Attention: Confluence is not suitable for the storage of highly confidential data. Please ensure that any data classified as Highly Protected is stored using a more secure platform.
If you have any questions, please refer to the University's data classification guide or contact ict.askcyber@sydney.edu.au
Computing platforms
Research computing comes in all shapes and sizes. In some cases, your compute needs are well met by your personal computer. In other cases, this is not sufficient, and this is where the platforms below can be critical to the timely analysis of your data.
High performance computing
High performance computing refers to the use of parallel processing techniques to solve complex computational problems efficiently. HPC systems consists of clusters of interconnected computers, each equipped with multiple processors and large amounts of member. These systems are capable of handling massive datasets and perform computations at speeds far beyond those achievable by your personal computer. HPC provides a reliable and efficient means of analysing data of all shapes and sizes from research domains.
To use HPC, it is not a requirement that your workflows make use of the multi-node architecture. There are many reasons why HPC would be justified:
Large input data requiring vast physical storage for inputs and outputs
High CPU or node requirement
GPU requirement
High memory requirement
Long walltime requirement
Faster I/O operations than your local computer can handle
Freeing up your local computer resources for other tasks or simply shut down for the day without stopping the analysis you’re running
What HPCs do you have access to?
USyd researchers have access to:
Artemis is our institutional HPC, it is due to be decommissioned in August 29th 2025. See the Intranet pages for more information.
USyd researchers have access to nationally subsidised Tier-1 HPCs: NCI Gadi and Pawsey Setonix HPCs. These systems boast vastly more CPU and GPU nodes than our institutional systems.
Cloud computing
Cloud computing enables researchers to access scalable, on-demand computing resources over the internet from their laptop. Unlike HPC, which often involves scheduled batch jobs and shared queues, cloud platforms offer flexible environments well suited to interactive work, customised software stacks, and rapidly scaling workflows.
Cloud services can be used for:
Hosting virtual machines and notebooks for analyses
Deploying web applications or APIs
Prototyping and testing software in isolated environments
Analyses requiring custom operating systems, tools, or workflows
Training and deploying machine learning models
Cloud computing can be used to complement HPC by offering more interactive, user-configurable approach to computation, particularly useful for tasks that don’t benefit from multi-node parallelisation or when self-managed environments are required.
What cloud platforms do you have access to?
USyd researchers have access to:
RONIN is an interface for creating virtual machines in the AWS cloud. The simplified user interface allows you to quickly turn-on new machines that are appropriate for your workload, use them, and destroy them when you are finished.
Nirin is an on-demand environment suited to interactive analysis and service deployment. It offers researchers a dynamic alternative to traditional HPC, while remaining tightly integrated with NCI Gadi and data collections, enabling workflows that span both platforms.
Nectar is Australia's national research cloud platform, offering researchers self-service access to scalable virtual machines, GPUs, and large-memory instances for data analysis, software deployment, and collaborative research.
Sydney Informatics Hub GPU cluster
More information coming soon!
Virtual research desktops
Argus is the University of Sydney’s virtual research desktop (VRD) project providing remote, on-demand, interactive, graphically intensive compute environments. VRD supports common, convenient software and environments for processing data generated at the University's core research facilities. For more information, see the Service Portal.