Welcome
The University of Maryland has a number of high performance computing
resources available for use by campus researchers requiring compute cycles
for parallel codes and applications. These are:
- Zaratan: Our flagship cluster,
intended for large, parallel jobs, housed off campus and maintained by
the Division of Information Technology. It consists of over 380 nodes with dual
socket (128 cores per node) AMD Milan processors. Twenty nodes also each contain four Nvidia A100 GPUs.
All nodes have at least 512 GB of RAM, with six large memory nodes having 2 TB of RAM. All nodes have HDR-100 infiniband
(100 Gb/s) interconnects, and there is 2 PB of fast BeeGFS scratch storage.
The following clusters have been retired, but information about them are
still provided on the website for historical reasons:
- Deepthought2: Our previous cluster, retired in September, 2022. It was
intended for large, parallel jobs, housed just off campus and maintained by
the Division of Information Technology. It consisted of over 480 nodes with dual
socket (20 cores per node) Ivy Bridge 2.8 GHz processors, forty of which had du
al Nvidia K20m GPUs.
Most nodes had 128 GB of RAM, with a few having 1 TB of RAM. All nodes had FDR infiniband
(56 Gb/s) interconnects, and there was 1 PB of fast lustre storage.
- Juggernaut: An older experimental cluster, providing compute resources to some users who could not be added to the
Deepthought2 cluster because of constraints of its data center, and a testing
ground for the next cluster at UMD.
Comparison of Clusters
The following table tries to compare the HPC resources:
Cluster |
Number of compute nodes |
Cores/node |
Processor speeds |
Memory/node GB |
Nodes with GPUs |
Interconnect |
Disk space |
Licensed Software |
Zaratan |
386 |
128 |
2.45 - 3.5 GHz |
512 (6 nodes have 2 TB) |
20 (quad Nvidia A100 40GB) |
HDR-100 Infiniband GPU nodes have HDR Infiniband |
2 PB BeeGFS 10 PB AuristorFS |
Intel compiler suite Matlab Distributed Compute Server |
Deepthought2 (RETIRED) |
488 |
20 (1TB nodes have 40) |
2.8 GHz (1TB nodes are 2.2 GHz) |
128 (4 nodes have 1 TB) |
40 (dual K20m) |
FDR Infiniband |
1 PB lustre |
Intel compiler suite Matlab Distributed Compute Server |
Click on the cluster name above for more detailed information about a particular cluster.
Back to Top