Difference between revisions of "Welcome to Montana Tech's High Performance Computing Cluster"
From Montana Tech High Performance Computing
(21 intermediate revisions by the same user not shown) | |||
Line 1: | Line 1: | ||
<div class="row"> | <div class="row"> | ||
<br> | <br> | ||
− | <div class="large- | + | <div class="large-12 column"> |
− | ==Supporting the Computational Science and Research Needs of Montana== | + | ==<span style="color:#925223">Supporting the Computational Science and Research Needs of Montana </span>== |
<p align="justify">Montana Tech's High Performance Computing (HPC) architecture debuted as the first HPC in Montana University System (MUS) and it has been designed to support collaborative research and instruction within Montana MUS. Funded by the Montana Department of commerce as a MUS-wide initiative, this computing cluster is available to faculty, students, researchers, and public/private industry collaborators.</p> | <p align="justify">Montana Tech's High Performance Computing (HPC) architecture debuted as the first HPC in Montana University System (MUS) and it has been designed to support collaborative research and instruction within Montana MUS. Funded by the Montana Department of commerce as a MUS-wide initiative, this computing cluster is available to faculty, students, researchers, and public/private industry collaborators.</p> | ||
</div> | </div> | ||
− | |||
− | |||
− | |||
− | |||
</div> | </div> | ||
Line 16: | Line 12: | ||
<div class="large-6 columns"> | <div class="large-6 columns"> | ||
<h3 class="subheader"><span class="fa fa-th fa-lg" style="display:inline;"></span> HPC Cluster</h3> | <h3 class="subheader"><span class="fa fa-th fa-lg" style="display:inline;"></span> HPC Cluster</h3> | ||
− | <p align="justify">Montana Tech's HPC is a small cluster consists of | + | <p align="justify">Montana Tech's HPC is a small cluster consists of 26 nodes with 544 cores. Two of the nodes are GPU nodes adding 7488 CUDA cores. The nodes are connected with 40Gbps InfiniBand and have access to 91 TB storage sytems. The theoretical peak performance of the entire cluster is about 21 TFLOPS. </p> |
</div> | </div> | ||
<div class="large-6 columns"> | <div class="large-6 columns"> | ||
Line 27: | Line 23: | ||
<div class="large-8 columns"> | <div class="large-8 columns"> | ||
<h3 class="subheader"><span class="fa fa-handshake-o fa-lg"></span> Collaborations </h3> | <h3 class="subheader"><span class="fa fa-handshake-o fa-lg"></span> Collaborations </h3> | ||
− | <p align="justify">Montana Tech | + | <p align="justify">Montana Tech currently pays for the system support. We hope current and future researchers will [[Grant_information|incorporate]] the facilities into their grant proposals to fund system expansion and future support. Researchers can also propose infrastructure expansions, if funding is available. </p> |
</div> | </div> | ||
</div> | </div> | ||
Line 37: | Line 33: | ||
* Molecular Dynamics | * Molecular Dynamics | ||
* Statistical Simulations | * Statistical Simulations | ||
+ | * Gene Analysis | ||
* Teaching | * Teaching | ||
</div> | </div> | ||
Line 42: | Line 39: | ||
<div class="large-6 columns"> | <div class="large-6 columns"> | ||
<h3 class="subheader"><span class="fa fa-newspaper-o fa-lg"></span> What's New </h3> | <h3 class="subheader"><span class="fa fa-newspaper-o fa-lg"></span> What's New </h3> | ||
− | * | + | * [https://hpc.mtech.edu/ganglia/?c=Oredigger&m=load_one&r=hour&s=by%20name&hc=4&mc=2 Current Cluster Status (Ganglia)] |
+ | * 04/01/2023 Software will be reinstalled. User accounts will be recovered upon request. | ||
+ | * 03/31/2023 HPC system upgraded to Rocky Linux 8 + Warewulf (stateless) from CentOS7 + xCAT (stateful). | ||
+ | * 03/05/2020 We have migrated to SLURM from Torque/Moab. Documentation will be updated. | ||
+ | * 08/20/2019 Four new compute nodes featuring the latest Xeon Platinum processors has arrived | ||
</div> | </div> | ||
</div> | </div> | ||
− | |||
Latest revision as of 15:19, 1 April 2023
Supporting the Computational Science and Research Needs of Montana
Montana Tech's High Performance Computing (HPC) architecture debuted as the first HPC in Montana University System (MUS) and it has been designed to support collaborative research and instruction within Montana MUS. Funded by the Montana Department of commerce as a MUS-wide initiative, this computing cluster is available to faculty, students, researchers, and public/private industry collaborators.
HPC Cluster
Montana Tech's HPC is a small cluster consists of 26 nodes with 544 cores. Two of the nodes are GPU nodes adding 7488 CUDA cores. The nodes are connected with 40Gbps InfiniBand and have access to 91 TB storage sytems. The theoretical peak performance of the entire cluster is about 21 TFLOPS.
Data Visualization
Associated with the HPC are two 3D data visualization systems with a variety of visualization software packages. Both 3D visualization systems are equipped with either 108" stereo projection wall or 70" 3D TV, shutter glasses, and a tracking system to enable researcher to directly interact with the 3D imagery.
Collaborations
Montana Tech currently pays for the system support. We hope current and future researchers will incorporate the facilities into their grant proposals to fund system expansion and future support. Researchers can also propose infrastructure expansions, if funding is available.
Current Uses
- Multiphysics Simulations
- Molecular Dynamics
- Statistical Simulations
- Gene Analysis
- Teaching
What's New
- Current Cluster Status (Ganglia)
- 04/01/2023 Software will be reinstalled. User accounts will be recovered upon request.
- 03/31/2023 HPC system upgraded to Rocky Linux 8 + Warewulf (stateless) from CentOS7 + xCAT (stateful).
- 03/05/2020 We have migrated to SLURM from Torque/Moab. Documentation will be updated.
- 08/20/2019 Four new compute nodes featuring the latest Xeon Platinum processors has arrived