Kennesaw State University Research Computing Facilities and Resources (Fall 2025)

vHPC

The new HPC cluster has 12 compute nodes and is offered as a research computing core service.  Each of these nodes has two CPUs (Intel Xenon Gold 6342 @ 2.80 GHz with 24-cores each) and 4 GPUs (NVIDIA Ampere A100 80 GB PCIe) with 512 GB RAM.  The compute nodes are all interconnected on a fast Infiniband network that also connects to the local storage.  Each compute node has access to user home storage (25 GB),  work storage for use during computation (180 TB), and staging storage for use between compute runs (180 TB ).  Additionally, larger and longer-term storage is available as a research computing core.  At launch, the cluster offered more than 400 software modules.  

Research Storage

Kennesaw State University (KSU) has deployed a new high-capacity storage system as part of a National Science Foundation (NSF) Campus Cyberinfrastructure award. The system runs on Ceph and offers a total raw capacity of 6.5 petabytes (PB), which, after applying erasure coding for redundancy and fault tolerance, results in approximately 4.27PB of usable storage. Of this usable capacity, 30% (1.28PB) is designated as an Open Science Data Federation (OSDF) data origin, enabling the hosting of datasets from external researchers and facilitating the sharing of KSU-generated datasets to support national research initiatives. 70% (2.99PB) is allocated for KSU researchers, supporting scientific research and educational activities across both campuses. This storage is available in 1TB increments. The storage system is accessible to KSU researchers via desktop devices and the virtual high-performance computing (vHPC) environment as an SMB share, available through a secure VPN portal. The OSDF portion supports both public dataset sharing and private research collaborations, enhancing KSU’s role in national and global research efforts. 

 



KSU has established a high-speed pathway to Internet2 and other heavily used commercial content providers.  Kennesaw and Marietta campuses are now directly connected through SoX to Internet2 and have established connections for both the Regional Research and Education Networks (R&E) routes and Internet2 Peer Exchange (I2PX) routes. The current connection speed is 10Gb/s. This connection will allow for rapid sharing of large amounts of data between KSU and other participating research institutions worldwide. This implementation is now available to on-campus researchers and traffic that can be routed through this connection will be done automatically. 

Kennesaw State University recommends that users of the university-level HPC include
the following acknowledgement statement: “This work was supported in part by research
computing resources and technical expertise via a partnership between Kennesaw State
University’s Office of the Vice President for Research and the Office of the CIO and Vice
President for Information Technology [1].” and cite using the appropriate citation format.

For a permanent link with the current KSU facilities statement, please use the persistent link at https://digitalcommons.kennesaw.edu/training/10/ to reference in your publications and proposals.

     

    Previous Facilities Statements

    • 2025 HPC Comput Node Details

      Queue CPUs Cores RAM(GB)
      batch 34-51 2 Xeon Gold 6148 (2.4 GHz)  40 192
      batch 52-70 2 Xeon Gold 6126 (2.6 GHz)   24 192
      batch 71-77 4 Xeon Gold 6226 (2.70 GHz) 

      48

      768
      himem 78 4 Xeon Gold 6226 (2.70 GHz) 48 1,537
      gpu 79-82 GPU: 4 NVidia V100S 5,120 gpu 768
      Total (47 nodes) 118 1,704 16,705



    • The Kennesaw State University HPC computing resources represent the University’s commitment to research computing. The KSU HPC is a RedHat Linux based cluster that offers a total capacity of over 50 Teraflops to KSU faculty researchers and their teams. The cluster consists of around 50 nodes with 120 processors having 1768 cores (excluding GPU cores) and 12.8TB RAM and has both CPU and GPU capabilities. There are queues available for standard, high memory and GPU jobs. The HPC is built on a fast network for data and interconnect traffic. A large storage array is provided for user home directories and a fast storage is available for use by each node during job runtime. Power for Cooling and Servers is backed by battery systems and natural gas generators. On and off campus access to the cluster is allowed only through secure protocols and utilizes Duo Authentication.

      Software is provided through environment modules to help provide versions of the same software and avoid conflicts with dependencies. There are around 200 software programs available that include titles for Astronomy, Biology, Chemistry, Math, Statistics, Physics, Engineering and programming languages. Some popular titles include: Gaussian, MATLAB, Mathematica, R, TensorFlow, COMSOL, LS-DYNA, HH-Suite, MAFFT, LAMMPS, OpenFoam, PHYLIP and Trinity. There is cluster management and job scheduling software used to provide free access to this shared resource.

      Kennesaw State University recommends that users of the university-level HPC include the following acknowledgement statement: “This work was supported in part by research computing resources and technical expertise via a partnership between Kennesaw State University’s Office of the Vice President for Research and the Office of the CIO and Vice".

    • The Kennesaw State University HPC computing resources represent the University’s commitment to research computing. The KSU HPC is a RedHat Linux based cluster that offers a total capacity of over 50 Teraflops to KSU faculty researchers and their teams. The cluster consists of over 50 nodes with 110 processors having 1512 cores (excluding GPU cores) and 10.3TB RAM and has both CPU and GPU capabilities. There are queues available for standard, high memory and GPU jobs. The HPC is built on a fast Infiniband network for data and interconnect traffic. A large storage array is provided for user home directories and a fast storage is available for use during job runtime. Power for Cooling and Servers is backed by battery systems and natural gas generators. On and off campus access to the cluster is allowed only through secure protocols.

      Software is provided through environment modules to help provide versions of the same software and avoid conflicts with dependencies. There are 150 software programs available that include titles for Astronomy, Biology, Chemistry, Math, Statistics, Engineering and programming languages. Some popular titles include: Gaussian, MATLAB, Mathematica, R, TensorFlow, COMSOL, LS-DYNA, HH-Suite, MAFFT, LAMMPS, OpenFoam, PHYLIP and Trinity. There is cluster management and job scheduling software used to provide free access to this shared resource.

      Kennesaw State University recommends that users of the university-level HPC include the following acknowledgement statement: “This work was supported in part by research computing resources and technical expertise via a partnership between Kennesaw State University’s Office of the Vice President for Research and the Office of the CIO and Vice President for Information Technology [1].” and cite using the appropriate citation format.