Skip to Main Content

Department of Computer Science

Technical Services and Support

Summary of Systems

Computer Science has two generally-available “clusters”,  faculty and iLab. The sections below will describe each. Each cluster has its own home directories. So if you have access to both iLab and faculty. There is a common file system shared by all clusters, /common/users. Your directory on it is /common/users/NETID. Quotas on /common/users are 100 GB.

IMPORTANT CS Linux System has enforced limitations that all users should be aware of. Make sure to check the page before running a big job.

Systems Status

Current System Status

Faculty offices, plus some servers. There are two Faculty Machines, constance.cs and porthos.cs. However we recommend logging in to That will give you either constance.cs or porthos.cs, randomly. If one is down it will give you the one that is up.

These are not very powerful systems. They’re intended primarily to keep records in an environment with no students. Serious computing is normally done on the research cluster.


Our student system is called iLab/rLab cluster consisting of a set of  large servers ( some with  A4000/RTX1080/2080 Ti GPUs) and  desktop in iLabs (2nd floor Hill Center 248, 252, 254) and grad student offices. Most students access these systems remotely, using WebLogin, SSH client like ssh command on MacOS and Linux or Bitvise Client on Windows. (See Video tutorial or SSH HowTo), X2GO and Windows Remote Desktop.  Please use the alias,which will give you one of the four large servers. You can also look at the iLab status page to find a desktop system that isn’t in heavy use.

More informaton about Student systems, and instructions on using them

Instructional Laboratory (iLab) Systems: Announcements and information about rooms

Details iLab System Specs

    • 50+ of the iLab desktop systems and 31 machines with Nvidia GTX1060 GPU.
    • 4 iLab servers each have 8 Nvidia A4000 GPUs  with 512GB-1TB of memory.
    • 3 rLab servers each have 8 Nvidia RTX 1080Ti GPUs  with 512GB-1TB of memory.
    • 1 rLab server with 8 Titan X GPU with 512GB of memory
    • 1 iLab server each have 8 Nvidia RTX 2080ti  GPUs  with 256GB of memory.
Private Research Systems

Many faculty research groups  own machines generally available for CS researchers in their group. The list is growing fast. Access to these systems are normally controlled by the head of each research group.

High-performance systems: OARC

OARC is a University group that provides high-performance computing. Computer science in general doesn’t have a conventional HPC cluster. We concentrate on GPUs and more specialized hardware. For large-scale HPC, OARC is the best source. They have a large cluster, Amarel. It is intended as a “condo” cluster. I.e. grants buy nodes, and are guaranteed at least as much capacity as they purchased. The cost is matched by the University. However some capacity is available for those who haven’t bought into the system, particularly for course work and student use.

For more information see the OARC web site.

Some of OARCs nodes have GPUs, typically Nvidia.

Virtual machines

LCSR can provide virtual machines, both for researchers and for use by classes. 

Researchers commonly use VMs for web servers and other support services. We normally put those VMs on the same servers used for LCSR infrastructure.

For course use, we talk with the instructor to find out the configuration and software needed, then create one small VM per user. We can also create limited VMs for grad students. These VMs are placed on one of two large (1 TB each) VM servers purchased specifically for instruction.

Virtual Machines for Student Academic Use

Virtual Machines for Faculty and Research

For help with our systems or If you need immediate assistant, visit LCSR Operator at CoRE 235 or call 848-445-2443. Otherwise, see CS HelpDesk. Don’t forget to include your NetID along with descriptions of your problem.