Web Content Display Web Content Display

 

Facilities, Equipment and Other Resources

The Scientific Computing and Imaging (SCI) Institute is one of eight recognized research institutes at the University of Utah and includes twenty faculty members and over 200 other scientists, administrative support staff, and graduate and undergraduate students. SCI has over 25,000 square feet of functional space allocated to its research and computing activities within the new John and Marva Warnock Engineering Building on campus. Laboratory facilities for the researchers listed in this project include personal work spaces and access to common working areas.

Computing: The SCI computing facility, which has dedicated machine room space in the Warnock Engineering Building, includes: Shared memory multi- processor computers, clusters and dedicated graphics systems.

1. Nvidia DGX-1 with 8X Tesla V100, Dual 20-Core Intel Xeon E5-2698 v4 2.2 GHz Processors, and 512 GB RAM.

2. 264 core, 2.8TB shared memory SGI UV 1000 system with Intel X7542 2.67GHz Processors.

3. 64 node CPU cluster. Each node has 8 cores, 24GB of RAM, with a 4x DDR Infiniband backbone with dual 10G network connections to SCI core switches.

4. 32 node GP-GPU cluster. Each node has 16 cores, 64GB of RAM with Intel E5-2660 2.20GHz processors. Each node has 2x Nvidia k20 GPUs with 2 full speed FDR Infiniband connections. System has a total of 128 56Gb/s Infiniband connections.

5. 10 blade HPE Apollo 6000 cluster system with 10 Xeon-7210 CPUs with 480GB of total RAM.

6. 32 core, 192GB shared memory IBM Linux system with Intel Xeon X7350 3.0GHz processors.

7. 64 core, 512GB shared memory HP DL980 G7 with Intel Xeon X7560 2.27GHz processors.

8. 4x 80 core, 842GB shared memory HP DL980 G7 with Intel E7- 4870 2.40GHz processors.

9. (3) 8 processor (24 cores, 2.5GHz, AMD Opteron with Nvidia Quadro FX 5600 graphics card) with a dual Gigabit Ethernet backbone and 96GB RAM.

10. 8 core, 2.0GHz, AMD Opteron, with Nvidia Quadro 2FX graphics card) with a dual Gigabit Ethernet backbone and 16GB RAM.

11. 6 core Intel Xeon x5650 2.67GHz with 196GB of RAM and 2x c2070 GPUs.

12. 8 core Intel Xeon x5570 2.93GHz (16 with HT enable) with 126GB of RAM and c2050 / c2070 GPUs.

13. 12 core Intel Xeon E5-2640 2.50GHz with 32GB of RAM and 3x K20c GPUs.

 

In addition, the SCI Institute computing facility contains:

1. An Isilon storage cluster with 13 36NL 36TB storage nodes for a total of 422TB usable space with 6x dual 10 Gigabit Ethernet links.

2. A Qumulo storage cluster with 7 QC24 storage nodes and 4 QC208 stooge nodes for a total of 627TB usable space with each QC24 having a 10 Gigabit Ethernet link, and each QC208 having a 40 Gigabit Ethernet link.

3. Dedicated IBM backup server to manage SAN backup system and SL500 robots.

4. IBM 10TB LTO-4 tape library providing backup for infrastructure systems such as email, web, DNS, and administrative systems.

5. 500TB LTO-4 StorageTek SL500 tape library primary backup system.

6. 2 fully redundant Foundry BigIron MLX-16 switching cores that provides a Gigabit network backbone for all HPC computers, servers, and individual workstations connected via Foundry floor switches.

7. Connections to the campus backbone via redundant 10 Gigabit Ethernet links.

8. A variety of Intel and AMD based desktop workstations running Linux with the latest ATI or Nvidia graphics cards.

9. Numerous Windows 7 desktop workstation

10. Numerous MacPro workstations with 27" or 30" displays

11. High availability Linux servers providing core SCI IT services – website (www.sci.utah.edu), ftp, mail, and software distribution

12. Dedicated version control server with 6TB of local disk space for all SCI code and software projects

13. UPS power, including 100 minutes of battery backup for critical SCI servers

 

Power Display Wall:

The interactive Power display wall provides users with the ability to explore 2D/3D visualizations on 36 (4 x 9) 27-inch tiled screens at a 133-megapixel resolution with 72GB of graphics memory. The display can be controlled by a computer and/or tablet device either on-site or by remote collaborators. Its infrastructure was designed to handle massive, terascale data sets from local or remote sources. Each node of the display wall operates 4 screens and can be configured by the controller to stream process the data as it is displayed to aid analysis. This is an ideal resource for local and remote collaborations where users need to examine fine details of large datasets while maintaining the global context

 

Office Space:

The SCI Institute houses its faculty and staff in individual offices. Students have individual desk space equipped with a workstation and located in large, open common areas that facilitate student collaboration and communication. All workstations are connected to the SCI local area network via full-duplex Gigabit Ethernet.

 

University Network:

The University of Utah is a member of the Inter- net2 advanced networking consortium. It is connected to the Internet2 Network via 100 Gigabit Ethernet (100 Gbit per second) initially. The University, in partnership with the Utah Education Network, has the ability to extend dedicated wavelengths or dedicated circuits from Internet2 collocated in a Level 3 Communications facility west of downtown Salt Lake City