Student LabRoom MEB 2460 is a 500 square foot computing laboratory dedicated to work in computational biomechanics.  The computing hardware includes an array of personal computers running both Windows and Linux operating systems, printers and networking infrastruture including a high-speed switch with Gigabit ethernet.

Dr. Weiss is a faculty member in the Scientific Computing and Imaging (SCI) Institute, one of eight Research Institutes at the University of Utah. The SCI Institute is housed in the Warnock Engineering Building, directly adjacent to the building housing the MRL, with approximately 8,500 sq. ft. of space allocated to research and computing activities. The Institute employs approximately 100 scientists, administrative personnel, graduate and undergraduate students.

The computing facilities include:

  •  264 core, 2.8TB shared memory SGI UV 1000 system with Intel X7542 2.67GHz Processors
  •  64 node GP-GPU cluster. Each node is an 8 core, 24GB of RAM system with Intel X5550 2.66Ghz processors each node is connected to (32) NVIDIA s1070 Tesla GPU systems nodes are linked with a 4x DDR Infiniband backbone with dual 10G network connections to SCI core switches.
  •  32 core, 192GB shared memory IBM Linux system with Intel Xeon X7350 3.0GHz processors This system can also be reconfigured into two separate 16 core systems with 96GB of RAM
  •  64 core, 512GB shared memory HP DL980 G7 with Intel Xeon X7560 2.27GHz processors, 2x NVIDIA m2070 GPU cards and 10Gb network connections.
  •  (3) 8 processor (24 cores, 2.5GHz, AMD Opteron with Nvidia Quadro FX 5600 graphics card) with a dual Gigabit Ethernet backbone and 96GB RAM
  •  (3) 2 processor (8 cores, 2.67GHz Intel X5550 with Nvidia GeForce GTS 250 graphics card) with a Gigabit Ethernet backbone and 48GB RAM
  •  4 processor (8 cores, 2.0GHz, AMD Opteron, with Nvidia Quadro 2FX graphics card) with a dual Gigabit Ethernet backbone and 16GB RAM
  •  Video display wall with 24 30″ LCD displays powered by 7 high end workstation systems with dual Nvidia video cards

In addition, the SCI Institute computing facility contains:

  •  SGI InfiniteStorge system consisting of both Fiber Channel and SATA disk with a capacity of 80TB
  •  An Isilon storage cluster with 13x 36TB storage nodes for a total of 422TB usable space with 6x dual 10 Gigabit Ethernet links and significant expansion capacity ( up to 1PB single namespace )
  •  Dedicated IBM backup server to manage SAN backup system and SL500 robots with 16TB of local SATA disk for disk to disk backups of critical systems
  •  IBM 10TB LTO-4 tape library providing backup for infrastructure systems such as email, web, DNS, and administrative systems
  •  500TB LTO-4 StorageTek SL500 tape library primary backup system
  •  40TB LTO-3 StorageTek SL500 tape library for archiving and offsite storage
  •  2 fully redundant Foundry BigIron RX-16 switching cores that provides a Gigabit network backbone for all HPC computers, servers, and individual workstations connected via Foundry floor switches
  •  Connections to the campus backbone via redundant 10 Gigabit Ethernet links – the first such attachments on campus
  •  A variety of Intel and AMD based desktop workstations running Linux with the latest ATI or Nvidia graphics cards
  •  Numerous Windows XP and Windows 7 desktop workstation
  •  Numerous MacPro workstations running OSX 10.6 with 30″ displays
  •  A 10 foot by 8 foot high-resolution rear projection system with stereo capability for collaborative group interaction and code development
  •  Six Sun quad-core AMD Opteron high availability Linux servers providing core SCI IT services – website (, ftp, mail, and software distribution
  •  Dedicated version control server with 6TB of local disk space for all SCI code and software projects
  •  UPS power, including 100 minutes of battery backup for critical SCI servers