|Nov 15||Nov 16||Nov 17||Nov 18||Nov 19|
|Cores in Use|
|System is operating at peak performance.|
|Scheduled maintenance: occurring at 7:00 AM on the first Tuesday of every month.|
We are proud to announce the substantial hardware upgrades provided by our successful Beagle 2 grant application. The upgrade would consist of:
- Addition of 2.24 PB of raw disk space in two new DDN cabinets for a total of 2.84 PB of raw storage (~2.1 PB total usable).
- All compute blades upgraded from 6-core Magny Cours processors to 8-core Abu Dhabi processors, increasing core count per node from 24 to 32.
- All compute blades upgraded from 32GB to 64GB per node.
- Login nodes network cards upgraded from 1Gbps NICs to 10Gbps NICs.
- Adding 4 compute nodes with nVidia GPU processors.
Work on those hardware upgrades are scheduled to begin on November 11. At that time, Beagle will be offline for three weeks while that work is completed. Please note that Lustre data will be kept intact as part of the upgrade. While Beagle is offline, there is no way to access any data on Lustre.
- Week of Nov 10th – Beagle is down for pre-upgrade prep
- Week of Nov 17th – Beagle hardware upgrade is started
- Week of Nov 24th – Beagle hardware upgrade is completed
- Week of Dec 1st – Beagle acceptance test is completed, users are given access to Beagle2
At this time we expect the work to happen accordingly, although these exact dates may still change in the future:
The acquisition of the Beagle supercomputer was made possible by a grant from the National Institutes of Health (NIH) National Center for Research Resources (NCRR).
Ian Foster, director of the Computation Institute at the University of Chicago and Argonne National Laboratory, is the PI for this project. Ian Foster, with UChicago’s team of technical and domain specialists, identified the need for a powerful computational environment that would serve the growing resource-intensive requirements of the biomedical research community.
Beagle’s “skin” was created by the Computation Institute’s Mark Hereld and Greg Cross. Beagle 2011 is built on three components: water and sky are divided by a wave. Moving to the right, the wave takes on the pitch of the double helix of DNA. The images of water and sky are generated by a stochastic, context-free grammar using a computer. This application of stochastic image generation gives Beagle 2011 a fractal aspect that combines visual elements inspired by biology and mathematics, disciplines at the heart of the research that Beagle will carry forward.
- About 200nd fastest machine (Nov. 2011)
- Cray XE6 system
- 150 Teraflops
- 600 TB disk (450 TB formatted)
- Extreme Scalability Mode (ESM), which supports large scalable custom applications.
- Cluster Compatibility Mode (CCM), which allows standard programs designed for smaller machines or clusters to run without modifications.
- The nodes are connected in a 3D torus topology via the Cray Gemini interconnect.
- A high-speed inter-processor connection network to support tightly coupled computational simulation and data-intensive analysis applications that involve frequent inter-process communication.
- At least 32 Gigabyte memory per compute node, for applications that create large in-memory data structures or that will run many tasks on the same node.
- The ability to easily and quickly schedule large jobs as data become available while being able to pursue a very large number of smaller tasks.
Beagle will focus — but not exclusively — on biomedical research supported by NIH funding.
Some of the project areas include:
- Quantitative determination of free energies associated with large conformational changes in cell membranes
- Molecular structure and ligand interaction prediction in cellular networks
- Whole-body model for studies of electrical and thermal injury
- Computation of possible configurations of transcriptional networks
- Data-mining of biomedical literature to understand regulatory networks in cancer and to understand complex disease processes
- Mapping brain structure to human behavior
- Quantitative medical-image analysis
- High volume text-mining
- Genomic and metagenomic data analysis
- Modeling of economic impact of climate change
- Large scale molecular dynamics
- Model ion channels in nerve cells
- Study transcriptional networks