building cyberinfrastructure into a university culture
DESCRIPTION
Building Cyberinfrastructure into a University Culture. EDUCAUSE Live! March 30, 2010 Curt Hillegas Director, TIGRESS HPC Center Princeton University. Context. - PowerPoint PPT PresentationTRANSCRIPT
![Page 1: Building Cyberinfrastructure into a University Culture](https://reader035.vdocuments.site/reader035/viewer/2022062500/56815881550346895dc5e2a9/html5/thumbnails/1.jpg)
Building Cyberinfrastructure
into a University Culture
EDUCAUSE Live!March 30, 2010
Curt HillegasDirector, TIGRESS HPC CenterPrinceton University
1
![Page 2: Building Cyberinfrastructure into a University Culture](https://reader035.vdocuments.site/reader035/viewer/2022062500/56815881550346895dc5e2a9/html5/thumbnails/2.jpg)
Context
As a world-renowned research university, Princeton seeks to achieve the highest levels of distinction in the discovery and transmission of knowledge and understanding. At the same time, Princeton is distinctive among research universities in its commitment to undergraduate teaching.
www.Princeton.EDU2
![Page 3: Building Cyberinfrastructure into a University Culture](https://reader035.vdocuments.site/reader035/viewer/2022062500/56815881550346895dc5e2a9/html5/thumbnails/3.jpg)
Cyberinfrastructure
Cyberinfrastructure consists of computational systems, data and information management, advanced instruments, visualization environments, and people, all linked together by software and advanced networks to improve scholarly productivity and enable knowledge breakthroughs and discoveries not otherwise possible.
Developing a Coherent Cyberinfrastructure from Local Campus to National Facilities: Challenges and Strategies A Workshop Report and Recommendations
3
![Page 4: Building Cyberinfrastructure into a University Culture](https://reader035.vdocuments.site/reader035/viewer/2022062500/56815881550346895dc5e2a9/html5/thumbnails/4.jpg)
Contents
• Yesterday• Today• Tomorrow• Lessons
4
![Page 5: Building Cyberinfrastructure into a University Culture](https://reader035.vdocuments.site/reader035/viewer/2022062500/56815881550346895dc5e2a9/html5/thumbnails/5.jpg)
2002-The Beginning
• OIT, Academic Services – Computational Science and Engineering Support group (CSES)
• The Princeton Institute for Computational Science and Engineering Support (PICSciE)
• Research Computing Advisory Group (RCAG)
5
![Page 6: Building Cyberinfrastructure into a University Culture](https://reader035.vdocuments.site/reader035/viewer/2022062500/56815881550346895dc5e2a9/html5/thumbnails/6.jpg)
Trust
• AdrOIT – small Beowulf cluster• Princeton Software Repository• Maintenance of old systems• Elders image – rebuilt RHEL distribution
6
![Page 7: Building Cyberinfrastructure into a University Culture](https://reader035.vdocuments.site/reader035/viewer/2022062500/56815881550346895dc5e2a9/html5/thumbnails/7.jpg)
Relationships
• Hire new faculty• New PICSciE Director, Prof. Jerry Ostriker• 1st annual RCAG Presentation• Wow!!!• Let’s buy something big together– Collaborative selection process– Cobble together funding
7
![Page 8: Building Cyberinfrastructure into a University Culture](https://reader035.vdocuments.site/reader035/viewer/2022062500/56815881550346895dc5e2a9/html5/thumbnails/8.jpg)
Partnerships• 64 processor SGI Altix BX2, Hecate• 1024 node IBM BlueGene/L, Orangena– Faculty– OIT– Development– Facilities
• Collaborative administration (without fees)• Housed in central Data Center (without fees)• 256 node Dell Beowulf cluster, Della• HPC Steering Committee
8
![Page 9: Building Cyberinfrastructure into a University Culture](https://reader035.vdocuments.site/reader035/viewer/2022062500/56815881550346895dc5e2a9/html5/thumbnails/9.jpg)
Storage
• Hire a pair of junior faculty– 192 node Beowulf cluster including startup
• 35 TB• Fees to recover 50% of capital cost• 10% utilization within the first 6 months• No fees!!!• 95% utilization within 4 months
9
![Page 10: Building Cyberinfrastructure into a University Culture](https://reader035.vdocuments.site/reader035/viewer/2022062500/56815881550346895dc5e2a9/html5/thumbnails/10.jpg)
Hierarchical Storage Management
• 96 node SGI Altix ICE, Artemis• RCAG – Need a scalable storage system that provides
appropriate performance and availability for aging data
• OIT proposal to Provost’s office• 1 PB total– IBM DS4800, GPFS, GPFS HSM, TSM HSM– Added benefit – free backups!!!
• All systems have high performance access
10
![Page 11: Building Cyberinfrastructure into a University Culture](https://reader035.vdocuments.site/reader035/viewer/2022062500/56815881550346895dc5e2a9/html5/thumbnails/11.jpg)
Success Brings New Challenges
• Senior faculty hire• 448 node Dell cluster• New scheduling policies• (Supercomputing|Scientific Computing)
Administrators Meeting – SCAM• DataSpace• New PICSciE Director – Prof. Jeroen Tromp
11
![Page 12: Building Cyberinfrastructure into a University Culture](https://reader035.vdocuments.site/reader035/viewer/2022062500/56815881550346895dc5e2a9/html5/thumbnails/12.jpg)
Visualization
• Visualization Expert• Sony SRX-S110 Projector– 8,847,360 pixels (4096x2160)– Rear projection ultra wide angle fabric screen– 9’3” (H) X 16’6” (W)
• Open Source and Proprietary software
12
![Page 13: Building Cyberinfrastructure into a University Culture](https://reader035.vdocuments.site/reader035/viewer/2022062500/56815881550346895dc5e2a9/html5/thumbnails/13.jpg)
Research Computing Base
HPC Hardware
Storage
Visualization
Programming Support
Infrastructure
InfrastructureCollaboration
Software
Cyberinfrastructure at Princeton
13
![Page 14: Building Cyberinfrastructure into a University Culture](https://reader035.vdocuments.site/reader035/viewer/2022062500/56815881550346895dc5e2a9/html5/thumbnails/14.jpg)
HPC Hardware
14
![Page 15: Building Cyberinfrastructure into a University Culture](https://reader035.vdocuments.site/reader035/viewer/2022062500/56815881550346895dc5e2a9/html5/thumbnails/15.jpg)
Collaboration
• PICSciE• RCAG• TIGRESS Steering Committee• TIGRESS Users• SCAM• PLUG
15
![Page 16: Building Cyberinfrastructure into a University Culture](https://reader035.vdocuments.site/reader035/viewer/2022062500/56815881550346895dc5e2a9/html5/thumbnails/16.jpg)
Future• New Data Center• Coordinated supervision of departmental scientific/Linux
system administrators• Collaboration with the Library• Lifecycle management• New technologies
– GPGPU– Power7– X86_64 based single image
• Participation in Virtual Organizations
16
![Page 17: Building Cyberinfrastructure into a University Culture](https://reader035.vdocuments.site/reader035/viewer/2022062500/56815881550346895dc5e2a9/html5/thumbnails/17.jpg)
Lessons Learned
• Research is driven by the faculty• Trust, relationships, and partnerships are essential to
success• Research computing relies on the complete
cyberinfrastructure of the University• Avoid fees• Change is a constant• Start small, do things well, and growth will follow• It’s all about the people
17