ocean sciences cyberinfrastructure futures
Post on 30-Dec-2015
23 Views
Preview:
DESCRIPTION
TRANSCRIPT
Ocean SciencesCyberinfrastructure Futures
Dr. Larry Smarr
Director, California Institute for Telecommunications and Information Technologies
Harry E. Gruber Professor,
Dept. of Computer Science and Engineering
Jacobs School of Engineering, UCSD
Report to the ORION Ocean Observatory Workshop
San Juan, Puerto Rico
January 7, 2004
•Interoperability
•Open, easy access and discovery
•Reliable, sustained, efficient operations
•Effective user feedback
•Open design and standards process
•Preservation of data and products
The Ocean.US DMAC VisionA Strong Foundation
Source: John Orcutt, SIO
www.dmac.ocean.us
Components of CI-Enabled Science & Engineering
CollaborationServices
Knowledge managementinstitutions for collection buildingand curation of data, information,
literature, digital objects
High-performance computingfor modeling, simulation, data
processing/mining
Individual &Group Interfaces& Visualization
Physical World
Humans
Facilities for activation,manipulation and
construction
Instruments forobservation andcharacterization.
GlobalConnectivity
NSF Report on Revolutionizing Science and Engineering through Cyber-Infrastructure (Atkins Report)
www.communitytechnology.org/nsf_ci_report/
NASA Earth System Science IT Challenges
• EOSDIS Currently:– Ingests Nearly 3 Terabytes of Data Each Day– In 2003 it Delivered Over 25 Million Data Products – In Response to Over 2.3 Million User Requests– Making It the Largest “e-Science” System in the World
• Earth System Modeling is a Driving Requirement for High-End Computing, and will Continue to be so as Models:– Increase in Resolution and – Are Further Coupled
– (e.g., Atmosphere-Ocean-Land Processes)
Other Agencies are Learning from EOSDIS and are Moving Beyond. As NASA Lays Out the Evolution of its Information Infrastructure to Meet its Earth Science Challenges Over The Next Decade, it will Again Need to Move to The Leading-Edge.
NSF is Funding Research on Wireless Cyberinfrastructure for Ocean Observatories
http://roadnet.ucsd.edu/
www.cosis.net/abstracts/EAE03/07668/EAE03-J-07668.pdf
The Biomedical Informatics Research Network: a Multi-Scale Brain Imaging Federated Repository
National Partnership for Advanced Computational Infrastructure
Part of the UCSD CRBS Center for Research on Biological Structure
UCSD is IT and Telecomm Integration Center
Average File Transfer~10-50 Mbps
NCRR BIRN Site Rack NCRR BIRN Site Rack
NetworkNetworkAttached Attached Storage Storage 1 1 -- 10 TB10 TB
RouterRouterCisco 4006Cisco 4006
Grid POPGrid POP
Network StatsNetwork Stats
GigEGigE Net ProbeNet Probe
UPSUPS
Encryption Encryption
NCRR BIRN Site Rack NCRR BIRN Site Rack
NetworkNetworkAttached Attached Storage Storage 1 1 -- 10 TB10 TB
RouterRouterCisco 4006Cisco 4006
Grid POPGrid POP
Network StatsNetwork Stats
GigEGigE Net ProbeNet Probe
UPSUPS
Encryption Encryption
Large Hadron Collider Cyberinfrastructure
Communications of the ACM, Volume 46, Issue 11 (November 2003)
From Shared Internet to Dedicated LightpipesEnabling the “I” in ORION
Source: Tom West, CEO NLR
“National Lambda Rail” PartnershipServes Very High-End Experimental and Research Applications
4 x 10Gbps Wavelengths Initially Capable of 40 x 10Gbps Wavelengths at Build Out
www.skio.peachnet.edu/coop/materials/cora_lowres.pdf
An International-Scale Set of Dedicated Wavelengths is Operational over TransLight
European lambdas to US–8 GEs Amsterdam— Chicago–8 GEs London—ChicagoCanadian lambdas to US–8 GEs Chicago—Canada —NYC–8 GEs Chicago—Canada —SeattleUS lambdas to Europe–4 GEs Chicago—Amsterdam–3 GEs Chicago—CERNEuropean lambdas–8 GEs Amsterdam—CERN –2 GEs Prague—Amsterdam–2 GEs Stockholm—Amsterdam–8 GEs London—AmsterdamTransPAC lambda–1 GE Chicago—TokyoIEEAF lambdas (blue)–8 GEs NYC—Amsterdam–8 GEs Seattle—Tokyo
UKLight
CERN
NorthernLight
Source: Tom DeFanti, EVL, UIC
The OptIPuter Project – Removing Bandwidth as an Obstacle In Data Intensive Sciences
• NSF Large Information Technology Research Proposal– UCSD and UIC Lead Campuses—Larry Smarr PI– USC, UCI, SDSU, NW, Texas A&M Partnering Campuses
• Industrial Partners: IBM, Telcordia/SAIC, Chiaro, Calient• $13.5 Million Over Five Years• Optical IP Streams From Lab Clusters to Large Data Objects NIH Biomedical Informatics Research Network
NSF EarthScope
http://ncmir.ucsd.edu/gallery.html
siovizcenter.ucsd.edu/library/gallery/shoot1/index.shtml
Removing Barriers to Earth Observing & Simulation
• One Current Barrier: The Low Throughput of Today’s Internet
• Even Though Internet2 Backbone is 10 Giga bits per second– Network is Shared Using TCP/IP Protocol
• A Remote NASA Earth Observation System User Only Sees:– 10-50 Mbps (May 2003) Throughput to Campuses
– Typically Over Abilene From Goddard, Langley, or EROS
– UCSD’s SIO to Goddard in May 2003 (ICESAT, CERES Satellite Data)
– 12.4 Mbps—1/1000 of the Available Backbone Speed!
• In Contrast, OptIPuter Demonstrated 9.3 Gbps/10 Gbps– NCSA to SDSC– Using Reliable Blast UDP
http://www.evl.uic.edu/cavern/rg/20030817_he
½ Mile
SIO
SDSC
CRCA
Phys. Sci -Keck
SOM
JSOE Preuss
6th College
SDSCAnnex
Node M
Earth Sciences
SDSC
Medicine
Engineering High School
To CENIC
Collocation
Source: Phil Papadopoulos, SDSC; Greg Hidley, Cal-(IT)2
The UCSD OptIPuter DeploymentPrototyping a Campus-Scale OptIPuter
Linking Scalable Linux Clusters
SDSC Annex
JuniperT320
0.320 TbpsBackplaneBandwidth
20X
ChiaroEstara
6.4 TbpsBackplaneBandwidth
2 Miles0.01 ms
Dedicated Fiber Between Sites
Ultra-Resolution Displays Driven by Graphics Clusters for Ocean Sciences Imaging
Emmi Ito- U. Minnesota; Frank Rack- Joint Oceanographic Institutes;
Jason Leigh, EVL UIC
GeographyUnderground
Earth Sciences
Interactive 3D APPLICATIONS:
How Can We Make Scientific Discovery as Engaging as Video Games?
Source: Mike Bailey,
Rozeanne StecklerSDSC
GeoWall Linked by
Fiber Optics to SIO
6-Week Earth Sciences
Unit Aligned to
State Standards
top related