the uk escience grid (and other real grids) mark hayes niees summer school 2003

Post on 04-Jan-2016

217 Views

Category:

Documents

0 Downloads

Preview:

Click to see full reader

TRANSCRIPT

The UK eScience Grid(and other real Grids)

Mark Hayes

NIEeS Summer School 2003

The Grid in the UK

Pilot projects in particle physics,astronomy, medicine, bioinformatics,environmental sciences...

Contributing to internationalGrid software development efforts

10 regional “eScience Centres”

Some UK Grid resources

• Daresbury - loki - 64 proc Alpha cluster• Manchester - green - 512 proc SGI Origin 3800• Imperial - saturn - large SMP Sun• Southampton - iridis - 400 proc.Intel Linux cluster• Rutherford Appleton Lab - hrothgar - 32 proc Intel Linux• Cambridge - herschel - 32 proc Intel Linux cluster • ...• coming soon: 4x >64 CPU JISC clusters, HPC(X)

Applications on the UK Grid

Ion diffusion through radiation damaged crystal structures (Mark Calleja, Earth Sciences, Cambridge)

• Monte Carlo simulation lots of independent runs• small input & output • more CPU -> higher temperatures, better stats• access to ~100 CPUs on the UK Grid• Condor-G client tool for farming out jobs

Applications on the UK Grid

GEODISE - Grid Enabled Optimisation & Design Search for Engineering(Simon Cox, Andy Keane, Hakki Eres, Southampton)

• Genetic algorithm to find the bestdesign for satellite truss beams

• Java plugins to MATLAB for remotejob submission to the Grid

• Used CPU at Belfast, Cambridge, RAL,London, Oxford & Southampton

Applications on the UK Grid

Reality Grid (Stephen Pickles, Robin Pinning - Manchester)

• Fluid dynamics of complex mixtures, e.goil, water and solid particles (mud)

• Used CPU at London, Cambridge

• Remote visualisation using SGIOnyx in Manchester (from a laptop in Sheffield)

• Computational steering

Applications on the UK Grid

GENIE - Grid Enabled Integrated Earth system model(Steven Newhouse, Murtaza Gulamali - Imperial)

• Ocean-atmosphere modelling

• How does moisture transport from the atmosphere effect ocean circulation?

• ~1000 independent 4000year runs (3 days real time!) on ~200 CPUs

• Flocked condor pools at London & Southampton

• Coupled modelling

Two years to get this far...

July 2001 - Regional eScience Centres funded

October 2001 - First meeting of the Grid Engineering Taskforce(biweekly meetings using Access Grid)

August 2002 - ‘Level 1’ Grid operational (simple job submission possible between sites)

April 2003 - ‘Level 2’ Grid + applications(security, monitoring, accounting)

July 2003 - ‘Level 3’ Grid: more users, more robust

The European DataGrid

• Tiered structure: Tier0=CERN

• Lots of their own Grid software

•Applications: particle physics, earth observation, bioinformatics

http://www.eu-datagrid.org/

NASA Information PowerGrid

• First “production quality” Grid

• Linking NASA & academic

supercomputing sites at 10 sites

• Applications: computational fluid

dynamics, meteorological data mining, Grid benchmarking

http://www.ipg.nasa.gov/

TeraGrid

• Linking supercomputers

through a high-speed network

• 4x 10GBps between SDSC, Caltech, Argonne & NCSA

• Call for proposals out for applications & users

http://www.teragrid.org/

Asia-Pacific Grid

• No central source of funding

• Informal, bottom-up approach

• Lots of experiments on benchmarking & bio apps.

http://www.apgrid.org/

What does it take to build a Grid?

• Resources - CPU, network, storage• People - sysadmins, application developers, Grid experts• Grid Middleware - Globus, Condor, Unicore…• Security - so you want to use my computer?• Maintenance - ongoing monitoring, upgrades… and co-ordination of this between multiple sites• Applications and users!

How you can get involved...

• NIEeS

• National eScience Centre (Edinburgh) http://www.nesc.ac.uk/

• NERC PhD studentships • Your local eScience Centre

• Adopt an application!

Questions?

top related