cover feature solving einstein’s equations on supercomputers

7
0018-9162/99/$10.00 © 1999 IEEE 52 Computer I n 1916, Albert Einstein published his famous gen- eral theory of relativity, which contains the rules of gravity and provides the basis for modern the- ories of astrophysics and cosmology. It describes phenomena on all scales in the universe, from compact objects such as black holes, neutron stars, and supernovae to large-scale structure formation such as that involved in creating the distribution of clusters of galaxies. For many years, physicists, astro- physicists, and mathematicians have striven to develop techniques for unlocking the secrets con- tained in Einstein’s theory of gravity; more recently, computational-science research groups have added their expertise to the endeavor. Those who study these objects face a daunting chal- lenge: The equations are among the most complicated seen in mathematical physics. Together, they form a set of 10 coupled, nonlinear, hyperbolic-elliptic par- tial differential equations that contain many thousands of terms. Despite more than 80 years of intense ana- lytical study, these equations have yielded only a hand- ful of special solutions relevant for astrophysics and cosmology, giving only tantalizing snapshots of the dynamics that occur in our universe. Scientists have gradually realized that numerical studies of Einstein’s equations will play an essential role in uncovering the full picture. Realizing that this work requires new tools, developers have created computer programs to investigate gravity, giving birth to the field of numerical relativity. Progress here has been initially slow, due to the complexity of the equations, the lack of computer resources, and the wide variety of numer- ical algorithms, computational techniques, and under- lying physics needed to solve these equations. A realistic 3D simulation based on the full Einstein equations is an enormous task: A single simulation of coalescing neutron stars or black holes would require more than a teraflop per second for reasonable perfor- mance, and a terabyte of memory. Even if the comput- ers needed to perform such work existed, we must still increase our understanding of the underlying physics, mathematics, numerical algorithms, high-speed net- working, and computational science. These cross- disciplinary requirements ensure that no single group of researchers has the expertise to solve the full problem. Further, many traditional relativists trained in more mathematical aspects of the problem lack expertise in computational science, as do many astrophysicists trained in the techniques of numerical relativity. We need an effective community framework for bringing together experts from disparate disciplines and geo- graphically distributed locations that will enable math- ematical relativists, astrophysicists, and computer scientists to work together on this immensely difficult problem. Because the underlying scientific project provides such a demanding and rich system for computational science, techniques developed to solve Einstein’s equa- tions will apply immediately to a large family of sci- entific and engineering problems. We have developed a collaborative computational framework that allows remote monitoring and visualization of simulations, at the center of which lies a community code called Cactus (see the “Cactus’s Modular, Extensible Code” sidebar). Many researchers in the general scientific computing community have already adopted Cactus, as have numerical relativists and astrophysicists. This past June, an international team of researchers at var- ious sites around the world ran, over several weeks, some of the largest such simulations in numerical rel- ativity yet undertaken, using a 256-processor SGI Origin 2000 supercomputer at NCSA (see http:// access.ncsa.uiuc.edu/Features/o2krun/ for a descrip- tion). VISUALIZING RELATIVITY The scientific visualization of Einstein’s equations as numerical simulations presents us with many chal- Globally distributed scientific teams, linked to the most powerful supercomputers, are running visual simulations of Einstein’s equations on the gravitational effects of colliding black holes. Cover Feature Solving Einstein’s Equations on Supercomputers Gabrielle Allen Tom Goodale Gerd Lanfermann Thomas Radke Edward Seidel Max Planck Institute for Gravitational Physics Werner Benger Hans- Christian Hege Andre Merzky Konrad Zuse Center for Information Technology, Berlin Joan Massó University of the Balearic Islands John Shalf National Center for Supercomputing Applications Cover Feature

Upload: others

Post on 18-Jan-2022

1 views

Category:

Documents


0 download

TRANSCRIPT

0018-9162/99/$10.00 © 1999 IEEE52 Computer

In 1916, Albert Einstein published his famous gen-eral theory of relativity, which contains the rulesof gravity and provides the basis for modern the-ories of astrophysics and cosmology. It describesphenomena on all scales in the universe, from

compact objects such as black holes, neutron stars,and supernovae to large-scale structure formationsuch as that involved in creating the distribution ofclusters of galaxies. For many years, physicists, astro-physicists, and mathematicians have striven todevelop techniques for unlocking the secrets con-tained in Einstein’s theory of gravity; more recently,computational-science research groups have addedtheir expertise to the endeavor.

Those who study these objects face a daunting chal-lenge: The equations are among the most complicatedseen in mathematical physics. Together, they form aset of 10 coupled, nonlinear, hyperbolic-elliptic par-tial differential equations that contain many thousandsof terms. Despite more than 80 years of intense ana-lytical study, these equations have yielded only a hand-ful of special solutions relevant for astrophysics andcosmology, giving only tantalizing snapshots of thedynamics that occur in our universe.

Scientists have gradually realized that numericalstudies of Einstein’s equations will play an essential rolein uncovering the full picture. Realizing that this workrequires new tools, developers have created computerprograms to investigate gravity, giving birth to the fieldof numerical relativity. Progress here has been initiallyslow, due to the complexity of the equations, the lackof computer resources, and the wide variety of numer-ical algorithms, computational techniques, and under-lying physics needed to solve these equations.

A realistic 3D simulation based on the full Einsteinequations is an enormous task: A single simulation ofcoalescing neutron stars or black holes would requiremore than a teraflop per second for reasonable perfor-mance, and a terabyte of memory. Even if the comput-

ers needed to perform such work existed, we must stillincrease our understanding of the underlying physics,mathematics, numerical algorithms, high-speed net-working, and computational science. These cross-disciplinary requirements ensure that no single groupof researchers has the expertise to solve the full problem.Further, many traditional relativists trained in moremathematical aspects of the problem lack expertise incomputational science, as do many astrophysiciststrained in the techniques of numerical relativity. Weneed an effective community framework for bringingtogether experts from disparate disciplines and geo-graphically distributed locations that will enable math-ematical relativists, astrophysicists, and computerscientists to work together on this immensely difficultproblem.

Because the underlying scientific project providessuch a demanding and rich system for computationalscience, techniques developed to solve Einstein’s equa-tions will apply immediately to a large family of sci-entific and engineering problems. We have developeda collaborative computational framework that allowsremote monitoring and visualization of simulations,at the center of which lies a community code calledCactus (see the “Cactus’s Modular, Extensible Code”sidebar). Many researchers in the general scientificcomputing community have already adopted Cactus,as have numerical relativists and astrophysicists. Thispast June, an international team of researchers at var-ious sites around the world ran, over several weeks,some of the largest such simulations in numerical rel-ativity yet undertaken, using a 256-processor SGIOrigin 2000 supercomputer at NCSA (see http://access.ncsa.uiuc.edu/Features/o2krun/ for a descrip-tion).

VISUALIZING RELATIVITYThe scientific visualization of Einstein’s equations

as numerical simulations presents us with many chal-

Globally distributed scientific teams, linked to the most powerfulsupercomputers, are running visual simulations of Einstein’s equations on the gravitational effects of colliding black holes.

Cover Feature

Solving Einstein’sEquations onSupercomputers

GabrielleAllen Tom Goodale Gerd Lanfermann ThomasRadke EdwardSeidel Max PlanckInstitute for Gravitational Physics

WernerBenger Hans-ChristianHege AndreMerzky Konrad ZuseCenter for Information Technology, Berlin

Joan MassóUniversity of theBalearic Islands

John ShalfNational Center forSupercomputingApplications

Cove

r Fea

ture

December 1999 53

lenges. The fundamental objects that describe thegravitational field form components of high-rank,four-dimensional tensors. Not only do the tensor com-ponents depend on the coordinates in which they arewritten, and thus have no direct physical meaningthemselves, they are also numerous: The quantitydescribing space-time curvature has 256 components.Further, interpreting these components is not straight-forward: Pure gravity is nothing more than curved“vacuum,” or empty space without matter. Standardintuition in terms of, say, fluid quantities such as den-sity or velocity, do not carry over easily to relativity.Thus, new paradigms for visualization must be devel-oped. The many quantities needed to describe the

gravitational field require an enormous volume ofdata that must be output, visualized, and analyzed.

Figure 1 shows one of the first detailed simulationsdepicting a full 3D merger of two spinning and orbit-ing black holes of unequal mass. (The “Black Holesand Gravitational Waves” sidebar describes blackholes in greater detail.) We define the black holes bythe surfaces of their respective horizons—in our case,and for rather complex reasons, we compute theapparent horizon. The horizon forms the critical 2Dsurface that separates everything trapped foreverinside from everything that could possibly escape. Thissurface responds to local variations in the gravita-tional field by distorting and oscillating. In Figure 1,

Cactus’s Modular, Extensible Code

Our experiences with highly complexnumerical relativity simulation codes haveled us to create a new paradigm for devel-oping and reusing numerical software in acollaborative, portable environment.Cactus embodies this new approach, whichcan be applied to all areas of computationalscience and engineering. Cactus lets youextend traditional single-processor codes tofull-blown parallel applications that canrun on all supercomputers, but that can stillbe developed on a laptop. Cactus also pro-vides access to a myriad of computationaltools, such as advanced numerical tech-niques, adaptive mesh refinement, parallelI/O, live and remote visualization, andremote steering. We’re also working to inte-grate computational technologies such asmetacomputing toolkits like Globus, andperformance and steering tools likeAutopilot, described elsewhere in this issue.The various computational layers and func-tionalities, as well as the different scientificmodules, can be selected at runtime viaparameter choices.

Design principlesThe name “Cactus” comes from the

initial design principle of a module set—the thorns—that could be plugged easilyinto a basic kernel, the flesh. The flesh hasevolved into a metacode with its own lan-guage to enable dynamic assembly ofmany different application codes from aset of thorns. The flesh controls how thethorns work together, coordinating

dataflow between them and decidingwhen any routine should be executed.

Developers build Cactus applicationsdynamically, using a metacode with a newobject-oriented language that describeshow the different pieces of codes writtenin any common computational lan-guage—such as C, C++, Fortan 77, andFortran 90—interweave.

Our desire to provide a solid frame-work for collaborative code use anddevelopment heavily influenced Cactus’sdesign principles. The modular approachenables both scientists and computationalscientists to design, use, and add thornsthat work together, through the flesh API.Cactus lets you apply collaborative con-siderations not only to the code itself, butto the wider working environment. Thornmailing lists, a versioning system, test-suite checking technology, and an openbug-tracking system help connect andinform users. To better address these col-laborative needs, we made the flesh andcomputational tool thorns open source sothat all developers can access them. In thissense, Cactus represents a huge collabo-ration between all users in all fields.

Cactus synergiesIt’s not always clear in emerging

research areas, such as numerical relativ-ity, what techniques will be best suited tosolving a given scientific problem. Further,both the software and hardware used torun advanced scientific simulationschange rapidly. These factors, combinedwith the complexity of the work under-

taken, result in huge simulations thatrequire intensive and often widely dis-tributed collaboration.

To support such work, we needed anenvironment such as that which Cactusprovides: a flexible, extensible, andportable computational infrastructure intowhich different components, from differ-ent communities, could be plugged.Although costly and labor-intensive,implementing such a system providesunique synergies. For example, while aphysicist may develop a new formulationof Einstein’s equations, a numerical ana-lyst may have a more efficient algorithmfor solving a required elliptic equation, acomputer scientist may develop a moreefficient parallel I/O layer, a software teammay develop a new paradigm for parallelprogramming, a hardware vendor maydevelop a more efficient parallel-computerarchitecture, and so on. All such develop-ments take place independently on rela-tively short time scales, and all maysignificantly affect the scientific simulation.

Without the ability to effortlessly con-nect such components and move about in“architecture space,” projects can bedelayed repeatedly for component retool-ing. Cactus allows the computationalapplication scientist, such as a physicistor engineer, to assimilate the most appro-priate available components into a solu-tion tailored specifically to the problem athand, and to execute a simulation usingthe best computer architecture available.You can find out more about Cactus athttp://www.cactuscode.org.

54 Computer

the two black holes have just merged in a grazing col-lision; the merged single black hole horizon, with itstwo black holes inside, is at the center. The collisionemits a burst of gravitational waves that radiates awaytoward infinity.

The content of gravitational radiation in the systemcan be calculated from the 10 independent componentsof the so-called Riemann curvature tensor (in turn rep-resented by five complex numbers Ψ0, … Ψ4). For sim-plicity, we refer loosely to the real and imaginary partsof the quantity as the two independent physical polar-izations of the gravitational waves. In Figure 1, the realpart of Ψ4 shows how some energy from the cataclysmradiates away. (Most of the visualizations in this arti-cle, as well as the cover image, were produced withAmira, described in the “Amira” sidebar.)

To fully evaluate the science in this complex sim-ulation, we must examine all aspects of the gravita-tional waves and other quantities. Figure 2 showsthe other polarization of the wave field, the imagi-nary part of Ψ4. This additional component of thegravitational radiation does not occur in previously

Figure 1. Gravitationalwaves from a full 3Dgrazing merger of twoblack holes. The imageshows the objectsimmediately after thecoalescence, when thetwo holes (seen justinside) have merged toform a single, largerhole at the center. Thedistortions of the hori-zon (the Gaussian cur-vature of the surface)appear as a color map,while the resultingburst of gravitationalwaves (the even-paritypolarization or real partof Ψ4) appears in redand yellow.

Black Holes and Gravitational Waves

Among the most exotic objects believedto populate the universe, black holes havenot been directly observed yet. They aretypically expected to form when a verymassive star exhausts its nuclear fuel andthen collapses in on itself. This processmay lead to a supernova explosion, thecreation of a compact object called a neu-tron star, or both phenomena. But if thestar is massive enough, its stronger self-gravity will pull it into a runaway collapsethat continues until all the original star’smass and energy collapse to a point of infi-nite density. This event leaves behind ablack hole in space. Travel too close to theblack hole and you reach the point of noreturn: No signal, including light, canescape if caught within an imaginary sur-face called the horizon.

Many stars in the universe pair intobinary systems: two stars that orbit eachother. Some fraction of these binaries willbe very massive stars; a fraction of thesemassive stars will ultimately collapse andform black holes; a yet smaller fraction ofblack holes will form binary black-holepairs that orbit each other. These exotic

orbiting holes of intense gravity would goaround each other forever, but anothernovel prediction of Einstein’s theory, grav-itational radiation, slowly but relentlesslydrains the system of energy and angularmomentum. The gravitational waves formripples in space-time itself, traveling at thespeed of light. Their strength is related tothe strength of the gravitational field, itstime rate of change, and its nonsphericity.Hence, as the binary holes orbit, they emita steady stream of waves that cause theirorbit to decay. After eons pass, the orbitdecays almost completely, and the holesapproach each other in a violent finalplunge, as shown in Figure A, spiralingwildly together and creating the larger,rapidly spinning black hole and a burst ofradiation. A worldwide effort is under wayto develop the computing capability andalgorithms needed to solve those ofEinstein’s equations that describe thisprocess. Achieving this goal will help usinterpret and detect such events using thenew gravitational-wave observatoriesbeing constructed on several continents.

Einstein’s theory predicts that gravita-tional waves, although normally weak,may—if they are strong enough andexposed to certain conditions—collapse

under their own self-gravity and formblack holes! Such a scenario seemsunlikely in normal astrophysical processes,but can be simulated on a computer as away to probe Einstein’s theory itself. In themain text we show examples of such cal-culations.

Figure A. Anatomy of a black-hole collision.The visualization depicts a 3D-simulatedhead-on collision between equal-mass blackholes. The (event) horizon surface appearsgreen, while the isosurfaces of thegravitational waves emitted appear as yellowand purple surfaces.

December 1999 55

computed axisymmetric collisions and shows inter-esting phenomena uniquely associated with 3D cal-culations.

Studying the horizon surface of the black hole byitself provides further insights. The coordinate loca-tion or coordinate shape of the horizon does not reflectits physical properties; those properties change with

different coordinate choices. However, the (Gaussian)curvature of the surface is an intrinsic property of thehorizon’s true geometry and thus does reflect its phys-ical properties. We can compute this quantity andcolor-map it to the surface of the horizon to under-stand how deformed it becomes during the collision.Figure 3 shows the results.

Figure 2. Additional polarization of gravitational waves(imaginary part of Ψ4) from a 3D merging collision of twoblack holes. The merged single black-hole horizon can justbe seen through the cloud of radiation emitted in theprocess.

AmiraAmira is a 3D visualization and model-

ing system developed at the Konrad ZuseCenter for Information Technology to sup-port interactive applications in science,

medicine, and engineering. Designed fornonexperts, Amira balances complexitywith ease of use and should be accessibleto a broad range of users. Complex tasksrequire more familiarity but can beachieved by experimenting with theinteractive facilities.

Amira does not adhere to the dataflowconcept; instead, the user loads data andsimply activates appropriate display orcomputational modules via context-sensi-tive menus. Instantiated objects and theirmutual dependencies appear automaticallyas a graphical network. By modifying thisrepresentation and selecting suitableparameters, users can control the visual-ization process with minimal interaction.

Amira displays its visual results in oneor several graphics windows that allowusers to interact directly, in three dimen-sions, with the depicted data objects.Amira can also arbitrarily combine differ-ent display techniques in a single window.The system’s 3D graphics have been imple-mented using Open Inventor and OpenGL.

Graphics hardware, if available, can beused to display very large data sets at inter-active speed. Although typically used inter-actively, Amira’s functions can also bescripted via a Tcl-command interface.

The system currently comprises about25 different data-object types and morethan 100 modules that can be dynamicallyloaded at runtime. Visualization tech-niques include hardware-accelerated vol-ume rendering for display of scalar fields,and advanced vector field visualizationalgorithms, like the illuminated stream-lines shown in Figure B. (For more on illu-minated streamlines, see http://www.zib.de/Visual/projects/vector.) The illumi-nated streamlines greatly improve spatialperception of complex 3D vector fields.Amira intrinsically supports various gridtypes, like hexahedral, curvilinear, andunstructured tetrahedral grids. AMR datatypes for numerical relativity data,adapted to those in Cactus, are currentlyunder development. (For more on Amira,see http://www.amiravis.com.)

Figure B. Collision of neutron stars showingisolevels of matter density and illuminatedintegral lines of the momentum vector field.The field lines and proper illumination greatlyimprove spatial perception of complex scenes.

Figure 3. Close-up of the horizon—with Gaussian curvatureto show the distorted nature of the surface—of a black holeformed by the collision of two black holes. The two individualhorizon surfaces can just be seen inside the larger horizonformed during the collision process.

56 Computer

The fabric of space-time may also change depend-ing on the initial concentration of pure gravitationalwaves. A weak concentration lets the evolving gravi-tational radiation just “flow away,” like shallow rip-ples on a pond. However, a large-enough energyconcentration in the gravitational wave can implodeviolently to create a black hole with, again, small rip-ples flowing away. At this point, comparing the gravi-tational waves’ field indicators and the oscillations ofthe Gaussian curvature of the newly formed black holecan be especially informative. For the simulation pre-sented here, the black hole assumes for its final state astatic spherical “Schwarzschild black hole,” but the ini-tial black hole that forms will be highly distorted. These

distortions radiate away during the evolution. We thusexpect to see gravitational radiation correlated with thestructures of the Gaussian curvature. By displaying thepositive and negative values of the Ψ4 scalar in full 3Dvolume, using special volume rendering techniques andoverlaying the apparent horizon, we can make this cor-relation visible, as Figures 4 and 5 show.

TOWARD SCIENTIFIC PORTALS FOR LARGE-SCALE COLLABORATIVE SIMULATION

Keeping such a widely distributed group ofresearchers working together requires infrastructuresupport for collaboration and resource sharing. Thisemerging world of metacomputing, now referred toas the Grid, brings together extraordinary computingresources, wherever they may be located in the world,to attack extraordinary scientific problems that can-not be solved by other means.

Although distributed resources offer several advan-tages, there are downsides as well. Even something assimple as running a supercomputing job becomes a hugetask when you and the supercomputer are on differentcontinents. The enormous terabyte data sets producedby these simulations tax bandwidth limits particularly.Even with the best available international networkingresources, downloading the data from the simulationrun may take longer than it took to run the simulationitself. To load terabyte data sets for interactive visualanalysis, the workstation assigned that task must benearly as powerful as the supercomputer performing thesimulation. These delays and resource limitations canbe costly to research when a simulation goes awry orresearchers select a nonoptimal set of simulation para-meters. These problems have motivated many of ourremote monitoring and steering efforts—an endeavorgreatly simplified by Cactus’s modular architecture.

From a visualization standpoint, traditionalresearchers remain essentially blind while the super-computer runs a simulation. This limitation becomesintolerable when we consider the hours or days thatlarge-scale problems run and the exceedingly expen-sive computational resources these problems consume.If we equate these limitations to cooking a roast with-out a timer or thermometer, computational monitor-ing provides the “window into the oven” we need tocontrol these simulations. Our own efforts in this areahave focused on parallel geometry extraction inlinewith the simulation code, using, initially, the Cactusisosurfacer thorn. This approach allowed us toremotely visualize a large (300 × 300 × 300) data setrunning on a T3E in Germany using an ElectronicVisualization Laboratory (EVL) ImmersaDesk VR sys-tem in San Jose, Calif., over a rather meager 5-megabits-per-second network connection.

Remote and distributed visualization simply decou-ples the parallel visualization pipeline from the simu-

Figure 4. Gravitational waves and horizon of a newly formedblack hole, caused by massive collapse of a strong gravita-tional wave on itself. The dotted surface shows the horizon,where green colors indicate high curvature and yellow meanszero curvature. The highest curvature indicates the largestgravitational radiation. The “distortion” (the non-sphericity)of the black hole radiates away over time, in accordance withmathematical theorems about black holes.

Figure 5. Horizon of agravitational wavethat implodes to forma black hole, withleftover waves escap-ing, shown at a latertime in the same evo-lution presented inFigure 4. We see thatthe horizon curvatureis affected by, andcorrelated with, theevolving gravitationalwave.

December 1999 57

lation code. The enormous data sizes generated bythese simulation codes overwhelm even the largest ofour workstations. The most desirable way around thischallenge is to put the supercomputer that generatedthe data set to work as a “visualization supercom-puter.” Doing so

• takes advantage of the data locality as well as thesupercomputer’s disk I/O bandwidth and mem-ory capacity to speed the loading of these datasets, and

• lets us bring to bear the parallel processing capa-bilities of these remote supercomputing resourcesto accelerate expensive visualization algorithms.

The data transport to the client must employ com-pression techniques (lossless and lossy) for geometries,textures, and images. Further, it must adapt to band-width and client capabilities by offering, for example,progressive transmission and multilevel resolution. Tothis end, we are experimenting with adding parallelcomputing and offscreen rendering extensions to pop-ular open-source visualization packages like VTK.We’re also adding the protocols needed to conveyremote-visualization control information for the DFNGigabit Testbed project.

Naturally, we would like to do more than watchour code “roasting in the oven.” Computationalsteering takes monitoring one step further by allow-ing you to manipulate the simulation as it runs. Thiscan be useful when some event in the simulationrequires a decision about the physical or computa-tional parameters during runtime. For example, adap-tive mesh refinement may exhaust a machine’smemory, and the researcher will be asked to choosebetween ignoring less interesting portions of the sim-ulation or bringing online more computing resourcesto handle the additional load.

The Cactus “http” thorn turns the simulation codeinto a Web server that allows anyone to monitor allaspects of the running code using a Web browser. Itssteering capabilities are currently limited to login-and-password authenticated code access to kill thesimulation—so that we can, metaphorically, stop theoven when we see smoke. In the future, the steeringcapabilities will be expanded to allow manipulationof many parameters. We have also begun to workwith AutoPilot, which will let us use various sensorsand actuators for steering and for visualization of thecode performance on distributed computers (see arti-cle by Eric Shaffer and colleagues on pages 44-51).

Perhaps the supercomputer visualization com-munity has restricted itself too much by consid-ering theory, computation, and visualization as

separate entities. Doing so keeps everyone’s work area

compartmentalized and discourages close interactionsbetween each stage in the pipeline. Portals may let usdo away with this partitioning and offer more openaccess to each piece of this scientific puzzle. A portalis a scientific workbench widely accessible throughnetworking and optimized for a particular set ofproblems or a problem-solving paradigm.

We are just beginning work on a CollaborativeScience Portal that will conveniently tie together inter-faces for composing codes from community-developedrepositories. This will provide a single interface toassemble a simulation code and resources, control therunning code, and visualize the results, either while itis running or afterward. We need to incorporate newcollaborative technologies such as CavernSoft thatprovide an infrastructure that supports multiway col-laborative visual data analysis. The Cactus code’smodular nature makes possible this level of serviceintegration and lets us meet the needs of our world-wide distributed band of researchers. We firmly believethat most major scientific codes will be viewedthrough some form of portal interface and that we willfind visualization systems increasingly embedded in aubiquitous service-based Grid fabric. ❖

AcknowledgmentsWe thank Ralf Kähler, Hermann Lederer, Jason

Novotny, Manuel Panea, Warren Smith, DetlevStalling, Ulrich Schwenn, Meghan Thornton, PaulWalker, Malte Zöckler, and Ian Foster for manyimportant contributions to the computational sciencepresented in this article, and Miguel Alcubierre, SteveBrandt, Bernd Brügmann, Lars Nerger, Wai-Mo Suen,and Ryoji Takahashi for many contributions to thescience presented here. This work was supported byNCSA, the NSF, Max Planck Institute for Gravita-tional Physics, Konrad Zuse Center for InformationTechnology, and the DFN-Verein. Our internationalnetworking exploits would not have been possiblewithout assistance and applications support from theStar Tap high-performance network access point.

Gabrielle Allen is a Cactus developer at the MaxPlanck Institute for Gravitational Physics. She receiveda PhD in astrophysics from the University of Walesand has worked in the field of numerical relativity.

Tom Goodale is a PhD student at the Max PlanckInstitute for Gravitational Physics, where he is aCactus developer. His research interests include gen-eral relativistic hydrodynamics, parallel computa-tional physics, and adaptive mesh refinement. Hereceived an MSc in computational fluid dynamics

58 Computer

from Cranfield University after working as a soft-ware engineer developing simulation code for theoil and gas industry.

Gerd Lanfermann is a PhD student at the Max PlanckInstitute for Gravitational Physics, where he is a mem-ber of the Cactus Development Team. He received adiploma degree in theoretical physics from the FreeUniversity, Berlin. His research interests include par-allel and distributed computing.

Thomas Radke is staff member at the Max PlanckInstitute for Gravitational Physics. He is involved inthe TIKSL project, a German gigabit-testbed researchproject on the teleimmersion of black hole collisions.His research interests include multithreading, messagepassing programming, and Linux kernel development.He received a diploma in information technology fromthe University of Technology, Chemnitz.

Edward Seidel is a professor at the Max Planck Insti-tute for Gravitational Physics, holds adjunct positionsin the Astronomy and Physics Departments at theUniversity of Illinois, Urbana-Champaign, and is asenior research scientist at the National Center forSupercomputing Applications. His research interestsinclude general relativity, astrophysics, and computa-tional science. He received a PhD in relativistic astro-physics from Yale.

Werner Benger is a researcher at the Konrad ZuseCenter for Information Technology in cooperationwith the Max Planck Institute for GravitationalPhysics. He works on visualization techniques for gen-eral relativity, including tensorfield visualization andrelativistic ray tracing. He received a Magister degreefrom the University of Innsbruck, Austria. His imagesand movies have appeared in various journals and tele-vision broadcasts.

Hans-Christian Hege is the director of the ScientificVisualization Department at the Konrad Zuse Centerfor Information Technology and is the managingdirector of Indeed-Visual Concepts, a company spe-cializing in visualization software and hardware. Hestudied physics at Free University, Berlin, and was aresearch assistant in quantum field theory and numer-ical physics.

Andre Merzky is a researcher at the Konrad Zuse Cen-ter for Information Technology. He studied elemen-tary particle physics in Leipzig, Edinburgh, and Berlin.His research interests include distributed computing,remote steering, and visualization.

Joan Massó is an associate professor in the Depart-ment of Physics at the University of the BalearicIslands, where he also is the head of supercomputing.He is affiliated with the National Center for Super-computing Applications and with the Department ofPhysics at Washington University. His research inter-ests include numerical relativity, scientific and high-performance computing, and software development.He received a PhD in theoretical physics from the Uni-versity of the Balearic Islands.

John Shalf works for the National Center for Super-computing Applications’ Visualization and VirtualEnvironments group as well as the Star Tap high-per-formance network access point. His degree is in elec-trical engineering, but he has contributed to a varietyof projects in networking, distributed computing, andapplication code frameworks.

Contact the authors at [email protected]://www.ieee.org/renewal

✔ 12 issues of Computer

✔ Member discountson periodicals, conferences, books, proceedings

✔ Free membership in Technical Committees

✔ Digital library access at the lowest prices

✔ Free e-mail alias @computer.org

RENEWRENEWyour Computer Society membership for

your Computer Society membership for