postgenomics: data, data, everywhere

2

Click here to load reader

Upload: declan

Post on 28-Jul-2016

218 views

Category:

Documents


1 download

TRANSCRIPT

Page 1: Postgenomics: Data, data, everywhere

2001 in context

840 NATURE | VOL 414 | 20/27 DECEMBER 2001 | www.nature.com

E.L

AN

DE

R/W

HIT

EH

EA

D I

NST

ITU

TE

S.T

OU

HIG

/NE

WSM

AK

ER

S

The full ramifications may not yet havesunk in for many biologists, but arevolution is under way. For those

comfortable with postgenomic techniques, agalaxy of scientific opportunities beckons.

To take advantage of this, many branchesof biology may need to be transformed —moving from bench sciences dominated bycompeting research groups intocollaborative enterprises. This would seeentire research communities organizedaround databases and associatedcomputational tools. In such fields, theseonline resources could eventually replacethe conventional scientific paper as thepredominant form of communication.

Although this year saw some early stagesin this transition, there was first someunfinished business from 2000 — the year ofgenome sequences — to attend to. Aftermonths of anticipation, rival versions of thedraft human genome, produced by theinternational Human Genome Project andCelera Genomics of Rockville, Maryland,were finally published in February.

As researchers attempt to link thesequence data with information on genefunction and expression, protein production and even clinical records, aplethora of techniques are being used.From DNA microarrays for studying geneexpression to advanced techniques forprotein analysis, the common thread is the generation of unprecedented quantitiesof data: we are entering the era ofhigh-throughput biology.

Increasingly, researchers will be able toapply sophisticated computational tools tohuge data sets to model the behaviour andfunction of whole pathways of genes andproteins. And as more genomes becomeavailable, computational approaches willalso be key to gaining insights into genefunction through comparative studies.

Most biologists will need assistance ifthey are to get a piece of this action. Butthankfully, experts are trying to help. Theinternational Microarray Gene ExpressionDatabase Group, for instance, is developingcommon protocols for high-throughput

Postgenomics

Data, data, everywhere…

northern England, and these ani-mals spread the disease across thecountry. Just over a month later,the number of infected farms had topped 700.

As the outbreak spread, it became clearthat the Ministry of Agriculture, Fisheriesand Food (MAFF) was unprepared for such acalamity. Understaffing left local vets strug-gling to keep up with new cases, and MAFF’splans for responding to emerging data werewoefully inadequate — many local veteri-nary offices were not even on e-mail. Indeed,MAFF never recovered from criticism ofits handling of the crisis, and has since been reconstituted as the Department forEnvironment,Food and Rural Affairs.

In early March, three teams of indepen-dent epidemiologists started crunching thedata. They forecast a ‘meltdown’ unless amuch more effective culling policy was intro-duced. On hearing this message, the govern-ment’s chief scientific adviser, David King,wrested control of the crisis from MAFF andformulated a policy that leant heavily on theepidemiologists’mathematical models.

The plan was to kill infected animalswithin 24 hours and those on surroundingfarms within 48.It worked: the last case — onfarm number 2,030 — was recorded on 30September. But was the cost too high? Some veterinary scientists feel that vaccinationwould have limited the epidemic’s size, andcomplain that King paid little attention totheir views. They cite the use of vaccinationelsewhere, including a programme used thissummer in the Netherlands to quell anascent foot-and-mouth outbreak sparkedby imports of infected animals from Britain.

The epidemiologists who advised Kingargue that their models show that the largesize of the epidemic — and hence the huge cullneeded to eradicate it — can be blamed on theinitial delay in reducing the virus’s spread.

Epidemiologists argue that retrospective

analyses of the British outbreakconfirm that vaccination wouldhave had little effect. They claim

that any benefits from vaccination wouldhave been outweighed by the adverse effectsof moving resources away from the cull pro-gramme, and note that vaccination does notsave animals from slaughter. Indeed, vacci-nated livestock must be killed if exports areswiftly to be resumed, as simple tests cannotdistinguish immunized from infected ani-mals. But critics reply that the full range of

vaccination strategies has yet to be modelled.Clearly, more work is needed before there

is a consensus on the strategy to be usedagainst future outbreaks. But the woundsopened up by this year’s British epidemic maymean that considerations other than the out-put of mathematical models will come intoplay.“Culling on such a scale will not be polit-ically acceptable,” says Mark Woolhouse, aveterinary epidemiologist at the University ofEdinburgh,and one of King’s advisers. ■

Jim Giles

Dead loss: Britain’sfoot-and-mouthcrisis saw some fourmillion animalsslaughtered (left).

Base camp: long-promised draftsequences of thehuman genomefinally appeared inprint (right).

© 2001 Macmillan Magazines Ltd

Page 2: Postgenomics: Data, data, everywhere

2001 in context

NATURE | VOL 414 | 20/27 DECEMBER 2001 | www.nature.com 841

publishing in biology, as agrassroots initiative called thePublic Library of Science tried toconvince publishers to deposit papers in afree-access online database six months aftertheir initial publication. But for a vision ofthe future of biological communication,speak to those researchers who are mostenthusiastically embracing the high-throughput approach. The Alliance forCellular Signaling, for instance, whichinvolves researchers from 20 institutionsacross the United States, sees its planneddatabase and associated web pages as theprimary means of disseminating its results.Conventional scientific papers simplycannot do the job, argue its leaders. ■

Declan Butler

problems with financial manage-ment are resolved. But ultimatelythe Young committee called for a

fully capable station with a crew of six,clearlydedicated to science.

CERN’s problems are not in the sameleague as NASA’s, although its managementhas faced harsh criticism. Rumours that theLHC project was struggling to keep to its austere budget began circulating in thespring.After months of denials, CERN final-ly admitted in September that the project waslikely to overspend.Its managers pointed to alack of contingency funds,problems with thedevelopment of superconducting magnetsand escalating civil-engineering costs; criticsblamed financial mismanagement. Eitherway, CERN’s member states are in no moodto bail out the project.

CERN director-general Luciano Maianiremains optimistic that the LHC will stillopen on time. But how this will be achievedhas yet to be determined, and strategic control of the project has effectively beenhanded to an external review board.

As NASA and CERN attempt to balancetheir books, researchers working on similarlyambitious projects are watching anxiously.Inthe current economic climate, they realize,talk of spiralling costs may make politiciansnervous about committing money.

High-energy physicists trying to sell thenext big machine after the LHC — an enor-mous electron–positron collider — are likelyto face some awkward questions. The samegoes for astronomers planning the Next Gen-eration Space Telescope,scheduled for launchin December 2008. Its technical specificationis ambitious, jumping from the 2.4-metremirror of the Hubble Space Telescope to a pro-posed 8-metre reflector. Some of the project’sproponents are already wondering if it mightbe wise to adopt a more modest plan. ■

David Adam and Tony Reichhardt

gene-expression studies, from thepreparation of samples to the visualizationof microarray data. Biologists new to thetechnology will be able to plan theirresearch using these standards, enter theirresults into a common global database, anduse its software tools to query the pooleddata sets.

Meanwhile, human-genome browserssuch as Ensembl, run from the EuropeanBioinformatics Institute in Hinxton, nearCambridge, or that hosted by the Universityof California, Santa Cruz, already give accessto layers of comparative genomic, functionalgenomic, genetic, cytogenetic and clinicaldata all anchored to the underlyingsequence data.

This year saw a lively debate over online

NA

SA

CE

RN

In the red: costsforced NASA torevise plans for theInternational SpaceStation (left), whileproblems withsuperconductingmagnets (right)helped to pushCERN’s LargeHadron Colliderover budget.

Big science

Down to Earth with a bumpFor NASA, 2001 was the year in which

the space odyssey that is the Interna-tional Space Station ran into serious

trouble. The script is hastily being re-written, with the goal of controlling thisblockbuster’s budget.

The space station is the ultimate big-ticket project. But it was not the only suchendeavour to run into serious budgetary difficulties this year. At CERN, the Europeanhigh-energy physics laboratory near Geneva, a projected overspend of severalhundred million dollars has left questionshanging over its next big particle accelerator,the Large Hadron Collider (LHC), which issupposed to start collecting data in 2006.

Few would dispute the judgement ofincoming NASA administrator Sean O’Keefethat the space station is in “management and

financial crisis”. Early in the year, NASAshocked the new US administration by say-ing that it would need $5 billion more thanthe $8 billion that had previously been bud-geted to finish the project. In being handedover to O’Keefe, a budget and managementexpert, the space agency resembles a corpo-ration that has gone into receivership.Financial responsibility, not technologicaland scientific vision, is the new priority.

Even before O’Keefe’s nomination, thedecision had been taken to freeze the stationat an interim stage, with a maintenance crewof three astronauts on board, until costscould be contained. A committee appointedto look into the project, headed by veteranaerospace executive Thomas Young, agreedin November that NASA should stay with astripped-down station for two years, until

© 2001 Macmillan Magazines Ltd