Not just for particle physicists: the accelerator also attracts agronomists, archaeologists and engineers | Image: Aurel Märki

Born during the Second World War with the development of radar and the atomic bomb, ‘Big Science’ has since evolved into particle accelerators under our feet, telescope networks in the desert, and observatories orbiting above our heads. Giant laboratories have since housed colossal instruments as part of projects headed by governments and groups of states, questioning the origins of the universe since the beginning, and drawing physics and astronomy with them. “Big science, hallelujah!” were the 1982 lyrics of Laurie Anderson, introducing to American pop culture the expression ‘Big Science’. ‘Hallelujah!’, indeed. An almost sacred aura now surrounds team research using titanic infrastructure.

“Biomedical research is now the field that draws the most people to this kind of infrastructure, and it attracts funding like no other”. Catherine Westfall

Undoubtedly, it has been fruitful. The development of the standard model, a ‘theory of almost everything’ that describes matter and universal forces with the help of a dozen or so elementary particles, was made possible by CERN and the American Fermilab. And it was the Hubble Space Telescope that provided us with stunning images of the expansion of the cosmos, the gravitational effects of black holes and the nurseries where stars form. What was once reserved for military might now has the hallmark of science: cultural prestige and rampant grandeur. In 1961, the physicist Alvin Weinberg saw these elements of infrastructure as the modern equivalent of cathedrals.

But since the turn of the millennium, this magnitude-based approach has begun to diversify. The circle of stakeholders has opened up rapidly to other disciplines and to industry, and the governance of megascience has become more democratic. The expectations and promises attached to major projects have become more concrete: we now talk about returns on investment, regional economic development, and solving the problems faced by society in the fields of energy, food and health. Finally, notions of size have become blurred: today we can conduct big science with modest means, and small science with giant synchrotrons.

Make way for the archaeologists and agronomists

“What strikes us today, when we look at a centre such as the Argonne National Laboratory in Illinois, where one of the three largest synchrotrons in the world is located, is the diversity of its users”, says Catherine Westfall, a science historian at Michigan State University and author of several studies on the future of megascience. Having been once exclusively dedicated to particle physics, large accelerators have since considerably broadened their scope of application. “I’ve met agronomists who were developing seeds, archaeologists looking for new techniques for their digs, someone who wanted to build an aircraft turbojet less vulnerable to collisions with birds...”. Particle beams are increasingly being used to explore practical applications of matter, be it metals or proteins. “Biomedical research is today the field that draws the most people to this kind of infrastructure; it’s at the forefront and it attracts funding like no other”.

In this process, the link between mega infrastructure and research projects becomes blurred. To discuss this shift, two moments are invoked by Olof Hallonsten, a sociologist and science historian at Sweden’s University of Lund and the author of a scholarly paper on the metamorphoses of Big Science. “In 1984, Carlo Rubbia and Simon van der Meer received the Nobel Prize in Physics for the discovery of the W and Z bosons. Of course, Nobel Laureates are always individuals, but in this case, through them, the prize landed in the lap of the research centre where the discovery took place: CERN”. The situation 25 years later is different: “In 2009, it was the molecular biologist Ada Yonath and her colleagues who received the Nobel Prize in Chemistry for their work on the ribosome of cells. Half a dozen research centres around the world rushed to issue press releases explaining that it was their large equipment that had made this work possible”.

New billionaire projects
With its Future Emerging Technologies (FET) Flagships, the European Commission is supporting the bottom-up development of Big Science. Scientists themselves propose projects with a budget of around EUR 1 billion. A new call for tenders is underway in the areas of the connected society, health and the environment. Among the 33 proposals selected are two initiatives that strongly involve the EPFL: Time Machine, which aims to convert historical archives into simulations of the past, and Health EU, which intends to develop digital avatars on which to test personalised medical care.

Following the selection carried out by scientific, political and industrial leaders, one or two projects will be launched in 2020. EPFL is already coordinating the Human Brain Project, which was launched in 2013 alongside the second flagship Graphene. The Quantum programme on quantum technologies begins in 2019.

More and more often, we do “small science with big tools”, says Hallonsten. The kind of research centre created in the past around a megaproject is giving way to large, but not dedicated, platforms oriented towards user needs. “Most of the scientists who use these centres also have a job and a source of funding elsewhere. There they usually work on small equipment, but then occasionally they need a very large machine. So they make a request for access, and if they’re lucky and it’s not oversubscribed, they’re granted it. They conduct their experiment and leave with the results. In this model, large accelerators or reactors are no longer the prerogative of teams whose mission is to win a war or investigate the origin of the universe. They become a resource that tends to be open to everyone. It is therefore a more democratic model, less over-determined by political or military decisions”.

Consortia time

This delinking of the sizes of projects and of infrastructure is also observed in the opposite sense. We can conduct big science, mobilising big budgets and pursuing big goals, without needing big tools. This is the case with the Human Genome Project, and in fact quite broadly in biology. For a while, however, the discipline “tried to imitate physics in a race to become a giant”, says Bruno Strasser, a professor of the history of science at the University of Geneva and author of several studies on the history of the biomedical sciences, big data and participatory research. “When the European Molecular Biology Laboratory (EMBL) was established in Heidelberg in 1974, it was hoped similar work would be done there as that of the physicists at CERN, i.e., around a central issue and with a scientific instrument that was too expensive for a university laboratory. In reality, this centralisation was not necessary, because biology did not use large instruments. In fact in biology, good science is small science, according to the formula of the American biochemist Bruce Albert. The EMBL researchers therefore spent their time trying to justify having the infrastructure. “When DNA sequencing became widespread in the second half of the 1970s, the EMBL jumped at the opportunity, which finally appeared to be a means to legitimise its existence: this led to the establishment in 1982 of the first public database of genomic sequences, the Nucleotide Sequence Database”. Although sequencing does not require large tools, it does produce ‘big data’ requiring large infrastructure.

In 1990, the Human Genome Project (HGP) began the complete sequencing of human DNA. The magnitude of the objective and its medical potential was highlighted in public proclamations. In 2000, the US President Bill Clinton declared, that it is “quite possible that, for our children’s children, cancer will be no more than a constellation of the zodiac”. Presented as a Big Science project, it actually takes a form adapted to the modest scale of biology laboratories. “It differs from the CERN model, which concentrates resources in a single place, closed like a microcosm and populated like a small town”, notes Strasser. The HGP, on the other hand, operates in a fragmented way, with research carried out in a large number of institutions that come together for the occasion in an international consortium”.

“When you receive money from the same source, you tend to talk more to each other”.Bruno Strasser

This logic is now being pushed even further. Strasser cites the case of SystemsX, a Swiss systems biology initiative led by a multi-site research consortium and representing the largest scientific project to date in the country. Launched in 2008 and closed in 2018, it neither relied on giant infrastructure nor aimed for a single major goal. Rather, it was a cluster of projects that shared a common front, as shown by the thesis defended at ETH Zurich in 2017 by Alban Frei. “But SystemsX’s coordination and the way it presents itself is based on megascience”, says Strasser. “The initiative has full-time communication officers who are responsible for its image. It allows researchers to bring a new perspective to their work that can reach the public and politicians. It’s very clever in terms of dialogue with the rest of society and fundraising”.

The return of amateurs

Associating to achieve a critical mass and to boost visibility: is it purely a branding exercise? “On the one hand, one could say that the SystemsX operation does not meet a scientific need”, says Strasser. “After all, no one really knows what systems biology is... But what it is doing is stimulating interesting research. It encourages exchanges, because when you receive money from the same source, you tend to talk more to each other. There will hopefully be cross-fertilisation among the projects, which will benefit from a common framework in which the diverse parts can be put together, revealing a pattern”.

“The circular buildings are grandiose and majestic. It’s an ideal setting for politicians to shake hands under the cameras’ watchful eye”.Olof Hallonsten

According to Strasser, by moving from centralised to network models, the science of large projects is returning to an earlier condition: “In the 19th century, megascience belonged to the biologists. Its research centres were the botanical gardens and natural history museums of Berlin and London. Its major projects involved exploring the globe. It was a matter of coordinating hundreds of people roving to the four corners of the planet on boats and bringing together the work of people of different nationalities and cultures, including a large number of amateurs”.

It is striking to note that participatory science means that non-specialists are today reappearing. “According to our estimates, there are some 10 million people in the world who are active in this field”, says Strasser. “The areas where public participation is growing rapidly and expanding on a large scale are the same ones that have historically seen significant amateur involvement, namely natural sciences and astronomy”. We participate by categorising millions of galaxies as part of Galaxy Zoo , we help to track the evolution of biodiversity by posting photos online, we take part in the study of climate change by noting when leaves begin to fall... This movement represents another way of doing research on a very large magnitude, building a megaproject out of a myriad of small, individual contributions. “Public participation in research, which was the norm in the 18th and 19th centuries, is now returning to a more active role, and thus may merely have experienced an eclipse in the last century, during which the public was a pure consumer of scientific information and the spectacle of science”.

Participatory data is sometimes melded into data generated at multiple sites by professional science. This is the case for the Annotathon, an online project where participants are invited to annotate DNA sequences coming from Craig Venter’s Global Ocean Sampling project. “It should be noted that the production of open data is one of the side effects of megascience”, says Strasser. “This principle of openness is all the more firm because it does not result from a form of idealism, but from a necessity, particularly in the context of consortia. It’s impossible to coordinate where there exists a possibility that each participant will remove their data”. Adopted as part of the Human Genome Project, the Bermuda Principles (1996) and the Fort Lauderdale Agreement (2003) established the practice of open access and instant dissemination of data in the genome field.

A range of possibilities

Big Science now seems to be travelling along multiple trajectories simultaneously. The example of Global Ocean Sampling reflects the ongoing blurring of borders in this area. This undertaking to circumnavigate the globe to survey the genetic diversity of microbial marine populations is actually deployed from the relatively modest infrastructure of a private yacht. Its promoter, Craig Venter, is both a scientist and a businessman. The funding pool includes private foundations, the television network Discovery Channel, which is broadcasting the expedition, and the US Department of Energy, which hopes to find in microbes innovative solutions to national energy needs. And the Annotathon opens up its citizen-science dimension.

Elsewhere, other projects are travelling more traditional paths in pursuit of grandeur (see boxes). But in one way or another, most of them face requirements for openness and diversification. The Human Brain Project has muted its vertiginous initial proclamations - reproducing the functioning of the human brain, even consciousness, on a supercomputer - to focus instead on the development of a technological platform in the field of neurocomputing.

The European Extreme Light Infrastructure has started building the world’s most powerful lasers without defining precise research objectives, which are in fact left to its future users. The same is true for the European Spallation Source (ESS), which is currently being built in Sweden around a pulsed neutron source publicised as being thirty times more powerful than its current counterparts. The ESS is part of the new paradigm of ‘small science with big tools’, which seeks to discover practical applications. The thrill of magnitude is not triggered here by a Herculean research goal, but by the physical aspect of the place, says Hallonsten: “I see the construction site from the window of my office at Lund University. The circular buildings are grandiose and majestic. It is an ideal setting for politicians to shake hands under the watchful eye of the cameras”.

According to Hallonsten, this gigantism could lead to some perverse effects. “One of the risks is that investment in these imposing infrastructure projects will be made at the expense of the budgets that finance the work of researchers. In Sweden, we have heard politicians addressing the scientific community saying: ‘we put all these funds into the ESS; so you got your money!’ In contrast, the Danish government – which has also invested heavily in the project – has announced that for every euro allocated to the ESS, another euro will fund scientists so that they can use the infrastructure”. Contemporary megascience has drifted far from the idea of projects aligned with a single trajectory; today it follows research approaches that trace paths in the plural, creating branching possibilities.

Nic Ulmi is a freelance journalist and lives in Geneva.