Feature: From images to knowledge
Gateways to the microcosm and the macrocosm
Imaging technologies enable researchers to make things visible that would otherwise remain hidden from the human eye. We look at seven examples to see what they can achieve.

Image: Keystone / Science Photo Library / Steve Gschmeissner
Electrons shift boundaries in the micro-universe
The very first objects that were drawn using simple optical microscopes were insects, over 350 years ago. One well-known example is the magnified flea drawn by the British naturalist Robert Hooke and published in his book ‘Micrographia’. The details that he recorded on the body of that tiny insect revealed a world that no one had previously thought existed. The same devices also allowed cells and bacteria to be seen for the first-ever time.
But visible light reaches physical limits of resolution. By using electrons instead of photons, microscopes can make images showing up to 100 times more detail. But as the biophysicist Jacques Dubochet has remarked, “this is a very difficult step”, because the electron microscope needs its samples to be supercooled, dried, and placed in a vacuum, thereby killing any cells to be observed. “And that’s a pity for biological science!”, he says. He is an emeritus professor of the University of Lausanne. In order to improve the process, he invented cryo-electron microscopy, which involves cooling samples to almost minus 200 degrees Celsius so quickly that water solidifies in them without forming ice crystals. His method earned him the Nobel Prize in 2017.
The ant pictured here doesn’t take us to such limits, however. In order to analyse surfaces like this, the sample is coated with a metal such as gold and scanned, point by point, using the electron beam of a scanning electron microscope. Images with a very high depth of field can be generated from the electrons that are reflected. “This also enlarges the world that we construct in our minds”, says Dubochet.

Image: Event Horizon Telescope
A place whence no light emerges
One night in April 2017, eight radio telescopes around the globe were synchronised using atomic clocks in order to observe a dot in the sky: the supermassive black hole at the heart of the galaxy M87 in the constellation of Virgo. The amount of data that they gathered was so large that it wasn’t collated over the Internet, but brought together on specially developed hard drives that were sent by plane. Researchers then used this data to calculate the image shown here. It went viral two years later.
It naturally doesn’t show the black hole itself, but rather the gas that races around it at roughly a quarter of the speed of light, glowing as it does so. The black spot within the glowing ring isn’t the black hole either – that would be an invisible, infinitesimally small dot in the centre of this image – but rather its shadow. When light gets too close to the black hole, some of it is sucked in, while some of it is deflected so strongly that it never reaches the telescope. In his media release about the image, Heino Falcke of Radboud University in the Netherlands wrote that this shadow is “something predicted by Einstein’s general relativity that we’ve never seen before”. Incidentally, the diameter of the glowing ring of gas is 25,000 times the distance between the Earth and our Sun.

Image: Copernicus Sentinel data (2015) / Esa
Monitoring photosynthesis from above
The satellite Sentinel-2A passes every place on Earth every five days, always at noon in local time. Since 2015, it’s been photographing the globe using 13 channels and an image resolution of up to 10 metres. “Just after it was launched, its data were very valuable. Then they became less valuable, but now, after ten years, people are finally appreciating the opportunity it gives us to make long-term comparisons using the same sensor”, says Robert Meisner, the ESA Communications Programme Officer.
This photo shows Berlin in July 2015. The contrast between the wooded Tiergarten park in the centre and the city around it is particularly striking. We can also clearly see the regions around Grunewald and Königswald to the south-west of Berlin. “On this satellite image, we’ve replaced the red channel with the infrared channel. The vegetation reflects much better in this part of the spectrum”, says Meisner. This is why plants on the image aren’t green as would usually be the case.
False colour images such as these are used by farmers, stock-market analysts and climate researchers to monitor the state of vegetation. The redder the image, the more plants are depicted on it, and the more active their photosynthesis.

Image: Kevin Mackenzie / University of Aberdeen / Science Photo Library
Luminous cells
A laser scans a cell, point by point. When it hits a fluorescent molecule, it lights up. Complex optics ensure that only light from this exact position is recorded. In this manner, an image of the entire cell is formed, bit by bit – or, to be more precise, an image of the fluorescent molecules in the cell. The rest remains dark.
“Confocal fluorescence microscopy is able to ‘cut’ an optical slice through a sample”, says Kevin Mackenzie, the former manager of the Microscopy and Histology Core Facility at the University of Aberdeen’s Institute of Medical Sciences. For this image, he coloured the different components of the cell with different fluorescent dyes. The cell nucleus was made blue, the cytoskeleton green and the mitochondria red. Antibodies and genetic engineering are making it possible to be even more precise and to determine whether structures should come in contact with each other, for example. Individual slices can also be superimposed to create 3D images. This is possible while the cells are still alive, even if the structures are moving.
In the case of the pulmonary artery cells shown here, researchers could use the method described to investigate the behaviour of mitochondria in low-oxygen conditions or to study the role of the cells in the development of cancer. Mackenzie has already analysed bee larvae for mite bites, observed bone cells with 3D glasses, and seen yeast cells forming fungal filaments. “I always liked looking at something new”, he says.

Image: Victor Shahin, Hans Oberleithner, Münster University Hospital / Science Photo Library
Topography with atomic precision
In 1986, Gerd Binnig and Heinrich Rohrer of IBM Research in Zurich were awarded the Nobel Prize in Physics for inventing the scanning tunnelling microscope. The image shown here was made with a further development of it, namely the atomic force microscope. With this instrument, a minutely small metal needle scans a surface, recording its movements with atomic precision. This needle can even pull on molecules and thereby measure force. “We’re actually looking into the heart of biology when we do that”, said Binnig in 2016. In the case shown here, we can see the pores on the surface of a cell nucleus that control what signals penetrate the genetic material and what signals emerge again. These pores have a diameter of one ten-thousandth of a millimetre. The whiter their colour, the higher the structure.

Image: Keystone / Science Source
Sight on steroids
“A photograph does not show the world as it is physically”, says Sabine Süsstrunk, a professor of visual representation at EPFL. “If that were so, we wouldn’t regard an image as a photograph”. Technology depicts the world as we perceive it. Ever since Bell Laboratories developed the first-ever digital sensor in 1970, there’s been no stopping it. Whether it’s macro lenses, telephoto lenses, for time-lapse or slow motion, webcams or photo traps – cheap, miniaturised cameras are everywhere today. “There probably isn’t a branch of science that doesn’t need photography in some way or other”, says Süsstrunk.
Thermal imaging cameras, for example, use the mid-infrared range to measure temperature. Thermography is typically used to test technical equipment for heat generation – such as the car engine in our image here. But it can also be used to check buildings for heat loss. It can even detect cancer. Süsstrunk says: “It gives us a two-dimensional measuring device that presents us with a nice visual representation of our results instead of just the bare figures”.

Image: Keystone / Photo Library / Zephyr
Watching the brain think
Magnetic resonance imaging (MRI) is best known for examining knee joints. First the strong magnetic field in the device aligns the protons of hydrogen atoms in the body. Then radio waves are pulsed through the body, exciting the protons, which allows measurements to be made of how many hydrogen atoms are located where. An image of a knee made in this way can reveal injuries, for example. Contrast agents can also make things such as tumours visible.
Haemoglobin in the blood is a natural contrast agent. Researchers can use it to observe which regions of the brain are being better supplied with oxygen. Lydia Hellrung, a postdoc at the University of Zurich, has made a video about the process. “We watch the test subjects while they’re thinking”, she says. She and her colleagues at the Centre for Neuroeconomics are using this method to study how people try to change their behaviour.
