A thousand colors, one galaxy: Astronomers reveal a cosmic masterpiece

Astronomers have created a galactic masterpiece: an ultra-detailed image that reveals previously unseen features in the Sculptor Galaxy. Using the European Southern Observatory's Very Large Telescope (ESO's VLT), they observed this nearby galaxy in thousands of colors simultaneously. By capturing vast amounts of data at every single location, they created a galaxy-wide snapshot of the lives of stars within Sculptor.

"Galaxies are incredibly complex systems that we are still struggling to understand," says ESO researcher Enrico Congiu, who led a new Astronomy & Astrophysics study on Sculptor. Reaching hundreds of thousands of light-years across, galaxies are extremely large, but their evolution depends on what's happening at much smaller scales. "The Sculptor Galaxy is in a sweet spot," says Congiu. "It is close enough that we can resolve its internal structure and study its building blocks with incredible detail, but at the same time, big enough that we can still see it as a whole system."

A galaxy's building blocks — stars, gas and dust — emit light at different colors. Therefore, the more shades of color there are in an image of a galaxy, the more we can learn about its inner workings. While conventional images contain only a handful of colors, this new Sculptor map comprises thousands. This tells astronomers everything they need to know about the stars, gas and dust within, such as their age, composition, and motion.

To create this map of the Sculptor Galaxy, which is 11 million light-years away and is also known as NGC 253, the researchers observed it for over 50 hours with the Multi Unit Spectroscopic Explorer (MUSE) instrument on ESO's VLT. The team had to stitch together over 100 exposures to cover an area of the galaxy about 65,000 light-years wide.

According to co-author Kathryn Kreckel from Heidelberg University, Germany, this makes the map a potent tool: "We can zoom in to study individual regions where stars form at nearly the scale of individual stars, but we can also zoom out to study the galaxy as a whole."

In their first analysis of the data, the team uncovered around 500 planetary nebulae, regions of gas and dust cast off from dying Sun-like stars, in the Sculptor Galaxy. Co-author Fabian Scheuermann, a doctoral student at Heidelberg University, puts this number into context: "Beyond our galactic neighborhood, we usually deal with fewer than 100 detections per galaxy."

Because of the properties of planetary nebulae, they can be used as distance markers to their host galaxies. "Finding the planetary nebulae allows us to verify the distance to the galaxy — a critical piece of information on which the rest of the studies of the galaxy depend," says Adam Leroy, a professor at The Ohio State University, USA, and study co-author.

Future projects using the map will explore how gas flows, changes its composition, and forms stars all across this galaxy. "How such small processes can have such a big impact on a galaxy whose entire size is thousands of times bigger is still a mystery," says Congiu.

This research was presented in a paper accepted for publication inAstronomy & Astrophysics.

The team is composed of E. Congiu (European Southern Observatory, Chile [ESO Chile]), F. Scheuermann (Astronomisches Rechen-Institut, Zentrum für Astronomie der Universität Heidelberg, Germany [ARI-ZAH]), K. Kreckel (ARI-ZAH), A. Leroy (Department of Astronomy and Center for Cosmology and Astroparticle Physics, The Ohio State University [OSU], USA), E. Emsellem (European Southern Observatory, Germany [ESO Garching] and Univ. Lyon, Univ. Lyon1, ENS de Lyon, CNRS, Centre de Recherche Astrophysique de Lyon, France), F. Belfiore (INAF – Osservatorio Astrofisico di Arcetri, Italy), J. Hartke (Finnish Centre for Astronomy with ESO [FINCA] and Tuorla Observatory, Department of Physics and Astronomy [Tuorla], University of Turku, Finland), G. Anand (Space Telescope Science Institute, USA), O. V. Egorov (ARI-ZAH), B. Groves (International Centre for Radio Astronomy Research, University of Western Australia, Australia), T. Kravtsov (Tuorla and FINCA), D. Thilker (Department of Physics and Astronomy, The Johns Hopkins University, USA), C. Tovo (Dipartimento di Fisica e Astronomia 'G. Galilei', Universit'a di Padova, Italy), F. Bigiel (Argelander-Institut für Astronomie, Universität Bonn, Germany), G. A. Blanc (Observatories of the Carnegie Institution for Science, USA, and Departamento de Astronomía, Universidad de Chile, Chile), A. D. Bolatto and S. A. Cronin (Department of Astronomy, University of Maryland, USA), D. A. Dale (Department of Physics and Astronomy, University of Wyoming, USA), R. McClain (OSU), J. E. Méndez-Delgado (Instituto de Astronomía, Universidad Nacional Autónoma de México, Mexico), E. K. Oakes (Department of Physics, University of Connecticut, USA), R. S. Klessen (Universität Heidelberg, Zentrum für Astronomie, Institut für Theoretische Astrophysik and Interdisziplinäres Zentrum für Wissenschaftliches Rechnen, Germany, Center for Astrophysics Harvard & Smithsonian, USA, and Elizabeth S. and Richard M. Cashin Fellow at the Radcliffe Institute for Advanced Studies at Harvard University, USA) E. Schinnerer (Max-Planck-Institut für Astronomie, Germany), T. G. Williams (Sub-department of Astrophysics, Department of Physics, University of Oxford, UK).

The European Southern Observatory (ESO) enables scientists worldwide to discover the secrets of the Universe for the benefit of all. We design, build and operate world-class observatories on the ground — which astronomers use to tackle exciting questions and spread the fascination of astronomy — and promote international collaboration for astronomy. Established as an intergovernmental organisation in 1962, today ESO is supported by 16 Member States (Austria, Belgium, Czechia, Denmark, France, Finland, Germany, Ireland, Italy, the Netherlands, Poland, Portugal, Spain, Sweden, Switzerland and the United Kingdom), along with the host state of Chile and with Australia as a Strategic Partner. ESO's headquarters and its visitor centre and planetarium, the ESO Supernova, are located close to Munich in Germany, while the Chilean Atacama Desert, a marvelous place with unique conditions to observe the sky, hosts our telescopes. ESO operates three observing sites: La Silla, Paranal and Chajnantor. At Paranal, ESO operates the Very Large Telescope and its Very Large Telescope Interferometer, as well as survey telescopes such as VISTA. Also at Paranal ESO will host and operate the Cherenkov Telescope Array South, the world's largest and most sensitive gamma-ray observatory. Together with international partners, ESO operates ALMA on Chajnantor, a facility that observes the skies in the millimeter and submillimeter range. At Cerro Armazones, near Paranal, we are building "the world's biggest eye on the sky" — ESO's Extremely Large Telescope. From our offices in Santiago, Chile we support our operations in the country and engage with Chilean partners and society.

Materialsprovided byESO.Note: Content may be edited for style and length.

Winter sea ice supercharges Southern Ocean’s CO2 uptake

New research reveals the importance of winter sea ice in the year-to-year variability of the amount of atmospheric CO2absorbed by a region of the Southern Ocean.

In years when sea ice lasts longer in winter, the ocean will overall absorb 20% more CO2from the atmosphere than in years when sea ice forms late or disappears early. This is because sea ice protects the ocean from strong winter winds that drive mixing between the surface of the ocean and its deeper, carbon-rich layers.

The findings, based on data collected in a coastal system along the west Antarctic Peninsula, show that what happens in winter is crucial in explaining this variability in CO2uptake.

The study was led by scientists at the University of East Anglia (UEA), in collaboration with colleagues from the Alfred Wegener Institute, Helmholtz Centre for Polar and Marine Research (AWI, Germany), British Antarctic Survey (BAS, UK) and Institute of Marine Research (IMR, Norway). It is published today in the journalCommunications Earth & Environment.

The global ocean takes up about a quarter of all CO2that humans emit into the atmosphere. The Southern Ocean is responsible for about 40% of this and the researchers wanted to know why it varies so much from year to year.

Lead author Dr Elise Droste, of UEA's School of Environmental Sciences, said: "Our picture of the Southern Ocean's carbon cycle is incomplete, and so we cannot predict whether its atmospheric CO2uptake – and therefore its contribution to climate change mitigation – will increase, decrease, or remain the same in the future.

"Whatever it does, it will affect what our climate will look like and how fast it will change. To improve predictions, our work suggests that we need to look at how sea ice affects the exchange of carbon between the deep and shallow parts of the ocean. To do this, we need more wintertime observations in the Southern Ocean."

Much of the Southern Ocean surrounding the west Antarctic Peninsula is covered by sea ice in winter, which disappears in spring and summer. In spring and summer, phytoplankton growth and melt water lead to low CO2concentrations at the ocean surface. This allows the Southern Ocean to absorb large amounts of atmospheric CO2, significantly reducing the global impact of anthropogenic emissions.

In winter, as sea ice forms, the ocean underneath mixes with deeper waters that contain lots of 'natural' carbon that has been in the ocean for centuries. This can cause CO2at the ocean surface to increase to the point where it can be released into the atmosphere.

Sea ice blocks a large amount of this CO2'outgassing'. However, it is part of the natural seasonal cycle that some CO2does escape the ocean. This seasonal balance means that the total amount of CO2absorbed by the Southern Ocean within one year often depends on how much CO2is absorbed in summer and how much is released in winter.

"We don't have a good grasp on what is driving this year-to-year variability, which is making it difficult to fully understand the system and to improve the predictability of how the ocean's CO2uptake will change in the future," said Dr Droste. "One major reason is because we have relatively little data on the Southern Ocean, particularly in the wintertime.

"It is extremely challenging to collect observations in the harsh weather and sea conditions of the Southern Ocean, not to mention sea ice cover making much of it inaccessible, even for the strongest icebreaker. However, this study takes us a step in the right direction."

The study draws on data for 2010-2020, a time series led and maintained by BAS, which collects year-round measurements along the west Antarctic Peninsula. At Rothera, the UK's Antarctic research station, ocean scientists measured physical aspects of the seawater in Ryder Bay and collected samples for nutrient and CO2analysis, carried out at both Rothera and UEA.

Using other physical and chemical data collected at the same time, the team was able to study why years with long sea ice duration differed from those with short sea ice duration.

Dr Hugh Venables, from BAS, said: "A series of ocean scientists have wintered at Rothera on the Antarctic Peninsula to collect these and other samples, from either a small boat or a sea ice sledge, to build a unique time series of year-round oceanographic conditions for the last 25 years.

"This important result shows the importance of this winter sampling and will hopefully lead to more year-round sampling in the Southern Ocean, both by humans and autonomous technology."

Prof Dorothee Bakker, Professor in Marine Biogeochemistry at UEA, added: "The fact that this data has been collected throughout the year at the same location allows us to investigate which mechanisms are important to explain the year-to-year variability of CO2uptake by the ocean at this particular location, but we can also use these insights to better understand how the rest of the Southern Ocean works."

The study also involved scientists from the National Institute of Oceanography and Applied Geophysics (Italy) and University of Gothenburg (Sweden). It was supported by funding from the UK's Natural Environment Research Council and European Union's Horizon 2020 research and innovation program.

Materialsprovided byUniversity of East Anglia.Note: Content may be edited for style and length.

Defying Darwin: Scientists discover worms rewrote their DNA to survive on land

In 1859, Darwin imagined evolution as a slow, gradual progress, with species accumulating small changes over time. But even he was surprised to find the fossil record offered no missing links: the intermediate forms which should have told this story step by step were simply not there. His explanation was as uncomfortable as it was unavoidable: basically, the fossil record is an archive where most of the pages have been torn out.

In 1972, the scarcity of intermediate forms led the palaeontologists Stephen Jay Gould and Niles Eldredge to propose a provocative idea: punctuated equilibrium. According to this theory, rather than changing slowly, species remain stable for millions of years and then suddenly make rapid, radical evolutionary jumps. This model would explain why the fossil record seems so silent between species: large changes would happen suddenly and in small, isolated populations, well off the palaeontological radar. Although some fossils support this pattern, the scientific community remains divided: is this a rule of evolution, or an eye-catching exception?

Now a research team led by the Institute of Evolutionary Biology (IBE), a mixed research centre belonging to the Spanish National Research Council (CSIC) and Pompeu Fabra University (UPF), points for the first time to a mechanism of rapid, massive genomic reorganisation which could have played a part in the transition of marine to land animals 200 million years ago. The team has shown that marine annelids (worms) reorganised their genome from top to bottom, leaving it unrecognisable, when they left the oceans. Their observations are consistent with a punctuated equilibrium model, and could indicate that not only gradual but sudden changes in the genome could have occurred as these animals adapted to terrestrial settings. The genetic mechanism identified could transform our concept of animal evolution and revolutionise the established laws of genome evolution.

An unprecedented invertebrate genomic library

The team sequenced for the first time the high-quality genome of various earthworms, and compared to them to other closely related annelid species (leeches and bristle worms or polychaetes). The level of precision was the same as for sequencing human genomes, although in this case starting from scratch, with no existing references for the studied species. Until now, the lack of complete genomes had prevented the study of chromosomal-level patterns and characteristics for many species, limiting research to smaller-scale phenomena – population studies of a handful of genes, rather than macroevolutionary changes at the full-genome level.

After putting together each of the genomic jigsaw puzzles, the team was able to travel back in time with great precision more than 200 million years, to when the ancestors of the sequenced species were alive. "This is an essential episode in the evolution of life on our planet, given that many species, such as worms and vertebrates, which had been living in the ocean, now ventured onto land for the first time," comments Rosa Fernández, lead researcher of the IBE's Metazoa Phylogenomics and Genome Evolution Lab.

The analysis of these genomes has revealed an unexpected result: the annelids' genomes were not transformed gradually, as Neo-Darwinian theory would predict, but in isolated explosions of deep genetic remodelling. "The enormous reorganisation of the genomes we observed in the worms as they moved from the ocean to land cannot be explained with the parsimonious mechanism Darwin proposed; our observations chime much more with Gould and Eldredge's theory of punctuated equilibrium," Fernández adds.

A radical genetic mechanism which could provide evolutionary responses

The team has discovered that marine worms broke their genome into a thousand pieces only to reconstruct it and continue their evolutionary path on land. This phenomenon challenges the models of genome evolution known to date, given that if we observe almost any species, whether a sponge, a coral, or a mammal, many of their genomic structures are almost perfectly conserved. "The entire genome of the marine worms was broken down and then reorganised in a completely random way, in a very short period on the evolutionary scale," Fernández says. "I made my team repeat the analysis again and again, because I just couldn't believe it."

The reason why this drastic deconstruction did not lead to extinction could be in the 3D structure of the genome. Fernández's team has discovered that the chromosomes of these modern worms are much more flexible than those of vertebrates and other model organisms. Thanks to this flexibility, it is possible that genes in different parts of the genome could change places and continue working together.

Major changes in their DNA could have helped the worms adapt quickly to life on land, reorganising their genes to respond better to new challenges such as breathing air or being exposed to sunlight. The study suggests that these adjustments not only moved genes around, but also joined fragments that had been separated, creating new "genetic chimeras" which would have driven their evolution. "You could think that this chaos would mean the lineage would die out, but it's possible that some species' evolutionary success is based on that superpower," comments Fernández.

The observations in the study are consistent with a punctuated equilibrium model, where we observe an explosion of genomic changes after a long period of stability. However, the lack of experimental data for or against — in this case, 200-million-year-old fossils — makes it difficult to validate this theory.

Chromosomal chaos: problem or solution?

It seems from this study that conserving the genomic structure at the linear level — i.e., where the genes are more or less in the same place in different species — may not be as essential as had been thought. "In fact, stability could be the exception and not the rule in animals, which could benefit from a more fluid genome," Fernández says.

This phenomenon of extreme genetic reorganisation had previously been observed in the progression of cancer in humans. The termchromoanagenesiscovers several mechanisms which break down and reorganise chromosomes in cancerous cells, where we see similar changes to those observed in the earthworms. The only difference is that while these genomic breakdowns and reorganisations are tolerated by the worms, in humans they lead to diseases. The results of this study open the door to a better understanding of the potency of this radical genomic mechanism, with implications for human health.

The study has also reawakened one of the liveliest scientific debates of our time. "Both visions, Darwin's and Gould's, are compatible and complementary. While Neo-Darwinism can explain the evolution of populations perfectly, it has not yet been able to explain some exceptional and crucial episodes in the history of life on Earth, such as the initial explosion of animal life in the oceans over 500 million years ago, or the transition from the sea to land 200 million years ago in the case of earthworms," Fernández notes. "This is where the punctuated equilibrium theory could offer some answers."

In the future, a larger investigation of the genomic architecture of less-studied invertebrates could shed light on the genomic mechanisms shaping the evolution of the species. "There is a great diversity we know nothing about, hidden in the invertebrates, and studying them could bring new discoveries about the diversity and plasticity of genomic organisation, and challenge dogmas on how we think genomes are organised," Fernández concludes.

The study involved the collaboration of research staff from the Universitat Autònoma de Barcelona, Trinity College, the Universidad Complutense de Madrid, the University of Köln, and the Université Libre de Bruxelles.

The study received support from SEA2LAND (Starting Grant funded by the European Research Council), and from the Catalan Biogenome Project, which funded the sequencing of one of the worm genomes.

Materialsprovided bySpanish National Research Council (CSIC).Note: Content may be edited for style and length.

You hear the beep, but can’t find the car: The hidden flaw in electric vehicle safety

As electric cars become more common, vulnerable road users are encountering more and more warning signals from them. Now, new research from Chalmers University of Technology in Sweden, shows that one of the most common signal types is very difficult for humans to locate, especially when multiple similar vehicles are in motion simultaneously.

In a recently published study, researchers from Chalmers investigated how well people can locate three common types of warning (or AVAS -Acoustic Vehicle Alerting System) signals from hybrid and electric vehicles moving at low speeds. The researchers' tests showed that all the signal types were harder to locate than the sound of an internal combustion engine. For one of the signals, the majority of test subjects were unable to distinguish the direction of the sound or determine whether they were hearing one, two or more vehicles simultaneously.

"The requirements placed on car manufacturers relate to detection, or detectability, not about locating sound direction or the number of vehicles involved. But if you imagine, say, a supermarket carpark, it's not inconceivable that several similar car models with the same AVAS signal will be moving at the same time and in different directions," says Leon Müller, a doctoral student at the Department of Architecture and Civil Engineering at Chalmers.

Today's electric and hybrid vehicles meet the requirements set for acoustic warning systems according to international standards. In Europe, plus China and Japan, for example, vehicles travelling at a speed below 20 kph must emit a warning signal consisting of tones or noise, to allow pedestrians, cyclists and other non-car users to detect them. In the United States, warning signals are required from vehicles traveling at speeds of up to 30 kph.

"The way the requirements are worded allows car manufacturers to design their own signature sounds. These warning signals are often tested without the complication of background noise. But in a real traffic environment there are usually many different types of sound," says Wolfgang Kropp, professor of acoustics at the Department of Architecture and Civil Engineering at Chalmers.

Trying multiple different signals

The experiments involved some 52 test subjects and were conducted in Chalmers' acoustics laboratory in soundproofed, anechoic chambers. The aim of the tests was to emulate real conditions in, say, larger carparks. The subject was placed at the center of the room and surrounded by 24 loudspeakers placed in a ring at chest height. Three types of simulated vehicle sounds were played on the loudspeakers, corresponding to the signals from one, two or more electric and hybrid vehicles, plus an internal combustion engine. One of the signals consisted of two tones, one had multiple tones and one was just noise. The test subjects heard a vehicle warning signal at about 7.5 meters away, mixed with pre-recorded background noise from a quiet city carpark. When they heard the signal, the subjects had to mark the direction it was coming from as quickly as possible. The signal comprising two tones coming from three vehicles simultaneously was the most difficult and none of the test subjects managed to locate all the two-tone signals within the ten-second time limit.

The test subjects were easily able to locate the sound corresponding to an internal combustion engine. Leon Müller says this sound consists of short pulses comprising all frequencies; something that is easier for the ear to perceive than a fixed tone at a single frequency. The fact that people can more easily perceive this type of sound may also be because of its familiarity.

"Naturally, as acousticians, we welcome the fact that electric cars are significantly quieter than internal combustion engines but it's important to find a balance," says Müller.

Existing research has focused mainly on detectability and what is usually referred to as "detection distance." No previous studies have investigated what happens when two or three cars emit the same type of signal. The researchers see a major need for further knowledge of how people react in traffic situations involving electric vehicles.

"From a traffic safety point of view, it would be desirable to find a signal that's as effective as possible in terms of detection and localisation but which doesn't affect people negatively; something our previous research has shown to be true of traffic noise," says Kropp.

In a follow-up study, the researchers have begun investigating how AVAS signals are perceived and what effect they may have on non-road users.

The article Auditory Localization of Multiple Stationary Electric Vehicles, is published in The Journal of the Acoustical Society of America.

The authors are Leon Müller, Jens Forssén and Wolfgang Kropp, all working at the Division of Engineering Acoustics, Department of Architecture and Civil Engineering at Chalmers University of Technology in Sweden.

Materialsprovided byChalmers University of Technology.Note: Content may be edited for style and length.

How can we make fewer mistakes? US Navy invests $860k in placekeeping

A team of cognitive psychologists from the Michigan State University Department of Psychology have received a $860,000 grant from the Office of Naval Research to develop assessments for identifying people who are good at performing complex procedural tasks, even under challenging conditions like sleep deprivation and frequent interruptions.

"If we develop the right tools, we can identify people who are going to be better at performing a wide range of procedures. This is important because Navy personnel are increasingly called upon to do lots of different tasks as military systems become more complex," said Erik Altmann, lead investigator of the study and professor in MSU's psychology department. "The goal is to get the right person in the right job at the right time."

This multiyear study will look at individual differences in placekeeping, which is the cognitive ability to remember what step you are on in a procedural sequence. The researchers will also test whether incorporating task interruptions during training can help personnel develop cognitive strategies for placekeeping during deployment, when personnel may be sleep-deprived.

"We know that under conditions of sleep deprivation, people make more procedural errors, especially when they're interrupted in the middle of a task. Procedural errors can be catastrophic, so the Navy is interested in reducing them," said Altmann.

The research team, which also includes cognitive psychologists Kimberly Fenn and Zach Hambrick, has been funded since 2016 by the Office of Naval Research, with past studies looking at multitasking and the effect of sleep on cognitive performance.

The results of this study could improve personnel selection and classification in the Navy and in other fields where procedural accuracy is critical; results also could suggest approaches to training that make people more resilient to effects of stressors like sleep deprivation and task interruption.

Materialsprovided byMichigan State University. Original written by Shelly DeJong.Note: Content may be edited for style and length.

The AI that writes climate-friendly cement recipes in seconds

AI paves the way towards green cement

The cement industry produces around eight percent of global CO2 emissions – more than the entire aviation sector worldwide. Researchers at the Paul Scherrer Institute PSI have developed an AI-based model that helps to accelerate the discovery of new cement formulations that could yield the same material quality with a better carbon footprint.

The rotary kilns in cement plants are heated to a scorching 1,400 degrees Celsius to burn ground limestone down to clinker, the raw material for ready-to-use cement. Unsurprisingly, such temperatures typically can't be achieved with electricity alone. They are the result of energy-intensive combustion processes that emit large amounts of carbon dioxide (CO2). What may be surprising, however, is that the combustion process accounts for less than half of these emissions, far less. The majority is contained in the raw materials needed to produce clinker and cement: CO2 that is chemically bound in the limestone is released during its transformation in the high-temperature kilns.

One promising strategy for reducing emissions is to modify the cement recipe itself – replacing some of the clinker with alternative cementitious materials. That is exactly what an interdisciplinary team in the Laboratory for Waste Management in PSI's Center for Nuclear Engineering and Sciences has been investigating. Instead of relying solely on time-consuming experiments or complex simulations, the researchers developed a modelling approach based on machine learning. "This allows us to simulate and optimise cement formulations so that they emit significantly less CO2 while maintaining the same high level of mechanical performance," explains mathematician Romana Boiger, first author of the study. "Instead of testing thousands of variations in the lab, we can use our model to generate practical recipe suggestions within seconds – it's like having a digital cookbook for climate-friendly cement."

With their novel approach, the researchers were able to selectively filter out those cement formulations that could meet the desired criteria. "The range of possibilities for the material composition – which ultimately determines the final properties – is extraordinarily vast," says Nikolaos Prasianakis head of the Transport Mechanisms Research Group at PSI, who was the initiator and co-author of the study. "Our method allows us to significantly accelerate the development cycle by selecting promising candidates for further experimental investigation." The results of the study were published in the journalMaterials and Structures.

Already today, industrial by-products such as slag from iron production and fly ash from coal-fired power plants are already being used to partially replace clinker in cement formulations and thus reduce CO2 emissions. However, the global demand for cement is so enormous that these materials alone cannot meet the need. "What we need is the right combination of materials that are available in large quantities and from which high-quality, reliable cement can be produced," says John Provis, head of the Cement Systems Research Group at PSI and co-author of the study.

Finding such combinations, however, is challenging: "Cement is basically a mineral binding agent – in concrete, we use cement, water, and gravel to artificially create minerals that hold the entire material together," Provis explains. "You could say we're doing geology in fast motion." This geology – or rather, the set of physical processes behind it – is enormously complex, and modelling it on a computer is correspondingly computationally intensive and expensive. That is why the research team is relying on artificial intelligence.

AI as computational accelerator

Artificial neural networks are computer models that are trained, using existing data, to speed up complex calculations. During training, the network is fed a known data set and learns from it by adjusting the relative strength or "weighting" of its internal connections so that it can quickly and reliably predict similar relationships. This weighting serves as a kind of shortcut – a faster alternative to otherwise computationally intensive physical modelling.

The researchers at PSI also made use of such a neural network. They themselves generated the data required for training: "With the help of the open-source thermodynamic modelling software GEMS, developed at PSI, we calculated – for various cement formulations – which minerals form during hardening and which geochemical processes take place," explains Nikolaos Prasianakis. By combining these results with experimental data and mechanical models, the researchers were able to derive a reliable indicator for mechanical properties – and thus for the material quality of the cement. For each component used, they also applied a corresponding CO2 factor, a specific emission value that made it possible to determine the total CO2 emissions. "That was a very complex and computationally intensive modelling exercise," the scientist says.

But it was worth the effort – with the data generated in this way, the AI model was able to learn. "Instead of seconds or minutes, the trained neural network can now calculate mechanical properties for an arbitrary cement recipe in milliseconds – that is, around a thousand times faster than with traditional modelling," Boiger explains.

How can this AI now be used to find optimal cement formulations – with the lowest possible CO2 emissions and high material quality? One possibility would be to try out various formulations, use the AI model to calculate their properties, and then select the best variants. A more efficient approach, however, is to reverse the process. Instead of trying out all options, ask the question the other way around: Which cement composition meets the desired specifications regarding CO2 balance and material quality?

Both the mechanical properties and the CO2 emissions depend directly on the recipe. "Viewed mathematically, both variables are functions of the composition – if this changes, the respective properties also change," the mathematician explains. To determine an optimal recipe, the researchers formulate the problem as a mathematical optimisation task: They are looking for a composition that simultaneously maximises mechanical properties and minimises CO2 emissions. "Basically, we are looking for a maximum and a minimum – from this we can directly deduce the desired formulation," the mathematician says.

To find the solution, the team integrated in the workflow an additional AI technology, the so-called genetic algorithms – computer-assisted methods inspired by natural selection. This enabled them to selectively identify formulations that ideally combine the two target variables.

The advantage of this "reverse approach": You no longer have to blindly test countless recipes and then evaluate their resulting properties; instead you can specifically search for those that meet specific desired criteria – in this case, maximum mechanical properties with minimum CO2 emissions.

Interdisciplinary approach with great potential

Among the cement formulations identified by the researchers, there are already some promising candidates. "Some of these formulations have real potential," says John Provis, "not only in terms of CO2 reduction and quality, but also in terms of practical feasibility in production." To complete the development cycle, however, the recipes must first be tested in the laboratory. "We're not going to build a tower with them right away without testing them first," Nikolaos Prasianakis says with a smile.

The study primarily serves as a proof of concept – that is, as evidence that promising formulations can be identified purely by mathematical calculation. "We can extend our AI modelling tool as required and integrate additional aspects, such as the production or availability of raw materials, or where the building material is to be used – for example, in a marine environment, where cement and concrete behave differently, or even in the desert," says Romana BoigerNikolaos Prasianakis is already looking ahead: "This is just the beginning. The time savings offered by such a general workflow are enormous – making it a very promising approach for all sorts of material and system designs."

Without the interdisciplinary background of the researchers, the project would never have come to fruition: "We needed cement chemists, thermodynamics experts, AI specialists – and a team that could bring all of this together," Prasianakis says. "Added to this was the important exchange with other research institutions such as EMPA within the framework of the SCENE project." SCENE (the Swiss Centre of Excellence on Net Zero Emissions) is an interdisciplinary research programme that aims to develop scientifically sound solutions for drastically reducing greenhouse gas emissions in industry and the energy supply. The study was carried out as part of this project.

Materialsprovided byPaul Scherrer Institute. Original written by Benjamin A. Senn.Note: Content may be edited for style and length.

Massive thread of hot gas found linking galaxies — and it’s 10 times the mass of the Milky Way

Astronomers have discovered a huge filament of hot gas bridging four galaxy clusters. At 10 times as massive as our galaxy, the thread could contain some of the Universe's 'missing' matter, addressing a decades-long mystery.

The astronomers used the European Space Agency's XMM-Newton and JAXA's Suzaku X-ray space telescopes to make the discovery.

Over one-third of the 'normal' matter in the local Universe – the visible stuff making up stars, planets, galaxies, life – is missing. It hasn't yet been seen, but it's needed to make our models of the cosmos work properly.

Said models suggest that this elusive matter might exist in long strings of gas, or filaments, bridging the densest pockets of space. While we've spotted filaments before, it's tricky to make out their properties; they're typically faint, making it difficult to isolate their light from that of any galaxies, black holes, and other objects lying nearby.

New research is now one ofthe first to do just this, finding and accurately characterizing a single filament of hot gas stretching between four clusters of galaxies in the nearby Universe.

"For the first time, our results closely match what we see in our leading model of the cosmos – something that's not happened before," says lead researcher Konstantinos Migkas of Leiden Observatory in the Netherlands. "It seems that the simulations were right all along."

Clocking in at over 10 million degrees, the filament contains around 10 times the mass of the Milky Way and connects four galaxy clusters: two on one end, two on the other. All are part of the Shapley Supercluster, a collection of more than 8000 galaxies that forms one of the most massive structures in the nearby Universe.

The filament stretches diagonally away from us through the supercluster for 23 million light-years, the equivalent of traversing the Milky Way end to end around 230 times.

Konstantinos and colleagues characterized the filament by combining X-ray observations from XMM-Newton and Suzaku, and digging into optical data from several others.

The two X-ray telescopes were ideal partners. Suzaku mapped the filament's faint X-ray light over a wide region of space, while XMM-Newton pinpointed very precisely contaminating sources of X-rays – namely, supermassive black holes – lying within the filament.

"Thanks to XMM-Newton we could identify and remove these cosmic contaminants, so we knew we were looking at the gas in the filament and nothing else," adds co-author Florian Pacaud of the University of Bonn, Germany. "Our approach was really successful, and reveals that the filament is exactly as we'd expect from our best large-scale simulations of the Universe."

As well as revealing a huge and previously unseen thread of matter running through the nearby cosmos, the finding shows how some of the densest and most extreme structures in the Universe – galaxy clusters – are connected over colossal distances.

It also sheds light on the very nature of the 'cosmic web', the vast, invisible cobweb of filaments that underpins the structure of everything we see around us.

"This research is a great example of collaboration between telescopes, and creates a new benchmark for how to spot the light coming from the faint filaments of the cosmic web," adds Norbert Schartel, ESA XMM-Newton Project Scientist.

"More fundamentally, it reinforces our standard model of the cosmos and validates decades of simulations: it seems that the 'missing' matter may truly be lurking in hard-to-see threads woven across the Universe."

Piecing together an accurate picture of the cosmic web is the domain of ESA's Euclid mission. Launched in 2023, Euclid is exploring this web's structure and history. The mission is also digging deep into the nature of dark matter and energy – neither of which have ever been observed, despite accounting for a whopping 95% of the Universe – and working with other dark Universe detectives to solve some of the biggest and longest-standing cosmic mysteries.

Materialsprovided byEuropean Space Agency.Note: Content may be edited for style and length.

Microscopic heist: How lung bacteria forge weapons to steal iron and survive

Bacteria of the genusPandoraeahave not been studied much to date. Their name is reminiscent of Pandora's box from Greek mythology, which is a symbol of uncontrollable dangers. "We have been working with an antibiotic-resistant bacterium," says Elena Herzog. She is the first author of the publication and works as a doctoral researcher in the team of Christian Hertweck, the head of the study at the Leibniz Institute for Natural Product Research and Infection Biology (Leibniz-HKI). However, like so many things in nature, these pathogenic bacteria do not only have negative properties. "Pandoraeabacteria not only harbor risks. They also produce natural products with an antibacterial effect."

Despite the high health risk posed byPandoraea, their molecular properties were hardly known until now. "We only knew that these bacteria occur in nature and that they can be pathogenic because they have been found in the lung microbiome of patients with cystic fibrosis or sepsis," explains Herzog.

As for most living organisms, iron is also essential for bacteria. "Iron plays a central role in enzymes and the respiratory chain of living organisms, for example," explains Herzog. Particularly in iron-poor environments such as the human body, the conditions for sufficient absorption of the element are anything but ideal. Many microorganisms therefore produce so-called siderophores: small molecules that bind iron from the environment and transport it into the cell.

"However, there were no known virulence or niche factors in thePandoraeabacteria that could help them survive," says Herzog. The research team therefore wanted to find out howPandoraeastrains can survive in such a competitive environment.

Using bioinformatic analyses, the team identified a previously unknown gene cluster calledpan. It codes for a non-ribosomal peptide synthetase – a typical enzyme for the production of siderophores. "We started with a gene cluster analysis and specifically searched for genes that could be responsible for the production of siderophores," reports Herzog.

Through targeted inactivation of genes as well as culture-based methods and state-of-the-art analytical techniques – including mass spectrometry, NMR spectroscopy, chemical degradation and derivatization – the researchers from Jena succeeded in isolating two new natural products and elucidating their chemical structure: Pandorabactin A and B. Both are able to complex iron and could play an important role in howPandoraeastrains survive in difficult environments. "The molecules help the bacteria to take up iron when it is scarce in their environment," says Herzog.

Bioassays have also shown that pandorabactins inhibit the growth of other bacteria such asPseudomonas,MycobacteriumandStenotrophomonasby removing iron from these competitors.

Analyses of sputum samples from the lungs of cystic fibrosis patients further revealed that the detection of thepangene cluster correlates with changes in the lung microbiome. Pandorabactins could therefore have a direct influence on microbial communities in diseased lungs.

"However, it is still too early to derive medical applications from these findings," emphasizes Herzog. Nevertheless, the discovery provides important information on the survival strategies of bacteria of the genusPandoraeaand on the complex competition for vital resources in the human body.

The study was carried out in close cooperation between the Leibniz-HKI and the universities of Jena, Heidelberg and Hong Kong. It was conducted as part of the "Balance of the Microverse" Cluster of Excellence and the ChemBioSys Collaborative Research Center and was funded by the German Research Foundation. The imaging mass spectrometer used for the analyses was funded by the Free State of Thuringia and co-financed by the European Union.

Materialsprovided byLeibniz Institute for Natural Product Research and Infection Biology.Note: Content may be edited for style and length.

Thinking AI models emit 50x more CO2—and often for nothing

No matter which questions we ask an AI, the model will come up with an answer. To produce this information – regardless of whether than answer is correct or not – the model uses tokens. Tokens are words or parts of words that are converted into a string of numbers that can be processed by the LLM.

This conversion, as well as other computing processes, produce CO2emissions. Many users, however, are unaware of the substantial carbon footprint associated with these technologies. Now, researchers in Germany measured and compared CO2emissions of different, already trained, LLMs using a set of standardized questions.

"The environmental impact of questioning trained LLMs is strongly determined by their reasoning approach, with explicit reasoning processes significantly driving up energy consumption and carbon emissions," said first author Maximilian Dauner, a researcher at Hochschule München University of Applied Sciences and first author of theFrontiers in Communicationstudy. "We found that reasoning-enabled models produced up to 50 times more CO2 emissions than concise response models."

'Thinking' AI causes most emissions

The researchers evaluated 14 LLMs ranging from seven to 72 billion parameters on 1,000 benchmark questions across diverse subjects. Parameters determine how LLMs learn and process information.

Reasoning models, on average, created 543.5 'thinking' tokens per questions, whereas concise models required just 37.7 tokens per question. Thinking tokens are additional tokens that reasoning LLMs generate before producing an answer. A higher token footprint always means higher CO2 emissions. It doesn't, however, necessarily mean the resulting answers are more correct, as elaborate detail that is not always essential for correctness.

The most accurate model was the reasoning-enabled Cogito model with 70 billion parameters, reaching 84.9% accuracy. The model produced three times more CO2emissions than similar sized models that generated concise answers. "Currently, we see a clear accuracy-sustainability trade-off inherent in LLM technologies," said Dauner. "None of the models that kept emissions below 500 grams of CO2 equivalent achieved higher than 80% accuracy on answering the 1,000 questions correctly." CO2equivalent is the unit used to measure the climate impact of various greenhouse gases.

Subject matter also resulted in significantly different levels of CO2emissions. Questions that required lengthy reasoning processes, for example abstract algebra or philosophy, led to up to six times higher emissions than more straightforward subjects, like high school history.

The researchers said they hope their work will cause people to make more informed decisions about their own AI use. "Users can significantly reduce emissions by prompting AI to generate concise answers or limiting the use of high-capacity models to tasks that genuinely require that power," Dauner pointed out.

Choice of model, for instance, can make a significant difference in CO2emissions. For example, having DeepSeek R1 (70 billion parameters) answer 600,000 questions would create CO2emissions equal to a round-trip flight from London to New York. Meanwhile, Qwen 2.5 (72 billion parameters) can answer more than three times as many questions (about 1.9 million) with similar accuracy rates while generating the same emissions.

The researchers said that their results may be impacted by the choice of hardware used in the study, an emission factor that may vary regionally depending on local energy grid mixes, and the examined models. These factors may limit the generalizability of the results.

"If users know the exact CO2 cost of their AI-generated outputs, such as casually turning themselves into an action figure, they might be more selective and thoughtful about when and how they use these technologies," Dauner concluded.

Materialsprovided byFrontiers.Note: Content may be edited for style and length.

How life endured the Snowball Earth: Evidence from Antarctic meltwater ponds

When the Earth froze over, where did life shelter? MIT scientists say one refuge may have been pools of melted ice that dotted the planet's icy surface.

In a study appearing inNature Communications,the researchers report that 635 million to 720 million years ago, during periods known as "Snowball Earth," when much of the planet was covered in ice, some of our ancient cellular ancestors could have waited things out in meltwater ponds.

The scientists found that eukaryotes — complex cellular lifeforms that eventually evolved into the diverse multicellular life we see today — could have survived the global freeze by living in shallow pools of water. These small, watery oases may have persisted atop relatively shallow ice sheets present in equatorial regions. There, the ice surface could accumulate dark-colored dust and debris from below, which enhanced its ability to melt into pools. At temperatures hovering around 0 degrees Celsius, the resulting meltwater ponds could have served as habitable environments for certain forms of early complex life.

The team drew its conclusions based on an analysis of modern-day meltwater ponds. Today in Antarctica, small pools of melted ice can be found along the margins of ice sheets. The conditions along these polar ice sheets are similar to what likely existed along ice sheets near the equator during Snowball Earth.

The researchers analyzed samples from a variety of meltwater ponds located on the McMurdo Ice Shelf in an area that was first described by members of Robert Falcon Scott's 1903 expedition as "dirty ice." The MIT researchers discovered clear signatures of eukaryotic life in every pond. The communities of eukaryotes varied from pond to pond, revealing a surprising diversity of life across the setting. The team also found that salinity plays a key role in the kind of life a pond can host: Ponds that were more brackish or salty had more similar eukaryotic communities, which differed from those in ponds with fresher waters.

"We've shown that meltwater ponds are valid candidates for where early eukaryotes could have sheltered during these planet-wide glaciation events," says lead author Fatima Husain, a graduate student in MIT's Department of Earth, Atmospheric and Planetary Sciences (EAPS). "This shows us that diversity is present and possible in these sorts of settings. It's really a story of life's resilience."

The study's MIT co-authors include Schlumberger Professor of Geobiology Roger Summons and former postdoct Thomas Evans, along with Jasmin Millar of Cardiff University, Anne Jungblut at the Natural History Museum in London, and Ian Hawes of the University of Waikato in New Zealand.

Snowball Earth is the colloquial term for periods of time in Earth history during which the planet iced over. It is often used as a reference to the two consecutive, multi-million-year glaciation events which took place during the Cryogenian Period, which geologists refer to as the time between 635 and 720 million years ago. Whether the Earth was more of a hardened snowball or a softer "slushball" is still up for debate. But scientists are certain of one thing: Most of the planet was plunged into a deep freeze, with average global temperatures of minus 50 degrees Celsius. The question has been: How and where did life survive?

"We're interested in understanding the foundations of complex life on Earth. We see evidence for eukaryotes before and after the Cryogenian in the fossil record, but we largely lack direct evidence of where they may have lived during," Husain says. "The great part of this mystery is, we know life survived. We're just trying to understand how and where."

There are a number of ideas for where organisms could have sheltered during Snowball Earth, including in certain patches of the open ocean (if such environments existed), in and around deep-sea hydrothermal vents, and under ice sheets. In considering meltwater ponds, Husain and her colleagues pursued the hypothesis that surface ice meltwaters may also have been capable of supporting early eukaryotic life at the time.

"There are many hypotheses for where life could have survived and sheltered during the Cryogenian, but we don't have excellent analogs for all of them," Husain notes. "Above-ice meltwater ponds occur on Earth today and are accessible, giving us the opportunity to really focus in on the eukaryotes which live in these environments."

For their new study, the researchers analyzed samples taken from meltwater ponds in Antarctica. In 2018, Summons and colleagues from New Zealand traveled to a region of the McMurdo Ice Shelf in East Antarctica, known to host small ponds of melted ice, each just a few feet deep and a few meters wide. There, water freezes all the way to the seafloor, in the process trapping dark-colored sediments and marine organisms. Wind-driven loss of ice from the surface creates a sort of conveyer belt that brings this trapped debris to the surface over time, where it absorbs the sun's warmth, causing ice to melt, while surrounding debris-free ice reflects incoming sunlight, resulting in the formation of shallow meltwater ponds.

The bottom of each pond is lined with mats of microbes that have built up over years to form layers of sticky cellular communities.

"These mats can be a few centimeters thick, colorful, and they can be very clearly layered," Husain says.

These microbial mats are made up of cyanobacteria, prokaryotic, single-celled photosynthetic organisms that lack a cell nucleus or other organelles. While these ancient microbes are known to survive within some of the the harshest environments on Earth including meltwater ponds, the researchers wanted to know whether eukaryotes — complex organisms that evolved a cell nucleus and other membrane bound organelles — could also weather similarly challenging circumstances. Answering this question would take more than a microscope, as the defining characteristics of the microscopic eukaryotes present among the microbial mats are too subtle to distinguish by eye.

To characterize the eukaryotes, the team analyzed the mats for specific lipids they make called sterols, as well as genetic components called ribosomal ribonucleic acid (rRNA), both of which can be used to identify organisms with varying degrees of specificity. These two independent sets of analyses provided complementary fingerprints for certain eukaryotic groups. As part of the team's lipid research, they found many sterols and rRNA genes closely associated with specific types of algae, protists, and microscopic animals among the microbial mats. The researchers were able to assess the types and relative abundance of lipids and rRNA genes from pond to pond, and found the ponds hosted a surprising diversity of eukaryotic life.

"No two ponds were alike," Husain says. "There are repeating casts of characters, but they're present in different abundances. And we found diverse assemblages of eukaryotes from all the major groups in all the ponds studied. These eukaryotes are the descendants of the eukaryotes that survived the Snowball Earth. This really highlights that meltwater ponds during Snowball Earth could have served as above-ice oases that nurtured the eukaryotic life that enabled the diversification and proliferation of complex life — including us — later on."

This research was supported in part by the NASA Exobiology Program, the Simons Collaboration on the Origins of Life, and a MISTI grant from MIT-New Zealand.

Materialsprovided byMassachusetts Institute of Technology. Original written by Jennifer Chu.Note: Content may be edited for style and length.

Tidak Ada Lagi Postingan yang Tersedia.

Tidak ada lagi halaman untuk dimuat.