Space-laser AI maps forest carbon in minutes—a game-changer for climate science

Satellite data used by archaeologists to find traces of ancient ruins hidden under dense forest canopies can also be used to improve the speed and accuracy to measure how much carbon is retained and released in forests.

Understanding this carbon cycle is key to climate change research, according to Hamdi Zurqani, an assistant professor of geospatial science for the Arkansas Forest Resources Center and the College of Forestry, Agriculture and Natural Resources at the University of Arkansas at Monticello. The center is headquartered at UAM and conducts research and extension activities through the Arkansas Agricultural Experiment Station and the Cooperative Extension Service, the University of Arkansas System Division of Agriculture's research and outreach arms.

"Forests are often called the lungs of our planet, and for good reason," Zurqani said. "They store roughly 80 percent of the world's terrestrial carbon and play a critical role in regulating Earth's climate."

To measure a forest's carbon cycle, a calculation of forest aboveground biomass is needed. Though effective, traditional ground-based methods for estimating forest aboveground biomass are labor-intensive, time-consuming and limited in spatial coverage abilities, Zurqani said.

In a study recently published inEcological Informatics, Zurqani shows how information from open-access satellites can be integrated on Google Earth Engine with artificial intelligence algorithms to quickly and accurately map large-scale forest aboveground biomass, even in remote areas where accessibility is often an issue.

Zurqani's novel approach uses data from NASA's Global Ecosystem Dynamics Investigation LiDAR, also known as GEDI LiDAR, which includes three lasers installed on the International Space Station. The system can precisely measure three-dimensional forest canopy height, canopy vertical structure and surface elevation. LiDAR stands for "light detection and ranging" and uses light pulses to measure distance and create 3D models.

Zurqani also used imagery data from the European Space Agency's collection of Earth observation Copernicus Sentinel satellites — Sentinel-1 and Sentinel-2. Combining the 3D imagery from GEDI and the optical imagery from the Sentinels, Zurqani improved the accuracy of biomass estimations.

The study tested four machine learning algorithms to analyze the data: Gradient tree boosting, random forest, classification and regression trees, or CART, and support vector machine. Gradient tree boosting achieved the highest accuracy score and the lowest error rates. Random forest came in second, proving reliable but slightly less precise. CART provided reasonable estimates but tended to focus on a smaller subset. The support vector machine algorithm struggled, Zurqani said, highlighting that not all AI models are equally suited for estimating aboveground forest biomass in this study.

The most accurate predictions, Zurqani said, came from combining Sentinel-2 optical data, vegetation indices, topographic features, and canopy height with the GEDI LiDAR dataset serving as the reference input for both training and testing the machine learning models, showing that multi-source data integration is critical for reliable biomass mapping.

Zurqani said that accurate forest biomass mapping has real-world implications for better accounting of carbon and improved forest management on a global scale. With more accurate assessments, governments and organizations can more precisely track carbon sequestration and emissions from deforestation to inform policy decisions.

While the study marks a leap forward in measuring aboveground forest biomass, Zurqani said the challenges remaining include the impact weather can have on satellite data. Some regions still lack high-resolution LiDAR coverage. He added that future research may explore deeper AI models, such as neural networks, to refine predictions further.

"One thing is clear," Zurqani said. "As climate change intensifies, technology like this will be indispensable in safeguarding our forests and the planet."

Materialsprovided byUniversity of Arkansas System Division of Agriculture. Original written by John Lovett.Note: Content may be edited for style and length.

CRISPR-edited stem cells reveal hidden causes of autism

To allow studying the genetic causes of autism spectrum disorder, a Kobe University research team created a bank of 63 mouse embryonic stem cell lines containing the mutations most strongly associated with the disorder. The achievement was made possible by developing a new and more efficient method for changing the genome of embryonic stem cells.

Although it is well understood that genetics influence the development of autism spectrum disorder, no one could yet pinpoint the precise cause and mechanism. To study the biological background of diseases, researchers use models: Cell models allow us to study how changes in the genes affect the shape and function of the cell, while animal models show how the change in its cellular components affects health and behavior. Despite significant differences between mice and humans, many disease-causing genes are very similar and cause similar conditions across these species. “One of the problems, however, is the lack of a standardized biological model to study the effects of the different mutations associated with autism spectrum disorder. This makes it difficult to find out, for example, whether they have common effects or what is specific to certain cell types,” explains Kobe University neuroscientist TAKUMI Toru.

Thus, twelve years ago, Takumi and his team embarked on a journey to change that. Being experts in studying mouse models of the disorder, they combined a conventional manipulation technique for mouse embryonic stem cells — cells that can be made to develop into almost any kind of cell in the body — with the then-newly discovered, highly specific and easy-to-handle CRISPR gene editing system. This new method proved highly efficient in making genetic variants of these cells and allowed the Kobe University team to produce a bank of 63 mouse embryonic stem cell lines of the genetic variants most strongly associated with autism spectrum disorder.

In the journalCell Genomics, Takumi and his team now published that they were able to develop their cells into a broad range of cell types and tissues, and even generate adult mice with their genetic variations. The analysis of these alone proved that their cell lines were adequate models for studying autism spectrum disorder. However, the cell lines also allowed them to conduct large-scale data analyses to clearly identify genes that are abnormally active, and in which cell types this is the case.

One of the things the data analysis brought to light is that autism-causing mutations often result in neurons being unable to eliminate misshapen proteins. “This is particularly interesting since the local production of proteins is a unique feature in neurons, and a lack of quality control of these proteins may be a causal factor of neuronal defects,” explains Takumi.

The Kobe University neuroscientist expects that his team’s achievement, which has been made available to other researchers and can be flexibly integrated with other lab techniques and adjusted to other targets, will be an invaluable resource for the scientific community studying autism and trying to find drug targets. He adds: “Interestingly, the genetic variants we studied are also implicated in other neuropsychiatric disorders such as schizophrenia and bipolar disorder. So, this library may be useful for studying other conditions as well.”

This research was funded by the Japan Society for the Promotion of Science (grants 16H06316, 16F16110, 21H00202, 21H04813, 23KK0132, 23H04233, 24H00620, 24H01241, 24K22036, 17K07119 and 21K07820), the Japan Agency for Medical Research and Development (grant JP21wm0425011), the Japan Science and Technology Agency (grants JPMJPF2018, JPMJMS2299 and JPMJMS229B), the National Center of Neurology and Psychiatry (grant 6-9), the Takeda Science Foundation, the Smoking Research Foundation, the Tokyo Biochemical Research Foundation, the Kawano Masanori Memorial Public Interest Incorporated Foundation for Promotion of Pediatrics, the Taiju Life Social Welfare Foundation, the Tokumori Yasumoto Memorial Trust for Researches on Tuberous Sclerosis Complex and Related Rare Neurological Diseases, and Takeda Pharmaceutical Company Ltd. It was conducted in collaboration with researchers from the RIKEN Center for Brain Science, Radboud University, the RIKEN Center for Integrative Medical Sciences, the Agency for Science, Technology and Research, the RIKEN Center for Biosystems Dynamics Research, and Hiroshima University.

Kobe University is a national university with roots dating back to the Kobe Higher Commercial School founded in 1902. It is now one of Japan’s leading comprehensive research universities with nearly 16,000 students and nearly 1,700 faculty in 11 faculties and schools and 15 graduate schools. Combining the social and natural sciences to cultivate leaders with an interdisciplinary perspective, Kobe University creates knowledge and fosters innovation to address society’s challenges.

Materialsprovided byKobe University.Note: Content may be edited for style and length.

Why giant planets might form faster than we thought

An international team of astronomers including researchers at the University of Arizona Lunar and Planetary Laboratory has unveiled groundbreaking findings about the disks of gas and dust surrounding nearby young stars, using the powerful Atacama Large Millimeter/submillimeter Array, or ALMA.

The findings, published in 12 papers in a focus issue of theAstrophysical Journal, are part of an ALMA large program called the ALMA Survey of Gas Evolution of PROtoplanetary Disks, or AGE-PRO. AGE-PRO observed 30 planet-forming disks around sunlike stars to measure gas disk mass at different ages. The study revealed that gas and dust components in these disks evolve at different rates.

Prior ALMA observations have examined the evolution of dust in disks; AGE-PRO, for the first time, traces the evolution of gas, providing the first measurements of gas disk masses and sizes across the lifetime of planet-forming disks, according to the project's principal investigator, Ke Zhang of the University of Wisconsin-Madison.

"Now we have both, the gas and the dust," said Ilaria Pascucci, a professor at planetary sciences at the U of A and one of three AGE-PRO co-principal investigators. "Observing the gas is much more difficult because it takes much more observing time, and that's why we have to go for a large program like this one to obtain a statistically significant sample."

A protoplanetary disk swirls around its host star for several million years as its gas and dust evolve and dissipate, setting the timescale for giant planets to form. The disk's initial mass and size, as well as its angular momentum, have a profound influence on the type of planet it could form — gas giants, icy giants or mini-Neptunes — and migration paths of planets. The lifetime of the gas within the disk determines the timescale for the growth of dust particles to an object the size of an asteroid, the formation of a planet and finally the planet's migration from where it was born.

In one of the survey's most surprising findings, the team discovered that as disks age, their gas and dust are consumed at different rates and undergo a shift in gas-to-dust mass ratio as the disks evolve: Unlike the dust, which tends to remain inside the disk over a longer time span, the gas disperses relatively quickly, then more slowly as the disk ages. In other words, planet-forming disks blow off more of their gas when they're young.

Zhang said the most surprising finding is that although most disks dissipate after a few million years, the ones that survive have more gas than expected. This would suggest that gaseous planets like Jupiter have less time to form than rocky planets.

ALMA's unique sensitivity allowed researchers to use faint, so-called molecular lines to study the cold gas in these disks, characteristic wavelengths of a light spectrum that essentially act as "fingerprints," identifying different species of gas molecules. The first large-scale chemical survey of its kind, AGE-PRO targeted 30 planet-forming disks in three star-forming regions, ranging from 1 million to 6 million years in age: Ophiuchus (youngest), Lupus (1-3 million years old), and Upper Scorpius (oldest). Using ALMA, AGE-PRO obtained observations of key tracers of gas and dust masses in disks spanning crucial stages of their evolution, from their earliest formation to their eventual dispersal. This ALMA data will serve as a comprehensive legacy library of spectral line observations for a large sample of disks at different evolutionary stages.

Dingshan Deng, a graduate student at LPL who is the lead author on one of the papers, provided the data reduction — essentially, the image analyses needed to get from radio signals to optical images of the disks — for the star-forming region in the constellation of Lupus (Latin for "wolf").

"Thanks to these new and long observations, we now have the ability to estimate and trace the gas masses, not only for the brightest and better studied disks in that region, but also the smaller and fainter ones," he said. "Thanks to the discovery of gas tracers in many disks where it hadn't been seen before, we now have a well-studied sample covering a wide range of disk masses in the Lupus star-forming region."

"It took years to figure out the proper data reduction approach and analysis to produce the images used in this paper for the gas masses and in many other papers of the collaboration," Pascucci added.

Carbon monoxide is the most widely used chemical tracer in protoplanetary disks, but to thoroughly measure the mass of gas in a disk, additional molecular tracers are needed. AGE-PRO used N2H+, or diazenylium, an ion used as an indicator for nitrogen gas in interstellar clouds, as an additional gas tracer to significantly improve the accuracy of measurements. ALMA's detections were also set up to receive spectral light signatures from other molecules, including formaldehyde, methyl cyanide and several molecular species containing deuterium, a hydrogen isotope.

"Another finding that surprised us was that the mass ratio between the gas and dust tends to be more consistent across disks of different masses than expected," Deng said. "In other words, different-size disks will share a similar gas-to-dust mass ratio, whereas the literature suggested that smaller disks might shed their gas faster."

Funding for this study was provided by the National Science Foundation, the European Research Council, the Alexander von Humboldt Foundation, FONDECYT (Chile) among other sources. For full funding information, see the research paper.

Materialsprovided byUniversity of Arizona.Note: Content may be edited for style and length.

Passive cooling breakthrough could slash data center energy use

Engineers at the University of California San Diego have developed a new cooling technology that could significantly improve the energy efficiency of data centers and high-powered electronics. The technology features a specially engineered fiber membrane that passively removes heat through evaporation. It offers a promising alternative to traditional cooling systems like fans, heat sinks and liquid pumps. It could also reduce the water use associated with many current cooling systems.

The advance is detailed in a paper published on June 13 in the journalJoule.

As artificial intelligence (AI) and cloud computing continue to expand, the demand for data processing — and the heat it generates — is skyrocketing. Currently, cooling accounts for up to 40% of a data center's total energy use. If trends continue, global energy use for cooling could more than double by 2030.

The new evaporative cooling technology could help curb that trend. It uses a low-cost fiber membrane with a network of tiny, interconnected pores that draw cooling liquid across its surface using capillary action. As the liquid evaporates, it efficiently removes heat from the electronics underneath — no extra energy required. The membrane sits on top of microchannels above the electronics, pulling in liquid that flows through the channels and efficiently dissipating heat.

"Compared to traditional air or liquid cooling, evaporation can dissipate higher heat flux while using less energy," said Renkun Chen, professor in the Department of Mechanical and Aerospace Engineering at the UC San Diego Jacobs School of Engineering, who co-led the project with professors Shengqiang Cai and Abhishek Saha, both from the same department. Mechanical and aerospace engineering Ph.D. student Tianshi Feng and postdoctoral researcher Yu Pei, both members of Chen's research group, are co-first authors on the study.

Many applications currently rely on evaporation for cooling. Heat pipes in laptops and evaporators in air conditioners are some examples, explained Chen. But applying it effectively to high-power electronics has been a challenge. Previous attempts using porous membranes — which have high surface areas that are ideal for evaporation — have been unsuccessful because their pores were either too small they would clog or too large they would trigger unwanted boiling. "Here, we use porous fiber membranes with interconnected pores with the right size," said Chen. This design achieves efficient evaporation without those downsides.

When tested across variable heat fluxes, the membrane achieved record-breaking performance. It managed heat fluxes exceeding 800 watts of heat per square centimeter — one of the highest levels ever recorded for this kind of cooling system. It also proved stable over multiple hours of operation.

"This success showcases the potential of reimagining materials for entirely new applications," said Chen. "These fiber membranes were originally designed for filtration, and no one had previously explored their use in evaporation. We recognized that their unique structural characteristics — interconnected pores and just the right pore size — could make them ideal for efficient evaporative cooling. What surprised us was that, with the right mechanical reinforcement, they not only withstood the high heat flux-they performed extremely well under it."

While the current results are promising, Chen says the technology is still operating well below its theoretical limit. The team is now working to refine the membrane and optimize performance. Next steps include integrating it into prototypes of cold plates, which are flat components that attach to chips like CPUs and GPUs to dissipate heat. The team is also launching a startup company to commercialize the technology.

This research was supported by the National Science Foundation (grants CMMI-1762560 and DMR-2005181). The work was performed in part at the San Diego Nanotechnology Infrastructure (SDNI) at UC San Diego, a member of the National Nanotechnology Coordinated Infrastructure, which is supported by the National Science Foundation (grant ECCS-2025752).

Disclosures: A patent related to this work was filed by the Regents of the University of California (PCT Application No. PCT/US24/46923.). The authors declare that they have no other competing interests.

Materialsprovided byUniversity of California – San Diego.Note: Content may be edited for style and length.

Fruit-eating mastodons? Ancient fossils confirm a long-lost ecological alliance

Ten thousand years ago, mastodons vanished from South America. With them, an ecologically vital function also disappeared: the dispersal of seeds from large-fruited plants. A new study led by the University of O'Higgins, Chile, with key contributions from IPHES-CERCA, demonstrates for the first time — based on direct fossil evidence — that these extinct elephant relatives regularly consumed fruit and were essential allies of many tree species. Their loss was not only zoological; it was also botanical, ecological, and evolutionary. Some plant species that relied on mastodons for seed dispersal are now critically endangered.

Published inNature Ecology & Evolution, the research presents the first solid evidence of frugivory inNotiomastodon platensis, a South American Pleistocene mastodon. The findings are based on a multiproxy analysis of 96 fossil teeth collected over a span of more than 1,500 kilometers, from Los Vilos to Chiloé Island in southern Chile. Nearly half of the specimens come from the emblematic site of Lake Tagua Tagua, an ancient lake basin rich in Pleistocene fauna, located in the present-day O'Higgins Region.

The study was led by Dr. Erwin González-Guarda, researcher at the University of O'Higgins and associate at IPHES-CERCA, alongside an international team that includes IPHES-CERCA researchers Dr. Florent Rivals, a paleodiet specialist; Dr. Carlos Tornero and Dr. Iván Ramírez-Pedraza, experts in stable isotopes and paleoenvironmental reconstruction; and Alia Petermann-Pichincura. The research was carried out in collaboration with the Universitat Rovira i Virgili (URV) and the Universitat Autònoma de Barcelona (UAB).

An ecological hypothesis finally proven

In 1982, biologist Daniel Janzen and paleontologist Paul Martin proposed a revolutionary idea: many tropical plants developed large, sweet, and colorful fruits to attract large animals — such as mastodons, native horses, or giant ground sloths — that would serve as seed dispersers. Known as the "neotropical anachronisms hypothesis," this theory remained unconfirmed for over forty years. Now, the study led by González-Guarda provides direct fossil evidence that validates it. To understand the lifestyle of this mastodon, the team employed various techniques: isotopic analysis, microscopic dental wear studies, and fossil calculus analysis. "We found starch residues and plant tissues typical of fleshy fruits, such as those of the Chilean palm (Jubaea chilensis)," explains Florent Rivals, ICREA research professor at IPHES-CERCA and an expert in paleodiet. "This directly confirms that these animals frequently consumed fruit and played a role in forest regeneration."

The forgotten role of large seed dispersers

"Through stable isotope analysis, we were able to reconstruct the animals' environment and diet with great precision," notes Iván Ramírez-Pedraza. The data point to a forested ecosystem rich in fruit resources, where mastodons traveled long distances and dispersed seeds along the way. That ecological function remains unreplaced.

"Dental chemistry gives us a direct window into the past," says Carlos Tornero. "By combining different lines of evidence, we've been able to robustly confirm their frugivory and the key role they played in these ecosystems."

A future threatened by an incomplete past

The extinction of mastodons broke a co-evolutionary alliance that had lasted for millennia. The researchers applied a machine learning model to compare the current conservation status of megafauna-dependent plants across different South American regions. The results are alarming: in central Chile, 40% of these species are now threatened — a rate four times higher than in tropical regions where animals such as tapirs or monkeys still act as alternative seed dispersers.

"Where that ecological relationship between plants and animals has been entirely severed, the consequences remain visible even thousands of years later," says study co-author Andrea P. Loayza.

Species like the gomortega (Gomortega keule), the Chilean palm, and the monkey puzzle tree (Araucaria araucana) now survive in small, fragmented populations with low genetic diversity. They are living remnants of an extinct interaction.

Paleontology as a key to conservation

Beyond its fossil discoveries, the study sends a clear message: understanding the past is essential to addressing today's ecological crises. "Paleontology isn't just about telling old stories," concludes Florent Rivals. "It helps us recognize what we've lost — and what we still have a chance to save."

Materialsprovided byUniversitat Autonoma de Barcelona.Note: Content may be edited for style and length.

AI Reveals Milky Way’s Black Hole Spins Near Top Speed

An international team of astronomers has trained a neural network with millions of synthetic simulations and artificial intelligence (AI) to tease out new cosmic curiosities about black holes, revealing the one at the center of our Milky Way is spinning at nearly top speed.

These large ensembles of simulations were generated by throughput computing capabilities provided by the Center for High Throughput Computing (CHTC), a joint entity of the Morgridge Institute for Research and the University of Wisconsin-Madison. The astronomers published their results and methodology today in three papers in the journalAstronomy & Astrophysics.

High-throughput computing, celebrating its 40thanniversary this year, was pioneered by Wisconsin computer scientist Miron Livny. It's a novel form of distributed computing that automates computing tasks across a network of thousands of computers, essentially turning a single massive computing challenge into a supercharged fleet of smaller ones. This computing innovation is helping fuel big-data discovery across hundreds of scientific projects worldwide, including the search for cosmic neutrinos, subatomic particles and gravitational waves as well as to unravel antibiotic resistance.

In 2019, the Event Horizon Telescope (EHT) Collaboration released the first image of a supermassive black hole at the center of the galaxy M87. In 2022, they presented the image of the black hole at the center of our Milky Way, Sagittarius A*. However, the data behind the images still contained a wealth of hard-to-crack information. An international team of researchers trained a neural network to extract as much information as possible from the data.

Previous studies by the EHT Collaboration used only a handful of realistic synthetic data files. Funded by the National Science Foundation (NSF) as part of the Partnership to Advance Throughput Computing (PATh) project, the Madison-based CHTC enabled the astronomers to feed millions of such data files into a so-called Bayesian neural network, which can quantify uncertainties. This allowed the researchers to make a much better comparison between the EHT data and the models.

Thanks to the neural network, the researchers now suspect that the black hole at the center of the Milky Way is spinning at almost top speed. Its rotation axis points to the Earth. In addition, the emission near the black hole is mainly caused by extremely hot electrons in the surrounding accretion disk and not by a so-called jet. Also, the magnetic fields in the accretion disk appear to behave differently from the usual theories of such disks.

"That we are defying the prevailing theory is of course exciting," says lead researcher Michael Janssen, of Radboud University Nijmegen, the Netherlands. "However, I see our AI and machine learning approach primarily as a first step. Next, we will improve and extend the associated models and simulations."

"The ability to scale up to the millions of synthetic data files required to train the model is an impressive achievement," adds Chi-kwan Chan, an Associate Astronomer of Steward Observatory at the University of Arizonaand a longtime PATh collaborator. "It requires dependable workflow automation, and effective workload distribution across storage resources and processing capacity."

"We are pleased to see EHT leveraging our throughput computing capabilities to bring the power of AI to their science," says Professor Anthony Gitter, a Morgridge Investigator and a PATh Co-PI. "Like in the case of other science domains, CHTC's capabilities allowed EHT researchers to assemble the quantity and quality of AI-ready data needed to train effective models that facilitate scientific discovery."

The NSF-funded Open Science Pool, operated by PATh, offers computing capacity contributed by more than 80 institutions across the United States. The Event Horizon black hole project performed more than 12 million computing jobs in the past three years.

"A workload that consists of millions of simulations is a perfect match for our throughput-oriented capabilities that were developed and refined over four decades" says Livny, director of the CHTC and lead investigator of PATh. "We love to collaborate with researchers who have workloads that challenge the scalability of our services."

Deep learning inference with the Event Horizon Telescope I. Calibration improvements and a comprehensive synthetic data library. By: M. Janssen et al. In:Astronomy & Astrophysics, 6 June 2025.

Deep learning inference with the Event Horizon Telescope II. The Zingularity framework for Bayesian artificial neural networks. By: M. Janssen et al. In:Astronomy & Astrophysics, 6 June 2025.

Deep learning inference with the Event Horizon Telescope III. Zingularity results from the 2017 observations and predictions for future array expansions. By: M. Janssen et al. In:Astronomy & Astrophysics, 6 June 2025.

Materialsprovided byMorgridge Institute for Research.Note: Content may be edited for style and length.

Koalas on the brink: Precision DNA test offers a lifeline to Australia’s icons

A University of Queensland-led project has developed a tool to standardize genetic testing of koala populations, providing a significant boost to conservation and recovery efforts.

Dr Lyndal Hulse from UQ's School of the Environment said the standardized koala genetic marker panel provides a consistent method for researchers nationwide to capture and share koala genetic variation, enabling improved collaboration and data integration across studies.

"Koalas in the wild are under increasing pressure from habitat loss, disease and vehicle strikes, forcing them to live in increasingly smaller and more isolated pockets with limited access to breeding mates outside their group," Dr Hulse said.

"Population inbreeding can mean detrimental effects on their health.

"A standardized panel for directly comparing genetic markers enables researchers, conservationists and government agencies to better understand the genetic diversity of koala populations, allowing for greater collaboration to ensure their survival."

Saurabh Shrivastava, Senior Account Manager at project partner the Australian Genome Research Facility (AGRF Ltd), said the new screening tool was a single nucleotide polymorphism (SNP) array that used next-generation sequencing technologies.

"The Koala SNP-array can accommodate good quality DNA, so is suitable for broad-scale monitoring of wild koala populations," Mr Shrivastava said.

"Importantly, it is available to all researchers and managers."

Dr Hulse said ideally the tool could help guide targeted koala relocations across regions.

"There are very strict rules about relocating koalas, but this could be key to improving and increasing the genetics of populations under threat," she said.

"These iconic Australian marsupials are listed as endangered in Queensland, New South Wales and the ACT – and in 50 years we may only be able to see koalas in captivity.

"Understanding the genetic diversity of different populations of koalas is crucial if we're going to save them from extinction."

The project included researchers from the Australasian Wildlife Genomics Group at the University of New South Wales.

AGRF Ltd isanot-for-profit organization advancing Australian genomics through nationwide access to expert support and cutting-edge technology across a broad range of industries including biomedical, health, agriculture and environmental sectors.

Materialsprovided byUniversity of Queensland.Note: Content may be edited for style and length.

Scientists reveal the hidden trigger behind massive floods

Atmospheric rivers are responsible for most flooding on the West Coast of the U.S., but also bring much needed moisture to the region. The size of these storms doesn't always translate to flood risk, however, as other factors on the ground play important roles. Now, a new study helps untangle the other drivers of flooding to help communities and water managers better prepare.

The research, published June 4 in theJournal of Hydrometeorology, analyzed more than 43,000 atmospheric river storms across 122 watersheds on the West Coast between 1980 and 2023. The researchers found that one of the primary driving forces of flooding is wet soils that can't absorb more water when a storm hits. They showed that flood peaks were 2-4.5 times higher, on average, when soils were already wet. These findings can help explain why some atmospheric river storms cause catastrophic flooding while others of comparable intensity do not. Even weaker storms can generate major floods if their precipitation meets a saturated Earth, while stronger storms may bring needed moisture to a parched landscape without causing flooding.

"The main finding comes down to the fact that flooding from any event, but specifically from atmospheric river storms, is a function not only of the storm size and magnitude, but also what's happening on the land surface," said Mariana Webb, lead author of the study who is completing her Ph.D. at DRI and the University of Nevada, Reno. "This work demonstrates the key role that pre-event soil moisture can have in moderating flood events. Interestingly, flood magnitudes don't increase linearly as soil moisture increases, there's this critical threshold of soil moisture wetness above which you start to see much larger flows."

The study also untangled the environmental conditions of regions where soil moisture has the largest influence on flooding. In arid places like California and southwestern Oregon, storms that hit when soils are already saturated are more likely to cause floods. This is because watersheds in these regions typically have shallow, clay-rich soils and limited water storage capacity. Due to lower precipitation and higher evaporation rates, soil moisture is also more variable in these areas. In contrast, in lush Washington and the interior Cascades and Sierra Nevada regions, watersheds tend to have deeper soils and snowpack, leading to a higher water storage capacity. Although soil saturation can still play a role in driving flooding in these areas, accounting for soil moisture is less valuable for flood management because soils are consistently wet or insulated by snow.

"We wanted to identify the watersheds where having additional information about the soil moisture could enhance our understanding of flood risk," Webb said. "It's the watersheds in more arid climates, where soil moisture is more variable due to evaporation and less consistent precipitation, where we can really see improvements in flood prediction."

Although soil moisture data is currently measured at weather monitoring stations like the USDA's SNOTEL Network, observations are relatively sparse compared to other measures like rainfall. Soil moisture can also vary widely within a single watershed, so often multiple stations are required to give experts a clear picture that can help inform flooding predictions. Increased monitoring in watersheds identified as high-risk, including real-time soil moisture observations, could significantly enhance early warning systems and flood management as atmospheric rivers become more frequent and intense.

By tailoring flood risk evaluations to a specific watershed's physical characteristics and climate, the study could improve flood-risk predictions. The research demonstrates how flood risk increases not just with storm size and magnitude, but with soil moisture, highlighting the value of integrating land surface conditions into impact assessments for atmospheric rivers. "My research really focuses on this interdisciplinary space between atmospheric science and hydrology," Webb said. "There's sometimes a disconnect where atmospheric scientists think about water up until it falls as rain, and hydrologists start their work once the water is on the ground. I wanted to explore how we can better connect these two fields."

Webb worked with DRI ecohydrologist Christine Albano to produce the research, building on Albano's extensive expertise studying atmospheric rivers, their risks, and their impacts on the landscape.

"While soil saturation is widely recognized as a key factor in determining flood risk, Mari's work helps to quantify the point at which this level of saturation leads to large increases in flood risk across different areas along the West Coast," Albano said. "Advances in weather forecasting allow us to see atmospheric rivers coming toward the coast several days before they arrive. By combining atmospheric river forecast information with knowledge of how close the soil moisture is to critical saturation levels for a given watershed, we can further improve flood early warning systems."

Materialsprovided byDesert Research Institute.Note: Content may be edited for style and length.

Impossible signal from deep beneath Antarctic ice baffles physicists

A cosmic particle detector in Antarctica has emitted a series of bizarre signals that defy the current understanding of particle physics, according to an international research group that includes scientists from Penn State. The unusual radio pulses were detected by the Antarctic Impulsive Transient Antenna (ANITA) experiment, a range of instruments flown on balloons high above Antarctica that are designed to detect radio waves from cosmic rays hitting the atmosphere.

The goal of the experiment is to gain insight into distant cosmic events by analyzing signals that reach the Earth. Rather than reflecting off the ice, the signals — a form of radio waves — appeared to be coming from below the horizon, an orientation that cannot be explained by the current understanding of particle physics and may hint at new types of particles or interactions previously unknown to science, the team said.

The researchers published their results in the journal Physical Review Letters.

"The radio waves that we detected were at really steep angles, like 30 degrees below the surface of the ice," said Stephanie Wissel, associate professor of physics, astronomy and astrophysics who worked on the ANITA team searching for signals from elusive particles called neutrinos.

She explained that by their calculations, the anomalous signal had to pass through and interact with thousands of kilometers of rock before reaching the detector, which should have left the radio signal undetectable because it would have been absorbed into the rock.

"It's an interesting problem because we still don't actually have an explanation for what those anomalies are, but what we do know is that they're most likely not representing neutrinos," Wissel said.

Neutrinos, a type of particle with no charge and the smallest mass of all subatomic particles, are abundant in the universe. Usually emitted by high-energy sources like the sun or major cosmic events like supernovas or even the Big Bang, there are neutrino signals everywhere. The problem with these particles, though, is that they are notoriously difficult to detect, Wissel explained.

"You have a billion neutrinos passing through your thumbnail at any moment, but neutrinos don't really interact," she said. "So, this is the double-edged sword problem. If we detect them, it means they have traveled all this way without interacting with anything else. We could be detecting a neutrino coming from the edge of the observable universe."

Once detected and traced to their source, these particles can reveal more about cosmic events than even the most high-powered telescopes, Wissel added, as the particles can travel undisturbed and almost as fast as the speed of light, giving clues about cosmic events that happened lightyears away.

Wissel and teams of researchers around the world have been working to design and build special detectors to capture sensitive neutrino signals, even in relatively small amounts. Even one small signal from a neutrino holds a treasure trove of information, so all data has significance, she said.

"We use radio detectors to try to build really, really large neutrino telescopes so that we can go after a pretty low expected event rate," said Wissel, who has designed experiments to spot neutrinos in Antarctica and South America.

ANITA is one of these detectors, and it was placed in Antarctica because there is little chance of interference from other signals. To capture the emission signals, the balloon-borne radio detector is sent to fly over stretches of ice, capturing what are called ice showers.

"We have these radio antennas on a balloon that flies 40 kilometers above the ice in Antarctica," Wissel said. "We point our antennas down at the ice and look for neutrinos that interact in the ice, producing radio emissions that we can then sense on our detectors."

These special ice-interacting neutrinos, called tau neutrinos, produce a secondary particle called a tau lepton that is released out of the ice and decays, the physics term referring to how the particle loses energy as it travels over space and breaks down into its constituents. This produces emissions known as air showers.

If they were visible to the naked eye, air showers might look like a sparkler waved in one direction, with sparks trailing it, Wissel explained. The researchers can distinguish between the two signals — ice and air showers — to determine attributes about the particle that created the signal.

These signals can then be traced back to their origin, similar to how a ball thrown at an angle will predictably bounce back at the same angle, Wissel said. The recent anomalous findings, though, cannot be traced back in such a manner as the angle is much sharper than existing models predict.

By analyzing data collected from multiple ANITA flights and comparing it with mathematical models and extensive simulations of both regular cosmic rays and upward-going air showers, the researchers were able to filter out background noise and eliminate the possibility of other known particle-based signals.

The researchers then cross-referenced signals from other independent detectors like the IceCube Experiment and the Pierre Auger Observatory to see if data from upward-going air showers, similar to those found by ANITA, were captured by other experiments.

Analysis revealed the other detectors did not register anything that could have explained what ANITA detected, which led the researchers to describe the signal as "anomalous," meaning that the particles causing the signal are not neutrinos, Wissel explained. The signals do not fit within the standard picture of particle physics, and while several theories suggest that it may be a hint of dark matter, the lack of follow-up observations with IceCube and Auger really narrow the possibilities, she said.

Penn State has built detectors and analyzed neutrino signals for close to 10 years, Wissel explained, and added that her team is currently designing and building the next big detector. The new detector, called PUEO, will be larger and better at detecting neutrino signals, Wissel said, and it will hopefully shed light on what exactly the anomalous signal is.

"My guess is that some interesting radio propagation effect occurs near ice and also near the horizon that I don't fully understand, but we certainly explored several of those, and we haven't been able to find any of those yet either," Wissel said. "So, right now, it's one of these long-standing mysteries, and I'm excited that when we fly PUEO, we'll have better sensitivity. In principle, we should pick up more anomalies, and maybe we'll actually understand what they are. We also might detect neutrinos, which would in some ways be a lot more exciting."

The other Penn State co-author is Andrew Zeolla, a doctoral candidate in physics. The research conducted by scientists from Penn State was funded by the U.S. Department of Energy and the U.S. National Science Foundation. The paper contains the full list of collaborators and authors.

Materialsprovided byPenn State.Note: Content may be edited for style and length.

83% of Earth’s climate-critical fungi are still unknown

Mycorrhizal fungi help regulate Earth's climate and ecosystems by forming underground networks that provide plants with essential nutrients, while drawing carbon deep into soils. Scientists and conservationists have been racing to find ways to protect these underground fungi, but they keep finding dark taxa – species that are known only by their DNA sequences that can't be linked to named or described species.

It is estimated that only 155,000 of the roughly 2-3 million fungal species on the planet have been formally described. Now, a review published inCurrent Biologyon June 9 shows that as much as 83% of ectomycorrhizal species are so-called dark taxa. The study helps identify underground hotspots of unknown mycorrhizal species occurring in tropical forests in southeast Asia and Central and South America, tropical forests and shrublands in central Africa, Sayan montane conifer forests above Mongolia, and more. This discovery has serious implications for conservation.

Names are important in the natural sciences. Traditionally, once a species is described, it is given a binomial – a name made of two Latin words that describe the species and genus. These names are used to categorize fungi, plants, and animals, and are critical identifiers for conservation and research. Most mycorrhizal fungi in the wild are found using environmental DNA (eDNA) — genetic material that organisms shed into their surroundings. Scientists extract fungal eDNA from soil and root samples, sequence that DNA, and then run those sequences through a bioinformatics pipeline that matches a sequence with a described species. For dark taxa there are no matches – just strings of As, Gs, Cs, and Ts.

"We are a long way out from getting all fungal DNA sequences linked to named species," says lead author Laura van Galen, a microbial ecologist working with the Society for the Protection of Underground Networks (SPUN) and ETH University, Switzerland. "Environmental DNA has enormous potential as a research tool to detect fungal species, but we can't include unnamed species in conservation initiatives. How can you protect something that hasn't yet been named?"

Ectomycorrhizal fungi are one of the largest groups of mycorrhizal fungi and form symbiotic partnerships with about 25% of global vegetation. Ectomycorrhizal fungi facilitate the drawdown of over 9 billion tons of CO2annually (over 25% of yearly fossil fuel emissions) and help Earth's forests function by regulating nutrient cycles, enhancing stress tolerance, and even breaking down pollutants.

The researchers' work has uncovered that dark taxa of ectomycorrhizal fungi are not spread evenly across the Earth. "There are hotspots of high dark taxa around the globe, but particularly they are concentrated in tropical regions in Southeast Asia and parts of South America and Africa," says van Galen. "Most of the research on ectomycorrhizal fungi has been focused in the North, but mid-latitude and southern-hemisphere regions show signs of being home to many unknown species. This means there is a mismatch in resources and funding. We need to bridge this gap and facilitate more tropical researchers and those from southern-hemisphere regions to focus on identifying these super-important fungi."

The researchers have suggestions of how we can start bringing these fungi out of the shadows. "One way to reduce the dark taxa problem is to collect, study and sequence mushrooms and other fungi," says co-author Camille Truong, a mycorrhizal ecologist at SPUN and research scientist at the Royal Botanic Gardens Victoria in Australia. "Conversely, there are mushrooms that have been sitting for decades in collections of botanical gardens. These should be urgently sequenced so that we can, hopefully, start matching them up with some of these dark taxa."

Many of the unidentified fungal species are associated with plants that are themselves endangered. "We're at risk here," says van Galen. "If we lose these host plants, we might also be losing really important fungal communities that we don't know anything about yet."

The technology is available – what's missing is attention. "We really need to pay so much more attention to fungi in the soil so that we can understand the species and protect them and conserve them before we lose them," says van Galen. The team hopes that conservation organizations will use the information to protect hotspots of underground biodiversity, even if these species remain nameless.

Materialsprovided bySPUN (Society for the Protection of Underground Networks).Note: Content may be edited for style and length.