Why giant planets might form faster than we thought

An international team of astronomers including researchers at the University of Arizona Lunar and Planetary Laboratory has unveiled groundbreaking findings about the disks of gas and dust surrounding nearby young stars, using the powerful Atacama Large Millimeter/submillimeter Array, or ALMA.

The findings, published in 12 papers in a focus issue of theAstrophysical Journal, are part of an ALMA large program called the ALMA Survey of Gas Evolution of PROtoplanetary Disks, or AGE-PRO. AGE-PRO observed 30 planet-forming disks around sunlike stars to measure gas disk mass at different ages. The study revealed that gas and dust components in these disks evolve at different rates.

Prior ALMA observations have examined the evolution of dust in disks; AGE-PRO, for the first time, traces the evolution of gas, providing the first measurements of gas disk masses and sizes across the lifetime of planet-forming disks, according to the project's principal investigator, Ke Zhang of the University of Wisconsin-Madison.

"Now we have both, the gas and the dust," said Ilaria Pascucci, a professor at planetary sciences at the U of A and one of three AGE-PRO co-principal investigators. "Observing the gas is much more difficult because it takes much more observing time, and that's why we have to go for a large program like this one to obtain a statistically significant sample."

A protoplanetary disk swirls around its host star for several million years as its gas and dust evolve and dissipate, setting the timescale for giant planets to form. The disk's initial mass and size, as well as its angular momentum, have a profound influence on the type of planet it could form — gas giants, icy giants or mini-Neptunes — and migration paths of planets. The lifetime of the gas within the disk determines the timescale for the growth of dust particles to an object the size of an asteroid, the formation of a planet and finally the planet's migration from where it was born.

In one of the survey's most surprising findings, the team discovered that as disks age, their gas and dust are consumed at different rates and undergo a shift in gas-to-dust mass ratio as the disks evolve: Unlike the dust, which tends to remain inside the disk over a longer time span, the gas disperses relatively quickly, then more slowly as the disk ages. In other words, planet-forming disks blow off more of their gas when they're young.

Zhang said the most surprising finding is that although most disks dissipate after a few million years, the ones that survive have more gas than expected. This would suggest that gaseous planets like Jupiter have less time to form than rocky planets.

ALMA's unique sensitivity allowed researchers to use faint, so-called molecular lines to study the cold gas in these disks, characteristic wavelengths of a light spectrum that essentially act as "fingerprints," identifying different species of gas molecules. The first large-scale chemical survey of its kind, AGE-PRO targeted 30 planet-forming disks in three star-forming regions, ranging from 1 million to 6 million years in age: Ophiuchus (youngest), Lupus (1-3 million years old), and Upper Scorpius (oldest). Using ALMA, AGE-PRO obtained observations of key tracers of gas and dust masses in disks spanning crucial stages of their evolution, from their earliest formation to their eventual dispersal. This ALMA data will serve as a comprehensive legacy library of spectral line observations for a large sample of disks at different evolutionary stages.

Dingshan Deng, a graduate student at LPL who is the lead author on one of the papers, provided the data reduction — essentially, the image analyses needed to get from radio signals to optical images of the disks — for the star-forming region in the constellation of Lupus (Latin for "wolf").

"Thanks to these new and long observations, we now have the ability to estimate and trace the gas masses, not only for the brightest and better studied disks in that region, but also the smaller and fainter ones," he said. "Thanks to the discovery of gas tracers in many disks where it hadn't been seen before, we now have a well-studied sample covering a wide range of disk masses in the Lupus star-forming region."

"It took years to figure out the proper data reduction approach and analysis to produce the images used in this paper for the gas masses and in many other papers of the collaboration," Pascucci added.

Carbon monoxide is the most widely used chemical tracer in protoplanetary disks, but to thoroughly measure the mass of gas in a disk, additional molecular tracers are needed. AGE-PRO used N2H+, or diazenylium, an ion used as an indicator for nitrogen gas in interstellar clouds, as an additional gas tracer to significantly improve the accuracy of measurements. ALMA's detections were also set up to receive spectral light signatures from other molecules, including formaldehyde, methyl cyanide and several molecular species containing deuterium, a hydrogen isotope.

"Another finding that surprised us was that the mass ratio between the gas and dust tends to be more consistent across disks of different masses than expected," Deng said. "In other words, different-size disks will share a similar gas-to-dust mass ratio, whereas the literature suggested that smaller disks might shed their gas faster."

Funding for this study was provided by the National Science Foundation, the European Research Council, the Alexander von Humboldt Foundation, FONDECYT (Chile) among other sources. For full funding information, see the research paper.

Materialsprovided byUniversity of Arizona.Note: Content may be edited for style and length.

Passive cooling breakthrough could slash data center energy use

Engineers at the University of California San Diego have developed a new cooling technology that could significantly improve the energy efficiency of data centers and high-powered electronics. The technology features a specially engineered fiber membrane that passively removes heat through evaporation. It offers a promising alternative to traditional cooling systems like fans, heat sinks and liquid pumps. It could also reduce the water use associated with many current cooling systems.

The advance is detailed in a paper published on June 13 in the journalJoule.

As artificial intelligence (AI) and cloud computing continue to expand, the demand for data processing — and the heat it generates — is skyrocketing. Currently, cooling accounts for up to 40% of a data center's total energy use. If trends continue, global energy use for cooling could more than double by 2030.

The new evaporative cooling technology could help curb that trend. It uses a low-cost fiber membrane with a network of tiny, interconnected pores that draw cooling liquid across its surface using capillary action. As the liquid evaporates, it efficiently removes heat from the electronics underneath — no extra energy required. The membrane sits on top of microchannels above the electronics, pulling in liquid that flows through the channels and efficiently dissipating heat.

"Compared to traditional air or liquid cooling, evaporation can dissipate higher heat flux while using less energy," said Renkun Chen, professor in the Department of Mechanical and Aerospace Engineering at the UC San Diego Jacobs School of Engineering, who co-led the project with professors Shengqiang Cai and Abhishek Saha, both from the same department. Mechanical and aerospace engineering Ph.D. student Tianshi Feng and postdoctoral researcher Yu Pei, both members of Chen's research group, are co-first authors on the study.

Many applications currently rely on evaporation for cooling. Heat pipes in laptops and evaporators in air conditioners are some examples, explained Chen. But applying it effectively to high-power electronics has been a challenge. Previous attempts using porous membranes — which have high surface areas that are ideal for evaporation — have been unsuccessful because their pores were either too small they would clog or too large they would trigger unwanted boiling. "Here, we use porous fiber membranes with interconnected pores with the right size," said Chen. This design achieves efficient evaporation without those downsides.

When tested across variable heat fluxes, the membrane achieved record-breaking performance. It managed heat fluxes exceeding 800 watts of heat per square centimeter — one of the highest levels ever recorded for this kind of cooling system. It also proved stable over multiple hours of operation.

"This success showcases the potential of reimagining materials for entirely new applications," said Chen. "These fiber membranes were originally designed for filtration, and no one had previously explored their use in evaporation. We recognized that their unique structural characteristics — interconnected pores and just the right pore size — could make them ideal for efficient evaporative cooling. What surprised us was that, with the right mechanical reinforcement, they not only withstood the high heat flux-they performed extremely well under it."

While the current results are promising, Chen says the technology is still operating well below its theoretical limit. The team is now working to refine the membrane and optimize performance. Next steps include integrating it into prototypes of cold plates, which are flat components that attach to chips like CPUs and GPUs to dissipate heat. The team is also launching a startup company to commercialize the technology.

This research was supported by the National Science Foundation (grants CMMI-1762560 and DMR-2005181). The work was performed in part at the San Diego Nanotechnology Infrastructure (SDNI) at UC San Diego, a member of the National Nanotechnology Coordinated Infrastructure, which is supported by the National Science Foundation (grant ECCS-2025752).

Disclosures: A patent related to this work was filed by the Regents of the University of California (PCT Application No. PCT/US24/46923.). The authors declare that they have no other competing interests.

Materialsprovided byUniversity of California – San Diego.Note: Content may be edited for style and length.

Fruit-eating mastodons? Ancient fossils confirm a long-lost ecological alliance

Ten thousand years ago, mastodons vanished from South America. With them, an ecologically vital function also disappeared: the dispersal of seeds from large-fruited plants. A new study led by the University of O'Higgins, Chile, with key contributions from IPHES-CERCA, demonstrates for the first time — based on direct fossil evidence — that these extinct elephant relatives regularly consumed fruit and were essential allies of many tree species. Their loss was not only zoological; it was also botanical, ecological, and evolutionary. Some plant species that relied on mastodons for seed dispersal are now critically endangered.

Published inNature Ecology & Evolution, the research presents the first solid evidence of frugivory inNotiomastodon platensis, a South American Pleistocene mastodon. The findings are based on a multiproxy analysis of 96 fossil teeth collected over a span of more than 1,500 kilometers, from Los Vilos to Chiloé Island in southern Chile. Nearly half of the specimens come from the emblematic site of Lake Tagua Tagua, an ancient lake basin rich in Pleistocene fauna, located in the present-day O'Higgins Region.

The study was led by Dr. Erwin González-Guarda, researcher at the University of O'Higgins and associate at IPHES-CERCA, alongside an international team that includes IPHES-CERCA researchers Dr. Florent Rivals, a paleodiet specialist; Dr. Carlos Tornero and Dr. Iván Ramírez-Pedraza, experts in stable isotopes and paleoenvironmental reconstruction; and Alia Petermann-Pichincura. The research was carried out in collaboration with the Universitat Rovira i Virgili (URV) and the Universitat Autònoma de Barcelona (UAB).

An ecological hypothesis finally proven

In 1982, biologist Daniel Janzen and paleontologist Paul Martin proposed a revolutionary idea: many tropical plants developed large, sweet, and colorful fruits to attract large animals — such as mastodons, native horses, or giant ground sloths — that would serve as seed dispersers. Known as the "neotropical anachronisms hypothesis," this theory remained unconfirmed for over forty years. Now, the study led by González-Guarda provides direct fossil evidence that validates it. To understand the lifestyle of this mastodon, the team employed various techniques: isotopic analysis, microscopic dental wear studies, and fossil calculus analysis. "We found starch residues and plant tissues typical of fleshy fruits, such as those of the Chilean palm (Jubaea chilensis)," explains Florent Rivals, ICREA research professor at IPHES-CERCA and an expert in paleodiet. "This directly confirms that these animals frequently consumed fruit and played a role in forest regeneration."

The forgotten role of large seed dispersers

"Through stable isotope analysis, we were able to reconstruct the animals' environment and diet with great precision," notes Iván Ramírez-Pedraza. The data point to a forested ecosystem rich in fruit resources, where mastodons traveled long distances and dispersed seeds along the way. That ecological function remains unreplaced.

"Dental chemistry gives us a direct window into the past," says Carlos Tornero. "By combining different lines of evidence, we've been able to robustly confirm their frugivory and the key role they played in these ecosystems."

A future threatened by an incomplete past

The extinction of mastodons broke a co-evolutionary alliance that had lasted for millennia. The researchers applied a machine learning model to compare the current conservation status of megafauna-dependent plants across different South American regions. The results are alarming: in central Chile, 40% of these species are now threatened — a rate four times higher than in tropical regions where animals such as tapirs or monkeys still act as alternative seed dispersers.

"Where that ecological relationship between plants and animals has been entirely severed, the consequences remain visible even thousands of years later," says study co-author Andrea P. Loayza.

Species like the gomortega (Gomortega keule), the Chilean palm, and the monkey puzzle tree (Araucaria araucana) now survive in small, fragmented populations with low genetic diversity. They are living remnants of an extinct interaction.

Paleontology as a key to conservation

Beyond its fossil discoveries, the study sends a clear message: understanding the past is essential to addressing today's ecological crises. "Paleontology isn't just about telling old stories," concludes Florent Rivals. "It helps us recognize what we've lost — and what we still have a chance to save."

Materialsprovided byUniversitat Autonoma de Barcelona.Note: Content may be edited for style and length.

AI Reveals Milky Way’s Black Hole Spins Near Top Speed

An international team of astronomers has trained a neural network with millions of synthetic simulations and artificial intelligence (AI) to tease out new cosmic curiosities about black holes, revealing the one at the center of our Milky Way is spinning at nearly top speed.

These large ensembles of simulations were generated by throughput computing capabilities provided by the Center for High Throughput Computing (CHTC), a joint entity of the Morgridge Institute for Research and the University of Wisconsin-Madison. The astronomers published their results and methodology today in three papers in the journalAstronomy & Astrophysics.

High-throughput computing, celebrating its 40thanniversary this year, was pioneered by Wisconsin computer scientist Miron Livny. It's a novel form of distributed computing that automates computing tasks across a network of thousands of computers, essentially turning a single massive computing challenge into a supercharged fleet of smaller ones. This computing innovation is helping fuel big-data discovery across hundreds of scientific projects worldwide, including the search for cosmic neutrinos, subatomic particles and gravitational waves as well as to unravel antibiotic resistance.

In 2019, the Event Horizon Telescope (EHT) Collaboration released the first image of a supermassive black hole at the center of the galaxy M87. In 2022, they presented the image of the black hole at the center of our Milky Way, Sagittarius A*. However, the data behind the images still contained a wealth of hard-to-crack information. An international team of researchers trained a neural network to extract as much information as possible from the data.

Previous studies by the EHT Collaboration used only a handful of realistic synthetic data files. Funded by the National Science Foundation (NSF) as part of the Partnership to Advance Throughput Computing (PATh) project, the Madison-based CHTC enabled the astronomers to feed millions of such data files into a so-called Bayesian neural network, which can quantify uncertainties. This allowed the researchers to make a much better comparison between the EHT data and the models.

Thanks to the neural network, the researchers now suspect that the black hole at the center of the Milky Way is spinning at almost top speed. Its rotation axis points to the Earth. In addition, the emission near the black hole is mainly caused by extremely hot electrons in the surrounding accretion disk and not by a so-called jet. Also, the magnetic fields in the accretion disk appear to behave differently from the usual theories of such disks.

"That we are defying the prevailing theory is of course exciting," says lead researcher Michael Janssen, of Radboud University Nijmegen, the Netherlands. "However, I see our AI and machine learning approach primarily as a first step. Next, we will improve and extend the associated models and simulations."

"The ability to scale up to the millions of synthetic data files required to train the model is an impressive achievement," adds Chi-kwan Chan, an Associate Astronomer of Steward Observatory at the University of Arizonaand a longtime PATh collaborator. "It requires dependable workflow automation, and effective workload distribution across storage resources and processing capacity."

"We are pleased to see EHT leveraging our throughput computing capabilities to bring the power of AI to their science," says Professor Anthony Gitter, a Morgridge Investigator and a PATh Co-PI. "Like in the case of other science domains, CHTC's capabilities allowed EHT researchers to assemble the quantity and quality of AI-ready data needed to train effective models that facilitate scientific discovery."

The NSF-funded Open Science Pool, operated by PATh, offers computing capacity contributed by more than 80 institutions across the United States. The Event Horizon black hole project performed more than 12 million computing jobs in the past three years.

"A workload that consists of millions of simulations is a perfect match for our throughput-oriented capabilities that were developed and refined over four decades" says Livny, director of the CHTC and lead investigator of PATh. "We love to collaborate with researchers who have workloads that challenge the scalability of our services."

Deep learning inference with the Event Horizon Telescope I. Calibration improvements and a comprehensive synthetic data library. By: M. Janssen et al. In:Astronomy & Astrophysics, 6 June 2025.

Deep learning inference with the Event Horizon Telescope II. The Zingularity framework for Bayesian artificial neural networks. By: M. Janssen et al. In:Astronomy & Astrophysics, 6 June 2025.

Deep learning inference with the Event Horizon Telescope III. Zingularity results from the 2017 observations and predictions for future array expansions. By: M. Janssen et al. In:Astronomy & Astrophysics, 6 June 2025.

Materialsprovided byMorgridge Institute for Research.Note: Content may be edited for style and length.

Koalas on the brink: Precision DNA test offers a lifeline to Australia’s icons

A University of Queensland-led project has developed a tool to standardize genetic testing of koala populations, providing a significant boost to conservation and recovery efforts.

Dr Lyndal Hulse from UQ's School of the Environment said the standardized koala genetic marker panel provides a consistent method for researchers nationwide to capture and share koala genetic variation, enabling improved collaboration and data integration across studies.

"Koalas in the wild are under increasing pressure from habitat loss, disease and vehicle strikes, forcing them to live in increasingly smaller and more isolated pockets with limited access to breeding mates outside their group," Dr Hulse said.

"Population inbreeding can mean detrimental effects on their health.

"A standardized panel for directly comparing genetic markers enables researchers, conservationists and government agencies to better understand the genetic diversity of koala populations, allowing for greater collaboration to ensure their survival."

Saurabh Shrivastava, Senior Account Manager at project partner the Australian Genome Research Facility (AGRF Ltd), said the new screening tool was a single nucleotide polymorphism (SNP) array that used next-generation sequencing technologies.

"The Koala SNP-array can accommodate good quality DNA, so is suitable for broad-scale monitoring of wild koala populations," Mr Shrivastava said.

"Importantly, it is available to all researchers and managers."

Dr Hulse said ideally the tool could help guide targeted koala relocations across regions.

"There are very strict rules about relocating koalas, but this could be key to improving and increasing the genetics of populations under threat," she said.

"These iconic Australian marsupials are listed as endangered in Queensland, New South Wales and the ACT – and in 50 years we may only be able to see koalas in captivity.

"Understanding the genetic diversity of different populations of koalas is crucial if we're going to save them from extinction."

The project included researchers from the Australasian Wildlife Genomics Group at the University of New South Wales.

AGRF Ltd isanot-for-profit organization advancing Australian genomics through nationwide access to expert support and cutting-edge technology across a broad range of industries including biomedical, health, agriculture and environmental sectors.

Materialsprovided byUniversity of Queensland.Note: Content may be edited for style and length.

Scientists reveal the hidden trigger behind massive floods

Atmospheric rivers are responsible for most flooding on the West Coast of the U.S., but also bring much needed moisture to the region. The size of these storms doesn't always translate to flood risk, however, as other factors on the ground play important roles. Now, a new study helps untangle the other drivers of flooding to help communities and water managers better prepare.

The research, published June 4 in theJournal of Hydrometeorology, analyzed more than 43,000 atmospheric river storms across 122 watersheds on the West Coast between 1980 and 2023. The researchers found that one of the primary driving forces of flooding is wet soils that can't absorb more water when a storm hits. They showed that flood peaks were 2-4.5 times higher, on average, when soils were already wet. These findings can help explain why some atmospheric river storms cause catastrophic flooding while others of comparable intensity do not. Even weaker storms can generate major floods if their precipitation meets a saturated Earth, while stronger storms may bring needed moisture to a parched landscape without causing flooding.

"The main finding comes down to the fact that flooding from any event, but specifically from atmospheric river storms, is a function not only of the storm size and magnitude, but also what's happening on the land surface," said Mariana Webb, lead author of the study who is completing her Ph.D. at DRI and the University of Nevada, Reno. "This work demonstrates the key role that pre-event soil moisture can have in moderating flood events. Interestingly, flood magnitudes don't increase linearly as soil moisture increases, there's this critical threshold of soil moisture wetness above which you start to see much larger flows."

The study also untangled the environmental conditions of regions where soil moisture has the largest influence on flooding. In arid places like California and southwestern Oregon, storms that hit when soils are already saturated are more likely to cause floods. This is because watersheds in these regions typically have shallow, clay-rich soils and limited water storage capacity. Due to lower precipitation and higher evaporation rates, soil moisture is also more variable in these areas. In contrast, in lush Washington and the interior Cascades and Sierra Nevada regions, watersheds tend to have deeper soils and snowpack, leading to a higher water storage capacity. Although soil saturation can still play a role in driving flooding in these areas, accounting for soil moisture is less valuable for flood management because soils are consistently wet or insulated by snow.

"We wanted to identify the watersheds where having additional information about the soil moisture could enhance our understanding of flood risk," Webb said. "It's the watersheds in more arid climates, where soil moisture is more variable due to evaporation and less consistent precipitation, where we can really see improvements in flood prediction."

Although soil moisture data is currently measured at weather monitoring stations like the USDA's SNOTEL Network, observations are relatively sparse compared to other measures like rainfall. Soil moisture can also vary widely within a single watershed, so often multiple stations are required to give experts a clear picture that can help inform flooding predictions. Increased monitoring in watersheds identified as high-risk, including real-time soil moisture observations, could significantly enhance early warning systems and flood management as atmospheric rivers become more frequent and intense.

By tailoring flood risk evaluations to a specific watershed's physical characteristics and climate, the study could improve flood-risk predictions. The research demonstrates how flood risk increases not just with storm size and magnitude, but with soil moisture, highlighting the value of integrating land surface conditions into impact assessments for atmospheric rivers. "My research really focuses on this interdisciplinary space between atmospheric science and hydrology," Webb said. "There's sometimes a disconnect where atmospheric scientists think about water up until it falls as rain, and hydrologists start their work once the water is on the ground. I wanted to explore how we can better connect these two fields."

Webb worked with DRI ecohydrologist Christine Albano to produce the research, building on Albano's extensive expertise studying atmospheric rivers, their risks, and their impacts on the landscape.

"While soil saturation is widely recognized as a key factor in determining flood risk, Mari's work helps to quantify the point at which this level of saturation leads to large increases in flood risk across different areas along the West Coast," Albano said. "Advances in weather forecasting allow us to see atmospheric rivers coming toward the coast several days before they arrive. By combining atmospheric river forecast information with knowledge of how close the soil moisture is to critical saturation levels for a given watershed, we can further improve flood early warning systems."

Materialsprovided byDesert Research Institute.Note: Content may be edited for style and length.

Impossible signal from deep beneath Antarctic ice baffles physicists

A cosmic particle detector in Antarctica has emitted a series of bizarre signals that defy the current understanding of particle physics, according to an international research group that includes scientists from Penn State. The unusual radio pulses were detected by the Antarctic Impulsive Transient Antenna (ANITA) experiment, a range of instruments flown on balloons high above Antarctica that are designed to detect radio waves from cosmic rays hitting the atmosphere.

The goal of the experiment is to gain insight into distant cosmic events by analyzing signals that reach the Earth. Rather than reflecting off the ice, the signals — a form of radio waves — appeared to be coming from below the horizon, an orientation that cannot be explained by the current understanding of particle physics and may hint at new types of particles or interactions previously unknown to science, the team said.

The researchers published their results in the journal Physical Review Letters.

"The radio waves that we detected were at really steep angles, like 30 degrees below the surface of the ice," said Stephanie Wissel, associate professor of physics, astronomy and astrophysics who worked on the ANITA team searching for signals from elusive particles called neutrinos.

She explained that by their calculations, the anomalous signal had to pass through and interact with thousands of kilometers of rock before reaching the detector, which should have left the radio signal undetectable because it would have been absorbed into the rock.

"It's an interesting problem because we still don't actually have an explanation for what those anomalies are, but what we do know is that they're most likely not representing neutrinos," Wissel said.

Neutrinos, a type of particle with no charge and the smallest mass of all subatomic particles, are abundant in the universe. Usually emitted by high-energy sources like the sun or major cosmic events like supernovas or even the Big Bang, there are neutrino signals everywhere. The problem with these particles, though, is that they are notoriously difficult to detect, Wissel explained.

"You have a billion neutrinos passing through your thumbnail at any moment, but neutrinos don't really interact," she said. "So, this is the double-edged sword problem. If we detect them, it means they have traveled all this way without interacting with anything else. We could be detecting a neutrino coming from the edge of the observable universe."

Once detected and traced to their source, these particles can reveal more about cosmic events than even the most high-powered telescopes, Wissel added, as the particles can travel undisturbed and almost as fast as the speed of light, giving clues about cosmic events that happened lightyears away.

Wissel and teams of researchers around the world have been working to design and build special detectors to capture sensitive neutrino signals, even in relatively small amounts. Even one small signal from a neutrino holds a treasure trove of information, so all data has significance, she said.

"We use radio detectors to try to build really, really large neutrino telescopes so that we can go after a pretty low expected event rate," said Wissel, who has designed experiments to spot neutrinos in Antarctica and South America.

ANITA is one of these detectors, and it was placed in Antarctica because there is little chance of interference from other signals. To capture the emission signals, the balloon-borne radio detector is sent to fly over stretches of ice, capturing what are called ice showers.

"We have these radio antennas on a balloon that flies 40 kilometers above the ice in Antarctica," Wissel said. "We point our antennas down at the ice and look for neutrinos that interact in the ice, producing radio emissions that we can then sense on our detectors."

These special ice-interacting neutrinos, called tau neutrinos, produce a secondary particle called a tau lepton that is released out of the ice and decays, the physics term referring to how the particle loses energy as it travels over space and breaks down into its constituents. This produces emissions known as air showers.

If they were visible to the naked eye, air showers might look like a sparkler waved in one direction, with sparks trailing it, Wissel explained. The researchers can distinguish between the two signals — ice and air showers — to determine attributes about the particle that created the signal.

These signals can then be traced back to their origin, similar to how a ball thrown at an angle will predictably bounce back at the same angle, Wissel said. The recent anomalous findings, though, cannot be traced back in such a manner as the angle is much sharper than existing models predict.

By analyzing data collected from multiple ANITA flights and comparing it with mathematical models and extensive simulations of both regular cosmic rays and upward-going air showers, the researchers were able to filter out background noise and eliminate the possibility of other known particle-based signals.

The researchers then cross-referenced signals from other independent detectors like the IceCube Experiment and the Pierre Auger Observatory to see if data from upward-going air showers, similar to those found by ANITA, were captured by other experiments.

Analysis revealed the other detectors did not register anything that could have explained what ANITA detected, which led the researchers to describe the signal as "anomalous," meaning that the particles causing the signal are not neutrinos, Wissel explained. The signals do not fit within the standard picture of particle physics, and while several theories suggest that it may be a hint of dark matter, the lack of follow-up observations with IceCube and Auger really narrow the possibilities, she said.

Penn State has built detectors and analyzed neutrino signals for close to 10 years, Wissel explained, and added that her team is currently designing and building the next big detector. The new detector, called PUEO, will be larger and better at detecting neutrino signals, Wissel said, and it will hopefully shed light on what exactly the anomalous signal is.

"My guess is that some interesting radio propagation effect occurs near ice and also near the horizon that I don't fully understand, but we certainly explored several of those, and we haven't been able to find any of those yet either," Wissel said. "So, right now, it's one of these long-standing mysteries, and I'm excited that when we fly PUEO, we'll have better sensitivity. In principle, we should pick up more anomalies, and maybe we'll actually understand what they are. We also might detect neutrinos, which would in some ways be a lot more exciting."

The other Penn State co-author is Andrew Zeolla, a doctoral candidate in physics. The research conducted by scientists from Penn State was funded by the U.S. Department of Energy and the U.S. National Science Foundation. The paper contains the full list of collaborators and authors.

Materialsprovided byPenn State.Note: Content may be edited for style and length.

83% of Earth’s climate-critical fungi are still unknown

Mycorrhizal fungi help regulate Earth's climate and ecosystems by forming underground networks that provide plants with essential nutrients, while drawing carbon deep into soils. Scientists and conservationists have been racing to find ways to protect these underground fungi, but they keep finding dark taxa – species that are known only by their DNA sequences that can't be linked to named or described species.

It is estimated that only 155,000 of the roughly 2-3 million fungal species on the planet have been formally described. Now, a review published inCurrent Biologyon June 9 shows that as much as 83% of ectomycorrhizal species are so-called dark taxa. The study helps identify underground hotspots of unknown mycorrhizal species occurring in tropical forests in southeast Asia and Central and South America, tropical forests and shrublands in central Africa, Sayan montane conifer forests above Mongolia, and more. This discovery has serious implications for conservation.

Names are important in the natural sciences. Traditionally, once a species is described, it is given a binomial – a name made of two Latin words that describe the species and genus. These names are used to categorize fungi, plants, and animals, and are critical identifiers for conservation and research. Most mycorrhizal fungi in the wild are found using environmental DNA (eDNA) — genetic material that organisms shed into their surroundings. Scientists extract fungal eDNA from soil and root samples, sequence that DNA, and then run those sequences through a bioinformatics pipeline that matches a sequence with a described species. For dark taxa there are no matches – just strings of As, Gs, Cs, and Ts.

"We are a long way out from getting all fungal DNA sequences linked to named species," says lead author Laura van Galen, a microbial ecologist working with the Society for the Protection of Underground Networks (SPUN) and ETH University, Switzerland. "Environmental DNA has enormous potential as a research tool to detect fungal species, but we can't include unnamed species in conservation initiatives. How can you protect something that hasn't yet been named?"

Ectomycorrhizal fungi are one of the largest groups of mycorrhizal fungi and form symbiotic partnerships with about 25% of global vegetation. Ectomycorrhizal fungi facilitate the drawdown of over 9 billion tons of CO2annually (over 25% of yearly fossil fuel emissions) and help Earth's forests function by regulating nutrient cycles, enhancing stress tolerance, and even breaking down pollutants.

The researchers' work has uncovered that dark taxa of ectomycorrhizal fungi are not spread evenly across the Earth. "There are hotspots of high dark taxa around the globe, but particularly they are concentrated in tropical regions in Southeast Asia and parts of South America and Africa," says van Galen. "Most of the research on ectomycorrhizal fungi has been focused in the North, but mid-latitude and southern-hemisphere regions show signs of being home to many unknown species. This means there is a mismatch in resources and funding. We need to bridge this gap and facilitate more tropical researchers and those from southern-hemisphere regions to focus on identifying these super-important fungi."

The researchers have suggestions of how we can start bringing these fungi out of the shadows. "One way to reduce the dark taxa problem is to collect, study and sequence mushrooms and other fungi," says co-author Camille Truong, a mycorrhizal ecologist at SPUN and research scientist at the Royal Botanic Gardens Victoria in Australia. "Conversely, there are mushrooms that have been sitting for decades in collections of botanical gardens. These should be urgently sequenced so that we can, hopefully, start matching them up with some of these dark taxa."

Many of the unidentified fungal species are associated with plants that are themselves endangered. "We're at risk here," says van Galen. "If we lose these host plants, we might also be losing really important fungal communities that we don't know anything about yet."

The technology is available – what's missing is attention. "We really need to pay so much more attention to fungi in the soil so that we can understand the species and protect them and conserve them before we lose them," says van Galen. The team hopes that conservation organizations will use the information to protect hotspots of underground biodiversity, even if these species remain nameless.

Materialsprovided bySPUN (Society for the Protection of Underground Networks).Note: Content may be edited for style and length.

Tiny wasp’s shocking reproductive trick may transform global agriculture

Scientists have shed new light on the evolution of an important species of wasp – and believe that the findings could help improve the effectiveness of natural pest control.

Dr Rebecca Boulton, from the University of Stirling, has shown, for the first time, thatLysiphlebus fabarum- a tiny species of wasp – can reproduce with or without a mate. This discovery challenges the previous assumption that asexual females could not mate and produce offspring sexually.

Significantly, the wasps lay their eggs inside small sap-sucking insects called aphids before consuming their host from the inside out — meaning that they are natural pest controllers.

Lysiphlebus fabarumis known to have both sexual and asexual populations but, until now, it was not known whether asexual females could reproduce sexually with males. The discovery opens up new possibilities for improving biological pest control.

Many species of parasitoid wasps are mass-reared and released as a natural alternative to pesticides because they lay their eggs on or in other species, many of which are pests, before the developing wasp larvae consumes their host, killing it in the process.

Asexual reproduction makes it easy to produce large numbers of wasps, but these need to be suitably adapted to local pests and environments to be effective. Currently,Lysiphlebus fabarumis not used commercially despite being found worldwide and naturally targeting aphids.

Developing an understanding of how the species reproduce could help boost genetic diversity in commercially reared lines, making future biocontrol agents more resilient and better adapted.

Dr Boulton, a lecturer in Biological and Environmental Sciences at the University's Faculty of Natural Sciences, led the study. She said: "In an evolutionary sense, facultative sex seems like a perfect strategy – asexual reproduction is highly efficient, and takes away the costs of finding a mate as well as the risks of failing to find one.

"But sex is really important for evolution. When females reproduce asexually they don't mix their genes up with any others which limits the potential for evolution to happen.

"If the environment changes, asexual species may be unable to adapt in the same way that sexuals can.

"Facultative sex brings the efficiency of asexual reproduction with the evolutionary benefits of sex and so has been touted as the best of both worlds.

"The results of my study show that there might be hidden costs to facultative sex though as it reduces female wasps' reproductive success, and this might limit how frequently it occurs in nature.

"The wasps that I studied are an important natural enemy of aphids, they aren't currently commercially reared, but they are found globally.

"My findings could be used to develop new biocontrol agents that can be used to control aphids throughout the world, harnessing their natural reproductive behavior to ensure that they are adapted to the hosts and environments that are specific to different regions."

Dr Boulton reared the wasps in a Controlled Environment Facility (CEF) at the University and had initially planned to put asexual and sexual wasps together, in direct competition, to see which parasitized the most aphids.

However, in the early stages of these experiments she realized the female asexual wasps were behaving unexpectedly and were mating with males from the sexual population.

This led to a change in strategy, as she started to record this behavior in more detail, before carrying out wasp paternity testing to see whether the asexual females were just mating or actually fertilizing eggs.

Once it confirmed that the asexual wasps were engaging in facultative sex, Dr Boulton carried out an experiment where asexual females either mated or didn't, before examining how successful these females, and their daughters, were at parasitizing aphids.

The study involved putting around 300 wasps, each around 1mm long, in their own petri dish with a colony of sap-sucking aphids and counting how many were parasitized.

Lysiphlebus fabarumwasps only live a few days but spend two weeks developing as larvae on their hosts.

The entire experiment, which was carried out across two generations of wasps, took six weeks to run.

On completion Dr Boulton extracted DNA from the wasps and sent it to be paternity tested. When the results were returned it was clear that the asexual wasps which mated were, in most cases, reproducing sexually as their offspring had bits of DNA that were only found in the fathers.

The study, Is facultative sex the best of both worlds in the parasitoid wasp Lysiphlebus fabarum? is published in the Royal Society of Open Science.

It was funded through a BBSRC Discovery fellowship.

Professor Anne Ferguson-Smith, Executive Chair of BBSRC, said: "This is an exciting example of how BBSRC's Discovery Fellowships are helping talented early career researchers explore fundamental questions in bioscience with real-world relevance.

"Dr Boulton's work, which measures the costs of sex in this predominantly asexual parasitoid wasp, opens up promising avenues for more sustainable pest control. Supporting curiosity-driven research like this not only strengthens the UK's research base, but helps drive innovation that benefits the environment, food systems and society at large."

Materialsprovided byUniversity of Stirling.Note: Content may be edited for style and length.

Galactic mystery: Why massive stars struggle to form in the Milky Way’s center

New research led by Dr. James De Buizer at the SETI Institute and Dr. Wanggi Lim at IPAC at Caltech revealed surprising results about the rate at which high-mass stars form in the Galactic Center of the Milky Way. The researchers based their study primarily on observations from NASA's now-retired SOFIA airborne observatory, focusing on three star-forming regions — Sgr B1, Sgr B2, and Sgr C — located at the heart of the Galaxy. Although the central part of our Galaxy has a much higher density of star-forming material than the rest of the Milky Way, in the Galactic Center, the current rate of formation of massive stars (those larger than 8 times the mass of our Sun) appears to be lower compared to the rest of the Galaxy.

The team compared these three Galactic Center star-forming regions to similar-sized regions further out in the Galaxy, including those closer to our Sun, and confirmed that the rate of star formation is below average near the Galactic Center. Their study finds that despite the Galactic Center's dense clouds of gas and dust, conditions that typically produce stars with high masses, these star-forming regions struggle to form high-mass stars. Furthermore, the studied areas appear to lack sufficient material for continued star formation, suggesting such regions effectively produce just one generation of stars, unlike typical star-forming regions.

"Recent studies have concluded that star formation is likely depressed near the Galactic Center, and even that there may be no present star formation occurring there," said De Buizer, lead author of the study. "Since presently-forming massive stars are brightest at long infrared wavelengths, we obtained the highest resolution infrared images of our Galaxy's central-most star-forming regions. The data show that, contrarily, massive stars are presently forming there, but confirm at a relatively low rate."

The study suggests that the reason for the slowdown in star formation is due to the extreme conditions in the Galactic Center. These regions orbit swiftly around the black hole at the center of the Galaxy, interacting with older stars and possibly with other material falling toward the black hole. These conditions could inhibit gas clouds from holding together long enough to form stars in the first place and prevent those that do form stars from staying together long enough for continued future star formation.

However, Sgr B2 appears to be the exception. Although its rate of present massive star formation is unusually low, like the other Galactic Center regions studied, it seems to have maintained its reservoir of dense gas and dust, allowing for a future emergent star cluster to be born.

Traditionally, astronomers have viewed giant H II regions — large clouds of gas, mainly hydrogen, in space like Sgr B1 and Sgr C — as hosts of massive star clusters still embedded in their birth clouds. This study challenges that assumption. The team argues these two regions may not fit the classical definition at all, or they may represent a new, previously unrecognized category of stellar nursery.

Enshrouded in gas and dust that obscure these star-forming regions from view in all but the longest infrared wavelengths, SOFIA's high-resolution infrared eyes allowed the team to identify more than six dozen presently-forming massive stars within the Galactic Center regions. However, these regions formed fewer stars — and topped out at a lower stellar mass — than the Galactic average.

"These Galactic Center star-forming regions are in many ways very similar to the massive star-forming regions in the relatively calm backwaters of our galaxy," said Lim. "However, the most massive stars we are finding in these Galactic Center regions, though still remarkably large, fall short in both size and quantity compared to those found in similar regions elsewhere in our Galaxy. Furthermore, such star-forming regions typically hang on to large reservoirs of star-forming material and continue to produce multiple epochs of stars, but that appears to not be the case for these Galactic Center regions."

Lim will present the results of this study at the 246th meeting of the American Astronomical Society in Anchorage, AK.

Materialsprovided bySETI Institute.Note: Content may be edited for style and length.

Tidak Ada Lagi Postingan yang Tersedia.

Tidak ada lagi halaman untuk dimuat.