Brain food fight: Rutgers maps the hidden switch that turns cravings on and off

Scientists know the stomach talks to the brain, but two new studies from Rutgers Health researchers suggest the conversation is really a tug-of-war, with one side urging another bite, the other signaling "enough."

Together, the papers inNature MetabolismandNature Communicationstrace the first complementary wiring diagram of hunger and satiety in ways that could refine today's blockbuster weight-loss drugs and blunt their side effects.

One study, led by Zhiping Pang of Robert Wood Johnson Medical School's Center for NeuroMetabolism, pinpointed a slender bundle of neurons that runs from the hypothalamus to the brainstem.

The cells bristle with GLP-1 receptors, the proteins mimicked by weight-loss drugs such as Ozempic. When Pang's team hit the pathway with pulses of light, well-fed mice quit eating; when they silenced the circuit or deleted the receptor, the animals packed on weight. Fasting weakened the connection until a burst of natural or synthetic GLP-1 restored it.

"The synapse is a volume knob that only turns up when energy stores are low," Pang said, warning that drugs that keep the signal high around the clock could disrupt the brain's normal rhythm and create some of the side effects of GLP-1 drugs such as nausea, vomiting, constipation or diarrhea and muscle wasting.

For the other paper, Mark Rossi, who co-leads the Center for NeuroMetabolism with Pang, charted the circuit that triggers hunger. His group traced inhibitory neurons in the stria terminalis to similar cells in the lateral hypothalamus.

When researchers triggered the connection, a suddenly hungry mouse would sprint for sugar water; when they blocked it, the animals lounged even after a long fast.

Hormones modulated the effect. An injection of ghrelin, the gut's hunger messenger, revved food seeking, while leptin, the satiety signal, slammed it shut. Overfed mice gradually lost the response, but it returned after diets made them thin again.

"Pang's pathway shuts things down," Rossi said. "Ours steps on the accelerator."

Although the circuits sit in different corners of the brain, members of both teams saw the same principle: Energy state rewires synapses quickly. During a fast, the hunger circuit gains sensitivity while the satiety circuit loosens; after a meal, the relationship flips.

It is the first time researchers have watched the push-pull mechanism operate in parallel pathways, a yin-yang arrangement that may explain why diets and drugs that treat only one side of the equation often lose power over time and may help in making drugs that work even better than today's generation of GLP-1 medications.

GLP-1 mimics such as Wegovy and Zepbound can trigger double-digit weight loss but also nausea, diarrhea and, in some cases, muscle wasting. Pang's data suggest a therapy targeting only the brainstem circuit and sparing peripheral organs might curb eating without the side effects. Conversely, Rossi's work hints that restoring the body's response to the hunger-regulating hormone ghrelin could help dieters who plateau after months of calorie cutting.

Both projects relied on the modern toolkit of neural biology – optogenetics to fire axons with laser light, chemogenetics to silence them, fiber-optic photometry to watch calcium pulses and old-fashioned patch-clamp recordings to monitor single synapses. Those techniques allowed the researchers to tune individual pathways with a precision that has only recently become possible.

Follow-up work from both teams will explore more questions that could improve drug design. Pang wants to measure GLP-1 release in real time to see whether short bursts, rather than constant exposure, are enough to calm appetite. Rossi is cataloging the molecular identity of his hunger-trigger cells in hopes of finding drug targets that steer craving without crushing the joy of eating.

"You want to keep the system's flexibility," Rossi said. "It's the difference between dimming the lights and flicking them off."

Allowing the brain to correctly rebalance the desire to eat or stop eating throughout the day, rather than using drugs to keep desire constantly low, may be an important ingredient in tomorrow's weight-loss prescriptions.

Materialsprovided byRutgers University.Note: Content may be edited for style and length.

Africa’s pangolin crisis: The delicacy that’s driving a species to the brink

The vast majority of pangolin hunting in African forest landscapes is done for meat consumed by people in the region, rather than for scales shipped to East Asia, a new study led by the University of Cambridge suggests.

Pangolins are the most heavily trafficked wild mammal in the world. A solitary, insect-eating animal about the size of a large domestic cat*, pangolins are famous for their highly prized keratin scales — a staple of traditional Chinese medicine.

All eight existing pangolin species are threatened with extinction and on the IUCN's Red List, with three Asian species categorised as critically endangered.

As Asian pangolins have declined dramatically, Nigeria has seen a boom in the export of pangolin scales to Asia. While hunting pangolins is illegal in Nigeria the West African country is now the world's largest hub for the criminal trade in pangolin products.

However, a new study published in the journalNature Ecology & Evolutionsuggests that some 98% of Nigerian pangolins are caught for meat first and foremost, with around two-thirds of scales from these animals simply thrown away.

A research team led by Cambridge collected data from over eight hundred hunters and traders in thirty-three locations across Nigeria's Cross River Forest region, primarily between 2020 and 2023, during which time the conservationists estimate that around 21,000 pangolins were killed annually in the area.

Almost all pangolins were captured "opportunistically" or during general hunting trips (97%) rather than sought out, and caught primarily for meat (98%). Around 71% of pangolins were consumed by hunters themselves, with 27% traded locally as food.

Perhaps surprisingly, given their potential overseas value, around 70% of the scales were discarded, while less than 30% were sold on. However, researchers calculated that, per animal, pangolin meat fetched 3-4 times the price of scales at local Nigerian markets.

"Thousands of kilos of pangolin scales are seized at Nigeria's ports, creating the impression that the international demand for scales is behind pangolin exploitation in West Africa," said study lead author and Gates Cambridge Scholar Dr Charles Emogor, who conducted the research for his PhD at the University of Cambridge's Department of Zoology.

"When we spoke to hunters and traders on the ground around the Cross River forest, the largest stronghold for Nigeria's pangolins, it was obvious that meat was the motivation for almost all of the pangolin killings."

"We found that dedicated pangolin hunts are virtually non-existent. Most pangolins are killed by hunters out for any type of game," said Emogor, now a Schmidt Science Fellow split between Cambridge, UK, and Harvard, US.

"Around a third of pangolins are caught opportunistically, often while people are working in the fields. Pangolins curl into a ball when threatened, which sadly makes them easy to catch." Among frequent hunters, by far the most common method of catching pangolins was given as simply picking them up by hand.

While Emogor says the demands of traditional medicine markets are exacerbating the decline of African pangolins — his previous research showed that just shipments intercepted by Nigerian authorities between 2010 and 2021 amounted to 190,407 kilos of pangolin scales taken from around 800,000 dead creatures — pangolins have been exploited in West Africa long before being trafficked to Asia.

The meat is a delicacy in parts of Nigeria, often procured for pregnant women in the belief it helps produce strong babies. Emogor and colleagues surveyed hunters and Cross River locals on "palatability": asking them to rank the tastiness of almost a hundred different animals eaten in the region, from domestic beef and chicken to catfish, monkeys and antelope.

The three major African pangolin species were rated as the most palatable of all available meats, with average scores of almost nine out of ten, and the giant pangolin considered the topmost appetising meat in the region.

"Pangolins face a lethal combination of threats," said Emogor. "Pangolins are easy to hunt, breed slowly, taste good to humans, and are falsely believed to have curative properties in traditional medicines. In addition, their forest habitat is being destroyed."

Emogor's research led him to set up Pangolino in 2021, a global network of volunteers, scientists and pangolin enthusiasts committed to saving the endangered animal. He points out that the cost of policy interventions to tackle meat-driven pangolin trading might be cheaper than those for an international scales market.

These should include anti-poaching patrols as well as community programmes focused on food security. Through Pangolino, Emogor is piloting interventions in four Southeast Nigerian communities by helping create by-laws that prohibit pangolin killing, with financial rewards for compliance.

"Clearly in designing any intervention we need good information on what's motivating the hunters," said Prof Andrew Balmford, co-author from Cambridge's Department of Zoology. "That's why studies such as this are vital for effective conservation of endangered species."

While the latest study focused on Nigeria, researchers say their pangolin hunting and consumption data echo that from countries such as Cameroon and Gabon — suggesting these patterns may be Africa-wide.

Raised on the edge of the Cross River National Park, home to Nigeria's endangered white-bellied and black-bellied pangolins, Emogor grew up surrounded by wildlife. Yet during childhood he only ever saw dead pangolins, and didn't encounter a living animal until his mid-twenties.

"If we lose the pangolin, we lose 80 million years of evolution," said Emogor. "Pangolins are the only mammals with scales, and their ancestors existed when dinosaurs still roamed the planet," added Emogor.

The latest study was conducted by an international team of researchers from the University of Cambridge, Wildlife Conservation Society, Pangolin Protection Network, University of Washington, CIFOR, CARE International, as well as the UK universities of Oxford, Exeter and Kent.

*While this is a rough size for some African species, such as the White-bellied pangolin, the Giant Pangolin can grow up to 30kg in weight.

Materialsprovided byUniversity of Cambridge.Note: Content may be edited for style and length.

This quantum sensor tracks 3D movement without GPS

In a new study, physicists at the University of Colorado Boulder have used a cloud of atoms chilled down to incredibly cold temperatures to simultaneously measure acceleration in three dimensions — a feat that many scientists didn't think was possible.

The device, a new type of atom "interferometer," could one day help people navigate submarines, spacecraft, cars and other vehicles more precisely.

"Traditional atom interferometers can only measure acceleration in a single dimension, but we live within a three-dimensional world," said Kendall Mehling, a co-author of the new study and a graduate student in the Department of Physics at CU Boulder. "To know where I'm going, and to know where I've been, I need to track my acceleration in all three dimensions."

The researchers published their paper, titled "Vector atom accelerometry in an optical lattice," this month in the journalScience Advances. The team included Mehling; Catie LeDesma, a postdoctoral researcher in physics; and Murray Holland, professor of physics and fellow of JILA, a joint research institute between CU Boulder and the National Institute of Standards and Technology (NIST).

In 2023, NASA awarded the CU Boulder researchers a $5.5 million grant through the agency's Quantum Pathways Institute to continue developing the sensor technology.

The new device is a marvel of engineering: Holland and his colleagues employ six lasers as thin as a human hair to pin a cloud of tens of thousands of rubidium atoms in place. Then, with help from artificial intelligence, they manipulate those lasers in complex patterns — allowing the team to measure the behavior of the atoms as they react to small accelerations, like pressing the gas pedal down in your car.

Today, most vehicles track acceleration using GPS and traditional, or "classical," electronic devices known as accelerometers. The team's quantum device has a long way to go before it can compete with these tools. But the researchers see a lot of promise for navigation technology based on atoms.

"If you leave a classical sensor out in different environments for years, it will age and decay," Mehling said. "The springs in your clock will change and warp. Atoms don't age."

Interferometers, in some form or another, have been around for centuries — and they've been used to do everything from transporting information over optical fibers to searching for gravitational waves, or ripples in the fabric of the universe.

The general idea involves splitting things apart and bringing them back together, not unlike unzipping, then zipping back up a jacket.

In laser interferometry, for example, scientists first shine a laser light, then split it into two, identical beams that travel over two separate paths. Eventually, they bring the beams back together. If the lasers have experienced diverging effects along their journeys, such as gravity acting in different ways, they may not mesh perfectly when they recombine. Put differently, the zipper might get stuck. Researchers can make measurements based on how the two beams, once identical, now interfere with each other — hence the name.

In the current study, the team achieved the same feat, but with atoms instead of light.

Here's how it works: The device currently fits on a bench about the size of an air hockey table. First, the researchers cool a collection of rubidium atoms down to temperatures just a few billionths of a degree above absolute zero.

In that frigid realm, the atoms form a mysterious quantum state of matter known as a Bose-Einstein Condensate (BEC). Carl Wieman, then a physicist at CU Boulder, and Eric Cornell of JILA won a Nobel Prize in 2001 for creating the first BEC.

Next, the team uses laser light to jiggle the atoms, splitting them apart. In this case, that doesn't mean that groups of atoms are separating. Instead, each individual atom exists in a ghostly quantum state called a superposition, in which it can be simultaneously in two places at the same time.

When the atoms split and separate, those ghosts travel away from each other following two different paths. (In the current experiment, the researchers didn't actually move the device itself but used lasers to push on the atoms, causing acceleration).

"Our Bose-Einstein Condensate is a matter-wave pond made of atoms, and we throw stones made of little packets of light into the pond, sending ripples both left and right," Holland said. "Once the ripples have spread out, we reflect them and bring them back together where they interfere."

When the atoms snap back together, they form a unique pattern, just like the two beams of laser light zipping together but more complex. The result resembles a thumb print on a glass.

"We can decode that fingerprint and extract the acceleration that the atoms experienced," Holland said.

The group spent almost three years building the device to achieve this feat.

"For what it is, the current experimental device is incredibly compact. Even though we have 18 laser beams passing through the vacuum system that contains our atom cloud, the entire experiment is small enough that we could deploy in the field one day," LeDesma said.

One of the secrets to that success comes down to an artificial intelligence technique called machine learning. Holland explained that splitting and recombining the rubidium atoms requires adjusting the lasers through a complex, multi-step process. To streamline the process, the group trained a computer program that can plan out those moves in advance.

So far, the device can only measure accelerations several thousand times smaller than the force of Earth's gravity. Currently available technologies can do a lot better.

But the group is continuing to improve its engineering and hopes to increase the performance of its quantum device many times over in the coming years. Still, the technology is a testament to just how useful atoms can be.

"We're not exactly sure of all the possible ramifications of this research, because it opens up a door," Holland said.

Materialsprovided byUniversity of Colorado at Boulder.Note: Content may be edited for style and length.

Space-laser AI maps forest carbon in minutes—a game-changer for climate science

Satellite data used by archaeologists to find traces of ancient ruins hidden under dense forest canopies can also be used to improve the speed and accuracy to measure how much carbon is retained and released in forests.

Understanding this carbon cycle is key to climate change research, according to Hamdi Zurqani, an assistant professor of geospatial science for the Arkansas Forest Resources Center and the College of Forestry, Agriculture and Natural Resources at the University of Arkansas at Monticello. The center is headquartered at UAM and conducts research and extension activities through the Arkansas Agricultural Experiment Station and the Cooperative Extension Service, the University of Arkansas System Division of Agriculture's research and outreach arms.

"Forests are often called the lungs of our planet, and for good reason," Zurqani said. "They store roughly 80 percent of the world's terrestrial carbon and play a critical role in regulating Earth's climate."

To measure a forest's carbon cycle, a calculation of forest aboveground biomass is needed. Though effective, traditional ground-based methods for estimating forest aboveground biomass are labor-intensive, time-consuming and limited in spatial coverage abilities, Zurqani said.

In a study recently published inEcological Informatics, Zurqani shows how information from open-access satellites can be integrated on Google Earth Engine with artificial intelligence algorithms to quickly and accurately map large-scale forest aboveground biomass, even in remote areas where accessibility is often an issue.

Zurqani's novel approach uses data from NASA's Global Ecosystem Dynamics Investigation LiDAR, also known as GEDI LiDAR, which includes three lasers installed on the International Space Station. The system can precisely measure three-dimensional forest canopy height, canopy vertical structure and surface elevation. LiDAR stands for "light detection and ranging" and uses light pulses to measure distance and create 3D models.

Zurqani also used imagery data from the European Space Agency's collection of Earth observation Copernicus Sentinel satellites — Sentinel-1 and Sentinel-2. Combining the 3D imagery from GEDI and the optical imagery from the Sentinels, Zurqani improved the accuracy of biomass estimations.

The study tested four machine learning algorithms to analyze the data: Gradient tree boosting, random forest, classification and regression trees, or CART, and support vector machine. Gradient tree boosting achieved the highest accuracy score and the lowest error rates. Random forest came in second, proving reliable but slightly less precise. CART provided reasonable estimates but tended to focus on a smaller subset. The support vector machine algorithm struggled, Zurqani said, highlighting that not all AI models are equally suited for estimating aboveground forest biomass in this study.

The most accurate predictions, Zurqani said, came from combining Sentinel-2 optical data, vegetation indices, topographic features, and canopy height with the GEDI LiDAR dataset serving as the reference input for both training and testing the machine learning models, showing that multi-source data integration is critical for reliable biomass mapping.

Zurqani said that accurate forest biomass mapping has real-world implications for better accounting of carbon and improved forest management on a global scale. With more accurate assessments, governments and organizations can more precisely track carbon sequestration and emissions from deforestation to inform policy decisions.

While the study marks a leap forward in measuring aboveground forest biomass, Zurqani said the challenges remaining include the impact weather can have on satellite data. Some regions still lack high-resolution LiDAR coverage. He added that future research may explore deeper AI models, such as neural networks, to refine predictions further.

"One thing is clear," Zurqani said. "As climate change intensifies, technology like this will be indispensable in safeguarding our forests and the planet."

Materialsprovided byUniversity of Arkansas System Division of Agriculture. Original written by John Lovett.Note: Content may be edited for style and length.

CRISPR-edited stem cells reveal hidden causes of autism

To allow studying the genetic causes of autism spectrum disorder, a Kobe University research team created a bank of 63 mouse embryonic stem cell lines containing the mutations most strongly associated with the disorder. The achievement was made possible by developing a new and more efficient method for changing the genome of embryonic stem cells.

Although it is well understood that genetics influence the development of autism spectrum disorder, no one could yet pinpoint the precise cause and mechanism. To study the biological background of diseases, researchers use models: Cell models allow us to study how changes in the genes affect the shape and function of the cell, while animal models show how the change in its cellular components affects health and behavior. Despite significant differences between mice and humans, many disease-causing genes are very similar and cause similar conditions across these species. “One of the problems, however, is the lack of a standardized biological model to study the effects of the different mutations associated with autism spectrum disorder. This makes it difficult to find out, for example, whether they have common effects or what is specific to certain cell types,” explains Kobe University neuroscientist TAKUMI Toru.

Thus, twelve years ago, Takumi and his team embarked on a journey to change that. Being experts in studying mouse models of the disorder, they combined a conventional manipulation technique for mouse embryonic stem cells — cells that can be made to develop into almost any kind of cell in the body — with the then-newly discovered, highly specific and easy-to-handle CRISPR gene editing system. This new method proved highly efficient in making genetic variants of these cells and allowed the Kobe University team to produce a bank of 63 mouse embryonic stem cell lines of the genetic variants most strongly associated with autism spectrum disorder.

In the journalCell Genomics, Takumi and his team now published that they were able to develop their cells into a broad range of cell types and tissues, and even generate adult mice with their genetic variations. The analysis of these alone proved that their cell lines were adequate models for studying autism spectrum disorder. However, the cell lines also allowed them to conduct large-scale data analyses to clearly identify genes that are abnormally active, and in which cell types this is the case.

One of the things the data analysis brought to light is that autism-causing mutations often result in neurons being unable to eliminate misshapen proteins. “This is particularly interesting since the local production of proteins is a unique feature in neurons, and a lack of quality control of these proteins may be a causal factor of neuronal defects,” explains Takumi.

The Kobe University neuroscientist expects that his team’s achievement, which has been made available to other researchers and can be flexibly integrated with other lab techniques and adjusted to other targets, will be an invaluable resource for the scientific community studying autism and trying to find drug targets. He adds: “Interestingly, the genetic variants we studied are also implicated in other neuropsychiatric disorders such as schizophrenia and bipolar disorder. So, this library may be useful for studying other conditions as well.”

This research was funded by the Japan Society for the Promotion of Science (grants 16H06316, 16F16110, 21H00202, 21H04813, 23KK0132, 23H04233, 24H00620, 24H01241, 24K22036, 17K07119 and 21K07820), the Japan Agency for Medical Research and Development (grant JP21wm0425011), the Japan Science and Technology Agency (grants JPMJPF2018, JPMJMS2299 and JPMJMS229B), the National Center of Neurology and Psychiatry (grant 6-9), the Takeda Science Foundation, the Smoking Research Foundation, the Tokyo Biochemical Research Foundation, the Kawano Masanori Memorial Public Interest Incorporated Foundation for Promotion of Pediatrics, the Taiju Life Social Welfare Foundation, the Tokumori Yasumoto Memorial Trust for Researches on Tuberous Sclerosis Complex and Related Rare Neurological Diseases, and Takeda Pharmaceutical Company Ltd. It was conducted in collaboration with researchers from the RIKEN Center for Brain Science, Radboud University, the RIKEN Center for Integrative Medical Sciences, the Agency for Science, Technology and Research, the RIKEN Center for Biosystems Dynamics Research, and Hiroshima University.

Kobe University is a national university with roots dating back to the Kobe Higher Commercial School founded in 1902. It is now one of Japan’s leading comprehensive research universities with nearly 16,000 students and nearly 1,700 faculty in 11 faculties and schools and 15 graduate schools. Combining the social and natural sciences to cultivate leaders with an interdisciplinary perspective, Kobe University creates knowledge and fosters innovation to address society’s challenges.

Materialsprovided byKobe University.Note: Content may be edited for style and length.

Why giant planets might form faster than we thought

An international team of astronomers including researchers at the University of Arizona Lunar and Planetary Laboratory has unveiled groundbreaking findings about the disks of gas and dust surrounding nearby young stars, using the powerful Atacama Large Millimeter/submillimeter Array, or ALMA.

The findings, published in 12 papers in a focus issue of theAstrophysical Journal, are part of an ALMA large program called the ALMA Survey of Gas Evolution of PROtoplanetary Disks, or AGE-PRO. AGE-PRO observed 30 planet-forming disks around sunlike stars to measure gas disk mass at different ages. The study revealed that gas and dust components in these disks evolve at different rates.

Prior ALMA observations have examined the evolution of dust in disks; AGE-PRO, for the first time, traces the evolution of gas, providing the first measurements of gas disk masses and sizes across the lifetime of planet-forming disks, according to the project's principal investigator, Ke Zhang of the University of Wisconsin-Madison.

"Now we have both, the gas and the dust," said Ilaria Pascucci, a professor at planetary sciences at the U of A and one of three AGE-PRO co-principal investigators. "Observing the gas is much more difficult because it takes much more observing time, and that's why we have to go for a large program like this one to obtain a statistically significant sample."

A protoplanetary disk swirls around its host star for several million years as its gas and dust evolve and dissipate, setting the timescale for giant planets to form. The disk's initial mass and size, as well as its angular momentum, have a profound influence on the type of planet it could form — gas giants, icy giants or mini-Neptunes — and migration paths of planets. The lifetime of the gas within the disk determines the timescale for the growth of dust particles to an object the size of an asteroid, the formation of a planet and finally the planet's migration from where it was born.

In one of the survey's most surprising findings, the team discovered that as disks age, their gas and dust are consumed at different rates and undergo a shift in gas-to-dust mass ratio as the disks evolve: Unlike the dust, which tends to remain inside the disk over a longer time span, the gas disperses relatively quickly, then more slowly as the disk ages. In other words, planet-forming disks blow off more of their gas when they're young.

Zhang said the most surprising finding is that although most disks dissipate after a few million years, the ones that survive have more gas than expected. This would suggest that gaseous planets like Jupiter have less time to form than rocky planets.

ALMA's unique sensitivity allowed researchers to use faint, so-called molecular lines to study the cold gas in these disks, characteristic wavelengths of a light spectrum that essentially act as "fingerprints," identifying different species of gas molecules. The first large-scale chemical survey of its kind, AGE-PRO targeted 30 planet-forming disks in three star-forming regions, ranging from 1 million to 6 million years in age: Ophiuchus (youngest), Lupus (1-3 million years old), and Upper Scorpius (oldest). Using ALMA, AGE-PRO obtained observations of key tracers of gas and dust masses in disks spanning crucial stages of their evolution, from their earliest formation to their eventual dispersal. This ALMA data will serve as a comprehensive legacy library of spectral line observations for a large sample of disks at different evolutionary stages.

Dingshan Deng, a graduate student at LPL who is the lead author on one of the papers, provided the data reduction — essentially, the image analyses needed to get from radio signals to optical images of the disks — for the star-forming region in the constellation of Lupus (Latin for "wolf").

"Thanks to these new and long observations, we now have the ability to estimate and trace the gas masses, not only for the brightest and better studied disks in that region, but also the smaller and fainter ones," he said. "Thanks to the discovery of gas tracers in many disks where it hadn't been seen before, we now have a well-studied sample covering a wide range of disk masses in the Lupus star-forming region."

"It took years to figure out the proper data reduction approach and analysis to produce the images used in this paper for the gas masses and in many other papers of the collaboration," Pascucci added.

Carbon monoxide is the most widely used chemical tracer in protoplanetary disks, but to thoroughly measure the mass of gas in a disk, additional molecular tracers are needed. AGE-PRO used N2H+, or diazenylium, an ion used as an indicator for nitrogen gas in interstellar clouds, as an additional gas tracer to significantly improve the accuracy of measurements. ALMA's detections were also set up to receive spectral light signatures from other molecules, including formaldehyde, methyl cyanide and several molecular species containing deuterium, a hydrogen isotope.

"Another finding that surprised us was that the mass ratio between the gas and dust tends to be more consistent across disks of different masses than expected," Deng said. "In other words, different-size disks will share a similar gas-to-dust mass ratio, whereas the literature suggested that smaller disks might shed their gas faster."

Funding for this study was provided by the National Science Foundation, the European Research Council, the Alexander von Humboldt Foundation, FONDECYT (Chile) among other sources. For full funding information, see the research paper.

Materialsprovided byUniversity of Arizona.Note: Content may be edited for style and length.

Passive cooling breakthrough could slash data center energy use

Engineers at the University of California San Diego have developed a new cooling technology that could significantly improve the energy efficiency of data centers and high-powered electronics. The technology features a specially engineered fiber membrane that passively removes heat through evaporation. It offers a promising alternative to traditional cooling systems like fans, heat sinks and liquid pumps. It could also reduce the water use associated with many current cooling systems.

The advance is detailed in a paper published on June 13 in the journalJoule.

As artificial intelligence (AI) and cloud computing continue to expand, the demand for data processing — and the heat it generates — is skyrocketing. Currently, cooling accounts for up to 40% of a data center's total energy use. If trends continue, global energy use for cooling could more than double by 2030.

The new evaporative cooling technology could help curb that trend. It uses a low-cost fiber membrane with a network of tiny, interconnected pores that draw cooling liquid across its surface using capillary action. As the liquid evaporates, it efficiently removes heat from the electronics underneath — no extra energy required. The membrane sits on top of microchannels above the electronics, pulling in liquid that flows through the channels and efficiently dissipating heat.

"Compared to traditional air or liquid cooling, evaporation can dissipate higher heat flux while using less energy," said Renkun Chen, professor in the Department of Mechanical and Aerospace Engineering at the UC San Diego Jacobs School of Engineering, who co-led the project with professors Shengqiang Cai and Abhishek Saha, both from the same department. Mechanical and aerospace engineering Ph.D. student Tianshi Feng and postdoctoral researcher Yu Pei, both members of Chen's research group, are co-first authors on the study.

Many applications currently rely on evaporation for cooling. Heat pipes in laptops and evaporators in air conditioners are some examples, explained Chen. But applying it effectively to high-power electronics has been a challenge. Previous attempts using porous membranes — which have high surface areas that are ideal for evaporation — have been unsuccessful because their pores were either too small they would clog or too large they would trigger unwanted boiling. "Here, we use porous fiber membranes with interconnected pores with the right size," said Chen. This design achieves efficient evaporation without those downsides.

When tested across variable heat fluxes, the membrane achieved record-breaking performance. It managed heat fluxes exceeding 800 watts of heat per square centimeter — one of the highest levels ever recorded for this kind of cooling system. It also proved stable over multiple hours of operation.

"This success showcases the potential of reimagining materials for entirely new applications," said Chen. "These fiber membranes were originally designed for filtration, and no one had previously explored their use in evaporation. We recognized that their unique structural characteristics — interconnected pores and just the right pore size — could make them ideal for efficient evaporative cooling. What surprised us was that, with the right mechanical reinforcement, they not only withstood the high heat flux-they performed extremely well under it."

While the current results are promising, Chen says the technology is still operating well below its theoretical limit. The team is now working to refine the membrane and optimize performance. Next steps include integrating it into prototypes of cold plates, which are flat components that attach to chips like CPUs and GPUs to dissipate heat. The team is also launching a startup company to commercialize the technology.

This research was supported by the National Science Foundation (grants CMMI-1762560 and DMR-2005181). The work was performed in part at the San Diego Nanotechnology Infrastructure (SDNI) at UC San Diego, a member of the National Nanotechnology Coordinated Infrastructure, which is supported by the National Science Foundation (grant ECCS-2025752).

Disclosures: A patent related to this work was filed by the Regents of the University of California (PCT Application No. PCT/US24/46923.). The authors declare that they have no other competing interests.

Materialsprovided byUniversity of California – San Diego.Note: Content may be edited for style and length.

Fruit-eating mastodons? Ancient fossils confirm a long-lost ecological alliance

Ten thousand years ago, mastodons vanished from South America. With them, an ecologically vital function also disappeared: the dispersal of seeds from large-fruited plants. A new study led by the University of O'Higgins, Chile, with key contributions from IPHES-CERCA, demonstrates for the first time — based on direct fossil evidence — that these extinct elephant relatives regularly consumed fruit and were essential allies of many tree species. Their loss was not only zoological; it was also botanical, ecological, and evolutionary. Some plant species that relied on mastodons for seed dispersal are now critically endangered.

Published inNature Ecology & Evolution, the research presents the first solid evidence of frugivory inNotiomastodon platensis, a South American Pleistocene mastodon. The findings are based on a multiproxy analysis of 96 fossil teeth collected over a span of more than 1,500 kilometers, from Los Vilos to Chiloé Island in southern Chile. Nearly half of the specimens come from the emblematic site of Lake Tagua Tagua, an ancient lake basin rich in Pleistocene fauna, located in the present-day O'Higgins Region.

The study was led by Dr. Erwin González-Guarda, researcher at the University of O'Higgins and associate at IPHES-CERCA, alongside an international team that includes IPHES-CERCA researchers Dr. Florent Rivals, a paleodiet specialist; Dr. Carlos Tornero and Dr. Iván Ramírez-Pedraza, experts in stable isotopes and paleoenvironmental reconstruction; and Alia Petermann-Pichincura. The research was carried out in collaboration with the Universitat Rovira i Virgili (URV) and the Universitat Autònoma de Barcelona (UAB).

An ecological hypothesis finally proven

In 1982, biologist Daniel Janzen and paleontologist Paul Martin proposed a revolutionary idea: many tropical plants developed large, sweet, and colorful fruits to attract large animals — such as mastodons, native horses, or giant ground sloths — that would serve as seed dispersers. Known as the "neotropical anachronisms hypothesis," this theory remained unconfirmed for over forty years. Now, the study led by González-Guarda provides direct fossil evidence that validates it. To understand the lifestyle of this mastodon, the team employed various techniques: isotopic analysis, microscopic dental wear studies, and fossil calculus analysis. "We found starch residues and plant tissues typical of fleshy fruits, such as those of the Chilean palm (Jubaea chilensis)," explains Florent Rivals, ICREA research professor at IPHES-CERCA and an expert in paleodiet. "This directly confirms that these animals frequently consumed fruit and played a role in forest regeneration."

The forgotten role of large seed dispersers

"Through stable isotope analysis, we were able to reconstruct the animals' environment and diet with great precision," notes Iván Ramírez-Pedraza. The data point to a forested ecosystem rich in fruit resources, where mastodons traveled long distances and dispersed seeds along the way. That ecological function remains unreplaced.

"Dental chemistry gives us a direct window into the past," says Carlos Tornero. "By combining different lines of evidence, we've been able to robustly confirm their frugivory and the key role they played in these ecosystems."

A future threatened by an incomplete past

The extinction of mastodons broke a co-evolutionary alliance that had lasted for millennia. The researchers applied a machine learning model to compare the current conservation status of megafauna-dependent plants across different South American regions. The results are alarming: in central Chile, 40% of these species are now threatened — a rate four times higher than in tropical regions where animals such as tapirs or monkeys still act as alternative seed dispersers.

"Where that ecological relationship between plants and animals has been entirely severed, the consequences remain visible even thousands of years later," says study co-author Andrea P. Loayza.

Species like the gomortega (Gomortega keule), the Chilean palm, and the monkey puzzle tree (Araucaria araucana) now survive in small, fragmented populations with low genetic diversity. They are living remnants of an extinct interaction.

Paleontology as a key to conservation

Beyond its fossil discoveries, the study sends a clear message: understanding the past is essential to addressing today's ecological crises. "Paleontology isn't just about telling old stories," concludes Florent Rivals. "It helps us recognize what we've lost — and what we still have a chance to save."

Materialsprovided byUniversitat Autonoma de Barcelona.Note: Content may be edited for style and length.

AI Reveals Milky Way’s Black Hole Spins Near Top Speed

An international team of astronomers has trained a neural network with millions of synthetic simulations and artificial intelligence (AI) to tease out new cosmic curiosities about black holes, revealing the one at the center of our Milky Way is spinning at nearly top speed.

These large ensembles of simulations were generated by throughput computing capabilities provided by the Center for High Throughput Computing (CHTC), a joint entity of the Morgridge Institute for Research and the University of Wisconsin-Madison. The astronomers published their results and methodology today in three papers in the journalAstronomy & Astrophysics.

High-throughput computing, celebrating its 40thanniversary this year, was pioneered by Wisconsin computer scientist Miron Livny. It's a novel form of distributed computing that automates computing tasks across a network of thousands of computers, essentially turning a single massive computing challenge into a supercharged fleet of smaller ones. This computing innovation is helping fuel big-data discovery across hundreds of scientific projects worldwide, including the search for cosmic neutrinos, subatomic particles and gravitational waves as well as to unravel antibiotic resistance.

In 2019, the Event Horizon Telescope (EHT) Collaboration released the first image of a supermassive black hole at the center of the galaxy M87. In 2022, they presented the image of the black hole at the center of our Milky Way, Sagittarius A*. However, the data behind the images still contained a wealth of hard-to-crack information. An international team of researchers trained a neural network to extract as much information as possible from the data.

Previous studies by the EHT Collaboration used only a handful of realistic synthetic data files. Funded by the National Science Foundation (NSF) as part of the Partnership to Advance Throughput Computing (PATh) project, the Madison-based CHTC enabled the astronomers to feed millions of such data files into a so-called Bayesian neural network, which can quantify uncertainties. This allowed the researchers to make a much better comparison between the EHT data and the models.

Thanks to the neural network, the researchers now suspect that the black hole at the center of the Milky Way is spinning at almost top speed. Its rotation axis points to the Earth. In addition, the emission near the black hole is mainly caused by extremely hot electrons in the surrounding accretion disk and not by a so-called jet. Also, the magnetic fields in the accretion disk appear to behave differently from the usual theories of such disks.

"That we are defying the prevailing theory is of course exciting," says lead researcher Michael Janssen, of Radboud University Nijmegen, the Netherlands. "However, I see our AI and machine learning approach primarily as a first step. Next, we will improve and extend the associated models and simulations."

"The ability to scale up to the millions of synthetic data files required to train the model is an impressive achievement," adds Chi-kwan Chan, an Associate Astronomer of Steward Observatory at the University of Arizonaand a longtime PATh collaborator. "It requires dependable workflow automation, and effective workload distribution across storage resources and processing capacity."

"We are pleased to see EHT leveraging our throughput computing capabilities to bring the power of AI to their science," says Professor Anthony Gitter, a Morgridge Investigator and a PATh Co-PI. "Like in the case of other science domains, CHTC's capabilities allowed EHT researchers to assemble the quantity and quality of AI-ready data needed to train effective models that facilitate scientific discovery."

The NSF-funded Open Science Pool, operated by PATh, offers computing capacity contributed by more than 80 institutions across the United States. The Event Horizon black hole project performed more than 12 million computing jobs in the past three years.

"A workload that consists of millions of simulations is a perfect match for our throughput-oriented capabilities that were developed and refined over four decades" says Livny, director of the CHTC and lead investigator of PATh. "We love to collaborate with researchers who have workloads that challenge the scalability of our services."

Deep learning inference with the Event Horizon Telescope I. Calibration improvements and a comprehensive synthetic data library. By: M. Janssen et al. In:Astronomy & Astrophysics, 6 June 2025.

Deep learning inference with the Event Horizon Telescope II. The Zingularity framework for Bayesian artificial neural networks. By: M. Janssen et al. In:Astronomy & Astrophysics, 6 June 2025.

Deep learning inference with the Event Horizon Telescope III. Zingularity results from the 2017 observations and predictions for future array expansions. By: M. Janssen et al. In:Astronomy & Astrophysics, 6 June 2025.

Materialsprovided byMorgridge Institute for Research.Note: Content may be edited for style and length.

Koalas on the brink: Precision DNA test offers a lifeline to Australia’s icons

A University of Queensland-led project has developed a tool to standardize genetic testing of koala populations, providing a significant boost to conservation and recovery efforts.

Dr Lyndal Hulse from UQ's School of the Environment said the standardized koala genetic marker panel provides a consistent method for researchers nationwide to capture and share koala genetic variation, enabling improved collaboration and data integration across studies.

"Koalas in the wild are under increasing pressure from habitat loss, disease and vehicle strikes, forcing them to live in increasingly smaller and more isolated pockets with limited access to breeding mates outside their group," Dr Hulse said.

"Population inbreeding can mean detrimental effects on their health.

"A standardized panel for directly comparing genetic markers enables researchers, conservationists and government agencies to better understand the genetic diversity of koala populations, allowing for greater collaboration to ensure their survival."

Saurabh Shrivastava, Senior Account Manager at project partner the Australian Genome Research Facility (AGRF Ltd), said the new screening tool was a single nucleotide polymorphism (SNP) array that used next-generation sequencing technologies.

"The Koala SNP-array can accommodate good quality DNA, so is suitable for broad-scale monitoring of wild koala populations," Mr Shrivastava said.

"Importantly, it is available to all researchers and managers."

Dr Hulse said ideally the tool could help guide targeted koala relocations across regions.

"There are very strict rules about relocating koalas, but this could be key to improving and increasing the genetics of populations under threat," she said.

"These iconic Australian marsupials are listed as endangered in Queensland, New South Wales and the ACT – and in 50 years we may only be able to see koalas in captivity.

"Understanding the genetic diversity of different populations of koalas is crucial if we're going to save them from extinction."

The project included researchers from the Australasian Wildlife Genomics Group at the University of New South Wales.

AGRF Ltd isanot-for-profit organization advancing Australian genomics through nationwide access to expert support and cutting-edge technology across a broad range of industries including biomedical, health, agriculture and environmental sectors.

Materialsprovided byUniversity of Queensland.Note: Content may be edited for style and length.

Tidak Ada Lagi Postingan yang Tersedia.

Tidak ada lagi halaman untuk dimuat.