Ad

Chewing pasta too much makes it less healthy

The latest pasta research shows that tiny bits of noodles lead to higher glucose levels

Josseline Ramos-Figueroa

Chemical Biology

University of Saskatchewan

Refrigerating dried or homemade pasta may be beneficial to health, but also may alter how pasta breaks down upon chewing.

When cooking every day becomes too impractical, many people resort to meal prepping. This can mean a constant cycle of refrigerating and reheating meals. In the case of pasta, this process actually might be be beneficial for your health.

Controlling glucose levels is critical to decreasing the risk of heart disease, diabetes, and obesity. Previous studies have shown that eating reheated pasta produced lower blood glucose levels than consuming freshly boiled pasta. Lower glucose levels were obtained because a new type of starch, called resistant starch, forms at cold temperatures. In our gut, enzymes break starch down into units of glucose. But for this resistant starch, degradation is slower, so less glucose forms. Do different types of pasta produce the same result? And does harsh or mild chewing affect starch degradation?

To find some answers, a group of researchers dug into two types of pasta: dried spaghetti and fresh homemade tagliatelle. And to simulate pasta particles produced upon chewing, they cut cooked pasta to two different sizes.

Results showed that cooled or reheated dry pasta only produced a small change in starch degradation than in freshly boiled pasta. However, with small-sized fresh pasta, the researchers saw a sharper decrease in starch degradation compared to dry pasta.

The researchers presume that the non-starch ingredients in pasta might point to additional factors for the formation of resistant starch. They also argued that the size of pasta particles formed during chewing may have greater effect than the process of cooling and reheating pasta. They proposed that industrial pasta production should consider pasta sizes and shapes that will naturally break into large pieces.

A newly discovered cryosphere-dwelling yeast stays alive by making ethanol

Rhodotorula frigidialcoholis was isolated from 150,000-year-old permafrost in the McMurdo Dry Valleys of Antarctica

Mitra Kashani

Microbial Ecology

Centers for Disease Control and Prevention

Most of the Earth’s biosphere is permanently cold and contains environments below 0° C,  known as the cryosphere. Microorganisms like bacteria and fungi call the cryosphere home, despite the seemingly inhospitable conditions. Some can even stick around in the ice for thousands of years.

To make this happen, microorganisms have evolved adaptations that help them survive their forever winter – whether it’s because they prefer cold environments (known as psychrophiles) or they can tolerate them (psychrotolerants) until more favorable conditions arise.

One example of cryosphere adapted fungi are a genus of single celled, pink pigmented yeast called Rhodotorula, which have been isolated and characterized from a range of cold ecosystems. In order to survive the coldest and driest parts of the Earth, they’ve evolved unique strategies to handle the elements. In a recent study by scientists at McGill University, a novel species of Rhodotorula yeast is changing our understanding of fungal cold adaptations in new and unexpected ways.

The newly identified psychrotolerant yeast, Rhodotorula frigidialcoholis, was isolated from 150,000-year-old ice cemented permafrost in the McMurdo Dry Valleys of Antarctica. The researchers found it has two novel responses to extreme cold temperatures: it can switch its metabolism from respiration to ethanol fermentation as its main pathway, and can overexpress molecules called small non-coding RNAs (sRNAs) that help regulate which genes are expressed after transcription. R. frigidialcoholis now also holds the record for the lowest temperature reported for ethanol production by any microorganism.

Scientists are still working to understand the precise role of sRNA expression in cold adaptation, but the metabolic switch from respiration to ethanol fermentation by R. frigidialcoholis may help the novel yeast – and potentially others like it – save energy, slowing down the freezing point in their cells as a long-term survival strategy.

El sur de Inglaterra alberga a una pequeña, pero prospera población de walabíes de cuello rojo

Los walabíes fueron introducidos al país al principio del siglo XX

Maria Gatta

Ecology and Conservation Biology

University of the Witwatersrand, Johannesburg

Walabíes: son muy monos, relativamente pequeños, y para los europeos, tienen una apariencia inédita. Esto es lo que llevó a la introducción del walabí de cuello rojo, una especie australiana, a principios del siglo XX a países como Inglaterra, Irlanda, y Francia. En aquellos tiempos, los walabíes se mantenían en zoos y colecciones privadas. Algunos escaparon, sobre todo durante la segunda guerra mundial, cuando la gente tenía cosas más importantes por las que preocuparse por mantener vallas.

Hoy en día, hay muy poca información disponible sobre que les pasó a aquellos walabíes introducidos. Dos científicos, Holly English y Anthony Caravaggi, decidieron investigar qué pasó con aquellos animales. Recogieron información sobre avistamientos de walabíes en los registros oficiales, las redes sociales, y los periódicos. Gracias a lo monos e inusuales que son, los avistamientos suelen ser mencionados en los periódicos locales.

En su artículo reciente publicado en la revista científica Ecologia y Evolución, los investigadores encontraron pequeñas poblaciones de walabíes viviendo a lo largo del sur de Inglaterra. Aunque alguno de estos animales es probablemente un fugitivo moderno de una colección privada o un zoo, es improbable que tales escapadas sean el origen de todos los avistamientos de la región. Por ello, los investigadores creen que las poblaciones del sur de Inglaterra se están reproduciendo en libertad.

Así que, si alguna vez estas en el sur de Inglaterra y crees que has visto a un walabí, ¡no te sorprendas demasiado!

Research demonstrates speech-in-noise training helps children with auditory processing disorder

Children with APD have difficulty perceiving speech when there is background noise and may have trouble on cognitive tests

Stephanie Santo

Psychology

Central auditory processing disorder (APD), a hearing disability, can impact cognitive functioning and academic performance in those who experience it. It is typically diagnosed in childhood. Children with APD have difficulty perceiving speech when there is background noise, called speech-in-noise perception, and may have trouble on cognitive tests.

Previous studies on the link between cognitive performance, background noise, and auditory processing in children have included participants without confirmed APD diagnoses. Therefore, researchers in a recent study examining the potential utility of a technique called speech-in-noise training selected participants with confirmed diagnoses to understand the relationship between APD, speech-in-noise perception, and working memory.

The researchers administered one cognitive and five auditory processing tests to the participants. They gave the children lists of words and asked them to repeat the words back. In one of the tests, the words were audible only in one ear and the participants were asked to repeat the words regardless of which ear they heard it from. 

Participants in the experimental group were given speech-in-noise training within a week of the evaluation. During this type of training, participants are asked to listen for words or speech presented with background noise, which gets progressively louder or more difficult to navigate as the training progresses. The goal is to help people pick out important words while filtering out background sounds.

The study found a link between how participants did on auditory tests and their performance on cognitive tests. Speech-in-noise training generally improved the participants' results on both tests. This study confirms that speech-in-noise training may be a helpful intervention for children with APD diagnoses.

Deep sea bacteria use selfishness to their advantage

Some bathypelagic bacteria have found a way to maximize their energy intake by taking food into their cells before breaking it down

Sarah Brown

Marine Science

University of North Carolina - Chapel Hill

The bathypelagic zone of the ocean, which spans depths between 1,000 and 4,000 meters (3,300 – 13,100 feet) below the ocean’s surface, is characterized by permanent darkness, low temperatures, and high pressure. In this hostile environment, slow-growing bacteria survive by relying on sinking organic matter, including proteins and carbohydrates called polysaccharides, from algae in the sun-lit surface waters of the ocean. 

Much of this organic matter is heavily degraded by the time it reaches the deep sea, and intact polysaccharides are hard to come by. For bacteria living in the bathypelagic zone, survival means getting the most out of every rare polysaccharide that reaches these depths – and a new study suggests that for some bacteria, selfishness may be key to their survival.

Bacteria typically feed by releasing enzymes into the water to break down their food into small enough pieces to be taken into the cell. However, by releasing these enzymes into the surrounding water, bacteria naturally lose some of the products of this process. While breaking down food externally can be profitable when resources are in high abundance, it is a much less successful strategy in environments where resource availability is low, such as the deep sea.

In a recent pre-print that I am a co-author on, we suggest that some bacteria at these depths may be using a selfish method of polysaccharide uptake. This method allows them to bring large pieces of polysaccharides into their cells without first breaking them down externally, enabling the bacteria to selfishly keep all the food to themselves.

To make this discovery, we incubated bathypelagic bacteria with fluorescently-labeled polysaccharides. By staining the bacteria with a DNA-binding dye and viewing them under microscopes, we were able to see intact pieces of polysaccharides inside the cells, indicating that these bacteria had not used external enzymes to break them down prior to uptake.

These results provide the first example of selfish behavior in deep-sea bacteria, suggesting that selfishness may be more common among bacteria than previously thought.

The screen you are reading this on is probably emitting volatile organic compounds

A new study demonstrates that, in addition to a variety of other household products, LCD screens also emit these compounds

Kay McCallum

Atmospheric Chemistry

McMaster University

We spend a lot of time indoors - so it’s important that we know what’s in indoor air. Indoor chemists are especially concerned with volatile organic compounds (VOCs, a class of molecules that includes benzene, formaldehyde, and more), which can be harmful to human health and are highly reactive.

VOCs are released into indoor air from a number of sources – plants, wall paint, cooking and cleaning – and, as a recent study by a pair of researchers at the University of Toronto shows, from LCS screens like those in your phone, TV, and laptop.

To measure how LCD screens affect air quality, the researchers collected data on what types of compounds were contained in two types of samples: one of regular indoor air, and one collected near the surface of on an LCD screen like a new TV or an old laptop. They identified the chemical signatures of those compounds using a technique called proton-transfer reaction mass spectrometry. They then cross-referenced these signatures against lists of known liquid crystal monomers (the “building blocks” of LCD screens) and other compounds used in LCD screen manufacturing.

They found over 30 VOCs and 10 L liquid crystal monomers were heavily emitted into the air exposed to the screen, including extremely reactive species like isoprene and acetic acid. This finding indicates that LCD screens are an important source of VOCs in indoor environments, and that our screen-time may be exposing us to more than just new things on the internet.

Getting vaccinated against COVID-19 improves mental health

People who received the COVID-19 vaccine experienced less depression and anxiety compared to unvaccinated individuals

Danielle Llaneza

Health and Medicine

Hunter College and MD Anderson Cancer Center

Vaccine hesitancy remains a pressing issue during the COVID-19 pandemic. About 69 percent of people in the US who are 12 years or older have received the full vaccine dosage, thus protecting them from COVID-19. Yet, a portion of the remaining population remains reluctant to get the vaccine. This group's decision is based on their concern about the vaccine's safety, lack of belief in the danger posed by COVID-19, and distrust for the government. Some also face logistical difficulties in getting vaccinated, either because they live in rural areas or do not have the ability to take sick days from work if they experience side effects. 

A promising new study, published by researchers at the University of Southern California, found that individuals had significant decreases in mental health distress after receiving the first dose of the vaccine. This knowledge can, hopefully, push some vaccine hesitant people to receive their shots.

The researchers surveyed 8000+ people to measure their depressive and anxiety symptoms (mental distress level) before and after their first vaccine dose. They found that vaccinated individuals reported less mental distress after receiving the first dosage, while the unvaccinated group retained a consistently high level of mental distress for the entire 1-year study period.

Receiving the vaccine has turned from a health to a political affair as the remaining population voluntarily refuses to get vaccinated. This voluntary refusal puts themselves, the community, and individuals who cannot get vaccinated (like children under five years old and immunocompromised individuals) at greater risk of being infected with COVID-19. At this point, severe reactions to COVID-19 are preventable but will remain a significant problem until all eligible persons are vaccinated. This study shows that we have an opportunity to increase our physical and mental wellbeing. We must invest in more substantial policy at the corporate, state, and federal levels to increase vaccination rates now.

Don't bank those seeds — some oaks can be "cryopreserved"

Acorns can't be frozen, but tips of oak tree shoots can

Christina Del Greco

Genetics and Genomics

University of Michigan

Due largely in part to human-induced climate change, up to 40 percent of all species of plants are at risk of extinction. In response, conservationists have developed seed banks, where seeds of at-risk plants are frozen and stored in case of emergency.

Many species of oak trees fall on the list of endangered plants. However, their acorns are not usable after freezing, so conservationists are unable to add them to seed banks. As a result, scientists have had to investigate alternative preservation methods for oaks.

A recently published study has demonstrated that, for oaks, an alternative to seed-banking could be shoot tip cryopreservation. Shoot tip cryopreservation is the process of clipping off the shoot tip of a plant — the part that contains cells able to regenerate into a whole new plant — and placing it in droplets of a freezable substance. The plants are then frozen in liquid nitrogen, -320 degrees Fahrenheit, until they're ready to be thawed and grown.

Scientists found that, when they attempted shoot tip cryopreservation on four different species of oaks, some the plants were able to grow after freezing and unfreezing. But some didn't survive. Survival depended on the species. One species survived liquid nitrogen freezing 56 percent of the time, another never did. When looking specifically at the most successful species, the researchers also found that slight temperature differences in the freezing and unfreezing processes can have an effect on both general plant survival and exactly how well the plants recover after freezing.

Up until now, there had been no evidence that shoot tip cryopreservation worked on oaks. While survival does depend on the species of oak, this study demonstrates that this method can be added to the arsenal of the different conservation tools available for oak preservation, and can hopefully contribute to finding methods that work for all oak species.

Male and female mice form memories of fearful events differently

A drug that blocks memory forming in male mice has a different effect in females

Rita Ponce

Evolutionary Biology

Polytechnic Institute of Setúbal

Memory is the process of encoding, storing and retrieving information by the brain. Several studies indicate that fear memories are processed differently in male and female animals, but the basis of these differences are still mostly unknown. A study published in Nature Communications has brought new information to the table: a drug known to reduce the ability to remember fearful events in male mice turns out to to increase that ability in females.

The team that led the research is from the Institut de Neurociències at the Universitat Autònoma de Barcelona. They study the mechanisms of the memory of fear, aiming to find treatments for pathologies associated with the experience of traumatic events. This project coupled behavioral studies of mice with hormonal and biochemical and molecular analysis.

The drug they used in the study, osanetant (which is not used to treat humans), blocks a pathway for brain signaling that is involved in creating lasting memories of fear. The researchers found that the drug's blocking action has opposite effects on males and females, and that it is dependent on sexual hormones — testosterone in males and estradiol in females.

While the new results about sexual differences and memory are very interesting on their own, they raise an important issue for experimental design. Most research studies are done in males. But scientists would benefit from understanding how drugs affect more than just males. That's particularly relevant in this field in particular, given that fear related disorders are more common in women.

Psilocybin reduced depression symptoms as much as a leading antidepressant

New research compared the "magic mushrooms" component to Lexapro

Soren Emerson

Neuroscience

Vanderbilt University

Since their introduction in the late 1980s, selective-serotonin reuptake inhibitors (SSRIs) have become the go-to treatment for major depression. SSRIs, however, have a number of limitations: they take several weeks to start working, can cause a variety of side-effects, and do not help some people with depression. A series of recent clinical investigations suggest that psilocybin, the active compound in magic mushrooms, may be an effective alternative. One question that these studies left unanswered, however, is how effective psilocybin treatment is compared to SSRIs.

In a first-of-its-kind study recently published in The New England Journal of Medicine, researchers at the Center for Psychedelic Research at Imperial College London compared psilocybin and escitalopram, an SSRI drug sold under the name Lexapro, as treatments for major depression. The six-week long study enrolled 59 volunteers with moderate-to-severe major depression. They were randomly and blindly assigned to receive treatment with psilocybin and an escitalopram control, or escitalopram and a psilocybin control. All the participants also received psychological support.

To evaluate the two treatments, the researchers compared the change from baseline on the 16-item Quick Inventory of Depressive Symptomatology–Self-Report (QIDS-SR-16), a basic clinical measure of depression symptoms. Based on results of the QIDS-SR-16, psilocybin and escitalopram both reduce depression symptoms. The researchers did not detect a statistically significant difference between the two treatments.

The results of other measures taken in the study, however, suggest that psilocybin may be more effective than escitalopram. When designing the study, the researchers determined that the QIDS-SR-16 most directly addressed their experimental question and would therefore be the primary outcome measure, but they also evaluated depression symptoms with a number of additional scales. Nearly all secondary outcome measures favored psilocybin over escitalopram, but their results hold less weight than the QIDS-SR-16 because of how the study was designed.

The study was also limited by its small size, non-random enrollment of interested volunteers, and the possibility that participants may have been unblinded by the strong subjective effects of psilocybin or the well-known side-effects of SSRIs. Nonetheless, as the most rigorous evaluation of the therapeutic potential of psilocybin conducted to date, the results provide a benchmark for the design of future investigations.

Cocaine use slices and dices RNA in mouse brain cells

The analysis of epigenetic changes caused by cocaine use adds to the evidence that substance use disorders are rooted in biology

Anna Rogers

Molecular Biology

UC Berkeley

Neuroscientists are known for doing some strange things to mice in their pursuit of learning about the brain. One such strange thing is training mice to self-administer cocaine, but it’s all for a good cause: Self-administration can help us understand the biological underpinnings of substance use disorders.

In a study recently published in Neuron, researchers found that cocaine use changes the DNA in mouse brains, specifically in the brain regions associated with reward. After consuming cocaine, the DNA in their brain cells had different chemical modifications known as epigenetic changes. These epigenetic changes also altered the types of RNAs the cells made through splicing. In this process, pieces of genes are left out or added in to create different RNAs that create different proteins.

Scientists have known RNA splicing is particularly important for neurons, and the researchers behind this mouse study saw many epigenetic and splicing changes after the mice consumed cocaine. Then, they artificially recreated one specific epigenetic change at a gene called Srsf11. This led Srsf11 to be spliced differently. However, Srsf11 is also a gene that controls splicing across the genome, meaning that changes to it had ripple effects in the mice. This one change also altered splicing across a few hundred other genes, some of which were previously implicated in substance use disorders. Most interestingly, the mice with the modified version of Srsf11 self-administered more cocaine, showing that these sorts of changes in the brain may underlie addiction.

Some researchers argue that increasing the body of evidence for the biological basis of substance use disorders reduces stigma against people who use substances, though the effectiveness of this in public messaging is debated. Regardless, though we continue to see evidence that substance use disorders are biologically-driven, there are currently no approved drugs to treat the overuse of cocaine. Epigenetics and RNA splicing may be promising targets for future medical interventions.

Your gut bacteria may be hoarding your medication

Researchers have observed this effect in petri dishes and nematodes

Madeline Barron

Microbiology

University of Michigan

When we take medications, we generally do two things:  first, we swallow some pills, then we wait for them to kick in. Whether or not they do, however, may be tied to our gut microbes.

Intestinal bacteria influence the availability and activity of therapeutic drugs in the body. For instance, some bacteria metabolically convert, or ‘biotransform,’ drugs into their active forms; others inactivate them. And some, according to a new report published in Nature, don’t chemically manipulate drug molecules — they hoard them.

In this study, researchers incubated 25 representative strains of gut bacteria with 12 orally administered drugs, including those used to treat asthma, high cholesterol, and diarrhea. By measuring drug levels in the growth medium before and after 48 hours of incubation, the scientists identified 29 novel bacteria-drug pairs in which the drug was depleted from the medium. Comparing drug concentrations in the medium alone with that of the total culture revealed that, in most cases, the drug was absent from the medium but recoverable from the total culture. These results suggest the medications were accumulating inside the bacteria.

The question is: When bacteria vacuum up drug molecules, does this alter the drug’s effect on the host?

To explore this, the researchers incubated Caenorhabditis elegans, a nematode and model organism, with duloxetine, an antidepressant that was accumulated by several bacterial strains. While duloxetine alone decreased nematode motility, adding a duloxetine-accumulating strain of E. coli to the culture reduced this effect.

These findings indicate that bacterial hoarding of medications may affect the way those drugs affect their targets. Ultimately, more research is required to determine whether a similar scenario plays out in the human gut, and in the context of other drugs. Greater insight into the interplay between medications and gut microbes could expand our understanding of drug bioavailability and efficacy, and how they may vary from one person (and gut microbiome) to the next.

Meet the springhare: the first glow-in-the-dark African mammal known to science

Researchers discovered the springhare's fluorescent abilities entirely by accident

Shakira Browne

Zoology

University College Dublin

Fluorescence is caused by an animal absorbing light and bouncing it back out again, and in nature, it’s not a new thing. Fluorescence occurs across only a handful of mammals but they span three different continents and inhabit entirely different ecosystems. The platypus is one such animal, whose glow-in-the-dark abilities were only discovered in 2020. 

But, a discovery earlier this year by Northland College researchers that springhares fluoresce is special: it is the first documented case of biofluorescence in an Afro-Eurasian placental mammal. The study purports that perhaps fluorescence in mammals is not as rare as once previously thought. 

The researchers entered Chicago’s Field Museum of Natural History armed with a flashlight, with the goal of examining the fluorescent abilities of flying squirrels. Along the way, they accidentally discovered that springhares also glow. One specimen they examined was collected in 1905, and continued to glow in the dark for over 100 years.

The researchers subsequently tested live springhares (this time, in the dead of night—springhares are nocturnal) and found they could also fluoresce, predictably stronger than in the dead specimens. This study raises the questions: What other animals are out there, pulsating in every different shade of the rainbow after the clock strikes midnight? 

For the first time ever, researchers have "housebroken" cows

Controlling where cow waste ends up could lead to cleaner air and water and decreased greenhouse gas emissions

Fernanda Ruiz Fadel

Animal Behavior and Behavioral Genetics

Advanced Identification Methods GmbH

In a strange triumph of science, researchers have now successfully potty trained 11 cows. The study, done by research groups in Germany and New Zealand, included 16 calves, which they trained by giving the calves a reward when they urinated in a latrine and later by adding an unpleasant stimulus (three-second water spray) when they began urinating outside of the designated area. The calves' potty training performances are equivalent to those of children and better than very young children.

But why is this important?

First, because cattle waste is a substantial contributor to greenhouse gas emissions and soil and water contamination. Being able to collect cow waste in one place would enable us to treat and dispose of it properly. One way of doing it is by keeping the animals confined in barns, but that lowers their welfare conditions.

Second, it shows that cows are able to react to and control their reflexes, indicating that their behavior — like shown with many other animal species before — is subject to modification by using rewards. This demonstrates that cows have more awareness than previously thought, which is important to better understand their wellbeing and welfare needs. Having cattle keep their own living areas a bit cleaner would also increase their welfare.

Potty training cows in farm settings is time consuming and logistically challenging, but it would help significantly decrease gas emissions without compromising animal welfare. Model calculations predict that capturing 80 percent of cattle waste could lead to a 56 percent reduction in ammonia emissions, which would lead to cleaner air for all of us.

Feeding extra amino acids to cells with a mutated enzyme makes them grow faster

This new finding could lead to advances in treatment of diseases caused by ARS mutations

Christina Del Greco

Genetics and Genomics

University of Michigan

Our cells require proteins, which are composed of individual amino acids connected in a long chain, to perform important functions. These amino acids are delivered to protein-building machinery by another molecule called a tRNA. Amino acids and tRNAs are attached together, or charged, by an enzyme colloquially known as ARS.

Mutations in ARS enzymes cause diseases such as Charcot-Marie-Tooth disease, which affects the nerves to a person's arms and legs, because cells cannot make proteins properly. Currently, there are few treatments for ARS defects. However, researchers predict that flooding cells with extra amino acid might allow defective ARS enzymes to function better.

To test this, scientists identified patients with ARS mutations that cause charging defects, and grew their cells in a petri dish. They then treated these cells with different amounts of amino acid, and compared the electrical impedance of the cells that received treatment to those that did not. “Impedance analysis” is an approach where scientists put cells on a surface that can conduct electricity.  As cells grow, they block the electrical current, and the speed at which the current is blocked corresponds to how fast the cells are growing. 

The scientists found cells with ARS mutations that were treated with amino acids grew faster than cells that did not receive treatment. These promising results meant that the researchers could move on to trying this treatment in the patients themselves. They designed specific amino acid treatments for four people with the same ARS mutations they studied in cells, monitored their symptoms over time, and found that giving patients amino acids alleviated many of their most severe symptoms.

While we still don’t know if these results are applicable to all patients with ARS mutations, this study found a potential new way to treat ARS mutations in patients. Considering that ARS mutations can cause very severe disease, this is exciting and promising for both scientists and patients alike.

White pine blister rust's habitat range is changing with the climate

New study in Sequoia and Kings Canyon National Parks demonstrates the complexity of changing plant-pathogen interactions

Ornob Alam

Population Genetics

New York University

A rapidly changing climate is expected to shift where species live. This will also alter human activities like agriculture and forest conservation, as the ranges of plant pathogens change. 

Scientists have been predicted that climate change will both increase and decrease the prevalence of a pathogen across its geographic range, depending on local climate effects, a pathogen’s favored conditions, and host factors. In an article published in Nature Communications, a group of researchers led by Joan Dudney demonstrate exactly this in a natural system. 

The researchers report the effects of climate change on the occurrence of white pine blister rust in the Sequoia and Kings Canyon National Parks (SEKI) in California. Blister rust is a fungal disease that threatens white pine forests across Europe and North America. Leveraging blister rust prevalence data from two surveys conducted twenty years apart (1996 and 2016) in SEKI, alongside climate data over the same timeframe, they authors found that a warming, increasingly dry climate caused contraction of blister rust's range at low elevations and expansion at higher elevations, where conditions remained relatively mild. They noted an approximately 33 percent decline in overall disease prevalence despite these expansions and high elevations.

The blister rust fungus has a complex lifecycle that requires both a white pine tree and an alternative host, such as Ribes shrubs. Dudney and her fellow researchers found that alternative hosts were less common at higher elevations, likely limiting blister rust's ability to infect pine trees at those elevations even though the pathogen could live there. They also observed that aridity, or dryness, played an important role in determining infection risk.

The study provides a roadmap for future studies on host-pathogen-climate interactions. Genomic adaptations of rapidly reproducing pathogens to changing conditions could further alter these dynamics and represent an additional avenue to explore in future work.

These uses of poop for protection are stranger than fiction

Defense by dung doesn't always elicit disgust in predators to repel them

Simon Spichak

Neuroscience

To some animals, their own excretion isn't just waste. They may use their fecal matter to ward off predators. Here are several examples of fecal prowess.

Poop... ink? 

Sperm whales are some of the largest animals to ever exist, reaching a whopping 14 meters long in adulthood. Despite their intimidating size, they still can get spooked (such as by pesky divers) and unleash a poopy trick. Through emergency defecation, a sperm whale can disperse a smoke screen of shit into the water before the cetacean makes its escape. Waving its tail to disperse their poop creates an underwater "poopnado," as Canadian diver Keri Wilk called it. These enormous diarrhea clouds also help recycle nutrients and store immense amounts of carbon, mitigating some effects of climate change.

Shields — I mean poop — up! 

The larvae of the tortoise beetle are the Captain America of the animal kingdom — because they make shields out of poop. Using their maneuverable anus that sits on their flexible rear end, they deposit their dung defense on their back. The fecal armor, made in part from the larvae's shed exoskeleton, can double as a club to whack off potential predators.

Rancid repellent

The Green Woodhoopoe takes a rather straightforward approach to defense. Young birds will simply coat themselves in liquid poop, using the odor to deter — or gross out — would-be predators. You wouldn't want to eat a poop-slathered bird now, would you?

Distance and our eyes distort the true colors of stars

New research calculates the colors of stars based on their actual energy distributions

Briley Lewis

Astronomy and Astrophysics

University of California, Los Angeles

From our perspective on Earth, most stars look like tiny, twinkling dots. But what color would a star be if you could actually see it up close?

Most astronomy textbooks will clearly say hot stars are blue, and colder stars are red. These colors come from an idealized version of the light a star gives off, called a blackbody curve. That’s not quite the whole story though, especially for smaller stars — the outer layers of a star absorb parts of the light emitted from the center, and our eyes respond differently to different wavelengths of light.

New research published in Research Notes of the American Astronomical Society calculated colors of stars based on their actual energy distributions and the response of the human eye. Turns out, we’ve been missing stars’ true colors. The hottest stars appear blue, as we’ve thought, but stars like our Sun appear off-white. Smaller stars, like K and M stars, are beige instead of red.

Most shocking of all — brown dwarfs aren’t even brown, they’re violet! These cool sub-stars are purple because absorption by molecules in their atmospheres takes out a whole chunk of their visible light, leaving only red and blue light for us to see.

There are a few more complexities that could change the color a star appears to us. For example, clouds on brown dwarfs may change how their atmosphere absorbs light, and that’s something researchers are still trying to figure out. Earth’s atmosphere reddens light, too, so all these colors would look different if we were looking from Earth’s surface. For now, though, it’s fun to have a better idea of what vivid colors are out in the universe, including purple (brown) dwarfs!

Zebrafish without "love hormone" neurons show no desire to socialize with each other

New research shows the importance of oxytocin for social affiliation and isolation

Kareem Clark

Neuroscience

Virginia Polytechnic Institute and State University

Whether you’re a social butterfly or a lone wolf, the brain circuits that define social behaviors begin forming early in life and mature over a lifetime. But how the social brain develops has remained unclear, and new research explores oxytocin – often referred to as the “love hormone” – for answers.

Oxytocin earns its loving nickname because the brain releases the hormone during moments of social bonding, such as those between a parent and child or romantic partners. But beyond this role, oxytocin has long been thought to play a more direct role in social circuit development, and a recent study published in the Journal of Neuroscience put this idea to the test with zebrafish.

Zebrafish are social creatures with evolutionarily similar brain circuitry to humans. Scientists can genetically alter them before observing their behavior across an entire lifespan, making them ideal for studying social behavior. So to understand the role of oxytocin-producing neurons in social brain development, researchers selectively removed those neurons from their brain circuits early in life and examined the consequences to social behavior once the zebrafish reached adulthood.

The researchers evaluated the zebrafish behavior by first separating a fish from a larger group with a transparent barrier, then observing how the lone fish reacts to its isolation. Like a person with FOMO ("fear of missing out") from a party next door, socially healthy zebrafish stay close to the transparent barrier – seemingly longing to join the group on the other side. However, zebrafish with a disrupted social circuit explore their own tank with no preference to socialize.

Researchers found that zebrafish with their oxytocin neurons removed early in life showed less preference to socialize as adults. However, eliminating these cells in adulthood did not affect social behavior, suggesting that oxytocin shapes the social circuit early in life during a critical developmental window. They also found that removing oxytocin neurons early impaired other social brain components, including those required for attention, decision making, and reward.

Together, this suggests that the famous "love hormone" may define our long-term social preferences early in life. But unlike a Pixar movie, fish are not humans, and there is still more to learn about social brain development.

Wild Goffin's cockatoos can use tools, too

Scientists have observed captive cockatoos making tools before, but this is the first documented instance of tool use in wild cockatoos

Fernanda Ruiz Fadel

Animal Behavior and Behavioral Genetics

Advanced Identification Methods GmbH

Tool making is a complex behavior that, until recently, had only been confirmed in three species of primates (including humans), and in some birds, including captive Goffin's cockatoos. Now, a research group at the University of Vienna that has studied Goffin's cockatoos for decades has also observed the behavior in wild cockatoos

This species of cockatoo, a member of the parrot family, is comparable to three-year-old humans in terms of intelligence. But before now, tool making behavior has not been observed in wild cockatoos, which is necessary to confirm that a species is indeed capable of making tools and their tool use is not just an artifact of captivity.

The group spent over 884 hours observing wild birds in their natural habitat in the Tanimbar Islands, Indonesia, with no success in witnessing tool use and manufacture. They then moved on to a catch and release method, where they captured 15 individuals and placed them in temporary aviaries with many resources and a food option that finally encouraged more complex approaches to foraging: the Wawai fruit, or sea mango. The cockatoos really like eating the seeds of these fruits and need to go through the thick skin and flesh of the fruit in order to reach the seeds.

Two of the 15 individuals manufactured and used tools to extract sea mango seeds. Those two birds made tools by removing fragments from branches and then modifying them with their beaks. The researchers identified three different tool types: wedges, to widen the fissures to reach the seed inside the fruit; fine tools used for piercing the coating of the seed; and medium tools used for scooping the seeds. Furthermore, the tools were used sequentially, which the researchers believe to be the most complex example of tool use in a species without hands.

Both cockatoos proficiently manufactured and used the tools immediately after provided the Wawai fruit, suggesting they knew how to do that before capture. The fact that only two individuals were observed using tools indicates that this complex skill is not found species-wide and therefore has to be learned as a result of opportunity and innovation. This finding broadens our understanding of tool making ability beyond just primates.

Giant clams are growing faster than ever. That's not a good thing

This supercharged growth is likely due to nitrate aerosols in our modern atmosphere

Sarah Heidmann

Fish Ecology

University of the Virgin Islands

The growth of modern giant clams is supercharged compared to growth measured from fossil clams. A recent study from the Red Sea has shown this, finding that growth lines from modern species are larger than those of fossils from similar animals dated to the Holocene and Pleistocene.

These increased growth rates appear to be related to higher amounts of nitrate aerosols in the modern atmosphere. These come from many different sources. Some are natural, such as lightning, biomass burning, and soil processing, but most are from anthropogenic activity like burning fossil fuels and agricultural fertilization.

This fast growth may seem like a good thing, but growth doesn't mean anything about the overall health of the clams. Additionally, aerosols may actually reduce the productivity of marine phytoplankton, which represent almost half of the world's primary production.

The overall effects of nitrate and other aerosol pollution on global land and ocean cycles are not well understood. They may appear to reduce global warming by improving carbon dioxide uptake and reflecting the sun's heat, but they contribute to poor air quality. We can congratulate today's super clams on their impressive growth. But in the long run, fewer emissions on our part are probably better for them.

Skeletons' broken clavicles tell a centuries-old tale of humans and horses

Clavicle fractures can be used to identify horse riders from their bones

Katie East

Biological Anthropology

SNA International

One thousand years ago, archers rode horses across the landscape of Hungary. They were probably intimidating, possibly threatening, and definitely adventurous, but just like equestrians today, they also fell a lot.

These horse riders remain a mystery. Who were they? Where were they from? When did they start riding horses? To answer these questions, an international team of scientists set out to find a way to identify horse riders from just their skeletons, using the fact that horse riders tend to fall. 

The researchers examined skeletons from a cemetery of well-known horse riders in Hungary dating to the 10th century CE. Riders in the cemetery were identified by horse riding equipment and horse bones in their graves. However, scientists could not be sure that skeletons without artifacts in the Hungarian cemetery never rode horses.  Therefore, they also investigated skeletons from another group of people from 20th century Portugal that definitely did not ride horses.

They found that upper body fractures were more common among riders, and that fractures of the clavicle (collar bone) were significantly more common among the Hungarian riders than the 20th century non-riders. To figure out if these fractures could be caused by horse riding, researchers turned to modern equestrians. Sure enough, fractures of the upper body, especially the clavicle, are some of the most commonly reported injuries in modern day equestrians.

The researchers argue that, in combination with other skeletal changes, clavicle fractures can be used to identify horse riders from just their skeletons. Being able to identify horse riders in the past could help researchers find the first horse riders, shedding light on the ways horse riding shaped human history.

Researchers observe a boar releasing two caged younglings in a impassioned rescue

The act sheds light on the prosocial behavior and empathy of wild boars, thought to be rare among animals

Simon Spichak

Neuroscience

Humans aren't the only animals that step up to help others out of difficult situations. In a study recently published in the journal Scientific Reports, Michaela Masilkova of the Czech University of Life Sciences and her colleagues described a boar's daring rescue of two young wild boars stuck in a trap.

Few animals show this kind of rescue behavior: to go out of their way to help other members of their species that are caught in a dangerous situation. Masilkova's team inadvertently caught an astonishing act of altruism on camera while conducting a separate experiment to monitor wild boar movement for the prevention of African Swine Fever. The goal was to catch boars so the researchers could mark them individually. The researchers set up traps containing food as lure. Once lured inside, a boar would be caged in by logs that would roll off the top of the enclosure and bar the door shut. 

One night the trap — operating as usual — snared two young boars. But the night took an unexpected turn when a new herd arrived at the scene. One adult female took particular interest in the captives' predicament. Over the course of 29 minutes, the female pushed against the logs and successfully moved it, allowing the young boars to escape. Given that the rescuer spent so much time on this activity and showed physical signs of distress throughout, the researchers believed her act to be potential evidence of pro-social empathy.

This discovery suggests that complex forms of empathy may just be more common in the animal kingdom than scientists may have previously believed. 

Roe deer pause development of their embryos for months, and researchers just learned how

An embryonic phenomenon discovered over 150 years ago may finally have an explanation

Charlotte Douglas

Genetics

Institut Curie Paris

In many species, not long after fertilization, the embryo implants into the uterus wall, preparing it for further development. In humans this implantation occurs at around day eight to nine after conception. However, in European roe deer, instead of implanting, the embryo stops developing, hovering in a period of dormancy.

This phenomenon has now been found in over 130 species, including mice and armadillos. But the roe deer have one of the longest known periods of embryonic suspension, lasting up to five months. This period is called diapause. Unlike in many of the other species that can induce diapause, cell division in roe deer embryos do not stop completely, but drastically slows, with cells dividing just once every few weeks.

Until now, the cellular mechanism that regulated the process of extensively slowing down cell replication was unknown. However, a recently published study in PNAS has uncovered it. Researchers discovered that a predominant driver of embryonic diapause is the changing abundance of amino acids in the embryo. One family of amino acids in particular were found to cause a significant increase in a protein called mTORC1, inducing the embryo to activate more of it. In fact, the increase of mTORC1 appeared to immediately coincide with the embryo’s exit from diapause, after which cells start dividing more rapidly, but was not detectable during the previous period of slow cell replication.

The mTOR protein family has been known for many years to be a crucial factor in regulating metabolic pathways, including in humans. In fact, a related protein called mTORC2 thought to be essential for maintaining slow cell divisions remains switched on throughout roe deer diapause. This new study will open up avenues of research into the precise timing of embryo implantation, as well as increasing our understanding about the interplay between the chemical and metabolic pathways of an animal and its embryo.

Female jumping spiders favor the most aggressive males

A new study provides evidence for sexual selection in these spiders

Hayden Waller

Evolutionary Biology

Cornell University

If you've ever witnessed an overly aggressive guy get bounced from a bar, you probably found yourself internally judging him. But new research published in the journal Animal Behaviour suggests that the opposite may be true for spiders: the more aggressive a male jumping spider is, the sexier his female counterparts find him.

Researchers from the National University of Singapore quantified female spiders' preferences for aggressive males. They first placed males in a small chamber containing a mirror and observed how combative they were toward their own reflection. Once males had demonstrated either their contempt for or passivity towards their own reflections, they were paired with another male for a series of bouts. Using the results from the mirror test and combat trials, the researchers assigned each individual male spider an aggression predictability score. Finally, a pair of one highly aggressive male and one more passive male were placed in a chamber with a single female spider. Female preference was determined based on the amount of time she spent ogling each of her potential suitors. 

The researchers found that aggressive males are both more likely to defeat a rival in a combat trial, and to draw a higher amount of attention from females than their more pacifist competitors. They concluded that, not only is this evidence for sexual selection, but that the combination of strong competitiveness and female favor reinforce each other to push the most aggressive spiders to the top of the pile. 

People with sickle cell disease are less likely to get kidney transplants than those without

Sickle cell disease predominantly affects Black populations, and kidney transplants can save their lives

Danielle Llaneza

Health and Medicine

Hunter College and MD Anderson Cancer Center

People with sickle cell disease encounter significant health issues such as kidney failure. Sickle cell disease, found predominantly in Black and African American populations, is when red blood cells are shaped like crescent moons (or “sickle-shaped”) instead of round and disc-like. This shape, which may have had evolutionary benefits during previous generations, can block blood flow, and therefore oxygen transport, through a person's body. 

Kidney failure is a major health complication encountered by people with sickle cell disease. Therefore, people with sickle cell disease are often reliant on dialysis treatment to filter the waste from their blood, but this is often not enough to save their lives. 

Therefore, researchers are exploring kidney transplantation as an additional treatment option for patients. In a study published in the Clinical Journal of the American Society of Nephrology, researchers used two national databases that collected information on adults from 1998-2017 with kidney failure who were on dialysis or the kidney transplant list. The researchers measured the impact of kidney transplantation on mortality, as well as differences in access to kidney transplants between people with and without sickle cell disease. 

People with sickle cell disease who were on dialysis had a higher mortality risk than the control group. However, the researchers found that transplantation reduced mortality risk for people with sickle cell disease as well as those without it, a benefit that lasted for at least ten years. 

Finally, the researchers found that patients with sickle cell were less likely to receive a transplant when compared to the control group, even though kidney transplantation has a higher likelihood of increasing the lifespan of sickle cell patients than does dialysis. This is yet another example of health inequity for Black and African American populations, and one with serious consequences, since kidney transplant is a life-saving intervention for people with sickle cell disease. 

More Lab Notes →