Hungry microbes turn crusty leftovers into wine

Old bread gets second life as microbe soup

Hannah Thomasy


University of Washington

Globally, we produce about 100 million tons of bread every year. But because bread gets stale or moldy fairly quickly, a substantial amount of this bread is wasted. Now, scientists have come up with a use for all this bread that would otherwise be wasted: use it to grow beneficial microbes.

moldy bread

"r u gonna eat that?" - Hungry Microbe

Shutterbug75 from Pixabay  

The food industry depends on many types of beneficial microbes; they’re used in the production of things like bread, beer, wine, cheese, and yogurt. In this study, researchers created a special growing medium that contained 50% “waste bread” (called wasted bread medium). Using this nutrient-rich soup, they were able to grow several types of microbes involved in the production of wine and yogurt. Importantly, they estimated that, at least on a small scale, the wasted bread medium only cost about 30% as much as traditional formulas used to grow these microbes.

If this process can be scaled up, it means that stale bread — which would ordinarily be thrown away — could live a second life as food for microbes that churn out fermented foods (and drinks!) like wine and yogurt. This would help decrease food waste and perhaps even drop the cost of growing these beneficial microbes.

Precise CRISPR gene editing can correct the mutation that causes cystic fibrosis in mini-intestines

Prime editing, which edits DNA directly without cutting, restored healthy function in an organoid model of an intestine

Charlotte Douglas


Institut Curie Paris

For the first time, a type of CRISPR/Cas9 genome editing, called prime editing, has been performed in “mini-organs” to correct the mutation causing cystic fibrosis.

Cystic fibrosis is thought to affect more than 70,000 people worldwide. It is a genetic disease caused by a mutation in a single gene, called the CFTR gene, which result in a dysfunctional CFTR protein. This dysfunctional protein aspect is what causes the main symptom of cystic fibrosis; a sticky mucus buildup in the respiratory tract and lungs. 

Published in Life Science Alliance, researchers from the Hubrecht Institute, in collaboration with UMC Utrecht and Oncode Institute, demonstrated that they were able to correct a mutation in CFTR that causes cystic fibrosis by performing genome editing in a mini-organ called an organoid. The organoids, mini intestines, had been grown from stem cells originally collected from patients with cystic fibrosis.

Prime editing is different than traditional CRISPR genome editing. Instead of acting as a pair of scissors, prime editing uses a modified Cas9 protein to make a direct change to the DNA sequence. In doing so, the researchers are able to change the underlying DNA sequence without cutting the DNA. This also reduces the risk of Cas9 cutting randomly elsewhere in the genome. 

To test if the prime editing was successful, the researchers, led by Hans Clever, added a treatment called forskolin to the organoids. In healthy organoids, addition of forskolin causes the mini-intestines to swell up due to movement of fluids into the center. The researchers found that this happened in some of the prime edited organoids as well, suggesting that the mutation had been corrected. Organoids that carried a CFTR gene with a mutation however, did not respond to forskolin treatment.

Prime editing efficiency is variable between organoids and cell types, an important consideration in the developments towards gene therapy for cystic fibrosis and other diseases. Moreover, significant research effort should be invested into ensuring that prime editing techniques do not cause any unintended off-target effects. Despite this, these proof-of-principle research findings provide a step forward for the understanding and future developments of gene therapy for cystic fibrosis treatment. 

Researchers find biomarkers for heart disease in young adults

The proteomic analysis detected differences in oxidative stress markers

Matthew Bomkamp

Physiology and Applied Physiology and Kinesiology

University of Florida

Cardiovascular diseases, or CVD, are the leading cause of mortality in the United States. As such, early detection of the risk factors for CVD allows for preventive measures to be put in place.

Many of the current risk prediction algorithms place the most emphasis on age as a contributor in developing CVD. Consequently, CVD risks for many younger adults are often underestimated. A recent study from vascular physiopathologists in Spain searched for a new method to estimate CVD risk for this population using a technique called proteomics, by quantifying risk of CVD in young adults based on the proteins found in their plasma. They were particularly interested in oxidative stress markers, because oxidative stress is associated with the development of CVD. Oxidative stress occurs when you have more free radicals than antioxidants in your body, which damages your cells.

The study placed younger adults into three groups: healthy participants, participants with risk factors for CVD, and those who had already experienced a cardiovascular event. Among the proteins found in the participants' plasma samples, the team identified more irreversible oxidation of certain amino acids in those who had experienced a cardiovascular event, compared to healthy adults. 

The results indicate that oxidation is progressive with the development of CVD. Additionally, there were increases in the antioxidant response for both the adults with CVD risk factors and patients who had already experienced a cardiovascular event. This antioxidant boost is assumed to be a response to the increased oxidative stress seen with CVD.

Ultimately, these results revealed exciting findings that both markers of oxidative stress and the antioxidant response are altered in those at risk for CVD. Thus, these biomarkers may be clinically useful in developing better tests to quantify the risk of CVD in young adults.

Move over, mice: sheep have the superior brains for neuroscience research

Sheep brains more closely resemble human brains than do mouse brains

Dori Grijseels


University of Sussex

Animals are crucial for neuroscience research. Many important discoveries have been made by studying the brains and cells of different animals. For example, we learned how the brain encodes information from the eye through experiments on cats, and we learned how information is relayed by neurons from the squid giant axon. Nowadays the majority of such research is done on mice and rats. These rodents are easy to train and don’t take up a lot of space, making them perfect for neuroscience research. 

However, rodents also have a big disadvantage. Although they are mammals, their brains do not look much like human brains. As a consequence, not all research that is done on rodents is translatable, meaning that findings from rodents do not necessarily apply to humans. This is especially a problem for research on diseases and their treatments: treatments that work in mice, may not work in humans. Many alternative animals have been proposed over the years to address this problem, but each has their own challenges.

In a recent preprint, researchers proposed a new alternative to rodents for neuroscience research: sheep. Sheep have brains that look similar to human brains. In addition, sheep are smarter than you might think. The scientists were able to teach them a behavioral task (the sheep had to choose between two different stimuli in order to receive a food pellet) and for the first time recorded the activity from their brains while the animals were performing it. They showed that the brain showed the same patterns of activity as rodents and humans, both during the task and while the animals were moving around. 

Although this study was the first to perform recordings from a sheep’s brain while it was actively walking around, it follows earlier studies where sheep were already used for the study of various diseases, such as Huntington’s disease. Together this makes sheep a promising alternative to rodents for studying treatments for diseases that affect the brain. Hopefully, studying these woolly animals will lead to many new discoveries in neuroscience.

Earth's oxygen is projected to run out in a billion years

As the Sun ages, Earth's processes will change

Briley Lewis

Astronomy and Astrophysics

University of California, Los Angeles

Our Sun is middle-aged, with about five billion years left in its lifespan. However, it’s expected to go through some changes as it gets older, as we all do — and these changes will affect our planet. New research published in Nature Geoscience shows that Earth’s oxygen will only stick around for another billion years.

One of the Sun's age-related changes is getting brighter as it gets older. When a star runs out of hydrogen fuel in its core, the core has to get hotter in order to fuse the next element, helium. As the core gets hotter, the outer layers expand, and the star gets brighter. This extra energy hitting Earth will eventually cause our planet to warm up and slowly lose its oceans and its oxygen. 

The exact timing of when we lose our oxygen depends on more complicated factors — particularly our planet’s carbonate-silicate cycle, which releases carbon dioxide into the atmosphere from volcanoes. As the mantle cools and this cycle slows, less carbon dioxide will be available for the plants that produce oxygen, leading to a rapid loss of oxygen in the atmosphere. The researchers’ model took into account all these factors — our biosphere, the Sun’s changes, our planet’s changes, and more — to come up with their estimate of about a billion years.

Interestingly, this means that planets like Earth only have oxygen for a fraction of their lifetimes. When we try to find habitable worlds, this will be important to keep in mind.

Your saliva affects the way you spread pathogens

Our saliva can vary depending on our physiological state, making us more or less likely to pass on bugs to others

Marnie Willman


University of Manitoba Bannatyne and National Microbiology Laboratory

We've all been in a crowded place and seen someone sneezing or coughing nearby. You do your best to get away from them, but somehow they always end up right there beside you. The COVID-19 pandemic has increased our collective use of masks and other protective measures that have reduced the transmission of many viruses beyond coronaviruses, including influenza. 

Researchers are now finding that there are specific qualities of saliva that might change how easy it is to catch certain pathogens. You may think that everyone's saliva is the same, but our physiological state changes our saliva! If you're stressed or dehydrated, for example, your saliva makeup is different than it would be if you weren't. Saliva thickness also differs between genders: women tend to have thinner saliva, and less of it than men

University of Florida researchers have found that what compounds are in your saliva, your salivary flow rate (how much saliva you produce), thickness, and other features make the saliva able to travel further when you cough or sneeze. This comes into play when we talk about respiratory viruses like the one that causes COVID-19, which are transmitted by respiratory droplets. 

With these suggestions, it may be possible to alter your saliva to decrease your ability to pass potentially deadly bugs to others. Could simply keeping yourself hydrated and being less stressed reduce virus and bacterial transmission? University of Florida researchers say very possibly! 

Alzheimer's drugs targeting amlyoid plaque may be doomed to fail

In a study of the cause of Alzheimer's, cognitive decline tracked something other than high levels of amyloid plaques

Rose Egelhoff


In June, the Food and Drug Administration approved the first drug designed to slow the progression of Alzheimer’s disease. The drug, called aducanumab, clears amyloid plaques — clumps of brain proteins that are characteristic of Alzheimer’s disease. Proponents of the drug say that amyloid plaques are toxic, and that they lead to brain inflammation and the loss of brain cells, causing cognitive impairment.

But critics say there is scant evidence that the drug actually helps people with Alzheimer's, and not all scientists agree that amyloid plaques cause the disease, though there is a correlation. In fact, some people with amyloid plaques do not show cognitive decline.

In new study, University of Cincinnati researchers sought to understand this apparent paradox. Their idea is maybe the cause of Alzheimer’s is not an accumulation of these protein clumps, but rather a decrease in their precursor: soluble un-clumped amyloid proteins in the brain. Soluble amyloid proteins have a number of important jobs in brain function, including brain development and protecting brain cells from premature death.

To test this idea, the researchers looked at soluble amyloid protein levels in people with varying stages of cognitive decline. They found that healthy individuals with amyloid plaques in their brains still had high levels of the soluble amyloid protein. Dementia was much more related to low soluble protein levels than it was to high levels of amyloid plaques. These results add to the evidence that plaques may not be the direct cause of Alzheimer’s, and they calls into question the FDA’s decision to approve aducanumab.

New machine learning approach can identify your circadian rhythm from a blood sample

Doctors do not currently monitor a person's circadian rhythms because there is not an efficient way to measure them

Soren Emerson


Vanderbilt University

Many of the body’s physiological activities, including hunger, wakefulness, and metabolism, run on 24 hour cycles called circadian rhythms. These cycles are primarily controlled by the release of chemical messengers into the bloodstream from the brain and have been linked to cardiovascular disease, neurodegenerative disorders, and weight-gain

Measuring circadian rhythms could significantly improve medical care. Doctors could better prevent and treat illness by more accurately assessing individual risk of disease and recommending times to eat, take medication, and rest. Circadian rhythms are not used as a clinical indicator at present because there is not an efficient way to measure them. Based on the results of a new study, however, that could change soon.

The results were recently reported in the Journal of Biological Rhythms by a team of researchers at the University of Colorado at Boulder. The researchers sought to develop a circadian rhythm test that is more efficient than the standard dim-light melatonin assessment, which requires hourly collection of blood over the course of an entire day to measure the amount of a sleep-inducing molecule in the body called melatonin. The test is accurate, but it is impractical for clinical use.

To develop an efficient circadian rhythm test, the researchers invited 16 participants into their sleep lab. Over the next two weeks, the researchers took regular blood samples and analyzed them to measure the levels of melatonin and approximately 4,000 metabolites, molecules that are produced by biochemical reactions in the body. Then, they used machine learning to determine which metabolites are associated with different phases of the circadian rhythm. When the analysis was complete, the researchers could measure circadian rhythms by testing a single blood sample for 65 metabolites with similar results to the dim-light melatonin assessment.

The new test does have some limitations; it worked best for people who had adequate sleep and whose food intake was controlled, which limits its practical application, and it would be more efficient if fewer metabolites had to be analyzed. Nonetheless, the study results are exciting and suggest an efficient circadian rhythm test may be available soon.

Tiny radio tags reveal the lives of Neotropical stingless bees

These bees are small, but the tags are smaller

Lila Westreich

Pollinator Ecology

University of Washington

Scientists have struggled for years with ways to understand bee movement and foraging behavior. Following bees around from flower to flower is tedious, and netting bees to identify species visiting different types of flowers only tells us so much about their behavior over a large area. Where bees are foraging, and what they're eating, will give insight into their health and survival in the face of changes to their habitat and climate. 

In a recent paper from Frontiers in Ecology and Evolution, researchers attached radio-frequency identification (RFID) tags to the backs of a Neotropical stingless bee, Melipona fasciculata, to monitor their behavior. Similar to the little tags attached to some clothing that beep loudly when you bring them past the doors of a store, RFID tags can be used to track movement using microcomputers. Researchers found that 64.1 percent of tagged M. fasciculata individuals would drift between nesting boxes to other colonies of bees. Their peak activity occurred at 9 am, and pollen foragers lived longer than nectar foragers.

This amazing research establishes a new system for understanding bees using RFID tag technology, and increases our understanding of Neotropical stingless bees. 

How “evolutionary medicine” helps create drugs that prevent antibiotic resistance

How do you stop a bug from becoming a superbug? Beat it at Darwin’s game

Marie Sweet

Structural Biology

NYU School of Medicine

Though it sounds like a plot device from a dystopian novel, the term “superbug” describes very real disease-causing microbes that have overcome the drugs normally used against them. These bugs kill over 35,000 people every year in the US alone, and the WHO considers them one of the biggest threats to global health and security.

In the arms race against microbes, scientists have begun fighting back by targeting the bugs’ most powerful weapon: the capacity to evolve. Antibiotics jam up essential proteins in bacteria, often by binding to crucial areas called “active sites.” The lock-and-key fit of these drugs makes them work with deadly precision right up until the bacteria change their locks. The liberal use of antibiotics creates dramatic evolutionary pressure, breeding superbugs with mutations in these active sites that prevent the drugs from binding.

A paper published in eLife in July describes an effort to shut down evolutionary routes to antibiotic resistance by tailoring a drug’s chemistry to preempt mutations. The idea was that, rather than waiting for evolution to render current drugs useless and require a return to the drawing board, scientists might design them to bind to essential areas within the site that tend not to mutate, prolonging their lifespan. 

Using computational chemistry, scientists tested 1.8 million chemicals for their ability to bind to a commonly targeted bacterial protein, both with and without real-life resistance mutations. They then brought the best candidates into the lab and put them to the test, pushing bacteria to evolve in response to them. In a success, the group found that one chemical dramatically weakened and delayed the bacteria’s evolution towards resistance. While the study’s specific chemical is not quite a silver bullet, its proof of principle bodes well for the use of this “evolutionary medicine” tactic in the design of future antimicrobials as well as drugs for other diseases that acquire resistance, like cancer.

What engineers can learn about infrastructure from predatory army ants

Ants can teach us how to design strong networks resilient to individual failures

Celia Ford

University of California, Berkeley

Engineers are grappling with a problem army ants solved ages ago: how can we design massive, coordinated networks that don’t topple when one individual fails? While it seems like big groups need one, centralized commander calling the shots, nature demonstrates otherwise. Simple rules followed by individuals, independently, produce incredible group-level behaviors. An April 2021 study published in PNAS turned to Eciton burchellii, predatory Central American army ants, to learn how they become greater than the sum of their parts. 

E. burchellii hunt in massive swarms, hundreds of thousands of ants strong. When faced with slippery slopes, they team up to form living structures called “scaffolds,” making footholds for the ants coming up behind them.

To model how ant colonies shape their behavior to the environment, researchers traveled to Barro Colorado Island, Panama. They built a moveable platform that could be tilted from 20 degrees (mostly flat) to 90 degrees (completely vertical), set the platform along the ants’ path, and watched. The steeper the slope, the denser the scaffold. Why? Selfish behavior. 

If an ant slips as they're climbing a precarious surface, they will dig in, gripping the underlying surface. Their body acts as a foothold, making the climb less slippery for the rest of the swarm. As more ants slip and catch themselves, their bodies add to the scaffold. As the surface becomes easier to grip, fewer ants slip, and scaffold construction winds down to a halt. This is an example of proportional control, where a system responds in proportion to the amount of error it detects. The rate at which bodies are scaffolded (the system’s response) decreases as fewer ants slip (the error it detects). 

Ants and their remarkable collective behaviors can inspire us to improve our own infrastructure. Mechanisms inspired by E. burchellii scaffold-building, which robustly and quickly respond to change, could be used to design self-healing fabrics, build better traffic control systems, and more.

Top-ranking baboons age the fastest. Is it worth it?

New research looks at the epigenetic effects of social status in baboons

Anna Rogers

Molecular Biology

UC Berkeley

Maybe you found a grey hair on your head in your twenties. Might this lead you to wonder if maybe all your stress is making you age faster? It might, if you’re a high-achieving male baboon.

Our DNA collects different chemical modifications know as epigenetic changes. Some of these are age-related and make up our “epigenetic age.” Someone’s age in years and epigenetic age don’t always agree, and in humans faster epigenetic aging has been associated with increased disease risk and shorter lifespan. However, we’re still learning what influences epigenetic age in humans and other animals. 

A recent study published in eLife shows that social status is the best predictor of epigenetic age in male baboons. This faster aging isn’t true for female baboons, who inherit their ranks from their mothers. Therefore, the researchers posit it’s not the rank of male baboons that accelerates their aging. Instead, it’s the process of climbing the social ladder. So why would social climbing among baboons be associated with faster aging? 

For male baboons, rising in rank means fighting for it, and their rank can change in their adult life. Not only does high status increase epigenetic age, but dropping in status can reverse accelerated aging as well. This “live fast, die young” way of life may still be worth it, from an evolutionary perspective. Despite the cost, high-ranking males are still more likely to reproduce. Social pressures among humans don’t align with those of baboons. Nevertheless, learning how other primates age and why can help us better understand the factors that contribute to aging in humans.

Access to free school lunch creates health benefits for a lifetime

A study of India's Midday Meal program shows clear nutritional benefits that are even passed on to the next generation

Sam Zlotnik

Ecology & Evolutionary Biology

University of Florida

School lunch programs are widely known to improve children’s health and education, but their benefits may extend even further than once thought. According to a recent study from the International Food Policy Research Institute, India’s national school meal program may provide intergenerational health benefits extending to participants’ own children.

India’s Midday Meal is the largest school meal program in the world, feeding around 100 million school children each year. The success of this program led researchers to wonder whether meal recipients grew up to have healthier children of their own.

The team found that women who grew up with school meals had young children (ages 0 to 5) who were taller for their age than the children of women who did not receive school meals. There were also fewer children with very short heights for their age, a common sign of malnutrition, among women who had experienced greater access to school meals. These differences were largest at low socioeconomic statuses, suggesting that school meals may be most beneficial for children with few resources at home.

They also found that school meals were associated with improved education and healthcare use by women, which may partially explain the nutritional gains in their children. This study highlights how a single free meal a day can have cascading effects across lifetimes.

What does a spider eat? Look at the DNA in their guts

DNA sequencing found wandering spiders eat at least 96 types of prey, including snakes and lizards

Fernanda Ruiz Fadel

Animal Behavior and Behavioral Genetics

Advanced Identification Methods GmbH

Despite not being the favorite animals of many people, spiders are very important land predators that shape the structure of ecological communities and control the populations of their prey species. Some spider species, like the South American “banana spiders” or “wandering spiders” (genus Phoneutria), are also of medical concern due to their potent venom. 

The diet of spiders is mainly inferred by observations in the field and laboratory, which is potentially biased and probably underestimates the number of species ingested. Spiders also pre-digest their prey externally before ingesting, increasing the difficulty in identifying their prey. 

With more advanced molecular technique and ever-growing DNA databases, it is now possible to perform molecular gut content, that is, sequence the DNA present in the gut of the spiders and match them to existing sequences in databases allowing a more precise identification of species. This methodology is called DNA meta-barcoding.

Researchers from the Universidad del Tolima and Universidad de Ibagué in Colombia were the first to use DNA metabarcoding to analyze the diet of a wandering spider, the Phoneutria boliviensis (disclosure: I, the author, worked on this project). We sequenced the guts of 57 spiders and identified 96 species of prey belonging to 10 orders, mainly flies, beetles, butterflies, moths, grasshoppers, locusts, and crickets. We also found the DNA of a species of lizard and snake among the prey species eaten by females. 

In this study, females had a smaller number of preys identified compared to males, even though the opposite was expected since females of this species are generally larger. However, these results indicate that the two sexes have different predatory strategies. The 57 spiders analyzed belong to three different populations in Colombia, and the prey composition of each population also differed, indicating that the three localities have small differences in prey availability. 

This study confirms that wandering spiders feed on a wider variety of species than previously reported, further validating their generalist nature and flexible diet. 

Illegal shark fishing study shows widespread catch of threatened Galapagos species

The Fu Yuan Yu Leng 999's catch of a dozen different species of sharks is a sign that endangered species need better protection

Enzo M. R. Reyes

Conservation Biology

Massey University

Many fish stocks are expected to be depleted during the coming years due to overexploitation and illegal fishing. Shark fishing driven by the million-dollar fin trade have pushed some species of shark into a heightened risk of disappearing. To counteract overfishing and preserve biodiversity, some governments have created marine protected areas (MPA) where the species can thrive without human pressure.

The Galapagos Marine Reserve, an MPA in the east Pacific, harbors and protects 36 species of sharks. This makes it a popular destination for divers and researchers attracted to its biodiversity. Nevertheless, the same biological richness also attracts illegal fisheries that trespasses the borders of the reserve, exploiting the difficulty of open sea monitoring.

In August 2017, the Fu Yuan Yu Leng 999, a Chinese vessel was captured inside the GMR with 572 tons of fish including hundreds of sharks. The boat’s crew were sentenced from 1-3 years imprisonment and fined $6 million for the illegal possession and transport of protected species. The silver lining was that the seized cargo gave the opportunity for a group of researchers to study it, using morphological and molecular barcoding.

 They found that from the 12 species of shark identified, 11 were present in the GMR from which 9 species are considered at risk. The cargo was composed of species of different sizes including a whale shark, to reef species that would suggest illegal trade with coastal fisheries. Several immature and unborn sharks were found on the boat; for one species immature individuals made up 86 percent of the total sample analyzed. Moreover, the researchers used advanced molecular techniques to attempt to identify the geographic origins of the cargo. Unfortunately they were unsuccessful due to the limited availability of molecular samples of the region stored in the international genetics database.

This case exposes the magnitude of the threat posed by fishing industries and illegal trade in marine reserves on already vulnerable shark populations. Additionally, it highlights the importance of satellite technologies in the monitoring of fishing activities, that in the case of the Fu Yuan, provided the evidence necessary for the successful prosecution of the culprits.

Researchers might be using your Facebook data in their papers

A recent study asked people what they thought about academics using their social media data

Elliot Eva Ping

Cognitive Neuroscience and History of Science

Ohio State University

After the Facebook-Cambridge Analytica data scandal following the 2016 United States presidential election, public conversation has revolved around appropriate uses of social media data across domains, including academic research.

The collection of user-generated data from social media can blur the line of what constitutes human subjects research. While this data is generally considered beyond the purview of institutional review boards, users are often disconcerted by the lax approach to their data. A recent study surveyed active US Facebook users about their perspective on social media data use in academic research. 

Results indicate that the purpose of data collection mattered to people: while health science-focused research was viewed as an appropriate use of social media data, gender studies, computer science, and psychology research was not. This may indicate that fields seen as “more political” could experience backlash regarding data collection. Post content and context were also relevant, with participants rating public comments about food or science as more appropriate for collection than posts about more personal topics made in groups or sent via private message. Users also indicated that research which sought their consent was less concerning, and that research used for service improvements rather than knowledge generation was more appropriate. 

These findings suggest that the discipline and purpose of the research, the kind of data collected, and the subjects’ awareness of the research affected how users assessed the appropriateness of the data collection. Researchers should be cognizant of these preferences when conducting research to help ease subject anxiety, promote public scholarship, and use context-informed methods in their studies.

Animals and their DNA move through the environment in different ways

Sampling a lake at different times of year and at different depths found fish DNA distributes in unexpected ways

Joanne Littlefair


Queen Mary University of London

Environmental DNA (or eDNA) is all the rage in ecology. Animals create and shake off DNA every day, leaving tracks as to where they’ve been. The DNA that has been shed into the environment can be sampled and tested later.

While the potential to test for a species without having to see or capture it is exciting, we need to be sure of its accuracy. Can biotic and abiotic processes in the environment influence how eDNA is distributed in an ecosystem? 

A recent study at IISD Experimental Lakes Area, a remote lake research area in Ontario, suggests that the distribution of eDNA in lakes can be affected by both summer stratification and fall turnover. Researchers tracked lake trout (Salvelinus namaycush) DNA and then compared those data to where the lake trout actually were, verified using acoustic telemetry.

The study revealed that during summer months, when the lake is stratified and the trout spend their time in the lake's deep waters, they could mostly only be detected using eDNA at the deepest layers of the lakes. In fact, lake trout eDNA was not detected at all in the surface water despite the presence of abundant lake trout populations in the lake. 

Sampling from a lake

Joanne Littlefair

By contrast, during fall lake turnover when lake trout spend much of their time near the surface, their eDNA was found pretty much everywhere throughout the water column, likely being mixed in as the lake water turned over.

Many eDNA sampling programs only sample surface water. The study found that if only surface water were sampled in these lakes while stratified, lake trout — the abundant top predator in the lakes — would not have been detected. It also revealed that sampling during a period of lake mixing (fall through spring) or sampling the entire profile of the lake in summer will increase the probability of detecting all species present.

Measuring eDNA remains a reliable way to monitor animals in their environments. The study demonstrates the need to design eDNA sampling methods accordingly to take lake physics into account and be aware of those physical processes’ impacts on the results.

Drinking way, way too much coffee might shrink your brain

Up to five cups of coffee per day seems to be fine. Six or more? Your brain is going to feel it

Soren Emerson


Vanderbilt University

Coffee ranks as one of the most popular drinks on the planet after water, alcoholic beverages, and fruit and vegetable juices. There are many reasons that people drink coffee, but most people drink the brew to improve their brain function. 

In fact, research shows that moderate coffee consumption (3–5 cups per day) can enhance concentration and alertness and improve health outcomes by decreasing risk of chronic disease. In a new large-scale study, however, researchers found that drinking over six cups per day may negatively impact brain function. 

The results were recently published in the journal Nutritional Neuroscience by a team of researchers at the University of South Australia. To look at the association between coffee consumption and brain health, the researchers conducted a prospective analysis of data collected by the UK Biobank, which has amassed detailed health information on over half a million participants in the UK.

The scientists analyzed the data associated with 398,646 UK Biobank participants. They compared self-reported consumption of caffeinated coffee and measures of brain volume acquired via MRI and found an inverse association between coffee consumption and total brain volume (brain volume isn't associated with intelligence, though brains tend to shrink as we age). In addition, their analysis revealed that drinking more than six cups of coffee per day is associated with a 53 percent greater probability of dementia compared to drinking 1–2 cups of coffee per day. 

The results build on previous research showing that drinking large amounts of coffee is associated with worsened health outcomes such as anxiety and insomnia. Luckily, because most coffee drinkers drink about three cups of coffee per day, odds are you can continue to enjoy your morning brew without worry. 

Gelatins may protect the brain against Alzheimer’s disease

A traditional Chinese medicine successfully protected neurons from amyloid-induced death

Kareem Clark


Virginia Polytechnic Institute and State University

Grandma might have the right idea when bringing Jell-O salad to every church potluck.

Gelatins are animal-derived protein fragments created by breaking down collagen — a protein found in connective tissues like skin and ligaments.

Gelatins are also widely used medicinally, from skincare to joint pain. But traditional Chinese medicine also claims that they protect the brain against deteriorating diseases such as Alzheimer’s disease. And a recent study published in Frontiers in Pharmacology put this historically anecdotal remedy to the test.

Researchers first mimicked Alzheimer’s disease in a dish by treating lab-grown cells with a toxic protein fragment that accumulates in patient brain cells, called amyloid-beta. And like human Alzheimer’s disease, amyloid-beta treatment induced profound cell death in this model. But, surprisingly, gelatin treatment completely protected these brain-like cells from this toxicity.

To understand how gelatins are neuroprotective, they turned to mitochondria — the powerhouses of the cell. Mitochondria are cellular structures that generate energy in the form of a molecule called ATP. Because healthy cells need ATP to function, mitochondrial dysfunction is harmful to cell survival and the primary cause of brain cell death in Alzheimer’s disease. 

The researchers, therefore, hypothesized that gelatins prevent amyloid-beta-induced cell death through mitochondrial protection. Indeed, the gelatin-treated mitochondria showed reduced structural damage, improved ATP production, and lower oxidative stress. Furthermore, they believe gelatins exert these protective effects by blocking excessive calcium from entering the cell, which can trigger mitochondrial damage, oxidative stress, and ultimately cell death.

Of course, while these results are exciting, the human brain is a little more complex than a dish of cells, and more work is necessary to determine the therapeutic potential of gelatins in human disease. 

A fluorescent sensor signals when yeast cells are getting stressed

Yeast cells can produce large amounts of proteins for industrial and therapeutic use, but they sometimes get overworked

Colleen J Mulvihill


University of Texas at Austin

Saccharomyces cerevisiae, commonly known as yeast, is a biological workhorse. It has been used by humans for thousands of years to brew beer, make wine, leaven bread, and, more recently, to produce large amounts of proteins for therapeutic uses and enzymes for industrial applications.

While yeast is often the organism of choice for making large amounts of a protein of interest, there can be problems when protein production is scaled. The unfolded protein response, or UPR, is one of these issues. When yeast cells try to fold large numbers of proteins, their cellular machinery can become overworked, triggering the UPR. Hundreds of genes are turned on rapidly in an effort to decrease the amount of unfolded protein. A fine line needs to be maintained between making enough protein and not triggering a large UPR, which typically results in decreased yields. 

In a recently published study in ACS Synthetic Biology, researchers developed a sensor to monitor the UPR in real time. They engineered a fluorescent protein that turns on when the UPR is activated. They then characterized this sensor in cells with proteins produced at different levels and with configurations known to elicit the UPR. They found that the UPR is caused by both how challenging a protein is for the yeast cells to fold, and how much of it is being produced.

Triggering of the UPR is a limitation in protein production that many industries rely on. Development of this sensor gives a real-time readout that could be useful for scientists and engineers trying to maintain the line between maximal protein production and the health of the yeast cells producing them.

A controversial new study shows how male rats can become pregnant and give birth

Scientists have debated whether the scientific progress justified the invasiveness of the procedure

Julia A Licholai


Brown University

Males from just a few animal species, namely seahorses and some of their relatives, are known to get pregnant. New research, currently published as a pre-print awaiting peer review, documents the first recorded instance of a male mammal giving birth from a transplanted uterus. 

In the study, male rats were surgically prepared to nurture a transplanted uterus with embryos. The project yielded a low (under 4 percent) success rate, but the male rats delivered pups via C-section that grew without any reported complication. While some have excitedly associated the experiment’s success with the possibility of human application, the lead researcher has repeatedly pleaded online to not associate the study with potential male human pregnancy. Regardless of such prospects, this paper has gained tremendous attention for its ethical ramifications. 

The procedure and the results triggered an avalanche of tweets questioning its ethics. Scientists debated whether the scientific progress justified the invasiveness of the procedure — for it to work, the male rat was surgically joined to a female rat so that they shared a circulatory system (called parabiosis). One scientist raised concerns on whether this study is indicative of science turning into an entertainment business, according to reporting in Nature. The lead researcher responded by reassuring that, “[the] animals did not have any painful symptoms such as screaming during the entire experiment” and asked that people “... please don’t bring non-scientific factors into scientific research.”

Aside from the sensational outcome of the paper, it also highlights that male rats' pregnancies were only successful when attached to a pregnant female and that the rats sometimes carried abnormal fetuses. 

The paper only has a short discussion section where the potential implications or usefulness of the study is outlined. It reads: "...our findings reveal the potential for rat embryonic development in male parabionts, and it may have a profound impact on reproductive biology research."

Biologists grow mouse eggs in mini ovaries developed from stem cells 

The advance could generate new solutions to for human fertility issues

Charlotte Douglas


Institut Curie Paris

Scientists have discovered a way to produce eggs from scratch, in mini ovaries grown in tubes, opening up new avenues for the field of reproductive medicine and fertility. 

The researchers, led by Katsuhiko Hayashi of Kyushu University, Japan, created of a mouse ovary-like tissue, called an “ovarioid.” They developed the ovarioid with a cocktail of chemicals that induce stem cells to form ovarian-like tissue. This helped them overcome previous limitations, where forming eggs in a lab required fresh ovarian tissue, usually taken from female mice.  These results are the very first time that all components, ovary-like tissue and eggs, were made entirely from stem cells.    

Hayashi has made other great strides in reproductive biology. In 2011, he and colleagues generated sperm from stem cells, also in mice. Hayashi's group also developed an original method for growing eggs from stem cells with fresh ovary tissue in 2016. The new findings, published in the journal Science in July, could eventually allow people who have damaged or lost ovarian tissue and same-sex couples to potentially have biological children, using their stem cells as the base for generating eggs.

Despite these strides in mice, there currently remains a bottleneck for developing these technologies in humans, as the methods would need to be adapted for developing human eggs. Collecting fresh ovarian tissue from humans also has huge ethical and technical implications, and this idea could circumvent that particular hurdle. But developing human eggs or ovaries from stem cells should be discussed by scientists, policy makers, and ethicists. 

Genes related to sleep help cells stay free of toxins

Genetic research in fruit flies points to an important link between sleep and cellular housekeeping

Srija Bhagavatula

Cell Biology and Molecular Biology

How long can a person go without sleep? Sleep deprivation is associated with many impaired body functions, and sleep is required to clear toxins from the brain.

Genetic screens in fruit flies have been used to identify genes regulating many basic cellular and physiological processes, including sleep. A recent study by researchers at the University of Pennsylvania and Howard Hughes Medical Institute found that some sleep genes are involved in managing cellular waste in neurons, a process known as autophagy (or self-eating). Non-functional proteins and other cellular debris are loaded into compartments called autophagosomes to be eventually broken down and reused by the cell. The proteins responsible for this process are known as the autophagosomal proteins.

The study, published in online journal eLife, monitored autophagy in a mutant fruit fly mutant (named argus) that sleeps less than others. They followed the autophagosomes in these flies by tagging them with fluorescent proteins, and found more autophagosomes accumulating in the mutants' neurons. This accumulation means that cellular waste is not being cleared efficiently. 

A different mutation in another known autophagy gene, called blue cheese, reduced this autophagosome accumulation. While in argus mutants, autophagosomes do not proceed towards degradation, blue cheese mutants produced fewer autophagosomes to begin with. Interestingly, when neurons don't express blue cheese or other autophagy genes, fruit flies sleep more.  

The question remained, does sleep itself also affect the autophagy pathway?

The authors found normal adult flies had more autophagosomes in the early night than in the early morning. Mechanical sleep deprivation of the flies also lead to such an autophagosomal accumulation in neurons. To rule out the possibility of an indirect effect, the authors used a sleep-inducing drug on the flies. They saw that this reduced autophagosome accumulation, emphasizing the relation between sleep and autophagy.

Scientists are closer to a molecular picture of sleep, with a direct link between sleep and a cellular housekeeping pathway. But it's still unclear how this pathway translates into a physiological response. More work in this direction could help researchers find suitable drugs to treat sleep disorders.

Ancient DNA pulled from dirt yields evidence of a Paleolithic human, wolf, and bison in Georgia

Previously, ancient DNA had been extracted from bones, hair, and teeth, but it can also be found in soil

Sequencing of various DNA fragments from skeletal remains has given us a clear understanding of the genetic history of humans. Frequently, ancient DNA (often referred to as aDNA) samples come from recovered bone, teeth, or hair samples. Now, researchers from the University of Vienna have found that cave sediments can preserve ancient DNA well enough to provide genome-length information. 

The samples were recovered from the caves of Satsurblia, Georgia where humans lived during the Paleolithic period. The study retrieved samples of three mammalian DNA from a single soil sample. The first genome was of Eurasian ancestry, characteristic of post-ice-age people living in the Near East, North Africa, and parts of Europe. The second was from an unknown extinct species of wolves and dogs, and the last genome was a European Bison.

It’s understood that the DNA remains were preserved in clay-rich sediment from the layers of the cave. The researchers were able to directly sequence DNA found in the sediments, rather than the commonly used method of amplifying small amounts of DNA to make sequencing easier. The small amount of genomic material was sufficient for implementing complementary analyses of various mammalian species to dig some of their population histories. Even from small and fragmentary samples, the scientists were able to determine the human was likely female, carrying XX chromosomes. They were able to even make estimates for percentage Neanderthal mixture into this human sample (approximately 1 percent). The wolf and bison species karyotypes were more mixed, indicating the possible presence of multiple individual wolves and bison samples. 

They conclude that this method is an alternative to recovering ancient DNA from the skeletal remains and the team has planned to dive deeper into studying the soil sample of Satsurblia cave to determine the relationship between the effect of climatic changes and ancestral human and animal populations.

Releasing bacteria-infected mosquitoes in Indonesia prevented the spread of dengue

Mosquitos carrying Wolbachia pipientis bacteria don't spread dengue fever

Madeline Barron


University of Michigan

Mosquitos are the banes of our existence — they suck our blood, they spread disease, and for that, they suck in general. Some viruses that mosquitoes, such as the dengue viruses, can be debilitating or even lethal. Scientists have previously shown that infecting mosquitoes with the insect-specific bacterium, Wolbachia pipientis, can stop or slow the replication of dengue virus within the bugs. That work raised the question: Can Wolbachia-infected mosquitoes help prevent the spread of dengue?

In a new study published in The New England Journal of Medicine, researchers sought to answer this question by releasing Aedes aegypti mosquitoes (the type that transmits dengue) with or without Wolbachia into geographic clusters throughout Yogyakarta, Indonesia. They found that Wolbachia-infected mosquitoes maintained stable populations in their areas of release over the three-year experiment. Importantly, the incidence of dengue fever was significantly lower among people living in clusters with Wolbachia-positive mosquitos relative to control clusters — 2.3 percent and 9.3 percent, respectively. 

This study points to the efficacy of Wolbachia-mediated methods for controlling dengue virus, and potentially other diseases like yellow fever and Zika. Given the Wolbachia method is being deployed in various regions throughout the world, this new work bolsters evidence that bacteria may be the key for keeping mosquito-transmitted viruses in check.

T cells activated by mRNA vaccines are effective against Alpha, Beta, and Gamma COVID-19 variants

Research into other aspects of COVID-19 immunity is swiftly developing

Josseline Ramos-Figueroa

Chemical Biology

University of Saskatchewan

With vaccination rolling around the world, it might feel for some that life is getting back to normal. But concern around the Delta variant has been growing. It has now been detected in over 130 countries, including 65 in July and early August. Three other variants of concern, Alpha, Beta, and Gamma, are also circulating in the United States.

Recently, a study led by Shane Crotty from La Jolla Institute for Immunology published in Cell Reports Medicine found that T cells that develop as a defense mechanism in people who had recovered from COVID-19 or had received Moderna or Pfizer vaccines remain effective against the Alpha, Beta, and Gamma SARS-CoV-2 variants, along with another variant called CAL.20C. The Johnson & Johnson vaccine had not been available at the time Crotty was doing this research.

“You can think of T cells as a backup system: if the virus gets past the antibodies — if you have vaccine T cells the T cells can probably still stop the variant coronavirus infection before you get pneumonia," said Crotty in Science Daily.

While vaccine development has heavily focused on the antibody response of the body generated by B cells, new variants have mutations in the spike protein that can escape antibody recognition. In April, theoretical research by Binquan Luan and Tien Huynh showed that new mutations can sacrifice a tighter attachment to the human receptor ACE2 to gain antibody evasion abilities.

So far, studies have not converged on whether the Delta variant can cause more severe illness than the original strain in unvaccinated people. However, researchers have verified that two doses of an mRNA vaccine are effective at preventing hospitalization or death, which they report in a recent pre-print. More studies are required to understand vaccine breakthrough infections.

On July 27, WHO highlighted the urgent need to increase COVID-19 vaccination coverage and the recommendation for everyone to use the strategies that have been in place for the past year, including wearing masks in indoor public places.

More Lab Notes →