Forest fires are good for bee populations

Scientists identified twice as many bees at burned sites compared to unburned sites

Lila Westreich

Pollinator Ecology

University of Washington

Fire serves a vital role in our ecosystem. Forest managers often prescribe burns also called a "controlled" burn to mimic the natural fires in forests and grasslands that may stimulate germination, reveal minerals in the soil, and increase seed vitality. Prescribed burns are performed by highly-trained individuals to accelerate growth in the ecosystem. Certain tree species like lodgepole pine, ponderosa pine, and the famous sequoia actually require fire to open their pinecones and release their seeds.

But do these burns help pollinators? A recent study from researchers at North Carolina State University examined the impact of prescribed burns on the native bee community in the Sandhills area of North Carolina. Bees are important insect pollinators of grasslands that depend on natural fire for ecological production. But fire destroys and removes flowering plants visited by bees. So how could fire help bees?

The study involved taking diversity counts that is, placing traps to catch bees and identifying them at the species level at sites that had been burned within two years and sites that hadn't been burned. 

The researchers found there were 2.3 times more bees captured in more recently burned forest sites than in the unburned sites.

Researchers found there were more flowering plants at recently burned sites, increasing the diversity of flowers known to benefit bees. Burns were conducted during the winter season, so as not to burn up woody debris that may serve as bee over-wintering habitat. 

Forest ecosystems that require fire for growth are found across the country. Fire benefits forest and prairie ecosystems by stimulated tree growth and increasing the diversity and presence of native plants such as Echinacea, providing food sources for bee species. Prescribed burns are a powerful tool for land managers, and this study is one of a growing number aiming to show the benefits of prescribed burns for bees. 

Some lucid dreamers can answer questions and math problems in their sleep

For the first time, scientists have been able to communicate with people while they are dreaming

Marina Olle Hurtado



Lucid dreaming, or the ability to become aware that you're dreaming while you are actively doing so, is a rare psychological state that has long fascinated scientists. So far, scientists had been able to show how participants can process external cues while remaining asleep but never communicate back while asleep.

This has changed with the publication of a new research study in Current Biology detailing how scientists were able communicate in real-time with lucid dreamers instead of having to wait for study participants to wake up and talk about their dreams. 

While sleeping, the volunteers communicated with the researchers, responded to questions, and answered math problems by moving their eyes left and right and contracting their facial muscles. For example, one study participant was asked to solve the math problem "four minus zero" and answered by moving their eyes left and right four times each — all while sleeping. Moreover, volunteers typically were able to recollect what happened upon awakening.

The possibility to hear the outside world during the stage of dreaming opens up a world of opportunities for some people including, but not limited to, interactive dreaming, recreational enjoyment or even lucid dreaming therapy.

Your brain responds to why you're drinking, not just what you're drinking

Drinking for relief produces a different reaction in the brain than drinking for reward does

Elizabeth Burnette


University of California, Los Angeles

Different people drink alcohol for different reasons. Knowing someone’s motivation can help researchers develop more personalized treatments for problematic drinking. Studies show that reward drinkers, those who say that drinking makes them feel good, behave differently from relief drinkers, those who drink because it makes them feel less bad (i.e., they are self-medicating or alleviating withdrawal symptoms).

In a recent study, we aimed to explore whether these two categories of heavy drinkers also showed differences in their brains. Specifically, do reward and relief drinkers have different patterns of neural activation when looking at pictures of alcoholic beverages?  

To answer this question, we recruited people who drink heavily and categorized them into reward- and relief-drinking groups using the Reward/Relief/Habit Drinking Scale and the Reasons for Heavy Drinking Questionnaire. Previous research suggested that while most people begin drinking for positive reinforcement (reward), as they continue to drink heavily for a long period of time, they begin to drink out of negative reinforcement (relief). 

We also knew that the ventral striatum is a brain region associated with reward, while the dorsal striatum is associated with compulsive behavior. Therefore, we hypothesized that reward drinkers would have greater neural activation in the ventral striatum while looking at images of alcoholic drinks, whereas relief drinkers would have greater activation in the dorsal striatum when viewing these images.

We found that relief drinkers did indeed show significantly more activation in the dorsal striatum. However, contrary to our hypothesis, there was not much difference between reward and relief drinkers in the ventral striatum. We interpreted this to mean that the rewarding qualities of alcohol may not be lost in relief drinkers, but that a sense of relief may be gained in addition to the reward.

This study showed that there might be biological differences underlying different motivations for drinking, which opens the door to further development of precision medicine to treat alcohol addiction. One next step in this line of inquiry might be to examine differences in reward and relief drinkers’ response to existing treatments.

These are the health effects of tear gas

Tear gas can kill, but exposure can also cause chronic respiratory difficulty and severe eye injuries for months and potentially years after

Dan Samorodnitsky

Senior Editor

A tear gas canister contains a few different things. Primarily they are made of an irritant, most commonly 2-chlorobenzalmalononitrile (CS). Another irritant, 1‐chloroacetophenone (CN) was common until the 1980s, but it was found to not be potent or stable enough. A common composition for a complete tear gas cocktail is 45% CS, 30% potassium chlorate, 14% epoxy resin, 7% maleic anhydride, 3% methyl nadic anhydride, with some small residual ingredients. Although CS is the main cause of pain and irritation, all these ingredients have their own inherent toxicities. Potassium chlorate, for instance, cause burns and irritation itself, and can cause anemia if inhaled.

Tear gas agents like CS and CN function by interacting with TRPA1, a receptor protein expressed on the surface of nociceptors, pain-sensing neurons. In a way that's similar to how strong mustard makes your nose burn, tear gas agents bind to these proteins at nanomolar or sub-nanomolar concentrations, tens of thousands more potently than mustard, over-activating nociceptors. Tear gas causes extremely painful burning and irritation in the eyes, mouth, nose, and respiratory tract.

The chemical structure of tear gas irritant 2-chlorobenzalmalononitrile

The chemical structure of tear gas irritant 2-chlorobenzalmalononitrile

Via Wikipedia

People exposed to tear gas were at risk for at least three months post-exposure for chest tightness, difficulty breathing, and chronic bronchitis. Severe exposure can cause edema and respiratory arrest leading to death as well. The CDC reports that tear gas can also kill due to severe burns in the respiratory tract. Other effects include corneal abrasion in the eye, glaucoma, and nerve damage. Deaths from deploying tear gas have also been reported after their use in prisons. A four month-old infant developed pneumonitis after two to three hours of exposure to tear gas, when police fired canisters into a home trying to arrest an adult there.

Tear gas was banned in warfare under the 1993 Geneva Protocol. Police continue to use it domestically, in order to avoid using more lethal tactics for crowd control.

Denisovans left their DNA traces in humans, but their fossils remain elusive

New study digs into ancestries of people in Island Southeast Asia

Amanda Rossillo

Evolutionary Anthropology

Duke University

Extensive fossil and DNA evidence from around the world has shown that extinct human species occasionally interbred when they encountered one another. Modern humans interbred with at least two other species — Neanderthals and Denisovans — which have left traces in our DNA to this day.

In contrast to the well-studied Neanderthals, the Denisovans are a poorly understood species known from only a handful of 50,000 to 160,000 year old fossils from Siberia and Tibet. But genetic studies have shown that Denisovans interbred with modern humans in Island Southeast Asia (ISEA), thousands of miles from where all known fossils were found.

There are plenty of fossils in ISEA, belonging to three distinct species: the well-traveled Homo erectus, and two endemic "super-archaic" species. The last of these species have a deep history in the region but disappeared from the fossil record around 50,000 years ago. This combination of super-archaic fossils and Denisovan DNA has complicated our understanding of the history of this region. To disentangle the genetic relationships among human species in ISEA, an international team of scientists analyzed DNA of over 400 people from across the world, searching for segments that corresponded to both super-archaic and Denisovan DNA.

Surprisingly, they found no evidence that any of these super-archaic species interbred with modern humans, despite the wealth of fossils from the area. On the contrary, people with ancestry from ISEA, Papua New Guinea, and Australia had the largest amounts of Denisovan ancestry of all populations studied.

Finding genetic traces of Denisovans in an area of the world where they have yet to be found shows how little we know about the history of this region. While Denisovans appear anatomically distinct from the three super-archaic species of ISEA, it’s possible that they’ve been hiding in plain sight all along. There may also be Denisovan fossils still hidden across ISEA, waiting to be found. Either possibility will have major impacts on how we understand our own evolutionary history.

The muon g-2 experiment might mean the Standard Model of physics is incomplete, but that's just the beginning

Muons, elementary particles similar to electrons, appear to behave like magnets, but the Fermilab experiment is still not confirmation

Katherine McCormick

Quantum Physics

University of Washington

Twenty years ago, an experiment at Brookhaven National Lab produced some puzzling results that might point to new physics beyond our current understanding. 

Just last week, Fermilab unveiled the product of a decade-long quest to verify that original Brookhaven result: they can now say with even more certainty that there is indeed a discrepancy between the measurement and our theoretical predictions based on current models of physics.

The experiment was a measurement of the so-called "muon g-2 factor." The muon — a subatomic particle which is, essentially, an electron, but 200 times heavier — has a magnetic moment, meaning this tiny particle can be thought of as a little bar magnet. Our current working theory of particle physics, called the Standard Model, predicts how strong this bar magnet should be. But that 20-year-old experiment measured a strength of the magnetic moment that wasn't consistent with the Standard Model theory prediction. Fermilab repeated the experiment with even more care and precision, by looking at how fast the muons' little bar magnets move like a spinning top when placed in a magnetic field. They were able to confirm that there is a 4.2-standard deviation discrepancy between their measurement and the current best theory predictions, meaning there is only a one-in-40,000 chance that the discrepancy is a statistical fluke.

A graph showing the Fermilab muon experiment lining up with a previous experiment from Brookhaven National Labs

The Fermilab result from the Muon g-2 experiment at Fermilab aligns well with the previous Brookhaven result

Ryan Postel, Fermilab/Muon g-2 collaboration

This is exciting, because it could be a hint towards a new, more complete theory of physics that might answer many questions that the Standard Model hasn't been able to. But then again, there could just be an issue with the way that the muon's magnetic moment is being calculated, not a problem with the physics itself. Calculating the "g-2" (pronounced "g minus two") is extremely complicated, and not every method of calculation produces the same result. Last week, at the same time the new experimental value of the muon g-2 was announced, a theoretical value using a new method was also announced — and this new theoretical value is actually much closer to the measured value thank previous calculations have been.

So while this is a very important step in particle physics, it's by no means the end of the story. The theorists will be hard at work trying to figure out why different calculations are giving different results, and the experimentalists will be hard at work with the analysis of even more data runs, reducing their uncertainty further. We can expect more exciting announcements from Fermilab in the years to come.

Researchers identify a bacterium that enables its host to breathe nitrate instead of oxygen

Meet Candidatus Azoamicus ciliaticola, discovered in a Swiss lake

Irene Zhang

Microbial Ecology

Massachusetts Institute of Technology

Most eukaryotes generate energy through breathing oxygen. This happens in mitochondria, specialized organelles likely acquired by the ancestor of eukaryotes when it engulfed a free-living prokaryote. Over time, this prokaryote became an obligate endosymbiont, meaning that neither the prokaryote nor its eukaryotic host could survive without the other. Then with more time it became an organelle, by keeping only the genes needed for oxygen respiration and losing genes for independent living.

But not all eukaryotes breathe oxygen. In a paper published in Nature, researchers investigating a lake in Switzerland found many eukaryotic ciliates (single-celled eukaryotic organisms covered with tiny hairs they use to move) which swam away from oxygen, indicating that they used something else for energy. 

When the researchers stained the ciliates with a DNA-binding dye, they found multiple pockets of DNA within each ciliate outside of its nucleus. Using a different fluorescent dye, they found that these pockets contained only bacterial DNA. The researchers extracted and sequenced the DNA, and obtained a small circular genome not belonging to any known bacterium. They named this novel organism ‘Candidatus Azoamicus ciliaticola’.

Why was this bacterium inside these ciliates? The researchers compared its genome to other bacteria to answer this question. Like many other endosymbionts, the bacterium's genome was very small and lacked the genes needed for independent living. In particular, this genome contained a high proportion of genes for energy production, similar to mitochondrial genomes. However, it lacked genes for oxygen respiration. It instead possessed the genes needed for nitrate respiration. These clues led the researchers to conclude that ‘Candidatus Azoamicus ciliaticola’ was an obligate endosymbiont that enabled its ciliate host to breathe nitrate — not oxygen — for energy. This adaptation allows the ciliate to live in waters low in oxygen but high in nitrate.

While many free-living prokaryotes can respire nitrate, this is the first instance of this metabolism in a prokaryotic endosymbiont. Perhaps other eukaryotes out there have acquired new metabolisms with the help of endosymbionts. These interkingdom partnerships may allow eukaryotes to colonize environments formerly assumed to be the domain of only bacteria and archaea.  

Fulvia Del Duca


Technical University of Munich

Remember in high school, when you had to read the entire Divine Comedy — and said to yourself, “I’ll never major in literature?" 

Unsurprisingly, brain recordings from experts in a particular discipline — be it literature, painting or music — show that experts value the types of art that interest them more positively than do people with little background on it. 

But, just because you didn’t major in literature doesn’t mean that you can’t be moved by a poem. In a recent study conducted in Italy and published in the journal Brain Sciences, non-literature students showed higher emotional reaction to excerpts from the Divine Comedy than did literature students, despite the fact that the literature students were able to appreciate and recall it better. The researchers attribute this to 'emotional attenuation' in the literature students. 

So if you are reading a poem and wishing you knew more, think again: it might be more emotionally impactful on you, the less you know about it.

People with immune systems primed to fend off bacteria are more susceptible to a common virus

This new research could lead to greater understanding of how the flu and coronaviruses affect us

Hazel Walker

Immunology and Cell Biology

University of Cambridge

Every winter coughs and sneezes run rampant through the population, but some people find themselves sick with the common cold time and again whereas others do not. Although immune memory generated from previous infections can partially explain why some people might be protected, it is not sufficient to explain the whole story.

Researchers at Imperial College London trying to better understand what makes us susceptible to the common cold, looked at the state of the airways in healthy volunteers before exposing them to Respiratory Syncytial Virus (RSV). RSV is one of a number of viruses which can cause common cold symptoms in healthy adults, but it can prove fatal in infants and the elderly.

In a paper published in Science, these researchers showed that people with bacteria-fighting cells, called neutrophils, in their airways before viral exposure were more likely to become infected with RSV. In this case, by being primed to tackle bacterial infections, their immune systems were less prepared to fight off a virus.

This finding could be used to identify people who might be more likely to become infected with RSV as well as further our understanding of how viruses affect us. These findings may prove relevant for other viruses such as influenza and coronaviruses.

Such a discovery would not have been possible without a challenge study in which study volunteers are safely exposed to the virus. This allows researchers to capture information before, during and after viral exposure, something which researchers at Imperial College London now hope to do with Sars-CoV-2, the virus responsible for the COVID-19 pandemic. 

People of the Tiahuanaco civilization engineered their own rocks to build temples and monuments

The discovery illustrates the great ingenuity of ancient construction workers

Tiahuanaco is a small village south-east of Lake Titicaca, is a world-famous archeological site with 1,400-year-old monuments and ceremonial buildings. The precision and detail of these sculptures called the attention of scientists who doubted that they could have been created with the simple tool technology known at the time. 

A new pre-print, a scientific study that is completed but have not yet been peer-reviewed by other scientists, claims to solve the mystery: the ‘H shaped’ blocks of one of the most iconic temples ‘Pumapunku’ is not made of rock (as it was always thought), they’re made of sand!

Since archaeologists found the ruins of Tiahuanaco, they tried to find their source of materials, as almost all of the ancient city was made of rock blocks. In 1892 researchers discovered that these rocks (called andesite) were collected from an outcrop located at the foot of a volcano named “Cerro Khapia.” 

A geochemical analysis revealed that the composition of the H shaped blocks from Pumapunku coincided with the andesitic sand but that also included organo-mineral binders, bat droppings, and other ingredients used to produce andesite geopolymer blocks. 

The discovery of Tiahuanaco’s material supply route revealed the great ingenuity of ancient construction workers who created incredibly resistant blocks by using what they could find around them in ways archeologists had not anticipated.

New earthquake-detecting app may give people more time to evacuate

ShakeAlert was designed by researchers at the University of Oregon

Kristen Vogt Veggeberg

Science Education

University of Illinois at Chicago

The 10 year anniversary of the terrifying Tohoku earthquake and tsunami wrecked Japan passed by last month. Now, the other side of the Pacific Rim may have developed a new technology to better detect these devastating forces of nature.

ShakeAlert, designed by researchers at the University of Oregon and led by UO geophysicist Doug Toomey, uses the data from over 400 different seismic detectors (the ‘rumbles’ that proceed an earthquake) in the Pacific Northwest to be sent to smart phones and other wireless devices in areas where an earthquake might strike, including states such as California, Oregon, and Washington. Subsequently, by giving residents more time to prepare for an earthquake and even tsunami, more lives may be saved, similar to the impact that tornado warnings have had in preventing deaths from wind storms. Hopefully, this app can also expand to other countries and places vulnerable to earthquakes, such as Chile, New Zealand, and Japan, where notices of seismic activity can be sent quickly to people, allowing them to keep themselves and their loved ones safe during an otherwise cataclysmic event.  

Uranus emits extra x-rays, and scientists don't know why

They could be just reflections, or Uranus could have its own version of the Northern Lights

Briley Lewis

Astronomy and Astrophysics

University of California, Los Angeles

Scientists have recently discovered x-rays coming from the planet Uranus. Using data from the Chandra X-Ray Telescope, scientists have observed x-rays on Uranus in images from both 2002 and 2017. (You might be thinking, “2002, how is that new?!” Sometimes, like in this case, astronomers will record a lot of data and not actually finish analyzing it until years later). 

They combined this x-ray information (shown in pink) with optical pictures of Uranus (blue), resulting in the image you see here.

Combined optical and X-ray image of Uranus.

Combined optical and X-ray image of Uranus

Chandra image gallery. X-ray: NASA/CXO/University College London/W. Dunn et al; Optical: W.M. Keck Observatory

High energy x-rays from a planet might sound shocking, but we’ve actually seen x-rays coming from most planets in the solar system. The Sun emits x-rays, and planets reflect some of that light back into space. The interesting thing with Uranus is that it seems to show more x-ray than you’d expect from just reflected sunlight. So, how is Uranus producing extra x-rays? Maybe it simply reflects more x-ray light than the other planets, or maybe it has charged particles hitting its rings, like Saturn. Another explanation could be aurorae — like the Aurora Borealis on Earth, other planets emit light when charged particles (like electrons) travel along the lines of their magnetic fields.

Either way, we’ll need more observations to know for certain what’s going on with Uranus. We know quite a bit about our solar system, but the two ice giants (Uranus and Neptune) are woefully unexplored. The only mission to visit them was Voyager 2, back in the 1970s, and we haven’t been back since. 

Scientists engineer a sustainable plastic made of tiny building blocks

The new plastic is just as strong as polyethylene and can be 3D printed into objects

Emily Mueller


University of Michigan

Plastics are incredible because they are so versatile. Plastic waste, however, is problematic because it is so persistent. Right now, the life of a typical piece of plastic follows a linear path: plastic is born from fossil-fuel building-blocks, molded into products, and dumped into landfills and oceans as waste. 

A cyclic life would be more sustainable, where plastic is born from plant-based building-blocks, molded into products, broken back down into building-blocks, reborn into plastic, and so on. This “closed-loop” recycling is challenging because the plastic must be durable enough for product use but breakable under the right conditions. 

To obtain durable yet breakable material, researchers made new plastic that imitates the world’s most common plastic, polyethylene, but with a slightly different chemical make-up. Polyethylene is durable because it is made of carbon and hydrogen atoms, which form strong bonds with each other. Long chains of connected atoms in polyethylene also arrange themselves into an organized 3D structure, further increasing strength. 

The new plastic still has long segments with carbon and hydrogen atoms, like polyethylene, but with small amounts of oxygen atoms, which serve as breaking points. To obtain a strong plastic despite these breaking points, the plastic needed to be made of extremely long chains of connected atoms. Researchers overcame this challenge by using a more reactive combination of molecular building-blocks to grow longer chains. 

The new plastic is just as strong as polyethylene and was 3D printed into a protective smartphone cover and a cup that could withstand boiling water. However, a combination of heat and alcohol completely broke the new plastic down into its molecular building-blocks – even in mixtures with other plastics and dyes that are present in real-world waste streams. The building-blocks were then re-used to make the plastic again in a “closed-loop” lifecycle. 

Plastics with “closed-loop” lifecycles could dramatically reduce the resources needed for products that we use every day, like smartphone covers or cups. While more research is needed to understand how to produce and recycle new plastics on large scales, this initial example is a promising step towards a sustainable future. 

Space travel is now open to the public

SpaceX's Inspiration4 has picked two new astronauts and will be launching September 2021

Briley Lewis

Astronomy and Astrophysics

University of California, Los Angeles

SpaceX’s Inspiration4 is the first all civilian space mission, breaking from a long tradition of NASA leadership and military-trained pilots. Funded by a billionaire CEO (Jared Isaacman, also the mission commander), the mission team just announced its final two crew members to join Isaacman and physician assistant and cancer survivor Hayley Arceneaux on the journey. 

Dr. Sian Proctor, a geoscientist and educator, won her spot through an entrepreneurship competition for “demonstrating innovation and ingenuity” using Isaacman’s Shift4Shop platform. She is extremely well-qualified to be an astronaut, too — she completed multiple trips on Earth as an “analog astronaut” and has been a finalist in NASA’s Astronaut Program. Chris Sembroski, an engineer, Air Force veteran, and all-around space enthusiast, won his seat through a raffle where proceeds went to St. Jude’s.

Launch is planned for September 2021, and these new astronauts will orbit Earth every 90 minutes for multiple days in SpaceX’s Dragon crew capsule before returning home. This is by no means the first historic feat SpaceX has taken on — they were the first private company to launch a rocket successfully and the first to send astronauts to the International Space Station. They’ve also created the first reusable rocket, the Falcon 9, and currently have the largest satellite constellation in the world (although that one might not be such a good thing).

Despite their illustrious track record, they’re not the first company to get into space tourism — although they are the first to successfully make it happen without the government’s help. Billionaires have been buying tickets through Space Adventures (hopping aboard Soyuz spacecraft) for years, Jeff Bezos (of Amazon infamy) owns Blue Origin, and Sir Richard Branson has Virgin Galactic. Even more private space companies have appeared in recent years, showing us that we truly are on the verge of a new era of space exploration (for better or for worse), with SpaceX leading the way. 

Think twice before you ice after an injury

Applying ice to a sprained ankle or wrist decreases blood flow to the area for longer than previously thought

Margaux Lopez


Vera C. Rubin Observatory

It is common practice to apply an ice pack to a sprained ankle or a sore muscle, and many professional athletes have been reported to use cryotherapy to aid with recovery. However, the benefits of the well-known RICE protocol (rest, ice, compression, and elevation) for injuries and sore muscles have been thoroughly debunked, including by the doctor that originally coined the term four decades ago. While icing an injury does effectively relieve pain, it also constricts blood vessels and reduces blood flow to the cold area. Even though the injury feels better, this impairs the body’s ability to heal, extending the recovery process. 

But what happens after the ice is removed? In a recent study, scientists hypothesized that once the area warmed up, there would be a large temporary increase in blood flow, aiding in the healing process. This “rebound” phenomenon has been observed after things like removing a tourniquet or unclamping an artery during surgery, but hadn’t been studied for restrictions due to cold temperatures.

The researchers found that using ice, compression, and elevation therapy on a muscle immediately after exercise led to significantly reduced blood flow as expected, but instead of bouncing back immediately after treatment, the blood flow remained low for an extended period of time. While we already knew that ice impairs muscle recovery even though it’s great for reducing pain, now we can add that the negative effects last longer than previously hypothesized, suggesting that injured athletes should think twice before using ice as pain relief.

Hurricanes cause more health problems than you might think

New research into when and why people go to hospitals after hurricanes will help these facilities better prepare for future disasters

Knowing how to prepare for a hurricane isn’t easy, especially if you are running a hospital. In the United States, hospital facilities are required by law to plan for disasters, including outlining emergency leadership and staffing, triage protocols, and how to perform a full-scale evacuation, as several Gulf Coast hospitals did during Hurricane Laura last fall. Making the wrong decision can cost dollars and lives, and there are always lessons to be learned.

A recent study published in Nature Communications may have one of those lessons. By cross-referencing 70 million Medicare hospitalizations with 16 years of local wind measures, researchers identified patterns of when older Americans tend to go to hospitals after storms and why. Unsurprisingly, hospital visits for injuries jumped post-storm. But so did several seemingly unrelated maladies.

The day after hurricane force winds, for instance, hospitalization rates for respiratory issues — asthma, chronic obstructive pulmonary disease (COPD) — doubled on average and infectious disease visits spiked by roughly half. Visits for both remained higher than average throughout the week after hurricanes and tropical cyclones, as did hospitalizations for liver disease, delirium and dementia, and renal failure. Cancer visits dropped by 4 percent in the same time.

All told, an estimated 16,000 additional hospitalizations were associated with respiratory problems and 2200 with infectious disease. Reasons are varied but likely have to do with the loss of power needed for breathing equipment and exposure to contaminated flood water — more than 800 wastewater facilities reported spills after Hurricane Harvey in 2017.

The results will help improve hospital preparedness for hurricanes. They will also clue hospitals in to how to stock medical supplies in the days before a storm and assign staff in the days after, crucial information when these facilities are strained. At a time when climate change is fueling more frequent and powerful cyclones, the study is also a reminder of the detrimental effect global warming has on health, even in wealthy countries.

Scientists re-discover a long carbon chain molecule hiding in space

Cyanodecapentayne has been detected in the Taurus Molecular Cloud for the first time in over 20 years

Olivia Harper Wilkins



Cyanopolyynes are a class of molecules that are out of this world. No, really — these molecules do not exist naturally on present-day Earth.

Cyanopolyynes are long chains of carbon atoms with a hydrogen atom on one end and a nitrogen atom on the other. Interstellar cyanopolyynes are interesting because they might help us better understand the carbon chemistry around stars and how larger carbon molecules, such as those that make up interstellar dust grains and soot, are formed and destroyed during stellar evolution. 

Observations with the Green Bank Telescope (GBT) in West Virginia have revealed another interstellar cyanopolyyne - cyanodecapentayne (pronounced sigh-ann-oh-deck-uh-pent-uh-ine), or HC11N - in the Taurus Molecular Cloud, about 430 light years from Earth. This discovery was reported in the February 2021 issue of Nature Astronomy, but the saga of detecting this molecule goes back to the late 1970s. 

Predictions for HC11N’s observable chemical signature were reported in 1978. Throughout the 1980s and 1990s, there were multiple detections of HC11N around a cool star and in the Taurus Molecular Cloud, but more recent observations taken toward that same cloud came up empty. This suggested that the earlier reports were based on a mistake made in the assumptions of the molecule’s chemical signature. After more than 20 years, HC11N was knocked off the list of detected molecules, until this latest discovery. 

Long carbon chains up to HC17N have been detected in the lab, so there might be even longer carbon chain molecules hiding in the clouds of interstellar space. The bigger question is: will we be able to detect them?

A scientific correction finds Venus's atmosphere probably does not contain phosphine gas

The initial discovery set off a flurry of excitement. The reality is something more mundane

Simon Spichak


Is there life on Venus? We once envisaged an alien world hidden beneath its yellow clouds. Through advances in astronomy, we now know that Venus is rather uninhabitable. It's scorching hot with a toxic sulfur dioxide clouds. The air pressure on the surface is literally crushing.

A study in September 2020 noticed an anomaly in the atmosphere of Venus. By pointing a telescope at the planet, researchers detected much more phosphine gas than expected in its atmosphere. This report set of a flurry of publicity and excitement, because the only way we know of to make this much gas involves microbial processes. The scientists were very careful in writing about their findings, conscious of the fact that extraordinary claims require extraordinary evidence, but it was widely hypothesized that either this was a newly discovered reaction in the atmosphere or a sign of microbes on Venus.

But in November of 2020, the authors contacted the journal to say that they found mistakes in the way that they processed some of their telescope data. They are in the process of correcting their article. In the meantime, other groups also delved into these findings. In January 2021, another group of researchers published their analysis, and they concluded that the gas was likely sulfur dioxide, not phosphine, high up in Venus's atmosphere. 

So, scientists didn't find signs of alien life on Venus after all. Nonetheless, they managed to improve their techniques and calibrations for observing faraway planets. And — importantly — the scientific process of discovery, debate, and correction when necessary worked exactly as it should have. 

By avoiding pumas, deer are drastically changing the vegetation around them

Animals' behaviors are shaped by fear of humans, and this, in turn, affects the plants they eat

Maria Gatta

Ecology and Conservation Biology

University of the Witwatersrand, Johannesburg

We humans modify our surroundings to fit our needs. We are capable of taking out any animal, from an inconvenient garden snail to a wolf that has a habit of killing livestock. It is no wonder that animals fear us.

But the effects of that fear do not stop with the animals themselves. Animals are adaptable creatures. They adapt to our schedules and behaviors by modifying their own. These changes in behaviour can trickle down. These ecological effects extend even to large carnivores like pumas.

Pumas live throughout the Americas, from Patagonia in the south all the way to the sub-arctic regions of Canada. But most people who live in those areas will go their whole lives never seeing one. Pumas know to avoid us — and their prey have noticed too.

New research published in Ecosphere has found that black-tailed deer in the Santa Cruz Mountains of California are spending more time in the forest areas closer to human habitation. And it is not due to tastier or better quality plants, nor because humans are providing them with food. Rather, it is because pumas are afraid of the nearby humans and so they don’t like going into those areas.

The deer are spending so much time in these forests that they are modifying the environment. The plants in these deer-frequented areas are becoming shrubbier as they are essentially being pruned by the deer. This, in turn, creates more food for deer, creating a win-win scenario for the deer. More research is needed to see how this affects the other wildlife in the area, like birds and insects, but the effect that humans are indirectly having on the landscape is clear.

Orangutans and bonobos at the San Diego Zoo have been vaccinated for COVID-19

The apes were given an experimental vaccine originally developed for dogs and cats

Brittany Kenyon-Flatt

Biological Anthropology

North Carolina State University

Just over three months ago, in December 2020, research found that monkeys and apes from Africa and Asia are susceptible to the SARS-CoV-2 virus. At the time, there was particular concern for endangered gorillas because the disease could have a detrimental effect on their small population size, and for captive chimpanzees living in sanctuaries who are in close contact with humans.

Then in January, it was reported that a troop of eight captive gorillas living at the San Diego Zoo tested positive for COVID-19. While the gorillas made a full recovery, this incident showed that gorillas can, in fact, contract and become sick from COVID-19. 

On March 3, however, there was good news for our primate cousins: great apes living at the San Diego Zoo received experimental COVID-19 vaccines. Four orangutans and five bonobos were vaccinated, making these nine animals the only non-human great apes to be immunized in the world. There are three leftover doses, which will be given to the three gorillas living at the zoo who did not have COVID-19 in January. 

This specific vaccine cannot be used for humans. It was developed by Zoetis for dogs and cats, among ongoing debate as to whether such a vaccine would be useful. While the vaccine used on the great apes in San Diego was not specifically designed for them, Zoetis provided an emergency supply of the vaccine for this specific use. The idea of using a vaccine for one animal to inoculate another is not new. In fact, the influenza vaccine developed for humans is given to great apes in zoos every year and researchers are working to develop cross-species vaccines to treat other diseases like malaria

While there is still relatively little known about how COVID-19 affects the animal kingdom, scientists are also worried about whether the virus could gain foothold in an animal population, and then reemerge and infect humans again. While it remains unclear whether the vaccine could (or should) be used to inoculate wild populations, the immunization of captive populations is certainly a step towards a safer world for all animals. 

Two mistakes by a pair of astronomers expanded our knowledge of dying stars

Wade and Hjellming were initially looking for red supergiants. They didn't find them, but that doesn't mean they failed

Margaux Lopez


Vera C. Rubin Observatory

Radio telescopes detect signals at a longer wavelength than optical telescopes, making them ideal for looking at gases but not light or stars. In 1970, a pair of astronomers named Wade and Hjellming decided to use their three precious weeks of observing time on a radio telescope to see if a certain type of star called a red supergiant was detectable. After two weeks of searching, the answer was a disappointing “no.” They moved onto the backup plan: using their last week to point the telescope at various other types of stars in the hopes they could find something

And they did! They detected a strong signal when they pointed the telescope at a nova, or exploding star, and then shortly after, a second one. Building on those successes they completely changed their observing strategy to focus on all types of novae, ultimately leading to the discovery of a new class of radio source and providing a valuable complement to the existing optical data on novae. 

There was one small catch — they had mis-typed a digit when filling out the punch card that feeds coordinates to the telescope, and the first thing they had detected was actually radio waves from a known radio source, not a nova. Same thing for the second detection: a calculation error meant that they had been pointing at the wrong patch of sky. The fact that they kept looking specifically at novae with the radio telescope was propelled by a few silly mistakes that just happened to lead them down the right path of inquiry. As Wade wrote: “When you can’t do [science] any other way, that’s how you have to do it!”

The mysterious cause of sea star wasting syndrome is a mystery no more

Sea stars suffer when microorganisms living on them suck up too much oxygen from the water

Amanda Rossillo

Evolutionary Anthropology

Duke University

Sea stars, also known as starfish, have a reputation for being resilient animals that can regenerate lost limbs. However, in 2013, sea stars off of the Pacific Coast began rapidly dying in huge numbers, and no one knew why.

Similar sea star mass die-offs have been recorded for decades, but this event was one of the largest wildlife mass-mortality events ever recorded. Sick sea stars become covered in lesions, permanently losing their limbs and melting into blobs of decayed tissue. This illness, termed sea star wasting syndrome (SSWS), was widely thought to be caused by a viral infection. However, this could not be replicated in the lab and was ultimately disproven.

Now, in a new study published in Frontiers in Microbiology, researchers have found the mysterious illness was caused by microorganisms sucking up oxygen from the water around infected sea stars, essentially suffocating them. 

The researchers had previously investigated and ruled out other factors, such as water temperature, as the cause for SSWS. However, when they examined the water immediately surrounding sick sea stars to the environments around healthy sea stars, they found that nutrient-loving bacteria living on the sea stars had used up all the oxygen that they need to breathe. 

Although SSWS is caused by an ecological interaction rather than an infection, it can still be transmitted between sea stars. As dying sea stars decay, they generate organic matter that can promote bacterial growth on nearby sea stars in a dangerous feedback loop.

Sea stars play essential roles in many ecosystems and help maintain local biodiversity. Knowing how this disease develops can help researchers treat sick sea stars in the lab, helping to preserve delicate ecological relationships. 

Scientists starved E. coli for 1200 days to learn about bacterial evolution

They saw two main lineages emerge and compete with each other — all in a single test tube

Madeline Barron


University of Michigan

What do you get when you stick E. coli in a test tube and don't feed it for three years? According to a new report in mBio, bacterial evolution at its finest.

In this study, scientists grew a single tube of E. coli in the laboratory for 1,200 days (over three years!). They wanted to explore how bacteria adapt to nutrient-limited conditions for an extended period of time, as microbes often face periods of starvation and stress in the natural world. 

The growth medium in the tube was never replenished. The only nutrients available were those the bacteria recycled from waste accumulating in the tube. This approach was different from other long-term E. coli evolution experiments (some of which have gone on for over 25 years), in which bacteria are transferred to fresh media everyday. 

When bacteria divide, they sometimes acquire mutations in their DNA. If those mutations are beneficial to the organism, their frequency within the population increases. By analyzing these mutations and their frequency, we can learn how bacteria evolve and adapt to their surroundings.

In this study, the researchers sequenced DNA collected from the E.coli collected at regular intervals over the course of the experiment. Based on the type and frequency of mutations they found, the researchers identified two main genetic groups, or lineages, that had diverged from the parent cells. While cells within each lineage coexisted with each other in the tube, one generally outnumbered those from the other, with the dominant lineage changing from one time point to the next. This indicates that the lineage best equipped to handle life in the tube varied according to the challenges faced by the cells at any given time. 

In addition to being the first to explore evolution of a single bacterial population under nutrient starvation conditions for this length of time, this study illustrates how complex and dynamic evolving bacterial communities can be. It also highlights specific mutations that may help bacteria thrive in diverse, often stressful, environments in the world beyond the test tube. 

Simon Spichak


As we age, our brain becomes more vulnerable to different diseases. Parkinson's disease is one such debilitating disorder. It affects more than 10 million people worldwide. Men are 1.5 times more likely than women to develop it. Most people with Parkinson's disease develop it when they are around 60 years old, when the motor cells in their brains begin to die off, leading to a loss of motor control.

This brain cell death is caused by clumping of alpha-synuclein (α-syn) proteins. We know that α-syn is found in the region of the neuron that sends packages of neurochemical signals, which are called vesicles. But we don't know what this protein does in a healthy brain. This might hold a clue to why α-syn begins clumping. 

A new study in Nature Communications provides more context into their function. Scientists used artificial cell membranes to mimic the vesicles sent out by cells. By adding α-syn to these vesicles, they found that it sticks like glue to the inside of the vesicle and keeps these tiny packages of neurochemicals at their origins until the cell is ready to send them out. This might explain how one abnormally clumping α-syn protein can spread and build up inside of neurons in people with Parkinson's disease. This very basic scientific finding could lead to better treatments for the disease in the future.

Green bridges in Germany are keeping a growing gray wolf population — and their prey — safe

Road accidents account for over 75 percent of all known wolf mortality in Germany

Gray wolves were locally extinct in Germany since the 1800s, but recently have successfully recolonized the northeastern region of the country by crossing the border from Poland. More than 100 wolf packs were counted in 2019. This is an impressive feat, but one obstacle stands in the way of a robust wolf population in the country — vehicle collisions.   

Road accidents account for more than 75 percent of all known wolf mortality in Germany. Fortunately, realizing the enormous benefit to both humans and wildlife that comes from reducing animals in roadways, efforts have been made around the world to build “green bridges.” These bridges provide wildlife with safe crossing opportunities over roads and highways, helping them move freely throughout the landscape.

The novelty of these bridges means that researchers are still testing out if, and how, different wildlife species will utilize them. Will prey species be deterred if they sense that predators use the same crossings? Will animals use the crossings during the day, when noise from the road is peaking, or merely at night when human disturbance is lower? These and other questions led researchers to examine one such green bridge built over a highway connecting Berlin to Poland. 

Using camera traps, which monitor the activities of animals with motion-activated photos and videos, researchers determined that wolves actively use the green bridge to safely cross the highway, and that their presence doesn’t seem to deter deer and boar (their main prey) from using the same crossing. First constructed in 2012, when wolves were still absent from the region, the bridge has quickly become a known safe crossing destination for wolves. The first wolf crossing was recorded in late 2015, and the wolves used the bridge over 85 times the following year. The bridge is one of seven such green bridges throughout the German state of Brandenburg. 

The results from this study help wildlife biologists determine the myriad benefits of green bridges. Wolves use the bridges more frequently during the winter months, which could be due to the seasonal need to expand their hunting range. In contrast, deer and wild boar tend to cross more in the spring and summer. All species crossed the bridge more during dawn and dusk, and least often during the day, when human activity is highest.

These analyses of species use of green bridges and other wildlife crossings are important for understanding how we can best design and implement them. Studies like this can help determine where to place green bridges in the landscape to optimize their use by wildlife species and maximize the benefit to humans by keeping animals out of the roads, facilitating our coexistence as we endeavor to lighten our footprint on the landscape.

Three clusters of neurons control zebrafish decision-making and movement

New research sheds light on how our neurons guide our behavior

Simone Lackner


Laboratório de Instrumentação e Física Experimental de Partículas

To make good decisions, both humans and animals need to collect evidence and integrate new information and knowledge. Lots of research in psychology and the cognitive sciences has been done on understanding human decision-making. However, scientists still don't fully understand how decisions are made on a neuronal level: How do neurons in our brains and nervous systems guide our behavior?

To find answers to those questions, a pair of scientists from Harvard University combined behavioral experiments with brain imaging using larval zebrafish as a model organism. Zebrafish larvae are small, transparent vertebrates with roughly 100,000 neurons. With current imaging techniques, the entire volume of the zebrafish larval brain can be imaged non-invasively at high resolution over many hours. 

The researchers saw that when zebrafish larvae were exposed to whole-field visual drift, they swam in the direction of the motion to stabilize themselves with respect to the visual environment. However, larvae don’t swim continuously. They make discrete movements that can each be considered an individual decision, raising the hypothesis that larvae accumulate evidence of the direction of the drift during motionless periods.

A popular decision-making paradigm (originally applied to humans and primates) is the random-dot motion discrimination task, which tests a viewer's perception of motion and direction. Using an adapted version of this random-dot paradigm, the scientists found that zebrafish larvae indeed accumulate information about motion direction over time and robustly swim in the direction they perceive their surroundings to be moving.

Further, the scientists created a whole-brain functional map and identified three clusters of neurons that relate to the behavioral choices of the zebrafish during the random-dot task. They propose a biophysical neural network in which a cluster of neurons, that integrates evidence of perceived motion direction over time, competes with another neuron cluster that represents a decision threshold. These two neuronal clusters are in a push-pull dynamic until a certain threshold is reached to finally activate the third cluster of motor neurons, which initiates movement in perceived direction.

With these findings, the researchers have shown for the first time that larval zebrafish integrate sensory information over time and that larvae are a suitable model organism to study questions about perceptual decision-making.

More Lab Notes →