Individualized brain cell grafts reverse Parkinson’s symptoms in monkeys

Grafting neurons grown from monkeys’ own cells into their brains relieved the debilitating movement and depression symptoms associated with Parkinson’s disease, researchers at the University of Wisconsin–Madison reported today.

In a study published in the journal Nature Medicine, the UW team describes its success with neurons made from induced pluripotent stem cells from the monkeys’ own bodies. This approach avoided complications with the primates’ immune systems and takes an important step toward a treatment for millions of human Parkinson’s patients.

“This result in primates is extremely powerful, particularly for translating our discoveries to the clinic,” says UW–Madison neuroscientist Su-Chun Zhang, whose Waisman Center lab grew the brain cells.

Parkinson’s disease damages neurons in the brain that produce dopamine, a brain chemical that transmits signals between nerve cells. The disrupted signals make it progressively harder to coordinate muscles for even simple movements and cause rigidity, slowness and tremors that are the disease’s hallmark symptoms. Patients — especially those in earlier stages of Parkinson’s — are typically treated with drugs like L-DOPA to increase dopamine production.

Standing at center, Su-Chun Zhang, professor of neuroscience in the School of Medicine and Public Health, talks with postdoctoral student Lin Yao as she prepares stem-cell cultures in the Zhang’s research lab at the Waismam Center at the University of Wisconsin-Madison on March 8, 2013. (Photo by Jeff Miller/UW-Madison)

Su-Chun Zhang

Marina Emborg

“Those drugs work well for many patients, but the effect doesn’t last,” says Marina Emborg, a Parkinson’s researcher at UW–Madison’s Wisconsin National Primate Research Center. “Eventually, as the disease progresses and their motor symptoms get worse, they are back to not having enough dopamine, and side effects of the drugs appear.”

Scientists have tried with some success to treat later-stage Parkinson’s in patients by implanting cells from fetal tissue, but research and outcomes were limited by the availability of useful cells and interference from patients’ immune systems. Zhang’s lab has spent years learning how to dial donor cells from a patient back into a stem cell state, in which they have the power to grow into nearly any kind of cell in the body, and then redirect that development to create neurons.

“The idea is very simple,” Zhang says. “When you have stem cells, you can generate the right type of target cells in a consistent manner. And when they come from the individual you want to graft them into, the body recognizes and welcomes them as their own.”

The application was less simple. More than a decade in the works, the new study began in earnest with a dozen rhesus monkeys several years ago. A neurotoxin was administered — a common practice for inducing Parkinson’s-like damage for research — and Emborg’s lab evaluated the monkeys monthly to assess the progression of symptoms.

“We evaluated through observation and clinical tests how the animals walk, how they grab pieces of food, how they interact with people — and also with PET imaging we measured dopamine production,” Emborg says. (PET is positron emission tomography, a type of medical imaging.) “We wanted symptoms that resemble a mature stage of the disease.”

The neuron-grafting approach in monkeys takes an important step toward a treatment for millions of human Parkinson’s patients.

Guided by real-time MRI that can be used during procedures and was developed at UW–Madison by biomedical engineer Walter Block during the course of the Parkinson’s study, the researchers injected millions of dopamine-producing neurons and supporting cells into each monkey’s brain in an area called the striatum, which is depleted of dopamine as a consequence of the ravaging effects of Parkinson’s in neurons.

Half the monkeys received a graft made from their own induced pluripotent stem cells (called an autologous transplant). Half received cells from other monkeys (an allogenic transplant). And that made all the difference.

Within six months, the monkeys that got grafts of their own cells were making significant improvements. Within a year, their dopamine levels had doubled and tripled.

“The autologous animals started to move more,” Emborg says. “Where before they needed to grab the cage to stand up, they started moving much more fluidly and grabbing food much faster and easier.”

The monkeys who received allogenic cells showed no such lasting boost in dopamine or improvement in muscle strength or control, and the physical differences in the brains were stark. The axons — the extensions of nerve cells that reach out to carry electrical impulses to other cells — of the autologous grafts were long and intermingled with the surrounding tissue.

“They could grow freely and extend far out within the striatum,” says Yunlong Tao, a scientist in Zhang’s lab and first author of the study. “In the allogenic monkeys, where the grafts are treated as foreign cells by the immune system, they are attacked to stop the spread of the axons.”

The results are promising enough that Zhang hopes to begin work on applications for human patients soon.

The missing connections leave the allogenic graft walled off from the rest of the brain, denying them opportunities to renew contacts with systems beyond muscle management.

“Although Parkinson’s is typically classified as a movement disorder, anxiety and depression are typical, too,” Emborg says. “In the autologous animals, we saw extension of axons from the graft into areas that have to do with what’s called the emotional brain.”

Symptoms that resemble depression and anxiety — pacing, disinterest in others and even in favorite treats — abated after the autologous grafts grew in. The allogenic monkeys’ symptoms remained unchanged or worsened.

The results are promising enough that Zhang hopes to begin work on applications for human patients soon. In particular, Zhang says, the work Tao did in the new study to help measure the relationship between symptom improvement, graft size and resulting dopamine production gives the researchers a predictive tool for developing effective human grafts.

SOURCE: news.wisc.edu

Study compares discrimination claims of younger and older Americans with cancer

David Strauser – professor, Department of Kinesiology and Community Health (Photo by L. Brian Stauffer

Kinesiology and community health scientists found that younger and older adults with cancer differ in their experiences of employment discrimination.

CHAMPAIGN, Ill. — Researchers assessed the employment discrimination claims made by younger and older American adults with cancer and found substantial differences in the nature – and outcomes – of their claims.

Reported in the Journal of Cancer Survivorship, the research focused on Title I complaints made to the U.S. Equal Employment Opportunity Commission from 2009 to 2016. This included 1,001 claims from cancer survivors up to age 35 and 8,874 claims by adults over 35 with a history of cancer.

The Americans with Disabilities Act originally recognized that people with cancer and undergoing cancer treatment could experience declines in their physical and cognitive functioning. But these difficulties were thought to disappear at the end of treatment or when the cancer was in remission. The ADA was amended in 2009 to allow for the fact that even after treatment ends, people with a history of cancer and cancer treatment often experience lingering difficulties.

“Fatigue is the most common issue that people with cancer experience,” said David Strauser, a professor of kinesiology and community health at the University of Illinois Urbana-Champaign who led the new research. “Also, chemotherapy can affect their ability to concentrate, to focus on details or to process information as fast as they used to.”

Previous studies have found that “adult cancer survivors experience discrimination at a similar rate as other groups with disabilities,” Strauser said. While several studies have focused on older adults with cancer in the workplace, the employment discrimination experiences of younger adults with cancer have been overlooked, he said.

 A recent analysis of dozens of studies found that younger adult survivors of childhood cancer were nearly twice as likely to be unemployed as their healthy peers. Those with cancers of the central nervous system were nearly five times as likely to be unemployed.

All of the complaints that Strauser and his colleagues analyzed had been resolved by the EEOC – either by a finding of merit or with a determination that there was not enough evidence to proceed. The EEOC found that 26.6% of younger cancer survivors’ claims had merit. Older adults with a history of cancer had a higher success rate; 31.4% of their claims were found to have merit.

The primary complaints of older and younger adults with cancer involved what they saw as unfair working terms and conditions, harassment, discipline, failure to accommodate their disabilities and wrongful termination of their employment.

But younger cancer survivors were more likely than their older peers to claim discriminatory treatment in regard to opportunities for training and promotion. They also brought significantly more claims relating to reinstatement – being allowed to return to their jobs after taking leave for treatment – and the writing of references to potential future employers.

“What we’re seeing here is that younger cancer survivors have different needs related to employment than their older counterparts,” Strauser said. “Their discrimination claims tend to be related to issues around their career advancement.”

This finding suggests that employers may not be familiar with laws protecting the rights of people with disabilities that stem from chronic illness, Strauser said.

“I think employers get a lot of training and support on how to handle affirmative action issues and family leave for parents,” he said. “But when it comes to disability in relation to chronic illness, they tend to be less versed, and we don’t do a lot of training in that area. These results suggest we need to do more.”

The paper “The employment discrimination experiences of younger and older Americans with cancer under Title I of the Americans with Disabilities Act” is available online and from the U. of I. News Bureau.

DOI: 10.1007/s11764-020-00867-x

SOURCE: news.illinois.edu Credit: Diana Yates

Biodiversity Key to Protect Bees

A bumble bee (Bombax impatiens) feeding on a calendula flower.

A new analysis of thousands of native and nonnative Michigan bees shows that the most diverse bee communities have the lowest levels of three common viral pathogens.

University of Michigan researchers netted and trapped more than 4,000 bees from 60 species. The bees were collected at winter squash farms across Michigan, where both managed honeybee colonies and wild native bees pollinate the squash flowers.

All but one species—Apis mellifera, the common European honeybee—are native bees. The number of bee species found at each farm ranged from seven to 49.

Consistently, lower virus levels were strongly linked to greater species richness among the local bee communities. The study was published online Feb. 11 in the journal Ecology.

An eastern bumblebee on a wingstem flower. European honeybees and eastern bumblebees showed the highest levels of three common viral pathogens in the University of Michigan study. Image credit: Michelle Fearon, University of Michigan.

“This result is exciting because it suggests that promoting diverse bee communities may be a win-win strategy to simultaneously reduce viral infections in managed honeybee colonies while helping to maintain native bee biodiversity,” said study lead author Michelle Fearon, a postdoctoral researcher in the University of Michigan Department of Ecology and Evolutionary Biology.

“In light of recent global pollinator population declines that are due in part to the spread of pathogens, these results offer hope that conservation efforts could also broadly benefit pollinator health,” said Fearon, who conducted the study for her doctoral dissertation. She is now pursuing a follow-up study that explores how natural areas keep pollinator communities healthy.

The Ecology study is the first to show that high levels of biodiversity within bee communities can help dilute the harmful effects of viral pathogens. Support for this “dilution effect” has been reported in other host­—pathogen systems—such as tick-borne Lyme disease—but this is the first time it’s been seen with pollinator viruses. The idea of a dilution effect remains controversial among ecologists, however.

Fearon and her colleagues collected 4,349 bees at 14 Michigan winter squash farms over two summers. Michigan winter squashes include acorn squash, butternut squash, spaghetti squash and pumpkins.

Honeybees were found at all of the sites, and a diverse array of native bees were also present in the squash fields and along field edges. In fact, native pollinators were much more common visitors to the squash flowers than honeybees at most locations.

Four types of bees—the European honeybee, the eastern bumblebee (Bombus impatiens), the squash bee (Eucera pruinosa) and several species of sweat bee (genus Lasioglossum)—were the most consistently abundant species among the bee communities that were sampled.

Those four groups were tested for the presence of three viruses that commonly infect managed honeybee colonies: deformed wing virus, black queen cell virus and sacbrood virus.

These pathogens contribute to high rates of colony loss among honeybees, and there are no widely available treatments that beekeepers can use to control them. Previous studies suggested that native bees are less commonly infected and may be less likely to transmit the pathogens to other bees.

The viruses spread as bees move from flower to flower, gathering pollen and nectar and pollinating the plants in the process. Consumption of virus-contaminated pollen is believed to be a primary mode of transmission.

For each of the four target bee groups in the U-M study, researchers found that lower viral prevalence was strongly linked to greater biodiversity of the local bee community: the more bee species present, the lower the percentage of bees infected.

Species-rich communities included many native bee species, which apparently helped to dilute the impact of the pathogens.

“Native bees likely reduce the viral prevalence in pollinator communities because they are poorer viral hosts than honeybees. This means that some native bees don’t get as sick as honeybees and are less likely to spread the virus to other bees,” said study co-author Elizabeth Tibbetts, a professor in the U-M Department of Ecology and Evolutionary Biology who was Fearon’s dissertation adviser.

A male eastern bumblebee (Bombus impatiens) and a sweat bee on a cone flower. Bumblebees and sweat bees were part of a University of Michigan study showing that the most diverse bee communities have the lowest levels of three common viral pathogens. Image credit: Michelle Fearon, University of Michigan.

“So, bees from pollinator communities with lots of species are less likely to get sick because they are sharing flowers with many bee species that are less likely to spread the virus, while bees from communities dominated by honeybees are more likely to share flowers with honeybees that are good at spreading the virus,” Tibbetts said.

Bees are indispensable pollinators, supporting both agricultural productivity and the diversity of flowering plants worldwide. In recent decades, both native bees and managed honeybee colonies have seen population declines blamed on multiple interacting factors including habitat loss, parasites and disease, and pesticide use.

“We found encouraging evidence that pollinator conservation efforts can broadly benefit the health of both managed honeybee colonies and native bees,” Fearon said. “This management strategy could be especially crucial in agricultural areas where crop flowers are visited by both honeybees and native bees—places that may be hot spots for viral transmission among bee species.”

Source: news.umich.edu Credit: Jim Erickson

Study Link (Ecology)

Abstract

Most pathogens are embedded in complex communities composed of multiple interacting hosts, but we are still learning how community‐level factors, such as host diversity, abundance, and composition, contribute to pathogen spread for many host–pathogen systems. Evaluating relationships among multiple pathogens and hosts may clarify whether particular host or pathogen traits consistently drive links between community factors and pathogen prevalence. Pollinators are a good system to test how community composition influences pathogen spread because pollinator communities are extremely variable and contain several multi‐host pathogens transmitted on shared floral resources. We conducted a field survey of four pollinator species to test the prevalence of three RNA viruses (deformed wing virus, black queen cell virus, and sacbrood virus) among pollinator communities with variable species richness, abundance, and composition. All three viruses showed a similar pattern of prevalence among hosts. Apis mellifera and Bombus impatiens had significantly higher viral prevalence than Lasioglossum spp. and Eucera pruinosa. In each species, lower virus prevalence was most strongly linked with greater pollinator community species richness. In contrast, pollinator abundance, species‐specific pollinator abundance, and community composition were not associated with virus prevalence. Our results support a consistent dilution effect for multiple viruses and host species. Pollinators in species‐rich communities had lower viral prevalence than pollinators from species‐poor communities, when accounting for differences in pollinator abundance. Species‐rich communities likely had lower viral prevalence because species‐rich communities contained more native bee species likely to be poor viral hosts than species‐poor communities, and all communities contained the highly competent hosts A. mellifera and B. impatiens. Interestingly, the strength of the dilution effect was not consistent among hosts. Instead, host species with low viral prevalence exhibited weaker dilution effects compared to hosts with high viral prevalence. Therefore, host species susceptibility and competence for each virus may contribute to variation in the strength of dilution effects. This study expands biodiversity–disease studies to the pollinator–virus system, finding consistent evidence of the dilution effect among multiple similar pathogens that infect ‘replicate’ host communities.

Scientists Seek to Understand “Holistic” Soil Microbiome via Sequencing

Research promises advances in agricultural advances

enders-nature

Laramy Enders (center) attends Purdue University’s first Microbiome Symposium in 2019. She and other Purdue microbiome researchers believe a holistic approach to microbiome research will soon offer opportunities to make advances in agriculture.Download Image

WEST LAFAYETTE, Ind. — For thousands of years, humans have altered — often negatively and inadvertently —microbial communities in a quest to improve agricultural crops. In recent years, knowledge about the roles microbes play in these systems has grown rapidly but is not yet to the point at which farmers and society have reaped benefits.

That’s primed to change, according to a group of Purdue University scientists who authored a review of agricultural microbiome work for the journal Nature Plants published this week. This is the first review of agricultural microbiome research that comprehensively combines knowledge about plant, soil and insect microbiome work to develop an integrated portrait of the complex interactions that will come into play as scientists attempt to harness microbes to improve crops. 

“We wanted to gather what we know about the microbiome in an agricultural context and see if that knowledge can be translated into actionable information for farmers.,” said Lizzie French, a postdoctoral researcher in entomology and lead author of the paper. “Using emerging technologies in next-generation sequencing and digital agriculture, we are beginning to integrate microbial community interactions into our understanding of agriculture as a whole ecosystem, which will enable growers to work with nature to farm more sustainably

Many decisions that have affected agriculture were made without knowledge of microbiomes or how those microbe populations would be affected. Crops bred for improved yield have lost genes that help the plants interact beneficially with microbes; pesticides can alter the abundance of beneficial insects and the microbes they carry; and monoculture or limited crop rotations can alter microbial diversity.

Other practices have improved the abundance and diversity of microbe populations, including better crop rotations and cover cropping. Biotechnology companies have developed products that can add beneficial microbes to an agricultural ecosystem, reducing the need for fertilizers and pesticides. However, more work is needed to make these products viable alternatives to traditional agricultural inputs.

“Efforts focused on improving plant growth and yields have caused us to either lose microbial species or the ability of plants to interact with some beneficial microbes,” said Laramy Enders, an assistant professor of entomology and corresponding author of the paper. “We’ve started to learn more about the interconnected nature of these microbial communities, and as we look at this from a holistic point of view, we can see opportunities to improve crops in new ways.”

The authors believe these advances are first steps to developing tools that will improve our ability to customize and shape microbiomes to boost plant defenses and yields in specific crops and in specific locations. 

“We ultimately envision a decision-tree framework that will enable growers to make data-driven management decisions on the appropriate practices, cultivars and microbial inoculants to optimize the health of their crop and soil for their specific region and farming system,” they write in the Nature Plants paper. “These are exciting times for harmonizing efforts that harness the power and complexity of all interacting sectors of crop microbiomes to fuel a future of sustainable and healthy agroecosystems.”

The Purdue Applied Microbiome Sciences team has led efforts to pull together scientists to address microbiome research in an interdisciplinary way. The team’s Microbiome Symposium has brought the field’s leading experts to the West Lafayette campus to investigate ways to utilize the latest available microbiome data and technology.

Purdue’s Ian Kaplan, professor of entomology, Anjali Iyer-Pascuzzi, associate professor of botany and plant pathology, and Cindy Nakatsu, professor of agronomy, are co-authors of the paper, which was supported by Purdue Agriculture’s 2019 Elevating the Visibility of Agricultural Research: 150th Anniversary Review program.

Source:purdue.edu/newsroom

Credit: Brian Wallheimer


ABSTRACT

Emerging strategies for precision microbiome management in diverse agroecosystems

Elizabeth French, Ian Kaplan, Anjali Iyer-Pascuzzi , Cindy H. Nakatsu and Laramy Enders

doi.org/10.1038/s41477-020-00830-9

Substantial efforts to characterize the structural and functional diversity of soil, plant and insect-associated microbial communities have illuminated the complex interacting domains of crop-associated microbiomes that contribute to agroecosystem health. As a result, plant-associated microorganisms have emerged as an untapped resource for combating challenges to agricultural sustainability. However, despite growing interest in maximizing microbial functions for crop production, resource efficiency and stress resistance, research has struggled to harness the beneficial properties of agricultural microbiomes to improve crop performance. Here, we introduce the historical arc of agricultural microbiome research, highlighting current progress and emerging strategies for intentional microbiome manipulation to enhance crop performance and sustainability. We synthesize current practices and limitations to managing agricultural microbiomes and identify key knowledge gaps in our understanding of microbe-assisted crop production. Finally, we propose research priorities that embrace a holistic view of crop microbiomes for achieving precision microbiome management that is tailored, predictive and integrative in diverse agricultural systems.

Accelerating and Improving Research on Brain Injuries via Data Sharing

The web-computing platform was created by IU scientists to support and publish reproducible neuroscience research

Scientists in the United States, Europe and South America are reporting how a new cloud-computing web platform allows scientists to track data and analyses on the brain, potentially reducing delays in discovery.

The project, called brainlife.io, is led by Franco Pestilli, associate professor in the Indiana University Bloomington College of Arts and Sciences’ Department of Psychological and Brain Sciences and a member of the IU Network Science Institute, in collaboration with colleagues across the university. At IU, it is speeding research on disorders such as dementia, sports concussion and eye disease.

Brainlife logo
The logo of brainlife.io, a cloud-computing web platform that allows scientists to track data and analyses on the brain.

A new paper on the project was published May 30 in the journal Scientific Data.

“Scientists are increasingly embracing modern technology to reduce human errors in scientific research practice,” said Pestilli, who established brainlife.io in 2017 with support from the National Science Foundation and Microsoft.

“This article describes a unique mechanism by which scientists across the world can share data and analyses, which allows them to reproduce research results and extend them to new frontiers of human understanding,” he added. “The benefit of such a platform is faster research on brain disease.”

The system manages all aspects of research where people are more likely than machines to make mistakes, such as keeping track of data and code for analyses, storing information, and producing visualizations.

At IU, brainlife.io is being used to advance research on multiple health care research studies. Examples include:

The new paper provides a “case study” on how to generate a full research study, including data collection, analysis and visualization, on the brainlife.io platform. It also describes how the system preserves data and analyses in a single digital record to create reusable research assets for other scientists to use in their work.

“I like to refer to the new technology as a process of ‘data upcycling,'” Pestilli said. “The new records that scientists create and share on brainlife.io can be easily reused by others to go beyond the goals of the original study.”

For example, a study on traumatic brain injury could potentially combine data from a study on Alzheimer’s disease to understand underlying biological mechanisms in both conditions.

Importantly, Pestilli added, brainlife.io is designed to store and process data derived from diffusion-weighted magnetic resonance imaging — a form of imaging that uses water molecules in the brain to create a highly detailed roadmap of the nerve tracks in the brain — as well as tractography, a 3D modeling technical to visually represent these nerves and understand the network of connections that make up the brain.

Franco Pestilli
Franco Pestilli. Photo by Eric Rudd, Indiana University

“The use these imaging techniques has revolutionized knowledge about networks inside the brain and the impact of the brain’s white matter on human behavior and disease,” Pestilli said. It also generates enormous amounts of data that require serious computer resources to store and analyze.

Some of this computing power comes from Microsoft, which chose brainlife.io as one of the first eight projects to benefit from the company’s initiative to award $3 million in compute credits to projects under the NSF’s Big Data Spokes and Planning projects, of which IU is a part. The project is also supported under the NSF’s BRAIN Initiative, a federal project to generate new physical and conceptual tools to understand the brain.

IU contributors to brainlife.io include Soichi Hayashi, a software engineer at the IU Pervasive Technology Institute; graduate students Brad Caron, Lindsey Kitchell, Brent McPherson and Dan Bullock; and undergraduate students Yiming Qian and Andrew Patterson. Bullock and McPherson were supported by grants from the National Institutes of Health and NSF. Additional authors on the article include researchers at Indiana University, the University of Michigan at Ann Arbor, Northwestern University, the University of Trento in Italy and CONICET in Argentina.

SOURCE: NEWS.IA.EDU Credit: Kevin Fryling

New data locates hundreds of millions of objects throughout space

Survey has mapped one-eighth of the skies, studying dark energy

This irregular dwarf galaxy, named IC 1613, and discovered through the Dark Energy Survey, contains some 100 million stars (bluish in this portrayal). It is a member of our Local Group of galaxy neighbors, a collection which also includes our Milky Way, the Andromeda spiral and the Magellanic clouds.Credit: DES/NOIRLab/NSF/AURA. Acknowledgments: Image processing: DES, Jen Miller (Gemini Observatory/NSF’s NOIRLab), Travis Rector (University of Alaska Anchorage), Mahdi Zamani & Davide de Martin

A longstanding project designed to study dark energy throughout the cosmos has released a second data set showing 300 million objects throughout space, one of the largest data releases of its kind. Combined with an initial release, the survey has now cataloged about 700 million objects in the universe.

The data was released by the Dark Energy Survey, an international collaboration of about 500 scientists from the U.S., Europe and South America, to map hundreds of millions of galaxies and thousands of supernovae in an attempt to understand more about dark energy, the force that is causing the universe to expand. The Ohio State University has played a primary role in the survey from the beginning.

The survey, started in 2013, has so far mapped about one-eighth of the skies.

The release was announced Friday, Jan. 15, at the American Astronomical Society’s annual meeting, held virtually this year.

Klaus Honscheid

“This release tells the world, ‘If you ever want to see any of these galaxies, here’s where they are and here’s what they look like,’ said Klaus Honscheid, a physics professor at Ohio State and member of Ohio State’s Center for Cosmology and Astroparticle Physics. “And people can use this info and do their own analysis – look for objects of a certain property or compare to theoretical models. This is really enabling a lot of people to do work now outside the DES collaboration.”

Last week’s release is the second from the Dark Energy Survey. The data builds on the 400 million objects cataloged in the survey’s previous data release, and improves on the first release by refining calibration techniques and including deeper combined images of the objects throughout space. That combination, scientists say, led to improved estimates of the amount and distribution of matter in the universe.

The data allows researchers to determine the size, shape and location of objects – most of them galaxies, but also quasars, stars, interstellar gas clouds and asteroids – throughout space, and to build a catalog of those objects

“These images combine data from the same locations in the skies – images of the same spot, multiple times,” said Ami Choi, co-convener of the DES science working group on weak gravitational lensing and a CCAPP fellow. “And it’s three-dimensional, so it allows us to build a map that looks deep into this one part of the universe.”

The new data should make it easier for astronomers, astrophysicists and cosmologists – both professional and amateur – to locate those objects in the night skies, and to build models around the distance those objects may have moved away from one another over time.

“The catalog contains these objects with their properties and their location, and anyone else who has this information and a big enough telescope can go look at the location we specified and repeat these observations,” Honscheid said.

The expansion of the universe is the key to understanding dark energy. Previous work has shown that the universe has been expanding since its birth some 13.8 billion years ago, and that for approximately the last 7 billion years the universe’s expansion is accelerating.

The next set of results from the survey is expected later this spring, said Jack Elvin-Poole, co-convener of the DES science working group on large scale structure and a CCAPP fellow.

Scientists at the Dark Energy Survey, including those at Ohio State, are still analyzing data released last week to discern what it might say about dark energy and the expansion of the universe. The data are online and available to the public; the survey’s scientists will make their analysis available after it is complete.

“We are very interested in cosmology – the history of the universe – so we are looking for these dark energy signatures in this data,” Honscheid said. “If you think about dark energy, it’s something that pulls the universe apart, that pushes objects further apart.”

The survey involves taking photographs of light produced by each object and analyzing the wavelengths of that light.

This analysis is built on a concept called “redshifting,” which gets its name from the way wavelengths of light lengthen as they travel through the expanding universe.

“The farther away something is in the universe, the longer its wavelength of light – and longer wavelengths appear red, while shorter wavelengths appear blue,” said Anna Porredon, a CCAPP fellow who worked on the survey. “Scientists who study the cosmos call that lengthening the redshift effect.”

There are a number of other researchers at Ohio State who have worked on the DES project, including David Weinberg, Paul Martini, Chris Hirata and Ashley Ross.

SOURCE:NEWS.OSU.EDU Credit: Laura Arenschield

Uncovering hidden forever chemicals

New tool finds and fingerprints previously undetected PFAS compounds in watersheds on Cape Cod

Researchers from the Harvard John A. Paulson School of Engineering and Applied Sciences (SEAS) found large quantities of previously undetectable compounds from the family of chemicals known as PFAS in six watersheds on Cape Cod using a new method to quantify and identify PFAS compounds. Exposures to some PFAS, widely used for their ability to repel heat, water, and oil, are linked to a range of health risks including cancer, immune suppression, diabetes, and low infant birth weight.

The new testing method revealed large quantities of previously undetected PFAS from fire-retardant foams and other unknown sources. Total concentrations of PFAS present in these watersheds were above state maximum contaminant levels (MCLs) for drinking water safety. 

“We developed a method to fully capture and characterize all PFAS from fire-retardant foams, which are a major source of PFAS to downstream drinking water and ecosystems, but we also found large amounts of unidentified PFAS that couldn’t have originated from these foams,” said Bridger Ruyle, a graduate student at SEAS and first author of the study. “Traditional testing methods are completely missing these unknown PFAS.”

The research will be published in Environmental Science & Technology.

PFAS — per- and polyfluoroalkyl substances — are present in products ranging from fire retardant foams to non-stick pans. Nicknamed “forever chemicals” due to their long lifespan, PFAS have been building up in the environment since they were first used in the 1950s.   

Despite the associated health risks, there are no legally enforceable federal limits for PFAS chemicals in drinking water. The Environmental Protection Agency’s provisional health guidelines for public water supplies only cover PFOS and PFOA, two common types of PFAS. Massachusetts, along with a few other states, has gone further by including six PFAS in their new MCLs in drinking water. But there are thousands of PFAS chemical structures known to exist, several hundred of which have already been detected in the environment. 

We’re simply not testing for most PFAS compounds, so we have no idea what our total exposure is to these chemicals and health data associated with such exposures are still lacking.

Elsie Sunderland

ELSIE SUNDERLANDGORDON MCKAY PROFESSOR OF ENVIRONMENTAL CHEMISTRY

“We’re simply not testing for most PFAS compounds, so we have no idea what our total exposure is to these chemicals and health data associated with such exposures are still lacking,” said Elsie Sunderland, the Gordon McKay Professor of Environmental Chemistry at SEAS and senior author of the paper. 

The standard testing methods used by the EPA and state regulatory agencies only test for 25 or fewer known compounds. The problem is the overwhelming majority of PFAS compounds are proprietary and regulatory agencies can’t find what they don’t know exist. 

The new method developed by Sunderland and her team can overcome that barrier and account for all PFAS in a sample. 

CSI: PFAS

PFAS are made by combining carbons and fluorine atoms to form one of the strongest bonds in organic chemistry. Fluorine is one of the most abundant elements on earth but naturally occurring organic fluorine is exceedingly rare — produced only by a few poisonous plants in the Amazon and Australia. Therefore, any amount of organofluorine detected in the environment is sure to be human made. 

PFAS compounds found in the environment come in two forms: a precursor form and a terminal form. Most of the monitored PFAS compounds, including PFOS and PFOA, are terminal compounds, meaning they will not degrade under normal environmental conditions. But precursor compounds, which often make up the majority of PFAS chemicals in a sample, can be transformed through biological or environmental processes into terminal forms. So, while the EPA or state agencies may monitor PFAS concentrations, they still are not detecting much of the huge pool of PFAS precursors. 

That’s where this new method comes in. 

image of researchers standing in the river conducting field work
Co-author Heidi Pickard conducts fieldwork in the Santuit River on Cape Cod (Photo Credit:  M. Salerno, URI STEEP Superfund Research Program) Download Image

The researchers first measure all the organofluorine in a sample. Then, using another technique, they oxidize the precursors in that sample and transform them into their terminal forms, which they can then measure. From there, the team developed a method of statistical analysis to reconstruct the original precursors, fingerprint their manufacturing origin, and measure their concentration within the sample. 

“We’re essentially doing chemical forensics,” said Sunderland. 

Using this method, Sunderland and her team tested six watersheds on Cape Cod as part of a collaboration with the United States Geological Survey and a research center funded by the National Institutes of Health and led by the University of Rhode Island that focuses on the sources, transport, exposure and effects of PFAS.

The team focused on identifying PFAS from the use of fire-retardant foams. These foams, which are used extensively at military bases, civilian airports, and local fire departments, are a major source of PFAS and have contaminated hundreds of public water supplies across the US. 

The research team applied their forensic methods to samples collected between August 2017 and July 2019 from the Childs, Quashnet, Mill Creek, Marstons Mills, Mashpee and Santuit watersheds on Cape Cod. During the collection process, the team members had to be careful what they wore, since waterproof gear is treated with PFAS. The team ended up in decades-old waders to prevent contamination.

The sampling sites in the Childs, Quashnet and Mill Creek watersheds are downstream from a source of PFAS from fire retardant foams — the Quashnet and Childs from The Joint Base Cape Cod military facility and Mill Creek from Barnstable County Fire Training Academy.

Current tests can only identify about 50 percent of PFAS from historical foams — products that were discontinued in 2001 due to high levels of PFOS and PFOA — and less than 1 percent of PFAS from modern foams. 

Using their new method, Sunderland and her team were able to identify 100 percent of all PFAS compounds in the types of fire-retardant foams that were used for decades at Joint Base Cape Cod and Barnstable County Fire Training Academy.

“Our testing method was able to find these missing compounds that have been used by the chemical industry for more than 40 years,” said Sunderland. 

The tests also revealed huge quantities of PFAS from unknown sources. 

This has huge ramifications for not only our understanding of human exposure but also for how much PFAS is discharging into the ocean and accumulating in marine life.

picture of Bridger Ruyle

BRIDGER RUYLEGRADUATE STUDENT, SEAS

“Our accounting of PFAS from firefighting foams could not explain 37 to 77 percent of the organofluorine that we measured,” said Ruyle. “This has huge ramifications for not only our understanding of human exposure but also for how much PFAS is discharging into the ocean and accumulating in marine life.”

To follow up on these findings, Ruyle is currently working with NIH to identify some of the health impacts of PFAS from contemporary firefighting foams using toxicology studies.  Sunderland’s team is continuing to study the unknown PFAS to better identify their sources and potential for accumulation in abundant marine food webs on Cape Cod.

March 5, 2021

SOURCE: SEAS.HARVARD.EDU Credit: Leah Burrows

Air pollution puts children at higher risk of disease in adulthood, according to Stanford researchers

First of its kind study reveals evidence that early exposure to dirty air alters genes in a way that could lead to adult heart disease, among other ailments. The findings could change the way medical experts and parents think about the air children breathe and inform clinical interventions.

Children exposed to air pollution, such as wildfire smoke and car exhaust, for as little as one day may be doomed to higher rates of heart disease and other ailments in adulthood, according to a new Stanford-led study. The analysis, published in Nature Scientific Reports, is the first of its kind to investigate air pollution’s effects at the single cell level and to simultaneously focus on both the cardiovascular and immune systems in children. It confirms previous research that bad air can alter gene regulation in a way that may impact long-term health – a finding that could change the way medical experts and parents think about the air children breathe, and inform clinical interventions for those exposed to chronic elevated air pollution.

View from above Fresno, California, an area with some of the country’s highest air pollution levels. (Image credit: Vadim Manuylov / Wikimedia Commons)

“I think this is compelling enough for a pediatrician to say that we have evidence air pollution causes changes in the immune and cardiovascular system associated not only with asthma and respiratory diseases, as has been shown before,” said study lead author Mary Prunicki, director of air pollution and health research at Stanford’s Sean N. Parker Center for Allergy & Asthma Research. “It looks like even brief air pollution exposure can actually change the regulation and expression of children’s genes and perhaps alter blood pressure, potentially laying the foundation for increased risk of disease later in life.”

The researchers studied a predominantly Hispanic group of children ages 6-8 in Fresno, California, a city beset with some of the country’s highest air pollution levels due to industrial agriculture and wildfires, among other sources. Using a combination of continuous daily pollutant concentrations measured at central air monitoring stations in Fresno, daily concentrations from periodic spatial sampling and meteorological and geophysical data, the study team estimated average air pollution exposures for 1 day, 1 week and 1, 3, 6 and 12 months prior to each participant visit. When combined with health and demographics questionnaires, blood pressure readings and blood samples, the data began to paint a troubling picture.

The researchers used a form of mass spectrometry to analyze immune system cells for the first time in a pollution study. The approach allowed for more sensitive measurements of up to 40 cell markers simultaneously, providing a more in-depth analysis of pollution exposure impacts than previously possible.

Among their findings: Exposure to fine particulate known as PM2.5, carbon monoxide and ozone over time is linked to increased methylation, an alteration of DNA molecules that can change their activity without changing their sequence. This change in gene expression may be passed down to future generations. The researchers also found that air pollution exposure correlates with an increase in monocytes, white blood cells that play a key role in the buildup of plaques in arteries, and could possibly predispose children to heart disease in adulthood. Future studies are needed to verify the long-term implications.

Hispanic children bear an unequal burden of health ailments, especially in California, where they are exposed to higher traffic-related pollution levels than non-Hispanic children. Among Hispanic adults, prevalence for uncontrolled hypertension is greater compared with other races and ethnicities in the U.S., making it all the more important to determine how air pollution will affect long-term health risks for Hispanic children.

Overall, respiratory diseases are killing more Americans each year, and rank as the second most common cause of deaths globally.

“This is everyone’s problem,” said study senior author Kari Nadeau, director of the Parker Center. “Nearly half of Americans and the vast majority of people around the world live in places with unhealthy air. Understanding and mitigating the impacts could save a lot of lives.”

Source: Stanford News Service Credit: Rob Jordan, 2/12/2021

http://ipak-edu.org

Deep learning may help doctors choose better lung cancer treatments

Deep learning, a powerful machine learning model, could guide doctors and healthcare workers in weighing treatment and care options, according to a team of Great Valley researchers.

MALVERN, Pa. — Doctors and healthcare workers may one day use a machine learning model, called deep learning, to guide their treatment decisions for lung cancer patients, according to a team of Penn State Great Valley researchers.

In a study, the researchers report that they developed a deep learning model that, in certain conditions, was more than 71% accurate in predicting survival expectancy of lung cancer patients, significantly better than traditional machine learning models that the team tested. The other machine learning models the team tested had about a 61% accuracy rate.

Information on a patient’s survival expectancy could help guide doctors and caregivers in making better decisions on using medicines, allocating resources and determining the intensity of care for patients, according to Youakim Badr, associate professor of data analytics.

“This is a high-performance system that is highly accurate and is aimed at helping doctors make these important decisions about providing care to their patients,” said Badr. “Of course, this tool can’t be used as a substitute for a doctor in making decisions on lung cancer treatments.”

According to Robin G. Qiu, professor of information science and engineering and an affiliate of the Institute for Computational and Data Sciences, the model can analyze a large amount of data — typically called features in machine learning — that describe the patients and the disease to understand how a combination of factors affect lung cancer survival periods. Features can include information such as types of cancer, size of tumors, the speed of tumor growth, and demographic data.

What is deep learning?

Deep learning may be uniquely suited to tackle lung cancer prognosis because the model can provide the robust analysis necessary in cancer research, according to the researchers, who report their findings in International Journal of Medical Informatics. Deep learning is a type of machine learning that is based on artificial neural networks, which are generally modeled on how the human brain’s own neural network functions.

In deep learning, however, developers apply a sophisticated structure of multiple layers of these artificial neurons, which is why the model is referred to as “deep.” The learning aspect of deep learning comes from how the system learns from connections between data and labels, said Badr.

“Deep learning is a machine-learning algorithm that makes associations between the data, itself, and the labels that we use to describe the data examples,” said Badr. “By making these associations, it learns from the data.”

Qiu added that deep learning’s structure offers several advantages for many data science tasks, especially when confronted with data sets that have a large number of records — in this case, patients — as well as a large number of features.

“It improves performance tremendously,” said Qiu. “In deep learning we can go deeper, which is why they call it that. In traditional machine learning, you have a simple structure of layers of neural networks. In each layer, you have a group of cells. In deep learning, there are many layers of these cells that can be architected into a sophisticated structure to perform better feature transformation and extraction, which gives you the ability to further improve the accuracy of any model.”

In the future, the researchers would like to improve the model and test its ability to analyze other types of cancers and medical conditions.

“The accuracy rate is good, but it’s not perfect, so part of our future work is to improve the model,” said Qiu.

To further improve their deep learning model, the researchers would also need to connect with domain experts, who are people who have specific knowledge. In this case, the researchers would like to connect with experts on specific cancers and medical conditions.

“In a lot of cases, we might not know a lot of features that should go into the model,” said Qiu. “But, by collaborating with domain experts, they could help us collect important features about patients that we might not be aware of and that would further improve the model.”

The researchers analyzed data from the Surveillance, Epidemiology, and End Results (SEER) program. The SEER dataset is one of the biggest and most comprehensive databases on the early diagnosis information for cancer patients in the United States, according to Shreyesh Doppalapudi, a graduate-student research assistant and first author of the paper. The program’s cancer registries cover almost 35% of U.S. cancer patients.

“One of the really good things about this data is that it covers a large section of the population and it’s really diverse,” said Doppalapudi. “Another good thing is that it covers a lot of different features, which you can use for many different purposes. This becomes very valuable, especially when using machine learning approaches.”

Doppalapudi added that the team compared several deep learning approaches, including artificial neural networks, convolutional neural networks and recurrent neural networks, to traditional machine learning models. The deep learning approaches performed much better than the traditional machine learning methods, he said.

Deep learning architecture is better suited to processing such large, diverse datasets, such as the SEER program, according to Doppalapudi. Working on these types of datasets requires robust computational capacity. In this study, the researchers relied on ICDS’s Roar supercomputer.

With about 800,000 to 900,000 entries in the SEER dataset, the researchers said that manually finding these associations in the data with an entire team of medical researchers would be extremely difficult without assistance from machine learning.

If it were only three fields I would say it would be impossible — and we had about 150 fields,” said Doppalapudi. “Understanding all of those different fields and then reading and learning from that information, would be impossible.”

SOURCE: NEWS.PSU.EDU Credit: Matt Swayne

TIRED OF THE Agenda and the HYPE? SO ARE WE.

There was a time when the media published Science as part of their normal, routine reporting. We’re here to bring back Science News reporting with no agenda.

You can read anything, anywhere on the internet. You can find “Fact-Checkers” who inject their own agenda into what you can, and cannot read. We’re here to provide the bare-bones updates on Science from around the world. If there are two sides to a story, we’ll give both equal time.

Why do we do this?

  • Because major media outlets answer to their advertisers, not to their readers.
  • Because you can’t find Science News not connected to a particular agenda.

Subscribe to sign up, and be sure to share the stories that matter to you. If you like a story, give the author a Kudoz – or a few of them – and let them know they did a good job.

If you have an Science story idea, send it to newstip@ipak-edu.org, we’ll look into it!

-Science Weekly US Staff

Recent Entries »