Meloria • Ever Better
Search Tools Main Menu

Currents

September 01, 2015

In Research

Earth’s magnetic field is 500 million years older than previously thought

graphic of magnetic fields between earth and sun

Since 2010, the best estimate of the age of Earth’s magnetic field has been 3.45 billion years. But now a researcher responsible for that finding has new data showing the magnetic field is far older.

John Tarduno, professor of earth and environmental sciences and a leading expert on Earth’s magnetic field, and his team of researchers say they believe the Earth’s magnetic field is at least four billion years old.

“A strong magnetic field provides a shield for the atmosphere,” says Tarduno. “This is important for the preservation of habitable conditions on Earth.”

The findings by Tarduno and his team were published in the latest issue of the journal Science.

Earth’s magnetic field protects the atmosphere from solar winds— streams of charged particles shooting from the sun. The magnetic field helps prevent the solar winds from stripping away the atmosphere and water, that make life on the planet possible.

The field is generated in Earth’s liquid iron core, and that “geodynamo” requires a regular release of heat from the planet to operate. Today, that heat release is aided by plate tectonics, which efficiently transfers heat from the deep interior of the planet to the surface. But, according to Tarduno, the time of origin of plate tectonics is hotly debated, with some scientists arguing that Earth lacked a magnetic field during its youth.

Given the importance of the magnetic field, scientists have been trying to determine when it first arose, which could, in turn, provide clues as to when plate tectonics got started and how the planet was able to remain habitable.

Fortunately for scientists, there are minerals—such as magnetite— that lock in the magnetic field record at the time the minerals cooled from their molten state. The oldest available minerals can tell scientists the direction and the intensity of the field at the earliest periods of Earth’s history.

Tarduno’s new results are based on the record of magnetic field strength fixed within magnetite found within zircon crystals collected from the Jack Hills of Western Australia. The zircons were formed more than a billion years ago and have come to rest in an ancient sedimentary deposit. By sampling zircons of different age, the history of the magnetic field can be determined.

Read more at www.rochester.edu/newscenter.

Understanding GPS may help you hit a curve ball

pitcher throws a ball
An algorithm that helps our brain track motion can be tricked by the pattern motion of an object, such as the seams on a spinning baseball, which causes our brain to  “see” the ball suddenly drop from its path when, in reality, it curves steadily.

Our brains track moving objects by applying one of the algorithms your phone’s GPS (Global Positioning System) uses, according to University researchers. The algorithm also explains why we are fooled by several motion-related optical illusions, including the sudden “break” of baseball’s well-known “curveball illusion.”

The new open-access study published in Proceedings of the National Academy of Sciences shows that the brain applies an algorithm, known as a Kalman filter, when tracking an object’s position. The algorithm helps the brain process less-than perfect visual signals, such as when objects move to the periphery of our visual field where acuity is low.

However, the same algorithm that helps the brain track motion can be tricked by the pattern motion of an object, such as the seams on a spinning baseball, which causes the brain to “see” the ball suddenly drop from its path when, in reality, it curves steadily.

Though we often rely on GPS to get us to our destination, the accuracy of the system is limited. When the signal is “noisy” or unreliable, your phone’s GPS uses algorithms, including the Kalman filter, to estimate the location of your car based on its past position and speed.

“Like GPS, our visual ability, although quite impressive, has many limitations,” says the study’s coauthor, Duje Tadin, associate professor of brain and cognitive sciences.

Most of the time our vision does a really good job, but in some cases, such as a breaking curveball, the optimal solution that our brain comes up with belies the actual behavior— and trajectory—of the ball, and the result is an optical illusion.

Therefore, Tadin says, you have a better chance of hitting a curveball by realizing that our brains, like GPS, can lead us to “see” changes in speed or direction that don’t actually occur when the ball moves from the center of our visual field to the periphery.

Read more at www.rochester.edu/newscenter.

A healthy social life in your 20s can lead to longer life

A new study shows that the quantity of social interactions a person has at 20—and the quality of social relationships that person has at age 30—can benefit his or her wellbeing later in life.

The 30-year longitudinal study also suggests that people with poor social connections show an increased risk for early mortality.

Published in Psychology and Aging, the study’s author, Cheryl Carmichael, who conducted the research as a PhD candidate in psychology at Rochester, says that “having few social connections is equivalent to tobacco use, and it’s higher than for those who drink excessive amounts of alcohol, or who suffer from obesity.” Researchers found that frequent social interactions that took place at the age of 20 were beneficial later in life because they lead to a manifold social tool set from which to be drawn.

However, the study also indicates that having a high number of social interactions at age 30 has no psychosocial benefits later. Researchers were also surprised to find that socially active 20-year-olds did not necessarily become successful at having quality relationships at age 30 when the greatest impact on their lives appears to have been derived from quality social engagement.

Read more at www.rochester.edu/newscenter.

Scientists take aim at genetically lethal strain of leukemia

In July, scientists at the Wilmot Cancer Institute developed what they believe to be the first mouse model to investigate why a certain subset of acute myeloid leukemia (AML) responds poorly to chemotherapy in some patients.

Approximately 15 percent of AML patients harbor a mutation in the RUNX1 gene, and, in those patients, standard treatment is unable to eradicate leukemic cells from their bone marrow, where the cancer is rooted. Scientists do not fully understand the underlying mechanisms protecting the residual AML cells.

An article published in the journal PLOS ONE by corresponding author Jason Mendler, assistant professor of medicine, suggests that a genetically defined mouse model of RUNX1-mutated AML, developed at Wilmot, is the ideal platform to investigate the cellular mechanisms protecting residual AML cells in the molecular subtype of the disease.

Mendler’s laboratory has already begun a preclinical collaboration with a drug company that has a therapy to target a pathway found to be overactive in RUNX1-mutated AML cases. Successful therapeutic approaches within these models will be brought into clinical trials targeting patients as soon as possible.

“We believe our mouse model will allow us to quickly define new ways to target this challenging disease,” Mendler says.

Read more at urmc.rochester.edu/news.

Study: Virtual research studies feasible

A pilot study in Parkinson’s disease suggests a new era of clinical research that removes the barrier of distance for both scientists and volunteers. The research, published in Digital Health, could also enable researchers to leverage the rapid growth in personal genetic testing to better diagnose and potentially treat a wide range of diseases.

“These findings demonstrate that remote recruitment and conduct of research visits is feasible and well received by participants,” says lead author Ray Dorsey, who holds the David M. Levy Professorship in Neurology. “Direct-to-consumer genetic testing, when paired with telemedicine, has the potential to involve more people in clinical research and accelerate the process of identifying the genetic causes and variations in chronic diseases such as Parkinson’s.”

Working with 23andMe and the Michael J. Fox Foundation for Parkinson’s Research, the researchers recruited 50 individuals in 23 states who agreed to undergo a remote assessment consisting of cognitive and motor tests via secure video conferencing and complete a survey.

The study found that physicians at a single site were able to successfully and rapidly diagnose and categorize patients located across the country. Findings could point the way for new and more cost-effective methods to recruit participants for clinical trials and make participation in clinical research more convenient.

Babies’ expectations may help brain development

Researchers at the Rochester Baby Lab and the University of South Carolina found that infants can use their expectations about the world to rapidly shape their developing brains.

The study published in the Proceedings of the National Academy of Sciences on July 20 says experiments with infants five to seven months old have shown portions of babies’ brains responsible for visual processing respond not just to the presence of visual stimuli, but also to the expectation of visual stimuli.

“We show that in situations of learning and situations of expectations, babies are in fact able to really quickly use their experience to shift the ways different areas of their brains respond to the environment,” says Lauren Emberson, assistant professor in psychology at Princeton University, who conducted the study at Rochester’s Baby Lab while a research associate in the Department of Brain and Cognitive Sciences.

Researchers used functional near-infrared spectroscopy, a technology that uses light to measure oxygenation in regions of the brain, to assess brain activity as infants were exposed to sounds and images.

Read more at www.rochester.edu/newscenter.

Nursing home care for minorities improves

A new study of nursing homes has found that, while disparities continue to exist, the quality of care in homes with higher concentrations of racial and ethnic minority residents has improved and that the progress appears to be linked to increases in Medicaid payments.

“Racial and ethnic disparities in quality of care have long been documented in nursing homes,” says Yue Li, associate professor in the Department of Public Health Sciences and lead author of the study, which was published in the journal Health Affairs. “This study shows that recent regulatory–, financial–, and market-driven changes have resulted in an improvement not only in homes with higher numbers of minorities but across the board.”

There are an estimated 1.3 million older and disabled Americans receiving care in some 15,000 nursing homes across the nation. Over the past 20 years, the number of African-American, Hispanic, and Asian individuals in nursing homes has increased rapidly, and these populations now make up nearly 20 percent of nursing home residents. While the number has risen, nursing homes remain segregated, and homes with high concentrations of racial and ethnic minorities tend to have limited financial resources, make do with lower nurse staffing, and provide a lower level of care.

State Medicaid programs are the dominant source of funding for nursing homes, providing roughly half of total payments for long-term care. In recent years states have attempted to influence the quality of care by increasing reimbursement rates and linking those payments to improvements.

The researchers looked at data over a six-year period from more than 14,000 nursing homes. The data, which is compiled by Brown University and the Centers for Medicare and Medicaid Services, tracks approximately 180 federal quality standards, including clinical care, patient safety, quality of life, the physical state of the facilities, and quality of its administrative staff.

Read more at urmc.rochester.edu/news.

Stress in low-income families can affect children’s learning

Children living in low-income households who endure family instability and emotionally distant caregivers are at risk of having impaired cognitive abilities, according to University researchers.

A study of 201 low-income mother-child pairs, conducted at Mt. Hope Family Center, tracked the levels of the stress hormone cortisol in the children at ages two, three, and four. It found that specific forms of family adversity are linked to both elevated and low levels of cortisol in children. Children with either elevated or low cortisol levels also had lower than average cognitive ability at age four.

“What we were interested in seeing is whether specific risk factors of children living in poverty might be related to children’s cortisol levels,” says lead author Jennifer Suor, a PhD candidate in clinical psychology. “Then we looked to see if the hormone levels are predictive of significant differences in the children’s ability to think.”

The study, published in Child Development, shows that children in low-income, stressful home environments— specifically homes with family instability and harsh and disengaged mothers—can have adverse levels of cortisol in their bodies, which previous studies have associated with having damaging effects on the structure and function of children’s brains.

Understanding how cortisol affects the brain’s cognitive abilities, though, is still unclear. Researchers hypothesize that too much cortisol can have toxic effects on parts of the brain that are important for cognitive functioning, and too little might hinder the body’s ability to recruit the biological resources necessary for optimal cognitive functioning.

Read more at www.rochester.edu/newscenter.

Previous story    Next story