Chapter 7. Life-Span Development of the Brain and Behavior
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
By Alice Klein Blame grandpa. A study in mice shows that the grandsons of obese males are more susceptible to the detrimental health effects of junk food, even if their fathers are lean and healthy. The finding adds to evidence that new traits can be passed down the family line without being permanently recorded in a family’s genes – a phenomenon called transgenerational epigenetics. Last year, a study found that the DNA in the sperm of obese men is modified in thousands of places, and that these sperm also contain short pieces of RNA. These are epigenetic modifications – they don’t affect the precise code of genes, but instead may affect how active particular genes are. Now Catherine Suter at Victor Chang Cardiac Research Institute in Sydney and her team have investigated the longer-term effects of paternal obesity. To do this, they mated obese male mice with lean female mice. They found that, compared with the offspring of lean males, both the sons and grandsons of the obese males were more likely to show the early signs of fatty liver disease and diabetes when given a junk food diet. The same effect wasn’t seen in daughters or granddaughters. Even when the sons of the obese males were fed a healthy diet and kept at a normal weight, their sons still had a greater tendency to develop obesity-related conditions when exposed to a junk diet. © Copyright Reed Business Information Ltd.
By William Kenower My youngest son, Sawyer, used to spend far more time relating to his imagination than he did to the world around him. He would run back and forth humming, flapping his hands and thumping on his chest. By the time he was in first grade, attempts to draw him out of his pretend world to join his classmates or do some class work led to explosions and timeouts. At 7 he was given a diagnosis of being on the autism spectrum. That was when my wife, Jen, learned about the practice called joining. The idea behind it, which she discovered in Barry Neil Kaufman’s book “Son-Rise,” is brilliant in its simplicity. We wanted Sawyer to be with us. We did not want him to live in this bubble of his own creation. And so, instead of telling him to stop pretending and join us, we started pretending and joined him. The first time Jen joined him, the first time she ran beside him humming and thumping her chest, he stopped running, stopped thumping, stopped humming and, without a single word from us, turned to her and said, “What are you doing?” We took turns joining him every day, and a week later we got an email from his special education teacher telling us to keep doing whatever we were doing. He’d gone from five timeouts a day to one in a week. The classroom was the same, the work was the same – all that was different was that we had found a way to say to him in a language he could understand, “You’re not wrong.” Emboldened by our success, we set about becoming more fluent in this language. For the next couple of years we taught ourselves to join him constantly. This meant that whatever we were doing had to stop whenever we heard him running back and forth and humming. But we could not join him simply to get him to stop running and thumping and humming. We had to join him without any judgment or impatience. That was the trickiest part. The desire to fix him was great. I had come to believe that there were broken people in need of fixing. Sometimes, I looked like one of those people. I was a 40-year-old unpublished writer working as a waiter. My life reeked of failure. Many days I looked in the mirror and asked, “What is wrong with me?” © 2016 The New York Times Company
Link ID: 22451 - Posted: 07.16.2016
James M. Broadway “Where did the time go?” middle-aged and older adults often remark. Many of us feel that time passes more quickly as we age, a perception that can lead to regrets. According to psychologist and BBC columnist Claudia Hammond, “the sensation that time speeds up as you get older is one of the biggest mysteries of the experience of time.” Fortunately, our attempts to unravel this mystery have yielded some intriguing findings. In 2005, for instance, psychologists Marc Wittmann and Sandra Lenhoff, both then at Ludwig Maximilian University of Munich, surveyed 499 participants, ranging in age from 14 to 94 years, about the pace at which they felt time moving—from “very slowly” to “very fast.” For shorter durations—a week, a month, even a year—the subjects' perception of time did not appear to increase with age. Most participants felt that the clock ticked by quickly. But for longer durations, such as a decade, a pattern emerged: older people tended to perceive time as moving faster. When asked to reflect on their lives, the participants older than 40 felt that time elapsed slowly in their childhood but then accelerated steadily through their teenage years into early adulthood. There are good reasons why older people may feel that way. When it comes to how we perceive time, humans can estimate the length of an event from two very different perspectives: a prospective vantage, while an event is still occurring, or a retrospective one, after it has ended. In addition, our experience of time varies with whatever we are doing and how we feel about it. In fact, time does fly when we are having fun. Engaging in a novel exploit makes time appear to pass more quickly in the moment. But if we remember that activity later on, it will seem to have lasted longer than more mundane experiences. © 2016 Scientific American,
Helen Haste The American psychologist and educationist Jerome Bruner, who has died aged 100, repeatedly challenged orthodoxies and generated novel directions. His elegant, accessible writing reached wide audiences. His colleague Rom Harré described his lectures as inspiring: “He darted all over the place, one topic suggested another and so on through a thrilling zigzag.” To the charge that he was always asking impossible questions, Jerry replied: “They are pretty much impossible, but the search for the impossible is part of what intelligence is about.” He was willing to engage with controversy, both on academic issues and in education politics. Blind at birth because of cataracts, Jerry gained his sight after surgery at the age of two. He credited this for his sense that we actively interpret and organise our world rather than passively react to it – a theme that he continued to develop in different ways. His first work lay in perception, when he resumed research at Harvard after the second world war. He found that children’s judgments of the size of coins and coin-like disks varied: poorer children overestimated the size of the coins. This contributed to the emerging “new look” movement in psychology, involving values, intentions and interpretation in contrast to the then dominant behaviourist focus on passive learning, reward and punishment. His professorship at Harvard came in 1952, and by the middle of the decade a computer metaphor began to influence psychology – the “cognitive revolution”. With Jacqueline Goodnow and George Austin, Jerry published A Study of Thinking (1956). © 2016 Guardian News and Media Limited
By Andy Coghlan There once was a brainy duckling. It could remember whether shapes or colours it saw just after hatching were the same as or different to each other. The feat surprised the researchers, who were initially sceptical about whether the ducklings could grasp such complex concepts as “same” and “different”. The fact that they could suggests the ability to think in an abstract way may be far more common in nature than expected, and not just restricted to humans and a handful of animals with big brains. “We were completely surprised,” says Alex Kacelnik at the University of Oxford, who conducted the experiment along with his colleague Antone Martinho III. Kacelnik and Martinho reasoned that ducklings might be able to grasp patterns relating to shape or colour as part of the array of sensory information they absorb soon after hatching. Doing so would allow them to recognise their mothers and siblings and distinguish them from all others – abilities vital for survival. In ducklings, goslings and other species that depend for survival on following their mothers, newborns learn quickly – a process called filial imprinting. Kacelnik wondered whether this would enable them to be tricked soon after hatching into “following” objects or colours instead of their natural mother, and recognising those same patterns in future. © Copyright Reed Business Information Ltd.
Laura Sanders If you’ve ever watched a baby purse her lips to hoot for the first time, or flash a big, gummy grin when she sees you, or surprise herself by rolling over, you’ve glimpsed the developing brain in action. A baby’s brain constructs itself into something that controls the body, learns and connects socially. Spending time with an older person, you may notice signs of slippage. An elderly man might forget why he went into the kitchen, or fail to anticipate the cyclist crossing the road, or muddle medications with awkward and unfamiliar names. These are the signs of the gentle yet unrelenting neural erosion that comes with normal aging. These two seemingly distinct processes — development and aging — may actually be linked. Hidden in the brain-building process, some scientists now suspect, are the blueprints for the brain’s demise. The way the brain is built, recent research suggests, informs how it will decline in old age. That the end can be traced to the beginning sounds absurd: A sturdily constructed brain stays strong for decades. During childhood, neural pathways make connections in a carefully choreographed order. But in old age, this sequence plays in reverse, brain scans reveal. In both appearance and behavior, old brains seem to drift backward toward earlier stages of development. What’s more, some of the same cellular tools are involved in both processes. © Society for Science & the Public 2000 - 2016
Keyword: Development of the Brain
Link ID: 22440 - Posted: 07.14.2016
By Anahad O'Connor Like most of my work, this article would not have been possible without coffee. I’m never fully awake until I have had my morning cup of espresso. It makes me productive, energized and what I can only describe as mildly euphoric. But as one of the millions of caffeine-loving Americans who can measure out my life with coffee spoons, (to paraphrase T.S. Eliot), I have often wondered: How does my coffee habit impact my health? The health community can’t quite agree on whether coffee is more potion or poison. The American Heart Association says the research on whether coffee causes heart disease is conflicting. The World Health Organization, which for years classified coffee as “possibly” carcinogenic, recently reversed itself, saying the evidence for a coffee-cancer link is “inadequate.” National dietary guidelines say that moderate coffee consumption may actually be good for you – even reducing chronic disease. Why is there so much conflicting evidence about coffee? The answer may be in our genes. About a decade ago, Ahmed El-Sohemy, a professor in the department of nutritional sciences at the University of Toronto, noticed the conflicting research on coffee and the widespread variation in how people respond to it. Some people avoid it because just one cup makes them jittery and anxious. Others can drink four cups of coffee and barely keep their eyes open. Some people thrive on it. Dr. El-Sohemy suspected that the relationship between coffee and heart disease might also vary from one individual to the next. And he zeroed in on one gene in particular, CYP1A2, which controls an enzyme – also called CYP1A2 – that determines how quickly our bodies break down caffeine. One variant of the gene causes the liver to metabolize caffeine very quickly. People who inherit two copies of the “fast” variant – one from each parent – are generally referred to as fast metabolizers. Their bodies metabolize caffeine about four times more quickly than people who inherit one or more copies of the slow variant of the gene. These people are called slow metabolizers. © 2016 The New York Times Company
By Tara Parker-Pope Hoping to alert parents to “red flags” that might signal autism, two advocacy groups yesterday launched a Web site, the ASD Video Glossary, that provides online glimpses of kids with autism to worried parents. But some experts fear the site, though well intentioned, also may cause anxiety among parents whose children are perfectly fine. The site contains videos that show subtle differences in how kids with autism speak, react, play and express themselves. The organizations behind it, Autism Speaks and First Signs, hope that parents who see resemblances in their own kids will be emboldened to seek early diagnosis and treatment, which many experts believe can improve outcomes for kids with autism. Visitors to the new site must register in order to watch the videos, and in the first two hours of its release, more than 10,000 people did so. Yet some researchers fear the video glossary is certain to be troubling for the parents of children without autism, too, because the behavior of kids without the condition can resemble that depicted in the videos. “Just as there’s a spectrum in autism…there’s a spectrum in normal development,” Dr. Michael Wasserman, a pediatrician at Ochsner Medical Center in New Orleans told the Associated Press. “Children don’t necessarily develop in a straight line.” But Amy Wetherby, a professor of communications disorders at Florida State University who helped create the site, said the videos would embolden parents to persist when doctors don’t listen to legitimate concerns about a child’s behavior. As she told the Associated Press, sometimes “parents are the first to be concerned, and the doctors aren’t necessarily worried,” she said. “This will help give them terms to take to the doctor and say, ‘I’m worried about it.”’ © 2016 The New York Times Company
Link ID: 22432 - Posted: 07.13.2016
By Maggie Koerth-Baker When former Tennessee women’s basketball coach Pat Summitt died Tuesday morning, news outlets, including ESPN, reported the cause of her death as “early-onset dementia, Alzheimer’s type.” That’s more than just a long-winded way of saying “Alzheimer’s.” By using five words instead of one, journalists were trying to point a big, flashing neon arrow at the complex realities of dementia. Dementia is more of a symptom than a diagnosis, and it can be caused by a number of different diseases. Even Alzheimer’s, the most common type of dementia, doesn’t seem to have a single cause. Instead, what ties Summitt to millions of other Alzheimer’s patients all over the world is the physical damage it wrought in her brain. Worldwide, 47.5 million people are living with some kind of dementia. Alzheimer’s represents 60 percent to 70 percent of those cases. Imagine a map of a city — roads branching out, intersecting with other roads, creating a network that allows mail to be delivered, food to be sold and brought home, people to get to their jobs. What would happen to that town if random intersections were suddenly barricaded and impassible? That’s the dystopian chaos Alzheimer’s causes, as damaged proteins clog the neurons and inhibit the flow of information from one neuron to another. Cut off from food, as well as data, the cells die. The brain shrinks. Eventually, the person dies, too. Afterward, doctors can cut into their brain and see the barriers, which are called plaques.
Link ID: 22426 - Posted: 07.12.2016
By Edd Gent, The devastating neurodegenerative condition Alzheimer's disease is incurable, but with early detection, patients can seek treatments to slow the disease's progression, before some major symptoms appear. Now, by applying artificial intelligence algorithms to MRI brain scans, researchers have developed a way to automatically distinguish between patients with Alzheimer's and two early forms of dementia that can be precursors to the memory-robbing disease. The researchers, from the VU University Medical Center in Amsterdam, suggest the approach could eventually allow automated screening and assisted diagnosis of various forms of dementia, particularly in centers that lack experienced neuroradiologists. Additionally, the results, published online July 6 in the journal Radiology, show that the new system was able to classify the form of dementia that patients were suffering from, using previously unseen scans, with up to 90 percent accuracy. [10 Things You Didn't Know About the Brain] "The potential is the possibility of screening with these techniques so people at risk can be intercepted before the disease becomes apparent," said Alle Meije Wink, a senior investigator in the center's radiology and nuclear medicine department. "I think very few patients at the moment will trust an outcome predicted by a machine," Wink told Live Science. "What I envisage is a doctor getting a new scan, and as it is loaded, software would be able to say with a certain amount of confidence [that] this is going to be an Alzheimer's patient or [someone with] another form of dementia." © 2016 Scientific American
Link ID: 22425 - Posted: 07.12.2016
By David Dobbs It’s difficult to tell what Gina Pace wants unless you already know what she wants. But sometimes that’s easy, and this is one of those times: Gina wants pizza. “I-buh!” she says repeatedly—her version of “I want.” We all do. We are sitting at Abate’s in New Haven, Connecticut, a town famous for—among other things—pizza and science. Gina and her father, Bernardo, who live on Staten Island in New York City, have made the two-hour drive here for both. The pizza is in the oven. The science is already at the table, represented by Abha Gupta, a developmental pediatrician at Yale’s renowned Child Study Center. Gupta is one of the few scientific experts on a condition that Bernardo and Gina know through hard experience. Gina, now 24, was diagnosed 20 years ago with childhood disintegrative disorder, or CDD. CDD is the strangest and most unsettling developmental condition you have probably never heard of. Also known as Heller’s syndrome, for the Austrian special educator who first described it in 1908, it is a late-blooming, viciously regressive form of autism. It’s rare, striking about 1 or 2 in every 100,000 children. After developing typically for two to 10 years (the average is three or four), a child with CDD will suffer deep, sharp reversals along multiple lines of development, which may include language, social skills, play skills, motor skills, cognition, and bladder or bowel control. The speed and character of this reversal varies, but it often occurs in a horrifyingly short period—as short as a couple of months, says Gupta. In about 75 percent of cases, this loss of skills is preceded by days or weeks in which the child experiences intense anxiety and even terror: nightmares and waking nightmares and bouts of confused, jumpy disturbance that resemble psychosis.
By Jane E. Brody To stem the current epidemic of obesity, there’s no arguing with the adage that an ounce of prevention is worth a pound of cure. As every overweight adult knows too well, shedding excess pounds and keeping them off is far harder than putting them on in the first place. But assuring a leaner, healthier younger generation may often require starting even before a baby is born. The overwhelming majority of babies are lean at birth, but by the time they reach kindergarten, many have acquired excess body fat that sets the stage for a lifelong weight problem. Recent studies indicate that the reason so many American children become overweight is far more complicated than consuming more calories than they burn, although this is certainly an important factor. Rather, preventing children from acquiring excess body fat may have to start even before their mothers become pregnant. Researchers are tracing the origins of being overweight and obese as far back as the pre-pregnancy weight of a child’s mother and father, and their explanations go beyond simple genetic inheritance. Twenty-three genes are known to increase the risk of becoming obese. These genes can act very early in development to accelerate weight gain in infancy and during middle childhood. In the usual weight trajectory, children are born lean, get chubby during infancy, then become lean again as toddlers when they grow taller and become more active. Then, at or before age 10 or so, body fat increases in preparation for puberty – a phenomenon called adiposity rebound. In children with obesity genes, “adiposity rebound occurs earlier and higher,” said Dr. Daniel W. Belsky, an epidemiologist at Duke University School of Medicine. “They stop getting leaner sooner and start putting on fat earlier and put on more of it.” © 2016 The New York Times Company
By Aviva Rutkin At first glance, she was elderly and delicate – a woman in her 90s with a declining memory. But then she sat down at the piano to play. “Everybody in the room was totally startled,” says Eleanor Selfridge-Field, who researches music and symbols at Stanford University. “She looked so frail. Once she sat down at the piano, she just wasn’t frail at all. She was full of verve.” Selfridge-Field met this woman, referred to as ME to preserve her privacy, at a Christmas party around eight years ago. ME, who is now aged 101, has vascular dementia: she rarely knows where she is, and doesn’t recognise people she has met in the last few decades. But she can play nearly 400 songs by ear – a trick that depends on tapping into a memory of previously stored musical imprints – and continues to learn new songs just by listening to them. She has even composed an original piece of her own. ME’s musical talent, despite her cognitive impairments, inspired Selfridge-Field to spend the last six years observing her, and she presented her observations today at the International Conference on Music Perception and Cognition in San Francisco, California. ME experienced a stroke-like attack when she was in her 80s, and a few years later was diagnosed with vascular dementia. She struggles most to remember events and encounters that are recent, and her memory is selective, focusing on specific periods – such as her childhood between the ages of 3 and 8. She can recognise people that she met before the age of about 75 to 80. She is never quite sure of her surroundings. © Copyright Reed Business Information Ltd.
Link ID: 22420 - Posted: 07.11.2016
Beatrice Alexandra Golomb, Statins can indeed produce neurological effects. These drugs are typically prescribed to lower cholesterol and thereby reduce the risk of heart attack and stroke. Between 2003 and 2012 roughly one in four Americans aged 40 and older were taking a cholesterol-lowering medication, according to the Centers for Disease Control and Prevention. But studies show that statins can influence our sleep and behavior—and perhaps even change the course of neurodegenerative conditions, including dementia. The most common adverse effects include muscle symptoms, fatigue and cognitive problems. A smaller proportion of patients report peripheral neuropathy—burning, numbness or tingling in their extremities—poor sleep, and greater irritability and aggression. Interestingly, statins can produce very different outcomes in different patients, depending on an individual's medical history, the statin and the dose. Studies show, for instance, that statins generally reduce the risk of ischemic strokes—which arise when a blocked artery or blood clot cuts off oxygen to a brain region—but can also increase the risk of hemorrhagic strokes, or bleeding into the brain. Statins also appear to increase or decrease aggression. In 2015 my colleagues and I observed that women taking statins, on average, showed increased aggression; men typically showed less, possibly because of reduced testosterone levels. Some men in our study did experience a marked increase in aggression, which was correlated with worsening sleep. © 2016 Scientific American
by Adriana Heguy, molecular biologist and genomics researcher: Interestingly, tongue-curling ability is not solely genetic, and the genetic component may be very small. Monozygotic (identical) twins are not always concordant for tongue-curling ability, so if there is a genetic component, it’s clearly not Mendelian. In other words, it’s not a trait coded by one single gene, and it’s clearly influenced by the environment—in this case, practice. But for some reason this is one of the “myths” about genetics that gets spread around in high school, where it is used as an example of a simple Mendelian trait with a simple dominant-recessive nature. It’s hard to comment on the evolutionary purpose of an ability so heavily influenced by the environment, and not obviously useful. There are many traits for which we do not have the faintest idea why they exist or what purpose they serve. In the case of tongue-curling, it’s possible that it’s a case of fine motor control of the tongue. We need to be able to move our tongues to not bite them when we eat, for example, and for swirling food around. For unknown reasons, some individuals are better than others at controlling tongue movement. And since the ability can be acquired by practicing (though not everybody apparently succeeds), it does seem likely that it is indeed a question of motor control. Most people are able to do it. It’s quite common. But it could be that evolution had nothing to do with it. Or it could be a spandrel; in other words, a side effect of evolution. Maybe the evolution of dexterity or finer motor control of other muscles resulted in tongue “dexterity.” It’s possible that it is an atavism, something that increased tongue muscle control was once useful for tasting or eating certain kinds of foods millions of years ago, and it has not disappeared because the developmental program for fine muscle control is still there.
By Louise Whiteley It’s an appealing idea: the notion that understanding the learning brain will tell us how to maximise children’s potential, bypassing the knotty complexities of education research. But promises to replace sociological complexity with biological certainty should always be treated with caution. Hilary and Steven Rose are deeply sceptical of claims that neuroscience can inform education and early intervention policy, and deeply concerned about the use of such claims to support neoliberal agendas. They argue that focusing on the brain encourages a focus on the individual divorced from their social context, and that this is easily aligned with a view of poor achievement as a personal moral failing, rather than a practical consequence of poverty and inequality. Whether or not you end up cheerleading for the book’s political agenda, its deconstruction of faulty claims about how neuroscience translates into the classroom is relevant to anyone interested in education. The authors tear apart the scientific logic of policy documents, interrogate brain-based interventions and dismantle prevalent neuro-myths. One of the book’s meatiest chapters deals with government reports advocating early intervention to increase “mental capital”, and thus reduce the future economic burden of deprived, underachieving brains. As we discover, the neuroscientific foundations of these reports are shaky. For instance, they tend to assume that the more synaptic connections between brain cells the better, and that poor environment in a critical early period permanently reduces the number of synapses. This makes early intervention focusing on the individual child and “poor parenting” seem like the obvious solution. But pruning of synapses is just as important to brain development, and learning involves the continual forming and reforming of synaptic connections. More is not necessarily better. And while an initial explosion in synapses can be irreversibly disrupted by extreme neglect, the evidence just isn’t there yet for extrapolating this to the more common kinds of childhood deprivation that such reports address.
By Jessica Hamzelou TEENAGE pregnancies have hit record lows in the Western world, largely thanks to increased use of contraceptives of all kinds. But strangely, we don’t really know what hormonal contraceptives – pills, patches and injections that contain synthetic sex hormones – are doing to the developing bodies and brains of teenage girls. You’d be forgiven for assuming that we do. After all, the pill has been around for more than 50 years. It has been through many large trials assessing its effectiveness and safety, as have the more recent patches and rings, and the longer-lasting implants and injections. But those studies were done in adult women – very few have been in teenage girls. And biologically, there is a big difference. At puberty, our bodies undergo an upheaval as our hormones go haywire. It isn’t until our 20s that things settle down and our brains and bones reach maturity. “If a drug is going to be given to 11 and 12-year-olds, it needs to be tested in 11 and 12-year-olds,” says Joe Brierley of the clinical ethics committee at Great Ormond Street Hospital in London. Legislation introduced in the US in 2003 and in Europe in 2007 was intended to make this happen but a New Scientist investigation can reveal that there is still scant data on what contraceptives actually do to developing girls. The few studies that have been done suggest that tipping the balance of oestrogen and progesterone during this time may have far-reaching effects, although there is not yet enough data to say whether we should be alarmed. © Copyright Reed Business Information Ltd.
By ERICA GOODE Irving Gottesman, a pioneer in the field of behavioral genetics whose work on the role of heredity in schizophrenia helped transform the way people thought about the origins of serious mental illness, died on June 29 at his home in Edina, Minn., a suburb of Minneapolis. He was 85. His wife, Carol, said he died while taking an afternoon nap. Although Dr. Gottesman had some health problems, she said, his death was unexpected, and several of his colleagues said they received emails from him earlier that day. Dr. Gottesman was perhaps best known for a study of schizophrenia in British twins he conducted with another researcher, James Shields, at the Maudsley Hospital in London in the 1960s. The study, which found that identical twins were more likely than fraternal twins to share a diagnosis of schizophrenia, provided strong evidence for a genetic component to the illness and challenged the notion that it was caused by bad mothering, the prevailing view at the time. But the findings also underscored the contribution of a patient’s environment: If genes alone were responsible for schizophrenia, the disorder should afflict both members of every identical pair; instead, it appeared in both twins in only about half of the identical pairs in the study. This interaction between nature and nurture, Dr. Gottesman believed, was critical to understanding human behavior, and he warned against tilting too far in one direction or the other in explaining mental illness or in accounting for differences in personality or I.Q. © 2016 The New York Times Company
By David Shultz Making eye contact for an appropriate length of time is a delicate social balancing act: too short, and we look shifty and untrustworthy; too long, and we seem awkward and overly intimate. To make this Goldilocks-like dilemma even trickier, it turns out that different people prefer to lock eyes for different amounts of time. So what’s too long or too short for one person might be just right for another. In a new study, published today in Royal Society Open Science, researchers asked a group of 498 volunteers to watch a video of an actor staring out from a screen and press a button if their gazes met for an uncomfortably long or short amount of time (above). During the test, the movement of their eyes and the size of their pupils were recorded with eye-tracking technology. On average, participants had a “preferred gaze duration” of 3.3 seconds, give or take 0.7 seconds. That’s a pretty narrow band for someone on their first date! Making things even harder, individual preferences can also be measured: Researchers found that how quickly people’s pupils dilated—an automatic reflex whenever someone looks into the eyes of another—was a good indicator of how long they wanted to gaze. The longer their preferred gaze, the faster their pupils expanded. The differences are so subtle, though, that they can only be seen with the eye-tracking software—making any attempts to game the system is likely to end up awkward rather than informative. © 2016 American Association for the Advancement of Science.
Carl Zimmer Our genes are not just naked stretches of DNA. They’re coiled into intricate three-dimensional tangles, their lengths decorated with tiny molecular “caps.” These so-called epigenetic marks are crucial to the workings of the genome: They can silence some genes and activate others. Epigenetic marks are crucial for our development. Among other functions, they direct a single egg to produce the many cell types, including blood and brain cells, in our bodies. But some high-profile studies have recently suggested something more: that the environment can change your epigenetic marks later in life, and that those changes can have long-lasting effects on health. In May, Duke University researchers claimed that epigenetics could explain why people who grow up poor are at greater risk of depression as adults. Even more provocative studies suggest that when epigenetic marks change, people can pass them to their children, reprogramming their genes. But criticism of these studies has been growing. Some researchers argue that the experiments have been weakly designed: Very often, they say, it’s impossible for scientists to confirm that epigenetics is responsible for the effects they see. Three prominent researchers recently outlined their skepticism in detail in the journal PLoS Genetics. The field, they say, needs an overhaul. “We need to get drunk, go home, have a bit of a cry, and then do something about it tomorrow,” said John M. Greally, one of the authors and an epigenetics expert at the Albert Einstein College of Medicine in New York. © 2016 The New York Times Company