Most Recent Links
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
By Matthew Hutson, Veronique Greenwood For some things, such as deciding whether to take a new job or nab your opponent's rook in chess, you're better off thinking long and hard. For others, such as judging your interviewer's or opponent's emotional reactions, first instincts are best—or so traditional wisdom suggests. But new research finds that careful reflection actually makes us better at assessing others' feelings. The findings could improve how we deal with bosses, spouses, friends and, especially, strangers. We would have trouble getting through the day or even a conversation if we couldn't tell how other people were feeling. And yet this ability, called empathic accuracy, eludes introspection. “We don't think too hard about the exact processes we engage in when we do it,” says Christine Ma-Kellams, a psychologist at the University of La Verne in California, “and we don't necessarily know how accurate we are.” Recently Ma-Kellams and Jennifer Lerner of Harvard University conducted four studies, all published in 2016. In one experiment, participants imagined coaching an employee for a particular job. When told to help the employee get better at reading others' emotions, most people recommended thinking “in an intuitive and instinctive way” as opposed to “in an analytic and systematic way.” When told to make employees worse at the task, the participants recommended the opposite. And yet later experiments suggested this coaching was off base. For instance, in another experiment, professionals in an executive-education program took a “cognitive reflection test” to measure how much they relied on intuitive versus systematic thinking. The most reflective thinkers were most accurate at interpreting their partners' moods during mock interviews. Systematic thinkers also outperformed intuiters at guessing the emotions expressed in photographs of eyes. © 2017 Scientific American
Many people think of fish and seafood as being healthy. However, new research suggests eating certain species that tend to have high levels of mercury may be linked to a greater risk of developing amyotrophic lateral sclerosis (ALS), also known as Lou Gehrig's disease. Questions remain about the possible impact of mercury in fish, according to a preliminary study released Monday that will be presented at the American Academy of Neurology's 69th annual meeting in Boston in April. Fish and seafood consumption as a regular part of the diet was not associated with ALS, the study said. "For most people, eating fish is part of a healthy diet," said study author Elijah Stommel of Dartmouth College in Hanover, N.H., and a fellow of the academy. In addition, the authors said their study does not negate the fact that eating fish provides many health benefits. Instead, it suggests people may want to choose species that are known to have a lower mercury content, and avoid consuming fish caught in waters where there is mercury contamination. The researchers stressed that more research is needed before fish consumption guidelines for neurodegenerative illness can be made. While the exact cause of ALS is not known, some previous studies have suggested the neurotoxic metal to be a risk factor for ALS, a progressive neurological disease. ©2017 CBC/Radio-Canada.
By Greta Keenan The ocean might seem like a quiet place, but listen carefully and you might just hear the sounds of the fish choir. Most of this underwater music comes from soloist fish, repeating the same calls over and over. But when the calls of different fish overlap, they form a chorus. Robert McCauley and colleagues at Curtin University in Perth, Australia, recorded vocal fish in the coastal waters off Port Headland in Western Australia over an 18-month period, and identified seven distinct fish choruses, happening at dawn and at dusk. You can listen to three of them here: The low “foghorn” call is made by the Black Jewfish (Protonibea diacanthus) while the grunting call that researcher Miles Parsons compares to the “buzzer in the Operation board game” comes from a species of Terapontid. The third chorus is a quieter batfish that makes a “ba-ba-ba” call. “I’ve been listening to fish squawks, burble and pops for nearly 30 years now, and they still amaze me with their variety,” says McCauley, who led the research. Sound plays an important role in various fish behaviours such as reproduction, feeding and territorial disputes. Nocturnal predatory fish use calls to stay together to hunt, while fish that are active during the day use sound to defend their territory. “You get the dusk and dawn choruses like you would with the birds in the forest,” says Steve Simpson, a marine biologist at the University of Exeter, UK. © Copyright Reed Business Information Ltd.
By WINNIE HU Ruth Brunn finally said yes to marijuana. She is 98. She pops a green pill filled with cannabis oil into her mouth with a sip of vitamin water. Then Ms. Brunn, who has neuropathy, settles back in her wheelchair and waits for the jabbing pain in her shoulders, arms and hands to ebb. “I don’t feel high or stoned,” she said. “All I know is I feel better when I take this.” Ms. Brunn will soon have company. The nursing home in New York City where she lives, the Hebrew Home at Riverdale, is taking the unusual step of helping its residents use medical marijuana under a new program to treat various illnesses with an alternative to prescription drugs. While the staff will not store or administer pot, residents are allowed to buy it from a dispensary, keep it in locked boxes in their rooms and take it on their own. From retirement communities to nursing homes, older Americans are increasingly turning to marijuana for relief from aches and pains. Many have embraced it as an alternative to powerful drugs like morphine, saying that marijuana is less addictive, with fewer side effects. For some people, it is a last resort when nothing else helps. Marijuana, which is banned by federal law, has been approved for medical use in 29 states, including New York, and the District of Columbia. Accumulating scientific evidence has shown its effectiveness in treating certain medical conditions. Among them: neuropathic pain, severe muscle spasms associated with multiple sclerosis, unintentional weight loss, and vomiting and nausea from chemotherapy. There have also been reports that pot has helped people with Alzheimer’s disease and other types of dementia as well as Parkinson’s disease. © 2017 The New York Times Company
By Robert F. Service Predicting color is easy: Shine a light with a wavelength of 510 nanometers, and most people will say it looks green. Yet figuring out exactly how a particular molecule will smell is much tougher. Now, 22 teams of computer scientists have unveiled a set of algorithms able to predict the odor of different molecules based on their chemical structure. It remains to be seen how broadly useful such programs will be, but one hope is that such algorithms may help fragrancemakers and food producers design new odorants with precisely tailored scents. This latest smell prediction effort began with a recent study by olfactory researcher Leslie Vosshall and colleagues at The Rockefeller University in New York City, in which 49 volunteers rated the smell of 476 vials of pure odorants. For each one, the volunteers labeled the smell with one of 19 descriptors, including “fish,” “garlic,” “sweet,” or “burnt.” They also rated each odor’s pleasantness and intensity, creating a massive database of more than 1 million data points for all the odorant molecules in their study. When computational biologist Pablo Meyer learned of the Rockefeller study 2 years ago, he saw an opportunity to test whether computer scientists could use it to predict how people would assess smells. Besides working at IBM’s Thomas J. Watson Research Center in Yorktown Heights, New York, Meyer heads something called the DREAM challenges, contests that ask teams of computer scientists to solve outstanding biomedical problems, such as predicting the outcome of prostate cancer treatment based on clinical variables or detecting breast cancer from mammogram data. “I knew from graduate school that olfaction was still one of the big unknowns,” Meyer says. Even though researchers have discovered some 400 separate odor receptors in humans, he adds, just how they work together to distinguish different smells remains largely a mystery. © 2017 American Association for the Advancement of Science
By JANE E. BRODY “Feeling My Way Into Blindness,” an essay published in The New York Times in November by Edward Hoagland, an 84-year-old nature and travel writer and novelist, expressed common fears about the effects of vision loss on quality of life. Mr. Hoagland, who became blind about four years ago, projected deep-seated sadness in describing the challenges he faces of pouring coffee, not missing the toilet, locating a phone number, finding the food on his plate, and knowing to whom he is speaking, not to mention shopping and traveling, when he often must depend on the kindness of strangers. And, of course, he sorely misses nature’s inspiring vistas and inhabitants that fueled his writing, though he can still hear birds chatter in the trees, leaves rustle in the wind and waves crash on the shore. Mr. Hoagland is hardly alone in his distress. According to Action for Blind People, a British support organization, those who have lost some or all sight “struggle with a range of emotions — from shock, anger, sadness and frustration to depression and grief.” When eyesight fails, some people become socially disengaged, leading to isolation and loneliness. Anxiety about a host of issues — falls, medication errors, loss of employment, social blunders — is common. A recent study from researchers at the Wilmer Eye Institute at Johns Hopkins University School of Medicine found that most Americans regard loss of eyesight as the worst ailment that could happen to them, surpassing such conditions as loss of limb, memory, hearing or speech, or having H.I.V./AIDS. Indeed, low vision ranks behind arthritis and heart disease as the third most common chronic cause of impaired functioning in people over 70, Dr. Eric A. Rosenberg of Weill Cornell Medical College and Laura C. Sperazza, a New York optometrist, wrote in American Family Physician. © 2017 The New York Times Company
By Michael Price BOSTON--Among mammals, primates are unique in that certain species have three different types of light-sensitive cone cells in their eyes rather than two. This allows humans and their close relatives to see what we think of as the standard spectrum of color. (Humans with red-green color blindness, of course, see a different spectrum.) The standard explanation for why primates developed trichromacy, as this kind of vision is called, is that it allowed our early ancestors to see colorful ripe fruit more easily against a background of mostly green forest. A particular Old World monkey, the rhesus macaque (pictured), has a genetic distinction that offers a convenient natural test of this hypothesis: a common genetic variation makes some females have three types of cone cells and others have two. Studies with captive macaques has shown that trichromatic females are faster than their dichromatic peers at finding fruit, but attempts to see whether that’s true for wild monkeys has been complicated by the fact that macaques are hard to find, and age and rank also play big roles in determining who eats when. A vision researcher reported today at the annual meeting of AAAS, which publishes Science, that after making more than 20,000 individual observations of 80 different macaques feeding from 30 species of trees on Cayo Santiago, Puerto Rico, she can say with confidence that wild trichromatic female monkeys do indeed appear to locate and eat fruit more quickly than dichromatic ones, lending strong support to the idea that this advantage helped drive the evolution of trichromacy in humans and our relatives. © 2017 American Association for the Advancement of Science.
JoAnna Klein Some microscopes today are so powerful that they can create a picture of the gap between brain cells, which is thousands of times smaller than the width of a human hair. They can even reveal the tiny sacs carrying even tinier nuggets of information to cross over that gap to form memories. And in colorful snapshots made possible by a giant magnet, we can see the activity of 100 billion brain cells talking. Decades before these technologies existed, a man hunched over a microscope in Spain at the turn of the 20th century was making prescient hypotheses about how the brain works. At the time, William James was still developing psychology as a science and Sir Charles Scott Sherrington was defining our integrated nervous system. Meet Santiago Ramón y Cajal, an artist, photographer, doctor, bodybuilder, scientist, chess player and publisher. He was also the father of modern neuroscience. “He’s one of these guys who was really every bit as influential as Pasteur and Darwin in the 19th century,” said Larry Swanson, a neurobiologist at the University of Southern California who contributed a biographical section to the new book “The Beautiful Brain: The Drawings of Santiago Ramón y Cajal.” “He’s harder to explain to the general public, which is probably why he’s not as famous.” Last month, the Weisman Art Museum in Minneapolis opened a traveling exhibit that is the first dedicated solely to Ramón y Cajal’s work. It will make stops in Minneapolis; Vancouver, British Columbia; New York; Cambridge, Mass.; and Chapel Hill, N.C., through April 2019. Ramón y Cajal started out with an interest in the visual arts and photography — he even invented a method for making color photos. But his father pushed him into medical school. Without his artistic background, his work might not have had as much impact, Dr. Swanson said. © 2017 The New York Times Company
Keyword: Brain imaging
Link ID: 23251 - Posted: 02.18.2017
By Nathaniel P. Morris Cardiovascular disease and mental illness are among the top contributors to death and disability in the United States. At first glance, these health conditions seem to lie at opposite ends of the medical spectrum: Treating the heart is often associated with lab draws, imaging and invasive procedures, whereas treating the mind conjures up notions of talk therapy and subjective checklists. Yet researchers are discovering some surprising ties between cardiac health and mental health. These connections have profound implications for patient care, and doctors are paying attention. Depression has become recognized as a major issue for people with heart disease. Studies have found that between 17 and 44 percent of patients with coronary artery disease also have major depression. According to the American Heart Association, people hospitalized for a heart attack are roughly three times as likely as the general population to experience depression. As many as 40 percent of patients undergoing coronary artery bypass surgery suffer from depression. Decades of research suggest these illnesses may actually cause one another. For example, patients with heart disease are often sick and under stressful circumstances, which can foster depressive symptoms. But depression itself is also a risk factor for developing heart disease. Researchers aren’t sure why, but something about being depressed — possibly a mix of factors including inflammatory changes and behavior changes — appears to increase risk of heart disease. © 1996-2017 The Washington Post
Emotions are a cognitive process that relies on “higher-order states” embedded in cortical (conscious) brain circuits; emotions are not innately programmed into subcortical (nonconscious) brain circuits, according to a potentially earth-shattering new paper by Joseph LeDoux and Richard Brown. The February 2017 paper, “A Higher-Order Theory of Emotional Consciousness,” was published online today ahead of print in the journal Proceedings of the National Academy of Sciences. This paper was written by neuroscience legend Joseph LeDoux of New York University and Richard Brown, professor of philosophy at the City University of New York's LaGuardia College. Joseph LeDoux has been working on the link between emotion, memory, and the brain since the 1990s. He's credited with putting the amygdala in the spotlight and making this previously esoteric subcortical brain region a household term. LeDoux founded the Emotional Brain Institute (EBI). He’s also a professor in the Departments of Psychiatry and Child and Adolescent Psychiatry at NYU Langone Medical Center. Why Is This New Report From LeDoux and Brown Significant? In the world of cognitive neuroscience, there's an ongoing debate about the interplay between emotional states of consciousness (or feelings) within cortical and subcortical brain regions. (Most experts believe that cortical brain regions house “thinking” neural circuits within the cerebral cortex. Subcortical brain regions are considered to be housed in “non-thinking” neural circuits beneath the 'thinking cap' of the cerebral cortex.) © 1991-2017 Sussex Publishers, LLC
Link ID: 23249 - Posted: 02.18.2017
By Timothy Revell It can be difficult to communicate when you can only move your eyes, as is often the case for people with ALS (also known as motor neurone disease). Microsoft researchers have developed an app to make talking with your eyes easier, called GazeSpeak. GazeSpeak runs on a smartphone and uses artificial intelligence to convert eye movements into speech, so a conversation partner can understand what is being said in real time. The app runs on the listener’s device. They point their smartphone at the speaker as if they are taking a photo. A sticker on the back of the phone, visible to the speaker, shows a grid with letters grouped into four boxes corresponding to looking left, right, up and down. As the speaker gives different eye signals, GazeSpeak registers them as letters. “For example, to say the word ‘task’ they first look down to select the group containing ‘t’, then up to select the group containing ‘a’, and so on,” says Xiaoyi Zhang, who developed GazeSpeak whilst he was an intern at Microsoft. GazeSpeak selects the appropriate letter from each group by predicting the word the speaker wants to say based on the most common English words, similar to predictive text messaging. The speaker indicates they have finished a word by winking or looking straight ahead for two seconds. The system also takes into account added lists of words, like names or places that the speaker is likely to use. The top four word predictions are shown onscreen, and the top one is read aloud. © Copyright Reed Business Information Ltd.
By Emma Hiolski There’s more to those love handles than meets the eye. Fat tissue can communicate with other organs from afar, sending out tiny molecules that control gene activity in other parts of the body, according to a new study. This novel route of cell-to-cell communication could indicate fat plays a much bigger role in regulating metabolism than previously thought. It could also mean new treatment options for diseases such as obesity and diabetes. “I found this very interesting and, frankly, very exciting,” says Robert Freishtat of Children’s National Health System in Washington, D.C., a pediatrician and researcher who has worked with metabolic conditions like obesity and diabetes. Scientists have long known that fat is associated with all sorts of disease processes, he says, but they don’t fully understand how the much-reviled tissue affects distant organs and their functions. Scientists have identified hormones made by fat that signal the brain to regulate eating, but this new study—in which Freishtat was not involved—takes a fresh look at another possible messenger: small snippets of genetic material called microRNAs, or miRNAs. MiRNAs, tiny pieces of RNA made inside cells, help control the expression of genes and, consequently, protein production throughout the body. But some tumble freely through the bloodstream, bundled into tiny packets called exomes. There, high levels of some miRNAs have been associated with obesity, diabetes, cancer, and cardiovascular disease. © 2017 American Association for the Advancement of Science.
Link ID: 23247 - Posted: 02.17.2017
By Alice Callahan Once fat cells are formed, can you ever get rid of them? The number of fat cells in a person’s body seems to be able to change in only one direction: up. Fat cell number increases through childhood and adolescence and generally stabilizes in adulthood. But this doesn’t mean that fat cells, or adipocytes, are stagnant. The size of individual fat cells is remarkably variable, expanding and contracting with weight gain or weight loss. And as with most cell types in the body, adipocytes die eventually. “Usually when old ones die, they are replaced by new fat cells,” said Dr. Michael Jensen, an endocrinologist and obesity researcher at the Mayo Clinic. Cell death and production appear to be tightly coupled, so although about 10 percent of adipocytes die each year, they’re replaced at the same rate. Even among bariatric surgery patients, who can lose massive amounts of weight, the number of fat cells tends to remain the same, although the cells shrink in size, studies show. Liposuction reduces the number of fat cells in a person’s body, but studies show the weight lost is typically regained within a year. It isn’t known whether this regain occurs through the production of new fat cells or expansion of existing ones. People who are obese tend to have more fat cells than those who are not, and several studies have found an increase in fat cell number with weight regain following weight loss. The fact that fat cell number can be increased but not decreased most likely contributes to the body’s drive to regain weight after weight loss, said Dr. Kirsty L. Spalding, a cell biologist at the Karolinska Institute in Sweden and the lead author of a 2008 study showing that fat cells die and are replaced. Beyond their role in storing fat, adipocytes secrete proteins and hormones that affect energy metabolism. © 2017 The New York Times Company
Link ID: 23246 - Posted: 02.17.2017
By Kelly Clancy More than two hundred years ago, a French weaver named Joseph Jacquard invented a mechanism that greatly simplified textile production. His design replaced the lowly draw boy—the young apprentice who meticulously chose which threads to feed into the loom to create a particular pattern—with a series of paper punch cards, which had holes dictating the lay of each stitch. The device was so successful that it was repurposed in the first interfaces between humans and computers; for much of the twentieth century, programmers laid out their code like weavers, using a lattice of punched holes. The cards themselves were fussy and fragile. Ethereal information was at the mercy of its paper substrate, coded in a language only experts could understand. But successive computer interfaces became more natural, more flexible. Immutable program instructions were softened to “If x, then y. When a, try b.” Now, long after Jacquard’s invention, we simply ask Amazon’s Echo to start a pot of coffee, or Apple’s Siri to find the closest car wash. In order to make our interactions with machines more natural, we’ve learned to model them after ourselves. Early in the history of artificial intelligence, researchers came up against what is referred to as Moravec’s paradox: tasks that seem laborious to us (arithmetic, for example) are easy for a computer, whereas those that seem easy to us (like picking out a friend’s voice in a noisy bar) have been the hardest for A.I. to master. It is not profoundly challenging to design a computer that can beat a human at a rule-based game like chess; a logical machine does logic well. But engineers have yet to build a robot that can hopscotch. The Austrian roboticist Hans Moravec theorized that this might have something to do with evolution. Since higher reasoning has only recently evolved—perhaps within the last hundred thousand years—it hasn’t had time to become optimized in humans the way that locomotion or vision has. The things we do best are largely unconscious, coded in circuits so ancient that their calculations don’t percolate up to our experience. But because logic was the first form of biological reasoning that we could perceive, our thinking machines were, by necessity, logic-based. © 2017 Condé Nast.
Hannah Devlin A transportable brain-scanning helmet that could be used for rapid brain injury assessments of stroke victims and those felled on the sports pitch or battlefield is being tested by US scientists. The wearable device, known as the PET helmet, is a miniaturised version of the hospital positron emission tomography (PET) scanner, a doughnut-shaped machine which occupies the volume of a small room. Julie Brefczynski-Lewis, the neuroscientist leading the project at West Virginia University, said that the new helmet could dramatically speed up diagnosis and make the difference between a positive outcome and devastating brain damage or death for some patients. “You could roll it right to their bedside and put it on their head,” she said ahead of a presentation at the American Association for the Advancement of Science’s (AAAS) annual meeting in Boston. “Time is brain for stroke.” Despite being only the size of a motorbike helmet, the new device produces remarkably detailed images that could be used to identify regions of trauma to the brain in the ambulance on the way to hospital or at a person’s bedside. The device is currently being tested on healthy volunteers, but could be used clinically within two years, the team predicted.
Bret Stetka In a series of recent interviews, President Donald Trump's longtime personal physician Dr. Harold N. Bornstein told The New York Times that our new commander in chief has what amounts to a pretty unremarkable medical chart. Like about a quarter of American adults, Trump is on a statin for high cholesterol. He also takes a daily baby aspirin for heart health, an occasional antibiotic for rosacea, a skin condition, and Propecia, a pill to promote hair growth. Bornstein also told the Times that should he be appointed White House doctor, he probably wouldn't test the president for baseline dementia risk, something many doctors have argued should be mandatory. At 70, Trump is the oldest American president to ever take office. Couple his age with a family history of dementia — his father Fred developed Alzheimer's disease in his 80s — and one could argue that the question of baseline cognitive testing for the U.S. head of state has taken on new relevance. An assortment of fairly simple tests exist that can establish a reference point for cognitive capacity and detect early symptoms of mental decline. One of the most common such screens is the Mini-Mental Status Examination, a series of questions that gauges attention, orientation and short-term memory. It takes about five to 10 minutes to complete. Yet admitting vulnerability of any kind isn't something politicians have been keen to do. The true health of politicians has likely been cloaked in secrecy since the days of Mesopotamian kings, but definitely since the Wilson administration. © 2017 npr
Link ID: 23243 - Posted: 02.17.2017
By LISA SANDERS, M.D. The 3-year-old girl was having a very bad day — a bad week, really. She’d been angry and irritable, screaming and kicking at her mother over nothing. Her mother was embarrassed by this unusual behavior, because her husband’s sister, Amber Bard, was visiting. Bard, a third-year medical student at Michigan State, was staying in the guest room while working with a local medical practice in Grand Rapids so that she could spend a little time with her niece. The behavior was strange, but the mother was more concerned about her child’s left eye. A few days earlier it was red and bloodshot. It no longer was, but now the girl had little bumps near the eye. The mother asked Bard whether she could look at the eye. “I’m a third-year medical student,” Bard told her. “I know approximately nothing.” But Bard was happy to try. She turned to the girl, who immediately averted her face. “Can you show me your eye?” she asked. The girl shouted: “No! No, no, no!” Eventually Bard was able to coax her into allowing her a quick look at the eye. She saw a couple of tiny pimples along the lower lid, near the lashes, and a couple more just next to the eye. The eye itself wasn’t red; the lid wasn’t swollen. She couldn’t see any discharge. Once the child was in bed, Bard opened her laptop and turned to a database she’d been using for the past week when she started to see patients. Called VisualDx, it’s one of a dozen or so programs known as decision-support software, designed to help doctors make a diagnosis. This one focuses mostly on skin findings.
Link ID: 23242 - Posted: 02.17.2017
By Pallab Ghosh Scientists are appealing for more people to donate their brains for research after they die. They say they are lacking the brains of people with disorders such as depression and post-traumatic stress disorder. In part, this shortage results from a lack of awareness that such conditions are due to changes in brain wiring. The researchers' aim is to develop new treatments for mental and neurological disorders. The human brain is as beautiful as it is complex. Its wiring changes and grows as we do. The organ is a physical embodiment of our behaviour and who we are. In recent years, researchers have made links between the shape of the brain and mental and neurological disorders. Most of their specimens are from people with mental or neurological disorders. Samples are requested by scientists to find new treatments for Parkinson's, Alzheimer's and a whole host of psychiatric disorders. But there is a problem. Scientists at McLean Hospital and at brain banks across the world do not have enough specimens for the research community. According Dr Kerry Ressler, who is the chief scientific officer at McLean hospital, new treatments for many mental and neurological diseases are within the grasp of the research community. However, he says it is the lack of brain tissue that is holding back their development. © 2017 BBC.
Keyword: Brain imaging
Link ID: 23241 - Posted: 02.17.2017
Ewen Callaway Researchers have no way to tell whether young babies may later be diagnosed with autism. But brain scans could help, a small study suggests. By scanning the brains of babies whose siblings have autism, researchers say they have been able to make reasonably accurate forecasts about which of these high-risk infants will later develop autism themselves. The findings raise the prospect of diagnosing autism spectrum disorder (ASD) months before children develop symptoms, a goal that has proved elusive. Nature looks at the new study and its implications. Why has it been so tough to diagnose autism in infants? Children typically show symptoms of ASD, such as difficulty making eye contact, after the age of 2. Researchers believe that the brain changes underlying ASD begin much earlier — possibly even in the womb. But behavioural assessments haven't been helpful in predicting who will get autism, says Joseph Piven, a psychiatrist at the University of North Carolina (UNC) in Chapel Hill, who co-led the study, published online in Nature1. “Children who end up with autism at 2 or 3, they don’t look like they have autism in the first year," he says. Certain rare mutations are linked to ASD, but the vast majority of cases cannot be pinned to a single or even a handful of genetic risk factors. Beginning in the 1990s, Piven and other researchers noticed that children with autism tended to have larger brains than developmentally normal children, suggesting that brain growth could be a biomarker for ASD. But Piven and colleague Heather Cody Hazlett, a psychologist at UNC-Chapel Hill, say it had not been clear when overgrowth occurred. What did their latest study look at? © 2017 Macmillan Publishers Limited,
By Amy Ellis Nutt For the first time, scientists can point to substantial empirical evidence that people with attention-deficit/hyperactivity disorder have brain structures that differ from those of people without ADHD. The common disorder, they conclude, should be considered a problem of delayed brain maturation and not, as it is often portrayed, a problem of motivation or parenting. In conducting the largest brain imaging study of its kind, an international team of researchers found that ADHD involves decreased volume in key brain regions, in particular the amygdala, which is responsible for regulating the emotions. Although the study, published Wednesday in the Lancet Psychiatry, included children, adolescents and adults, the scientists said the greatest differences in brain volume appeared in the brains of children. Of seven subcortical brain regions targeted in the study, five, including the amygdala, were found to be smaller in those with ADHD, compared with those in a control group. The other regions that showed reductions in volume were: the caudate nucleus (which has been linked to goal-directed action), the putamen (involved in learning and responding to stimuli), the nucleus accumbens (which processes rewards and motivation) and the hippocampus (where memories are formed). © 1996-2017 The Washington Post