Chapter 13. Memory, Learning, and Development

Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.

Links 1 - 20 of 6055

By Andy Coghlan Two gene variants have been found to be more common in gay men, adding to mounting evidence that sexual orientation is at least partly biologically determined. How does this change what we already knew? Didn’t we already know there were “gay genes”? We have known for decades that sexual orientation is partly heritable in men, thanks to studies of families in which some people are straight and some people are gay. In 1993, genetic variations in a region on the X chromosome in men were linked to whether they were heterosexual or homosexual, and in 1995, a region on chromosome 8 was identified. Both findings were confirmed in a study of gay and straight brothers in 2014. However, these studies didn’t home in on any specific genes on this chromosome. What’s new about the latest study? For the first time, individual genes have been identified that may influence how sexual orientation develops in boys and men, both in the womb and during life. Alan Sanders at North Shore University, Illinois, and his team pinpointed these genes by comparing DNA from 1077 gay and 1231 straight men. They scanned the men’s entire genomes, looking for single-letter differences in their DNA sequences. This enabled them to home in on two genes whose variants seem to be linked to sexual orientation. © Copyright New Scientist Ltd.

Keyword: Sexual Behavior; Genes & Behavior
Link ID: 24413 - Posted: 12.09.2017

By Ruth Williams Two studies in Science today—one that focuses on prenatal development in humans, the other on infancy to old age—provide insights into the extent of DNA sequence errors that the average human brain cell accumulates over a lifetime. Together, they reveal that mutations become more common as fetuses develop, and over a lifetime a person may rack up more than 2,000 mutations per cell. “I think these are both very powerful technical papers, and they demonstrate how single-cell sequencing . . . can reliably detect somatic changes in the genomes of human neurons,” says neuroscientist Fred Gage of the Salk Institute in La Jolla who was not involved in either study. “What’s cool about [the papers] is that they show two different ways that one can look at somatic mutations in single human neurons . . . and yet they get consistent results,” says neuroscientist Michael McConnell of the University of Virginia School of Medicine. Cells of the human body acquire mutations over time, whether because of errors introduced during DNA replication or damage incurred during transcription and other cellular processes. But, until recent technological developments enabled whole genome sequencing from the miniscule quantities of DNA found inside single cells or small clones of the same cell, investigating the nature and extent of such somatic mutations—and the resulting tissue mosaicism—was practically impossible. © 1986-2017 The Scientist

Keyword: Development of the Brain; Epigenetics
Link ID: 24412 - Posted: 12.09.2017

Carl Zimmer When you drive toward an intersection, the sight of the light turning red will (or should) make you step on the brake. This action happens thanks to a chain of events inside your head. Your eyes relay signals to the visual centers in the back of your brain. After those signals get processed, they travel along a pathway to another region, the premotor cortex, where the brain plans movements. Now, imagine that you had a device implanted in your brain that could shortcut the pathway and “inject” information straight into your premotor cortex. That may sound like an outtake from “The Matrix.” But now two neuroscientists at the University of Rochester say they have managed to introduce information directly into the premotor cortex of monkeys. The researchers published the results of the experiment on Thursday in the journal Neuron. Although the research is preliminary, carried out in just two monkeys, the researchers speculated that further research might lead to brain implants for people with strokes. “You could potentially bypass the damaged areas and deliver stimulation to the premotor cortex,” said Kevin A. Mazurek, a co-author of the study. “That could be a way to bridge parts of the brain that can no longer communicate.” In order to study the premotor cortex, Dr. Mazurek and his co-author, Dr. Marc H. Schieber, trained two rhesus monkeys to play a game. The monkeys sat in front of a panel equipped with a button, a sphere-shaped knob, a cylindrical knob, and a T-shaped handle. Each object was ringed by LED lights. If the lights around an object switched on, the monkeys had to reach out their hand to it to get a reward — in this case, a refreshing squirt of water. © 2017 The New York Times Company

Keyword: Learning & Memory; Movement Disorders
Link ID: 24408 - Posted: 12.08.2017

Seventeen million babies under the age of one are breathing toxic air, putting their brain development at risk, the UN children's agency has warned. Babies in South Asia were worst affected, with more than 12 million living in areas with pollution six times higher than safe levels. A further four million were at risk in East Asia and the Pacific. Unicef said breathing particulate air pollution could damage brain tissue and undermine cognitive development. Its report said there was a link to "verbal and non-verbal IQ and memory, reduced test scores, grade point averages among schoolchildren, as well as other neurological behavioural problems". The effects lasted a lifetime, it said. Delhi's air pollution is triggering a health crisis "As more and more of the world urbanises, and without adequate protection and pollution reduction measures, more children will be at risk in the years to come," Unicef said. It called for wider use of face masks and air filtering systems, and for children not to travel during spikes in pollution. Media captionSmog reduced visibility in Delhi to a few metres Last month hazardous smog began blanketing the Indian capital Delhi, prompting the Indian capital's chief minister Arvind Kejriwal to say the city had become a "gas chamber". Some schools in the city were closed but there was criticism when they re-opened, with parents accusing the authorities of disregarding their children's health. © 2017 BBC.

Keyword: Development of the Brain; Neurotoxins
Link ID: 24398 - Posted: 12.07.2017

By David Z. Hambrick, Madeline Marquardt There are advantages to being smart. People who do well on standardized tests of intelligence—IQ tests—tend to be more successful in the classroom and the workplace. Although the reasons are not fully understood, they also tend to live longer, healthier lives, and are less likely to experience negative life events such as bankruptcy. Now there’s some bad news for people in the right tail of the IQ bell curve. In a study just published in the journal Intelligence, Pitzer College researcher Ruth Karpinski and her colleagues emailed a survey with questions about psychological and physiological disorders to members of Mensa. A “high IQ society”, Mensa requires that its members have an IQ in the top two percent. For most intelligence tests, this corresponds to an IQ of about 132 or higher. (The average IQ of the general population is 100.) The survey of Mensa’s highly intelligent members found that they were more likely to suffer from a range of serious disorders. The survey covered mood disorders (depression, dysthymia, and bipolar), anxiety disorders (generalized, social, and obsessive-compulsive), attention-deficit hyperactivity disorder, and autism. It also covered environmental allergies, asthma, and autoimmune disorders. Respondents were asked to report whether they had ever been formally diagnosed with each disorder, or suspected they suffered from it. With a return rate of nearly 75%, Karpinski and colleagues compared the percentage of the 3,715 respondents who reported each disorder to the national average. © 2017 Scientific American

Keyword: Intelligence; Depression
Link ID: 24397 - Posted: 12.06.2017

By Rebecca Robbins, Akili Interactive Labs on Monday reported that its late-stage study of a video game designed to treat kids with ADHD met its primary goal, a big step in the Boston company’s quest to get approval for what it hopes will be the first prescription video game. In a study of 348 children between the ages of 8 and 12 diagnosed with ADHD, those who played Akili’s action-packed game on a tablet over four weeks saw statistically significant improvements on metrics of attention and inhibitory control, compared to children who were given a different action-driven video game designed as a placebo. The company plans next year to file for approval with the Food and Drug Administration. “We are directly targeting the key neurological pathways that control attention and impulsivity,” said Akili CEO Eddie Martucci. The study “was meant to be a strong objective test to ask: Is it the targeting we do in the brain or is it general engagement with a treatment that’s exciting and interesting … that actually leads to these targeted effects? And so I think we clearly see that it’s the targeted algorithms that we have.” Despite the positive results, questions about the product remain. For instance, parents and physicians subjectively perceived about the same amount of improvement in children’s behavior whether they were playing the placebo game or the therapeutic game. And if Akili can get approval, it remains to be seen whether clinicians and insurers will embrace its product. The video game has not been tested head-to-head against ADHD medications or psychotherapy to see if it’s equally effective. © 2017 Scientific American

Keyword: ADHD; Learning & Memory
Link ID: 24396 - Posted: 12.06.2017

Anne Churchland Decisions span a vast range of complexity. There are really simple ones: Do I want an apple or a piece of cake with my lunch? Then there are much more complicated ones: Which car should I buy, or which career should I choose? Neuroscientists like me have identified some of the individual parts of the brain that contribute to making decisions like these. Different areas process sounds, sights or pertinent prior knowledge. But understanding how these individual players work together as a team is still a challenge, not only in understanding decision-making, but for the whole field of neuroscience. Part of the reason is that until now, neuroscience has operated in a traditional science research model: Individual labs work on their own, usually focusing on one or a few brain areas. That makes it challenging for any researcher to interpret data collected by another lab, because we all have slight differences in how we run experiments. Neuroscientists who study decision-making set up all kinds of different games for animals to play, for example, and we collect data on what goes on in the brain when the animal makes a move. When everyone has a different experimental setup and methodology, we can’t determine whether the results from another lab are a clue about something interesting that’s actually going on in the brain or merely a byproduct of equipment differences. © 2010–2017, The Conversation US, Inc.

Keyword: Learning & Memory; Attention
Link ID: 24392 - Posted: 12.05.2017

By JANE E. BRODY After 72 very nearsighted years, 55 of them spent wearing Coke-bottle glasses, Jane Quinn of Brooklyn, N.Y., is thrilled with how well she can see since having her cataracts removed last year. “It’s very liberating to be able to see without glasses,” Ms. Quinn told me. “My vision is terrific. I can even drive at night. I can’t wait to go snorkeling.” And I was thrilled to be able to tell her that the surgery very likely did more than improve her poor vision. According to the results of a huge new study, it may also prolong her life. The 20-year study, conducted among 74,044 women aged 65 and older, all of whom had cataracts, found a 60 percent lower risk of death among the 41,735 women who had their cataracts removed. The findings were published online in JAMA Ophthalmology in October by Dr. Anne L. Coleman and colleagues at the Stein Eye Institute of the David Geffen School of Medicine at the University of California, Los Angeles, with Dr. Victoria L. Tseng as lead author. A cataract is a clouding and discoloration of the lens of the eye. This normally clear structure behind the iris and pupil changes shape, enabling incoming visual images to focus clearly on the retina at the back of the eye. When cataracts form, images get increasingly fuzzy, the eyes become more sensitive to glare, night vision is impaired, and color contrasts are often lost. One friend at 74 realized she needed cataract surgery when she failed to see the yellow highlighted lines in a manuscript she was reading; for her husband, then 75, it was his ophthalmologist who said “it’s time.” Cataracts typically form gradually with age, and anyone who lives long enough is likely to develop them. They are the most frequent cause of vision loss in people over 40. Common risk factors include exposure to ultraviolet radiation (i.e., sunlight), smoking, obesity, high blood pressure, diabetes, prolonged use of corticosteroids, extreme nearsightedness and family history. © 2017 The New York Times Company

Keyword: Vision; Alzheimers
Link ID: 24390 - Posted: 12.05.2017

By PERRI KLASS, M.D. We can date our pregnancies by what we were told was safe that later turned out to be more problematic. My own mother often told me lovingly (and laughingly) of the understanding doctor who advised her to drink rum every night when she was pregnant with me and had trouble falling asleep. And we know that on balance, it’s a good thing that science and epidemiology march forward, with more careful and more thorough investigations of the possible effects of exposures during fetal development and their complex long-term implications. But it’s disconcerting to learn that something you did, or something you took, in all good faith, following all the best recommendations, may be part of a more complicated story. And the researchers who have been examining the possible effects of fairly extensive acetaminophen use during pregnancy are very well aware that these are complex issues to communicate to women who have been pregnant in the past, who are pregnant right now or who become pregnant in the future. Acetaminophen, found in Tylenol and many other over-the-counter products, has been the drug recommended for pregnant women with fever or pain or inflammatory conditions certainly as far back as my own pregnancies in the 1980s and ‘90s. But in recent years there have been concerns raised about possible effects of heavy use of acetaminophen on the brain of the developing fetus. A Danish epidemiological study published in 2014 found an association between prenatal acetaminophen use during pregnancy and attention deficit hyperactivity disorder, especially if the acetaminophen use was more frequent. Zeyan Liew, a postdoctoral scholar in the department of epidemiology at the U.C.L.A. Fielding School of Public Health, who was the first author on the 2014 article, said it was challenging for researchers to look at effects that show up later in the child’s life. “With a lot of drug safety research in pregnancy, they only look into birth outcomes or congenital malformations,” Dr. Liew said. “It’s very difficult to conduct a longitudinal study and examine outcomes like neurobehavioral disorders.” © 2017 The New York Times Company

Keyword: ADHD; Development of the Brain
Link ID: 24389 - Posted: 12.04.2017

An analysis of more than 800,000 people has concluded that people who remain single for life are 42 per cent more likely to get dementia than married couples. The study also found that people who have been widowed are 20 per cent more likely to develop the condition, but that divorcees don’t have an elevated risk. Previous research has suggested that married people may have healthier lifestyles, which may help explain the findings. Another hypothesis is that married people are more socially engaged, and that this may protect against developing the condition. The stress of bereavement might be behind the increased risk in those who have been widowed. But marriage isn’t always good for the health. While men are more likely to survive a heart attack if they are married, single women recover better than those who are married. Journal reference: Journal of Neurology, Neurosurgery, and Psychiatry © Copyright New Scientist Ltd.

Keyword: Alzheimers; Learning & Memory
Link ID: 24382 - Posted: 12.01.2017

By JOANNA KLEIN Chances are that’s a shy elk looking back at a bold magpie, in the photograph above. They get along, so to speak, because the elk needs grooming and the magpie is looking for dinner. But they may have never entered into this partnership if it weren’t for their particular personalities, suggests a study published Wednesday in Biology Letters. Let’s start with the elk. In Canada’s western province of Alberta, they’ve been acting strange. Some have quit migrating, opting to hang around towns with humans who protect them from predators like wolves. Others still migrate. As a doctoral student at the University of Alberta, Robert Found, now a wildlife biologist for Parks Canada, discovered over years of observing their personalities that bold elk stayed, while shy elk migrated. But he noticed something else in the process of completing his research: As elk laid down to rest at the end of the day, magpies approached. There appeared to be a pattern: elk of some personality types aggressively rejected magpies. Others didn’t. “Sometimes the magpies will walk around right on the head and the face of the elk,” Dr. Found said. Scientists define animal personality by an individual animal’s behavior. It’s predictable, but also varies from others in a group. Dr. Found created a bold-shy scale for elk, measuring how close they allowed him to get, where elk positioned themselves within the group, which elk fought other elk, which ones won, how long elk spent monitoring for predators and their willingness to approach unfamiliar objects like old tires, skis and a bike. He also noted which elk accepted magpies. To study the magpies, he attracted the birds to 20 experimental sites with peanuts on tree stumps. During more than 20 separate trials with different magpies, he judged each bird’s behavior relative to the other magpies in a trial. Like the elk, he measured flight response, social structure and willingness to approach items they hadn’t previously encountered (a bike decorated with a boa and Christmas ornaments). He also noted who landed on a faux-elk that offered dog food rather than ticks (a previous study showed magpies liked dog food as much as ticks). © 2017 The New York Times Company

Keyword: Learning & Memory; Evolution
Link ID: 24380 - Posted: 11.30.2017

By NICHOLAS BAKALAR The daughters of women exposed to childhood trauma are at increased risk for serious psychiatric disorders, a new study concludes. Researchers studied 46,877 Finnish children who were evacuated to Sweden during World War II, between 1940 and 1944. They tracked the health of their 93,391 male and female offspring born from 1950 to 2010. The study, in JAMA Psychiatry, found that female children of mothers who had been evacuated to Sweden were twice as likely to be hospitalized for a psychiatric illness as their female cousins who had not been evacuated, and more than four times as likely to have depression or bipolar disorder. But there was no effect among male children, and no effect among children of either sex born to fathers who had been evacuated. The most obvious explanation would be that girls inherited their mental illness from their mothers, but the researchers controlled for parental psychiatric disorder and the finding still held. The lead author, Torsten Santavirta, an associate professor of economics at Uppsala University, said that it is possible that traumatic events cause changes in gene expression that can then be inherited, but the researchers did not have access to genetic information. “The most important takeaway is that childhood trauma can be passed on to offspring,” Dr. Santavirta said, “and the wrinkle here is that these associations are sex-specific.” © 2017 The New York Times Company

Keyword: Stress; Epigenetics
Link ID: 24377 - Posted: 11.30.2017

By Shawna Williams Neurodegenerative diseases are tough nuts to crack, not just because of the inherent difficulties of sorting through what has gone awry, and why, but also due to a dearth of biomarkers that could help spot the diseases and track their progression. This inability to easily diagnose many forms of neurodegeneration means that the diseases can’t be treated early in their progression. The lack of biomarkers also hinders the certainty with which researchers running clinical trials can assess whether and how well experimental treatments of the diseases are working. A simple, noninvasive eye scan now being developed for Alzheimer’s disease (AD), however, may help address both shortcomings. AD researchers already utilize amyloid positron emission tomography (PET), in which tracers are injected into patients’ brains to make the disease’s characteristic amyloid plaques detectable by PET imaging. But the scans are very expensive, spurring the continuing hunt for biomarkers. “What we now know is that the disease essentially occurs about 20 years before a patient becomes symptomatic,” says Cedars-Sinai Medical Center neuroscientist and neurosurgeon Keith Black. “And by the time one is symptomatic, they’ve already lost a lot of their brain weight; they’ve already lost a significant number of brain cells; they’ve already lost a significant amount of connectivity.” What’s needed, he says, is a way to detect the disease early so it can be treated—with drugs, lifestyle interventions, or both—before it’s too late. © 1986-2017 The Scientist

Keyword: Alzheimers; Vision
Link ID: 24376 - Posted: 11.29.2017

By Linda Searing Benzodiazepines, also known as benzos, are drugs sometimes prescribed to ease the agitation, anxiety and insomnia often experienced by people with Alzheimer’s disease. Might these powerful medications have an effect beyond their sleep-inducing or calming properties? This study Researchers analyzed data on 31,140 adults with Alzheimer’s, most in their early 80s and predominantly women. The group included 10,380 people who started taking benzodiazepines (6,438), benzodiazepine-related “Z-drugs” (3,826) or both (116) after being diagnosed with Alzheimer’s. None of them had taken these drugs for at least a year before their diagnosis. Prescribed benzos included Valium, Librium, Ativan, Xanax, Restoril, Serax and one drug not approved for use in the United States. Prescribed Z-drugs were Ambien and one non-U.S. drug. Within six months of starting to take the medication, 1,225 people had died. Those taking benzos were 41 percent more likely to have died than were people who did not take these drugs, with the strongest mortality risk occurring within four months of starting the medication. No increased risk was linked to Z-drugs. Who may be affected? People with Alzheimer’s, which usually affects those 60 and older. The researchers noted that benzodiazepines and similar drugs have a stronger effect on the central nervous system of older people than of younger ones, and they have been shown to raise older people’s risk for hip fractures, pneumonia and stroke. Because of this, they wrote, “the observed association with an increased risk of death might result from these outcomes.” Today, about 5 million Americans are living with Alzheimer’s — a number that may triple in the next three decades. © 1996-2017 The Washington Post

Keyword: Alzheimers
Link ID: 24369 - Posted: 11.28.2017

By Julie Hecht A good friend insists: "You don't study dogs, Julie. You study human culture. Dog behavior is a product of the people who love them." And since I'm not one for quick comebacks, I typically just smile and pet her dog. Because I like dogs. And I study them (see what I just did there? Bam). Or maybe she's onto something. Is dog behavior independent from where they live? From the cultural norms they're exposed to? Maybe German Shepherds can tell us a thing or two. In 2009, researchers from Hungary and the USA published a cross-cultural survey where German Shepherd owners from each country weighed in on their dogs. While a number of similarities emerged, so did differences. For example, USA German Shepherds were more likely to be kept indoors and have more types of training experiences. And when it came to behavior, on some measures there was no difference between German Shepherds in each country—all owners reported low activity-impulsivity and low inattention scores—but there were also a few differences: the USA dogs scored higher on confidence and aggressiveness than those in Hungary. Does this mean a German Shepherd here isn't the same as a German Shepherd there? One possible answer is: yes, the dogs are different. If German Shepherd lovers in the USA prefer dogs with higher confidence ratings, this preference "could lead to selective breeding for higher confidence, resulting in a population of German Shepherds in the USA with this trait." We know it’s possible to select for particular parental behavioral traits, and then observe them in offspring. "Genetic isolation, as well as environmental variation, could contribute to differences in pet behavior across cultures," the researchers offer. © 2017 Scientific American

Keyword: Learning & Memory; Evolution
Link ID: 24366 - Posted: 11.27.2017

By Mary Beth Aberlin Like the entomologist in search of colorful butterflies, my attention has chased in the gardens of the grey matter cells with delicate and elegant shapes, the mysterious butterflies of the soul, whose beating of wings may one day reveal to us the secrets of the mind. —Santiago Ramón y Cajal, Recollections of My Life Based on this quote, I am pretty certain that Santiago Ramón y Cajal, a founding father of modern neuroscience, would approve of this month’s cover. The Spaniard had wanted to become an artist, but, goaded by his domineering father into the study of medicine, Ramón y Cajal concentrated on brain anatomy, using his artistic talent to render stunningly beautiful and detailed maps of neuron placement throughout the brain. Based on his meticulous anatomical studies of individual neurons, he proposed that nerve cells did not form a mesh—the going theory at the time—but were separated from each other by microscopic gaps now called synapses. Fast-forward from the early 20th century to the present day, when technical advances in imaging have revealed any number of the brain’s secrets. Ramón y Cajal would no doubt have marveled at the technicolor neuron maps revealed by the Brainbow labeling technique. (Compare Ramón y Cajal’s drawings of black-stained Purkinje neurons to a Brainbow micrograph of the type of neuron.) But the technical marvels have gotten even more revelatory. © 1986-2017 The Scientist

Keyword: Brain imaging; Development of the Brain
Link ID: 24348 - Posted: 11.24.2017

Laura Sanders Around the six-month mark, babies start to get really fun. They’re not walking or talking, but they are probably babbling, grabbing and gumming, and teaching us about their likes and dislikes. I remember this as the time when my girls’ personalities really started making themselves known, which, really, is one of the best parts of raising a kid. After months of staring at those beautiful, bald heads, you start to get a glimpse of what’s going on inside them. When it comes to learning language, it turns out that a lot has already happened inside those baby domes by age 6 months. A new study finds that babies this age understand quite a bit about words — in particular, the relationships between nouns. Work in toddlers, and even adults, reveals that people can struggle with word meanings under difficult circumstances. We might briefly falter with “shoe” when an image of a shoe is shown next to a boot, for instance, but not when the shoe appears next to a hat. But researchers wanted to know how early these sorts of word relationships form. Psychologists Elika Bergelson of Duke University and Richard Aslin, formerly of the University of Rochester in New York and now at Haskins Laboratories in New Haven, Conn., put 51 6-month-olds to a similar test. Outfitted with eye-tracking gear, the babies sat on a parent’s lap and looked at a video screen that showed pairs of common objects. Sometimes the images were closely related: mouth and nose, for instance, or bottle and spoon. Other pairs were unrelated: blanket and dog, or juice and car. © Society for Science and the Public

Keyword: Language; Development of the Brain
Link ID: 24343 - Posted: 11.21.2017

Laura Sanders In stark contrast to earlier findings, adults do not produce new nerve cells in a brain area important to memory and navigation, scientists conclude after scrutinizing 54 human brains spanning the age spectrum. The finding is preliminary. But if confirmed, it would overturn the widely accepted and potentially powerful idea that in people, the memory-related hippocampus constantly churns out new neurons in adulthood. Adult brains showed no signs of such turnover in that region, researchers reported November 13 at a meeting of the Society for Neuroscience in Washington, D.C. Previous studies in animals have hinted that boosting the birthrate of new neurons, a process called neurogenesis, in the hippocampus might enhance memory or learning abilities, combat depression and even stave off the mental decline that comes with dementia and old age (SN: 9/27/08, p. 5). In rodents, exercise, enriched environments and other tweaks can boost hippocampal neurogenesis — and more excitingly, memory performance. But the new study may temper those ambitions, at least for people. Researchers studied 54 human brain samples that ranged from fetal stages to age 77, acquired either postmortem or during brain surgery. These samples were cut into thin slices and probed with molecular tools that can signal dividing or young cells, both of which are signs that nerve cells are being born. As expected, fetal and infant samples showed evidence of both dividing cells that give rise to new neurons and young neurons themselves in the hippocampus. But with age, these numbers declined. In brain tissue from a 13-year-old, the researchers spotted only a handful of young neurons. And in adults, there were none. |© Society for Science & the Public 2000 - 2017.

Keyword: Neurogenesis
Link ID: 24334 - Posted: 11.16.2017

By NICHOLAS BAKALAR Heart attack survivors have an increased risk for developing dementia, a new study has found. Danish researchers studied 314,911 heart attack patients and compared them with 1,573,193 controls who had not had a heart attack. They excluded anyone who had already been diagnosed with dementia or other memory disorders. The study, in Circulation, adjusted for heart failure, pulmonary disease, head trauma, kidney disease and many other variables. During 35 years of follow-up, there were 3,615 cases of Alzheimer’s disease, 2,034 cases of vascular dementia and 5,627 cases of other dementias among the heart attack patients. There was no association of heart attack with Alzheimer’s disease. But heart attack increased the risk for vascular dementia, the type caused by impaired blood flow to the brain, by 35 percent. There are several possible reasons for the link, including similar underlying causes for dementia and heart attack — among them, hypertension, stroke and having undergone coronary artery bypass surgery. The researchers had no data on smoking, and they acknowledge that there may be other variables they were unable to account for. “Dementia can’t be cured,” said the lead author, Dr. Jens Sundboll, a resident in cardiology at Aarhus University in Denmark. “What’s the solution? Prevention. And for prevention we have to identify risk factors. Here we’ve identified an important one.” © 2017 The New York Times Company

Keyword: Alzheimers
Link ID: 24330 - Posted: 11.16.2017

By Sarah DeWeerdt, Young adults with autism have an unusual gait and problems with fine motor skills. Researchers presented the unpublished findings today at the 2017 Society for Neuroscience annual meeting in Washington, D.C. Motor problems such as clumsiness, toe-walking and altered gait are well documented in autism. But most studies have been limited to children or have included adults only as part of a broad age range. “Studies haven’t focused on just adults,” says Cortney Armitano, a graduate student in Steven Morrison’s lab at Old Dominion University in Norfolk, Virginia, who presented the work. The researchers looked at 20 young adults with autism between the ages of about 17 and 25, and 20 controls of about the same age range. They put these participants through a battery of standard tests of fine motor skills, balance and walking. When it comes to simple tasks—such as tapping a finger rapidly against a hard surface or standing still without swaying—those with autism perform just as well as controls do. But with activities that require more back-and-forth between the brain and the rest of the body, differences emerge. Adults with autism have slower reaction times compared with controls, measured by how fast they can click a computer mouse in response to seeing a button light up. They also have a weaker grip. © 2017 Scientific American,

Keyword: Autism; Movement Disorders
Link ID: 24329 - Posted: 11.15.2017