Most Recent Links
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
Helen Haste The American psychologist and educationist Jerome Bruner, who has died aged 100, repeatedly challenged orthodoxies and generated novel directions. His elegant, accessible writing reached wide audiences. His colleague Rom Harré described his lectures as inspiring: “He darted all over the place, one topic suggested another and so on through a thrilling zigzag.” To the charge that he was always asking impossible questions, Jerry replied: “They are pretty much impossible, but the search for the impossible is part of what intelligence is about.” He was willing to engage with controversy, both on academic issues and in education politics. Blind at birth because of cataracts, Jerry gained his sight after surgery at the age of two. He credited this for his sense that we actively interpret and organise our world rather than passively react to it – a theme that he continued to develop in different ways. His first work lay in perception, when he resumed research at Harvard after the second world war. He found that children’s judgments of the size of coins and coin-like disks varied: poorer children overestimated the size of the coins. This contributed to the emerging “new look” movement in psychology, involving values, intentions and interpretation in contrast to the then dominant behaviourist focus on passive learning, reward and punishment. His professorship at Harvard came in 1952, and by the middle of the decade a computer metaphor began to influence psychology – the “cognitive revolution”. With Jacqueline Goodnow and George Austin, Jerry published A Study of Thinking (1956). © 2016 Guardian News and Media Limited
NOBODY knows how the brain works. But researchers are trying to find out. One of the most eye-catching weapons in their arsenal is functional magnetic-resonance imaging (fMRI). In this, MRI scanners normally employed for diagnosis are used to study volunteers for the purposes of research. By watching people’s brains as they carry out certain tasks, neuroscientists hope to get some idea of which bits of the brain specialise in doing what. The results look impressive. Thousands of papers have been published, from workmanlike investigations of the role of certain brain regions in, say, recalling directions or reading the emotions of others, to spectacular treatises extolling the use of fMRI to detect lies, to work out what people are dreaming about or even to deduce whether someone truly believes in God. But the technology has its critics. Many worry that dramatic conclusions are being drawn from small samples (the faff involved in fMRI makes large studies hard). Others fret about over-interpreting the tiny changes the technique picks up. A deliberately provocative paper published in 2009, for example, found apparent activity in the brain of a dead salmon. Now, researchers in Sweden have added to the doubts. As they reported in the Proceedings of the National Academies of Science, a team led by Anders Eklund at Linkoping University has found that the computer programs used by fMRI researchers to interpret what is going on in their volunteers’ brains appear to be seriously flawed. © The Economist Newspaper Limited 2016
Keyword: Brain imaging
Link ID: 22444 - Posted: 07.15.2016
By Andy Coghlan There once was a brainy duckling. It could remember whether shapes or colours it saw just after hatching were the same as or different to each other. The feat surprised the researchers, who were initially sceptical about whether the ducklings could grasp such complex concepts as “same” and “different”. The fact that they could suggests the ability to think in an abstract way may be far more common in nature than expected, and not just restricted to humans and a handful of animals with big brains. “We were completely surprised,” says Alex Kacelnik at the University of Oxford, who conducted the experiment along with his colleague Antone Martinho III. Kacelnik and Martinho reasoned that ducklings might be able to grasp patterns relating to shape or colour as part of the array of sensory information they absorb soon after hatching. Doing so would allow them to recognise their mothers and siblings and distinguish them from all others – abilities vital for survival. In ducklings, goslings and other species that depend for survival on following their mothers, newborns learn quickly – a process called filial imprinting. Kacelnik wondered whether this would enable them to be tricked soon after hatching into “following” objects or colours instead of their natural mother, and recognising those same patterns in future. © Copyright Reed Business Information Ltd.
Rebecca Boyle Eliane Lucassen works the night shift at Leiden University Medical Center in the Netherlands, beginning her day at 6 p.m. Yet her own research has shown that this schedule might cause her health problems. “It’s funny,” the medical resident says. “Here I am, spreading around that it’s actually unhealthy. But it needs to be done.” Lucassen and Johanna Meijer, a neuroscientist at Leiden, report today in Current Biology1 that a constant barrage of bright light prematurely ages mice, playing havoc with their circadian clocks and causing a cascade of health problems. Mice exposed to constant light experienced bone-density loss, skeletal-muscle weakness and inflammation; restoring their health was as simple as turning the lights off. The findings are preliminary, but they suggest that people living in cities flooded with artificial light may face similar health risks. “We came to know that smoking was bad, or that sugar is bad, but light was never an issue,” says Meijer. “Light and darkness matter.” Disrupted patterns Many previous studies have hinted at a connection between artificial light exposure and health problems in animals and people2. Epidemiological analyses have found that shift workers have an increased risk of breast cancer3, metabolic syndrome4 and osteoporosis5, 6. People exposed to bright light at night are more likely to have cardiovascular disease and often don’t get enough sleep. © 2016 Macmillan Publishers Limited,
Keyword: Biological Rhythms
Link ID: 22442 - Posted: 07.15.2016
Michael Egnor The most intractable question in modern neuroscience and philosophy of the mind is often phrased "What is consciousness?" The problem has been summed up nicely by philosopher David Chalmers as what he calls the Hard Problem of consciousness: How is it that we are subjects, and not just objects? Chalmers contrasts this hard question with what he calls the Easy Problem of consciousness: What are the neurobiological substrates underlying such things as wakefulness, alertness, attention, arousal, etc. Chalmers doesn't mean of course that the neurobiology of arousal is easy. He merely means to show that even if we can understand arousal from a neurobiological standpoint, we haven't yet solved the hard problem: the problem of subjective experience. Why am I an I, and not an it? Chalmers's point is a good one, and I think that it has a rather straightforward solution. First, some historical background is necessary. "What is consciousness?" is a modern question. It wasn't asked before the 17th century, because no one before Descartes thought that the mind was particularly mysterious. The problem of consciousness was created by moderns. The scholastic philosophers, following Aristotle and Aquinas, understood the soul as the animating principle of the body. In a human being, the powers of the soul -- intellect, will, memory, perception, appetite, and such -- were no more mysterious than the other powers of the soul, such as respiration, circulation, etc. Of course, biology in the Middle Ages wasn't as advanced as it is today, so there was much they didn't understand about human physiology, but in principle the mind was just another aspect of human biology, not inherently mysterious. In modern parlance, the scholastics saw the mind as the Easy Problem, no more intractable than understanding how breathing or circulation work.
Link ID: 22441 - Posted: 07.15.2016
Laura Sanders If you’ve ever watched a baby purse her lips to hoot for the first time, or flash a big, gummy grin when she sees you, or surprise herself by rolling over, you’ve glimpsed the developing brain in action. A baby’s brain constructs itself into something that controls the body, learns and connects socially. Spending time with an older person, you may notice signs of slippage. An elderly man might forget why he went into the kitchen, or fail to anticipate the cyclist crossing the road, or muddle medications with awkward and unfamiliar names. These are the signs of the gentle yet unrelenting neural erosion that comes with normal aging. These two seemingly distinct processes — development and aging — may actually be linked. Hidden in the brain-building process, some scientists now suspect, are the blueprints for the brain’s demise. The way the brain is built, recent research suggests, informs how it will decline in old age. That the end can be traced to the beginning sounds absurd: A sturdily constructed brain stays strong for decades. During childhood, neural pathways make connections in a carefully choreographed order. But in old age, this sequence plays in reverse, brain scans reveal. In both appearance and behavior, old brains seem to drift backward toward earlier stages of development. What’s more, some of the same cellular tools are involved in both processes. © Society for Science & the Public 2000 - 2016
Keyword: Development of the Brain
Link ID: 22440 - Posted: 07.14.2016
Ramin Skibba Is Justin Bieber a musical genius or a talentless hack? What you 'belieb' depends on your cultural experiences. Some people like to listen to the Beatles, while others prefer Gregorian chants. When it comes to music, scientists find that nurture can trump nature. Musical preferences seem to be mainly shaped by a person’s cultural upbringing and experiences rather than biological factors, according to a study published on 13 July in Nature1. “Our results show that there is a profound cultural difference” in the way people respond to consonant and dissonant sounds, says Josh McDermott, a cognitive scientist at the Massachusetts Institute of Technology in Cambridge and lead author of the paper. This suggests that other cultures hear the world differently, he adds. The study is one of the first to put an age-old argument to the test. Some scientists believe that the way people respond to music has a biological basis, because pitches that people often like have particular interval ratios. They argue that this would trump any cultural shaping of musical preferences, effectively making them a universal phenomenon. Ethnomusicologists and music composers, by contrast, think that such preferences are more a product of one’s culture. If a person’s upbringing shapes their preferences, then they are not a universal phenomenon. © 2016 Macmillan Publishers Limited
Jon Hamilton Letting mice watch Orson Welles movies may help scientists explain human consciousness. At least that's one premise of the Allen Brain Observatory, which launched Wednesday and lets anyone with an Internet connection study a mouse brain as it responds to visual information. "Think of it as a telescope, but a telescope that is looking at the brain," says Christof Koch, chief scientific officer of the Allen Institute for Brain Science, which created the observatory. The hope is that thousands of scientists and would-be scientists will look through that telescope and help solve one of the great mysteries of human consciousness, Koch says. "You look out at the world and there's a picture in your head," he says. "You see faces, you see your wife, you see something on TV." But how does the brain create those images from the chaotic stream of visual information it receives? "That's the mystery," Koch says. There's no easy way to study a person's brain as it makes sense of visual information. So the observatory has been gathering huge amounts of data on mice, which have a visual system that is very similar to the one found in people. The data come from mice that run on a wheel as still images and movies appear on a screen in front of them. For the mice, it's a lot like watching TV on a treadmill at the gym. But these mice have been genetically altered in a way that allows a computer to monitor the activity of about 18,000 neurons as they respond to different images. "We can look at those neurons and from that decode literally what goes through the mind of the mouse," Koch says. Those neurons were pretty active when the mice watched the first few minutes of Orson Welles' film noir classic Touch of Evil. The film is good for mouse experiments because "It's black and white and it has nice contrasts and it has a long shot without having many interruptions," Koch says. © 2016 npr
By Tanya Lewis In recent years, research on mammalian navigation has focused on the role of the hippocampus, a banana-shaped structure known to be integral to episodic memory and spatial information processing. The hippocampus’s primary output, a region called CA1, is known to be divided into superficial and deep layers. Now, using two-photon imaging in mice, researchers at Columbia University in New York have found these layers have distinct functions: superficial-layer neurons encode more-stable maps, whereas deep-layer brain cells better represent goal-oriented navigation, according to a study published last week (July 7) in Neuron. “There are lots of catalogued differences in sublayers of pyramidal cells” within the hippocampus, study coauthor Nathan Danielson of Columbia told The Scientist. “The question is, are the principle cells in each subregion doing the same thing? Or is there a finer level of granularity?” For that past few decades, scientists have been chipping away at an explanation of the brain’s “inner GPS.” The 2014 Nobel Prize in Physiology or Medicine honored the discovery of so-called place cells and grid cells in the hippocampus, which keep track of an individual’s location and coordinates in space, respectively. Since then, studies have revealed that neurons in different hippocampal regions have distinct genetic, anatomical, and physiological properties, said Attila Losonczy of Columbia, Danielson’s graduate advisor and a coauthor on the study. © 1986-2016 The Scientist
Keyword: Learning & Memory
Link ID: 22437 - Posted: 07.14.2016
Suzi Gage Ketamine hydrochloride is a synthetic dissociative anaesthetic. It was first synthesized in the 1960s for medical use, and was first used medicinally during the Vietnam war. Recreationally, it is usually consumed by snorting a white crystalline powder, and at lower doses than when it’s used as an anaesthetic. However it can also be injected, or smoked. It is used in a club setting, but also as a psychedelic. Short term effects When ketamine is snorted, it gets in to the blood stream quickly, and intoxication effects occur soon after it’s taken. Although it’s an anaesthetic, at low doses it raises heart rate. It’s also associated with cognitive impairment during intoxication, including to speech and executive function. It can also induce mild psychedelic effects such as perceptual changes and psychotic-like experiences, which are appealing to some users, but can also be distressing. At slightly higher doses, users can experience a dissociative state, where their mind feels separated from their body. This can also manifest as a feeling of depersonalization. At higher doses, the anaesthetic quality of ketamine becomes more pronounced. People may find it difficult to move and may feel numb, and can experience more vivid hallucinations. This is sometimes called the ‘k-hole’ by users. Amnesia can occur at this level of use. This is a particular danger of using ketamine recreationally: users are vulnerable to assault from others in this state, or can put themselves in danger by not being aware of their surroundings (for example being unaware they are outside and it is cold can lead to hypothermia, or being unaware of surroundings could lead to walking in to traffic). © 2016 Guardian News and Media Limited
Keyword: Drug Abuse
Link ID: 22436 - Posted: 07.14.2016
By Virginia Morell Infanticide—the killing of offspring—is generally rare among birds. And when it happens, it’s usually because of outsiders that want the nesting site or territory. But what happens among birds, such as the greater ani (Crotophaga major, pictured), which have a more socialist approach to nesting? Two to four pairs of the Central and South American cuckoos (which are usually unrelated) build a single nest, and then work together to raise their chicks, which generally hatch at the same time. Intriguingly, the adults cannot recognize either their own eggs or chicks, so they care for all of them. To find out why—and if the simultaneous hatching protects the chicks from infanticide—a scientist analyzed data on nestling mortality gathered at 104 communal greater ani nests from 2006 to 2015. Of the 741 nestlings, 321 (43%) fledged and 420 (57%) died. Most of the deaths (78.5%) were due to predation. But another 13.8%, or 58 nestlings, died from infanticide, the scientist reports online today in Evolution. The remaining 32 (7.7%) died from starvation. At most of the nests, the chicks hatched within 1 day of each other. Those that first emerged from their eggs were the most likely to be dispatched by one of the nest founders, not an outsider. Chicks that hatched last were also unlucky; weaker than their older and larger nest-mates, they weren’t able to compete for food and starved. Those two pressures—infanticide and food competition—end up favoring the chicks in the middle and those that hatch on the same day, the researcher reports. © 2016 American Association for the Advancement of Science
By Anahad O'Connor Like most of my work, this article would not have been possible without coffee. I’m never fully awake until I have had my morning cup of espresso. It makes me productive, energized and what I can only describe as mildly euphoric. But as one of the millions of caffeine-loving Americans who can measure out my life with coffee spoons, (to paraphrase T.S. Eliot), I have often wondered: How does my coffee habit impact my health? The health community can’t quite agree on whether coffee is more potion or poison. The American Heart Association says the research on whether coffee causes heart disease is conflicting. The World Health Organization, which for years classified coffee as “possibly” carcinogenic, recently reversed itself, saying the evidence for a coffee-cancer link is “inadequate.” National dietary guidelines say that moderate coffee consumption may actually be good for you – even reducing chronic disease. Why is there so much conflicting evidence about coffee? The answer may be in our genes. About a decade ago, Ahmed El-Sohemy, a professor in the department of nutritional sciences at the University of Toronto, noticed the conflicting research on coffee and the widespread variation in how people respond to it. Some people avoid it because just one cup makes them jittery and anxious. Others can drink four cups of coffee and barely keep their eyes open. Some people thrive on it. Dr. El-Sohemy suspected that the relationship between coffee and heart disease might also vary from one individual to the next. And he zeroed in on one gene in particular, CYP1A2, which controls an enzyme – also called CYP1A2 – that determines how quickly our bodies break down caffeine. One variant of the gene causes the liver to metabolize caffeine very quickly. People who inherit two copies of the “fast” variant – one from each parent – are generally referred to as fast metabolizers. Their bodies metabolize caffeine about four times more quickly than people who inherit one or more copies of the slow variant of the gene. These people are called slow metabolizers. © 2016 The New York Times Company
By Rebecca Brewer, Jennifer Murphy, There is a persistent stereotype that people with autism are individuals who lack empathy and cannot understand emotion. It’s true that many people with autism don’t show emotion in ways that people without the condition would recognize. But the notion that people with autism generally lack empathy and cannot recognize feelings is wrong. Holding such a view can distort our perception of these individuals and possibly delay effective treatments. We became skeptical of this notion several years ago. In the course of our studies of social and emotional skills, some of our research volunteers with autism and their families mentioned to us that people with autism do display empathy. Many of these individuals said they experience typical, or even excessive, empathy at times. One of our volunteers, for example, described in detail his intense empathic reaction to his sister’s distress at a family funeral. Yet some of our volunteers with autism agreed that emotions and empathy are difficult for them. We were not willing to brush off this discrepancy with the ever-ready explanation that people with autism differ from one another. We wanted to explain the difference, rather than just recognize it. So we looked into the overlap between autism and alexithymia, a condition defined by a difficulty understanding and identifying one’s own emotions. People with high levels of alexithymia (which we assess with questionnaires) might suspect they are experiencing an emotion, but are unsure which emotion it is. They could be sad, angry, anxious or maybe just overheated. About 10 percent of the population at large — and about 50 percent of people with autism — has alexithymia. © 2016 Scientific American
By Tara Parker-Pope Hoping to alert parents to “red flags” that might signal autism, two advocacy groups yesterday launched a Web site, the ASD Video Glossary, that provides online glimpses of kids with autism to worried parents. But some experts fear the site, though well intentioned, also may cause anxiety among parents whose children are perfectly fine. The site contains videos that show subtle differences in how kids with autism speak, react, play and express themselves. The organizations behind it, Autism Speaks and First Signs, hope that parents who see resemblances in their own kids will be emboldened to seek early diagnosis and treatment, which many experts believe can improve outcomes for kids with autism. Visitors to the new site must register in order to watch the videos, and in the first two hours of its release, more than 10,000 people did so. Yet some researchers fear the video glossary is certain to be troubling for the parents of children without autism, too, because the behavior of kids without the condition can resemble that depicted in the videos. “Just as there’s a spectrum in autism…there’s a spectrum in normal development,” Dr. Michael Wasserman, a pediatrician at Ochsner Medical Center in New Orleans told the Associated Press. “Children don’t necessarily develop in a straight line.” But Amy Wetherby, a professor of communications disorders at Florida State University who helped create the site, said the videos would embolden parents to persist when doctors don’t listen to legitimate concerns about a child’s behavior. As she told the Associated Press, sometimes “parents are the first to be concerned, and the doctors aren’t necessarily worried,” she said. “This will help give them terms to take to the doctor and say, ‘I’m worried about it.”’ © 2016 The New York Times Company
Link ID: 22432 - Posted: 07.13.2016
Rachel Ehrenberg When mice have a stroke, their gut reaction can amp up brain damage. A series of new experiments reveals a surprising back-and-forth between the brain and the gut in the aftermath of a stroke. In mice, this dickering includes changes to the gut microbial population that ultimately lead to even more inflammation in the brain. There is much work to be done to determine whether the results apply to humans. But the research, published in the July 13 Journal of Neuroscience, hints that poop pills laden with healthy microbes could one day be part of post-stroke therapy. The work also highlights a connection between gut microbes and brain function that scientists are only just beginning to understand,says Ted Dinan of the Microbiome Institute at the University College Cork, Ireland. There’s growing evidence that gut microbes can influence how people experience stress or depression, for example (SN: 4/2/16, p. 23). “It’s a fascinating study” says Dinan, who was not involved with the work. “It raises almost as many questions as it answers, which is what good studies do.” Following a stroke, the mouse gut becomes temporarily paralyzed, leading to a shift in the microbial community, neurologist Arthur Liesz of the Institute for Stroke and Dementia Research in Munich and colleagues found. This altered, less diverse microbial ecosystem appears to interact with immune system cells called T cells that reside in the gut. These T cells can either dampen inflammation or dial it up, leading to more damage, says Liesz. Whether the T cells further damage the brain after a stroke rather than soothe it seems to be determined by the immune system cells’ interaction with the gut microbes. © Society for Science & the Public 2000 - 2016.
Link ID: 22431 - Posted: 07.13.2016
Not much is definitively proven about consciousness, the awareness of one’s existence and surroundings, other than that it’s somehow linked to the brain. But theories as to how, exactly, grey matter generates consciousness are challenged when a fully-conscious man is found to be missing most of his brain. Several years ago, a 44-year-old Frenchman went to the hospital complaining of mild weakness in his left leg. It was discovered then that his skull was filled largely by fluid, leaving just a thin perimeter of actual brain tissue. And yet the man was a married father of two and a civil servant with an IQ of 75, below-average in his intelligence but not mentally disabled. Doctors believe the man’s brain slowly eroded over 30 years due to a build up of fluid in the brain’s ventricles, a condition known as “hydrocephalus.” His hydrocephalus was treated with a shunt, which drains the fluid into the bloodstream, when he was an infant. But it was removed when he was 14 years old. Over the following decades, the fluid accumulated, leaving less and less space for his brain. While this may seem medically miraculous, it also poses a major challenge for cognitive psychologists, says Axel Cleeremans of the Université Libre de Bruxelles.
By Gretchen Reynolds To strengthen your mind, you may first want to exert your leg muscles, according to a sophisticated new experiment involving people, mice and monkeys. The study’s results suggest that long-term endurance exercise such as running can alter muscles in ways that then jump-start changes in the brain, helping to fortify learning and memory. I often have written about the benefits of exercise for the brain and, in particular, how, when lab rodents or other animals exercise, they create extra neurons in their brains, a process known as neurogenesis. These new cells then cluster in portions of the brain critical for thinking and recollection. Even more telling, other experiments have found that animals living in cages enlivened with colored toys, flavored varieties of water and other enrichments wind up showing greater neurogenesis than animals in drab, standard cages. But animals given access to running wheels, even if they don’t also have all of the toys and other party-cage extras, develop the most new brain cells of all. These experiments strongly suggest that while mental stimulation is important for brain health, physical stimulation is even more potent. But so far scientists have not teased out precisely how physical movement remakes the brain, although all agree that the process is bogglingly complex. Fascinated by that complexity, researchers at the National Institutes of Health recently began to wonder whether some of the necessary steps might be taking place far from the brain itself, and specifically, in the muscles, which are the body part most affected by exercise. Working muscles contract, burn fuel and pump out a wide variety of proteins and other substances. The N.I.H. researchers suspected that some of those substances migrated from the muscles into the bloodstream and then to the brain, where they most likely contributed to brain health. © 2016 The New York Times Company
Keyword: Learning & Memory
Link ID: 22429 - Posted: 07.13.2016
By Karen Weintraub Researchers at Stanford University have coaxed brain cells involved in vision to regrow and make functional connections—helping to upend the conventional dogma that mammalian brain cells, once damaged, can never be restored. The work was carried out in visually impaired mice but suggests that human maladies including glaucoma, Alzheimer’s disease and spinal cord injuries might be more repairable than has long been believed. Frogs, fish and chickens are known to regrow brain cells, and previous research has offered clues that it might be possible in mammals. The Stanford scientists say their new study confirms this and shows that, although fewer than 5 percent of the damaged retinal ganglion cells grew back, it was still enough to make a difference in the mice’s vision. “The brain is very good at coping with deprived inputs,” says Andrew Huberman, the Stanford neurobiologist who led the work. “The study also supports the idea that we may not need to regenerate every neuron in a system to get meaningful recovery.” Other researchers praised the study, published Monday in Nature Neuroscience. “I think it’s a significant step forward toward getting to the point where we really can regenerate optic nerves,” says Don Zack, a professor of ophthalmology at Johns Hopkins University who was not involved in the research. He calls it “one more indication that it may be possible to bring that ability back in humans.” © 2016 Scientific American
By Michael Price The blind comic book star Daredevil has a highly developed sense of hearing that allows him to “see” his environment with his ears. But you don’t need to be a superhero to pull a similar stunt, according to a new study. Researchers have identified the neural architecture used by the brain to turn subtle sounds into a mind’s-eye map of your surroundings. The study appears to be “very solid work,” says Lore Thaler, a psychologist at Durham University in the United Kingdom who studies echolocation, the ability of bats and other animals to use sound to locate objects. Everyone has an instinctive sense of the world around them—even if they can’t always see it, says Santani Teng, a postdoctoral researcher at the Massachusetts Institute of Technology (MIT) in Cambridge who studies auditory perception in both blind and sighted people. “We all kind of have that intuition,” says Teng over the phone. “For instance, you can tell I’m not in a gymnasium right now. I’m in a smaller space, like an office.” That office belongs to Aude Oliva, principal research scientist for MIT’s Computational Perception & Cognition laboratory. She and Teng, along with two other colleagues, wanted to quantify how well people can use sounds to judge the size of the room around them, and whether that ability could be detected in the brain. © 2016 American Association for the Advancement of Science.
Link ID: 22427 - Posted: 07.12.2016
By Maggie Koerth-Baker When former Tennessee women’s basketball coach Pat Summitt died Tuesday morning, news outlets, including ESPN, reported the cause of her death as “early-onset dementia, Alzheimer’s type.” That’s more than just a long-winded way of saying “Alzheimer’s.” By using five words instead of one, journalists were trying to point a big, flashing neon arrow at the complex realities of dementia. Dementia is more of a symptom than a diagnosis, and it can be caused by a number of different diseases. Even Alzheimer’s, the most common type of dementia, doesn’t seem to have a single cause. Instead, what ties Summitt to millions of other Alzheimer’s patients all over the world is the physical damage it wrought in her brain. Worldwide, 47.5 million people are living with some kind of dementia. Alzheimer’s represents 60 percent to 70 percent of those cases. Imagine a map of a city — roads branching out, intersecting with other roads, creating a network that allows mail to be delivered, food to be sold and brought home, people to get to their jobs. What would happen to that town if random intersections were suddenly barricaded and impassible? That’s the dystopian chaos Alzheimer’s causes, as damaged proteins clog the neurons and inhibit the flow of information from one neuron to another. Cut off from food, as well as data, the cells die. The brain shrinks. Eventually, the person dies, too. Afterward, doctors can cut into their brain and see the barriers, which are called plaques.
Link ID: 22426 - Posted: 07.12.2016