Chapter 2. Cells and Structures: The Anatomy of the Nervous System
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
At the Society for Neuroscience meeting earlier this month in San Diego, California, Science sat down with Geoffrey Ling, deputy director of the Defense Sciences Office at the Defense Advanced Research Projects Agency (DARPA), to discuss the agency’s plans for the Brain Research through Advancing Innovative Neurotechnologies (BRAIN) Initiative, a neuroscience research effort put forth by President Barack Obama earlier this year. So far, DARPA has released two calls for grant applications, with at least one more likely: The first, called SUBNETS (Systems-Based Neurotechnology for Emerging Therapies), asks researchers to develop novel, wireless devices, such as deep brain stimulators, that can cure neurological disorders such as posttraumatic stress (PTS), major depression, and chronic pain. The second, RAM (Restoring Active Memory), calls for a separate wireless device that repairs brain damage and restores memory loss. Below is an extended version of a Q&A that appears in the 29 November issue of Science. Q: Why did DARPA get involved in the BRAIN project? G.L.: It’s really focused on our injured warfighters, but it has a use for civilians who have stress disorders and civilians who also have memory disorders from dementia and the like. But at the end of the day, it is still meeting [President Obama’s] directive. Of all the things he could have chosen—global warming, alternative fuels—he chose this, so in my mind the neuroscience community should be as excited as all get-up. Q: Why does SUBNETS focus on deep brain stimulation (DBS)? G.L.: We’ve opened the possibility of using DBS but we haven’t exclusively said that. We’re challenging people to go after neuropsychiatric disorders like PTS [and] depression. We’re challenging the community to come up with something in 5 years that’s clinically feasible. DBS is an area that has really been traditionally underfunded, so we thought what the heck, let’s give it a go—in this new BRAIN Initiative the whole idea is to go after the things that there aren’t 400 R01 grants for—and let’s be bold, and boy, if it works, fabulous. © 2013 American Association for the Advancement of Science
By Dwayne Godwin and Jorge Cham Dwayne Godwin is a neuroscientist at the Wake Forest University School of Medicine. His Twitter handle is @brainyacts. Jorge Cham draws the comic strip Piled Higher and Deeper at www.phdcomics.com. © 2013 Scientific American
Keyword: Brain imaging
Link ID: 18980 - Posted: 11.30.2013
By Neuroskeptic Claims that children with autism have abnormal brain white matter connections may just reflect the fact that they move about more during their MRI scans. So say a team of Harvard and MIT neuroscientists, including Nancy “Voodoo Correlations” Kanwisher, in a new paper: Spurious group differences due to head motion in a diffusion MRI study. Essentially, the authors show how head movement during a diffusion tensor imaging (DTI) scan causes apparant differences in the integrity of white matter tracts, like these ones: In comparisons of two randomized groups of healthy children – in whom no white matter differences ought to appear – spurious effects were seen whenever one group moved more than the other: As for autism, the authors found that kids with autism moved more, on average, than controls, and that matching the two groups by motion reduced the magnitude of the group differences in white matter (though many remained significant). Technically, the motion-related differences manifested as increases in RD and reductions in FA; these were localized: The pathways that exhibited the most substantial motion-induced group differences in our data were the corpus callosum and the cingulum bundle. Perhaps this is related to their proximity to non-brain voxels (such as the ventricles) … deeper brain areas appear to be more affected than more superﬁcial ones, thus distance from the head coils may also be a factor. The good news is that there’s a simple fix: entering the motion parameters, extracted from the DTI data itself, as a covariate in the analysis. The authors show that this is extremely effective. The bad news is that most researchers don’t do this.
Peter Hildebrand Neuroscience is a rapidly growing field, but one that is usually thought to be too complex and expensive for average Americans to participate in directly. Now, an explosion of cheap scientific devices and online tutorials are on the verge of changing that. This change could have exciting implications for our future understanding of the brain. From 1995 to 2005, the amount of money spent on neuroscience research doubled. A lot of that research used medical devices, like MRI and CT Scan machines, and drugs that everyday citizens don’t have access to. Even in colleges, experience with powerful research equipment is reserved for upperclassmen and graduate students. The lowlier castes can work with models or dissect animal brains, but as scientist and engineer Greg Gage points out in this TED video, the brain isn’t like the heart or the lungs. You can’t tell how it works just by looking at it. Gage is calling for “neuro-revolution,” in which scientists and inventors come together to put the tools for learning neuroscience into the hands of the public. He may be onto something too, because those tools are looking more accessible than ever before. One of the most well publicized examples of this punk rock revolution has been Gage’s own “SpikerBox,” which he co-developed with Tim Marzullo. Roughly the size of your fist, the SpikerBox is a small collection of electronic components bolted between two squares of orange plastic. Coming out of one end are two pins that you can use to record the electrical activity of nerve cells in, say, a recently severed cockroach leg. There’s also a port that allows you to attach the box to a smartphone or tablet, and watch the spikes of activity as the neurons are stimulated. © 2013 Salon Media Group, Inc.
Keyword: Brain imaging
Link ID: 18976 - Posted: 11.26.2013
Ian Sample, science correspondent in San Diego Criminal courts in the United States are facing a surge in the number of defendants arguing that their brains were to blame for their crimes and relying on questionable scans and other controversial, unproven neuroscience, a legal expert who has advised the president has warned. Nita Farahany, a professor of law who sits on Barack Obama's bioethics advisory panel, told a Society for Neuroscience meeting in San Diego that those on trial were mounting ever more sophisticated defences that drew on neurological evidence in an effort to show they were not fully responsible for murderous or other criminal actions. Lawyers typically drew on brain scans and neuropsychological tests to reduce defendants' sentences, but in a substantial number of cases the evidence was used to try to clear defendants of all culpability. "What is novel is the use by criminal defendants to say, essentially, that my brain made me do it," Farahany said following an analysis of more than 1,500 judicial opinions from 2005 to 2012. The rise of so-called neurolaw cases has caused serious concerns in the country where brain science first appeared in murder cases. The supreme court has begun a review of how such evidence can be used in criminal cases. But legal and scientific experts nevertheless foresee the trend spreading to other countries, including the UK, and Farahany said she was expanding her work abroad. The survey even found cases where defendants had used neuroscience to argue that their confessions should be struck out because they were not competent to provide them. "When people introduce this evidence for competency, it has actually been relatively successful," Farahany said. © 2013 Guardian News and Media Limited
Kenneth S. Kosik Twenty years of research and more than US$1-billion worth of clinical trials have failed to yield an effective drug treatment for Alzheimer's disease. Most neuroscientists, clinicians and drug developers now agree that people at risk of the condition will probably need to receive medication before the onset of any cognitive symptoms. Yet a major stumbling block for early intervention is the absence of tools that can reveal the first expression of the insidious disease. So far, researchers have tended to focus on macroscopic changes associated with the disease, such as the build up of insoluble plaques of protein in certain areas of the brain, or on individual genes or molecular pathways that seem to be involved in disease progression. I contend that detecting the first disruptions to brain circuitry, and tracking the anatomical and physiological damage underlying the steady cognitive decline that is symptomatic of Alzheimer's, will require tools that operate at the 'mesoscopic' scale: techniques that probe the activity of thousands or millions of networked neurons. Although such tools are yet to be realized, several existing technologies indicate that they are within reach. Charted territory All the current approaches that are used to diagnose Alzheimer's are crude and unreliable. Take the classic biomarkers of the disease: a build up of plaques of the protein β-amyloid in a person's cerebral cortex, for instance, or elevated levels of the tau protein and dampened levels of β-amyloid in their cerebrospinal fluid. Although such markers are predictive of the disease, the interval between their appearance and the onset of cognitive problems is hugely variable, ranging from months to decades. © 2013 Nature Publishing Group
Virginia Gewin Corey White felt pretty fortunate during his job search late last year. Over the course of 4 months, he found at least 25 posts to apply for — even after he had filtered the possibilities to places where his wife also had job prospects. Competition for the jobs was, as he expected, fierce, but he secured three interviews. In the end, he says, it was his skills in functional magnetic resonance imaging (fMRI) that helped him to clinch a post at Syracuse University in New York, where they were eager to elevate their neuroscience profile. The human brain is something of an enigma. Much is known about its physical structure, but quite how it manages to marshal its myriad components into a powerhouse capable of performing so many different tasks remains a mystery. Neuroimaging offers one way to help find out, and universities and government initiatives are betting on it. Already, an increasing number of universities across the United States and Europe are buying scanners dedicated to neuroimaging — a clear signal that the area is set for growth. “Institutions feel an imperative to develop an imaging programme because everybody's got to have one to be competitive,” says Mark Cohen, an imaging pioneer at the Semel Institute for Neuroscience and Human Behavior at the University of California, Los Angeles. At the same time, a slew of major projects focusing on various aspects of the brain is seeking to paint the most comprehensive picture yet of the organ's organizing principles — from genes to high-level cognition. As a result, young scientists with computational expertise, a fluency in multiple imaging techniques and a willingness to engage in interdisciplinary collaborations could readily carve out a career in this dynamic landscape. © 2013 Nature Publishing Group
Keyword: Brain imaging
Link ID: 18894 - Posted: 11.08.2013
Helen Shen A mixture of excitement, hope and anxiety made for an electric atmosphere in the crowded hotel ballroom. On a Monday morning in early May, neuroscientists, physicists and engineers packed the room in Arlington, Virginia, to its 150-person capacity, while hundreds more followed by webcast. Only a month earlier, US President Barack Obama had unveiled the neuroscience equivalent of a Moon shot: a far-reaching programme that could rival Europe's 10-year, €1-billion (US$1.3-billion) Human Brain Project (see page 5). The US Brain Research Through Advancing Innovative Neurotechnologies (BRAIN) Initiative would develop a host of tools to study brain activity, the president promised, and lead to huge breakthroughs in understanding the mind. But Obama's vague announcement on 2 April had left out key details, such as what the initiative's specific goals would be and how it would be implemented. So at their first opportunity — a workshop convened on 6 May by the National Science Foundation (NSF) and the Kavli Foundation of Oxnard, California — researchers from across the neuroscience spectrum swarmed to fill in the blanks and advocate for their favourite causes. The result was chaotic, acknowledges Van Wedeen, a neurobiologist at Harvard Medical School in Boston, Massachusetts, and one of the workshop's organizers. Everyone was afraid of being left out of 'the next big thing' in neuroscience — even though no one knew exactly what that might be. “The belief is we're ready for a leap forward,” says Wedeen. “Which leap, and in which direction, is still being debated.” © 2013 Nature Publishing Group
From supercomputing to imaging, technologies have developed far enough that it is now possible for us to imagine a day when we will understand the murky workings of our most complex organ: the brain. True, that day remains distant, but scientists are no longer considered crazy if they report a glimpse of it on the horizon. This turning point has been marked by the independent launches this year of two major brain projects: US President Barack Obama’s Brain Research Through Advancing Innovative Neurotechnologies (BRAIN) Initiative and the European Commission’s Human Brain Project. Even if they fail to achieve the ambitions the research community sets for them, they are signals of a new confidence. Right now, the two projects are not equal. The BRAIN Initiative is in an early phase of development, and has so far been promised little new money. The impetus behind it was a brash proposal by a group of neuroscientists for a billion-dollar project to measure the activity of every neuron in the human brain. That ambition was lost on the starting block when peers, justifiably, deemed it scientifically inappropriate — but it is yet to be replaced by a single goal of equivalently Apollo-programme proportions (see page 26). This may make it hard to maintain the political support large projects always need. Conversely, the Human Brain Project — headquartered in Switzerland, where it will soon relocate from Lausanne to its new base in Geneva — has 135 partner institutes and is blessed with a plenitude of money and planning. And it has a romantic Moon-landing-level goal: to simulate the human brain in a computer within ten years, and provide it to scientists as a research resource. Programme leaders have committed €72 million (US$97 million) to the 30-month ramp-up stage; those monies started to flow into labs after the project’s launch last month. The project has a detailed ten-year road map, laden with explicit milestones. © 2013 Nature Publishing Group
Keyword: Brain imaging
Link ID: 18892 - Posted: 11.08.2013
By KATE MURPHY Whether it’s hitting a golf ball, playing the piano or speaking a foreign language, becoming really good at something requires practice. Repetition creates neural pathways in the brain, so the behavior eventually becomes more automatic and outside distractions have less impact. It’s called being in the zone. But what if you could establish the neural pathways that lead to virtuosity more quickly? That is the promise of transcranial direct current stimulation, or tDCS — the passage of very low-level electrical current through targeted areas of the brain. Several studies conducted in medical and military settings indicate tDCS may bring improvements in cognitive function, motor skills and mood. Some experts suggest that tDCS might be useful in the rehabilitation of patients suffering from neurological and psychological disorders, perhaps even in reducing the time and expense of training healthy people to master a skill. But the research is preliminary, and now there is concern about a growing do-it-yourself community, many of them video gamers, who are making tDCS devices with nine-volt batteries to essentially jump-start their brains. “If tDCS is powerful enough to do good, you have to wonder if, done incorrectly, it could cause harm,” said Dr. H. Branch Coslett, chief of the cognitive neurology section at the University of Pennsylvania School of Medicine and a co-author of studies showing that tDCS improves recall of proper names, fosters creativity and improves reading efficiency. Even the tDCS units used in research are often little more than a nine-volt battery with two electrodes and a controller for setting the current and the duration of the session. Several YouTube videos show how to make a rough facsimile. © 2013 The New York Times Company
Link ID: 18848 - Posted: 10.29.2013
Kerri Smith Jack Gallant perches on the edge of a swivel chair in his lab at the University of California, Berkeley, fixated on the screen of a computer that is trying to decode someone's thoughts. On the left-hand side of the screen is a reel of film clips that Gallant showed to a study participant during a brain scan. And on the right side of the screen, the computer program uses only the details of that scan to guess what the participant was watching at the time. Anne Hathaway's face appears in a clip from the film Bride Wars, engaged in heated conversation with Kate Hudson. The algorithm confidently labels them with the words 'woman' and 'talk', in large type. Another clip appears — an underwater scene from a wildlife documentary. The program struggles, and eventually offers 'whale' and 'swim' in a small, tentative font. “This is a manatee, but it doesn't know what that is,” says Gallant, talking about the program as one might a recalcitrant student. They had trained the program, he explains, by showing it patterns of brain activity elicited by a range of images and film clips. His program had encountered large aquatic mammals before, but never a manatee. Groups around the world are using techniques like these to try to decode brain scans and decipher what people are seeing, hearing and feeling, as well as what they remember or even dream about. © 2013 Nature Publishing Group
By Gary Stix The Obama administration’s neuroscience initiative highlights new technologies to better understand the workings of brain circuits on both a small and large scale. Various creatures, from roundworms to mice, will be centerpieces of that program because the human brain is too complex—and the ethical issues too intricate—to start analyzing the actual human organ in any meaningful way. But what if there were already a means to figure out how the brain wires itself up and, in turn, to use this knowledge to study what happens in various neurological disorders of early life? Reports in scientific journals have started to trickle in on the way stem cells can spontaneously organize themselves into complex brain tissue—what some researchers have dubbed mini-brains. Christopher A. Walsh, Bullard Professor of pediatrics and neurology at Harvard Medical School, talked to Scientific American about the importance of just such work for understanding brain development and neurological disease. (Also, check out the Perspective Walsh did for Science on this topic, along with Byoung-il Bae.) In order to be able to understand the way the brain solves this tremendously complex problem of wiring itself up, we need to be able to study it rigorously in the laboratory. We need some sort of model. We can’t just take humans and put them under the microscope, so we have to find some way of modeling the brain. The mouse has been tremendously useful for understanding brain wiring and how cells in the brain form. And the mouse will continue to be very useful. The mouse is particularly useful in studying cellular effects of particular genes, but, as we get smarter and smarter about what the problems are, we’re increasingly able to think, not about things that we share with mice, but the differences that distinguish us from mice. © 2013 Scientific American
Special Note to Teachers: The content of the following lesson plans compares the “normal” brain to a “zombie” brain. Zombies are not real but there are plenty of diseases that effect real people and students may have people in their lives who have suffered because of them. The following lessons about neuroscience have been inspired by the book, “The Zombie Autopsies”, written by Steven C. Schlozman, M.D., and are intended to compliment it. “The Zombie Autopsies” was inspired by George Romero’s 1968 cult-classic horror film “Night of the Living Dead”. These original lessons build upon each other and have an accompanying plot line where the world is fighting a zombie apocalypse and the best and the brightest young people are being trained as medical students – with a specialty in neuroscience – with the hopes that they will be able to provide a cure to this terrible epidemic and save humanity. For a richer experience have the students read the book in class and as homework (see suggested reading schedule) along with the class activities. Although the materials are organized as a unit, lessons can be used as stand-alone or can be shaped to fit the needs of you and your students regarding time and content. For example, Lesson 3 is perfect for the day of Halloween. © 2013 MacNeil-Lehrer Productions
Keyword: Learning & Memory
Link ID: 18824 - Posted: 10.23.2013
By Sandra G. Boodman, Janet Ruddock was crushed: She had dreamed of greeting her first grandchild, and now that once-in-a-lifetime experience had been marred by the embarrassing problem that had derailed her life for nearly a decade. In June 2010, Ruddock, then 59, and her husband had flown to Vancouver, B.C., from Washington to meet their new grandson. But soon after they arrived, Ruddock’s intractable sweating went into overdrive. As she sat in a rocking chair, perspiration drenched her head and upper body, soaking her shirt and dripping onto the 4-week-old infant. “I burst into tears,” Ruddock recalled. “All I can remember is the feeling that I’m wet, this poor baby’s wet and a moment you should always remember is ruined. You’re never going to get it back. “ For Ruddock, that event precipitated a suicidal depression. For the previous eight years she had undergone tests, taken drugs and endured the bafflement — and skepticism — of a parade of doctors she consulted about the extreme, unpredictable sweating that engulfed her head and upper body. After confiding her despair to a relative, she began seeing a psychiatrist. By chance, a few months later she learned about a woman whose experience mirrored her own and provided her a much-needed road map. “It’s a fascinoma,” said retired Washington internist Charles Abrams, using the medical slang for an unusual — or unusually interesting — case. “You usually hate for patients to come in and say, ‘I found this on the Internet,’ ” said Abrams, who treated Ruddock until his retirement last year. “But every once in a while, something is brought to your attention.” © 1996-2013 The Washington Post
Link ID: 18787 - Posted: 10.15.2013
By WILLIAM J. BROAD SCIENCE has looked into some strange things over the centuries — reports of gargantuan sea monsters, purported images of Jesus, sightings of alien spaceships and so on. When I first heard of spontaneous orgasm, while researching a book on yoga, including its libidinal cousin, tantra, I figured it was more allegory than reality and in any event would prove beyond the reach of even the boldest investigators. Well, I was wrong. It turns out science has tiptoed around the subject for more than a century and of late has made considerable progress in determining not only the neurophysiological basis of the phenomenon but also its prevalence. Men are mentioned occasionally. But sex researchers have found that the novel type of autoerotism shows up mainly in women. Ground zero for the research is Rutgers University, where scientists have repeatedly had female volunteers put their heads into giant machines and focus their attention on erotic fantasies — the scans reveal that the pleasure centers of their brains light up in ways indistinguishable from everyday orgasms. The lab atmosphere is no-nonsense, with plenty of lights and white coats and computer monitors. Subjects often thrash about so forcefully that obtaining clear images of their brains can be difficult. “Head movement is a huge issue,” Nan Wise, a doctoral candidate at Rutgers who helps run the project, said in an interview. “It’s hard to get a decent signal.” She said a volunteer’s moving her head more than two millimeters — less than a 10th of an inch — can make for a bad day in the lab. It is easy to dismiss this as a new kind of narcissism in search of scientific respectability, a kinky pleasure coming out of the shadows. Many YouTube videos now purport to show people using controlled breathing and erotic introspection to achieve what they describe as “thinking off” and “energy orgasms.” © 2013 The New York Times Company
By Neuroskeptic The comparative anatomy of male and female brains is an incredibly popular topic. From teachers to cartoonists, everyone’s interested in it. One supposed dude-dame dimorphism is the width of the corpus callosum, the white matter bridge that connects the brain’s left and right hemispheres. Some studies suggest that women have a larger corpus callosum, relative to overall brain size, than men. This has led to a lot of speculation about how females, with their more ‘interconnected’ brains, are therefore better at things like multitasking: The corpus callosum is 30 percent more highly developed in the female brain… allowing information to flow more easily from one side of the brain to the other, which allows a woman to focus on more than one thing at a time. However, according to Eileen Luders and colleagues, that’s all a wash, because: Differences in Brain Volume Account for Apparent Sex Differences in Callosal Anatomy It’s been argued that women’s relatively larger corpus callosa may reflect the fact that men have larger brains, on average, and that the corpus callosum is relatively smaller in larger brains. In other words, the corpus callosum difference might be a side-effect of the true gender difference (perhaps the only one) – bigger male brains overall. Luders et al confirmed this with a clever technique: they looked in a large online brain database to find some extremely small male brains, and extremely large female ones. This, the two genders were matched on total size.
By Fritz Andersen, It was hot that Sunday morning in February 2011 in Old San Juan. I had just retired after 40 years of cardiology practice in the suburbs of Washington, and my wife and I were spending the winter in Puerto Rico. A couple of friends had arrived by cruise ship, and I took them to see the 450-year-old Spanish fortress that sits above the entrance of the harbor. The fortress walls radiated heat, and after reentering the city we walked to our home for a breather and a refreshing ceiling fan. While sitting in the kitchen and sipping a beer, I suddenly passed out. I woke up a bit dizzy and confused; my friend, an internist from Arlington, told me I had had a grand mal seizure. My wife, Carmen Alicia, called a local friend, also a cardiologist, who sent us to a nearby hospital; there, an MRI exam revealed a small spot on my brain. The neurologist felt it needed to be biopsied to obtain a tissue diagnosis. I immediately returned to Virginia and went to several specialists, who suggested further testing before I decided to have an invasive brain biopsy. I also had a blood test for cysticercosis, an infection that results from eating undercooked pork contaminated with Tenia solium. This common parasite produces cysts all over the body, including the brain. It is the most common reason for seizures in many countries, particularly in India, where children with seizures are first treated for this disease even before other studies are done. My blood test was strongly positive. I started a course of oral medicine to treat it. The test reassured me. Unfortunately, my spot grew a bit over the course of three months, reaching the size of a grape. A biopsy and excision were now indicated. © 1996-2013 The Washington Post
Keyword: Genes & Behavior
Link ID: 18692 - Posted: 09.24.2013
Joseph Brean U.S. President Barack Obama’s much-hyped BRAIN initiative to crack the mysteries of consciousness via a finely detailed map of the brain in action took its first big step this week, with the release of a strategy report that foresees “revolutionary advances” in the $100-million effort to “crack the brain’s code,” perhaps in as little as “a few years.” “We stand on the verge of a great journey into the unknown,” the report says, explicitly comparing BRAIN to the Apollo moon shot, and predicting it will “change human society forever.” As a grand challenge, Apollo was an unambiguous success, despite the vast expense and human costs, but there is a growing sense among scientists, if not legacy-minded politicians, that the road ahead for modern neuroscience will be pocked with disappointment, with more impenetrable mysteries than solvable problems. As the world approaches what some are calling “peak neuro,” after three decades of over-hyped “brain porn,” the optimistic hope is that Mr. Obama’s BRAIN project will lead to a detailed and dynamic map of the brain, and thus reveal both how it works and how it fails in such diseases as Alzheimer’s or autism. The pessimistic fear, however, is that the “speed of thought,” as Mr. Obama described it, is just too quick for our current brain imaging technologies, primarily functional magnetic resonance imaging (fMRI). As the anonymous blogger Neuroskeptic, a British brain scientist who tracks the misinterpretation of brain scan studies by both scientists and media, put it in an email, “there’s just as much hype and misrepresentation as ever.” The more we learn about the brain, the less we seem to know. With its potential overstated and its aspirations presented as foregone conclusions, the relatively new field of neuroscience is in a period of self-reflection, said Jackie Sullivan, a philosopher of neuroscience at Western University in London Ont. “The vast majority of neuroscientists are well aware that the goals going forward need to be more modest,” she said. © 2013 National Post
by Andy Coghlan The two major brain abnormalities that underlie Alzheimer's disease can now be viewed simultaneously in brain scans while people are still alive, providing new insight into how the disease develops and whether drugs are working. The breakthrough comes from the development of a harmless tracer chemical that is injected into the bloodstream and accumulates exclusively in "tau tangles" – one type of abnormality that occurs in the brains of people with Alzheimer's and other kinds of dementia. Fluorescent light emitted from the chemical is picked up using positron emission tomography (PET), showing exactly where the tangles are. The tracer remains in the brain for a few hours before being broken down and expelled from the body. Similar tracers already exist for beta amyloid plaques, the other major anatomical feature of Alzheimer's, so the one for tau tangles completes the picture. "This is a big step forward," says John Hardy, an Alzheimer's researcher at University College London. "This is of critical significance, as tau lesions are known to be more intimately associated with neuronal loss than plaques," says Makoto Higuchi of the National Institute of Radiological Sciences in Chiba, Japan, and head of the team who developed the new tracer. The tracer could help researchers unravel exactly how Alzheimer's develops, and enable earlier diagnosis and monitoring of treatments. © Copyright Reed Business Information Ltd.
Posted by Gary Marcus On Monday, the National Institutes of Health released a fifty-eight-page report on the future of neuroscience—the first substantive step in developing President Obama’s BRAIN Initiative, which seeks to “revolutionize our understanding of the human mind and uncover new ways to treat, prevent, and cure brain disorders like Alzheimer’s, schizophrenia, autism, epilepsy, and traumatic brain injury.” Assembled by an advisory panel of fifteen scientists led by Cori Bargmann, of Rockefeller University, and William Newsome, of Stanford, the report assesses the state of neuroscience and offers a vision for the field’s future. The core challenge, as the report puts it, is simply that “brains—even small ones—are dauntingly complex”: Information flows in parallel through many different circuits at once; different components of a single functional circuit may be distributed across many brain structures and be spatially intermixed with the components of other circuits; feedback signals from higher levels constantly modulate the activity within any given circuit; and neuromodulatory chemicals can rapidly alter the effective wiring of any circuit. To tackle the brain’s immense complexity, the report outlines nine goals for the initiative. No effort to study the brain is likely to succeed without devoting serious attention to all nine, which range from creating structural maps of its static, physical connections to developing new ways of recording continuous, dynamic activity as it perceives the world and directs action. A less flashy, equally critical goal is to create a “census” of the brain’s basic cell types, which neuroscientists haven’t yet established. (The committee also devotes attention to ethical questions that could arise, such as what should happen if neural enhancement—the use of engineering to alter the brain—becomes a realistic possibility.) © 2013 Condé Nast.
Keyword: Brain imaging
Link ID: 18668 - Posted: 09.18.2013