Chapter 1. Biological Psychology: Scope and Outlook
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
by Helen Thompson Earth’s magnetic field guides shark movement in the open ocean, but scientists had always suspected that sharks might also get their directions from an array of other factors, including smell. To sniff out smell’s role, biologists clogged the noses of leopard sharks (Triakis semifasciata), a Pacific coastal species that makes foraging trips out to deeper waters. Researchers released the sharks out at sea and tracked their path back to the California coast over four hours. Sharks with an impaired sense of smell only made it 37.2 percent of the way back to shore, while unimpaired sharks made it 62.6 percent of the way back to shore. The study provides the first experimental evidence that smell influences a shark’s sense of direction, the team writes January 6 in PLOS ONE. The animals may be picking up on chemical gradients produced by food sources that live on the coast. © Society for Science & the Public 2000 - 2015.
By R. Douglas Fields We all heard the warning as kids: “That TV will rot your brain!” You may even find yourself repeating the threat when you see young eyes glued to the tube instead of exploring the real world. The parental scolding dates back to the black-and-white days of I Love Lucy, and today concern is growing amid a flood of video streaming on portable devices. But are young minds really being harmed? With brain imaging, the effects of regular TV viewing on a child's neural circuits are plain to see. Studies suggest watching television for prolonged periods changes the anatomical structure of a child's brain and lowers verbal abilities. Behaviorally, even more detrimental effects may exist: although a cause-and-effect relation is hard to prove, higher rates of antisocial behavior, obesity and mental health problems correlate with hours in front of the set. Now a new study hits the pause button on this line of thinking. The researchers conclude that the entire body of research up to now has overlooked an important confounding variable, heredity, that could call into question the conventional wisdom that TV is bad for the brain. Further study will be needed to evaluate this claim, but the combined evidence suggests we need a more nuanced attitude toward our viewing habits. To understand the argument against television, we should rewind to 2013, when a team ofresearchers at Tohoku University in Japan, led by neuroscientist Hikaru Takeuchi, first published findings from a study in which the brains of 290 children between the ages of five and 18 were imaged. The kids' TV viewing habits, ranging from zero to four hours each day, were also taken into account. © 2016 Scientific American
By Elahe Izadi Tiny cameras attached to wild New Caledonian crows capture, for the first time, video footage of these elusive birds fashioning hooked stick tools, according to researchers. These South Pacific birds build tools out of twigs and leaves that they use to root out food, and they're the only non-humans that make hooked tools in the wild, write the authors of a study published Wednesday in the journal Biology Letters. Humans have previously seen the crows making the tools in artificial situations, in which scientists baited feeding sites and provided the raw tools; but researchers say the New Caledonian crows have never been filmed doing this in a completely natural setting. "New Caledonian crows are renowned for their unusually sophisticated tool behavior," the study authors write. "Despite decades of fieldwork, however, very little is known about how they make and use their foraging tools in the wild, which is largely owing to the difficulties in observing these shy forest birds." Study author Jolyon Troscianko of the University of Exeter in England described the tropical birds as "notoriously difficult to observe" because of the terrain of their habitat and their sensitivity to disturbance, he said in a press release. "By documenting their fascinating behavior with this new camera technology, we obtained valuable insights into the importance of tools in their daily search for food," he added.
Tim Radford British scientists believe they have made a huge step forward in the understanding of the mechanisms of human intelligence. That genetic inheritance must play some part has never been disputed. Despite occasional claims later dismissed, no-one has yet produced a single gene that controls intelligence. But Michael Johnson of Imperial College London, a consultant neurologist and colleagues report in Nature Neuroscience that they may have discovered a very different answer: two networks of genes, perhaps controlled by some master regulatory system, lie behind the human gift for lateral thinking, mental arithmetic, pub quizzes, strategic planning, cryptic crosswords and the ability to laugh at limericks. As usual, such research raises potentially politically-loaded questions about the nature of intelligence. “Intelligence is a composite measure of different cognitive abilities and how they are distributed in a population. It doesn’t measure any one thing. But it is measurable,” Dr Johnson said. About 40% of the variation in intelligence is explained by inheritance. The other factors are not yet certain. But the scientists raise the distant possibility that armed with the new information they may be able to devise ways to modify human intelligence. “The idea of ultimately using drugs to affect cognitive performance is not in any way new. We all drink coffee to improve our cognitive performance,” Dr Johnson said. “It’s about understanding the pathways that are related to cognitive ability both in health and disease, especially disease so one day we could help people with learning disabilities fulfill their potential. That is very important.” © 2015 Guardian News and Media Limited
Parrots can dance and talk, and now apparently they can use and share grinding tools. They were filmed using pebbles for grinding, thought to be a uniquely human activity – one that allowed our civilisations to extract more nutrition from cereal-based foods. Megan Lambert from the University of York, UK, and her colleagues were studying greater vasa parrots (Coracopsis vasa) in an aviary when they noticed some of the birds scraping shells in their enclosure with pebbles and date pips. “We were surprised,” says Lambert. “Using tools [to grind] seashells is something never seen before in animals.” Afterwards, the birds would lick the powder from the tool. Some of the parrots even passed tools to each other, which is rarely seen in animals. This behaviour was exclusively male to female. Lambert and her team, who watched the parrots for six months, noticed that the shell-scraping was more frequent before their breeding season. Since seashells contain calcium, which is critical for females before egg-laying, they suspect that the parrots could be manufacturing their own calcium supplements, as the mineral is probably better absorbed in powder form. Greater vasa parrots are native to Madagascar and have breeding and social systems unique among parrots. For example, two or more males have an exclusive sexual relationship with two or more females, and they are unusually tolerant of their group members. The reproductive ritual of sharing tools and grinding could be yet another one of their quirks. © Copyright Reed Business Information Ltd.
By Nicholas Bakalar Watching television may be bad for your brain, a new study suggests. Researchers followed 3,274 people whose average age was 25 at the start of the study for 25 years, using questionnaires every five years to collect data on their physical activity and TV watching habits. At year 25, they administered three tests that measured various aspects of mental acuity. The highest level of TV watching — more than three hours a day most days — was associated with poor performance on all three tests. Compared with those who watched TV the least, those who watched the most had between one-and-a-half and two times the odds of poor performance on the tests, even after adjusting for age, sex, race, educational level, body mass index, smoking, alcohol use, hypertension and diabetes. Those with the lowest levels of physical activity and the highest levels of TV watching were the most likely to have poor test results. The authors acknowledge that their findings, published in JAMA Psychiatry, depend on self-reports, and that they had no baseline tests of cognitive function for comparison. “We can’t separate out what is going on with the TV watching,” said the lead author, Dr. Kristine Yaffe, a professor of psychiatry and neurology at the University of California, San Francisco. “Is it just the inactivity, or is there something about watching TV that’s the opposite of cognitive stimulation?” © 2015 The New York Times Company
Link ID: 21675 - Posted: 12.05.2015
Sara Reardon Panzee the chimpanzee was a skilled communicator that could tell untrained humans where to find hidden food by using gestures and vocalizations. Austin the chimp was particularly adept with a computer, and scientists have been scanning its genome for clues to its unusual cognitive abilities. Both apes lived at a language-research centre at Georgia State University in Atlanta, and both died several years ago — but they will live on in an online database of brain scans and behavioural data from nearly 250 chimpanzees. Researchers hope to combine this trove, now in development, with a biobank of chimpanzee brains to enable scientists anywhere in the world to study the animals’ neurobiology. This push to repurpose old data is especially timely now that the US National Institutes of Health (NIH) has decided to retire its remaining research chimpanzees. The agency decommissioned more than 300 animals in 2013, but kept 50 available for research in case of a public-health emergency. Following an 18 November decision, this remaining population will also be sent to sanctuaries in the coming years. The NIH also hopes to retire another 82 chimps that it supports but does not own, says director Francis Collins. “We were on a trajectory toward zero, and today’s the day we’re at zero,” says Jeffrey Kahn, a bioethicist at Johns Hopkins University in Baltimore, Maryland, who led a 2011 study on the NIH chimp colony for the Institute of Medicine. © 2015 Nature Publishing Group
by Sarah Zielinski Call someone a “bird brain” and they are sure to be offended. After all, it’s just another way of calling someone “stupid.” But it’s probably time to retire the insult because scientists are finding more and more evidence that birds can be pretty smart. Consider these five species: We may call pigeons “flying rats” for their penchant for hanging out in cities and grabbing an easy meal. (Long before there was “pizza rat,” you know there had to be “pizza pigeons” flying around New York City.) But there may be more going on in their brains than just where to find a quick bite. Richard Levenson of the University of California, Davis Medical Center and colleagues trained pigeons to recognize images of human breast cancers. In tests, the birds proved capable of sorting images of benign and malignant tumors. In fact, they were just as good as humans, the researchers report November 18 in PLOS ONE. In keeping with the pigeons’ reputation, though, food was the reward for their performance. No one would suspect the planet’s second-best toolmakers would be small black birds flying through mountain forests on an island chain east of Australia. But New Caledonian crows have proven themselves not only keen toolmakers but also pretty good problem-solvers, passing some tests that even dogs (and pigeons) fail. For example, when scientists present an animal with a bit of meat on a long string dangling down, many animals don’t ever figure out how to get the meat. Pull it up with one yank, and the meat is still out of reach. Some animals will figure out how to get it through trial and error, but a wild New Caledonian crow solved the problem — pull, step on string, pull some more — on its first try. © Society for Science & the Public 2000 - 2015
By James Gallagher Health editor, BBC News website A mass vaccination programme against meningitis A in Africa has been a "stunning success", say experts. More than 220 million people were immunised across 16 countries in the continent's meningitis belt. In 2013 there were just four cases across the entire region, which once faced thousands of deaths each year. However, there are fresh warnings from the World Health Organization that "huge epidemics" could return unless a new vaccination programme is started. The meningitis belt stretches across sub-Saharan Africa from Gambia in the west to Ethiopia in the east. In the worst epidemic recorded, in 1996-97, the disease swept across the belt infecting more than a quarter of a million people and led to 25,000 deaths. Unlike other vaccines, the MenAfriVac was designed specifically for Africa and in 2010 a mass vaccination campaign was started. "The disease has virtually disappeared from this part of the world," said Dr Marie-Pierre Preziosi from the World Health Organization. The mass immunisation programme was aimed at people under 30. However, routine vaccination will be needed to ensure that newborns are not vulnerable to the disease. Projections, published in the journal Clinical Infectious Diseases, showed the disease could easily return. Dr Preziosi told the BBC News website: "What could happen is a huge epidemic that could sweep the entire area, that could target hundreds of thousands of people with 5-10% deaths at least. © 2015 BBC
Link ID: 21624 - Posted: 11.11.2015
Richard A. Friedman YOU can increase the size of your muscles by pumping iron and improve your stamina with aerobic training. Can you get smarter by exercising — or altering — your brain? Stories from Our Advertisers This is hardly an idle question considering that cognitive decline is a nearly universal feature of aging. Starting at age 55, our hippocampus, a brain region critical to memory, shrinks 1 to 2 percent every year, to say nothing of the fact that one in nine people age 65 and older has Alzheimer’s disease. The number afflicted is expected to grow rapidly as the baby boom generation ages. Given these grim statistics, it’s no wonder that Americans are a captive market for anything, from supposed smart drugs and supplements to brain training, that promises to boost normal mental functioning or to stem its all-too-common decline. The very notion of cognitive enhancement is seductive and plausible. After all, the brain is capable of change and learning at all ages. Our brain has remarkable neuroplasticity; that is, it can remodel and change itself in response to various experiences and injuries. So can it be trained to enhance its own cognitive prowess? The multibillion-dollar brain training industry certainly thinks so and claims that you can increase your memory, attention and reasoning just by playing various mental games. In other words, use your brain in the right way and you’ll get smarter. A few years back, a joint study by BBC and Cambridge University neuroscientists put brain training to the test. Their question was this: Do brain gymnastics actually make you smarter, or do they just make you better at doing a specific task? For example, playing the math puzzle KenKen will obviously make you better at KenKen. But does the effect transfer to another task you haven’t practiced, like a crossword puzzle? © 2015 The New York Times Company
Jon Hamilton For a few days this week, a convention center in Chicago became the global epicenter of brain science. Nearly 30,000 scientists swarmed through the vast hallways of the McCormick Place convention center as part of the annual Society for Neuroscience meeting. Among them were Nobel Prize winners, the director of the National Institutes of Health, and scores of researchers regarded as the international rock stars of neuroscience. "It's amazing. I'm a bit overwhelmed," said Kara Furman, a graduate student from Yale who was attending her first Society for Neuroscience meeting. Furman was just one of several hundred neuroscientists I found standing in lines outside the center one afternoon, waiting for shuttle buses. She was pondering a presentation from a few hours earlier that she found "pretty mind-blowing." What was it about? "Using MRI techniques to access dopamine release at the molecular level," she told me, deadpan. Welcome to the five-day annual event that's become known simply as "The Neuro Meeting." It's where brain scientists from around the world come to present their own work and discover the "mind-blowing" research others are doing. And there are thousands of presentations to choose from. "I prepared an itinerary based on my interests and that ran into 20 pages," said Srinivas Bharath from the National Institute of Mental Health and Neurosciences in Bangalore, India. © 2015 npr
Link ID: 21553 - Posted: 10.23.2015
By Melissa Dahl Next time you feel you are in danger of losing an argument, make some obscure reference to the brain. Any nod to neuroscience will do, even if it doesn’t actually illuminate the problem at hand or prove anything that halfway resembles a point. People tend to find explanations that include references to the brain very convincing, even if those references are mostly nonsense, according to the latest episode of "Psych Crunch," a podcast hosted by psychologist (and Science of Us contributor) Christian Jarrett. Jarrett interviews Sara Hodges, a research psychologist at the University of Oregon and the co-author of a study published this May on the appeal of “superfluous neuroscience information.” In it, Hodges and her colleagues presented students with a variety of explanations for various psychological phenomena. Some of these explanations were not really explanations at all, but rather just a restatement of the facts already presented. The students considered explanations for various quirks of human behavior from the fields of social science, biological science, and neuroscience, and rated how convincing they found each explanation. “The social sciences would refer to something about how people were raised, and the hard-science explanation referred to changes in DNA, the structure of DNA,” Hodges explained to Jarrett. The neuroscience explanation, on the other hand, would pretty much just name an area of the brain thought to be associated with the behavior at hand and leave matters at that, without really explaining anything. Even still, Hodges said, the “neuroscience explanations always came out on top — better than no explanation, better than social science, better than the hard science.” © 2015, New York Media LLC
Link ID: 21552 - Posted: 10.23.2015
by Ben Cipollini Thanks to Ms. Amazing, it’s now cliche to say, but hey… I really love SfN. For the uninitiated SfN is a thirty thousand person international conference for neuroscience–a conference so large, only a few cities in the US can handle it. Yes, that’s a giant C-SPAN2 bus that's dwarfed by this small section of the “Great Room”. For many, SfN evokes fear and dread; it’s truly overwhelming in its size, breadth, and depth. For me, it was love at first “OM*G!!!”. Don’t believe me, scientists? Let’s review the data: I loathe running, but I actually do it at SfN. One needs wheels to get from talks to posters to talks again. We filled the New Orleans convention center in 2012; it’s so long you you can actually get directions from one end of it to the other on Google Maps. Yes, that map does say “1.0 kilometers”. I hate crowds, but I will fight through the poster session crowds like a salmon heading upstream to spawn, just to get to one more poster before the end of the session. SfN may have more human traffic jams than China has vehicle jams during Golden week… but that won’t stop me from finding out how callosal connections have properties similar to those of long-range lateral connections, or to understand how hemispherectomy affects functional organization. You’d better too; you never know when one of your research heroes might be presenting the poster, or you’ll find yourself standing in front of a poster that winds up in Science just a few months later.
Link ID: 21522 - Posted: 10.17.2015
By AUSTIN RAMZY HONG KONG — Australian officials have responded to criticism from animal rights activists and celebrities, including the former actress Brigitte Bardot and the singer Morrissey, that a government plan to protect threatened species by killing millions of feral cats is unnecessarily cruel. Gregory Andrews, Australia’s threatened species commissioner, has written open letters to Ms. Bardot and Morrissey saying that feral cats prey on more than 100 of the country’s threatened species and that they were a “major contributor” to the extinction of at least 27 mammal species in the country over the past 200 years. He called some of the extinct species, such as the lesser bilby, desert bandicoot, crescent nailtail wallaby and big-eared hopping mouse, “delightful creatures, rich in importance in Australian indigenous culture, and formerly playing important roles in the ecology of our country. We don’t want to lose any more species like these.” The Australian Department of the Environment says that feral cats are the biggest threat to the country’s mammals, ahead of foxes and habitat loss. The government plan would use poison and traps to kill the cats. In announcing the plan in July, Greg Hunt, the environment minister, said that he wanted two million feral cats culled by 2020. Australia has an estimated 20 million feral cats, which are an invasive species brought by European settlers. Calls to exterminate the cats have been floated before, including one in the 1990s that called for killing all feral cats by 2020. © 2015 The New York Times Company
Keyword: Animal Rights
Link ID: 21512 - Posted: 10.15.2015
By Martin Enserink Researchers who conduct animal studies often don't use simple safeguards against biases that have become standard in human clinical trials—or at least they don't report doing so in their scientific papers, making it impossible for readers to ascertain the quality of the work, an analysis of more than 2500 journal articles shows. Such biases, conscious or unconscious, can make candidate medical treatments look better than they actually are, the authors of the analysis warn, and lead to eye-catching results that can't be replicated in larger or more rigorous animal studies—or in human trials. Neurologist Malcolm MacLeod of the Centre for Clinical Brain Sciences at the University of Edinburgh and his colleagues combed through papers reporting the efficacy of drugs in eight animal disease models and checked whether the authors reported four measures that are widely acknowledged to reduce the risk of bias. First, if there was an experimental group and a control group, were animals randomly assigned to either one? (This makes it impossible for scientists to, say, assign the healthiest mice or rats to a treatment group, which could make a drug look better than it is.) Second, were the researchers who assessed the outcomes of a trial—for instance, the effect of a treatment on an animal's health—blinded to which animal underwent what procedure? Third, did the researchers calculate in advance the sample size needed to show that they didn't just accumulate data until they found something significant? And finally, did they make a statement about their conflicts of interest? © 2015 American Association for the Advancement of Science
Carl Zimmer In recent years, a peculiar sort of public performance has taken place periodically on the sidewalks of Seattle. It begins with a woman named Kaeli N. Swift sprinkling peanuts and cheese puffs on the ground. Crows swoop in to feed on the snacks. While Ms. Swift observes the birds from a distance, notebook in hand, another person walks up to the birds, wearing a latex mask and a sign that reads “UW CROW STUDY.” In the accomplice’s hands is a taxidermied crow, presented like a tray of hors d’oeuvres. This performance is not surreal street theater, but an experiment designed to explore a deep biological question: What do crows understand about death? Ms. Swift has been running this experiment as part of her doctoral research at the University of Washington, under the guidance of John M. Marzluff, a biologist. Dr. Marzluff and other experts on crow behavior have long been intrigued by the way the birds seem to congregate noisily around dead comrades. Dr. Marzluff has witnessed these gatherings many times himself, and has heard similar stories from other people. “Whenever I give a talk about crows, there’s always someone who says, ‘Well, what about this?’ ” he said. Dr. Marzluff and Ms. Swift decided to bring some scientific rigor to these stories. They wanted to determine whether a dead crow really does trigger a distinctive response from living crows and, if so, what the purpose of the large, noisy gatherings might be. To run the experiment, Ms. Swift began by delivering food to a particular spot each day, so that the crows learned to congregate there to eat. Then one of her volunteers would approach the feast with a dead crow, and Ms. Swift observed how the birds reacted. © 2015 The New York Times Company
Sara Reardon The brain’s wiring patterns can shed light on a person’s positive and negative traits, researchers report in Nature Neuroscience1. The finding, published on 28 September, is the first from the Human Connectome Project (HCP), an international effort to map active connections between neurons in different parts of the brain. The HCP, which launched in 2010 at a cost of US$40 million, seeks to scan the brain networks, or connectomes, of 1,200 adults. Among its goals is to chart the networks that are active when the brain is idle; these are thought to keep the different parts of the brain connected in case they need to perform a task. In April, a branch of the project led by one of the HCP's co-chairs, biomedical engineer Stephen Smith at the University of Oxford, UK, released a database of resting-state connectomes from about 460 people between 22 and 35 years old. Each brain scan is supplemented by information on approximately 280 traits, such as the person's age, whether they have a history of drug use, their socioeconomic status and personality traits, and their performance on various intelligence tests. Smith and his colleagues ran a massive computer analysis to look at how these traits varied among the volunteers, and how the traits correlated with different brain connectivity patterns. The team was surprised to find a single, stark difference in the way brains were connected. People with more 'positive' variables, such as more education, better physical endurance and above-average performance on memory tests, shared the same patterns. Their brains seemed to be more strongly connected than those of people with 'negative' traits such as smoking, aggressive behaviour or a family history of alcohol abuse. © 2015 Nature Publishing Group,
By David Grimm The journal Nature is revising its policy on publishing animal experiments after a study it ran in 2011 received criticism because the authors allowed tumors to grow excessively large in mice. The paper reported that a compound isolated from a pepper plant killed cancer cells without harming healthy cells. Yesterday, the journal published a correction to the study (the paper’s second), which noted that “some tumors on some of the animals exceeded the maximum size … permitted by the Institutional Animal Care and Use Committee.” The tumors were only supposed to grow to a maximum of 1.5 cubic centimeters, but some reached 7 cubic centimeters, according to David Vaux, a cell biologist at the Walter and Eliza Hall Institute of Medical Research in Melbourne, Australia, who first raised concerns about the paper in 2012. (Vaux spoke to Retraction Watch, which first reported the correction.) In an editorial published yesterday, Nature calls the large tumors “a breach of experimental protocol,” one that could have caused the mice to “have experienced more pain and suffering than originally allowed for.” The journal also noted the lapse could have implications beyond the one study, saying that “cases such as this could provoke a justifiable backlash against animal research.” Nature says it will now require authors to include the maximum tumor size allowed by its institutional animal-use committee, and to state that this size was not exceeded during the experiments. The journal does say, however, that it is not retracting the paper, and that the study remains “valid and useful.”
Keyword: Animal Rights
Link ID: 21418 - Posted: 09.20.2015
James Gorman If spiders had nightmares, the larvae of ichneumonid wasps would have to star in them. The wasp lays an egg on the back of an orb weaver spider, where it grows fat and bossy, and occupies itself with turning the spider into a zombie. As Keizo Takasuka and his colleagues point out in The Journal of Experimental Biology, this is a classic case of “host manipulation.” Using more colorful language, he described the larva turning the spider into a “drugged navvy.” The larva forces the spider to turn its efforts away from maintaining a sticky, spiral web to catch prey, and to devote itself to building a safe and sturdy web to serve as a home for the larva’s cocoon, in which it will transform itself into a wasp. This process was well known, but Dr. Takasuka and Kaoru Maeto at Kobe University, working with other Japanese researchers, wanted to explore how the wasp overlords controlled their spiders. They suspected that the larvae were co-opting a natural behavior of the spiders. Turning on a behavior already in the spiders’ repertoire would be much easier than controlling every step of modifying a sticky web. So they compared the cocoon web to one that the spiders themselves build to rest in when they are molting. It’s called a resting web. The similarities were striking. In both the resting and cocoon webs, the sticky, spiraling threads that make the webs of orb weavers so appealing were gone. Instead, the spokes of the web remained, decorated with fibrous spider silk that the researchers found reflected ultraviolet light. That would be a highly useful quality to warn away birds and some large insects from flying into the web because those creatures can see in the ultraviolet spectrum. The strength of the two silk webs was also similar. © 2015 The New York Times Company
By GREGORY COWLES Oliver Sacks, the neurologist and acclaimed author who explored some of the brain’s strangest pathways in best-selling case histories like “The Man Who Mistook His Wife for a Hat,” using his patients’ disorders as starting points for eloquent meditations on consciousness and the human condition, died on Sunday at his home in Manhattan. He was 82. The cause was cancer, said Kate Edgar, his longtime personal assistant. Dr. Sacks announced in February, in an Op-Ed essay in The New York Times, that an earlier melanoma in his eye had spread to his liver and that he was in the late stages of terminal cancer. As a medical doctor and a writer, Dr. Sacks achieved a level of popular renown rare among scientists. More than a million copies of his books are in print in the United States, his work was adapted for film and stage, and he received about 10,000 letters a year. (“I invariably reply to people under 10, over 90 or in prison,” he once said.) Dr. Sacks variously described his books and essays as case histories, pathographies, clinical tales or “neurological novels.” His subjects included Madeleine J., a blind woman who perceived her hands only as useless “lumps of dough”; Jimmie G., a submarine radio operator whose amnesia stranded him for more than three decades in 1945; and Dr. P. — the man who mistook his wife for a hat — whose brain lost the ability to decipher what his eyes were seeing. Describing his patients’ struggles and sometimes uncanny gifts, Dr. Sacks helped introduce syndromes like Tourette’s or Asperger’s to a general audience. But he illuminated their characters as much as their conditions; he humanized and demystified them. © 2015 The New York Times Company
Link ID: 21361 - Posted: 08.31.2015