Chapter 16. None
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
by Catherine Brahic Move over Homo habilis, you're being dethroned. A growing body of evidence – the latest published this week – suggests that our "handy" ancestor was not the first to use stone tools. In fact, the ape-like Australopithecus may have figured out how to be clever with stones before modern humans even evolved. Humans have a way with flint. Sure, other animals use tools. Chimps smash nuts and dip sticks into ant nests to pull out prey. But humans are unique in their ability to apply both precision and strength to their tools. It all began hundreds of thousands of years ago when a distant ancestor began using sharp stone flakes to scrape meat off skin and bones. So who were those first toolmakers? In 2010, German researchers working in Ethiopia discovered markings on two animal bones that were about 3.4 million years old. The cut marks had clearly been made using a sharp stone, and they were at a site that was used by Lucy's species, Australopithecus afarensis. The study, led by Shannon McPherron of the Max Planck Institute for Evolutionary Anthropology in Leipzig, Germany, was controversial. The bones were 800,000 years older than the oldest uncontested stone tools, and at the time few seriously thought that australopithecines had been tool users. Plus, McPherron hadn't found the tool itself. The problem, says McPherron, is that if we just go on tools that have been found, we must conclude that one day somebody made a beautifully flaked Oldowan hand axe, completely out of the blue. That seems unlikely. © Copyright Reed Business Information Ltd.
Link ID: 20512 - Posted: 01.23.2015
by Linda Geddes OUR personality literally shapes our world. It helps determine how many friends we have, which jobs we excel in and how we cope with adversity. Now it seems it may even play a role in our health – and not just in terms of any hypochondriac tendencies we harbour, but also how prone our bodies are to getting sick in the first place. It is a provocative idea but one that has been steadily gaining traction. We think of conscientiousness, for example, as a positive trait because it suggests caution, careful planning and an aversion to potential danger. But could it also be a symptom of underlying weakness in the immune system? That's one interpretation of a study published last month that sought to pick apart the links between personality traits and the immune system. It found that highly conscientious people had lower levels of inflammation; an immune response that helps the body fight infection and recover from injury. Highly extrovert people had higher levels. This may mean that extroverts are more physically robust – at least while they're young. While this sounds like good news, there's also a downside since sustained inflammation over a lifetime may leave you vulnerable to diabetes, atherosclerosis and cancer. "The biggest take-home message is that what happens in our health is connected to what happens in our heads and what happens in our lives," says Steven Cole at the University of California in Los Angeles (UCLA), who supervised the research. © Copyright Reed Business Information Ltd.
Link ID: 20510 - Posted: 01.22.2015
By Rachel Feltman Fear is one of our most basic evolutionary instincts, a sudden physical jolt to help us react to danger more quickly. In the modern world, fear often seems excessive -- in the absence of wild animals to flee, we're left screaming over roller coasters and scary movies. But for at least one woman, fear is unobtainable. And while she lives a normal life, her fearlessness is actually a handicap. The researchers who study her keep her closely guarded, using the code-name "SM" when publishing papers about her brave brainpower. And until this year, she'd never been interviewed. "Tell me what fear is," Tranel began. "Well, that's what I'm trying to -- to be honest, I truly have no clue," SM said, her voice raspy. That's actually a symptom of the condition that stole fear from her. Urbach-Wieth disease, which is characterized by a hoarse voice, small bumps around the eyes, and calcium deposits in the brain is rare in its own right -- only 400 people on the planet are known to have it -- but in SM's case, some of those brain-deposits happened to take over her amygdalae. These almond-shaped structures deep inside the brain are crucial to human fear response. And in SM's case, they've been totally calcified since she was a young woman. Now in her 40s, her fear-center is as good as gone. "It's a little bit as if you would go to this region and literally scoop it out," Antonio Damasio, another neuroscientist who studies SM, told "Invisibilia" hosts Lulu Miller and Alix Spiegel.
Link ID: 20504 - Posted: 01.21.2015
Oliver Burkeman One spring morning in Tucson, Arizona, in 1994, an unknown philosopher named David Chalmers got up to give a talk on consciousness, by which he meant the feeling of being inside your head, looking out – or, to use the kind of language that might give a neuroscientist an aneurysm, of having a soul. Though he didn’t realise it at the time, the young Australian academic was about to ignite a war between philosophers and scientists, by drawing attention to a central mystery of human life – perhaps the central mystery of human life – and revealing how embarrassingly far they were from solving it. The scholars gathered at the University of Arizona – for what would later go down as a landmark conference on the subject – knew they were doing something edgy: in many quarters, consciousness was still taboo, too weird and new agey to take seriously, and some of the scientists in the audience were risking their reputations by attending. Yet the first two talks that day, before Chalmers’s, hadn’t proved thrilling. “Quite honestly, they were totally unintelligible and boring – I had no idea what anyone was talking about,” recalled Stuart Hameroff, the Arizona professor responsible for the event. “As the organiser, I’m looking around, and people are falling asleep, or getting restless.” He grew worried. “But then the third talk, right before the coffee break – that was Dave.” With his long, straggly hair and fondness for all-body denim, the 27-year-old Chalmers looked like he’d got lost en route to a Metallica concert. “He comes on stage, hair down to his butt, he’s prancing around like Mick Jagger,” Hameroff said. “But then he speaks. And that’s when everyone wakes up.”
Link ID: 20503 - Posted: 01.21.2015
The presence of a romantic partner during painful medical procedures could make women feel worse rather than better, researchers say. A small study found this increase in pain was most pronounced in women who tended to avoid closeness in their relationships. The authors say bringing a loved one along for support may not be the best strategy for every patient. The work appears in the journal Social Cognitive and Affective Neuroscience. Researchers from University College London, King's College London and the University of Hertfordshire say there has been very little scientific research into the effects of a partner's presence on someone's perception of pain, despite this being common medical advice. They recruited 39 heterosexual couples and asked them a series of questions to measure how much they sought or avoided closeness and emotional intimacy in relationships. Each female volunteer was then subjected to a series of painful laser pulses while her partner was in and then out of the room. The women were asked to score their level of pain. They also had their brain activity measured using a medical test called an EEG. The researchers found that certain women were more likely to score high levels of pain while their partner was in the room. These were women who said they preferred to avoid closeness, trusted themselves more than their partners and felt uncomfortable in their relationships. © 2015 BBC
Keyword: Pain & Touch
Link ID: 20502 - Posted: 01.21.2015
By JOHN MARKOFF A new laboratory technique enables researchers to see minuscule biological features, such as individual neurons and synapses, at a nearly molecular scale through conventional optical microscopes. In a paper published last week in the journal Science, researchers at M.I.T. said they were able to increase the physical size of cultured cells and tissue by as much as five times while still preserving their structure. The scientists call the new technique expansion microscopy. The idea of making objects larger to make them more visible is a radical solution to a vexing challenge. By extending the resolving power of conventional microscopes, scientists are able to glimpse such biological mysteries as the protein structures that form ion channels and the outline of the membrane that holds the genome within a cell. The researchers have examined minute neural circuits, gaining new insights into local connections in the brain and a better understanding of larger networks. The maximum resolving power of conventional optical microscopes is about 200 nanometers, about half the wavelength of visible light. (By contrast, a human hair is about 500 times wider.) In recent decades, scientists have struggled to push past these limits. Last year, three scientists received a Nobel Prize for a technique in which fluorescent molecules are used to extend the resolving power of optical microscopes. But the technique requires specialized equipment and is costly. With expansion microscopy, Edward S. Boyden, a co-director of the M.I.T. Center for Neurobiological Engineering, and his colleagues were able to observe objects originally measuring just 70 nanometers in cultured cells and brain tissue through an optical microscope. © 2015 The New York Times Company
Keyword: Brain imaging
Link ID: 20499 - Posted: 01.20.2015
By Christie Aschwanden Maybe it’s their famously protruding brow ridge or perhaps it’s the now-discredited notion that they were primitive scavengers too dumb to use language or symbolism, but somehow Neanderthals picked up a reputation as brutish, dim and mannerless cretins. Yet the latest research on the history and habits of Neanderthals suggests that such portrayals of them are entirely undeserved. It turns out that Neanderthals were capable hunters who used tools and probably had some semblance of culture, and the DNA record shows that if you trace your ancestry to Europe or Asia, chances are very good that you have some Neanderthal DNA in your own genome. The bad rap began when the first Neanderthal skull was discovered around 1850 in Germany, says Paola Villa, an archaeologist at the University of Colorado. “The morphological features of these skulls — big eyebrows, no chin — led to the idea that they were very different from us, and therefore inferior,” she says. While the majority of archaeologists no longer believe this, she says, the idea that Neanderthals were inferior, brutish or stupid remains in popular culture. Neanderthals first appeared in Europe and western Asia between 300,000 and 400,000 years ago. They are our closest (extinct) relative, and their species survived until 30,000 to 40,000 years ago, when they vanish from the fossil record, says Svante Paabo, director of the Max Planck Institute of Evolutionary Anthropology in Leipzig, Germany, and author of “Neanderthal Man: In Search of Lost Genomes.” Why these relatives of ours thrived for so long and then ended their long, successful run about the same time that modern humans began to spread remains a point of debate and speculation.
Link ID: 20498 - Posted: 01.20.2015
// by Jennifer Viegas Researchers eavesdropping on wild chimpanzees determined that the primates communicate about at least two things: their favorite yummy fruits, and the trees where these fruits can be found. Of particular interest to the chimps is the size of trees bearing the fruits that they relish most, such that the chimps yell out that information, according to a new study published in the journal Animal Behaviour. The study is the first to find that information about tree size and available fruit amounts are included in chimp calls, in addition to assessments about food quality. "Chimpanzees definitely have a very complex communication system that includes a variety of vocalizations, but also facial expressions and gestures," project leader Ammie Kalan of the Max Planck Institute for Evolutionary Anthropology told Discovery News. "How much it resembles human language is still a matter of debate," she added, "but at the very least, research shows that chimpanzees use vocalizations in a sophisticated manner, taking into account their social and environmental surroundings." Kalan and colleagues Roger Mundry and Christophe Boesch spent over 750 hours observing chimps and analyzing their food calls in the Ivory Coast's Taï Forest. The Wild Chimpanzee Foundation in West Africa is working hard to try and protect this population of chimps, which is one of the last wild populations of our primate cousins. © 2015 Discovery Communications, LLC
Helen Fisher, a biological anthropologist at Rutgers University responds: Several years ago I embarked on a project to see if the seven-year itch really exists. I began by studying worldwide data on marriage and divorce and noticed that although the median duration of marriage was seven years, of the couples who divorced, most did so around their fourth year together (the “mode”). I also found that divorce occurred most frequently among couples at the height of their reproductive and parenting years—for men, ages 25 to 29, and for women, ages 20 to 24 and 25 to 29—and among those with one dependent child. To try to explain these findings, I began looking at patterns of pair bonding in birds and mammals. Although only about 3 percent of mammals form a monogamous bond to rear their young, about 90 percent of avian species team up. The reason: the individual that sits on the eggs until they hatch will starve unless fed by a mate. A few mammals are in the same predicament. Take the female fox: the vixen produces very thin milk and must feed her young almost constantly, so she relies on her partner to bring her food while she stays in the den to nurse. But here's the key: although some species of birds and mammals bond for life, more often they stay together only long enough to rear their young through infancy and early toddlerhood. When juvenile robins fly away from the nest or maturing foxes leave the den for the last time, their parents part ways as well. Humans retain traces of this natural reproductive pattern. In more contemporary hunter-gatherer societies, women tend to bear their children about four years apart. Moreover, in these societies after a child is weaned at around age four, the child often joins a playgroup and is cared for by older siblings and relatives. This care structure allows unhappy couples to break up and find a more suitable partner with whom to have more young. © 2015 Scientific American
By PAULA SPAN DEDHAM, Mass. — Jerome Medalie keeps his advance directive hanging in a plastic sleeve in his front hall closet, as his retirement community recommends. That’s where the paramedics will look if someone calls 911. Like many such documents, it declares that if he is terminally ill, he declines cardiopulmonary resuscitation, a ventilator and a feeding tube. But Mr. Medalie’s directive also specifies something more unusual: If he develops Alzheimer’s disease or another form of dementia, he refuses “ordinary means of nutrition and hydration.” A retired lawyer with a proclivity for precision, he has listed 10 triggering conditions, including “I cannot recognize my loved ones” and “I cannot articulate coherent thoughts and sentences.” If any three such disabilities persist for several weeks, he wants his health care proxy — his wife, Beth Lowd — to ensure that nobody tries to keep him alive by spoon-feeding or offering him liquids. VSED, short for “voluntarily stopping eating and drinking,” is not unheard-of as an end-of-life strategy, typically used by older adults who hope to hasten their decline from terminal conditions. But now ethicists, lawyers and older adults themselves have begun a quiet debate about whether people who develop dementia can use VSED to end their lives by including such instructions in an advance directive. Experts know of just a handful of people with directives like Mr. Medalie’s. But dementia rates and numbers have begun a steep ascent, already afflicting an estimated 30 percent of those older than 85. Baby boomers are receiving a firsthand view of the disease’s devastation and burdens as they care for aging parents. They may well prove receptive to the idea that they shouldn’t be kept alive if they develop dementia themselves, predicted Alan Meisel, the director of the University of Pittsburgh’s Center for Bioethics and Health Law. © 2015 The New York Times Company
Link ID: 20495 - Posted: 01.20.2015
By Tia Ghose Being around strangers can cause people stress and, in turn, make them less able to feel others' pain, new research suggests. But giving people a drug that blocks the body's stress response can restore that sense of empathy, scientists said. What's more, the same effect shows up in both humans and mice. "In some sense, we've figured out what to do about increasing empathy as a practical matter," said Jeffrey Mogil, a neuroscientist at McGill University in Montreal. "We've figured out what stops it from happening and, therefore, the solution to make it happen more between strangers." Decreasing stress by doing a shared activity could be a simple way to increase empathy between people who don't know each other, the findings suggest. Past studies had found that mice seemed to feel the pain of familiar mice but were less responsive to foreign mice. Other studies found that, in both humans and mice, stress levels tended to rise around strangers. To see how stress and empathy are connected, Mogil and his colleagues placed two mice together in a cage, then inflicted a painful stimulus on one of them. When the mice were cage mates, the unaffected mouse showed more signs of pain than when they were strangers. But when the team gave the mice a drug called metyrapone, which blocks the formation of the stress hormone cortisol, the mice responded equally to the strangers' pain.
Link ID: 20491 - Posted: 01.17.2015
By Viviane Callier In the deep sea, where light is dim and blue, animals with bigger eyes see better—but bigger eyes are more conspicuous to predators. In response, the small (10 mm to 17 mm), transparent crustacean Paraphronima gracilis has evolved a unique eye structure. Researchers collected the animals from 200- to 500-meter deep waters in California’s Monterey Bay using a remote-operated vehicle. They then characterized the pair of compound eyes, discovering that each one was composed of a single row of 12 distinct red retinas. Reporting online on 15 January in Current Biology, the researchers hypothesize that each retina captures an image that is transmitted to the crustacean’s brain, which integrates the 12 images to increase brightness and contrast sensitivity, adapting to changing light levels. Future work will focus on how images are processed by the neural connections between the retinas and the brain. © 2015 American Association for the Advancement of Science.
By Brady Dennis The Food and Drug Administration on Wednesday approved a device aimed at helping obese people shed weight in a novel way – by targeting the nerve pathway between the brain and the stomach that controls feelings of hunger and fullness. The Maestro Rechargeable System, as it is known, consists of an electrical charge generator, wire leads and electrodes that are implanted surgically into a patient’s abdomen. It sends electrical pulses designed to interfere with the vagus nerve, which signals to the brain when the stomach is full or empty. Though researchers don't know exactly how such electrical stimulation leads to weight loss, the approach seems promising. In a year-long clinical trial involving 233 patients with a body-mass index, or BMI, of 35 or greater, those who received a working Maestro device lost 8.5 percent more weight than those without it. About half those in the experimental group lost at least 20 percent of their excess weight, and more than a third lost more than 25 percent of their excess weight. The overall figure was below the original goal of the trial, which was to show weight loss of 10 percent more excess weight in the control group than in those using the new device. Nevertheless, an FDA advisory group said the data showed sustained weight loss among participants and argued that the benefits of the device outweigh its risks for certain patients. In the clinical trial, some patients experienced nausea, vomiting, surgical complications and other side effects. The FDA is requiring the device's manufacturer, EnteroMedics, to conduct a five-year, post-approval study to gather additional data about its safety and effectiveness.
Link ID: 20488 - Posted: 01.15.2015
by Ashley Yeager The brain's got its own set of pipes for flushing waste. The plumbing is delicate, however — a finding that may complicate scientists' attempts to create a blood test to diagnose traumatic brain injuries. Bumps to the head can knock proteins out of brain cells. The brain's plumbing system is supposed to wash these proteins away from the damaged area and eventually into the blood. But new research in mice shows that slight alterations to the brain's self-cleaning system, even from treating head injuries, can change the levels of proteins flushed into the blood. As a result, the proteins are unreliable markers of injury, researchers report January 14 in the Journal of Neuroscience. © Society for Science & the Public 2000 - 2015.
By SAM ROBERTS When he was just 5 years old, Thomas Graboys declared that he intended to become a doctor. As a young physician, he visited a nephew serving in the Peace Corps in Mauritania and remained for two months, treating dozens of patients a day. He skied and played tennis and joined fellow cardiologists as the drummer in a rock band called the Dysrhythmics. In Boston, he was famous as a member of the team that diagnosed the Celtics star Reggie Lewis’s heart defect before he died abruptly on a basketball court. In short, “he was a medical version of one of Tom Wolfe’s masters of the universe,” one reviewer concluded after Dr. Graboys (pronounced GRAY-boys) published his autobiography. But barely 60, after experiencing horrific nightmares, frequently flailing in bed, losing his memory, suffering tremors and finally collapsing on his wedding day, he acknowledged that he was suffering from Parkinson’s disease and the onset of dementia. He informed his patients that he had no choice but to close his practice. “My face is often expressionless, though I still look younger than my 63 years,” he recalled in the autobiography, “Life in the Balance: A Physician’s Memoir of Life, Love, and Loss With Parkinson’s Disease and Dementia,” which was published in 2008. “I am stooped,” he continued. “I shuffle when I walk, and my body trembles. My train of thought regularly runs off the rails. There is no sugarcoating Parkinson’s. There is no silver lining here. There is anger, pain, and frustration at being victimized by a disease that can to some extent be managed but cannot be cured.” After managing for more than a decade, Dr. Graboys died on Jan. 5 at his home in Chestnut Hill, Mass., his daughter, Penelope Graboys Blair, said. The cause was complications of Lewy Body Dementia, which was diagnosed after his Parkinson’s. He was 70. © 2015 The New York Times Company
Link ID: 20485 - Posted: 01.15.2015
By Will Boggs MD NEW YORK (Reuters Health) - Patients with chronic pain show signs of glial activation in brain centers that modulate pain, according to results from a PET-MRI study. "Glia appears to be involved in the pathophysiology of chronic pain, and therefore we should consider developing therapeutic approaches targeting glia," Dr. Marco L. Loggia from Massachusetts General Hospital, Harvard Medical School, Charlestown, Massachusetts, told Reuters Health by email. "Glial activation is accompanied by many cellular responses, which include the production and release of substances (such as so-called 'pro-inflammatory cytokines') that can sensitize the pain pathways in the central nervous system," he explained. "Thus, glial activation is not a mere reaction to a pain state but actively contributes to the establishment and/or maintenance of persistent pain." To test their hypothesis that patients with chronic pain demonstrate in vivo activation of brain glia, Dr. Loggia's team imaged the brains of 19 individuals diagnosed with chronic low back pain as well as 25 pain-free healthy volunteers using 11C-PBR28, a PET radioligand that binds to the translocator protein (TSPO), a protein upregulated in activated microglia and reactive astrocytes in animal models of pain. Each patient exhibited higher 11C-PBR28 uptakes than his/her age-, sex-, and TSPO genotype-matched control in the thalamus, and there were no brain regions for which the healthy controls showed statistically higher uptakes than the patients with chronic low back pain. © 2015 Scientific American
By Michael Balter If there’s one thing that distinguishes humans from other animals, it’s our ability to use language. But when and why did this trait evolve? A new study concludes that the art of conversation may have arisen early in human evolution, because it made it easier for our ancestors to teach each other how to make stone tools—a skill that was crucial for the spectacular success of our lineage. Researchers have long debated when humans starting talking to each other. Estimates range wildly, from as late as 50,000 years ago to as early as the beginning of the human genus more than 2 million years ago. But words leave no traces in the archaeological record. So researchers have used proxy indicators for symbolic abilities, such as early art or sophisticated toolmaking skills. Yet these indirect approaches have failed to resolve arguments about language origins. Now, a team led by Thomas Morgan, a psychologist at the University of California, Berkeley, has attacked the problem in a very different way. Rather than considering toolmaking as a proxy for language ability, he and his colleagues explored the way that language may helps modern humans learn to make such tools. The researchers recruited 184 students from the University of St. Andrews in the United Kingdom, where some members of the team were based, and organized them into five groups. The first person in each group was taught by archaeologists how to make artifacts called Oldowan tools, which include fairly simple stone flakes that were manufactured by early humans beginning about 2.5 million years ago. This technology, named after the famous Olduvai Gorge in Tanzania where archaeologists Louis and Mary Leakey discovered the implements in the 1930s, consists of hitting a stone “core” with a stone “hammer” in such a way that a flake sharp enough to butcher an animal is struck off. Producing a useful flake requires hitting the core at just the right place and angle. © 2015 American Association for the Advancement of Science.
By Susan Svrluga Edwin Chapman’s secretary handed him a pile of prescription slips, and the doctor’s pen moved quickly across them: “Buprenorphine/naloxone.” “Buprenorphine/naloxone.” “Buprenorphine/naloxone.” His waiting room was full of heroin-addicted patients there to refill their generic prescriptions for Suboxone, a drug that helps keep their relentless cravings at bay and now outpaces methadone as a treatment. Chapman is an internist, a cardiologist. This drug has transformed his D.C. medical practice — now more than half of his patients are there to seek it, addicts edging out elderly ladies with arthritis and diabetes. And the drug, he believes, has transformed lives. He wishes more people could get it. Yet even as heroin use surges in the United States, destroying neighborhoods and families — drug overdoses kill more people than any other kind of accident — both addicts and doctors say there are barriers that keep some from the treatment they desperately need. “In the past we’ve kind of run away from these patients, put them in methadone clinics, places no one can see them,” said Chapman, who estimates that two-thirds of his heroin-addicted patients tested positive for hepatitis C and more than one in 10 for HIV. “We need to reverse that. Put them in primary care. We need to be taking care of sick folks, not running away from them.
Keyword: Drug Abuse
Link ID: 20480 - Posted: 01.14.2015
By Neuroskeptic A new study offers two reasons to be cautious about some of the claims made for the role of the hormone oxytocin in human behavior. The paper’s out now in PLoS ONE from researchers James C. Christensen and colleagues, who are based at the US Air Force Research Laboratory in Ohio. That the military are interested in oxytocin at all is perhaps a testament to the huge amount of interest that this molecule has attracted in recent years. Oxytocin has been called the “hug hormone”, and is said to be involved in such nice things as love and trust. But according to Christensen et al., quite a lot of previous oxytocin research may be flawed. Their paper is in two parts. Christensen et al. first show that the only accurate way to measure oxytocin levels in blood is by performing plasma extraction before chemical analysis. Using unextracted plasma, they find, leads to seriously distorted measures. The differences between extracted and unextracted plasma estimates of oxytocin have been noted before, but Christensen et al. show directly that unextracted plasma interferes with oxytocin measurement. They found that oxytocin test kits were unable to detect a ‘spike’ of pure oxytocin added to some unextracted plasma samples, whereas the spike was reliably detected when added to an extracted sample. This was true using either the ELISA or RIA method for quantification of oxytocin. With ELISA, unextracted oxytocin measures were also very noisy and unrealistically high:
Keyword: Hormones & Behavior
Link ID: 20479 - Posted: 01.14.2015
Vernon Mountcastle, one of Johns Hopkins Medicine's giants of the 20th century, died peacefully at his North Baltimore home on Sunday, with Nancy, his wife of seven decades, and family at his bedside. He was 96. Mountcastle was universally acknowledged as the "father of neuroscience" and served Johns Hopkins with extraordinary dedication for nearly 65 years. A 1942 graduate of the School of Medicine and a member of the faculty since 1948, Mountcastle served as director of the Department of Physiology and head of the Philip Bard Laboratories of Neurophysiology at Johns Hopkins from 1964 to 1980. He later became one of the founding members of Johns Hopkins' Zanvyl Krieger Mind/Brain Institute, where he continued to work until his retirement at 87. Colleagues remember his dedication to the professional development of neuroscientists, fiercely focused work ethic, and devotion to collaborative research. Also see: Mind/Brain's Mountcastle wins NAS award for lifetime of groundbreaking work (Gazette, April 1998) Mountcastle once was dubbed the "Jacques Cousteau of the cortex" for his revolutionary research delving into the unknown depths of the brain and establishing the basis for modern neuroscience. In 1957, he made the breakthrough discovery that revolutionized the concept of how the brain is built. He found that the cells of the cerebral cortex are organized in vertical columns, extending from the surface of the brain down through six layers of the cortex, each column processing a specific kind of information.
Keyword: Brain imaging
Link ID: 20478 - Posted: 01.14.2015