Chapter 13. Memory, Learning, and Development
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
By BENEDICT CAREY People of a certain age (and we know who we are) don’t spend much leisure time reviewing the research into cognitive performance and aging. The story is grim, for one thing: Memory’s speed and accuracy begin to slip around age 25 and keep on slipping. The story is familiar, too, for anyone who is over 50 and, having finally learned to live fully in the moment, discovers it’s a senior moment. The finding that the brain slows with age is one of the strongest in all of psychology. Lisa Haney Over the years, some scientists have questioned this dotage curve. But these challenges have had an ornery-old-person slant: that the tests were biased toward the young, for example. Or that older people have learned not to care about clearly trivial things, like memory tests. Or that an older mind must organize information differently from one attached to some 22-year-old who records his every Ultimate Frisbee move on Instagram. Now comes a new kind of challenge to the evidence of a cognitive decline, from a decidedly digital quarter: data mining, based on theories of information processing. In a paper published in Topics in Cognitive Science, a team of linguistic researchers from the University of Tübingen in Germany used advanced learning models to search enormous databases of words and phrases. Since educated older people generally know more words than younger people, simply by virtue of having been around longer, the experiment simulates what an older brain has to do to retrieve a word. And when the researchers incorporated that difference into the models, the aging “deficits” largely disappeared. “What shocked me, to be honest, is that for the first half of the time we were doing this project, I totally bought into the idea of age-related cognitive decline in healthy adults,” the lead author, Michael Ramscar, said by email. But the simulations, he added, “fit so well to human data that it slowly forced me to entertain this idea that I didn’t need to invoke decline at all.” © 2014 The New York Times Company
by Helen Thomson When the criteria for diagnosing autism were changed last year, concerns were raised that people already diagnosed might be re-evaluated and end up losing access to treatments and services. The American Psychiatric Association (APA), which publishes the diagnostic guidelines, recommends that children who are receiving appropriate treatment as the result of the old criteria should not be required to undergo a re-examination with the new criteria by insurance companies. But a small survey revealed to New Scientist suggests that not everyone is following the party line. In May, the APA published the DSM-5, the latest edition of what has come to be known as psychiatry's diagnostic bible. One controversial change was to the criteria used to diagnose different kinds of autism, which are now combined under the umbrella term of "Autism Spectrum Disorder" (ASD). Under the previous criteria of DSM-4, a person would be diagnosed with ASD by exhibiting at least six of 12 behaviours, which include problems with communication, interaction and repetition. Now, that same person would need to exhibit three deficits in social communication and interaction and at least two repetitive behaviours – the latter, say critics, makes the new criteria more restrictive. To see how the change in criteria was affecting people, Autism Speaks, a US science and advocacy organisation, asked users of its website to complete an online survey about their experiences. "We wanted to ensure that people are still maintaining access to the services they need," says Michael Rosanoff, Autism Speaks' associate director for public health research and scientific review. © Copyright Reed Business Information Ltd.
Link ID: 19174 - Posted: 01.27.2014
By SARAH MASLIN NIR The day after the funeral of Avonte Oquendo, the boy with autism whose remains were found this month after he disappeared at age 14 from his school in October, his mother and grandmother stood with Senator Charles E. Schumer as he announced a proposal for a new law. Called “Avonte’s law,” it would finance a program to provide optional electronic tracking devices to be worn by children with autism. “Avonte’s running away was not an isolated incident,” Mr. Schumer, Democrat of New York, said at a news conference on Sunday morning in his office on the East Side of Manhattan. “This is a high-tech solution to an age-old problem.” Citing research that suggests nearly 50 percent of children with autism wander off, often to escape the overstimulation of sounds and noise, Mr. Schumer said the new legislation would expand an existing Department of Justice program that grants money to law enforcement agencies and other groups to provide trackers for people who have Alzheimer’s disease. Mr. Schumer said he had contacted the department months ago about including children with autism in the program. There was receptiveness, he said, but money was needed to provide children with the devices, which cost $80 to $90 and a few dollars a month to operate. The legislation would allocate $10 million for the program, giving interested parents free access to the equipment, which can be worn like a watch or even sewn into clothing. Whether to use such a monitor would be up to the parents, and the exact system of employing the devices would be up to individual municipalities, Mr. Schumer said. There are different variants that could be selected, including one that alerts authorities automatically when a child has stepped across a given perimeter — for example, outside school grounds — and another that becomes activated only after authorities are called. © 2014 The New York Times Company
Link ID: 19173 - Posted: 01.27.2014
by Helen Thomson The brain that made the greatest contribution to neuroscience and to our understanding of memory has become a gift that keeps on giving. A 3D reconstruction of the brain of Henry Molaison, whose surgery to cure him of epilepsy left him with no short-term memory, will allow scientists to continue to garner insights into the brain for years to come. "Patient HM" became arguably the most famous person in neuroscience after he had several areas of his brain removed in 1953. His resulting amnesia and willingness to be tested have given us unprecedented insights into where memories are formed and stored in the brain. On his death in 2008, HM was revealed to the world as Henry Molaison. Now, a post-mortem examination of his brain, and a new kind of virtual 3D reconstruction, have been published. As a child, Molaison had major epileptic seizures. Anti-epileptic drugs failed, so he sought help from neurosurgeon William Scoville at Hartford Hospital in Connecticut. When Molaison was 27 years old, Scoville removed portions of his medial temporal lobes, which included an area called the hippocampus on both sides of his brain. As a result, Molaison's epilepsy became manageable, but he could not form any new memories, a condition known as anterograde amnesia. He also had difficulty recollecting his long-term past – partial retrograde amnesia.
Keyword: Learning & Memory
Link ID: 19172 - Posted: 01.27.2014
By CARL ZIMMER The term “X chromosome” has an air of mystery to it, and rightly so. It got its name in 1891 from a baffled biologist named Hermann Henking. To investigate the nature of chromosomes, Henking examined cells under a simple microscope. All the chromosomes in the cells came in pairs. All except one. Henking labeled this outlier chromosome the “X element.” No one knows for sure what he meant by the letter. Maybe he saw it as an extra chromosome. Or perhaps he thought it was an ex-chromosome. Maybe he used X the way mathematicians do, to refer to something unknown. Today, scientists know the X chromosome much better. It’s part of the system that determines whether we become male or female. If an egg inherits an X chromosome from both parents, it becomes female. If it gets an X from its mother and a Y from its father, it becomes male. But the X chromosome remains mysterious. For one thing, females shut down an X chromosome in every cell, leaving only one active. That’s a drastic step to take, given that the X chromosome has more than 1,000 genes. In some cells, the father’s goes dormant, and in others, the mother’s does. While scientists have known about this so-called X-chromosome inactivation for more than five decades, they still know little about the rules it follows, or even how it evolved. In the journal Neuron, a team of scientists has unveiled an unprecedented view of X-chromosome inactivation in the body. They found a remarkable complexity to the pattern in which the chromosomes were switched on and off. © 2014 The New York Times Company
by Laura Sanders Growing up, I loved it when my parents read aloud the stories of the Berenstain Bears living in their treehouse. So while I was pregnant with my daughter, I imagined lots of cuddly quiet time with her in a comfy chair, reading about the latest adventures of Brother and Sister. Of course, reality soon let me know just how ridiculous that idea was. My newborn couldn’t see more than a foot away, cried robustly and frequently for mysterious reasons, and didn’t really understand words yet. Baby V was simply not interested in the latest dispatch from Bear County. When I started reading child development expert Elaine Reese’s new book Tell Me a Story, I realized that I was not the only one with idyllic story time dreams. Babies and toddlers are squirmy, active people with short attention spans. “Why, then, do we cling to this soft-focus view of storytelling when we know it is unrealistic?” she writes. These days, as Baby V closes in on the 1-year mark, she has turned into a most definite book lover. But it’s not the stories that enchant her. It’s holding the book, turning its pages back to front to back again, flipping it over and generally showing it who’s in charge. Every so often I can entice Baby V to sit on my lap with a book, but we never read through a full story. Instead, we linger on the page with all the junk food that the Hungry Caterpillar chomps through, sticking our fingers in the little holes in the pages. And we make Froggy pop in and out of the bucket. And we study the little goats as they climb up and up and up on the hay bales. © Society for Science & the Public 2000 - 2014
by Bethany Brookshire There are some scientific topics that are bound to generate excitement. A launch to the moon, a potential cure for cancer or any study involving chocolate will always make the news. And then of course there’s caffeine. More than half of Americans have a daily coffee habit, not to mention the boost offered by tea, soda, chocolate and energy drinks. We’d all love to believe that it has more benefit than just papering over a poor night’s sleep. This week, scientists reported that caffeine could give a jolt to memory consolidation, the step right after your brain acquires a memory. During memory consolidation, activity patterns laid down in your brain become more permanent. The study suggested that caffeine might perk up this stage of memory formation. But while it’s an interesting finding, the scientific brew may not be strong enough to justify your coffee habit. Caffeine is a great way to wake you up. It blocks the action of adenosine, a chemical messenger that promotes sleep. Caffeine also has indirect effects on other chemical messengers such as norepinephrine, the neurotransmitter that gives us our famous “fight or flight” response. The net result is increased attention, wakefulness and faster responses. But attention, focus and response time are not memory. And previous studies of memory, says neuroscientist Michael Yassa, the lead author on the new study, were “all over the place.” So Yassa, then at Johns Hopkins University (he’s now at the University of California, Irvine), and undergraduate student Daniel Borota decided to study the effects of caffeine on memory “in a rigorous way.” © Society for Science & the Public 2000 - 2014
A clean slate—that’s what people suffering from posttraumatic stress disorder (PTSD) crave most with their memories. Psychotherapy is more effective at muting more recent traumatic events than those from long ago, but a new study in mice shows that modifying the molecules that attach to our DNA may offer a route to quashing painful memories in both cases. One of the most effective treatments for PTSD is exposure psychotherapy. A behavioral psychologist asks a patient to recall and confront a traumatic event; each time the traumatic memory is revisited, it becomes susceptible to editing through a phenomenon known as memory reconsolidation. As the person relives, for example, a car crash, the details of the event—such as the color and make of the vehicle—gradually uncouple from the anxiety, reducing the likelihood of a panic attack the next time the patient sees, say, a red Mazda. Repeated therapy sessions can also lead to memory extinction, in which the fears tied to an event fade away as old memories are replaced with new ones. Yet this therapy works only for recent memories. If too much time passes before intervention, the haunting visions become stalwart, refusing to budge from the crevices of the mind. This persistence raises the question of how the brain tells the age of a memory in the first place. Researchers at the Massachusetts Institute of Technology, led by neurobiologist Li-Huei Tsai, have now uncovered a chemical modification of DNA that regulates gene activity and dictates whether a memory is too old for reconsolidation in mice. A drug that tweaks these “memory wrinkles” gives old memories a face-lift, allowing them to be edited by reconsolidation and resulting in fear extinction during behavior therapy. © 2014 American Association for the Advancement of Science.
By Melissa Healy Adolescents treated with the antidepressant fluoxetine -- better known by its commercial name, Prozac -- appear to undergo changes in brain signaling that result in changed behavior well into adulthood, says a new study. Adult mice and rats who were administered Prozac for a stretch of mid-adolescence responded to daunting social and physical challenges with less despair than animals who passed their teen years unmedicated, a team of researchers found. But, even as adults long separated from their antidepressant days, the Prozac veterans reacted to stressful situations with greater anxiety than did the adult Prozac virgins. The latest research, published Wednesday in the Journal of Neuroscience, offers evidence that treatment with a selective serotonin reuptake inhibitor -- an SSRI antidepressant -- has long-lived effects on the developing brain. It also zeroes in on how and where fluoxetine effects those lasting changes: by modifying the cascade of chemical signals issued by the brain's ventral tegmentum -- a region active in mood regulation -- in stressful situations. Yet, the new research raises more questions than it answers, since the changes in adults who were treated with Prozac as adolescents seem contradictory. Sensitivity to stress appears to predispose one to developing depression. So how does a medication that treats depression in children and teens -- and that continues to protect them from depression as adults -- also heighten their sensitivity to stress?
Dan Hurley Forget mindfulness meditation, computerized working-memory training, and learning a musical instrument; all methods recently shown by scientists to increase intelligence. There could be an easier answer. It turns out that sex might actually make you smarter. Researchers in Maryland and South Korea recently found that sexual activity in mice and rats improves mental performance and increases neurogenesis (the production of new neurons) in the hippocampus, where long-term memories are formed. In April, a team from the University of Maryland reported that middle-aged rats permitted to engage in sex showed signs of improved cognitive function and hippocampal function. In November, a group from Konkuk University in Seoul concluded that sexual activity counteracts the memory-robbing effects of chronic stress in mice. “Sexual interaction could be helpful,” they wrote, “for buffering adult hippocampal neurogenesis and recognition memory function against the suppressive actions of chronic stress.” So growing brain cells through sex does appear to have some basis in scientific fact. But there’s some debate over whether fake sex—pornography—could be harmful. Neuroscientists from the University of Texas recently argued that excessive porn viewing, like other addictions, can result in permanent “anatomical and pathological” changes to the brain. That view, however, was quickly challenged in a rebuttal from researchers at the University of California, Los Angeles, who said that the Texans "offered little, if any, convincing evidence to support their perspectives. Instead, excessive liberties and misleading interpretations of neuroscience research are used to assert that excessive pornography consumption causes brain damage." © 2014 by The Atlantic Monthly Group
Injuries to the head can leave victims susceptible to early death even years later through impaired judgement, a major analysis of survivors shows. Those with a history of psychiatric disorders before the injury are most at risk of dying prematurely. The study, in JAMA Psychiatry, of 40 years of data on more than two million people, showed that overall a brain injury trebled the risk. Suicide and fatal injuries were among the commonest causes of early death. More than one million people in Europe are taken to hospital with a traumatic brain injury each year. The study, by researchers at the University of Oxford and the Karolinska Institute in Stockholm, looked at Swedish medical records between 1969 and 2009. They followed patients who survived the initial six-month danger period after injury. The data showed that without injury 0.2% of people were dying prematurely - before the age of 56. However, the premature-death rate was three-fold higher in patients who had previously suffered traumatic brain injury. In those who also had a psychiatric disorder the rate soared to 4%. Dr Seena Fazel, one of the researchers in Oxford, said: "There are these subgroups with really high rates, and these are potentially treatable illnesses, so this is something we can do something about." BBC © 2014
A new website that helps determine whether someone might have Alzheimer's disease or dementia is so popular that the site crashed temporarily. Ohio State University's website says its Self-Administered Gerocognitive Exam (SAGE) is a test that can be done in your own home with a paper and pencil. When researchers visited 45 community events where they asked people to take the simple test, they found that of the 1, 047 who did it, 28 per cent were identified with cognitive impairment, test developer Dr. Douglas Scharre of Ohio State and his team reported Monday in The Journal of Neuropsychiatry and Clinical Neurosciences. Alzheimer's test Researchers in Ohio say the SAGE test has been shown to be effective in spotting the early signs of cognitive decline. (Ohio State University Wexner Medical Center) Participants were told the test represented their baseline level, which doctors could use for future comparisons during re-screening. "What we found was that this SAGE self-administered test correlated very well with detailed cognitive testing," Scharre said in a release. "If we catch this cognitive change really early, then we can start potential treatments much earlier than without having this test." The Alzheimer Society of Canada says early diagnosis can help with planning, care and support. © CBC 2014
Link ID: 19137 - Posted: 01.16.2014
By ANDREW POLLACK Launch media viewer Kristin Tremblay helps make dinner at home in Gainesville, Fla. She has a disorder that makes her uncontrollably hungry. Rob C. Witzel for The New York Times Lisa Tremblay still recalls in horror the time her daughter Kristin pulled a hot dog crawling with ants from the garbage at a cookout and prepared to swallow it. Kristin has a rare genetic abnormality that gives her an incessant, uncontrollable hunger. Some people with the condition, called Prader-Willi syndrome, will eat until their stomach ruptures and they die. And, not surprisingly, many are obese. “She’s eaten dog food. She’s eaten cat food,” said Ms. Tremblay, who lives in Nokomis, Fla. When Kristin, now 28, was a child, neighbors once called social welfare authorities, thinking Kristin was not being fed because she complained of being hungry so much. Once an obscure and neglected disease, Prader-Willi is starting to attract more attention from scientists and pharmaceutical companies for a simple reason: It may shed some light on the much broader public health problems of overeating and obesity. “These are remarkable human models of severe obesity,” said Dr. Steven B. Heymsfield, a professor and former executive director of the Pennington Biomedical Research Center in Baton Rouge, La. “When we discover the underlying mechanism of these very rare disorders, they will shed light on garden-variety obesity.” One drug being developed to help obese people lose weight has shown some preliminary signs of success in patients with Prader-Willi. The drug, beloranib, is believed to work by reducing fat synthesis and increasing fat use. In a small trial, it reduced weight and body fat and lowered the food-seeking urge, according to the drug’s developer, Zafgen. © 2014 The New York Times Company
By Ashutosh Jogalekar Popular wisdom holds that caffeine enhances learning, alertness and retention, leading millions to consume coffee or caffeinated drinks before a challenging learning task such as attending a business strategy meeting or a demanding scientific presentation. However a new study in the journal Nature Neuroscience conducted by researchers from Johns Hopkins hints that when it comes to long-term memory and caffeine, timing may be everything; caffeine may enhance consolidation of memories only if it is consumed after a learning or memory challenge. In the study the authors conducted a randomized, double-blind controlled experiment in which 160 healthy female subjects between the ages of 18 and 30 were asked to perform a series of learning tasks. The subjects were handed cards with pictures of various random indoor and outdoor objects (for instance leaves, ducks and handbags) on them and asked to classify the objects as indoor or outdoor. Immediately after the task the volunteers were handed pills, either containing 200 mg of caffeine or placebo. Saliva samples to test for caffeine and its metabolites were collected after 1, 3 and 24 hours. After 24 hours the researchers tested the participants’ recollection of the past day’s test. Along with the items in the test (‘old’) they were presented with new items (‘foils’) and similar looking items (‘lures’), neither of which were part of the task. They were then asked to again classify the items as old, new and similar. There was a statistically significant percentage of volunteers in the caffeinated group that was more likely to mark the ‘similar’ items as ‘similar’ rather than ‘old’. That is, caffeinated participants were clearly able to distinguish much better between the old and the other items, indicating that they were retaining the memory of the old items much better than the people in the placebo group. © 2014 Scientific American,
Training to improve cognitive abilities in older people lasted to some degree 10 years after the training program was completed, according to results of a randomized clinical trial supported by the National Institutes of Health. The findings showed training gains for aspects of cognition involved in the ability to think and learn, but researchers said memory training did not have an effect after 10 years. The report, from the Advanced Cognitive Training for Independent and Vital Elderly (ACTIVE) study, appears in the January 2014 issue of the Journal of the American Geriatrics Society. The project was funded by the National Institute on Aging (NIA) and the National Institute of Nursing Research (NINR), components of the NIH. “Previous data from this clinical trial demonstrated that the effects of the training lasted for five years,” said NIA Director Richard J. Hodes, M.D. “Now, these longer term results indicate that particular types of cognitive training can provide a lasting benefit a decade later. They suggest that we should continue to pursue cognitive training as an intervention that might help maintain the mental abilities of older people so that they may remain independent and in the community.” “ACTIVE is an important example of intervention research aimed at enabling older people to maintain their cognitive abilities as they age,” said NINR Director Patricia Grady, Ph.D. “The average age of the individuals who have been followed over the last 10 years is now 82. Given our nation’s aging population, this type of research is an increasingly high priority.”
Ian Sample, science correspondent A cup or two of coffee could boost the brain's ability to store long-term memories, researchers in the US claim. People who had a shot of caffeine after looking at a series of pictures were better at distinguishing them from similar images in tests the next day, the scientists found. The task gives a measure of how precisely information is stored in the brain, which helps with a process called pattern separation which can be crucial in everyday situations. If the effect is real, and some scientists are doubtful, then it would add memory enhancement to the growing list of benefits that moderate caffeine consumption seems to provide. Michael Yassa, a neuroscientist who led the study at Johns Hopkins University in Baltimore, said the ability to separate patterns was vital for discriminating between similar scenarios and experiences in life. "If you park in the same parking lot every day, the spot you choose can look the same as many others. But when you go and look for your car, you need to look for where you parked it today, not where you parked it yesterday," he said. Writing in the journal Nature Neuroscience, Yassa described how 44 volunteers who were not heavy caffeine consumers and had abstained for at least a day were shown a rapid sequence of pictures on a computer screen. The pictures included a huge range of items, such as a hammer, a chair, an apple, a seahorse, a rubber duck and a car. © 2014 Guardian News and Media Limited
By Gary Stix The blood-brain barrier is the Berlin Wall of human anatomy and physiology Its closely packed cells shield neurons and the like from toxins and pathogens, while letting pass glucose and other essential chemicals for brain metabolism (caffeine?). For years, pharmaceutical companies and academic researchers have engaged in halting efforts to traverse this imposing blockade in order to deliver some of the big molecules that might potentially help slow the progression of devastating neurological diseases. Like would-be refugees from the former East Germany, many medications get snagged by border guards during the crossing—a molecular security force that either impedes or digests any invader. There have been many attempts to secure safe passage—deploying chemicals that make brain-barrier “endothelial” cells shrivel up, or wielding tiny catheters or minute bubbles that slip through minuscule breaches. Success has been mixed at best—none of these molecular cargo carriers have made their way as far as human trials. Roche, the Swiss-based drugmaker, reported in the Jan. 8 Neuron a bit of progress toward overcoming the lingering technical impediments. The study described a new technique that tricks one of the BBB’s natural checkpoints to let through an elaborately engineered drug that attacks the amyloid-beta protein fragments that may be the primary culprit inflicting the damage wrought by Alzheimer’s. The subterfuge involves the transferrin receptor, a docking site used to transport iron into the brain. Roche took a fragment of an antibody that binds the transferrin receptor and latched it onto another antibody that, once on the other side of the BBB, attaches to and then removes amyloid. © 2014 Scientific American
Link ID: 19121 - Posted: 01.13.2014
by Helen Thomson A drug for perfect pitch is just the start: mastering new skills could become easy if we can restore the brain's youthful ability to create new circuits WANNABE maestros, listen up. A mood-stabilising drug can help you achieve perfect pitch – the ability to identify any note you hear without inferring it from a reference note. Since this is a skill that is usually acquired only early in life, the discovery is the first evidence that it may be possible to revert the human brain to a childlike state, enabling us to treat disorders and unlock skills that are difficult, if not impossible, to acquire beyond a certain age. From bilingualism to sporting prowess, many abilities rely on neural circuits that are laid down by our early experiences. Until the age of 7 or so, the brain goes through several "critical periods" during which it can be radically changed by the environment. During these times, the brain is said to have increased plasticity. In order to take advantage of these critical periods, the brain needs to be stimulated appropriately so it lays down the neuronal circuitry needed for a particular ability. For example, young children with poor sight in one eye may develop lazy eye, or amblyopia. It can be treated by covering the better eye, forcing the child to use the lazy eye – but this strategy only works during the critical period. These windows of opportunity are fleeting, but now researchers are beginning to understand what closes them and how they might be reopened. © Copyright Reed Business Information Ltd.
by Bethany Brookshire When most people think of the quintessential lab mouse, they think of a little white mouse with red eyes. Soft fur. A timid nature. But scientists think of something very different. This mouse is black, small and fast, with pink ears and a pinkish tail. It’s got black eyes to match. The fur may be soft, but the temper sure isn’t. This is the C57 Black 6 mouse. Each Black 6 mouse should be almost identical to every other Black 6 mouse. They have been bred to their own siblings for hundreds of generations, so there should be very few genetic differences left. But even supposedly identical mouse strains have their differences. These take the form of mutations in single DNA base pairs that accumulate in different populations. Recently, researchers showed that one of these tiny changes in a single gene was enough to produce a huge difference in how two groups of Black 6 mice respond to drugs. And the authors identified a surprising number of other small DNA differences still waiting to be explored. On one level, the new work offers scientists a novel tool for identifying genes that could relate to behaviors. But it also serves as a warning. “Identical” mouse populations aren’t as alike as many scientists had assumed. The Black 6, the most common lab mouse in the United States, is used for everything from drug abuse studies to cancer research. The Black 6 is also the reference strain for the Mouse Genome Sequencing Consortium. Whenever scientists discover a new genetic change in a mouse strain, they compare it first against the Black 6. And it’s the mouse used by the International Knockout Mouse Consortium (now the International Mouse Phenotyping Consortium), which keep a library of mouse embryos with different deleted genes. The Allen Brain Atlas, a database of neuroanatomy and gene activity throughout the mouse brain, relies on the Black 6 as well. © Society for Science & the Public 2000 - 2014
Oliver Burkeman What happens when you attach several electrodes to your forehead, connect them via wires to a nine-volt battery and resistor, ramp up the current and send an electrical charge directly into your brain? Most people would be content just to guess, but last summer a 33-year-old from Alabama named Anthony Lee decided to find out. "Here we go… oooahh, that stings a little!" he says, in one of the YouTube videos recording his exploits. "Whoa. That hurts… Ow!" The video cuts out. When Lee reappears, the electrodes are gone: "Something very strange happened," he says thoughtfully. "It felt like something popped." (In another video, he reports a sudden white flash in his visual field, which he describes, in a remarkably calm voice, as "cool".) You might conclude from this that Lee is a very foolish person, but the quest he's on is one that has occupied scientists, philosophers and fortune-hunters for centuries: to find some artificial way to improve upon the basic cognitive equipment we're born with, and thus become smarter and maintain mental sharpness into old age. "It started with Limitless," Lee told me – the 2011 film in which an author suffering from writer's block discovers a drug that can supercharge his faculties. "I figured, I'm a pretty average-intelligence guy, so I could use a little stimulation." The scientific establishment, it's fair to say, remains far from convinced that it's possible to enhance your brain's capacities in a lasting way – whether via electrical jolts, brain-training games, dietary supplements, drugs or anything else. But that hasn't impeded the growth of a huge industry – and thriving amateur subculture – of "neuro-enhancement", which, according to the American Psychological Association, is worth $1bn a year. "Brain fitness technology" has been projected to be worth up to $8bn in 2015 as baby boomers age. Anthony Lee belongs to the sub-subculture of DIY transcranial direct-current stimulation, or tDCS, whose members swap wiring diagrams and cautionary tales online, though if that makes you queasy, you can always pay £179 for Foc.us, a readymade tDCS headset that promises to "make your synapses fire faster" and "excite your prefrontal cortex", so that you can "get the edge in online gaming". © 2014 Guardian News and Media Limited