Chapter 17. Learning and Memory
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
by Helen Thomson HAVE you read this before? A 23-year-old man from the UK almost certainly feels like he has – he's the first person to report persistent déjà vu stemming from anxiety rather than any obvious neurological disorder. Nobody knows exactly how or why déjà vu happens, but for most of us it is rare. Some people experience it more often, as a side effect associated with epileptic seizures or dementia. Now, researchers have discovered the first person with what they call "psychogenic déjà vu" – where the cause appears to be psychological. The man's episodes began just after he started university, a period when he felt anxious and was also experiencing obsessive compulsions. As time went on, his déjà vu became more and more prolonged, and then fairly continuous after he tried LSD. Now, he avoids television and radio, and finds newspapers distressing as the content feels familiar. There are different theories as to what is going on, says Christine Wells at Sheffield Hallam University in the UK, who has written a paper on the man's experiences. "The general theory is that there's a misfiring of neurons in the temporal lobes – which deal with recollection and familiarity. That misfiring during the process of recollection means we interpret a moment in time as something that has already been experienced," she says. Surprisingly, when Wells gave the man a standard recall test, he scored more similarly to people of his own age without the condition than those with epilepsy-related déjà vu. An MRI and an EEG scan of his brain activity also showed no abnormalities. © Copyright Reed Business Information Ltd.
By David Noonan It was the day before Christmas, and the normally busy MIT laboratory on Vassar Street in Cambridge was quiet. But creatures were definitely stirring, including a mouse that would soon be world famous. Steve Ramirez, a 24-year-old doctoral student at the time, placed the mouse in a small metal box with a black plastic floor. Instead of curiously sniffing around, though, the animal instantly froze in terror, recalling the experience of receiving a foot shock in that same box. It was a textbook fear response, and if anything, the mouse’s posture was more rigid than Ramirez had expected. Its memory of the trauma must have been quite vivid. Which was amazing, because the memory was bogus: The mouse had never received an electric shock in that box. Rather, it was reacting to a false memory that Ramirez and his MIT colleague Xu Liu had planted in its brain. “Merry Freaking Christmas,” read the subject line of the email Ramirez shot off to Liu, who was spending the 2012 holiday in Yosemite National Park. The observation culminated more than two years of a long-shot research effort and supported an extraordinary hypothesis: Not only was it possible to identify brain cells involved in the encoding of a single memory, but those specific cells could be manipulated to create a whole new “memory” of an event that never happened. “It’s a fantastic feat,” says Howard Eichenbaum, a leading memory researcher and director of the Center for Neuroscience at Boston University, where Ramirez did his undergraduate work. “It’s a real breakthrough that shows the power of these techniques to address fundamental questions about how the brain works.” In a neuroscience breakthrough, the duo implanted a false memory in a mouse
By Gary Stix Our site recently ran a great story about how brain training really doesn’t endow you instantly with genius IQ. The games you play just make you better at playing those same games. They aren’t a direct route to a Mensa membership. Just a few days before that story came out—Proceedings of the National Academy of Sciences—published a report that suggested that playing action video games, Call of Duty: Black Ops II and the like—actually lets gamers learn the essentials of a particular visual task (the orientation of a Gabor signal—don’t ask) more rapidly than non-gamers, a skill that has real-world relevance beyond the confines of the artificial reality of the game itself. As psychologists say, it has “transfer effects.” Gamers appear to have learned how to do stuff like home in quickly on a target or multitask better than those who inhabit the non-gaming world. Their skills might, in theory, make them great pilots or laparoscopic surgeons, not just high scorers among their peers. Action video games are not billed as brain training, but both Call of Duty and nominally accredited training programs like Lumosity are both structured as computer games. So that leads to the question of what’s going on here? Every new finding about brain training as B.S. appears to be contradicted by another that points to the promise of cognitive exercise, if that’s what you call a session with Call of Duty. It may boil down to a realization that the whole story about exercising your neurons to keep the brain supple may be a lot less simple than proponents make it out to be. © 2014 Scientific American
Keyword: Learning & Memory
Link ID: 20409 - Posted: 12.13.2014
by Helen Thomson Zapping your brain might make you better at maths tests – or worse. It depends how anxious you are about taking the test in the first place. A recent surge of studies has shown that brain stimulation can make people more creative and better at maths, and can even improve memory, but these studies tend to neglect individual differences. Now, Roi Cohen Kadosh at the University of Oxford and his colleagues have shown that brain stimulation can have completely opposite effects depending on your personality. Previous research has shown that a type of non-invasive brain stimulation called transcranial direct current stimulation (tDCS) – which enhances brain activity using an electric current – can improve mathematical ability when applied to the dorsolateral prefrontal cortex, an area involved in regulating emotion. To test whether personality traits might affect this result, Kadosh's team tried the technique on 25 people who find mental arithmetic highly stressful, and 20 people who do not. They found that participants with high maths anxiety made correct responses more quickly and, after the test, showed lower levels of cortisol, an indicator of stress. On the other hand, individuals with low maths anxiety performed worse after tDCS. "It is hard to believe that all people would benefit similarly [from] brain stimulation," says Cohen Kadosh. He says that further research could shed light on how to optimise the technology and help to discover who is most likely to benefit from stimulation. © Copyright Reed Business Information Ltd.
Ian Sample, science editor Electrical brain stimulation equipment – which can boost cognitive performance and is easy to buy online – can have bad effects, impairing brain functioning, research from scientists at Oxford University has shown. A steady stream of reports of stimulators being able to boost brain performance, coupled with the simplicity of the devices, has led to a rise in DIY enthusiasts who cobble the equipment together themselves, or buy it assembled on the web, then zap themselves at home. In science laboratories brain stimulators have long been used to explore cognition. The equipment uses electrodes to pass gentle electric pulses through the brain, to stimulate activity in specific regions of the organ. Roi Cohen Kadosh, who led the study, published in the Journal of Neuroscience, said: “It’s not something people should be doing at home at this stage. I do not recommend people buy this equipment. At the moment it’s not therapy, it’s an experimental tool.” The Oxford scientists used a technique called transcranial direct current stimulation (tDCS) to stimulate the dorsolateral prefrontal cortex in students as they did simple sums. The results of the test were surprising. Students who became anxious when confronted with sums became calmer and solved the problems faster than when they had sham stimulation (the stimulation itself lasted only 30 seconds of the half hour study). The shock was that the students who did not fear maths performed worse with the same stimulation.
|By Bret Stetka When University of Bonn psychologist Monika Eckstein designed her latest published study, the goal was simple: administer a hormone into the noses of 62 men in hopes that their fear would go away. And for the most part, it did. The hormone was oxytocin, often called our “love hormone” due to its crucial role in mother-child relationships, social bonding, and intimacy (levels soar during sex). But it also seems to have a significant antianxiety effect. Give oxytocin to people with certain anxiety disorders, and activity in the amygdala—the primary fear center in human and other mammalian brains, two almond-shaped bits of brain tissue sitting deep beneath our temples—falls. The amygdala normally buzzes with activity in response to potentially threatening stimuli. When an organism repeatedly encounters a stimulus that at first seemed frightening but turns out to be benign—like, say, a balloon popping—a brain region called the prefrontal cortex inhibits amygdala activity. But in cases of repeated presentations of an actual threat, or in people with anxiety who continually perceive a stimulus as threatening, amygdala activity doesn’t subside and fear memories are more easily formed. To study the effects of oxytocin on the development of these fear memories, Eckstein and her colleagues first subjected study participants to Pavlovian fear conditioning, in which neutral stimuli (photographs of faces and houses) were sometimes paired with electric shocks. Subjects were then randomly assigned to receive either a single intranasal dose of oxytocin or a placebo. Thirty minutes later they received functional MRI scans while undergoing simultaneous fear extinction therapy, a standard approach to anxiety disorders in which patients are continually exposed to an anxiety-producing stimulus until they no longer find it stressful. In this case they were again exposed to images of faces and houses, but this time minus the electric shocks. © 2014 Scientific American
By recording from the brains of bats as they flew and landed, scientists have found that the animals have a "neural compass" - allowing them to keep track of exactly where and even which way up they are. These head-direction cells track bats in three dimensions as they manoeuvre. The researchers think a similar 3D internal navigation system is likely to be found throughout the animal kingdom. The findings are published in the journal Nature. Lead researcher Arseny Finkelstein, from the Weizmann Institute of Science in Rehovot, Israel, explained that this was the first time measurements had been taken from animals as they had flown around a space in any direction and even carried out their acrobatic upside-down landings. "We're the only lab currently able to conduct wireless recordings in flying animals," he told BBC News. "A tiny device attached to the bats allows us to monitor the activity of single neurons while the animal is freely moving." Decades of study of the brain's internal navigation system garnered three renowned neuroscientists this year's Nobel Prize for physiology and medicine. The research, primarily in rats, revealed how animals had "place" and "grid" cells - essentially building a map in the brain and coding for where on that map an animal was at any time. Mr Finkelstein and his colleagues' work in bats has revealed that their brains also have "pitch" and "roll" cells. These tell the animal whether it is pointing upwards or downwards and whether its head is tilted one way or the other. BBC © 2014
By CHRISTOPHER F. CHABRIS and DANIEL J. SIMONS NEIL DEGRASSE TYSON, the astrophysicist and host of the TV series “Cosmos,” regularly speaks to audiences on topics ranging from cosmology to climate change to the appalling state of science literacy in America. One of his staple stories hinges on a line from President George W. Bush’s speech to Congress after the 9/11 terrorist attacks. In a 2008 talk, for example, Dr. Tyson said that in order “to distinguish we from they” — meaning to divide Judeo-Christian Americans from fundamentalist Muslims — Mr. Bush uttered the words “Our God is the God who named the stars.” Dr. Tyson implied that President Bush was prejudiced against Islam in order to make a broader point about scientific awareness: Two-thirds of the named stars actually have Arabic names, given to them at a time when Muslims led the world in astronomy — and Mr. Bush might not have said what he did if he had known this fact. This is a powerful example of how our biases can blind us. But not in the way Dr. Tyson thought. Mr. Bush wasn’t blinded by religious bigotry. Instead, Dr. Tyson was fooled by his faith in the accuracy of his own memory. In his post-9/11 speech, Mr. Bush actually said, “The enemy of America is not our many Muslim friends,” and he said nothing about the stars. Mr. Bush had indeed once said something like what Dr. Tyson remembered; in 2003 Mr. Bush said, in tribute to the astronauts lost in the Columbia space shuttle explosion, that “the same creator who names the stars also knows the names of the seven souls we mourn today.” Critics pointed these facts out; some accused Dr. Tyson of lying and argued that the episode should call into question his reliability as a scientist and a public advocate. © 2014 The New York Times Company
Keyword: Learning & Memory
Link ID: 20387 - Posted: 12.03.2014
|By David Z. Hambrick If you’ve spent more than about 5 minutes surfing the web, listening to the radio, or watching TV in the past few years, you will know that cognitive training—better known as “brain training”—is one of the hottest new trends in self improvement. Lumosity, which offers web-based tasks designed to improve cognitive abilities such as memory and attention, boasts 50 million subscribers and advertises on National Public Radio. Cogmed claims to be “a computer-based solution for attention problems caused by poor working memory,” and BrainHQ will help you “make the most of your unique brain.” The promise of all of these products, implied or explicit, is that brain training can make you smarter—and make your life better. Yet, according to a statement released by the Stanford University Center on Longevity and the Berlin Max Planck Institute for Human Development, there is no solid scientific evidence to back up this promise. Signed by 70 of the world’s leading cognitive psychologists and neuroscientists, the statement minces no words: "The strong consensus of this group is that the scientific literature does not support claims that the use of software-based “brain games” alters neural functioning in ways that improve general cognitive performance in everyday life, or prevent cognitive slowing and brain disease." The statement also cautions that although some brain training companies “present lists of credentialed scientific consultants and keep registries of scientific studies pertinent to cognitive training…the cited research is [often] only tangentially related to the scientific claims of the company, and to the games they sell.” © 2014 Scientific American,
Keyword: Learning & Memory
Link ID: 20380 - Posted: 12.02.2014
By Nicholas Bakalar Researchers have found that people diagnosed with diabetes in their 50’s are significantly more likely than others to suffer mental decline by their 70’s. The study, published Monday in the Annals of Internal Medicine, started in 1990. Scientists examined 13,351 black and white adults, aged 48 to 67, for diabetes and prediabetes using self-reported physician diagnoses and glucose control tests. They also administered widely used tests of memory, reasoning, problem solving and planning. About 13 percent had diabetes at the start. The researchers followed them with five periodic examinations over the following 20 years. By that time, 5,987 participants were still enrolled. After adjusting for numerous health and behavioral factors, and for the large attrition in the study, the researchers found people with diabetes suffered a 30 percent larger decline in mental acuity than those without the disease. Diabetes can impair blood circulation, and the authors suggest that the association of diabetes with thinking and memory problems may be the result of damage to small blood vessels in the brain. “People may think cognitive decline with age is inevitable, but it’s not,” said the senior author, Elizabeth Selvin, an associate professor of epidemiology at the Johns Hopkins Bloomberg School of Public Health. “Factors like diabetes are potentially modifiable. If we can better control diabetes we can stave off cognitive decline and future dementia.” © 2014 The New York Times Company
by Andy Coghlan What would Stewart Little make of it? Mice have been created whose brains are half human. As a result, the animals are smarter than their siblings. The idea is not to mimic fiction, but to advance our understanding of human brain diseases by studying them in whole mouse brains rather than in dishes. The altered mice still have mouse neurons – the "thinking" cells that make up around half of all their brain cells. But practically all the glial cells in their brains, the ones that support the neurons, are human. "It's still a mouse brain, not a human brain," says Steve Goldman of the University of Rochester Medical Center in New York. "But all the non-neuronal cells are human." Goldman's team extracted immature glial cells from donated human fetuses. They injected them into mouse pups where they developed into astrocytes, a star-shaped type of glial cell. Within a year, the mouse glial cells had been completely usurped by the human interlopers. The 300,000 human cells each mouse received multiplied until they numbered 12 million, displacing the native cells. "We could see the human cells taking over the whole space," says Goldman. "It seemed like the mouse counterparts were fleeing to the margins." Astrocytes are vital for conscious thought, because they help to strengthen the connections between neurons, called synapses. Their tendrils (see image) are involved in coordinating the transmission of electrical signals across synapses. © Copyright Reed Business Information Ltd.
By BENEDICT CAREY Quick: Which American president served before slavery ended, John Tyler or Rutherford B. Hayes? If you need Google to get the answer, you are not alone. (It is Tyler.) Collective cultural memory — for presidents, for example — works according to the same laws as the individual kind, at least when it comes to recalling historical names and remembering them in a given order, researchers reported on Thursday. The findings suggest that leaders who are well known today, like the elder President George Bush and President Bill Clinton, will be all but lost to public memory in just a few decades. The particulars from the new study, which tested Americans’ ability to recollect the names of past presidents, are hardly jaw-dropping: People tend to recall best the presidents who served recently, as well as the first few in the country’s history. They also remember those who navigated historic events, like the ending of slavery (Abraham Lincoln) and World War II (Franklin D. Roosevelt). But the broader significance of the report — the first to measure forgetfulness over a 40-year period, using a constant list — is that societies collectively forget according to the same formula as, say, a student who has studied a list of words. Culture imitates biology, even though the two systems work in vastly different ways. The new paper was published in the journal Science. “It’s an exciting study, because it mixes history and psychology and finds this one-on-one correspondence” in the way memory functions, said David C. Rubin, a psychologist at Duke University who was not involved in the research. The report is based on four surveys by psychologists now at Washington University in St. Louis, conducted from 1974 to 2014. In the first three, in 1974, 1991 and 2009, Henry L. Roediger III gave college students five minutes to write down as many presidents as they could remember, in order. © 2014 The New York Times Company
Keyword: Learning & Memory
Link ID: 20364 - Posted: 11.29.2014
By Linda Searing THE QUESTION Keeping your brain active by working is widely believed to protect memory and thinking skills as we age. Does the type of work matter? THIS STUDY involved 1,066 people who, at an average age of 70, took a battery of tests to measure memory, processing speed and cognitive ability. The jobs they had held were rated by the complexity of dealings with people, data and things. Those whose main jobs required complex work, especially in dealings with people — such as social workers, teachers, managers, graphic designers and musicians — had higher cognitive scores than those who had held jobs requiring less-complex dealings, such as construction workers, food servers and painters. Overall, more-complex occupations were tied to higher cognitive scores, regardless of someone’s IQ, education or environment. WHO MAY BE AFFECTED? Older adults. Cognitive abilities change with age, so it can take longer to recall information or remember where you placed your keys. That is normal and not the same thing as dementia, which involves severe memory loss as well as declining ability to function day to day. Commonly suggested ways to maintain memory and thinking skills include staying socially active, eating healthfully and getting adequate sleep as well as such things as doing crossword puzzles, learning to play a musical instrument and taking varied routes to common destinations when driving.
Keyword: Learning & Memory
Link ID: 20359 - Posted: 11.26.2014
By Gary Stix One area of brain science that has drawn intense interest in recent years is the study of what psychologists call reconsolidation—a ponderous technical term that, once translated, means giving yourself a second chance. Memories of our daily experience are formed, often during sleep, by inscribing—or “consolidating”—a record of what happened into neural tissue. Joy at the birth of a child or terror in response to a violent personal assault. A bad memory, once fixed, may replay again and again, turning toxic and all-consuming. For the traumatized, the desire to forget becomes an impossible dream. Reconsolidation allows for a do-over by drawing attention to the emotional and factual content of traumatic experience. In the safety of a therapist’s office, the patient lets demons return and then the goal is to reshape karma to form a new more benign memory. The details remain the same, but the power of the old terror to overwhelm and induce psychic paralysis begins to subside. The clinician would say that the memory has undergone a change in “valence”—from negative to neutral and detached. The trick to undertaking successful reconsolidation requires revival of these memories without provoking panic and chaos that can only makes things worse. Talk therapies and psycho-pharm may not be enough. One new idea just starting to be explored is the toning down of memories while a patient is fast asleep © 2014 Scientific American,
By MAX BEARAK MUMBAI, India — The young man sat cross-legged atop a cushioned divan on an ornately decorated stage, surrounded by other Jain monks draped in white cloth. His lip occasionally twitched, his hands lay limp in his lap, and for the most part his eyes were closed. An announcer repeatedly chastised the crowd for making even the slightest noise. From daybreak until midafternoon, members of the audience approached the stage, one at a time, to show the young monk a random object, pose a math problem, or speak a word or phrase in one of at least six different languages. He absorbed the miscellany silently, letting it slide into his mind, as onlookers in their seats jotted everything down on paper. After six hours, the 500th and last item was uttered — it was the number 100,008. An anxious hush descended over the crowd. And the monk opened his eyes and calmly recalled all 500 items, in order, detouring only once to fill in a blank he had momentarily set aside. When he was done, and the note-keepers in the audience had confirmed his achievement, the tense atmosphere dissolved and the announcer led the crowd in a series of triumphant chants. The opportunity to witness the feat of memory drew a capacity crowd of 6,000 to the Sardar Vallabhbhai Patel stadium in Mumbai on Sunday. The exhibition was part of a campaign to encourage schoolchildren to use meditation to build brainpower, as Jain monks have done for centuries in India, a country drawn both toward ancient religious practices and more recent ambitions. But even by Jain standards, the young monk — Munishri Ajitchandrasagarji, 24 — is something special. His guru, P. P. Acharya Nayachandrasagarji, said no other monk in many years had come close to his ability. © 2014 The New York Times Company
By Gretchen Reynolds Exercise seems to be good for the human brain, with many recent studies suggesting that regular exercise improves memory and thinking skills. But an interesting new study asks whether the apparent cognitive benefits from exercise are real or just a placebo effect — that is, if we think we will be “smarter” after exercise, do our brains respond accordingly? The answer has significant implications for any of us hoping to use exercise to keep our minds sharp throughout our lives. In experimental science, the best, most reliable studies randomly divide participants into two groups, one of which receives the drug or other treatment being studied and the other of which is given a placebo, similar in appearance to the drug, but not containing the active ingredient. Placebos are important, because they help scientists to control for people’s expectations. If people believe that a drug, for example, will lead to certain outcomes, their bodies may produce those results, even if the volunteers are taking a look-alike dummy pill. That’s the placebo effect, and its occurrence suggests that the drug or procedure under consideration isn’t as effective as it might seem to be; some of the work is being done by people’s expectations, not by the medicine. Recently, some scientists have begun to question whether the apparently beneficial effects of exercise on thinking might be a placebo effect. While many studies suggest that exercise may have cognitive benefits, those experiments all have had a notable scientific limitation: They have not used placebos. This issue is not some abstruse scientific debate. If the cognitive benefits from exercise are a result of a placebo effect rather than of actual changes in the brain because of the exercise, then those benefits could be ephemeral and unable in the long term to help us remember how to spell ephemeral. © 2014 The New York Times Company
Keyword: Learning & Memory
Link ID: 20329 - Posted: 11.20.2014
By NICK BILTON Ebola sounds like the stuff of nightmares. Bird flu and SARS also send shivers down my spine. But I’ll tell you what scares me most: artificial intelligence. The first three, with enough resources, humans can stop. The last, which humans are creating, could soon become unstoppable. Before we get into what could possibly go wrong, let me first explain what artificial intelligence is. Actually, skip that. I’ll let someone else explain it: Grab an iPhone and ask Siri about the weather or stocks. Or tell her “I’m drunk.” Her answers are artificially intelligent. Right now these artificially intelligent machines are pretty cute and innocent, but as they are given more power in society, these machines may not take long to spiral out of control. In the beginning, the glitches will be small but eventful. Maybe a rogue computer momentarily derails the stock market, causing billions in damage. Or a driverless car freezes on the highway because a software update goes awry. But the upheavals can escalate quickly and become scarier and even cataclysmic. Imagine how a medical robot, originally programmed to rid cancer, could conclude that the best way to obliterate cancer is to exterminate humans who are genetically prone to the disease. Nick Bostrom, author of the book “Superintelligence,” lays out a number of petrifying doomsday settings. One envisions self-replicating nanobots, which are microscopic robots designed to make copies of themselves. In a positive situation, these bots could fight diseases in the human body or eat radioactive material on the planet. But, Mr. Bostrom says, a “person of malicious intent in possession of this technology might cause the extinction of intelligent life on Earth.” © 2014 The New York Times Company
By James Gallagher Health editor, BBC News website Working antisocial hours can prematurely age the brain and dull intellectual ability, scientists warn. Their study, in the journal Occupational and Environmental Medicine, suggested a decade of shifts aged the brain by more than six years. There was some recovery after people stopped working antisocial shifts, but it took five years to return to normal. Experts say the findings could be important in dementia, as many patients have disrupted sleep. The body's internal clock is designed for us to be active in the day and asleep at night. The damaging effects on the body of working against the body clock, from breast cancer to obesity, are well known. Now a team at the University of Swansea and the University of Toulouse has shown an impact on the mind as well. Three thousand people in France performed tests of memory, speed of thought and wider cognitive ability. The brain naturally declines as we age, but the researchers said working antisocial shifts accelerated the process. Those with more than 10 years of shift work under their belts had the same results as someone six and a half years older. The good news is that when people in the study quit shift work, their brains did recover. Even if it took five years. Dr Philip Tucker, part of the research team in Swansea, told the BBC: "It was quite a substantial decline in brain function, it is likely that when people trying to undertake complex cognitive tasks then they might make more mistakes and slip-ups, maybe one in 100 makes a mistake with a very large consequence, but it's hard to say how big a difference it would make in day-to-day life." BBC © 2014
By RICHARD A. FRIEDMAN ATTENTION deficit hyperactivity disorder is now the most prevalent psychiatric illness of young people in America, affecting 11 percent of them at some point between the ages of 4 and 17. The rates of both diagnosis and treatment have increased so much in the past decade that you may wonder whether something that affects so many people can really be a disease. And for a good reason. Recent neuroscience research shows that people with A.D.H.D. are actually hard-wired for novelty-seeking — a trait that had, until relatively recently, a distinct evolutionary advantage. Compared with the rest of us, they have sluggish and underfed brain reward circuits, so much of everyday life feels routine and understimulating. To compensate, they are drawn to new and exciting experiences and get famously impatient and restless with the regimented structure that characterizes our modern world. In short, people with A.D.H.D. may not have a disease, so much as a set of behavioral traits that don’t match the expectations of our contemporary culture. From the standpoint of teachers, parents and the world at large, the problem with people with A.D.H.D. looks like a lack of focus and attention and impulsive behavior. But if you have the “illness,” the real problem is that, to your brain, the world that you live in essentially feels not very interesting. One of my patients, a young woman in her early 20s, is prototypical. “I’ve been on Adderall for years to help me focus,” she told me at our first meeting. Before taking Adderall, she found sitting in lectures unendurable and would lose her concentration within minutes. Like many people with A.D.H.D., she hankered for exciting and varied experiences and also resorted to alcohol to relieve boredom. But when something was new and stimulating, she had laserlike focus. I knew that she loved painting and asked her how long she could maintain her interest in her art. “No problem. I can paint for hours at a stretch.” Rewards like sex, money, drugs and novel situations all cause the release of dopamine in the reward circuit of the brain, a region buried deep beneath the cortex. Aside from generating a sense of pleasure, this dopamine signal tells your brain something like, “Pay attention, this is an important experience that is worth remembering.” © 2014 The New York Times Company
Maanvi Singh How does a sunset work? We love to look at one, but Jolanda Blackwell wanted her eighth-graders to really think about it, to wonder and question. So Blackwell, who teaches science at Oliver Wendell Holmes Junior High in Davis, Calif., had her students watch a video of a sunset on YouTube as part of a physics lesson on motion. "I asked them: 'So what's moving? And why?' " Blackwell says. The students had a lot of ideas. Some thought the sun was moving; others, of course, knew that a sunset is the result of the Earth spinning around on its axis. Once she got the discussion going, the questions came rapid-fire. "My biggest challenge usually is trying to keep them patient," she says. "They just have so many burning questions." Students asking questions and then exploring the answers. That's something any good teacher lives for. And at the heart of it all is curiosity. Blackwell, like many others teachers, understands that when kids are curious, they're much more likely to stay engaged. But why? What, exactly, is curiosity and how does it work? A study published in the October issue of the journal Neuron suggests that the brain's chemistry changes when we become curious, helping us better learn and retain information. © 2014 NPR