Most Recent Links
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
By Alyssa Abkowitz If you’re wary of investing in a certain stock or exchange-traded fund, it could be because of the your brain’s physical composition. In a recent study, 61 participants from the Northeastern U.S. were asked to choose between monetary options that differed in the level of risk. Questions included: “Would you prefer a 50 percent chance of receiving $5 or would you rather take a 13 percent chance of winning $50?” and “Would you prefer $10 for sure or a 50 percent chance of receiving $50?” Researchers found that individuals with more gray matter in a specific part of their brains tend to tolerate more financial risks, says Agnieszka Tymula, an economist at the University of Sydney and co-author of the findings. Most of the participants answered questions while their brains were being scanned, while others received MRIs afterward (the timing doesn’t make a difference because the researchers were looking at brain structure, not brain function). The study involved measuring the volume of gray matter, or the outer layer of the brain, in the right posterior parietal region of the cortex. Thicker gray matter corresponded to riskier responses. Tymula worked with researchers from Yale University, University College London, New York University, and the University of Pennsylvania. Their findings, published in the Journal of Neuroscience this month, dovetail with previous work in which Tymula found that adults become more risk-averse as they age. Other neuroscience research shows that people’s cortexes become thinner as they get older, meaning there could be a link between a thinning cortex and risk aversion. ©2014 Bloomberg L.P
by Laura Sanders Earlier this month, a star running back for the Minnesota Vikings was indicted for whipping his young son bloody with a switch. Leaked photographs allegedly showed Adrian Peterson’s 4-year-old son with cuts and bruises on his legs, back, buttocks and scrotum. As details about the incident emerged, Peterson took to Twitter to say that he’s not a perfect parent but what he did was not abuse. It was discipline. “My goal is always to teach my son right from wrong and that’s what I tried to do that day,” he wrote. Many people, and I’m one of them, that think Peterson’s actions were disgusting. There’s no way that hitting 4-year-old with a switch until his body is cut and bruised is a good way to impart values and morals. Peterson’s extreme actions, done in the name of corporal punishment, ignited a ferocious, emotionally fraught debate over whether it’s OK to hit your kid. The debate reflects deep divides in our society, chasms that track along political, religious, regional and racial lines. Half of all U.S. parents say they’ve spanked their kid. Spanking doesn’t just happen in the privacy of homes, either. Nineteen states allow teachers or principals to hit children. Opponents often point to scientific studies as proof that spanking is bad. And I confess, I originally thought this post was going to describe those results that we’ve all heard: how children who have been spanked are more aggressive and have more behavioral problems. But despite the headlines, the science behind spanking is actually quite limited, says clinical psychologist Christopher Ferguson of Stetson University in DeLand, Fla. “Because it’s a culture war issue, I think a lot of what we hear has misrepresented what is very nuanced science,” he says. © Society for Science & the Public 2000 - 2014.
Some people don't just work — they text, Snapchat, check Facebook and Tinder, listen to music and work. And a new study reveals those multitaskers have brains that look different than those of people who stick to one task. Researchers at the University of Sussex scanned 75 adults using an fMRI to examine their gray matter. Those who admitted to multitasking with a variety of electronic devices at once had less dense gray matter in their anterior cingulate cortexes (ACC). This region controls executive function, such as working memory, reasoning, planning and execution. There is no way of knowing if people with smaller anterior cingulate cortexes are more likely to multitask or if multitaskers are shrinking their gray matter. It could even show that our brains become more efficient from multitasking, said Dr. Gary Small, director of UCLA’s Memory and Aging Research Center at the Semel Institute for Neuroscience and Human Behavior, who was not involved in the study. “When you exercise the brain … it becomes effective at performing a mental task,” he said. While previous research has shown that multitasking leads to more mistakes, Small said research remains important to our understanding of something we’re all guilty of doing.
Link ID: 20115 - Posted: 09.25.2014
by Greg Laden I heard yesterday that my friend and former advisor Irven DeVore died. He was important, amazing, charming, difficult, harsh, brilliant, fun, annoying. My relationship to him as an advisee and a friend was complex, important to me for many years, and formative. For those who don’t know he was instrumental in developing several subfields of anthropology, including behavioral biology, primate behavioral studies, hunter-gatherer research, and even ethnoarchaeology. He was a cultural anthropologist who realized during his first field season that a) he was not cut out to be a cultural anthropologist and b) most of the other cultural anthropologists were not either. Soon after he became Washburn’s student and independently invented the field study of complex social behavior in primates (though some others were heading in that direction at the same time), producing his famous work on the baboons of Kenya’s Nairobi National Park. For many years, what students learned about primate behavior, they learned from that work. Later he and Richard Lee, along with John Yellen, Alison Brooks, Henry Harpending, and others started up the study of Ju/’hoansi Bushmen along the Namibian/Botswana border. One of the outcomes of that work was the famous Werner Gren conference and volume called “Man the Hunter.” That volume has two roles in the history of anthropology. First, it launched modern forager studies. Second, it became one of the more maligned books in the field of Anthropology. I have yet to meet a single person who has a strong criticism of that book that is not based on having not read it. For many years, much of what students learned about human foragers, they learned from that work.
Link ID: 20114 - Posted: 09.25.2014
By Sarah C. P. Williams Press the backs of your hands against the inside of a door frame for 30 seconds—as if you’re trying to widen the frame—and then let your arms down; you’ll feel something odd. Your arms will float up from your sides, as if lifted by an external force. Scientists call this Kohnstamm phenomenon, but you may know it as the floating arm trick. Now, researchers have studied what happens in a person’s brain and nerve cells when they repress this involuntary movement, holding their arms tightly by their sides instead of letting them float up. Two theories existed as to how this repression worked: The brain could send a positive “push down” signal to the arm muscles at the same time as the involuntary “lift up” signal was being transmitted to cancel it out; or the brain could entirely block the involuntary signal at the root of the nerves. The new study, which analyzed brain scans and muscle activity recordings from 39 volunteers, found that the latter was true—when a person stifles Kohnstamm phenomenon, the involuntary “lift” signal is blocked before it reaches the muscle. The difference between the repression mechanisms may seem subtle, but understanding it could help people repress other involuntary movements—including the tremors associated with Parkinson’s disease and the tics associated with Tourette syndrome, the team reports online today in the Proceedings of the Royal Society B. © 2014 American Association for the Advancement of Science
By Dick Miller, CBC News Dan Campbell felt the bullets whiz past his head. The tracer rounds zipped between his legs. It was his first firefight as a Canadian soldier in Afghanistan. "I was completely frightened and scared like I’d never been before in my life,” he says. As the attack continued, the sights, sounds and smells started to form memories inside his brain. The fear he felt released the hormone norepinephrine, and in the complex chemistry of the brain, the memories of the battle became associated with the fear. 'I think one day, hopefully in the not-too-distant future, we will be able to delete a memory.'- Dr. Sheena Josselyn, senior scientist, Hospital For Sick Children Research Institute Six years later, a sight or sound such as a firecracker or car backfiring can remind him of that night in 2008. The fear comes back and he relives rather than remembers the moments. "It can be hard. Physically, you know, there’s the tapping foot, my heart beating,” he says. Like so many soldiers and victims of assault or people who have experienced horrific accidents, Campbell was diagnosed with post traumatic stress disorder. Now a newspaper reporter in Yellowknife, Campbell thinks one day he may get therapy. But for now he is working on his own to control the fear and anger the memories bring. © CBC 2014
By SAM BORDEN Bellini, a Brazilian soccer star who led the team that won the 1958 World Cup and was honored with a statue outside the Estádio do Maracanã in Rio de Janeiro, had a degenerative brain disease linked to dozens of boxers and American football players when he died in March at age 83. At the time, his death was attributed to complications related to Alzheimer’s disease. But researchers now say he had an advanced case of chronic traumatic encephalopathy, or C.T.E., which is caused by repeated blows to the head and has symptoms similar to those of Alzheimer’s. C.T.E. can be diagnosed only posthumously, and few brains of former soccer players have been examined. Bellini is the second known case, according to Dr. Ann McKee, a neuropathologist at Boston University and the Veterans Affairs Medical Center in Bedford, Mass., who assisted in examining Bellini’s brain. McKee was also involved this year when researchers found C.T.E. in the brain of a 29-year-old man from New Mexico who had played soccer semiprofessionally. McKee said in an interview that she was aware of a third former soccer player who had C.T.E. but that she was not yet authorized to publicly identify the person. As C.T.E. began to gain widespread attention about six years ago, it was often thought of as an American problem. Many of the early cases of the disease, for which there is no known cure, were connected to boxers and American football players. © 2014 The New York Times Company
Keyword: Brain Injury/Concussion
Link ID: 20110 - Posted: 09.24.2014
by Sarah Zielinski Chimps may be cute and have mannerisms similar to humans, but they are wild animals. A new study finds that chimps raised as pets or entertainers have behavioral problems as adults. There are plenty of good reasons why chimpanzees should not be pets or performers, no matter how cute or humanlike they appear: They are wild animals. They can be violent with each other. And they can be violent toward humans — even humans that have a long history with the chimp. Plus, there’s evidence that seeing an adorable chimp dressed up like a miniature human actually makes us care less about the plight of their species. Now comes evidence that the way that chimps are raised to become pets or entertainers — taking them away from other chimps at a young age and putting them in the care of humans, who may or may not feed and care for them properly — has long-term, negative effects on their behavior. “We now add empirical evidence of the potentially negative welfare effects on the chimpanzees themselves as important considerations in the discussion of privately owned chimpanzees,” Hani Freeman and Stephen Ross of the Lincoln Park Zoo in Chicago write September 23 in PeerJ. Freeman and Ross compiled life history and behavioral data on 60 captive chimps living in zoos. Some of the animals had always lived in zoos and grew up in groups of chimpanzees. Six were raised solely by humans and were later placed in zoos after they became too big or too old for their owners to care for them. Others had a more mixed background. © Society for Science & the Public 2000 - 2014
// by Jennifer Viegas Harems -- where a group of females share a single mate -- can be sexual bliss for the male, but the arrangement poses many challenges for him, according to a new study. Male leaders of harems are often overworked and tired finds the study, published in the latest issue of Royal Society Open Science. Gelada baboons exemplify the problems. "Being a gelada leader male is fairly exhausting," co-author David Pappano told Discovery News. "In order to keep the females within his harem happy, gelada leader males spend a lot of time grooming them." "When bachelors are around, leader males often engage in costly displays -- running around, climbing up a tree, and producing a very loud (ee-yow) display call," added Pappano, who is an NSF postdoctoral research fellow at Princeton University's Department of Ecology and Evolutionary Biology. He co-authored the paper with Jacinta Beehner. © 2014 Discovery Communications, LLC.
Keyword: Sexual Behavior
Link ID: 20108 - Posted: 09.24.2014
|By Simon Makin The Claim Casual cannabis use harms young people's brains. The Facts A study found differences in the brains of users and nonusers, but it did not establish that marijuana use caused the variations or that they had any functional significance. The Details Researchers at Northwestern University and Harvard Medical School conducted MRI scans of two groups of 20 young adults ages 18 to 25. One group reported using marijuana at least once a week, smoking 11 joints a week on average, whereas the other had used it less than five times total and not at all during the last year. Neither group had any psychiatric disorders, and the users were psychiatrically assessed as not dependent on the drug. The study focused on two brain regions involved in processing rewards, the nucleus accumbens and the amygdala. These areas create pleasurable experiences of things such as food and sex, as well as the high associated with drugs, and have been shown to change in animals given THC, the main psychoactive component of cannabis. The researchers found that cannabis users had more gray matter density in the left nucleus accumbens and left amygdala, as well as differences in the shape of the left nucleus accumbens and right amygdala. The left nucleus accumbens also tended to be slightly larger in users. They concluded that recreational cannabis use might be associated with abnormalities in the brain's reward system. News reports have proclaimed that scientists have shown that even casual cannabis use harms young people's brains. The Caveats The most obvious problem with leaping to that conclusion is that the scans were conducted at only one point. © 2014 Scientific American
|By Corinne Iozzio Albert “Skip” Rizzo of the University of Southern California began studying virtual reality (VR) as psychological treatment in 1993. Since then, dozens of studies, his included, have shown the immersion technique to be effective for everything from post-traumatic stress disorder (PTSD) and anxiety to phobias and addiction. But a lack of practical hardware has kept VR out of reach for clinicians. The requirements for a VR headset seem simple—a high-resolution, fast-reacting screen, a field of vision that is wide enough to convince patients they are in another world and a reasonable price tag— yet such a product has proved elusive. Says Rizzo, “It’s been 20 frustrating years.” In 2013 VR stepped into the consumer spotlight in the form of a prototype head- mounted display called the Oculus Rift. Inventor Palmer Luckey’s goal was to create a platform for immersive video games, but developers from many fields—medicine, aviation, tourism—are running wild with possibilities. The Rift’s reach is so broad that Oculus, now owned by Facebook, hosted a conference for developers in September. The Rift, slated for public release in 2015, is built largely from off- the-shelf parts, such as the screens used in smartphones. A multi- axis motion sensor lets the headset refresh imagery in real time as the wearer’s head moves. The kicker is the price: $350. (Laboratory systems start at $20,000.) Rizzo has been among the first in line. His work focuses on combat PTSD. In a 2010 study, he placed patients into controlled traumatic scenarios, including a simulated battlefield, so they could confront and process emotions triggered in those situations. © 2014 Scientific American
Jia You In the future, a nurse could determine whether a baby is likely to develop a reading disorder simply by attaching a few electrodes to its scalp and watching its brain waves respond to human speech. Such is the scenario suggested by a new study, which finds a potential biological indicator of how well preschool children perceive rhythm, an ability linked to language development. “It’s really impressive to work with children this young, who are not often looked at,” says Aniruddh Patel, a cognitive neuroscientist at Tufts University in Medford, Massachusetts, who was not involved with the research. Spoken language consists of sound waves occurring over multiple timescales. A syllable, for example, takes place over a quarter of a second, while a sentence unfolds over a few seconds. To make sense of this complex auditory information, humans use rhythmic cues such as stress and pause to discern words and syllables. Adults and school-aged children with reading disorders, however, struggle to pick up on these rhythmic patterns. Scientists estimate that dyslexia and other reading disabilities plague about 5% to 10% of the population. Detecting such impairments early could lead to more effective intervention, but observing telltale signs in younger children who have not learned to read has proven a challenge. So biologist Nina Kraus of Northwestern University in Evanston, Illinois, and her colleagues looked for automatic brain responses that can track language development in preschoolers, who have not learned to read. © 2014 American Association for the Advancement of Science
|By Melinda Wenner Moyer Autism is primarily a disorder of the brain, but research suggests that as many as nine out of 10 individuals with the condition also suffer from gastrointestinal problems such as inflammatory bowel disease and “leaky gut.” The latter condition occurs when the intestines become excessively permeable and leak their contents into the bloodstream. Scientists have long wondered whether the composition of bacteria in the intestines, known as the gut microbiome, might be abnormal in people with autism and drive some of these symptoms. Now a spate of new studies supports this notion and suggests that restoring proper microbial balance could alleviate some of the disorder's behavioral symptoms. At the annual meeting of the American Society for Microbiology held in May in Boston, researchers at Arizona State University reported the results of an experiment in which they measured the levels of various microbial by-products in the feces of children with autism and compared them with those found in healthy children. The levels of 50 of these substances, they found, significantly differed between the two groups. And in a 2013 study published in PLOS ONE, Italian researchers reported that, compared with healthy kids, those with autism had altered levels of several intestinal bacterial species, including fewer Bifidobacterium, a group known to promote good intestinal health. One open question is whether these microbial differences drive the development of the condition or are instead a consequence of it. A study published in December 2013 in Cell supports the former idea. When researchers at the California Institute of Technology incited autismlike symptoms in mice using an established paradigm that involved infecting their mothers with a viruslike molecule during pregnancy, they found that after birth, the mice had altered gut bacteria compared with healthy mice. © 2014 Scientific American,
Link ID: 20104 - Posted: 09.23.2014
By Nicholas Bakalar Average waist circumference — but not body mass index— increased significantly in the United States between 1999 and 2012, a new study reports. Abdominal obesity — a “beer belly” or “beer gut” — is caused by fat around the internal organs. It is one of the indicators of metabolic syndrome, a group of five conditions that raises the risk for heart disease and diabetes. After adjusting for age, the overall mean waist circumference increased to 38.7 inches in 2012 from 37.5 in 1999. The increases were significant for men, women, non-Hispanic whites, non-Hispanic blacks and Mexican-Americans. They were greatest among non-Hispanic whites in their 40s, and non-Hispanic black men in their 30s. “I would encourage people to keep track of their waists,” said the lead author of the study, Dr. Earl S. Ford, a medical officer with the Centers for Disease Control and Prevention. “Standing on the scale every day is all good and well, but you can have a steady weight and still have an expanding waist. And that should be a signal for people to start looking at their diet and physical activity.” In 2012, 54.2 percent of Americans had abdominal obesity (defined as an age-adjusted waist circumference of more than 40 inches for men and more than 34.6 for women) compared with 46.4 percent in 1999. The study was published in JAMA. © 2014 The New York Times Company
Link ID: 20103 - Posted: 09.23.2014
2014 by Dan Jones The vast majority of people think we have free will and are the authors of our own life stories. But if neuroscientists were one day able to predict our every action based on brain scans, would people abandon this belief in droves? A new study concludes that such knowledge would not by itself be enough to shake our confidence in our own volition. Many neuroscientists, such as the late Francis Crick, have argued that our sense of free will is no more than the behaviour of a vast assembly of nerve cells. This is tied to the idea of determinism, which has it that every effect is preceded by a cause, with cause and effect connected by physical laws. This is why the behaviour of physical systems can be predicted – even the brain, in principle. As author Sam Harris puts it: "If determinism is true, the future is set – and this includes all our future states of mind and our subsequent behaviour." If people lost their belief in their own free will, that would have important consequences for how we think about moral responsibility, and even how we behave. For example, numerous studies have shown that when people are led to reject free will they are more likely to cheat, and are also less bothered about punishing other wrongdoers. For those who argue that what we know about neuroscience is incompatible with free will, predicting what our brain is about to do should reveal the illusory nature of free will, and lead people to reject it. Experimental philosopher Eddy Nahmias at Georgia State University in Atlanta dubs this view "willusionism". He recently set out to test it. © Copyright Reed Business Information Ltd.
Link ID: 20102 - Posted: 09.22.2014
|By Victoria Stern Many studies show that teens who use marijuana face a greater risk of later developing schizophrenia or symptoms of it, especially if they have a genetic predisposition. For instance, one 15-year study followed more than 45,000 Swedes who initially had no psychotic symptoms. The researchers determined that subjects who smoked marijuana by age 18 were 2.4 times more likely to be diagnosed with schizophrenia than their nonsmoking peers, and this risk increased with the frequency of cannabis use. The connection still held when researchers accounted for participants' use of other drugs. Yet despite these results and an uptick in marijuana use in the 1970s and 1980s, other researchers have not uncovered an increase in the incidence of schizophrenia in the general Swedish population—suggesting that perhaps people who were going to develop schizophrenia anyway were more likely to use marijuana. Another study, conducted in Australia over a 30-year period, also found no increase in schizophrenia diagnoses among the general population, despite rising rates of teen marijuana use. These authors concluded that although cannabis most likely does not cause schizophrenia, its use might trigger psychosis in vulnerable people or exacerbate an existing condition. © 2014 Scientific American
By Melissa Dahl Recently, I was visiting my family in Seattle, and we were doing that thing families do: retelling old stories. As we talked, a common theme emerged. My brother hardly remembered anything from our childhood, even the stories in which he was the star player. (That time he fell down the basement steps and needed stitches in the ER? Nope. That panicky afternoon when we all thought he’d disappeared, only to discover he’d been hiding in his room, and then fell asleep? Nothing.) “Boys never remember anything,” my mom huffed. She’s right. Researchers are finding some preliminary evidence that women are indeed better at recalling memories, especially autobiographical ones. Girls and women tend to recall these memories faster and with more specific details, and some studies have demonstrated that these memories tend to be more accurate, too, when compared to those of boys and men. And there’s an explanation for this: It could come down to the way parents talk to their daughters, as compared to their sons, when the children are developing memory skills. To understand this apparent gender divide in recalling memories, it helps to start with early childhood—specifically, ages 2 to 6. Whether you knew it or not, during these years, you learned how to form memories, and researchers believe this happens mostly through conversations with others, primarily our parents. These conversations teach us how to tell our own stories, essentially; when a mother asks her child for more details about something that happened that day in school, for example, she is implicitly communicating that these extra details are essential parts to the story. © 2014 The Slate Group LLC
2014 by Helen Thomson Shall I compare thee to... well, no one actually. A 76-year-old woman has developed an incredibly rare disorder – she has the compulsive urge to write poetry. Her brain is now being studied by scientists who want to understand more about the neurological basis for creativity. In 2013, the woman arrived at a UK hospital complaining of memory problems and a tendency to lose her way in familiar locations. For the previous two years, she had experienced occasional seizures. She was diagnosed with temporal lobe epilepsy and treated with the drug lamotrigine, which stopped her seizures. However, as they receded, a strange behaviour took hold. She began to compulsively write poetry – something she hadn't shown any interest in previously. Suddenly, the woman was writing 10 to 15 poems a day, becoming annoyed if she was disrupted. Her work rhymed but the content was banal if a touch wistful – a style her husband described as doggerel (see "Unstoppable creativity"). About six months after her seizures stopped, the desire to write became less strong, although it still persists to some extent. Doctors call the intense desire to write hypergraphia. It typically occurs alongside schizophrenia and an individual's output is usually rambling and disorganised. "It was highly unusual to see such highly structured and creative hypergraphia without any of the other behavioural disturbances," says the woman's neurologist, Jason Warren at University College London. © Copyright Reed Business Information Ltd
By CLYDE HABERMAN When it came to pharmacological solutions to life’s despairs, Aldous Huxley was ahead of the curve. In Huxley’s 1932 novel about a dystopian future, the Alphas, Betas and others populating his “Brave New World” have at their disposal a drug called soma. A little bit of it chases the blues away: “A gramme” — Huxley was English, remember, spelling included — “is better than a damn.” With a swallow, negative feelings are dispelled. Prozac, the subject of this week’s video documentary from Retro Report, is hardly soma. But its guiding spirit is not dissimilar: A few milligrams of this drug are preferable to the many damns that lie at the core of some people’s lives. Looking back at Prozac’s introduction by Eli Lilly and Company in 1988, and hopscotching to today, the documentary explores the enormous influence, both chemical and cultural, that Prozac and its brethren have had in treating depression, a concern that gained new resonance with the recent suicide of the comedian Robin Williams. In the late 1980s and the 90s, Prozac was widely viewed as a miracle pill, a life preserver thrown to those who felt themselves drowning in the high waters of mental anguish. It was the star in a class of new pharmaceuticals known as S.S.R.I.s — selective serotonin reuptake inhibitors. Underlying their use is a belief that depression is caused by a shortage of the neurotransmitter serotonin. Pump up the levels of this brain chemical and, voilà, the mood lifts. Indeed, millions have embraced Prozac, and swear by it. Depression left them emotionally paralyzed, they say. Now, for the first time in years, they think clearly and can embrace life. Pharmacological merits aside, the green-and-cream pill was also a marvel of commercial branding, down to its market-tested name. Its chemical name is fluoxetine hydrochloride, not the most felicitous of terms. A company called Interbrand went to work for Eli Lilly and came up with Prozac. “Pro” sounds positive. Professional, too. “Ac”? That could signify action. As for the Z, it suggests a certain strength, perhaps with a faint high-techy quality. © 2014 The New York Times Company
Link ID: 20098 - Posted: 09.22.2014
By Maria Konnikova At the turn of the twentieth century, Ivan Pavlov conducted the experiments that turned his last name into an adjective. By playing a sound just before he presented dogs with a snack, he taught them to salivate upon hearing the tone alone, even when no food was offered. That type of learning is now called classical—or Pavlovian—conditioning. Less well known is an experiment that Pavlov was conducting at around the same time: when some unfortunate canines heard the same sound, they were given acid. Just as their luckier counterparts had learned to salivate at the noise, these animals would respond by doing everything in their power to get the imagined acid out of their mouths, each “shaking its head violently, opening its mouth and making movements with its tongue.” For many years, Pavlov’s classical conditioning findings overshadowed the darker version of the same discovery, but, in the nineteen-eighties, the New York University neuroscientist Joseph LeDoux revived the technique to study the fear reflex in rats. LeDoux first taught the rats to associate a certain tone with an electric shock so that they froze upon hearing the tone alone. In essence, the rat had formed a new memory—that the tone signifies pain. He then blunted that memory by playing the tone repeatedly without following it with a shock. After multiple shock-less tones, the animals ceased to be afraid. Now a new generation of researchers is trying to figure out the next logical step: re-creating the same effects within the brain, without deploying a single tone or shock. Is memory formation now understood well enough that memories can be implanted and then removed absent the environmental stimulus?