Most Recent Links

Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.


Links 61 - 80 of 19768

by Helen Thomson A MAN with the delusional belief that an impostor has taken his wife's place is helping shed light on how we recognise loved ones. Capgras syndrome is a rare condition in which a person insists that a person they are close to – most commonly a spouse – has been replaced by an impostor. Sometimes they even believe that a much-loved pet has also been replaced by a lookalike. Anecdotal evidence suggests that people with Capgras only misidentify the people that they are closest to. Chris Fiacconi at Western University in London, Ontario, Canada, and his team wanted to explore this. They performed recognition tests and brain scans on two male volunteers with dementia – one who had Capgras, and one who didn't – and compared the results with those of 10 healthy men of a similar age. For months, the man with Capgras believed that his wife had been replaced by an impostor and was resistant to any counterargument, often asking his son why he was so convinced that the woman was his mother. First the team tested whether or not the volunteers could recognise celebrities they would have been familiar with throughout their lifetime, such as Marilyn Monroe. Volunteers were presented with celebrities' names, voices or pictures, and asked if they recognised them and, if so, how much information they could recall about that person. The man with Capgras was more likely to misidentify the celebrities by face or voice compared with the volunteer without Capgras, or the 10 healthy men. None of the volunteers had problems identifying celebrities by name (Frontiers in Human Neuroscience, doi.org/wrw). © Copyright Reed Business Information Ltd.

Keyword: Attention; Aggression
Link ID: 20284 - Posted: 11.06.2014

By NICK BILTON Ebola sounds like the stuff of nightmares. Bird flu and SARS also send shivers down my spine. But I’ll tell you what scares me most: artificial intelligence. The first three, with enough resources, humans can stop. The last, which humans are creating, could soon become unstoppable. Before we get into what could possibly go wrong, let me first explain what artificial intelligence is. Actually, skip that. I’ll let someone else explain it: Grab an iPhone and ask Siri about the weather or stocks. Or tell her “I’m drunk.” Her answers are artificially intelligent. Right now these artificially intelligent machines are pretty cute and innocent, but as they are given more power in society, these machines may not take long to spiral out of control. In the beginning, the glitches will be small but eventful. Maybe a rogue computer momentarily derails the stock market, causing billions in damage. Or a driverless car freezes on the highway because a software update goes awry. But the upheavals can escalate quickly and become scarier and even cataclysmic. Imagine how a medical robot, originally programmed to rid cancer, could conclude that the best way to obliterate cancer is to exterminate humans who are genetically prone to the disease. Nick Bostrom, author of the book “Superintelligence,” lays out a number of petrifying doomsday settings. One envisions self-replicating nanobots, which are microscopic robots designed to make copies of themselves. In a positive situation, these bots could fight diseases in the human body or eat radioactive material on the planet. But, Mr. Bostrom says, a “person of malicious intent in possession of this technology might cause the extinction of intelligent life on Earth.” © 2014 The New York Times Company

Keyword: Robotics; Aggression
Link ID: 20283 - Posted: 11.06.2014

By Christian Jarrett It feels to me like interest in the brain has exploded. I’ve seen huge investments in brain science by the USA and Europe (the BRAIN Initiative and the Human Brain Project), I’ve read about the rise in media coverage of neuroscience, and above all, I’ve noticed how journalists and bloggers now often frame stories as being about the brain as opposed to the person. Look at these recent headlines: “Why your brain loves storytelling” (Harvard Business Review); “How Netflix is changing our brains” (Forbes); and “Why your brain wants to help one child in need — but not millions” (NPR). There are hundreds more, and in each case, the headline could be about “you” but the writer chooses to make it about “your brain”. Consider too the emergence of new fields such as neuroleadership, neuroaesthetics and neuro-law. It was only a matter of time before someone announced that we’re in the midst of a neurorevolution. In 2009 Zach Lynch did that, publishing his The Neuro Revolution: How Brain Science is Changing Our World. Having said all that, I’m conscious that my own perspective is heavily biased. I earn my living writing about neuroscience and psychology. I’m vigilant for all things brain. Maybe the research investment and brain-obsessed media headlines are largely irrelevant to the general public. I looked into this question recently and was surprised by what I found. There’s not a lot of research but that which exists (such as this, on the teen brain) suggests neuroscience has yet to make an impact on most people’s everyday lives. Indeed, I made Myth #20 in my new book Great Myths of the Brain “Neuroscience is transforming human self-understanding”. WIRED.com © 2014 Condé Nast.

Keyword: Attention
Link ID: 20282 - Posted: 11.06.2014

By James Gallagher Health editor, BBC News website Working antisocial hours can prematurely age the brain and dull intellectual ability, scientists warn. Their study, in the journal Occupational and Environmental Medicine, suggested a decade of shifts aged the brain by more than six years. There was some recovery after people stopped working antisocial shifts, but it took five years to return to normal. Experts say the findings could be important in dementia, as many patients have disrupted sleep. The body's internal clock is designed for us to be active in the day and asleep at night. The damaging effects on the body of working against the body clock, from breast cancer to obesity, are well known. Now a team at the University of Swansea and the University of Toulouse has shown an impact on the mind as well. Three thousand people in France performed tests of memory, speed of thought and wider cognitive ability. The brain naturally declines as we age, but the researchers said working antisocial shifts accelerated the process. Those with more than 10 years of shift work under their belts had the same results as someone six and a half years older. The good news is that when people in the study quit shift work, their brains did recover. Even if it took five years. Dr Philip Tucker, part of the research team in Swansea, told the BBC: "It was quite a substantial decline in brain function, it is likely that when people trying to undertake complex cognitive tasks then they might make more mistakes and slip-ups, maybe one in 100 makes a mistake with a very large consequence, but it's hard to say how big a difference it would make in day-to-day life." BBC © 2014

Keyword: Biological Rhythms; Aggression
Link ID: 20281 - Posted: 11.05.2014

Kate Baggaley Much of the increase in autism diagnoses in recent decades may be tied to changes in how the condition is reported. Sixty percent of the increase in autism cases in Denmark can be explained by these changes, scientists report November 3 in JAMA Pediatrics. The researchers followed all 677,915 people born in Denmark in 1980 through 1991, monitoring them from birth through the end of 2011. Among children born in this period, diagnoses increased fivefold, until 1 percent of children born in the early 1990s were diagnosed with autism by age 20. During these decades, Denmark experienced two changes in the way autism is reported. In 1994, the criteria physicians rely on to diagnose autism were updated in both the International Classification of Diseases manual used by Denmark and in its American counterpart, the Diagnostic and Statistical Manual of Mental Disorders. Then in 1995, the Danish Psychiatric Register began reporting diagnoses where doctors had only outpatient contact with children, in addition to cases where autism was diagnosed after children had been kept overnight. The researchers estimated Danish children’s likelihood of being diagnosed with autism before and after the two reporting changes. These changes accounted for 60 percent of the increase in diagnoses. © Society for Science & the Public 2000 - 2014.

Keyword: Autism
Link ID: 20280 - Posted: 11.05.2014

By Amy Robinson Whether you’re walking, talking or contemplating the universe, a minimum of tens of billions of synapses are firing at any given second within your brain. “The weak link in understanding ourselves is really about understanding how our brains generate our minds and how our minds generate our selves,” says MIT neuroscientist Ed Boyden. One cubic millimeter in the brain contains over 100,000 neurons connected through a billion synapses computing on a millisecond timescale. To understand how information flows within these circuits, we first need a “brain parts” list of neurons and glia. But such a list is not enough. We’ll also need to chart how cells are connected and to monitor their activity over time both electrically and chemically. Researchers can do this at small scale thanks to a technology developed in the 1970s called patch clamping. Bringing a tiny glass needle very near to a neuron living within a brain allows researchers to perform microsurgery on single neurons, piercing the cell membrane to do things like record the millivolt electrical impulses flowing through it. Patch clamping also facilitates measurement of proteins contained within the cell, revealing characteristic molecules and contributing to our understanding of why one neuron may behave differently than another. Neuroscientists can even inject glowing dyes in order to see the shape of cells. Patch clamping is a technique that has been used in neuroscience for 40 years. Why now does it make an appearance as a novel neuroscience technology? In a word: robots. © 2014 Scientific American

Keyword: Brain imaging
Link ID: 20279 - Posted: 11.05.2014

By Kelly Servick Using data from old clinical trials, two groups of researchers have found a better way to predict how amyotrophic lateral sclerosis (ALS) progresses in different patients. The winning algorithms—designed by non-ALS experts—outperformed the judgments of a group of ALS clinicians given the same data. The advances could make it easier to test whether new drugs can slow the fatal neurodegenerative disease. The new work was inspired by the so-called ALS Prediction Prize, a joint effort by the ALS-focused nonprofit Prize4Life and Dialogue for Reverse Engineering Assessments and Methods, a computational biology project whose sponsors include IBM, Columbia University, and the New York Academy of Sciences. Announced in 2012, the $50,000 award was designed to bring in experts from outside the ALS field to tackle the notoriously unpredictable illness. Liuxia Wang, a data analyst at the marketing company Sentrana in Washington, D.C., was used to helping companies make business decisions based on big data sets, such as information about consumer choices, but says she “didn’t know too much about this life science thing” until she got an unusual query from a client. One of the senior managers she worked with revealed that her son had just been diagnosed with ALS and wondered if Sentrana’s analytics could apply to patient data, too. When Wang set out to investigate, she found the ALS Prediction Prize. The next step, she said, was to learn something about ALS. The disease destroys the neurons that control muscle movement, causing gradual paralysis and eventually killing about half of patients within 3 years of diagnosis. But the speed of its progression varies widely. About 10% of patients live a decade or more after being diagnosed. That makes it hard for doctors to answer patients’ questions about the future, and it’s a big problem for testing new ALS treatments. © 2014 American Association for the Advancement of Science.

Keyword: ALS-Lou Gehrig's Disease
Link ID: 20278 - Posted: 11.04.2014

James Gorman Here is something to keep arachnophobes up at night. The inside of a spider is under pressure, like the air in a balloon, because spiders move by pushing fluid through valves. They are hydraulic. This works well for the spiders, but less so for those who want to study what goes on in the brain of a jumping spider, an aristocrat of arachnids that, according to Ronald R. Hoy, a professor of neurobiology and behavior at Cornell University, is one of the smartest of all invertebrates. If you insert an electrode into the spider’s brain, what’s inside might squirt out, and while that is not the kind of thing that most people want to think about, it is something that the researchers at Cornell had to consider. Dr. Hoy and his colleagues wanted to study jumping spiders because they are very different from most of their kind. They do not wait in a sticky web for lunch to fall into a trap. They search out prey, stalk it and pounce. “They’ve essentially become cats,” Dr. Hoy said. And they do all this with a brain the size of a poppy seed and a visual system that is completely different from that of a mammal: two big eyes dedicated to high-resolution vision and six smaller eyes that pick up motion. Dr. Hoy gathered four graduate students in various disciplines to solve the problem of recording activity in a jumping spider’s brain when it spots something interesting — a feat nobody had accomplished before. In the end, they not only managed to record from the brain, but discovered that one neuron seemed to be integrating the information from the spider’s two independent sets of eyes, a computation that might be expected to involve a network of brain cells. © 2014 The New York Times Company

Keyword: Brain imaging; Aggression
Link ID: 20277 - Posted: 11.04.2014

|By Sandra Upson Jan Scheuermann is not your average experimental subject. Diagnosed with spinocerebellar degeneration, she is only able to move her head and neck. The paralysis, which began creeping over her muscles in 1996, has been devastating in many ways. Yet two years ago she seized an opportunity to turn her personal liability into an extraordinary asset for neuroscience. In 2012 Scheuermann elected to undergo brain surgery to implant two arrays of electrodes on her motor cortex, a band of tissue on the surface of the brain. She did so as a volunteer in a multi-year study at the University of Pittsburgh to develop a better brain-computer interface. When she visits the lab, researchers hook up her brain to a robotic arm and hand, which she practices moving using her thoughts alone. The goal is to eventually allow other paralyzed individuals to regain function by wiring up their brains directly to a computer or prosthetic limb. The electrodes in her head record the firing patterns of about 150 of her neurons. Specific patterns of neuronal activity encode her desire to perform different movements, such as swinging the arm to the left or clasping the fingers around a cup. Two thick cables relay the data from her neurons to a computer, where software can identify Scheuermann’s intentions. The computer can then issue appropriate commands to move the robotic limb. On a typical workday, Jan Scheuermann arrives at the university around 9:15 am. Using her chin, she maneuvers her electric wheelchair into a research lab headed by neuroscientist Andrew Schwartz and settles in for a day of work. Scientific American Mind spoke to Scheuermann to learn more about her experience as a self-proclaimed “guinea pig extraordinaire.” © 2014 Scientific American,

Keyword: Robotics
Link ID: 20276 - Posted: 11.04.2014

By SINDYA N. BHANOO BERKELEY, CALIF. — Lilith Sadil, 12, climbs into an examination chair here at the Myopia Control Center at the University of California. “Do you know why you are here?” asks Dr. Maria Liu, an optometrist. “Because my eyes are changing fast,” Lilith says. “Do you read a lot?” Dr. Liu asks. “Yes.” “Do you use the computer a lot?” “Yes.” Lilith is an active child who practices taekwondo. But like an increasing number of children, she has myopia — she can see close up but not farther away. Her mother, Jinnie Sadil, has brought her to the center because she has heard about a new treatment that could help. Eye specialists are offering young patients special contact lenses worn overnight that correct vision for the next day. Myopia has become something of a minor epidemic: More than 40 percent of Americans are nearsighted, a 16 percent increase since the 1970s. People with so-called high myopia — generally, blurry vision beyond about five inches — face an increased likelihood of developing cataracts and glaucoma, are at higher risk for retinal detachments that can result in blindness. Exactly what is causing the nationwide rise in nearsightedness is not known. “It can’t be entirely genetic, because genes don’t change that fast,” said Susan Vitale, an epidemiologist at the National Institutes of Health who studies myopia. “It’s probably something that’s environmental, or a combination of genetic and environmental factors.” Some research indicates that “near work” — reading, computer work, playing video games, and using tablets and smartphones — is contributing to the increase. A recent study found that the more educated a person is, the more likely he or she will be nearsighted. A number of other studies show that children who spend time outdoors are less likely to develop high myopia. But no one is certain whether the eye benefits from ultraviolet light or whether time outside simply means time away from near work. © 2014 The New York Times Company

Keyword: Vision; Aggression
Link ID: 20275 - Posted: 11.04.2014

by Catherine Brahic Once described as the finest sound in nature, the song of the North American hermit thrush has long captivated the human ear. For centuries, birdwatchers have compared it to human music – and it turns out they were on to something. The bird's song is beautifully described by the same maths that underlies human harmonies. To our ears, two notes usually sound harmonious together if they follow a set mathematical relationship. An octave is a doubling of frequencies. Tripling the frequency of sound produces a perfect fifth, quadrupling is yet another octave, and quintupling produces a perfect third. These relationships define the most common major chords – the ones that, across human cultures, we tend to find most pleasant to listen to. Early studies sought to determine whether these mathematical relationships also governed the notes in bird song. Studies in the white-throated sparrow and the northern nightingale-wren failed to find the same musical intervals as those used in human music, and deemed birdsong to be something different entirely. Making tweet music The song of the hermit thrush challenges that conclusion. Tecumseh Fitch of the University of Vienna in Austria and colleagues analysed recordings taken in the wild of 70 full songs from this species. They isolated the frequencies corresponding to each note, and calculated the relationships between pitches appearing in each song. Lo and behold, the vast majority of songs used notes that fitted the same simple mathematical ratios as human harmony. What's more, Fitch says the thrush can produce other notes - meaning it must choose to use these harmonic chords. © Copyright Reed Business Information Ltd.

Keyword: Animal Communication; Aggression
Link ID: 20274 - Posted: 11.04.2014

Joan Raymond TODAY contributor It’s well established that baby talk plays a huge role in helping the wee widdle babies learn to tawk. And — no surprise — moms talk more to babies than dads do. But it seems that the baby's sex plays a role, too: Moms may be talking more to their infant daughters than their sons during the early weeks and months of a child’s life. In a new study published Monday in the online edition of Pediatrics, researchers looked at the language interactions between 33 late preterm and term infants and their parents by capturing 3,000 hours of recordings. Somewhat surprisingly, the researchers found that moms interacted vocally more with infant daughters rather than sons both at birth and 44 weeks post-menstrual age (equivalent to 1 month old.) Male adults responded more frequently to infant boys than infant girls, but the difference did not reach statistical significance, say the researchers. “We wanted to look more at gender and factors that affect these essentially mini-conversations that parents have with infants,” says lead author and neonatologist Dr. Betty Vohr, director of the Neonatal Follow-Up Program at Women & Infants Hospital of Rhode Island. “Infants are primed to vocalize and have reciprocal interactions.”

Keyword: Sexual Behavior; Aggression
Link ID: 20273 - Posted: 11.04.2014

By RICHARD A. FRIEDMAN ATTENTION deficit hyperactivity disorder is now the most prevalent psychiatric illness of young people in America, affecting 11 percent of them at some point between the ages of 4 and 17. The rates of both diagnosis and treatment have increased so much in the past decade that you may wonder whether something that affects so many people can really be a disease. And for a good reason. Recent neuroscience research shows that people with A.D.H.D. are actually hard-wired for novelty-seeking — a trait that had, until relatively recently, a distinct evolutionary advantage. Compared with the rest of us, they have sluggish and underfed brain reward circuits, so much of everyday life feels routine and understimulating. To compensate, they are drawn to new and exciting experiences and get famously impatient and restless with the regimented structure that characterizes our modern world. In short, people with A.D.H.D. may not have a disease, so much as a set of behavioral traits that don’t match the expectations of our contemporary culture. From the standpoint of teachers, parents and the world at large, the problem with people with A.D.H.D. looks like a lack of focus and attention and impulsive behavior. But if you have the “illness,” the real problem is that, to your brain, the world that you live in essentially feels not very interesting. One of my patients, a young woman in her early 20s, is prototypical. “I’ve been on Adderall for years to help me focus,” she told me at our first meeting. Before taking Adderall, she found sitting in lectures unendurable and would lose her concentration within minutes. Like many people with A.D.H.D., she hankered for exciting and varied experiences and also resorted to alcohol to relieve boredom. But when something was new and stimulating, she had laserlike focus. I knew that she loved painting and asked her how long she could maintain her interest in her art. “No problem. I can paint for hours at a stretch.” Rewards like sex, money, drugs and novel situations all cause the release of dopamine in the reward circuit of the brain, a region buried deep beneath the cortex. Aside from generating a sense of pleasure, this dopamine signal tells your brain something like, “Pay attention, this is an important experience that is worth remembering.” © 2014 The New York Times Company

Keyword: ADHD; Aggression
Link ID: 20272 - Posted: 11.03.2014

Maanvi Singh How does a sunset work? We love to look at one, but Jolanda Blackwell wanted her eighth-graders to really think about it, to wonder and question. So Blackwell, who teaches science at Oliver Wendell Holmes Junior High in Davis, Calif., had her students watch a video of a sunset on YouTube as part of a physics lesson on motion. "I asked them: 'So what's moving? And why?' " Blackwell says. The students had a lot of ideas. Some thought the sun was moving; others, of course, knew that a sunset is the result of the Earth spinning around on its axis. Once she got the discussion going, the questions came rapid-fire. "My biggest challenge usually is trying to keep them patient," she says. "They just have so many burning questions." Students asking questions and then exploring the answers. That's something any good teacher lives for. And at the heart of it all is curiosity. Blackwell, like many others teachers, understands that when kids are curious, they're much more likely to stay engaged. But why? What, exactly, is curiosity and how does it work? A study published in the October issue of the journal Neuron suggests that the brain's chemistry changes when we become curious, helping us better learn and retain information. © 2014 NPR

Keyword: Learning & Memory; Aggression
Link ID: 20271 - Posted: 11.03.2014

Colin Barras It's the sweetest relief… until it's not. Scratching an itch only gives temporary respite before making it worse – we now know why. Millions of people experience chronic itching at some point, as a result of conditions ranging from eczema to kidney failure to cancer. The condition can have a serious impact on quality of life. On the face of it, the body appears to have a coping mechanism: scratching an itch until it hurts can bring instant relief. But when the pain wears off the itch is often more unbearable than before – which means we scratch even harder, sometimes to the point of causing painful skin damage. "People keep scratching even though they might end up bleeding," says Zhou-Feng Chen at the Washington University School of Medicine in St Louis, Missouri, who has now worked out why this happens. His team's work in mice suggests it comes down to an unfortunate bit of neural crosstalk. We know that the neurotransmitter serotonin helps control pain, and that pain – from the heavy scratching – helps soothe an itch, so Chen's team set out to explore whether serotonin is also involved in the itching process. They began by genetically engineering mice to produce no serotonin. Normally, mice injected with a chemical that irritates their skin will scratch up a storm, but the engineered mice seemed to have almost no urge to scratch. Genetically normal mice given a treatment to prevent serotonin leaving the brain also avoided scratching after being injected with the chemical, indicating that the urge to scratch begins when serotonin from the brain reaches the irritated spot. © Copyright Reed Business Information Ltd.

Keyword: Pain & Touch
Link ID: 20270 - Posted: 11.03.2014

By James Gallagher Health editor, BBC News website Weight loss surgery can dramatically reduce the odds of developing type 2 diabetes, according to a major study. Doctors followed nearly 5,000 people as part of a trial to assess the health impact of the procedure. The results, published in the Lancet Diabetes and Endocrinology journal, showed an 80% reduction in type 2 diabetes in those having surgery. The UK NHS is considering offering the procedure to tens of thousands of people to prevent diabetes. Obesity and type 2 diabetes are closely tied - the bigger someone is, the greater the risk of the condition. The inability to control blood sugar levels can result in blindness, amputations and nerve damage. Around a tenth of NHS budgets are spent on managing the condition. Surgery The study followed 2,167 obese adults who had weight loss - known as bariatric - surgery. They were compared to 2,167 fellow obese people who continued as they were. There were 38 cases of diabetes after surgery compared with 177 in people left as they were - a reduction of nearly 80%. Around 3% of morbidly obese people develop type 2 each year, however, surgery reduced the figure to around 0.5%, which is the background figure for the whole population. Bariatric surgery, also known as weight loss surgery, is used as a last resort to treat people who are dangerously obese and carrying an excessive amount of body fat. This type of surgery is available on the NHS only to treat people with potentially life-threatening obesity when other treatments have not worked. Around 8,000 people a year currently receive the treatment. The two most common types of weight loss surgery are: Gastric band, where a band is used to reduce the size of the stomach so a smaller amount of food is required to make someone feel full Gastric bypass, where the digestive system is re-routed past most of the stomach so less food is digested to make someone feel full BBC © 2014

Keyword: Obesity
Link ID: 20269 - Posted: 11.03.2014

by Aviva Rutkin IF DINNER is missing some zing, a spoon studded with electrodes could help. It creates tastes on your tongue with a pulse of electricity. The utensil may add some extra flavour for people who shouldn't eat certain foods. Different frequencies and magnitudes of current through the electrodes can create the impression of saltiness, sourness or bitterness. The spoon was developed by Nimesha Ranasinghe at the New York University Abu Dhabi in the United Arab Emirates and his team, who have also developed a water bottle with similar hardware on the mouthpiece. Both devices use various coloured lights, like blue for salty, in an attempt to augment the perceived intensity of the flavour. "Taste is not only taste. It's a multisensory sensation, so we need smell, colour, previous experiences, texture," says Ranasinghe. "I am trying to integrate different aspects of these sensations." By boosting the flavour of plain foods, he says a tool like this could be useful for people with diabetes or heart issues who have been ordered to cut down on salt and sugar. To see how well the electric utensils could fool diners, 30 people tried them out in a taste test with plain water and porridge. The spoon and bottle were judged 40 to 83 per cent successful at recreating the tastes, depending on which one they were aiming for. Bitter was the hardest sensation to get right. Some testers also said they were distracted by the metallic taste of the electrodes – a pitfall the researchers will work on next. © Copyright Reed Business Information Ltd.

Keyword: Chemical Senses (Smell & Taste)
Link ID: 20268 - Posted: 11.03.2014

by Helen Thomson As you read this, your neurons are firing – that brain activity can now be decoded to reveal the silent words in your head TALKING to yourself used to be a strictly private pastime. That's no longer the case – researchers have eavesdropped on our internal monologue for the first time. The achievement is a step towards helping people who cannot physically speak communicate with the outside world. "If you're reading text in a newspaper or a book, you hear a voice in your own head," says Brian Pasley at the University of California, Berkeley. "We're trying to decode the brain activity related to that voice to create a medical prosthesis that can allow someone who is paralysed or locked in to speak." When you hear someone speak, sound waves activate sensory neurons in your inner ear. These neurons pass information to areas of the brain where different aspects of the sound are extracted and interpreted as words. In a previous study, Pasley and his colleagues recorded brain activity in people who already had electrodes implanted in their brain to treat epilepsy, while they listened to speech. The team found that certain neurons in the brain's temporal lobe were only active in response to certain aspects of sound, such as a specific frequency. One set of neurons might only react to sound waves that had a frequency of 1000 hertz, for example, while another set only cares about those at 2000 hertz. Armed with this knowledge, the team built an algorithm that could decode the words heard based on neural activity aloneMovie Camera (PLoS Biology, doi.org/fzv269). © Copyright Reed Business Information Ltd.

Keyword: Language; Aggression
Link ID: 20267 - Posted: 11.01.2014

BY Laura Sanders The first time Nathan Whitmore zapped his brain, he had a college friend standing by, ready to pull the cord in case he had a seizure. That didn’t happen. Instead, Whitmore started experimenting with the surges of electricity, and he liked the effects. Since that first cautious attempt, he’s become a frequent user of, and advocate for, homemade brain stimulators. Depending on where he puts the electrodes, Whitmore says, he has expanded his memory, improved his math skills and solved previously intractable problems. The 22-year-old, a researcher in a National Institute on Aging neuroscience lab in Baltimore, writes computer programs in his spare time. When he attaches an electrode to a spot on his forehead, his brain goes into a “flow state,” he says, where tricky coding solutions appear effortlessly. “It’s like the computer is programming itself.” Whitmore no longer asks a friend to keep him company while he plugs in, but he is far from alone. The movement to use electricity to change the brain, while still relatively fringe, appears to be growing, as evidenced by a steady increase in active participants in an online brain-hacking message board that Whitmore moderates. This do-it-yourself community, some of whom make their own devices, includes people who want to get better test scores or crush the competition in video games as well as people struggling with depression and chronic pain, Whitmore says. As reckless as it sounds to juice a brain at home with a 9-volt battery and 40 dollars’ worth of spare parts, this technology’s buzz is based on legit science. Small laboratory studies suggest that carefully controlled brain stimulation can boost a person’s memory and math abilities, hone attention and fast-track learning. The U. S. military is interested and is funding studies to test brain stimulation as a way to boost soldiers’ alertness and vigilance. The technique may even be a viable treatment for pernicious mental disorders such as major depression, according to other laboratory-based studies. © Society for Science & the Public 2000 - 2014.

Keyword: Depression; Aggression
Link ID: 20266 - Posted: 11.01.2014

by Helen Thomson Scared of the dark? Terrified of heights? Spiders make you scream? For the first time, a person's lifelong phobia has been completely abolished overnight. Unfortunately, it required removing a tiny bit of the man's brain, so for now, most people will have to find another way to dispel their fears. The phobia was abolished by accident. A 44-year-old business man started having seizures out of the blue. Brain scans showed he had an abnormality in his left amygdala – an area in the temporal lobe involved in emotional reactions, among other things. Further tests showed the cause was sarcoidosis, a rare condition that causes damage to the lungs, skin and, occasionally, the brain. Doctors decided it was necessary to remove the man's damaged left amygdala. The surgery went well, but soon after the man noticed a strange turn of events. Not only did he have a peculiar "stomach-lurching" aversion to music – which was particularly noticeable when he heard the song accompanying a certain TV advert – but he also discovered he was no longer afraid of spiders. While his aversion to music waned over time, his arachnophobia never returned. Before the surgery he would throw tennis balls at spiders, or use hairspray to immobilise them before vacuuming them up. Now he is able to touch and observe the little critters at close distance and says he actually finds them fascinating. He hasn't noticed any changes to other kinds of fears or anxieties. For example, he is equally as anxious about public speaking now as he was prior to surgery. © Copyright Reed Business Information Ltd.

Keyword: Emotions
Link ID: 20265 - Posted: 11.01.2014