Most Recent Links
Follow us on Facebook or subscribe to our mailing list, to receive news updates. Learn more.
By THE NEW YORK TIMES Q. What are the risks of long-term use of prescription drugs to combat insomnia?Claudia, New York, N.Y. A. Dr. Thorpy and Dr. Harris respond: In our sleep center, we use several types of prescription medications to treat insomnia. The most well known are drugs like Ambien (zolpidem), Lunesta (escopiclone) and Sonata (zaleplon) — sometimes referred to as the “Z-drugs” or hypnotics. They all affect a brain structure called the GABA receptor, which is widely found throughout the brain and has many functions; their main effect is to dampen arousal, thereby allowing sleep to occur. There are also non-GABA sleep medications like Rozerem (ramelteon), which reduces arousal by affecting the receptor for the hormone melatonin. Before using any of these medications, it is vital to understand the underlying cause of the insomnia, since other treatments may be more appropriate. If depression is the cause of poor sleep, for example, antidepressants or lifestyle changes may be the most appropriate course. Insomnia has now been shown to be associated with a range of underlying medical and psychological disorders, and it is therefore an important condition to treat. We usually use both behavioral and medication therapy, either alone or together, to get the best response. The aim is to have the patient come off the sleep medication completely when the insomnia resolves. Some people, though, have chronic insomnia that is associated with a medical or psychiatric disorder and need to continue medication very long-term. They can do so without any untoward effects. Copyright 2010 The New York Times Company
Keyword: Sleep
Link ID: 14267 - Posted: 07.17.2010
By GINA KOLATA Marilyn Maldonado is not quite sure why she is at the Memory Enhancement Center in the seaside town of Oakhurst, N.J. “What are we waiting for?” she asks. About 10 minutes later, she asks again. Then she asks again. She is waiting to enter a new type of Alzheimer’s drug study that will, in the boldest effort yet, test the leading hypothesis about how to slow or stop this terrifying brain disease. The disease is defined by freckles of barnacle-like piles of a protein fragment, amyloid beta, in the brain. So, the current thinking goes, if you block amyloid formation or get rid of amyloid accumulations — plaque — and if you start treatment before the disease is well under way, you might have a chance to alter its course. On Tuesday, that plan got a new push. The National Institute on Aging and the Alzheimer’s Association proposed new guidelines for diagnosis to find signs of Alzheimer’s in people who do not yet have severe symptoms, or even any symptoms at all. The guidelines are needed for the new approach to Alzheimer’s drug development. Just about every pharmaceutical company and many biotechnology companies have experimental drugs to block amyloid — there are more than 100 in the pipeline. And the companies would like to show that if they give their drugs early, they can slow or stop the disease. Copyright 2010 The New York Times Company
Keyword: Alzheimers
Link ID: 14266 - Posted: 07.17.2010
By Bruce Bower Here’s some not-so-sobering news for party people, barhoppers and clubgoers. Individuals who inherit a particular gene variant that tweaks the brain’s reward system are especially likely to drink a lot of alcohol in the company of heavy-boozing peers. That’s the preliminary indication of a new study directed by psychology graduate student Helle Larsen of Radboud University Nijmegen in the Netherlands. Adults carrying at least one copy of a long version of the dopamine D4 receptor gene, dubbed DRD4, imbibed substantially more alcohol around a heavy-drinking peer than did others who lacked that gene variant, Larsen’s group reports in a paper published online July 7 in Psychological Science. “Carriers of the long gene may be more attuned to, and influenced by, another person’s heavy drinking than noncarriers are,” Larsen says. Her study provides the first evidence that a gene influences human alcohol use in social situations. Scientists have yet to decipher the precise brain effects of DRD4’s long form. Larsen hypothesizes that in the presence of heavy drinkers, the gene variant may increase dopamine activity in brain areas that amplify alcohol’s appeal as a rewarding social activity. © Society for Science & the Public 2000 - 2010
Keyword: Drug Abuse; Genes & Behavior
Link ID: 14265 - Posted: 07.17.2010
Miriam Frankel A type of brain cell thought to be responsible for supporting other cells may have a previously unsuspected role in controlling breathing. Star-shaped cells called astrocytes, found in the brain and spinal cord, can 'sense' changes in the concentration of carbon dioxide in the blood and stimulate neurons to regulate respiration, according to a study published online in Science today1. The research may shed some light on the role of astrocytes in certain respiratory illnesses, such as cot death, which are not well understood. Astrocytes are a type of glial cell — the most common type of brain cell, and far more abundant than neurons. "Historically, glial cells were only thought to 'glue' the brain together, providing neuronal structure and nutritional support but not more," explains physiologist Alexander Gourine of University College London, one of the authors of the study. "This old dogma is now changing dramatically; a few recent studies have shown that astrocytes can actually help neurons to process information." "The most important aspect of this study is that it will significantly change ideas about how breathing is controlled," says David Attwell, a neuroscientist at University College London, who was not involved in the study. During exercise, the amount of CO2 in the blood increases, making the blood more acidic. Until now, it was thought that this pH change was 'sensed' by specialized neurons that signal to the lungs to expel more CO2. But the study found that astrocytes can sense such a decrease in pH too — a change that causes an increase in the concentration of calcium ions (Ca2+) in the cells and the release of the chemical messenger adenosine-5'-triphosphate (ATP). © 2010 Nature Publishing Group,
by Dolly Krishnaswamy Under the vivid green canopies of the tropics, a young gorilla sneaks up behind another, yanks its hair, and dashes away with a toothy grin on its face. It may seem like harmless fun, but this game of tag has profound implications. In a new study, researchers say the behavior indicates that gorillas know the limits of their social status—and that they play tag to help even the score. Other studies have shown that nonhumans can sense unfairness. In 2005, for example, a group led by psychologist Sarah Brosnan of Yerkes National Primate Research Center in Atlanta reported that capuchin monkeys refused to exchange tokens with an experimenter for a cucumber if they saw a fellow monkey receiving a more desirable grape for its "money." But do animals also sense unfairness in more natural settings? To find out, behavioral biologist Marina Davila Ross of the University of Portsmouth in Hampshire, U.K., and colleagues watched videos collected over 3 years of gorillas at various zoos and reserves in Germany and Switzerland. Almost every afternoon, a couple of the gorillas would begin wrestling with each other. In some instances, gorillas hit their playmates and ran away. Most gorillas seemed to play this game of tag, though older mothers observed from the sidelines. Ross's team noticed a pattern in the play: Gorillas lower on the social ladder were usually the taggers. These gorillas were also twice as likely to instigate another round of the game, and they frequently bared their teeth—a possible indication that they were willing to bite the other gorilla. © 2010 American Association for the Advancement of Science.
Keyword: Development of the Brain; Evolution
Link ID: 14263 - Posted: 07.15.2010
Lucas Laursen The rust-coloured plateau above Mecca in Saudi Arabia may soon attract pilgrims of palaeontology. The hills, which overlook the Red Sea, have disgorged the 29–28-million-year-old partial skull fossil of an early primate that possesses features both of apes and monkeys. The skull could help palaeontologists to answer questions about the life of primates in a period that until now has provided few fossils. When he caught sight of the skull during an expedition in search of ancient whale fossils last year, Iyad Zalmout wondered whether it belonged to a monkey or an ape. "It turns out it's not an ape, it's not a monkey, it's something intermediate," says Zalmout, a palaeontologist at the University of Michigan in Ann Arbor, and an author of a paper published in Nature today1. The primate, dubbed Saadanius hijazensis, shares characteristics with Propliopithecoidea, an ancestor of apes and monkeys which existed more than 30 million years ago, as well as with more recent primates found to have lived from 23 million years ago. Saadanius lacks the advanced sinuses of the modern apes and monkeys that are collectively called catarrhines, but has a bony ear tube that was not yet fully developed in the Propliopithecoidea. "This fossil is really key because it has that bony tube," says Erik Seiffert, an anatomist at Stony Brook University in New York. Comparison of the tube and other features, such as the teeth and the position of the eye sockets on the partial skull, with those of other primates could help palaeontologists to reconstruct the branches of the catarrhine family tree, between 30 and about 23 million years ago, says Seiffert. © 2010 Nature Publishing Group
Keyword: Evolution
Link ID: 14262 - Posted: 07.15.2010
By Janet Raloff Being fat may diminish mental performance, studies find — a problem that worsens with age. But among elderly women, where fat is deposited may matter. To wit: The big apple is sharper than the obese pear. Genetics dictates where people preferentially accumulate body fat. For most it’s around the belly. Among the obese, these apple-shaped individuals tend to run a bigger risk of developing heart disease than do pears — people who deposit most of their excess fat at the hips and thighs. For a host of reasons, physicians had expected that if body shape affected mental performance, apples would again prove the bigger losers. In fact, the opposite appears true, Diana Kerwin of Northwestern University’s Feinberg School of Medicine in Chicago and her colleagues report online July 14 in the Journal of the American Geriatrics Society. The team pored over data collected from more than 8,700 women, all 65 to 79 years old. These were a healthy subset of incoming participants to the Women’s Health Initiative study. This long-running trial at 40 medical centers across the country has been investigating the role of hormone-replacement therapy and diet on risk of heart disease, fractures and certain cancers. Each woman was administered a test of memory and reasoning known as the Modified Mini-Mental State Examination, or 3MSE. Kerwin’s team correlated a participant’s score with her shape and her height-adjusted weight — something known as body-mass index, or BMI. BMI values were divided into six categories, with 1 being lean and 6 morbidly obese. © Society for Science & the Public 2000 - 2010
Keyword: Obesity; Learning & Memory
Link ID: 14261 - Posted: 07.15.2010
Having low vitamin D levels may increase a person's risk of developing Parkinson's disease later in life, say Finnish researchers. Their study of 3,000 people, published in Archives of Neurology, found people with the lowest levels of the sunshine vitamin had a three-fold higher risk. Vitamin D could be helping to protect the nerve cells gradually lost by people with the disease, experts say. The charity Parkinson's UK said further research was required. Parkinson's disease affects several parts of the brain, leading to symptoms like tremor and slow movements. The researchers from Finland's National Institute for Health and Welfare measured vitamin D levels from the study group between 1978 and 1980, using blood samples. They then followed these people over 30 years to see whether they developed Parkinson's disease. They found that people with the lowest levels of vitamin D were three times more likely to develop Parkinson's, compared with the group with the highest levels of vitamin D. Most vitamin D is made by the body when the skin is exposed to sunlight, although some comes from foods like oily fish, milk or cereals. As people age, however, their skin becomes less able to produce vitamin D. Doctors have known for many years that vitamin D helps calcium uptake and bone formation. But research is now showing that it also plays a role in regulating the immune system, as well as in the development of the nervous system. (C)BBC
Keyword: Parkinsons
Link ID: 14260 - Posted: 07.13.2010
by MARILYNN MARCHIONE MILWAUKEE, Wisconsin — Scientists are reporting advances in detecting and predicting Alzheimer's disease at a conference in Honolulu this week, plus offering more proof that getting enough exercise and vitamin D may lower your risk. There are better brain scans to spot Alzheimer's disease. More genes that affect risk. Blood and spinal fluid tests that may help tell who will develop the mind-robbing illness and when. But what is needed most — a treatment that does more than just ease symptoms — is not at hand. "We don't have anything that slows or stops the course," said William Thies, the Alzheimer's Association scientific director. "We're really in a silent window right now" with new drugs, he said. Several promising ones flopped in late-stage tests — most recently, Pfizer Inc.'s Dimebon. Results on several others won't be ready until next year. Still, there is some progress against Alzheimer's, a dementia that afflicts more than 26 million people worldwide. Highlights of the research being reported this week: Prevention. Moderate to heavy exercisers had half the risk of developing dementia compared with less active people, researchers from the long-running Framingham Heart Study reported Sunday. Earlier studies also found exercise helps. Copyright 2010 The Associated Press.
Keyword: Alzheimers
Link ID: 14259 - Posted: 07.13.2010
Steve Connor Huntington’s disease is a relatively rare genetic disorder that you wouldn’t wish upon your worst enemy. If you carry a single copy of the affected gene you are destined to die a horrible death involving uncontrollable movements, psychiatric disturbances and progressive dementia. The first symptoms typically occur around the age of 40, and it takes between 10 and 15 more years for the gradual neurodegeneration to end life. Ten years after the excitement of mapping the human genome, and the revolution in the understanding of genetic disorders that the achievement has brought, it is easy to forget that some of those directly affected by inherited diseases have seen little in terms of practical benefit. The gene involved in Huntington’s disease was mapped to chromosome 4 in 1983 by a team led by Jim Gusella at Harvard Medical School in Boston, but it took another 10 years of intensive effort to isolate and clone the gene itself. This allowed scientists to find the type of changes, or mutations, that cause the disorder – the mutated gene has about two or three times the normal number of ‘GAG repeats. I remember on both occasions – in 1983 and 1993 – there were optimistic predictions that the discoveries would soon lead to a test for the carriers of the Huntington’s mutation and effective treatments – even possibly a cure – for the disease. The sad fact is that although a relatively cheap and accurate diagnostic test for the Huntington’s mutation has existed for some years, this medical advance has for the affected families arguably produced more misery than it has eradicated. For a start, there has been no accompanying revolution in treatment, largely because there are so few affected people (estimated to be about 12,000 in Britain) to make it worth the expense and effort of the drug companies to develop new therapies. ©independent.co.uk
Keyword: Huntingtons; Genes & Behavior
Link ID: 14258 - Posted: 07.13.2010
By GINA KOLATA A small company with a new brain scan for detecting plaque, the hallmark physical sign of Alzheimer’s disease, presented its results on Sunday at an international conference in Hawaii, and experts who attended said the data persuaded them that the method works. Until now, the only definitive way to diagnose Alzheimer’s has been to search for plaque with a brain autopsy after the patient dies. Scientists hope the new scanning technique, described June 24 in The New York Times’s series “The Vanishing Mind,” will allow doctors to see plaque while the patient is still alive, improving diagnosis and aiding research on drugs to slow or stop plaque accumulation. Neurologists have known about plaques ever since Alzheimer’s disease was first described in 1906. They are microscopic bumps made up of a protein, amyloid beta, appearing on the surface of the brain in areas involved with learning and memory. They are so characteristic of Alzheimer’s that they are required for a definitive diagnosis of the disease. Of course, doctors do not wait for a brain autopsy to diagnose Alzheimer’s. They use memory tests and evaluations of patients’ reasoning and ability to care for themselves. Yet with autopsy, even doctors at leading medical centers have been wrong as often as 20 percent of the time: people they said had Alzheimer’s did not have plaque. Copyright 2010 The New York Times Company
Keyword: Alzheimers; Brain imaging
Link ID: 14257 - Posted: 07.13.2010
By Melody Dye Subject 046M, for male, was seated nervously across from me at the table, his hands clasped tightly together in his lap. He appeared to have caught an incurable case of the squirms. I resisted the urge to laugh, and leaned forward, whispering conspiratorially. “Today, we’re going to play a game with Mr. Moo” —I produced an inviting plush cow from behind my back. “Can you say hi to Mr. Moo?” In the Stanford lab I work in with Professor Michael Ramscar, we study how children go about what is arguably the most vital project in their career as aspiring adults—learning language. Over the last several years, we’ve been particularly taken with the question of how kids learn a small, but telling piece of that vast complex: color words. We want to know how much they know, when they know it, and whether we can help them get there faster. 046M was off to a good start. I arranged three different color swatches in front of him. “Can you show me the red one?” He paused slightly, then pointed to the middle rectangle: red . “Very good!” I said, beaming. “Now, what about the one that’s blue?” The test was not designed to trip kids up. Far from it—we only tested basic color words, and we never made kids pick between confusable shades, like red and pink. To an adult, the test would be laughably easy. Yet, after several months of testing two-year olds, I could count my high scorers on one hand. Most would fail the test outright. 046M, despite his promising start, proved no exception. © 2010 Scientific American,
Keyword: Development of the Brain; Vision
Link ID: 14256 - Posted: 07.13.2010
When a person loses his sense of smell, does he also lose any memory associated with a smell? —Ana Artega, via e-mail David Smith, a professor of psychology and a researcher at the Center for Smell and Taste at the University of Florida, replies: Normally people can detect a cacophony of odors using the 40 million olfactory receptor neurons that reside in the nasal cavity. When we encounter a new odor, these neurons send information about the whiff to a brain area called the olfactory cortex, leaving an imprint of the smell there. These memories accumulate over time to create a library of odors. Although we do not fully understand how the olfactory cortex encodes these memories, we do know that olfactory memories seem to be particularly rich—perhaps because the olfactory cortex is closely connected to the brain regions important for recollection. These areas include the amygdala, which processes emotions, and the hippocampus, which encodes and stores memories. Damage to the olfactory receptor neurons because of a respiratory infection, a head injury or a neurodegenerative disease can disrupt the brain’s ability to process different smells. When olfactory neurons stop working altogether, a person develops anosmia, or the inability to discern odors. According to a 2008 report from the National Institutes of Health, 1 to 2 percent of the U.S. population younger than 65 years old, and more than half older than 65, have almost completely lost their sense of smell. © 2010 Scientific American,
Keyword: Chemical Senses (Smell & Taste); Learning & Memory
Link ID: 14255 - Posted: 07.13.2010
By CLAUDIA DREIFUS Jeremy Niven spends his days at Cambridge University running locusts across ladders and through mazes, trying to figure out how bugs think. Dr. Niven, 34, studies the evolution of brains and neurons in insects and other animals, like humans. We spoke during a break in last month's World Science Festival in New York, where he was a guest presenter, and then again later via telephone. An edited version of the two conversations follows: Q. YOUR RESEARCH SUBJECTS ARE LOCUSTS. SOME PEOPLE MIGHT SAY, “LOCUSTS, YUCK!” WHY STUDY THEM? A. I think locusts are sweet. When you get used to them, they are actually quite nice. Actually, I find that working with invertebrates opens your mind. Insects don’t perceive the world the way we do. Trying to understand them makes you think more about why we see the world as we do. Many animals have different sensors and receive different energies. Birds have ultraviolet vision. So do bees. They can see things we don’t. One learns respect for their capacities. But the other thing is that insects in general and locusts in particular are admirable because they permit us to gain new information about nervous systems. With insects, we can actually study neural circuits and see how what happens in the neurons relates to behavior. Copyright 2010 The New York Times Company
Keyword: Learning & Memory; Evolution
Link ID: 14254 - Posted: 07.13.2010
By SINDYA N. BHANOO Handsome men may turn the heads of women, but for those less attractive, sociability and friendliness also seem to seduce the fairer sex. The same is true for male house finches, according to a new study. Female house finches prefer to mate with males with the reddest feathers, but dull-colored males make themselves more appealing by acting more social before mating season, according to a study in the September issue of the American Naturalist. The researchers found that the duller a male bird was in color, the more likely he was to engage with multiple social groups. Birds in a social group flock and forage together and any bird can belong to multiple groups. Drab-looking male finches drifted from group to group in the winter, the researchers found. By mating season in the spring, the less attractive males tended to have the same level of mating success as the most colorful, attractive males. “Females have limited options to chose from and this is a way for males to manipulate their chances to find mates, by placing themselves in certain settings,” said Kevin Oh, an evolutionary biologist at Cornell University and the study’s lead author. The least attractive, or most yellow, males were four times as likely to interact with multiple social groups then the most attractive, or reddest, males, Dr. Oh said. House finches are found across North America, but Dr. Oh and his co-author, Alexander Badyaev of the University of Arizona, studied wild populations in Arizona. Copyright 2010 The New York Times Company
Keyword: Sexual Behavior; Evolution
Link ID: 14253 - Posted: 07.13.2010
by Linda Geddes A form of synaesthesia in which people experience letters or numbers in colour may be trainable. The discovery could shed new light on how such traits develop. Synaesthesia is thought to have a genetic component, but some people have reported synaesthetic experiences following hypnosis, so Olympia Colizoli at the University of Amsterdam in the Netherlands, and colleagues, wondered if it might also be possible to acquire synaesthesia through training. To test the idea, they gave seven volunteers a novel to read in which certain letters were always written in red, green, blue or orange (see picture). Before and after reading the book, the volunteers took a "synaesthetic crowding" test, in which they identified the middle letter of a grid of black letters which were quickly flashed onto a screen. Synaesthetes perform better on the test when a letter they experience in colour is the target letter. The volunteers performed significantly better on this test after training compared with people who read the novel in black and white. The findings suggest that natural synaesthesia may develop as a result of childhood experiences as well as genetics, says Colizoli, who presented the findings at the Forum of European Neuroscience in Amsterdam last week. © Copyright Reed Business Information Ltd.
Keyword: Miscellaneous
Link ID: 14252 - Posted: 07.13.2010
By Tina Hesman Saey Aging and wisdom are supposed to go together, but it turns out that a molecule that prevents one may actually play a role in the other. Researchers have discovered a new role for the famous antiaging protein SIRT1. It not only fends off aging, but also aids in learning and memory, a new study published online July 11 in Nature shows. Sirtuins, a family of proteins that includes SIRT1, help to regulate gene activity and have been implicated in governing metabolism and many of the biological processes that lead to aging. In the new study, Li-Huei Tsai, a neuroscientist and Howard Hughes Medical Institute investigator at MIT, finds that SIRT1 also plays a critical role in protecting learning and memory, at least in mice. Tsai and her colleagues had an inkling that SIRT1 might play some role in the brain from earlier experiments showing that resveratrol, an activator of sirtuins, could help neurons survive a mouse version of Alzheimer’s disease. Resveratrol also improved the animals’ ability to learn and remember. Since resveratrol can act on all seven of the sirtuins found in mammals and also affects other biological processes (SN Online: 6/28/10), the researchers didn’t know what role, if any, SIRT1 played in the process. To find out, Tsai and her colleagues put mice genetically engineered to lack SIRT1 in their brains through a series of learning and memory tests. The mice had trouble remembering the location of a submerged platform in a water maze, couldn’t tell the difference between a new object and an old one placed in their cages, and did poorly on other memory tests. “The ability for these animals to learn is clearly impaired,” Tsai says. © Society for Science & the Public 2000 - 2010
Keyword: Learning & Memory; Alzheimers
Link ID: 14251 - Posted: 07.13.2010
Mice might turn up their noses at alcohol, but not the prairie vole. This usually upstanding rodent, famous for mating for life and sharing pup-raising duties, apparently likes a stiff drink. “They not only drink alcohol, they prefer it over water,” Allison Anacker, a neuroscience graduate student at Oregon Health & Science University told The Oregonian. Anacker, working under behavioral neuroscience professor Andrey Ryabinin, was looking for a model organism to study some humans’ troubled relationship with alcohol. Mice and rats fail in this role–it’s unusual to find ones that want even a sip of the stuff. In a study published in Addiction Biology last month, Ryabinin’s team records the drunken misadventures of prairie voles. After chugging their preferred 6 percent alcohol drink (about the equivalent of beer), some thirsty voles shoved off parental responsibilities and even walked out on their mates. Though some drank responsibility, others drank to excess, stumbling away from the bar/spiked water bottle. The study suggests that like humans, the voles also make drinking buddies, seemingly encouraging each other to have another. When caged together, the voles appear to match one another drink for drink, a practice that apparently has nothing to do with who’s buying the next round.
Keyword: Drug Abuse
Link ID: 14250 - Posted: 07.13.2010
By GINA KOLATA If you had to choose one public health problem to attack, which would it be: teenage smoking or childhood obesity? To answer that question, you might want to pose another. Who will have the harder road in life, or indeed the longer one: the teenage puffer or the chubby child? Pitting smoking against obesity is tricky because it can mean comparing apples and bonbons, but there is some suggestion that a kind of weird zero-sum game is actually going on. And some smoking opponents fear that a choice has been made — with obesity the winner, quite possibly for the wrong reasons. “Obesity is the new kid on the block, relatively speaking,” said Kenneth E. Warner, dean of the University of Michigan’s school of public health. “Tobacco is old news.” When it comes to smoking, said Stanton A. Glantz, director of the University of California at San Francisco’s Center for Tobacco Control Research and Education, “we really haven’t had anyone pushing it to the top of the agenda.” That is a problem. “It’s not that I am for obesity,” he said, but he finds it less than encouraging, for example, that the hugely influential Robert Wood Johnson Foundation is pulling back from its anti-smoking efforts while directing its money and resources to preventing childhood obesity. Then there is Michelle Obama’s campaign, Let’s Move, to prevent childhood obesity. And in May, the White House Task Force on Childhood Obesity announced its goal — reduce the rate of childhood obesity, now 17 percent, to 5 percent by 2030. Copyright 2010 The New York Times Company
Keyword: Obesity; Drug Abuse
Link ID: 14249 - Posted: 07.12.2010
by Helen Thomson I'VE just had a brainwave. Oh, and there's another. And another! In fact, you will have had thousands of them since you started reading this sentence. These waves of electricity flow around our brains every second of the day, allowing neurons to communicate while we walk, talk, think and feel. Exactly where brainwaves are generated in the brain, and how they communicate information, is something of a mystery. As we begin to answer these questions, surprising functions of these ripples of neural activity are emerging. It turns out they underpin almost everything going on in our minds, including memory, attention and even our intelligence. Perhaps most importantly, haphazard brainwaves may underlie the delusions experienced by people with schizophrenia, and researchers are investigating this possibility in the hope that it will lead to treatments for this devastating condition. So what exactly is a brainwave? Despite the way it is bandied about in everyday chit-chat, the term "brainwave" has a specific meaning in neuroscience, referring to rhythmic changes in the electrical activity of a group of neurons. Each neuron has a voltage, which can change when ions flow in or out of the cell. This is normally triggered by stimulation from another cell, and once a neuron's voltage has reached a certain point, it too will fire an electrical signal to other cells, repeating the process. When many neurons fire at the same time, we see these changes in the form of a wave, as groups of neurons are all excited, silent, then excited again, at the same time. © Copyright Reed Business Information Ltd.
Keyword: Miscellaneous
Link ID: 14248 - Posted: 07.12.2010


.gif)

