Chapter 16. None

Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.


Links 21 - 40 of 1835

|By Erez Ribak and The Conversation UK The human eye is optimised to have good colour vision at day and high sensitivity at night. But until recently it seemed as if the cells in the retina were wired the wrong way round, with light travelling through a mass of neurons before it reaches the light-detecting rod and cone cells. New research presented at a meeting of the American Physical Society has uncovered a remarkable vision-enhancing function for this puzzling structure. About a century ago, the fine structure of the retina was discovered. The retina is the light-sensitive part of the eye, lining the inside of the eyeball. The back of the retina contains cones to sense the colours red, green and blue. Spread among the cones are rods, which are much more light-sensitive than cones, but which are colour-blind. Before arriving at the cones and rods, light must traverse the full thickness of the retina, with its layers of neurons and cell nuclei. These neurons process the image information and transmit it to the brain, but until recently it has not been clear why these cells lie in front of the cones and rods, not behind them. This is a long-standing puzzle, even more so since the same structure, of neurons before light detectors, exists in all vertebrates, showing evolutionary stability. Researchers in Leipzig found that glial cells, which also span the retinal depth and connect to the cones, have an interesting attribute. These cells are essential for metabolism, but they are also denser than other cells in the retina. In the transparent retina, this higher density (and corresponding refractive index) means that glial cells can guide light, just like fibre-optic cables. © 2015 Scientific American

Keyword: Vision; Glia
Link ID: 20701 - Posted: 03.19.2015

By PAM BELLUCK What happens to forgotten memories — old computer passwords, friends’ previous phone numbers? Scientists have long held two different theories. One is that memories do not diminish but simply get overshadowed by new memories. The other is that older memories become weaker, that pulling to mind new passwords or phone numbers degrades old recollections so they do not interfere. The difference could be significant. If old memories stay strong and are merely papered over by new ones, they may be easier to recover. That could be positive for someone trying to remember an acquaintance’s name, but difficult for someone trying to lessen memories of abuse. It could suggest different strategies for easing traumatic memories, evaluating witness testimony about crimes, or helping students study for tests. Now, a study claims to provide evidence of memory’s weakening by showing that people’s ability to remember something and the pattern of brain activity that thing generates both appear to diminish when a competing memory gets stronger. Demonstrating sophisticated use of brain scans in memory research, authors of the study, published Monday in the journal Nature Neuroscience, appear to have identified neural fingerprints of specific memories, distinguishing brain activity patterns produced when viewing a picture of a necklace, say, from a picture of binoculars or other objects. The experiment, conducted by scientists in Birmingham and Cambridge, England, involved several stages with 24 participants first trained to associate words to two unrelated black and white pictures from lists of famous people, ordinary objects or scenes. © 2015 The New York Times Company

Keyword: Learning & Memory
Link ID: 20695 - Posted: 03.17.2015

By RENEE ENGELN ON Tuesday, in the wake of an online petition signed by thousands of people, Facebook announced that it was removing “feeling fat” from its list of status update emoticons. The petition argued that the offending emoticon, with its chubby cheeks and double chin, reinforced negative body images, and Facebook seemed to agree. Is it really such a big deal if you tell everyone how fat you feel? After all, a simple “I’m so fat!” can result in a chorus of empathetic voices, saying, “Me, too!” or “You’re beautiful just the way you are!” And that will help you feel better, and help others feel better, too — right? Wrong. As someone who studies this type of public body self-disparagement, known as “fat talk,” I can say that it probably will make you feel worse. And it may drag down other people with you. Conversational shaming of the body has become practically a ritual of womanhood (though men also engage in it). In a survey that a colleague and I reported in 2011 in the Psychology of Women Quarterly, we found that more than 90 percent of college women reported engaging in fat talk — despite the fact that only 9 percent were actually overweight. In another survey, which we published in December in the Journal of Health Psychology, we canvassed thousands of women ranging in age from 16 to 70. Contrary to the stereotype of fat talk as a young woman’s practice, we found that fat talk was common across all ages and all body sizes. Most important, fat talk is not a harmless social-bonding ritual. According to an analysis of several studies that my colleagues and I published in 2012 in the Psychology of Women Quarterly, fat talk was linked with body shame, body dissatisfaction and eating-disordered behavior. Fat talk does not motivate women to make healthier choices or take care of their bodies; in fact, the feelings of shame it brings about tend to encourage the opposite. © 2015 The New York Times Company

Keyword: Anorexia & Bulimia
Link ID: 20691 - Posted: 03.17.2015

By Emily Underwood Deep brain stimulation, which now involves surgically inserting electrodes several inches into a person's brain and connecting them to a power source outside the skull, can be an extremely effective treatment for disorders such as Parkinson's disease, obsessive compulsive disorder, and depression. The expensive, invasive procedure doesn't always work, however, and can be risky. Now, a study in mice points to a less invasive way to massage neuronal activity, by injecting metal nanoparticles into the brain and controlling them with magnetic fields. Major technical challenges must be overcome before the approach can be tested in humans, but the technique could eventually provide a wireless, nonsurgical alternative to traditional deep brain stimulation surgery, researchers say. "The approach is very innovative and clever," says Antonio Sastre, a program director in the Division of Applied Science & Technology at the National Institute of Biomedical Imaging and Bioengineering in Bethesda, Maryland. The new work provides "a proof of principle." The inspiration to use magnets to control brain activity in mice first struck materials scientist Polina Anikeeva while working in the lab of neuroscientist-engineer Karl Deisseroth at Stanford University in Palo Alto, California. At the time, Deisseroth and colleagues were refining optogenetics, a tool that can switch specific ensembles of neurons on and off in animals with beams of light. © 2015 American Association for the Advancement of Science.

Keyword: Brain imaging
Link ID: 20690 - Posted: 03.14.2015

By Matthew J.X. Malady One hour and seven minutes into the decidedly hit-or-miss 1996 comedy Black Sheep, the wiseass sidekick character played by David Spade finds himself at an unusually pronounced loss for words. While riding in a car driven by Chris Farley’s character, he glances at a fold-up map and realizes he somehow has become unfamiliar with the name for paved driving surfaces. “Robes? Rouges? Rudes?” Nothing seems right. Even when informed by Farley that the word he’s looking for is roads, Spade’s character continues to struggle: “Rowds. Row-ads.” By this point, he’s become transfixed. “That’s a total weird word,” he says, “isn’t it?” Now, it’s perhaps necessary to mention that, in the context of the film, Spade’s character is high off nitrous oxide that has leaked from the car’s engine boosters. But never mind that. Row-ad-type word wig outs similar to the one portrayed in that movie are things that actually happen, in real life, to people with full and total control over their mental capacities. These wordnesias sneak up on us at odd times when we’re writing or reading text. I was in a full-on wordnesiac state. On one of my spelling attempts, I think I even threw a K into the mix. It was bad. Here’s how they work: Every now and again, for no good or apparent reason, you peer at a standard, uncomplicated word in a section of text and, well, go all row-ads on it. If you’re typing, that means inexplicably blanking on how to spell something easy like cake or design. The reading version of wordnesia occurs when a common, correctly spelled word either seems as though it can’t possibly be spelled correctly, or like it’s some bizarre combination of letters you’ve never before seen—a grouping that, in some cases, you can’t even imagine being the proper way to compose the relevant term. © 2014 The Slate Group LLC.

Keyword: Language
Link ID: 20688 - Posted: 03.14.2015

When it comes to fight or flight for brawling crickets, a chemical in the brain is in charge. Being roughed up in a skirmish can trigger nerve cells in Mediterranean field crickets (Gryllus bimaculatus) to release nitric oxide, making the losing cricket run away, scientists report online March 13 in Science Advances. Watch in this video as two crickets face off. When the loser hits its limit, it flees the fight. In a second bout, the loser then tries to avoid the winner. Nitric oxide prompts this continued submissive behavior, which lasts several hours before a cricket’s will to fight returns. “If you block nitric oxide they recover quickly, and if you give them nitric oxide they don’t,” says Paul Stevenson, a coauthor of the new research and behavioral neurobiologist at Leipzig University in Germany. “It’s a very simple algorithm for controlling a very complicated social situation.” P. Stevenson and J. Rillich. Adding up the odds—Nitric oxide signaling underlies the decision to flee and post-conflict depression of aggression. Science Advances. Published online March 13, 2015.doi: 10.1126/sciadv.1500060. © Society for Science & the Public 2000 - 2015.

Keyword: Aggression
Link ID: 20686 - Posted: 03.14.2015

By Emily Underwood From imaging babies to blasting apart kidney stones, ultrasound has proved to be a versatile tool for physicians. Now, several research teams aim to unleash the technology on some of the most feared brain diseases. The blood-brain barrier, a tightly packed layer of cells that lines the brain's blood vessels, protects it from infections, toxins, and other threats but makes the organ frustratingly hard to treat. A strategy that combines ultrasound with microscopic blood-borne bubbles can briefly open the barrier, in theory giving drugs or the immune system access to the brain. In the clinic and the lab, that promise is being evaluated. This month, in one of the first clinical tests, Todd Mainprize, a neurosurgeon at the University of Toronto in Canada, hopes to use ultrasound to deliver a dose of chemotherapy to a malignant brain tumor. And in some of the most dramatic evidence of the technique's potential, a research team reports this week in Science Translational Medicine that they used it to rid mice of abnormal brain clumps similar to those in Alzheimer's disease, restoring lost memory and cognitive functions. If such findings can be translated from mice to humans, “it will revolutionize the way we treat brain disease,” says biophysicist Kullervo Hynynen of the Sunnybrook Research Institute in Toronto, who originated the ultrasound method. Some scientists stress that rodent findings can be hard to translate to humans and caution that there are safety concerns about zapping the brain with even the low-intensity ultrasound used in the new study, which is similar to that used in diagnostic scans. © 2015 American Association for the Advancement of Science.

Keyword: Alzheimers
Link ID: 20685 - Posted: 03.12.2015

|By Daisy Yuhas The brain is a hotbed of electrical activity. Scientists have long known that brain cells communicate via electrical missives, created by charged atoms and molecules called ions as they travel across the membranes of those cells. But a new study suggests that in the days and weeks that lead up to a brain forming in an embryo or fetus, altering the electrical properties of these cells can dramatically change how the ensuing brain develops. Researchers at Tufts University and the University of Minnesota have investigated how the difference in charge on either side of a resting cell’s membrane—its electrical potential—helps build the brain. In previous work Tufts University developmental biologist Michael Levin found that patterns of electrical potentials in the earliest stages of an embryo’s development can direct how an animal’s body grows, and that manipulating those potentials can cause a creature to sprout extra limbs, tails or functioning eyes. Now, Levin’s group has investigated how these potentials shape the brain. Working with frog embryos the researchers first used dyes to see the patterns of electrical potentials that precede brain development. They noticed that before the development of a normal brain the cells lining the neural tube, a structure that eventually becomes the brain and spinal cord, have extreme differences in ionic charge within and outside the membrane that houses the cells. In other words, these cells are extremely polarized. © 2015 Scientific American

Keyword: Development of the Brain
Link ID: 20684 - Posted: 03.12.2015

Older people could improve or maintain their mental function through heart healthy lifestyle changes, a large randomized trial for dementia prevention shows. Researchers in Finland and Sweden designed a trial to tackle risk factors for Alzheimer's disease. The 1,260 Finns aged 60 to 77 participating in the study were all considered at risk of dementia based on standard test scores. Half were randomly assigned to receive advice from health professionals on maintaining a healthy diet, aerobic and muscle training exercises, brain training exercises and regular checks of blood pressure, height and weight for body mass index and physical exams for two years or regular health advice. Participants in the Finnish Geriatric Intervention Study to Prevent Cognitive Impairment and Disability or FINGER study had their cognitive function measured in a battery of mental tests. "The main hypothesis was that simultaneous changes in several risk factors (even of smaller magnitude) would lead to a protective effect on cognition," Miia Kivipelto from the Karolinska Institute in Stockholm and her co-authors said in Wednesday's issue of The Lancet. Overall, test scores were 25 per cent in the diet and training group than the control group. There was no effect on memory. ©2015 CBC/Radio-Canada.

Keyword: Alzheimers
Link ID: 20683 - Posted: 03.12.2015

Mutations in the presenilin-1 gene are the most common cause of inherited, early-onset forms of Alzheimer’s disease. In a new study, published in Neuron, scientists replaced the normal mouse presenilin-1 gene with Alzheimer’s-causing forms of the human gene to discover how these genetic changes may lead to the disorder. Their surprising results may transform the way scientists design drugs that target these mutations to treat inherited or familial Alzheimer’s, a rare form of the disease that affects approximately 1 percent of people with the disorder. The study was partially funded by the National Institute of Neurological Disorders and Stroke (NINDS), part of the National Institutes of Health. For decades, it has been unclear exactly how the presenilin mutations cause Alzheimer’s disease. Presenilin is a component of an important enzyme, gamma secretase, which cuts up amyloid precursor protein into two protein fragments, Abeta40 and Abeta42. Abeta42 is found in plaques, the abnormal accumulations of protein in the brain which are a hallmark of Alzheimer’s. Numerous studies suggested that presenilin-1 mutations increased activity of gamma-secretase. Investigators have developed drugs that block gamma-secretase, but they have so far failed in clinical trials to halt the disease. The study led by Raymond Kelleher, M.D., Ph.D. and Jie Shen, Ph.D., professors of neurology at Harvard Medical School, Boston, provides a plot twist in the association of presenilin-1 mutations and inherited Alzheimer’s disease. Using mice with altered forms of the presenilin gene, Drs. Kelleher and Shen discovered that the mutations may cause the disease by decreasing, rather than increasing, the activity of gamma-secretase.

Keyword: Alzheimers; Genes & Behavior
Link ID: 20682 - Posted: 03.12.2015

By Gretchen Reynolds An easy, two-minute vision test administered on the sidelines after a young athlete has hit his or her head can help to reliably determine whether the athlete has sustained a concussion, according to a new study of student athletes, some as young as 5. The test is so simple and inexpensive that any coach or parent potentially could administer it, the study’s authors believe, and any league afford to provide it as a way to help evaluate and safeguard players. Those of us who coach or care for young athletes know by now that an athlete who falls or collides with something during play or seems dazed, dizzy, loses consciousness or complains of head pain should be tested for a concussion, which occurs when the brain is physically jostled within the skull. But most of us are clueless about how to test young athletes. The most commonly recommended sideline test is the Standardized Assessment of Concussion, a multipart examination during which athletes are asked to name the date, describe how they feel, memorize and recall lists of words, and do jumping jacks and other tests of coordination. Ideally, this assessment should be administered and evaluated by a medical professional. But while the sidelines of college and professional games are crowded with doctors and certified athletic trainers, few high schools and youth leagues have those resources. Most of the time, concussion testing in youth sports falls to volunteer coaches or parents with little if any medical experience. That situation prompted researchers at New York University’s Langone Concussion Center to begin wondering recently whether there might be other, easier diagnostic tools to check young players for concussions. Their thoughts soon turned to vision. “About 50 percent of the brain’s pathways are tied in some to way to vision and visual processing,” said Dr. Steven Galetta, chairman of neurology at N.Y.U. Langone Medical Center and senior author of the study, which was published in The Journal of Neuro-Ophthalmology. © 2015 The New York Times Company

Keyword: Brain Injury/Concussion
Link ID: 20680 - Posted: 03.12.2015

By Douglas Starr In 1906, Hugo Münsterberg, the chair of the psychology laboratory at Harvard University and the president of the American Psychological Association, wrote in the Times Magazine about a case of false confession. A woman had been found dead in Chicago, garroted with a copper wire and left in a barnyard, and the simpleminded farmer’s son who had discovered her body stood accused. The young man had an alibi, but after questioning by police he admitted to the murder. He did not simply confess, Münsterberg wrote; “he was quite willing to repeat his confession again and again. Each time it became richer in detail.” The young man’s account, he continued, was “absurd and contradictory,” a clear instance of “the involuntary elaboration of a suggestion” from his interrogators. Münsterberg cited the Salem witch trials, in which similarly vulnerable people were coerced into self-incrimination. He shared his opinion in a letter to a Chicago nerve specialist, which made the local press. A week later, the farmer’s son was hanged. Münsterberg was ahead of his time. It would be decades before the legal and psychological communities began to understand how powerfully suggestion can shape memory and, in turn, the course of justice. In the early nineteen-nineties, American society was recuperating from another panic over occult influence; Satanists had replaced witches. One case, the McMartin Preschool trial, hinged on nine young victims’ memories of molestation and ritual abuse—memories that they had supposedly forgotten and then, after being interviewed, recovered. The case fell apart, in 1990, because the prosecution could produce no persuasive evidence of the victims’ claims. A cognitive psychologist named Elizabeth Loftus, who had consulted on the case, wondered whether the children’s memories might have been fabricated—in Münsterberg’s formulation, involuntarily elaborated—rather than actually recovered.

Keyword: Learning & Memory
Link ID: 20679 - Posted: 03.12.2015

|By Anne Skomorowsky On a Saturday night last month, 12 students at Wesleyan University in Connecticut were poisoned by “Molly,” a hallucinogenic drug they had taken to enhance a campus party. Ambulances and helicopters transported the stricken to nearby hospitals, some in critical condition. Molly—the street name for the amphetamine MDMA—can cause extremely high fevers, liver failure, muscle breakdown, and cardiac arrest. Given the risks associated with Molly, why would anybody take it? The obvious answer—to get high—is only partly true. Like many drugs of abuse, Molly causes euphoria. But Molly is remarkable for its “prosocial” effects. Molly makes users feel friendly, loving, and strongly connected to one another. Molly is most commonly used in settings where communion with others is highly valued, such as raves, music festivals, and college parties. Recently, psychiatrists have taken an interest in its potential to enhance psychotherapy; this has led to new research into the mechanisms by which MDMA makes people feel closer. It appears that MDMA works by shifting the user’s attention towards positive experiences while minimizing the impact of negative feelings. To investigate this, a 2012 study by Cedric Hysek and colleagues used the Reading the Mind in the Eyes Test (RMET), which was developed to evaluate people with autism. In the RMET, participants are shown 36 pictures of the eye region of faces. Their task is to describe what the person in the picture is feeling. Volunteers taking MDMA, under carefully controlled conditions, improved in their recognition of positive emotions; but their performance in recognizing negative emotions declined. In other words, they incorrectly attributed positive or neutral feelings to images that were actually negative in emotional tone. They mistook negative and threat-related images for friendly ones. © 2015 Scientific American

Keyword: Drug Abuse
Link ID: 20678 - Posted: 03.12.2015

By CELIA WATSON SEUPEL Every year, nearly 40,000 Americans kill themselves. The majority are men, and most of them use guns. In fact, more than half of all gun deaths in the United States are suicides. Experts and laymen have long assumed that people who died by suicide will ultimately do it even if temporarily deterred. “People think if you’re really intent on dying, you’ll find a way,” said Cathy Barber, the director of the Means Matters campaign at Harvard Injury Control Research Center. Prevention, it follows, depends largely on identifying those likely to harm themselves and getting them into treatment. But a growing body of evidence challenges this view. Suicide can be a very impulsive act, especially among the young, and therefore difficult to predict. Its deadliness depends more upon the means than the determination of the suicide victim. Now many experts are calling for a reconsideration of suicide-prevention strategies. While mental health and substance abuse treatment must always be important components in treating suicidality, researchers like Ms. Barber are stressing another avenue: “means restriction.” Instead of treating individual risk, means restriction entails modifying the environment by removing the means by which people usually die by suicide. The world cannot be made suicide-proof, of course. But, these researchers argue, if the walkway over a bridge is fenced off, a struggling college freshman cannot throw herself over the side. If parents leave guns in a locked safe, a teenage son cannot shoot himself if he suddenly decides life is hopeless. With the focus on who dies by suicide, these experts say, not enough attention has been paid to restricting the means to do it — particularly access to guns. © 2015 The New York Times Company

Keyword: Depression
Link ID: 20674 - Posted: 03.10.2015

If you missed the great dress debate of 2015 you were probably living under a rock. Staffrooms across the globe threatened to come to a standstill as teachers addressed the all-important question – was the dress white and gold or blue and black? This is just one example of how our brains interpret things differently. So, with the 20th anniversary of Brain Awareness Week from 16 to 22 March, this week we bring you a collection of ideas and resources to get students’ synapses firing. The brain is one of our most interesting organs, and advances in technology and medicine mean we now know more about it than ever before. Brain Awareness Week is a global campaign to raise awareness of the progress and benefits of brain research. The organisers, the Dana Foundation, have put together an assortment of teaching materials for primary and secondary students. For children aged five to nine, the Mindboggling Workbook is a good place to start. It includes information on how the brain works, what it does and how to take care of it. There’s also a section on the nervous system, which you could turn into a fun group activity. Ask one student to lie down on a large sheet of paper while others trace around them. Add a drawing of the brain and the spinal cord. Use different coloured crayons to illustrate how neurons send messages around your body when you a) touch something hot, b) get stung on the leg by a wasp, and c) wriggle your toes after stepping in sand. Can students explain why the brain is described as being more powerful than a computer? © 2015 Guardian News and Media Limited

Keyword: Miscellaneous
Link ID: 20673 - Posted: 03.10.2015

Robin Tricoles The first time it happened, I was 8. I was tucked in bed reading my favorite book when my tongue swelled up to the size of a cow’s, like the giant tongues I had seen in the glass display case at the neighborhood deli. At the same time, the far wall of my bedroom began to recede, becoming a tiny white rectangle floating somewhere in the distance. In the book I was holding, the typeface grew vast on the page. I was intrigued, I remember, but not afraid. Over the next six years, the same thing happened to me dozens of times. Forty years later, while working as a science writer, I stumbled on a scientific paper describing almost exactly what I had experienced. The paper attributed those otherworldly sensations to something called Alice in Wonderland syndrome, or its close cousin, Alice in Wonderland-like syndrome. People with Alice in Wonderland syndrome (AWS) perceive parts of their body to be changing size. For example, their feet may suddenly appear smaller and more distant, or their hands larger than they had been moments before. Those with the closely related Alice in Wonderland-like syndrome (AWLS) misperceive the size and distance of objects, seeing them as startlingly larger, smaller, fatter, or thinner than their natural state. People who experience both sensations, like I did, are classified as having AWLS. The syndrome’s name is commonly attributed to English psychiatrist John Todd, who in 1955 described his adult patients’ illusions of corporal and objective distortions in a paper in the Canadian Medical Association Journal. © 2015 by The Atlantic Monthly Group.

Keyword: Attention
Link ID: 20672 - Posted: 03.10.2015

Alison Abbott Mediators appointed to analyse the rifts within Europe’s ambitious €1-billion (US$1.1-billion) Human Brain Project (HBP) have called for far-reaching changes both in its governance and its scientific programmes. Most significantly, the report recommends that systems neuroscience and cognitive neuroscience should be reinstated into the HBP. The mediation committee, led by engineer Wolfgang Marquardt, director of Germany’s national Jülich Research Centre, sent its final report to the HBP board of directors on 9 March, and issued a press release summarizing its findings. (The full report will not be published until after the board, a 22-strong team of scientists, discusses its contents at a meeting on 17–18 March). The European Commission flagship project, which launched in October 2013, is intended to boost supercomputing through neuroscience, with the aim of simulating the brain in a computer. But the project has been racked by dissent from the outset. In early 2014, a three-person committee of scientists who ran the HBP’s scientific direction revealed that they planned to eliminate cognitive neuroscience from the initiative, which precipitated a mass protest. More than 150 of Europe’s leading neuroscientists signed a letter to the European Commission, complaining about the project’s management and charging that the HBP plan to simulate the brain using only ‘bottom-up’ data on the behaviour of neurons was doomed to failure if it did not include the top-down constraints provided by systems and cognitive neuroscience. © 2015 Nature Publishing Group

Keyword: Brain imaging
Link ID: 20670 - Posted: 03.10.2015

By James Gallagher Health editor, BBC News website, San Diego A dog has been used to sniff out thyroid cancer in people who had not yet been diagnosed, US researchers say. Tests on 34 patients showed an 88% success rate in finding tumours. The team, presenting their findings at the annual meeting of the Endocrine Society, said the animal had an "unbelievable" sense of smell. Cancer Research UK said using dogs would be impractical, but discovering the chemicals the dogs can smell could lead to new tests. The thyroid is a gland in the neck that produces hormones to regulate metabolism. Thyroid tumours are relatively rare and are normally diagnosed by testing hormone levels in the blood and by using a needle to extract cells for testing. Cancers are defective, out-of-control cells. They have their own unique chemistry and release "volatile organic compounds" into the body. The canine approach relies on dogs having 10 times the number of smell receptors as people and being able to pick out the unique smells being released by cancers. The man's best friend approach has already produced promising results in patients with bowel and lung cancers. A team at the University of Arkansas for Medical Sciences (UAMS) had previously showed that a dog could be trained to smell the difference between urine samples of patients with and without thyroid cancer. Frankie the dog Frankie gave the correct diagnosis in 30 out of 34 cases The next step was to see if it could be used as a diagnostic test. Frankie the German Shepherd was trained to lie down when he could smell thyroid cancer in a sample and turn away if the urine was clean.

Keyword: Chemical Senses (Smell & Taste)
Link ID: 20668 - Posted: 03.09.2015

By Lily Hay Newman When I was growing up, I had a lazy eye. I had to wear a patch over my stronger eye for many years so that good-for-nothing, freeloading, lazy eye could learn some responsibility and toughen up. Wearing a patch was really lousy, though, because people would ask me about it all the time and say things like, "What's wrong with you?" Always fun to hear. I would have much preferred to treat my condition, which is also called amblyopia, by playing video games. Who wouldn't? And it seems like that dream may become a possibility. On Tuesday, developer Ubisoft announced Dig Rush, a game that uses stereoscopic glasses and blue and red figures in varying contrasts to attempt to treat amblyopia. Working in collaboration with McGill University and the eye treatment startup Amblyotech, Ubisoft created a world where controlling a mole character to mine precious metals is really training patients' brains to coordinate their eyes. When patients wear a patch, they may force their lazy eye to toughen up, but they aren't doing anything to teach their eyes how to work together. This lack of coordination, called strabismus, is another important factor that the game makers hope can be addressed better by Dig Rush than by "patching" alone. Amblyotech CEO Joseph Koziak said in a statement, “[This] electronic therapy has been tested clinically to significantly increase the visual acuity of both children and adults who suffer from this condition without the use of an eye patch.” One advantage of Dig Rush, he noted, is that it's easier to measure compliance with video games.

Keyword: Vision
Link ID: 20667 - Posted: 03.09.2015

By Neuroskeptic There is a popular view that all of the natural sciences can be arranged in a chain or ladder according to the complexity of their subjects. On this view, physics forms the base of the ladder because it deals with the simplest building-blocks of matter, atoms and subatomic particles. Chemistry is next up because it studies interacting atoms i.e. molecules. Biology studies complex collections of molecules, i.e. cells. Then comes neuroscience which deals with a complex collection of interacting cells – the brain. Psychology, perhaps, can be seen as the next level above neuroscience, because psychology studies brains interacting with each other and with the environment. So this on this model, we have a kind of Great Chain of Science, something like this: This is an appealing model. But is biology really basic to neuroscience (and psychology)? At first glance it seems like biology – most importantly cell and molecular biology – surely is basic to neuroscience. After all, brains are comprised of cells. All of the functions of brain cells, like synaptic transmission and plasticity, are products of biological machinery, i.e. proteins and ultimately genes. This doesn’t imply that neuroscience could be ‘reduced to’ biology, any more than biology will ever be reduced to pure chemistry, but it does seem to imply that biology is the foundation for neuroscience.

Keyword: Miscellaneous
Link ID: 20664 - Posted: 03.09.2015