Chapter 16. None
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
|By Anne Skomorowsky On a Saturday night last month, 12 students at Wesleyan University in Connecticut were poisoned by “Molly,” a hallucinogenic drug they had taken to enhance a campus party. Ambulances and helicopters transported the stricken to nearby hospitals, some in critical condition. Molly—the street name for the amphetamine MDMA—can cause extremely high fevers, liver failure, muscle breakdown, and cardiac arrest. Given the risks associated with Molly, why would anybody take it? The obvious answer—to get high—is only partly true. Like many drugs of abuse, Molly causes euphoria. But Molly is remarkable for its “prosocial” effects. Molly makes users feel friendly, loving, and strongly connected to one another. Molly is most commonly used in settings where communion with others is highly valued, such as raves, music festivals, and college parties. Recently, psychiatrists have taken an interest in its potential to enhance psychotherapy; this has led to new research into the mechanisms by which MDMA makes people feel closer. It appears that MDMA works by shifting the user’s attention towards positive experiences while minimizing the impact of negative feelings. To investigate this, a 2012 study by Cedric Hysek and colleagues used the Reading the Mind in the Eyes Test (RMET), which was developed to evaluate people with autism. In the RMET, participants are shown 36 pictures of the eye region of faces. Their task is to describe what the person in the picture is feeling. Volunteers taking MDMA, under carefully controlled conditions, improved in their recognition of positive emotions; but their performance in recognizing negative emotions declined. In other words, they incorrectly attributed positive or neutral feelings to images that were actually negative in emotional tone. They mistook negative and threat-related images for friendly ones. © 2015 Scientific American
Keyword: Drug Abuse
Link ID: 20678 - Posted: 03.12.2015
By CELIA WATSON SEUPEL Every year, nearly 40,000 Americans kill themselves. The majority are men, and most of them use guns. In fact, more than half of all gun deaths in the United States are suicides. Experts and laymen have long assumed that people who died by suicide will ultimately do it even if temporarily deterred. “People think if you’re really intent on dying, you’ll find a way,” said Cathy Barber, the director of the Means Matters campaign at Harvard Injury Control Research Center. Prevention, it follows, depends largely on identifying those likely to harm themselves and getting them into treatment. But a growing body of evidence challenges this view. Suicide can be a very impulsive act, especially among the young, and therefore difficult to predict. Its deadliness depends more upon the means than the determination of the suicide victim. Now many experts are calling for a reconsideration of suicide-prevention strategies. While mental health and substance abuse treatment must always be important components in treating suicidality, researchers like Ms. Barber are stressing another avenue: “means restriction.” Instead of treating individual risk, means restriction entails modifying the environment by removing the means by which people usually die by suicide. The world cannot be made suicide-proof, of course. But, these researchers argue, if the walkway over a bridge is fenced off, a struggling college freshman cannot throw herself over the side. If parents leave guns in a locked safe, a teenage son cannot shoot himself if he suddenly decides life is hopeless. With the focus on who dies by suicide, these experts say, not enough attention has been paid to restricting the means to do it — particularly access to guns. © 2015 The New York Times Company
Link ID: 20674 - Posted: 03.10.2015
If you missed the great dress debate of 2015 you were probably living under a rock. Staffrooms across the globe threatened to come to a standstill as teachers addressed the all-important question – was the dress white and gold or blue and black? This is just one example of how our brains interpret things differently. So, with the 20th anniversary of Brain Awareness Week from 16 to 22 March, this week we bring you a collection of ideas and resources to get students’ synapses firing. The brain is one of our most interesting organs, and advances in technology and medicine mean we now know more about it than ever before. Brain Awareness Week is a global campaign to raise awareness of the progress and benefits of brain research. The organisers, the Dana Foundation, have put together an assortment of teaching materials for primary and secondary students. For children aged five to nine, the Mindboggling Workbook is a good place to start. It includes information on how the brain works, what it does and how to take care of it. There’s also a section on the nervous system, which you could turn into a fun group activity. Ask one student to lie down on a large sheet of paper while others trace around them. Add a drawing of the brain and the spinal cord. Use different coloured crayons to illustrate how neurons send messages around your body when you a) touch something hot, b) get stung on the leg by a wasp, and c) wriggle your toes after stepping in sand. Can students explain why the brain is described as being more powerful than a computer? © 2015 Guardian News and Media Limited
Link ID: 20673 - Posted: 03.10.2015
Robin Tricoles The first time it happened, I was 8. I was tucked in bed reading my favorite book when my tongue swelled up to the size of a cow’s, like the giant tongues I had seen in the glass display case at the neighborhood deli. At the same time, the far wall of my bedroom began to recede, becoming a tiny white rectangle floating somewhere in the distance. In the book I was holding, the typeface grew vast on the page. I was intrigued, I remember, but not afraid. Over the next six years, the same thing happened to me dozens of times. Forty years later, while working as a science writer, I stumbled on a scientific paper describing almost exactly what I had experienced. The paper attributed those otherworldly sensations to something called Alice in Wonderland syndrome, or its close cousin, Alice in Wonderland-like syndrome. People with Alice in Wonderland syndrome (AWS) perceive parts of their body to be changing size. For example, their feet may suddenly appear smaller and more distant, or their hands larger than they had been moments before. Those with the closely related Alice in Wonderland-like syndrome (AWLS) misperceive the size and distance of objects, seeing them as startlingly larger, smaller, fatter, or thinner than their natural state. People who experience both sensations, like I did, are classified as having AWLS. The syndrome’s name is commonly attributed to English psychiatrist John Todd, who in 1955 described his adult patients’ illusions of corporal and objective distortions in a paper in the Canadian Medical Association Journal. © 2015 by The Atlantic Monthly Group.
Link ID: 20672 - Posted: 03.10.2015
Alison Abbott Mediators appointed to analyse the rifts within Europe’s ambitious €1-billion (US$1.1-billion) Human Brain Project (HBP) have called for far-reaching changes both in its governance and its scientific programmes. Most significantly, the report recommends that systems neuroscience and cognitive neuroscience should be reinstated into the HBP. The mediation committee, led by engineer Wolfgang Marquardt, director of Germany’s national Jülich Research Centre, sent its final report to the HBP board of directors on 9 March, and issued a press release summarizing its findings. (The full report will not be published until after the board, a 22-strong team of scientists, discusses its contents at a meeting on 17–18 March). The European Commission flagship project, which launched in October 2013, is intended to boost supercomputing through neuroscience, with the aim of simulating the brain in a computer. But the project has been racked by dissent from the outset. In early 2014, a three-person committee of scientists who ran the HBP’s scientific direction revealed that they planned to eliminate cognitive neuroscience from the initiative, which precipitated a mass protest. More than 150 of Europe’s leading neuroscientists signed a letter to the European Commission, complaining about the project’s management and charging that the HBP plan to simulate the brain using only ‘bottom-up’ data on the behaviour of neurons was doomed to failure if it did not include the top-down constraints provided by systems and cognitive neuroscience. © 2015 Nature Publishing Group
Keyword: Brain imaging
Link ID: 20670 - Posted: 03.10.2015
By James Gallagher Health editor, BBC News website, San Diego A dog has been used to sniff out thyroid cancer in people who had not yet been diagnosed, US researchers say. Tests on 34 patients showed an 88% success rate in finding tumours. The team, presenting their findings at the annual meeting of the Endocrine Society, said the animal had an "unbelievable" sense of smell. Cancer Research UK said using dogs would be impractical, but discovering the chemicals the dogs can smell could lead to new tests. The thyroid is a gland in the neck that produces hormones to regulate metabolism. Thyroid tumours are relatively rare and are normally diagnosed by testing hormone levels in the blood and by using a needle to extract cells for testing. Cancers are defective, out-of-control cells. They have their own unique chemistry and release "volatile organic compounds" into the body. The canine approach relies on dogs having 10 times the number of smell receptors as people and being able to pick out the unique smells being released by cancers. The man's best friend approach has already produced promising results in patients with bowel and lung cancers. A team at the University of Arkansas for Medical Sciences (UAMS) had previously showed that a dog could be trained to smell the difference between urine samples of patients with and without thyroid cancer. Frankie the dog Frankie gave the correct diagnosis in 30 out of 34 cases The next step was to see if it could be used as a diagnostic test. Frankie the German Shepherd was trained to lie down when he could smell thyroid cancer in a sample and turn away if the urine was clean.
Keyword: Chemical Senses (Smell & Taste)
Link ID: 20668 - Posted: 03.09.2015
By Lily Hay Newman When I was growing up, I had a lazy eye. I had to wear a patch over my stronger eye for many years so that good-for-nothing, freeloading, lazy eye could learn some responsibility and toughen up. Wearing a patch was really lousy, though, because people would ask me about it all the time and say things like, "What's wrong with you?" Always fun to hear. I would have much preferred to treat my condition, which is also called amblyopia, by playing video games. Who wouldn't? And it seems like that dream may become a possibility. On Tuesday, developer Ubisoft announced Dig Rush, a game that uses stereoscopic glasses and blue and red figures in varying contrasts to attempt to treat amblyopia. Working in collaboration with McGill University and the eye treatment startup Amblyotech, Ubisoft created a world where controlling a mole character to mine precious metals is really training patients' brains to coordinate their eyes. When patients wear a patch, they may force their lazy eye to toughen up, but they aren't doing anything to teach their eyes how to work together. This lack of coordination, called strabismus, is another important factor that the game makers hope can be addressed better by Dig Rush than by "patching" alone. Amblyotech CEO Joseph Koziak said in a statement, “[This] electronic therapy has been tested clinically to significantly increase the visual acuity of both children and adults who suffer from this condition without the use of an eye patch.” One advantage of Dig Rush, he noted, is that it's easier to measure compliance with video games.
Link ID: 20667 - Posted: 03.09.2015
By Neuroskeptic There is a popular view that all of the natural sciences can be arranged in a chain or ladder according to the complexity of their subjects. On this view, physics forms the base of the ladder because it deals with the simplest building-blocks of matter, atoms and subatomic particles. Chemistry is next up because it studies interacting atoms i.e. molecules. Biology studies complex collections of molecules, i.e. cells. Then comes neuroscience which deals with a complex collection of interacting cells – the brain. Psychology, perhaps, can be seen as the next level above neuroscience, because psychology studies brains interacting with each other and with the environment. So this on this model, we have a kind of Great Chain of Science, something like this: This is an appealing model. But is biology really basic to neuroscience (and psychology)? At first glance it seems like biology – most importantly cell and molecular biology – surely is basic to neuroscience. After all, brains are comprised of cells. All of the functions of brain cells, like synaptic transmission and plasticity, are products of biological machinery, i.e. proteins and ultimately genes. This doesn’t imply that neuroscience could be ‘reduced to’ biology, any more than biology will ever be reduced to pure chemistry, but it does seem to imply that biology is the foundation for neuroscience.
Link ID: 20664 - Posted: 03.09.2015
By Jonathan Webb Science reporter, BBC News, San Antonio Physicists have pinned down precisely how pipe-shaped cells in our retina filter the incoming colours. These cells, which sit in front of the ones that actually sense light, play a major role in our colour vision that was only recently confirmed. They funnel crucial red and green light into cone cells, leaving blue to spill over and be sensed by rod cells - which are responsible for our night vision. Key to this process, researchers now say, is the exact shape of the pipes. The long, thin cells are known as "Muller glia" and they were originally thought to play more of a supporting role in the retina. They clear debris, store energy and generally keep the conditions right for other cells - like the rods and cones behind them - to turn light into electrical signals for the brain. But a study published last year confirmed the idea, proposed in earlier simulations, that Muller cells also function rather like optical fibres. 3D scans revealed the pipe-like structure of the Muller cells (in red) sitting above the photoreceptor cells (in blue) 3D scans revealed the pipe-like structure of the Muller cells (in red) sitting above the photoreceptor cells (in blue) And more than just piping light to the back of the retina, where the rods and cones sit, they selectively send red and green light - the most important for human colour vision - to the cone cells, which handle colour. Meanwhile, they leave 85% of blue light to spill over and reach nearby rod cells, which specialise in those wavelengths and give us the mostly black-and-white vision that gets us by in dim conditions. © 2015 BBC.
|By Dina Fine Maron Obesity stems primarily from the overconsumption of food paired with insufficient exercise. But this elementary formula cannot explain how quickly the obesity epidemic has spread globally in the past several decades nor why more than one third of adults in the U.S. are now obese. Many researchers believe that a more complex mix of environmental exposures, lifestyle, genetics and the microbiome’s makeup help explain that phenomenon. And a growing body of work suggests that exposure to certain chemicals—found in nature as well as industry—may play an essential role by driving the body to produce and store surplus fat in its tissues. Evidence of that cause-and-effect relationship in humans is still limited, but in laboratory animals and in petri dishes data linking the chemicals to problematic weight gain are mounting. Moreover, the effects in animals appear to be passed on not just to immediate offspring but also grandchildren and great-grandchildren—potentially accounting for some multigenerational obesity. The murkier picture for humans may become clearer in the next five years, says Jerry Heindel, a health science administrator at the National Institute of Environmental Health Sciences. His agency is now funding 57 grants related to obesity and diabetes, he said on March 2 at a meeting of the Institute of Medicine (IOM). The studies look at how chemicals, including those that appear to alter hormone regulation (such as the plasticizer bisphenol A and the antibacterial chemical triclosan), affect weight gain or insulin resistance. Thirty-two of the ongoing studies are in humans. And 20 of those will help assess the longer-term risks to children by tracking the youngsters' chemical levels in utero or as newborns and beyond. © 2015 Scientific American
Link ID: 20660 - Posted: 03.07.2015
Dr. Lisa Sanders. On Thursday we challenged Well readers to solve the case of a middle-aged woman with arthritis who developed a wasting illness after what looked like a simple cold. Her rheumatologist was worried that the immune suppressing medications the patient took to treat her joint disease had caused the new illness. More than 300 of you took on the challenge, and 17 of you correctly identified this rarity. The correct diagnosis is … Whipple’s disease The first reader to make the diagnosis was Mike Natter, a second-year medical student at the Sidney Kimmel Medical College at Thomas Jefferson University in Philadelphia. Mike said it was an easy case for him because he had been studying for an exam the next day and had just read about the disease. He is a frequent contributor to this column and says that he got the right diagnosis twice before but this was the first time he got it in first. Well done, Mike! The Diagnosis Whipple’s was first identified in 1907 by Dr. George Whipple, who was caring for a fellow physician who had “gradual loss of weight and strength, stools consisting chiefly of neutral fat and fatty acids, indefinite abdominal signs, and a peculiar multiple arthritis.” The patient eventually died. Dr. Whipple suspected an infectious cause because he found bacteria in many of the patient’s affected tissues, but the organism itself wasn’t identified for nearly 80 years. The bug, Tropheryma whipplei, is common and found mostly in soil. And yet the infection is rare. There have been only about 1,000 reported cases of Whipple’s disease in the more than one hundred years since it was first described. Over two-thirds of those were in middle-aged white men. Many of them were farmers or others who had occupational exposure to soil. © 2015 The New York Times Company
Link ID: 20659 - Posted: 03.07.2015
By David Masci Potential Republican presidential candidate Dr. Ben Carson made news earlier this week when he said that being gay is a “choice,” but when it comes to public opinion, polls show that Americans remain divided over whether “nature” or “nurture” is ultimately responsible for sexual orientation. Four-in-ten Americans (42%) said that being gay or lesbian is “just the way some choose to live,” while a similar share (41%) said that “people are born gay or lesbian,” according to the most recent Pew Research Center poll on the issue, conducted in 2013. Fewer U.S. adults (8%) said that people are gay or lesbian due to their upbringing, while another one-in-ten (9%) said they didn’t know or declined to give a response. People with the most education are the most likely to say that gays and lesbians were born that way. Indeed, 58% of Americans with a postgraduate degree say that people are born gay or lesbian, compared with just 35% of those with a high school diploma or less. The percentage of all Americans who believe that people are born gay or lesbian has roughly doubled (from 20% to 41%) since 1985, when the question was asked in a Los Angeles Times survey. More than three decades of Gallup polls also show a considerable rise in the view that being gay or lesbian is a product of “nature” rather than “nurture.” But the most recent survey, in 2014, still finds that the nation remains split in its feelings on the origins of sexual orientation. Copyright 2015 Pew Research Center
Keyword: Sexual Behavior
Link ID: 20658 - Posted: 03.07.2015
In Archaeology it is very rare to find any soft tissue remains: no skin, no flesh, no hair and definitely no brains. However, in 2009, archaeologists from York Archaeological Trust found something very surprising at a site in Heslington, York. During the excavation of an Iron-age landscape at the University of York, a skull, with the jaw and two vertebrae still attached, was discovered face down in a pit, without any evidence of what had happened to the rest of its body. At first it looked like a normal skull but it was not until it was being cleaned, that Collection Projects Officer, Rachel Cubitt, discovered something loose inside. “I peered though the hole at the base of the skull to investigate and to my surprise saw a quantity of bright yellow spongy material. It was unlike anything I had seen before.” says Rachel. Sonia O’Connor, from Archaeological Sciences, University of Bradford, was able to confirm that this was brain. With the help of York Hospital’s Mortuary they were able to remove the top of the skull in order to get their first look at this astonishingly well-preserved human brain. Since the discovery, a team of 34 specialists have been working on this brain to study and conserve it as much as possible. By radiocarbon dating a sample of jaw bone, it was determined that this person probably lived in the 6th Century BC, which makes this brain about 2,600 years old. By looking at the teeth and the shape of the skull it is likely this person was a man between 26 and 45 years old. An examination of the vertebrae in the neck tells us that he was first hit hard on the neck, and then the neck was severed with a small sharp knife, for reasons we can only guess. © Copyright York Archaeological Trust 2013-2015.
Keyword: Brain imaging
Link ID: 20657 - Posted: 03.07.2015
by Jan Piotrowski It's not the most charismatic fossil ever found, but it may reveal secrets of our earliest evolution. Unearthed in Ethiopia, the broken jaw with greying teeth suggests that the Homo lineage – of which modern humans are the only surviving member – existed up to 400,000 years earlier than previously thought. The fragment dates from around 2.8 million years ago, and is by far the most ancient specimen to bear the Homo signature. The earliest such fossil was one thought to be up to 2.4 million years ages old. Showing a mixture of traits, the new find pinpoints the time when humans began their transition from primitive, apelike Australopithecus to the big-brained conquerer of the world, says Brian Villmoare from the University of Nevada, Las Vegas, whose student made the find. Geological evidence from the same area, also reported this week in a study led by Erin DiMaggio from Pennsylvania State University, shows that the jaw's owner lived just after a major climate shift in the region: forests and waterways rapidly gave way to arid savannah, leaving only the occasional crocodile-filled lake. Except for the sabre-toothed big cat that once roamed these parts, the environment ended up looking much like it does today. It was probably the pressure to adapt to this new world that jump-started our evolution into what we see looking back at us in the mirror today, according to Villmoare. © Copyright Reed Business Information Ltd.
Link ID: 20654 - Posted: 03.05.2015
Loss of sensation in the eye that gradually leads to blindness has been prevented with an innovative technique, Canadian surgeons say. Abby Messner, 18, of Stouffville, Ont., lost feeling in her left eye after a brain tumour was removed, along with a nerve wrapped around it, when she was 11. Messner said she didn’t notice the loss of feeling until she scratched the eye. Messner wasn’t able to feel pain in the eye, a condition called corneal anaesthesia. Despite her meticulous care, the eye wouldn’t blink to protect itself when confronted by dust. A scar formed on her cornea, burrowed through, and formed a scar doctors feared would eventually obliterate her vision. "Everyone was like, 'Wow, she had a brain tumour and she’s fine," Messner recalled. "You don't really think that everything that is holding me back is my eye." Messner had to give up competitive swimming because of irritation from the chlorine, playing hockey, spending time outdoors where wind was a hazard or inside dry shopping malls. Over time, ophthalmology surgeon Dr. Asam Ali at SickKids introduced the idea of a nerve graft to restore feeling in the eye. "She started getting feeling back at about the two, three-month mark and that was a real surprise to her and we were very happy at that point because that was a lot faster than anything that had been reported before," Ali said. ©2015 CBC/Radio-Canada.
Alison Abbott Europe’s ambitious but contentious €1-billion Human Brain Project (HBP) has announced changes to its organization in a response to criticisms of its management and scientific trajectory by many high-ranking neuroscientists. On 26 February, the HBP's Board of Directors voted narrowly to disband the three-person executive committee that had run the project, which launched in October 2013 and is intended to boost digital technologies such as supercomputing through collaboration with neuroscience. That decision is expected to be endorsed by HBP’s 85 or so partner universities and research institutes by the end of this week. The revamp comes seven months after 150 top neuroscientists signed a protest letter to the European Commission, charging, among other things, that the committee was acting autocratically and running the project's scientific plans off course. Led by the charismatic but divisive figure of Henry Markram, a neuroscientist at the Swiss Federal Institute of Technology in Lausanne (EPFL) which coordinates the HBP, the committee had stirred up anger last spring when it revealed plans to cut cognitive neuroscience from the initiative. The neuroscientists vowed to boycott the HBP's future phases if their concerns were ignored. An independent mediation committee was established to look into the charges and make recommendations. Its report, which is expected to further shake up the HBP's management, will be published in the next few weeks. In the meantime, the three-person committee's responsibilities will be taken on by the HBP's Board of Directors (currently a 22-strong team of scientists that includes the disbanded executive committee, although they do not have voting rights). © 2015 Nature Publishing Group
Keyword: Brain imaging
Link ID: 20651 - Posted: 03.05.2015
By Abby Phillip Jan Scheuermann, who has quadriplegia, brings a chocolate bar to her mouth using a robot arm guided by her thoughts. Research assistant Elke Brown watches in the background. (University of Pittsburgh Medical Center) Over at the Defense Advanced Research Projects Agency, also known as DARPA, there are some pretty amazing (and often top-secret) things going on. But one notable component of a DARPA project was revealed by a Defense Department official at a recent forum, and it is the stuff of science fiction movies. According to DARPA Director Arati Prabhakar, a paralyzed woman was successfully able use her thoughts to control an F-35 and a single-engine Cessna in a flight simulator. It's just the latest advance for one woman, 55-year-old Jan Scheuermann, who has been the subject of two years of groundbreaking neurosignaling research. First, Scheuermann began by controlling a robotic arm and accomplishing tasks such as feeding herself a bar of chocolate and giving high fives and thumbs ups. Then, researchers learned that -- surprisingly -- Scheuermann was able to control both right-hand and left-hand prosthetic arms with just the left motor cortex, which is typically responsible for controlling the right-hand side. After that, Scheuermann decided she was up for a new challenge, according to Prabhakar.
Link ID: 20647 - Posted: 03.04.2015
by Catherine de Lange You won't believe you do it, but you do. After shaking hands with someone, you'll lift your hands to your face and take a deep sniff. This newly discovered behaviour – revealed by covert filming – suggests that much like other mammals, humans use bodily smells to convey information. We know that women's tears transmit chemosensory signals - their scent lowers testosterone levels and dampens arousal in men - and that human sweat can transmit fear. But unlike other mammals, humans don't tend to go around sniffing each other. Wondering how these kinds of signals might be exchanged, Noam Sobel and his colleagues at the Weizmann Institute of Science in Rehovot, Israel turned to one of the most common ways in which people touch each other - shaking hands. "We started looking at people and noticed that afterwards, the hand somehow inadvertently reached the face," says Sobel. To find out if people really were smelling their hands, as opposed to scratching their nose, for example, his team surreptitiously filmed 153 volunteers. Some were wired up to a variety of physiological instruments so that airflow to the nose could be measured without them realising this was the intention. The volunteers were filmed as they greeted a member of the team, either with or without a handshake. The researchers recorded how often the volunteers lifted their hands close to their nose, and how long they kept them there, the minute before and after the greeting. © Copyright Reed Business Information Ltd.
Keyword: Chemical Senses (Smell & Taste)
Link ID: 20645 - Posted: 03.04.2015
By Felicity Muth Visual illusions are fun: we know with our rational mind that, for example, these lines are parallel to each other, yet they don’t appear that way. Similarly, I could swear that squares A and B are different colours. But they are not. This becomes clearer when a connecting block is drawn between the two squares (see the image below). Illusions aren’t just fun tricks for us to play with, they can also tell us something about our minds. Things in the world look to us a certain way, but that doesn’t mean that they are that way in reality. Rather, our brain represents the world to us in a particular way; one that has been selected over evolutionary time. Having such a system means that, for example, we can see some animals running but not others; we couldn’t see a mouse moving from a mile away like a hawk could. This is because there hasn’t been the evolutionary selective pressures on our visual system to be able to do such a thing, whereas there has on the hawk’s. We can also see a range of wavelengths of light, represented as particular colours in our brain, while not being able to see other wavelengths (that, for example, bees and birds can see). Having a system limited by what evolution has given us means that there are many things we are essentially blind to (and wouldn’t know about if it weren’t for technology). It also means that sometimes our brain misrepresents physical properties of the external world in a way that can be confusing once our rational mind realises it. Of course, all animals have their own representation of the world. How a dog visually perceives the world will be different to how we perceive it. But how can we know how other animals perceive the world? What is their reality? One way we can try to get this is through visual illusions. © 2015 Scientific American
|By Christof Koch In the Dutch countryside, a tall, older man, dressed in a maroon sports coat, his back slightly stooped, stands out because of his height and a pair of extraordinarily bushy eyebrows. His words, inflected by a British accent, are directed at a middle-aged man with long, curly brown hair, penetrating eyes and a dark, scholarly gown, who talks in only a halting English that reveals his native French origins. Their strangely clashing styles of speaking and mismatched clothes do not seem to matter to them as they press forward, with Eyebrows peering down intently at the Scholar. There is something distinctly odd about the entire meeting—a crossing of time, place and disciplines. Eyebrows: So I finally meet the man who doubts everything. The Scholar: (not missing a beat) At this time, I admit nothing that is not necessarily true. I'm famous for that! Eyebrows: Is there anything that you are certain of? (sotto voce) Besides your own fame? The Scholar: (evading the sarcastic jibe) I can't be certain of my fame. Indeed, I can't even be certain that there is a world out there, for I could be dreaming or hallucinating it. I can't be certain about the existence of my own body, its shape and extension, its corporality, for again I might be fooling myself. But now what am I, when I suppose that there is some supremely powerful and, if I may be permitted to say so, malicious deceiver who deliberately tries to fool me in any way he can? Given this evil spirit, how do I know that my sensations about the outside world—that is, it looks, feels and smells in a particular way—are not illusions, conjured up by Him to deceive me? It seems to me that therefore I can never know anything truly about the world. Nothing, rien du tout. I have to doubt everything. © 2015 Scientific American
Link ID: 20640 - Posted: 03.03.2015