Chapter 16. None

Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.


Links 61 - 80 of 3701

By Emily Underwood It doesn’t take long for tropical zebrafish to get hooked on hydrocodone. Within a week, they will risk their lives thousands of times per hour to get a dose of the opioid, shows the first study that let the fish themselves choose when to take a hit. To train them, researchers released 1.5 milligrams of hydrocodone per liter of water every time they swam over a shallow platform. The drug quickly filtered out of the tank, so they had to keep going back if they wanted to maintain their high. After just 5 days, the trained fish were visiting the opioid-delivering platform almost 2000 times every 50 minutes, the team reports online today in Behavioral Brain Research. When no drug was present, they visited the platform only about 200 times. Fish normally avoid shallow water, where they’re more likely to be spotted by predators. But over and over again, the jonesing zebrafish left the safety of deep water for the shallow platform. When the team rigged the tank so it took several visits to get a hit, the fish ramped up their efforts, returning as many as 20 times for one dose. Previous studies have shown that zebrafish exposed to opioids become stressed and anxious when the drug is taken away, displaying symptoms of withdrawal. But this is the first time scientists have shown that zebrafish will expend effort—and even court danger—to get a dose. Because zebrafish and humans share the same opioid receptor in their brains, as well as neurotransmitters such as dopamine and serotonin that signal pleasure and reward, the team hopes to use them to screen for new treatments for opioid addiction. © 2017 American Association for the Advancement of Science

Keyword: Drug Abuse
Link ID: 23998 - Posted: 08.26.2017

By Helen Thomson Have you ever seen the Virgin Mary in your grilled cheese? Or a screaming face inside a bell pepper? Seeing faces in inanimate objects is a common phenomenon. Now it seems that we’re not alone in experiencing it – monkeys do too. Pareidolia is the scientific term for erroneously perceiving faces where none exist. Other examples including seeing “ghosts” in blurry photos and the man in the moon. To investigate whether pareidolia was a uniquely human experience, Jessica Taubert at the US National Institute of Mental Health in Maryland and her colleagues trained five rhesus macaques to stare at pairs of photos. Each photo showed either an inanimate object that prompts pareidolia in humans, an equivalent object that doesn’t, or the face of a monkey (below). We already knew that both people and monkeys will look longer at images of faces than other things. So the team presented each of the photos in every possible pairing – 1980 in all – and measured the time the monkeys spent looking at each. The monkeys did indeed seem to succumb to pareidolia – they spent more time looking at illusory faces than the non-illusory photos they were paired with. Interestingly, they also spent more time looking at the illusory faces than the monkey faces, perhaps because they spent longer studying these more unusual “faces”, or because they tend to dislike holding the gaze of another monkey. © Copyright New Scientist Ltd.

Keyword: Attention
Link ID: 23997 - Posted: 08.25.2017

Jon Hamilton It's not just what you say that matters. It's how you say it. Take the phrase, "Here's Johnny." When Ed McMahon used it to introduce Johnny Carson on The Tonight Show, the words were an enthusiastic greeting. But in The Shining, Jack Nicholson used the same two words to convey murderous intent. Now scientists are reporting in the journal Science that they have identified specialized brain cells that help us understand what a speaker really means. These cells do this by keeping track of changes in the pitch of the voice. "We found that there were groups of neurons that were specialized and dedicated just for the processing of pitch," says Dr. Eddie Chang, a professor of neurological surgery at the University of California, San Francisco. Chang says these neurons allow the brain to detect "the melody of speech," or intonation, while other specialized brain cells identify vowels and consonants. "Intonation is about how we say things," Chang says. "It's important because we can change the meaning, even — without actually changing the words themselves." For example, by raising the pitch of our voice at the end of a sentence, a statement can become a question. The identification of neurons that detect changes in pitch was largely the work of Claire Tang, a graduate student in Chang's lab and the Science paper's lead author. Tang and a team of researchers studied the brains of 10 epilepsy patients awaiting surgery. The patients had electrodes placed temporarily on the surface of their brains to help surgeons identify the source of their seizures. © 2017 npr

Keyword: Language
Link ID: 23996 - Posted: 08.25.2017

By Frank Swain When you search for a medical condition online, would you also want to take a test for it then and there? Google has announced plans to offer people in the US searching for “depression” a clinically validated questionnaire so they can find out if they may have the condition. But then what? “This sounds like a really good idea that can quickly help people work out whether they are having low moods or feeling blue, [or if they] may have more serious and enduring problems that could be alleviated by seeking help,” says Marjorie Wallace, of the UK mental health charity SANE. “Our concern [however] is that raising expectations of help can be disappointing.” In places where access to therapy is hard to come by, a questionnaire may offer little comfort. Google has partnered with the National Alliance on Mental Illness (NAMI) – a US advocacy group for those affected by mental illness – to provide a link to a depression questionnaire at the top of the search results for terms related to depression. In an announcement posted on Google’s blog, NAMI states that the results of the self-assessment can form the first step towards a diagnosis, and help people have a more informed conversation with their doctor. © Copyright New Scientist Ltd.

Keyword: Depression
Link ID: 23995 - Posted: 08.25.2017

By Sam Wong It seems you can judge an athlete by their face – if they are a man, that is. Male athletes with a higher world ranking tend to be judged as more attractive by women, but there is no such trend among women. Several studies have previously reported a link between facial attractiveness and sporting performance in men, leading to suggestions that women respond to facial cues that reflect athletic ability in potential partners. Some have suggested this is because, in our evolutionary past, women might have benefited from choosing a partner with speed, skill and endurance. As a better hunter, the idea goes, he would have brought home more food, and he might pass on his fitness to their children. But these studies have been criticised, notably for only looking at men. They also tended to focus on team sports, therefore failing to isolate individual performance. To find more evidence, Tim Fawcett and colleagues at the University of Exeter, UK, collected photos of 156 men and women who competed at the 2014 Winter Olympics in the biathlon – an event combining cross-country skiing and shooting. Each athlete was rated for their facial attractiveness by members of the opposite sex, who didn’t know the purpose of the study. © Copyright New Scientist Ltd.

Keyword: Sexual Behavior; Evolution
Link ID: 23992 - Posted: 08.25.2017

By James Gallagher People with higher levels of lithium in their drinking water appear to have a lower risk of developing dementia, say researchers in Denmark. Lithium is naturally found in tap water, although the amount varies. The findings, based on a study of 800,000 people, are not clear-cut. The highest levels cut risk, but moderate levels were worse than low ones. Experts said it was an intriguing and encouraging study that hinted at a way of preventing the disease. The study, at the University of Copenhagen, looked at the medical records of 73,731 Danish people with dementia and 733,653 without the disease. Tap water was then tested in 151 areas of the country. The results, published in JAMA Psychiatry, showed moderate lithium levels (between 5.1 and 10 micrograms per litre) increased the risk of dementia by 22% compared with low levels (below five micrograms per litre). However, those drinking water with the highest lithium levels (above 15 micrograms per litre) had a 17% reduction in risk. The researchers said: "This is the first study, to our knowledge, to investigate the association between lithium in drinking water and the incidence of dementia. "Higher long-term lithium exposure from drinking water may be associated with a lower incidence of dementia." Lithium is known to have an effect on the brain and is used as a treatment in bipolar disorder. However, the lithium in tap water is at much lower levels than is used medicinally. Experiments have shown the element alters a wide range of biological processes in the brain. © 2017 BBC.

Keyword: Alzheimers
Link ID: 23990 - Posted: 08.24.2017

/ By Eric Bender Physicians call it the 5,000-hour problem. If you have a common chronic condition such as cardiovascular disease or diabetes, the expert in charge of your health for almost all of your 5,000 waking hours annually is — you. And, frankly, you won’t always make the best choices. “The behavior changes that are necessary to address chronic disease are much more in your hands than in the doctor’s,” points out Stacey Chang, executive director of the Design Institute for Health at Dell Medical School in Austin, Texas. “To cede that control to the doctor sometimes is actually counterproductive.” “While there have been enormous advances in technology, there’s still a lot of work to be done with the science of habit formation.” With that in mind, a rapidly evolving set of new digital health tools is angling to help patients engage better with their own care. Wearable health monitors already on the market help to track heart rate, footsteps, or blood glucose levels; sophisticated home health sensors can report on weight and blood pressure; and phone apps can present key feedback and maybe even offer personalized advice. The only problem: It has thus far proved very difficult to know what really works. Indeed, despite a veritable avalanche of “digital health” products, from Fitbits to telehealth heart sensors, and despite floods of data flowing both to the people who use them and to their physicians — and even despite clear evidence that many doctors very much want these new gadgets to work — there is still precious little clinical data proving that they are providing major patient benefits or delivering more cost-effective care. Copyright 2017 Undark

Keyword: Obesity
Link ID: 23989 - Posted: 08.24.2017

By Matthew Hutson Engineers have figured out how to make antennas for wireless communication 100 times smaller than their current size, an advance that could lead to tiny brain implants, micro–medical devices, or phones you can wear on your finger. The brain implants in particular are “like science fiction,” says study author Nian Sun, an electrical engineer and materials scientist at Northeastern University in Boston. But that hasn’t stopped him from trying to make them a reality. The new mini-antennas play off the difference between electromagnetic (EM) waves, such as light and radio waves, and acoustic waves, such as sound and inaudible vibrations. EM waves are fluctuations in an electromagnetic field, and they travel at light speed—an astounding 300,000,000 meters per second. Acoustic waves are the jiggling of matter, and they travel at the much slower speed of sound—in a solid, typically a few thousand meters per second. So, at any given frequency, an EM wave has a much longer wavelength than an acoustic wave. Antennas receive information by resonating with EM waves, which they convert into electrical voltage. For such resonance to occur, a traditional antenna's length must roughly match the wavelength of the EM wave it receives, meaning that the antenna must be relatively big. However, like a guitar string, an antenna can also resonate with acoustic waves. The new antennas take advantage of this fact. They will pick up EM waves of a given frequency if its size matches the wavelength of the much shorter acoustic waves of the same frequency. That means that that for any given signal frequency, the antennas can be much smaller. © 2017 American Association for the Advancement of Science

Keyword: Brain imaging
Link ID: 23987 - Posted: 08.23.2017

By Alexander P. Burgoyne, David Z. Hambrick More than 60 years ago, Francis Crick and James Watson discovered the double-helical structure of deoxyribonucleic acid—better known as DNA. Today, for the cost of a Netflix subscription, you can have your DNA sequenced to learn about your ancestry and proclivities. Yet, while it is an irrefutable fact that the transmission of DNA from parents to offspring is the biological basis for heredity, we still know relatively little about the specific genes that make us who we are. That is changing rapidly through genome-wide association studies—GWAS, for short. These studies search for differences in people’s genetic makeup—their “genotypes”—that correlate with differences in their observable traits—their “phenotypes.” In a GWAS recently published in Nature Genetics, a team of scientists from around the world analyzed the DNA sequences of 78,308 people for correlations with general intelligence, as measured by IQ tests. The major goal of the study was to identify single nucleotide polymorphisms—or SNPs—that correlate significantly with intelligence test scores. Found in most cells throughout the body, DNA is made up of four molecules called nucleotides, referred to by their organic bases: cytosine (C), thymine (T), adenine (A), and guanine (G). Within a cell, DNA is organized into structures called chromosomes­. Humans normally have 23 pairs of chromosomes, with one in each pair inherited from each parent. © 2017 Scientific American

Keyword: Intelligence; Genes & Behavior
Link ID: 23986 - Posted: 08.23.2017

By Helen Thomson Our brains seem better at predictions than we are. A part of our brain becomes active when it knows something will be successfully crowdfunded, even if we consciously decide otherwise. If this finding stands up and works in other areas of life, neuroforecasting may lead to better voting polls or even predict changes in financial markets. To see if one can predict market behaviour by sampling a small number of people, Brian Knutson at Stanford University in California and his team scanned the brains of 30 people while they decided whether to fund 36 projects from the crowdfunding website Kickstarter. The projects were all recently posted proposals for documentary films. Each participant had their brain scanned while taking in the pictures and descriptions of each campaign, and they were then asked if they would want to fund the project. When the real Kickstarter campaigns ended a few weeks later, 18 of the projects had gained enough funding to go forward. Examining the participants’ brain scans, the team discovered that activity in a region called the nucleus accumbens had been different when they considered projects that later went on to be successful. Prediction paradox The team trained an algorithm to recognise these differences in brain activity using scan data from 80 per cent of the projects, then tested the program on the remaining 20 per cent. Using neural activity alone, the algorithm was able to forecast which Kickstarter campaigns would be funded with 59.1 per cent accuracy – more than would be expected by chance. © Copyright New Scientist Ltd.

Keyword: Attention
Link ID: 23984 - Posted: 08.22.2017

By RONI CARYN RABIN Many of us grab coffee and a quick bite in the morning and eat more as the day goes on, with a medium-size lunch and the largest meal of the day in the evening. But a growing body of research on weight and health suggests we may be doing it all backward. A recent review of the dietary patterns of 50,000 adults who are Seventh Day Adventists over seven years provides the latest evidence suggesting that we should front-load our calories early in the day to jump-start our metabolisms and prevent obesity, starting with a robust breakfast and tapering off to a smaller lunch and light supper, or no supper at all. More research is needed, but a series of experiments in animals and some small trials in humans have pointed in the same direction, suggesting that watching the clock, and not just the calories, may play a more important role in weight control than previously acknowledged. And doctors’ groups are taking note. This year, the American Heart Association endorsed the principle that the timing of meals may help reduce risk factors for heart disease, like high blood pressure and high cholesterol. The group issued a scientific statement emphasizing that skipping breakfast — which 20 to 30 percent of American adults do regularly — is linked to a higher risk of obesity and impaired glucose metabolism or diabetes, even though there is no proof of a causal relationship. The heart association’s statement also noted that occasional fasting is associated with weight loss, at least in the short term. “I always tell people not to eat close to bedtime, and to try to eat earlier in the day,” said Marie-Pierre St-Onge, an associate professor of nutritional medicine at Columbia University’s College of Physicians and Surgeons, who led the work group that issued the statement. Perhaps not surprisingly, the latest study found that those who supplemented three meals a day with snacks tended to gain weight over time, while those who ate only one or two meals a day tended to lose weight, even compared with those who just ate three meals a day. © 2017 The New York Times Company

Keyword: Obesity
Link ID: 23983 - Posted: 08.22.2017

By JANE E. BRODY A very slender friend recently admitted to me that she “can’t stand to be around fat people.” Her reaction is almost visceral, and it prompts her to avoid social and professional contact with people who are seriously overweight. Although she can’t pinpoint the source of her feelings, she said they go back as far as she can remember. And she is hardly alone. Decades ago, researchers found that weight-based bias, which is often accompanied by overt discrimination and bullying, can date back to childhood, sometimes as early as age 3. The prejudiced feelings may not be apparent to those who hold them, yet they can strongly influence someone’s behavior. A new study by researchers at Duke University, for example, found that “implicit weight bias” in children ages 9 to 11 was as common as “implicit racial bias” is among adults. The study’s lead author, Asheley C. Skinner, a public health researcher, said that prejudices that people are unaware of may predict their biased behaviors even better than explicit prejudice. She traced the origins of weight bias in young children and adolescents to the families they grow up in as well as society at large, which continues to project cultural ideals of ultra-slimness and blames people for being fat. “It’s pretty common for parents to comment on their own weight issues and tell their children they shouldn’t be eating certain foods or remark about how much weight they’re gaining,” Dr. Skinner said.

Keyword: Obesity
Link ID: 23982 - Posted: 08.22.2017

By Abby Olena Our brains quickly characterize everything we see as familiar or new, and scientists have been investigating this connection between vision and cognition for years. Now, research in Japanese macaques (Macaca fuscata) reveals that the activation of neurons in a part of the primate brain called the perirhinal cortex can cause monkeys to recognize new objects as familiar and vice versa. The study was published today (August 17) in Science. “There are a lot of really exciting aspects to this paper,” says neuroscientist David Sheinberg of Brown University, who did not participate in the work. “This group continues to make advances that are helping us understand how we convert visual impressions into things we know.” Primate brains process visual information through several brain structures that make up the ventral visual stream. The last stop in this stream is the perirhinal cortex, part of the medial temporal lobe. Scientists know that this brain structure plays roles in visual memory and object discrimination. But one open question is whether the perirhinal cortex represents objects’ physical traits or whether it might also communicate information about nonphysical attributes, such as whether an object has been seen before. “In the primate, the perirhinal cortex is the link between the visual pathway and the limbic memory system,” coauthor and University of Tokyo neuroscientist Yasushi Miyashita writes in an email to The Scientist. “Therefore, the perirhinal cortex is one of the most likely candidates in the brain where visual information is transformed to subjective semantic values by referring to one’s own memory.” © 1986-2017 The Scientist

Keyword: Attention
Link ID: 23979 - Posted: 08.19.2017

/ By Steven Lubet There is a memorable episode in the now-classic sitcom Scrubs in which the conniving Dr. Kelso unveils a plan to peddle useless “full body scans” as a new revenue stream for the perpetually cash-strapped Sacred Heart Hospital. The irascible but ultimately patient-protecting Dr. Cox objects loudly. “I think showing perfectly healthy people every harmless imperfection in their body just to scare them into taking invasive and often pointless tests is an unholy sin,” he says. Undeterred, Kelso launches an advertising campaign that promotes the scans in a tear-jerking television commercial and a billboard screaming “YOU may already be DYING.” Alarmist medical advertising is pretty funny on television, but it can be far more troubling in real life. Although I’ve never been alerted to impending death, I recently received an advertisement from my own trusted health care provider warning that I may have Alzheimer’s disease, although I have no known symptoms and no complaints. As long-time patients at NorthShore University Health System, which is affiliated with the University of Chicago, my wife and I received two solicitations from its Center for Brain Health touting the development of “ways to slow brain aging and even prevent the onset of Alzheimer’s.” According to the ads, which arrived in both postcard and email form, there is “new hope for delaying — even preventing — aging brain diseases” through “genetic testing, advanced diagnostics, and lifestyle factors.” Copyright 2017 Undark

Keyword: Alzheimers
Link ID: 23978 - Posted: 08.19.2017

By Denise D. Cummins Looking directly at the camera, NPR's Skunk Bear host Adam Cole laments, "It's pretty clear that I'll never be able to have a real human-style conversation with an ape.” In his short and very entertaining video, Cole summarizes decades of research aimed at teaching apes human language, all of which, we are to understand, came to naught. But what the video actually shows us is how little the average person (and many scientists) understands about language. At one point, Cole tells his dog to sit, and the dog sits. This, he tells us, is not evidence that the dog knows English. But actually, it is. The dog's behavior shows us that he is capable of understanding the simple concept of sitting, that he is capable of distinguishing the verbal signal "sit" from other verbal signals, and that he is capable of connecting the two. This isn't rocket science, it isn't magic, and it isn't anthropomorphizing. It is just the way word learning works. In studies conducted at the Max Planck Institute for Evolutionary Anthropology in Leipzig, Germany, a border collie named Rico was taught the meanings of 200 words. He could even use theprocess of elimination to figure out unfamiliar words: If he already knew the word "ball,” and his trainer showed him a ball and a stick and told him to get the "stick,” he would bring the stick. He could remember new words even after a month of not hearing them. © 2017 Scientific American,

Keyword: Animal Communication; Language
Link ID: 23977 - Posted: 08.19.2017

Tina Hesman Saey Add a new ingredient to the sugar, spice and everything nice needed to make girls. A protein called COUP-TFII is necessary to eliminate male reproductive tissue from female mouse embryos, researchers report in the Aug. 18 Science. For decades, females have been considered the “default” sex in mammals. The new research overturns that idea, showing that making female reproductive organs is an active process that involves dismantling a primitive male tissue called the Wolffian duct. In males, the Wolffian duct develops into the parts needed to ejaculate sperm, including the epididymis, vas deferens and seminal vesicles. In females, a similar embryonic tissue called the Müllerian duct develops into the fallopian tubes, uterus and vagina. Both duct tissues are present in early embryos. A study by French endocrinologist Alfred Jost 70 years ago indicated that the testes make testosterone and an anti-Müllerian hormone to maintain the Wolffian duct and suppress female tissue development. If those hormones are missing, the Wolffian duct degrades and an embryo by default develops as female, Jost proposed. That’s the story written in textbooks, says Amanda Swain, a developmental biologist at the Institute of Cancer Research in London. But the new study “demonstrates that females also have a pathway to make sure you don’t get the wrong ducts,” says Swain, who wrote a commentary in the same issue of Science. |© Society for Science & the Public 2000 - 2017.

Keyword: Sexual Behavior
Link ID: 23976 - Posted: 08.19.2017

By NICHOLAS BAKALAR A handful of walnuts may be an effective weight loss tool. Walnuts are rich in omega-3 fatty acids and other substances and, in moderation, have been linked to reduced risk of obesity and diabetes. They may also efficiently reduce appetite. Researchers now may have found out why. They had nine hospitalized obese patients drink, on five consecutive days, either a smoothie containing 48 grams of walnuts (1.7 ounces, or about 14 walnut halves and 315 calories) or a placebo smoothie identical in taste and calorie content. Then, after a month on their regular diet, the patients returned for a second five-day trial, with placebo drinkers on the first trial receiving a walnut smoothie, and vice versa. The participants underwent M.R.I. brain exams while looking at pictures of high-fat food (cake, for example), low-fat food (vegetables) or neutral pictures of rocks and trees. The study, published in Diabetes, Obesity and Metabolism, found that when people looked at pictures of high-fat food, activation in the insula, a part of the brain involved in appetite and impulse control, increased among those who drank the walnut smoothie, but not among placebo drinkers. The study was funded in part by the California Walnut Commission. “Walnuts can alter the way our brains view food and impact our appetites,” said the lead author, Olivia M. Farr, of Beth Israel Deaconess Medical Center in Boston. “Our results confirm the current recommendations to include walnuts as part of a healthy diet.” © 2017 The New York Times Company

Keyword: Obesity
Link ID: 23975 - Posted: 08.19.2017

Laurel Hamers Scientists have traced the sensation of itch to a place you can’t scratch. The discomfort of a mosquito bite or an allergic reaction activates itch-sensitive nerve cells in the spinal cord. Those neurons talk to a structure near the base of the brain called the parabrachial nucleus, researchers report in the Aug. 18 Science. It’s a region that’s known to receive information about other sensations, such as pain and taste. The discovery gets researchers one step closer to finding out where itch signals ultimately end up. “The parabrachial nucleus is just the first relay center for [itch signals] going into the brain,” says study coauthor Yan-Gang Sun, a neuroscientist at the Chinese Academy of Sciences in Shanghai. Understanding the way these signals are processed by the brain could someday provide relief for people with chronic itch, Sun says. While the temporary itchiness of a bug bite is annoying, longer term, “uncontrollable scratching behavior can cause serious skin damage.” Previous studies have looked at the way an itch registers on the skin or how neurons convey those sensations to the spinal cord. But how those signals travel to the brain has been a trickier question, and this research is a “major step” toward answering it, says Zhou-Feng Chen, director of the Center for the Study of Itch at Washington University School of Medicine in St. Louis. |© Society for Science & the Public 2000 - 2017.

Keyword: Pain & Touch
Link ID: 23972 - Posted: 08.18.2017

Researchers from the National Institutes of Health have identified a class of sensory neurons (nerve cells that electrically send and receive messages between the body and brain) that can be activated by stimuli as precise as the pulling of a single hair. Understanding basic mechanisms underlying these different types of responses will be an important step toward the rational design of new approaches to pain therapy. The findings were published in the journal Neuron. “Scientists know that distinct types of neurons detect different types of sensations, such as touch, heat, cold, pain, pressure, and vibration,” noted Alexander Chesler, Ph.D., lead author of the study and principal investigator with the National Center for Complementary and Integrative Health’s (NCCIH) Division of Intramural Research (DIR). “But they know more about neurons involved with temperature and touch than those underlying mechanical pain, like anatomical pain related to specific postures or activities.” In this study, Chesler and his colleagues used a novel strategy that combined functional imaging (which measures neuronal activity), recordings of electrical activity in the brain, and genetics to see how neurons respond to various stimuli. The scientists focused on a class of sensory neurons that express a gene called Calca, as these neurons have a long history in pain research. The scientists applied various stimuli to the hairy skin of mice cheeks, including gentle mechanical stimuli (air puff, stroking, and brushing), “high-threshold” mechanical stimuli (hair pulling and skin pinching), and temperature stimulation. They found that the target neurons belong to two broad categories, both of which were insensitive to gentle stimulation. The first was a well-known type of pain fiber—a polymodal nociceptor—that responds to a host of high intensity stimuli such as heat and pinching. The second was a unique and previously unknown type of neuron that responded robustly to hair pulling. They called this previously undescribed class of high-threshold mechanoreceptors (HTMRs) “circ-HTMRs,” due to the unusual nerve terminals these neurons made in skin. They observed that the endings of the fibers made lasso-like structures around the base of each hair follicle.

Keyword: Pain & Touch
Link ID: 23970 - Posted: 08.17.2017

By Catherine Offord | On August 21, the moon will pass between the Earth and the sun, resulting in a total solar eclipse visible across a large strip of the United States. Self-proclaimed eclipse-chaser Ralph Chou, an emeritus professor of optometry at the University of Waterloo, has been working to spread awareness about eye-safety during eclipses for around 30 years. Last year, he put together the American Astronomical Society’s technical guide to eye safety, aimed at everyone from astronomers to educators to medical professionals. The Scientist spoke to Chou to find out what happens to the eye when exposed to too much sunlight, and how to watch next week’s solar eclipse safely. Ralph Chou: Light comes into the eye and goes through all the various layers of cells until it reaches the photoreceptors—essentially, the bottom of a stack of cells. The photoreceptors themselves guide the light towards a specialized structure [of the cells] called the outer segment, where there is a stack of discs that contain the visual pigment. Under normal circumstances, the light would interact with the pigment, which generates an electrical signal that then starts the process of sending an impulse through the optic nerve to the brain. In looking at the sun, you have a very large volume of photons—light energy—coming in and hitting these pigment discs, and it’s more than they can really handle. In addition to generating the electrical signal, [the cell] also starts generating photo-oxidative compounds. So you’re getting oxidative species like hydroxyl radicals and peroxides that will go on to attack the cell’s organelles. © 1986-2017 The Scientist

Keyword: Vision
Link ID: 23969 - Posted: 08.17.2017