Most Recent Links

Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.


Links 81 - 100 of 22535

Alan Yu Being overweight can raise your blood pressure, cholesterol and risk for developing diabetes. It could be bad for your brain, too. A diet high in saturated fats and sugars, the so-called Western diet, actually affects the parts of the brain that are important to memory and make people more likely to crave the unhealthful food, says psychologist Terry Davidson, director of the Center for Behavioral Neuroscience at American University in Washington, D.C. He didn't start out studying what people ate. Instead, he was interested in learning more about the hippocampus, a part of the brain that's heavily involved in memory. He was trying to figure out which parts of the hippocampus do what. He did that by studying rats that had very specific types of hippocampal damage and seeing what happened to them. In the process, Davidson noticed something strange. The rats with the hippocampal damage would go to pick up food more often than the other rats, but they would eat a little bit, then drop it. Davidson realized these rats didn't know they were full. He says something similar may happen in human brains when people eat a diet high in fat and sugar. Davidson says there's a vicious cycle of bad diets and brain changes. He points to a 2015 study in the Journal of Pediatrics that found obese children performed more poorly on memory tasks that test the hippocampus compared with kids who weren't overweight. He says if our brain system is impaired by that kind of diet, "that makes it more difficult for us to stop eating that diet. ... I think the evidence is fairly substantial that you have an effect of these diets and obesity on brain function and cognitive function." © 2016 npr

Keyword: Obesity; Learning & Memory
Link ID: 23039 - Posted: 12.31.2016

By Nicole Mortillaro Post-traumatic stress disorder can be a debilitating condition. It's estimated that it affects nearly one in 10 Canadian veterans who served in Afghanistan. Now, there's promising research that could lead to the treatment of the disorder. Following a particularly traumatic event — one where there is the serious threat of death or a circumstance that was overwhelming — we often exhibit physical symptoms immediately. But the effects in our brains actually take some time to form. That's why symptoms of PTSD — reliving an event, nightmares, anxiety — don't show up until some time later. Research has shown that, after such an event, the hippocampus — which is important in dealing with emotions and memory — shrinks, while our amygdala — also important to memory and emotions — becomes hyperactive. In earlier research, Sumantra Chattarji from the National Centre for Biological Sciences (NCBS) and the Institute for Stem Cell Biology and Regenerative Medicine (inStem), in Bangalore, India, discovered that traumatic events cause new nerve connections to form in the amygdala, which also causes hyperactivity. This plays a crucial role in people dealing with post-traumatic stress disorder. Chattarji has been studying changes in the brain after traumatic events for more than a decade. In an earlier study, he concluded that a single stress event had no immediate event on the amygdala of rats. However, 10 days later, the rats exhibited increased anxiety. There were even changes to the brain, and, in particular the amygdala. So Chattarji set out to see if there was a way to prevent these changes. Post-traumatic stress disorder can seriously affect those who have served in the military. New research may help to one day prevent that. (Shamil Zhumatov/Reuters) The new research focused on a particular cell receptor in the brain, called N-Methyl-D-Aspartate Receptor (NMDA-R), which is crucial in forming memories. ©2016 CBC/Radio-Canada.

Keyword: Stress; Learning & Memory
Link ID: 23038 - Posted: 12.31.2016

By Alice Callahan Can psychiatric medications alter the mother-baby bond? I am having a baby in a month and am on an antidepressant, antipsychotic and mood stabilizer. I don't feel a natural instinct to mother or connect to my baby yet. Could it be because of my medications? It’s normal for expectant parents to worry if they don’t feel a strong connection to the baby right away. “Those kinds of mixed fears and anxieties are really common in most pregnancies, certainly first pregnancies,” said Dorothy Greenfeld, a licensed clinical social worker and professor of obstetrics and gynecology at Yale School of Medicine. Bonding is a process that takes time, and while it can begin in pregnancy, the relationship between parent and child mostly develops after birth. Psychiatric conditions, and the medicines used to treat them, can complicate the picture. Antidepressants, the most widely used class of psychiatric drugs, do not seem to interfere with a woman’s attachment to the fetus during pregnancy, as measured by the amount of time the mother spends thinking about and planning for the baby, a 2011 study in the Archives of Women’s Mental Health found. On the other hand, the study found that women with major depression in pregnancy had lower feelings of maternal-fetal attachment, and this sense of disconnection intensified with more severe symptoms of depression. “Depression can definitely affect a person’s ability to bond with their baby, to feel those feelings of attachment, which is why we encourage treatment so strongly,” said Dr. Amy Salisbury, the study leader and a professor of pediatrics and psychiatry at the Alpert Medical School at Brown University. “That’s more likely to interfere than the medication itself.” There is less research on the effects of other types of mental health medications on mother-baby bonding, but psychiatric medications can have side effects that might interfere with parenting. For example, a small percentage of people taking mood-stabilizing medications have feelings of apathy, and that could hinder the bonding process, said Dr. Salisbury. And some mental health medications, depending on dosage and combination, might make a person feel too sedated. But again, letting mental illness go untreated is likely far riskier for both the mother and the baby. © 2016 The New York Times Company

Keyword: Depression; Sexual Behavior
Link ID: 23037 - Posted: 12.31.2016

By Laura Beil, Justin Shamoun began to hate his body a few weeks into seventh grade. He was a year younger than his suburban Detroit classmates, having skipped a grade. Many of his peers were entering puberty, their bodies solidifying into sleek young men. Justin still had the doughy build of a boy. After gym class one day, someone told Justin he could probably run faster if he weren’t so fat. The remark crushed him. Ashamed, he started hiding his body under ever-baggier clothes and making excuses to skip P.E., the pool, anywhere required to expose bare skin. Finally, he decided to fix himself. He dove headlong into sports and cut back on food. Before long, he was tossing his lunch into the garbage and picking at his dinner. He ate just enough to blunt his hunger, until the time came when he ate barely at all. The thought that he had an eating disorder never occurred to him. Long considered an affliction of women, eating disorders — the most deadly of all mental illnesses — are increasingly affecting men. The National Eating Disorders Association predicts that 10 million American men alive today will be affected, but that number is only an estimate based on the limited research available. The official criteria for diagnosing eating disorders were updated to be more inclusive of men only in 2013. And last year, Australian researchers writing in the Journal of Eating Disorders noted that “the prevalence of extreme weight control behaviors, such as extreme dietary restriction and purging” may be increasing at a faster rate in men than women. © 2016 Scientific American

Keyword: Anorexia & Bulimia; Sexual Behavior
Link ID: 23036 - Posted: 12.31.2016

By BENEDICT CAREY She was all there, all the time: exuberant in describing her mania, savage and tender when recalling her despair. And for decades, she gracefully wore the legacy of her legendary role as Princess Leia, worshiped by a generation of teenage girls as the lone female warrior amid the galactic male cast of the “Star Wars” trilogy. In her long, openhearted life, the actress and author Carrie Fisher brought the subject of bipolar disorder into the popular culture with such humor and hard-boiled detail that her death on Tuesday triggered a wave of affection on social media and elsewhere, from both fans and fellow bipolar travelers, whose emotional language she knew and enriched. She channeled the spirit of people like Patty Duke, who wrote about her own bipolar illness, and Kitty Dukakis, who wrote about depression and alcoholism, and turned it into performance art. Ms. Fisher’s career coincided with the growing interest in bipolar disorder itself, a mood disorder characterized by alternating highs and lows, paralyzing depressions punctuated by flights of exuberant energy. Her success fed a longstanding debate on the relationship between mental turmoil and creativity. And her writing and speaking helped usher in a confessional era in which mental disorders have entered the pop culture with a life of their own: Bipolar is now a prominent trait of another famous Carrie, Claire Danes’s character Carrie Mathison in the Showtime television series “Homeland.” “She was so important to the public because she was telling the truth about bipolar disorder, not putting on airs or pontificating, just sharing who she is in an honest-to-the-bone way,” said Judith Schlesinger, a psychologist and author of “The Insanity Hoax: Exposing the Myth of the Mad Genius.” © 2016 The New York Times Company

Keyword: Schizophrenia
Link ID: 23035 - Posted: 12.29.2016

By Heather M. Snyder For more than 25 years, Mary Read was a successful nurse in Lititz, Pennsylvania. But in 2010, at the age of 50, she started having trouble with her memory and thinking, making it difficult for her to complete routine tasks and follow instructions at work. The problems worsened, bringing her career to an abrupt end. In 2011, her doctor conducted a comprehensive evaluation, including a cognitive assessment, and found that she was in the early stages of younger-onset Alzheimer’s, which affects hundreds of thousands of people under 65. A year earlier, Elizabeth Wolf faced another sort of upheaval. The 36-year-old community health program director was forced to abandon her own career, home and community in Vermont when both of her parents were diagnosed with Alzheimer’s three months apart. Wolf took the difficult decision to move back into her childhood home in Mount Laurel, New Jersey in order to become their primary caregiver. These stories are not unusual. Alzheimer’s dementia disproportionately affects women in a variety of ways. Compared with men, 2.5 times as many women as men provide 24-hour care for an affected relative. Nearly 19 percent of these wives, sisters and daughters have had to quit work to do so. In addition, women make up nearly two-thirds of the more than 5 million Americans living with Alzheimer’s today. According to the Alzheimer’s Association 2016 Alzheimer’s Disease Facts and Figures, an estimated 3.3 million women aged 65 and older in the United States have the disease. To put that number in perspective, a woman in her sixties is now about twice as likely to develop Alzheimer’s as breast cancer within her lifetime. © 2016 Scientific American

Keyword: Alzheimers; Sexual Behavior
Link ID: 23034 - Posted: 12.29.2016

Ian Sample Science editor The first subtle hints of cognitive decline may reveal themselves in an artist’s brush strokes many years before dementia is diagnosed, researchers believe. The controversial claim is made by psychologists who studied renowned artists, from the founder of French impressionism, Claude Monet, to the abstract expressionist Willem de Kooning. While Monet aged without obvious mental decline, de Kooning was diagnosed with Alzheimer’s disease more than a decade before his death in 1997. Strobe lighting provides a flicker of hope in the fight against Alzheimer’s Alex Forsythe at the University of Liverpool analysed more than 2,000 paintings from seven famous artists and found what she believes are progressive changes in the works of those who went on to develop Alzheimer’s. The changes became noticeable when the artists were in their 40s. Though intriguing, the small number of artists involved in the study means the findings are highly tentative. While Forsythe said the work does not point to an early test for dementia, she hopes it may open up fresh avenues for investigating the disease. The research provoked mixed reactions from other scientists. Richard Taylor, a physicist at the University of Oregon, described the work as a “magnificent demonstration of art and science coming together”. But Kate Brown, a physicist at Hamilton College in New York, was less enthusiastic and dismissed the research as “complete and utter nonsense”. © 2016 Guardian News and Media Limited

Keyword: Alzheimers
Link ID: 23033 - Posted: 12.29.2016

By KEVIN DEUTSCH An anesthetic commonly used for surgery has surpassed heroin to become the deadliest drug on Long Island, killing at least 220 people there in 2016, according to medical examiners’ records. The drug, fentanyl, is a synthetic opioid, which can be 100 times more potent than morphine. The numbers from Long Island are part of a national pattern, as fentanyl fatalities have already surpassed those from heroin in other parts of the country, including New England, as its use has skyrocketed. Part of the reason for the increase is economic — because fentanyl can be manufactured in the lab, it is much cheaper and easier than cultivating heroin. In New York City, more than 1,000 people are expected to die from drug overdoses this year — the first recorded four-digit death total in city history, according to statistics compiled by the Department of Health and Mental Hygiene. Nearly half of all unintentional drug overdose deaths in the city since July have involved fentanyl, the health department said. The medical examiners of Long Island’s two counties, Nassau and Suffolk, compiled the new numbers. “Fentanyl has surpassed heroin as the most commonly detected drug in fatal opioid overdoses,” Dr. Michael J. Caplan, the Suffolk County medical examiner, said in a written statement about the statistics, which were obtained by The New York Times ahead of their release. “The influx of illicitly manufactured fentanyl from overseas is a nationwide issue that requires a multidisciplinary intervention from all levels of government.” Nationwide, recorded deaths from opioids surpassed 30,000 in 2015, according to data compiled by the Centers for Disease Control and Prevention. And overdoses caused by synthetic opioids like fentanyl increased by 72.2 percent in 2015 over 2014 — one of the deadliest year-over-year surges for any drug in United States history, the same data shows. © 2016 The New York Times Company

Keyword: Drug Abuse; Pain & Touch
Link ID: 23032 - Posted: 12.29.2016

Perry Link People who study other cultures sometimes note that they benefit twice: first by learning about the other culture and second by realizing that certain assumptions of their own are arbitrary. In reading Colin McGinn’s fine recent piece, “Groping Toward the Mind,” in The New York Review, I was reminded of a question I had pondered in my 2013 book Anatomy of Chinese: whether some of the struggles in Western philosophy over the concept of mind—especially over what kind of “thing” it is—might be rooted in Western language. The puzzles are less puzzling in Chinese. Indo-European languages tend to prefer nouns, even when talking about things for which verbs might seem more appropriate. The English noun inflation, for example, refers to complex processes that were not a “thing” until language made them so. Things like inflation can even become animate, as when we say “we need to combat inflation” or “inflation is killing us at the check-out counter.” Modern cognitive linguists like George Lakoff at Berkeley call inflation an “ontological metaphor.” (The inflation example is Lakoff’s.) When I studied Chinese, though, I began to notice a preference for verbs. Modern Chinese does use ontological metaphors, such as fāzhăn (literally “emit and unfold”) to mean “development” or xὶnxīn (“believe mind”) for “confidence.” But these are modern words that derive from Western languages (mostly via Japanese) and carry a Western flavor with them. “I firmly believe that…” is a natural phrase in Chinese; you can also say “I have a lot of confidence that…” but the use of a noun in such a phrase is a borrowing from the West. © 1963-2016 NYREV, Inc

Keyword: Consciousness; Language
Link ID: 23031 - Posted: 12.28.2016

Morwenna Ferrier Is my face attractive? Don’t answer that. Not because I’m ducking out of this, but because you can’t. Attractiveness is subjective, perhaps the most subjective question of all; that we outsource the answer to Google (and we do, in our droves) is ironic since it depends on a bias that is impossible to unpack. Yet in searching the internet for an answer, it also reveals the question to be one of the great existential tensions of our time. Because, as we all know, being attractive is absolutely 100% the A-road to happiness. If you are Googling to rate your attractiveness, then you are probably working on the assumption that you aren’t. You’re also, possibly, more vulnerable and susceptible to being told that you aren’t. In short, you’re a sitting duck, someone who had a sore throat and who asked good old Dr Google for advice only to be told it was cancer. Still, it’s only in investigating precisely why Google is the last person you should ask – being a search engine therefore insentient – that you can start cobbling together an idea of what attractiveness really is. It’s worth starting with semantics. Beauty is not attractiveness and vice versa, though we commonly confuse the two. Beauty (arguably) has a template against which we intuit and against which we measure ourselves. It is hinged around genetics and a particular look associated with this politically correct (and largely western-governed) model. Darwin wouldn’t agree: “It is certainly not true that there is in the mind of man any universal standards of beauty with respect to the human body,” he said. But a lot has changed since his time. © 2016 Guardian News and Media Limited

Keyword: Sexual Behavior
Link ID: 23030 - Posted: 12.28.2016

By Ben Andrew Henry Traveling from the forests and fields of Europe to the grasslands south of the Sahara desert is a monumental trip for anyone, and especially for a diminutive insect. Yet every year, populations of the painted lady (Vanessa cardui) butterfly make that journey over the course of several generations. The logistics of this migratory feat had been speculated for some time, but never fully understood, in part because of the difficulty of tracking the tiny insects across long distances. In a study published October 4 in Biology Letters, researchers reported having measured the isotopic composition of butterfly wings in Europe and south of the Sahara. Since the fraction of heavy hydrogen isotopes in the environment varies geographically, the team used its analysis to identify the origins of butterflies captured, confirming that groups of butterflies in the Sahara did originate in Europe. The butterflies do not linger in Africa long. They most likely make their trip, the authors suggested, to take advantage of the burst of productivity in the tropical savannah that follows the rainy season—and to breed the generation that will start the trip back. Europe’s freshwater eels (Anguilla anguilla) live out their days in rivers and streams, but they never spawn there. Massive catches of larval eels in the Sargasso Sea tipped researchers off a century ago that eels must spawn in the swirling mid-Atlantic gyre of free-floating seaweed and then migrate to Europe. Eels leave their homes in the late fall, but other than that, the details of their journey have been a mystery. © 1986-2016 The Scientist

Keyword: Animal Migration
Link ID: 23029 - Posted: 12.28.2016

by Bethany Brookshire An opioid epidemic is upon us. Prescription painkillers such as fentanyl and morphine can ease terrible pain, but they can also cause addiction and death. The Centers for Disease Control and Prevention estimates that nearly 2 million Americans are abusing or addicted to prescription opiates. Politicians are attempting to stem the tide at state and national levels, with bills to change and monitor how physicians prescribe painkillers and to increase access to addiction treatment programs. Those efforts may make access to painkillers more difficult for some. But pain comes to everyone eventually, and opioids are one of the best ways to make it go away. Morphine is the king of pain treatment. “For hundreds of years people have used morphine,” says Lakshmi Devi, a pharmacologist at the Ichan School of Medicine Mount Sinai in New York City. “It works, it’s a good drug, that’s why we want it. The problem is the bad stuff.” The “bad stuff” includes tolerance — patients have to take higher and higher doses to relieve their pain. Drugs such as morphine depress breathing, an effect that can prove deadly. They also cause constipation, drowsiness and vomiting. But “for certain types of pain, there are no medications that are as effective,” says Bryan Roth, a pharmacologist and physician at the University of North Carolina at Chapel Hill. The trick is constructing a drug with all the benefits of an opioid painkiller, and few to none of the side effects. Here are three ways that scientists are searching for the next big pain buster, and three of the chemicals they’ve turned up. |© Society for Science & the Public 2000 - 2016

Keyword: Pain & Touch; Drug Abuse
Link ID: 23028 - Posted: 12.27.2016

By GINA KOLATA It was Oct. 11, 2015, and a middle-aged man and a young woman, both severely obese, were struggling with the same lump-in-the-throat feeling. The next day they were going to have an irreversible operation. Were they on the threshold of a new beginning or a terrible mistake? They were strangers, scheduled for back-to-back bariatric surgery at the University of Michigan with the same doctor. He would cut away most of their stomachs and reroute their small intestines. They were almost certain to lose much of their excess weight. But despite the drastic surgery, their doctor told them it was unlikely that they would ever be thin. Nearly 200,000 Americans have bariatric surgery each year. Yet far more — an estimated 24 million — are heavy enough to qualify for the operation, and many of them are struggling with whether to have such a radical treatment, the only one that leads to profound and lasting weight loss for virtually everyone who has it. Most people believe that the operation simply forces people to eat less by making their stomachs smaller, but scientists have discovered that it actually causes profound changes in patients’ physiology, altering the activity of thousands of genes in the human body as well as the complex hormonal signaling from the gut to the brain. It often leads to astonishing changes in the way things taste, making cravings for a rich slice of chocolate cake or a bag of White Castle hamburgers simply vanish. Those who have the surgery naturally settle at a lower weight. © 2016 The New York Times Company

Keyword: Obesity
Link ID: 23027 - Posted: 12.27.2016

By GINA KOLATA Bariatric surgery is an option that obesity medicine specialists say is too often ignored or dismissed. Yet it is the only option that almost always works to help very heavy people lose a lot of weight and that also can mysteriously make some chronic conditions vanish. Here are some answers about bariatric surgery and what it does. HOW MANY AMERICANS ARE ELIGIBLE FOR BARIATRIC SURGERY? Twenty-four million, according to the American Society for Metabolic and Bariatric Surgery. The criteria are a body mass index above 40, or a B.M.I. of at least 35 along with other medical conditions like diabetes, hypertension, sleep apnea or acid reflux. HOW MANY HAVE THE SURGERY EACH YEAR? Fewer than 200,000. WHAT ARE THE OPERATIONS? There are four in use today. The two most popular procedures are the Roux-en-Y gastric bypass and the gastric sleeve. Both make the stomach smaller. The bypass also reroutes the small intestine. A simpler procedure, the gastric band, is less effective and has fallen out of favor. And a much more drastic operation, the biliopancreatic diversion with duodenal switch, which bypasses a large part of the small intestine, is rarely used because it has higher mortality and complication rates. HOW MUCH DO THE OPERATIONS COST? The average cost of a sleeve gastrectomy is $16,000 to $19,000, and the average cost of a gastric bypass is $20,000 to $25,000. Most insurance plans cover the cost for patients who qualify, though some plans require that patients try dieting for a certain amount of time first. DOES THE SURGERY SAVE MONEY ON OTHER HEALTH CARE COSTS IN THE END? © 2016 The New York Times Company

Keyword: Obesity
Link ID: 23026 - Posted: 12.27.2016

By Sheryl Ubelacker, The Canadian Press Posted: Peter Chaban was up early doing dishes one morning in 2012 when he noticed there was water flowing over his hand — but he couldn't feel it. Next thing he knew, he lost all sensation and strength on his left side and dropped to floor. Within seconds he was lying there completely immobilized. By the time the ambulance arrived at his vacation property near Collingwood, Ont., Chaban had recovered. But doctors at the local hospital diagnosed him with a probable transient ischemic attack, or TIA, a type of temporary stroke that leaves no permanent damage. Once he returned home to Toronto, Chaban was sent for an MRI, and the brain scan confirmed that diagnosis. But of more concern was the discovery of "quite a few" lesions in his brain, the result of "silent strokes" that show up as small holes on imaging. When the strokes had occurred and over what time period was a mystery to Chaban, who had experienced no symptoms. That's why, in fact, they're known as silent — patients have no idea they've had a miniature clot or microbleed in the brain that has destroyed a tiny chunk of neurons, but resulted in no loss of function as would typically occur with a full-blown stroke. "I was never aware of any deficits," said Chaban, 64, who retired from his research job at the Hospital for Sick Children three years ago. "When I was employed, I was quite cognitively active. "I was physically very active. I ski, play golf, I played squash until a few years ago. And my health is very good, so the silent strokes hadn't expressed themselves, at least to my awareness." ©2016 CBC/Radio-Canada.

Keyword: Stroke
Link ID: 23025 - Posted: 12.27.2016

As 2016 draws to a close, we are re-visiting some of the people we met this year — including one man who survived a stroke at a young age, and a listener who heard his story on the radio. DAVID GREENE, HOST: Now as 2016 draws to a close, we're revisiting some of the people we met this year. And NPR's Rae Ellen Bichell checks back with a man who survived a stroke in his 40s and also a listener who heard his story. RAE ELLEN BICHELL, BYLINE: Back in February, I reported a story about strokes increasing in adults under 50. Troy Hodge, a 43-year-old man living in Maryland, shared his story about having a stroke two years earlier. (SOUNDBITE OF ARCHIVED BROADCAST) TROY HODGE: I remember setting myself on the floor because I was really hot. And I wanted to get some water to splash on my face. BICHELL: When the story aired on MORNING EDITION, the radio waves carried Hodge's voice into the home of Sue Bryson, a teacher in Virginia. SUE BRYSON: It was just a normal Monday morning and I was just getting ready for work and I was listening to NPR. BICHELL: Listening to Hodge's story, Bryson realized that right then, she was having similar symptoms, that she was having a stroke. So she called her neighbors and they took her to the emergency room. BRYSON: I would have never gone to the hospital if I didn't hear your show - never. BICHELL: Bryson is now back in the classroom and Hodge has made some changes. He moved into a bigger apartment. He walks up a flight of stairs each day without his cane to check the mail. He sometimes forgets things. HODGE: Memory's not too bad, I mean, it's... © 2016 npr

Keyword: Stroke
Link ID: 23024 - Posted: 12.27.2016

Anna Gorman Rosemary Navarro was living in Mexico when her brother called from California. Something wasn't right with their mom, then in her early 40s. She was having trouble paying bills and keeping jobs as a food preparer in convalescent homes. Navarro, then 22, sold her furniture to pay for a trip back to the U.S. for herself and her two young children. Almost as soon as she arrived, she knew her mother wasn't the same person. "She was there but sometimes she wasn't there," she said. "I thought, 'Oh man this isn't going to be good.' " Before long, Navarro was feeding her mom, then changing her diapers. She put a special lock on the door to keep her from straying outside. Unable to continue caring for her, Navarro eventually moved her mom to a nursing home, where she spent eight years. Near the end, her mom, a quiet woman who had immigrated to the U.S. as a teenager and loved telenovelas, could communicate only by laughing or crying. Navarro was there when she took her last breath in 2009, at age 53. "What I went through with my mom I wouldn't wish on anyone," she said. Article continues after sponsorship It has happened again and again in her family — relatives struck by the same terrible disease, most without any clue what it was. An aunt, an uncle, a cousin, a grandfather, a great grandfather. "Too many have died," Navarro said. All in their early 50s. © 2016 npr

Keyword: Alzheimers
Link ID: 23023 - Posted: 12.27.2016

By Stephen L. Macknik Masashi Atarashi, a physics high school teacher from Japan, submitted this wonderful winter illusion to the 2015 Best Illusion of the Year Contest, where it competed as a finalist. Atarashi discovered this effect serendipitously, while watching the snow fall through the venetian window blinds of his school’s faculty lounge—just like his students must sometimes do in the classroom during a lecture! Notice that as the blinds occupy more area on the screen, the speed of the snowfall seems to accelerate. A great illusion to ponder during our white holiday season. Nobody knows how Atarashi’s effect works, but our working hypothesis is that each time the snow disappears behind a blind, or reappears below it, it triggers transient increases in the activity of your visual system’s motion-sensitive neurons. Such transient surges in neural activity are perhaps misinterpreted by your brain as faster motion speed. © 2016 Scientific American,

Keyword: Vision
Link ID: 23022 - Posted: 12.27.2016

By Susana Martinez-Conde, Stephen L. Macknik We think we know what we want—but do we, really? In 2005 Lars Hall and Petter Johansson, both at Lund University in Sweden, ran an experiment that transformed how cognitive scientists think about choice. The experimental setup looked deceptively simple. A study participant and researcher faced each other across a table. The scientist offered two photographs of young women deemed equally attractive by an independent focus group. The subject then had to choose which portrait he or she found more appealing. Next, the experimenter turned both pictures over, moved them toward the subjects and asked them to pick up the photo they just chose. Subjects complied, unaware that the researcher had just performed a swap using a sleight-of-hand technique known to conjurers as black art. Because your visual neurons are built to detect and enhance contrast, it is very hard to see black on black: a magician dressed in black against a black velvet backdrop can look like a floating head. Hall and Johansson deliberately used a black tabletop in their experiment. The first photos their subjects saw all had black backs. Behind those, however, they hid a second picture of the opposite face with a red back. When the experimenter placed the first portrait face down on the table, he pushed the second photo toward the subject. When participants picked up the red-backed photos, the black-backed ones stayed hidden against the table's black surface—that is, until the experimenter could surreptitiously sweep them into his lap. © 2016 Scientific American

Keyword: Consciousness; Attention
Link ID: 23021 - Posted: 12.26.2016

By Drake Baer Convergent evolution is what happens when nature takes different courses from different starting points to arrive at similar results. Consider bats, birds, and butterflies developing wings; sharks and dolphins finding fins; and echidnas and porcupines sporting spines. Or, if you want to annoy a traditionalist scientist, talk about humans and octopuses — and how they may both have consciousness. This is the thrust of Other Minds: The Octopus, the Sea, and the Deep Origins of Consciousness, a new book by the scuba-diving, biology-specializing philosopher Peter Godfrey-Smith, originally of Australia and now a distinguished professor at the City University of New York’s graduate center. The book was written up by Olivia Judson in The Atlantic, and you should read the whole thing, but what I find mesmerizing is how categorically other the eight-tentacled ink-squirters are, and how their very nature challenges our conceptualizations of intelligence. “If we can make contact with cephalopods as sentient beings, it is not because of a shared history, not because of kinship, but because evolution built minds twice over,” Godfrey-Smith is quoted as saying. “This is probably the closest we will come to meeting an intelligent alien.” (He’s not the first to think so: The Hawaiian creation myth holds that octopuses are the only creatures left over from an earlier incarnation of the Earth, making them more proto-terrestrials than extraterrestrials.) © 2016, New York Media LLC.

Keyword: Evolution; Learning & Memory
Link ID: 23020 - Posted: 12.26.2016