Links for Keyword: Intelligence

Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.


Links 1 - 20 of 291

By Alexander P. Burgoyne, David Z. Hambrick More than 60 years ago, Francis Crick and James Watson discovered the double-helical structure of deoxyribonucleic acid—better known as DNA. Today, for the cost of a Netflix subscription, you can have your DNA sequenced to learn about your ancestry and proclivities. Yet, while it is an irrefutable fact that the transmission of DNA from parents to offspring is the biological basis for heredity, we still know relatively little about the specific genes that make us who we are. That is changing rapidly through genome-wide association studies—GWAS, for short. These studies search for differences in people’s genetic makeup—their “genotypes”—that correlate with differences in their observable traits—their “phenotypes.” In a GWAS recently published in Nature Genetics, a team of scientists from around the world analyzed the DNA sequences of 78,308 people for correlations with general intelligence, as measured by IQ tests. The major goal of the study was to identify single nucleotide polymorphisms—or SNPs—that correlate significantly with intelligence test scores. Found in most cells throughout the body, DNA is made up of four molecules called nucleotides, referred to by their organic bases: cytosine (C), thymine (T), adenine (A), and guanine (G). Within a cell, DNA is organized into structures called chromosomes­. Humans normally have 23 pairs of chromosomes, with one in each pair inherited from each parent. © 2017 Scientific American

Related chapters from BN8e: Chapter 7: Life-Span Development of the Brain and Behavior
Related chapters from MM:Chapter 13: Memory, Learning, and Development
Link ID: 23986 - Posted: 08.23.2017

Susan Milius Ravens have passed what may be their toughest tests yet of powers that, at least on a good day, let people and other apes plan ahead. Lab-dwelling common ravens (Corvus corax) in Sweden at least matched the performance of nonhuman apes and young children in peculiar tests of advanced planning ability. The birds faced such challenges as selecting a rock useless at the moment but likely to be useful for working a puzzle box and getting food later. Ravens also reached apelike levels of self-control, picking a tool instead of a ho-hum treat when the tool would eventually allow them to get a fabulous bit of kibble 17 hours later, Mathias Osvath and Can Kabadayi of Lund University in Sweden report in the July 14 Science. “The insight we get from the experiment is that [ravens] can plan for the future outside behaviors observed in the wild,” Markus Böckle, of the University of Cambridge, said in an e-mail. Böckle, who has studied ravens, coauthored a commentary in the same issue of Science. In the wild, ravens cache some of their food, but that apparent foresight could be more of a specific adaptation that evolved with diet instead of as some broader power of planning. The Lund tests, based on experiments with apes, tried to challenge ravens in less natural ways. The researchers say the birds aren’t considered much of a tool-using species in nature, nor do they trade for food. “The study for the first time in any animal shows that future planning can be used in behaviors it was not originally selected for” in evolution, Böckle says. © Society for Science & the Public 2000 - 2017.

Related chapters from BN8e: Chapter 6: Evolution of the Brain and Behavior; Chapter 17: Learning and Memory
Related chapters from MM:Chapter 13: Memory, Learning, and Development
Link ID: 23835 - Posted: 07.14.2017

People with higher IQs are less likely to die before the age of 79. That’s according to a study of over 65,000 people born in Scotland in 1936. Each of the people in the study took an intelligence test at the age of 11, and their health was then followed for 68 years, until the end of 2015. When Ian Deary, of the University of Edinburgh, UK, and his team analysed data from the study, they found that a higher test score in childhood was linked to a 28 per cent lower risk of death from respiratory disease, a 25 per cent reduced risk of coronary heart disease, and a 24 per cent lower risk of death from stroke. These people were also less likely to die from injuries, digestive diseases, and dementia – even when factors like socio-economic status were taken into account. Deary’s team say there are several theories for why more intelligent people live longer, such as people with higher IQs being more likely to look after their health and less likely to smoke. They also tend to do more exercise and seek medical attention when ill. “I’m hoping it means that if we can find out what smart people do and copy them, then we have a chance of a slightly longer and healthier life,” says Dreary. But there’s evidence genetics is involved too. A recent study suggests that very rare genetic variants can play an important role in lowering intelligence, and that these may also be likely to impair a person’s health. Journal reference: British Medical Journal, DOI: 10.1136/bmj.j2708 © Copyright New Scientist Ltd.

Related chapters from BN8e: Chapter 1: Biological Psychology: Scope and Outlook; Chapter 17: Learning and Memory
Related chapters from MM:Chapter 1: An Introduction to Brain and Behavior; Chapter 13: Memory, Learning, and Development
Link ID: 23786 - Posted: 06.29.2017

By Katie Langin No one likes a con artist. People avoid dealing with characters who have swindled them in the past, and—according to new research—birds avoid those people, too. Ravens, known more for their intelligence, but only slightly less for their love of cheese, were trained by researchers to trade a crust of bread for a morsel of cheese with human partners. When the birds then tried to broker a trade with “fair” and “unfair” partners—some completed the trade as expected, but others took the raven’s bread and kept (and ate) the cheese—the ravens avoided the tricksters in separate trials a month later. This suggests that ravens can not only differentiate between “fair” and “unfair” individuals, but they retain that ability for at least a month, the researchers write this month in Animal Behavior. Ravens have a complex social life involving friendships and rivalries. Their ability to recognize and punish dishonest individuals, even after a single encounter, may help explain how cooperation evolved in this group of birds. For people, though, the moral of the story is simple: Be nice to ravens. © 2017 American Association for the Advancement of Science.

Related chapters from BN8e: Chapter 6: Evolution of the Brain and Behavior; Chapter 17: Learning and Memory
Related chapters from MM:Chapter 13: Memory, Learning, and Development
Link ID: 23709 - Posted: 06.06.2017

By David Z. Hambrick Physical similarities aside, we share a lot in common with our primate relatives. For example, as Jane Goodall famously documented, chimpanzees form lifelong bonds and show affection in much the same way as humans. Chimps can also solve novel problems, use objects as tools, and may possess “theory of mind”—an understanding that others may have different perspectives than oneself. They can even outperform humans in certain types of cognitive tasks. These commonalities may not seem all that surprising given what we now know from the field of comparative genomics: We share nearly all of our DNA with chimpanzees and other primates. However, social and cognitive complexity is not unique to our closest evolutionary cousins. In fact, it is abundant in species with which we would seem to have very little in common—like the spotted hyena. For more than three decades, the Michigan State University zoologist Kay Holekamp has studied the habits of the spotted hyena in Kenya’s Masai Mara National Reserve, once spending five years straight living in a tent among her oft-maligned subjects. One of the world’s longest-running studies of a wild mammal, this landmark project has revealed that spotted hyenas not only have social groups as complex as those of many primates, but are also capable of some of the same types of problem solving. This research sheds light on one of science’s greatest mysteries—how intelligence has evolved across the animal kingdom. According to the social brain hypothesis, intelligence has evolved to meet the demands of social life. The subject of many popular articles and books, this hypothesis posits that the complex information processing that goes along with coexisting with members of one’s own species—forming coalitions, settling disputes, trying to outwit each other, and so on—selects for larger brains and greater intelligence. By contrast, the cognitive buffer hypothesis holds that intelligence emerges as an adaption to dealing with novelty in the environment, in whatever form it presents itself. © 2017 Scientific American,

Related chapters from BN8e: Chapter 1: Biological Psychology: Scope and Outlook; Chapter 17: Learning and Memory
Related chapters from MM:Chapter 1: An Introduction to Brain and Behavior; Chapter 13: Memory, Learning, and Development
Link ID: 23685 - Posted: 05.31.2017

Carl Zimmer In a significant advance in the study of mental ability, a team of European and American scientists announced on Monday that they had identified 52 genes linked to intelligence in nearly 80,000 people. These genes do not determine intelligence, however. Their combined influence is minuscule, the researchers said, suggesting that thousands more are likely to be involved and still await discovery. Just as important, intelligence is profoundly shaped by the environment. Still, the findings could make it possible to begin new experiments into the biological basis of reasoning and problem-solving, experts said. They could even help researchers determine which interventions would be most effective for children struggling to learn. “This represents an enormous success,” said Paige Harden, a psychologist at the University of Texas, who was not involved in the study. For over a century, psychologists have studied intelligence by asking people questions. Their exams have evolved into batteries of tests, each probing a different mental ability, such as verbal reasoning or memorization. In a typical test, the tasks might include imagining an object rotating, picking out a shape to complete a figure, and then pressing a button as fast as possible whenever a particular type of word appears. Each test-taker may get varying scores for different abilities. But over all, these scores tend to hang together — people who score low on one measure tend to score low on the others, and vice versa. Psychologists sometimes refer to this similarity as general intelligence. It’s still not clear what in the brain accounts for intelligence. Neuroscientists have compared the brains of people with high and low test scores for clues, and they’ve found a few. Brain size explains a small part of the variation, for example, although there are plenty of people with small brains who score higher than others with bigger brains. © 2017 The New York Times Company

Related chapters from BN8e: Chapter 7: Life-Span Development of the Brain and Behavior; Chapter 17: Learning and Memory
Related chapters from MM:Chapter 13: Memory, Learning, and Development; Chapter 13: Memory, Learning, and Development
Link ID: 23650 - Posted: 05.23.2017

By Ian Randall René Descartes began with doubt. “We cannot doubt of our existence while we doubt. … I think, therefore I am,” the 17th century philosopher and scientist famously wrote. Now, modern scientists are trying to figure out what made the genius’s mind tick by reconstructing his brain. Scientists have long wondered whether the brains of geniuses (especially the shapes on their surfaces) could hold clues about their owners’ outsized intelligences. But most brains studied to date—including Albert Einstein’s—were actual brains. Descartes’s had unfortunately decomposed by the time scientists wanted to study it. So with techniques normally used for studying prehistoric humans, researchers created a 3D image of Descartes’s brain (above) by scanning the impression it left on the inside of his skull, which has been kept for almost 200 years now in the National Museum of Natural History in Paris. For the most part, his brain was surprisingly normal—its overall dimensions fell within regular ranges, compared with 102 other modern humans. But one part stood out: an unusual bulge in the frontal cortex, in an area which previous studies have suggested may process the meaning of words. That’s not to say this oddity is necessarily indicative of genius, the scientists report online in the Journal of the Neurological Sciences. Even Descartes might agree: “It is not enough to have a good mind,” he wrote. “The main thing is to use it well.” © 2017 American Association for the Advancement of Science

Related chapters from BN8e: Chapter 19: Language and Lateralization; Chapter 1: Biological Psychology: Scope and Outlook
Related chapters from MM:Chapter 15: Brain Asymmetry, Spatial Cognition, and Language; Chapter 1: An Introduction to Brain and Behavior
Link ID: 23579 - Posted: 05.06.2017

Ian Sample Science editor Tempting as it may be, it would be wrong to claim that with each generation humans are becoming more stupid. As scientists are often so keen to point out, it is a bit more complicated than that. A study from Iceland is the latest to raise the prospect of a downwards spiral into imbecility. The research from deCODE, a genetics firm in Reykjavik, finds that groups of genes that predispose people to spend more years in education became a little rarer in the country from 1910 to 1975. The scientists used a database of more than 100,000 Icelanders to see how dozens of gene variants that affect educational attainment appeared in the population over time. They found a shallow decline over the 65 year period, implying a downturn in the natural inclination to rack up qualifications. But the genes involved in education affected fertility too. Those who carried more “education genes” tended to have fewer children than others. This led the scientists to propose that the genes had become rarer in the population because, for all their qualifications, better educated people had contributed less than others to the Icelandic gene pool. Spending longer in education and the career opportunities that provides is not the sole reason that better educated people tend to start families later and have fewer children, the study suggests. Many people who carried lots of genes for prolonged education left the system early and yet still had fewer children that the others. “It isn’t the case that education, or the career opportunities it provides, prevents you from having more children,” said Kari Stefansson, who led the study. “If you are genetically predisposed to have a lot of education, you are also predisposed to have fewer children.” © 2017 Guardian News and Media Limited

Related chapters from BN8e: Chapter 17: Learning and Memory; Chapter 6: Evolution of the Brain and Behavior
Related chapters from MM:Chapter 13: Memory, Learning, and Development
Link ID: 23113 - Posted: 01.17.2017

By PETER GODFREY-SMITH Around 2008, while snorkeling and scuba diving in my free time, I began watching the unusual group of animals known as cephalopods, the group that includes octopuses, cuttlefish and squid. The first ones I encountered were giant cuttlefish, large animals whose skin changes color so quickly and completely that swimming after them can be like following an aquatic, multi-armed television. Then I began watching octopuses. Despite being mollusks, like clams and oysters, these animals have very large brains and exhibit a curious, enigmatic intelligence. I followed them through the sea, and also began reading about them, and one of the first things I learned came as a shock: They have extremely short lives — just one or two years. I was already puzzled by the evolution of large brains in cephalopods, and this discovery made the questions more acute. What is the point of building a complex brain like that if your life is over in a year or two? Why invest in a process of learning about the world if there is no time to put that information to use? An octopus’s or cuttlefish’s life is rich in experience, but it is incredibly compressed. The particular puzzle of octopus life span opens up a more general one. Why do animals age? And why do they age so differently? A scruffy-looking fish that inhabits the same patch of sea as my cephalopods has relatives who live to 200 years of age. This seems extraordinarily unfair: A dull-looking fish lives for centuries while the cuttlefish, in their chromatic splendor, and the octopuses, in their inquisitive intelligence, are dead before they are 2? There are monkeys the size of a mouse that can live for 15 years, and hummingbirds that can live for over 10. Nautiluses (who are also cephalopods) can live for 20 years. A recent Nature paper reported that despite continuing medical advances, humans appear to have reached a rough plateau at around 115 years, though a few people will edge beyond it. The life spans of animals seem to lack all rhyme or reason. © 2016 The New York Times Company

Related chapters from BN8e: Chapter 17: Learning and Memory; Chapter 7: Life-Span Development of the Brain and Behavior
Related chapters from MM:Chapter 13: Memory, Learning, and Development; Chapter 13: Memory, Learning, and Development
Link ID: 22951 - Posted: 12.05.2016

James Gorman The Goffin’s cockatoo is a smart bird, so smart it has been compared to a 3-year-old human. But even for this species, a bird named Figaro stands out for his creativity with tools. Hand-raised at the Veterinary University of Vienna, the male bird was trying to play with a pebble that fell outside his aviary onto a wooden beam about four years ago. First he used a piece of bamboo to try to rake the stone back in. Impressed, scientists in the university Goffin’s lab, which specializes in testing the thinking abilities of the birds, put a cashew nut where the pebble had been. Figaro extended his beak through the wire mesh to bite a splinter off the wooden beam. He used the splinter to fish the cashew in, a fairly difficult process because he had to work the splinter through the mesh and position it at the right angle. In later trials, Figaro made his tools much more quickly, and also picked a bamboo twig from the bottom of the aviary and trimmed it to make a similar tool. Cockatoos don’t do anything like this in nature, as far as anyone knows. They don’t use tools. They don’t even build nests, so they are not used to manipulating sticks. And they have curved bills, unlike the straight beaks of crows and jays that make manipulating tools a bit easier. Blue jays have been observed creating tools from newspaper to pull food pellets to them. Alice M.I. Auersperg, a researcher at the Veterinary University of Vienna who studies cognition in animals, and her colleagues reported those first accomplishments by Figaro in 2012. Since then, they have continued to test Figaro and other birds in the lab that were able to learn tool use or tool making, sometimes both, by watching Figaro. © 2016 The New York Times Company

Related chapters from BN8e: Chapter 1: Biological Psychology: Scope and Outlook; Chapter 6: Evolution of the Brain and Behavior
Related chapters from MM:Chapter 1: An Introduction to Brain and Behavior
Link ID: 22892 - Posted: 11.21.2016

Bruce Bower Apes understand what others believe to be true. What’s more, they realize that those beliefs can be wrong, researchers say. To make this discovery, researchers devised experiments involving a concealed, gorilla-suited person or a squirreled-away rock that had been moved from their original hiding places — something the apes knew, but a person looking for King Kong or the stone didn’t. “Apes anticipated that an individual would search for an object where he last saw it, even though the apes knew that the object was no longer there,” says evolutionary anthropologist Christopher Krupenye. If this first-of-its-kind finding holds up, it means that chimpanzees, bonobos and orangutans can understand that others’ actions sometimes reflect mistaken assumptions about reality. Apes’ grasp of others’ false beliefs roughly equals that of human 2-year-olds tested in much the same way, say Krupenye of the Max Planck Institute for Evolutionary Anthropology in Leipzig, Germany, and his colleagues. Considering their targeted gazes during brief experiments, apes must rapidly assess others’ beliefs about the world in wild and captive communities, the researchers propose in the October 7 Science. Understanding the concept of false beliefs helps wild and captive chimps deceive their comrades, such as hiding food from those who don’t share, Krupenye suggests. |© Society for Science & the Public 2000 - 2016.

Related chapters from BN8e: Chapter 6: Evolution of the Brain and Behavior; Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Consciousness
Link ID: 22733 - Posted: 10.08.2016

By David Z. Hambrick, Fredrik Ullén, Miriam Mosing Elite-level performance can leave us awestruck. This summer, in Rio, Simone Biles appeared to defy gravity in her gymnastics routines, and Michelle Carter seemed to harness super-human strength to win gold in the shot put. Michael Phelps, meanwhile, collected 5 gold medals, bringing his career total to 23. In everyday conversation, we say that elite performers like Biles, Carter, and Phelps must be “naturals” who possess a “gift” that “can’t be taught.” What does science say? Is innate talent a myth? This question is the focus of the new book Peak: Secrets from the New Science of Expertise by Florida State University psychologist Anders Ericsson and science writer Robert Pool. Ericsson and Pool argue that, with the exception of height and body size, the idea that we are limited by genetic factors—innate talent—is a pernicious myth. “The belief that one’s abilities are limited by one’s genetically prescribed characteristics....manifests itself in all sorts of ‘I can’t’ or ‘I’m not’ statements,” Ericsson and Pool write. The key to extraordinary performance, they argue, is “thousands and thousands of hours of hard, focused work.” To make their case, Ericsson and Pool review evidence from a wide range of studies demonstrating the effects of training on performance. In one study, Ericsson and his late colleague William Chase found that, through over 230 hours of practice, a college student was able to increase his digit span—the number of random digits he could recall—from a normal 7 to nearly 80. In another study, the Japanese psychologist Ayako Sakakibara enrolled 24 children from a private Tokyo music school in a training program designed to train “perfect pitch”—the ability to name the pitch of a tone without hearing another tone for reference. With a trainer playing a piano, the children learned to identify chords using colored flags—for example, a red flag for CEG and a green flag for DGH. Then, the children were tested on their ability to identify the pitches of individual notes until they reached a criterion level of proficiency. By the end of the study, the children had seemed to acquire perfect pitch. Based on these findings, Ericsson and Pool conclude that the “clear implication is that perfect pitch, far from being a gift bestowed upon only a lucky few, is an ability that pretty much anyone can develop with the right exposure and training.” © 2016 Scientific American

Related chapters from BN8e: Chapter 17: Learning and Memory; Chapter 7: Life-Span Development of the Brain and Behavior
Related chapters from MM:Chapter 13: Memory, Learning, and Development; Chapter 13: Memory, Learning, and Development
Link ID: 22674 - Posted: 09.21.2016

By DAVID Z. HAMBRICK and ALEXANDER P. BURGOYNE ARE you intelligent — or rational? The question may sound redundant, but in recent years researchers have demonstrated just how distinct those two cognitive attributes actually are. It all started in the early 1970s, when the psychologists Daniel Kahneman and Amos Tversky conducted an influential series of experiments showing that all of us, even highly intelligent people, are prone to irrationality. Across a wide range of scenarios, the experiments revealed, people tend to make decisions based on intuition rather than reason. In one study, Professors Kahneman and Tversky had people read the following personality sketch for a woman named Linda: “Linda is 31 years old, single, outspoken and very bright. She majored in philosophy. As a student, she was deeply concerned with issues of discrimination and social justice, and also participated in antinuclear demonstrations.” Then they asked the subjects which was more probable: (A) Linda is a bank teller or (B) Linda is a bank teller and is active in the feminist movement. Eighty-five percent of the subjects chose B, even though logically speaking, A is more probable. (All feminist bank tellers are bank tellers, though some bank tellers may not be feminists.) In the Linda problem, we fall prey to the conjunction fallacy — the belief that the co-occurrence of two events is more likely than the occurrence of one of the events. In other cases, we ignore information about the prevalence of events when judging their likelihood. We fail to consider alternative explanations. We evaluate evidence in a manner consistent with our prior beliefs. And so on. Humans, it seems, are fundamentally irrational. But starting in the late 1990s, researchers began to add a significant wrinkle to that view. As the psychologist Keith Stanovich and others observed, even the Kahneman and Tversky data show that some people are highly rational. In other words, there are individual differences in rationality, even if we all face cognitive challenges in being rational. So who are these more rational people? Presumably, the more intelligent people, right? © 2016 The New York Times Company

Related chapters from BN8e: Chapter 17: Learning and Memory; Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 13: Memory, Learning, and Development; Chapter 14: Attention and Consciousness
Link ID: 22666 - Posted: 09.19.2016

By Brian Owens It’s certainly something to crow about. New Caledonian crows are known for their ingenious use of tools to get at hard-to-reach food. Now it turns out that their Hawaiian cousins are adept tool-users as well. Christian Rutz at the University of St Andrews in the UK has spent 10 years studying the New Caledonian crow and wondered whether any other crow species are disposed to use tools. So he looked for crows that have similar features to the New Caledonian crow – a straight bill and large, mobile eyes that allow it to manipulate tools, much as archaeologists use opposable thumbs as an evolutionary signature for tool use in early humans. “The Hawaiian crow really stood out,” he says. “They look quite similar.” Hawaiian crows are extinct in the wild, but 109 birds still live in two captive breeding facilities in Hawaii. That meant Rutz was able to test pretty much every member of the species. He stuffed tasty morsels into a variety of holes and crevices in a log, and gave the birds a variety of sticks to see if they would use them to dig out the food. Almost all of them did, and most extracted the food in less than a minute, faster than the researchers themselves could. “It’s mind-blowing,” says Rutz. “They’re very good at getting the tool in the right position, and if they’re not happy with it they’ll modify it or make their own.” © Copyright Reed Business Information Ltd.

Related chapters from BN8e: Chapter 17: Learning and Memory; Chapter 6: Evolution of the Brain and Behavior
Related chapters from MM:Chapter 13: Memory, Learning, and Development
Link ID: 22659 - Posted: 09.15.2016

By Virginia Morell Fourteen years ago, a bird named Betty stunned scientists with her humanlike ability to invent and use tools. Captured from the wild and shown a tiny basket of meat trapped in a plastic tube, the New Caledonian crow bent a straight piece of wire into a hook and retrieved the food. Researchers hailed the observation as evidence that these crows could invent new tools on the fly—a sign of complex, abstract thought that became regarded as one of the best demonstrations of this ability in an animal other than a human. But a new study casts doubt on at least some of Betty’s supposed intuition. Scientists have long agreed that New Caledonian crows (Corvus moneduloides), which are found only on the South Pacific island of the same name, are accomplished toolmakers. At the time of Betty’s feat, researchers knew that in the wild these crows could shape either stiff or flexible twigs into tools with a tiny, barblike hook at one end, which they used to lever grubs from rotting logs. They also make rakelike tools from the leaves of the screw pine (Pandanus) tree. But Betty appeared to take things to the next level. Not only did she fashion a hook from a material she’d never previously encountered—a behavior not observed in the wild—she seemed to know she needed this specific shape to solve her particular puzzle. © 2016 American Association for the Advancement of Science. A

Related chapters from BN8e: Chapter 6: Evolution of the Brain and Behavior; Chapter 17: Learning and Memory
Related chapters from MM:Chapter 13: Memory, Learning, and Development
Link ID: 22538 - Posted: 08.10.2016

by Helen Thompson Pinky and The Brain's smarts might not be so far-fetched. Some mice are quicker on the uptake than others. While it might not lead to world domination, wits have their upside: a better shot at staying alive. Biologists Audrey Maille and Carsten Schradin of the University of Strasbourg in France tested reaction time and spatial memory in 90 African striped mice (Rhabdomys pumilio) over the course of a summer. For this particular wild rodent, surviving harsh summer droughts means making it to mating season in the early fall. The team saw some overall trends: Females were more likely to survive if they had quick reflexes, and males were more likely to survive if they had good spatial memory. Cognitive traits like reacting quickly and remembering the best places to hide are key to eluding predators during these tough times but may come with trade-offs for males and females. The results show that an individual mouse’s cognitive strengths are linked to its survival odds, suggesting that the pressure to survive can shape basic cognition, Maille and Schradin write August 3 in Biology Letters. |© Society for Science & the Public 2000 - 2016

Related chapters from BN8e: Chapter 17: Learning and Memory; Chapter 6: Evolution of the Brain and Behavior
Related chapters from MM:Chapter 13: Memory, Learning, and Development
Link ID: 22511 - Posted: 08.04.2016

Not much is definitively proven about consciousness, the awareness of one’s existence and surroundings, other than that it’s somehow linked to the brain. But theories as to how, exactly, grey matter generates consciousness are challenged when a fully-conscious man is found to be missing most of his brain. Several years ago, a 44-year-old Frenchman went to the hospital complaining of mild weakness in his left leg. It was discovered then that his skull was filled largely by fluid, leaving just a thin perimeter of actual brain tissue. And yet the man was a married father of two and a civil servant with an IQ of 75, below-average in his intelligence but not mentally disabled. Doctors believe the man’s brain slowly eroded over 30 years due to a build up of fluid in the brain’s ventricles, a condition known as “hydrocephalus.” His hydrocephalus was treated with a shunt, which drains the fluid into the bloodstream, when he was an infant. But it was removed when he was 14 years old. Over the following decades, the fluid accumulated, leaving less and less space for his brain. While this may seem medically miraculous, it also poses a major challenge for cognitive psychologists, says Axel Cleeremans of the Université Libre de Bruxelles.

Related chapters from BN8e: Chapter 17: Learning and Memory; Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 13: Memory, Learning, and Development; Chapter 14: Attention and Consciousness
Link ID: 22430 - Posted: 07.13.2016

Agata Blaszczak-Boxe, People with higher levels of education may be more likely to develop certain types of brain tumors, a new study from Sweden suggests. Researchers found that women who completed at least three years of university courses were 23 percent more likely to develop a type of cancerous brain tumor called glioma, compared with women who only completed up to nine years of mandatory education and did not go to a university. And men who completed at least three years of university courses were 19 percent more likely to develop the same type of tumor, compared with men who did not go to a university. Though the reasons behind the link are not clear, "one possible explanation is that highly educated people may be more aware of symptoms and seek medical care earlier," and therefore are more likely to be diagnosed, said Amal Khanolkar, a research associate at the Institute of Child Health at the University College Londonand a co-author of the study. [Top 10 Cancer-Fighting Foods] In the study, the researchers looked at data on more than 4.3 million people in Sweden who were a part of the Swedish Total Population Register. The researchers tracked the people for 17 years, beginning in 1993, to see if they developed brain tumors during that time. They also collected information about the people's education levels, income, marital status and occupation. During the 17-year study, 5,735 men and 7,101 women developed brain tumors, according to the findings, published today (June 20) in the Journal of Epidemiology & Community Health. Copyright 2016 LiveScience,

Related chapters from BN8e: Chapter 1: Biological Psychology: Scope and Outlook; Chapter 7: Life-Span Development of the Brain and Behavior
Related chapters from MM:Chapter 1: An Introduction to Brain and Behavior; Chapter 13: Memory, Learning, and Development
Link ID: 22346 - Posted: 06.22.2016

James Gorman There’s an aura of power around invasive species. How is it that they can sweep in and take over from the locals? Are they more adaptable, tougher? What are their secrets? The great-tailed grackle is a case in point. North America has its own similar species — the common and boat-tailed grackle. But the great-tailed bird, Quiscalus mexicanus, native to Central America, is one of the most invasive species in the United States. The black birds with iridescent feathers were prized by the Aztec emperor Auitzotl, who, by some accounts, relocated some of them from Veracruz to near Mexico City about 500 years ago. Over the past century or so the bird has spread north and its range is still expanding, particularly in the West, where it haunts cattle feed lots and big dairy farms. The birds are also quite happy in urban areas, like Santa Barbara, Calif., where Corina J. Logan captured and later released some grackle for recent experiments. Great-tailed grackles first caught the attention of Dr. Logan, now at Cambridge University, in 2004 when she was doing undergraduate research in Costa Rica. “They’ll actually walk right up and look you in the eye,” she said. “They look like they’re so smart.” Years later, having earned her Ph.D. at Cambridge, she decided to look more closely at them because she was interested in behavioral flexibility. Grackles, for example, might look under rocks at the beach for something to eat, or switch to discarded sandwich wrappers in a city park. © 2016 The New York Times Company

Related chapters from BN8e: Chapter 6: Evolution of the Brain and Behavior
Related chapters from MM:None
Link ID: 22343 - Posted: 06.21.2016

By David Z. Hambrick If you’re a true dog lover, you take it as one of life’s simple truths that all dogs are good, and you have no patience for scientific debate over whether dogs really love people. Of course they do. What else could explain the fact that your dog runs wildly in circles when you get home from work, and, as your neighbors report, howls inconsolably for hours on end when you leave? What else could explain the fact that your dog insists on sleeping in your bed, under the covers—in between you and your partner? At the same time, there’s no denying that some dogs are smarter than others. Not all dogs can, like a border collie mix named Jumpy, do a back flip, ride a skateboard, and weave through pylons on his front legs. A study published in the journal Intelligence by British psychologists Rosalind Arden and Mark Adams confirms as much. Consistent with over a century of research on human intelligence, Arden and Adams found that a dog that excels in one test of cognitive ability will likely excel in other tests of cognitive ability. In more technical terms, the study reveals that there is a general factor of intelligence in dogs—a canine “g” factor. For their study, Arden and Adams devised a battery of canine cognitive ability tests. All of the tests revolved around—you guessed it—getting a treat. In the detour test, the dog’s objective was to navigate around barriers arranged in different configurations to get to a treat. In the point-following test, a researcher pointed to one of two inverted beakers concealing a treat, and recorded whether the dog went to that beaker or the other one. Finally, the quantity discrimination test required the dog to choose between a small treat (a glob of peanut butter) and a larger one (the “correct” answer). Arden and Adams administered the battery to 68 border collies from Wales; all had been bred and trained to do herding work on a farm, and thus had similar backgrounds. © 2016 Scientific American

Related chapters from BN8e: Chapter 6: Evolution of the Brain and Behavior; Chapter 17: Learning and Memory
Related chapters from MM:Chapter 13: Memory, Learning, and Development
Link ID: 22272 - Posted: 06.01.2016