Chapter 13. Memory, Learning, and Development
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
by Bethany Brookshire There are times when science is a painful experience. My most excruciating moment in science involved a heated electrode placed on my bare leg. This wasn’t some sort of graduate school hazing ritual. I was a volunteer in a study to determine how we process feelings of pain. As part of the experiment I was exposed to different levels of heat and asked how painful I thought they were. When the electrode was removed, I eagerly asked how my pain tolerance compared with that of others. I secretly hoped that I was some sort of superwoman, capable of feeling pain that would send other people into screaming fits and brushing it off with a stoic grimace. It turns out, however, that I was a bit of a wuss. Ouch. I figured I could just blame my genes. About half of our susceptibility to pain can be explained by genetic differences. The other half, however, remains up for grabs. And a new study published February 4 in Nature Communications suggests that part of our susceptibility to pain might lie in chemical markers on our genes. These “notes” on your DNA, known as epigenetic changes, can be affected by environment, behavior and even diet. So the findings reveal that our genetic susceptibility to pain might not be our destiny. Tim Spector and Jordana Bell, genetic epidemiologists at King’s College London, were interested in the role of the epigenome in pain sensitivity. Epigenetic changes such as the addition (or subtraction) of a methyl group on a gene make that gene more or less likely to be used in a cell by altering how much protein can be made from it. These differences in proteins can affect everything from obesity to memory to whether you end up like your mother. © Society for Science & the Public 2000 - 2013.
By Roy H. Hamilton and Jihad Zreik It's hard to imagine anyone, no matter how brilliant, who doesn't yearn to be even smarter. Thanks to recent advances in neural science, that wish may come true. Researchers are finding ways to rev up the human brain like never before. There would be just one question: Do we really want to inhabit that world? It may be too late to ask. Modern society has already embraced the basic idea of fine-tuning our intellects via artificial procedures—what might be termed “cosmetic” neurology. Schoolchildren take Adderall, Concerta and other attention-focusing medications. Parents and teachers rely on antidepressants and antianxiety drugs. And self-help books offer the latest advances in neuroscience to help ordinary people think faster and sharper. Add to those advances another cognitive-enhancement method: transcranial direct-current stimulation (tDCS). With this technique, electrodes applied to the scalp deliver minuscule amperages of current to the brain. This trickle of electricity seems to cause incremental adjustments in the electrical potentials of membranes in the neurons closest to the electrodes, increasing or decreasing their likelihood of firing. And that, in turn, induces measurable changes in memory, language, mood, motor function, attention and other cognitive domains. Investigators still aren't sure whether tDCS can cause long-term neural changes. Although most tests show only transient effects, there is limited evidence that repeated applications might have more persistent results. The procedure is not approved by the U.S. Food and Drug Administration, and the consensus among experts is that it should be performed only under qualified supervision. Nevertheless, if used properly, it is safe, portable, easy to implement and inexpensive. © 2014 Scientific American
|By Annie Sneed Alzheimer’s disease is now the sixth leading cause of death in the U.S., but researchers still do not know what causes the degenerative neurological disorder. In recent years they have pinpointed several genes that seem largely responsible for those cases in which the disorder develops early on, prior to age 60. They have also identified about 20 genes that can increase or decrease risk for the more common late-onset variety that starts appearing in people older than 60. But genetics simply cannot explain the whole picture for the over five million Americans with late-onset Alzheimer’s. Whereas genetics contribute some risk of developing this version of the disorder, no combination of genes inevitably leads to the disease. Scientists are now urgently searching for the other missing pieces to explain what causes late-onset Alzheimer’s. Some researchers have shifted their attention from genes to the environment—especially to certain toxins. Their studies of pesticides, food additives, air pollution and other problematic compounds are opening a new front in the battle against this devastating malady. Here’s a roundup of some of the possibilities being studied: Scientists have already found a strong potential link between pesticides and Parkinson’s disease. Now, a preliminary study released in January suggests that the pesticide DDT, which degrades so slowly that it continues to linger in the environment more than 40 years after the U.S. Environmental Protection Agency banned its use in the U.S., may also contribute to Alzheimer’s. © 2014 Scientific American
By Daniel Engber Drop a mouse in some water and white paint, and it will know just what to do. Mice can swim, by whipping their tails like a flagellum, but they don't like doing it; a mouse in a tub tries to find a way out. There's no need for training, or food pellets, or annoying electric shocks: To put a mouse through a water maze, you need only to build a little platform for it, hidden somewhere just beneath the surface. The mouse will try to find that platform without any encouragement. It's a setup that's so simple—and so useful in measuring an animal's capacity for learning and memory—it hardly seems like it would need inventing. But it took a cognitive neuroscientist at the University of St. Andrews in Scotland to come up with the tub-and-platform method. In 1979, Richard Morris built a heated pool about 4 feet and 3 inches in diameter, filled it with water and fresh milk, and then added a platform made of stones and drain piping. Within a few years, his method (designed for rats) had been adapted for smaller lab mice, and had made its way into rodent labs around the world. Now it's among the most widespread animal-testing protocols in all of biomedicine. Scientists plunge mice in murky water to test the effects of brain damage, or the functions of particular genes on learning, or the efficacy of new drugs for treating Alzheimer's. You can even buy a standard-issue "Morris Water Maze" direct from a lab-supply shop, along with specialized software for recording its results. That fact that so few of us would call a tub full of milk a “maze” only goes to show that rodent mazes aren't what they used to be. Early psychologists tempted rats with tricky blind alleys and wrong turns using contraptions built by hand, of wood and wire. © 2014 The Slate Group LLC.
Keyword: Learning & Memory
Link ID: 19233 - Posted: 02.11.2014
By Maggie Fox Researchers looking for simple ways to treat autism say they may have explained why at least some cases occur: It all has to do with the stress babies undergo at birth. They’re already testing a simple drug for treating kids with autism and say their findings may point to ways to treat the disorder earlier in life. It’s all experimental, but the study, published in the journal Science, should inspire other researchers to take a closer look. “This is exciting stuff to people in the field, because it’s getting at a basic mechanism," says Andrew Zimmerman of the University of Massachusetts Medical School, who reviewed the study. Yehezkel Ben-Ari of the Mediterranean Institute of Neurobiology in Marseille, France, and colleagues have been treating children with autism with a diuretic called bumetanide that reduces levels of chloride in cells. Diuretics lower blood pressure by making people urinate more, reducing fluid. Ben-Ari has had mixed success in his trials in kids, and wanted to prove his theory that chloride was the key. He worked with two rodent “models” of autism — they’re the closest things scientists have for replicating autism in a human. One has mutated genes linked with autism, and another develops autism when given valproate, an epilepsy drug blamed for causing autism in the children of mothers who take it while pregnant. They looked at what was going on in the brains of the mouse and rat pups just before and after birth. Then they gave the mouse and rat moms bumetanide — and fewer of their newborns showed autistic-like behaviors.
Ewen Callaway A study in mice and rats suggests that an imbalance in chloride ions during a child's development in the womb could be a factor for autism. Children with autism typically begin showing obvious symptoms, such as trouble making eye contact and slow language development, a year or more after birth. A study in mice and rats now hints that prenatal drug treatment could head off these problems. The findings, reported today in Science1, do not suggest that autism spectrum disorders can be prevented in children. But researchers not involved in the study say that they add support to a controversial clinical trial suggesting that some children with autism benefited from taking a common diuretic medication called bumetanide2. In that trial, a team led by neuroscientist Yehezkel Ben-Ari at the Mediterranean Institute of Neurobiology in Marseille gave 60 children bumetanide or a placebo daily for three months. Children who had less severe forms of autism showed mild improvements in social behaviour after taking the drug, and almost no adverse side effects were observed (see 'Diuretic drug improves symptoms of autism'). But autism researchers greeted the results with caution. Many pointed out that the study did not provide a clear biological mechanism that could explain how the drug improved the symptoms of the disorder. The latest study is an attempt to answer such criticisms by identifying a role for the neurotransmitter GABA. Studies in humans and animals have suggested that GABA, which in healthy people typically inhibits the activity in neurons, is altered in autism and instead activates some brain cells. © 2014 Nature Publishing Group,
Link ID: 19225 - Posted: 02.08.2014
Memory can be altered by new experience, and isn't nearly as accurate as courtroom testimony might have us believe, a new study suggests. The results suggest a cheeky answer to the question posed by comedian Richard Pryor: "Who you gonna believe: me, or your lyin' eyes?" Turns out, Pryor was onto something. The brain behind our eyes can distort reality or verify it, based on subsequent experience. And somewhat paradoxically, the same area of the brain appears to be strongly involved in both activities, according to a study published online Tuesday in the Journal of Neuroscience. Northwestern University cognitive neuroscientist Donna Bridge was testing how memory is either consolidated or altered, by giving 17 subjects a deceptively simple task. They studied the location of dozens of objects briefly flashed at varied locations on a standard computer screen, then were asked to recall the object's original location on a new screen with a different background. When subjects were told to use a mouse to drag the re-presented object from the center of the new screen to the place where they recalled it had been located, 16 of 17 got it wrong, by an average of about 3 inches. When the same subjects then were given three choices - the original location, the wrong guess and a neutral spot between them - they almost unfailingly dragged the object to the incorrectly recalled location, regardless of the background screen. Their new memory was false. © 2014 Hearst Communications, Inc.
Keyword: Learning & Memory
Link ID: 19224 - Posted: 02.08.2014
| by Isaac Saul Multi-step puzzles can be difficult for humans, but what if I told you there was a bird that could solve them on its own? In this BBC special, Dr. Alex Taylor has set up an eight-step puzzle to try and stump one of the smartest crows he's seen in captivity. They describe the puzzle as "one of the most complex tests of the animal mind ever." This isn't the first time crows' intelligence has been tested, either. Along with being problem solvers, these animals have an eerie tendency towards complex human-like memory skills. Through several different studies, we've learned that crows can recognize faces, communicate details of an event to each other and even avoid places they recognize as dangerous. This bird, dubbed "007" for its crafty mind, flies into the caged puzzle and spends only seconds analyzing the puzzle before getting down to business. Despite the puzzle's difficulty, the bird only seems to be stumped momentarily. At the end of the puzzle is a food reward, but how he gets there is what will really blow your mind. © 2014 TheHuffingtonPost.com, Inc
By NICHOLAS BAKALAR There are many well established risk factors for cardiovascular death, but researchers may have found one more: slower reaction time. In the late 1980s and early ’90s, researchers measured the reaction times of 5,134 adults ages 20 to 59, having them press a button as quickly as possible after a light flashed on a computer screen. Then they followed them to see how many would still be alive after 15 years. The study is in the January issue of PLOS One. Unsurprisingly, men, smokers, heavy drinkers and the physically inactive were more likely to die. But after controlling for these and other factors, they found that those with slower reaction times were 25 percent more likely to die of any cause, and 36 percent more likely to die of cardiovascular disease, than those with faster reactions. Reaction time made no difference in cancer mortality. The reasons for the connection are unclear, but the lead author, Gareth Hagger-Johnson, a senior research associate at University College London, said it may reflect problems with the brain or nervous system. He stressed, though, that “a single test of reaction time is not going to tell you when you’re going to die. There’s a link at a population level. We didn’t look at individual people.” © 2014 The New York Times Company
Keyword: Development of the Brain
Link ID: 19210 - Posted: 02.06.2014
|By Geoffrey Giller Working memory—our ability to store pieces of information temporarily—is crucial both for everyday activities like dialing a phone number as well as for more taxing tasks like arithmetic and accurate note-taking. The strength of working memory is often measured with cognitive tests, such as repeating lists of numbers in reverse order or recalling sequences of dots on a screen. For children, performance on working memory assessments is considered a strong predictor for future academic performance. Yet cognitive tests can fail to identify children whose brain development is lagging in subtle ways that may lead to future deficits in working memory and, thus, in learning. Doctors give the tests periodically and plot the results along a development curve, much like a child’s height and weight. By the time these tests reveal that a child’s working memory is below average, however, it may be too late to do much about it. But in a new study, published January 29 in The Journal of Neuroscience, scientists demonstrated that they could predict the future working memory of children and adolescents by examining brain scans from two different types of magnetic resonance imaging (MRI), instead of looking only at cognitive tests. Henrik Ullman, a PhD student at the Karolinska Institute in Stockholm and the lead author on the paper, says that this was the first study attempting to use MRI scans to predict future working memory capacity. “We were pretty surprised when we found what we actually found,” Ullman says. © 2014 Scientific American,
by Andy Coghlan If you flinch where others merely frown, you might want to take a look at your lifestyle. That's because environmental factors may have retuned your genes to make you more sensitive to pain. "We know that stressful life events such as diet, smoking, drinking and exposure to pollution all have effects on your genes, but we didn't know if they specifically affected pain genes," says Tim Spector of King's College London. Now, a study of identical twins suggests they do. It seems that epigenetic changes – environmentally triggered chemical alterations that affect how active your genes are – can dial your pain threshold up or down. This implies that genetic tweaks of this kind, such as the addition of one or more methyl groups to a gene, may account for some differences in how our senses operate. Spector and his colleagues assessed the ability of hundreds of pairs of twins to withstand the heat of a laser on their skin, a standard pain test. They selected 25 pairs who showed the greatest difference in the highest temperature they could bear. Since identical twins have the same genes, any variation in pain sensitivity can be attributed to epigenetic differences. Pain thermostat The researchers screened the twins' DNA for differences in methylation levels across 10 million gene regions. They found a significant difference in nine genes, most of which then turned out to have been previously implicated in pain-sensitivity in animal experiments. © Copyright Reed Business Information Ltd.
One thing marijuana isn’t known to do is improve your memory. But there’s another reason why scientists believe it could fight Alzheimer’s disease. Gary Wenk, PhD, professor of neuroscience, immunology and medical genetics at Ohio State University, has studied how to combat brain inflammation for over 25 years. His research has led him to a class of compounds known as cannabinoids, which includes many of the common ingredients in marijuana. He says, throughout all of his research, cannabinoids have been the only class of drugs he’s found to work. What’s more, he believes early intervention may be the best way of fighting Alzheimer’s. Dr. Wenk doesn’t see cannabinoids – or anything else – as a cure. But he took the time to discuss with us how marijuana might prevent the disorder from developing. Q: What’s so important about brain inflammation? Over the past few years, there’s been a focus on inflammation in the brain as causing a lot more than Alzheimer’s. We now know it plays a role in ALS, Parkinson’s disease, AIDS, dementia, multiple sclerosis, autism, schizophrenia, etc. We’re beginning to see that inflammation in the brain, if it lasts too long, can be quite detrimental. And if you do anything, such as smoke a bunch of marijuana in your 20s and 30s, you may wipe out all of the inflammation in your brain and then things start over again. And you simply die of old age before inflammation becomes an issue for you. © 2013-2014 All rights reserved
By Ariana Eunjung Cha, The National Institutes of Health is undertaking an ambitious collaboration with private industry in an attempt to speed up the search for treatments for some of the world’s most devastating diseases — Alzheimer’s, type 2 diabetes, rheumatoid arthritis and lupus. The pilot projects announced Tuesday will involve the sharing of not only scientists but also of data, blood samples and tissue specimens among 10 rival companies, the federal government and several nonprofit groups and research foundations. The companies that have signed up to participate include most of the large drug makers, which in the past had resisted calls to share detailed data and samples from experiments, preferring to instead use the information to gain lucrative patents. The agreement with NIH represents a major break from how they used to do business. The competing pharmaceutical companies have said they will hold off launching commercial ventures based on discoveries from the partnership until after the data has been made publicly available. The idea behind the collaboration is similar to that of the “open source” movement among some computer scientists who believe that sharing their code with anyone who wants it is the best way to innovate. The first group of projects, which will last three to five years, will involve an investment of more than $230 million from industry participants including Bristol-Myers Squibb, GlaxoSmithKline, Johnson & Johnson, Eli Lilly, Merck, Pfizer, Sanofi and Takeda, as well as a few smaller biotech companies. © 1996-2014 The Washington Post
Link ID: 19206 - Posted: 02.05.2014
Karen Weintraub, Every time you pull up a memory – say of your first kiss – your mind reinterprets it for the present day, new research suggests. If you're in the middle of an ugly divorce, for example, you might recall it differently than if you're happily married and life is going well. This makes your memory quite unlike the video camera you may imagine it to be. But new research in the Journal of Neuroscience suggests it's very effective for helping us adapt to our environments, said co-author Joel Voss, a researcher at Northwestern University's Feinberg School of Medicine. Voss' findings build on others and may also explain why we can be thoroughly convinced that something happened when it didn't, and why eyewitness testimony is notoriously unreliable. The new research also suggests that memory problems like those seen in Alzheimer's could involve a "freezing" of these memories — an inability to adapt the memory to the present, Voss said. Our memories are thus less a snapshot of the past, than "a record of our current view on the past," said Donna Rose Addis, a researcher and associate professor at the University of Auckland in New Zealand, who was not involved in the research. Using brain scans of 17 healthy volunteers as they were taught new data and recalled previously learned information, Voss and his colleagues were able to show for the first time precisely when and where new information gets implanted into existing memories.
Keyword: Learning & Memory
Link ID: 19205 - Posted: 02.05.2014
by Laura Sanders Despite seeming like a bystander, your baby is attuned to your social life (assuming you have one, which, with a baby, would be amazing). Every time you interact with someone, your wee babe is watching, eagerly slurping up social conventions. Scientists already know that babies expect some social graces: They expect people in a conversation to look at each other and talk to other people, not objects, and are eager to see good guys rewarded and bad guys punished, scientists have found. Now, a new study shows that babies are also attuned to other people’s relationships, even when those relationships have nothing to do with them. Babies are pretty good at figuring out who they want to interact with. The answer in most cases: Nice people. And that makes sense. The helpless wailers need someone reliable around to feed, change and entertain them. So to find out how good babies are at reading other people’s social relationships, University of Chicago psychologists showed 64 9-month-old babies a video of two women eating. Sometimes the women ate from the same bowl and agreed that the food was delicious, or agreed that it was gross. Sometimes the women disagreed. Later, the women interacted again, either warmly greeting each other and smiling, or giving each other the cold shoulder, arms crossed with a “hmph.” Researchers then timed how long the babies spent looking at this last scene, with the idea that the longer the baby spent looking, the more surprising the scene was. © Society for Science & the Public 2000 - 2014.
By Jennifer Ouellette It was a brisk October day in a Greenwich Village café when New York University neuroscientist David Poeppel crushed my dream of writing the definitive book on the science of the self. I had naively thought I could take a light-hearted romp through genotyping, brain scans, and a few personality tests and explain how a fully conscious unique individual emerges from the genetic primordial ooze. Instead, I found myself scrambling to navigate bumpy empirical ground that was constantly shifting beneath my feet. How could a humble science writer possibly make sense of something so elusively complex when the world’s most brilliant thinkers are still grappling with this marvelous integration that makes us us? “You can’t. Why should you?” Poeppel asked bluntly when I poured out my woes. “We work for years and years on seemingly simple problems, so why should a very complicated problem yield an intuition? It’s not going to happen that way. You’re not going to find the answer.” Well, he was right. Darn it. But while I might not have found the Ultimate Answer to the source of the self, it proved to be an exciting journey and I learned some fascinating things along the way. 1. Genes are deterministic but they are not destiny. Except for earwax consistency. My earwax is my destiny. We tend to think of our genome as following a “one gene for one trait” model, but the real story is far more complicated. True, there is one gene that codes for a protein that determines whether you will have wet or dry earwax, but most genes serve many more than one function and do not act alone. Height is a simple trait that is almost entirely hereditary, but there is no single gene helpfully labeled height. Rather, there are several genes interacting with one another that determine how tall we will be. Ditto for eye color. It’s even more complicated for personality traits, health risk factors, and behaviors, where traits are influenced, to varying degrees, by parenting, peer pressure, cultural influences, unique life experiences, and even the hormones churning around us as we develop in the womb.
Madhusree Mukerjee By displaying images on an iPad, researchers tested patients' ability to detect contrast after their vision was restored by cataract surgery. In a study of congenitally blind children who underwent surgery to restore vision, researchers have found that the brain can still learn to use the newly acquired sense much later in life than previously thought. Healthy infants start learning to discern objects, typically by their form and colour, from the moment they open their eyes. By the time a baby is a year old vision development is more or less complete, although refinements continue through childhood. But as the brain grows older, it becomes less adaptable, neuroscientists generally believe. "The dogma is that after a certain age the brain is unable to process visual inputs it has never received before," explains cognitive scientist Amy Kalia of the Massachusetts Institute of Technology (MIT) in Cambridge. Consequently, eye surgeons in India often refuse to treat children blinded by cataracts since infancy if they are over the age of seven. Such children are not usually found in wealthier countries such as the United States — where cataracts are treated as early as possible — but are tragically plentiful in India. In the study, which was published last week in Proceedings of the National Academy of Sciences1, Kalia and her collaborators followed 11 children enrolled in Project Prakash2, a humanitarian and scientific effort in India that provides corrective surgery to children with treatable cataracts and subsequently studies their visual abilities. ('Prakash' is Sanskrit for light.) © 2014 Nature Publishing Group
By GINA KOLATA For many obese adults, the die was cast by the time they were 5 years old. A major new study of more than 7,000 children has found that a third of children who were overweight in kindergarten were obese by eighth grade. And almost every child who was very obese remained that way. Some obese or overweight kindergartners lost their excess weight, and some children of normal weight got fat over the years. But every year, the chances that a child would slide into or out of being overweight or obese diminished. By age 11, there were few additional changes: Those who were obese or overweight stayed that way, and those whose weight was normal did not become fat. “The main message is that obesity is established very early in life, and that it basically tracks through adolescence to adulthood,” said Ruth Loos, a professor of preventive medicine at the Icahn School of Medicine at Mount Sinai in New York, who was not involved in the study. These results, surprising to many experts, arose from a rare study that tracked children’s body weight for years, from kindergarten through eighth grade. Experts say they may reshape approaches to combating the nation’s obesity epidemic, suggesting that efforts must start much earlier and focus more on the children at greatest risk. The findings, to be published Thursday in The New England Journal of Medicine, do not explain why the effect occurs. Researchers say it may be a combination of genetic predispositions to being heavy and environments that encourage overeating in those prone to it. But the results do provide a possible explanation for why efforts to help children lose weight have often had disappointing results. The steps may have aimed too broadly at all schoolchildren, rather than starting before children enrolled in kindergarten and concentrating on those who were already fat at very young ages. © 2014 The New York Times Company
by Susan Milius Male bee flies fooled into trying to copulate with a daisy may learn from the awkward incident. Certain orchids and several forms of South Africa’s Gorteria diffusa daisy lure pollinators by mimicking female insects. The most effective daisy seducers row a dark, somewhat fly-shaped bump on one of their otherwise yellow-to-orange petals. Males of small, dark Megapalpus capensis bee flies go wild. But tests show the daisy’s victims waste less time trying to mate with a second deceptive daisy than with the first. “Far from being slow and stupid, these males are actually quite keen observers and fairly perceptive for a fly,” says Marinus L. de Jager of Stellenbosch University in South Africa. Males’ success locating a female bee fly drops in the presence of deceitful daisies, de Jager and Stellenbosch University colleague Allan Ellis say January 29 in the Proceedings of the Royal Society B. That’s the first clear demonstration of sexual deceit’s cost to a pollinator, Ellis says. Such evolutionary costs might push the bee fly to learn from mating mistakes. How long bee flies stay daisy-wary remains unknown. In other studies, wasps tricked by an Australian orchid forgot their lesson after about 24 hours. © Society for Science & the Public 2000 - 2014
Alison Abbott By slicing up and reconstructing the brain of Henry Gustav Molaison, researchers have confirmed predictions about a patient that has already contributed more than most to neuroscience. No big scientific surprises emerge from the anatomical analysis, which was carried out by Jacopo Annese of the Brain Observatory at the University of California, San Diego, and his colleagues, and published today in Nature Communications1. But it has confirmed scientists’ deductions about the parts of the brain involved in learning and memory. “The confirmation is surely important,” says Richard Morris, who studies learning and memory at the University of Edinburgh, UK. “The patient is a classic case, and so the paper will be extensively cited.” Molaison, known in the scientific literature as patient H.M., lost his ability to store new memories in 1953 after surgeon William Scoville removed part of his brain — including a large swathe of the hippocampus — to treat his epilepsy. That provided the first conclusive evidence that the hippocampus is fundamental for memory. H.M. was studied extensively by cognitive neuroscientists during his life. After H.M. died in 2008, Annese set out to discover exactly what Scoville had excised. The surgeon had made sketches during the operation, and brain-imaging studies in the 1990s confirmed that the lesion corresponded to the sketches, although was slightly smaller. But whereas brain imaging is relatively low-resolution, Annese and his colleagues were able to carry out an analysis at the micrometre scale. © 2014 Nature Publishing Group