December 22, 2011

MIND READING: ENGINEERS HELP REVEAL MEANING IN BRAIN SCANS




(December 22, 2011)  Princeton engineers are working closely with neuroscientists to understand how visual information and words are encoded in the brain.

In a five-year collaboration, a team led by Princeton’s Peter Ramadge, chair and the Gordon Y.S. Wu Professor of electrical engineering, and James Haxby, a neuroscientist at Dartmouth College, have found common patterns in data from brain scans, called fMRI, that reveal brain activity as people perform tasks. The researchers are solving a long-standing challenge of comparing one person’s brain activity to another, which until now has been difficult because both the anatomy and functional processes of each person’s brain are different.

In one recent result, published in the journal Neuron, the researchers had subjects watch the entire movie “Raiders of the Lost Ark” while undergoing fMRI scans and used the data to derive a “common neural code” for how the brain recognizes complex visual images. Based on data from the first half of the movie, the researchers were able to predict, using only a person’s fMRI results, what scene he or she was watching in the second half of the movie.


December 19, 2011

IBM 5 in 5: Mind Reading is no longer science fiction




(December 19, 2011)  One of the many great things about working with the Emerging Technology Services team is that I am always focused on “what’s next.”  For a long time speech recognition fitted into this category as the computing industry looked to make technology more pervasive to free our finger tips from typing and to help us become more productive.

We are benefitting from this today with voice recognition for our cars, smartphones and even automated phone services for banks and travel reservations.

Now that speech recognition is becoming mainstream, and many other forms of human computer interaction have come along, like touch, gesture recognition, etc., we are thinking about what’s next - or in the case of the IBM 5 in 5 - what's next by 2017. In my view there will be huge leaps made in bioinformatics - this is a large topic, so I am more specifically referring to the use of sensors to understand our thoughts.

read entire news release >>

IBM Reveals Five Innovations That Will Change Our Lives within Five Years




(December 19, 2011)  Today IBM formally unveiled the sixth annual “IBM 5 in 5" (#ibm5in5) – a list of innovations that have the potential to change the way people work, live and interact during the next five years:

*  People power will come to life
*  You will never need a password again
*  Mind reading is no longer science fiction
*  The digital divide will cease to exist
*  Junk mail will become priority mail

The next IBM 5 in 5 is based on market and societal trends as well as emerging technologies from IBM’s research labs around the world that can make these transformations possible. 
At IBM, we’re bridging the gap between science fiction and science fact on a daily basis. Here are how five technologies will define the future:

read entire press release

December 7, 2011

Brain Energy Metabolism: Focus on Astrocyte-Neuron Metabolic Cooperation




(December 7, 2011)  The energy requirements of the brain are very high, and tight regulatory mechanisms operate to ensure adequate spatial and temporal delivery of energy substrates in register with neuronal activity. Astrocytes—a type of glial cell—have emerged as active players in brain energy delivery, production, utilization, and storage. Our understanding of neuroenergetics is rapidly evolving from a “neurocentric” view to a more integrated picture involving an intense cooperativity between astrocytes and neurons. This review focuses on the cellular aspects of brain energy metabolism, with a particular emphasis on the metabolic interactions between neurons and astrocytes.

read entire article (OPEN ACCESS) >>

December 2, 2011

Changing Ideas




10 new technologies that will make a difference

(December 2, 2011)  Revolutions often spring from the simplest of ideas. When a young inventor named Steve Jobs wanted to provide computing power to “people who have no computer experience and don’t particularly care to gain any,” he ushered us from the cumbersome technology of mainframes and command-line
prompts to the breezy advances of the Macintosh and iPhone. His idea helped to forever change our relationship with technology. What other simple but revolutionary ideas are out there in the labs, waiting for the right moment to make it big? We have found 10, and in the following pages we explain what they are and how they might shake things up: Computers that work like minds. Batteries you can top off at the pump. A crystal ball made from data (the focus of a feature on page 52). Consider
this collection our salute to the power of a simple idea.

read entire technology special report (pdf) >>

November 15, 2011

Mimicking the brain, in silicon




New computer chip models how neurons communicate with each other at synapses.

(November 15, 2011)  For decades, scientists have dreamed of building computer systems that could replicate the human brain’s talent for learning new tasks.

MIT researchers have now taken a major step toward that goal by designing a computer chip that mimics how the brain’s neurons adapt in response to new information. This phenomenon, known as plasticity, is believed to underlie many brain functions, including learning and memory.

With about 400 transistors, the silicon chip can simulate the activity of a single brain synapse — a connection between two neurons that allows information to flow from one to the other. The researchers anticipate this chip will help neuroscientists learn much more about how the brain works, and could also be used in neural prosthetic devices such as artificial retinas, says Chi-Sang Poon, a principal research scientist in the Harvard-MIT Division of Health Sciences and Technology.

read entire press release >>

November 2, 2011

The Mind Reader



How Frank Guenther turns thoughts into words

(November 2, 2011)  For thousands of years humans have spoken. Noam Chomsky and many other linguists argue that speech is what sets Homo sapiens apart in the animal kingdom. “Speech,” wrote Aristotle, “is the representation of the mind.”

It is a complex process, the series of lightning-quick steps by which your thoughts form themselves into words and travel from your brain, via the tongue, lips, vocal folds, and jaw (together known as the articulators), to your listeners’ ears—and into their own brains.

Complex, but mappable. Over the course of two decades and countless experiments using functional magnetic resonance imaging (fMRI) and other methods of data collection, neuroscientist Frank Guenther has built a computer model describing just how your brain pulls off the trick of speaking.

And the information isn’t merely fascinating. Guenther (GRS’93), a Sargent College professor of speech, language and hearing sciences, believes his model will help patients suffering from apraxia (where the desire to speak is intact, but speech production is damaged), stuttering, Lou Gehrig’s disease, throat cancer, even paralysis.

read entire press release >>

November 1, 2011

TMC Shows New Nursing and Healthcare Robots in Tokyo




Four New Types of Robots Aimed for Commercialization from 2013

(November 1, 2011)  Toyota Motor Corporation (TMC) held an event today here at its vehicle display space and theme park Mega Web to display a number of new robots developed to provide support in nursing and healthcare.  The robots form part of the Toyota Partner Robot series, which is being developed to assist humans in their everyday activities.

TMC considers Partner Robots to be useful in four fields: nursing and healthcare, short-distance personal transport, manufacturing and domestic duties.  TMC is developing technology that cooperates with humans, including devices that assist in the loading and moving of heavy components in factories, in addition to automated technology that enables autonomous tool operation.

TMC endeavors to provide the freedom of mobility to all people, and understands from its tie-ups with the Toyota Memorial Hospital and other medical facilities that there is a strong need for robots in the field of nursing and healthcare.  TMC aims to support independent living for people incapacitated through sickness or injury, while also assisting in their return to health and reducing the physical burden on caregivers.

Each robot incorporates the latest in advanced technologies developed by TMC, including high-speed, high-precision motor control technology, highly stable walking-control technology advanced through development of two-legged robots, and sensor technology that detects the user's posture as well as their grasping and holding strength.

read entire press release >>

Hippocampus Plays Bigger Memory Role Than Previously Thought




(November 1, 2011)  Human memory has historically defied precise scientific description, its biological functions broadly but imperfectly defined in psychological terms. In a pair of papers published in the November 2 issue of The Journal of Neuroscience, researchers at the University of California, San Diego report a new methodology that more deeply parses how and where certain types of memories are processed in the brain, and challenges earlier assumptions about the role of the hippocampus.

Specifically, Larry R. Squire, PhD, a Research Career scientist at the VA Medical Center, San Diego and professor of psychiatry, neurosciences, and psychology at UC San Diego, and Christine N. Smith, PhD, a project scientist, say that contrary to current thinking the hippocampus (a small seahorse-shaped structure located deep in the center of the brain and long associated with memory function) supports both recollection and familiarity memories when these memories are strong.

Recollection and familiarity memory are two components of recognition memory – the ability to identify an item as having been previously encountered. Recollection memory involves remembering specific details about a learning episode, such as where and when the episode occurred. Familiarity memory refers to remembering an item as previously encountered, but without any recall of specific details, such as recognizing someone’s face but recalling nothing else about that person (For example, where you met the person.).


image >>

October 17, 2011

NJIT Researcher Testing Micro-Electronic Stimulators for Spinal Cord Injuries


Implant location of the micro electrode array shown on the rubrospinal tract (RST)
at the C5 level of the rat spinal cord.

(October 17, 2011)  A new wireless device to help victims of spinal cord injury is receiving attention in the research community. Mesut Sahin, PhD, associate professor in the department of biomedical engineering at NJIT, recently has published and presented news of his findings to develop micro-electrical stimulators for individuals with spinal cord injuries.

The work, now in its third year of support from a four-year, $1.4 million National Institutes of Health (NIH) grant, has resulted in the development and testing of a technology known by its acronym, FLAMES (floating light activated micro-electrical stimulators). The technology, really a tiny semiconductor device, will eventually enable people with spinal cord injuries to restore some of the motor functions that are lost due to injury. Energized by an infrared light beam through an optical fiber located just outside the spinal cord these micro-stimulators will activate the nerves in the spinal cord below the point of injury and thus allow the use of the muscles that were once paralyzed.



image (read also) >>

Man With Spinal Cord Injury Uses Brain-Computer Interface to Move Prosthetic Arm With His Thoughts



(October 17, 2011)  Seven years after a motorcycle accident damaged his spinal cord and left him paralyzed, 30-year-old Tim Hemmes reached up to touch hands with his girlfriend in a painstaking and tender high-five

Hemmes, of Evans City, Pa., is the first to participate in a new trial assessing whether the thoughts of a person with spinal-cord injury can be used to control the movement of an external device, such as a computer cursor or a sophisticated prosthetic arm. The project, one of two brain-computer interface (BCI) studies under way at the University of Pittsburgh School of Medicine and UPMC Rehabilitation Institute, used a grid of electrodes placed on the surface of the brain to control the arm.

It was a unique robotic arm and hand, designed by the Johns Hopkins University Applied Physics Laboratory, that Hemmes willed to extend first toward the palm of a researcher on the team and, a few minutes later, to his girlfriend’s hand.

read entire news >>

October 11, 2011

Large study shows females are equal to males in math skills




(October 11, 2010) The mathematical skills of boys and girls, as well as men and women, are substantially equal, according to a new examination of existing studies in the current online edition of journal Psychological Bulletin.

One portion of the new study looked systematically at 242 articles that assessed the math skills of 1,286,350 people, says chief author Janet Hyde, a professor of psychology and women's studies at the University of Wisconsin-Madison.

These studies, all published in English between 1990 and 2007, looked at people from grade school to college and beyond. A second portion of the new study examined the results of several large, long-term scientific studies, including the National Assessment of Educational Progress.


image

October 5, 2011

Monkeys "Move and Feel" Virtual Objects Using Only Their Brains




(October 5, 2011)  In a first-ever demonstration of a two-way interaction between a primate brain and a virtual body, two monkeys trained at the Duke University Center for Neuroengineering learned to employ brain activity alone to move an avatar hand and identify the texture of virtual objects.

"Someday in the near future, quadriplegic patients will take advantage of this technology not only to move their arms and hands and to walk again, but also to sense the texture of objects placed in their hands, or experience the nuances of the terrain on which they stroll with the help of a wearable robotic exoskeleton," said study leader Miguel Nicolelis, MD, PhD, professor of neurobiology at Duke University Medical Center and co-director of the Duke Center for Neuroengineering.

Without moving any part of their real bodies, the monkeys used their electrical brain activity to direct the virtual hands of an avatar to the surface of virtual objects and, upon contact, were able to differentiate their textures.

read entire press release

September 28, 2011

Tech Industry Visionaries Foresee "Internet of Everything" at Marconi Symposium



(September 28, 2011)  The "Internet of everything" has arrived, and according to a panel of prominent experts who assembled at the University of California, San Diego earlier this month for the 2011 Marconi Society Symposium, surfboards, dog collars and even tube socks will one day cross the digital divide and make for an increasingly Internet-enabled world.

This year's symposium, which was co-sponsored by the Center for Magnetic Recording Research and the UC San Diego division of the California Institute for Telecommunications and Information Technology (Calit2), explored the role that hardware (infrastructure) and software (applications) will play as the Internet evolves over the next several decades. The Marconi Society hosted the symposium in advance of its awards ceremony to recognize two scientists who–like radio inventor Guglielmo Marconi–pursued advances in communications and information technology for the social, economic and cultural development of all humanity. This year's winners of the Marconi Prize were former UCSD professors of electrical and computer engineering Jack Wolf and Irwin Mark Jacobs (Jacobs is also the co-founder of Qualcomm, Inc.). Speakers at this year's symposium included Jacobs and Google's Chief Internet Evangelist Vint Cerf, as well as Calit2 Senior Research Scientist Thomas A. DeFanti, Calit2 Research Scientist Albert Yu-Min Lin and several representatives from Bell Labs, Alcatel-Lucent and the University of Texas at Austin.

read entire press news

September 22, 2011

Scientists use brain imaging to reveal the movies in our mind




(September 22, 2011)  Imagine tapping into the mind of a coma patient, or watching one’s own dream on YouTube. With a cutting-edge blend of brain imaging and computer simulation, scientists at the University of California, Berkeley, are bringing these futuristic scenarios within reach.

Using functional Magnetic Resonance Imaging (fMRI) and computational models, UC Berkeley researchers have succeeded in decoding and reconstructing people’s dynamic visual experiences – in this case, watching Hollywood movie trailers.

As yet, the technology can only reconstruct movie clips people have already viewed. However, the breakthrough paves the way for reproducing the movies inside our heads that no one else sees, such as dreams and memories, according to researchers.

September 21, 2011

Ultrasound for Mind Reading


HEARING THOUGHTS: University of Toronto graduate student Sarah Power
models ultrasound headgear that could see the difference between two brain tasks.

Ultrasound transducers could make a better brain-computer interface

(September 21, 2011)  Ultrasound is good for more than monitoring fetuses and identifying heart defects. According to engineers in Canada, it can help tell what people are thinking as well. Their research suggests that ultrasound-based devices could lead to a new kind of brain-computer interface.

Brain-computer interface technology allows users to control devices with brain activity alone. Researchers have focused primarily on clinical applications for people with severe disabilities who would otherwise have difficulty interacting with the outside world.

In addition to brain-computer interfaces that involve electronics inserted directly into a patient’s head, researchers are also developing a number of noninvasive methods. For instance, electroencephalography (EEG) relies on electrodes attached to a person’s head; functional magnetic resonance imaging (fMRI) uses powerful magnetic fields to measure blood flow in the brain that telegraphs brain activity; magnetoencephalography (MEG) detects the magnetic fields generated by clusters of thousands of neurons; and near-infrared spectroscopy (NIRS) uses light to scan for changes in blood hemoglobin concentrations.

Yet practical use of these methods has so far been limited due to a number of drawbacks. For instance, EEG faces "noise" from electrical signals sent by the muscles and eyes; fMRI and MEG are very expensive and require large equipment; and NIRS, while still early in development as a brain-computer interface technology, has a low data-transmission rate.

Now biomedical engineer Tom Chau and his colleagues at the University of Toronto reveal that ultrasound can also monitor brain activity, suggesting that it could be used for brain-computer interfaces.


journal reference (OPEN ACCESS) >>

September 20, 2011

Proton-based transistor could let machines communicate with living things




(September 20, 2011)  Human devices, from light bulbs to iPods, send information using electrons. Human bodies and all other living things, on the other hand, send signals and perform work using ions or protons.

Materials scientists at the University of Washington have built a novel transistor that uses protons, creating a key piece for devices that can communicate directly with living things. The study is published online this week in the interdisciplinary journal Nature Communications.

Devices that connect with the human bodys processes are being explored for biological sensing or for prosthetics, but they typically communicate using electrons, which are negatively charged particles, rather than protons, which are positively charged hydrogen atoms, or ions, which are atoms with positive or negative charge.


journal reference >>

September 2, 2011

Autonomous Wheelchair




Computer Scientists from Freie Universität Berlin Present Novelty at IFA

(September 2, 2011)  Computer scientists from Freie Universität Berlin demonstrated a new type of wheelchair at IFA, an international trade fair for home electronics. The wheelchair, on loan from the Otto Bock company, makes it significantly easier to navigate inside buildings. It is equipped with laser and camera sensors and a computer under the seat. Laser sensors detect the position of walls and obstacles and prevent collisions. A so-called Kinect, developed for Microsoft Xbox 360 game consoles, is also mounted on the wheelchair. The sensor detects the three-dimensional structure of the environment and can, for example, prevent the collision of the wheelchair with people. Demonstrations and explanations are available in videos on YouTube.

A camera is installed for steering using eye movements. To cause the wheelchair go to the right or left, the user needs only to glance toward the right or left. Accelerating and braking are triggered by looking upward or downward. For steering by thought, the wheelchair user wears a cap with 16 sensors that continuously measure brain activation. The system is trained to distinguish four brain patterns: drive left, drive right, accelerate, and brake. After a training period, the user should be able to steer the wheelchair just by thinking. A great deal of concentration is required, as ideally the user should think of only the four practiced patterns the entire time. Since obstacles automatically cause the wheelchair to stop, the person remains accident-free in any case. Previously, the group had demonstrated steering a car using only brain power.

read entire press release >>

The Mind Reader




How Frank Guenther turns thoughts into words

(September 2, 2011)  For thousands of years humans have spoken. Noam Chomsky and many other linguists argue that speech is what sets Homo sapiens apart in the animal kingdom. “Speech,” wrote Aristotle, “is the representation of the mind.”

It is a complex process, the series of lightning-quick steps by which your thoughts form themselves into words and travel from your brain, via the tongue, lips, vocal folds, and jaw (together known as the articulators), to your listeners’ ears—and into their own brains.

read entire press release

August 18, 2011

IBM Unveils Cognitive Computing Chips


SyNAPSE technical project manager Bill Risk next to the "brain wall."
Each of the yellow boxes represents one of the cognitive computing chips (256 neurons),
and close up you'll see them blinking - these are neurons firing.

(August 18, 2011)  Today, IBM (NYSE: IBM) researchers unveiled a new generation of experimental computer chips designed to emulate the brain’s abilities for perception, action and cognition. The technology could yield many orders of magnitude less power consumption and space than used in today’s computers.

In a sharp departure from traditional concepts in designing and building computers, IBM’s first neurosynaptic computing chips recreate the phenomena between spiking neurons and synapses in biological systems, such as the brain, through advanced algorithms and silicon circuitry. Its first two prototype chips have already been fabricated and are currently undergoing testing.
Called cognitive computers, systems built with these chips won’t be programmed the same way traditional computers are today. Rather, cognitive computers are expected to learn through experiences, find correlations, create hypotheses, and remember – and learn from – the outcomes, mimicking the brains structural and synaptic plasticity.

To do this, IBM is combining principles from nanoscience, neuroscience and supercomputing as part of a multi-year cognitive computing initiative. The company and its university collaborators also announced they have been awarded approximately $21 million in new funding from the Defense Advanced Research Projects Agency (DARPA) for Phase 2 of the Systems of Neuromorphic Adaptive Plastic Scalable Electronics (SyNAPSE) project. 


image >>

July 15, 2011

The Internet of Things [INFOGRAPHIC]




(July 15, 2011)  When we think of being connected to the Internet, our minds immediately shift to our computers, phones, and most recently tablets. This week at Cisco live, I shared that in 2008, the number of devices connected to the Internet exceeded the number of people on Earth.

That’s right. There are more devices tapping into the Internet than people on Earth to use them. How is this possible?

The infographic below provides a visual representation of the increase in “things” connected to the Internet. With this increase, how will you prepare your network for the future?

read and see more >>

July 14, 2011

Your Brain on Androids




(July 14, 2011)  Ever get the heebie-jeebies at a wax museum? Feel uneasy with an anthropomorphic robot? What about playing a video game or watching an animated movie, where the human characters are pretty realistic but just not quite right and maybe a bit creepy? If yes, then you’ve probably been a visitor to what’s called the “uncanny valley.”

The phenomenon has been described anecdotally for years, but how and why this happens is still a subject of debate in robotics, computer graphics and neuroscience. Now an international team of researchers, led by Ayse Pinar Saygin of the University of California, San Diego, has taken a peek inside the brains of people viewing videos of an uncanny android (compared to videos of a human and a robot-looking robot).

Published in the Oxford University Press journal Social Cognitive and Affective Neuroscience, the functional MRI study suggests that what may be going on is due to a perceptual mismatch between appearance and motion.

read entire news release >>

June 29, 2011

Brain research predicts premeditated actions




(June 29, 2011)  Bringing the real world into the brain scanner, researchers at The University of Western Ontario from The Centre for Brain and Mind can now determine the action a person was planning, mere moments before that action is actually executed.

The findings were published this week in the prestigious Journal of Neuroscience, in the paper, “Decoding Action Intentions from Preparatory Brain Activity in Human Parieto-Frontal Networks.”

“This is a considerable step forward in our understanding of how the human brain plans actions,” says Jason Gallivan, a Western Neuroscience PhD student, who was the first author on the paper.

Over the course of the one-year study, human subjects had their brain activity scanned using functional magnetic resonance imaging (fMRI) while they performed one of three hand movements: grasping the top of an object, grasping the bottom of the object, or simply reaching out and touching the object. The team found that by using the signals from many brain regions, they could predict, better than chance, which of the actions the volunteer was merely intending to do, seconds later.

read entire press release

June 23, 2011

A First Step Toward a Prosthesis for Memory




A neural implant helps rats with short-term recall.

(June 23, 2011)  Researchers have developed the first memory prosthetic device—a neural implant that, in rats, restored lost brain function and improved short-term memory retention. While human testing is still a distant goal, the implant provides evidence that the brain’s complex neural code can be interpreted and reproduced to enhance cognitive function.

The device, which consists of a tiny chip and a set of 32 electrodes, marries math and neuroscience. At its heart is an algorithm that deciphers and replicates the neural code that one layer of the brain sends to another. The function restored by the implant is limited—rats were able to remember which of two levers they had pressed. But its creators believe that a device on the same principle could one day be used to improve recall in people suffering from stroke, dementia, or other brain damage.

Wake Forest University neurophysiologist Samuel Deadwyler first trained the rats to press two different levers in succession. The animals learned to press one lever as it was presented to them and then, after a delay, remember which they’d pressed and choose the other one the second time around. While the rats performed the task, two sets of minute electrodes recorded the activity of individual neurons on the right and left sides of the hippocampus, an area of the brain that consolidates short-term memory by processing information as it passes through multiple layers. A set of 16 electrodes—eight on the right, eight on the left—monitored signals being sent from neurons in an area of the hippocampus called the CA3 layer, and another 16 monitored the processed signals received by neurons in the CA1 layer.

read entire news >>

June 20, 2011

Caltech Researchers Create the First Artificial Neural Network Out of DNA




(June 20, 2011)  Artificial intelligence has been the inspiration for countless books and movies, as well as the aspiration of countless scientists and engineers. Researchers at the California Institute of Technology (Caltech) have now taken a major step toward creating artificial intelligence—not in a robot or a silicon chip, but in a test tube. The researchers are the first to have made an artificial neural network out of DNA, creating a circuit of interacting molecules that can recall memories based on incomplete patterns, just as a brain can.

"The brain is incredible," says Lulu Qian, a Caltech senior postdoctoral scholar in bioengineering and lead author on the paper describing this work, published in the July 21 issue of the journal Nature. "It allows us to recognize patterns of events, form memories, make decisions, and take actions. So we asked, instead of having a physically connected network of neural cells, can a soup of interacting molecules exhibit brainlike behavior?"


June 17, 2011

RESTORING MEMORY, REPAIRING DAMAGED BRAINS




(June 17, 2011)  USC Viterbi School of Engineering scientists have developed a way to turn memories on and off—literally with the flip of a switch.

Using an electronic system that duplicates the neural signals associated with memory, they managed to replicate the brain function in rats associated with long-term learned behavior, even when the rats had been drugged to forget.

"Flip the switch on, and the rats remember. Flip it off, and the rats forget," said Theodore Berger of the USC Viterbi School of Engineering's Department of Biomedical Engineering.

Berger is the lead author of an article that will be published in the Journal of Neural Engineering. His team worked with scientists from Wake Forest University in the study, building on recent advances in our understanding of the brain area known as the hippocampus and its role in learning.

In the experiment, the researchers had rats learn a task, pressing one lever rather than another to receive a reward. Using embedded electrical probes, the experimental research team, led by Sam A. Deadwyler of the Wake Forest Department of Physiology and Pharmacology, recorded changes in the rat's brain activity between the two major internal divisions of the hippocampus, known as subregions CA3 and CA1. During the learning process, the hippocampus converts short-term memory into long-term memory, the researchers prior work has shown.

read entire press release >>

May 18, 2011

Decoding brainwaves lets scientists read minds




(May 18, 2011)  While currently in the realm of sci-fi fantasy, the ability to read people’s minds has taken a step closer to reality thanks to neuroscientists at the University of Glasgow.

Researchers at the Institute of Neuroscience & Psychology have been able to identify the type of information contained within certain brainwaves related to vision.

Brainwaves – the patterns of electrical activity created in the brain when it is engaged in different activities – can easily be measured using electroencephalography (EEG).

However, knowing exactly what information is encoded within them, and how that encoding takes place, is a mystery.

Professor Philippe Schyns, Director of the Institute of Neurosciences & Psychology and the Centre for Cognitive Neuroimaging, who led the pioneering study, said: “It’s a bit like unlocking a scrambled television channel. Before, we could detect the signal but couldn’t watch the content; now we can.


journal reference (OPEN ACCESS) >>

'Mind reading' brain scans reveal secrets of human vision




"Mind reading" scans show that, to our brains, a sparse line drawing of a street scene is almost as recognizable as a detailed color photograph.

(May 18, 2011)  Researchers call it mind reading. One at a time, they show a volunteer – who's resting in an MRI scanner – a series of photos of beaches, city streets, forests, highways, mountains and offices. The subject looks at the photos, but says nothing.

The researchers, however, can usually tell which photo the volunteer is watching at any given moment, aided by sophisticated software that interprets the signals coming from the scan. They glean clues not only by noting what part of the brain is especially active, but also by analyzing the patterns created by the firing neurons. They call it decoding.

Now, psychologists and computer scientists at Stanford, Ohio State University and the University of Illinois at Urbana–Champaign have taken mind reading a step further, with potential impact on how both computers and the visually impaired make sense of the world they see.


May 5, 2011

The benefits of meditation




MIT and Harvard neuroscientists explain why the practice helps tune out distractions and relieve pain.

(May 5, 2011)  Studies have shown that meditating regularly can help relieve symptoms in people who suffer from chronic pain, but the neural mechanisms underlying the relief were unclear. Now, MIT and Harvard researchers have found a possible explanation for this phenomenon.

In a study published online April 21 in the journal Brain Research Bulletin, the researchers found that people trained to meditate over an eight-week period were better able to control a specific type of brain waves called alpha rhythms.

“These activity patterns are thought to minimize distractions, to diminish the likelihood stimuli will grab your attention,” says Christopher Moore, an MIT neuroscientist and senior author of the paper. “Our data indicate that meditation training makes you better at focusing, in part by allowing you to better regulate how things that arise will impact you.”

There are several different types of brain waves that help regulate the flow of information between brain cells, similar to the way that radio stations broadcast at specific frequencies. Alpha waves, the focus of this study, flow through cells in the brain’s cortex, where sensory information is processed. The alpha waves help suppress irrelevant or distracting sensory information.

read entire news release >>

April 20, 2011

Brains that switch between active areas more often learn faster




(April 20, 2011)  The “flexibility” of a person’s brain — how much different areas of the brain link up in different combinations — can be used to predict how fast someone will learn, according to research by an international team from Oxford University, UC Santa Barbara, and UNC Chapel Hill.

The team ran an experiment over 3 sessions in which 18 volunteers had to push a series of buttons as fast as possible. They then divided functional MRI images of each volunteer’s brain into 112 different regions and analyzed how these different areas were active together while they performed the task.

They found that people with more “flexible” brains, whose brain regions switched active areas more often, were faster at learning the motor tasks.

“It’s the first time that anyone has defined this concept of ‘flexibility’ in the brain: how brain regions ‘light up’ together in different combinations. We’ve been able to show that how much these areas ‘swap partners’ in one session can predict how fast people will perform a task in a later session,” said Dr Mason Porter of Oxford University’s Mathematical Institute, an author of the report. “It suggests that in order to learn, the networks of our brains have to be flexible.”


joufrnal reference (OPEN ACCESS) >>

April 11, 2011

UCSF Study on Multitasking Reveals Switching Glitch in Aging Brain




(April 11, 2011)  Scientists at the University of California, San Francisco (UCSF) have pinpointed a reason older adults have a harder time multitasking than younger adults: they have more difficulty switching between tasks at the level of brain networks.

Researchers know that multitasking negatively affects short-term, or “working,” memory in both young and older adults. Working memory is the capacity to hold and manipulate information in the mind for a period of time. It is the basis of all mental operations, from learning a friend’s telephone number, and then entering it into a smart phone, to following the train of a conversation, to conducting complex tasks such as reasoning, comprehension and learning.

However, anecdotal accounts of “senior moments” – such as forgetting what one wanted to retrieve from the refrigerator after leaving the couch – combined with scientific studies conducted at UCSF and elsewhere indicate that the impact is greater in older people.

The current study offers insights into what is occurring in the brain in older adults. “Our findings suggest that the negative impact of multitasking on working memory is not necessarily a memory problem, per se, but the result of an interaction between attention and memory,” said the senior author of the study, Adam Gazzaley, MD, PhD, UCSF associate professor of neurology, physiology and psychiatry and director of the UCSF Neuroscience Imaging Center.


March 30, 2011

UT Southwestern researchers discover how brain's memory center repairs damage from head injury




(March 30, 2011)  Researchers from UT Southwestern Medical Center have described for the first time how the brain’s memory center repairs itself following severe trauma – a process that may explain why it is harder to bounce back after multiple head injuries.

The study, published in The Journal of Neuroscience, reports significant learning and memory problems in mice who were unable to create new nerve cells in the brain’s memory area, the hippocampus, following brain trauma. The study’s senior author, Dr. Steven G. Kernie, associate professor of pediatrics and developmental biology at UT Southwestern, said the hippocampus contains a well of neural stem cells that become neurons in response to injury; those stem cells must grow into functioning nerve cells to mend the damage.

“Traumatic brain injury (TBI) has received a lot of attention recently because of the recognition that both military personnel and football players suffer from debilitating brain injuries,” Dr. Kernie said, adding that memory and learning problems are common after repeated severe head injuries.

“We have discovered that neural stem cells in the brain’s memory area become activated by injury and remodel the area with newly generated nerve cells,” Dr. Kernie said. “We also found that the activation of these stem cells is required for recovery.”

read entire press release >>

February 22, 2011

Stanford researcher's new stretchable solar cells will power artificial electronic 'super skin'




(February 22, 2011)  Ultrasensitive electronic skin developed by Stanford researcher Zhenan Bao is getting even better. Now she's demonstrated that it can detect chemicals and biological molecules, in addition to sensing an incredibly light touch. And it can now be powered by a new, stretchable solar cell she's developed in her lab, opening up more applications in clothing, robots, prosthetic limbs and more.

"Super skin" is what Stanford researcher Zhenan Bao wants to create.  She's already developed a flexible sensor that is so sensitive to pressure it can feel a fly touch down.  Now she's working to add the ability to detect chemicals and sense various kinds of biological molecules.  She's also making the skin self-powering, using polymer solar cells to generate electricity.  And the new solar cells are not just flexible, but stretchable – they can be stretched up to 30 percent beyond their original length and snap back without any damage or loss of power. 

Super skin, indeed.

"With artificial skin, we can basically incorporate any function we desire," said Bao, a professor of chemical engineering. "That is why I call our skin 'super skin.' It is much more than what we think of as normal skin."

read entire press release

February 17, 2011

Scientists Steer Car with the Power of Thought



Computer Scientists at Freie Universität Couple Brain Waves with Driving Technology – Testing at Former Tempelhof Airport

(February 17, 2011)  You need to keep your thoughts from wandering, if you drive using the new technology from the AutoNOMOS innovation labs of Freie Universität Berlin. The computer scientists have developed a system making it possible to steer a car with your thoughts. Using new commercially available sensors to measure brain waves – sensors for recording electroencephalograms (EEG) – the scientists were able to distinguish the bioelectrical wave patterns for control commands such as “left,” “right,” “accelerate” or “brake” in a test subject. They then succeeded in developing an interface to connect the sensors to their otherwise purely computer-controlled vehicle, so that it can now be “controlled” via thoughts. Driving by thought control was tested on the site of the former Tempelhof Airport.

The scientists from Freie Universität first used the sensors for measuring brain waves in such a way that a person can move a virtual cube in different directions with the power of his or her thoughts. The test subject thinks of four situations that are associated with driving, for example, “turn left” or “accelerate.” In this way the person trained the computer to interpret bioelectrical wave patterns emitted from his or her brain and to link them to a command that could later be used to control the car. The computer scientists connected the measuring device with the steering, accelerator, and brakes of a computer-controlled vehicle, which made it possible for the subject to influence the movement of the car just using his or her thoughts.

read entire press release >>

February 13, 2011

The Brain-Machine Connection: Humans and Computers in the 21st Century




(February 13, 2011)  A bull is charging at full speed, riveted in its fury, straight at you. Rather than running for your life, you calmly flip a switch on a remote control you’re holding. Immediately, the bull halts its furious charge and awkwardly trots away. This sort of mind control is not science fiction – it was an actual experiment performed by one of the earliest practitioners of brain implants – José Delgado, a neurophysiologist at Yale University from 1946 to 1974.

Trained in the venerable tradition of neuroanatomists, José Delgado was a physiologist who primarily studied the neural anatomies of animals. After reading about how Nobel Prize-winning neurologist Walter Hess was able to induce various emotions through electrical stimulation, Delgado chose to further explore this concept. Over the next thirty years, he constructed increasingly sophisticated devices that would deliver measured electrical pulses to specific targets in the brain. For example, one of his innovations was a device known as a stimoceiver, a pacemaker-like device that could electrically stimulate a certain area of the brain when triggered by a remote electrical receiver. The device provided Delgado unprecedented control of an animal’s movement and emotional state. In his mind, the final purpose of these devices was to be able to control mental illnesses, such as schizophrenia or depression, by stimulating various parts of the brain, a less invasive and destructive alternative to a then-popular surgical procedure known as a lobotomy. Using this device, he dramatically demonstrated his control of behavior by stopping a charging bull just a few feet away.

read entire article >>

February 2, 2011

The brain knows what the nose smells, but how? Stanford researchers trace the answer




(February 2, 2011)  Professor of Biology Liqun Luo has developed a new technique to trace neural pathways across the brain. He has mapped the path of odor signals as they travel to the higher centers of a mouse brain, illuminating the ways mammalian brains process smells.

Mice know fear. And they know to fear the scent of a predator. But how do their brains quickly figure out with a sniff that a cat is nearby?

It's a complex process that starts with the scent being picked up by specific receptors in their noses. But until now it wasn't clear exactly how these scent signals proceeded from nose to noggin for neural processing.

In a study to be published in Nature (available online now to subscribers), Stanford researchers describe a new technique that makes it possible to map long-distance nerve connections in the brain. The scientists used the technique to map for the first time the path that the scent signals take from the olfactory bulb, the part of the brain that first receives signals from odor receptors in the nose, to higher centers of the mouse brain where the processing is done.

read entire press release