Scientists Hack Into Fish Brain, Observe Thoughts Swimming
By Roxanne Palmer | February 1, 2013 6:35 AM EST
What do thoughts look like? Though that sort of question might sound like something posed late one night in a college dormitory, neuroscientists are keenly interested in being able to see signals traveling throughout the brain in real-time.
Now Japanese scientists have scored a key breakthrough, using technology to watch as a thought ‘swims’ through the brain of a living zebrafish. They reported the find in a paper published online on Thursday in the journal Current Biology.
“Our work is the first to show brain activities in real time in an intact animal during that animal's natural behavior," senior author Koichi Kawakami, a researcher at Japan's National Institute of Genetics, said in a statement Thursday. "We can make the invisible visible; that's what is most important."
The researchers used genetic techniques to insert a special fluorescent probe right into the neurons of the fish. When the fish saw a tasty paramecium, various pathways in the fish’s brain lit up, allowing the researchers to correlate specific brain activities to different behaviors.
A zebrafish’s brain is primitive, but its basic design does not differ too much from our own. The researchers of the current study implied that they’re interested in exploring more complex behaviors.
"In the future, we can interpret an animal's behavior, including learning and memory, fear, joy, or anger, based on the activity of particular combinations of neurons," Kawakami said.
If the technique eventually makes its way into humans, researchers could potentially use this kind of specific brain activity reading to develop better psychiatric drugs.
"This has the potential to shorten the long processes for the development of new psychiatric medications," Kawakami says.
This latest thought visualization experiment is just one of the many recent advances in neuroscience that have ushered in an era where hacking the brain is becoming more and more plausible.
In 2010, University of Utah scientists unveiled a technique that uses sensors attached directly to the brain to translate signals into speech – a technique that could prove especially useful to patients rendered mute by paralysis.
A computer hooked up to the sensors was able to correctly identify what word that an epileptic man was thinking between 76 and 90 percent of the time. In the initial experiments, the resarchers only focused on a few essential words like "yes," "no," "hungry," "thirsty," "hot," "cold," "more" and "less."
"Even if we can just get them 30 or 40 words that could really give them so much better quality of life," author Bradley Greger told the Telegraph in 2010.
SOURCE: Muto et al. “Real-Time Visualization of Neuronal Activity during Perception.” Current Biology published online 31 January 2013.
To contact the editor, e-mail:
Most Popular Slideshows
- Pope Francis Meets Sudanese Woman Who Was Spared Death for Apostasy (PHOTOS)
- Malaysia Airlines Flight MH17: King Williem-Alexander, Queen Maxima Hold Solemn Reception Ceremony for Victims
- Transfer News: FC Barcelona Shockingly Sign Valencia Defender [PHOTOS]
- Jennifer Lawrence & Nicholas Hoult Allegedly Split: Mad Max Actor Cheats with Kristen Stewart & Riley Keough - Reports
Join the Conversation
- Apple iPhone 6 on Two Confirmed Release Dates, New Parts Leaked Suggesting Bigger iPhone to Come
- Nexus 5 and Nexus 7 Android 5.0 L Material Design Coming with More Interface Changes
- Xiaomi Mi4 vs OnePlusOne vs Nexus 5: Mi4 is the ‘Perfect’ Phone
- Israeli Women Stripping Naked for IDF Soldiers
- Samsung Galaxy S4 LTE-A Android 4.4.2 KitKat Update Rolls Out: When to Hit Your Region
- HTC One M8 Android 4.4.3 KitKat Update Roll Out, Introducing the HTC One Remix
- OnePlus One Android 4.4.4 KitKat and Android L Update Guide