40th Anniversary of "The Magna Carta of Space"
YouTube Launches New Anti-Piracy Feature

Second Life on the Brain - BCI Controls Online Avatars

Bci_second_life The Biomedical Engineering Laboratory at Keio University in Japan recently announced that researchers were able to control a Second Life avatar using a brain-computer interface (BCI), bringing the possibility of total immersion in the online virtual world one step closer.

Prior attempts at creating such an interface involved hardwiring an interface into the brain, in the form of an implant. But this procedure uses external electrodes wired to an EEG (electroencephalogram) as a way of receiving and interpreting commands. But it's not as simple as strapping on a helmet and running amok in Linden Land. The interface, which reads electric signals in the brain, must be "trained" to understand what specific electrical impulses and patterns trigger specific actions in the real world. Once this  learning process is complete, a user can then exercise simple control over an avatar in the virtual world by imagining what they would like it to do.

Movement is still crude but this is partly due to limitations in the way that Second Life renders motion (as pre-scripted animations based on the movements of human models recorded via motion capture). Another limiting factor is the current control mechanism for Second Life avatars: keyboard and mouse input, which cannot possibly trigger the full range of human movement, as the interface works by converting brain impulses the Second Life keyboard commands rather than directly controlling the avatar.
The technology has many potential applications in the real world as well, such as remotely controlling robots in situations that are dangerous or too difficult for humans, and triggering devices that enable the handicapped to perform otherwise impossible tasks or to communicate with others.
Perhaps one day a BCI will enable me to play the solo to Stairway to Heaven on my air-Gibson Les Paul.

Posted by Christos Tsirbas.

Story links:


Nice hosting! It’s my first time to read like this article.

Are we considering eye movement? The visuals would look just the way images do we normally see particularly when our eyes are doing saccades, and the brain would sort out the simulated jumble of imagery just as it does the actual excited input from our eyes. All because AI has unravelled the mystery of these movements solely by EEG.

Verify your Comment

Previewing your Comment

This is only a preview. Your comment has not yet been posted.

Your comment could not be posted. Error type:
Your comment has been posted. Post another comment

The letters and numbers you entered did not match the image. Please try again.

As a final step before posting your comment, enter the letters and numbers you see in the image below. This prevents automated programs from posting comments.

Having trouble reading this image? View an alternate.


Post a comment

Your Information

(Name is required. Email address will not be displayed with the comment.)