I Controlled a Telepresence Robot with My Mind.Simply by concentrating I was able to get a remote telepresence robot to move where I wanted, in a manner of speaking.
In the video below I demonstrate the use of an assembly of technologies that I put together that enables me to exert mind control over the telepresence robot that I have demonstrated previously (itself an assembly of a range of technologies). The key addition to the earlier assembly and demonstration of the remote telepresence robot (which can be found here: http://youtu.be/lJTU3fLoZuY
) is the use of an Emotiv EPOC headset, which is able to sense, record, and transmit information concerning "brain waves" or EEG signals from my brain.
Using this I am able to direct my attention to particular areas of the computer screen and, by concentrating and training the Emotiv software to associate particular brain states with particular actions such as left-clicking the mouse, I am able to use my mind to control the little robot - whose "eyes and ears" I can see and hear out of via the streaming video feed displayed on the computer.
I have only trained with the Emotic EPOC headset for a relatively brief period and this form of mind control is still very difficult - it is a lot harder than it looks in this video. There were occasions where quite a bit of time would elapse between successful movements - you can see the transitions between each movement of the robot, which I have edited because some of them were nearly a minute long. If you are going to attempt this yourself you should be prepared for some occasional frustration! As such there is much more training required before I could claim "competent" control with fluid and responsive action of the robot from my thoughts.
But, while I freely admit that this is a rudimentary, crude, and inelegant implementation, as demonstrated in the video it still works and is nonetheless a good little proof-of-concept. In fact I might even claim a limited and primitive form of telekinesis by moving a hunk of matter with my mind, purely in homage to Arthur C. Clarke and any sufficiently advanced technology being indistinguishable from magic
So, it is 2012 and for a few hundred bucks you too can put together your own mind-controlled telepresence robot system. Robot Surrogates and Avatars await!
Of course, if you’re part of a professional laboratory with some funding and very smart team you can use a very similar system to exert mind control over a flying AR Drone quadcopter, as this group has done here: http://www.youtube.com/watch?v=JH96O5niEnI _________________________________________
A few extra notes:
* The headset has an accelerometer and can do pretty damn precise head tracking, and you can use this to substitute for the computer mouse.
* Examples of mind-states that I trained are concentrating on the left side of my head / body or the right side, or applying “mental pressure” to my forehead and imagining pushing fluid from my brain through my forehead. Sounds easy. But to actually maintain it for decent periods of time is difficult. If it gets too difficult you can default to using facial expressions instead, e.g. smiling, blinking, etc, which the headset picks up very easily.
* One type of difficulty is your own emotional response. So when I was successful in concentrating and moving the robot for the very first time - seeing an external material object move just by thinking about it - I had such a surge of elation, surprise, humour, and excitement that it destroyed my mind state and made it impossible for me to exert any control for many minutes afterwards.
* Leading on from this, I’m pretty sure that if there was a feedback mechanism from the robot via sensors, e.g. a touch sensor that conveyed when the robot had been touched to an actuator or electrodes that delivered a signal to my skin, then interesting things would start happening to my body map and sense of self.
* I’ve been in discussions with Peter Mora, lead developer of the WebKey application software / web service that I’ve used to enable this telepresence robot solution and he had indicated that he might try to include the ability to map sections of the screen (via the web-interface) to keys on the keyboard. In this way forward, backward, left, right, etc could be the arrow keys or the WASD keys, which would allow for more intuitive control, not just for the normal control of the telepresence robot, but particularly for the mind-control version that I discuss here; different mind states could be mapped to each key for example and so moving the robot would be somewhat more intuitive (ideally - you would have to be very well trained though).