Monday, October 15, 2012

Outside the Asylum: Tools of the Trade

One of the biggest problems you inmates run into is communicating across your diverse delusions. For some reason, you all seem to have your own personal languages in order to communicate. While I realize each of you bounces off of reality at a different angle, my greatest dream is to create an means by which you can share your unique imaginary view of reality with each other, so you can all align yourselves properly. To that end, I've pulled together a map towards the idea.

I realize that in the digital age, the ability to share experiences has been... abused. The bandwidth necessary to share that picture of your bagel this morning could have been dedicated to more beneficial duty, but if we're going to have the ability, lets TRULY capitalize on it.

With the emergence of products like ProjectGlass from Google and the advancements in augmented reality gaming it won't be long before Second Life and your first life are bumping up against each other. You're going to be able to skin your friends and decorate your house using purely digital sources combined with 3D Printers.

But the really exciting part is what sort of interfaces are coming to help us manage all the data more efficiently. Spatially aware gaming sensors have adapted people to the idea of using their body as a controller, but we are chasing the dream of the brain-computer interface.

There are several different approaches currently being examined, and each has its own benefits and limitations. Emotiv is a company that is pursuing electroencephalography as a means to interface with a computer. Berkley is using MRI technology to project images from your brain into a digital reality. Potentially the most useful, although simultaneously prohibitive non-invasive method would utilize MEG which has the potential to give the most precise reading of neural activity in real time, but is severely limited in utility because of its sensitivity.

Then, we have the people at Braingate, who are in the early stages of clinical testing for a chip that implants directly into the motor cortex of the brain, which allows you to control the computer cursor or a robot arm using the same thoughts you would for natural muscle movement.

The goal is to gather as much detailed information about brain activity and map that activity to the experiences that create it. Every person is going to have a unique biological reaction when trying to mentally execute a task, so all of these methods require software that learns to connect your brain activity with another event. The software has to learn how we think so that it can act like an extension of us.

Knowing humans and technology, how long do you think it will be before we're trading experiences like sky-diving in our data streams the way we share pictures of biscuits and tea today?


No comments:

Post a Comment