Artist Neil Harbisson was born completely color blind, but these days a device attached to his head turns color into audible frequencies. Instead of seeing a world in grayscale, Harbisson can hear a symphony of color — and yes, even listen to faces and paintings. Neil Harbisson’s “eyeborg” allows him to hear colors, even those beyond the range of sight, such as ultraviolet and infrared frequencies.
His story is unbelievable; a remarkable use of modern technology. This is just one example of a growing trend of technology integration with our biological “hardware”. I want to talk about encouraging the innovation that will push the boundaries between technology and our world. To do this, we have to start designing for our bodies, for our native ecosystems. There is a lot of material in this post, so make sure you catch all of the exciting developments! Listen to Neil’s story, and get excited!
Neil Harbisson is a pioneer and a revolutionary, but he is not alone. Check out this documentary into the technological feasibility of some of the cyborg “enhancements” in the video game Deus Ex Machina. Rob “Eyeborg” Spence is a real life person with a prosthetic eye with a wireless camera inside. This camera can stream the video feed to another device. He looks at prosthetics designed by Science Fiction pop-culture, and assesses the industry-leading equivalents. In my lifetime, we may see “body hardware” become the next element to human evolution.
But the Cyborg future will also include software that has to interface with our existing world. By designing interfaces to behave the same way we interface with the rest of the world, we can merge the two impossibly different ecosystems. I shouldn’t have to be in an office, using a mouse invented in the 60’s and a keyboard invented in the 1800’s by the original Type Writer, in order to use technology. Some of the innovators have been talking about augmented reality as a desired result of biotech augmentation. So lets talk about some real world toys that are helping to merge the two worlds together!
Remember the computer in the Minority Report (2002)? No mouse or keyboard, just gloves. We built the real deal by 2008. The G-Speak computer comes complete with multi-touch, simultaneous input and motion detection that occurs in a 3D space to execute intelligent UI commands. Watching this technology in action is a thing of beauty.
What happens when you take away the screens? What can we do today at an individual consumer level?
Pranav Mistry has been working on similar Augmented reality interfaces as G-Speak, but in a manner that won’t break your bank. For just $350 as of 2009, you can carry SixthSense, a portable projection solution that takes the computing world with you as you go. Imagine being able shop the real world with all of the review and product data you can get online, even though its not all on the label. Imagine being able to take and browse pictures without ever pulling out your phone.
Mozilla is dreaming up a similar future, with their Firefox Seabird design. So is Google, with “Google Vision.” Each idea’s future implementation may diverge as they develop, but they all point to a strong interest to bring augmented reality to consumers within the next 5-10 years. If you can dream it, it is possible!
Once the building blocks are in place: mainly the hardware, the bioelectric interface, and the user software, it is just a matter of time before the form factors of these technologies reach a point where the become a part of our daily lives. When they do, we will indeed have skipped some steps in biological evolution to enhance our world for the better.
When that day comes, will you be willing to give up your own arm for a stronger prosthetic? Would you be willing to give up your eye for a smarter, stronger eye with video capture and augmented reality? As with all technological advancements in our history, challenges are on the horizon that will test our ethical responsibilities. We will need to be focused on ensuring that as we rush to the “next stage” of evolution that we don’t leave our “humanity” behind.
Leave a comment if you feel excited, apathetic, or nervous about the future of this technology. I personally can’t wait until these dreams become reality.