Volume 6, Issue 2
This article was submitted for publication to the Journal of Geoethical Nanotechnology by Mr. William Kraemer, M.S., whose primary focus is the interface between information technology biology and biosensor arrays.
Mr. Kraemer deftly explains how human interaction with digital computers will change in the near future, from the initial user interface to the evolving thought pattern recognition and brain-computer interface.
How we interact with our digital computers has only changed once so far over the last few decades, but it is poised to transition again twice in the next couple decades. That interaction that is actively being re-designed by many academic and corporate researchers is usually called the User Interface (UI) and more recently the User eXperience (UX). These UIs are really layers or shells and started with the Command Line Interface (CLI) of text commands and replies only to later add on top of it a Graphical User Interface  (GUI) that is more icon-driven with more point-and-clicking and thus less typing. The next UI already beginning to change computer gaming, is generally referred to as a Natural User Interface (NUI) and replaces mouse movements with gestures, typing with voice recognition and basic user identification such as passwords with face recognition. Finally, pattern recognition algorithms in use to develop NUIs are only a precursor to the thought pattern recognition software that is in limited use currently and under intensive research. The study of the structure of the brain, spine and peripheral nervous system and how it changes based on different cognitive patterns is called neuromorphic computing. The direct interaction of the brain (primarily) with computers of various kinds, called a Brain Computer Interface (BCI), appears to be the most transformative and final frontier of computer science and cognitive neuroscience to come this century.
From Text to Icons in Commands
The Command Line Interface (CLI) was developed to help speed data entry and processing by overcoming the dependency on punch cards and to allow for time-sharing on the expensive mainframes to do this batch processing (Hauben , 1994). The first freely available real time operating system to allow for this kind of computing is UNIX released in 1969 (Hauben, 1994, Computer, 2006).
The 70's were a time that a variety of hobbyist computers were beginning to appear at retail outlets. At this time, Steve Jobs toured the Palo Alto Research Center (PARC) and saw the Alto, the first computer with a Graphical User Interface (GUI) (Bellis, 2011). Later, in 1983, Jobs released Apple Computer's first GUI computer called the Lisa with little market response followed by the very successful Macintosh. Microsoft announced the Windows 1.0 in 1983, which was released in 1985 (Bellis, 2011), which, like the Xerox Alto and Apple's Lisa and Macintosh computers also had a GUI.
From Icons to Speech, Gesture and Faces
Both the CLI keyboard and the GUI-enabling computer mouse is beginning to be replaced in part by a Natural User Interface (NUI) that uses speech recognition instead of the keyboard and gesture recognition instead of the mouse. An early example of this NUI is found in Microsoft's Kinect , which is based on facial recognition of the user's identity with voice and gesture recognition driven commands to operate the 3D environment of the Xbox console (Andrews , 2011; Knies, 2011). Kinect will also be added to Windows operating system (Andrews, 2011; Knies, 2011). In fact, the other two main gaming consoles, the Nintendo Wii and Sony PlayStation Move also have some NUI elements though they are presently less advanced (Miller, 2010).
Towards a More Direct Linkup
At the same time as NUIs become widespread, early examples of thought recognition for the motion control region of the neocortex have already begun to succeed. This early success is sparking greater enthusiasm and acceptance among researchers and forecasters of the goal to improve or perfect it, which is the Brain Computer Interface (BCI) of neuromorphic computing research (Flatley , 2010; Gaudin , 2009; X Prize, 2010). More proximate goals for a BCI include a means for quadriplegics to communicate, pilot or driver assistance, developing ongoing BCI research tools and of course entertainment applications – brain generated music, enhanced gaming and possibly better team work.
Directly Linking Man and Machine Together
The first BCI developed to arrive has been the BrainGate  software that allows quadriplegics to interact with a computer using software that interprets the inputs from an electroencephalogram (EEG) machine (BrainGate, 2009). This software is on its second version and a very basic BCI of similar design is already available for computer gaming (BrainGate, 2009; Emotiv, 2010). A more advanced form of BCI called Cortical Neural Prosthetics (CNP) does currently succeed in controlling artificial limbs through input from the primary motor cortex of the brain and the CNP BCI works more precisely than the organic arms that the pelagic had lost (Halley , 2010; Kamen , 2009). Both prosthetic arms and legs are quite functional now as well as cochlear implants for the deaf (Clausen , 2009; Tucker , 2010) and a first version of a retinal implant called the Argus II has been proven to work at helping the blind navigate and occasionally even read and will be on the market later this year (Economist , 2011; Haupt , 2010).
Who is in the Driver's Seat
Another more proximate or intermediate goal of BCI is in navigation. The DARPA Grand Challenge demonstrated that vehicles could be programmed to autonomously navigate an obstacle course without any human input (NOVA , 2006). A team of undergraduates at Northeastern University created a robot that can be navigated by thought control alone (Tucker, 2010). The so-called Google Car  can do many navigational chores without any human input (ABC, 2010) but could be enhanced with a BCI. The United States Air Force is also very interested in BCI for future piloting of aircraft (Lyle , 2010) and has begun research in that direction (Linderman , Burns, Moore, Wu, Qui & Taha, 2009).
From Data to Knowledge with BCI Assistance
Sorting through data is increasingly important in many endeavors as the amount of information expands dramatically. Currently, Artificial Intelligence (AI) programs are designed to narrowly recognize a certain type of pattern – optical character recognition recognizes words and numbers from printed pages, voice recognition does the same for human speech, facial recognition for individual identities and so on.
An early solution to sorting through a data deluge is in "cortically coupled computer vision" (C3Visions) system developed by Paul Sajda in which BCI software recognizes the "a-ha" moments when a brain sees an element of a pattern it is looking for and auto-sorts the remaining pile of information based on similarities to other data (Piore , 2010). The C3Visions system is very good at graphical sorting problems such as a large collection of photos or videos for intelligence gathering purposes (Piore, 2010).
AI or AGI as a Threat, Ally or Resource
Concerted effort to more fully understand how the brain recognizes general patterns is underway to develop what is now called Artificial General Intelligence (AGI) to tackle broader recognition goals (Goertzel , 2009; Koene , 2010; Livingston  & Arel, 2009). Nevertheless, many with AI or other scientific expertise are concerned that once any AGI is created it may quickly also becomes a Super-Intelligence (SI) and would then take over control of many computerized systems with its acquired independent and inhuman pattern recognition abilities (Ptolemy , 2009). Many of these concerned experts think unenhanced humans without an intimate neural connection are at some level of existential risk. Some of those concerned include Hugo De Garis, Director of the Artificial Brain Lab project at Wuhan University in China, Peter Diamandis, X Prize Foundation Chair, Ben Goertzel, a polymath and AI researcher, Kevin Warwick, Prof. of Cybernetics at University of Reading in the UK and Ray Kurzweil, an AI programmer, inventor and author of technology books (Ptolemy, 2009). Many of these people concerned about an AGI threat think that merging and thus upgrading our own intelligence with the AGI agents will help prevent or counter any rogue AGI agent threat to our progressively more automated man-machine civilization. The early BCIs as antecedents do show this long term potential.
Micro and Nano Sized Computers for Greater Intelligence, Health and Wellness
Intel is the largest computer chip manufacturer in the world and is doing research on a directly implanted brain-to-computer interface to replace the CLI of the keyboard and mouse and also render the NUI of voice, gesture and face recognition interactions unnecessary (Gaudin, 2009). The primary research location for this work is the Pittsburg Intel Lab directed by Dean Pomerleau in a project called NeuroSys (Gaudin, 2009; NeuroSys , 2010; Pomerleau , 2008). The Intel Lab at Pittsburg works in partnership with the Brain Image Analysis Research Group at Carnegie Mellon University  to identify the brain's thought patterns through fMRI technology (BIARG, n.d.; CMU, 2009).
Embedding many tiny computer chips directly into the human brain is not the only application of micro- or nano-scale chips inside the human body. Nanomedicine is a research area that seeks to make specialized miniature devices to deliver drugs directly to the site of infection, remove excess cholesterol and other harmful substances from the blood stream and eventually aid the immune system in detecting foreign invading organisms (Aldridge , 2008; Bullis , 2006; Freitas , 2005; Kraft , 2011).
Currently small computer chips are implanted into the brains of Parkinson disease patients to control tremors and seizures. It is called a brain pacemaker and the technique it uses is called deep brain stimulation or DBS (Glass , 2010) and is an early example of a direct BCI to improve upon basic health.
The Future of BCIs
As BCIs get more powerful and effective, the demand from people without any clear medically diagnosed need for such will begin to emerge. Additionally, opposition to this innovation will also grow. Key concerns would include the potential for these BCI chips to track the people that have them implanted and any loss of control or freedom the chips might represent. If the BCI safety record continues to remain solid, more advanced BCIs will be approved and enter the market. Expanded or "off label" use that is already common with popular drugs like Prozac and Viagra will in time also apply to BCIs as well. The consequent spread of BCIs will then help ensure that humans can evolve and adapt to the increasingly complex world we are creating with the technologies we develop.
 "The History of the Graphical User Interface." Retrieved May 14, 2011, from http://inventors.about.com/library/weekly/aa043099.htm
 Hauben, R. (1994). History of UNIX - Part I. Retrieved May 14, 2011, from http://www.dei.isep.ipp.pt/~acc/docs/unix-Part_I.html
 Kamen, D. (2009). Dean Kamen at TEDMED 2009. Retrieved April 10, 2010, from
Knies, R. (2011, February 21). Academics, Enthusiasts to Get Kinect SDK. Retrieved May 21, 2011, from
 Andrews, S. (2011, May 20). Kinect: taking control of computing. Retrieved May 21, 2011, from http://www.pcpro.co.uk/features/367513/kinect-taking-control-of-computing Bellis, M. (2011).
 Flatley, J. (2010, September 18). SMU and DARPA develop fiber optics for the human nervous system. Retrieved September 19, 2010, from
 Gaudin, S. (2009, November 19). Intel: Chips in brains will control computers by 2020.Retrieved November 20, 2009, from http://www.computerworld.com/s/article/9141180/Intel Chips in brains will control computers by 2020.
 BrainGate. (2009). BrainGate | Wired For Thought. Retrieved June 12, 2011, from
http://www.braingate.com/thought. T. Brain Image Analysis Research Group (BIARG). (n.d.). Home. Retrieved June 11, 2011, from
 Halley, D. (2010, August 3). Mind-Controlled Artificial Arm Begins the First Human Testing. Retrieved August 21, 2010, from
 Kamen, D. (2009). Dean Kamen at TEDMED 2009. Retrieved April 10, 2010, from Knies, R. (2011, February 21). Academics, Enthusiasts to Get Kinect SDK. Retrieved May 21, 2011, from
 Clausen, J. (2009). Man, machine and in between. Nature, 457(7233), 1080-1. Retrieved May 26, 2011, from Research Library. (Document ID: 1656529191).
Computer History Museum. (2006) Timeline of Computer History, Software and Languages. Retrieved May 14, 2011, from
 Tucker, P. (2010). Prospects for Brain-Computer Interfacing. The Futurist, 44(5), 7-9. Retrieved May 26, 2011, from ABI/INFORM Global. (Document ID: 2101088481). X Prize Foundation (2010). BCI X PRIZE - Igniting a Brain-Computer Interface Revolution. Retrieved July 29, 2010, from
 Economist. (2010, December 11). "Seeing into the future." The Economist. 397(8712), 3. Retrieved June 11, 2011, from ProQuest (Document ID: 2211258131). Emotiv. (2010). Emotiv - Brain Computer Interface Technology. Retrieved May 30, 2011, from
 Haupt, A. (2010, November). Health Buzz: Retina Implant Partially Restores Sight. Retrieved June 11, 2011, from http://health.usnews.com/health-news/family-health/eye-andvision/articles/2010/11/03/health-buzz-retina-implant-partially-restores-sight
 NOVA. (2006, March 28). "The Great Robot Race". Retrieved June 1, 2011, from http://www.pbs.org/wgbh/nova/transcripts/3308_darpa.html
 ABC News. (2010, October 12). "Test Driving the Google Car." Retrieved May 30, 2011, from http://abcnews.go.com/GMA/video/test-driving-google-car-11857670
 Lyle, A. (2010, July 15). Air Force's 'Technology Horizons' makes science fiction a reality. Retrieved July 17, 2010, from
http://www.af.mil/news/story.asp?id=123213717. Miller, R. (2010, November 4). Kinect for Xbox 360 review. Retrieved May 21, 2011, from
 Linderman, R., Burns, D., Moore, M. Wu, Q., Qiu, Q. & Taha, T. (2009, June). Investigating Architectural Issues In Neuromorphic Engineering. AFRL-RI-RS-TR-2009-157. Retrieved August 12, 2010, from
 Piore, A. (2010, October 15). A Brain Boost for Information Overload. The Record. (36)03. Retrieved October 22, 2010, from
 Koene, R.A. (2010). Whole Brain Emulation: Issues of scope and resolution, and the need for new methods of in-vivo recording. Presented at the Third Conference on Artificial General Intelligence (AGI2010). March, 2010. Lugano, Switzerland. Retrieved August 29, 2010, from
 Ptolemy, R. (Producer & Director). (2009). Transcendent Man [Film]. Los Angeles: Ptolemaic Productions, Therapy Studios. Ritchie, D. (1996, October). Early Unix history and evolution. Retrieved May 14, 2011, from
 NeuroSys. (2010). Intel Labs - Pittsburgh. Retrieved June 11, 2011 from
 Pomerleau, D. (2008). Dean Pomerleau, Intel Labs - Pittsburg. Retrieved June 11, 2011, from
 Aldridge, S. (Oct 2008). It's a small world ... Nanomedicine is on the move with several substances in development that could offer improved pharmacokinetics, drug delivery and bioavailability. (SPOTLIGHT)(Report). Pharmaceutical Technology Europe, 20, 10. p.12 (3). Retrieved July 18, 2009, from Nursing and Allied Health Collection via Gale:
 Bullis, K. (2006, March). Nanomedicine. Technology Review, 109(1), 58-59. Retrieved June 14, 2009, from Alumni - ABI/INFORM Global database. (Document ID: 1011348451). Also Retrieved June 27, 2009, from http://www.technologyreview.com/read_article.aspx?ch=specialsections&sc=emergingtech&id=16469
 Freitas, R. (2005). Current Status of Nanomedicine and Medical Nanorobotics. Journal of Computational and Theoretical Nanoscience. (2)1-25. Retrieved March 25, 2010, from
 Glass, J. (2010, March 15). Deep Brain Stimulation for Parkinson's Disease. Retrieved June 12, 2011, from http://www.webmd.com/parkinsons-disease/deep-brain-stimulation
William Kraemer completed a Baccalaureate degree in Biology, Social Science and Philosophy with the intention to then study Biotechnology and Medicine during the mid-90s. After some graduate courses in Biotechnology, he realized that biological science was clearly the next information technology (IT) and has been working in IT. Having completed a Master of IT degree, he is looking at degrees that focus directly on the interface between information technology biology such as a Brain Computer Interface (BCI), nanomedicine and biosensor arrays.