Skip to main content



) -- For a generation, I told friends that computers were a box connected to a TV set for output, a typewriter for input and a tape recorder for storage, all sitting on a desk.



iOS changed all that. Now we think of only the screen. Touching the screen replaces the typewriter, and the tape recorder is replaced by memory chips. The whole thing can be held in one hand. You can walk around with it.

A personal computer, it turns out, is more like the box held up by a store clerk in



co-founder Gordon Moore's famous


article, which now lives at

Intel's Web site

. Writing in 1964, Moore had no idea of what interfaces would matter: The computer was just intelligence, processing at the center of it all.

So iOS is not the final evolution of computer interfaces. Already, Apple Siri and its imitators are telling us we can talk to our computers, like the late James Doohan tried to talk to a Macintosh in

Star Trek IV




Kinect, and now

Leap Motion's gesture device

, tell us that we don't have to touch the screen to interact, that we can use a form of sign language.

Those who are dismissing Apple's latest earnings, thinking there's nothing for the company to innovate beyond its current lineup, are missing this key point. The very nature of computing is about to be transformed.

Above my screen right now sits a



camera, purchased for an abortive career as a Web TV personality. The microphone is studio-quality, and the video output is also top-notch. But that's becoming more than an input device. It's becoming an interface.

Also see: Leap Motion CEO: Reach Into Your Computer >>

By this time next year, instead of talking in front of the camera about stocks and technology, I might use the camera, and the microphone, to tell the computer what to do, to control software, making it my interface with an operating system.

The camera itself cost me just $50. At that price I can have one in every room. If a product combining voice, visuals and gesture interpretation is part of an operating system, one shared by all the screens in the home, now I'm walking, talking, and interacting with technology as naturally as I might with my family.

Also see: Microsoft Strategy Is to Make Google Into Microsoft >>

All the tools needed for this kind of interface already exist. We can connect such devices using WiFi, which now runs to 100 Mbps. We have voice interfaces, and we have gesture interfaces that work at both short and long range. Companies like



already allow computers to take dictation.

The processing and storage of computing goes into the background. Much of it will exist on a desk, or desks, but other parts will be in a closet somewhere, and still others in the cloud. All that's needed is some form of integration, an operating system, like Windows or



Android or iOS, that can tie it all together for us.

Also see: Facebook Could Be Bigger Career Killer Than You Thought >>

I call this next evolution in computing the interface wars. It's what we're going to be seeing on the client side of computing for the rest of this decade, and for some time beyond. Touch, gesture, voice and vision are the tools we use in our daily life to interact with one another. And all these tools are coming to a computer near you.

At the same time, that computer will become more distant. If you don't have to sit before a typewriter to write, if you can dictate to an interface that will then produce an output you can check on your iPad, then why are you sitting down, in either an office or in your home? The "computer" will thus become the nearest screen, and all the screens in your home should be compatible.

The best part of this new computer interaction is that it finally welcomes something I've written about for a decade, the "Internet of Things." With WiFi becoming the home of your interfaces, devices can be spread throughout the home that quietly make use of the same platform. You can be wearing medical devices tied to the WiFi interface, you can tie security, gardening, and home automation to the same interfaces.

How much of this will Apple do? How much of it will Microsoft do? How much of it will Google do, and how much will be done by as-yet unlaunched start-ups?

I don't know. I'm just convinced it is going to be done, that it's inevitable, because all the tools for doing it exist, and because there is certain to be ready demand. We want to interface with our machines as naturally as we interface with each other. Now we can.

At the time of publication, the author was long AAPL and GOOG.

Follow @DanaBlankenhorn

This article is commentary by an independent contributor, separate from TheStreet's regular news coverage.