It's not exactly news that Google ( GOOG) dominates text search on the Internet. Nor is it surprising that Google is working to broaden its command of image and video search. But Google is going about the latter all wrong.

The problem with searching for images or video through Google is that you have to describe the image and moving pictures you're seeking with a text description. You then refine your search with more text detail. See the problem? Your visual senses, when searching, become completely reliant on your verbal capacity, along with the text and metadata associated with an image or video.

A video-search competitor, Blinkx, attempts to offer better results by searching a transcription of audio corresponding with the video. Again, your search results are based on auditory clues, not visual.

Enter Bryan Calkins, 47, and Dr. Leonid Kontsevich, 49, founders of CogniSign.

The two San Francisco-based entrepreneurs teamed up in 2003 and launched their image-search portal,, for stock images in 2007. With their launch, they turned search into a visual experience.

"When humans are processing an object or a scene for recognition, their visual attention is jumping from key feature to key feature," says Calkins. "We have mimicked that part of the human visual-processing system with our trajectory technology. And it is the same thing, evaluating those salient features and their spatial relationship to one another. Then accommodating changes in scale, changes in viewing perspective and changes in position within the image."