NEW YORK (TheStreet) -- IBM (IBM) revealed on Tuesday a new Watson supercomputer-created, cloud-delivered personal shopper the company says will revolutionize the currently, often tedious online and mobile shopping experience.
IBM's Institute for Business Value recently found after a study of more than 30,000 global consumers that 40% of shoppers are using mobile, social and location technologies for information gathering, and yet are unlikely to use these channels to purchase products. What's hindering shoppers from making purchase decisions from these devices is, to start, the time-consuming process of online and mobile shopping in their current state. Consumers have to enter keywords into websites and hope for the best. Or, worse, they have to comb through a site for details buried under a sea of categories and subpages. In fact, according to PowerReviews, 50% of consumers spend 75% or more of their total shopping time conducting online research.
The studies indicate that what customers are looking for from today's online and mobile experience is the thoughtful help of a knowledgeable and relatable sales representative who's instantly ready to advise on important purchase decisions.
Drawing from the $100 million that IBM has earmarked to fuel new cognitive apps, IBM is collaborating with digital commerce company Fluid to create the Fluid Expert Shopper made with IBM Watson. The XPS app will give consumers one-on-one interaction with Watson on desktops, tablets and smartphones for the first time and the power to ask the app highly specific questions on key purchase decisions, as they would a sales associate in a store.
For instance, if you're a North Face customer, you could ask Watson for advice on what outdoor gear would be best-suited for a five-day, June hiking adventure in Phoenix. Fluid XPS then uses Watson's understanding of natural language to identify clues from the question, suggesting particular needs around weather, terrain and trail conditions. The app then combs through massive amounts of data, including product details, expert reviews, blogs and travel maps to find the right products for the customer's trip.
The North Face is currently one of the various consumer brands the XPS app is being customized for.
Stephen Gold, vice president at IBM Watson Ecosystem, tells TheStreet that as the Watson-made personal shopper becomes smarter and smarter with each human interaction, its ability to interact with customers in an intellectually-challenging way will only grow stronger and stronger.
Here's TheStreet's interview with Gold...
Tse: How is the new, Watson-created, personal shopper different from the other automated queries in existence?
Gold: Watson is really a very different type of technology. It's really a new technology that affords an individual the opportunity to ask questions in natural language. So rather than a programmatic input, I can express my customer query as I would a friend, and it then has the unique ability to really rediscover in context information that's been made available. So if you're thinking about maybe asking a question about a product or seeking reviews of a product at that very moment, you're able to direct that question through the service.
The service investigates all the information that's been populated and then it brings back. But what's really interesting, what I think is extremely interesting is that it does so in a very transparent way. It brings back not just a single response -- it can bring back a set of responses to a request and can weigh those responses with confidence, as to whether they'd been an appropriate type of response to the question. It also brings back all the supporting evidence. So unlike all the other forms of information discovery where I have to rely on the system to make a determination for me, this is actually an intelligent system that's capable of understanding much the way you and I would have a conversation.
We now can have a conversation with Watson.
Tse: Are the queries to be made with voice or typed commands?
Gold: Well it's agnostic in the sense that really, the technology can process a natural language input. That could be from a text, that could from something that was keyed in, it could be through a technology that did a voice of text translation. That's out there so we can support those technologies. But the essence of it is that the input itself is unstructured right, so it's in our native language. And what's interesting about that is that it's not voice recognition.
We've all been on those voice recognition systems, pounding on the zero key yelling "agent, agent, agent."
This Watson technology is very different in the sense that it actually understands the nuances -- the colloquialisms, the idiosyncrasies of the human language.
We don't really think about the way we talk to each other as illogical, but at least in a historical frame of reference to computerization it is, right. In English, we say things like "houses burn up as they burn down," right. And that's counterintuitive. And you know we say things like "noses run and feet smell." You look at a classical definition and you wouldn't uphold that.
So when I talk about this natural language capability, it's really in and of itself quite an accomplishment. And that's a part of Watson. But its only one dimension, right. That by itself is interesting, but then to literally be able to read context ...
We live in a world where there's an information deluge right. 90% of the world's data was created in the last two years, 80% of that's unstructured, and it's not just blogs and tweets and posts, it's information that's been captured and stored away in recordings, surveys and forums. I mean how often do I call a vendor and hear, "this call is being recorded for quality purposes." I can't help but wonder "what do they do with that recording?"
Imagine now if you transcribe it, and literally a system can learn from that experience, can learn from that interaction with the consumer and get smarter -- which is the third part of Watson. It learns -- so unlike a typical conventional system that we know today that's programmatic in nature since the '50s, that's what we've known in computing, it's programmatic systems. They're driven by logic, they're driven by rules based on structured data. Now remove those barriers and imagine a system that learns. It gets smarter with each interaction, with each outcome. With each new piece of information, it's getting progressively more insightful and I always think about this in the context of today.
If I use a piece of software and it doesn't work as designed or it doesn't have the capability that would be beneficial, I would have to wait until the next version or the next release. And then it may or may not be addressed. With Watson, it's progressively advancing every time there's an interaction. And again, it's learning based on behavior, based on activity, and it turns out that that's a very important distinction as we think about how we as individuals live our lives.
Imagine computers have historically been deterministic right, they've always been looking for the single right answer. Watson has the ability to take in that new data and say, "you know what, I know yesterday I was 90% confident that the world was flat, but today I'm only 84% confident because there's new evidence that would controvert that statement." With further explanation and further use, the system gets smarter right and eventually learns that the world is round. And this happens every single day. In healthcare, finance, law, retail , every industry, things are changing. And we want to be able to take advantage and learn from that.
Tse: Will the personal shopping concierge be accessible from essentially anywhere?
Gold: It's a cloud-based service, which means that as long as your smartphone, your tablet, your PC is connected, it's accessible through the respective app that you're using. So what we're going to see is this takes form and shape in a variety of interactions that we as an individual will experience every day.
In the case of the announcement with Fluid, it will be accessible, online, through various merchant sites, typically expressed as an "Ask Watson"-type function. So I may, rather than doing a simple search or a drop-down menu filter, just ask a question. And the question will pertain to my interest, to my needs at that moment in time. And what's really interesting is I can literally have a sustained conversation with Watson as I would with a personal assistant, with a personal shopper, a knowledgeable sales person at my side, saying "yeah, you can buy that, but I recommend this because." That's exactly what Watson's doing.
Tse: What's next in the personal shopper's evolution as Watson becomes increasingly sophisticated with each human interaction?
Gold: I think a couple of things will happen. I think if you think about this notion of learning systems, we equate it with what we've experienced going to school. Moving from first grade to second grade, to third grade, to fourth grade. With each successive year, we're getting smarter and we're able to do more with that knowledge, and we're able to interact in a more intellectually-challenging way. So today you can ask a question and you can carry on a conversation, but tomorrow it will advance to a point where the system will prompt you with a question.
Right, so take for example, I want to go on a camping trip. Rather than answer the question, "what equipment do I need," maybe the best response is, "where are you going?" Cause if you're going to Joshua Tree, I'm going to answer it differently from if you are going to Yellowstone. "And what time of the year are you going? ... Oh! Well I'm going to make this venture into winter" -- well, that's a different answer. Cause the tent, the sleeping bag, the equipment you need is going to be different. So the next generation of the system is going to take us to a more evolved level, a more advanced level of interaction and I think you're going to see this progression in terms of that type of sophistication. The other thing that's going to happen, if you think about it, that takes us deeper into dialoguing -- I think you're also going to see this technology extend to a greater variety of use cases.
So some of the things that you might imagine as a consumer -- I might want to talk about my camping trip, but maybe as a buyer that works at the store that I'm talking to, I want to actually look across the cohorts of all the buyers and I want to find new patterns, I want to find new insights. And I want Watson to help me do that. And that's discovery. That's a different type of kind of capability. What we are introducing is more centered around the notion of discovery, not simply "ask a question and answer." So I think you're going to see both the depth of capabilities associated with dialoguing and question and answering and the breadth of what types of investigations Watson can do, as information grows substantially.
Tse: Do you anticipate that the personal shopper will come with other languages in the future?
Gold: I do. Today, it is English only, but it is something that we're working on and it's interesting. I'm always asked about translations. But the problem isn't translations. Translations are very straightforward. It's actually understanding the language. It's the way in which a language is fundamentally constructed, an interpretation of that language. So English is a great example right, where the way we express things, the way we construct nouns and verbs and adjectives, is very different than German. Very different than Chinese. So it's just time and task right. We're working on it. I absolutely believe that you're going to find that we're going to be talking about cognitive technology for the next 50 years, and there's no question it will address the languages that we typically interact in.
-- Written by Andrea Tse in New York