Apple Inc. (AAPL) is tired of people beating up on Siri, its voice-activation technology.
Yet when the company releases its new operating system in the fall, Apple may once again feel the fury of users who have also own a device powered by Amazon.com Inc.'s (AMZN) Alexa or Alphabet Inc.'s/Google (GOOGL) Assistant. Siri has long been criticized for being unable to answer seemingly simple questions such as the weather forecast for next week on Cape Cod or what time the Dodgers game begins against the Giants.
If recent experience is any guide, Siri is unlikely to surprise on the upside. That's because users of voice-assistants have come to expect that they can ask a wide assortment of question and receive accurate answers. Yet Apple just isn't focused on building a voice-recognition technology that does that.
As everyone should know by now, Apple has never been interested in making products for the entire world. After all, its hardware devices have always been priced at a premium. Instead, Apple wants to keep you coming back to Apple, to buy its newest piece of hardware.
Siri's job is to integrate those devices, its meant to grease the connections between Apple devices, making the iPhone integral to the iPad, AppleWatch and AppleTV -- and all points in between. The problem for Apple is that people have come to expect a voice-activated device that can answer relatively easy questions fast and efficiently, and Siri -- here and here -- has mostly fallen short.
Apple's position is that Siri isn't trailing Alexa, Google Assistant, Samsung's Bixby or Microsoft Inc.'s (MSFT) Cortana. Yet when Apple's new operating system iOS 11 is made public in the fall, much attention will be heaped on whether Siri has improved.
How did we get here?
Siri's origins are a product of the intersection between technology and the defense industry, and not surprisingly, those developments took place in California. The technology at the heart of Siri was originally developed by SRI International, a tech-focused defense firm that began in 1946, just after the end of World War II. Back then, it was known as Stanford Research Institute, located right next door to Stanford University in Menlo Park, Calif.
In 2008, SRI separated its on voice-activation technology from the rest of the company when it received venture capital money from Menlo Ventures and Morgenthaler Ventures. In April 2010, Apple bought Siri for somewhere north of $200 million.
At the time, Siri was the clear leader in voice-activated artificial intelligence. Apple being Apple was eager to plant its flag on the seemingly far-away planet of machine learning. Owing to SRI's rather extraordinary work, Apple was able to introduce Siri to the public in October 2011 as part of iPhone 4S. For the next nearly four years, Apple had few rivals.
But then Alexa came out in late-2015, albeit to mixed reviews, followed in May 2016 by Google which introduced its Home product, powered by Google Assistant.
Suddenly, Siri's ability to recognize speech didn't seem so compelling. The public and the many media outlets that cover Apple began spewing a steady stream of stories citing common errors in Siri's ability to convert voice commands and questions into text.
In the summer of 2014, Apple took great pains to rewrite Siri's speech-recognition technology to integrate more current machine-learning capabilities. Further improvement in language recognition took place in the summer of 2016 as Siri sought to establish a more defined rules-based approach to understanding natural language.
At present, Siri receives over 2 billion requests each week from some 375 million devices that access its voice assistant each month. It would seem that Apple has more than enough data to determine what users want, and Apple insists that what their users really want is get a device or app to do something very specific.
That means voice requests around messaging, phone calls, reminders, rearranging a calendar or controlling music selection. Topics that happen to dovetail with Apple's most widely-used applications. While Apple says it's working to improve Siri's ability to handle topical questions, it's just not as high a priority.
If it was, Apple might have a different view of third-party developers. But it doesn't. Whereas Amazon has largely lifted the veil on Alexa, opening it to thousands of unique developer commands, Apple didn't want Siri bombarded with loads of new syntax. Users shouldn't be asked to speak to Siri with different commands based on the whims of outside developers, Apple has argued.
Rather, Siri looks for developers to support one of seven well-defined disciplines: transportation, payments, messaging, photo search, voice over IP/video calling, workouts and auto controls such as opening windows or starting an engine. With OS 11, outside developers have been allowed to add apps around paying bills and making purchases.
The new operating system is also expected to reveal a Siri with a new male and female voices across 21 languages. That is certain to get a lot of initial attention. But Siri's focus remains on Apple, not necessarily on the general consumer. And that means integration across a user's iPhone, iPad, Apple Watch and AppleTV as well as with CarPlay, Home Kit and its wireless AirPod earbuds.
And by the end of the year, the HomePod. In other words, within the Apple eco-system.
As it has successfully done for some 35 years, Apple's message is simple: come to Apple. Even if Siri can't tell you the weather forecast for Cape Cod.
Apple's shares rose 0.6% to $145.15 on Monday morning.
Editor's Pick: Originally published Jun. 16.