This week, Google began rolling out an AR mode for Google Maps to select users. The solution, first demoed last May at the Google I/O conference, uses a phone's rear camera, GPS signals and Google's machine learning smarts and Maps/Street View data to provide turn-by-turn directions as a user navigates an urban environment.
When a user lifts up his or her phone while the AR mode is engaged, more than half the screen shows a rear-camera view with superimposed AR content such as arrows and street names. The rest of the screen shows a traditional Maps view, with a blue arrow and dots letting users know their current location and planned route.
As Google itself admits, this feature isn't meant to be a full replacement for Maps' standard turn-by-turn navigation feature, which doesn't require lifting up one's smartphone and (since the camera isn't used) consumes less battery life. Rather, it's meant to be a complement in urban areas where a phone's GPS and digital compass may not be completely reliable.
However, provided battery drain isn't excessive, it's easy to imagine a feature such as Maps' AR mode being frequently used on augmented reality headsets that can superimpose content on a hands-free display that's constantly in front of a user's eyes. Likewise, one could imagine the Google Lens service -- it identifies objects, places and text picked up by a phone's camera, and is integrated with Google Assistant and Android camera apps -- proving popular on AR headsets.
Services such as Lens and Maps' AR mode highlight how well-positioned Google, aided by its user data, Maps data, knowledge graph and arguably unmatched machine learning strengths, is to deliver a powerful set of cloud services for mass-market augmented reality headsets. In addition to Lens and Maps, Google's ability to integrate Google Assistant, which frequently comes out on top in voice assistant tests, into an AR headset platform is a valuable strength. And so is its ability to bake in services such as Gmail and YouTube.
Last May, Google was reported to be working on a standalone AR headset that would be separate from its Google Glass headset platform, and powered by a Qualcomm (QCOM) processor. If and when this headset launches, look for it to integrate many, if not all, of the aforementioned Google services.
On the other hand, Apple, which just named a long-time iPhone marketing exec its first AR marketing chief and is reportedly prepping an AR/VR headset that could launch in 2020, has a different set of AR strengths. Chief among them is Apple's superb mobile hardware and chip engineering capabilities, which could make a huge difference in an AR headset market where -- though progress is gradually being made -- current products (such as Google Glass and Microsoft's (MSFT) first-gen HoloLens) tend to have major limitations in terms of processing power, display quality, portability and/or battery life. With regards to display quality in particular, Apple's 2018 acquisition of Akonia Holographics, a startup that was working on AR lenses, should help its cause.
In addition to its engineering chops, Apple's knack for creating integrated user experiences across its hardware platforms is a major plus, give that any AR headset meant for outdoor use will lean heavily on smartphone pairing. Apple's success to date at using iPhone pairing to enable innovative features for the Apple Watch and AirPods bodes well for its ability to do the same for an AR headset. Such integration might prove a little more challenging for Google's headset platform, since it will have to support headsets and phones from third-party Android OEMs, in addition to Google's own devices.
In addition, for now, Apple, via its ARKit software developer kit (SDK) for iOS, appears to be the world's most popular platform for developing AR-powered mobile apps. Though Google is hungry to catch up via its rival ARCore platform, the lead that ARKit currently seems to have in terms of developer support could make it easier for Apple to have a large base of third-party apps available when it rolls out its first AR headset, or soon afterwards.
To the extent that Google tries to make its services available on popular third-party software platforms in addition to its own, Apple and Google's AR strengths could wind up complementing each other. Should Apple launch an AR headset with third-party app support, odds are pretty good that Google will want Maps, Assistant, Lens and other services to be available on it.
However, it's also quite likely that -- as is the case for iPhones and iPads -- Apple's headsets will tightly integrate its own mapping and voice assistant services out of the box, rather than Google's. It's also probable that just as an iPhone or iPad user looking to use Google Assistant rather than Siri needs to load an app rather than simply utter a two-word voice command, using Google's services on Apple's headsets won't always be as convenient as using Apple's.
As a result, the augmented reality headset fight that Apple and Google each seem to be prepping for could feature combatants whose platforms have very different selling points. Apple might end up selling would-be AR headset buyers on the quality of its hardware and iPhone integration, while Google could pitch them on the quality of the cloud services running on its headset platform.
Apple close Tuesday up 0.86% at $170.89. Alphabet gained 2.41% at $1,121.37.