In announcing some of Siri’s new “proactive” capabilities yesterday at WWDC, Apple’s Craig Federighi said the company was “Enhancing how you use your device but without compromising your privacy.” A not-so-indirect swipe at Google, this approach also suggests limitations out of the gate.
Federighi discussed a cluster of “intelligent” enhancements to Siri and search intended to improve the iPhone user experience. To a great degree they’re also intended to play catch up with Google (Now) and Microsoft whose own “assistants” have leaped ahead of Siri in sophistication and expanded use cases.
Some of the new functionality is built upon Apple’s acquisition of personal virtual assistant Cue (founded as Greplin), in late 2013 for an estimated $35 million to $45 million. Like Google Now and Cortana, Cue looked at email, contacts and other on-device content to present personal agendas and deliver contextually and time-sensitive information.
Apple’s Federighi explained yesterday that Siri speech recognition and natural language understanding have become 40 percent more accurate and faster in the past year. He demonstrated some of the new functionality from the stage and called up several user queries and requests in a hypothetical day in the life scenario:
- “Show me photos from last August”
- Location-based reminders
- “Remind me about this when I get home . . ” capturing in-app or web links being viewed at that moment
Federighi also showed a useful caller ID feature that looks in a user’s email to identify in-bound callers who aren’t in contacts.
As indicated, Siri will offer context-sensitive content based on user location, time and historical behavior. It will automatically populate user calendars with meeting requests and invitations and then offer time-to-leave reminders based on traffic conditions.
A reinvigorated Spotlight Search (swipe left on the homescreen) will feature Siri suggestions of people, apps, music and local information based on user behavior patterns, time of day and location.
Spotlight Search also offers more complete and useful search functionality, including results that offer in-app deep linked content. On stage Federighi demonstrated this with a search for “potatoes” that yielded recipes from several apps, taking users directly to the recipe page. Apple’s embrace of deep linking will also boost Google and Facebook’s similar initiatives — and could well be one of the most significant parts of yesterday’s keynote.
Siri is getting over “1 billion requests per week,” Federighi also announced. While this is not the same thing as Google search queries, it indicates Siri is being heavily used. The new proactive capabilities and improved Spotlight Search should generate more usage and engagement and at the margins and substitute for some types of Google search queries.
Apple has staked a position as the OS that cares about privacy. Surveys indicate that consumers are willing to give up some of their privacy for rewards or improved personalization in selected contexts. I don’t believe that the world of on-device personal digital assistants is one where consumers are quite as concerned about the usage of their data. They want these tools to be helpful. Accordingly Apple’s “restraint” here may be less of a selling point.
Google Now improvements (third party app integration, Now on Tap) and Cortana are coming to iOS. While Siri started the assistant arms race, Apple had fallen behind. It was and is thus critical that the company keeps investing in and evolving Siri.
Beyond this, Apple continues to incrementally build out its own search capability on the PC and in mobile. It’s unlikely to ever be a direct substitute for Google but Apple will probably continue to peel away more usage that might have gone to Google (e.g., weather, sports, nearby locations). Indeed, on the iPhone, Apple’s improved search could turn out to have the greatest impact in Maps and local.
Opinions expressed in this article are those of the guest author and not necessarily Search Engine Land. Staff authors are listed here.