Siri, Topsy, and the Web – Context is everything
Last night, my youngest child and I were talking, and I wound up telling her about the scene from 2001: A Space Odyssey where the HAL 9000 computer, as he is being disassembled, sings the old song Daisy to Dave Bowman. My child loves music, and didn’t see the irony in immediately asking me, “How does the song go?” So I taught her – she hadn’t ever heard it before. At the time I didn’t get the irony in doing that either – not until I woke up this morning.
Think about that line right before Dave tells HAL to sing him the song:
“My instructor was Mr. Langley, and he taught me to sing a song. If you’d like to hear it I can sing it for you.”
Topsy is Siri’s Mr. Langley.
A little over two years ago I wrote about how Siri was the start of Apple escaping the Web, and escaping Google search. In that piece, I discussed how important context was for Siri. Over the last few years, Siri has been improved as Apple has connected it to (often very contextually specific) sources, such as sports and movie information, and demonstrated them at WWDC.
However, Siri had, and continues to have, rather large holes in her knowledge set. What we think of as very simple questions, Siri cannot answer. The child of mine I mentioned earlier is fascinated with technology, and Siri in particular. Periodically, she will come up with random obscure queries and throw them at Siri. While the Siri system often can’t answer them, sometimes it can.
Twitter is amazing because it can provide insight into the zeitgeist (the Web’s short-term memory), but it also has such knowledge of long-term events along a timeline as they happened. In many ways, Twitter is a bit of a knowledge mechanical turk, where Twitter users mine the Web and real-time events and surface their knowledge in discrete snippets of information. Topsy was uniquely situated to surface Twitter’s knowledge in an API-driven way, and is ideally situated for Apple to integrate into Siri (since Siri doesn’t really learn anything, it just connects into other systems.
Many people have said Topsy was acquired to enhance advertising or iTunes content. Both are tangentially right. But ads have never appeared to be a primary focus for Apple – which makes sense, because the customer they build their hardware, software, and services for usually isn’t a fan of ads. That said, the analytics from Topsy Pro could well wind up integrated into iAds. We’ll see in time. As for content discovery? Sure, that’ll happen too, and people will buy content as a result of their searches. But I don’t believe that this is what this acquisition was about.
People expect Siri to be able to answer their queries, and if it can’t, they disengage from the service, and potentially from Apple’s platform, if they don’t find that it just works the way they expect. That’s why I believe Topsy has everything to do with Siri, and that’s where the team will end up, and how we’ll see the technology demonstrated at WWDC next summer.
A few pundits have also made the association to Siri, but most analysis I’ve seen seems to focus on real-time search, not mentioning the (relative) long-term knowledge that Topsy surfaces from Twitter, and how that can only grow over time. Just as importantly, as I understood it Topsy had created an algorithm that enabled tweets to be sorted geographically. This is invaluable to Apple, as it then gives Siri location-based context, and will let the system help users find resources near you that others are discussing in near real-time through the Twitter firehose. I think the acquisition of Topsy by Apple is good news for Apple and their customers, as well as Twitter itself and Twitter users. I think it’s really bad news for Google.