Searching the Matrix, Or Why Siri Wouldn’t Make A Good Airtalent

Posted on November 8, 2011


Everyone, or at least my poker buddies, are all abuzz about Siri, the new voice-activated iPhone personal assistant app.

As I wrote in this post a few weeks ago, I think Siri is a fascinating development.

However, underneath the packaging, Siri depends on an algorithm just like every other search engine or recommendation feature on the web.

Algorithms are, according to Wikipedia, “a finite list of well-defined instructions for calculating a function,” and just like we learned in “The Matrix,” because computers have to work from instructions without any intuitive abilities they have limitations.

Those limitations, which some experts are calling “filter bubbles,” mean that as more people depend on recommendations from programs using algorithms there will be important content that will never surface because of the systems limitations.

For example, news about the war in Afghanistan may not get many “likes” so it will be less likely to be suggested by an algorithm despite being important news that people should be reading.

A recent article on Mashable describes the differences between human content curators — such as newspaper editors, radio hosts or Keanu Reeves — and algorithms:

  • Risk Taking: Author Matt Silverman writes, “Suggestion engines almost always offer up ‘safe’ content within a very narrow spectrum.” You can look for stories outside of the norm that will grab your audience’s attention.
  • Big Picture: According to Silverman, “Algorithms seldom connect the dots between specs of content to form a big picture of current events.” You can group stories to show how they fit together and help listener’s draw broader conclusions.
  • Pairing: “Human editors can draw you in with something ‘clicky’ and get you to stick around by pairing that item with something of substance.”  Think of this as good teasing. Grab the listener’s attention in a way no algorithm can then, when you have them, give them good content.
  • Social Importance:  This the Afghanistan example I gave earlier. It’s part of your job to be sure all kinds of topics are being discussed, not just ones people are going to click the “like” button on.
  • Mind-Blowingness: Think about stories or content that is “bubbling under.” It’s your job to be the conduit of what’s going to be popular so your listeners are smarter than their friends. Algorithms can’t predict the next big thing, you can.

If you can consistently provide unique, more intuitive information than an algorithm your listener’s will reward you with something else that Silverman writes about; trust.

“People learn to trust good editors. If something seems boring or irrelevant but a trusted editor says it’s important, you’ll heed. Algorithms may never be so trustworthy.”