Personalization algorithms provide curated experiences based on what we’ve done in the past. But in the future, will we prefer surprises?
These algorithms provide a continual rearview mirror of our lives. Netflix shows us what we previously watched. Google shows us what we previously searched for. In the end, our choices get restricted to who we once were – and lead to far more conformity than we might have wanted for ourselves. So where do we draw the line? And how long until we get tired of perfectly curated experiences and start seeking out surprises?
Here are the articles referenced in this episode:
- Perspectives on Personalization at Scale: Volume 1
- The Future of Personalization – and How to Get Ready For It
Highlights include:
2:42 – JCM: “And all I could think of was my Netflix queue. Now Netflix has a perfect eye for what I usually like. But the thing is, what I liked 20 years ago when I saw a movie or two a month is not the same as what I like now that I can watch a movie every day or 17,000 episodes of a series in a week. You know, I want maybe to try different things. And it is very hard to find them because Netflix says, no, this is what you like. Now I have other members of my family, so I can go on their queues. And it’s not that Netflix is so wrong. It’s just that you start noticing that you’re not being shown every possibility.”
4:55 – JCM: “If everything is programmed to please us based on what we’ve liked in the past, how do we ever learn anything new? How do we ever find out what else we might like?”
5:50 – JCM: “I’m going to call it and say, give it 10 years and I’m going to tell you, the future is going to be about surprise. In 10 years, I’m calling it. I’m saying it. Everyone is going to be begging to be surprised. And the companies that come in and provide that are going to be leading the way.”
6:10 – JCM: “There’ll be a point where everybody is sick and tired of perfectly curated experiences that have narrowed their world view so dramatically.”
8:55 – JCM: “These are all coming from algorithms. These are all programmed, right? So it’s not like we’re getting some neutral, unbiased world. We’re simply, instead of choosing our own realities, we’re letting somebody else choose it for us. So instead of getting our preferences, we’re getting somebody else’s preferences – or, more accurately, somebody else’s interpretation of our preferences. Now, what I wonder is, why is that preferable?”