How Does Spotify Know You So Well?
20 February 2020 | 3min read
This monday, just like every monday before it, over 100 millions Spotify users found a new playlist waiting for them called ‘Discover Weekly’. It’s a custom list of 30 songs they’ve never listened to before but will probably love.
So how does Spotify does such an amazing job of choosing those 30 songs for each person every week? And how does it seem to know the taste of individual users so much more accurately than any similar services?
Spotify uses more than a single recommendation model. To create discover weekly, there are three main types of recommendation models that Spotify uses:
- Collaborative Filtering models, which analyze your behavior and others’ behaviors.
- Natural language processing (NLP) models to analyse text.
- Audio models to analyze raw audio tracks.
Combining these models is called hybrid recommendation and could be more effective.
Netflix was one of the first companies to use collaborative filtering to create a recommendation model. Taking users’ star-based movie ratings they determine which movies to recommend to other similar users.
Unlike Netflix, Spotify doesn’t have a system where users rate their music. Instead, Spotify uses data such as stream counts of the tracks, whether a user saved the track to their own playlist and more.
But what is collaborative filtering and how does it works?
Every person has track preferences: let’s say Joe likes tracks A, B, C and Carol likes tracks B, C, D. The collaborative filter reasoned as follows: “These people like two of the same tracks (B and C). Therefore, both users are likely to enjoy each other’s track.
Collaborative filtering does a pretty good job, but Spotify did even better by adding another engine. Natural language processing.
Natural language processing (NLP)
Natural language processing is used for feature engineering. Which means that this enriches the data of songs that will improve the collaborative filtering model. The source data for these engines are world: metadata, blogs, news and more.
Natural language processing is the ability of a computer to understand human speech as it is spoken.
You can read more about NLP in details on our NLP blogpost, but here’s what happens on a very high level: Spotify constantly looks on the web for blog post, news and more about music to figure out what people are saying about specific artists and songs.
Then, much like in collaborative filtering, the NLP model uses these terms and weights to create a vector representation of the song that can be used to determine if two pieces of music are similar. Cool, right?
Raw audio models
There is already a lot of data from the first 2 models. Adding a third model further improves the accuracy of the music recommendation service. Unlike the first two types, raw audio models take new songs into account.
For example, a new singer-songwriter has a new song on Spotify. Maybe it only has 50 listens, so there are few other listeners to collaboratively filter it. If it isn’t mentioned anywhere on the internet yet, NLP models won’t pick it up.
Raw audio models don’t discriminate between new tracks and popular tracks, so a new song could end up in a Discover Weekly playlist alongside popular songs!
This covers the basic 3 major types of recommendation models powering the discover weekly playlist.
In this article about the best artificial intelligence use cases in real estate, we wanted to give you a simple summary of all the AI blogs and use cases we have.
In the previous article, we went over the four biggest data struggles that real estate portals are facing today.
But how exactly should you solve those issues? Read everything about it here:
Throughout our years of experience working with data from multiple real estate portals and websites, we have identified the four biggest data struggles and compiled everything we know about those issues in this article.