Getting release dates right: a product perspective on building trust

Photo by Matthias Groeneveld / Pexels

When the old and new worlds meet

When thinking about Deezer, many of us see a digital-born company, based in the “new world” — this tech era we live in. As much as we are honored to work in a thrilling creative environment, we must not forget that music didn’t wait for streaming to exist. Neither did metadata.

“Metadata” stands for a set of data that describes and gives information about an object, for example an album. Song titles, artists, contributors’ names, as well as release dates are all metadata. In the music industry, a release date, i.e the day an album was released can be physical or digital. The physical release date is the day an album was released as a CD or vinyl in a specific country. The digital release date is the day when an album becomes legally available online.

Release dates are delivered to Deezer by labels and providers along with the music audio — we receive albums (and their tracks) together along with their associated metadata. We then ingest all this content to make it available and stored in the right place in our product.

At Deezer, we used to struggle with release dates. Which one to choose? What should we do when it’s incorrect?

At this point, you may ask yourself “Why is it tricky for Deezer to display the right release dates?” The reasons are many: release dates can differ from one territory to another, labels sometimes make mistakes when inputting them, fans build their own databases, etc. Basically, it is all about transposing the good ole CD / vinyl world into the brand new digital one. All of this makes release dates an exciting topic for a Product Manager (PM) because it’s tricky and very dependent on the industry, it impacts all music fans and Deezer users, and it has to do with trust and collective intelligence. I wanted to write about this to bring up a product perspective on a complex topic: the crossroads between tech and the music industry.

Start with the problem

Read between the lines

Identifying problems is the number one job of every PM. The occasional display of wrong release dates has been a known issue at Deezer for quite a while. For example, it was never reported that Bob Marley & The Wailers’ albums were actually not released in 2020. Such issues could be explained by the fact that we mainly used physical release dates, and when those dates were unavailable, we sometimes displayed digital dates instead. Because an album is a piece of art, its release date should stay consistent over time. This is why we chose the physical over the digital release date. The problem was, we used to only rely on the information provided by labels. However, this information is not always correct, sometimes there is a confusion between physical and digital, for instance when it comes to re-editions.

So, we were not happy with the release dates we displayed. But, the real question was, is it a problem?

That is when it got exciting for a PM: defining the problem. Let me take you through the steps I followed.

Step one: start from facts, not personal opinions

It is not easy to rationalize problem identification when everyone already has an opinion, or when the issue has been known for a long time. The more I talked to people, the more I collected ideas on solutions, impacts, and even mere feasibility. Opinions are not facts though, and I needed to know more about our users to determine whether we had a problem or not.

I had plenty of information to start my investigation: reasons why users churned (as declared when they unsubscribed), results of user interviews conducted by our User Research team, comments on social networks (including our Community website), and app store reviews. I decided to read and observe everything related to metadata in general.

Step two: be data-informed and not data-driven

Being data-informed allows us to interpret the data by acknowledging and understanding its limitations. Depending on the company’s structure, a data-informed or data-driven mindset can be more effective. It was hard to get quantitative data about release dates but I had a ton of qualitative feedback on this issue. Using cold data, as it is, was nonsense in that situation. Going through all available materials again and again, I understood that I had to read between the lines. I attended many User Experience (UX) test sessions (on totally different topics) to show as much empathy as possible with users who brought up metadata and catalog issues. I wrote down users’ comments verbatim, observed their behavior, and whether they used metadata or not to make decisions during their journey.

Step three: bring context to the problem

At the time, I focused so much on finding the right data that I almost missed the point. I had to take a step back and look at the situation within its context. Actually, the question was not “Do people care about release dates,” but rather “Do they impact Deezer’s business?” That is when I understood something crucial when talking about music: trust and expertise. So I met with the music experts at Deezer: Artist Marketing and Editorial teams. Together, we were able to build a first draft of what we call “album fundamentals,” i.e. the necessary set of information a user needs to understand an album as well as the level of information Deezer should provide as a music expert. Later on, I discussed this topic with user behavior experts (meaning the User Research team) to refine users’ emotional expectations.

It turned out release dates were a real problem for our users but it was implicit. For some people using a streaming service, learning and improving their knowledge is part of their emotional journey into music. Building a relationship with a digital streaming platform based on trust matters a lot to them. One phrase I often use to describe the situation is: “I have to go on Google to know more about what I listen to.” In a word, people expect Deezer to provide them with truthful music expertise. Another thing that is very important to fully appreciate and understand the problem is the notion of ordering. In artist pages, we display artist discographies ordered by release dates. Therefore, erratic release dates were partly responsible for messy discographies — which means it was hard for users to find what they wanted to listen to. Yet, we want Deezer to be the most intuitive music app. 17% of users churning for product experience reasons did so because they couldn’t access or find their content. Now, you can understand why it was a real problem.

Build a solution

Collective intelligence

As a Product Manager, I am lucky enough to work with many different people. A part of my role is to maximize team efficiency to design the best solutions. To do so, I am a true believer in and advocate for collective intelligence. Collective intelligence is the ability to share ideas and knowledge within a group of people through collaboration, collective efforts and co-design. There are many brilliant people out there so it’s all about making sure everyone can raise their voice and make the most of their talent.

A talented Data Engineer on my team had the idea to compute what we soon called “original release dates.” Back to the idea of an album being a piece of art, not only do we need its release date to be consistent over time, but also to be the first original worldwide. Original release dates are based on an algorithm that basically compares multiple internal and external sources, and chooses the best one. Indeed, release dates are sent by labels, but there is also another data source that we should not ignore — music fans! On collaborative platforms like MusicBrainz, fans have listed almost every release date for almost every album in almost every territory. It’s the beauty of crowd-sourcing.

We decided to use this algorithm for all official albums in our catalog. What we call “official” is an album that is part of a discography that our Metadata Curators manually verified and curated. Original release dates then go through a whole technical pipeline: from a database to our catalog for the backend to serve it to all necessary clients such as iOS, Android and web. This way it is displayed throughout the product and can be used for ordering discographies.

I believe it is part of the PM’s job to be able to hear what contributors have to say. The Data Engineer on my team already had the idea for quite some time but never had the chance to implement it since no data had been gathered before. My job is to identify problems as much as possible to leverage collective intelligence to bring users what suits them most.

And…it is also my job to be able to measure the success of a feature.

Deliver, measure, iterate.

Before launching the feature, we iterated a lot to make sure original release dates were accurate. Thanks to a predetermined dataset of almost 2,000 albums, we were able to refine the algorithm and the methods. The goal was to determine the first ever release date by comparing dates listed in MusicBrainz, dates given by providers, and dates indicated in copyrights and producer lines (both are legal dates that are sometimes sent by providers). Each time we added a new comparison element, we would review the dataset manually to find out what was best.

Excerpt of the dataset used to create the Original Release Date algorithm

I also decided to clear the scope of this feature to ensure a reasonable time to market. Easier said than done. It was a good exercise for me, as a PM, to learn how to keep focused and stay realistic. To do so, we finally decided to limit our external comparison system to MusicBrainz, at least at first. After hours and hours of thinking, trying, learning: original release dates were born.

Yet, a feature in production is worth nothing if we cannot measure its success. It was the hardest part, where I faced a lot of constraints and limitations: how to quantitatively measure something as qualitative as trust? I questioned myself a lot on the notion of trust. Is it one of the reasons why people churn? Yes. Is it the only one? No. How do we simply measure such a change then? I went back to my first hypothesis; by computing original release dates, I assumed users would more easily find the content they want to listen to. Therefore the bounce rate on artist pages, where the discography section is, should decrease. And then my second hypothesis on trust: by computing original release dates, I assumed users would trust Deezer more as a music expert. Therefore they would be less likely to churn for “missing content” reasons.

Taking a look at the Deezer community and users’ feedback, we were able to spot some errors and iterate again. For example, we allowed Deezer’s Metadata Curators to override original release dates — thus correcting some flaws in the process.

The first step of a long and exciting journey

With original release dates, I had the chance to bring a product perspective to a very music and tech savvy problem to solve. Thanks to collective intelligence, product vision, and technical efficiency, we clearly enhanced our music expertise. This is the first step toward enriching Deezer’s metadata for album fundamentals.

I am delighted I could take part in this beautiful cross-functional expedition. At the end of the day, this is exactly what Deezer needs: a solid music industry expertise, a deep understanding of user behavior, a clear view of technical possibilities and constraints, and finally a product vision on trust and the power of metadata. This is the first step of a long and exciting journey!

If you would like to further discuss this topic, share your feedback, or if you have suggestions on how to measure metadata improvements, feel free to reach out!

And if you wish to be part of Deezer’s long and exciting journey, have a look at our open positions and jump on board!