UX Planet - Medium 前天 17:02
What If… YouTube Music Translated Lyrics in Real Time?
index_new5.html
../../../zaker_core/zaker_tpl_static/wap/tpl_guoji1.html

 

本文是一篇关于提升多语种音乐体验的UX案例研究,探讨了YouTube Music如何通过实时歌词翻译功能,让用户在欣赏非母语歌曲时更好地理解歌词,从而增强音乐体验。作者在项目中承担了产品和设计责任,从用户研究、市场调研、概念测试到A/B测试,详细介绍了设计过程,包括功能触发方式、翻译呈现方式、设置选项等。最终的设计方案注重易读性、可发现性和流畅性,并通过A/B测试验证了设计决策。文章还探讨了产品KPI、边缘案例以及未来的改进方向。

🎧 项目旨在解决用户在听非母语歌曲时,因不理解歌词而影响体验的问题,核心是为YouTube Music设计实时歌词翻译功能。

🎤 通过用户访谈和市场调研,明确了用户对实时歌词翻译的需求,并构建了轻量级用户画像,例如喜欢听非母语歌曲的用户,以及希望在听歌时就能理解歌词的用户。

💡 在设计过程中,作者进行了多轮迭代和测试,包括概念测试和A/B测试。测试结果表明,用户更倾向于先看原始歌词,再看翻译;同时,用户希望翻译功能易于发现,但又不会过于突兀。

⚙️ 最终设计方案包括翻译触发方式(如悬浮按钮)、翻译呈现方式(如与原文歌词并排显示)、设置选项(如翻译语言和脚本选择),并考虑了边缘情况,如无歌词、歌词超长等。

📈 项目还定义了关键绩效指标(KPI),并提出了未来的改进方向,如评估该功能是否应纳入高级会员计划,以及支持用户众包歌词改进等。

A UX case study on making multilingual music more meaningful.

My Role

With this project I identified a missing opportunity, hence on top of my end-to-end design ownership, I took on product responsibilities (wrote my own PRD, with AI help).

Trailer before the Movie

Imagine listening to a beautiful Punjabi or Spanish song but missing the meaning.
What if YouTube Music could translate lyrics for you in realtime — without breaking the vibe?

Here’s the core experience — from triggering translation to how it integrates into the player. Deep dives on user research, configurations, and edge cases follow.

Now let’s dive deep into the process. KPIs have been captured at the end.

Why This Problem?

I listen to music every day, and often find myself drawn to songs in languages I don’t understand. To get their meaning, I had to Google lyrics or watch YouTube translations — pulling me out of the moment.

That made me ask:
What if lyrics translation happened within the listening experience itself?

Is This a Real Problem?

Market Research

I wrote a brief PRD for myself based on the research, you can refer to it here, or just take a look at my market research summary:

My Hypothesis

To enjoy their favorite songs in non-native languages, users would love to have optional — realtime lyrics translation.

User Interviews for Understanding in Depth

I interviewed 6 users to understand 1st hand about their music listening experience.
Criteria for recruitment: Users should be interested in songs in non-native languages, especially ones that they don’t completely understand. Here are the major takeaways and excerpts:

Understanding user listening contexts (e.g., active discovery vs. passive listening) was crucial in forecasting the adoption and impact of real-time lyrics translation on user engagement.

Problem Statement

Multilingual music listeners need real-time translation of lyrics because it enables them to understand and connect with non-native songs during playback, enhancing their overall listening experience.

For whom?

Based on the limited research, I created the following lightweight personas:

2 lightweight personas

Use case details

Why only favorite songs?

Users don’t look for translation in every song from a non-native language.

Sometimes they connect with a song and want to understand and experience it during playback, and that’s when knowing the meaning would be of real help.
In most cases, they wouldn’t open the translated lyrics unless the song has something unique in terms of trends, emotional relevance etc.

What kind of Translation?

Script vs Language: Both full translation and transliteration (e.g., Hindi in Roman script) are valid use cases.

2 types of translation — Script and Language

Should Translation Always be On?

Big NO — it’s a complementary feature for very specific sets of users with explicit use cases.
The settings can be modified so that the users can turn it on by default for any genre of songs that they want, but otherwise it would be manually triggered.

I looked at Existing Solutions…

The apps don’t support it today so there must be some way that users achieve the same goal. So I checked the most frequently used ways to read translation.

some references of lyrics translation

Ideation and Low-Fidelity Exploration

I started with paper sketching and asked myself some key questions:

Concept Testing

To move swiftly, I wanted to validate the progress with the users.

1 data point that I needed an answer to:
- Show lyrics first, then translation, or the reverse?

Two variants used for concept testing

Major Learnings to Incorporate

I tested the prototype using Figma mirror, again with the same 6 users, and here’s the key takeaway:

Users want to see original lyrics first. Translation should follow to preserve structure and rhythm.

Designing for Discoverability

Before finalizing the core product, I explored banner designs to announce this feature in-app.

The feature demo video will need some more conditions to be defined to ensure the users see the demo in a genre that interests them the most.

Feature Announcement

A/B Testing with 28 Users (Using Figma Mirror)

Criteria: Open to all music lovers, not just multi-lingual music lovers. (Wanted to see how different the reaction would be since the feature would be available to everyone)

Things to be tested (2 Versions — 4 variations):

After testing, I asked them to fill out a Google form to help answer a few qualitative and quantitative questions.

Two design variants for A/B testing

Reading between the lines

Once I went through all the feedback in detail, I understood the following that the users want:

Learnings from A/B testing

Trigger for Translation

I went down a spiral with explorations, but the A/B testing helped give me a solid direction to pursue.

Final design for translation trigger and a few rejected iterations

Translation Experience

You’ll see a few of the many iterations and why the final design is MY CHOICE here.

Italics text wasn’t used in the final design due to its accessibility concerns. None of the users in the A/B testing had visibility impairment of any kind, but it’s not the best choice at scale.

Configuration in Settings

Translation Language — English by default (Assuming most YT music players understand it, or would know to change it in settings)

Translation Script — Script of the most listened song genre by default (ex: Hindi, Latin). I’m assuming that users who want to read the script would be hardcore listeners/linguists.

Configuration in settings for Translation

Product KPIs

Here are some key KPIs for me:

Some Edge Cases

Here’s when translation wouldn’t be available:

What’s Next?

If I were working on this product, I would:

Major Learnings

I’m super grateful to my mentors from ADPList and my close friends for helping me out with design feedback when I needed it the most.

_________________________________________________________________

https://medium.com/media/973c81d84e3360818fd444fee7f96ae6/href

If you liked this project, don’t shy away from rewarding me with some claps! 👏👏👏


What If… YouTube Music Translated Lyrics in Real Time? was originally published in UX Planet on Medium, where people are continuing the conversation by highlighting and responding to this story.

Fish AI Reader

Fish AI Reader

AI辅助创作,多种专业模板,深度分析,高质量内容生成。从观点提取到深度思考,FishAI为您提供全方位的创作支持。新版本引入自定义参数,让您的创作更加个性化和精准。

FishAI

FishAI

鱼阅,AI 时代的下一个智能信息助手,助你摆脱信息焦虑

联系邮箱 441953276@qq.com

相关标签

YouTube Music 歌词翻译 用户体验 UX设计
相关文章