clock menu more-arrow no yes mobile

Filed under:

Meet the man with an impossible job: cleaning up YouTube

A Q&A with Neal Mohan, YouTube’s chief product officer.

YouTube Chief Product Officer Neal Mohan.
YouTube chief product officer Neal Mohan at a YouTube event on February 28, 2017.
Jeff Kravitz/FilmMagic for YouTube
Peter Kafka covers media and technology, and their intersection, at Vox. Many of his stories can be found in his Kafka on Media newsletter, and he also hosts the Recode Media podcast.

YouTube, the world’s largest video site, has 2 billion users. Most of them just use YouTube to watch … anything.

But some YouTubers put their own videos on the site, too. And those people do that a lot: They upload a staggering 500 hours of video a minute. That’s every. Single. Minute.

How do you police all of the video and keep the worst stuff — like videos that exploit children, promote extremism, or push hate speech — off the site? That’s Neal Mohan’s job.

The easiest way for Mohan, YouTube’s chief product officer and its de facto No. 2 executive, to wrangle YouTube would be to put a wall around YouTube.

But the idea of an open platform, one that allows users to upload anything they want — requiring YouTube and its users to find odious stuff after it’s already on the site — is core to YouTube (as well as to many of Silicon Valley’s most successful companies, including Facebook and Twitter). It’s an ideological imperative, as well as a business and legal one.

So YouTube’s going to stay open, Mohan told me last week in an interview, which you can listen to right now on my podcast Recode Media.

“I wouldn’t be working at YouTube if I didn’t believe that having an open platform, where ... anybody on the platform can have a voice, no matter where they came from in the world, isn’t an important founding principle.”

So instead of locking YouTube up, Mohan and his team are trying to tame it as best they can, with computers, humans, and a set of constantly updated guidelines for those computers and humans to follow.

During my conversation with Mohan, he mentioned those guidelines and the work the company has done to update them over the last few years, over and over.

That emphasis surprised me: I would think that the problem is the sheer volume of horrible things people are uploading, which is why YouTube took down a staggering 8.3 million videos in the first three months of this year. The company uses a combination of software and humans — at least 10,000 people have been hired to help flag offensive content — to find and remove those videos.

But if I understood Mohan correctly, he’s arguing that computers and humans can’t do anything without rules to follow. And that YouTube thinks refining and changing those rules is core to the work it’s doing to clean up the site. He’s also arguing that those rules will have to allow some videos that you might not like to remain on the site.

“In some cases, some of those videos … might be something that lots of users might find objectionable but are not violating our policies as they stand today,” he said.

That makes sense (though Bloomberg has reported, convincingly, that YouTube turned a blind eye to some of its worst content because it was more concerned about increasing engagement). But it doesn’t explain a recurring story for YouTube, where users or journalists find offensive (or worse) videos and point them out to YouTube, which then takes them down.

An example I discussed at some length with Mohan: Soph, a 14-year-old girl whose videos “lecture her hundreds of thousands of followers about Muslim ‘rape gangs,’ social justice ‘homos,’ and the evils wrought by George Soros,” as BuzzFeed’s Joesph Bernstein reported this month.

After Bernstein flagged two particularly upsetting videos to YouTube — including one that included a death threat to YouTube CEO Susan Wojcicki — YouTube removed them.

But Soph’s channel, which now boasts more than 900,000 subscribers, is up and running. Mohan wouldn’t comment directly on Soph’s output, but presumably the stuff that remains on YouTube falls into the objectionable-but-not-something-that-violates-YouTube-policy category he referenced.

I understand Mohan’s argument that letting the Sophs of the world post what they like — and taking their worst stuff down if it crosses particular lines — is fundamental to YouTube.

But the fact that YouTube’s policies, software, and army of evaluators didn’t flag a popular video creator making death threats against the company’s CEO until a reporter pointed it out suggests that YouTube has a structural problem. And it’s not one it can solve with tools, rules, and people.

You can listen to Recode Media wherever you get your podcasts — including Apple Podcasts, Spotify, Google Podcasts, Pocket Casts, and Overcast.

Below, we’ve shared a lightly edited full transcript of my conversation with Neal.


Peter Kafka: Hi, Neal. You’re chief product officer at YouTube?

Neal Mohan: That’s right. Hey, Peter.

Hello. Nice to see you again. What is the shortest way to describe what chief product officer at YouTube means? I think of it as No. 2 guy at YouTube.

Well, I’m responsible for building all the products at YouTube that our creators and our viewers use every single day. I work very closely with our advertiser-facing team, as well. As you know, I came from the advertiser side of Google’s business and so really just looking after the products that support our ecosystem.

So you touch everything, so we can talk about everything. We have limited time so we can’t talk about everything. Every week or two weeks or three weeks I read a story about YouTube and content problems. Sometimes they’re short stories. Sometimes they’re very long stories. I want to ask you about a couple specific things but my general question is, whenever you guys get asked about, “We found this piece of content, we don’t like or it’s objectionable,” or worse on YouTube, the answer you guys usually provide is, “We’re really big. We’re an open platform. We have 2 billion people. This is a really hard challenge to solve.”

And my big overall question is, can you sort of break down how much of what you’re trying to figure out in terms of objectionable content at YouTube is sort of a scale and math and computing problem and how much of it is, “We’re a giant platform. We let everyone upload whatever they want and we’re always going to have stuff that people find objectionable.” It makes it like ... How much of this is a technical problem versus an ideological debate?

Well, I’m happy to kind of break it down into more detail as well, but just starting out on first principles, I will say that YouTube started — and you know, you’ve been, obviously, following it for quite some time — as a platform where anyone could upload a video and share it with the world. It does have this basis of an open platform, and freedom of expression is an important concept that goes along with that ability to express yourself. I mean, it’s YouTube. It’s about broadcasting, broadcasting yourself in that manner.

Having said that, even from the very early days of YouTube we’ve always had a set of community guidelines and I would say that sort of what’s happened over the course of certainly the last decade, but in particular, over the last couple of years is those community guidelines are the part that need to evolve. And so while it is an open platform and remains that way, the community guidelines need to evolve to the nature of YouTube today compared to where it was a decade-plus ago. Part of that is that it’s a larger platform. We’ve kind of grown up, if you will, from this small village where everybody kind of knew each other. The creators actually knew each other.

Originally, if you had a complaint about something you wanted taken down off YouTube there was a woman you called. You called her on her cellphone.

And the notion of, actually, users flagging it was something that worked with that scale. Now we’re kind of a big metropolis where you need sort of rules of the road, which is what our community guidelines have evolved to, and also a more sophisticated enforcement mechanism, which I’m happy to describe as well. To your point about whether that’s technology-driven or not, I think it’s a combination of both technology as well as — as you know, we’ve invested a lot in terms of people.

It strikes me that you have this platform. Anyone can upload whatever they want, but there are various times where you guys step in and say, “We’re not going to allow this kind of content and we’re going to take steps to take it down,” whether it’s ... A couple years ago it was ISIS-related stuff. Now you’ve got a set of guidelines for sort of borderline content that you want to stamp out.

But other times you’ve got ... I was at your Brandcast event that’s for advertisers, and this is a pledge you’ve made now a couple years running, you say, you’ve got something called Google Preferred, which is basically sort of a clean well-lit space where you guys say, “All the videos in this space that we want you to advertise in, we are going to have a human being review them.” Right?

So you’ve already sort of narrowed the focus of, “Anything can go on YouTube,” to say, “Here’s this specific part of YouTube that we’ve really cleaned up.” So when you want to, right, you guys can step in and do certain things to clean up certain parts of YouTube. Is there a way to sort of scour the entire thing or is that too technically difficult? And/or could you change the way YouTube fundamentally operates so it’s not an open platform and there are more gates?

I would say it’s actually ... The way that I think about it, it is actually more of the former in the sense that I do believe ... And remember Google Preferred, as you know very well, is a small subset of our overall corpus and ...

But your most valuable real estate, right? It’s the stuff you are telling advertisers, “This is where we really want you to spend your time and money.”

Well, it’s the place that we showcase in our Upfront event at Brandcast, but it’s certainly not a place where all the advertising, not just limited to that. It runs across the broad swath of our corpus as well. But to the core of your question, what I would say is, our aspiration is to make sure that we are enforcing our community guidelines across the entire swath of our corpus. Let me say that very clearly. And the best way to do that can be a combination.

What I have found — and machine learning technology and artificial intelligence, of course, is evolving and has made leaps and bounds in the couple of years — what I’ve found is that a combination of using machines and humans, people, is really the most effective way.

Let me give you a concrete example of what I mean. Machines are good at operating at scale. What we did in the example that you cited, which is around videos, around violent extremism for example. What machines can do is they enqueue videos that might be candidates for policy violations — take-down, if you will — but they have a harder time making nuanced decisions. So machines can cover our entire corpus and enqueue a bunch of videos, but it has to be human beings that ultimately make the final decision. What the machines have done is that they have reduced those decisions down from let’s say hundreds of millions of videos or what have you, to in the whatever thousands.

It makes the work more manageable.

It makes the work more manageable, but it’s the human being that can tell the difference between an NGO documenting war-time atrocities versus a video that could look and feel the same way, but is actually a recruitment or propaganda video for a terrorist organization.

Is that combination of computers and human beings, can that fundamentally scale up to the 2 billion users growing all the time? What’s your latest stat on how many minutes of content are uploaded per minute?

I think it’s on the order of 500 hours are uploaded every single minute.

Every minute, right.

Every single minute.

It seems like this is sort of an impossible task for humans to solve and ultimately if you’re really going to solve it, it’s got to be commuters. Or maybe it’s just not solvable.

I think that at the current state of the art it’s going to be a combination of machines and also human beings. I think that we’ve made a lot of progress in the last two years. The work is by no means done. I’ll be the first to say that, but just in the last couple of years we’ve ... First and foremost, we’ve updated our policies. We’ve updated over 30 polices to be much more precise and up to date in terms of the type of content we allow and don’t allow. That work is always ongoing. As I said, we’ve hired up to 10,000 human beings to evaluate this content and we’ve built dozens of machine classifiers to detect this content.

What that resulted in is what you see in the transparency report, for example, that we issue every quarter now, where we take down on the order of, I think, eight, nine million videos every single quarter. That represents a tiny fraction of our overall corpus, but the problematic content tends to be relatively small in terms of number of videos compared to the rest of the corpus. But we are actioning millions and millions of videos every quarter now.

Sometimes the stuff that people will find seems extremely pernicious and also extremely hard to find, right? They’ve clipped in objectionable stuff sort of in the middle of the video where it’s harder to find or the objectionable stuff is in the comments where you guys might not have been looking to begin with. I still don’t know why you have comments at all.

Then occasionally, like, so there’s a BuzzFeed report that came out a couple days ago. It said, “Here’s a YouTube star. She has 800,000 followers. She’s a 14-year-old girl. She says completely terrible stuff. Here’s an example of the video.” As of yesterday, it was up. Now you guys have taken it down. There’s also a video where she threatens your boss, Susan Wojcicki, with murder. But this is someone who’s ... This is not someone with sort of at the far edge of YouTube, right? She’s got close to a million followers. It’s hard to imagine how someone like that can have that kind of popularity and not get picked up sooner than by a BuzzFeed reporter. How do you deal with that level of problem?

Yeah, what I would say is that there’s a few things there ...

By the way, she’s still on the site. You’ve taken down a couple of her videos, but she’s still there.

Yeah, and I think that it all has to do ... Different cases are different, but what I would say is that at the core of it is, there is the policy. It really breaks down into, do we have a policy that that channel or that video — and we tend to focus with the video as the unit — is violating? And is there enough of a violation of a specific policy to strike that video, to take it down? That’s one element.

The other element is what we said which is, do we detect those videos? Do we try to detect those videos as quickly as possible and then do we enqueue them for this enforcement act?

What does enqueue mean?

Enqueue means set them up so that they’ve been identified for a human being to actually take a closer look at. The way that we set our goals is, we of course want to remove policy-violative content as quickly as possible with as little views by actual people as our users as possible. That’s the goal we aspire to. We aspire to that number being zero. Of course, we’re not at zero today. Every day we get better, but our systems are not perfect.

I get that you guys have talked about the Christchurch videos and how difficult it was for you to handle that and the steps you took. Again, I can sort of get ... I still don’t understand why there’s thousands of people sending objectionable videos at once. That’s a different question, but that’s a technical problem with sort of a fire alarm going off and you guys are trying to rush to deal with it.

In the case of this girl, Soph I guess is her name, she’s been around for a long time. Lots of people have watched her videos. It’s been out there for ... She has a large following. So how do you sort of suss that problem out?

Yeah, so it goes back to these pieces. One area that we are constantly looking at — and actually it turns out to be, just to give you a little bit of insight into how some of these policies evolve, one area that tends to be quite hard in terms of actually determining where you draw the lines or don’t draw the lines — has to be in the area of hate and harassment. We have a set of policies. They’re on our website in terms of hate policies on our platform and of course anything where there is incitement to violence or specific physical threats against an individual. You mentioned one earlier, then that video would be struck.

Again, it was sitting out there for a long time. So how does that not surface?

That’s a combination of things, right? One is, does the video actually violate our policies? Are our policies drawn in the right way? We’re constantly looking at our policies, including our hate and harassment policies. The second part is are we detecting it quickly enough and are we having an enforcement action on it quickly enough? And so what I would say is that all three of those elements are evolving, and we’re not perfect. We get better every day, but we’re not perfect about them.

I just want to go again, one last time ...

But on this specific channel, there’s aspects in terms of — there’s multiple videos, for example. And so what our raters will do, if some of those videos are enqueued as potential candidates for policy action, is they will review them against our policies. And in some cases, some of those videos ... The content might be something that lots of users might find objectionable but are not violating our policies as they stand today. That doesn’t meant that our policies won’t evolve over time, but it might mean, and I’m not speaking specifically about any video on this channel, but I’m giving you a more general answer that explains why something might look to you like, “Well why didn’t they actually take an action on that, or what happened here?”

But this is the thing, you did take an action once the BuzzFeed story went up. The BuzzFeed story went up, the video was there, then it went down, then another video came down. So obviously, you guys say the BuzzFeed story …

And that goes to the second piece that I’m describing, which is we might have the policy in the right place, we might be happy about our enforcement guidelines around that policy, but then it’s also up to our machines or our trusted flaggers or users to flag that content to then be enforced. And as I said, we strive to be as perfect as we can there, but we are not. And every day we get better at it, but we’re not ... It’s a problem that we’ll continue to work to get better at.

But our detection ... If your question is ...

My question is ...

”Do we have 100 percent ...”

No, no. But my question is, if a BuzzFeed reporter can find this stuff, it seems like the people you’re paying to find this stuff should be able to get it before he gets to it.

And our machines find an enormous amount of content in terms of the percentage of content that might be potentially violative to the order of what I described, which is eight, nine million videos every quarter are coming down that our machines are finding. The vast majority of those videos, we are finding without even a single human being actually having seen them. So they’ve been uploaded to our platform, our classifiers have enqueued them, raters have made a judgment on those videos even before a single user has seen them.

Do you guys ever wonder if maybe we just shouldn’t have this platform be open? Maybe there should be some sort of process where you need permission to upload something?

I think about it. I think others here at YouTube think about it. I wouldn’t frankly be working at YouTube if I didn’t believe that having an open platform where this mission of ... Where anybody on the platform can have a voice no matter where they came from in the world isn’t an important founding principle. I do think that that is something that’s core to our platform.

But again, having said that, that by far doesn’t mean that anything goes. We do have community guidelines. Those community guidelines have been strengthened, as I said, over 30 times; 30 new policy adjustments over the last two years. And we’re going to continue to make those next month, over the next quarter or year, etc., so that our community guidelines can evolve with the scale of our platform. But I think those two things go hand in hand.

That eight or nine million pieces of content that you pull down, that’s over what period of time?

Every quarter.

Every quarter, that’s eight or nine million pieces of objectionable garbage.

Video that comes down that violates our policies.

Comes down, that you don’t even need a human being to step in.

Lots of those videos are videos that were identified by our classifiers. Some portion of those videos are actually identified by trusted flaggers or users, etc. And also, the majority of those videos were taken down before a user saw them. They might’ve actually been evaluated by an employee, a rater. But a user or a viewer didn’t see them.

So you guys use this stat, and Susan had a similar stat at Brandfront event, to say this many videos were taken down even before a single view happened, and you’re justifiably proud of doing that. To me it also seems like, man, there’s that much garbage floating around, that people are injecting into your system that you’ve got to go deal with, and there’s lots of other media companies that don’t have to deal with this. They have their own problems, but they don’t have people sending eight to nine million pieces of garbage at them every quarter.

Yeah. You know this, and Peter, as I said, 500 hours of content uploaded every single minute. The corpus is large. One of the things that I think does make YouTube so special is this diversity of content that you find on the platform. Everything from ... Last weekend, for example, somebody tweaked the door on my garage. Basically I looked it up on YouTube in terms of how to fix my garage door, and I’m not good at that kind of stuff, but I was able to do it after a few minutes. And so there’s billions of those types of examples ...

I got a very good lamb shoulder recipe from YouTube, so thanks for that.

Yeah. And it probably had a positive impact on your life.

Tastes good. People are impressed when I make it. It’s with pomegranate and rosemary.

You’ll have to make it for me next time I’m in New York.

I will.

So we mentioned this Brandcast event a couple times. This is your big event you do for advertisers. I noticed that Susan didn’t spend a lot of time talking about the garbage on YouTube, which makes sense; you want to celebrate what you’ve got there. But when you are talking to advertisers, and publishers for that matter, what kind of response and feedback are you getting from them today? And how has that changed over the last couple years?

Yeah, so as you can imagine, I spend a lot of my time talking to advertisers. Susan does as well, and it’s been kind of a continuous conversation. I believe that they have seen the progress over the last couple years. I think that they have seen them in a few buckets; they’ve seen them in this general bucket of how we approach problematic content on the platform, where we remove violative content as quickly as possible, we raise up authoritative voices, we reduce some of even the borderline content from recommendations, the change that you mentioned. And so they appreciate that part of it.

They also appreciate the fact that we’ve built in more controls for them to be able to manage where their advertising campaigns run on the platform. And I don’t mean just Google Preferred or not, or this channel versus not, but also taking into account what is appropriate for their brands and giving them that level of control. We’ve worked really hard in terms of giving them third-party verification of the nature of their campaigns, where they’re running, the types of videos they run on; working with people like DoubleVerify, IAS, etc. And so they appreciate that whole spectrum.

And one really big piece — which doesn’t come up that much but I should highlight to you because my insight has been that that’s actually been a big part of the positive reception we’ve gotten from advertisers over the last couple years — is the changes we made to our YPP program, the YouTube Partner Program, where we established a threshold of a thousand subscribers, or 4,000 hours of watch time.

This is for a regular person who wants to upload videos and be a YouTube star, or just do it for fun?

Correct. Any YouTube creator who is looking to build an audience and monetize it now has to hit a certain threshold on their channel in terms of subs, subscribers, as well as watch hours in order to even apply to the Partner Program. And then the channel is manually reviewed before they’re actually even allowed into the Partner Program. And that was something that was received positively, of course, by advertisers, because now there’s both a threshold and a human vetting of the channels that are eligible for their ads to run on.

Secondly, it was actually — and this was, in retrospect, not so surprising, but it was a positive thing — which is our YouTube creators view that as positive as well because the dollars are now going to creators that are creating real value for the overall ecosystem, as well.

Right. There was an initial burst of people saying “you’re demonetizing me, you’re making it harder for me to make a living,” and your response generally was the people who were at the margins weren’t making real money to begin with.

Well, there’s the dollar amount and hitting those thresholds. Those are real thresholds, but they’re not ... If you’re a growing creator, you’ll reach that level. But I think the key thing was an evaluation to see that this is original content, content that’s adding real value to our ecosystem, and therefore is something that advertisers, in general, would be happy to run on. That’s sort of the principle now of the YouTube Partner Program.

And so back to your question on advertiser response: For all of those reasons, all of those changes, the response has been positive. Again, does that mean that every single issue is resolved? No, I wouldn’t say that, and it’s an ongoing conversation, and frankly partnership with these brands. But one of the key things that they also recognize is, basically everything else that you saw at Brandcast, which is the audience that they’re looking to reach is on YouTube, they’re engaged. Every time a user opens up the phone, they’re there for 60-plus minutes. And so they have a vested interest in working with us to address some of these challenges as well, and that’s what they tell us.

You mentioned watch time and engagement, and for a long time you guys were really focused on increasing that, getting to this billion-hour mark that Susan’s predecessor had set. Are you guys still optimizing for engagement, or do you have a new goal now?

So there’s always been an evolution here, and engagement is just a blurry term so let me be a little bit more specific with you in terms of how we think about this. And what I will say just again to be very clear, our first and foremost priority is responsibility around the content that’s on the platform. And that means three things: That means reducing the content that is … removing the content that is policy-violative, raising up authoritative sources when users are looking for information, including things like Christchurch where we had our breaking news shelf and authoritative ranking in our search results, and then reducing content that might be spreading harmful misinformation. That is our top priority overall. Our objectives for the company are oriented around that. That’s our primary objectives.

We also obviously look at things like satisfaction of our users, and satisfaction of our users can be viewed in many different ways. And so that’s where we measure how our users are using our platform. We also have metrics in terms of satisfaction of our creators. And so creators, obviously, are looking to grow their audience, but they’re also looking to establish more connections with their audience. So features that we have, like you can do now community posts, and you can do a YouTube version of Stories; those are the features that go into those. And creators are also looking to make money on the platform, and so that’s advertising, what we talked about with brands.

So there’s a lot of stuff you want to do, but for a while you had this overall company goal: We want to get to a billion hours of watch time. And that was a big focus, and you did a bunch of things to get there. So has that been replaced with something specific?

So our top-level goals are the ones that I described in terms of ... We put them in this bucket of responsibility for the content on the platform. Then I’m describing some of the other goals that exist, including user-facing goals, creator-facing goals. We have advertiser-facing goals as well. And that’s the way that our objectives are organized for overall YouTube, this year.

We try to set those on a yearly basis, and that includes things like how long users are spending time on the platform, but also includes things like surveys that we run. For example, you might have seen them, you might get them after you watch a video. How satisfied were you with this, in terms of YouTube recommending it to you?

Never asked, but I’m happy to tell you, if you want.

Have you been satisfied?

Yeah.

Okay.

I like it, I like it. I’m a little worried about my kids getting into it.

Yeah. So, those are the ways that we... Because I think, again, as the person responsible for the products, my experience has been those types of goals are the things that prove the efficacy of the product, for our creators and our users.

YouTube has always been free, and then you guys have introduced this thing called YouTube Red, which is both a music service and then also premium content, and then again at this Brandcast event you guys were putting money into making your own movies and TV shows.

You mean YouTube Originals.

YouTube Originals. Then at the Brandcast event you said, “Actually, all the YouTube Originals we do from now on, we’re going to make these free for users.” So, what does that tell us about this subscription product you guys have?

So, the subscription product is called YouTube Premium now. It’s been rebranded. We also have within that a subscription product called YouTube Music Premium, which is, as you know, we built our standalone YouTube music app which is around music and obviously, it still features video, because it’s YouTube, but also has an audio-forward type experience with it, as I’m sure you’ve played around with it.

Yeah. We had Lyor Cohen in to tell us about it.

Oh yeah, that’s right. Lyor. Yes, I remember that. I remember that podcast fondly.

There’s an unedited version too there.

All right, I’ll have to ask him about that. So, YouTube Music Premium is the subscription service for that where you get background, offline, without ads or interruptions. So, that business has been growing, continues to grow. We keep adding subscribers. Our goal there is to continue to roll it out worldwide. I think we’re in over 40 countries now, including …

But that does it mean when you take this thing that you had to subscribe to get, and now you’re saying, “We’re going to bring this basically in front of the paywall”? There’s a whole move across media to get people to pay to get access to stuff, and you’re saying, “Here’s stuff that we’re actually going to put in front of the wall.”

Yeah, I think that’s a good question. I would say that — and Susan has talked about this before as well — I just think YouTube is different than a lot of those other platforms, and I don’t mean just in terms of its business model. Obviously we’re an advertising-supported platform, primarily. Even though our subscription business is growing and growing nicely, and we’re happy with the results there and we want to make that global as well, just like as our advertising business is. So, we’re going to continue to do that and have both those pieces.

One of the things that we thought about was, what is the most immediate way to give as wide of an audience to some of this original content that we have been producing? It’s something our advertisers were asking for too, which is, could they be associated with some of these YTO productions that we had? So, we made the decision in this Brandcast to actually ...

I’m reading it as a shift. You guys saying, “We have this stuff and not enough people are seeing it, and we’ll do better if we bring it out.”

I mean, you know the scale of our ad-supported business relative to where subscription businesses are. The other point that I would say, and I think that we need to be clear about this, but music is front and center in our subscription business. Music is one of those habits that’s a daily habit. Everybody listens to music on a daily basis, and we want that to be a star when it comes to our subscription service. We find that users that are music listeners are users that tend to be happy with our subscription service. So our orientation, and I’m sure Lyor talked about this as well, for our paid subscribers is ...

Music.

Is music first.

Two more questions and then you’re allowed out of here. You have this … YouTube Live? What’s the OTT service called?

YouTube TV.

YouTube TV.

Yeah.

So, it’s essentially another version of the cable bundle. I pay you guys I think now $45 dollars a month to …

We just raised it to $49.99.

Right, you raised it to $49.99. You added more channels. I think you’re still probably losing… Everyone who has a version of this basically is saying, “We’re losing money on this and we’ve got to raise our prices eventually.” Of all the people who are trying to sell me TV over the internet, you guys seem most situated to be able to say, “We’re going to sell this at a significant loss so we can build up scale, and by the way, we have a reason to do this, because if we get a lot of people to subscribe to YouTube TV, we’re really good at advertising.” So, why haven’t you guys, instead of raising your price to $50 bucks, gone to $20 bucks or some bargain-basement price and try to get as many subscribers as possible?

You know, it really is a matter of optimizing for what we think will be a business that continues to grow in terms of households that we’re bringing online, and healthy economics of the business. As you said, getting to enough scale in terms of subscribers where it’s an attractive proposition for our brands and advertisers who are already telling us that, “Hey…” Actually, it was a lot of brand and advertiser input that had us feature YouTube TV in the way that you saw at Brandcast a couple weeks ago, where for the first time ever we actually featured YouTube TV inventory as a lineup within Google Preferred.

You guys, just to beat this in the ground because it’s confusing, I understand why maybe Hulu or some of the other people, AT&T, why these people are trying to figure out how they can get close to profitability and are reassessing that business, but you guys have billions of dollars. You’ve spent billions of dollars on all kinds of things in the past. This is directly a business that you can benefit from. Why not just blow it out?

So, there was two types of feedback that we’ve gotten since we launched YouTube TV, and as you know it’s been in the market now for a couple years. One is very positive in terms of the product. People love the product features, the DVR on the cloud, the power of Google-powered recommendations on content. All the magical stuff of, “Oh, wow. My Warriors game was recorded, and by the way, it didn’t cut it off with two minutes left. It did the whole thing.” That kind of stuff. So, we’ve gotten a lot of positive feedback about the core product itself.

Another area where we’ve gotten feedback is the channel lineups and the content that we’ve had. So, one of the things that we have done in response to user feedback, periodically over the course of the last couple years, is actually add content. So, the bundle that started when we launched it is not the bundle that we have today.

Right. It keeps getting fatter and fatter.

With added channels, because we got feedback. For example, sports is a very big use case on YouTube TV. We got feedback that, “Hey, if I’m an NBA fan, I have expectations that you’re going to give me pretty broad coverage of NBA games.” We got feedback that there are certain channels that were very attractive to users, even though they’re not sports fans. So, those were some of the channels that we added, for example, in the most recent announcement.

I still think if any company is going to come out and not replicate the cable bundle, it should be you guys.

I don’t think that we’re at the point where we’re nearly where some of the existing products are in terms of scale of our channels. I think that we, at this point, feel pretty comfortable about the size of the bundle that we have, and I think that you should look for more product innovation there, more ways that we feature content in the future. So, we’re going to continue to invest in the product and we’re adding households every single week, every single month.

Andrea is waving at me furiously, so one last question. We know what your digital video business looks like. We’ve been talking about that. We know what your OTT business looks like. They’re still linear TV, right? They’re doing upfronts this week, giant $70, $80 billion business. You guys tried a long time ago to get into that business. It seems like it’s still pretty ripe for you to enter there. What’s your strategy for conventional linear TV ad business?

I think the business that we like, that you’ll continue to see us invest in that’s most related to that, is our YouTube TV product. We think that while… And those are a lot of the partners that we work with. We work very closely with a lot of those linear broadcast and cable partners. It’s just that we think that the way that we wrap those channels up in the product that we have is really the way of the future. Our innovation, just like other Google products, is going to be in terms of what we can do for users, in terms of a new way of actually consuming that content.

I have so many more questions, but I’m going to get tackled by this ferocious person here.

We’ll have to do it again. You’ve got to come out to YouTube again.

I’ll come out again.

All right.

Thank you, Neal. Appreciate it.

That was great, thanks, Peter.


Recode and Vox have joined forces to uncover and explain how our digital world is changing — and changing us. Subscribe to Recode podcasts to hear Kara Swisher and Peter Kafka lead the tough conversations the technology industry needs today.

Sign up for the newsletter Today, Explained

Understand the world with a daily explainer plus the most compelling stories of the day.