Skip to main contentSkip to navigationSkip to navigation
Roger McNamee
Roger McNamee: ‘The internet is how it is because Google and Facebook made it that way.’ Photograph: Winni Wintermeyer/The Times/News Licensing
Roger McNamee: ‘The internet is how it is because Google and Facebook made it that way.’ Photograph: Winni Wintermeyer/The Times/News Licensing

Roger McNamee: ‘It’s bigger than Facebook. This is a problem with the entire industry’

This article is more than 5 years old

Mark Zuckerberg’s mentor and an early investor in Facebook on why his book Zucked urges people to turn away from big tech’s toxic business model

Roger McNamee is an American fund manager and venture capitalist who has made investments in, among others, Electronic Arts, Sybase, Palm Inc and Facebook. In 2004, along with Bono and others, he co-founded Elevation Partners, a private equity firm. He has recently published Zucked: Waking Up to the Facebook Catastrophe.

What is your history with Facebook?
I’ve been a technology investor since 1982, and a tech optimist until very recently. I first met Mark Zuckerberg in 2006, when he was 22 years old and I was 50. Even at that time it was already obvious to me that Facebook would be as successful eventually as Google was at that time, which was to say spectacularly successful. He had broken the code on the two things that historically had undermined all network-based companies: he had required authenticated identity, and he had provided genuine control of privacy.

I thought that represented a staggering success. So I met him, and before he said anything I told him that I was afraid somebody was going to try to buy the company, they were going to pay a billion dollars and everybody was going to tell him to take the money. And I said, look, if you believe in your dream, I hope you’ll tell them no.

It turned out that the reason he was coming to see me was that a company had offered a billion dollars for Facebook and everyone had told him to accept it.

He wanted a chief operating officer and I suggested Sheryl Sandberg, and persuaded her to meet with Mark. He sold her on the company and they became a team.

When did you first realise that things had taken a turn for the worse?
In January 2016 I saw things coming out of Facebook groups ostensibly associated with the Bernie Sanders campaign. The stuff they were spreading was uniformly inappropriate, it was misogynistic, it was disinformation.

And then a month later, I saw a report that Facebook had expelled a company that was using its advertising tools to gather information about people interested in Black Lives Matter and selling that data to police departments. That was a massive violation of civil rights and just completely horrible.

Finally, I reached out to Mark and Sheryl with an opinion piece I’d written, in which I suggested that there was a systemic problem with the algorithms and the business model that was allowing bad actors to harm innocent people. They very politely replied to me, but suggested that in their view what I’d seen was isolated and they had taken care of it.

So that’s when I decided I’d better figure out exactly what happened. I met Tristan Harris, a design ethicist from Google, who talked about what he called “brain hacking”, a term that he invented to describe the persuasive technologies used by internet platforms that enable them to develop habits in the minds of the people who use the products. Those habits evolve into addictions, and that situation makes [users] vulnerable to manipulation.

Like a stroke of lightning, it made me see what it was that had potentially influenced the election in the US, and had potentially influenced Brexit.

Is this book the culmination of those efforts?
No, this is just the next step. We kept at this for a while and then, kaboom, the Observer and Carole Cadwalladr come out with the story about Cambridge Analytica and that changed everything.

We’d been banging the drum [but] the privacy angle was the first one that really [hit] home, with everyone, and it transformed everything.

That’s when I began working on the book, and the notion was very simple: there’s only so much you can do in a tiny team of people going around and meeting small groups. So we needed to find a way of broadening the message. What I was really trying to do was to use my personal narrative to help people understand what matters, so that as new news came along, they could interpret it for themselves.

My thought process was that this isn’t a tech story. It’s not a business story. This is an everybody story. This is a catastrophe that we’re all facing, and we don’t necessarily have a vocabulary for it, because the business model that Facebook and Google have created is something we’ve never seen before.

They were very much in the business of manipulating attention in order to get you to spend more time on [their services]. And that is a very dangerous business model for society. It’s bad for the mental health of the people who use it. It’s terrible for democracy. It’s completely destroyed any sense of privacy, and it’s undermined entrepreneurship in the United States, because these companies essentially pick off one industry at a time and disrupt it in a way that destroys the old without replacing it with something of equal value.

What’s the reaction been inside Facebook to this book?
One of the things that has been most surprising to me is that no one at Facebook has communicated with me since February 2017. I have known Sheryl since 2000 and she is one of the most politically savvy, exceptionally capable executives you could possibly meet. And the notion that they would not reach out to critics is astonishing. It’s so harmful to them to be so closed-minded about criticism.

The problems are not isolated. They are systemic. They’re related to a business model that has worked extraordinarily well for investors and horrifically for everyone else. The failure to recognise that moderation would’ve been a better long-term strategy for the company is ultimately going to be very costly, because they are leaving governments around the world no choice but to bring the hammer down.

Is this a Facebook problem or a Mark Zuckerberg problem?
It’s bigger than Facebook. This is a problem with the entire internet platform industry, and Mark is just one of the two most successful practitioners of it.

This is a cultural model that infected Silicon Valley around 2003 – so, exactly at the time that Facebook and LinkedIn were being started – and it comes from a specific route.

Silicon Valley spent the period from 1950 to 2003 first with the space programme, and then with personal computers and the internet. The cultures of those things were very idealistic: make the world a better place through technology. Empower the people who use technology to be their best selves. Steve Jobs famously characterised his computers as bicycles for the mind.

The problem with Google and Facebook is that their goal is to replace humans in many of the core activities of life. If you think about what they’re doing with artificial intelligence, there are three markets that have proven to be incredibly lucrative: getting rid of white-collar work; telling people what to think with filter bubbles – that’s what Facebook does; and recommendation engines that tell people what to enjoy or consume.

Any list of the things that make us who we are is going include those three characteristics. Our work, the things we believe, and the things we enjoy are part of what makes us human. And to take those away and give those over to a computer instead strikes me as the opposite of bicycles for the mind.

Do you think there’s a version of history in which we don’t end up in this situation?
The culture into which Facebook was born was this deeply libertarian philosophy that was espoused by their first investor, Peter Thiel, and the other members of the so-called “PayPal mafia”.

They were almost single-handedly responsible for creating the social generation of companies. And their insights were brilliant. Their ideas about how to grow companies were revolutionary and extraordinarily successful. The challenge was that they also had a very different philosophy from the prior generations of Silicon Valley. Their notion was that disruption was perfectly reasonable because you weren’t actually responsible for anybody but yourself, so you weren’t responsible for the consequences of your actions.

That philosophy got baked into their companies in this idea that you could have a goal – in Facebook’s case, connecting the whole world on one network – and that goal would be so important that it justified whatever means were necessary to get there.

You said Brexit was a wake-up call. Why was that?
It never occurred to me that there would be an asymmetry in the way that advertising works. That in order to command attention, you want to appeal to what [Tristan Harris] calls “the lizard brain”, the things that provoke outrage and fear. Things that essentially create a perception of reward. Those things, when you put them into advertising, can really be bad for democracy. Suddenly a neutral centrist idea gets very little traction on Facebook, where really extreme, emotionally charged ideas are viral.

There’s evidence that in the US, the messages of the Trump campaign got 17 times the effective reach per dollar spent as Clinton’s messages, and that’s just a staggering advantage.

With Brexit, because of the shocking outcome, I had to ask myself: had Facebook played a role? Because clearly Facebook had been part of the promotional strategies of the campaigns. And it was equally clear that in one case you had a very emotionally charged message, and in the other a very neutral message. So it was a pretty good test of that question. I didn’t have the data so I didn’t know. But the hypothesis occurred to me and in the context of the other things I had seen, raised more alarms.

Can you see Facebook changing on issues like this without being forced to?
Tristan and I have worked very hard to share our message with people at Facebook in a way that they could accept. After two years of this, I am sceptical about the ability of anyone to get Facebook to make these changes internally.

So I fear that we’re now at the point where external stimuli are the only way this will happen, which is why I emphasise the role that the people who use these products need to play by withdrawing some or all of their attention and by making their voices heard with policymakers in government to force regulatory change.

Zuckerberg argued that the downsides of Facebook are the same as the downsides of the internet. Do you think that’s true?
No. There was an old definition of chutzpah: a child who kills the parents and then begs the mercy of the court because now he’s an orphan. That’s what we’re talking about here. The internet is the way it is because Google and Facebook have made it this way. To Mark’s credit, Facebook has been a spectacular success on its own terms, but we should not forget that these are Facebook’s terms.

Zucked: Waking Up to the Facebook Catastrophe by Roger McNamee is published by HarperCollins (£16.99). To order a copy go to guardianbookshop.com or call 0330 333 6846. Free UK p&p over £15, online orders only. Phone orders min p&p of £1.99

More on this story

More on this story

  • Zucked by Roger McNamee review – Facebook’s catastrophe

  • 'The goal is to automate us': welcome to the age of surveillance capitalism

  • The Age of Surveillance Capitalism by Shoshana Zuboff review – we are the pawns

Comments (…)

Sign in or create your Guardian account to join the discussion

Most viewed

Most viewed