Advertisement

SKIP ADVERTISEMENT

I Used Google Ads for Social Engineering. It Worked.

Ad campaigns that manipulate searchers’ behavior are frighteningly easy for anyone to run.

Credit...Daniel Savage

Mr. Berlinquette is the founder of the search engine marketing consulting firm Berlin SEM.

Kevin Hines had one thought as he plummeted toward the Pacific Ocean: I can change anything in my life except the fact that I just jumped from the Golden Gate Bridge.

“One sentence could have stopped me,” Kevin wrote. “Had any one of the hundreds of passers-by engaged with me, it would … potentially have showed me that I had the ability to choose life.”

No person stopped Kevin from trying to kill himself. Could a Google ad have?

Three out of four smartphone owners turn to Google first to address their immediate needs. As a result, Google marketers like me survive by exploiting impatience and impulsiveness. We must be there to serve you an ad in your “micromoment” — the second you use your phone to alleviate the discomfort of not having something now.

You have micromoments about 150 times per day. Mobile device users will see ads during most of them. A helpful ad on Google will match your keywords with a relevant landing page. But some ads provide countermessaging or alternative destinations that go against your search words. These are called “redirect ads.”

With redirection, marketers swerve your monetizable desperation. But we can also swerve something bigger: your beliefs, convictions and ideology. There are advertisers in the digital marketing industry who want to find out how effective this new form of social engineering is. One of those advertisers is Google.

The Redirect Method was a Google-incubated project that used redirect ads to deradicalize would-be extremists. In the first eight weeks of 2016, some 320,000 people — all of whom were believed to harbor sympathy toward the Islamic State — clicked on ads designed to reflect an interest in extremist content. Instead of arriving at a page that supported their views, Islamic State sympathizers who clicked the ads found themselves directed to a playlist of videos debunking the terror group’s recruitment narratives.

Most of the visitors stuck around. Together, they watched more than half a million minutes of video.

[If you’re online — and, well, you are — chances are someone is using your information. We’ll tell you what you can do about it. Sign up for our limited-run newsletter.]

After the ISIS campaign ended, Google left behind a blueprint. The blueprint shows, step by step, how you can create your own redirect ads to sway any belief or opinion — held by any Google user, anywhere in the world — of your choice.

You don’t have to be a marketer with years of experience to do this. You just need to follow the instructions and put up a credit card (a few hundred bucks will suffice).

Recently, I followed the blueprint and created a redirect campaign of my own.

The first step was to identify the problem I wanted to address. I thought about Kevin Hines and how his fate might have changed if cellphones with Google had existed back in 2000 when he tried to take his own life.

Could Kevin have been redirected? Could he have been persuaded — by a few lines of ad copy and a persuasive landing page — not to jump? I wondered if I could redirect the next Kevin Hines. The goal of my first redirect campaign was to sway the ideology of suicidal people.

The problem my campaign addressed: Suicidal people are underserved on Google. In 2010, Google started making the National Suicide Prevention Lifeline the top result of certain searches relating to suicide. It also forced autocomplete not to finish such searches.

The weakness of Google’s initiative is that not enough variations of searches trigger the hotline. A search for “I am suicidal” will result in the hotline. But a search for I’m going to end it” won’t always. “I intend to die” won’t ever. A lot of “higher-funnel” searches don’t trigger the hotline.

I hoped my redirect campaign would fill the gap in Google’s suicide algorithm. I would measure my campaign’s success by how many suicidal searchers clicked my ad and then called the number on my website, which forwarded to the National Suicide Prevention Lifeline.

Nine days after my campaign began, the ads were accepted by Google. My ad was the first result across the United States when someone Googled with suicidal intent. I showed unique ads to suicidal people who were physically located around the Golden Gate Bridge.

Nearly one in three searchers who clicked my ad dialed the hotline — a conversion rate of 28 percent. The average Google Ads conversion rate is 4 percent.

The campaign’s 28 percent conversion rate was met in the first week. Not counting people who thought I was associated with lifeline or who did not read the ad or language on my website, that leaves a rate suggesting there’s a need in this ad space that is not being met.

After the suicide deterrence campaign ended, I created another redirect campaign targeting Americans who told Google they wanted to shoot up a school or commit terrorism.

Like the suicide deterrence campaign, these ads connected clickers to a website with a crisis hotline. Prospective school shooters were enticed by my ad, but the conversion rates were low. These prospective shooters were reluctant to speak with someone.

Google let me run the ads with no issue. It didn’t seem to care what the language on my website was, or what phone number I directed people to. There was no vetting process to become a redirector. I didn’t need qualifications to be a conduit of peoples’ fates. I expected the ads to get rejected, but they were not.

For every search conducted by an American who wanted to kill, I saw the exact words he or she typed into Google before clicking my ad. And anyone who runs campaigns using the blueprint will have access to the same. It is a one-way mirror into the American psyche.

Click data can be used for harm by a redirector with bad intentions. If redirectors can groom ISIS sympathizers, they can also use it to groom school shooters. A redirector using a call-forwarding service can link up with like-minded terrorists by having clickers’ calls directed to their phones.

With the ISIS campaign, Google decided what a radical view was, who seemed to hold those views and who should be able to view them. It’s hard to be cynical about an initiative that deters extremism. But entering the domain of social engineering is a slippery slope. The standard of what needs to be deradicalized is adjustable.

Using Google’s ISIS campaign blueprint, anyone can access the platform’s precise targeting tools and redirect ads to help further his or her own agenda. For instance, swaying peoples’ political beliefs during an election.

Those who bear the brunt of that abuse aren’t just the impatient and impulsive. More than 50 percent of people still can’t differentiate between an ad (redirect or not) and an organic result on Google.

What the public needs is a free blueprint on how to use Google defensively. But that won’t happen while profits for advertising are tied to exploiting our micromoments.

Mr. Berlinquette is a Google certified partner, and the founder of the search engine marketing consulting firm Berlin SEM.

Like other media companies, The Times collects data on its visitors when they read stories like this one. For more detail please see our privacy policy and our publisher's description of The Times's practices and continued steps to increase transparency and protections.

Follow @privacyproject on Twitter and The New York Times Opinion Section on Facebook and Instagram.

glossary replacer

A version of this article appears in print on  , Section A, Page 23 of the New York edition with the headline: I Used Google Ads for Social Engineering. Order Reprints | Today’s Paper | Subscribe

Advertisement

SKIP ADVERTISEMENT