but Not Really a Hack or Particularly Great

Posted by Qassem Naim
Head of Data & Tech

I finally sat down and finished Netflix's The Great Hack

I made it to about 36 minutes in when it was first released (and more buzzworthy), but lost interest as the narrative inevitably drifted into the interpersonal drama surrounding the scandal.

But I persevered. And as the issues the film reacts to are far from being resolved, I’m compelled to share a few thoughts on the documentary’s representation of the Cambridge Analytica story.

Really Not That Cool

The principles Cambridge Analytica employed to develop personas and audiences for targeting weren’t revolutionary, by any means.

Utilising data to drive targeting for the US election and other political events did not require great sophistication. Targeting voting age populations in swing states who are likely to be aligned with certain political affinities is a pretty basic strategy – and if they refined their audience through profiling, that’s just an efficiency play.

Even the smallest advertisers and brands now have access to a breadth of audiences and look-a-like models online, built from demographics, psychographics and a stream of behavioural information, which together attempt to find people who might be willing to buy what they happen to be selling.

And these tactics have existed long before they were applied digitally. Propaganda has always been around, and is not dissimilar from advertising in many ways.

Long before the uber-creepy digital options, there were creepy direct mail campaigns in your letter box, sent by politicians and businesses using negligibly worse data and likely more sophisticated models to reach prospective voters and customers with the right message.

What digital has enabled is rapid testing and optimisation.

According to Bloomberg, Trump not only outspent Clinton on Facebook by 58% ($16M USD), but also achieved markedly differently results for his spend—a disparity attributed to the two candidates’ differing use of platform. Trump reportedly ran 5.9 million different ad variations during his presidential campaign, spending over 25% of his budget on look-a-like targeting, compared to the 66,000 ad variations and 4% of look-a-like spend used by Clinton.

Clinton’s campaign was out-tested to a far greater degree than it was outspent.

Context is Everything

My friend started a company selling sustainably-made t-shirts he designed himself. Eventually he made a website, and we looked at how he could target people interested the types of clothes he was making. In this case, that was women between the ages of 16 and 34 who live in Auckland and are interested in yoga.

With just a few hundred bucks, anyone can jump onto Facebook and choose to target their ads to a hyper-specific audience, based on a trove of attributes and signals.

And in the context of a start-up, most people seem to think that’s okay. But as soon as you turn it to 18 to 45 year olds who are interested in voting, you’ve crossed the line.

There’s a Deeper Issue Here

The larger problem is really education: educating users about the world they’ve stumbled into; and educating and reminding ourselves of the responsibility we, as practitioners, have to protect and honour the trust of our customers (and fellow humans) who have wittingly or unwittingly relinquished their data.

The concept is akin to that of being caught off-guard by someone a little too observant, à la Sherlock Holmes. Well, I hate to break it to you, but technology “pays attention” constantly, follows you around all the time, and has a memory immortalized through digital storage.

And still, no-one reads the terms and conditions. Well, except people in the industry who understand the implications, like me, who start to read them until they get too long.

Honestly, I could have easily digitally signed away my first born child and eternal soul numerous times by now. It doesn’t feel like a binding agreement, but depending on your jurisdiction, it may well be defensible in court.

Let’s fix that.

And help people come around to accepting that when you interact with a machine, it’s always watching.

And check ourselves, and the way we approach achieving our business objectives by creating behaviour change.

We use aggressive language like “target” profusely, but in what other context is it acceptable to openly discuss “targeting women and children”?

All That Said…

Data privacy and security is a genuine issue that is finally getting the attention its due. In the last year alone, there’s been more publicity, more clients conducting audits, more data-sharing agreements than in the three years prior.

At FCB, we benefit from best-in-class global security and compliance experts through our parent company IPG, which has deeply invested in data. This ensures our data practices are compliant, even by US and EU standards, but smaller independents and start-ups are really going to have to up their game and invest in order to keep up.

The Great Hack wasn’t the worst piece of content I’ve ever seen. And it was at least useful for helping to explain what I do for work to my parents.

“Imagine all that clever data and targeting and modelling they dramatized in the film, and then make it way less creepy and far more ethical…”