Much noise has rightly been made about the role Facebook played in the 2016 presidential election. Critics have pointed to a targeted ad campaign by Russian groups as proof that the Menlo Park-based company wasn’t minding the store — and alleged that disaster followed as a result.
But that argument overlooks one key point: In showing microtargeted “dark ads” to users, Facebook was doing exactly what it was designed to do. The larger problem is not these specific Russian ads (which Facebook refuses to disclose to the public) — or even that Donald Trump was elected president — but the very system upon which the company is built.
Mark Zuckerberg’s plan to increase transparency on political advertisements, while welcome, falls into the same trap. Yes, more disclosure is good, but what is the remedy when the underlying architecture itself is gangrenous?
Zeynep Tufekci, author of Twitter and Tear Gas and associate professor at the University of North Carolina at Chapel Hill, made this point painfully clear in a September TED Talk that dove into the way the same algorithms designed to better serve us ads on platforms like Facebook have the ability to be deployed for much darker purposes.
“So Facebook’s market capitalization is approaching half a trillion dollars,” Tufekci told the gathered crowd. “It’s because it works great as a persuasion architecture. But the structure of that architecture is the same whether you’re selling shoes or whether you’re selling politics. The algorithms do not know the difference. The same algorithms set loose upon us to make us more pliable for ads are also organizing our political, personal and social information flows, and that’s what’s got to change.”
Tufekci further argued that when machine learning comes into play, humans can lose track of exactly how algorithms work their magic. And, she continued, not fully understanding how the system works has potentially scary consequences — like advertising Vegas trips to people about to enter a manic phase.
This concern is real. Facebook can now infer all kinds of data about its users — from their political views, to religious affiliations, to intelligence, and much more. What happens when that power is made available to anyone with a small advertising budget? Or, worse, an oppressive government?
“Imagine what a state can do with the immense amount of data it has on its citizens,” noted Tufekci. “China is already using face detection technology to identify and arrest people. And here’s the tragedy: we’re building this infrastructure of surveillance authoritarianism merely to get people to click on ads.”
Facebook bills itself as a company striving to bring “the world closer together,” but the truth of the matter is far different. It is, of course, a system designed to collect an endless amount of data on its users with the goal of nudging us toward whatever behavior the company believes is in its best interest — be that purchasing an advertised item, voting, or being in a particular mood.
That’s a fundamental problem that cuts to Facebook’s very core, and it’s not one that a new political ad disclosure policy will fix.