Blog post -

A Few Words From Christopher Wylie

Months ago, the shocking news of how Facebook’s algorithms may have helped to send Trump to the Whitehouse, was broken by Cambridge Analytica’s own Chistopher Wylie.

We recently attended an onstage interview at the Gather Festival here in Stockholm, Sweden, where Linus Larsson interviewed Christopher Wylie. We were struck by the insights Wylie had into what’s really been happening at Facebook, but also by the severity of the impact we are seeing, even beyond the Cambridge Analytica scandal.

Here are the highlights from that on-stage interview

Linus Larsson:

“How did you notice something was changing at the place you were working? How did it become apparent to you that your job was something else?”

According to Wylie, it began soon after Steve Bannon joined the company. “The direction started to shift, and we started doing general research on the US population and testing how disinformation worked online, as well as exploring how race ideology spreads. We were testing things to provoke reaction in people. One of the things Cambridge Analytica realized is that Facebook algorithms are responsive to certain fringe groups and pages which then enabled them to manipulate the news feed.”

Mobilizing “Conspiratorial” Groups / When Fake News Becomes Reality

One of the scariest tactics apparently used by Cambridge Analytica was this kind of provocation of pack mentality, or reactionary mindset that ultimately feeds into the polarization of the world.

“You can profile people using data to identify who is conspiratorial online,” Wylie said, “and then target those kinds of people to like a page or group. Let’s say it’s called ‘New York Patriots.’ When they join the group or page, you can manipulate their news feed with content.” After that, he said, accounts began to engage in the groups or pages and develop relationships with those people. Then, you could set up a meet-up at a local coffee shop, and people attend, only to find out that their beliefs and opinions aren’t ”crazy” while talking to people at the event where everybody thinks the same.

As soon as people go to these kinds of meet-ups, something that was once fake becomes reality. Once that happens, these groups self organize, and a movement starts. Hence, the polarization in the world, enabled by algorithms and companies like Cambridge Analytica.

“This is undermining society,” Wylie said, and pulling out the worst character or the most vulnerable people to unhealthy thinking and socially problematic thinking … tricking them online so that they join a group that is “fake,” which then later becomes a toxic movement.

As soon as he saw that Facebook was using data and algorithms to harm people, he and many others left the company, Wylie said. The original team left within three months, which speaks volumes about the work that the company was doing. But by then they already had a large database, and everything they needed to continue the work.

In 2016, when Wylie heard Trump speaking, he had flashbacks from research reports from Cambridge Analytica, and he knew that the research was being used for Steve Bannon’s dream candidate. The company had laid the groundwork before him by targeting vulnerable people online.

Society works like this, Wylie said: You have candidates that give speeches and, when lies are being told, there is an opposition party and news organizations calling the lies out. But this only works when it’s a public forum. When you are online, the candidates can track you. Someone can whisper in your ear, and you have no idea who that is. It might be disguised as news, a friend, or an event. So how can journalists and civil society call the lies out? They can’t, because they don’t know what’s happening. People start debating facts and what’s fake news. Cambridge Analytica was capitalizing on this effect. “If we cannot agree on what’s real, Democracy becomes pointless. If we can’t agree on what’s real, we can’t have functioning anything,” Wylie said.

He went on to say that content matters when it comes to information. There is an important difference between what Cambridge Anylica was doing and target marketing, for example – a difference between persuasion, and manipulation. People were not aware that they were being targeted in the case of Cambridge Analytica. The material they were targeted with had an agenda and, in many cases, was misleading. The misconception that Cambridge Anylica launched a target campaign is important. Is there awareness of the content? Is it deceptive? If yes, then you are being manipulated, not persuaded.

How widespread is this in Sweden during election times?

We are seeing a rise of disinformation and have active bots on twitter, Wylie said. “The thing about working in digital space influencing elections is that you can do it from anywhere. People are far more likely to share fake news than real news.” The viability of fake news is bigger. So, if you can get large numbers of people in Sweden to vote for neo fascists, then the larger parties get smaller and that’s a problem. If it can happen in Sweden, it can happen everywhere in Europe. But the Swedish government is talking about it, which is far more than what other countries are doing.

Cambridge Analytica was not set up to service campaigns, Wylie said, “it was set up to engage in cultural warfare. Politics flows from culture. If you want to change politics, you have to change culture.” …

The Dangers that come with the increasing Digitization of our Lives

“The impact of AI is going to increase in our society now with all the technical gadgets we have brought into our homes,” Wylie said. And we must start thinking about problematic situations with these kinds of applications. “Thank God Cambridge Analytica happened now, and not ten years from now.”

If the trend of the digitization of our lives continues … imagine how easy it will be to manipulate someone when the whole physical environment is connected to AI, whose only morality is to “optimize” you.

There are very few examples in history where people become the product, Wylie noted. “We had it before and called it slavery.” It is denial of free will. You don’t have the right to make up your mind, and people make money off of you.

We have industries that create algorithms to optimize you and make money off of you as a user, Wylie said. .... And they are also optimizing the information on Facebook to keep you there as long as possible and make it like a slot machine. (Constant scrolling is an intentional design.) And imagine that, in your physical space, where you go into a store, a car, office or just simply just walk around, and there is this ambient presence that follows you and knows who you are and is designed to optimize you in every way. Then imagine you have a company with the ethics of Cambridge Analytica that takes advantage of that situation, where AI is only present around you and data is collected and information is being targeted at you every step of your life and how that would influence you. “And that is a disaster scenario, because if you can control all the information around a physical space in a person and not just only what they see online, you control that person’s reality and that becomes a new form of slavery.”

In Conclusion / The Questions We Should Ask Ourselves

“{These industries} make you do things whether you know it or not, and make money off of you and deny you a choice. We have to, as a society, take a step back and look at what the ongoing trends are and how the technologies are developing and ask {ourselves}: Are we building technological infrastructure that is in the interest of humanity? Does this technology work for humanity? Is this making us freer? Is this going to help us or limit us and {if it limits us}, how do we make sure to avoid that situation?”

Wylie ended by saying that he is hopeful for the future because at least now there is a conversation about it. We are collectively thinking about it! 

Topics

  • New media

Categories

  • big data
  • integritet
  • invasion of privacy
  • privacy concerns
  • social monitoring
  • protecting privacy
  • messaging
  • integrity

Contacts

Elizabeth Perry

Press contact Chief Marketing Officer Marketing & Communication