In Conversation With Chris Wylie

Christopher Wylie, self-described ‘gay, Canadian vegan’ is ‘not the sort of person the military would want to recruit’. Yet, beyond the traditional battle spaces of land, air, sea and space, information is a fifth dimension for warfare. The military needs expertise from unexpected quarters to equip itself to meet the security challenges that flow from a networked society.

I spoke with Chris Wylie at Chatham House’s 2018 Cyber Conference - its annual, flagship event which brings together cyber security experts, policy makers and defence professionals.

Watch the interview

The Cambridge Analytica scandal

The Cambridge Analytica scandal had broken a few months previously, and for cyber policy wonks, Christopher Wylie was the face of 2018. Given that disinformation has been around for approximately as long as information, I asked what makes social media different? Wylie replied that social networks act as an ‘accelerant for disinformation.’ Traditional broadcast and print media create a common experience among audiences. Everyone hears the same thing. If someone lies, they can be called out. Online ‘I can go and whisper in your ear with the benefit of having followed you about for weeks, while being invisible.’ This creates a series of ‘fractured cognitive monocultures.’

While some commentators dislike the entire concept of targeted advertising, Wylie is not one of them. ‘No one likes spam. Relevance is important.’ He highlighted positive uses of targeting ‘to motivate the traditionally, systemically disenfranchised.’ But there is ‘a line between relevance and coercion or manipulation’.

Making democracies resilient

Taking a more future-orientated approach, we discussed how to make democracies more resilient against social-media manipulation and disinformation? Wylie responded that elections are ‘no longer purely a domestic affair’, and that Facebook is ‘part of the information battle-space.’ He had some harsh words for Facebook: they ‘know what is happening, but are not disclosing.’ He speculated that the reason why Mark Zuckerberg had not shown up to face questions from Damian Collins’ DCMS Committee on Fake News and Disinformation - the country where the Cambridge Analytica scandal took place - was that ‘he knows he will be asked tough questions. They’re not lobbied. They do the research and they care.’

Require people who make technology to become more ethical

The most effective global response would be if the platforms self-regulated to reduce the impact of disinformation. I asked Christopher Wylie whether he agreed. ‘They have no incentives to solve the problem. They like it how it is.’ He said that the #DeleteFacebook campaign was the wrong response. It’s like saying ‘if you don’t want to be electrocuted, don’t use electricity. You can’t expect the onus to be on the user to manage their own safety. It’s like having buildings with no emergency exits, just terms and conditions.’

Is creating supporting quality journalism the answer? Wylie responded, ‘Facebook doesn’t work collaboratively with journalists or civil society.’ He stated that Facebook’s response to Carole Cadwalladr’s breaking of the Cambridge Analytica story was to threaten to sue the Guardian.

Unpredictable future technologies

When technology is moving fast, it is difficult for both the platform and the user to imagine how data that is collected now could be used in the future. ‘Facebook didn’t have facial recognition when I started using it in 2007.’ Now all our photos are up there to be analysed. But there are trends. We are starting to see rudimentary AI being deployed in people’s homes - for example Amazon, Google and Apple’s voice assistants. ‘Alexa is always listening,’ said Wylie. > Require people who make technology to become more ethical

In the future, when you walk into Starbuck, how much you pay, whether you should be allowed at all, will be ‘decided by algorithms whose only morality is optimising you.’ The appropriate response, according to Wylie, would be to require the people who make technology to become more ethical, like doctors or lawyers, and give due consideration to the impact of their actions on users and society. ‘You need to empower the engineers to say no’, to their bosses and their company legal teams.

Being a whistleblower

Finally, we reflected together on the experience of being a whistleblower. Whenever there is a big scandal, like Cambridge Analytica, or treatment of vulnerable people in care homes, the great and good always ask why no one came forward. I asked Christopher Wylie to explain to the great and good in the room why no one comes forward - ‘Cos it sucks’, was the reply. ‘There’s almost no support. The narrative of the Herculean, heroic whistleblower doesn’t help. It shouldn’t require people to blow up their entire lives.’ He described his own journey from being an anonymous source to coming forward and being publicly named. It took a year of planning, including how he would get a job after the episode had ended. ‘We celebrate [whistleblowers] in theory, but for HR this person might be problematic.’

A final thought from Christopher Wylie. ‘Disinformation is like dating. Would you go on a blind date with someone who knew all about you, and you know nothing about them?’ Regulation has to start with the platforms. They could make engineering solutions to make them less addictive, less coercive.

Emily Taylor

Emily Taylor is the CEO of Oxford Information Labs. She is an Associate Fellow of Chatham House and is the Editor of the Journal of Cyber Policy and co-founder of ICANN accredited registrar, Oxford Information Labs.

Published: , 860 Words.

Share

Top