Security, Data, and the Future of Elections

‘It began with a dream of a connected world. A space where everyone can share experiences and feel less alone. It wasn’t long before this world become our matchmaker, fact-checker, personal entertainer, photo book , even our therapist.’ -Professor David Carol

The data from our online activity isn’t disappearing, but instead being mined into a multi-million pound industry. What’s worse, the masses are not aware this is happening, and with the lack of public concern for the security of smart devices, it’s an open playing field for the influence industry.

With the advent of social media and the emergence of smart devices, eyes and ears can be seen in your kitchen, outside on your doorbell and in your pocket. Your every move supposedly recorded including your location, your opinions, your spending patterns and even your heart rate. As social primitives, we are constantly uploading our lives online, from baby photos to political opinions. Meanwhile, Governments and marketing agencies are creating bespoke messaging campaigns designed to influence based from the information available from their target audiences. 

The key ingredient: your data. 

A breakdown of which topics each party chose to focus on in their Facebook ads from during the 2019 General Election (Graphic: Tristian Hotham / Political Studies Association)

What started out as a clever method to keep you on social platforms for longer, tailored algorithms capitalise on our activities in more ways than one. Social platforms are monitoring every online interaction from spending patterns, locations, likes, dislikes and even personal messages. Private companies are able to harness this data and take your emotional pulse, using the data to personalise algorithms for maximum content exposure. Feeding us content that we know we like.  Which is why when you ‘liked’ that post on internships abroad in February, your feed began to fill up with similar content and suggest like-minded ‘friends you may know’. Drawing like-minded peers together and creating stronger divisions against those that do not agree where prejudices and hatred for the opposition are inflamed, creating an echo chamber of bias.

This data misuse is not limited to social media, however.

Personal data is collected from your mobile phone, social media, GoogleHome Hub, Alexa, Ring Doorbell cameras and just about any other device that connects to the internet. Some experts argue that the general public aren’t necessarily bothered about how these devices work but rather just that they do work. This is highlighted in the Guardian’s recent piece on the security flaws with smart devices

Interestingly,  Amazon’s Alexa works by always listening for its name to be called, much like a human is always inadvertently listening to its surroundings. The difference is, outside its name being called, the device listens and captures everything else for ‘market research’. This data is used to supposedly create advertising patterns for their users and create a better Amazon experience. The same can easily be sold to third parties for political or financial gain.

Occasionally the listeners pick up things Amazon Echo owners would rather stay private: a woman singing badly off key in the shower or a child screaming for help. Two of the workers said they picked up what they believe was a sexual assault.’ 

This may seem like wild conspiratorial speculation but investigative journalists such as Carole Cadwalladr uncovered the Cambridge Analytica scandal in 2018 with the help of whistleblowers Christopher Wylie and Britney Kaiser. The trio exposed the illegal collection and manipulation of public data for developing political campaign strategies and surgical, psychological profiling.

 One popular method of profiling used by Cambridge Analytica is known as psychographics, a detailed map of a users personality. Ultimately it is personality that drives behaviour and private companies break your personality down into five exploitable personality traits: Openness, Conscientiousness, Extraversion, Agreeableness and Neuroticism. Using an aggregate measure of these traits, private advertising companies are designing specific campaign material aligned with your thoughts and opinions. 

A visualisation of IBM’s Personality Insights, an example of psychographic information (Image: CB Insights / IBM)

‘Remember those Facebook quizzes that used to form personality models? After gathering the personal data of our ‘persuadables’ (targets that were identified as undecided), our creative team created intricately personalised content to trigger those individuals to vote the way we wanted them to. We bombarded them through blogs, websites, articles, videos, ads and every platform you can imagine, until they saw the world the way we wanted them to.’ -Britney Kaiser, Cambridge Analytica 

With this level of data intrusion, it is no surprise of mass public suspicion in foreign involvement in Brexit and surveillance powers being held to account for apparently unlawful monitoring of its citizens. 

If current attitudes towards social media and smart devices continue, elections will be a question of data science to the extent where common practice will be companies finding and isolating particular characteristics of key voters and pulling their emotional strings to control voting habits using any technological resource available

However, since the Cambridge Analytica scandal, Facebook are now taking active measures to identify and take down rogue accounts showing co-ordinated inauthentic behaviour. Twitter are now implementing a misinformation identification strategy, notifying users to use caution when sharing. 

This plethora of messaging and available information is rapidly being falsified and malign in nature. This includes faking video content. 

DeepFake videos are described as video footage of a person that has been doctored to falsify the speakers words, face and behaviour – a complete fabrication of a video. Mouthes can be animated to fake speech with an impressionist or machine-learning technology to reproduce the targets voice. This could be and is increasingly likely to be used to create or fabricate speeches of political candidates that never even took place. The technology is already being experimented with comedian Jordan Peele creating  a video that pretended to show former President Barack Obama insulting US President Donald Trump in a speech. 

There countless ways at which we are exposed to advertising and messaging, in both online and the physical environment. One technology expert, Jamie Bartlett, described how even the novel smart fridge can be weaponized:

If companies can access your fridge data, they will know that at about 7pm is when you eat and at 6.30pm, you are hangry because your social posts are written with charged, emotional language. We know those that are emotional are marginally more open to messages of influence. So at 6.30pm, Jacob Rees-Mogg is probably going to pop up on your smart fridge screen with a personalised message to you and you alone, knowing you are more susceptible in your current state of mind’. 

The future of elections looks primed to be a war of disinformation and data manipulation. In an era of constant competition, protecting our cognitive security appears to be a losing battle. Technology looks set to be constantly evolving, and institutions across the globe appear to be on the back foot with data policy and platform misuse. This must not be ignored.