Has Black Mirror come to China?

Cast your minds back to the first episode of the new series of Charlie Brooker’s dystopian Black Mirror. In ‘Nosedive’ the protagonist Lacie simpers her way through a perfectly manicured world of perfectly polite people. In this alternate reality, every action is judged and rated by those around her and her average rating is visible to everyone she comes into contact with. Lacie adopts a persona of false niceties in an attempt to improve her ratings but the pressure gets too much- her emotions break through her polished exterior and in no time at all, she finds herself slipping further and further down the five-point rating scale as her freedoms become increasingly restricted.

The thing about Charlie Brooker’s imagined world is that, despite seeming extreme, it fits the blueprint of our contemporary world so snugly that it never feels completely inconceivable. ‘Nosedive’ could not be more relevant after recent reports about the Chinese government’s decision to speed up plans for its new social credit system have caused ethical concerns to flare up.

In 2014, the Chinese government released reports about a social credit system that would combine data such as tax payments and adherence to traffic regulations in order to build a profile of trustworthiness for each of its citizens. It would require a kind of national database of each individual’s ratings which would bureaucratize almost every aspect of public life; from social security to the labour force to internet usage. According to China Daily “The credit-worthy will be granted conveniences in education, employment and opening start-ups, while severe wrongdoing will be made public”. This system becomes compulsory in 2020.

The reaction of the global press has been understandably incredulous, largely due to the fact that this move by the government coincides with the piloting of eight social rating apps such as Sesame Credit (launched by Alibaba- China’s answer to Amazon), which have been described as having “gamified obedience to the State”.  However, China does not currently have a financial credit rating system as rigorous as those in the West, so efforts to make credit ratings more transparent seem justifiable; especially considering the problems China faces with issues such as fraud and corruption throughout its industrial, healthcare and education systems.

Yet these pilot projects penetrate further than simply analyzing financial credit by combining data from all spheres of an individual’s life. For example, an individual’s internet activity may be assessed on the basis of whether the user is sharing content that is pro- or anti-state. The fact that the government has given these apps its stamp of approval, whilst the political rhetoric in its own reports maintains a typical sense of ambivalence about the consequences that a social credit system would have, has led many media sources to conflate the two ideas and come to the conclusion that China is embarking on a new form of social engineering. Given China’s history of social control, this is a predictable accusation to make, though, admittedly, it is not that far from the truth. If plans go ahead as expected, the publicising of social scores means that people not only have a regulating effect on each other but also on themselves. What is often harder to accept is that these Orwellian forms of social control already exist in democratic countries. They simply appear on a smaller scale and in a socially accepted format.

Take a moment to consider the apps which you use in everyday life. How about the last time you used an Uber? The simplicity of it means that barely a second thought is spared for the number of stars with which you rate your driver. However, internal charts shared by Business Insider in 2015 leave little room for doubt about the direct implications which driver ratings have on the security of an Uber driver’s job. Drivers with an average rating of 4.6 out of 5 or below run the risk of becoming ‘deactivated’. Of course, this is all for the purpose of providing excellent customer service and improving safety for Uber’s customers and ‘partners’ alike. In an automated world where instant gratification has become the norm and customers and suppliers both dance to the merciless tune of the markets, there is no room for human error. When the requirement is to give a 100% ‘performance’ 100% of the time human emotions and short-fallings become inconvenient blips in the system, yet sanctions as a result of mistakes have very real consequences for individuals.

This brings us back to Lacie’s performance in Black Mirror. Her interactions with people are not genuine but tactical moves to gain approval and improve her social rating. All the while, the spectre of those who have slipped down the social ladder and lost privileges such as the right to work looms large, pressuring her to keep up her performance. Similarly, China’s decision to quantify the trustworthiness of its citizens reeks of irony as it will inevitably weaken authentic relationships and hand totalitarian control to those who decide which citizens are deemed ‘trustworthy’.

Joshua Fisque

Leave a Reply