Engineering democracy
Cambridge Analytica fracas reopens the big questions about the organisation of the information order
Cambridge Analytica’s use of the data of more than 55 million Facebook users has once again reopened big questions about the organisation of the information order in a democracy.
The romance of democracy is that voters, acting as agents collectively, shape their own future. But this idea rests on the presumption that voters are agents, not mere effects of some propaganda machine, some information order that manipulates them: Consent is not manufactured, to use Noam Chomsky’s phrase. The information order also has to grant relatively open access and a degree of equality that allows all citizens to be heard, so that our collective decisions are genuinely all-things-considered decisions. The purpose of protecting free speech, making sure media power is not concentrated, information is not secret, and so forth, was to protect democracy itself. Most democracies have long betrayed these ideals, especially asymmetries of power in the information order itself.
The alleged scandal involving Cambridge Analytica’s use of the data of more than 55 million Facebook users has once again reopened big questions about the organisation of the information order in a democracy. They have also revealed how so much of the language of our democracy is struggling to come to terms with complex technological developments. The exact nature of this scandal, what laws were violated, who is responsible, will unfold in due course. It is also not entirely clear whether such similar violations have not happened in the past. But the episode has once again opened questions on the nature of democracy.
The first issue at stake is what consent means in the new information order. The conceit, and attraction of the modern information order is that it does things with our consent, in our name, ostensibly to satisfy our desires. But given the complexities of data-sharing, possible third-party uses, or use by friends, through whom your data can be accessed, it is not very clear what we are consenting to, and whether the terms of that consent can be enforced. The idea that simply because you have the formal option of consenting or withholding consent, you can control what can be done with your data seems like a bit of a pipe dream. It has become the normative equivalent of financial derivatives: A mysterious term that ostensibly helps manage risk, but has itself become the risk because we don’t know what it is we are authorising.
Also Read | Facebook’s Zuckerberg breaks silence on Cambridge Analytica data scandal; admits mistakes, outlines fixes
This scandal should be a reminder of one of Marx’s insights: The language of things can disguise the fact that the things signify relationships between people. The use of seemingly anodyne terms like data, technology, information, etc, can disguise the fact that all of these involve profound shifts in power relations between people, and can have implications for democracy. One thing that disguises this banal fact is that the debate over new technology, the power of social media etc, is presented as a debate between technologists and anti-technologists. But the real issue is not that. It is, more, what forms of ownership, what regulatory architectures ensure that the collection of data, the use and profiting from data, do not subvert the ideals of citizenship.
We now have an information architecture where a handful of large private players can exercise near monopoly power, with very little accountability on how this power is used. What has facilitated this power is the idea that there is such a thing as private power that is purely private. In older critiques of capitalism, we had to create the hard-won knowledge that private economic power has great public effects. But the rise of the tech companies coincided both with ebbing trust in the state, and a belief that technology was about serving consumers, not distorting the meaning of citizenship. So, in a way, the private sector was given a free pass. In India, the debate has a slightly different valence.
We are suspicious of the state. But when it comes to the now galloping uses of Aadhaar, beyond very limited, well-defined and sequestered uses, we are ready to give the state everything we have. Even if you trust the state, in our regime there are very few safeguards against contracting of data with private and foreign parties, which is what the game is increasingly becoming about. Second, we are using the specious argument that since private companies have access to data what is wrong in government collecting and linking data. The obvious answer to this has to be that questions will have to be asked of both the public and private sector when it comes to data protection, and its monetisation.
But now there is also a more sinister possibility. What if the state and Facebook or Jio colluded in how data is used? One of the ironies of the Cambridge Analytica episode is how much these companies are dependent on state patronage: Apparently, they were being used by states to effect outcomes in other jurisdictions. Both state surveillance and private power are a challenge for democracy.
But apart from safeguards and regulatory oversight, there is a deeper anxiety this episode once again raises. In India, we had a nice phrase for some forms of democratic mobilisation, “social engineering,” the idea of creating configurations of social groups based on their identity to carve electoral majorities. To some extent, social engineering is inevitable. But there was always the worry that social engineering is rarely about justice. It involves manipulating people’s fears. One of the interesting conceits embedded in projects like Cambridge Analytica and the new science of data-based campaigning is this: Their ability to attract clients depends on their ability to socially engineer electoral outcomes. In computer science parlance, it is a kind of confidence trick that gets you to divulge information. Now the jury is out on how effective all this is. But what exactly is the confidence trick? It is that the voters think they are getting what they want, but all the time it is the clients who are getting out of the voters what they want.
Is democracy increasingly becoming such a confidence trick, merely a feat of social engineering that a good combination of surveillance and data extraction can profoundly affect? It is premature to panic on this score. But the Cambridge Analytica episode does prompt this question. It does dent confidence in democracy. And it leads to the view that the Chinese state, with its sophisticated arsenal of data-based surveillance, is at least being honest. Voters don’t exercise sovereignty, they are manufactured; they are not causes they are effects. The only question is whether a public authority creates them or a private company. Serious questions for ideas of citizenship.
The writer is vice-chancellor, Ashoka University. Views are personal
For all the latest Opinion News, download Indian Express App
More From Pratap Bhanu Mehta
- A claim for dignityIt is morally obtuse and analytically misleading to see farmers’ long march as a demand for handouts..
- Revisiting ShivpalganjAs it turns 50, ‘Raag Darbari’ continues to tell the story of our politics, its crisis of meaning, with humour. ..
- Art of scam managementBJP thought it had a huge rhetorical advantage on corruption but after Nirav Modi, it looks like this issue will be a draw..
No hay comentarios:
Publicar un comentario