miércoles, 9 de mayo de 2018

When Code Is Law | The Indian Express

When Code Is Law | The Indian Express

Growing dependence on algorithms will affect defence, privacy, social fabric


When Code Is Law

Growing dependence on algorithms will affect defence, privacy, social fabric

Written by Pukhraj Singh | Updated: May 9, 2018 12:21:36 am
When Code Is Law
Billions of digital identities are for sale at ridiculously cheap prices.

With the debate spurred by the revelations of Cambridge Analytica’s dealings with Facebook — and, closer to home, by Aadhaar — we may have to revisit the very foundations of the individual’s social contract with the state when it comes to privacy. Those familiar with the hacker counter-culture of the Nineties knew one thing — the most potent weapon of information warfare is availability.
Julian Assange wrote an informal manifesto for Wikileaks in 2006, stating that “where leaking is easy, secretive or unjust systems are nonlinearly hit, relative to open just systems”. Dave Aitel, a cyber-offence expert recruited by the National Security Agency at the age of 18, concedes that Assange’s document “was way ahead of its time”. Back then, regimes around the world were still honing the dark art of extending the militaristic domain of information warfare to cyberspace, barring a few exceptions like the United States and its anglophone allies.
We crossed the Rubicon when Russia allegedly influenced the 2016 US presidential elections by weaponising the availability of information, proving the Wikileaks’ hypothesis that keeping secrets would become costly. Billions of digital identities are for sale at ridiculously cheap prices. Emin Gün Sirer, a self-proclaimed hacker and associate professor of computer science at Cornell University, writes that “our laws were written for a time and place where giant data collections and intersections were difficult to perform, so we’ve erred on the side of forcing the government to release whatever it knows”.
Every interaction is a leak, says Sirer. The digitisation of social interfaces has made leaking so rapid that it outpaces the human ability to comprehend it. He postulates that eventually “everyone will have access to all the data related to everyone who is alive during their lifetime”. The research-grade problem is to strip the data of its value, which requires a fundamental shift in how privacy is perceived.
Access to hitherto forbidden information spawns unexpected formations, which is what we are witnessing with the ideological echo chambers aggravated by social media. Despite being bound by perceived commonalities, these groups are ad-hoc, unpredictable and probably their own worst enemies.
When such unpredictability — intensified by the deluge of information — becomes the norm, we will obviously rely on Artificial Intelligence (AI), much to our peril. In 2015, reports appeared that Google’s artificial neural network started spouting “Dali-esque” images when queried about common worldly objects. It took a while to figure out that the system’s “brain” went on a learning overdrive.
There is a Gordian knot in the pursuit of objectivity. Dan Geer, a cybersecurity expert at the CIA’s venture capital fund, In-Q-Tel, explains the paradox: “The more data [an AI system] is given, the more its data utilisation efficiency matters. The more its data utilisation efficiency matters, the more its algorithms will evolve to opaque operation. Above some threshold of dependence on such an algorithm in practice, there can be no going back”. He terms this property of evolving opaqueness of algorithms as “interrogatability”.
It is obvious that analytical systems will become less interrogatable with the ongoing data deluge. It is already reaching those thresholds in domains like cyber-defence where one can only fight algorithms with algorithms. It is scary to imagine the impact this may have on the real world — on nations, societies and individuals making crucial decisions merely relying on esoteric computations. That is almost an eerie allusion to the technological singularity (when AI would surpass societal intelligence) speculated by futurists like Ray Kurzweil.
Democracies would be susceptible to its pitfalls. Life and liberty would be etched on semiconductors. Lawrence Lessig, a professor at Harvard Law School, divined in 2000 that code — the language in which computational logic is expressed — would act as the enforcer of law, and may even become the law.
The writer is a cyber-intelligence specialist and has worked with the Indian government and security response teams of global companies
For all the latest Opinion News, download Indian Express App

No hay comentarios: