Journal logo

Data-driven Transformation

How Data-Driven Transformation of Our Society Multiplies Our Problems

By The Hundredth MonkeyPublished 3 years ago 7 min read
Like
Source: Shoshana Zuboff | Falling Walls Foundation

A woman who lives in the Rajasmand district of Rajasthan worked under MGNREGA and had her wages for her work of 60 days between April and November 2019 due to her. For the amount of Rs. 12000 due to her, she went to the officials to enquire and was given the established standard response - told to wait. Wait, the accepted fate of the common. After having completed one round of work, a worker’s wages are to be transferred to her bank within a 15 days period. The woman waited for 8 months getting the same response from the bank every time she went to ask about her payment, that her bank account had not been deposited with the wages.

We have been witnessing how the delivery of welfare programs has become highly digitized. The hype of AI being the messiah for healthcare, education, agriculture, is accepted generally without question. This means that at every stage from applications to deliveries, data is collected from individuals for the “social good”. This newsletter aims to discuss the failures, violations of rights, corruption, and surveillance that take place behind all the hype of data transforming our lives.

Being well versed with the reality, you will not have any difficulty in picturing thousands of MGNREGA workers whose wages are either delayed, rejected or even misdirected to others’ accounts. The reason they are usually given is “inactive Aadhar”. There can be numerous reasons for this and these result in what is a common term in rural India - “link fail”. Numerous reasons for link fail can arise from the failures at any of the numerous steps of online submission, update from one system to another, verification from one to the other, and operationalizing of Aadhar mapped payments by National Payments Corporation of India (NPCI) before finally the payment reaches bank accounts.

The GoI started Direct Benefit Transfers (DBT) in 2013 and since then all of their schemes are plagued with the same issues. Detecting the exact cause of failure is much more difficult in some cases, and the methods to resolve these issues are never clear. Radha Devi’s payments were rejected and she quit MGNREGA work. A widow in Kurnool, Andhra Pradesh, had her PM-KISAN payments rejected due to the failure of Aadhar mapping with the bank. Another woman of Garhwa, Jharkhand was living on PDS and a little social security pension. She died of starvation in December 2017, like many other victims of technological failure in Jharkhand. Collateral damage, that’s what we’d think they are, aren’t they?

To get a bigger picture, Rs. 1,375 crore worth of payments have been lost under MGNREGA alone, that too without any resolution. The failure rate of DBT schemes was over 10% for years 2017-18 and 2018-19. For the authorities, these failures will be insignificant but for those at the grassroots, these become a matter of life and death. It is baffling that the state collects huge amounts of data in the name of social good and what is delivered are alarming human rights violations and affliction on human lives.

Social security schemes cover the population already poorly. With the digitization of welfare delivery schemes, old problems get multiplied manifolds with reduced accountability and transparency. The beneficiaries face difficulties in accessing the technology and moreover in understanding the issues. When data is the building block, it means that those on the ground whose data are not there in the systems, or those who face errors, are completely invisible to the government.

What is effectively happening is that people’s livelihoods get tracked by linking them to dystopian and dysfunctional data-based identities. The technologies such as facial recognition and biometrics are being employed to build a digital infrastructure of surveillance but the costs of laying its foundations are being paid by the disabled, marginalized, women, queer-trans communities, immigrants, and the underprivileged.

The “AI for social good” narrative is also propelled in the healthcare sector with the deployment of machine learning technologies. These work with huge data sets or Big Data and are programmed to analyze historical data to bring out patterns, learn from instances, predict, and classify generalized outcomes to help in decision making. AI-based systems are being developed to put in place for diagnosis and functions like telemedicine. The goal is to initially assist doctors in the diagnostic procedures and later fill in the place of the doctor.

The unaffordable and predatory conditions of the highly privatized healthcare sector thus provide an easy ground for introducing these technologies and let bodies of the ill and underprivileged, with their medical data, be used to train the machine algorithms. In fact, India serves as the hub for data collection for developing AI-enabled diagnosis mechanisms for several reasons. One of them is, as Virginia Eubanks puts in her book Automating Inequality: How High Tech Tools Profile, Police, and Punish the Poor, “the low-rights environment where there are few expectations of accountability and transparency”. The other reason is that the diversity of Indian patients provides ample numbers for the software to be trained. Another reason is that in many hospitals treatment is combined with “experimental processes of data gathering” to train the AI software and there is no objection to this.

The questions we must ask here are why is the presence of doctors at the underserved parts being neglected. Why are the sick and poor being subjected to these untested and unethical technologies?

Why is the state allowing the capital flows from the Global North to replace labor and the technologies not even benefiting those sick and poor beyond outreach programs as their lived realities are hardly given any consideration?

The case of IBM’s Watson for Ontology, a cancer diagnosis tool makes this clear. This AI-powered tool has been rejected by several countries on the grounds that it is centered around the preferences of a handful of American doctors, is biased towards American care standards, and does not take into account socio-economic issues faced by patients in poor countries. Yet, it is used freely by Manipal Hospitals across India.

One of the problems with the development of AI models is that the teams are populated by cis, hetero, upper-class males across the world. Among them too, there are hardly any sociologists or anthropologists who could outline the impacts of these models on societies or on people from different socio-economic backgrounds. This is perhaps the reason that most facial recognition technologies fail to see past gender binaries. LGBTQ groups have vehemently resisted the growing use of such technologies often citing them as dangerous. They also affect women from South-Asian countries as most are meant to cater to white American females.

Facial recognition is sold to us everywhere, despite an ample amount of research proving that both physiognomy and pathognomy are pseudoscience and that neither criminality nor emotions could be “detected” by facial expressions. The installation of AI cameras that allow employees to enter only if they are smiling, in the garb of ensuring positivity, is one such move towards forcing people to behave in a certain way and of coercing and cowing workers. Shoshana Zuboff puts it succinctly in her book The Age of Surveillance Capitalism, saying “it is no longer enough to automate information flows about us, the goal is to automate us.”

When social media platforms came about, we were sold the ideas of articulation of self, free speech, upholding democracy, and the such. The price we’re paying to date is the opposites, erosion of a sense of self, attacks on free speech, spreading of fake news, preying upon the privacy of our lives, and democracies on cliffhangers.

Such data-driven transformations of our lives render us vulnerable in the face of issues concerning exploitation, accountability, marginalization, and privacy. The realization of these issues should not lead us to the gloomy notion that we’re all doomed. Rather, it should drive us to challenge the collaborations of big corporations with political power structures that aim to set conditions of access and control over our bodies, lives, and societies. It is about saying no to our everyday lives being cataloged, organized, and criminalized. It is about us protecting our right to be who we are, and not letting human lives be given up for collateral damage of developing a world governed through databases.

Book Recommendation

The Age of Surveillance Capitalism by Shoshana Zuboff

economy
Like

About the Creator

The Hundredth Monkey

The Hundredth Monkey is a weekly newsletter by National Association of Students, India.

We are a group of students who write around policy, politics, and major social issues.

Reader insights

Be the first to share your insights about this piece.

How does it work?

Add your insights

Comments (1)

Sign in to comment
  • Nitin Khaitanabout a year ago

    Thanks for sharing your thought about the data-driven organisation. Below is a link to a good article about making a data-driven organisation: https://medium.com/towards-polyglot-architecture/design-thinking-toward-data-driven-organisation-473060f44feb Could you share your thoughts as well?

Find us on social media

Miscellaneous links

  • Explore
  • Contact
  • Privacy Policy
  • Terms of Use
  • Support

© 2024 Creatd, Inc. All Rights Reserved.