How Tech Is Quietly Bringing Back Caste Discrimination in India

A forensic dive into how supposedly neutral tech platforms deepen caste biases—from AI hiring filters to ad targeting in rural India.


Introduction: The Invisible Hand of Bias

Welcome to the 21st century—where caste discrimination doesn’t scream in the streets anymore. It whispers through code, quietly deciding who gets a job, who sees an ad, and who stays invisible.

India’s deep-rooted caste system didn’t vanish with urbanisation or smartphones. It simply evolved—migrating from village panchayats to digital platforms, cloaked in neutrality but reeking of hierarchy. The new battleground? Algorithms. And the casualties? Dalits, Adivasis, and backward castes—again.


1. The Myth of “Neutral” Technology

We’ve been sold a seductive lie: algorithms are objective. They don’t see caste. They only see data. But what if the data is casteist?

  • AI hiring tools trained on company resumes over decades? They learn to prefer “upper-caste-sounding names” because that’s who got hired before.
  • Natural language processing models used in customer service bots flag “regional dialects” as unprofessional.
  • Facial recognition software fails to recognize darker skin tones—conveniently excluding large portions of Dalit and Adivasi populations in rural identity verification systems.

This isn’t neutrality. It’s repackaged apartheid.


2. Caste in Hiring Algorithms

Let’s talk tech recruitment—India’s shiny sector. AI-based screening systems are being sold to MNCs and startups alike. But behind the automation is a chilling pattern:

  • Candidates with surnames like Verma, Reddy, Sharma receive higher callback rates than Paswan, Valmiki, or Munda, even with identical resumes.
  • Automated HR filters quietly deprioritize non-English speaking regions or Tier 3 colleges, which disproportionately affects marginalized communities.
  • Some platforms use “location targeting” to match “culture-fit” employees—an excuse to avoid hiring from Dalit-dominated areas.

We’ve gone from “fair-skinned, convent-educated” to “data-optimized and algorithm-approved.” Same poison. New bottle.


3. Ad Targeting: Prejudice on a Budget

Think caste can’t creep into your Facebook feed? Think again.

  • Real estate ads use geotargeting to exclude Dalit-majority neighborhoods from their promotion radius.
  • Education ads avoid SC/ST regions assuming “lower conversion rates.”
  • Matrimonial platforms secretly rank profiles based on caste preference, even when users opt for “caste no bar.”

All this happens quietly in the back end. Caste bias becomes just another metric on a dashboard.


4. The Silicon Valley of Savarna Privilege

Let’s be brutally honest: Indian tech is a Savarna boys’ club. From IITs to VC firms, it’s an ecosystem steeped in the unspoken dominance of upper castes.

And when only one kind of person builds the tools, only one kind of reality gets coded in.

  • Who audits these algorithms for caste bias? No one.
  • Who represents marginalized voices in product design? Rarely anyone.
  • Who benefits from “data-driven” development models in rural India? Hint: it’s not the Bahujans.

Tech isn’t just failing to dismantle caste—it’s digitally institutionalizing it.


5. Caste in Surveillance: The Digital Policing of Dalit Lives

Surveillance tech used in police departments, educational institutions, and welfare schemes are disproportionately used to monitor “low-income” and “high-risk” areas—keywords that conveniently map onto caste demographics.

  • Students from SC/ST communities in hostels are tracked more closely, assumed to be “troublemakers.”
  • Welfare schemes using Aadhaar-based verification often fail in remote tribal belts due to poor infrastructure, denying benefits to those who need them most.
  • Dalit activists are watched and targeted under “cyber threat” alerts by AI surveillance programs.

It’s not just discrimination. It’s digital policing.


6. Resistance Is Not Futile: Cracking the Caste Code

But all is not lost. Bahujan techies, activists, and ethical AI researchers are calling it out:

  • Projects like Equality Labs are mapping caste bias in tech systems.
  • Dalit-run startups are creating inclusive platforms that center the marginalized, not erase them.
  • Some regional governments are starting to mandate algorithmic audits for public procurement tools.

But it’s an uphill battle. And silence is complicity.


Conclusion: Your Code Is Political

We cannot debug social injustice with tools built on unjust data. If we don’t address caste bias in algorithms today, tomorrow’s AI will write a future where oppression is automated and resistance is shadowbanned.

This isn’t just a tech problem—it’s a moral one. And the question is: Will we keep pretending our systems are fair? Or will we rewrite the code?


🧠 Think Before You Code. Audit Before You Automate.

Because if we don’t control these algorithms, they will control us—caste and all.

Comments

comments

 
Post Tags:

Hi, I’m Nishanth Muraleedharan (also known as Nishani)—an IT engineer turned internet entrepreneur with 25+ years in the textile industry. As the Founder & CEO of "DMZ International Imports & Exports" and President & Chairperson of the "Save Handloom Foundation", I’m committed to reviving India’s handloom heritage by empowering artisans through sustainable practices and advanced technologies like Blockchain, AI, AR & VR. I write what I love to read—thought-provoking, purposeful, and rooted in impact. nishani.in is not just a blog — it's a mark, a sign, a symbol, an impression of the naked truth. Like what you read? Buy me a chai and keep the ideas brewing. ☕💭   For advertising on any of our platforms, WhatsApp me on : +91-91-0950-0950 or email me @ support@dmzinternational.com