How technology will dismantle caste (or make it worse). Hidden Filters
and Silent Codes
LESSON FOUR: MODULE TWO
“If Black people don't have broadband access, if HBCU people don't have broadband access, that is going to lead to the fall of America.”
Rashad Bilal, How HBCU Broadband Deserts Could Cost Black People Billions
“If Black people don't have broadband access, if HBCU people don't have broadband access, that is going to lead to the fall of America.” —Rashad Bilal, How HBCU Broadband Deserts Could Cost Black People Billions Lesson Two Lesson One Lesson Three What Is Caste?

Kriti Sharma, a global leader in the field of AI and one of the leading executives in the chatbots industry, discusses it’s potential future impact on society, and how we should be more afraid of our own biases than the so-called ‘robot apocalypse’.

Can technology, which isn’t human, have a bias against certain people or things? How can stereotypes exist in something that isn’t human? By the end of the lesson, learners will be equipped to recognize biases in digital interactions, challenge algorithmic prejudices and advocate for more inclusive technological practices.

Kriti Sharma, a global leader in the field of AI and one of the leading executives in the chatbots industry, discusses it’s potential future impact on society, and how we should be more afraid of our own biases than the so-called ‘robot apocalypse’.

Can technology, which isn’t human, have a bias against certain people or things? How can stereotypes exist in something that isn’t human? By the end of the lesson, learners will be equipped to recognize biases in digital interactions, challenge algorithmic prejudices and advocate for more inclusive technological practices.

Introduction

When considering how hierarchies show up in technology, it is important to ask the question: Will technology dismantle caste or make it worse? Melinda Gates is a philanthropist, businesswoman and advocate for women and girls. In her book, The Moment of Lift, she writes about diversity as the best way to defend equality. “Gender and racial diversity is essential for a healthy society. When one group marginalizes others and decides on its own what will be pursued and prioritized, its decisions will reflect its values, its mindsets and its blind spots.” Gates continues, “That’s why we have to include everyone in the decisions that shape our cultures, because even the best of us are blinded by our own interests. If you care about equality, you have to embrace diversity—especially now, as people in tech are programming our computers and designing artificial intelligence. We’re at an infant stage of A.I. We don’t know all the uses that will be made of it—health uses, battlefield uses, law enforcement uses, corporate uses—but the impact will be profound, and we need to make sure it’s fair. If we want a society that reflects the values of empathy, unity and diversity, it matters who writes the code.”

Watch the next video and answer the reflection questions that follow.

Crash Course is produced in association with PBS Digital Studio

Crash Course is produced in association with PBS Digital Studio

REFLECTION QUESTIONS

  1. Are we transitioning into an era dominated by technology?
  2. Is there a risk that the age-old biases and prejudices that support and fuel caste systems could get translated or even amplified in the digital space? If you answered yes, which technology platforms do you believe are most likely to be impacted?
  3. As technology grows, who or what entity is responsible for monitoring its impact on human beings?
  4. Many people use technology that is considered ‘helpful’ to order food for delivery, purchase household items online and control the temperature in their homes. Is there a limit to which technology should be allowed and which should not? If so, who decides what we keep and what we shouldn’t use?

REFLECTION QUESTIONS

  1. Are we transitioning into an era dominated by technology?
  2. Is there a risk that the age-old biases and prejudices that support and fuel caste systems could get translated or even amplified in the digital space? If you answered yes, which technology platforms do you believe are most likely to be impacted?
  3. As technology grows, who or what entity is responsible for monitoring its impact on human beings?
  4. Many people use technology that is considered ‘helpful’ to order food for delivery, purchase household items online and control the temperature in their homes. Is there a limit to which technology should be allowed and which should not? If so, who decides what we keep and what we shouldn’t use?

“On January 9, 2020, Detroit police drove to the suburb of Farmington Hill and arrested Robert Williams in his driveway while his wife and young daughters looked on. Williams, a Black man, was accused of stealing watches from Shinola, a luxury store. He was held overnight in jail.”
MIT Technology Review

When police in Detroit Michigan arrested Robert Williams in his front yard while his wife and two young daughters looked on in disbelief, they did so based on the use of facial recognition software used by law enforcement. According to Ned Freed Wessler, a member of Williams’ defense team, the police department used flawed and biased technology for what turned out to be a false arrest. “These algorithms have been tested by the National Institute of Standards and Technology, a federal agency, and by numerous independent researchers, who have found that facial-recognition algorithms make significantly higher rates of false identifications when used to try to identify people of color, particularly Black people,” says Wessler.

And while facial misidentification headlines have grabbed the attention of many people around the world, there are other examples of biased technology. In some classrooms, A.I. headbands are being utilized to monitor the attention spans of students throughout the day. Could this technology be used to classify people, reinforce hierarchies or potentially punish students? In the medical field, critical life saving technology such as pulse oximeters have been found to harbor bias in the way they detect the skin color of patients. If a tool for healthcare doesn’t function as well for people with darker skin as it does for people with lighter skin, could this be seen as a way of showing a higher value for one kind of person over another?

In the United States, certain neighborhoods and communities are disproportionately placed into contact with new technology used to surveil their neighborhoods and label them as criminals. Investigations into ShotSpotter systems, a system that uses sensors to detect gunshots and trigger armed police responses, revealed a dire discrepancy in police responses due to their placement. “In most of the white neighborhoods, there are no sensors at all,” reports Vice News. In the four cities they investigated, the “data shows that the sensors are also placed almost exclusively in majority Black and brown neighborhoods, based on population data from the U.S. Census.” These technologies and their data are then sold and shared from law enforcement to corporations to create the tailored customer experiences of the future. Most recently, this technology has expanded beyond A/V technology to DNA sequencing, creating a troubling new precedent for surveillance and discrimination based on birth.

Before we begin the next activity, let’s explore a few vocabulary terms that will be helpful to learners as we move forward.

In her book “Race After Technology,” Princeton professor and author Ruha Benjamin coined the term “The New Jim Code” to refer to the ways in which “tech fixes often hide, speed up, or even deepen discrimination while appearing to be neutral and benevolent.” Throughout the book, she explores the many ways in which the increasingly automated and online functions of social life threaten to encode the caste divisions of the past and present into our technological future.

Engineered
Inequity

Technologies designed to create convenience and access for users may actually enforce the exclusionary boundaries of caste. As social media continues to connect people across the globe, some online dating services have been created that allow users to select for potential partners based on caste, creating a virtual endogamy that was once enforced through matchmakers and arranged marriages.

However, it’s not only users that discriminate. Several algorithms powered by artificial intelligence have produced clearly discriminatory results without any human input. These “racist robots” reflect the failures of the tech industry to properly evaluate the ways in which the discriminatory practices within the industry might be magnified through the technology they produce.

Default
Discrimination

In “ORIGIN,” during the family reunion scene, Isabel explains to her cousin that people place other people in “containers.” Meaning that you see a person walking down the street and because of the way they look or dress or their body type, you make an assumption about who they are and what they like. Default discrimination refers to the consequences of tech design without proper sociohistorical context. When technology used by people worldwide is produced within an insular and largely homogeneous environment, it shows in the blatant biases these technologies demonstrate. As people are increasingly reliant on these technologies to access important information and services, these biases grow more consequential for our social and political functions.

Coded
Exposure

Even in the absence of discriminatory design, social codes based on caste still create biased encounters between technology and lower-caste peoples. This is exemplified by Kodak’s Shirley Cards, a series of photos of women created in the 1950s used to calibrate color settings on the company’s film cameras. From the original “Shirley,” these images used white women as the “standard” for calibrating the cameras, making them unsuitable for photographing people with darker skin tones.

REFLECTION QUESTIONS

  1. Are there potential societal consequences if only the elite have access to advanced technology?
  2. Should advanced technology be available to all people of all ages, backgrounds and locations? What risks could be associated with the mass distribution of powerful technology?
  3. We learned about endogamy in “ORIGIN.” How is technology, specifically dating apps, impacting how people choose companionship?

REFLECTION QUESTIONS

  1. Are there potential societal consequences if only the elite have access to advanced technology?
  2. Should advanced technology be available to all people of all ages, backgrounds and locations? What risks could be associated with the mass distribution of powerful technology?
  3. We learned about endogamy in “ORIGIN.” How is technology, specifically dating apps, impacting how people choose companionship?

ACTIVITY: Can Technology Dismantle Caste?

Technology experts and regular folks who are fighting against discrimination in technology are getting into good trouble by exposing the biases they see and asking others to do the same. Their efforts have resulted in updates to many technology products that otherwise would have continued to give flawed results to those in lower castes.

From large groups like Data for Black Lives, a movement by activists, organizers and scientists who create change by using data, to individuals like a TikTok creator who used machine learning to turn Kenyan sign language into audio, everyone can make a difference.

What can you do to dismantle caste in technology? Watch this “How I’m fighting bias in algorithms,” TED Talk by computer scientist and digital activist Joy Buolamwini and answer the reflection questions that follow.

REFLECTION QUESTIONS

  1. You’ve been presented with information about technology bias in the areas of healthcare, gaming, policing and other areas. Which topic was most interesting to you?
  2. We all use technology daily: what did you learn in this lesson that you didn’t know before you started?
  3. Can advanced (or old) technology enforce caste systems? If yes, how?

REFLECTION QUESTIONS

  1. You’ve been presented with information about technology bias in the areas of healthcare, gaming, policing and other areas. Which topic was most interesting to you?
  2. We all use technology daily: what did you learn in this lesson that you didn’t know before you started?
  3. Can advanced (or old) technology enforce caste systems? If yes, how?

ADDITIONAL RESOURCES

LESSON 4: MODULE 1

Reimagined
Roots

LESSON 4: MODULE 3

This Land
Is Our Land

ORIGIN 101