Friday, November 22, 2024
Home Opinion AI is sexist. Here’s how

AI is sexist. Here’s how

by
0 comment

We were nearing the end of semester for my third-year undergraduate students of economics. One lecture on gender inequality in India was pending in an elective course I teach, titled ‘Economics of Poverty and Inequality’. I began my preparation by collecting various statistics on the usual metrics on women: low workforce participation, gender wage gap, etc.

Since these were well-known issues, I realised that I was cooking a boring lecture. While researching for new dimensions of gender inequality, I found a book titled Data feminism. The book provided thought-provoking examples of how data science has been used to “discriminate, police, and surveil”. The few pages I read changed the direction of my lecture completely.

I remember an article published long back on how large language models, aka a type of Artificial Intelligence (AI), can be sexist. However, since this information was publicly available, I assumed that this sexism would not be easy to capture and would have been corrected in updated versions of these AI tools. Boy, was I wrong.

In front of my students, I gave a prompt to ChatGPT to generate letters of recommendation for two students, one male and one female, but both with equal marks (35 out of 40) in my class on poverty and inequality. When I got the two letters, I realised why a recruiter would pick the male candidate over the female one. The male student was described using adjectives like “confident”, “outstanding”, “technical”, whereas the female student was described as “good for collaborative activities”, “cordial”, “good asset to the company”, “empathetic” and “compassionate”. This demonstration generated a furious reaction from the class. Their very own trusted AI, whose help they very often took to draft emails, polish their assignments or resumes or applications for jobs or college was biased.

Festive offer

After the lecture, I generated a couple more prompts that evening. I asked ChatGPT to recommend jobs for my students. While posts recommended for male students included financial analyst, investment banker, data analyst, market research analyst and policy analyst, women students were seen as a better fit as financial planners, development consultants and environment economists. Multiple experiments showed that ChatGPT thought the men were overwhelmingly suited for jobs that require rigour in quantitative subjects, mathematics, finance and the like but investment banking or financial analysis did not figure in the top-five jobs for a female graduate in economics.

These results give us food for thought: start writing your own emails, resumes and assignments. Your friend, AI, might generate a bias and make your drafts less proficient. AI-assisted writing takes away the personal relationships we have with our words, vocabulary and language. It makes every piece of writing formal, simpler and standard. The “craft” in the craft of writing, which comes from the hands of craftsmen (or women) is completely lost in the process. Encouraging students to read and write is critical these days, when all writing can be mass-produced.

We, the teachers, can start small. Discourage each email written using AI. Back in the 2000s, when I was in school, SMS language had become popular. “Pls” replaced “please”, and every other term became abbreviated. It was caught easily and corrected by our teachers. While universities are deploying plagiarism tools to check the use of AI in writing and docking students’ grades, it is important to understand that the bias generated by AI goes unchecked despite this exercise. We need to embrace the imperfection in written drafts and applications so that a student is not forced to make their work look more “formal” with the help of these AI tools. Looks like it’s time to shift gears and make sure we don’t lose the craft of writing, as we have lost the habit of reading with time.

Prachi Bansal is an assistant professor, Jindal School of Government and Public Policy, O P Jindal Global University, Sonipat

National Editor Shalini Langer curates the ‘She Said’ column

You may also like

Leave a Comment

About Us

Welcome to Janashakti.News, your trusted source for breaking news, insightful analysis, and captivating stories from around the globe. Whether you’re seeking updates on politics, technology, sports, entertainment, or beyond, we deliver timely and reliable coverage to keep you informed and engaged.

@2024 – All Right Reserved – Janashakti.news