Artificial intelligence (AI) changes needed to consider Gender & Racial Equality

Thought Prompt:

Have you ever been frustrated with Siri, Alexa, or Google not understanding your instructions or blatantly ignoring your voice frequently? Do you use AI technology to process job applicants and notice a pattern in the applicant pools generated? Did you know that gender and racial biases have been found in AI programming and the problem is only growing each day? đŸ˜±

That’s right, our advanced technology is not immune to biased and discriminatory algorithms. This raises important questions about AI’s ethics and impact on society. A study by the Berkeley Haas Center for Equity, Gender and Leadership analyzed 133 AI systems across different industries and found that about 44 percent showed gender biases, and 25 percent exhibited both gender and racial biases.

What?! AI technology considers gender?!

The Problem:

AI exhibits gender bias because of the data it had been trained on, specifically, “word embedding”. This method encodes words in machine learning to convey their meanings and associations with other words, enabling machines to understand and interact with human language. If the AI is trained on data that associates women and men with different and specific skills or interests, it will generate content reflecting that bias.

đŸ€– Who creates AI and what biases are built into AI data (or not), can perpetuate, widen, or reduce gender equality gaps.

According to the Global Gender Gap Report of 2024, the gender disparity is over-emphasized in the technology workforce with women making up only 30% of the AI and big data workforce, 31% of the programming jobs, and 31% of network and cybersecurity positions. Even though more women are graduating and joining STEM careers today than ever before, they are consistently kept in entry-level positions and less likely to hold leadership positions that could influence the tone of AI algorithms. Meaning, that AI is mostly developed by men and trained on datasets that are primarily based on men, which means AI is designed to work best FOR men. Think of how this affects recruitment and hiring, especially when most companies use some form of AI technology to filter applicants and find potential candidates. Think about how this bias provides inaccurate answers to patients using recent medical or health provider apps based on AI processing technology.

The Future:

Removing gender bias in AI starts with prioritizing gender equality as a goal, as AI systems are conceptualized and built. This includes assessing data for misrepresentation, providing data that is representative of diverse gender and racial experiences, and reshaping the teams developing AI to make them more diverse and inclusive. There is a crucial necessity to incorporate a wide range of expertise in AI development, including gender-specific knowledge. This will enhance the performance of machine learning systems, contributing to the pursuit of a more equitable and sustainable world. 

đŸ’Ș The AI field needs more women, and that requires enabling and increasing girls’ and women’s access to and leadership in STEM and ICT education and careers.

In the fast-evolving AI sector, the absence of diverse racial and gender perspectives, data, and decision-making could prolong significant inequalities in the future. This oversight could potentially lead to a decline in service quality and biased decision-making across various sectors, including jobs, credit, and healthcare. UN Women position paper on the GDC provides concrete recommendations to harness the speed, scale, and scope of digital transformation for the empowerment of women and girls in all their diversity, and to trigger transformations that set countries on paths to an equitable digital future for all.

💭 Let’s start a conversation and discuss how we can address this issue and create a more inclusive future for all.

#AI #ArtificialIntelligence #GenderEquality #GenderDisparity #EqualityInAI #InclusiveTechnology #EthicsInTech #BreakingBarriers #TechTalk #DiversityandInclusion #HigherEducation #ResearchTopics

 

Sources:

L. Nicoletti and D. Bass. 9 June 2023, “Humans are biased. Generative AI is even worse.” Bloomberg. https://www.bloomberg.com/graphics/2023-generative-ai-bias/

Smith, G., & Rustagi, I. (2021). When Good Algorithms Go Sexist: Why and How to Advance AI Gender Equity. Stanford Social Innovation Review. https://doi.org/10.48558/A179-B138

UN Women Headquarters. (2024). Placing gender equality at the Heart of the Global Digital Compact: Taking forward the recommendations of the sixty-seventh session of the Commission on the Status of Women. https://www.unwomen.org/en/digital-library/publications/2024/03/placing-gender-equality-at-the-heart-of-the-global-digital-compact

World Economic Forum (WEF). 11 June 2024. The Global Gender Gap Report 2024. https://www.weforum.org/publications/global-gender-gap-report-2024

Academic Integrity – Reasoning vs. Consequences

Think about how you present academic integrity in your class. All too often, academic misconduct and consequences are the sole concepts of academic integrity. Too much emphasis is weighted on a single exam/paper/project, leading to increased student pressure and the likelihood of cheating. When students have guidance on how to navigate your course, approach their studies, and reflect upon their learning, they are more motivated, more deeply engaged, and more academically successful.

Now think about your parenting style versus how you were raised. What methods did your parent or guardians use to teach you important lessons? What methods do you use on your children? Which system or methods do you feel have been the most effective in communicating values with an emphasis on the importance of those values? When you instruct your children not to do something, don’t they almost always ask you “Why?” Was your parent or guardian’s typical response, “Because I said so”, effective on your children or did they challenge your reasoning? This inquisitive thinking should be applied to discussions about academic integrity.

Provide actual reasoning as to the WHY. Communicating the rationale behind your content choices, your assignment design, and the course structure itself has been shown to improve student retention, grades, and learning outcomes (Winkelmes et al., 2016; Ou, 2018). Highlight underlying foundational values within the education system: original thought, scholarly conversation, respect, understanding, and exploration. For example, a student who can explain a concept in their own words rather than regurgitating text has truly learned that concept. When a student truly understands that concept, they can develop their own voice, and scholarly discussions can take place. This in turn allows new concepts or ideas to be developed and overall intellectual advancement.

Evaluate your learning environment. Over-emphasis on grade performance and high-stakes exams or assignments increases the likelihood of cheating (Lang 2013). Visualize a toddler sticking their hand in the cookie jar. The toddler knows they will get in trouble if caught stealing cookies without permission. Yet, if given the chance, the child might drag a chair or stool to the counter to get ahold of one of those delightful treats.

I’m not saying students are children, but the same logical approaches can be translational. Simply knowing the consequences of cheating only deters misconduct, not stopping it nor encouraging constructive behavior development. Students enrolled in courses with tests worth 50% or more of their grade are more likely to be concerned with performing well rather than learning the material, thus increasing the likelihood of misconduct (Karpicke and Roediger, 2008).

 

Karpicke, J.K. and Roediger, H. L. (2008). The critical importance of retrieval for learning. Science 319: 966-968.

Lang, James M. (2013). Cheating lessons: learning from academic dishonesty. Cambridge: Harvard University Press.

Ou, J. (2018). Board 75: Work in Progress: A Study of Transparent Assignments and Their Impact on Students in an Introductory Circuit Course. Paper presented at 2018 ASEE Annual Conference & Exposition, Salt Lake City, Utah. https://peer.asee.org/30100

OSU. Designing Assessments of Student Learning. https://teaching.resources.osu.edu/teaching-topics/designing-assessments-student

Winkelmes, M., Bernacki, M., Butler, J., Zochowski, M., Golanics, J., & Harriss-Weavil, K. (2016). A teaching intervention that increases underserved college students’ success. Peer Review, 18(1/2).