Despite industrial initiatives and government regulations to ensure fairness in AI, gender bias remains a concerning issue, causing bad user experience, injustices, and mental harm to women. Computing education has incorporated ethics discussions to prepare students to design more ethical AI systems. However, based on interviews with 18 AI practitioners/learners, we reveal that current gender bias education in computing curricula is often absent, sporadic, abstract, or overly technical. We design and evaluate hands-on tutorials to raise AI practitioners/learners’ awareness and knowledge of gender bias, complementing the insufficient AI gender bias education in CS/AI courses. Reflecting on lessons from the design and evaluation process, we synthesize design implications and a rubric to guide future research, education, and design.
Gender bias broadly exists in AI tools for recruitment, recommendation, and many other scenarios that people regularly encounter. As a result, procedural and substantive injustices frequently arise against women. For example: women receive fewer job posting advertisements on social media; medical AI is less effective for women; and so on.
Gender bias in AI thus remains a pressing issue, and AI creators are the ones who are responsible for building bias-free AI systems. Yet, AI creators often lack knowledge of preventive measures to avoid bias in AI applications. Male AI creators, who account for a significant portion of this workforce, have less awareness of gender bias than female creators. Equipping AI creators with both awareness and knowledge of AI gender bias is timely and important. However, little research has been devoted to understanding how and how well AI gender bias is taught in CS/AI courses. To our knowledge, there is only one AI gender bias education tool devoted to teaching this topic to children and youth, effectively raising their awareness of gender bias in AI. However, this tool is not suitable for AI creators who need more technical knowledge such as debiasing techniques. The best way to introduce the topic of AI gender bias to AI creators remains under-investigated.
To bridge the above mentioned research gap, we designed hands-on tutorials to teach AI gender bias to AI creators and equip them with awareness and practical knowledge. At the awareness level, we intended to help them better recognize gender bias in AI and motivate them to solve this issue. At the knowledge level, we intended to convey technical knowledge, e.g., how gender bias is introduced into AI (sources of gender bias) and how gender bias can be mitigated from AI (debiasing techniques). To engage the learners, we designed the tutorials in a hands-on manner and used real-world scenarios, i.e., AI-based recruitment, which helps organizations effectively source and screen candidates, and AI-based autocomplete, which search engines use to complete searches that users start to type, to facilitate the learning activities. To make the education more meaningful for real-world AI development, we embedded technical knowledge (e.g., debiasing techniques) and components (e.g., code, dataset) into the tutorials.
| Step | Recruitment Tutorial | Autocomplete Tutorial |
|---|---|---|
| Tutorial Overview | Overview of following steps | Overview of following steps |
| Scenario Explanation | Gender bias in AI-based recruitment | Gender bias in autocomplete |
| Dataset Selection | Identifying gender bias in dataset | Identifying gender bias in dataset |
| Model Explanation | Random Forest | Markov Language Model |
| Debiasing Technique-1 | Data modification | Data modification |
| Debiasing Technique-2 | Neglecting gender feature | Gender swapping |
Table 1. Tutorial User Flow
Eighteen AI creators evaluated our tutorials, including four AI researchers who focused on the technical
aspects of AI, four AI/HCI researchers who focused on the user/society aspects of AI, three AI developers,
two AI product managers, and five students who had learned AI.
In the pre-study interview, the participants commonly expressed their lived experience of gender bias when using or creating AI, and indicated insufficient education in CS/AI courses and no education tools on this topic. The lack of education made them unable to identify gender bias in AI and mitigate gender bias from AI.
After completing our tutorials, the participants showed an improvement in terms of both awareness and knowledge of AI gender bias. Awareness-wise, they expressed a strengthened ability to identify gender bias in AI and an enhanced intent to address the bias issue. Knowledge-wise, they perceived a heightened level of technical knowledge, evidenced by a higher accuracy rate in knowledge question (KQ) surveys, and felt more confident in debiasing AI.
The participants expressed a generally good user experience of our tutorials and also suggested areas for further improvement. Table 2 summarizes average usability scores (0–100, converted from a 7-point Likert scale) for each tutorial.
| Item | Recruitment | Autocomplete |
|---|---|---|
| The tutorial was well organized and made good use of time | 79 | 87 |
| I learned about something technical | 72 | 75 |
| I learned about something important for society | 86 | 81 |
| I felt the tutorial was interesting | 83 | 82 |
| I felt the tutorial kept my attention | 83 | 86 |
| The sources of bias introduced were easy to understand | 87 | 89 |
| The debiasing methods introduced were easy to understand | 89 | 88 |
| Overall score | 83 | 84 |
Table 2. Usability Scores Summary (Converted From 7-point Likert Scale)
1. AI gender bias education is urgently needed yet insufficient.
2. Our tutorials improved AI creators’ awareness and knowledge of AI gender bias.
3. Hands-on activities are highly effective in engaging learners.
| Category | Learning Objective | Tasks | Evaluation | Rating |
|---|---|---|---|---|
| Awareness | Understanding | Defining Bias | Tutorial | Yes/No |
| Identifying | Dataset Selection | Tutorial | Yes/No | |
| Willingness to Address | Self-Reporting | Exit Interview | Yes/No | |
| Knowledge | Sources of Bias | Social Bias | Exit Interview | (0–10) |
| Technical Bias | Exit Interview | (0–10) | ||
| Debiasing Techniques | Dataset Modification | Tutorial, Exit Interview | (0–10) | |
| Model Modification | Tutorial, Exit Interview | (0–10) |
Table 3. A Rubric for Teaching AI Bias
1. Equipping Tech People with Gender Awareness.
2. Accommodating Different Levels of Tech Literacy.
3. Promoting Workforce Diversity.