Published By: The Guardian, 9/8/2016
Summary: Beauty.AI developed a set of algorithms to judge photos according to five factors in human standards of beauty; it disproportionately chose photos of white people. The article discusses the potential consequences of emergent bias in algorithms and/or datasets in general, including more consequential examples like predictive policing.
* View the Article *
Content Advisory: The Guardian article leads with a beauty-contest photo of women’s bare legs, and makes a brief tangential reference to masturbation.
Extended Discussion Questions:
- What does this story demonstrate about using computers to judge subjective characteristics like beauty?
- How might the results of the Beauty.AI have been affected by prejudices that already exist in society? (Can be general or about technical detail.)
- Do you think the problem could have been avoided? How? What do you think computer scientists should do to avoid these types of problems generally?
- One of the five algorithms, MADIS, compared photos to models or actors within the same ethnic group. Do you think MADIS could still have contributed to the biased results? Why or why not? (Hints could include who gets chosen as models and where they’re from, breadth of variation within ethnic groups, number of training photos of models available…)
- To analyze what went wrong, what would you want to know about how the algorithms were designed and what data was used to train them? (If you’ve covered model training.)
Alternative Article: “The First AI-Judged Beauty Contest Taught Us One Thing: Robots Are Racist” (The Next Web, 9/8/2016) lists the five algorithms used (including MADIS).
CSP Global Impact Learning Objectives/EKs:
LO 7.3.1 Analyze the beneficial and harmful effects of computing.
LO 7.4.1 Explain the connections between computing and real-world contexts, including economic, social, and cultural contexts.
EK 7.3.1A Innovations enabled by computing raise legal and ethical concerns.
Other CSP Big Ideas:
3 Data and Information
Banner Image: “Network Visualization – Violet – Crop 8”, derivative work by ICSI. New license: CC BY-SA 4.0. Based on “Social Network Analysis Visualization” by Martin Grandjean. Original license: CC BY-SA 3.0
Home › Forums › A Beauty Contest Was Judged by AI and the Robots Didn’t Like Dark Skin
Tagged: 2 Abstraction, 3 Data & Info, 4 Algorithms, 7.3.1 Benefits and harm, 7.3.1A Law and ethics, 7.4.1 Real-world contexts