Artificial intelligence has a gender-bias problem - just ask Siri

SOURCE: HSRC Review
OUTPUT TYPE: Journal Article
PUBLICATION YEAR: 2020
TITLE AUTHOR(S): R.Adams
KEYWORDS: ARTIFICIAL INTELLIGENCE (AI), GENDER EQUALITY, SIRI, VIRTUAL PERSONAL ASSISTANT
DEPARTMENT: Impact Centre (IC), Impact Centre (PRESS), Impact Centre (CC)
Print: HSRC Library: shelf number 11311
HANDLE: 20.500.11910/15262
URI: http://hdl.handle.net/20.500.11910/15262

If you would like to obtain a copy of this Research Output, please contact Hanlie Baudin at researchoutputs@hsrc.ac.za.

Abstract

All the virtual personal assistants on the market today come with a default female voice and are programmed to respond to all kinds of suggestive questions. Does their design as stereotyped females suggest that in the midst of a global technological revolution, women remain trapped in traditional roles and personalities of the past?