Résumé
The use of female AI agents, such as vocal assistants, chatbots and robots, is on the rise, but the indiscriminate feminization of these AI agents poses novel ethical concerns about their impact on gender relations in society. This conceptual article argues that AI agents, even virtual ones, can display sexed cues (bodies, faces, and voices) beyond mere gendered cues (e.g., names, pronouns, hairstyle) and questions how assigning artificial female gender and sex to AI agents can harm women and transform gender power dynamics. Grounded in the Social Shaping of Technology and Technofeminism with an existentialist feminist lens, this work parallels the scrutiny that the use of gendered and sexed cues in female advertising models has faced over past decades to critique the deceptive practice of linking artificial gender and sex in female AI agents. It suggests that by restricting a narrow view of gender to a narrow view of biological sex, the use of female AI agents limits women’s self-concepts by binding their identities to deceptive, narrow body/face/voice-centric scripts, while facilitating covert manipulation, enforcing harmful stereotypes, amplifying objectification, and exacerbating gender power imbalances. This research offers ethical guidelines for the further development of AI agents based on transparency, justice, and care, addressing this new form of surveillance capitalism and sexual oppression, and providing insights to create a more authentic, equitable, and caring technological landscape.
Mots-clés
Gender; Sex; AI agents; Feminism; Stereotypes; Discrimination; Objectification; Manipulation;
Publié dans
Journal of Business Ethics, vol. 198, n° 1, avril 2025, p. 1–19