Gender, ethnic, and social class bias in the use of AIGs for audiovisual media
DOI:
https://doi.org/10.20868/ardin.2026.15.5662Keywords:
algorithmic bias, generative artificial intelligence, audiovisual representation, gender discrimination, racial discrimination, class discrimination, AI ethicsAbstract
This article condenses the results of a research project funded by the TAI University School of Arts, which aimed to critically review the programming that underlies Generative Artificial Intelligence (GenAI) applied to audiovisual creation. During the 2023-2024 academic year, an analysis of the contents generated by several platforms -in some cases of paid subscription- was carried out. The biases and stereotypes that emerged in the elaboration of an audiovisual piece have been analyzed.
The use of different platforms determined the construction process of the short film: from script writing, through the development of the shooting plan and the technical script, to editing and post-production. The elaboration of this audiovisual piece has served as a starting point to highlight the ideological bias of the GenAI, as reflected below in this text.
Downloads
References
1. Ashwini, K. P. (2024). Consejo de Derechos Humanos. 56º período de sesiones. Tema 9 de la agenda. Racismo, discriminación racial, xenofobia y formas conexas de intolerancia: seguimiento y aplicación de la Declaración y el Programa de Acción de Durban. Formas contemporáneas de racismo, discriminación racial, xenofobia y formas conexas de intolerancia. Informe de la Relatora Especial sobre las formas contemporáneas de racismo, discriminación racial, xenofobia y formas conexas de intolerancia. Naciones Unidas. https://www.ohchr.org/es/documents/thematic-reports/a79316-special-measures-report-special-rapporteur-contemporary-forms
2. Barona, S. (2021). Algoritmización del derecho y de la justicia: de la inteligencia artificial a la smart justice. Tirant lo Blanch. https://biblioteca-tirant-com.bucm.idm.oclc.org/cloudLibrary/ebook/info/9788413786667 128 ArDIn. Arte, Diseño e Ingeniería Year, 2026, 15, 94-133 ISSN: 2254-8319
3. BBC. (2025, 20 de enero). “El declive de EE.UU. ha terminado”: las frases más destacadas de Trump en su primer discurso como presidente (y qué dijo sobre América Latina). BBC News. https://www.bbc.com/mundo/articles/cq8kkwykkl3o
4. Benjamin, R. (2019). Race After Technology: Abolitionist Tools for the New Jim Code. Polity.
5. Bevan, R. (2019). La destrucción de la memoria: Arquitectura en guerra. La Caja Books.
6. Bordwell, D. y Thompson, K. (1995). El arte cinematográfico. Paidós.
7. Brandão, P. (2011). La imagen de la ciudad. Estrategias de identidad y comunicación. Universitat de Barcelona.
8. Buolamwini, J. y Gebru, T. (2018). Gender Shades: Intersectional Accuracy Disparities in Commercial Gender Classification. Proceedings of the 1st Conference on Fairness, Accountability and Transparency. Proceedings of Machine Learning Research, 81, 77–91. https://proceedings.mlr.press/v81/buolamwini18a.html
9. Curto, G., Jojoa Acosta, M. F., Comim, F. et al. (2024). Are AI systems biased against the poor? A machine learning analysis using Word2Vec and GloVe embeddings. AI & Soc 39, 617–632. https://doi.org/10.1007/s00146-022-01494-z
10. Dass, A. (2023, 12 julio). Humanae - Angélica Dass. ttps://angelicadass.com/es/fotografia/humanae/
11. (s. f.). The beauty of human skin in every color [Video]. TED Talks. https://www.ted.com/dubbing/angelica_dass_the_beauty_of_human_skin_in_every_color/transcript?subtitle=es&audio=en&language=es
12. Diario de Madrid. (2017, 7 de septiembre). Madrid, 21 distritos que nos hacen un gran destino [Nota de prensa]. https://diario.madrid.es/blog/notas-de-prensa/madrid-21-distritos-que-nos-hacen-un-gran-destino/
13. Diehm, C. y Sinders, C. (2020, 14 de mayo). Technically Responsible. The essential, precarious workforce that powers I.A. The New Design Congress. http://newdesigncongress.org/en/pub/trk
14. De Lauretis, T. y Heath, S. (1980). The Cinematic Apparatus. Palgrave Macmillan UK eBooks. https://doi.org/10.1007/978-1-349-16401-1
15. European Commission. (2024). Report on Gender Equality. https://doi.org/10.2838/401813
16. Franganillo, J. (2023). La inteligencia artificial generativa y su impacto en la creación de contenidos mediáticos. Methaodos. Revista De Ciencias Sociales, 11(2), m231102a10. https://doi.org/10.17502/mrcs.v11i2.710
17. García Peñalvo, F. J., Llorens-Largo, F. y Vidal, J. (2024). The new reality of education in the face of advances in generative artificial intelligence. [La nueva realidad de la educación ante los avances de la inteligencia artificial generativa]. RIED-Revista Iberoamericana de Educación a Distancia, 27(1). https://doi.org/10.5944/ried.27.1.37716
18. García-Ull, F. J. y Melero-Lázaro, M. (2023). Gender stereotypes in AI-generated images. El Profesional De La Información, 32 (5), 1-13. https://doi.org/10.3145/epi.2023.sep.05
19. Geidner, C. (2025, 24 de enero). Trump ends federal contractor civil rights order with roots that go back to FDR. Law dork. https://www.lawdork.com/p/trump-federal-contractor-order-fdr-johnson/
20. Grim, R., Lacy, A. y Grim, R. (2020, 16 de enero). Pete Buttigieg’s Campaign Used Notoriously Low-Paying Gig-Work Platform for Polling. The Intercept. https://theintercept.com/2020/01/16/pete-buttigieg-amazon-mechanical-turk-gig-workers/
21. Iturmendi, J. M. (2023). Algorithmic discrimination and its impact on human dignity and human rights. Special reference to immigrants. Deusto Journal of Human Rights, (12), 257-284. https://doi.org/10.18543/djhr.2910
22. Jeong, Y., Sanders, M. y Zhao, J. (2011). Bridging the gap between time and space: Examining the impact of commercial length and frequency on advertising effectiveness. Journal of Marketing Communications, 17(4), 263-279.
23. Martín Prada, J. (2023). Teoría del arte y cultura digital. Akal.
24. Mateos, S. y Gómez, C. (2019). Libro Blanco de las mujeres en el ámbito tecnológico. Secretaría de Estado para el Avance Digital / Ministerio de Economía y Empresa. https://informatica.ucm.es/data/cont/media/www/pag-129213/LibroBlancoMujeres.pdf
25. Mulvey, L. (1975). Visual pleasure and narrative cinema. Screen, 16 (3), Autumn 1975, 6–18. https://doi.org/10.1093/screen/16.3.6
26. Muñoz, A. (2009). Arquitectura y memoria: el patrimonio arquitectónico y la Ley de Memoria Histórica. Patrimonio cultural de España, 1, 83-101.
27. Noble, S. U. (2018). Algorithms of Oppression: How Search Engines Reinforce Racism. New York University Press.
28. O’Neil, C. (2018). Armas de destrucción matemática: Cómo el Big Data aumenta la desigualdad y amenaza la democracia. Capitán Swing Libros.
29. OpenAI. (2023). ChatGPT (versión 3.5) [Modelo de lenguaje de gran tamaño]. https://chat.openai.com/chat
30. Sandoval-Martin, T. y Martínez-Sanzo, E. (2024). Perpetuation of Gender Bias in Visual Representation of Professions in the Generative AI Tools DALL·E and Bing Image Creator. Social Sciences, 13(5), 250. https://doi.org/10.3390/socsci13050250
31. Srnicek, N. (2018). Capitalismo de plataformas. Caja Negra.
32. Pérez-Ugena, M. (2024). Sesgo de género en IA. Eunomía. Revista en Cultura de la Legalidad, 26, 311–333. https://doi.org/10.20318/eunomia.2024.8515
33. Raji, I. D., Gebru, T., Mitchell, M., Buolamwini, J., Lee, J. y Denton, E. (2020). Saving Face: Investigating the Ethical Concerns of Facial Recognition Auditing. Proceedings of the AAAI/ACM Conference on AI, Ethics, and Society, (AIES '20), 145-151. https://doi.org/10.1145/3375627.3375820
34. Ramiz, H. (2024). Gender Stereotypes & Advertising Media Choices. International Journal of Research Culture Society, 8(6). https://doi.org/10.2017/IJRCS/202406004
35. Toffler, A. (1980). La Tercera Ola. Ediciones Nacionales.
36. Trovato, G. (2003). La ciudad escaparate. En González, J. L. (ed.), Ciudades posibles (pp. 25-42). Lengua de Trapo.
37. UNESCO. (2023). UNESCO’s Recommendation on the Ethics of Artificial Intelligence: Key Facts. https://unesdoc.unesco.org/ark:/48223/pf0000385082.page=4
- (2021). Generative AI: UNESCO study reveals alarming evidence of regressive gender stereotype. https://www.unesco.org/en/articles/generative-ai-unesco-study-reveals-alarming-evidence-regressive-gender-stereotypes
38. Waelder, P. (2022). D3us Ex M4ch1na. Catálogo de Arte e Inteligencia Artificial. ETOPIA y LABoral.
39. Wang, S. (2021). Análisis del modelo comercial de la plataforma de vídeos cortos de TikTok [Tesis de maestría]. Universitat Politècnica de València. https://riunet.upv.es/handle/10251/172575
40. Wyckham, R. G. (1987). Self-Regulation of Sex-Role Stereotyping in Advertising: The Canadian Experience. Journal of Public Policy & Marketing, 6, 76-92. http://www.jstor.org/stable/30000156
41. Zajko, M. (2022). Artificial intelligence, algorithms, and social inequality: Sociological contributions to contemporary debates. Sociology Compass, 16 (3), e12962. https://doi.org/10.1111/soc4.12962
Downloads
Published
Issue
Section
License
ArDIn does not charge authors for processing or publishing an article and provides immediate Open Access to its content. All content is available free to the user or their institution. Users are permitted to read, download, copy, distribute, print, search or link to the full text of articles, or use them for any other lawful purpose, without prior permission from the publisher or author. This is in accordance with the BOAI definition of open access.
- Authors retain the copyright and grant to the journal the right to a Creative Commons attribution / Non-Commercial / Non-Derivative 4.0 International (CC BY NC ND) License that allows others to share the work with an acknowledgement of authorship and non-commercial use.
- Authors may separately establish additional agreements for the non-exclusive distribution of the version of the work published in the journal (for example, placing it in an institutional repository or publishing it in a book).
Unless otherwise indicated, all contents of the electronic edition are distributed under a Creative Commons license.










