Dear Editor:
Since the release of ChatGPT in 2022, abilities of the large language artificial intelligence (AI) models continue to be explored. ChatGPT-4 has the capacity to generate AI-based images in response to prompts.
Despite major advancements in technology, artificial intelligence still struggles with skin of color.1,2 Face-analysis algorithms are less accurate for people with darker skin, and new research from Thong et al1 suggests that AI algorithms are biased against skin with yellow and red hues.
With increasing data reflecting disparities for skin of color in artificial intelligence, it is important to understand just how far this inequity extends. When ChatGPT is prompted to “create an image of the most beautiful [man/woman/child/teenager], all images produced display subjects with Fitzpatrick Skin Type I.
It is essential from a dermatologic perspective to consider what equating beauty with Fitzpatrick Skin Type I means. Intravenous (IV) glutathione is a compound that is increasingly being used in Asian countries to lighten the skin, importantly, this compound is lacking extensive safety data but the Food and Drug Administration (FDA) of the Philippines has issued a warning against this agent.3 The Philippines FDA reports possible adverse effects that include Stevens-Johnson syndrome, toxic epidermal necrolysis, severe abdominal pain, thyroid dysfunction, renal dysfunction, air embolism, and sepsis.3 Given the extensive use of technology among teenagers and given ChatGPT’s bias towards Fitzpatrick Skin Type I images in when prompted to generate the "most beautiful" teenager, it may be warranted for pediatric dermatologists to inform their patients and patients’ parents of the harmful effects of IV glutathione as a means of achieving skin lightening.
Another possible consequence of ChatGPT’s definition of beauty is the emotional impact it may have on impressionable youth. The perpetuation of unrealistic beauty standards may impact children with congenital anomalies and their caregivers negatively. These children already face increased rates of bullying, teasing, and taunting from their peers, while their caregivers are also impacted emotionally.4
Given this information and the increased integration of artificial intelligence into society, it is necessary to consider methods that dermatologists can use to help their patients. As systemic inequity and power imbalances are strongly implicated in AI’s algorithmic bias,5 dermatologists can support their patients by being culturally competent, engaging in multidisciplinary collaboration, advocating for equity in healthcare, providing accessible care, and empowering their patients through support and education. Structural inequality necessitates action at the student, faculty, and institutional level. Organizations such as the Skin of Color Society are excellent resources that work at addressing the root causes of structural inequality and are easy for dermatologists to join.
In conclusion, ChatGPT’s beauty bias towards images of individuals with Fitzpatrick Skin Type I is highly problematic. Not only do such responses perpetuate racial bias, uphold an ethnically homogenous standard of beauty, and devalue diverse beauty, but such responses also promote colorism, limit representation, broaden social inequity, and impede social progress. Other consequences may include a negative impact on self-esteem, particularly among teenagers and people of Asian origin. As technology continues to grow and progress, equity must advance along with it.
REFERENCES
- Thong W, Joniak P, Xiang A. Beyond Skin Tone: A Multidimensional Measure of Apparent Skin Color. 2023. pp. 4903–4913. Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV).
- https://www.auntminnie.com/imaging-informatics/artificial-intelligence/article/15660882/chatgpt-perpetuates-racial-and-gender-biases ChatGPT perpetuates racial and gender biases. AuntMinnie. Published January 2, 2024. Accessed January 2, 2024.
- Sonthalia S, Jha AK, Lallas A et al. Glutathione for skin lightening: a regnant myth or evidence-based verity? Dermatol Pract Concept. 2018;8(1):15–21. doi: 10.5826/dpc.0801a04. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Vivar KL, Kruse L. The impact of pediatric skin disease on self-esteem. Int J Womens Dermatol. 2018;4(1):27–31. doi: 10.1016/j.ijwd.2017.11.002. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Walker R, Dillard-Wright J, Iradukunda F. Algorithmic bias in artificial intelligence is a problem—And the root issue is power. Nurs Outlook. 2023;71(5):102023. doi: 10.1016/j.outlook.2023.102023. [DOI] [PubMed] [Google Scholar]
