Skip to main content
American Journal of Public Health logoLink to American Journal of Public Health
editorial
. 2020 Apr;110(4):440–441. doi: 10.2105/AJPH.2020.305570

Machine Learning and Medical Appointment Scheduling: Creating and Perpetuating Inequalities in Access to Health Care

Michele Samorani 1,, Linda Goler Blount 1
PMCID: PMC7067080  PMID: 32159974

We are deeply concerned about how machine learning and algorithms create and perpetuate inequalities in health. We are to believe that algorithms are developed to ensure that no one will have an unfair advantage over anyone else and that human bias is removed from decision-making. Sounds good in theory.

In real-life circumstances—such as medical diagnoses and policies that determine access to health care and social services or where your child is placed in school—algorithms can separate populations into groups of haves and have-nots along racial lines, exacerbating the racial disparity experienced by the different groups. Algorithms can determine the health of entire communities. Invisible to most of us, algorithms are described as the great equalizers.

However, unlike people, all algorithms are not created equal. Scheduling a medical appointment is the most common way for patients to access a health provider: a patient asks for an appointment and is given a day and time to see a doctor. If she’s on time, she expects that she’ll be seen at or about the time of her appointment. Straightforward and fair? Or not? Our recent study1 argues that state-of-the-art appointment scheduling algorithms may, in fact, contribute to racial disparities, because they make Black patients wait longer than non-Black patients.

In our study, in which we examined electronic scheduling systems in safety net clinics, we revealed how racial bias is woven into the algorithms of electronic health records scheduling systems. To understand how this happens, consider how modern appointment scheduling systems work. To maximize efficiency, most outpatient clinics overbook some of their appointment slots, that is, they give the same appointment time to more than one patient. Overbooking is meant to ensure that providers are fully utilized even if some patients fail to show up for their scheduled appointment. However, if patients who are scheduled in overbooked slots do show up, some of them will experience waiting time at the clinic because the provider can see only one patient at a time.

Modern appointment scheduling systems decide which patients to overbook through machine learning: when a patient is given an appointment, a machine-learning algorithm predicts his or her individual probability of showing up for the appointment at the scheduled time—the show-up probability. It can be shown that to maximize efficiency, a clinic should overbook the patients with the lowest show-up probability. Although the purpose is to optimize provider time and clinic revenue, unfortunately, these same algorithms overbook Black patients, forcing them to wait longer. Built into machine learning, apparently, is that for Black patients, timely, quality care can wait.

Significant amounts of data factor into the calculation of a patient’s show-up probability: sociodemographic information, the patient’s past no-shows, the number of past appointments, how far in advance the appointment is scheduled, and so on. Critically, it is well known that lower show-up probabilities are correlated with factors typically associated with less advantaged socioeconomic status: limited transportation, lack of health insurance, and inconsistent employment, to name a few. In this safety net clinic, Black patients are overrepresented at the lower socioeconomic status level.

Some studies show that Black patients are less likely to show up; other studies show that the patients that are least likely to show up should be overbooked; we connected the dots.

We were honored to work with coauthors Shannon Harris of Virginia Commonwealth University and Haibing Lu and Michael Santoro of Santa Clara University on “Overbooked and Overlooked: Machine Learning and Racial Bias in Medical Appointment Scheduling,”1 in which we revealed how racial bias is woven into the algorithms of electronic health records scheduling systems. Our concern is that patients least able to afford waiting are forced to wait longer to be seen by providers and that these patients may in fact leave before being seen, perhaps never to return until their health conditions have worsened. Black patients are overbooked, not because they’re Black but because of the lived experiences of being Black and having a low income. However, their failure to show up for appointments can be conflated with race and incorporated into algorithms in the name of efficiency.

Our study developed a solution method to address racial disparity by modifying the objective of the scheduling algorithm. Essentially, instead of optimizing the in-clinic waiting time of the general patient population, our method optimizes the in-clinic waiting time of the group that is worse off. This way, we remove disparities between the different groups. When tested on the data set of a large specialty clinic whose Black patients have a lower show-up probability than non-Black patients, our proposed solution method can build schedules without any racial disparity and without any negative impact on clinic efficiency.

In other words, we showed that it is possible to achieve efficiency without adversely affecting patients. More importantly, it is possible to factor the experiences of a disadvantaged population into algorithms in a way that promotes equity.

Our study suggests that there are ways that machine learning and optimization can be used for the benefit of all patients, without leaving anyone behind.

CONFLICTS OF INTEREST

The authors have no conflicts of interest to declare.

Footnotes

See also Morabia, p. 421, and Rodenberg, p. 441.

REFERENCES


Articles from American Journal of Public Health are provided here courtesy of American Public Health Association

RESOURCES