Abstract
Contemporary health and social care is saturated by processes of datafication. In many cases, these processes are nested within an ostensibly simple logic of accountability: Define a politically and morally desirable goal, then measure the level of achievement. This logic has come to permeate public health initiatives globally and today it operates in most health care systems in various ways. We explore here a particular instantiation of the logic associated with the introduction of a measurement instrument used in Danish home care. Building on ethnographic fieldwork, interviews, and analysis of policy documents, we show how the instigated processes of datafication—despite hopeful political claims—erode care levels and disempower older people. We believe that these findings can be of relevance for other settings that subscribe to the same accountability logic and to similar forms of measurement instruments.
Keywords: accountability, data, datafication, health care, home care
Introduction
This article describes what happened as a Danish municipality decided to shift methods in home care delivery and measure the effect of that change. The stated aims of the new method were to provide better care for less money and to empower older persons. The shift was accompanied by the introduction of a particular logic of accountability that uses a measurement instrument to prove that the desired effect has been achieved. This accountability logic is increasingly common around the world, with new measurement instruments developed across different settings. In our case, however, the data that were accumulated did little to ensure accountability. We therefore suggest reconsidering the political power ascribed to data. Data, we propose, can be surprisingly weak.
The dominant accountability logic in this form of datafication is ostensibly simple. It relies on combining goals with measurements: First, define a politically and morally desirable goal, then measure its level of achievement. This logic has come to permeate public health initiatives globally as an element of an intricate politics of evidence (Adams and Biehl 2016). In 2015, the United Nations (UN) General Assembly decided on 17 sustainable development goals—i.e., they defined a set of politically and morally desirable goals—and then translated them into 232 indicators to measure their level of achievement. Similar processes have defined interventions on reproductive health (Halfon 2006) and human rights, gender violence, and sex trafficking (Merry 2016). It illustrates a form of “trust in numbers” (Porter 1996), where numbers are expected to help ensure accountability (Power 1997). This confidence in data contributes to particular forms of data politics (Bigo et al. 2019). Powerful global actors thus propagate the conviction that more data are needed to ensure accountability in initiatives aimed at empowering the fragile and poor. The UN, for example, published a report, A World that Counts, which stated that “[d]ata are the lifeblood of decision‐making and the raw material for accountability” (2014: 2), while praising data in the following way: “Never again should it be possible to say ‘we didn't know’. No one should be invisible. This is the world we want—a world that counts” (2014: 3). Informed by such beliefs, numerous organizations, including the Bill and Melinda Gates Foundation, have invested in systems of data collection to enhance accountability in relation to politically determined goals (Hoeyer et al. 2019).
In this article, we trace the local history of a specific measurement instrument in a Danish municipality: where it came from, how it entered the municipality, how it became entrenched in practice, and what happened with the type of home care it was supposed to measure in the process. We argue that though numbers and datafication are attributed significant political power in popular policy discourses and often also in critical social science (Adams 2016a), this case could serve as a tale of warning against attributing data too much political power. They might seem mighty and seductive (Merry 2016), but their power easily dwindles in the political machinery, and their ability to ensure governmental accountability remains limited. It takes political will to deliver care to those in need. This should come as no surprise, readers might think, but it becomes important to reiterate this basic point at a time when progressively more hope is invested in the power of data. By exploring the work of the logic and the measurement instrument in a wealthy welfare state with a long tradition for high standards of care, we believe the warnings should have general appeal and relevance—also for less fortunate settings. Before we turn to the narration of the local history of the measurement instrument, we briefly introduce its place in Danish home care, revisit the anthropological literature on data and measurement instruments, and outline our methods.
A Measurement Instrument Expected to Ensure Accountability in Home Care
The specific measurement instrument that we investigate can be viewed as a Weberian ideal type of the logic of separating goal setting and data collection. Its name is Patient Specific Functional Scale (PSFS) and it was introduced in municipal home care in Denmark to measure the effect of a shift in method in care for older people from emphasizing service delivery to emphasizing restoration of functionality through training in everyday activities. This new form of home care is termed “reablement.”
The PSFS consists of a blank chart on which older persons are asked to list up to five everyday activities in which they wish to improve their functionality (e.g., vacuum cleaning or bathing). For each activity, the older persons are asked to rate their ability on a scale from 0 to 10 before and after an eight‐week training program. Increased measures on the scale serve as an indicator of the effect of the program. The aggregate data from PSFS—in the form of the difference in functional level before and after reablement training—are expected to hold the municipality accountable by measuring the effect of the intervention: Higher measures are taken as proof of increased functionality and increased functionality legitimates reductions in homecare services. As we will discuss below, proponents of the PSFS see the instrument as empowering older persons by allowing them to set their own goals for reablement and then seeing the effect through measurements. In this way, it mirrors the same logic of separating goal‐setting and measurement‐of‐achievement at an individual and at an aggregate municipal level.
Danish welfare services are organized through three levels of government: national‐level coordination, regions responsible for specialized care, and municipalities responsible for most of the public services ordinary citizens encounter in their everyday life: social services, nursing homes, nurseries, schools, libraries, and so on. While municipalities are responsible for home care, they operate within a national legal framework monitored by the ministry. The national framework defines entitlements to ensure minimum levels of care across all 98 Danish municipalities. Data are supposed to hold municipalities accountable for meeting these standards, but increasingly data are also used with the purpose of facilitating new, “smart,” and data‐driven ways of optimizing service delivery (Petersen 2019). In this way, data are expected to help identify problems, find solutions, and document that the solutions work. To ensure fiscal stability, national rules impose budget constraints on all municipalities. Adding to the economic constraints is a changing demography with a growing older population in need of care. It is in this environment that reablement has become popular among Danish mayors. Reablement is offered to people over the age of 65 who apply for publicly funded home care, and it is expected to enable older people to retain the competences of their younger selves so that they no longer need home care. However, reablement differs from rehabilitation, which is typically offered to people of all ages recovering from an injury or illness, in that reablement is directed at home care applicants of whom many suffer from a gradual loss of functional capacity to care for themselves caused by age or dementia (Hansen 2016). Despite age‐induced functional loss, they are now expected to regain competences through training.
In short, PSFS separates goal‐setting and measurement and is expected to ensure accountability toward the national standards of good care (documenting that people do not need help and the municipality legitimately can cut expenses), while enhancing respect for the rights and dignity of older people. On the ground, however, this is not exactly what happens.
In Need of Measurement? The Datafication of Home Care
Data permeate contemporary health and social care at almost every level—from political and administrative decisions to the interaction with individual citizens in need of care. A rapid growth in data availability is facilitated further by digital tools that generate data traces out of everyday work and movements (Moore 2017; Petersen 2019). Oxlund and Whyte (2014) identify three types of data collection in eldercare that are all intensifying: measurements quantifying pathological biological values as a basis for medical intervention (Greene 2007); measurements used to track and assess health interventions in the home (Schwennesen 2017); and measurements quantifying functional capacity to be used in the allocation of welfare services (Bowling 2004). The specific case developed below belongs to the latter category, but common to all three types of measurements is that they are seen by policymakers as involving opportunities for achieving a more just allocation of health and social care services based on the same logic: Set a goal and collect data to measure the extent to which it is met. 1
Advocates of big data methodologies have described datafication as a straightforward matter of quantification: “To datafy a phenomenon is to put it into a quantified format so it can be tabulated and analysed” (Mayer‐Schönberger and Cukier 2013: 78). The datafication of health care is anything but straightforward, however, and the translation of multifaceted and unbounded phenomena into structured data formats sets in motion complex technological, social, and political dynamics (Boellstorff and Maurer 2015; Ruckenstein and Schüll 2017). As demonstrated by Martin and Lynch (2009), even the simple act of counting sounds a lot simpler than it is. Phenomena are often difficult to delineate, and those interested in counting do not always have access to the desired information. Furthermore, datafication is not just a matter of quantification. There is always a qualitative component, a narrative element, involved. The narrative element describes what data are supposed to capture. The accountability logic we describe here characteristically separates the qualitative and quantitative aspects of datafication into a political definition of a goal (which takes a narrative form) and then use numbers to capture whether organisations reach those goals.
Daston and Galison (2010) point out how quantitative data are aligned with popular ideas about objectivity. This feature is part of ensuring the political efficacy of instruments that claim to separate the political–normative setting of a goal from its technical–administrative achievement. Data, and in particular their quantitative elements, have become integral to efforts aimed at enhancing accountability in public organisations (Bessette 2001). This does not mean that quantitative data succeed in erasing ambiguity or local tinkering (Best 2012; Triantafillou 2014). In some cases, the data work said to deliver accountability has been found to hide as much as it reveals (Sullivan 2017). Health care staff sometimes describe the involved tinkering as a fight against surveillance systems that impede care (Høybye‐Mortensen 2015; Hunt et al. 2017; Hutchinson et al. 2018). Nonetheless, when frontline staff do complain, it often leads to additional investments in “better data tools” rather than the abandonment of the tools. Complaints rarely affect the overall legitimacy of the quest to become a data‐driven and learning health‐care system (Olsen et al. 2007; Pedersen 2019).
Data facilitate forms of ranking that have become intrinsic to new public governance (Merry 2016), though it is well known that datafication processes both illuminate and hide aspects of the phenomena they are supposed to capture (Kitchin 2014; Sullivan 2017). Quantitative data tend to travel further and be used for more general management decisions, while qualitative insights and local narratives about their meaning tend to remain just that: local (Dahler‐Larsen 2011). In other cases, it turns out that not even the quantitative data travel. They are generated and accumulated, but not used for anything (Shaw et al. 2009). Nevertheless, data continue to possess a particular lure of attraction similar to that of beauty (Halpern 2014). They can be seductive (Merry 2016) and tend to remain subject to administrative desire even when they fail to do what they were supposed to do.
Despite the praise of new digital tools, which are supposed to make data collection seamless, many forms of data still stem from manual labor. In fact, data work consumes significant time and energy, and this work sometimes subtracts from the time available for care work (Wadmann et al. 2018; Wiener 2000). Data work has become a fertile ground for ethnographic studies exploring what data do to people and what people do with data (Adams 2016a; Biruk 2017, 2018; Carruth 2018; Erikson 2018; Hogle 2019; Hunt et al. 2017; Merry 2016). This study broadly belongs to this budding field that Levin (2019) refers to as an “anthropology of data,” questioning “the norms, politics, and values that get wrapped up in data” (p. 665). We are not against data, but we wish to explore what data work does and to question some common assumptions about the power of data. Thereby, perhaps, we may help install more effective forms of accountability.
Methodology
How do you study the history of a measurement instrument? Where do you locate the logics that bring it about, and where do you go to explore its implications? We came across the PSFS in the course of 14 months of ethnographic fieldwork aimed at understanding changing conditions for care among older people. Bødker carried out this fieldwork in a Danish municipality over several periods in 2015, 2016, and 2019 as a form of multi‐sited ethnographic fieldwork (Marcus 1995). Meanwhile, Hoeyer was exploring the data politics of what he calls “intensified data sourcing” in Danish health care (Hoeyer 2016). Seeing that the PSFS captured generic elements of data tools used across the health services, we teamed up and decided to explore how it had come about and what it did in practice. We began tracing the documents used to argue the need for the PSFS, and we followed the data produced with it. All along, we kept relating this work to the political agendas relating to care for older people and the political decisions affecting the home care services that the PSFS data were supposed to measure (in the form of the difference in functionality between before and after reablement training). The fieldwork thus consisted of a combination of participant observation, semi‐structured interviews, and document analysis.
When tracing the documents, we realized that some of our own esteemed colleagues had participated in validation studies of the PSFS instrument. Suddenly, this became “ethnography at home” in a very direct sense (Jackson 1987). It turned out to be a great opportunity, and with their help we could dive deeper into the hopes and aspirations associated with the tool. From January to May 2019, we conducted eight interviews with various actors working with reablement data. In addition to one with our colleague, who conducted the validation of the PSFS for the municipality, the interviews included five people working with reablement data infrastructures in the municipality, and two working with national standards. During this period, political uses of data became subject to what Cool (2019) has termed increased “accountability anxiety.” Consequently, some of our informants have had to insist on strict protection of their identity. It illustrates the difficulties staff have with speaking against the interpretation and use of data that management claims, but, unfortunately, it means that we have had to present this material with less transparency than what would have been academically desirable.
The everyday practices of using the PSFS were observed by Bødker, who shadowed health care professionals engaged in using PSFS in relation to reablement activities (Czarniawska‐Joerges 2007). Observations included 28 assessment meetings (i.e., meetings in the older person's home attended by an interdisciplinary reablement team [typically consisting of a therapist, assessor and/or nurse and less frequently a home helper]). The purpose of these is to decide whether the older person has the potential to benefit from reablement (i.e., from receiving training instead of simply receiving traditional home care services). Additionally, we draw on empirical material from the final assessment meetings in the homes of the older persons, also attended by the interdisciplinary team and the older person. Semi‐structured interviews (Kvale and Brinkmann 2009) were conducted with 10 professionals, who administer the PSFS in their everyday work: five assessors, three therapists, and two nurses. To further understand the political pressures for increased data collection, we also drew on Hoeyer's interviews and document analysis exploring the data politics of health data infrastructures in Denmark (Hoeyer 2016, 2019; Hoeyer and Wadmann 2020). We have translated quotes, and informants’ own emphasis is marked using italics. 2
Our analysis below reflects a thematic coding (Madden 2010), which divided the material into three sections aimed at describing the history of the PSFS: first, how the instrument was chosen as ‘the right tool for the job’ of measuring the effect of reablement; second, how the PSFS shapes the everyday encounters of older people going through reablement; and finally, how the political uses and non‐uses of the PSFS data have implications for home care services.
The Identification and Validation of a Measurement Instrument
The PSFS was developed in the 1990s in the United States to measure patients’ physical progress in the field of orthopaedic rehabilitation (Stratford 1995). Why, then, was it introduced into Danish reablement services? When interviewing people about why the municipality chose the PSFS, they explained how the scale could balance different and at times conflicting ambitions. They needed a tool that could ensure high‐quality and citizen‐centered care, document efficiency and accountability, ensure scientific validity, and limit documentation work to a minimum to fit a busy everyday practice. As we will now illustrate, the PSFS was not perfect for any of these agendas, but it spoke to all of them enough to serve as a compromise.
The delivery of high‐quality care to senior citizens is a long‐standing ambition in the Danish tax‐financed welfare state and, in recent years, this ambition has come to emphasize also goals of ensuring dignity and empowerment. Home care services should now be “citizen‐centered” (The Home Care Commission 2013). Our interviewees thus emphasized that the PSFS allows citizens to focus their care on what is important to them. The introduction of short‐term reablement programs as an alternative to long‐term home care emanate from Fredericia Municipality (Kjellberg 2010). It has been presented as a way of re‐enabling older persons to function independently, ideally to an extent that they do not need home care by the end of the reablement program. The quick dissemination of reablement among Danish municipalities might indicate that reablement is cheaper than traditional home care because it runs over a shorter period (Hansen 2016). In the current political environment, however, it is important to document the effect of this transition.
The ambition to document the effects of the shift relates to a political agenda for the public sector at all levels to become more data driven. In 2014, the Ministry of Health in collaboration, with Deloitte, analyzed Denmark's health data infrastructure, publishing a transformation plan to improve the use of health data (Deloitte 2014). This was in response to an OECD report, which concluded that Denmark does not make full use of its data potential (OECD 2013). It led to the Health Data Programme operating as a framework for health data collection and use (Ministry of Health 2015). The program's stated aim is to achieve “better health through better use of data.” Building on the Health Data Programme, in 2018 the organization representing all Danish municipalities, Local Government Denmark, initiated the policy program “Visible Municipal Health and Eldercare Data” (Local Government Denmark 2018). In an interview with a man we call Peter, who had worked in the municipality for years, we were told that the OECD report had created a demand also for a different type of data, namely on effects (rather than on structures and processes):
We have to look at the results that we actually achieve for the citizens. Not on the big ones like mortality or something like that […], but we have to go closer, we have to go all the way down to the level of the citizen and look at what is the citizen‐perceived effect of the services that we go around providing. There are many good reasons for that. […] If we stopped doing all of that, which has no effect in the Danish health care system, we would have plenty of funds at our disposal. (Peter, data analyst)
Peter here assumes that many activities have no effect. He thereby articulates a link between measuring citizens’ experiences (to document the “effect of services”) and the ambition of municipal budget control. It is in line with what is typically called value‐based care (Bonde et al. 2018). Unfortunately, the required measurements create a lot of data work. So, along with requesting more data, our informants also articulate a desire for quicker or easier tools for data collection. Lea, also working in the municipality in a management function, for example, said:
We have to limit documentation. There must be the necessary documentation, and we have to make sure that the citizen gets the right care, but we must not in any way drown in it, because there also needs to be time to perform that care. (…) Sometimes those two things are not compatible. (Lea, municipality management)
In the context of such conflicting agendas, the PSFS seemed to constitute a reasonable compromise, Lea explained as she continued:
[W]e had tried all sorts of [measurement instruments] first and it was hard to make them work. They took too much time and were troublesome […]. Then we found the PSFS, one of my colleagues who knows everything about rehabilitation, he knew it from before and it just appealed to us pretty intuitively because it's short, it asks the citizen “What's most important to you?” So this thing about citizen‐involvement, it does that pretty directly and […] it has this element of scoring. (Lea, municipality management)
Lea and her colleagues in this way came across the PSFS somewhat by coincidence and saw in it a tool that could ensure citizen‐involvement and measure effect without consuming too much of the professionals’ time for documentation. It had “this element of scoring”: It translated experience into numbers. One more challenge then needed to be resolved: scientific validity.
Despite validation studies in relation to orthopaedic rehabilitation (Stratford 1995), PSFS had not been validated specifically for older people at the time. The administrative management of the municipality therefore decided to initiate a two‐tier validation study of the PSFS consisting of a qualitative and a quantitative part. To do this, they enrolled our colleagues Christina Schnohr and Jakob Bjørner. The qualitative study explored experiences with the PSFS among professionals and older people through interviews and was used to develop a Danish PSFS manual for professionals working with reablement. They found that while PSFS works well in many instances, it is not valid for cognitively impaired persons with limited awareness of their situation and reduced ability to set up goals (Schnohr and Bjørner 2015). A recommendation discouraging the use of the PSFS on cognitively impaired persons was therefore included the report, although this was regrettable for some in the municipality. Christina Schnohr explained:
It was important for me to emphasise the finding that PSFS cannot be used on someone cognitively impaired or confused for other reasons (e.g., medication or simply their current state). The municipality did not seem to appreciate this precaution, but I did not want to be a part of the evaluation without including the points about when not to complete the test. The measure is usable only under certain circumstances, and they had to be underlined. (Christina, researcher)
Approximately 25% of Danish home care recipients are estimated to suffer from dementia (Local Government Denmark 2009), and a conflict therefore emerges here between the political agenda of a universally applicable effect measurement tool, on the one hand, and a scientific agenda of producing valid data on the other.
The quantitative validation of the PSFS (Bjørner and Schnohr 2017), which was published two years after the implementation of the tool in reablement, concludes (to much relief for the administration in the municipality) that the PSFS “is suitable for assessing a group of citizens’ change [in functional capacity] over time” (Bjørner and Schnohr 2017: 18). PSFS was validated by comparing results with two other scales, Patient Reported Outcome Measurement Information System and Sit‐to‐Stand Test. The study's estimated Kappa‐coefficient for the correlation between the PSFS and these two other scales validated for measuring functional capacity was calculated to be 0.2, which is just at the cut‐off point between none and weak correlation. Christina explained how the positive evaluation should be read since the PSFS had already been implemented. At this stage, the point was to help practitioners produce data as valid as possible. The PSFS manual was written with this purpose in mind. Christina also remarked that as researchers they would have chosen a better tool, but if they want to know about what happens in practice, they need to use data that can be collected in the course of that practice. The data analysts in the municipality also acknowledged the limitations of PSFS. Peter remarked that it was “not the world's best effect measurement tool,” but “that's just the reality in municipal practice.”
Overall, this meant that the PSFS was chosen and implemented relatively by coincidence, but it became the municipality's standard tool because it constituted a compromise between several important, though partly conflicting, agendas. What then happened in practice?
Using the Measurement Instrument in Practice
In practice, PSFS encountered serious challenges. The political ambition—to pick a scientifically validated tool that empowers citizens by letting them choose their own goals and then measure the effect—did not deliver as promised. First, it is noteworthy that the scientific validation study was conducted on people receiving rehabilitation (being recently admitted from hospital due to an illness or injury), not reablement. This essentially means that the validation was conducted on patients rather than older citizens who experience gradual age‐related decline and are less likely to experience functional progress (Bødker 2018). 3
Second, the ambition of letting citizens chose their own goals ran into several obstacles. According to the reablement teams, many older people were unable to come up with any goals, or as Ellen said:
[Goal‐setting using the PSFS] is something that I find difficult because many citizens consider it very abstract. (…) often it's like putting words in their mouths, because you know that you, in your documentation, have to set a goal. (Ellen, Assessor)
Many older people were unprepared to discuss reablement because they had applied for home care, not reablement. They expected assistance with house chores, not training. Another obstacle to the empowerment agenda was that many older people mentioned goals that fell outside the catalog of services that the municipality could provide according to existing frameworks. They would suggest leisure or hobby activities, such as attending a craft class or music café—one woman even mentioned pole dancing. We observed how assessors typically overcame this challenge by trying to convert leisure‐related goals into goals more directly corresponding to specific home care services available in the service catalog (e.g., cleaning or grocery shopping).
It was not only assessors and regulatory constraints influencing the older person's goals. As we move close to the concrete application of PSFS, we see how also relatives influence the goal setting. We have chosen an assessment meeting with 94‐year‐old Walter as an example (see also Bødker 2018). In this case, an assessor, a therapist, a home helper, and two relatives were gathered around Walter's dining table when the assessor took out the PSFS form and asked Walter to mention five activities that he was unable to perform but would like to become able to perform again. Walter mentioned going for longer walks and doing his grocery shopping.
Assessor: “What about meals?”
Walter: “Yes.”
Assessor: “So would you like to become able to prepare a meal or just heat a meal?”
Walter: “Yes.”
The home helper asks in the direction of the assessor: “Do you also wish to become able to eat real [i.e., not pureed] food?” The assessor says “yes” and gives Walter an inquiring look.
Walter: “Yes”
Assessor: “One more thing? Diaper?”
As there is no response, the home helper looks at Walter and says:
“It's not very nice getting wet and having your private parts washed is it?”
Walter: “No.”
Walter's daughter then interrupts:
“Then say it dad!”
In this way, the goals come about through input from multiple parties and in reflection of differently orchestrated agendas of care: The older citizen is not a free‐floating autonomous agent empowered through the PSFS. In fact, it is likely that some citizens feel humiliated rather than empowered when confronted with this type of questioning.
When the measurement begins, we then see a third diversion from the ideal use of the PSFS: The measurement can be manipulated and, in some instances, come across as arbitrary. When the assessor asks Walter to put a number from 0 to 10 on each goal indicating how well he is currently able to perform the activity, Walter says “10” for “washing himself.” The assessor looks puzzled and says that it does not make sense because he has just said that he is unable to wash himself.
Walter: “But I used to.”
The assessor explains that the score is about his current abilities to which Walter, who is currently receiving home care assistance with personal hygiene, replies:
“But I can't do it at all now. Someone is doing it for me.”
Assessor: “Sure, but you have to imagine if you were to do it yourself, how well would you then be able to do it?”
Walter looks puzzled and goes silent.
Assessor: “Can we say 7?”
Walter: “Yes.”
Data from the PSFS were regarded by many professionals as relatively arbitrary. Two assessors said: “It's a bit like the wind blows”—a Danish expression describing a random process that can go in either direction. Some professionals were confused about the scale themselves: In three cases, professionals accidentally reversed the direction of the scale when explaining it to Bødker (2018).
Nonetheless, most of the assessors were committed to setting up goals using the PSFS. They knew the municipality management valued the data and that an indicator of success had been put in place: At least 80% of older citizens requesting home care should instead be referred to reablement. This also made some assessors go to great lengths to set up goals (e.g., by putting words in the older people's mouths or simply by typing fabricated goals into the documentation system on returning to their office). It also made professionals tinker with the referral data. Line, a reablement coordinator, for example had “a theory” that the 80% rate was reached “creatively”:
After all, it's also … it's very easy to cheat with it because the only thing they measure is if a [reablement] service is put on or not, so actually they don't even see whether the [reablement] programme has begun. (Line, reablement coordinator)
When professionals are accountable for reaching certain indicators they do not agree with, the risk of such “data gaming” is known to increase (Wallenburg and Bal 2019). In practice, such factors can seriously compromise data validity, and validation studies have few options for capturing them.
In short, the PSFS has in practice come to be used for a different group for which it was validated, it does not empower the older citizens in the intended way, and it produces a set of relatively arbitrary data. As a consequence, PSFS data come to form a data world detached from the actual reablement practices.
The Political Effects of the Measurement Instrument and the Generated Data
If the political goal with using the PSFS rather than other scales to document the effect of reablement was that it could empower older citizens, the observable political effects differ in significant ways. As we shall now see, the PSFS seems to have become an actor in a political process shifting the emphasis from citizens’ goals to those set by the municipality management, in particular relating to budget control and economic savings. As we were writing this article, an independent national research report documented that the proportion of Danish older citizens receiving home care has decreased dramatically over the last 10 years, leaving an estimated 44% of frail older people in need of help without any formal or informal home care (Rostgaard and Matthiessen 2019). The municipality nevertheless officially still understands its work as data driven and regards reablement a success. What has happened?
When reablement was implemented in the municipality, the management board established three indicators. Two activity indicators: that 80% of new home care applicants and 15% of existing home care recipients applying for additional services were referred to reablement; and one effect indicator: that 50% of all of those referred to reablement have a positive outcome of reablement. Effect was intended to be measured in two ways: one using PSFS data (with a positive outcome being defined as the older person having improved their PSFS score[s] by two points or more) and one using data on allocated time for home care services for each person undergoing reablement (with a positive outcome being defined as the person being granted a reduced amount of time for home care). The latter effect measure—reduced assistance—was ambivalent for the professionals for whom a positive outcome in the case of marginalized and fragile citizens had traditionally been to convince them of their need for home care, thereby increasing their use of home care. Nevertheless, management had insisted on also measuring an economic effect and not just the fulfilment of care needs.
Unfortunately, the implementation in 2017 of a new IT system meant that PSFS data were locked in an old database from which aggregate data could not be drawn. PSFS data are still produced, but they are not used at the aggregate level. For more than two years, the management board has therefore relied only on data on referral rates and expenditures on home care and not on improvements in functionality, when claiming reablement a success. In effect, a system has emerged where the municipality measures the success of professionals’ work as a matter of delivering less care, regardless of older people's actual need for care.
Rather than serving the goals set by the citizens, data have come to serve those set by the system. They have become symbolic management tools. One interviewee was very concerned about the tendency for the municipal board to select only those data that they liked, but we also spoke to people at the upper management level who were less concerned about the symbolic uses of data and the fact that they did not reflect practice very well. For example, when we mentioned how the professionals found the 80% referral rate unrealistic, Lea argued that the symbolic value of indicators was important even when indicators were unrealistic:
Well 80% is perhaps … is it the right level? [shrugs, indicating that it is not] But it does after all give a powerful signal that [reablement] is something that we think that all of our citizens can benefit from. Therefore, it is valuable, and putting a number on it makes it more concrete than just saying that “It's very important for us that all citizens get it.” That's one of the things that numbers do to us. (Lea, municipality management)
From a management perspective, indicators can create a form of governance through symbols, regardless of their adequacy for the everyday work practices of front‐line workers (Hoeyer and Wadmann 2020).
There is a growing awareness in the municipality of this shift from a representational perception of data (that data represent practice) to a symbolic approach to data (that data symbolise governance ambitions). This shift frustrates those employees who feel subjected to management decisions that are legitimized with what they believe are invalid data. It also frustrates the very committed data analysts who are asked to deliver the numbers. However, those interviewed did not want to be cited on this topic. To criticize the way management uses data is perceived as dangerous in the current moment in the municipality.
In short, PSFS failed to ensure citizen‐centeredness and data‐driven decision making. Data did not leverage the power to hold the municipality accountable; on the contrary, data fueled a system of symbolic governance and data gaming—and contributed to the creation of a system in which more than 40% of the frail older citizens no longer receive the assistance they need.
Discussion
The analysis above outlined how a municipality in a welfare state reduced the levels of care offered to fragile older citizens and claimed to have data documenting it a success. Our analysis of the adoption and use of an international measurement instrument in a local context has illustrated the data politics of claiming a separation between a political goal and the neutral collection of data to document levels of its achievement. In this case, unmet care needs have been disguised and more or less obscure data work proliferated. There is an abundance of data, but no documentation of effect and no means of ensuring citizen needs. That measurements alone do not ensure adequate levels of care is not a new insight. More than 30 years ago, a call was made for nesting the use of numbers in more careful forms of analysis: “What the study of caregiving now needs is a new language, the analytic services of interpretation, transformation, and circumstantiality. Without them, measurement will take us farther away from the object of our concern than we ever intended” (Gubrium and Lynott 1987: 283).
The analysis above has sought to reestablish these links between quantitative data and the stated political ambition of ensuring adequate levels of care—exactly by staying attentive to interpretations, transformations, and circumstantiality. We believe this is particularly important in the current political climate, where accountability efforts increasingly rely on the accountability logic we have explored in this article: Politically define a goal and use quantitative data to document its achievement. Our analysis does not suggest that there is anything wrong with using data to document achievement on goals. What it does suggest is that it involves political vulnerabilities when goal achievement is construed as a technical process and data are ascribed the ability to hold organizations accountable. Political commitment rather than data defines the organizational path.
To contemplate the general valence of these insights, we wish to use this discussion to move beyond the specific study of the PSFS. While we were conducting our study, a case with many similarities unfolded in the same municipality in relation to cuts in disability services accompanied by new standardized data tools documenting adequate levels of care (despite complaints from users). At the national level, the Danish health services have adopted a quality reform termed “National Goals,” which is used for governing the health care providers by setting goals and defining standards for documenting their achievement (Ministry of Health 2018). One of the people working with National Goals explains the logic of separation: “If politicians say, ‘we want to focus on [this or that],’ then you convene a group of technicians who create an indicator for that” (Torben, civil servant). Here again, good governance is seen as a matter of separating goal setting from “technical” measurement. In our work with other parts of the health services, we have learned how National Goals are also subject to many complaints from health professionals who do not believe in the validity of the measurements and worry about being encouraged to perform tasks to fulfil the indicators rather than to meet patient needs (Hoeyer and Wadmann 2020). In the introduction, we outlined how the logic also permeates global health efforts. Social scientists as well as policy makers therefore might benefit from rethinking what they expect data to do in political organisations, also in more general terms.
What would be more effective forms of accountability? We still believe that data can be helpful in documenting challenges as well as progress, but we posit that data accumulation cannot overrule political will. It may, however, disguise it (Taylor‐Alexander 2016). The separation of goal‐setting and measurement involves serious perils in environments facing fierce economic constraints. Rather than keeping professionals responsible for achieving a given indicator, the political system needs to protect the professional ethos of care and retain room for professional judgment. Judgment involves dangers of its own, not least relating to prejudice and bias. In some instances, data can be used to help identify biases in the professional use of judgment. Such uses of data serve the purpose of learning rather than control. The inadequacies of data in relation to accountability point to the continued relevance of more traditional forms of political mobilization toward accountability, including dissemination of stories about the need for good care. As Adams (2016b) argues, narratives continue to matter. In this way, data and narratives may go hand in hand.
Data facilitate particular political narratives, but they do not speak for themselves. Data can be surprisingly weak. They depend on infrastructures (Bowker and Star 2000), and they inevitably interact with the multiplicity of political agendas characteristic of any governmental setting. The promise of accountability, therefore, cannot be fulfilled through data‐driven provision of eldercare without the political will to ensure sensitive and adequate care. Though data are politically weak in and by themselves, words like “data show that reablement works” seem to travel well. In this case, at least, such statements travel further than the data themselves. These words acquire more political weight than the words of individual older citizens like Walter because the narrative about data legitimizes claims about the population. Narratives on data become seductive (Merry 2016), even when the data as such are not being used to establish any particular insight. Data facilitate such narratives and they create political relations of subjection when people are asked to datafy their experiences (Asad 1994). With this narrative about the PSFS, we hope to inspire others to likewise disentangle narratives about measurement instruments and claims about disadvantaged populations and in this way explore the political effects of other datafication processes.
Funding
This project has received funding from the European Research Council (ERC) under the European Union's Horizon 2020 research and innovation programme (grant agreement number 682110) and from the Danish Ministry of Research and Education (reference number 6161‐00029B).
Supporting information
Supplementary Material
Acknowledgment
We would like to thank our informants for their generous contribution to this work and our colleagues at Section for Health Services Research, especially Ezio di Nucci, for comments on an earlier version of this article.
Notes
Obviously, there are many other drivers for these processes of datafication, including expansion of pharmaceutical markets and opportunities of surveillance.
This type of research is not eligible for ethics committee assessment in Denmark, but it was set up in accordance with the rules of General Data Protection Regulation. It was guided by an imperative to help articulate what mattered to those affected by the politics involved without causing them any harm.
It has later been validated for older citizens (Mathis et al. 2019), but this validation did not form part of the process of adoption in the municipality.
References Cited
- Adams, V. 2016a. Introduction In Metrics: What Counts in Global Health. Reprint edition, edited by Adams V., 1–19. Durham: Duke University Press. [Google Scholar]
- Adams, V. 2016b. Metrics of the Global Sovereign. Numbers and Stories in Global Health In Metrics. What Counts in Global Health, edited by Adams V., 19–54. Durham: Duke University Press. [Google Scholar]
- Adams, V. , and Biehl J.. 2016. The Work of Evidence in Critical Global Health. Medicine Anthropology Theory. An open‐access journal on the anthropology of health illness and medicine 3: 123–26. [Google Scholar]
- Asad, T. 1994. Representation, Statistics and Modern Power. Social Research 61: 55–88. [Google Scholar]
- Bessette, J. M. 2001. Accountability: Political In International Encyclopedia of the Social & Behavioral Sciences, edited by Smelser N. J. and Baltes P. B., 38–41. New York: Elsevier. [Google Scholar]
- Best, J. 2012. Bureaucratic Ambiguity. Economy and Society 41: 84–106. [Google Scholar]
- Bigo, D. , Isin E., and Ruppert E.. 2019. Data Politics In Data Politics: Worlds, Subjects, Rights, edited by Bigo D., Isin E., and Ruppert E., 1–17. London: Routledge, Taylor & Francis Group. [Google Scholar]
- Biruk, C. 2017. Ethical Gifts? An Analysis of Soap‐for‐data Transactions in Malawian Survey Research Worlds. Medical Anthropology Quarterly 31: 365–84. [DOI] [PubMed] [Google Scholar]
- Biruk, C. 2018. Cooking Data: Culture and Politics in an African Research World. Durham: Duke University Press. [Google Scholar]
- Bjørner, J. B. , and Schnohr C. W.. 2017. Vurdering af måleegenskaber for Patient Specifik Funktionel Skala, PSFS [Assessment of Measurement Properties for Patient Specific Functional Scale, PSFS] . Report: Afdeling for Social Medicin, Institut for Folkesundhedsvidenskab. Copenhagen University.
- Boellstorff, T. and Maurer B.. 2015. Introduction In Data, Now Bigger and Better!, edited by Engelke M., 1–7. Chicago: Prickly Paradigm. [Google Scholar]
- Bonde, M. , Bossen C., and Danholt P.. 2018. Translating Value‐based Health Care: An Experiment into Healthcare Governance and Dialogical Accountability. Sociology of Health & Illness 40: 1113–16. [DOI] [PubMed] [Google Scholar]
- Bowker, G. C. , and Star S. L.. 2000. Sorting Things out: Classification and Its Consequences. Cambridge, MA: MIT Press. [Google Scholar]
- Bowling, A. 2004. Measuring Health. Berkshire, UK: McGraw‐Hill Education. [Google Scholar]
- Bødker, M. N. 2018. Potentiality Made Workable – Exploring Logics of Care in Reablement for Older People. Ageing & Society 39: 1–24. [Google Scholar]
- Carruth, L. 2018. The Data Hustle: How Beneficiaries Benefit from Continual Data Collection and Humanitarian Aid Research in the Somali Region of Ethiopia. Medical Anthropology Quarterly 32: 340–64. [DOI] [PubMed] [Google Scholar]
- Cool, A. 2019. Impossible, Unknowable, Accountable: Dramas and Dilemmas of Data Law. Social Studies of Science 49: 762–86. [DOI] [PubMed] [Google Scholar]
- Czarniawska‐Joerges, B. 2007. Shadowing: And Other Techniques for Doing Fieldwork in Modern Societies. Copenhagen: Copenhagen Business School Press. [Google Scholar]
- Dahler‐Larsen, P. 2011. The Evaluation Society. Stanford, CA: Stanford University Press. [Google Scholar]
- Daston, L. , and Galison P.. 2010. Epistemologies of the Eye In Objectivity, edited by Daston L. and Galison P., 17–53. New York: Zone Books. [Google Scholar]
- Deloitte . 2014. Foranalyse: Muligheder for bedre brug af sundhedsdata. Transformationsplan [Pre‐analysis: Possibilities for Better Use of Health Data. Transformation Plan]. Copenhagen: Deloitte. [Google Scholar]
- Erikson, S. L. 2018. Cell Phones ≠ Self and Other Problems with Big Data Detection and Containment during Epidemics. Medical Anthropology Quarterly 32: 315–39. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Greene, J. A. 2007. Prescribing by Numbers: Drugs and the Definition of Disease. Baltimore: Johns Hopkins University Press. [Google Scholar]
- Gubrium, J. F. , and Lynott R. J.. 1987. Measurement and the Interpretation of Burden in the Alzheimer's Disease Experience. Journal of Aging Studies 1: 265–85. [DOI] [PubMed] [Google Scholar]
- Halfon, S. 2006. The Disunity of Consensus: International Population Policy Coordination as Socio–Technical Practice. Social Studies of Science 36: 783–807. [Google Scholar]
- Halpern, O. 2014. Beautiful Data: A History of Vision and Reason since 1945. Durham: Duke University Press. [Google Scholar]
- Hansen, A. M. 2016. Rehabilitative Bodywork: Cleaning up the Dirty Work of Homecare. Sociology of Health and Illness 38: 1092–105. [DOI] [PubMed] [Google Scholar]
- Hoeyer, K. 2016. Denmark at a Crossroad? Intensified Data Sourcing in a Research Radical Country In The Ethics of Biomedical Big Data, edited by Mittelstadt B. D. and Floridi L., 73–93. Dordrecht: Springer. [Google Scholar]
- Hoeyer, K. 2019. Data as promise: Regonfiguring Danish public health through personalized medicine. Social Studies of Science 49(4): 531–55. [DOI] [PubMed] [Google Scholar]
- Hoeyer, K. , Bauer S., and Pickersgill M.. 2019. Datafication and Accountability in Public Health: Introduction to a Special Issue. Social Studies of Science 49: 459–75. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Hoeyer, K. , and Wadmann S.. 2020. “Meaningless Work”: How the Datafication of Health Reconfigures Knowledge about Work and Erodes Professional Judgement, Economy and Society 49 forthcoming 10.1080/03085147.2020.1733842 [DOI] [Google Scholar]
- Hogle, L. F. 2019. Accounting for Accountable Care: Value‐based Population Health Management. Social Studies of Science 49: 556–82. [DOI] [PubMed] [Google Scholar]
- Home Care Commission, The . 2013. Fremtidens hjemmehjælp–ældres ressourcer i centrum for en sammenhængende indsats [Home Care for the Future–Older People's Resources in the Centre of a Coherent Effort]. Copenhagen: Social‐og Integrationsministeriet. [Google Scholar]
- Høybye‐Mortensen, M. 2015. Decision‐making Tools and Their Influence on Caseworkers’ Room for Discretion. The British Journal of Social Work 45: 600–15. [Google Scholar]
- Hunt, L. M. , Bell H. S., Baker A. M., and Howard H. A.. 2017. Electronic Health Records and the Disappearing Patient. Medical Anthropology Quarterly 31: 403–21. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Hutchinson, E. , Nayiga S., Nabirye C., Taaka L., and Staedke S.. 2018. Data Value and Care Value in the Practice of Health Systems: A Case Study in Uganda. Social Science & Medicine 211: 123–30. [DOI] [PubMed] [Google Scholar]
- Jackson A., ed. 1987. Anthropology at Home. London: Tavistock Publications. [Google Scholar]
- Kitchin, R. 2014. The Data Revolution: Big Data, Open Data, Data Infrastructures & Their Consequences. London: Sage. [Google Scholar]
- Kjellberg, P. K. 2010. Hverdagsrehabilitering i Fredericia Kommune [Reablement in Fredericia Municipality]. Frederecia, Denmark: Frederecia Kommune. [Google Scholar]
- Kvale, S. , and Brinkmann S.. 2009. InterViews: Learning the Craft of Qualitative Research Interviewing. Thousand Oaks, CA: Sage. [Google Scholar]
- Levin, N. 2019. Big Data and Biomedicine In The Palgrave Handbook of Biology and Society, edited by Meloni M., Cromby J., Fitzgerald D., and Lloyd S., 663–81. London: Palgrave Macmillan. [Google Scholar]
- Local Government Denmark . 2009. Kvalitet i Kommunerne—Resultater fra Kvalitetsprojektet [Quality in the Municipalities—Results from the Quality Project]. Copenhagen: Local Government Denmark. [Google Scholar]
- Local Government Denmark . 2018. Projektbeskrivelse for Synlige kommunale sundheds‐ og ældredata. [Project Description for Visible Municipal Data on Health and Older People]. Copenhagen: Local Government Denmark. [Google Scholar]
- Madden, R. 2010. Being Ethnographic: A Guide to the Theory and Practice of Ethnography. Thousand Oaks, CA: Sage. [Google Scholar]
- Marcus, G. E. 1995. Ethnography in/of the World System: The Emergence of Multi‐sited Ethnography. Annual Review of Anthropology 24: 95–117. [Google Scholar]
- Martin, A. and Lynch M.. 2009. Counting Things and People: The Practices and Politics of Counting. Social Problems 56: 243–66. [Google Scholar]
- Mathis, R. A. , Taylor J. D., Odom B. H., and Lairamore C.. 2019. Reliability and Validity of the Patient‐specific Functional Scale in Community‐dwelling Older Adults. Journal of Geriatric Physical Therapy 42: E67–E72. [DOI] [PubMed] [Google Scholar]
- Mayer‐Schönberger, V. , and Cukier K.. 2013. Big Data: A Revolution that Will Transform How We Live, Work, and Think. London: Houghton Mifflin Harcourt. [Google Scholar]
- Merry, S. E. 2016. The Seductions of Quantification: Measuring Human Rights, Gender Violence, and Sex Trafficking. Chicago Series in Law and Society. Chicago: The University of Chicago Press. [Google Scholar]
- Ministry of Health . 2015. Vision: Sundhedsdataprogrammet [Vision: The Health Data Programme]. Copenhagen: Ministry of Health; https://www.sum.dk/Sundhedsprofessionelle/~/media/Filer%20-%20dokumenter/Sundhedsdataprogrammet/Bilag%203%20-%20Vision.ashx (accessed July 1, 2020). [Google Scholar]
- Ministry of Health, Local Government Denmark and Danish Regions . 2018. Nationale mål for sundhedsvæsenet [National Goals for the Healthcare Sector]. Copenhagen: Ministry of Health. [Google Scholar]
- Moore, P. V. 2017. The Quantified Self in Precarity: Work, Technology and what Counts. New York: Routledge. [Google Scholar]
- OECD . 2013. OECD Reviews of Health Care Quality: Denmark. Paris: OECD. [Google Scholar]
- Olsen L., Aisner D., and McGinnis J. M., eds. 2007. The Learning Healthcare System: Workshop Summary. Washington, DC: National Academies Press. [Google Scholar]
- Oxlund, B. , and Whyte S. R.. 2014. Measuring and Managing Bodies in the Later Life Course. Journal of Population Ageing 7: 217–30. [Google Scholar]
- Pedersen, J. S. 2019. Data‐driven Management in Practice in the Digital Welfare State In Big Data. Promise, Application and Pitfalls, edited by Pedersen J. S. and Wilkinson A., 220–23. Northampton, MA: Edward Elgar Publishing. [Google Scholar]
- Petersen, A. R. 2019. Digital Health and Technological Promise: A Sociological Inquiry. London: Routledge. [Google Scholar]
- Porter, T. M. 1996. Trust in Numbers: The Pursuit of Objectivity in Science and Public Life. Princeton, NJ: Princeton University Press. [DOI] [PubMed] [Google Scholar]
- Power, M. 1997. The Audit Society: Rituals of Verification. Oxford: Oxford University Press. [Google Scholar]
- Rostgaard, T. and Matthiessen M. U.. 2019. Hjælp til svage ældre [Help for Frail Older People]. Copenhagen: VIVE. [Google Scholar]
- Ruckenstein, M. , and Schüll N. D.. 2017. The Datafication of Health. Annual Review of Anthropology 46: 261–78. [Google Scholar]
- Schnohr, C. W. , and Bjørner J. B.. 2015. Patient Specifik Funktionel Skala, PSFS. Notat vedr. kvalitativt valideringsarbejde og manual til brugen af PSFS i Københavns Kommune [Patient Specific Functional Scale, PSFS. Note on Qualitative Validation Work and Manual for the Use of PSFS in the City of Copenhagen]. Copenhagen: Københavns Kommune. [Google Scholar]
- Schwennesen, N. 2017. When Self‐tracking Enters Physical Rehabilitation: From “Pushed” Self‐tracking to Ongoing Affective Encounters in Arrangements of Care. Digital Health 3: 1–8. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Shaw, I. , Bell M., Sinclair I., Sloper P., Mitchell W., Dyson P., Clayden J., and Rafferty J.. 2009. An Exemplary Scheme? An Evaluation of the Integrated Children's System. The British Journal of Social Work 39: 613–26. [Google Scholar]
- Stratford, J. P. 1995. Assessing Disability and Change on Individual Patients: A Report of a Patient Specific Measure. Physiotherapy Canada 47: 258–63. [Google Scholar]
- Sullivan, N. 2017. Multiple Accountabilities: Development Cooperation, Transparency, and the Politics of Unknowing in Tanzania's Health Sector. Critical Public Health 27: 193–204. [Google Scholar]
- Taylor‐Alexander, S. 2016. Ethics in Numbers: Auditing Cleft Treatment in Mexico and Beyond. Medical Anthropology Quarterly 31: 385–402. [DOI] [PubMed] [Google Scholar]
- Triantafillou, P. 2014. Against All Odds? Understanding the Emergence of Accreditation of the Danish Hospitals. Social Science & Medicine 101: 78–85. [DOI] [PubMed] [Google Scholar]
- United Nations (UN) . 2014. The United Nations Secretary‐General's Independent Expert Advisory Group on a Data Revolution for Sustainable Development. New York: United Nations. [Google Scholar]
- Wadmann, S. , Holm‐Petersen C., and Levay C.. 2018. “We Don't Like the Rules and Still We Keep Seeking New Ones”: The Vicious Circle of Quality Control in Professional Organizations. Journal of Professions and Organization 6: 17–32. [Google Scholar]
- Wallenburg, I. , and Bal R.. 2019. The Gaming Healthcare Practitioner: How Practices of Datafication and Gamification Reconfigure Care. Health Informatics Journal 25: 549–57. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Wiener, C. L. 2000. The Elusive Quest: Accountability in Hospitals. New York: Aldine de Gruyter. [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.
Supplementary Materials
Supplementary Material
