Abstract
Ageism is the most invisible form of discrimination. While there is some awareness of gender, racial, and socioeconomic discrimination on digital platforms, ageism has received less attention. This article analyzes some tools that are frequently embedded on digital platforms from an old-age perspective, in order to increase awareness of the different ways in which ageism works. We will firstly look at how innovation teams, following homophilic patterns, disregard older people. Secondly, we will show how ageism tends to be amplified by the methods often used on digital platforms. And thirdly, we will show how corporate values contradict the usability issues that mainly affect people with a low level of (digital) skills, which is more common among older people. Counterbalancing the abusive power control of the corporations behind digital platforms and compensating for the underrepresentation of groups in less favorable situations could help to tackle such discrimination.
Keywords: Ageism, big data, digital divide, digital minorities, digital platforms, discrimination, older people, usability
Introduction
Discrimination is about unfairness. Stereotypes, which usually work unconsciously, contribute to such discrimination by ignoring the real habits, interests, and values of diverse individuals (Ayalon and Tesch-Römer, 2018). Most of today’s digital services are provided by corporate digital platforms (van Dijck et al., 2018). There is evidence of gender, race, and religious discrimination on digital platforms (Buolamwini and Gebru, 2018; Hajian and Domingo-Ferrer, 2013; Kamiran et al., 2012; Neff and Nagy, 2016; Pedreschi et al., 2009), as well as other types of discrimination. There have also been discussions on how the automation of public policies fosters inequality (Eubanks, 2018; González-Bailón et al., 2017; O’Neil, 2016) and how digital platforms create new strands of discrimination (O’Neil, 2016; Wachter-Boettcher and Emmes, 2018) that move away from the ideal of a fair networked society (Kretchmer, 2017) because of the unintended social consequences of algorithms (González-Bailón et al., 2017).
Like gender, ethnicity, or class, age is an aspect of social structure that ‘involves differential (and sometimes discriminatory) treatment’ (Brah and Phoenix, 2004: 81). However, ageism is often ignored in the analysis of discrimination on digital platforms (e.g. Buolamwini and Gebru, 2018; Hajian and Domingo-Ferrer, 2013; Kamiran et al., 2012; Neff and Nagy, 2016; Pedreschi et al., 2009). Ageism is a particular form of discrimination, in which individuals are judged according to age-based stereotypes, or views on what people should be doing, experiencing, or feeling depending on their age. These complex considerations can be positive or negative, subtle or explicit (Ayalon and Tesch-Römer, 2018), and may be directed at people of any age (Bodner et al., 2012). With regard to older people, ageism is based on a view that focuses on disabilities and implies ‘inferiorization’ and ‘patronage’ (Neves and Amaro, 2012: 3), uses practices that deprioritize, disregard, or even exclude older people (AGE Platform Europe, 2016), and influences digital usage habits (e.g. Comunello et al., 2017; Lagacé et al., 2015). The limited awareness of ageism in society has been widely discussed (Ayalon and Tesch-Römer, 2018; Palmore, 1999; World Health Organization, 2017). Ageism is more pervasive and more invisible than sexism or racism (World Health Organization, 2017). This is partly because ageism is based on unnoticed ageist stereotypes, and partly because industrial society has ousted older people from the positions of power they used to hold (Palmore, 1999).
Ageism shapes both the image(s) that individuals have of themselves and the image(s) that society has of the different life stages. At a societal level, ageism refers to ‘the way in which society and its institutions sustain ageist attitudes, actions or language in laws, policies, practices or culture’ (AGE Platform Europe, 2016). We will thus analyze the influence of ageism on the design of corporate digital platforms. The analysis takes into account three factors that foster discrimination on digital platforms:
First, despite the fact that the digital divide is blurring in terms of access and use, the second digital divide, or the divide in skills, purpose of use and motivation, is persistent or widening (Brandtzæg et al., 2011; Ragnedda and Ruiu, 2017; van Dijk, 2006). This means that people have access to and make use of digital technologies but have less interest, do it for a narrower range of purposes and with more difficulty. The second digital divide affects older people among others excluded collectives(Brandtzæg et al., 2011; Lagacé et al., 2015), which reinforces the idea that older people are not interested in digital technologies (Durick et al., 2013), disempowering them as a group in digital media and perpetuating the exclusionary stigmatization of older people (Stangor and Schaller, 2000).
Second, the prevailing negative aging-related ideas (Garattini and Prendergast, 2015) affect research design (Ayalon and Tesch-Römer, 2018). Young people are points of reference for Information Communication and Technologies (ICT) studies as they help identify usage trends (Castells et al., 2006; Ito et al., 2010). Most studies on digital practices do not include older people, do not ensure that their samples include older people or include inaccurate open-ended categories (45+, 55+, or 65+) that lump together people at different life stages (Rosales and Fernández-Ardèvol, 2016; Sawchuk and Crow, 2011). Thus, research projects that consider the older population are comparatively outnumbered by research on teenagers or the adult population and tend to focus on the younger old, because the older old are (considered) comparatively less accessible or not of interest to stakeholders outside the biomedical sphere (Fernández-Ardèvol, 2019). Consequently, we can safely assume that older Internet users are underrepresented in the data sets that inform algorithms on digital platforms.
Third, most approaches wrongly see later life as a homogeneous life stage, often identified by an inaccurate ‘grey area’ label (Sawchuk and Crow, 2011). With regard to the use of digital platforms, there are diverse levels of interest, uses, and skills among older people (Rosales and Fernández-Ardèvol, 2019), while stereotyped views of older people describe them in general as less interested in ICTs (Neves and Amaro, 2012), which is then used as justification to deprioritize older people in product design decisions.
This article describes and discusses four tools provided by digital platforms in which, as we understand it, ageist mechanisms are at work. We analyze the tools embedded on social network sites (SNS), security systems, and smartphone technologies, as they are widely used technologies which are of general interest. The identified mechanisms include (1) homophily (or self-centered ideas) among the innovation teams that design digital platforms, (2) some sampling methods used on digital platforms, and (3) the corporate ageist values hidden behind obscure algorithms (Pasquale, 2015), which result in the exclusion of older people.
Prejudices and stereotyped views of older people deprioritize, disregard, and exclude them from digital platforms. These views work by means of the corporate and personal values behind the design process and methods used. Thus, directly or indirectly, implicitly or explicitly, digital platforms reproduce this strand of discrimination by failing to take into account the diversity of the prevailing everyday life practices, interests, and usage conditions at the different stages of later life.
Ageist mechanisms on digital platforms
There have been a number of efforts to heighten awareness of how stereotypes and prejudices reinforce discrimination, and a corpus of laws and regulations that prohibit discrimination, for example, gender equality, antidiscrimination action (Justice and fundamental rights|European Commission, n.d.), fair housing, and the Equal Credit Opportunity Acts (Housing and Civil Enforcement Section|CRT|Department of Justice, n.d.). However, intelligent systems are reinforcing discrimination (Eubanks, 2018). Discriminative biases work in digital systems by means of different mechanisms. In this article, we focus on three kinds of ageist mechanisms that are common in the design and implementation of digital platforms.
First, algorithms are influenced by the biases of developers, mainly young men earning above-average salaries (Beyer, 2014; Cohoon and Aspray, 2004). Innovation teams on digital platforms often therefore follow homophilic patterns. The principle of homophily refers to the fact that ‘contact between similar people occurs at a higher rate than among dissimilar people’ (McPherson et al., 2001: 416). Thus, design decisions are strongly influenced by common points of view, falling into homophilic or self-centered ideas. Some of these biases work by making design decisions that are influenced by their common interests and practices, and ignoring the practices of other groups. The homophilic ideas of programmers became evident when researchers demonstrated the gender and race bias in face recognition systems, as they were less accurate with Black women while performing better with White men (Buolamwini and Gebru, 2018). Beyond any technical limitations, the system was not tested or calibrated for Black women. In this case, the lack of accuracy of such systems with older people was presumably not questioned.
Second, the research methods commonly used on digital platforms face limitations when considering the interests of diverse groups of users. Big data implies a nontraditional way of carrying out research (Kitchin, 2014; Manovich, 2012). Beyond research questions and controlled experiments to collect data, in big data studies, data precedes the research question (Kitchin, 2014). Data is a by-product of human–machine interaction, used for research purposes (Kitchin, 2014). Thus, big data is not exempt from methodological concerns. To make predictions, intelligent systems often build on previous data or on a continuous flux of data used as a learning data set. The stereotypes shaping users are likely to be reflected in the content that they provide to the learning data set and in the subsequent intelligent system. Big data relies on thousands of data points from thousands of users to make decisions. However, being big does not mean that a sample is representative of a population (boyd and Crawford, 2012), or that it represents older people. Finally, big data approaches are mainly based on predictions and correlations (Bonchi et al., 2017), which are less effective for nonmainstream uses (Hajian et al., 2016). The system looks for the most common associations in the learning data set, meaning that the discrimination of the learning data set is amplified by the predictions (Zhao et al., 2017). Microsoft’s chatbot, Tay, was released in 2016 and shut down 1 day later due to the racist and sexist responses it learned from its audience (Neff and Nagy, 2016). This failure illustrated the vulnerability of minority or disempowered groups in big data approaches when they are not properly considered in the overall tool design process. Moreover, the likely ageist responses were not studied in this case.
Third, in contrast to analogical algorithms, the algorithms running intelligent systems are black boxes (Pasquale, 2015). Coders are not always able to accurately describe their computing details (Pasquale, 2015). Thus, a complex system of data analysis and decision-making often remains private and is not transparent (Pasquale, 2015), hiding the values involved in the algorithm design. Algorithms are therefore the new power control tool that dominates online social practices (Mager, 2012), from digital communications to entertainment, consumption and life in general. Digital platforms are supported by corporations, for whom corporate interests take precedence over the general interest. Thus, their algorithms are aligned with their objectives (Cheney-Lippold, 2011) and keeping them hidden helps to ensure that the corporate ideology remains invisible (Cheney-Lippold, 2011). In this sense, there is evidence of discrimination in credit approval decisions that take into account race, income, and zip code, among other aspects (Pedreschi et al., 2009); and in intelligent systems used to support the decision-making of security agencies (Kamiran et al., 2012). However, none of these studies consider ageism to be a significant source of discrimination.
There is a need to understand the way in which these mechanisms of discrimination on digital platforms reinforce ageism, in order to raise awareness of how ageism works in the digital society, and to incorporate this knowledge into the design of algorithms to prevent discriminative biases.
Digital platforms from an old-age perspective
In this section, we analyze four digital platform tools which are of general interest from an old-age perspective. The examples presented are not aimed to be exhaustive, but were selected for their impact on society.
Psychometric predictions on social network sites
There is a growing interest in psychometric prediction of digital platform users. This is particularly true on SNS, where people often express their interests by sharing, following, or supporting different content, and digital platforms wish to profile their audience.
Applymagicsauce.com (Magic Sauce) is an academic project developed to raise awareness of the implications of data sharing. It allows users to discover what their digital footprints might reveal about their psychological profile (Apply Magic Sauce – Demo, n.d.). The tool can make predictions about any Facebook user by comparing their Facebook activities with a learning data set. They claim that their system is transparent (Apply Magic Sauce – Transparency, n.d.). To back up this claim, they publish a list of the services using their system and explain their algorithm. A learning data set of over 58,000 volunteers fed the algorithm. Volunteers gave their consent to the use of their Facebook likes, detailed demographic profiles and the results of several psychometric tests (Kosinski et al., 2013). Predictions based on this collection of data contain personal attributes, including sexual orientation, ethnicity, religious and political views, personality traits, measurement of intelligence and happiness, and use of addictive substances, among others. The authors published the estimated probability of correct classification of each variable and claimed a higher level of accuracy than predictions made by humans (Youyou et al., 2015).
However, the authors did not provide information on the representativeness of the learning data set. There is no public information on the demographics of the learning data set, so it is not clear whether or not the learning data set reproduces the Facebook population, for example, with regard to age, country, or gender. It is therefore impossible to know, for any given users, whether the prediction is fair or has failed because either the user is atypical or there are no similar profiles in their learning data set. Voluntary sampling does not meet the characteristics of the population studied on SNS, particularly taking into account the fact that the digital divide in relation to motivations, skills, and purposes of use mainly affects older people (Ragnedda and Ruiu, 2017). For example, recruitment processes that make use of digital media for advertising, registration, and participation tend to exclude older people. The Magic Sauce sample might be biased toward individuals with a high number of interests on Facebook and more diverse uses of the platform, meaning that young users might be more likely to contribute.
Moreover, the system uses correlations to make predictions. According to the authors, personal interest prediction is accurate in 72% of the cases studied (Apply Magic Sauce – Documentation, n.d.). Thus, one in four predictions might not be accurate, and this could have a more significant effect on individuals who use Facebook differently to most of the population on this SNS, which is the case of older people who use digital media less often and for a narrower range of purposes than the majority of the population (Rosales and Fernández-Ardèvol, 2019). By ignoring 28% of the population for whom the predictions are not accurate, the system deprioritizes both less motivated and less skilled users, two aspects that appear to create a bias against older people.
To predict the age of a user, other systems use the average age of the user’s network (e g. Culotta et al., 2016; Perozzi and Skiena, 2015). This approach is based on available evidence of age-homophily on SNS (Perozzi and Skiena, 2015), which supports the idea that contacts on SNS would be around the user’s age, as happens at school. However, available evidence also shows that intergenerational engagement is notably higher among older individuals, with younger relatives being an important part of their intergenerational relationships Marsden (McPherson et al., 2001).
CAPTCHA security
CAPTCHA stands for the Completely Automated Public Turing test to distinguish between Computers and Humans. Thus, CAPTCHAs are used to identify whether the subject trying to access a digital platform is a human or a bot. To do this, the system asks the user to perform a task that a bot would not be able to do. While studies have shown that hackers can compromise CAPTCHAs (Yan and El Ahmad, 2008), they are designed to be a key element for the security of digital platforms (Yan and El Ahmad, 2008).
CAPTCHAs have evolved over time, setting the user different types of challenges (Gafni and Nagar, 2016; Kaur and Cook, 2019). Initially, they required the transcription of distorted texts or short audios. Others involved crowdsourcing in altruistic missions. In this case, the user had to transcribe two words instead of one. The former met the original goal of challenging bots, while the latter contributed to the digitization of books. More recently, the identification of particular elements in an image was introduced. The latest CAPTCHAs use a secret algorithm and require most of the users to select a checkbox (Padave, 2014).
Many users find CAPTCHAs difficult. There are a number of usability issues, as CAPTCHAs are more difficult for people facing either the physical decline often associated with older people (Prusty, n.d.) or learning disabilities (Kaur and Cook, 2019). There is evidence that they also discriminate against individuals in peripheral cultures, as the texts, audios, and images tend to be a reflection of prevailing cultural Western content (Gafni and Nagar, 2016). In particular, image-based CAPTCHAs require the identification of images containing cultural elements that may differ from one context to another, meaning that people from peripheral cultures will find it more challenging to perform the identification task correctly. Thus, for the sake of security and/or altruism, CAPTCHAs fall into ageism by deprioritizing the limitations of older people on digital platforms in their corporate decision-making, and by ignoring the ways in which their comparatively limited skills reduce their chances of completing the CAPTCHA challenges.
Biometrics in security systems
Biometric technologies are meant to provide an appropriate alternative for control systems which should be more accurate and faster than traditional username and password systems (Wagner and Fernández-Ardèvol, 2016). Some biometric systems include iris, retina, and fingerprint scanning. While iris and retina scanning should be more accurate than fingerprint scanning (Parei and Hamidi, 2018), fingerprint systems have become more widespread in recent years (Thakkar, n.d.; Trader, n.d.). They are widely used as a security feature in new smartphones (InAuth, 2017), but also in border controls (Fingerprint Cards AB, n.d.). They could also be implemented for ATM transactions (Bleiker, 2017) and contribute to health-care systems (Soares and Gaikwad, 2016). However, they are far from being perfect (Hamidi, 2019).
In addition to all the other body changes that occur with aging, fingerprints may disappear with age (Blosfield, 2018; Ng, 2016). Other factors that could influence the loss of fingerprints include medical treatments (Harmon, 2009), prolonged use of strong chemical products, or prolonged hard manual work (Sarbaz et al., 2016). Older individuals may therefore have more chance of being exposed to some of the aforementioned risks and it is more probable that their fingerprints will be less readable by conventional systems, which are likely to have been built on the basis of an average adult population that does not have to face such physical issues. Distrust arises when fingerprints do not match the user’s records. Individuals not able to go through conventional identity checking systems are therefore forced to face extraordinary controls and, in extreme situations, could even be detained by government institutions (Harmon, 2009).
Similar to the case of face recognition systems, fingerprint systems have been widely implemented in society, with limited tests and calibrations that follow homophilic patterns (Harmon, 2009). A fingerprint system which is meant to simplify the user’s life by having a more accessible and secure system than passwords (Parei and Hamidi, 2018) thus creates new strands of discrimination. While digital platforms should provide tools for an inclusive networked society (Buolamwini and Gebru, 2018), fingerprint systems reinforce ageism by ignoring the reality of older people.
Passive metering tools for smartphones
Passive metering tools, or trackers, collect data from digital devices by tracking system logs (Kretchmer, 2017). They are used to make psychometric predictions (Rosales and Fernández-Ardèvol, 2019).
Most tools based on passive metering of smartphones rely on big data approaches. Collection of system logs depends on the voluntary participation of individuals, without checking whether the sample is representative of the studied population (Böhmer et al., 2011; Ferreira et al., 2011; Jones et al., 2015). Similarly to psychometric predictions on SNS, voluntary sampling methods in this case do not allow the inclusion of older people.
Moreover, digital logs are the raw data of what are seen as nonintrusive methods for data collection (Böhmer et al., 2011; Ferreira et al., 2011; Jones et al., 2015). Researchers often argue that tracking systems provide real-life data with little or no inconvenience and no effort from participants (Kiukkonen et al., 2010; Xu et al., 2016). On the one hand, passive metering tools for smartphones are far from being universal. At present, tracking systems can monitor a limited number of smartphones, according to their operating system (OS) version and model (Holz et al., 2015; Lee et al., 2014; Shin et al., 2012). The effort–benefit ratio leads to the exclusion of some models from tracking systems, particularly older or less popular models. On the other hand, tracking systems do have an impact on the battery, memory, and processor of the tracked device. In this vein, some studies have expressed concern about the battery drain of monitoring systems (Wagner et al., 2013). The capacity of the smartphone could therefore act as a technical limitation that prevents individuals from participating in tracking studies, as well as preventing the inclusion of older and low-profile smartphones such studies. And this affects older people more, as it is the age group that tends to have the oldest devices, since they upgrade them less frequently (Rahmati et al., 2012; Srinivasan et al., 2014; Yan et al., 2012) and often use second-hand devices inherited from their relatives (Jacobson et al., 2017).
Tracking systems thus reinforce ageism by deprioritizing the habits of older people in the development of passive metering tools, particularly with regard to the use of basic and older models of mobile phones. They also use voluntary sampling methods that disregard their comparative limited purposes of use. All these factors limit the chances of older individuals participating in studies that make use of smartphone logs.
Discussion and conclusion
The aim of this research is to raise awareness of different, explicit and non-explicit ageist mechanisms that limit the participation of older people on digital platforms. Our analysis shows how ageism has already become part of digital platforms through the homophily that shapes corporate teams, the discriminatory methods inadvertently embedded in their design and development processes and the (obscure) algorithms that increasingly run them. We have shown how these mechanisms deprioritize, disregard, or explicitly exclude older people from digital platforms, as well as the way in which explicit and non-explicit ageism limits the chances of ‘creating digital and social equality for all in our digital, networked society’ (Fernández-Ardèvol and Ivan, 2013; Oreglia and Kaye, 2012).
Homophily refers to the tendency to associate with peers, for example, in terms of age, gender, class, and societal role (Kretchmer, 2017: 88). Marsden (McPherson et al., 2001: 416) found evidence of age-based endogamy among younger age groups, a trend that has remained stable over time (Grossetti, 2007; Smith et al., 2014). Innovation teams on digital platforms are primarily made up of young, educated men with above-average salaries and a keen interest in technologies (Grossetti, 2007; Smith et al., 2014). Thus, their collective socioeconomic and cultural references influence the processes which run the digital platforms that they develop, and it is comparatively more unlikely for them to properly include the older population’s experiences when designing digital tools. Homophilic dynamics therefore lead teams to disregard older people in their design decisions.
The methods used to analyze data from digital platforms rely on big data approaches which disregard groups that behave differently to the mainstream, something which is more common among older people. Big data approaches implicitly assume that with large amounts of data there is no need to evaluate representativeness (Cohoon and Aspray, 2004). They are mostly based on predictions and correlations (boyd and Crawford, 2012) that are less accurate with minorities (Bonchi et al., 2017) or any groups that behave differently to the mainstream. By using predictions and correlations, the system thus deprioritizes users outside the mainstream, among whom older people are more commonly found. In addition, voluntary samples recruited through digital media exclude users with fewer skills and a narrower range of uses of digital media, which is more common among older people.
Obscure algorithms (Hajian et al., 2016) help to hide corporate priorities that contradict user interests. This article highlights usability or inclusivity issues that mainly affect nonexpert users, who are more common among the older population. Behind these usability issues is not only the absence of a user experience approach, but also the failure to take into account unexpected users. Beyond the user experience, design decisions are made in support of corporate decisions. Thus social equality in digital society often contradicts the aims of market products, showing the asymmetrical power embedded on digital platforms that are part of the capitalist economy (Pasquale, 2015) and the way in which corporate decision-making reinforces ageism by deprioritizing people with limited digital skills or motivation, among whom older people are commonly found. The capitalist approach to digital platforms is a ‘source of invisibilities that support inequalities and ultimately injustices’ (Schäfer and Van Es, 2017). The analyzed mechanisms thus reinforce ageism on digital platforms based on stereotyped views of older people that see all of them as less avid users who are not interested in technologies. By doing so, corporations tend to deprioritize, disregard, or directly exclude older people through their design and development processes.
A general principle to tackle discrimination should consider giving the user control of their personal data, that is, control on how the data provided to digital platforms can be used, which implies a movement toward more transparent algorithms. In addition, intelligent algorithms should compensate for the underrepresentation of collectives in less favorable conditions (Stocchetti, 2018: 23). It is therefore necessary to design algorithms that are capable of setting up bias-free training data sets (Pedreschi et al., 2009). Or to take into account the granularity of the data (Bolukbasi et al., 2016), create more inclusive algorithms and use statistical models capable of incorporating the cultural and social digital media practices of broader population segments. In this sense, we join Dressel et al. (1997) in suggesting that research should adapt to the context of overlooked groups, which the older population has become, and avoid imposing the frameworks of market and economically and politically dominant groups.
Author biographies
Andrea Rosales is a senior researcher at the Internet Interdisciplinary Institute (IN3), at the Universitat Oberta de Catalunya (UOC). Her current research agenda combines a critical analysis of the (non) explicit biases of intelligent systems, with qualitative studies about how people perceive and use mobile technologies. Thus, she analyses the social impact of intelligent systems, to contribute to shape the intrinsic power relationships of digital technologies.
Mireia Fernández-Ardèvol is a senior researcher and group leader at the Internet Interdisciplinary Institute (IN3), Open University of Catalonia (UOC). Her research interests range from the social and economic aspects of mobile communication to the intersection between digital communication and ageing. She has published in MIT and Routledge, and in several high-impact journals.
Footnotes
Funding: The author(s) disclosed receipt of the following financial support for the research, authorship, and/or publication of this article: The Spanish Ministry of Science, Innovation, and Universities supported the first author withthe personal grant IJCI-2017-32162.
ORCID iD: Andrea Rosales https://orcid.org/0000-0001-8506-5126
Contributor Information
Andrea Rosales, Universitat Oberta de Catalunya (UOC), Internet Interdisciplinary Institute (IN3), Spain.
Mireia Fernández-Ardèvol, Universitat Oberta de Catalunya (UOC), Internet Interdisciplinary Institute (IN3), Spain.
References
- AGE Platform Europe (2016) AGE Platform Europe Position on Structural Ageism. Brussels, Belgium. [Google Scholar]
- Apply Magic Sauce – Demo (n.d.) Available at: https://applymagicsauce.com/demo (accessed 12 September 2019).
- Apply Magic Sauce – Documentation (n.d.) Available at: https://applymagicsauce.com/documentation (accessed 28 November 2018).
- Apply Magic Sauce – Transparency (n.d.) Available at: https://applymagicsauce.com/transparency (accessed 21 December 2018).
- Ayalon L, Tesch-Römer C. (eds) (2018) Contemporary Perspectives on Ageism. Cham, Switzerland: Springer Open. [Google Scholar]
- Beyer S. (2014) Why are women underrepresented in Computer Science? Gender differences in stereotypes, self-efficacy, values, and interests and predictors of future CS course-taking and grades. Computer Science Education 24(2–3): 153–192. [Google Scholar]
- Bleiker C. (2017) EU to implement border fingerprint checks similar to United States. Available at: https://www.dw.com/en/eu-to-implement-border-fingerprint-checks-similar-to-united-states/a-41111621 (accessed 19 December 2018).
- Blosfield E. (2018) Password stolen; create a new one. What if your retina scan or fingerprint is stolen? Available at: https://www.insurancejournal.com/news/national/2018/05/11/488962.htm (accessed 19 December 2018).
- Bodner E, Bergman YS, Cohen-Fridel S. (2012) Different dimensions of ageist attitudes among men and women: A multigenerational perspective. International Psychogeriatrics 24(6): 895–901. [DOI] [PubMed] [Google Scholar]
- Böhmer M, Hecht B, Schöning J, et al. (2011) Falling asleep with Angry Birds, Facebook and Kindle: a large scale study on mobile application usage. In: Human-computer interaction with mobile devices and services (MobileHCI’11), Stockholm, Sweden, 30 August 2011, pp. 47–56. ACM. [Google Scholar]
- Bolukbasi T, Chang K-W, Zou JY, et al. (2016) Man is to computer programmer as woman is to homemaker? Debiasing word embeddings. In: Neural Information Processing Systems (NIPS’16), Barcelona: Available at: https://www.semanticscholar.org/paper/Man-is-to-Computer-Programmer-as-Woman-is-to-Word-Bolukbasi-Chang/274459c52103f9b7880d0697aa28755ac3366987 (accessed 11 October 2018). [Google Scholar]
- Bonchi F, Hajian S, Mishra B, et al. (2017) Exposing the probabilistic causal structure of discrimination. International Journal of Data Science and Analytics 3(1): 1–21. [Google Scholar]
- boyd danah, Crawford K. (2012) Critical questions for big data. Information, Communication & Society 15(5): 662–679. [Google Scholar]
- Brah A, Phoenix A. (2004) Ain’t I a woman? Revisiting intersectionality. Journal of International Women’s Studies 5(3): 75–86. [Google Scholar]
- Brandtzæg PB, Heim J, Karahasanović A. (2011) Understanding the new digital divide – A typology of Internet users in Europe. International Journal of Human-Computer Studies 69(3): 123–138. [Google Scholar]
- Buolamwini J, Gebru T. (2018) Gender shades: Intersectional accuracy disparities in commercial gender classification. Proceedings of the 1st Conference on Fairness, Accountability and Transparency 81: 1–15. Available at: http://proceedings.mlr.press/v81/buolamwini18a/buolamwini18a.pdf. [Google Scholar]
- Castells M, Fernández-Ardèvol M, Linchuan Qiu J, et al. (2006) Mobile Communication and Society: A Global Perspective. Cambridge: MIT Press. [Google Scholar]
- Cheney-Lippold J. (2011) A new algorithmic identity: Soft biopolitics and the modulation of control. Theory, Culture & Society 28(6): 164–181. [Google Scholar]
- Cohoon JM, Aspray W. (eds) (2004) Women and Information Technology: Research on Underrepresentation. London: MIT Press. [Google Scholar]
- Comunello F, Fernández-Ardèvol M, Mulargia S, et al. (2017) Women, youth and everything else: Age-based and gendered stereotypes in relation to digital technology among elderly Italian mobile phone users. Media, Culture & Society 39(6): 798–815. [Google Scholar]
- Culotta A, Ravi NK, Cutler J. (2016) Predicting Twitter user demographics using distant supervision from website traffic data. Journal of Artificial Intelligence Research 55: 389–408. [Google Scholar]
- Dressel P, Minkler M, Yen I. (1997) Gender, race, class, and aging: advances and opportunities. International Journal of Health Services 27(4): 579–600. [DOI] [PubMed] [Google Scholar]
- Durick J, Robertson T, Brereton M, et al. (2013) Dispelling ageing myths in technology design. In: Proceedings of the Australian computer-human interaction conference (OzCHI’13), pp. 467–476. DOI: 10.1145/2541016.2541040. [Google Scholar]
- Eubanks V. (2018) Automating Inequality: How High-Tech Tools Profile, Police and Punish the Poor. New York, USA: St Martin’s Press. [Google Scholar]
- Fernández-Ardèvol M. (2019) Older people go mobile In: Oxford Handbook of Mobile Communication, Culture, and Information. Oxford: Oxford University Press. [Google Scholar]
- Fernández-Ardèvol M, Ivan L. (2013) Older people and mobile communication in two European contexts. Romanian Journal of Communication and Public Relations 15(3): 83–101. Available at: http://journalofcommunication.ro/index.php/journalofcommunication/article/view/196 (accessed 1 February 2015). [Google Scholar]
- Ferreira D, Dey AK, Kostakos V. (2011) Understanding human-smartphone concerns: A study of battery life. In: Proceedings of the International conference on pervasive computing, San Francisco, June 2011, pp. 19–33. Berlin: Springer-Verlag. [Google Scholar]
- Fingerprint Cards AB. (n.d.) Fingerprints biometric solutions for smartphones & tablets. Available at: https://www.fingerprints.com/solutions/smartphones-tablets/ (accessed 19 December 2018).
- Gafni R, Nagar I. (2016) CAPTCHA: Impact on user experience of users with learning disabilities. Interdisciplinary Journal of e-Skills and Lifelong Learning 12: 207–223. [Google Scholar]
- Garattini C, Prendergast D. (2015) Critical reflections on ageing and technology in the twenty-first century In: Prendergast D, Garattini C. (eds) Aging and the Digital Life Course. Life Course, Culture and Aging: Global transformations. New York: Berghahn Books, pp. 1–15. [Google Scholar]
- González-Bailón S, Braman S, Jaeger PT. (2017) Decoding the Social World: Data Science and the Unintended Consequences of Communication. Cambridge, Massachusetts: MIT Press. [Google Scholar]
- Grossetti M. (2007) Are French networks different? Social Networks 29(3): 391–404. [Google Scholar]
- Hajian S, Domingo-Ferrer J. (2013) A methodology for direct and indirect discrimination prevention in data mining. IEEE Transactions on Knowledge and Data Engineering 25(7): 1445–1459. [Google Scholar]
- Hajian S, Bonchi F, Castillo C. (2016) Algorithmic bias: From discrimination discovery to fairness-aware data mining. In: Proceedings of the international conference on knowledge discovery and data mining (KDD’16), pp. 2125–2126. DOI: 10.1007/s41060-016-0040-z. [Google Scholar]
- Hamidi H. (2019) An approach to develop the smart health using Internet of Things and authentication based on biometric technology. Future Generation Computer Systems 91: 434–449. [Google Scholar]
- Harmon K. (2009) Can you lose your fingerprints? – Scientific American. Available at: https://www.scientificamerican.com/article/lose-your-fingerprints/ (accessed 28 November 2018). [PubMed]
- Holz C, Bentley F, Church K, et al. (2015) ‘I’m just on my phone and they’re watching TV’: Quantifying mobile device use while watching television. In: Proceedings of the ACM international conference on interactive experiences for TV and online video (TVX 2015) New York: Association for Computing Machinery, Inc, pp. 93–102. DOI: 10.1111/j.1365-2206.2012.00839.x. [Google Scholar]
- Housing and Civil Enforcement Section|CRT|Department of Justice (n.d.) Available at: https://www.justice.gov/crt/housing-and-civil-enforcement-section (accessed 11 December 2018).
- InAuth (2017) Fingerprints: The most popular biometric. Available at: https://www.inauth.com/blog/fingerprints-popular-biometric/ (accessed 20 December 2018).
- Ito M, Baumer S, Bittanti M, et al. (2010) Hanging Out, Messing Around, and Geeking Out: Kids Living and Learning with New Media. Cambridge, MA: MIT Press. [Google Scholar]
- Jacobson J, Lin CZ, McEwen R. (2017) Aging with technology: Seniors and mobile connections. Canadian Journal of Communication 42(2): 331. [Google Scholar]
- Jones SL, Ferreira D, Hosio S, et al. (2015) Revisitation analysis of smartphone app use In: Pervasive and ubiquitous computing (UbiComp’15). Osaka, Japan: ACM Press, pp. 1197–1208. [Google Scholar]
- Justice and fundamental rights|European Commission (n.d.) Available at: https://ec.europa.eu/info/policies/justice-and-fundamental-rights_en (accessed 11 December 2018).
- Kamiran F, Karim A, Verwer S, et al. (2012) Classifying socially sensitive data without discrimination: An analysis of a crime suspect dataset. Proceedings – 12th IEEE international conference on data mining workshops, ICDMW 2012, pp. 370–377. DOI: 10.1109/ICDMW.2012.117. [Google Scholar]
- Kaur K, Cook DM. (2019) Haptic alternatives for mobile device authentication by older technology users. In: Advances in Intelligent Systems and Computing. DOI: 10.1007/978-3-319-93692-5_24.
- Kitchin R. (2014) Big data, new epistemologies and paradigm shifts. Big Data & Society 1(1): 1–12. [Google Scholar]
- Kiukkonen N, Blom J, Dousse O, et al. (2010) Towards rich mobile phone datasets: Lausanne data collection campaign In: Pervasive services (ICPS’10). Berlin, Germany: Springer. [Google Scholar]
- Kosinski M, Stillwell D, Graepel T. (2013) Private traits and attributes are predictable from digital records of human behavior. National Academy of Sciences 110(15): 5802–5805. Available at: http://www.pnas.org/cgi/doi/10.1073/pnas.1218772110. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Kretchmer SB. (2017) Theorizing digital divides through the lens of the social construction of technology and social shaping of technology. In: Ragnedda M, Muschert GW. (eds) Theorizing Digital Divides. Taylor & Francis, pp. 88–102. [Google Scholar]
- Lagacé M, Charmarkeh H, Laplante J, et al. (2015) How ageism contributes to the second level digital divide: The case of Canadian seniors. Journal of Technologies and Human Usability 11(4): 1–13. [Google Scholar]
- Lee U, Lee J, Ko M, et al. (2014) Hooked on smartphones: An exploratory study on smartphone overuse among college students. In: Human factors in computing systems (CHI’14). Toronto, Canada: ACM Press, pp. 2327–2336. Available at: http://www.yatani.jp/paper/CHI2014_MobileOveruse.pdf%5Cnhttp://dl.acm.org/citation.cfm?id=2557366. [Google Scholar]
- Mager A. (2012) Algorithmic ideology: How capitalist society shapes search engines. Information Communication and Society 15(5): 769–787. [Google Scholar]
- Manovich L. (2012) Trending: the promises and the challenges of big social data In: Gold MK. (ed) Debates in the Digital Humanities. Minneapolis: University of Minnesota Press, pp. 460–475. [Google Scholar]
- McPherson M, Smith-Lovin L, Cook JM. (2001) Birds of a feather: Homophily in social networks. Annual Review of Sociology 27(1): 415–444. [Google Scholar]
- Neff G, Nagy P. (2016) Talking to bots: Symbiotic agency and the case of Tay. International Journal of Communication 10: 4915–4931. [Google Scholar]
- Neves B, Amaro F. (2012) Too old for technology? How the elderly of Lisbon use and perceive ICT. The Journal of Community Informatics 8(1): 1–12. [Google Scholar]
- Ng A. (2016) Child uses sleeping mom’s fingerprints to buy Pokemon gifts – CNET. Available at: https://www.cnet.com/news/child-uses-sleeping-moms-fingerprints-to-buy-pokemon-gifts/ (accessed 19 December 2018).
- O’Neil C. (2016) Weapons of Math Destruction. How Big Data Increases Inequality and Threatens Democracy. New York: Broadway Books. [Google Scholar]
- Oreglia E, Kaye JJ. (2012) A gift from the city: Mobile phones in rural China In: Computer-supported cooperative work and social computing (CSCW’15). Seattle: ACM Press, pp. 137–146. DOI: 10.1145/2145204.2145228. [Google Scholar]
- Padave K. (2014) The evolution of CAPTCHA – How an anti-spam measure has grown? Available at: https://www.exeideas.com/2014/11/the-evolution-of-captcha.html (accessed 20 December 2018).
- Palmore EB. (1999) Ageism: Negative and Positive, 2nd ed New York, NY: Springer. [Google Scholar]
- Parei A, Hamidi H. (2018) A method for FIDO management through biometric technology in IOT.
- Pasquale F. (2015) The Black Box Society. Cambridge, Massachusetts: Harvard University Press; DOI: 10.4159/harvard.9780674736061. [Google Scholar]
- Pedreschi D, Ruggieri S, Turini F. (2009) Measuring discrimination in socially-sensitive decision records. In: Proceedings of the 2009 SIAM international conference on data mining Nevada: Society for Industrial and Applied Mathematics, pp. 581–592. DOI: 10.1137/1.9781611972795.50. [Google Scholar]
- Perozzi B, Skiena S. (2015) Exact age prediction in social networks. In: Proceedings of the 24th international conference on world wide web Nueva York: Association for Computing Machinery, pp. 91–92. DOI: 10.1145/2740908.2742765. [Google Scholar]
- Prusty N. (n.d.) How does Google’s no CAPTCHA ReCAPTCHA work? Available at: http://qnimate.com/how-does-googles-no-captcha-recaptcha-work/ (accessed 20 December 2018).
- Ragnedda M, Ruiu ML. (2017) Social capital and the three levels of digital divide. In: Ragnedda M, Muschert GW. (eds), Theorizing Digital Divides. Taylor & Francis, pp. 21–34. [Google Scholar]
- Rahmati A, Tossell C, Shepard C, et al. (2012) Exploring iPhone usage. In: International conference on human computer interaction with mobile devices and services – MobileHCI, p. 11: DOI: 10.1145/2371574.2371577. [Google Scholar]
- Rosales A, Fernández-Ardèvol M. (2016) Smartphones, apps and older people’s interests: From a generational perspective. In: Human-Computer Interaction with Mobile Devices and Services (MobileHCI’16). Florence, Italy: ACM Press, pp. 491–503. Available at: https://www.scopus.com/inward/record.uri?eid=2-s2.0-84991338150&doi=10.1145%2F2935334.2935363&partnerID=40&md5=ebf9d8edc83ed983d9f4ecef279acce8 (accessed 6 September 2016). [Google Scholar]
- Rosales A, Fernández-Ardèvol M. (2019) Smartphone usage diversity amongst older people In: Sayago S. (ed) Perspectives on Human Computer Interaction research with older people. Berlin: Springer. [Google Scholar]
- Sarbaz S, Azadeh P, Samiei F, et al. (2016) Evaluation of fingerprint loss in patients under paclitaxel based chemotherapy regimen. Annals of Oncology 27(9): ix189. [Google Scholar]
- Sawchuk K, Crow B. (2011) Into the grey zone: Seniors, cell phones and milieus that matter. WI: Journal of Mobile Media 5(1). Available at: http://wi.mobilities.ca/into-the-grey-zone-seniors-cell-phones-and-milieus-that-matter/ (accessed 6 September 2011). [Google Scholar]
- Schäfer MT, Van Es K. (2017) The Datafied Society: Studying Culture Through data. Amsterdam: Amsterdam University Press. [Google Scholar]
- Shin C, Hong J-H, Dey AK. (2012) Understanding and prediction of mobile application usage for smart phones. In: Proceedings of the joint conference on pervasive and ubiquitous computing (UbiComp’12), p. 173: DOI: 10.1145/2370216.2370243. [Google Scholar]
- Smith JA, McPherson M, Smith-Lovin L. (2014) Social distance in the United States: Sex, race, religion, age, and education homophily among confidants, 1985 to 2004. American Sociological Review 79(3): 432–456. [Google Scholar]
- Soares J, Gaikwad AN. (2016) A self banking biometric machine with fake detection applied to fingerprint and iris along with GSM technology for OTP. In: International conference on communication and signal processing, ICCSP 2016 Piscataway: Institute of Electrical and Electronics Engineers Inc, pp. 508–512. DOI: 10.1109/ICCSP.2016.7754189. [Google Scholar]
- Srinivasan V, Moghaddam S, Mukherji A, et al. (2014) Mobileminer: Mining your frequent patterns on your phone. In: UbiComp 2014 – Proceedings of the 2014 ACM international joint conference on pervasive and ubiquitous computing New York: Association for Computing Machinery, Inc, pp. 389–400. DOI: 10.1145/2632048.2632052. [Google Scholar]
- Stangor C, Schaller M. (2000) Stereotypes as individual and collective representations In: Stangor C. (ed) Stereotypes and prejudice: Essential readings. Philadelphia, PA: Psychology Press, pp. 64–82. Available at: https://books.google.es/books?id=3yJm1h1OblEC&printsec=frontcover&source=gbs_ge_summary_r&cad=0#v=onepage&q&f=false. [Google Scholar]
- Stocchetti M. (2018) Invisibility, inequality and the dialectics of the Real in the Digital Age. Interaçoes 34(1): 23–46. [Google Scholar]
- Thakkar D. (n.d.) Iris recognition scanners vs. fingerprint scanners. Available at: https://www.bayometric.com/iris-recognition-scanners-vs-fingerprint-scanners/ (accessed 20 December 2018).
- Trader J. (n.d.) Iris recognition vs. retina scanning – What are the differences? Available at: http://www.m2sys.com/blog/biometric-hardware/iris-recognition-vs-retina-scanning-what-are-the-differences/ (accessed 20 December 2018).
- van Dijk J. (2006) Digital divide research, achievements and shortcomings. Poetics 34(4–5): 221–235. [Google Scholar]
- van Dijck J, Poell T, de Waal M. (2018) The Platform Society: Public Values in a Connective World. Oxford: Oxford University Press. [Google Scholar]
- Wachter-Boettcher S, Emmes A. (2018) Technically Wrong: Sexist Apps, Biased Algorithms, and Other Threats of Toxic Tech. New York: W.W. Norton & Company. [Google Scholar]
- Wagner DT, Rice A, Beresford AR. (2013) Device analyzer: Understanding smartphone usage. In: Proceedings of the international conference on mobile and ubiquitous systems Tokyo, Japan: Springer, pp. 1–12. DOI: 10.1007/978-3-319-11569-6_16. [Google Scholar]
- Wagner S, Fernández-Ardèvol M. (2016) Local content production and the political economy of the mobile app industries in Argentina and Bolivia. In: Wamala C, Scharff C, Hellström J. (eds) New Media & Society 18(8): 1768–1786. [Google Scholar]
- World Health Organization (2017) 10 facts on ageing and health. Available at: http://www.who.int/features/factfiles/ageing/en/ (accessed 12 September 2018).
- Xu R, Frey RM, Fleisch E, et al. (2016) Understanding the impact of personality traits on mobile app adoption – Insights from a large-scale field study. Computers in Human Behavior 62: 244–256. [Google Scholar]
- Yan J, El Ahmad AS. (2008) A low-cost attack on a Microsoft CAPTCHA. In: Proceedings of the 15th ACM conference on computer and communications security – CCS ‘08, p. 543 DOI: 10.1145/1455770.1455839. [Google Scholar]
- Yan T, Chu D, Ganesan D, et al. (2012) Fast app launching for mobile devices using predictive user context. In: Proceedings of the conference on mobile systems, applications, and services (MobiSys’12), Low Wood Bay, England: ACM Press, pp. 113–126. [Google Scholar]
- Youyou W, Kosinski M, Stillwell D. (2015) Computer-based personality judgments are more accurate than those made by humans. Proceedings of the National Academy of Sciences 112(4): 1036–1040. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Zhao J, Wang T, Yatskar M, et al. (2017) Men also like shopping: Reducing gender bias amplification using corpus-level constraints. In: Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing, Copenhagen, Denmark, September 2017 Stroudsburg, PA, USA: The Association for Computational Linguistics. [Google Scholar]