Abstract
Introduction:
This study investigated the accessibility and usability of online job applications by individuals with visual impairments who use screen readers.
Method:
Three blind, experienced screen reader users on the research team attempted to complete online job applications from 30 randomly selected Fortune 500 companies, with 90 application trials attempted. Each tester used a different screen reader and browser combination.
Results:
The testers experienced significant barriers to completing the online job applications: 50 of 90 (55.6%) attempts were successful, but 23 of the 30 (76.7%) sites had critical issues that prevented at least one tester from completing the application. Testers found 694 accessibility issues and gave an average ease of use rating of 3.60 (SD=1.05) out of 5 for the job application sites. Twenty-six of the job application sites were created by eight job application software vendors, with significant variation in accessibility across different applications using the same vendor.
Discussion:
The accessibility and usability issues identified with the majority of online job applications evaluated present an unnecessary barrier to employment for screen reader users. While the 55.6% success rate found in this study is considerably higher compared to previous research, only 23.3% of the job application sites were accessible to all three screen readers.
Implications:
Job application software vendors and companies that create their own online job applications should take a Universal Design approach by following the WCAG guidelines and implementing recommended solutions to increase the accessibility of their applications. Software vendors are encouraged to only offer accessible options for creating online job applications, such that companies would not be able to create an inaccessible form.
In 2022, 80% of all job searches were conducted online (Kolmar, 2022). Online resources were the most popular method to search for a job, ahead of personal and professional networks, and more than a third of job seekers considered them their most important resource (Smith, 2015). Other research documented that the most common way to search for a job was on company websites (Gallup, 2017). Online applications are often the only method available to apply for jobs with large companies, given the popularity of applicant tracking systems, which are used by 99% of Fortune 500 companies (Jobscan, 2019). While multiple research studies have formally evaluated the accessibility of federal government (Jaeger, 2006; Loiacono et al., 2005; Olalere & Lazar, 2011), state government (Fagan & Fagan, 2004; Lazar et al., 2013; Youngblood & MacKiewicz, 2012; Youngblood, 2014; Yu & Parmanto, 2011), corporate (Loiacono et al., 2009; Loiacono & Djamasbi, 2013), and even web development company websites (Gilbertson & Machin, 2012), there have been far fewer studies on the accessibility of online job application sites, which is particularly concerning because of their frequent use in the job search.
According to 2019 American Community Survey (ACS) estimates, almost 3.8 million working-age adults have visual impairments. Recent ACS estimates indicate that 46% of working-age individuals who report a visual disability are employed compared to 79% of the population of people without disabilities (U.S. Census Bureau, 2020). People with visual impairments are approximately twice as likely to be unemployed - actively seeking a job but unable to find one - as people without disabilities (McDonnall & Sui, 2019). The first step to achieving employment is the ability to apply for a job, and the inaccessibility of these applications for screen reader users may contribute to the high rate of unemployment among people with visual impairments. Thus, it is essential that online job applications be accessible to screen readers.
Web inaccessibility, particularly for screen reader users, is a longstanding issue that has received a fair amount of research attention. Numerous state and national laws and regulations require digital technology and online content to be accessible and usable by people with disabilities. Yet inaccessible websites continue to be a problem, as documented in WebAIM’s annual study of the top one million websites, in which 96.8% of home pages had detectable Web Content Accessibility Guideline (WCAG) failures (WebAIM, 2022). Another study that involved the analysis of hundreds of websites by people who are blind found that more than 70% of the sites had critical accessibility blockers, making the main purpose of that site inaccessible (Wettemann & White, 2019). A growing number of lawsuits have attempted to address this problem. In 2017, 814 cases were filed; this number increased by more than 175% in 2018 and 2019, and has continued to increase every year since, to a high of 3,255 in 2021 (Launey & Vu, 2023). Website accessibility lawsuits made up 37% of the Americans with Disabilities Act (ADA) Title III lawsuits filed in 2022 (Launey & Vu, 2023).
Most web accessibility lawsuits have been filed under ADA Title III, but issues with inaccessible online job application websites could fall under ADA Title I, which prohibits employment discrimination against qualified individuals with disabilities, including discrimination in applying for jobs (Weissburg, 2021). Additionally, these online employment applications could potentially fall under Section 503 of the Rehabilitation Act of 1973, which requires that all employers with federal contracts or subcontracts of at least $10,000 “must take affirmative action to hire, retain, and promote qualified individuals with disabilities” (60–741.1).
With the almost ubiquitous use of online application sites today, inaccessible online applications potentially pose a significant barrier to employment, but this has received little research attention. We identified only three previous research studies that evaluated the accessibility of online job application sites for screen reader users, and all indicated significant accessibility problems. The first study utilized an automated system and expert simulation to check accessibility and found that only 33% (3/9) of aggregator job sites and 25% (3/12) of corporate sites were accessible enough to allow submission of an application (Bruyere et al., 2005). The next study evaluated the accessibility of eight aggregator job sites by using expert inspection based on guidelines from the Section 508 regulations and found that only one site did not violate any of the 16 guidelines (Lazar et al., 2011). The most recent study, conducted in 2011, evaluated the accessibility and usability of 16 company application sites with screen reader users and found that only 28% of the application attempts could be completed without sighted assistance (Lazar, 2012). Given that these studies were conducted more than 10 years ago and technology has changed substantially, conducting additional research on the accessibility of online job applications is warranted.
This study adds to the literature by providing current information about the accessibility and usability of online job application sites of 30 large, national companies, making it the largest study of its kind. We investigated the following questions to evaluate the ability of people who are visually impaired to independently complete online job applications:
RQ1: What is the current state of accessibility and usability of online job application sites of national companies?
RQ2: What are the most common accessibility and usability issues on companies’ online job application sites?
RQ3: What is the user experience for blind screen reader users when applying for jobs through companies’ online application sites?
Methods
Sample
To answer the primary research questions, this study evaluated online job applications from 30 U.S.-based Fortune 500 companies across five industry sectors: (a) financial, (b) health care, (c) retail, (d) technology, and (e) hotels, restaurants, and leisure. Criteria for company inclusion was a required online application process through the company’s website and routine hiring of a large number of employees (defined as having 100 or more open positions posted or having at least 70,000 employees). The research team identified Fortune 500 companies that met the criteria and randomly selected six companies to represent each industry category. Because our sample does not consist of human subjects, this study was determined by the second author’s institution to be exempt from IRB overview.
Study Design
Three expert testers conducted a task-oriented heuristic evaluation of the 30 online job application sites while being observed by accessibility engineers who recorded the data. We used a mixed methods protocol to collect user experience feedback from the testers during the online job application process for each company, including both technical reporting of issues encountered along with a questionnaire summarizing the user experience. All three testers on the research team are long-time screen reader users who are totally blind (without functional vision). They are advanced users trained to provide structured feedback regarding the accessibility and usability of websites. We utilized expert screen reader users who are experienced usability testers for this study because an experienced tester can provide (a) precise language to specific feedback regarding accessibility issues encountered and (b) more consistent results across multiple tests than someone who has never performed testing before. Having the same people test all 30 job application sites provided consistency in terms of screen reader skill level, ensuring that tester skill level did not influence their evaluations regarding the accessibility or usability of the sites.
During the testing process, accessibility engineers recorded the following data for each issue identified by a tester: (a) action performed, (b) expected results, (c) actual results, (d) impact on user (severity), and (e) the category of the issue based on WCAG 2.1 success criteria. To rate the impact of each issue on the user’s progress, testers used a system that considered both the severity and importance of the issue to the application submission process. For example, if a user could easily work around an accessibility problem, the impact was rated low. However, if that issue occurred in an area critical to completing the application process, then the impact was rated high. A critical issue, referred to as a “blocker,” required the user to seek help from a sighted person (i.e., the accessibility engineer) to work around the problem.
Upon completing each online job application, testers responded to a questionnaire regarding their experience. The questionnaire items were:
What specific barriers did you encounter? (free response)
Rating of ease of use (1 = No part of the process was usable, 2 = Encountered many barriers, 3 = Encountered some barriers, 4 = Encountered a few barriers, 5 = Encountered no barriers at any point)
General feedback (free response)
(Optional) On this application, did you notice any screen reader features that developers built in? (Yes/No, with option for free response)
Materials
The testing was conducted in the U.S. on desktop computers that were free of external applications or extensions that could interfere with testing. Each tester used a different screen reader and browser combination, which were as follows: (a) NVDA version 2020.4 with Chrome version 89 on Windows 10, (b) JAWS 2021 with Chrome version 89 on Windows 10, and (c) VoiceOver with Safari 14 on macOS Big Sur. Researchers selected these three screen readers because, in total, they represent the primary screen readers used by 90.9% of users (WebAIM, 2021). Chrome is the most popular web browser in the United States when paired with JAWS and NVDA (WebAIM, 2021) and was therefore used for this study with the Windows OS.
Procedure
Each tester assessed all 30 job application websites while going through the entire process of completing an online job application for a total of 90 tests. During testing, an accessibility engineer monitored data collection via screen share, recorded the sessions, captured real-time feedback from the testers (i.e., typed testers’ comments and documented issues testers encountered), and recorded the length of time taken to complete each application, which ranged from 20 minutes to 2 hours and 15 minutes. Usability testers narrated their actions and experience as they completed each online job application. The engineers did not provide feedback or guidance unless the tester required assistance to overcome blocker issues that prevented them from proceeding. Blocker issues were noted and resolved with the assistance of the accessibility engineer to allow testing to continue.
When a tester identified an accessibility or usability issue, the accessibility engineer recorded a new entry that included (a) a description of the action, (b) expected and actual results, (c) impact on the user, and (d) the category of the error related to the WCAG 2.1 success criteria in violation. A job application site was considered to fail if testers found one or more blocker accessibility issues during the testing. After each online job application test, the tester responded to the series of four questions to describe their user experience.
Data Analysis
Descriptive statistics were used to analyze the data. Frequencies were tabulated for issues per platform, number of applications by vendor, and specific issue categories. Means and standard deviations were calculated to evaluate ease of use overall and by job application site, industry sector, and vendor.
Results
Current State of Online Job Application Accessibility and Usability (RQ1)
A total of 694 accessibility issues were discovered during testing of the 30 online job applications, broken down by the following:
73 Blockers/ Critical Issues
345 High Impact Issues
188 Medium Impact Issues
87 Low Impact Issues
Twenty-three of the 30 (76.7%) online job application sites failed on at least one screen reader/browser combination, meaning that a tester found a blocker issue and could not continue without sighted assistance (see Table 1). On many application sites, the three user platform combinations experienced considerably different results. VoiceOver with Safari experienced the most critical issues – 38 – and succeeded on 13, or 43.3%, of the applications. JAWS with Chrome had 24 critical issues and succeeded on 14, or 46.7%, of the applications. NVDA with Chrome experienced the fewest critical issues with 11 and was the most successful platform, succeeding on 23, or 76.7%, of the applications. Although a similar total number of issues were found with the NVDA/Chrome platform, they were rated lower impact, and the NVDA and Chrome browser combination was the most successful. However, it is relevant to note that this tester encountered critical issues on some applications on which the other browser and screen reader combinations succeeded.
Table 1.
Issues per Platform During Job Application Testing
Issues per Platform | Blocker | Bug | High | Medium | Low | Grand Total |
---|---|---|---|---|---|---|
JAWS/Chrome | 24 | 116 | 94 | 5 | 239 | |
NVDA/Chrome | 11 | 1 | 58 | 64 | 80 | 214 |
VoiceOver/ Safari (MacOS) | 38 | 171 | 30 | 2 | 241 |
Overall, the screen readers succeeded on 50 of 90 attempts, a 55.6% success rate, indicating a poor overall state of accessibility of online job applications. However, looking at the data from another perspective, we see that 26 of the 30 job applications, or 86.7%, succeeded for at least one of the three test platforms. That relatively high success rate, though, could potentially require the user to attempt to complete the job applications using up to three separate platforms to achieve success, which would place an undue burden on the user.
Eight software vendors accounted for 26 of the 30 job applications, with the remaining four being custom builds from the companies (see Table 2). There was significant variation in the number and type of blocking issues found within a vendor’s various applications. Although the custom-built job application sites didn’t perform well on average, they included the highest (5.00 [SD=0]) and lowest (1.67 [SD=1.15]) company scores, resulting in significant variability.
Table 2.
Job Application Software Vendor Count and Average Ease of Use Scores
Vendor | Count | Ease of Use Score | ||
---|---|---|---|---|
Mean (SD) | Min | Max | ||
My Workday Jobs | 9 | 3.37 (0.84) | 1 | 5 |
Taleo | 7 | 3.48 (1.17) | 1 | 5 |
[Custom-built system] | 4 | 3.50 (1.51) | 1 | 5 |
Phenom People | 3 | 3.67 (1.00) | 2 | 5 |
Brass Ring | 3 | 4.11 (0.60) | 3 | 5 |
Peopleclick.com | 1 | 4.33 (0.58) | 4 | 5 |
Talemetry | 1 | 4.67 (0.58) | 4 | 5 |
iCIMS | 1 | 3.00 (1.00) | 2 | 4 |
Oracle Cloud | 1 | 4.00 (1.00) | 3 | 5 |
Looking at the data by industry sector, applications from the technology industry were rated the highest with a 4.17 (SD=0.79) average. This was followed by the retailing sector (M=3.78, SD=1.11), the hotels, restaurants, and leisure sector (M=3.67, SD=1.03), the health care sector (M=3.28, SD=1.27), and the financial sector (M=3.11, SD=0.68). It is understandable that the tech sector would perform better with the technical issues of accessibility and usability of their job application sites, but clearly all sectors need improvement.
Common Accessibility and Usability Issues (RQ2)
All 30 job applications tested were large multi-page forms with different types of form fields and controls, many of which had problems in one or more area. Table 3 presents the issue categories and count of issues, including blocker issues. Regarding WCAG issues, 75% were related to keyboard accessibility, information and relationships, or labeling. More than half (53%) of the blocker issues were related to keyboarding accessibility. More than one-third (34%) of blocker issues included either date pickers or combo fields, making these specific form controls the most problematic for independent completion of the job applications.
Table 3.
WCAG 2.1 Issue Categories and Raw Count by Any Severity and Blockers
Issue Categories | Any Severity | Blockersa |
---|---|---|
Labels | 125 | 8 |
Keyboard Accessible | 116 | 39 |
Info and Relationships | 92 | 7 |
Role | 72 | 5 |
Setting Focus | 64 | 1 |
Error Messages | 47 | 3 |
Needs Discussion | 43 | 3 |
Consistent Navigation | 43 | 1 |
Headings | 32 | 0 |
Focus Order | 25 | 1 |
Modal Dialogs | 15 | 4 |
Alternative Text | 12 | 1 |
Contrast | 8 | 0 |
Total | 694 | 73 |
Blocker means a critical issue that prevented the tester from completing the job application.
A picker is a common form control allowing the user to choose from a list of prescribed options, often used to choose a date (see Figure 1). In many of the forms tested, the date pickers were not standard components but instead were customized in a way that blocked the screen reader from independently choosing a date. A standard edit field allows the user to type or paste text into that field, but an edit combo field is an enhanced edit field that expands on that functionality by populating a drop-down list of choices after the user begins typing (see Figure 2 for an example). This was often used in the applications for items such as choosing the job applicant’s college, and the usability testers experienced significant barriers with them. Often, the usability testers were not alerted to the existence of the drop-down list and were also not able to scroll through the list and choose an option.
Figure 1.
Default Chromium date picker.
Figure 2.
Example of a combobox from the ARIA 1.1 design patterns.
Common user comments spoken during the testing which described those critical blocker issues are summarized as follows:
Questions not associated with labels
Unlabeled buttons and form fields
Unclear labels for fields and inaccessible (likely custom) fields
Unclear error messages which were hard to find and access
Form field labels constantly reading “accessibility”
Inaccessible search and drop-down edit fields
Unclear selection and acceptance of suggestions presented in edit fields
Nonstandard drop-down list improperly presented like free form text, leading to a complete blocker
Inaccessible autocomplete with unopenable drop-down list
Focus traps created by unconventional date fields and insufficient labels
Unconfirmed selection of items within unconventional pickers
Blockers from date fields
Unusable calendar widgets
User Experience (RQ 3)
The average overall ease of use rating for the 30 online job applications was 3.60 (SD=1.05) out of 5, indicating a moderate level of overall usability for the job applications tested. Not surprisingly, the existence of a blocker preventing application completion appeared to affect user ratings. Of the 14 job applications with average scores between 4 and 5, 13 succeeded on at least one platform. Additionally, because in many cases one screen reader and browser combination performed significantly better than another on a specific job application site, the average scores tended to converge towards the middle. The ease of use ratings also varied by job application vendor.
User Experience by Screen Reader
Ease of use ratings by screen reader platform used do not correspond with the success of each screen reader in completing the job applications. The average overall ease of use rating given by the tester using NVDA was 3.33 (SD=0.88) while the JAWS user’s average rating was 3.50 (SD=1.07), and the VoiceOver user’s average rating was 3.97 (SD=1.10). Although NVDA succeeded with nearly double the number of online job applications than JAWS and VoiceOver, the NVDA user rated the experiences the lowest. This may indicate that the NVDA user had to implement more advanced techniques, or try multiple methods to find a work-around, to navigate the applications than the other testers did, and the necessity of such efforts decreased satisfaction level.
User Experience by Vendor
User experience varied across vendor platforms. Although most user comments were negative overall, some comments were positive regarding certain aspects of the job applications. While eight vendors accounted for 26 of the online job applications, each company implementation of the vendors’ online job application was different. There are multiple versions of the applications from each vendor, so while there are similarities, each company utilizing a specific vendor is using a unique interface. Variability was high for most vendors’ job applications that were used by more than one company (see Table 2).
Of the eight vendors, Talemetry scored the highest user rating at 4.67, and iCIMS scored the lowest user rating at 3.00, but each was based on only one job application form. For vendors that created several job application forms, testers commented on a varied range of experiences of these vendors’ job applications. Some experiences included reports of confusing navigation, unpredictable controls, and inconsistent focus, while other job applications by the same vendor were reported to be much more accessible and easier to use.
Noted Intentional Efforts for Accessibility
Users found features added specifically for screen readers on some sites, such as screen reader only text or instructions. This was noted by at least 2 out of the 3 testers on 12, or 40%, of the applications. However, testers noted that one-third of the applications with added screen reader features made the experience worse instead of better. Testers identified many occasions where the application process worked well. In some instances, components such as form fields, combo boxes, and graphics were all labeled and configured properly, and it was obvious that developers considered accessibility and usability during design. A few exemplary comments are provided below.
“Search worked very well to find a specific job.”
“Everything was labeled correctly.”
“They did a great job of error correction, with the error correction dialog. Alerts were automatically announced, better than most sites.”
“The experience was simple enough, and I was able to complete an application.”
“Headings were done right and allowed for easy navigation.”
“No problematic edit combos and mostly used standard components or well-done components.”
However, several efforts towards better usability were either poorly implemented or simply misguided. Examples of user comments about such instances include:
“Inappropriately added accessibility labels.”
“Too many unhelpful drop-down instructions.”
“Skip links don’t work.”
“ARIA alerts are used to tell you information but are not well implemented.”
“ARIA alert indicates certain errors, but places focus in a random area.”
Discussion
This study of online job application accessibility and usability for screen reader users represents the first research conducted on this topic since 2011. Using expert testers who utilized three different screen reader/browser combinations, we evaluated 30 online job application sites of Fortune 500 companies. Accessible and usable online job applications are essential to increase employment opportunities for individuals with visual impairments. The results of this study indicate that this essential need has not been met, with the accessibility and usability problems identified presenting an unnecessary barrier during what is often the first step in finding employment. These accessibility problems could immediately discourage potential applicants using screen readers, indicating that employers are ignoring a portion of their potential hiring pool. Given the current difficulty companies face in hiring employees (Morath, 2021; Nguyen, 2022), this is an important consideration.
Testing job applications from these 30 companies resulted in an unacceptably low success rate. The percentage of applications that were completed independently (with no sighted assistance required) was substantially higher in this study (55.6%) than in Lazar et al.’s (2012) study (28.1%), suggesting that accessibility of job applications may have improved during the past 10 years. However, given the overall success rate was only slightly more than half and only 23.3% of the job applications could be completed independently across all three screen reader/browser combinations, we clearly have a long way to go to reach universal online job application accessibility. Usability testers’ overall ease of use ratings were moderate, and their comments demonstrated significant frustration. Another indication of poor usability is the time it took applicants to complete the applications: the shortest application process was 20 minutes, and the longest was 135 minutes. This does not compare favorably to the average time it takes a typical user to complete the application process of a Fortune 500 company – less than 5 minutes (Maurer, 2022). Companies are concerned about the application process being a barrier to obtaining talent, but many do not seem to be seriously considering accessibility as an aspect of this.
The eight job application software vendors utilized by companies in this study do not offer one standard online job application for their customers. Instead, they allow companies to choose from several existing options or customize the application system. Our findings suggest that accessibility may have been considered for some aspects of the applications but not for other aspects. On several occasions, the testers and accessibility engineers discovered a functionality specifically designed for screen reader users. In a few cases, these were “workarounds” to allow using a screen reader with the site. In other cases, these were specific messages or alerts to help screen reader users navigate the site. However, even though job application developers attempted to improve functionality for screen reader users, the added functionality was occasionally done incorrectly or inappropriately, causing more harm than good. Our findings suggest that a clean, well-implemented application will provide a good user experience for screen reader users, and it is not necessary to build screen-reader-specific functionality, which often adds complexity.
The most promising finding from the study was that 26 out of 30 job applications could be successfully and independently completed by at least one of the three screen reader/browser combinations tested. However, there was significant variation in the success of different screen readers, indicating that vendors and companies might have designed with one screen reader platform in mind and not others, but the optimal screen reader and browser configuration, if there is one, was not disclosed to the user. To achieve this 86.7% success rate, a screen reader user would be required to potentially experiment with two different computer operating systems and three separate screen reader/browser combinations. Employers would never accept placing such an undue burden on their general population of job applicants.
Implications
The misguided efforts toward better accessibility found in this study show the need for increased, varied testing with screen reader users by the creators of the online job applications, which would easily catch those problems. Accessible and usable online job applications should be a high priority for companies, and they must understand the impact that inaccessible job applications have on people with disabilities. Vendors that supply job application software and companies that create their own online job applications are urged to take a Universal Design approach by following the WCAG and implementing recommended solutions to increase the accessibility of the applications, creating a more inclusive hiring system. Software vendors are encouraged to only offer accessible options for creating online job applications, such that companies would not be able to create an inaccessible form.
Of the 694 accessibility and usability issues discovered in this study that hindered testers from independently completing the online job applications, none of them are difficult hurdles to overcome by the designers and developers of the online job applications. The problems posed by each issue, including the blocker issues, can be solved with straightforward and well-known solutions and techniques (Melnyk et al., 2014), such as creating properly labeled form fields and using standard pickers and drop-down lists. These techniques are available in the WAI-ARIA Authoring Practices 1.1. Following WCAG and using commonly known usability techniques would solve all the problems discovered and improve job application accessibility and usability for screen reader users.
Limitations
While there are several advantages to using expert testers to evaluate the job application sites, as discussed in the Method section, this design is not without some limitations. Although severity levels were defined for this study, testers may have different subjective opinions about the severity of the issues they encountered. Because only one person used each screen reader, their personal interpretation of severity may have influenced the severity ratings by screen reader. Likewise, ease of use ratings by screen reader may be biased based on how each individual interpreted the rating scale. Another limitation of using experienced testers is that advanced users are likely not representative of a typical screen reader user, whose skills and ability to navigate websites may be lower. Testers navigated the websites using different methods (tabbing, using arrows, etc.), and some less-experienced screen reader users may only utilize one method to navigate websites. Therefore, our study may have underestimated the challenges that a typical screen reader user might encounter while trying to complete online job applications.
Conclusion
The state of online job application accessibility for screen readers is moderate at best. Completing an accessible and usable online job application is not a difficult task for a screen reader user with average skills. Developers of online job application sites and software vendors should be aware of and follow all WCAG 2.1 guidelines. Software from job application vendors was used to create 86.7% of the job applications studied; if the vendors only created options that all meet WCAG 2.1 guidelines, companies would by default create accessible online job applications. Companies should require this from the vendor they utilize. Creating a clean, well-implemented application that follows all WCAG guidelines is not difficult to do and will result in a job application site that is accessible to screen readers. Simply fixing the critical and high-impact issues discovered by this study would go a long way towards increasing job opportunities and reducing employment discrimination for screen reader users with visual impairments.
Source of Funding:
The contents of this manuscript were developed under a grant from the U.S. Department of Health and Human Services, NIDILRR grant 90RTEM0007. However, these contents do not necessarily represent the policy of the Department of Health and Human Services and should not indicate endorsement by the Federal Government.
Footnotes
Conflicts of Interest: The authors declare that they have no conflicts of interest.
References
- Bruyère SM, Erickson WE, VanLooy S. (2005). Information technology and the workplace: Implications for persons with disabilities. Disability Studies Quarterly, 25(2), 1–16. http://dsq-sds.org/article/view/548 [Google Scholar]
- Fagan JC, Fagan B. (2004). An accessibility study of state legislative web sites. Government Information Quarterly, 21(1), 65–85. 10.1016/j.giq.2003.12.010 [DOI] [Google Scholar]
- Gallup. (2017). State of the American workplace. Gallup,Inc., (pp. 1–214) https://www.gallup.com/workplace/238085/state-american-workplace-report-2017.aspx [Google Scholar]
- Gilbertson TD, Machin CHC. (2012). Guidelines, icons and marketable skills: An accessibility evaluation of 100 web development company homepages. In: W4A 2012 - International Cross-Disciplinary Conference on Web Accessibility. New York, New York, USA: ACM Press, 17(1), 1–4. 10.1145/2207016.2207024 [DOI] [Google Scholar]
- Higgins T (2020, May 9). Supreme Court hands victory to blind man who sued Domino’s over site accessibility. CNBC. Advanced Online Publication. https://www.cnbc.com/2019/10/07/dominos-supreme-court.html%0D [Google Scholar]
- Hu J (2019, November 7). 99% of Fortune 500 companies use Applicant Tracking Systems (ATS). Jobscan. https://www.jobscan.co/blog/99-percent-fortune-500-ats/%0D [Google Scholar]
- Jaeger PT. (2006). Assessing Section 508 compliance on federal e-government Web sites: A multi-method, user-centered evaluation of accessibility for persons with disabilities. Government Information Quarterly, 23(2), 169–190. 10.1016/j.giq.2006.03.002 [DOI] [Google Scholar]
- Kolmar C (2022). 15+ incredible job search statistics [2023]: What job seekers need to know. https://www.zippia.com/advice/job-search-statistics/
- Lazar J, Beavan P, Brown J,et al. (2010). Investigating the accessibility of state government web sites in Maryland. Designing Inclusive Interactions: Inclusive Interactions Between People and Products in Their Contexts of Use. Springer-Verlag London Ltd, 69–78. 10.1007/978-1-84996-166-0_7 [DOI] [Google Scholar]
- Lazar J, Wentz B, Almalhem A, et al. (2013). A longitudinal study of state government homepage accessibility in Maryland and the role of web page templates for improving accessibility. Government Information Quarterly, 30(3), 289–299. 10.1016/j.giq.2013.03.003 [DOI] [Google Scholar]
- Lazar J, Wentz B, Biggers D, et al. (2011). Societal inclusion: Evaluating the accessibility of job placement and travel web sites. Research Gate. https://www.researchgate.net/publication/267623502_Societal_Inclusion_evaluating_the_accessibility_of_job_placement_and_travel_web_sites
- Lazar J, Olalere A, & Wentz B (2012). Investigating the accessibility and usability of job application web sites for blind users. Journal of Usability Studies, 7(2), 68–87. https://uxpajournal.org/investigating-the-accessibility-and-usability-of-job-application-web-sites-for-blind-users/ [Google Scholar]
- Loiacono ET, Djamasbi S. (2013). Corporate website accessibility: Does legislation matter? Universal Access in the Information Society, 12(1), 115–124. 10.1007/s10209-011-0269-1 [DOI] [Google Scholar]
- Loiacono ET, McCoy S, Chin W. (2005). Federal web site accessibility for people with disabilities. IT Professional, 7(1), 27–31. 10.1109/MITP.2005.1407801 [DOI] [Google Scholar]
- Loiacono ET, Romano NC, McCoy S. (2009). The state of corporate website accessibility. Communications of the ACM, 52(9), 128–132. 10.1145/1562164.1562197 [DOI] [Google Scholar]
- Maurer R (2022, February 16). Most people – 92% – never finish online job applications. SHRM. https://www.shrm.org/resourcesandtools/hr-topics/talent-acquisition/pages/most-people-never-finish-online-job-applications.aspx
- McDonnall MC, & Sui Z (2019). Employment and unemployment rates of people who are blind or visually impaired: Estimates from multiple sources. Journal of Visual Impairment & Blindness, 113(6), 481–492. 10.1177/0145482X19887620 [DOI] [Google Scholar]
- Melnyk V, Ashok V, Puzis Y, Soviak A, Borodin Y, & Ramakrishnan IV (2014). Widget classification with applications to web accessibility. In Web Engineering: 14th International Conference, ICWE 2014, Toulouse, France, July 1–4, 2014. Proceedings 14 (pp. 341–358). Springer International Publishing. [Google Scholar]
- Morath E (2021, May 6). Millions are unemployed. Why can’t companies find workers? The Wall Street Journal. https://www.wsj.com/articles/millions-are-unemployed-why-cant-companies-find-workers-11620302440
- Nguyen J (2022, March 23). Companies still say they can’t find enough workers. What’s going on? Marketplace. https://www.marketplace.org/2022/03/23/companies-still-say-they-cant-find-enough-workers-whats-going-on/
- Olalere A, Lazar J. (2011). Accessibility of U.S. federal government home pages: Section 508 compliance and site accessibility statements. Government Information Quarterly, 28(3), 303–309. 10.1016/j.giq.2011.02.002 [DOI] [Google Scholar]
- Smith A (2015). Searching for work in the digital era. Pew Research Center. Advanced Online Publication. https://www.pewresearch.org/internet/2015/11/19/searching-for-work-in-the-digital-era/ [Google Scholar]
- U.S. Census Bureau (2020). 2019 American community survey 1-year estimates: Employment status by disability status and type. Table B18120. U.S. Census Bureau. https://data.census.gov/cedsci/table?q=B18120&tid=ACSDT1Y2019.B18120 [Google Scholar]
- Launey KM, & Vu MN (2023). Plaintiffs set a new record for website accessibility lawsuit filings in 2022. Seyfarth ADA Title III News & Insights Blog. https://www.adatitleiii.com/2023/01/plaintiffs-set-a-new-record-for-website-accessibility-lawsuit-filings-in-2022/
- WebAIM. (2022). The WebAIM million: The 2022 report on the accessibility of the top 1,000,000 home pages. WebAIM. Advanced Online Publication. https://webaim.org/projects/million/ [Google Scholar]
- WebAIM. (2021). Screen reader user survey #9 results. WebAIM. Advanced Online Publication. https://webaim.org/projects/screenreadersurvey9/ [Google Scholar]
- Wetteman R, & White T (2019). The internet is unavailable. Nucleus Research: Research Note (Document number T103). https://accessibility.deque.com/nucleus-accessibility-research-2019 [Google Scholar]
- Youngblood NE, MacKiewicz J. (2012). A usability analysis of municipal government website home pages in Alabama. Government Information Quarterly, 29(4), 582–588. 10.1016/j.giq.2011.12.010 [DOI] [Google Scholar]
- Youngblood NE. (2014). Revisiting Alabama state website accessibility. Government Information Quarterly, 31(3), 476–487. 10.1016/j.giq.2014.02.007 [DOI] [Google Scholar]
- Yu DX, Parmanto B. (2011). U.S. state government websites demonstrate better in terms of accessibility compared to federal government and commercial websites. Government Information Quarterly, 28(4), 484–490. 10.1016/j.giq.2011.04.001 [DOI] [Google Scholar]