Abstract
Ubiquitous access is an increasingly common vision of computing, wherein users can interact with any computing device or service from anywhere, at any time. In the era of personal computing, users with visual impairments required special-purpose, assistive technologies, such as screen readers, to interact with computers. This paper investigates whether technologies like screen readers have kept pace with, or have created a barrier to, the trend toward ubiquitous access, with a specific focus on desktop computing as this is still the primary way computers are used in education and employment. Towards that, the paper presents a user study with 21 visually-impaired participants, specifically involving the switching of screen readers within and across different computing platforms, and the use of screen readers in remote access scenarios. Among the findings, the study shows that, even for remote desktop access—an early forerunner of true ubiquitous access—screen readers are too limited, if not unusable. The study also identifies several accessibility needs, such as uniformity of navigational experience across devices, and recommends potential solutions. In summary, assistive technologies have not made the jump into the era of ubiquitous access, and multiple, inconsistent screen readers create new practical problems for users with visual impairments.
Keywords: Ubiquitous accessibility, visually impaired users, multiple screen readers, remote access, mobile computing
ACM Classification Keywords: K.4.2 Computers and Society: Social Issues
INTRODUCTION
Nearly all aspects of modern life now incorporate computing. The nature of computing has also shifted, from a user owning a single, dedicated PC to the modern era where users commonly own multiple devices, including desktops, phones, tablets, and laptops. As internet access has connected these devices, software has given users the ability to interact with data and apps on any of these devices at any time, from anywhere. For instance, users commonly work from home, using remote desktop technology, run applications for different operating systems (OSes) in a virtual machine, or interact with other programmable devices. This concept, called ubiquitous access, has fundamentally changed how users interact with computers, unlocking new opportunities in business, education, and health care. Although ubiquitous access has not been fully realized (e.g., one cannot access smartphone apps remotely from a laptop, and the “internet of things” is still in its infancy), users no longer use a single computer and OS; work, education, and other circumstances require users to move among different computer systems as part of daily life.
Related to ubiquitous access is the concept of Ubiquitous Accessibility (UA) [30], wherein assistive technologies, which help users with disabilities use computers, also become ubiquitously available. We define UA relatively broadly: users with disabilities should be able to interact as easily as any other user with multiple devices, or multiple applications that may be running on a cloud or remote system. This paper focuses on users with visual impairments. For these users, the predominant assistive technology is a Screen Reader (SR), which narrates the on-screen text, and provides keyboard shortcuts for efficient navigation of the content. Current SRs are not portable across OSes because of the heterogeneity of accessibility APIs [2, 10, 22, 23], creating potential barriers for users with visual impairments.
As computing platforms proliferate, so do SRs. This paper seeks to understand the impact of this trend on users. Different SRs have different navigation models; learning or regularly switching between different SRs can lead to cognitive load and frustration. Specifically, the paper examines potential accessibility issues when visually-impaired people interact with (i) applications on multiple devices, possibly with different operating systems, and (ii) applications hosted on remote and cloud-based systems.
This paper summarizes the results of a user study with 21 visually-impaired participants. The study involved using screen readers in remote access scenarios; switching screen readers on different types of computers; and open-ended discussion about the users’ experience with using computers at home, workplace and school. We focus on desktop computing, as this is still the primary way computers are used in education and employment, but with some mobile device activities and a focus on the increasing need for users to incorporate multiple devices, OSes, and software packages into their daily routine. Questions for participants included: What difficulties did they face in switching between different devices? How often do they need to access a computing device that is not their own (e.g., computers in public libraries and universities), and the difficulties experienced in this process?
The study reveals several practical obstacles to education and employment that users face today; these problems will only be exacerbated as different types of devices proliferate. For instance, screen readers on the same OS do not support applications equally well; several study participants reported losing employment after routine software upgrades broke accessibility. Similarly, remote access technology often renders assistive technology unusable; another participant reported failing a college course because required course software was deployed on a virtual desktop system, which did not interoperate with the student’s SR.
A common theme from these interviews is that uniformity of experience is essential—whether it is from having the same SR installed on every device, or simply consistent navigation models and shortcuts across SRs. Our participants agreed that it would be useful to be able to carry one screen reading device with them, such as using their smartphone as a terminal to other computers.
RELATED WORK
The term ubiquitous accessibility [30] was coined by Vanderheiden to describe the goal of extending assistive technologies to the new models of interaction enabled by ubiquitous computing. Vanderheiden proposed “pluggable interfaces” for the computing devices at hand, and recommended that these interfaces should be available online for use anywhere. Several papers [8, 11, 20, 29] explore ubiquitous accessibility within a model where all user interface (UI) elements and their relationships are represented in a high-level user interface description language, such as UIDL [7, 18], and, to adapt to a user’s disability, these interfaces are adapted according to a set of rules derived from the user’s profile, interaction history, and the specific kind of disability.
Our motivation for conducting this study was a surprising dearth of holistic studies of end-to-end issues in accessibility, especially as end-user technology trends are shifting towards ubiquitous access. In other words, we wanted to know the most important accessibility problems computer users faced, so we could focus our own research agenda. A number of previous user studies have investigated usability issues in accessibility in specific domains. Specifically, prior studies have identified usability and accessibility issues in JAWS SR [28], exercise tracking devices [26], and in particular applications, including social media, email clients, and course management systems [15, 24, 31, 32]. Ahmed et al. study issues in privacy and security facing blind users [9].
Given the amount of content that is moving to the web, there is a reasonable argument that the browser will effectively become the OS. Web accessibility [12] is a special case of UA, which focuses on making the Web accessible. We agree that web accessibility is essential, but also note that there is a long “tail” of software, essential to education and employment, that has not yet moved to the web, which must become accessible.
Sinter [13] addresses the problem of integrating screen readers with remote desktop technologies—a rapidly growing technology [34]. Sinter works with clients and servers running different OSes. Similarly, Hahn et al. describe an application of remote desktop technologies for online, collaborative training of low-vision screen magnifier users [33]. The work of Dixon et al. that interprets User Interface elements from pixels can possibly be adapted to address inaccessibility problems in cross-platform computing [16]. Although tools like Sinter may be useful building blocks for accessibility solutions, this study identifies important, open problems that should be solved.
METHODS
The IRB-approved study was conducted at Lighthouse-Guild in New York City [5]. The 21 visually-impaired participants were from the New York metropolitan area, including New Jersey and Connecticut, and were recruited through email advertisements. Participants were required to have some experience with college education (not necessarily completing a degree), and some degree of familiarity with two or more screen readers. Users were both interviewed and asked to complete some screen reading tasks. The interviews were semi-structured and in-person, to facilitate open-ended discussion of issues we may not have anticipated in advance.
Interview Preparation and Process
Two interviewers conducted the study ‒ one interacted with the participant, while the other recorded and transcribed the interviews. Following completion of the first five interviews, the interviewers analyzed the transcripts using an iterative coding process with initial coding and identified concepts [14], categorized them, framed new questions for subsequent interviews, and updated the concept list.
For the interactive part of the study, the interviewers used two laptops (Mac and Windows), two smartphones (iPhone and Android), and two full-size, standard, external keyboards (one Windows, one Mac). The Windows computer with dual boot capability (Win7, Win10) had the following applications installed: major screen readers (JAWS [27], NVDA [25], SuperNova [17], Windows-Eyes [21], and System Access [1]); popular internet browsers (IE, Firefox, Chrome with ChromeVox [3]); remote access tools (Microsoft RDP, NVDARemote [6] and System Access-2-Go); Office 2015; and NEWT [35] network emulator. The Mac laptop included its built-in VoiceOver screen reader, and a VirtualBox virtual machine running Windows 7 and the same applications listed above. The two computers were directly connected to each other via an Ethernet cable; NEWT was used to emulate different network types (Wi-Fi, WLAN, Cellular) between the computers. All computers and phones had internet access.
Participants were asked to perform tasks within simulated ubiquitous access scenarios, such as (a) browsing file directory with: (1) VoiceOver on Mac using Finder, and (2) JAWS on Windows using Windows Explorer; and (b) editing Word documents on a local Windows computer and editing Word documents on a remote Windows computer. For tasks involving interaction with remote applications, the network speeds were varied using NEWT. Participants used these simulated scenarios to demonstrate specific accessibility issues they faced in their daily lives, and as a starting point for open-ended conversation during the interview. All participants utilized the full 3 hours, with a 5-minute break after 1.5 hours. Each participant was compensated $75.
Interview Protocol
We began by asking participants to introduce themselves, their educational and professional background, history of visual impairment, and their use of assistive technologies.
Our interview questions were designed to elicit feedback on UA. The topics were preselected; however, questions under each topic were adapted somewhat to each participant’s expertise and profession. The topics explored were: (1) Difficulties in switching from one device or platform or screen reader to another. A sampler of questions under this topic included: (a) What motivated you to become familiar with the other SRs? (b) Do you customize your SR? (c) What difficulties did you face in learning a new SR? (e) Would you prefer your smartphone’s SR over your desktop’s? (2) Accessing remote applications and devices. Sample questions in this category included: (a) Are you aware of any remote access technologies? (b) What was your experience in using them? (c) Will you use them in the future? (3) Usability of computers and accessible technologies at school and work. Example questions were as follows: (a) Do you bring your own screen reader for use at school and/or at work (why or why not)? (b) How do you handle software that is not fully usable with your screen reader? (c) How frequently does your application software gets updated? (d) How familiar are you with SRs at your school or at work?
We also discussed potential solutions to problems that participants experienced, and, if applicable, told participants about additional accessibility technologies they were unaware of. The interview process culminated with participants making suggestions and recommendations.
Participant Demographics
The 21 participants included 11 men and 10 women, all with college experience. Participants varied in age from 22 to 63 (mean=43, median=37). 18 were blind and 3 had very low vision. They had varying amount of expertise in screen reading, and came from diverse professional backgrounds as shown in Table 1. The table lists unique IDs (P1, P2, etc.) for the participants.
Table 1.
Participants’ IDs and screen reading skills (expert, intermediate, and beginner) grouped by professions.
| Profession | Expert | Inter. | Beginner |
|---|---|---|---|
| Musician | P1 | P2 | |
| Transcriber | P3, P4 | ||
| IT/Tech | P5, P6 | P7 | |
| Teacher | P8 | P9 | |
| Info. Dispatcher | P10, P11, P12 | ||
| Student | P13, P14, P15 | P16 | |
| Self-employed | P17 | P18 | |
| Radio-host | P19 | ||
| Service Industry | P20 | ||
| Unemployed | P21 |
FINDINGS
This section reports our findings, supplemented with corroborating comments from the participants. Block quoted, italicized passages are direct quotes from participants.
Switching Devices, Platforms, and Screen Readers
All participants who were college students or recent graduates, described the following accessibility issues they frequently experienced with computers on campus: (i) absence of SRs; (ii) having to use a different SR than the SR they were familiar with; (iii) using a different OS platform than the one they prefer; (iv) dealing with different versions (Win10 vs. Win7) of a familiar platform; (v) not having appropriate privileges for customizing SR settings; and (vi) losing their customized SR settings following the termination of their current session.
Although sighted users also experience unfamiliarity when switching devices/platforms, the problem is compounded for blind users, because different SRs offer different navigation strategies with overlapping and sometimes conflicting shortcuts. For example, P5, a recent college graduate, who could not use her preferred SR at school, had this to say:
The settings of JAWS at University is different from the settings of NVDA that I am used to. It makes it more difficult because you have to spend time to figure out what works. You cannot use your custom strategies on the school computer, and it slows you down – a lot of key presses and efforts in vain.
6 participants reported severe difficulties using their preferred SR at school because of different OS versions ‒ Win10 vs Win7. Version changes are disruptive, e.g., the simple, up/down navigation on the start menu in Win7 becomes confusing on the grid-layout of Win10, because of non-uniform sizes and irregular alignment of grid cells. Quoting P16:
Windows 10 has too many grids, that it is a hassle to navigate. It gets stuck up and I have to restart my PC each time by pressing reboot button. And then I lose my custom JAWS settings.
All participants agreed that it would be useful if their preferred SR and settings were portable ‒ i.e., easily installed on workplace and school computers. However, two participants (P13, P14) who had previously used a portable version of NVDA SR on a flash-drive and the cloud-based System-Access-to-Go, said that these worked only on Windows OS, and, thus, could not be used on the Macs in their university libraries. Moreover, they felt that these portable screen readers were slow and had poor usability.
Switching between screen readers on the same platform
13 participants who were JAWS users had also learned to use at least one other Windows-specific SR. Their motivation was economic: a single JAWS license costs around $1,000/annually, whereas NVDA is free and both Windows-Eyes and System Access are affordably priced. The hidden cost of switching to these no-cost or low-cost SRs is a disruptive, inefficient, and frustrating learning experience. P8, a teacher at a disability center, shared her experience:
Knowledge of knowing one screen reader is not transferrable. Every screen reader works somewhat differently. It’s like putting a lot on my plate.
For P17, an expert in JAWS, the problem he had in learning Window-Eyes was that of the fast shortcuts:
The basic shortcuts are same, but the fast ones are different. To learn that, you need to take classes.
Usually, a SR’s ‘fast’ shortcuts are application-specific, designed to improve navigation. The richness and efficiency of these ‘fast’ shortcuts distinguish SRs from each other.
Participants also noted that user experience varies for similar applications even when using the same SR. For example, NVDA provides a better browsing experience with Firefox than with IE. Moreover, they reported that SRs were not robust and prone to software crashes. Quite often participants were faced with the need to use an unfamiliar SR at school or at work, as the contracted IT vendor for these institutions didn’t provide any alternatives. P6, an IT professional, said:
I usually use JAWS at work. But when I need to work with SPSS software, SPSS crashes with JAWS. So I have to switch to another screen reader called Super Nova. In Super Nova, the key commands are different. So I use Super Nova only when I use SPSS. When I’m done with SPSS, I have to switch back to JAWS. And the pain comes in when I’m using SPSS, and I have to IM somebody, I have to switch back and forth [between two screen readers]. And you can’t run two screen readers at a time. Because the commands confuse each other. Every time, I turn off a screen reader software, and switch to another one, there is always a risk that the computer will crash. When I’m running the survey dataset in the background, it’s like I’m praying and praying and praying, please, don’t crash.
Switching screen readers across platforms
All participants indicated that it would be very hard and disruptive to learn a new SR on an unfamiliar platform. 7 participants had tried learning VoiceOver, Mac’s free, built-in SR. Only two (P14 and P15) succeeded, and it took them 3–4 months to become reasonably proficient. P15 had this to say about her switching experience:
Switching between Windows and Apple was highly inconvenient, because of the differences in keyboard, shortcuts, and navigational strategies [flat vs. hierarchical].
E.g., JAWS on Windows explores an application roughly from left-to-right, top-to-bottom (flat), whereas navigation on VoiceOver follows a logical, tree hierarchy (analogous to Folder Tree in Windows Explorer, as explained by P1).
P14, another participant who switched from Windows to Mac, still uses JAWS inside a Virtual Machine running Windows, for browsing the Web with IE and for editing MS Word docs. P14 reported that running two SRs still leads to occasional confusion. To avoid such problems, after buying a Mac laptop, P4 installed Windows using Boot Camp.
Participants’ reported a reluctance to switch SRs because of the time, effort, and financial burden to train on a new SR. They summarized the training process as memorizing numerous shortcuts, navigational strategies, and building muscle memory through practice. The net effect is that having been trained in one SR, they preferred not to go through the training process for a new SR without compelling reasons to do so.
Switching Screen Readers due to Software updates
8 participants reported that they worried about their employment security, particularly when the software at work was updated or new applications were introduced. 5 (P3, P7, P12, P20, P21) stated that they lost full-time employment because upgraded versions of required software did not interoperate as well with their current SRs. They were then required to use a different, unfamiliar screen reader that worked with these upgrades. P3 and P7, whose SR expertise was between intermediate and expert level, decided to quit instead of retraining on a new SR; P12, P20, and P21, who were beginners, tried to adapt but were let go because they were less productive than before. At the time of this study, P21 was still unemployed.
3 participants (P5, P9, P10) reported knowing someone who had lost a job for similar reasons.
Accessing Remote Devices and Cloud Applications
Remote access technologies, such as Microsoft RDP or Citrix are commonly used for telecommuting and deploying educational software. Yet, these technologies operate by relaying pixels from the remote display to a client application, and are not accessible from an SR on the client [13]. The alternatives are to relay audio from a screen reader on the remote system, or to synthesize audio locally as it is done in NVDA-Remote [6] and JAWS-Tandem [4].
With the exception of P9, P10, and P20, all other 18 participants were aware of the existence of remote-access technologies. However, only 6 had actually used Microsoft RDP and JAWS-Tandem remote-access technology. 3 of them (P5, P6, P17) had used JAWS-Tandem only for the purpose of training and troubleshooting. They reported that JAWS-Tandem requires the same SR and OS on both the remote and local system, reducing flexibility. In fact, 15 participants had misconceptions about the use of remote access technologies; they believed these technologies were only for an IT vendor or an instructor to provide technical support over the internet.
Telecommuting
10 participants, who were all employed, would prefer to work from home, especially on bad weather days. They were, however, skeptical of existing RDP technology. 3 of these 10 (P3, P4, P11) had tried Microsoft RDP with audio being relayed by the remote SR. They found the user experience restrictive, slow, and frustrating. The other 7 were either dissuaded by friends’ negative experiences, or did not know how to set them up. For example, P12 said the following:
I do know that you can access other computers from your personal laptop, but I don’t know what software to use for that purpose.
Accessing academic institutional resources from home
Educational institutions commonly place course materials on remote servers that are accessed via virtual desktop clients, such as Citrix. 4 participants reported the latency of relaying SR audio over remote desktop, especially over WLAN, rendered the solution unusable. P14 failed a course because the course software hosted on the school’s Citrix server didn’t work well with the SR installed on that server.
DISCUSSION
Our findings revealed that the participants found it difficult to switch SRs regardless of the underlying platforms, because of the differences in shortcuts and navigational models. Intuitively, blind users build a mental model of where items are in the UI, not necessarily how they appear on the screen, but where things are in the logical navigation order. When this ordering changes due to (i) software updates (e.g. ribbon replaces classical menu in MS Word) or (ii) switching SRs, the user has to relearn a new mental model which is tedious and disruptive. The participants reported that such changes can cause navigational confusion, and often they have to abort and restart applications, which has also been observed by Tavares et al. [28].
With regard to remote access, extant technologies are inflexible and have been found to have poor usability with SRs. Moreover, visually-impaired people need help setting up these tools, such as remote access client, virtual private networks, and portable screen readers.
In the open-ended discussion, participants suggested ideas that can serve as the basis for future ubiquitous accessibility research. A sampler of these ideas are as follows:
Uniformity of interaction experience
A consistent theme from our interviews was the need to have uniform interaction experience across different screen readers and platforms. One suggestion was to standardize screen-reader shortcuts across platforms. Although keystrokes can be remapped relatively easily, encapsulating the heterogeneity of different platforms and navigation models is an open problem. Another suggestion was to have uniformity of UI elements across platforms. Harris [19] advocates the use of standardized UI description languages for this purpose.
Universally portable screen reader
All participants expressed a desire to carry their screen reader and all personal customizations with them, plugging it into any computer they need to use in the course of their day. However, assistive technologies are generally OS-specific; the differences in the underlying accessibility APIs create barriers to SR portability. Sinter [13] has taken a step towards addressing this problem.
Smartphone as a “portable” screen reader
19 participants owned smartphones. They had learned to use the SR on their smartphones with some proficiency and carry their smartphones everywhere. If any other device or application could appear to the smartphone SR as an accessible app, using some remote access protocol over a network or local wireless protocol like Bluetooth, this would meet their accessibility needs. We note that several projects are investigating building blocks for this direction of ubiquitous accessibility [8, 29].
CONCLUSION
Usability issues in current screen readers create significant barriers to employment and education for users with visual impairments. Some of these issues are because not all applications that run on an OS are accessible on a screen reader for that OS. Other issues are the product of the move toward ubiquitous accessibility; remote and virtual desktop infrastructure are widely used, yet interact poorly with current assistive technologies. Until users have a consistent screen reading experience across a range of devices, applications, and operating systems, the vision of ubiquitous access will be thwarted. One promising direction is using the smartphone as a primary, portable interface to other devices.
Acknowledgments
We thank Yevgen Borodin, the anonymous reviewers, and our shepherd for their insightful feedbacks. Part of this work was completed while Porter was at Stony Brook University. This research was supported in part by NSF: IIS-1447549, CNS-1161541, CNS-1405641; National Eye Institute of NIH: R01EY026621; NIDILRR: 90IF0117-01-00, and VMware.
References
- 1.System Access. Available from: http://www.serotek.com/systemaccess.
- 2.GNOME Accessibility Architecture (ATK and AT-SPI) Available from: https://developer.gnome.org/accessibility-devel-guide/stable/dev-start-5.html.en.
- 3.Introducing ChromeVox. Available from: http://www.chromevox.com/
- 4.JAWS Tandem. Available from: http://www.freedomscientific.com/JAWSHq/JAWSTandemQuickStart.
- 5.LightHouse GUILD. Available from: http://www.lighthouseguild.org/
- 6.NVDA Remote brings free remote access to the blind. Available from: http://nvdaremote.com/
- 7.OASIS User Interface Markup Language (UIML) Available from: https://www.oasis-open.org/committees/uiml/
- 8.Abascal Julio, Aizpurua Amaia, Cearreta Idoia, Gamecho Borja, Garay-Vitoria Nestor, Miñón Raúl. Automatically generating tailored accessible user interfaces for ubiquitous services. The proceedings of the 13th international ACM SIGACCESS conference on Computers and accessibility; ACM, Dundee, Scotland, UK. 2011. pp. 187–194. [Google Scholar]
- 9.Ahmed Tousif, Hoyle Roberto, Connelly Kay, Crandall David, Kapadia Apu. Privacy Concerns and Behaviors of People with Visual Impairments. Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems; ACM, Seoul, Republic of Korea. 2015. pp. 3523–3532. [Google Scholar]
- 10.Apple. NSAccessibility. Available from: https://developer.apple.com/reference/appkit/nsaccessibility.
- 11.Atkinson Matthew Tylee, Bell Matthew J, Machin Colin HC. Towards ubiquitous accessibility: capability-based profiles and adaptations, delivered via the semantic web. Proceedings of the International Cross-Disciplinary Conference on Web Accessibility; ACM, Lyon, France. 2012. pp. 1–4. [Google Scholar]
- 12.Bigham Jeffrey P, Prince Craig M, Ladner Richard E. WebAnywhere: a screen reader on-the-go. Proceedings of the 2008 international cross-disciplinary conference on Web accessibility (W4A); ACM, Beijing, China. 2008. pp. 73–82. [Google Scholar]
- 13.Billah Syed Masum, Porter Donald E, Ramakrishnan IV. Sinter: Low-bandwidth Remote Access for the Visually-impaired. Proceedings of the Eleventh European Conference on Computer Systems; New York, NY, USA, ACM. 2016. p. 16. [Google Scholar]
- 14.Bryman Alan, Burgess Bob, et al. Analyzing qualitative data. Routledge; 2002. [Google Scholar]
- 15.Calvo Rocío, Iglesias Ana, Moreno Lourdes. Accessibility barriers for users of screen readers in the Moodle learning content management system. Universal Access in the Information Society. 2013;13(3):315–327. doi: 10.1007/s10209-013-0314-3. [DOI] [Google Scholar]
- 16.Dixon Morgan, Leventhal Daniel, Fogarty James. Content and Hierarchy in Pixel-based Methods for Reverse Engineering Interface Structure. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems; New York, NY, USA, ACM. 2011. pp. 969–978. [Google Scholar]
- 17.Dolphin. SuperNova Screen Reader. Available from: http://www.yourdolphin.com/productdetail.asp?id=1.
- 18.Guerrero-Garcia Josefina, Gonzalez-Calleros Juan Manuel, Vanderdonckt Jean, Munoz-Arteaga Jaime. Proceedings of the 2009 Latin American Web Congress (la-web 2009) IEEE Computer Society; 2009. A Theoretical Survey of User Interface Description Languages: Preliminary Results; pp. 36–43. [Google Scholar]
- 19.Harris Kip. Challenges and solutions for screen reader/I.T. interoperability. SIGACCESS Access Comput. 2006;(85):10–20. doi: 10.1145/1166118.1166120. [DOI] [Google Scholar]
- 20.Heron Michael, Hanson Vicki L, Ricketts Ian W. ACCESS: a technical framework for adaptive accessibility support. Proceedings of the 5th ACM SIGCHI symposium on Engineering interactive computing systems; ACM, London, United Kingdom. 2013. pp. 33–42. [Google Scholar]
- 21.Micro GW. Window-Eyes. Available from: http://http//www.gwmicro.com/Window-Eyes/
- 22.Microsoft. Microsoft Active Accessibility: Architecture. Available from: https://msdn.microsoft.com/en-us/library/windows/desktop/dd373592(v=vs.85).aspx.
- 23.Microsoft. UI Automation Overview. Available from: http://msdn.microsoft.com/en-us/library/ms747327.aspx.
- 24.Morris Meredith Ringel, Zolyomi Annuska, Yao Catherine, Bahram Sina, Bigham Jeffrey P, Kane Shaun K. “With Most of It Being Pictures Now, I Rarely Use It”: Understanding Twitter’s Evolving Accessibility to Blind Users. Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems; New York, NY, USA, ACM. 2016. pp. 5506–5516. [Google Scholar]
- 25.NV Access. NV Access: Home of the free NVDA Screen Reader. Available from: http://www.nvaccess.org/
- 26.Rector Kyle, Milne Lauren, Ladner Richard E, Friedman Batya, Kientz Julie A. Exploring the Opportunities and Challenges with Exercise Technologies for People who are Blind or Low-Vision. Proceedings of the 17th International ACM SIGACCESS Conference on Computers & Accessibility; ACM, Lisbon, Portugal. 2015. pp. 203–214. [Google Scholar]
- 27.Freedom Scientific. JAWS Screen Reader. Available from: http://www.freedomscientific.com/downloads/jaws.
- 28.Shinohara Kristen, Tenenberg Josh. Observing Sara: a case study of a blind person’s interactions with technology. Proceedings of the 9th international ACM SIGACCESS conference on Computers and accessibility; ACM, Tempe, Arizona, USA. 2007. pp. 171–178. [Google Scholar]
- 29.Tavares Joãa, Barbosa Jorge, Costa Cristiano, Yamin Adenauer, Real Rodrigo. Hefestos: A Model for Ubiquitous Accessibility Support. Proceedings of the 5th International Conference on PErvasive Technologies Related to Assistive Environments; New York, NY, USA, ACM. 2012. p. 28. [Google Scholar]
- 30.Vanderheiden Gregg C. Ubiquitous Accessibility, Common Technology Core, and Micro Assistive Technology. ACM Transactions on Accessible Computing. 2008;1(2):1–7. doi: 10.1145/1408760.1408764. [DOI] [Google Scholar]
- 31.Voykinska Violeta, Azenkot Shiri, Wu Shaomei, Leshed Gilly. How Blind People Interact with Visual Content on Social Networking Services. Proceedings of the 19th ACM Conference on Computer-Supported Cooperative Work & Social Computing; ACM, San Francisco, California, USA. 2016. pp. 1584–1595. [Google Scholar]
- 32.Wentz Brian, Hochheiser Harry, Lazar Jonathan. A survey of blind users on the usability of email applications. Univers Access Inf Soc. 2013;12(3):327–336. doi: 10.1007/s10209-012-0285-9. [DOI] [Google Scholar]
- 33.Hahn Thomas, Rahman Hidayat Ur, Segall Richard, Heim Christoph, Brunson Raphaela, Sharma Ankush, Aslam Maryam, Lara-Rodriguez Ana, Islam Md Sahidul, Gupta Neha, Embry CharlesS, Grossmann Patrick, Babar Shahrukh, Skibinski Gregory A, Tang Fusheng. Remote Access Programs to Better Integrate Individuals with Disabilities. In Proceedings of the 18th International ACM SIGACCESS Conference on Computers and Accessibility (ASSETS ’16); ACM, New York, NY, USA. 2016. pp. 245–250. DOI: https://doi.org/10.1145/2982142.2982182. [Google Scholar]
- 34.Visiongain. The Cloud-Based Virtual Desktop Infrastructure (VDI) Market 2013–2023. Available from: https://www.visiongain.com/Report/1160/The-Cloud-Based-Virtual-Desktop-Infrastructure-(VDI)-Market-2013-2023.
- 35.Network Emulator Toolkit. Available from: https://blog.mrpol.nl/2010/01/14/network-emulator-toolkit/
