Skip to main content
Telemedicine Journal and e-Health logoLink to Telemedicine Journal and e-Health
. 2016 Jan 1;22(1):56–62. doi: 10.1089/tmj.2015.0027

Mobile Videoconferencing Apps for Telemedicine

Kai Zhang 1,, Wei-Li Liu 1, Craig Locatis 1, Michael Ackerman 1
PMCID: PMC4739126  PMID: 26204322

Abstract

Introduction: The quality and performance of several videoconferencing applications (apps) tested on iOS (Apple, Cupertino, CA) and Android™ (Google, Mountain View, CA) mobile platforms using Wi-Fi (802.11), third-generation (3G), and fourth-generation (4G) cellular networks are described. Materials and Methods: The tests were done to determine how well apps perform compared with videoconferencing software installed on computers or with more traditional videoconferencing using dedicated hardware. The rationale for app assessment and the testing methodology are described. Results: Findings are discussed in relation to operating system platform (iOS or Android) for which the apps were designed and the type of network (Wi-Fi, 3G, or 4G) used. The platform, network, and apps interact, and it is impossible to discuss videoconferencing experienced on mobile devices in relation to one of these factors without referencing the others. Conclusions: Apps for mobile devices can vary significantly from other videoconferencing software or hardware. App performance increased over the testing period due to improvements in network infrastructure and how apps manage bandwidth.

Key words: : technology, telemedicine, mobile health, telecommunications

Introduction

Mobile videoconferencing applications (apps) enable consultations between healthcare providers and patients from any location, even while in transit, using data-capable mobile devices with cellular or Wi-Fi access. Mobile videoconferencing apps are a form of cloud computing, where locally installed client software communicates over networks with programs on servers providing services. Cloud videoconferencing systems are software-based, with client apps designed to take advantage of microphone and camera capabilities built into or added onto local computing devices to access software on servers providing services managing communication. They are more scalable and flexible and thus are distinctively different from traditional, stand-alone hardware-based videoconferencing devices.1

The ubiquity of mobile videoconferencing apps is partially the result of enhanced microprocessor capacities making it possible to videoconference with software, rather than special dedicated hardware. It is also partially the result of cameras being added to cell phones, tablets, and other mobile computing platforms. Most cloud videoconferencing system developers have created concomitant apps for popular Apple (Cupertino, CA) iOS and Google (Mountain View, CA) Android™ mobile operating systems, although some have been developed for the Windows® mobile platform of Microsoft (Redmond, WA). The apps are specifically designed for the touch screen interfaces on smartphone and tablet devices. Users of traditional videoconferencing hardware or cloud conferencing software designed for desktops and laptops need to understand the extent to which videoconferencing on mobile devices may vary and what performance to expect. Consequently, 12 apps were tested in varied contexts with iOS and Android mobile devices over third-generation (3G) and fourth-generation (4G) cellular networks and Wi-Fi (802.11x).

Background

Although mobile apps may potentially revolutionize telemedicine with anytime–anywhere patient monitoring and remote consultation,2 there is little research on mobile video apps for telemedicine. A PubMed search on telemedicine and mobile video retrieved 80 citations, but most of these concerned using older videoconferencing hardware devices or desktop/laptop software with 802.11x Wi-Fi in single settings, such as hospitals and clinics, or were very technical appraisals of different video compression algorithms for wireless transmission. A review of the use of Skype™ (Microsoft) for telemedicine that did not differentiate between its use on computers or mobile devices identified a single controlled clinical trial on nurse communication with elderly dementia patients that had only 7 subjects.3 A 1-year study documenting five rural physicians' tablet use on 3G cellular networks found that they did not use video for communication. They believed better online video content would improve tablet usefulness for educating patients and that apps designed specifically for tablet interfaces were easier to use than apps (e.g., browsers) designed for multiple platforms.4 Three video citations were about single cases involving cell phone use to guide a resident to program a pacemeaker,5 a cell phone–recorded video of a patient seen in-person documenting previously experienced tachycardia,6 and cell phone transmission of a 30-second video clip of 1 patient's scrolled-through still radiologic images to off-site surgeons using a multimedia messaging service that was essentially a store-and-forward application.7

Eleven studies had more than one teleconsultation involving physicians or other health providers.8–18 They included, for example, studies of 3G cellular networks with transmissions of 64 kilobits/s (Kbps) for cardiopulmonary resuscitation assistance with and without video,8 video with multiplexed 3G networks (combining multiple 3G data streams into a single signal) to achieve transmissions of 125–150 Kbps from moving ambulances to transmit endotracheal intubations and ultrasound,9 and video telestroke consultations with land line transmissions over 700 Kbps and 3G wireless transmissions as low as 128 Kbps.10 Overall, the studies suggest that there may be lower boundary transmission rates below which the data transmitted become unusable. There is no agreement on what the boundary may be. It may depend on the content transmitted. An ambulance-based study of multiplexed 3G found data rates of 125–150 Kbps were insufficient for ultrasound but acceptable for endotracheal intubations.9 Several studies reported issues with lower data rates or use of cellular networks versus WiFi.10–13 One stroke study identified a lower threshold of 400 Kbps,14 whereas an echocardiogram study's lower threshold was 200 Kbps.15 The only study documenting transmission while moving found that maintaining transmission while switching between cell towers was an issue.9

The results of these studies are highly variable, and the number of cases in many studies was low (under 10 in four studies11,12,14,15 and only 20 in another13). The video apps used were seldom specified, even though the video compression each uses affects image quality.19 Only a single app was used in each study; none compared performance of several apps on different wireless networks. Moreover, transmission rates were unspecified in many studies,11,16–18 another image quality factor.19 References to Universal Mobile Telecommunications Systems without specifying the technology's iteration (e.g., 3G or 4G) were often made. Readers have to infer the more specific technology from the data rate or network capacity, if presented, or the location and date of the research.

The present study addresses some of these shortcomings by documenting the performance of a range of videoconferencing apps and their data transfer rates over Wi-Fi and cellular (3G and 4G long-term evolution) networks, accounting for transmission rates and the operating systems and platforms (iOS and Android tablet computers) used and video quality and app usability over a time period exceeding 1 year (from summer 2013 through fall 2014).

Materials and Methods

Apps tested on Android and iOS platforms are listed in Table 1. With the exception of Skype, Google Hangout, LinPhone (Belledonne Communications, Grenoble, France), and VSee (VSee, Sunnyvale, CA), the apps tested were designed for use with cloud videoconferencing systems participating in Internet2's test drive program. Companies developing these systems are major vendors in higher education and partners with Internet2, an advanced telecommunications network serving universities, national laboratories, and research centers. The program allows Internet2 member institutions free access to the technologies for testing. Skype and Google Hangout were added because of their widespread use, LinPhone was added because it is an open source program popular among Linux users, and VSee was added because prior use indicated it had highly efficient bandwidth allocation. Some apps were designed for only one mobile operating system, but when there were dual versions for iOS and Android, both were tested, provided they could be made to work on both platforms.

Table 1.

Mobile Applications Tested on iOS and Android Devices, Respectively

SYSTEM TESTED ON APPLICATION
Android Skype
  Fuzebox
  Google Hangout
  Radvision Scopia
  SeeVogh
  Vidyo
  Lifesize ClearSea
  LinPhone
  Polycom
iOS Skype
  Fuzebox
  Google Hangout
  Radvision Scopia
  SeeVogh
  Vidyo
  Lifesize ClearSea
  Yahoo Messenger
  Jabber
  VSee

Each app was tested in point-to-point and multipoint conferences having three participants. Conferences were done entirely over Wi-Fi, entirely over cellular, and between Wi-Fi and cellular networks. Public Wi-Fi networks and cellular networks at the National Institutes of Health (NIH) (Bethesda, MD) and Wi-Fi networks and cellular networks at the authors' homes were used in these tests. Every NIH and home test was done on three occasions for each app on different days and times to account for traffic variance by cell tower location or time of day. All Wi-Fi tests were done in close proximity to base stations located in homes and NIH offices. In addition, each app was tested over cellular networks in a moving automobile traveling from NIH to a point 25 miles north at an average speed of 60 miles per hour. Bit rates were recorded in each app's tests and averaged. Technical issues such as frozen or posterized video and call drops were recorded. Latencies were observed and averaged, audio and video synchronization also was recorded, and subjective judgments were made of audio and video quality by conference participants.

Testing involved different cellular networks, network carriers, and tablet platforms and evolved over time. Ultimately, there were three rounds of tests, each based on the results from the previous round (Table 2). Initial tests were done using iOS iPad® (Apple) devices configured for 3G cellular and Android Samsung (Seoul, Korea) Galaxy tablets configured for 4G cellular both from the same cellular carrier. Additional iPads were acquired with 4G capability but from a different carrier for a second round of tests. This led to a third testing round where apps were tested on a second Android platform, Nexus 7 (Google and Asus [Taipei, Taiwan]), configured for the same 4G network and carrier that the iPads used in Round 2 to keep networks and carriers constant.

Table 2.

Testing Rounds, Platforms, Networks, and Carriers

TESTING ROUND OPERATING SYSTEM PLATFORM NETWORK/CARRIER
1 iOS iPad Wi-Fi/3G Carrier B
  Android Galaxy Wi-Fi/4G Carrier B
2 iOS iPad Wi-Fi/4G Carrier A
  Android Galaxy Wi-Fi/4G Carrier B
3 Android Nexus 7 Wi-Fi/4G Carrier A

3G, third generation; 4G, fourth generation.

Results

In Round 1 (summer 2013), Android mobile apps tested poorly categorically over 4G and Wi-Fi networks, whereas iOS apps were generally stable and usable over Wi-Fi but not 3G. One reason apps for both platforms performed poorly is that they appeared to encode video for data transmission rates that were higher than wireless networks could accommodate. This was the obvious reason for the poor iOS app performance over 3G. Problems occurred in all locations, and there were predictable drops at certain places on the route the automobile traveled, indicating insufficient cell tower coverage. There are three possible reasons for the poor performance of Android apps in these initial tests. Android tablets were newer than iPads and, given iPhone® (Apple) and iPad popularity, there might have been a tendency for developers to create apps for iOS first. Consequently, iOS apps may have been further developed. A second reason may be the open source nature of the Android operating system. Device manufacturers may have modified the system in certain ways to optimize it for their hardware, making it harder for app developers to create programs performing consistently on different Android devices. Finally, some performance issues with Android apps on the 4G network may have been due to poor network infrastructure. Overall, data for Round 1 were too poor and inconsistent to report.

In Round 2 (spring/summer 2014), both iOS and Android apps performed acceptably. Round 2's Wi-Fi, 4G, and Wi-Fi to 4G findings for iOS are shown in Table 3, and those for Android are shown in Table 4. To simplify presentation, Tables 3 and 4 show data only for apps with versions for both platforms and tests at fixed locations. Test of apps unique to each platform are consistent with these results. In addition, all apps performed similarly when tested in the moving vehicle but were more variable with some degradation in performance when moving between cell towers. The improved performance indicated 4G network infrastructure had matured over the period between Round 1 and Round 2 tests and/or that developers made improvements to their apps. The data rate range shown in Tables 3 and 4 is highly variable, reflecting the lowest and highest rates monitored in the three test of each app at all fixed locations. For example, at some time during one test of iOS/iPad app A, either at NIH or at the authors' homes, the Wi-Fi data rate momentarily dropped as low as 104 Kbps, and sometime during another test it burst to 1.2 megabits/s (Mbps), giving an overall range of 104 Kbps–1.2 Mbps. The range was more consistent for apps for a given test at a given location. For example, in Round 3 testing, app A on an Android device had range of 1.2–1.3 Mbps at one home location and 95–125 Kbps at NIH.

Table 3.

Round 2 Mobile Application Testing Results on iOS for Applications Having Android Versions (See Table 4)

  A B C D E F G
APPLICATION WI-FI 4G WI-FI/4G WI-FI 4G WI-FI/4G WI-FI 4G WI-FI/4G WI-FI 4G WI-FI/4G WI-FI 4G WI-FI/4G WI-FI 4G WI-FI/4G WI-FI 4G WI-FI/4G
Data rate range (Kbps) 104–1,200 176–1,040 160–264 96–280 233–680 208–584 288–880 544–864 360–870 344–592 424–994 568–664 176–704 114–1,048 608–744 448–1,312 576–1,248 392–1,112 248–936 520–1,120 424–680
Average data rate (Kbps) 526 336 184 216 485 459 585 702 663 476 683 597 517 506 641 812 697 726 631 667 559
Technical issues Some Minor Many Some Minor Some Minor Minor Some Minor Minor Some Minor Some Minor Minor Minor Minor Many Some Some
Average latency (s) 0 0 0 0 0 0 0 0 0 0.5 0.5 0.5 0.5 0.5 0.5 0 0 0 0 0 0
Video quality Fair Fair/poor Poor Good/fair Good/fair Good/fair Good/fair Good/fair Good/fair Fair Fair Fair/poor Good Good/fair Good/fair Good/fair Good Good/fair Fair/poor Good/fair Good/fair

4G, fourth generation; Kbps, kilobits/s.

Table 4.

Round 2 Mobile Application Testing Results on Android (Samsung Galaxy) for Applications Having iOS versions (See Table 3)

  A B C D E F G
APPLICATION WI-FI 4G WI-FI/4G WI-FI 4G WI-FI/4G WI-FI 4G WI-FI/4G WI-FI 4G WI-FI/4G WI-FI 4G WI-FI/4G WI-FI 4G WI-FI/4G WI-FI 4G WI-FI/4G
Data rate range (Kbps) 95–1,100 128–1,300 225–1,300 265–400 400–750 100–731 200–340 350–900 152–515 419–878 1,100–1,600 550–1,600 37–875 250–952 263–920 281–429 488–745 439–803 251–1,100 630–2,200 482–2,200
Average data rate (Kbps) 569 981 906 330 625 507 249 531 382 618 1283 1308 248 478 447 368 602 698 804 1,415 1,803
Technical issues Some None Minor Some Minor Minor Minor Some Some Some Many Many Some Minor Some Minor Minor Minor Minor Minor Minor
Average latency (s) 0.5 <0.5 <0.5 <0.5 0 0 0 <0.5 0 >0.5 >5 >3 <1 <0.5 0.5 <0.5 <0.5 <0.5 0 0.5 <0.5
Video quality Good/fair Good Good Good Excellent/good Good Good/fair Fair Good/fair Good Poor Poor Fair/poor Good/fair Fair Good Good Good Good Fair Good/fair

4G, fourth generation; Kbps, kilobits/s.

Data rate averages are more representative of performance than ranges. Most iOS and Android apps' averages were about 400–600 Kbps. The three outliers are iOS app A for Wi-Fi to 4G that averaged only 184 Kbps and Android apps D and G that had data rates exceeding 1 Mbps for 4G and Wi-Fi to 4G tests. At either high or low bandwidth extreme, there tended to be more technical and video quality issues. The two apps using most bandwidth were Android, suggesting these developers were still trying to maximize data transmission rather than optimize it. Tables 3 and 4 also show there was little audio latency, usually less than 0.5 s, a point where it becomes noticeable.

Round 3 Android tests with Nexus 7s (Table 5) came later (fall 2014); the results for all apps, including the previous high bandwidth outliers are closer to the 400–600 Kbps average, and there were fewer technical problems, more acceptable video quality, and no excessive latency, suggesting further app optimization for wireless and, perhaps, even more network build out.

Table 5.

Round 3 Mobile Application Testing Results on Android (Nexus 7) for Applications Having iOS Versions (See Table 3)

  A B C D E F G
APPLICATION WI-FI 4G WI-FI/4G WI-FI 4G WI-FI/4G WI-FI 4G WI-FI/4G WI-FI 4G WI-FI/4G WI-FI 4G WI-FI/4G WI-FI 4G WI-FI/4G WI-FI 4G WI-FI/4G
Data rate range (Kbps) 107–1,100 678–1,300 685–1,250 265–367 575–727 238–678 191–288 376–750 231–495 364–587 468–959 539–660 356–684 314–764 342–795 319–413 585–643 607–795 515–710 415–750 452–676
Average data rate (Kbps) 568 956 906 331 624 502 246 531 382 506 701 586 535 596 541 368 602 698 601 553 559
Technical issues Minor None Minor Minor None None None Minor Some Minor Minor Minor Minor Minor Minor Minor None Minor Minor Minor Minor
Average latency (s) 0.5 0 0 0.5 0.5 0.5 0.5 0.5 0.5 0.5 0.5 0.5 0.5 0.5 0.5 0.5 0.5 0.5 0.5 0.5 0.5
Video quality Good/fair Good Good Good Excellent/good Good Good Fair Good/fair Good Good Good/fair Good Good/fair Good/fair Good Good Good/fair Good Good/fair Good/fair

4G, fourth generation; Kbps, kilobits/s.

Network jitter and packet loss were not tracked because not all of the apps provide such measurement, but both are reflected in assessments of video quality and latency. Quality judgments were subjective, determined by tester consensus, and based on whether image artifacts (posterization, freeze frames, pixelization) occurred and, if so, their frequency and duration during typical 15–30-min test sessions. Good indicates no artifacts or one to two very minor, temporary ones. Poor indicates artifacts occurred several times or consistently with certain images (such as those where objects moved), or an app stopped and had to be restarted.

Discussion

There are several conclusions that can be made about mobile videoconferencing apps based on the tests:

  • • Mobile apps on both iOS and Android platforms have improved over the period between Round 1 and Round 2 testing and are now generally stable and usable given sufficient bandwidth. App performance was generally poor in Round 1. Audio latency continued to be an issue for Android apps in Round 2, but not in later rounds. Android apps with too little or too great data transfer rates had more technical and quality issues in Round 2 that resolved in Round 3 with more optimized bandwidth use.

  • • Network speed and infrastructure greatly affect performance. In Round 1 testing, 3G was very inadequate for iOS apps, and initial 4G performance using Android apps was poor due to either app instability or the 4G network possibly being insufficiently built out at the time.

  • • Wireless base station/cellular tower location and time of use greatly impact performance, especially on cellular networks.5 Apps usually had significantly lower data rates when tested during the workday at NIH versus authors' homes.

  • • High bandwidth use does not guarantee video quality. Early in Round 1 some apps used lots of bandwidth, probably more than cellular networks could manage, and performance was poor, resulting in latency, image freezing or degradation, and dropped calls. This carried over for some Android apps in Round 2.

  • • Most mobile apps are now optimized to use 300–600 Kbps bandwidth, with many optimized around 400 Kbps (Tables 3–5). These transmission rates are consistent with those for acceptable quality previously reported in some studies. These transmission rates target a “sweet spot” using enough cellular bandwidth for consistent performance, but not too much to cause transmission problems or too little to compromise image quality.

  • • Performance fluctuations and slight degradations are possible with all apps and platforms when used in motion due to signal passing between cell towers and varied tower traffic.

  • • Performance and features of the mobile apps by developers of cloud conferencing or hardware-based systems were inferior to their desktop/hardware commercial products. Whether the quality is sufficient depends on the content transmitted. This may be due to app data transfer rates being optimized for cellular network bandwidth, requiring greater compression.19

  • • Only three apps took advantage of the highest high-definition camera quality of 1080p. Most have 720p resolution, and all performed well except when high degrees of motion were captured in the video. In many cases, frame rate may as important as resolution.

  • • Although reviewers subjectively judged the quality of the latest apps, platforms, and wireless networks as sufficient (that is, with relatively little latency and few image artifacts), there is still upward performance potential for mobile hardware, software, and networks.

Conclusions

The performance of mobile videoconferencing apps improved substantially over the test period, likely resulting from device, software, and network improvements. In the final tests, apps worked well on iOS and two different Android platforms over two different 4G cellular networks. These findings suggest the potential of mobile videoconferencing apps for telemedicine, but they do not obviate the need to conduct tests in local settings. Cellular service in one's setting may not be as good as that used in these tests, or the overall video quality may still be inadequate for particular content. Cell signals may need amplification, and wired communication may be needed in locations having substantial cell traffic.20 A limitation of the study is its subjective judgments of image quality based on video of conference participants rather than specific types of medical data such as sonograms, electrocardiograms, etc. Moreover, the study is based on current wireless technology, which is itself a moving target, subject to cell tower and base station density and evolving bandwidth standards. Finally, there are other videoconferencing issues, such as security and encryption, the study did not address.

Acknowledgments

The research reported was supported by the Intramural Research Program of the National Institutes of Health and the National Library of Medicine.

Disclosure Statement

No competing financial interests exist.

References

  • 1.Liu W, Zhang K, Locatis C, Ackerman M. Cloud and traditional videoconferencing technology for telemedicine and distance learning. Telemed J E Health 2015;21:422–426 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 2.Weinstein RS, Lopez AM, Joseph BA, Erps KA, Halcomb M, Barker GP, Krupinski EA. Telemedicine, telehealth, and mobile health applications that work: Opportunities and barriers. Am J Telemed 2014;127:183–187 [DOI] [PubMed] [Google Scholar]
  • 3.Armfield NR, Gray LC, Smith AC. Clinical use of Skype: A review of the evidence base. J Telemed Telecare 2012;18:125–127 [DOI] [PubMed] [Google Scholar]
  • 4.Anderson C, Henner T, Burkey J. Tablet computers in support of rural and frontier clinical practice. Int J Med Inform 2013;82:1046–1058 [DOI] [PubMed] [Google Scholar]
  • 5.Chakrabarti S, Chou A, Andrade J, Bennett MT, Deyell MW, Tung SKK, Krahn A. Real-time transmission to real-time patient care: A tale of 4 devices. Can J Cardiol 2013;29:1014.e1–1014.e2 [DOI] [PubMed] [Google Scholar]
  • 6.Parakh N, Chaturvedi V. Indigenous telemedicine. Indian Pacing Electrophysiol J 2012;12:131–132 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 7.Shivapathasundram G, Heckelmann M, Sheridan M. Using smart phone video to supplement communication of radiology imaging in a neurosurgical unit: Technical note. Neurol Res 2012;34:318–320 [DOI] [PubMed] [Google Scholar]
  • 8.Bolle S, Johnsen E, Gilbert M. Video calls for dispatcher-assisted cardiopulmonary resuscitation can improve the confidence of lay rescuers—Surveys after simulated cardiac arrest. J Telemed Telecare 2011;17:88–92 [DOI] [PubMed] [Google Scholar]
  • 9.Sibert K, Ricci MA, Caputo M, Callas PW, Rogers FB, Charash W, Malone P, Leffler SM, Clark H, Salinas J, Wall J, Kocmoud C. The feasibility of using ultrasound and video laryngoscopy in a mobile telemedicine consult. Telemed J E Health 2008;14:266–272 [DOI] [PubMed] [Google Scholar]
  • 10.Audebert H, Boy S, Jankovits R, Pilz P, Klucken J, Fehm N, Schenkel J. Is mobile teleconsulting equivalent to hospital based telestroke services? Stroke 2008;39:3427–3430 [DOI] [PubMed] [Google Scholar]
  • 11.Dickson BW, Pedersen PC. Wireless image streaming in mobile ultrasound. Telemed J E Health 2010;16:161–166 [DOI] [PubMed] [Google Scholar]
  • 12.Larsen SB, Clemensen J, Ejskjaer N. A feasibility study of UMTS mobile phones for supporting nurses doing home visits to patients with diabetic foot ulcers. J Telemed Telecare 2006;12:358–362 [DOI] [PubMed] [Google Scholar]
  • 13.Seeman R, Guevara , Undt G, Ewers R, Schicho K. Clinical evaluation of tele-endoscopy using UMTS cellphones. Surg Endosc 2010;24:2855–2859 [DOI] [PubMed] [Google Scholar]
  • 14.Caverno E, Alesanco A, Garcia J. Enhanced protocol for real-time transmission of echocardiograms over wireless channels. IEEE Trans Biomed Eng 2012;59:3212–3220 [DOI] [PubMed] [Google Scholar]
  • 15.Kim DK, Yoo SK, Park IC, Choa M, Kyoung BY, Kim YD, Heo JH. A mobile telemedicine system for remote consultation in cases of acute stroke. J Telemed Telecare 2009;15:102–107 [DOI] [PubMed] [Google Scholar]
  • 16.Lim TH, Choi HJ, Kang BS. Feasibility of dynamic cardiac ultrasound transmission via mobile phone for basic emergency teleconsultation. J Telemed Telecare 2010;16:281–285 [DOI] [PubMed] [Google Scholar]
  • 17.Waran V, Bahuri NF, Narayanan V, Ganesan D, Kadir KA. Video clip transfer of radiological images using a mobile telephone in emergency neurosurgical consultations (3G multi-media messaging service). Br J Neurosurg 2012;26:199–201 [DOI] [PubMed] [Google Scholar]
  • 18.Van Dillen C, Silvestri S, Haney M, Ralls G, Zuver C, Freeman D, Diaz L, Papa L. Evaluation of an off-the-shelf mobile telemedicine model in emergency department wound assessment and management. J Telemed Telecare 2013;19:84–88 [DOI] [PubMed] [Google Scholar]
  • 19.Liu W, Zhang K, Locatis C, Ackerman M. Internet-based videoconferencing coder/decoders and tools for telemedicine. Telemed J E Health 2011;17:358–362 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 20.Locatis C, Williamson D, Sterrett J, Detzler I, Ackerman M. Video medical interpretation over 3G cellular networks: A feasibility study. Telemed J E Health 2011;17:809–813 [DOI] [PMC free article] [PubMed] [Google Scholar]

Articles from Telemedicine Journal and e-Health are provided here courtesy of Mary Ann Liebert, Inc.

RESOURCES