Abstract
Effective response to natural or man-made disasters (i.e., terrorism) is predicated on the ability to communicate among the many organizations involved. Disaster response exercises enable disaster planners and responders to test procedures and technologies and incorporate the lessons learned from past disasters or exercises. On May 31 and June 1, 2002, one such exercise event took place at the Camp Lejeune Marine Corps Base in Jacksonville, North Carolina. During the exercise, East Carolina University tested: (1) in-place Telehealth networks and (2) rapidly deployable communications, networking, and data collection technologies such as satellite communications, local wireless networking, on-scene video, and clinical and environmental data acquisition and telemetry. Exercise participants included local, county, state, and military emergency medical services (EMS), emergency management, specialized response units, and local fire and police units. The technologies and operations concepts tested at the exercise and recommendations for using telehealth to improve disaster response are described.
The September 11, 2001 terrorist attacks on the World Trade Center and the Pentagon have illustrated the unfortunate reality of the terrorist threat within the Unites States. Concomitantly, these events heightened the awareness of emergency response and security organizations at the federal, state, and local levels throughout the U.S. and the world, and many of these organizations have been reexamining and evolving their disaster and terrorism preparedness and response plans. Examination and analysis of the events leading up to the September 11 attacks, the response to the events, and prospective assessment of readiness to detect or respond to other terrorism attacks, including weapons of mass destruction (WMD), have revealed that one of the most pressing issues is to improve communication among all of the disparate intelligence, law enforcement, and emergency response organizations involved in the detection and response to terrorism.1,2
On May 31 and June 1, 2002, the East Carolina University (ECU) Telemedicine Center participated in a regional terrorism response exercise sponsored by the Camp Lejeune Marine Corps Base/Onslow County Military-Civilian Task Force for Emergency Response (MCTFER), the North Carolina Division of Emergency Management, and the Coastal Carolina Regional Steering Committee (CCRSC) for Domestic Preparedness. This event was conducted at the Camp Lejeune Military Operations on Urbanized Terrain (MOUT) training facility, which is a full-scale town used for urban combat and search and rescue training. Multiple activities were concurrently conducted during the two-day exercise, including hostage rescue, mass triage, firefighting, mass decontamination, aeromedical transport, foreign animal disease detection and management, and family assistance, to name a few. Personnel from local, county, state, and military emergency medical services (EMS), fire, police, emergency management, and specialized response units (e.g., hazardous materials) participated in the exercise. The goals of the ECU Telemedicine Center’s participation were to demonstrate and test the following within the context of simulated terrorist or mass casualty events: (1) existing telemedicine networks, systems, and operations concepts and (2) rapidly deployable satellite, local wireless networking, and mobile clinical and environmental monitoring technologies. Such systems and practices offer the potential for improving interresponder and inter-agency coordination and for keeping all parties informed about evolving conditions at the scene of the disaster.
Background
As evidenced in many disaster events, existing public-switched telephone networks (PSTN) and cellular telephone networks can be become quickly disabled or overwhelmed, rendering them degraded or useless for disaster response communications. This may result from either physical damage to the communications infrastructure or extreme demand that exceeds the network’s ability to handle increased usage (i.e., insufficient surge capacity). Frequently, however, hospitals and other clinical facilities rely solely on PSTN or cellular telephone systems for emergency communications. Emergency response and public safety organizations use separate radiofrequency (wireless) terrestrial communications networks and avoid reliance on PSTN or cellular networks. However, these systems can become overwhelmed in large-scale disasters. One possible solution to loss of terrestrial communications infrastructure during a disaster is the use of satellite networks for voice and data communications.
The use of in-place telehealth networks is another way to improve communications reliability for medical response to disaster or terrorism events. Accord-ing to the American Telemedicine Association,3 over 200 telehealth networks within the U.S. are linked to more than 2,000 clinical facilities. These high-speed networks can be used in the event of a disaster for many applications, including improved triage and medical transport decision making, interfacility or interagency collaboration, teleconsultation, training, postevent psychiatric/psychological consultation, or long-term patient follow-up. Currently, several proposed initiatives address the utilization and interconnection of existing telehealth network resources to support the disaster application. For example, the Southern Governors Association’s Telehealth/Home-land Security Task Force has conducted an assessment of telehealth, public health, and distance learning networks within its member states as a first step in determining how such existing resources can be used to support regional disaster preparedness and response.
Like many telemedicine programs throughout the U.S., the ECU Telemedicine Center operates an extensive high-speed telecommunications and videoconferencing network to serve patients throughout its regional service area. The ECU network consists of heterogeneous communications links, including full and fractional T-1 (1.54 Mbps), Integrated Services Digital Network (ISDN) at multiples of 128 kbps Basic Rate Interface (BRI) most commonly aggregated at 3 BRI (384 kbps), Plain Old Telephone Service (POTS, up to 56 kbps), Internet 2 (OC-3, 155 Mbps), and fixed point wireless Ethernet (45 Mbps). Both H.320 and H.323 videoconferencing systems are used in the ECU network, and three multipoint conference units enable these disparate systems to interoperate. In addition to ECU-managed network sites, several other videoconference sites are frequently interconnected via the ECU Telemedicine Center for distance learning and grand rounds via the North Carolina Information Highway (NCIH) and North Carolina Research and Education Network (NC-REN). Although all of the aforementioned sites are designed primarily for clinical telehealth and distance education, they also can be used to support disaster or terrorism response operations.
The ECU Telemedicine Center used the MCTFER exercise as an opportunity to assess the feasibility and operations concepts for using in-place videoconferencing networks, satellite communications, wireless Local Area Networks (LAN), and Internet Protocol (IP) biomedical and environmental data telemetry for disaster response. The exercise involved a simulated scenario of a nerve agent attack on a high-school graduation ceremony and was conducted to test both the individual response plans of participating organizations and local and regional interorganizational coordination.
Design Objectives
Medical response to natural disasters or acts of terrorism requires a diverse team, distributed at many locations. Emergency department staff at rural hospitals and regional trauma centers will be involved in decision making and patient care. On-scene responders will include emergency medical technicians and paramedics as well as firefighters and law enforcement personnel, hazardous materials teams, and volunteers. Many other county, state, and federal emergency management organizations will be involved. This complex effort absolutely requires effective collaboration, mediated by telecommunications, and up-to-date information about conditions at the scene.
As previously described, disasters often disable or degrade standard communications modes. Therefore, the most fundamental system requirement is redundant, fault-tolerant communications/network infrastructure. Associated applications will be required to improve medical response to natural or manmade disasters. These basic requirements can be decomposed into several constituent elements:
- Redundant wide-area network (WAN) or “reach-back” infrastructure to enable connectivity between geographically dispersed personnel:
- Fixed WAN assets: the existing ECU Telemedi-cine network and other in-place networks
- Deployable WAN systems: satellite communications, cellular telephone, amateur radio
On-scene wireless LAN infrastructure to support responders in the field who will be dispersed throughout the scene.
Collaboration applications, including videoconferencing and voice communications to facilitate interaction between field-deployed personnel and remote decision makers.
Remote sensing and telemetry of biomedical and environmental parameters to improve “situational awareness” (i.e., assessment of victims or on-scene conditions).
Basic network-enabled IT productivity applications (i.e., e-mail, file transfer, web browsing).
Additional requirements can be further derived or implied from these core requirements and knowledge of the nature of disaster response operations. Applications that use the communications/network infrastructure must be able to use any potential network path and not be constrained by proprietary protocols linked to physical characteristics peculiar to individual modes of communication. Therefore, applications should be optimized for IP networking. In addition, on-scene systems should be mobile and able to operate without an AC power source (i.e., battery, solar, or other alternative power).
Systems Description
Multiple systems were demonstrated and tested within the context of the MCTFER exercise to examine their utility for on-scene disaster response operations. Examples include in-place networks, satellite and local wireless networking for reach-back and on-scene communications, clinical and environmental devices, videoconferencing systems, and information technologies.
In-Place Networks
As described previously, the ECU Telemedicine Network consists of ISDN and T-1 connections to various clinical sites throughout the region, with capabilities to interconnect with several other networks throughout the state, as well as the Internet via Internet 2. Several of these sites were in counties that participated in the exercise. These sites, plus other key operational sites, were interconnected in a multipoint videoconferencing session (Figure 1▶): (1) Onslow Memorial Hospital (Jacksonville, NC) and Duplin General Hospital (Kenansville, NC) via T-1 (H.320) links, (2) Pitt County Memorial Hospital (Greenville, NC) and the State of North Carolina Emergency Operations Center (Raleigh, NC) via ISDN (H.320) links, and (3) the deployed ECU Tele-medicine Center team at the MOUT (Jacksonville, NC) via IP satellite (H.323). Existing ECU Telemedi-cine Center multipoint conference units were used to make the necessary interconnections between the disparate videoconferencing systems and standards.
Figure 1.
The network configuration that supported the multipoint videoconference during the exercise.
Satellite “Reach-back” Communications
Three different satellite communications systems were deployed to facilitate reach-back connectivity, including the Global Communications Solutions (GCS) 212, Starband, and two Inmarsat Global Area Network (GAN) systems. The U.S. Army Telemedicine and Advanced Technology Research Center provided one of the Inmarsat GAN units for the exercise and the other was provided by GCS. Both Iridium and Globalstar satellite telephones provided voice communications and 9.6 kbps telemetry, and a packet radio (Ham) was used for 300 bps telemetry. The GCS-212 system consisted of a 1.2 m parabolic antenna and very small aperture terminal (VSAT) that connected to the GCS network operations center (NOC) in Victor, NY, at 768 kbps. From the GCS NOC, IP traffic was delivered to the commodity Internet via a 3 Mbps connection and reached the ECU Telemedicine Center via Internet 2 (OC-3). The router at the MOUT was configured to connect four analog telephones, and these were switched into the PSTN at the GCS NOC, providing reliable voice communications.
The Starband system is a 35-pound satellite dish that can be used anywhere within the continental U.S. and aligned southward to connect with the Telstar-7 satellite to establish a sharable Internet connection. The gateway for the Starband system is a desktop or laptop PC with either two network cards or one network card and a free Universal Serial Bus (USB) port. The Globalstar system uses a collection of Low Earth Orbit (LEO) satellites that use telephone handsets roughly the same size and power requirements of a commercial cellular phone handset.
Local Wireless Networking
Commercial off-the-shelf (COTS) 802.11b wireless network access points were used to provide connectivity on-scene at the MOUT. A NETGEAR (model ME102) access point was used to cover the Telemedicine Operations Center, while two Linksys (Instant Access model WAP11) systems were bridged to provide outdoor coverage for mobile users. Different channel settings were used for the two manufacturers’ systems (channel 1 for NETGEAR, 6 for Linksys). These wireless networks were connected into the different satellite networks via cabled (Cat 5) connections. A highgain antenna was used with the outdoor access point.
On-scene Video
GCS provided a Zydachron H.323 codec that enabled IP videoconferencing. The ECU team also tested a commercially available 802.11b-enabled web cam (D-Link DCS-1000W) that was used for general on-scene video streaming at VGA resolution. The system has an onboard webserver that allows the camera to operate without a dedicated PC. The video stream from this system can be viewed from any PC with Internet access and a standard web browser client. The system can be configured to automatically take digital still photos at defined intervals or as triggered by an external sensor signal (e.g., motion, infrared), however, these capabilities were not used during the exercise.
Additionally, two mobile video systems were tested: a GCS “video vest” and a wearable computer with a web cam. The GCS system is a vest that has an integrated video camera, headset with microphone, and battery pack; it allows the user to transmit video and two-way voice via a 900-MHz wireless link back to a transceiver at the telehealth operations center. The video and audio from the mobile user can be routed into the auxiliary inputs on the codec to enable reach-back to other sites, although this capability was not tested during the exercise. ECU Telemedicine Center personnel also used a wearable computer, manufactured by Charmed, Inc., that had a Pentium 2 233-MHz processor, 20-GB hard drive, 128-MB RAM, 2 PCMCIA card slots, with onboard Ethernet and USB. The Charmed device was outfitted with a USB Logitech Quickcam VC web cam and used Microsoft Netmeeting software (H.323). This system used the 802.11 wireless LAN to connect back to satellite reach-back communications to videoconference externally.
Clinical and Environmental Data Telemetry
The presence of reliable reach-back and local wireless networking infrastructure allows on-scene responders to utilize information technologies to collaborate electronically with key external players and provide up-to-date, if not real-time, data about the disaster scene and medical response operations. The satellite and wireless LAN systems enabled ECU Telemedi-cine Center personnel to demonstrate and test communications-enabled biomedical and environmental monitors, videoconferencing, and basic information technologies, including notebook PCs, webcams, and personal digital assistants (PDAs).
The ECU-developed “TopHat” platform (Figure 2▶) is central to providing IP data telemetry from devices and sensors. TopHat is a micro-webserver with four serial (RS-232) ports and one Ethernet (RJ-45) port that enables devices to communicate via a shared network connection. TopHat is based on the 22.1-MHz Rabbit 2100 Microprocessor. Four five-volt CMOS compatible serial ports are used at the same time as the 10Base-T Ethernet port. TopHat can be simultaneously controlled by the network port, one serial port, or programmed thresholds. The network control consists of user interaction with an internally hosted web page and TopHat periodically checking an e-mail account for commands or new parameters. Multiple users can concurrently interact with the network aspect of TopHat. The serial control can be via a personal computer running any of the major operating systems or a PDA. Programmed thresholds include sensor or device data parameters exceeding specified limits, or patterns of events triggering programmed responses. An event pattern typically includes sensor thresholds on multiple sensors exceeding programmed parameters within a certain time frame. With built-in power regulators, TopHat is capable of being powered by 120 VAC, 7.2 VDC 3-amp battery, automotive cigarette lighter, solar power, or any combination thereof.
Figure 2.
The ECU TopHat micro-web server platform was used to convert serial device data to an Internet Protocol stream.
TopHat was used to interface with several of the deployed devices. Clinical devices that used TopHat during the exercise included Welch Allyn Vital Signs monitor (Model 52000) and Roche Diagnostics’ Accu-Check Complete glucometer. TopHat was used to interface with environmental monitoring devices, including Rae Systems’ MultiRAE Plus air and chemical detection system and a Black Cat Systems’ GM-10 radiation detector (Geiger counter).
TopHat was connected to the wireless LAN via either a direct cable connection to one of the wireless access points/routers or via a wireless 802.11b connection and then subsequently to the Internet connections provided by the satellite systems. TopHat was connected directly to the Qualcomm GSP-1600 satellite phone via an RS-232 serial cable interface and communicated with the aforementioned clinical or environmental monitors as well as notebook computers. TopHat also was used with a cellular telephone. TopHat posted device data via the satellite connections to a remote Apache webserver and a Cold Fusion server at ECU that could be accessed by any authorized person with Internet access. In addition to posting data to remote web servers, the TopHat’s internal webserver enabled data to be locally displayed on PDAs, including both a Handspring Visor and Compaq iPAQ. The network configuration that supported all of the field-deployed technologies is depicted in Figure 3▶.
Figure 3.
The overall architecture of the communications and telemetry network involved satellite reach-back communications integrated with local wireless networking and device communications.
Two other environmental telemetry systems, developed by ECU Telemedicine Center personnel, were tested, one for personal monitoring, the other for wide-area monitoring. The personal system provided location tracking and environmental monitoring hardware. The equipment consisted of an APD2000 (Smiths Detection, Environmental Technologies Group, Baltimore, MD) gas monitoring system and a Magellan 315-GPS unit. Both were interfaced via custom controlling hardware that in turn controlled data flow to a 9600-bps, 2.4-GHz radio transmitter. Every time the APD2000 sent environmental data, GPS data followed immediately afterwards. The APD2000 was configured to transmit data approximately every 30 seconds unless the unit detected dangerous levels of a substance. If it detected dangerous levels, it would immediately transmit data.
The GPS unit sent a continuous stream when in contact with satellites. When the controlling hardware detected data from the APD2000, it would get the relevant data from the GPS unit and send it immediately after the APD2000 data. The radio transmitted to a receiver, serially attached to a computer. Data were displayed in Hyperterminal, a terminal emulator that comes installed with Microsoft Windows.
The wide area system was stationary and based on packet (ham) radio. Often the least common denominator in disaster situations is ham radio, which is used when other modes fail. Ham radio is highly reliable, has a wide coverage area, is flexible to change bands/equipment, mobile, and has a large base of knowledgeable operators. A Graseby Dynamics GID-3 was used for environmental chemical detection and a Magellan 315 GPS unit for positioning. Both were interfaced via custom controlling hardware that in turn controlled initialization and data flow of a Kenwood TH-D7AG, packet-radio-equipped, handheld radio. The frequency was adjustable. The controlling hardware was setup to send both GID-3 and GPS data every 10 seconds. When the entire system was turned on, the controlling hardware set up the radio with an identification call, and then prepared to transmit data. The hardware was connected to the GID-3 and GPS and controlled the flow of data into the radio.
Both the GID-3 and GPS units output data continually. The controlling hardware took a sample of data from each device every 10 seconds and sent it to the radio. A receiving radio was connected to a packet modem that, in turn, was connected serially to a PC and displayed all information on the computer. All data traffic protocols were handled by the packet modems and made sure that data were sent and received with error checking. Data were retransmitted if necessary. Data rates were set at 9.6 kbps, since both systems produce low bit-rate data streams.
Status Report:Performance of the System During the Regional Disaster Response Exercise
For the exercise, the ECU Telemedicine Center used the MOUT’s Building 5 (Embassy Building) to set up a communications and telehealth operations center. The three-story building, constructed primarily of concrete blocks, had limited interior lighting and 120-VAC electrical power. This particular building was selected because it had an unobstructed view of the Southern sky and a large flat rooftop that was suitable for deploying antennae. On the first day of the exercise, the ECU team was allowed to drive its vehicles on site to drop off equipment at the base of the building. The test strategy was to simulate an actual event, and, therefore, set up all of the equipment de novo and measure the time required to set up communications links from unpacking the boxes until connectivity was verified. Performance of the many systems tested during the exercise is described below.
Testing of In-place Networks
Existing networks were tested in the “tabletop” simulation of a nerve agent attack on a high-school graduation ceremony. A tabletop exercise is a facilitated, scenario-driven activity that involves participants describing how they or their organization would respond to events that arise during the scenario. An important component of this tabletop simulation was the coordination of emergency medical response among hospitals in the region. This involved testing the procedures and protocols for interhospital communication of bed availability and staffing levels, coordination of mass casualty medical transport (air and ground), determination of in-place and supplemental medical supply needs, and assessment of the situation at the high school. In addition to usual means of communication, the ECU Telemedicine Center made its network available for use in management of the event.
From the time the first request for video connectivity was received at the Telemedicine Center, a multi-point videoconference was successfully established within 30 minutes on an ad hoc basis. The two rural hospital sites (Duplin General and Onslow Memorial) were able to communicate with the regional trauma center, Pitt County Memorial Hospital. The telehealth operations center at the Camp Lejeune MOUT was connected into the conference via the GCS-212 satellite system (via the GCS NOC in NY). The State Emergency Operations Center was bridged into the multipoint conference. Nonetheless, once these connections were demonstrated, they were not further used for the tabletop simulation and sat idle until later that afternoon when senior exercise planners at the MOUT used the session to discuss the exercise’s progress with State Emergency Management officials at the Emergency Operations Center in Raleigh. Therefore, although the capability to bridge multiple sites of different standards over varied network paths on an ad hoc basis was successfully demonstrated, it was not tested as a tool to improve intersite, interagency coordination of disaster response.
On-scene Communications
The on-scene communications concept was to assume that the local telecommunications infrastructure was unavailable and to use multiple, redundant satellite paths of different bandwidths for reach-back connectivity and wireless networking for local communications. The ECU Telemedicine Center tested the different, previously described satellite systems for reach-back and local wireless networking for on-scene communications. The bandwidth, communications protocol, and set-up time for each of the satellite systems are listed in Table 1▶.
Table 1 .
Bandwidth, Protocols, and Set-up Times for the Different Satellite Reach-back Systems Tested During the Exercise
| System | Bandwidth | Protocols | Set Up Time |
|---|---|---|---|
| GCS 212 | 768 kbps | IP | 2 hr |
| Starband™ | 70 kbps downlink 35 kbps uplink (asymmetric) | IP | 2 hr, 10 min |
| Inmarsat GAN | 128 kbps | ISDN | 10 min |
| Globalstar | 4 kbps | IP (PPP) | 40 sec |
To the extent practical, devices were configured to use IP so that devices could interoperate and transmit critical data regardless of the physical network path(s). This allowed devices that inherently supported IP, or were converted to IP via the TopHat micro-web server platform, to use all of the network paths, whether the high-bandwidth 768 kbps GCS-212 system or the 4kbps GlobalStar satellite telephone. Using IP protocols would allow devices to use existing local or wide-area networks, the PSTN, emergency response RF, packet radio (ham) or cellular networks, if they were available in a disaster situation, without requiring the devices to be reconfigured.
Once the satellite links were established, two wireless LANs were set up and connected to the GCS 212 and Starband satellite systems. The COTS 802.11 wireless LAN products readily provided local network infrastructure without modification. These commercial wireless LAN products provided coverage throughout the MOUT (approximately 0.25 mile radius), and Ethernet bridging capabilities allowed access points to interconnect, further increasing the possible coverage area. The 802.11b networks were set up in under five minutes (the time it took to move the antennae to the desired locations and run the cable). One of the wireless LANs used a high-gain antenna that provided long-range line of site connectivity within a horizontal plane. The other wireless LAN used a lower-gain antenna and did not provide as much line of sight distance, but performed better in the scenario with a higher vertical range of coverage. The low-gain antenna was able to broadcast over buildings, whereas the high-gain antenna’s limited broadcast angle of plus or minus 13 degrees could not reach over or around buildings.
On-scene Video
Videoconferencing and video streaming technologies that were tested on-scene during the exercise performed well over the wireless LAN and satellite reach-back communications. IP video calls were established between the field-deployed system and a Polycom VS-4000 IP codec at the ECU Telemedicine Center at 128 kbps, 256 kbps, and 512 kbps. Although the same satellite link was simultaneously used for Internet reachback, only limited degradation of the video was observed even when in a 512 kbps videoconference. As described previously, this videoconference link from the field was tied into a multipoint videoconference that was established for the regional tabletop simulation.
The satellite paths added noticeable latency to the videoconference, but users quickly accommodated their communication patterns to account for the increased delay. Videoconferencing capabilities enabled field-deployed personnel to collaborate with remote experts, coordinate planning and logistics activities, and provide real-time assessment of on-scene conditions. The wireless web cam provided surprisingly good video quality and could be useful for remote surveillance of the scene and disaster response operations. The two wearable video systems demonstrated the feasibility of using such systems for mobile operations, such as telementoring of responders.
Clinical and Environmental Data Telemetry
The TopHat was primarily used at the telehealth operations center in the MOUT’s Building 5, but it was also deployed at other locations at the MOUT. A mock serin gas attack was simulated at the MOUT’s high school gymnasium, and the TopHat was deployed there for simultaneous monitoring of simulated gas levels with the MultiRAE device, location with a Magellan 315 Global Positioning System (GPS), and radiation levels with a GM-10. When radiation levels exceeded preset thresholds with a radioactive test sample, TopHat transmitted its location and radiation levels over the 802.11b network via hypertext transport protocol (http). When TopHat was out of 802.11b range and was unable to send data using its network connection, it automatically switched communications to the cellular phone that was also connected and successfully posted its data to a Sun Cobalt Qube web server running Perl 5 and Apache webserver. The Perl script that accepted the data then notified a predefined list of people as a response to the data that was posted.
TopHat was demonstrated both inside and outside the ECU Telemedicine Center’s van. TopHat was able to communicate from within the van with devices that were located outside of the van via Bluetooth personal-area wireless communications using Connectblue’s Bluetooth Serial Port Adapter. TopHat was able to work outside of the van with the sensors directly, with a wireless connection back to a Handspring Visor inside the van.
TopHat was also tested as a wearable platform, broadcasting GPS data, radiation levels, and toxic gas levels on a web page that it hosted. When radiation thresholds were reached, TopHat posted data to an external web server while running its own web server and simultaneously transmitted e-mail messages to external team members. When the 802.11b network connection to TopHat was disconnected, TopHat continued to post radiation data readings through the Globalstar Qualcomm GSP-1600 satellite phone. The user interface for TopHat was a Handspring Visor connected to TopHat through TopHat’s data port. The Handspring Visor was running a Palm OS program that was able to configure TopHat’s alarm thresholds, network settings, and view data from the devices attached to TopHat.
Both of the amateur radio-based environmental data telemetry systems were evaluated within the MOUT site. Both systems had similar performance characteristics. Set-up times were limited to connecting and turning on the hardware. This typically required 5–10 minutes. A simulated toxic gas sample (oil of wintergreen) was used to test the systems’ responses. Both systems immediately transmitted gas concentration and GPS data when stimulated by the simulated toxic gas sample. Receivers in the telehealth operations center successfully collected these data from different indoor and outdoor locations throughout the MOUT within an approximately 500-yard radius.
At the request of the Onslow County EMS team, ECU Telemedicine Center determined a means to use the Internet to transmit electrocardiogram (ECG) data from their standard field deployed Medtronic Physio-Control LIFEPAK 12 system. The LIFEPAK 12 is capable of using a modem or a direct serial connection but was not able to send recorded ECG samplings across the Internet. A Bluetooth Serial Adapter™ from Connectblue, Inc. was used to interface the LIFEPAK 12 with a Bluetooth-enabled Compaq iPAQ. The files could be transmitted from the LIFEPAK to the iPAQ, using e-mail, ftp, or web based upload, to multiple recipients anywhere else in the world. This capability was established de novo on-scene at the MOUT, with no prior integration work performed before the exercise. Communication between the LIFEPAK 12 and the iPAQ was established, and the data files were sent to Medtronic Physio-Control specialists. The data were not viewable on site, as this required using proprietary software, but Medtronic Physio-Control personnel were able to view these data at their location.
Alternative Power
Although AC electrical power was available in the telehealth operations center, rechargeable battery packs and solar panel systems were used and tested. The solar panel system tested was a commercial off the shelf panel, PowerQwest’s Suncatcher Expedi-tion. The battery packs were 7.2V, 3A NiMh batteries available at any electronics store. Several devices, including the TopHat and 802.11b wireless access points, were designed to use the same commercially available power connectors, which enabled power systems to be interchangeable. TopHat was powered by the battery system, automobile cigarette lighter, and solar panel. Although several of the devices worked off their own proprietary batteries, they could have been adapted to use the battery or solar power systems.
Discussion
Extensive network and videoconferencing infrastructure is already in place to support telehealth throughout the United States. During the exercise, the ECU Telemedicine Center demonstrated that its network can be activated in a timely manner to support disaster operations, and existing multipoint conference units can enable disparate network and videoconferencing standards to interoperate. The test scenario involved hospitals in the affected area, the trauma center that is responsible for coordinating disaster medical response in the region, the NC Emergency Operations Center, and an on-scene team. This demonstrated only one use of the network: administrative management of disaster response and medical assets. However, other potential clinical applications include the following:
Assistance for overwhelmed emergency departments and facilitation of transport decisions
Consultation with regional or National centers of expertise in diagnosis and management of bioterrorism agent exposure
Psychologic/psychiatric consultation (e.g., post-traumatic stress)
Long-term follow-up with disaster victims
“Traditional” telehealth (i.e., medical specialty consultation)
Although ECU’s telehealth network can provide tremendous benefit in disaster response, these resources are not part of current disaster response plans. In fact, although relevant sites in the network were activated for the tabletop simulation, participants at those facilities did not use the systems for the exercise. Ironically, one of the chief problems noted during the postexercise debrief was the inability to communicate with the involved hospitals, because the hospitals’ switchboards were overwhelmed. Mean-while, the already activated videoconferencing links were mostly idle. This illustrated the need to add the telemedicine networks to response plans, advertise that they are available, and use them in disaster drills.
However, one key change in the existing network must be made: telehealth network-related systems must be put on primary back-up power (generator) systems to ensure that these resources are available in case of loss of electrical power. Currently, none of the nodes, including the hub at the ECU Telemedicine Center, is on primary back-up power.
A bridged wireless LAN network would allow the use of IP-based video, data, and voice products that would enable collaboration between personnel in isolated on-scene locations or surveillance and remote monitoring of a “hot zone” from a safe distance. These wireless networks could be either isolated or connected to the rest of the Internet via the reach-back paths, or separate private and public networks could be deployed. Furthermore, a wireless LAN infrastructure would support communications with a mobile responder who is equipped with wearable computing, monitoring, or video systems.
During the exercise only real-time videoconferencing and streaming technologies were tested, but asynchronous video transmission also can be employed if network bandwidth is insufficient to support videoconferencing or if higher than NTSC (i.e., broadcast television) resolution is required. Several inexpensive MPEG-1 and MPEG-2 digital video encoders are commercially available and work with standard personal computer interfaces. Also, digital still cameras would be a valuable part of a field-deployed system, both for clinical diagnostic purposes and to provide images of the disaster scene.
The concept of using web-enabled clinical and environmental devices and wireless communications was demonstrated. Although only small numbers of devices were used during the exercise, it is feasible that an array of devices and sensors could be simultaneously monitored on scene or remotely to assess the nature and extent of casualties. Ostensibly, many environmental sensors, linked with GPS coordinates, could be deployed in a hazardous area by protective-suited personnel, left in situ, and remotely monitored. These hazardous agent concentration and location data could then be combined with meteorologic and mapping data to provide detailed assessment of the situation and prediction of the spread of toxins.4
A primary barrier to remote monitoring of casualties or the environment is the lack of widely employed data representation and interoperability standards in COTS devices. For the exercise, ECU Telemedicine Center engineers had to develop a means of converting vendor-proprietary data and communications protocols. Serial data communications protocols can be translated readily to IP via the use of micro-webserver platforms, such as TopHat. However, data representation and device control standards must be developed to enable interoperability without customized systems integration. Such standards would allow data from any device to be readable by any device or application. This goal can be accomplished by using an open source standard, such as extensible markup language (XML) and/or the Common Object Request Broker Architecture (CORBA), as a basis for interoperability. Currently, several efforts are under way to use open source standards for health and scientific device interoperability, including Health Level Seven–Version 3, which uses XML schema to represent health data, and Instrument Markup Language, which is a variant of XML for astronomic instrumentation.5
To achieve the vision of internet-worked, distributed disaster medical response, environmental and clinical device interoperability standards must be developed and adopted. The lack of interoperability between clinical devices and information systems has limited the adoption of telehealth as well. Perhaps the post-September 11 emphasis on improving bioterrorism preparedness, public health surveillance, and emergency response coordination will be the impetus for achieving this goal.
The technologies demonstrated during the exercise operated well in the hands of technically trained and experienced operators. However, a truer assessment of the performance and usability of these technologies should be based on their use by emergency responders. It is anticipated that a combination of improved human factors engineering, technical training for emergency response personnel, and making technical experts members of the response team will be required to make such systems work in real conditions.
This exercise provided the opportunity to test telehealth technologies and operations concepts for medical response within the context of a simulated disaster response environment. The unique locale and its concomitant isolation, with environmental and logistical challenges, provided a suitable setting for simulating disaster response operations. Using a simulated disaster situation allows system developers and disaster planners to test innovative technologies and approaches in a relevant environment without worrying about the consequences of failure. The use of existing terrestrial telemedicine networks for disaster/terrorism applications was demonstrated, and the set-up and performance of enabling, rapidly deployable satellite, wireless LAN, videoconferencing, and clinical and environmental data telemetry technologies were evaluated. The exercise validated several basic requirements for using telehealth to improve disaster response (Table 2▶). Further refinement of these requirements will involve additional testing and exercises that incorporate “real” users, including emergency responders, physicians, and disaster planners.
Table 2 .
Recommendations, Based on the Exercise, for Functional Requirements for Both Fixed and Deployable Telehealth Systems.
For existing telehealth networks
|
For on-scene operations
|
This experience helped develop and evolve extramural collaborative relationships, and the ECU Tele-medicine Center was able to leverage the experience and in-kind support of these collaborators. Relationships with other local and regional organizations were cultivated by participation in the exercise and will be invaluable in the development of disaster response plans and infrastructure to better support the region, the state, and the nation.
Acknowledgments
The authors thank the many organizations and individuals who contributed to the success of this exercise. Onslow County Emergency Medical Services, especially Butch Thompson, Lynda Buchikos, and Judi Costa, were valuable partners in developing our plans. We also thank our hosts at Camp Lejeune Marine Corps Base and Col. Mark Goodman and Maj. Steve Simmons in particular. Global Communications Solutions, Inc. (GCS), provided satellite communications equipment, and GCS’s Vance Kannapel and Brian Skurka’s diligence in setting up and operating said equipment was greatly appreciated. Tommy Morris and Renee Clerici from the U.S. Army’s Telemedicine and Advanced Technology Research Center (TATRC) made one of their Inmarsat terminals available for our use. Rae Systems and Graesby Dynamics let us borrow environmental monitors, and the North Carolina Division of Emergency Management loaned two Iridium phones for the exercise. Finally, we express our gratitude to Mark Harrington, Chad Waters, and Tony Cooke at the ECU Telemedi-cine Center for their skill and patience in setting up and testing the videoconference network and multipoint sessions.
References
- 1.Teich JM, Wagner MM, MacKenzie CF, Schafer KO. The informatics response in disaster, terrorism, and war. J Am Med Inform Assoc 2002; 9(2)97–104. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 2.U.S. General Accounting Office. National Preparedness: Integrating new and existing technology and information sharing into an effective homeland security strategy. Testimony of Y.A. Yim, Managing Director, National Pre-paredness. GAO-02-811T Washington, DC, June 2002.
- 3.American Telemedicine Association. The Telemedicine Response to Homeland Safety and Security—Developing a National Network for Rapid and Effective Response for Emergency Medical Care. Available at <http://www.atmeda.org/news/newres.htm>. 2001.
- 4.Boris J. The threat of chemical and biological terrorism: preparing a response. Comput Sci Engineer Mar/Apr 2002; 22–32.
- 5.Cox D. XML for instrument control and monitoring: Exchanging data between instruments. Dr. Dobb’s Journal. 2001; 330:83–85. [Google Scholar]



