Introduction
Numerous studies have been conducted to evaluate psychometric properties of the VAS in measuring degree of pain or pain relief, concluding that the VAS is a simple, reliable, reproducible, valid, and sensitive tool (e.g., Huskisson, 1974; Jensen, Karoly, & Braver, 1986; Scott & Huskisson, 1976; Sriwatanakul, et al., 1983). Although the VAS has several advantages in measuring pain intensity and pain relief, Cline and colleagues (1992) described several methodological issues that need to be considered before using the tool in research or clinical practice. One of the unresolved issues is cumbersome scoring.
A VAS score is determined by measuring the distance from the end indicating zero on a straight line to the mark placed by the respondent. Researchers have used a micrometer (Tesler, et al., 1991; Wilkie & Keefe, 1991), a clear ruler (Scott & Huskinsson, 1976), or a transparent scoring template (Cline, Herman, Shaw, & Morton, 1992) for measuring the VAS lines. The large amount of time required for the line measurement, data encoding, and data entry has been the major impediment of using the VAS. Computerized, automated measurement and data entry of VAS scores provide a practical solution and is essential to the efficient use of VAS data in contemporary clinical research.
The purposes of this paper are to introduce a computerized, software-driven digitizer tablet for scoring and entering VAS data, to describe testing of intra- and inter-rater reliability as well as the accuracy of the digitizer technique; and to highlight the time efficiency of a digitizer program in scoring and entering VAS data. Conclusions and recommendations for research applications of the procedure are provided.
Description of The Digitizer Tablet and The Computer Software
The computerized digitizer system consists of a DOS software program, dig2 (Steinke, 1993), and the SummaSketch II digitizer tablet (Model MM1201). The resolution of the MM1201 digitizer tablet was up to 40 lines per mm; indicating the smallest possible distance between two points was 0.025 mm. The standard accuracy of the digitizer tablet is ±0.381 mm as reported by Summagraphics Corporation (1990). To operate the digitizer tablet and process the VAS data, the dig2 software requires installation on a hard drive prior to connecting the digitizer tablet (Steinke, 1993).
To measure the VAS, the data sheet should be placed within the active area of the digitizer tablet (297 mm × 297 mm). The VAS score is measured by the distance a hand-held cursor moves over the digitizer tablet and is automatically entered into a pre-created data file. The detailed procedure for setting the program and data entry is provided in Table I. In addition to digitizer tablet entry, keyboard data entry is an option. A file with a maximum of 512 variables and 4,096 subjects can be created with the dig2 software. The cost in 1993 for the VAS computer scoring system was $384 for the digitizer tablet and $100 for the dig2 software.
Table I.
A. Connecting the digitizer to a computer: |
---|
|
|
B. Accessing dig2 program and creating a dig2 file: |
|
|
|
C. Scoring and entering data: |
|
|
|
D. Converting the dig2 database to SPSS/PC, ASCI, or DIF database: |
|
|
|
E. Writing ASCI database to Crunch4 database: |
|
|
Reliability and Accuracy of the Digitizer Program
Intra-rater reliability
To test the stability and reproducibility of the digitizer measurement, 138 horizontal VASs were measured to 0.01 mm using the digitizer and dig2 software. The 138 VASs were measured again by the same rater using the same method. The Pearson correlation was .9999 (p ≤ .0001) for two measurement sets of 117 non-identical scores (21 scores were exactly the same; most represented zero or the missing value). The discrepancy between the two data sets varied from 0.01 to 0.85 mm with a mean of 0.25 (SD = 0.19) for the non-identical scores. This comparison demonstrated high intra-rater reliability, supporting the stability and reproducibility of digitizer and software measurement and encoding when measured by the same person. Steinke (1989) reported similar findings (r = .9998 to .9999, p < .001) for test-retest by the same investigator, separated by four months, using a similar computerized digitizer program (dig software and a Summagraphics digitizer tablet, MM1103).
Inter-rater reliability
To test the reliability of the measurement program operated by different persons, 510 VASs were measured by a trained high school student with prior basic computer knowledge. The 1.5 hour training session included a brief introduction to the digitizer equipment, connecting the equipment to a computer, accessing an existing file, and data entry. After training, the student was able to independently measure the 510 VASs, which also were measured by one of the authors using the measurement program. The Pearson correlation was .9900 (p ≤ .0001) for the 499 non-identical scores. Based on the 499 scores, the mean discrepancy was 0.56 mm (SD = 4.47 mm). The maximum discrepancy was 99.80 mm, which was due to operator error. Ninety-six percent of the discrepancies were less than 1 mm. While these findings demonstrate greater variation between two raters using the dig2 program and the digitizer tablet than when one rater performs the measurement, the inter-rater correlation was very strong. Measurement rules and likely sources for the measurement error should be clarified and standardized in order to improve the inter-rater reliability. For example, we found that a rule is needed regarding consistently placing the cross-hairs of the cursor on the left end of the marked line since the thickness of the marked line and inconsistent placement can affect the measurement.
Accuracy
To test the accuracy, we measured 100 VASs using a traditional micrometer and the computer program. The Pearson correlation was .9984 (p ≤ .0001) for the 93 non-identical scores. The discrepancy ranged from 0.02 to 10.11 mm with a mean of 0.66 (SD = 1.49). Five discrepancy scores were greater than 1 mm. Examination of the discrepant data revealed that all of the five disagreements were due to the misreading of the micrometer, a common error with the method. Steinke (1989) also found errors in ruler measurement due to reverse encoded scores and data entry mistakes. In summary, these findings demonstrate the accuracy of the software-driven digitizer to score VAS data.
Time Efficiency of the Digitizer Program
To test the efficiency of using the digitizer, we recorded the time required to measure and enter the 100 VASs when the software driven digitizer or the micrometer method were used. After completing the dig2 measurement and conversion, a micrometer was used by the same person to measure the VASs. The scores were encoded on data sheets before entering the scores into a Crunch4 file (Crunch Software Corporation, 1991). A total of 40 minutes was required to score and enter 100 VASs for 10 VAS variables plus conversion to a Crunch4 database. The time increased to 60 minutes when using the micrometer plus keyboard entry and file creation, 50% more time than required with the digitizer method (Table II). In a large clinical trial, the sum of increased time required for micrometer measurement would be burdensome. For example, the digitizer could reduce VAS data scoring and entry by 8.8 hours in a randomized clinical trial we are conducting in 200 patients with lung cancer. This time savings assumes that the 1200 VAS collected in the study would be scored and entered into a Crunch file as a set.
Table II.
Digitizer Procedures | time* (min) |
Micrometer Procedures | time* (min) |
---|---|---|---|
1. Connecting the digitizer to a computer and accessing the dig2 program. | 3 | 1. Measuring and encoding the 100 VAS scores. | 45 |
2. Creating a dig2 file with 11 variables (one subject identification value and 10 VAS variables). | 4 | 2. Creating a Crunch4 file with 11 variables (one subject identification value and 10 VAS variables). | 3 |
3. Scoring and simultaneously entering 100 VAS scores into a dig2 file. | 25 | 3. Entering encoded scores to Crunch4 database. | 12 |
4. Converting the dig2 file to SPSS/PC, ASCI, or DIF file. | 0 | ||
5. Writing ASCI file to Crunch4 file. | 8 | ||
| |||
Total | 40 | Total | 60 |
rounded to nearest minute.
Conclusions and Recommendations
Testing the intra-rater reliability of dig2 software driven digitizer system indicated exceptional stability and reproducibility for VAS measurement and data entry. We conclude that duplicate or double measurement is not necessary when using this automated system and further reducing personnel costs associated with use of VASs. Although testing the inter-rater reliability revealed small variation in scores obtained by two persons, the differences reflected acceptable measurement error. The variation can be reduced by applying standard procedures, training, and identifying possible errors in making the measurements. For example, scoring rules might include placing the cursor hair line on the left side of the subject’s mark no matter how thick the line is, inspecting the scores immediately after scoring to identify operational error in scoring, and not moving the data sheet during scoring.
Comparing the use of digitizer to use of micrometer for scoring VASs indicated excellent accuracy for both methods and fewer errors when the digitizer was used. Use of the digitizer minimized errors in measurement technique, encoding, and entering the data into a computer file. The time required to use the digitizer was less than that required to use the micrometer. Similar time savings are likely when comparing digitizer tablet scoring to plastic ruler scoring. Placement of the cursor requires time similar to placement of the ruler for VAS measurement, but additional time would be required to encode and enter ruler-measured scores. The dig2 file easily can be converted to SPSS/PC, ASCI, or DIF file. Personnel salary savings would offset cost of the equipment in many studies. Therefore, we strongly recommended use of the computerized, software driven digitizer tablet for scoring and entering VAS scores, rather than traditional micrometer measurement.
Acknowledgments
This research was supported by a grant from the National Cancer Institute (5 R29 CA62477-02) and a professorship award from the American Cancer Society (DW, principal investigator). Equipment and software purchase was supported by the Department of Biobehavioral Nursing and Health Systems, University of Washington.
Contributor Information
Hsiu-Ying Huang, University of Washington, School of Nursing.
Diana J. Wilkie, University of Washington, Department of Biobehavioral Nursing and Health Systems.
Donna L. Berry, University of Washington, Department of Biobehavioral Nursing and Health Systems.
References
- Cline ME, Herman J, Shaw E, Morton RD. Standardization of the Visual Analogue Scale. Nursing Research. 1992;41(6):378–380. [PubMed] [Google Scholar]
- Crunch Software Corporation. Reference Manual: Crunch Statistical Package. Oakland, CA: Crunch Software Program; 1991. [Google Scholar]
- Huskisson EC. Measurement of pain. Lancet. 1974;2:1127–1131. doi: 10.1016/s0140-6736(74)90884-8. [DOI] [PubMed] [Google Scholar]
- Jensen MP, Karoly P, Braver S. The measurement of clinical pain intensity: A comparison of six methods. Pain. 1986;27:117–126. doi: 10.1016/0304-3959(86)90228-9. [DOI] [PubMed] [Google Scholar]
- Scott J, Huskisson EC. Graphic representation of pain. Pain. 1976;2:175–184. [PubMed] [Google Scholar]
- Sriwatanakul K, Kelvie W, Lasagna L, Calimlim JF, Weis OF, Mehta G. Studies with different types of visual analog scales for measurement of pain. Clinical Pharmacology and Therapeutics. 1983;34(2):234–239. doi: 10.1038/clpt.1983.159. [DOI] [PubMed] [Google Scholar]
- Steinke D. Unpublished Manuscript for Dig2 Software. Tucson, AZ: Author; 1993. [Google Scholar]
- Steinke N. The digitizer tablet: A faster yet reliable method of measuring Visual Analogue Scales; Western Institute of Nursing Conference; San Diego. 1989. p. 184. [Google Scholar]
- Summagraphics Corporation. SummaSketch II Series Drives/Utilities User Guide. Seymour, CT: Author; 1990. [Google Scholar]
- Tesler MD, Savedra MC, Holzemer WL, Wilkie DJ, Ward JA, Paul SM. The word-graphic rating scale as a measure of children and adolescents’ pain intensity. Research in Nursing & Health. 1991;14:361–371. doi: 10.1002/nur.4770140507. [DOI] [PubMed] [Google Scholar]
- Wilkie DJ, Keefe FJ. Coping strategies of patients with lung cancer-related pain. The Clinical Journal of Pain. 1991;7:292–299. doi: 10.1097/00002508-199112000-00007. [DOI] [PubMed] [Google Scholar]