Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2013 Jun 20.
Published in final edited form as: Nurs Res. 1996 Nov-Dec;45(6):370–372. doi: 10.1097/00006199-199611000-00015

A Novel Approach to Score and Enter Visual Analogue Scale Data: Use of a Computerized Digitizer Tablet

Hsiu-Ying Huang 1, Diana J Wilkie 2, Donna L Berry 3
PMCID: PMC3687081  NIHMSID: NIHMS439070  PMID: 8941313

Introduction

Numerous studies have been conducted to evaluate psychometric properties of the VAS in measuring degree of pain or pain relief, concluding that the VAS is a simple, reliable, reproducible, valid, and sensitive tool (e.g., Huskisson, 1974; Jensen, Karoly, & Braver, 1986; Scott & Huskisson, 1976; Sriwatanakul, et al., 1983). Although the VAS has several advantages in measuring pain intensity and pain relief, Cline and colleagues (1992) described several methodological issues that need to be considered before using the tool in research or clinical practice. One of the unresolved issues is cumbersome scoring.

A VAS score is determined by measuring the distance from the end indicating zero on a straight line to the mark placed by the respondent. Researchers have used a micrometer (Tesler, et al., 1991; Wilkie & Keefe, 1991), a clear ruler (Scott & Huskinsson, 1976), or a transparent scoring template (Cline, Herman, Shaw, & Morton, 1992) for measuring the VAS lines. The large amount of time required for the line measurement, data encoding, and data entry has been the major impediment of using the VAS. Computerized, automated measurement and data entry of VAS scores provide a practical solution and is essential to the efficient use of VAS data in contemporary clinical research.

The purposes of this paper are to introduce a computerized, software-driven digitizer tablet for scoring and entering VAS data, to describe testing of intra- and inter-rater reliability as well as the accuracy of the digitizer technique; and to highlight the time efficiency of a digitizer program in scoring and entering VAS data. Conclusions and recommendations for research applications of the procedure are provided.

Description of The Digitizer Tablet and The Computer Software

The computerized digitizer system consists of a DOS software program, dig2 (Steinke, 1993), and the SummaSketch II digitizer tablet (Model MM1201). The resolution of the MM1201 digitizer tablet was up to 40 lines per mm; indicating the smallest possible distance between two points was 0.025 mm. The standard accuracy of the digitizer tablet is ±0.381 mm as reported by Summagraphics Corporation (1990). To operate the digitizer tablet and process the VAS data, the dig2 software requires installation on a hard drive prior to connecting the digitizer tablet (Steinke, 1993).

To measure the VAS, the data sheet should be placed within the active area of the digitizer tablet (297 mm × 297 mm). The VAS score is measured by the distance a hand-held cursor moves over the digitizer tablet and is automatically entered into a pre-created data file. The detailed procedure for setting the program and data entry is provided in Table I. In addition to digitizer tablet entry, keyboard data entry is an option. A file with a maximum of 512 variables and 4,096 subjects can be created with the dig2 software. The cost in 1993 for the VAS computer scoring system was $384 for the digitizer tablet and $100 for the dig2 software.

Table I.

Procedure for accessing the dig2 program and VAS data entry.

A. Connecting the digitizer to a computer:
  1. Attach the digitizer cursor to the digitizer tablet by plugging the puck cursor to the tablet connection labeled cursor/stylus”.

  2. Connect the digitizer tablet to a personal DOS computer (with pre-installed dig2 program) with the supplied power cable. The connector fits onto a 9 pin male port of the computer.

  3. Push the small black switch next to the connection labeled /O/PWR” to turn on the power and to activate the digitizer tablet.


B. Accessing dig2 program and creating a dig2 file:

  1. In DOS change to the directory where the dig2 software was pre-installed on the C drive. Access the dig2 software program by typing: :\> cd dig symbol 191 \f “Symbol” \s 12 graphic file with name nihms439070t1.jpg} “ and then :\dig> dig2 filename symbol 191 \f “Symbol” \s 12 graphic file with name nihms439070t1.jpg}“ (symbol 191 \f “Symbol” \s 12 graphic file with name nihms439070t1.jpg} = enter; the filename should not exceed eight characters).

  2. In dig2 main menu, choose “Option 1: Edit variable control file to create the variables.”

    # Enter the variable name as indicated on the screen.

    # Format variable by typing fa.b (a = the number of digits plus one; b = the number of digits after the decimal point; i.e., f6.2 means the score is 3 digits before the decimal point, one decimal point, and two digits after the decimal point, such as 100.00).

    # Enable the digitizer for the variable by typing ” in the indicated place.

    # Set the minimum and maximum data value (i.e., 0 as minimum and 100 as maximum).

    Create next variable by typing ctrl-A”. Repeat the same steps marked as #.


C. Scoring and entering data:

  1. Organize the data sheets by subject identification number (SID).

  2. In main menu, choose “Option 2: Enter/Edit data.”

    Enter the SID as requested. Place the data sheet on the active area of the tablet. Place the intersection of the cursor cross-hairs on the left end of the VAS line that is defined as no pain or zero. Push the green button to activate the measurement. Move the cursor without moving the data sheet. Place the hair line on the left edge of the line marked by the respondent. Push the same green button to signal the end of the measurement. The score will be shown in the parentheses ( ) corresponding to the variable. Continue to measure other variables for the same SID. Make sure the score corresponds to the appropriate variable, especially in a file with several variables. After finishing all variables for the subject, esc and go to the main menu.

  3. Repeat step2 for scoring VASs from another subject.


D. Converting the dig2 database to SPSS/PC, ASCI, or DIF database:

  1. In main menu, choose Option 3: Write data to formatted file. Then, select the data output option, such as ASCI columns. The database automatically will be written to the selected database (i.e., filename.asc file for ASCI database).


E. Writing ASCI database to Crunch4 database:

  1. In Crunch4 program control menu, select build. Build a new file for output file with variable attributes defined. Return to the control menu.

  2. In the control menu, choose datarw to read data from ASCI database. In datarw option 2: Input file, enter the ASCI file name to be read. Then choose option 4: Variables to be read/written. Fill out the number of beginning line, ending line, and decimal point to indicate column position of variable in the ASCI database. After finishing all of the steps, choose “Option 1: Begin reading data.” The Crunch4 database will be created for further data analysis.

Reliability and Accuracy of the Digitizer Program

Intra-rater reliability

To test the stability and reproducibility of the digitizer measurement, 138 horizontal VASs were measured to 0.01 mm using the digitizer and dig2 software. The 138 VASs were measured again by the same rater using the same method. The Pearson correlation was .9999 (p ≤ .0001) for two measurement sets of 117 non-identical scores (21 scores were exactly the same; most represented zero or the missing value). The discrepancy between the two data sets varied from 0.01 to 0.85 mm with a mean of 0.25 (SD = 0.19) for the non-identical scores. This comparison demonstrated high intra-rater reliability, supporting the stability and reproducibility of digitizer and software measurement and encoding when measured by the same person. Steinke (1989) reported similar findings (r = .9998 to .9999, p < .001) for test-retest by the same investigator, separated by four months, using a similar computerized digitizer program (dig software and a Summagraphics digitizer tablet, MM1103).

Inter-rater reliability

To test the reliability of the measurement program operated by different persons, 510 VASs were measured by a trained high school student with prior basic computer knowledge. The 1.5 hour training session included a brief introduction to the digitizer equipment, connecting the equipment to a computer, accessing an existing file, and data entry. After training, the student was able to independently measure the 510 VASs, which also were measured by one of the authors using the measurement program. The Pearson correlation was .9900 (p ≤ .0001) for the 499 non-identical scores. Based on the 499 scores, the mean discrepancy was 0.56 mm (SD = 4.47 mm). The maximum discrepancy was 99.80 mm, which was due to operator error. Ninety-six percent of the discrepancies were less than 1 mm. While these findings demonstrate greater variation between two raters using the dig2 program and the digitizer tablet than when one rater performs the measurement, the inter-rater correlation was very strong. Measurement rules and likely sources for the measurement error should be clarified and standardized in order to improve the inter-rater reliability. For example, we found that a rule is needed regarding consistently placing the cross-hairs of the cursor on the left end of the marked line since the thickness of the marked line and inconsistent placement can affect the measurement.

Accuracy

To test the accuracy, we measured 100 VASs using a traditional micrometer and the computer program. The Pearson correlation was .9984 (p ≤ .0001) for the 93 non-identical scores. The discrepancy ranged from 0.02 to 10.11 mm with a mean of 0.66 (SD = 1.49). Five discrepancy scores were greater than 1 mm. Examination of the discrepant data revealed that all of the five disagreements were due to the misreading of the micrometer, a common error with the method. Steinke (1989) also found errors in ruler measurement due to reverse encoded scores and data entry mistakes. In summary, these findings demonstrate the accuracy of the software-driven digitizer to score VAS data.

Time Efficiency of the Digitizer Program

To test the efficiency of using the digitizer, we recorded the time required to measure and enter the 100 VASs when the software driven digitizer or the micrometer method were used. After completing the dig2 measurement and conversion, a micrometer was used by the same person to measure the VASs. The scores were encoded on data sheets before entering the scores into a Crunch4 file (Crunch Software Corporation, 1991). A total of 40 minutes was required to score and enter 100 VASs for 10 VAS variables plus conversion to a Crunch4 database. The time increased to 60 minutes when using the micrometer plus keyboard entry and file creation, 50% more time than required with the digitizer method (Table II). In a large clinical trial, the sum of increased time required for micrometer measurement would be burdensome. For example, the digitizer could reduce VAS data scoring and entry by 8.8 hours in a randomized clinical trial we are conducting in 200 patients with lung cancer. This time savings assumes that the 1200 VAS collected in the study would be scored and entered into a Crunch file as a set.

Table II.

Comparison of time required to use the digitizer and a micrometer to measure 100 VASs and enter scores for 10 variables.

Digitizer Procedures time*
(min)
Micrometer Procedures time*
(min)
1. Connecting the digitizer to a computer and accessing the dig2 program. 3 1. Measuring and encoding the 100 VAS scores. 45
2. Creating a dig2 file with 11 variables (one subject identification value and 10 VAS variables). 4 2. Creating a Crunch4 file with 11 variables (one subject identification value and 10 VAS variables). 3
3. Scoring and simultaneously entering 100 VAS scores into a dig2 file. 25 3. Entering encoded scores to Crunch4 database. 12
4. Converting the dig2 file to SPSS/PC, ASCI, or DIF file. 0
5. Writing ASCI file to Crunch4 file. 8

Total 40 Total 60
*

rounded to nearest minute.

Conclusions and Recommendations

Testing the intra-rater reliability of dig2 software driven digitizer system indicated exceptional stability and reproducibility for VAS measurement and data entry. We conclude that duplicate or double measurement is not necessary when using this automated system and further reducing personnel costs associated with use of VASs. Although testing the inter-rater reliability revealed small variation in scores obtained by two persons, the differences reflected acceptable measurement error. The variation can be reduced by applying standard procedures, training, and identifying possible errors in making the measurements. For example, scoring rules might include placing the cursor hair line on the left side of the subject’s mark no matter how thick the line is, inspecting the scores immediately after scoring to identify operational error in scoring, and not moving the data sheet during scoring.

Comparing the use of digitizer to use of micrometer for scoring VASs indicated excellent accuracy for both methods and fewer errors when the digitizer was used. Use of the digitizer minimized errors in measurement technique, encoding, and entering the data into a computer file. The time required to use the digitizer was less than that required to use the micrometer. Similar time savings are likely when comparing digitizer tablet scoring to plastic ruler scoring. Placement of the cursor requires time similar to placement of the ruler for VAS measurement, but additional time would be required to encode and enter ruler-measured scores. The dig2 file easily can be converted to SPSS/PC, ASCI, or DIF file. Personnel salary savings would offset cost of the equipment in many studies. Therefore, we strongly recommended use of the computerized, software driven digitizer tablet for scoring and entering VAS scores, rather than traditional micrometer measurement.

Acknowledgments

This research was supported by a grant from the National Cancer Institute (5 R29 CA62477-02) and a professorship award from the American Cancer Society (DW, principal investigator). Equipment and software purchase was supported by the Department of Biobehavioral Nursing and Health Systems, University of Washington.

Contributor Information

Hsiu-Ying Huang, University of Washington, School of Nursing.

Diana J. Wilkie, University of Washington, Department of Biobehavioral Nursing and Health Systems.

Donna L. Berry, University of Washington, Department of Biobehavioral Nursing and Health Systems.

References

  1. Cline ME, Herman J, Shaw E, Morton RD. Standardization of the Visual Analogue Scale. Nursing Research. 1992;41(6):378–380. [PubMed] [Google Scholar]
  2. Crunch Software Corporation. Reference Manual: Crunch Statistical Package. Oakland, CA: Crunch Software Program; 1991. [Google Scholar]
  3. Huskisson EC. Measurement of pain. Lancet. 1974;2:1127–1131. doi: 10.1016/s0140-6736(74)90884-8. [DOI] [PubMed] [Google Scholar]
  4. Jensen MP, Karoly P, Braver S. The measurement of clinical pain intensity: A comparison of six methods. Pain. 1986;27:117–126. doi: 10.1016/0304-3959(86)90228-9. [DOI] [PubMed] [Google Scholar]
  5. Scott J, Huskisson EC. Graphic representation of pain. Pain. 1976;2:175–184. [PubMed] [Google Scholar]
  6. Sriwatanakul K, Kelvie W, Lasagna L, Calimlim JF, Weis OF, Mehta G. Studies with different types of visual analog scales for measurement of pain. Clinical Pharmacology and Therapeutics. 1983;34(2):234–239. doi: 10.1038/clpt.1983.159. [DOI] [PubMed] [Google Scholar]
  7. Steinke D. Unpublished Manuscript for Dig2 Software. Tucson, AZ: Author; 1993. [Google Scholar]
  8. Steinke N. The digitizer tablet: A faster yet reliable method of measuring Visual Analogue Scales; Western Institute of Nursing Conference; San Diego. 1989. p. 184. [Google Scholar]
  9. Summagraphics Corporation. SummaSketch II Series Drives/Utilities User Guide. Seymour, CT: Author; 1990. [Google Scholar]
  10. Tesler MD, Savedra MC, Holzemer WL, Wilkie DJ, Ward JA, Paul SM. The word-graphic rating scale as a measure of children and adolescents’ pain intensity. Research in Nursing & Health. 1991;14:361–371. doi: 10.1002/nur.4770140507. [DOI] [PubMed] [Google Scholar]
  11. Wilkie DJ, Keefe FJ. Coping strategies of patients with lung cancer-related pain. The Clinical Journal of Pain. 1991;7:292–299. doi: 10.1097/00002508-199112000-00007. [DOI] [PubMed] [Google Scholar]

RESOURCES