Skip to main content
. 2020 Jul 9;22(7):e15770. doi: 10.2196/15770

Table 2.

The checklist for reporting results from the internet surveys checklist.

Item category and checklist item Explanation
Design

Describe survey design A randomized controlled trial experiment testing the impact of using the GRASP framework on clinicians and health care professionals’ decisions in selecting predictive tools for CDSa, using a convenience invited sample to participate in the experiment
IRBb approval and informed consent process

IRB approval The experiment was approved by the Human Research Ethics Committee, Faculty of Medicine and Health Sciences, Macquarie University, Sydney, Australia

Informed consent Informed consent was introduced at the beginning of the survey for participants to agree before they take the survey, including the length of time of the survey, types of data collected and its storage, investigators, and the purpose of the study

Data protection Collected personal information was protected through Macquarie University account on Qualtrics survey system
Development and pretesting

Development and testing The first author (MK) developed the survey and pilot tested the questions and its usability before deploying the survey to the participants
Recruitment process and description of the sample having access to the questionnaire

Open survey versus closed survey This was a closed survey; only invited participants had access to complete the survey

Contact mode An initial contact, via email, was sent to all invited participants. Only those who agreed to participate completed the web-based survey

Advertising the survey The survey was not advertised. Only invited participants were informed of the study and completed the survey
Survey administration

Web or email The survey was developed using the Qualtrics survey platform, and the link to the web-based survey was sent to invited participants via email. Responses were automatically collected through the Qualtrics survey platform then retrieved by the investigators for analysis

Context Only invited participants were informed of the study via email

Mandatory/voluntary The survey was not mandatory for invited participants

Incentives The only incentive was that participants could request to be acknowledged in the published study. Participants were also informed of the results of the survey after the analysis is complete

Time/date Data were collected over 6 weeks, from March 11 to April 21, 2019

Randomization of items or questionnaires To prevent biases, items were randomized. Figure 2 shows the survey workflow and randomization of 4 scenarios

Adaptive questioning Four scenarios were used and randomized, but they were not conditionally displayed

Number of items From 5 to 8 items per page

Number of screens (pages) The questionnaire was distributed over 5 pages

Completeness check Completeness checks were used after the questionnaire was submitted, and mandatory items were highlighted. Items provided a nonresponse option “not applicable” or “don’t know”

Review step Respondents were able to review and change their answers before submitting their answers
Response rates

Unique site visitor We used the IPc addresses to check for unique survey visitors

View rate (ratio of unique survey visitors/unique site visitors) Only invited participants had access to the survey. Survey visitors included those who completed the survey and those who started the survey but did not complete it or gave incomplete answers

Participation rate (ratio of unique visitors who agreed to participate/unique first survey page visitors) The recruitment rate was 90% (218 participants agreed to participate out of 242 invited participants who visited the first page)

Completion rate (ratio of users who finished the survey/users who agreed to participate) The completion rate was 91% (198 participants completed the survey out of 218 participants who agreed to participate)
Preventing multiple entries from the same individual

Cookies used Cookies were not used to assign a unique user identifier; instead, we used users’ computer IP to identify unique users

IP address check The IP addresses of participants’ computers were used to identify potential duplicate entries from the same user. Only 2 duplicate entries were captured and were eliminated before analysis

Log file analysis We also checked the provided demographic information, of all participants, to make sure the 2 identified duplicates were the only incidents

Registration Data were collected and the user IP and other demographic data were used later on to eliminate duplicate entries before analysis. Most recent entries were used in the analysis
Analysis

Handling of incomplete questionnaires Only completed surveys were used in the analysis

Questionnaires submitted with an atypical timestamp The task completion time was captured. However, no specific timeframe was used. In the analysis, we excluded statistical outliers, since the survey allowed users to re-enter after a while, for example, the next day. This is discussed in the paper

Statistical correction No statistical correction was required

aCDS: clinical decision support.

bIRB: institutional review board.

cIP: internet protocol.