Download as pdf or txt
Download as pdf or txt
You are on page 1of 8

Checklist for Reporting Results of

Internet E-Surveys (CHERRIES)


Item Category Checklist Item Explanation Location in paper
Design Describe survey Describe target population, Please see Methods, 1st
design sample frame. Is the sample a paragraph.
convenience sample? (In
“open” surveys this is most A convenience sample was
likely.) not used.

IRB (Institutional IRB approval Mention whether the study Please see Methods, 5th
Review Board) has been approved by an IRB. paragraph.
approval and
informed consent
process

Informed consent Describe the informed consent Please see Methods, 2nd
process. Where were the paragraph.
participants told the length of
time of the survey, which data Participants received an
were stored and where and for email describing the study
how long, who the investigator (for informed consent) with a
was, and the purpose of the hyperlink to the
study? questionnaire.

It was a voluntary survey. The


participants were told the
length of time of the survey
(15-30 minutes). The
investigators were named.
The purpose of the study and
all other items mentioned
earlier in the paragraph were
explained in the invitation
email.

Data protection If any personal information Please see Methods, 5th


was collected or stored, paragraph.
describe what mechanisms
were used to protect
unauthorized access.

1
Checklist for Reporting Results of
Internet E-Surveys (CHERRIES)
Development and Development State how the survey was Please see Methods, 2nd
pre-testing and testing developed, including whether paragraph.
the usability and technical
functionality of the electronic
questionnaire had been tested
before fielding the
questionnaire.

Recruitment Open survey An “open survey” is a survey Please see Methods, 1st
process and versus closed open for each visitor of a site, paragraph.
description of the survey while a closed survey is only
sample having open to a sample which the The survey was only open to
access to the investigator knows (password- a sample which the
questionnaire protected survey). investigator knew.

Contact mode Indicate whether or not the Please see Methods, 1st
initial contact with the paragraph.
potential participants was
made on the Internet.
(Investigators may also send
out questionnaires by mail and
allow for Web-based data
entry.)
Advertising the How/where was the survey N/A (closed survey).
survey announced or advertised?
Some examples are offline
media (newspapers), or online
(mailing lists – If yes, which
ones?) or banner ads (Where
were these banner ads posted
and what did they look like?).
It is important to know the
wording of the announcement
as it will heavily influence who
chooses to participate. Ideally
the survey announcement
should be published as an
appendix

Survey Web/E-mail State the type of e-survey (eg, Please see Methods, 1st and
administration one posted on a Web site, or 2nd paragraphs.
one sent out through e-mail). If
it is an e-mail survey, were the
responses entered manually
into a database, or was there

2
Checklist for Reporting Results of
Internet E-Surveys (CHERRIES)
an automatic method for
capturing responses?

Context Describe the Web site (for Please see Methods, 2nd
mailing list/newsgroup) in paragraph.
which the survey was posted.
What is the Web site about, We used an online survey
who is visiting it, what are tool (FluidSurveys), but the
visitors normally looking for? survey was not posted on a
Discuss to what degree the Web site (email invitation).
content of the Web site could
pre-select the sample or
influence the results. For
example, a survey about
vaccination on a anti-
immunization Web site will
have different results from a
Web survey conducted on a
government Web site

Mandatory/volunt Was it a mandatory survey to Please see Methods, 2nd


ary be filled in by every visitor who paragraph.
wanted to enter the Web site,
or was it a voluntary survey? It was a voluntary survey
(informed consent).
Incentives Were any incentives offered Please see Methods, 1st
(eg, monetary, prizes, or non- paragraph.
monetary incentives such as an
offer to provide the survey
results)?

Time/Date In what timeframe were the Please see Methods, 2nd


data collected? paragraph.

Randomization of To prevent biases items can be Please see Methods, 2nd


items or randomized or alternated. paragraph.
questionnaires
Due to the nature of this
survey, including the use of
adaptive questioning,
randomization of items was
not performed.

3
Checklist for Reporting Results of
Internet E-Surveys (CHERRIES)
Adaptive Use adaptive questioning Please see Methods, 2nd
questioning (certain items, or only paragraph.
conditionally displayed based
on responses to other items)
to reduce number and
complexity of the questions.
Number of items What was the number of Please see Methods, 2nd
questionnaire items per page? paragraph.
The number of items is an
important factor for the
completion rate. There were 1 to 4 questions
per page depending on the
length of the question.

Number of Over how many pages was the Please see Methods, 2nd
screens (pages) questionnaire distributed? The paragraph.
number of items is an
important factor for the There were 16 pages (often
completion rate. less considering adaptive
questioning).

Completeness It is technically possible to do Please see Methods, 2nd


check consistency or completeness paragraph.
checks before the
questionnaire is submitted.
Was this done, and if “yes”, Completeness checks were
how (usually JAVAScript)? An not done
alternative is to check for
completeness after the All items provided a none-
questionnaire has been response option when
submitted (and highlight relevant.
mandatory items). If this has
been done, it should be
reported. All items should
provide a non-response option
such as “not applicable” or
“rather not say”, and selection
of one response option should
be enforced.

4
Checklist for Reporting Results of
Internet E-Surveys (CHERRIES)
Review step State whether respondents Please see Methods, 2nd
were able to review and paragraph.
change their answers (eg,
through a Back button or a
Review step which displays a Respondents were able to
summary of the responses and review and change through a
asks the respondents if they back button.
are correct).

Response rates Unique site visitor If you provide view rates or Please see Methods, 1st
participation rates, you need paragraph.
to define how you determined
a unique visitor. There are 158 respondents were
different techniques available, invited. Respondents
based on IP addresses or provided the name of their
cookies or both. LCSC. It was thus easy to
determine participation
rates.

View rate (Ratio Requires counting unique N/A


unique site visitors to the first page of the
visitors/unique survey, divided by the number The survey was not posted
survey visitors) of unique site visitors (not on a Web site (email
page views!). It is not unusual invitation).
to have view rates of less than
0.1 % if the survey is voluntary.

Participation rate Count the unique number of N/A


(Ratio unique people who filled in the first
survey page survey page (or agreed to
visitors/agreed to participate, for example by
participate) checking a checkbox), divided
by visitors who visit the first
page of the survey (or the
informed consents page, if
present). This can also be
called “recruitment” rate

5
Checklist for Reporting Results of
Internet E-Surveys (CHERRIES)
Completion The number of people Please see Results, 1st
rate (Ratio submitting the last paragraph.
agreed to questionnaire page, divided by
participate/ the number of people who This was named response
finished agreed to participate (or rate in the paper. The
survey) submitted the first survey denominator corresponded
page). This is only relevant if to the number of invited
there is a separate “informed health professionals.
consent” page or if the survey
goes over several pages. This is
a measure for attrition. Note
that “completion” can involve
leaving questionnaire items
blank. This is not a measure for
how completely
questionnaires were filled in.
(If you need a measure for this,
use the word “completeness
rate”.)

Preventing Cookies used Indicate whether cookies were Please see Methods, 1st
multiple entries used to assign a unique user paragraph.
from the same identifier to each client
individual computer. If so, mention the N/A
page on which the cookie was
set and read, and how long the Cookies were not used
cookie was valid. Were because respondents were
duplicate entries avoided by recruited from an existing
preventing users access to sampling frame.
thesurvey twice; or were
duplicate database entries
having the same user ID
eliminated before analysis? In
the latter case, which entries
were kept for analysis (eg, the
first entry or the most recent)?

IP check Indicate whether the IP Please see Methods, 1st


address of the client computer paragraph.
was used to identify potential
duplicate entries from the IP address was not used to
same user. If so, mention the identify duplicate entries.
period of time for which no
two entries from the same IP Respondents were recruited
address were allowed (eg, 24 from an existing sampling
hours). Were duplicate entries frame. Other information
avoided by preventing users (e.g. name of the LCSC,
with the same IP address contact information) allowed
access to the survey twice; or to identify potential duplicate
were duplicate database entries.

6
Checklist for Reporting Results of
Internet E-Surveys (CHERRIES)
entries having the same IP
address within a given period
of time eliminated before
analysis? If the latter, which
entries were kept for analysis
(eg, the first entry or the most
recent)?
Log file analysis Indicate whether other Please see above.
techniques to analyze the log
file for identification of
multiple entries were used. If
so, please describe.

Registration In “closed” (non-open) Please see above.


surveys, users need to login
first and it is easier to prevent
duplicate entries from the
same user. Describe how this
was done. For example, was
the survey never displayed a
second time once the user had
filled it in, or was the
username stored together with
the survey results and later
eliminated? If the latter, which
entries were kept for analysis
(eg, the first entry or the most
recent)?

Analysis Handling of Were only completed Please see Results, 1st


incomplete questionnaires analyzed? paragraph.
questionnaires Were questionnaires which
terminated early (where, for All questionnaires received
example, users did not go were complete.
through all questionnaire
pages) also analyzed?

Questionnaires Some investigators may N/A


submitted with an measure the time people
atypical timestamp needed to fill in a There was no time restriction
questionnaire and exclude for this survey.
questionnaires that were
submitted too soon. Specify
the timeframe that was used
as a cut-off point, and describe
how this point was
determined.

7
Checklist for Reporting Results of
Internet E-Surveys (CHERRIES)
Statistical Indicate whether any methods Please see Methods, 4th
correction such as weighting of items or paragraph.
propensity scores have been
used to adjust for the non- No statistical correction was
representative sample; if so, used.
please describe the methods.

Eysenbach, G. (2004). Improving the quality of web surveys: the checklist for reporting results of
internet e-surveys (cherries). Journal of medical Internet research, 6(3)e34 doi:10.2196/jmir.6.3.e34
http://www.jmir.org/2004/3/e34/

You might also like