Internet Learning Volume 4, Number 1, Spring 2015 | Page 44
Internet Learning Journal – Volume 4, Issue 1 – Spring 2015
Procedure
The survey instrument was administered electronically through a unique URL
furnished by a designated contact person at each cooperating institution. The participants
received the URL by means of an e-mail message or a link posted to the home page of the
institution’s course management system. They also received URLs by means of an
announcement in the online course in which they were enrolled. Data were collected from
all cooperating institutions and aggregated into a cumulative data file.
RESULTS
To determine how students’ ratings of each QM statement relate to the point values
assigned by the 2011-2013 edition of the QM rubric, one-sample t-tests were conducted.
Additionally, effect sizes were calculated for each item using Cohen’s d to indicate the
practical significance of the differences. Table 1 shows the survey items that correspond to a
QM indicator assigned a point value of “3 – Essential” on the 2011-2013 QM rubric.
Table 1 Comparison of participant ratings to QM point values for items ranked “3 -
Essential” by QM
QM # QM statement N Mean SD t p
1.1 Clear instructions tell me how to
get started and how to find various
course components.
3.3 Criteria for how my work &
participation will be evaluated are
descriptive & specific.
6.3 Navigation throughout the online
components of the course is
logical, consistent, and efficient.
3.2 The grading policy is stated
clearly.
3.1 Assessments measure the stated
learning objectives and are
consistent with course activities
and resources.
Mean
Diff.
3154 2.66 0.60 -31.58 .000** -0.34 0.56
2984 2.52 0.64 -40.42 .000** -0.48 0.74
2685 2.51 0.67 -37.94 .000** -0.49 0.73
2998 2.49 0.65 -43.12 .000** -0.51 0.79
2997 2.48 0.66 -43.46 .000** -0.52 0.79
2.4 Instructions on how to meet the 3038 2.30 0.77 -49.88 .000** -0.70 0.91 §
d
42!