Internet Learning Volume 3, Number 2, Fall 2014 | Page 107
Internet Learning
a group, or among multiple conversations or
courses. As mentioned earlier, such metrics
could serve as a foundation for content, peer
tutor, or study group recommendations.
They could also serve to support instructor
facilitation, student awareness and engagement,
dimensions of assessment, comparative
analysis for research, suggested conversational
entry points based on personal
interests, and more. As one example, Figure
26 illustrates concept overlap between
the text of Renlit’s lead post, and the text of
How to Lie with Statistics (Huff, 1954), an
assigned reading cited in the post.
Ongoing work in this area includes
automated concept categorization, automated
approaches to scoring topicSpread, mapping
conepts to an ontology, and linking
topicSpread scores to the actual concepts
under discussion.
IX - Discussion and Implications for
Future Research
The emergence of social tools in educational
settings combined with a
developing awareness of big data and
visualization techniques mark a critical opportunity
to develop techniques for collecting
meaningful data that enable us to better
assess social behaviors in online courses.
This area has been previously under-represented
in research, and conditions are favorable
for us to develop a deeper understanding
of the tools and pedagogies that support
learning in social and cooperative online
learning spaces.
Our research to date details a methodology
for capturing individual and conversational
patterns present in online Social
Knowledge Networks. And although we are
encouraged by the findings so far, we have
gone deep but not broad. A more rigorous
examination is required to draw clear conclusions
about this work.
A. Learning Activity Design
We suggest that the most effective
approach for assessing the productivity
of a discussion is not
a standardized “counting mechanism,” but
a tailored approach more dependent on activity
type. A discussion in which students
share their own experiences and engage in
interviewing activities should have a different
fingerprint than one in which students
are working to develop a single solution to
a problem. Identifying the anticipated data
fingerprints associated with a library of activity
types, and their variations, will be a
critical step to defining student and instructional
strategies for success.
B. Learner and Instructor Strategies
Similarly, whether learner and instructor
strategy is effective depends at least
in part on our expectations for the discussion.
We can also ask questions about how
instructor strategies might vary depending
on the students to whom they are responding.
This connection, however, relies on us
knowing more about the nature of corpora.
In particular, does the character of a corpus
stay the same across a student’s academic
career? Or does it change based on the composition
of their cohort, their development
through a program, or other factors? These
questions may lead us to identify new metrics
for predicting and supporting team and
cohort success, and the ways in which individuals
may influence one another over the
course of their interactions. If we can begin
to measure these influences, we might be
able to establish and support successful cooperative
and collaborative teams, learning
communities, peer tutoring relationships,
and more.
106