Thursday, September 26, 2013

More Thoughts on the Dissertation

After few days reading the articles, "critical thinking" - the term I originally used for my dissertation title doesn't seem so bad to me now. Text-based discussion forums provides a means for interaction, consensus searching and new knowledge construction in distance education (Fahy, 2001). Quality of interaction should be able to represent levels of critical thinking of online learners.

Buraphadeja's study (2010) suggested no statistically significant relationship between mean level of knowledge construction in content analysis and SNA while Fahy's study (2001) found analysis results from these two methods supported each other. The differences might come from the different selection of the authors in the tool for content analysis and the target courses they investigated. My theses aims to adopt similar research idea to study a new context - MOOC. Possible finding is the interaction pattern in MOOCs. I also plan to have an in-depth study on several most engaged learners in a selected MOOC to further investigate their interaction patterns. Contribution of the theses to the field would be: suggestions on learning facilitation in MOOCs for instructors and designers.

Oct. 19
I am thinking investigating the following questions:
The depth of learning:
(1) The depth of discussion threads (How many sub-threads does one thread derived? or intensity). Note: 70% messages were posted in the first tow levels (initial post and subtopic 1 (Gibbs, 2008).
(2) Number of topics evolved from the given topic by the instructor?
(3) Have the derivative topics maintained the connection to the goals listed by the instructor at the beginning of the MOOC?
(4) Number of isolated messages
Have the discussion been continued after the MOOC?

The self-organized grouping:
(1) Are there some kinds of homogeneous characteristics presented between members of groups that are self-organized while taking the MOOC? How do the learners choose to connect to someone? (class conduct, initial impressions, and interactions. (Gibbs, 2008, p. 17))
(2) Grouping and centrality (Gibbs, 2008, p. 17)
(3) How long does it take for learners to form a stable (or somewhat stable) discussion group? How long can a group sustain?
(4) Power law
(5) Group size and stability (多少人的組最穩定)

The characteristics of the learner-instructor interaction:
(1) Are there differences in the above four questions between different instructors or MOOCs?

independent variables: learner characteristics, moderating (skills) characteristics/styles, types of posts (5+3 categories in TAT), timeline of posts (when/who post what), social media adopted by learners for this MOOC,
dependent variables: depth of discussion, number of derivative topics, consistence of the derivative topics,





Friday, September 20, 2013

Interaction in MOOCs - Students and Instructors' Perception

I've found quite a few MOOC literature. To my disappointment, many of them are either not relevant or weak in the argument. Here is an example.

One article (Khalil & Ebner, 2013) discussed the levels of satisfaction on interaction in MOOCs. The authors deployed two web-base survey questionnaires based on the five-step model (Access and Motivation, Online Socialization, Information Exchange, Knowledge Construction and Development) for interactivity developed by Salmon (2001). The students and the instructors self-reported one's perception and then the data were collected and analyzed. The authors concluded that there is a gap between students’ and instructors' perception and satisfaction of interaction in MOOCs and there was a lack of student-instructor interaction.

I found myself not persuaded by the authors' conclusions because:

  • the returned ratio of the questionnaires were low (11 out of 250) 
  • the research results were based on self-reports from participants instead of tools which could represent participants' perceptions more directly such as content analysis
  • using Salmon five-step model as a single measurement seems difficult to really look into the heart of the interaction taken place in the courses.


Khalil, H., & Ebner, M. (20130624). “How satisfied are you with your MOOC?” - A Research Study on Interaction in Huge Online Courses. World Conference on Educational Multimedia, Hypermedia and Telecommunications 2013, 2013(1), 830–839. 

Literature Review of MOOC Related Articles

Title: MOOCs: A system review on the MOOC studies published between 2008~2012
Authors: Liyanagunawardena, Adams, and Williams (2013)

Description:

Basically, the article is more about classifying the content of published articles and less about critical review. There were eight categories used: introductory, concept, case studies, educational theory, technology, participant focused, provider focused, and other.

Others: popular authors in MOOCs in term of number of articles published; types of MOOCs (c-MOOC and AI-Stanford like MOOC (Rodriguez, 2012) vs. cMOOC and xMOOC (Daniel, 2012)).

The authors also pointed out problems in MOOC studies including lacking of ethical consideration in using publically available data; neglecting data existing in virture spaces other than LMS;

Data Collection

Duration: 2008-2012
No. of Articles: 46 including 2008 (1), 2009 (1), 2010 (7), 2011 (11), 2012 (26).
Sources: Journal (17), Conferences (13), magazines (10), report (3), workshop (2)

My comment:

Limited articles were collected. The total collection of articles were only 46. It seems difficult to believe since 2012 were called "the year of MOOC" because so many MOOCs have attracted lots discussion. One of the reasons let to the small collection could be that the authors only considered the articles with the term "MOOC" in the title or abstract.

Liyanagunawardena, T. R., Adams, A. A., & Williams, S. A. (2013). MOOCs: A systematic study of the published literature 2008-2012. The International Review of Research in Open and Distance Learning, 14(3), 202–227.

Title: The Maturing of the Mooc: Literature Review of Massive Open Online Courses and Other Forms of Online Distance Learning
Authors: Liyanagunawardena, Adams, and Williams (2013)

Description:


  • The review mainly examined the topic of the literature.
  • “mere completion is not a relevant metric, that learners participate in many valid ways, and that those who do complete MOOCs have high levels of satisfaction.” (Haggard, 2013, P.6)
  • The literature review on MOOC concluds that "after a phase of broad experimentation, a process of maturation is in place. MOOCs are heading to become a significant and possibly a standard element of credentialed University education, exploiting new pedagogical models, discovering revenue and lowering costs." (p. 5)

Data Collection

One hundred know and recent literature on MOOCs and open distance learning. Three categories of literature were collected: individual polemical articles discussing the impacts of MOOCs on educational institutions and learners, formal and comprehensive surveys, and general press writing and journalism.

Department for Business, Innovation and Skills. (2013). The Maturing of the Mooc: Literature Review of Massive Open Online Courses and Other Forms of Online Distance Learning (BIS Report Papers No. 130) (p. 123). London, UK: Department for Business, Innovation & Skills. 

Wednesday, September 18, 2013

MOOC Discussion Forums

September 18, 2013

Revisit literature on interaction theory to refresh my memory. After all, it's been three years since I studied the topic as an assignment project of EDDE 801. The revisit did not bring me any new idea/finding.

Routinely, I check newsletter from OLDaily and MOOC.ca every day. An article from Phil Hill, an education technology consultant, led me to a list of articles. Hill argued that discussion forums in MOOCs (massive open online courses) are centralized discussion forums and are barriers to student engagement. As he quoted from Robert McGuire (Sept. 3, 2013):

"Most MOOC discussion forums have dozens of indistinguishable threads and offer no way to link between related topics or to other discussions outside the platform. Often, they can’t easily be sorted by topic, keyword, or author. As a result, conversations have little chance of picking up steam, and community is more often stifled than encouraged."

Hill also supported his argument with studies from MIT and Stanford University. They pointed out that lower than 3% MOOC participants posted in discussion forums unless they were credit earners.

I haven't read these article carefully but they reminded me of Rivard's report regarding MOOC dropout rate (March 8, 2013). Rivard argued that it may not make sense to compare the number who register to the number who finish because different kinds of people are signing up for the online classes and what their goals are. "Some clearly do not intend to ace or even take every test, nor want to earn a largely meaningless certificate of completion." (Rivard, 2013) MOOCs participants are diverse populations who come with various goals in mind. Some faculty members also enroll in MOOCs because they want to watch how other faculty teach their subject. It is clear that this group of MOOC participants are not likely will take the assignment and are likely will drop-out any time. They drop-out the courses because they had never intended to completed the courses from the very first beginning.

The same reason might also apply to the problem of low posting rate in MOOC discussion forums. Some participants just want to be the lurkers. By check-in the forums and view the discussions, they've got what they need. What I am trying to say is that it might be a good way for a study on MOOCs to begin with dividing the participants into several subgroups based on individual goals for enrollment in the MOOC. Then, it would be more meaningful to continue discussing learner behavior.

Su-Tuan Lulee


Tuesday, September 17, 2013

Sensemaking or not?

I think I'd better not change the term - Critical Thinking in my research topic to Sensemaking because:

  1. A quick browse over the literature told me that sensemaking is a term/concept mainly adapted in fields other than distance/education. It was initiated and often used in the study of organization management, organizational behavior and philosophy.
  2. Study a bunch of literature from fields with which I am not familiar would be overwhelming. I might need to postpone my complete date of dissertation for 6 months.
I could adapt the arguments, concepts and methods of sensemaking to help me with the outline of coding scheme or building up the argument/explanation of the research findings. However, it's better not to let sensemaking become my main focus of discussion.

Monday, September 16, 2013

First Meeting with the Supervisor

My supervisor - Dr. Siemens wanted to talk to me and we had a 30-minute Skype call at 8:00 AM, my time, today.

Conclusions:

  • My research topic sounds good including MOOC study, content analysis and social network analysis)
  • Changing the term on the research topic "Critical Thinking" to "Sensemaking" or "Interaction" is good. However, I need to learn more about "sensemaking" as a specific study area.
  • I can continue working on the proposal. Finishing the literature review, specifying the research questions and methodology.
  • No need to worry about technology right now. Technology should not interfere research.
  • Submit a timeline to let him know what time he will get what.
Before the meeting, I submitted a brief introduction of my preliminary proposal:

Abstract

The proposed study will examines the relationship between two methods - content analysis and social network analysis - of assessing critical thinking in discussion forums of massive open online courses (MOOCs). The study proposes that the results from content analysis would highly relate to the results obtained from social network analysis. The study will focus on results of interaction analysis of TAT (Transcript Analysis Tools, Fahy, 2001) model and its relationship with the centrality measures obtained from social network analysis. This study will not only support online educators with useful information in interpretation of educational data analysis but also provides evidence-based considerations for the design of learning activity as well as the organization of content/learning resources, Moreover, the technique proposes in this study will present insight into the process of knowledge construction in online learning environment in terms of interaction. This study will also add to the body of research in MOOCs and interaction in relation to validation of existing measurements.