Professor Sandel has a demonstrated commitment to sharing his teaching widely with the world. He kept enrollment open in his on-campus course as registration neared 1,000 students; he produced a television version of the course with WGBH; he has lectured widely on moral reasoning to popular audiences around the world; and JusticeHarvard.org provided an online version of this course years before the emergence of edX. JusticeX was designed to expand access to Justice’s ideas about the importance of moral reasoning in society and in participant’s lives. This intent is important to keep in focus in reviewing how students choose to participate in the various learning opportunities in JusticeX.

Grades and Certification

Some courses on the edX platform have maintained a commitment to evaluating students using methods that mirror, as closely as possible, the assessments used in residential course settings. JusticeX is not one of those courses. In interviews with Professor Sandel and his course team, they described the goals of their course as more focused on sharing their perspectives and sparking student interest and reflection than in evaluating, sorting, and certifying students.

In the spirit of creating a “course” rather than a lecture series, Professor Sandel did include assessments in JusticeX, and in an interview he described multiple-­‐choice questions as the “least bad” of the available option for assessing student competence at scale in the humanities. In evaluating grades and certificates in JusticeX, it is important to keep in focus the limits of the assessments used and the course team’s belief that the learning opportunities provided to “auditors” and others with no interest with the problems and certifications were as important as the experience of students who chose to engage with the for-credit problems.

Figure 8. Distribution of grades across all JusticeX participants who answer at least one for-credit question correctly (n= 12,330).

Figure 8

Of the 76,079 students who registered for the course on or before the final due of August 2, 2013, 5,442 people earned a certificate. (Of interest: two of the 133 people who registered for the course on August 2 earned a certificate.)

Many analyzing large-scale online courses are interested in the percentage of students who earn a certificate, and the corresponding inverse. As a set of purely descriptive statistics, we can take the number of students who earned a certificate (5,442) and divide that by the number of students who registered (79,787, yielding 6.8%); by the number of students who registered before the final due date (76,079, yielding 7.2%); by the number of participants who viewed the course and registered by the final due date (47,469, yielding 11.5%); or by the number of participants who viewed more than half the course and registered by the final due date (8,285, yielding 65.7%). However, no data that we have for JusticeX can allow us to identify the number of students who intended to complete the course and the proportion who actually did so.

Interestingly, there is little variation in certification rate based on when students signed up for the course. Figure 9 shows an average trend line (loess approximation) of pass rate by daily registration cohort. About 7% of the students who register on any given day pass the course, slightly lower before the launch and slightly higher after.

Figure 9. Average trend line (loess approximation) of pass rate (certified students over all registered students) by daily registration cohort (n=76,079).

Figure 9

Activity

A number of metrics can illustrate the nature and diversity of student activity in JusticeX. In Figure 10, we show the distribution of chapters viewed by JusticeX participants. Chapters represent the highest-­‐level unit in the edX courseware, including lecture units, quizzes, the final exam, and several logistical chapters like the introduction to the course and supplementary materials. The number of chapters viewed by a participant, therefore, is a rough summary of what fraction of course content they had a chance to learn from. The distribution of chapters viewed has a bi-modal distribution that is a characteristic of many distributions of usage data on the edX platform—a large number of people do very little, and then there is a second mode for students who seem interested in viewing the entire the course. Here, we see that most people only open a few chapters, but many people also open nearly all of them. (Together, Figures 8 and 10 represent the marginal distributions of the variables comprising Figure 5.)

Figure 10. Distribution of chapters viewed for JusticeX participants who viewed courseware (n=50,044).

Figure 10

We can gain another perspective on student activity in JusticeX by looking at students’ use of assessments. Justice had 24 sets of lecture review questions, five quizzes, and a final exam. Figure 11 shows that both views and attempts peak early in the course and then decline at a steady rate after the first quiz, two weeks into the 12-week course. The peaks of activity in the quizzes shows that more people viewed and attempted the quizzes than the lecture review questions.

Figure 11. Number of students viewing and attempting each problem set in JusticeX of all students who view courseware (n=50,044). Numbered sets are practice questions, Q1–Q5 are for-credit quizzes, and Final is the final exam.

Figure 11

This decline of activity over time in massive open online courses has been widely reported in early studies of the form, and it is useful to also compare these patterns of activity to other online media. For instance, Figure 12 displays the number of views, as of mid-November 2013, of the PBS Justice videos that were posted to YouTube in October of 2009. We see here similar patterns of “attrition” in watching this lecture series, where the initial video has been viewed over four million times, and then, by the fifth video in the series, views range from 200,000 to 300,000.

Figure 12. Number of views of PBS Justice videos hosted on YouTube since September 2009, as of November 2013.

Figure 12

An additional metric of student activity is a simple measure of the number of “clicks” or actions taken within the course site. Figure 13 shows the distribution of student clicks on a logarithmic scale for both certificate earners and all other participants.

Figure 13. Number of participant clicks (i.e. recorded actions) plotted on a log scale for JusticeX certificate earners (n=5,442) and non-certificate earners (n=73,345).

Figure 13

One immediate insight is that, in coarse terms, activity predicts achievement: those who do a lot in the course tend to be those who pass the course. Similar to Figure 5, this indicates that there are some certificate earners who can pass the course with only a few hundred actions, while there are also listeners who do not pass the course but take thousands of actions. In general, however, those who do many actions in the course are more likely to earn a certificate than those who do little, a simplistic insight that echoes findings from other early studies of lecture-based large-­scale online courses.

Persistence

We also examined student persistence through the course. Unlike courses that begin and end on set dates, students could join JusticeX at any point. Moreover, there were no incremental due dates throughout the course—only a final due date—so late registrants could complete all of the work of the course, even those who registered on the final day of the course. The official course launch was March 2, 2013; the first content was released on March 12; the final exam was made available on May 28; and all assignments were due by August 2. Some students engaged in the course at the pace that the content was released, completing two lecture units every week from March through May. Students could also sign up for Justice in July, and, to borrow a phrase from contemporary television viewing habits, “binge-watch” the course and complete all assignments over a weekend.

Therefore, to examine student persistence, we focused more on each student’s relative timescale—the time from their enrollment or the course content launch, whichever was later—rather than on the absolute time in the course.

In Figure 14, we show an average hazard function derived by empirically calculating the hazard proportions of each weekly registration cohort and then averaging across the hazard proportion in each relative week. Thus we evaluated the questions: “What proportion of students stop participating during their first week? Their second week? (and so on…).” We can then infer the proportion of students who would remain from week to week in each cohort. This is an “implied survivor function” calculated directly from the average hazard function.

Figure 14. Hazard function comprised of the average of all registration cohort hazard functions plotted on relative course week (where week 0 is the course launch or initial registration week, whichever is later) and truncated at week 21 when the final exam was due. Implied survival function is calculated directly from the average hazard function.

Figure 14

The key insight from this figure is that, within each weekly registration cohort, students are very likely to cease activity during the week that they register and right afterwards. After that, however, hazard proportions drop to below .2 and level out below .1 after the fifth week. Colloquially, if a student gets hooked on a course within the first two weeks—regardless of when s/he starts—s/he is likely to stay, or at least the risk of dropping out in any subsequent week is constant.

This survival model gives some sense of how students persist throughout the whole course, examining the span of time from first action to last action. However, a student could log in only twice, on the first day and the last day, and be counted as “surviving” through the length of the course. This motivates an alternative metric that counts how many discrete days students view course material. Among all registrants, we find that median number of days of activity is two, and 75% of registrants have seven or fewer days of activity.

For a more granular view, therefore, in Figure 15 we examine the daily activity of those who have viewed over half the course (those who “explored”) as well as certificate earners. We see that among these active students, most had between 10 and 40 days of activity, with a long tail of students who spent over 100 days within the period from March 3 to our date of data collection on September 8.

Figure 15. Days with activity, the number of discrete days (demarked in UTC time) during the observational period where participants had at least one action, for explorers and certificate earners (n=8,415).

Figure 15

These contrasting figures illustrate the diversity of ways of participating substantially in the Justice course. Most people took an action one or two days a week over the 26 week period that we examined; however, some students participated over the course of only a few days, and still others engaged much more frequently.