A number of metrics can illustrate the nature and diversity of student activity in JusticeX. In Figure 10, we show the distribution of chapters viewed by JusticeX participants. Chapters represent the highest-­‐level unit in the edX courseware, including lecture units, quizzes, the final exam, and several logistical chapters like the introduction to the course and supplementary materials. The number of chapters viewed by a participant, therefore, is a rough summary of what fraction of course content they had a chance to learn from. The distribution of chapters viewed has a bi-modal distribution that is a characteristic of many distributions of usage data on the edX platform—a large number of people do very little, and then there is a second mode for students who seem interested in viewing the entire the course. Here, we see that most people only open a few chapters, but many people also open nearly all of them. (Together, Figures 8 and 10 represent the marginal distributions of the variables comprising Figure 5.)

Figure 10. Distribution of chapters viewed for JusticeX participants who viewed courseware (n=50,044).

Figure 10

We can gain another perspective on student activity in JusticeX by looking at students’ use of assessments. Justice had 24 sets of lecture review questions, five quizzes, and a final exam. Figure 11 shows that both views and attempts peak early in the course and then decline at a steady rate after the first quiz, two weeks into the 12-week course. The peaks of activity in the quizzes shows that more people viewed and attempted the quizzes than the lecture review questions.

Figure 11. Number of students viewing and attempting each problem set in JusticeX of all students who view courseware (n=50,044). Numbered sets are practice questions, Q1–Q5 are for-credit quizzes, and Final is the final exam.

Figure 11

This decline of activity over time in massive open online courses has been widely reported in early studies of the form, and it is useful to also compare these patterns of activity to other online media. For instance, Figure 12 displays the number of views, as of mid-November 2013, of the PBS Justice videos that were posted to YouTube in October of 2009. We see here similar patterns of “attrition” in watching this lecture series, where the initial video has been viewed over four million times, and then, by the fifth video in the series, views range from 200,000 to 300,000.

Figure 12. Number of views of PBS Justice videos hosted on YouTube since September 2009, as of November 2013.

Figure 12

An additional metric of student activity is a simple measure of the number of “clicks” or actions taken within the course site. Figure 13 shows the distribution of student clicks on a logarithmic scale for both certificate earners and all other participants.

Figure 13. Number of participant clicks (i.e. recorded actions) plotted on a log scale for JusticeX certificate earners (n=5,442) and non-certificate earners (n=73,345).

Figure 13

One immediate insight is that, in coarse terms, activity predicts achievement: those who do a lot in the course tend to be those who pass the course. Similar to Figure 5, this indicates that there are some certificate earners who can pass the course with only a few hundred actions, while there are also listeners who do not pass the course but take thousands of actions. In general, however, those who do many actions in the course are more likely to earn a certificate than those who do little, a simplistic insight that echoes findings from other early studies of lecture-based large-­scale online courses.