The engineering education team’s staff meeting on May 2, 2012 began like any other: reports of new engineer orientation, computer science outreach efforts, and an updated mission statement: “To provide Google engineers and the world with relevant and timely technical content, learning resources, and tools.”

With five minutes remaining in the meeting, the director announced that she was recruiting team members who were willing to tackle an audacious goal: create an online course for ten million people in eight weeks. Many of us left the room that day with more questions than answers: Could we really create a course from scratch for that many people? In only eight weeks? What would we teach, and why? How would we know if we were successful?

MOOCs? Why MOOCs?

Many elements of Google culture have contributed to our experiments with massive open online courses (MOOCs), including the company’s mission, desire to think big, commitment to our users, and ability to launch and iterate. A year and a half since that fateful staff meeting, we have served over 360,000 students by launching five courses for the general public, developed a handful of courses for our own engineers, and assisted numerous partners to launch courses using Google’s open-source Course Builder platform.

You might be wondering about Google’s interest in MOOCs. Our company mission statement is, “To organize the world’s information and make it universally accessible and useful.” Enabling educators to share their expertise with the world fits in this mission, as does expanding education to everyone. We had enthusiasm and a vision, but what should we teach?

After some brainstorming, we decided to start with what we know—our own products. We have worked with teams to enhance the user experience of Google tools through education. If people know how to better use our products, they will likely use them more, which helps the company meet its business goals. Google has also focused resources on helping individual professors, small colleges, and non-profit organizations scale their education efforts.

At Google we hope to help MOOCs evolve from their current implementation by encouraging others to build interesting courses and share their discoveries about effective pedagogy. Several strategies that have worked for us include experimentation (hypothesizing, testing, evaluating, and iterating), student community, more activities, short videos, alternative evaluation methods, and paying attention to student goals.

Experimentation

At Google we are encouraged to experiment by hypothesizing, gathering feedback, launching, evaluating data, and iterating. An intrepid team of content experts, designers, and engineers worked together to develop our first course. After eight weeks of development we launched Power Searching with Google. This course consisted of 28 lessons, each containing a video, text transcript, and activity, three assessments, as well as certificates of completion. The interface had some rough edges, and we spent late nights fixing bugs. However, by releasing an early version of the course with its imperfections, we were able to collect student interaction data that in turn guided future design decisions. We offered Power Searching a second time a few months later with clearer activity instructions, explicit links to the discussion forum, and new assessment questions. Based on the organic community interactions via the course forum, we also decreased the number of support staff answering questions.

Since the first course, we have experimented with numerous pedagogical elements including community, videos, activities, designing for student goals, and assessment strategies. These observations have helped inform the Course Builder technology.

Lessons learned: Release courses early, even if they are not perfect, and gather feedback to inform and improve course content. Given the size of many online courses, it’s impossible to predict every student experience.

Student community

Technology enables students to connect with hundreds or thousands of other students. This also presents a challenge for course developers: how do we ensure the same course is valuable for students of diverse backgrounds, levels of technical savviness, varying experience with a topic, in locations around the world, and with different ways to apply the concepts.

Despite conducting several usability studies with members of what we assumed to be our target audience, we discovered that our actual audience was more diverse than we had anticipated. For example, we found that challenge activities at the end of each module motivated a small number of people but were too difficult for other students. We therefore offered these challenge activities as supplementary to the primary content. We also realized that we hadn’t considered how the Google search interface would appear differently in other regions. Students in Brazil, for example, saw slightly different search interfaces than peers in Japan and the US. We had developed the course with US-centric examples, resulting in student confusion in one lesson. Because these students could ask questions and share their experiences via the course forum, they were able to help each other achieve the lesson’s goals.

We also found that students shared examples of how they could apply the course content. A librarian in Power Searching commented in the forum that she had used color filtering in image search to help one of her patrons search for a particular book with a green cover. Educators shared lesson plans for using Google Maps in their classrooms. Although we could have anticipated the different ways students would interact with each other, the fact that we had a course forum enabled students to differentiate the course for each other in ways that we did not expect. Furthermore, students crowd-sourced solutions to overcome learning barriers.

Lessons learned: Students will have diverse interests, backgrounds, and needs. Enable students to personalize the course for their own needs and share experiences with each other. Giving the students freedom to share how they will apply the course concepts enables them to help each other.

Videos and activities

Many MOOCs consist of videos with intermittent quizzes to maintain student interest and engagement. From our data, it’s not clear whether watching videos is the most effective way to learn the skills we taught; we have found that many students prefer clicking on a text lesson instead of, or in addition to, watching videos. In fact, when we featured the video predominantly on the page, with a small button linking to the text version of the lesson, students clicked on the video about seventy percent of the time and the text lesson thirty percent of the time.

In the Advanced Power Searching course, we presented video links next to text-version links of the same lessons. In this course, which also gave students opportunities to try search challenges before viewing lessons, we found that students clicked on the text and video lessons in equal numbers. We have discovered that shorter, targeted videos seem to hold students’ interest better than longer videos. In our courses, videos shorter than five minutes have, on average, an eighty percent engagement rate (meaning that students watched an average of eighty percent of the video).

Online education enables us to give students opportunities to apply skills and receive instant feedback. In our courses we couple instructional videos with activities where students practice the skills and receive guidance about how well they are mastering the content. We have discovered that significantly more students complete activities than watch videos. One hypothesis is that students jump directly to activities, try them, and assess whether they need to review the relevant lessons. In fact, students who completed course activities had a higher course completion rate than students who did not do activities.

Lessons learned: Students appreciate control over their learning experiences; make it easy for students to choose activities, text lessons, or video lessons in their preferred order. We plan to use short videos for motivation, rationale, and authentic examples of the content.

Goals

We asked students to select a goal when they registered for the Mapping with Google and Introduction to Web Accessibility courses. We provided a list of goals including “Meet all course requirements in order to earn a certificate of completion,” “Learn one or two new things about Google Maps [or web accessibility] without achieving a certificate of completion,” and “I’m curious about how this online course is taught.”

Surprisingly we found that only fifty-four percent of Mapping registrants (and fifty-six percent of Web Accessibility registrants) intended to complete course requirements to earn a certificate. The vast majority of other registrants only wanted to learn one or two new things, either out of curiosity or for a work-related need. Based on what we know about student progress in the Mapping with Google course, we inferred that forty-two percent of active students did achieve the goals they set out to meet (compared to thirteen percent of all registrants who completed the course).

Lessons learned: We should consider changing course designs to meet a variety of student goals. Instead of assuming that all students will interact with all course materials from A to Z, make it easier to search for small nuggets of content. Publish clear learning objectives that enable students to self-select whether they will get what they want out of the course. Lastly, consider publicizing multiple paths that students could take through the course.

Self-evaluation

Although peer grading has become quite popular in MOOCs because it relieves professors of the burden of grading thousands of assignments, we believe that self-evaluation has greater benefit to the students. Self-grading helps build students’ metacognition that they will use when applying the skills from the class. For example, after the class we want students to stop and think about the qualities of an effective Google Map when creating a map. By having them evaluate their maps against these criteria, our hope is that they will continue to apply these skills after the class.

In Advanced Power Searching, students submitted two case studies that detailed how they solved complex challenges related to their lives in order to earn certificates of completion. Students provided great examples of how they used Google tools to research their family’s history, the origins of common objects, or trips they anticipate taking. In addition to listing their queries, they wrote details about how they knew websites were credible and what they learned along the way. They graded their own assignments based on a rubric we provided. Similarly, in Mapping with Google, students created maps and evaluated them based on a checklist.

Teaching assistants (TAs) graded a random sample of student assignments. We found a modest yet statistically significant correlation between TA’s grades and student’s grades, low incidents of cheating (duplicate assignments), and an overall high quality of work. In fact, the majority of students graded themselves within six percentage points of how an expert grader would assess their work. This is a positive result, since it suggests that self-graded project work in a MOOC can be a valuable assessment mechanism. Reading stories of how people used their new skills to plan vacations, find jobs, and research ordinary objects was one of the most inspiring aspects of this course for the TAs.

Lessons learned: Continue to explore self-evaluation as an assessment mechanism. Test rubrics with a broad sample of potential assignments submitted. Provide additional guidance to students on evaluating their work.

Areas of future exploration and reflection

We have launched five courses and iterated to improve aspects of each course. Future product education courses will involve many more experiments, as many hands-on activities as possible, opportunities for students to connect with each other, short videos, and opportunities for students to evaluate their work.

We also anticipate continuing to experiment with motivation, community, and personalization. How do we inspire students to achieve their goals? How do we maximize the value of having tens of thousands of people working on the same content at roughly the same time? How do we help students collaborate with each other to further differentiate the content? How do we provide personalized learning experiences for all students?

Though much has changed since that staff meeting a year and a half ago, and over 45,000 students have completed our courses, we still have more questions than answers.