Learning Analytics Lessons from Moneyball

by Ellen Wagner, WCET Executive Director

The recent release of the movie Moneyball has drawn attention to an idea that has also been top of mind at WCET in recent months: What might happen if we used advanced statistical methods to examine the mounds of data describing a venerable old American institution and came up with new, exciting and decidedly different ways to use results as intelligence to inform decision-making about, well, just about everything?

Moneyball tells the story of how the Oakland Athletics Major League Baseball team applied the principles of sabermetrics  to analyze every aspect of their game, and to then invest their relatively scant salary dollars in very smart, statistically significant ways. The book, Moneyball  was written by Michael Lewis in 2003. The movie Moneyball, starring Brad Pitt, was released in September, 2011.

If you have been watching the rising tide of interest in applying pattern strategy and predictive analytics to forecast such things as, say, points of student loss and momentum in educational settings, you will definitely want to read the book, see the movie, or both. It will help you understand the appeal of learning analytics, and will explain why learning analytics adoption is probably going to occur more quickly than the 4 – 5 years that the Horizon Report  has posited.

Moneyball is not a story of how statistics saved the day for Oakland, any more than collecting more and more data on everything we do in post-secondary online education is going to save the day for educational quality. Everybody in baseball has, does and will collect statistics on just about everything that anybody did, is doing, and might do in baseball.  Simlarly, educators collect all kinds of information about everything that online students do.

What was different was that the A’s started using sabermetrics to analyze each and every aspect of the game – and the business – of baseball. A’s executive leadership then made decisions informed by the specialized analysis of objective, empirical evidence, – specifically, baseball statistics that measure in-game activity.

Bill James, one of sabermetrics’ pioneers and often considered its most prominent advocate, was puzzled that, in spite of evidence to the contrary, important baseball decisions continued to be made on personal biases, folk wisdom, and how a player looked. He was also struck by the aversion to using the data baseball collected about itself:

“Baseball keeps copious records, and people talk about them and argue about them and think about them a great deal. Why doesn’t anybody use them? Why doesn’t anybody say, in the face of this contention or that, “Prove it’?”

When I saw that paragraph in the book I froze: What if we substituted the words “postsecondary education” for “baseball”? In education we would need be content with demonstrating tenability at a particular level of significance…but still.

I was also struck by James’s observation that there are some parts of the game of baseball where the metrics used to evaluate performance were not only inadequate – they actually lied and, in the case of fielding errors, made judgments based on what the scorer thought should have happened!  And that made me reflect a bit on student selection interviews and student teaching observation and internships and other programs where certification of competency comes from being observed by an expert.

It also made me wonder about what we are all going to need to do to ensure the thoughtful adoption of analytics practices, services and software. What ARE  the essential skills, tools, and resources required to actively inform practice decisions when we are likely to be confronted with all kinds of reasons that we can’t or won’t believe in our own numbers??

How will we respond to the challenges and opportunities coming from “big data”? Douglas Bowman protested his experience working under what he called the “Sword of Data” by resigning his post as the lead designer at Google when he found his design decisions being overruled by engineers fueled by customer use case statistics. How will educators respond when their “art of teaching” is confounded by the science of empirical evidence?

Fueling the fires of interest in learning analytics is the appeal of personalization of the learning experience along with the promise that analytics will help identify specific points of student loss and momentum. And while there are legitimate privacy and transparency concerns to be sorted out, these benefits will lead to solutions to those barriers.

Lest you think I am being starry-eyed about predictive analytics, let me point to three recent education examples:

  • Kevin Carey’s great article in the Atlantic recently suggested an online a Match.com-like college admissions service that could transform the college admissions process.
  • At Austin Peay University students have begun using a recommendation system designed by their provost to advise students as they pick their courses—a step that could change GPA’s and career paths.
  • My favorite quote of recent months comes from Dr. Phil Ice, American Public University, during the closing plenary session at the Sloan-C Emerging Technologies for Online Learning conference when he talked about the feasibility of using ecommerce recommendation techniques in learning settings: “Mathematically speaking there is very little difference between a point of sale and a learning outcome.”

Perhaps my favorite example of all: The Predictive Analytics Reporting (PAR) Framework. This is the project that WCET and six forward-thinking members undertook to see if we could federate de-identified student records and then apply descriptive, inferential and predictive analysis techniques to the single data set, looking for points of student loss and student momentum.

I am pleased to tell you that members of the PAR project team will be sharing preliminary findings during the upcoming WCET Annual Meeting.   We think that we have found some patterns of student engagement that will change the way we think about providing one-size-fits-all online education experiences.  We can’t wait to see you in Denver so we can tell you all about what we’ve learned.

Biggest learning of all – this is about a whole lot more than arcane methodologies for generating statistics. It is ALL about reframing the way that we think about using data. In MoneyBall, they used analysis to find ways to win more games.  With PAR, we’re using analysis to find ways to keep people in school and make their learning experiences more relevant and engaging.  In Moneyball, once the secret of sabermetrics was revealed, some pundits believed that the A’s lost their competitive advantage. The great thing about analytics in education is that, as we raise the bar on quality and accountability, ALL education stakeholders will be winners.

Ellen Wagner, WCET Executive Director

One Comment

  1. Posted October 6, 2011 at 12:42 pm | Permalink | Reply

    This is truly exciting. Can’t wait to hear about it.

Post a Comment

Required fields are marked *

*
*

Follow

Get every new post delivered to your Inbox.

Join 814 other followers

%d bloggers like this: