U-Pace: Building Student Success Through Content Mastery & Proactive Support

The University of Wisconsin-Milwaukee is a 2014 WCET Outstanding Work Award winner for their innovative online instructional approach, U-Pace.  Today Diane Reddy and Ray Fleming, co-creators of U-Pace and Laura Pedrick, executive director of UWM Online, share with us a little about the program.

Putting data into action. Valuable student data is now easily available to instructors through the Learning Management System (LMS) at their college or university. A wealth of information about each student’s work habits and progress is automatically recorded for each course. But while advances in learning management systems and learning analytics are accelerating, a gap remains between instructors’ ability to access student data and to act upon the data in an empirically-based way to fully utilize the potential to benefit their students. The U-Pace instructional approach uses information about learner engagement and performance recorded in the institution’s LMS to maximize students’ learning experience and provide personalized support for students to be successful.

U-Pace logoWhat does U-Pace mean for instructors? U-Pace is a self-paced, mastery-based online instructional approach that works within any LMS. U-Pace proactively supports learners through instructor-initiated messages called Amplified Assistance.

The mastery-based learning component of U-Pace consists of course content divided into small, manageable units (usually half of a chapter/lesson) that are each associated with a 10-item quiz. Students must demonstrate mastery on each unit quiz by scoring at least 90% before they can advance to new content. Retakes are unlimited (with a required one-hour wait between attempts), and consist of different quizzes that cover the same content.

Amplified Assistance consists of tailored feedback and motivational support that is emailed at least weekly to each student, which may be particularly useful for students who are struggling but reluctant to ask for help. Instructors use information recorded in the LMS to craft Amplified Assistance messages personalized for the student. Valuable information provided by most LMS’s includes:

  • When was the last time the student took a quiz or accessed the course material?
  • How many attempts does the student require to demonstrate mastery on a quiz?
  • Are students missing questions related to one particular concept that the instructor can help clarify?
  • Are students missing questions from multiple different content areas, perhaps suggesting they need assistance with their study skills and general approach to learning the material?

How does Amplified Assistance help students? In Amplified Assistance messages, instructors communicate unwavering belief in the student’s ability to succeed, and praise the student for small accomplishments (such as mastering a single quiz). Instructors also reinforce students’ effort and persistence (for example, by praising a student for consistently making attempts at a quiz they have found challenging to master), which may be especially beneficial for students who have the tendency to give up after minor set-backs. By focusing on the student’s positive behavior, instructors are shaping the student’s behavior for success. Through Amplified Assistance messages, U-Pace instructors provide the support students need to meet the high standards created by U-Pace’s mastery-based learning component, a combination which empowers students and fosters their sense of control over learning.

How can instructors create Amplified Assistance messages efficiently? Past U-Pace instructors have field-tested dozens of Amplified Assistance messages and created templates that are freely available on the U-Pace website. The variety of templates offered allows instructors to select a message from an appropriate category (e.g., “Students who are on schedule,” “Students who are behind schedule,” or “Students who have not started”) and then tailor each message based on the individual student’s performance. These templates reduce the time needed for instructors to compose effective messages to address student needs, while still offering instructors the flexibility to craft personalized messages that will be meaningful and build rapport with each student.

Does U-Pace produce results? The U-Pace instructional approach has been honored with a 2014 WCET Outstanding Work (WOW) Award. U-Pace instruction produced striking student success in multiple rigorous evaluations, including a large randomized controlled trial (where learners did not self-select their course format) funded by the U.S. Department of Education, Institute of Education Sciences, and a multi-institutional study funded by EDUCAUSE’s Next Generation Learning Challenges program. U-Pace has consistently produced greater learning and greater academic success compared to conventional face-to-face instruction. U-Pace students have scored higher than conventionally taught students on proctored, cumulative exams taken at the end of the course, and again six months later (EDUCAUSE Review Online). A greater percentage of U-Pace students (compared to conventionally taught students) have earned final grades of A or B (EDUCAUSE Learning Initiative Case Study). Furthermore, U-Pace students have shown improvements in self-regulated learning, as evidenced by a decrease in the number of attempts needed to master the quizzes over the course of the semester (Journal of Asynchronous Learning Networks).

UPace charts

How do students react to U-Pace instruction? Survey data has found that, relative to conventionally taught students, U-Pace students perceive greater instructor support, control over their learning, and improvements in time management and study skills over the semester (NGLC Grantee Profile, Journal of Asynchronous Learning Networks). Student reviews have mirrored these findings:

“I am actually retaining the information that I learned in this course. It has helped me out so much in boosting my confidence, and actually showing me, and opening the door, and saying you are just a step further from graduation and you can succeed because you have all these skills in you that you might have never seen before.”

“I go out and try new things, and I know that that sounds really weird, that a course can change someone like that, but you know that it is, I learned the content as well, but it is not even that, it is the fact that I am learning to be myself more, and I am opening up more doors to being motivated and having better time managing skills and being more confident in myself. Outside of school, people have noticed changes in me, that I have more of a glow to me, that I am more outgoing, almost because I have that confidence that I can actually do stuff that I used to think I had no business doing.”

Bottom line. By acting upon student data recorded in the LMS, instructors can have a meaningful impact on students. U-Pace is an empirically-tested instructional approach that has shown great success in utilizing this data to motivate, engage, and improve the learning of students. By integrating the U-Pace instructional method with LMS capabilities, instructors have the opportunity to maximize the value of these tools for guiding students to success.

If you’d like to learn more about U-Pace instruction, we’d be delighted to talk with you at the WCET Annual Meeting at one of our presentations on Thursday, November 20th: Conversation about Competency Based Education (1:30 – 2:30 pm) & U-Pace Instruction: Paving the Way to College Success (3:00 – 4:00 pm)


Reddy, Diane


Diane Reddy, Co-creator of U-Pace Instruction




Fleming, Ray


Ray Fleming, Co-creator of U-Pace Instruction




Pedrick, Laura


Laura Pedrick, Executive Director, UWM Online

Education Department Urges Colleges to Follow IPEDS Distance Ed Definitions

In an extended conversation with the U.S. Department of Education (US ED) IPEDS personnel, they confirmed which distance education enrollment counts colleges should be reporting to the Department’s IPEDS (Integrated Postsecondary Education Data System) survey.

The Department representatives also wondered why we did not highlight some of the errors made by colleges in reporting their enrollments.   They encouraged colleges to follow the IPEDS definitions and instructions and to call them if they have any questions.


A few weeks ago Phil Hill of the e-Literate blog and I reported on anomalies that we found when colleges reported their distance education enrollments to the U.S. Department of Education.  Earlier this year the Department released data from its Fall Enrollment 2012 IPEDS (Integrated Postsecondary Education Data System) survey.  For the first time in a long time, the survey included counts of distance education students.

Upon publishing our initial IPEDS blogs analyzing distance education enrollments, we heard from some of our readers.  They told us about the following situation in reporting their numbers when they strayed from what was expected of them.

Undercounts:  Some Colleges Did Not Report All of Their Distance Education Students

photo of dictionaries

There are many definitions of “distance education.”

The first report was from a college that did not report any of their students who were enrolled in continuing education, self-support (receive no state funding) colleges.  We were surprised at this and learned that other colleges also did not report all their distance education students.

In following up, some whom we contacted were unaware that there were self-support entities on some campuses that offered for-credit courses  leading to full degrees.  They do exist.  The most common instance is with public colleges that have a College of Continuing Education.  Jim Fong, Director, University Professional and Continuing Education Association’s Center for Research and Consulting, said that his organization has about 370 members.  In a survey a couple years ago, about 91% of the respondents to his inquiry have for-credit offerings.  He did not have data on how many are self-support units.

Reasons for the Undercount

We heard different reasons for not reporting these students:

  • Misunderstanding the IPEDS instructions.  The survey instructs colleges to: “Exclude students who are not enrolled for credit. For example, exclude: Students enrolled exclusively in Continuing Education Units (CEUs).”  Of course, this instruction is intended to reference non-credit, CEU courses….not colleges of continuing education.  It is conceivable that someone may have misread the instruction.
  • They understood, but it was too difficult to do.  For some colleges the data systems for the continuing education colleges are different than those for the main campus.  Merging the data is difficult and would take calculations by hand.
  • They chose not to report the correct enrollments to IPEDS.  A college might decide that it does not wish to report different enrollment numbers to IPEDS than it reported to the state, even though the requirements for each government entity are different.
  • Their data system was not ready.  One college said that their data system simply was not ready to report the correct numbers.

Response from the Department

The Department was offered a chance to provide a written response, but they declined.  In their discussion with me they noted:

  • Most of the reasons given above were not due to the IPEDS definition, but were due to errors or inaction by the colleges.  That’s a fair point.
  • The definition asks colleges to: “Include all students enrolled for credit (courses or programs that can be applied towards the requirements for a postsecondary degree, diploma, certificate, or other formal award), regardless of whether or not they are seeking a degree or certificate.”  There is no mention of how the courses are funded or whether the courses are offered by a continuing education college.  They were very clear that students enrolled in for-credit courses in colleges of continuing education should have been included in the counts.  The Department will not issue a clarifying document, but they plan to inform the state IPEDS coordinators when they next meet.

Photo of a dictionary with the term "disclaimer" highlighted.Overcounts:  Some Colleges Using the Wrong Definition of “Distance Education”

As we talked to colleges, we learned that some colleges did not use the definition of “distance education.”  IPEDS defines a “distance education course” as: “A course in which the instructional content is delivered exclusively via distance education.  Requirements for coming to campus for orientation, testing, or academic support services do not exclude a course from being classified as distance education.”

Reasons for the Overcount

We heard different reasons for using their own definitions:

  • Misunderstanding the IPEDS instructions.  One institution said that it tried to get a clarification on the definition and was still confused even after contacting the IPEDS call center.
  • They understood, but it was too difficult to do.  The state and/or an accrediting agency may already have its own definition that differs from the IPEDS definition and it would be difficult to create another classification just for IPEDS.  A few examples:
    • The Southern Association of Colleges and Schools Commission on Colleges defines distance education as when “a majority of instruction (interaction between students and instructors and among students) in a course occurs when students and instructors are not in the same place.” By majority, colleges are interpreting that to mean more than 50% of the instruction.
    • The Texas Coordinating Board defines a “Fully Distance Education Course” as having “mandatory face-to-face sessions totaling no more than 15 percent of the instructional time.”   Therefore 85% of the instruction is at a distance.
  • They chose not to report the correct enrollments to IPEDS.  A college might decide that it does not wish to report different enrollment numbers to IPEDS than it reported to the state, even though the requirements are different.

Response from the Department

In their discussion with me they noted:

  • Most of the reasons given above were not due to the IPEDS definition, but were due to errors or inaction by the colleges.  Once again, that’s a fair point.
  • The IPEDS “distance education” definition (cited above) defines a distance education as being nearly 100% at a distance.  The definition is clearly listed in the IPEDS Glossary.  While they understand that states may have differing reporting requirements, they were very clear that they expect colleges to use this nearly 100% definition in reporting distance education enrollments.  Again, the Department will not issue a clarifying document, but they plan to inform the state IPEDS coordinators when they next meet.

In Conclusion…

Some final thoughts:

  • As shown with the “distance education” definition examples, a college in Texas would need to report distance education as 51+% of a course to SACS, 86+% of a course to its Coordinating Board, and nearly 100% of a course to IPEDS.  You can see the difficulties they face.
  • The Department did not seem to think that the errors from these anomalies were significant.  From the enrollments numbers that were reported to IPEDS, about one-in-eight students take all of their courses at a distance and about one-in-four take at least some distance courses.  Those are significant numbers and I’d like to see both colleges and IPEDS strive to make future counts as accurate as possible.
  • Those colleges waiting for a clarification from the Department will not see anything dramatic. They may want to call them if they have any questions or feel that they might not be reporting enrollments correctly.

Finally, we could ask the question as to whether the Department’s definition of “distance education” is a useful one?  On the plus side, it is a clear definition.  On the negative side, the “nearly 100%” definition does not reflect current practice.  But, that’s a question for a different day.  And it is a discussion that may need to include accreditors and states.

For now, let’s use the definitions as presented by the Department so that IPEDS has accurate data to inform federal financial aid policies.

RussPhoto of Russ Poulin with baseball bat

Russell Poulin
Deputy Director, Research & Analysis
WCET – WICHE Cooperative for Educational Technologies
Twitter:  wcet_info and RussPoulin

Join us in Portland, OR for the WCET Annual Meeting – November 19-21.


Photo credits:
Dictionaries – Morgue File
“Disclaimer” definition – Morgue File


Intellipath for MBA preparation

Today we welcome Colorado Technical University Chief Academic Officer and Provost, Connie Johnson and CTU faculty member, Sarah Pingrey as they share what their WCET Outstanding Work (WOW) award-winning program has done to improve the student and faculty experience in MBA-preparatory courses.

One of the great pleasures that I have as Chief Academic Officer of Colorado Technical University, is to work with a large group of talented faculty who embrace new technology to improve student learning. In 2012, CTU committed to creating a personalized learning experience for our students.

Because CTU has many adult students pursuing an MBA degree that may have an undergraduate degree in other disciplines, University Dean of Business, Dr. Gail Whitaker worked with the business program committee to integrate adaptive learning into business pre-requisites. The purpose was two-fold: to ensure that students received the content that he or she specifically needed as a knowledge-base for the MBA program and to comply with prerequisite requirements prescribed by Association Council for Business Schools and Programs (ACBSP). This innovative approach, which CTU received the WCET Outstanding Work award for, provided students with a tailored learning path for topics including accounting, statistics, economics and finance.

Central to the implementation of adaptive learning technology (Intellipath) were faculty who created relevant assessments and content for each course. Intellipath provides content to students determined by an assessment and provides instructors with the ability to work with students.

The Faculty Experience

Sarah Pingrey, faculty member at CTU, shares her experience using the platform:

I started working on the adaptive learning platform Intellipath at CTU in the spring of 2012. From development to testing to piloting courses to full implementation, I’ve seen Intellipath grow into an essential learning platform for students. Throughout my teaching career other platforms have tried to woo me, but Intellipath does something different – faculty members are intimately involved in their students’ progress every step of the way.

CTU dashboard view 1Teaching in an Intellipath classroom is such a joyful experience. Training is simple with videos and documents to review and a short quiz to demonstrate competency. Once training is complete and fundamental best practices are understood, the next step is to delve deeper into exactly what Intellipath offers and how to access and use this information. With so many students entering the classroom who are scared that mathematics will be the end of their college careers, I am to be able to follow their progress through the course objectives, praise their successes, and help them immediately when they struggle. Intellipath gives me the information I need to do this and there is no way a student can fall behind without me knowing.

Intellipath contains detailed data for the entire class and for each student, and using this data effectively is crucial. The first thing I want to know is whether a student has started working on their weekly assignments. Intellipath clearly shows which students have started or completed the assignment. Also, it only takes a quick glance to find a student’s current knowledge score on the assignment, the number of objectives completed, the time spent working on the assignment, and the day that the assignment was last accessed. This information is such a treat for an instructor to have. Instructors can now motivate students who have not started the assignment and give praise to those who have.

CTU dashboard view 2Students can also easily flag difficult problems. A detailed solution is provided to every problem, but if a student doesn’t understand the solution or has a question, they can easily flag the problem by just pushing a button. The problem, the student’s solution, and the scratch work can be viewed, and I am able to leave feedback for the student. Encouraging students to use this feature is crucial and students are very likely to use it since they are able to ask questions without having to directly email the instructor: pushing a button is easy.

Intellipath has definitely led to more interaction between students and faculty. It has also changed the dynamics of synchronous lectures. Having the lectures apply to all students can be challenging when some students have already started their Intellipath assignments and have very specific questions, while other students don’t have enough foundational knowledge yet to jump into answering these questions. Having organized slides and corresponding class activities, and being able to jump around in them during the lecture, makes teaching more effective for both students and faculty.

The biggest challenge for an online professor can be making that initial connection with students. Students are struggling, but what they are struggling with is unknown until it is too late. Intellipath takes away the mystery of why a student is struggling and makes interactions between the instructor and student easy, fun, and often. I am excited for the future of Intellipath, and most of all, excited that students are truly learning!

If you’re interested in learning more about CTU’s Intellipath for MBA-preparation program, be sure to join us at the WCET Annual Meeting where Connie will share more about the program on Thursday, November 20.


Headshot of Connie JohnsonConnie Johnson
Chief Academic Officer & Provost
Colorado Technical University







headshot of Sarah PingreySarah Pingrey
Colorado Technical University

Capella University FlexPath

Capella University is a 2014 WCET Outstanding Work (WOW) Award recipient for the development of FlexPath.  Deb Bushway, Vice President of Academic Innovation and Chief Academic Officer shares with us today the evolution of FlexPath from pilot to celebrating its first birthday as a program. Capella will accept their award at the WCET 26th Annual Meeting.

FlexPath LogoIn May 2012, Capella University’s academic leadership made the decision to delve into the world of direct assessment. The impetus for our pursuit of designing a direct assessment program came from a conversation with leaders at the U.S. Department of Education regarding barriers to innovation. They recommended that we leverage the Federal Department of Education’s Title IV eligibility language introduced in August 2006, and finalized by the department in November of that same year to explore our options for introducing direct assessment programs. As the process rolled out, it became clear that we also needed to acquire approval from our regional accreditor, the Higher Learning Commission, to begin development of direct assessment courses citing this portion of the Title IV.

It is important to note that all curricula at Capella University are competency based. These academic programs are still rooted in courses and comply with seat time (and other) requirements for credit bearing distance education. This curricular base provided a great starting place to move to a competency-based, direct assessment delivery model.

The Self-Paced Course Pilot

Initially, we started with a pilot to help us understand the learning experience in a self-paced delivery model. The initiative was simply referred to as the Self-Paced Course Pilot. We based these courses on the infrastructure built to support our credit-bearing programs. From the onset of design and development of these courses, we have taken a very faculty-driven design and delivery approach. The faculty chairs from both programs worked directly with our instructional designers, curriculum specialists, assessment specialists, and project managers as we built a self-paced course experience that would be as rigorous and engaging as our more “traditional” online credit-bearing model. The team worked very closely together for many months with a goal of offering four courses from each of the two programs in early January 2013.

FlexPath Competency MapSupport Structures for Learners

Concurrently, we built out a support structure for our learners to be able to achieve the necessary competencies to successfully complete each course in a self-paced format. After countless meetings with faculty, advising staff, and a host of other contributors, we arrived at a three-pronged support structure. This learner support structure consists of traditional faculty, tutoring faculty, and coaching faculty. In traditional credit-bearing delivery models, these three support roles are often integrated into the work of a single faculty member. With this new architecture of the faculty role, extensive training was necessary for the individuals who took on these new functions.

Capella faculty chairs chose select faculty members from our traditional programs to pilot this new approach to teaching and learning. These faculty members then led the work to articulate competencies, align the criteria through which to assess competencies, design the authentic assessments, and serve as the evaluators of learners’ demonstration of competency. The tutoring faculty are aligned to particular “clusters of competency” (or courses) and are content area experts. Many are enrolled in our doctoral programs at Capella University and have demonstrated success in the relevant content area.  One reason for this design is that our research indicates adult learners prefer seeking help from peers rather than the traditional faculty. Finally, the coaching faculty team was formed.  This team came from among our traditional advising teams, although this model is significantly more proactive and engaged than the traditional advising model. Each learner is assigned a coaching faculty member who stays with that individual throughout his/her experience at Capella.

Aligning the Technology

The third major component in developing FlexPath was to align Capella’s technology infrastructure to accommodate the needs of a competency-based direct assessment delivery model. There are many unique attributes to FlexPath that our systems simply were not designed to accommodate. These attributes range from not having any course level grades to not being able to transcribe individual competencies for each course. Other technological areas needing alterations included our learning management system, our institution’s learner website (iGuide), and our student administration system. We needed to accommodate these attributes without making permanent changes to our systems, not knowing if the direct assessment delivery model would be accepted widely enough to make permanent changes a worthwhile endeavor. Additionally, the university’s catalog, policy, and external communications needed to reflect all of the changes needed to deliver FlexPath. The entire initiative took dozens of people and thousands of hours’ worth of work before a single learner was enrolled into a FlexPath course!

FlexPath SavingsCelebrating Successes & Expanding to the Future

All in all, FlexPath has been and continues to be an exciting endeavor. We have been honored with several awards for our efforts, most notably, the WOW Award from WCET, along with Blackboard Learning’s Innovation Award and the NUTN Award for Innovation in Distance Education.

FlexPath will soon be celebrating its first birthday. There are now a total of six degree programs offered in the FlexPath model. As the FlexPath program expands, so does our knowledge base for developing high-quality, competency-based direct assessment programs. With that said, we identify opportunities to enhance FlexPath on a daily basis. As more and more universities take on the challenge of implementing this type of program, we look forward to the opportunity to participate in a larger community of practice around direct assessment to further address the needs of 21st century adult learners and employers.

Learn more about the first year of FlexPath on November 20th at 9:45am during #WCET14.


Deborah Bushway HeadshotDeborah Bushway

Vice President of Academic Innovation & Chief Academic Officer

Capella University

Learner-Centric Shifts in Education

Katie Blot, senior vice president, education services at Blackboard shares with us how MyEdu is helping learners succeed through academic planning and out into the marketplace.  

When we talk about changes in education, the best place to start is with the learner.  And if there’s one statistic that highlights the shift in who our learners are, it could be this: today less than 15 percent of higher education students in the U.S. are what we would call “traditional” 4-year residential students.  That means that there are roughly 18 million post-traditional students who are over 25, need to work to afford education, attend multiple institutions, have dependents, and are actively working toward job and career goals.

Even “traditional” undergraduates are seeking flexibility and transparency in their educational options (online, self-paced, dual enrollment, accelerated degrees, competency based learning, etc.), such that the old distinctions between “traditional” and “non-traditional” are not really applicable anymore.  Post-traditional students do not fit in clearly defined categories, and they follow extremely varied educational pathways.

Economics Make Students More Focused on Their Goals

These shifts are driven in part by economics.  The need for higher education opportunities is greater than ever before, as increasingly jobs require post-secondary education.  Research repeatedly emphasizes that higher education leads to greater economic attainment, both for individuals and for our country.  But with the recession and the rise in the cost of higher education, degree attainment is extremely difficult for many people to afford.  Potential students are looking for a more economical means to access upward mobility and are accelerating huge consumer-driven changes in higher education.

Students today are much more focused on establishing a clear-cut career path and figuring out which job is going to help them earn a living and eventually pay for or justify the cost of their education.  This leads to a focus on the competencies they’ll gain from their education and how they can demonstrate those competencies to prospective employers and translate them into gainful employment.

What Do College and University Presidents Foresee? It’s Not Always Positive.

Naturally, these shifts in learner behavior put significant pressure on our institutions.  But educational institutions, even in their own estimation, are not adapting quickly enough.  Our recent research co-sponsored by The Chronicle of Higher Education gathered insights from over 350 college and university presidents on the topics of change and innovation in higher education.

  • Nearly 70% of presidents believe that at least a moderate amount of disruption is needed in higher education.
  • 63% of presidents believe that the pace of change is too slow.
  • While 60% of respondents are optimistic about the direction the American higher education system is going in, only 30% felt as if the higher education system is currently the best in the world. Surprisingly, this drops to 17 percent when asked about the next decade.
  • And less than half (49%) of the Presidents believed we are providing good or excellent value to students and their families.

How will we approach systemic educational changes?  We’ve been talking for a while about mobile, personal, and flexible – but that is not enough.  Now we need to add affordable, modular, and accessible.  The learning ecosystems that serve these needs empower students to:

  • See at a glance what’s happening, what they’ve accomplished, and what should be next.
  • Access learning materials anytime, anyplace, including diverse personalized learning resources.
  • Collaborate in learning activities while forming communities of peers and mentors.
  • Easily create learning artifacts and reflect on their own learning.
  • Collect portable evidence of learning.
  • Manage competencies and learning achievements from multiple sources.
  • Develop an online academic and professional identity.

MyEdu:  A New Academic Planning Tool

MyEdu ProfileOne example of how Blackboard is helping learners succeed is MyEdu.  MyEdu offers free, learner-centric academic planning tools, including a personal education planner, a flexible scheduler and an assignment tracker.  By consolidating multiple components into a rich, easy-to-use platform, MyEdu helps students plan their degree path, take the right classes, and stay on track to graduate on time.  MyEdu also presents students with up-to-date information about courses, schedules, and professors as well as feedback from other students.

MyEdu helps learners establish their professional profiles.  The traditional resume doesn’t accurately represent the skills and talents of learners, and tools that work for mid-career professionals don’t effectively convey a student’s credentials, capabilities, and evidence of learning.   As students use MyEdu’s academic tools, they build their professional profiles with data about their school, major, courses, graduation date and academic work.  They can also personalize their profiles with credentials, projects, organizations, services, languages, work experiences and competencies, providing valuable information to employers.

And perhaps most importantly, MyEdu connects learners with employers and jobs.  Learners choose how much and what type of information from their profiles to show to employers for potential jobs and internships.  By connecting the learning achievements from their courses and projects with lifelong, cross-institutional learning profiles, these achievements and their related competencies become more powerful for helping learners succeed in their goals of completing their degrees, getting jobs, and advancing their careers. MyEdu empowers learners to recognize, manage and continuously build on their own recognizable achievements beyond any single course, program, degree, or institution.

MyEdu enables employers to connect directly to students whose data indicates that they would be a great match for open positions. They can see not only courses and credits, but also much more granular achievements that reveal who the student is and what specific talents they can bring to the job.  These are very concrete ways in which new technologies and evolving learning ecosystems serve post-traditional learners.

It’s not just about jobs—it’s the new normal of evolving careers and the need for lifelong learning. Today’s learners need to build skills and work toward credentials at any time, at any age and apply them to an ever-changing landscape of personal goals. Today’s evolutions in our learning ecosystems coincide with the rise in learners’ need to have more control over their own learning achievements.

Katie BlotKatie Blot
Senior Vice President
Education Services



Investigation of IPEDS Distance Education Data: System Not Ready for Modern Trends

After billions of dollars spent on administrative computer systems and billions of dollars invested in ed tech companies, the U.S. higher education system is woefully out of date and unable to cope with major education trends such as online & hybrid education, flexible terms, and the expansion of continuing and extended education. Based on an investigation of the recently released distance education data for IPEDS, the primary national education database maintained by the National Center for Education Statistics (NCES), we have found significant confusion over basic definitions of terms, manual gathering of data outside of the computer systems designed to collect data, and, due to confusion over which students to include in IPEDS data, the systematic non-reporting of large numbers of degree-seeking students.

In Fall 2012, the IPEDS (Integrated Postsecondary Education Data System) data collection for the first time included distance education – primarily for online courses and programs. This data is important for policy makers and institutional enrollment management as well as for the companies serving the higher education market.

We first noticed the discrepancies based on feedback from analysis that we have both included at the e-Literate and WCET blogs. One of the most troubling calls came from a state university representative that said that the school has never reported any students who took their credit bearing courses through their self-supported, continuing education program.  Since they did not include the enrollments in reporting to the state, they did not report those enrollments to IPEDS. These were credits toward degrees and certificate programs offered by the university and therefore should have been included in IPEDS reporting based on the following instructions.

Photo of a calculator focused on the Clear Button

Calculating distance education enrollments hit some major snags.

“Include all students enrolled for credit (courses or programs that can be applied towards the requirements for a postsecondary degree, diploma, certificate, or other formal award), regardless of whether or not they are seeking a degree or certificate.”

Unfortunately, the instructions call out this confusing exclusion (one example out of four)

“Exclude students who are not enrolled for credit. For example, exclude: Students enrolled exclusively in Continuing Education Units (CEUs)”

How many schools have interpreted this continuing education exclusion to apply to all continuing education enrollments? To do an initial check, we contacted several campuses in the California State University system and were told that all IPEDS reporting was handled at the system level. Based on the introduction of the Fall 2012 distance education changes, Cal State re-evaluated whether to change their reporting policy. A system spokesman explained that:

“I’ve spoken with our analytic studies staff and they’ve indicated that the standard practice for data reporting has been to share only data for state-supported enrollments. We have not been asked by IPEDS to do otherwise so when we report distance learning data next spring, we plan on once again sharing only state-supported students.”

Within the Cal State system, this means that more than 50,000 students taking for-credit self-support courses will not be reported, and this student group has never been reported.

One of the reasons for the confusion, as well as the significance of this change, is that continuing education units have moved past their roots of offering CEUs and non-credit courses for the general public (hence the name continuing education) and taking up a new role of offering courses not funded by the state (hence self-support). Since these courses and programs are not state funded, they are not subject to the same oversight and restrictions as state-funded equivalents such as maximum tuition per credit hour.

This situation allows continuing education units in public schools to become laboratories and innovators in online education. The flip side is that given the non-state-funded nature of these courses and programs, it appears that schools may not be reporting these for-credit enrollments through IPEDS, whether or not the students were in online courses. However, the changes in distance education reporting may actually trigger changes in reporting.

Did Other Colleges Also Omit Students from Their IPEDS Report?

Given what was learned from the California State University System, we were interested in learning if other colleges were having similar problems with reporting distance education enrollments to IPEDS.  WCET conducted a non-scientific canvassing of colleges to get their feedback on what problems they may have encountered.  Twenty-one institutions were selected through a non-scientific process of identifying colleges that reported enrollment figures that seemed incongruous with their size or distance education operations.  See the “Appendix A:  Methodology(Link added after publishing) for more details.

From early August to mid-September, we sought answers regarding whether the colleges reported all for-credit distance education and online enrollments for Fall 2012.  If they did not, we asked about the size of the undercount and why some enrollments were not reported.

Typically, the response included some back-and-forth between the institutional research and distance education units at each college.  Through these conversations, we quickly realized that we should have asked a question about the U.S. Department of Education’s definition of “distance education.”   Institutions were very unclear about what activities to include or exclude in their counts.  Some used local definitions that varied from the federal expectations.  As a result, we asked that question as often as we could.

The Responses

Twenty institutions provided useable responses. We agreed to keep responses confidential.  Table 1 provides a very high level summary of the responses to the following two questions:

  • Counts Correct? – Do the IPEDS data reported include all for-credit distance education and online enrollments for Fall 2012?
  • Problem with “Distance Education” Definition? – Although we did not specifically ask this question, several people volunteered that they had trouble applying the IPEDS definition.


Table 1:  Counts for Institutional Responses
Counts Correct? Problem with “Distance Education” Definition?
Yes 11 3
Maybe 5 5
No 4 12


One institution declined to respond.  Given that its website advertises many hundreds of online courses, the distance education counts reported would leave us to believe that they either: a) under-reported, or b) average one or two students per online class.  The second scenario seems unlikely.

Of those that assured us that they submitted the correct distance education counts, some of them also reported having used their own definitions or processes for distance education.  This would make their reported counts incomparable to the vast majority of others reporting.


This analysis found several issues that call into question the usability of IPEDS distance education enrollment counts and, more broadly and more disturbingly, IPEDS statistics, in general.

There is a large undercount of distance education students

While only a few institutions reported an undercount, one was from the California State University System and another from a large university system in another populous state.  Since the same procedures were used within each system, there are a few hundred thousand students who were not counted in just those two systems.

In California, they have never reported students enrolled in Continuing Education (self-support) units to IPEDS.  A source of the problem may be in the survey instructions.  Respondents are asked to exclude: “Students enrolled exclusively in Continuing Education Units (CEUs).”  The intent of this statement is to exclude those taking only non-credit courses.  It is conceivable that some might misinterpret this to mean to exclude those in the campuses continuing education division. What was supposed to be reported was the number of students taking for-credit courses regardless of what college or institutional unit was responsible for offering the course.

In the other large system, they do not report out-of-state students as they do not receive funding from the state coffers.

It is unclear what the numeric scope would be if we knew the actual numbers across all institutions.  Given that the total number of “students enrolled exclusively in distance education courses” for Fall 2012 was 2,653,426, an undercount of a hundred thousand students just from these two systems would be a 4% error.  That percentage is attention-getting on its own.

The IPEDS methodology does not work for innovative programs…and this will only get worse

Because it uses as many as 28 start dates for courses, one institutional respondent estimated that there was approximately a 40% undercount in its reported enrollments.  A student completing a full complement of courses in a 15-week period might not be enrolled in all of those courses at the census date.  With the increased use of competency-based programs, adaptive learning, and innovations still on the drawing board, it is conceivable that the census dates used by an institution (IPEDS gives some options) might not serve every type of educational offering.

The definition of ‘distance education’ is causing confusion

It is impossible to get an accurate count of anything if there is not a clear understanding of what should or should not be included in the count.  The definition of a “distance education course” from the IPEDS Glossary is:

“A course in which the instructional content is delivered exclusively via distance education.  Requirements for coming to campus for orientation, testing, or academic support services do not exclude a course from being classified as distance education.”

Even with that definition, colleges faced problems with counting ‘blended’ or ‘hybrid’ courses.  What percentage of a course needs to be offered at a distance to be counted in the federal report?  Some colleges had their own standard (or one prescribed by the state) with the percentage to be labeled a “distance education” course varied greatly.  One reported that it included all courses with more than 50% of the course being offered at a distance.

To clarify the federal definition, one college said they called the IPEDS help desk.  After escalating the issue to a second line manager, they were still unclear on exactly how to apply the definition.

The Online Learning Consortium is updating their distance education definitions.  Their current work could inform IPEDs on possible definitions, but probably contains too many categories for such wide-spread data gathering.

There is a large overcount of distance education students

Because many colleges used their own definition, there is a massive overcount of distance education.  At least, it is an overcount relative to the current IPEDS definition.  This raises the question, is the near 100% standard imposed by that definition useful in interpreting activity in this mode of instruction?  Is it the correct standard since no one else seems to use it?

In addressing the anomalies, IPEDS reporting becomes burdensome or the problems ignored

In decentralized institutions or in institutions with “self-support” units that operate independently from the rest of campus, their data systems are often not connected.  They are also faced with simultaneously having to reconcile differing “distance education” definitions. One choice for institutional researchers is to knit together numbers from incompatible data systems and/or with differing definitions. Often by hand. To their credit, institutional researchers overcome many such obstacles.  Whether it is through misunderstanding the requirements or not having the ability to perform the work, some colleges did not tackle this burdensome task.

Conclusions – We Don’t Know

While these analyses have shed light on the subject, we are still left with the feeling that we don’t know what we don’t know.  In brief the biggest finding is that we do not know what we do not know and bring to mind former Secretary of Defense Donald Rumsfeld’s famous rambling:

“There are known knowns. These are things we know that we know. We also know there are known unknowns. That is to say, there are things that we know we don’t know. But there are also unknown unknowns. There are ones we don’t know we don’t know.”

The net effect is not known

Some institutions reported accurately, some overcounted, some undercounted, some did both at the same time.  What should the actual count be?

We don’t know.

The 2012 numbers are not a credible baseline

The distance education field looked forward to the 2012 Fall Enrollment statistics with distance education numbers as a welcomed baseline to the size and growth of this mode of instruction.  That is not possible and the problems will persist with the 2013 Fall Enrollment report when those numbers are released.  These problems can be fixed, but it will take work.  When can we get a credible baseline?

We don’t know.

A large number of students have not been included on ANY IPEDS survey, EVER.

A bigger issue for the U.S. Department of Education goes well beyond the laser-focused issue of distance education enrollments.  Our findings indicate that there are hundreds of thousands of students who have never been reported on any IPEDS survey that has ever been conducted.  What is the impact on IPEDS?  What is the impact on the states where they systematically underreported large numbers of students?

We don’t know.

Who is at fault?

Everybody and nobody.  IPEDS is faced with institutional practices that vary greatly and often change from year-to-year as innovations are introduced.  Institutional researchers are faced with reporting requirements that vary depending on the need, such as state oversight agencies, IPEDS, accrediting agencies, external surveys and ranking services, and internal pressures from the marketing and public relations staffs.  They do the best they can in a difficult situation.  Meanwhile, we are in an environment in which innovations may no longer fit into classic definitional measurement boxes.

What to expect?

In the end, this expansion of data from NCES through the IPEDS database is a worthwhile effort in our opinion, and we should see greater usage of real data to support policy decisions and market decisions thanks to this effort. However, we recommend the following:

  • The data changes from the Fall 2012 to Fall 2013 reporting periods will include significant changes in methodology from participating institutions. Assuming that we get improved definitions over time, there will also be changes in reporting methodology at least through Fall 2015. Therefore we recommend analysts and policy-makers not put too much credence in year-over-year changes for the first two or three years.
  • The most immediate improvement available is for NCES to clarify and gain broader consensus on the distance education definitions. This process should include working with accrediting agencies, whose own definitions influence school reporting, as well as leading colleges and universities with extensive online experience.


NOTE:  The research on the IPEDS survey and this blog post are the result of an on-going partnership between Phil Hill (e-literate blog and co-founder of MindWires Consulting, @PhilonEdTech) and WCET. Throughout this year, we coordinated in analyzing and reporting on the IPEDS Fall Enrollment 2012 distance education enrollment data. As they came to light, we also coordinated in examining the anomalies. We very much appreciate Phil and this partnership.

Much thanks goes to Terri Taylor Straut, who performed the heavy lifting for WCET in surveying institutions and conducting follow-up calls with respondents. Her insightful questions and attention to details was invaluable. And thank you to Cali Morrison, WCET, for her work in helping us to get the word out about our findings.
Russ Poulin


Photo of Russ Poulin with baseball batRussell Poulin
Interim Co-Executive Director
WCET – WICHE Cooperative for Educational Technologies
Twitter:  @RussPoulin





Photo of Phil HillPhil Hill
MindWires / e-Literate
Twitter:  @philonedtech





Calculator Photo Credit: Alvimann

Have Fun in Portland at WCET’s Annual Meeting

Greetings from the great Pacific Northwest!

Portland (aka the City of Roses, Bridgetown, Beervana, P-Town, Rip City, Stumptown, and PDX) welcomes you to WCET’s 26th Annual Meeting. There’s no way to tell if you’ll experience the usual rain, sun breaks, or gorgeous beautiful weather when you’re here. No worries though, there’s plenty to do when you’re not engaged in sessions, rain or shine.


Portland’s known to be quite a foodie town. Check out one of the many restaurants downtown such as Mother’s Bistro, the Red Star Tavern, Veritable Quandary, Tasty n Alder, or Nel Centro. Take public transportation over to the Pearl and dine at Andina, Piazza Italia, or Isabel. For a more casual dining experience, you may want to try one of the many Portland Food Carts. There are several locations in the downtown area.Photo of the Willamette River and downtown Portland


Of course Portland’s rich with coffee and tea spots. But, if you’re interested in experiencing Beervana, choose from over 50 microbrews within the city. You can check out this map at PortlandBrewPubs’s web site of brewpubs in the Portland area. You can select downtown, Northwest Portland, and the Pearl and either walk or ride the Max or Streetcar. You cannot go wrong if you want to make it easy on yourself and head over to the Rogue Distillery and Public House and the Deschutes Brewery. Find the best downtown walking map at portlandmap.com.

Photo of Saturday marketShop

Did you know Oregon does not have sales tax? Yes, you can shop tax free in Portland! Choose from the NW 23rd area or Pearl District, Downtown, or on the weekend visit Saturday Market on Saturday or Sunday (rain or shine) to check out up to 250 artisans sharing their work. Stop in and listen to some music while you enjoy a tasty meal from one of the food vendors.   Of course, right across the street from the hotel is the Riverplace Marina, Shops, and Restaurants. Stop in for ice cream, sushi, drinks, or browse the shops just steps from the hotel.


You must have heard about Powell’s City of Books. It takes up an entire city block and you need a map just to find your way around the store taking up 68,000 square feet.


RelaxPhoto of Cannon Beach

Without leaving the city, you can take a little break and visit the beautiful Lan Su Chinese Garden created by artists in our sister city, Suzhou. It’s not just a garden, but it’s work of art, complete with a Teahouse.   If you come early or stay late, you may want to take advantage of our rich climate and head out of town. Take a drive out on the Historic Columbia River Highway. Stop in to see Multnomah Falls, drive on to Hood River, or go all the way to Mt. Hood. If you want to head west, check out Cannon Beach, Lincoln City, or Pacific City on the Oregon Coast and stop in for some wine tasting on the way at one of the Willamette Valley Wineries.

As you are exploring new ideas and innovations at the annual meeting, be sure to get out and experience Portland. It’s not quite as extreme as Portlandia makes us out to be. Enjoy the city and help us keep Portland weird!


Loraine Schmitt
Director of Distance Education
Portland Community College


Come join us at the 26th WCET Annual Meeting in Portland, Oregon on November 19 – 21.  Early bird registration deadline is October 18.

Creating a New Kind of OWL: Online Writing Support that Makes a Difference

The Excelsior College Online Writing Lab (OWL) is a 2014 recipient of the WCET Outstanding Work (WOW) Award and will accept the award at the WCET Annual Meeting.  Today Crystal Sands, Director of  the OWL shares with us the goals, process and results of a pilot study on the OWL’s use that resulted in their award.

We knew we wanted our Online Writing Lab (OWL) to stand out, to be more student friendly than other online writing resources, and to use some of the latest research about what works in writing instruction and in online education. It turned out to be a monumental task; we had just one year to build it. But, the Excelsior College Online Writing Lab was a labor of love for all of us, and I think it shows.

The Excelsior College OWL is a first-of-its-kind, open-source multimedia online writing lab. While we continue to expand its resources, the OWL already provides comprehensive writing support for students across eight areas:

  1. Locating Information and Writing with Sources takes students through the entire process of writing a research paper.
  2. Grammar Essentials provides students with detailed, student-friendly grammar, punctuation, and common error support.
  3. The Writing Process area helps students develop a strong writing process for papers that do not require research.
  4. The Essay Zone provides comprehensive support for the major rhetorical styles students are likely to encounter in college.
  5. Digital Writing supports students who are writing in digital environments, with coverage for everything from e-mails to digital presentations.
  6. The Avoiding Plagiarism tutorial explains what plagiarism is, what its consequences are, and what students can do to avoid it.
  7. The ESL Writing Online Workshop provides detailed writing process support for ESL writers.
  8. Paper Capers is an original writing process video game, allowing students to practice writing process steps and build a writer’s vocabulary, which is essential for skill transfer. The game also features mini assessments, allowing students to practice lessons from the other areas of the OWL.

Funding for building this kind of comprehensive support was generously provided by the Kresge Foundation. To fit within the funding criteria, our team worked quickly to build the OWL, completing it in just one year. During the second year of the grant, we conducted a national pilot study and based revisions upon feedback from the study.

Creating the OWL

Excelsior OWL mascotExcelsior College worked with one writing faculty member from each of its five community college partners to develop specific goals for the OWL. We knew we wanted to create an OWL that was different than other online writing labs, one that was student-centered, warm, and engaging. We wanted to make the OWL a fun learning experience, a place that students would come back to even after their writing class was over. We decided to focus on helping students build a strong writing process, as research indicates that students who have a better writing process also have better writing products. We also needed to help students build a rhetorical foundation and vocabulary, which would assist them in becoming more flexible writers. As part of the creation of OWL, a writing video game was created to reinforce both the writing process and a rhetorical foundation.
As director, my job was to develop content based on feedback from the committee and try to imagine how the content could be brought to life for students. An instructional designer was critical in that process. Additionally, we worked with an outside vendor, who was committed to our idea to do something creative and fun, on the website build and design. The brainstorming sessions we had were remarkable at times, and it was not long before we were seeing our ideas become reality.

As we neared the end of the first year of the project, we realized we were doing more than we had originally envisioned in the scope of the grant—adding new content, additional areas, and working to add a creative flair to the OWL. The hours were long, but our committed, small team got the OWL ready for the pilot study, which was to begin in the fall of 2013.

The summer of 2013 was an epic time. As the project director, I was responsible for making sure deadlines were met and budgets were kept. Thankfully, we had a wonderful grants office that supported me and our team in this endeavor. My family also became involved in the project as well, with my husband providing audio, and when testing on the site began my high school-aged son joined us in the testing as well. The OWL became our dinner-time conversation, and when my toddler asked me, “Mama, what is a thesis statement?” I knew I had probably crossed that work-life balance line. I knew I was not alone in crossing that line, as our team of five went above and beyond that summer. Thankfully, we were just about ready for the pilot study.

It truly was a labor of love. I don’t think we could have built such a resource in such a short time otherwise. Fortunately, our hard work has been rewarded.

The Pilot Study

Course Grades OWL pilotThanks to an amazing team effort, the OWL was ready to go, minus a few tweaks, for the pilot study. Our team of teachers from our partner colleges worked together to build the OWL into the curriculum of their writing classes. We ran treatment and control group classes in order to have sets of students working with the same curriculum without the added support of the OWL. The results were positive and gave us a good start on future study of the OWL and how it benefits students.

We found that students in the treatment groups, who used the OWL regularly, scored 6.6 points higher on their final grades than students in the control groups. We also ran a “writing about writing” assessment in order to evaluate how students approached the writing process. In six of the seven categories we assessed, students in the treatment groups exhibited more growth than students in the control groups. In our assessment of the final product essays, something we knew would be tricky, as it is difficult to show improvement in just one semester, we had positive results as well. Students in the treatment groups exhibited more growth in three of the five categories we assessed, showing greater improvement in context and purpose for writing, control of syntax and mechanics, and genre and disciplinary conventions.

Students also completed extensive surveys on the OWL and their attitudes toward writing at the beginning and end of the semester. Students responded well to the OWL, reporting that the content felt relevant and helpful. Students in the treatment groups also reported greater improvements in their general attitudes about writing, with many students indicating they were more likely to write in their spare time after using the OWL.

These results are promising and are in line with the goals of the OWL. While longitudinal study is needed, we have evidence that the Excelsior College OWL provides students with a strong foundation in writing, one that is going to help them transfer the skills they learn in writing classes to other writing situations, which is, of course, the ultimate goal of writing instruction.

The OWL team at Excelsior College feels we have set the stage, through solid writing instruction and extensive multimedia support, to be the kind of free resource that students can rely upon and come back to, throughout their college careers and beyond.
Our team has been honored with the WCET Outstanding Work Award. We are excited that high schools, community colleges, and universities across the country are beginning to use the OWL in their classes and their writing centers. We have been successful in our goals to create a warm, engaging learning environment. The structure of the OWL makes it a valuable resource, whether students need one short lesson on documentation or extensive instruction in writing support. There is something for everyone in the OWL!

Crystal SandsCrystal Sands, Director
Online Writing Lab (OWL)
Excelsior College

Email Crystal

Seven Key Takeaways from the State Authorization Webcasts

In partnership with M-SARA (run by MHEC), the Online Learning Consortium (OLC), and the University Professional and Continuing Education Association (UPCEA), we offered two webcasts in August with updates on state authorization. The first webcast focused on state and federal regulations. The second provided background on the State Authorization Reciprocity Agreement (SARA) processes and an update on progress made by states in joining SARA.

Archives of the webcasts, the presenters’ slides, and responses to questions that were not verbally covered in each webcast are freely available for your use.

Seven Key Takeaways
To save you some time, I’ve developed a list of seven key takeaways that will help you in your thinking about state authorization. Many of these are not new, but I’m surprised at the on-going misunderstanding and misinformation on some of these issues. Forgive the repetition, but we keep getting these questions. Repetition reinforces the message.The words "state authorization surrounded by all the state names.

1, There is No Federal Regulation or Deadline
I often hear, “I know that we have to be in compliance with federal state authorization laws by July 1 of (fill in the year).” That is not correct. There is currently no federal regulation for distance education. The regulation issued in 2010 (600.9 (c)) was “vacated” by the courts and the Department of Education will not enforce it. Earlier this year, a Negotiated Rulemaking Committee failed to reach consensus on a new regulation for distance education.

Therefore, there is no regulation. There is no deadline.

Don’t be confused about another regulation regarding distance education within a state. Enforcement of that one is delayed until July 1, 2015. But, that regulation (600.9 (a) and (b)) does not cover distance education across state lines.

The Department may issue a new regulation regarding distance education, but that effort is currently on “pause.” OLC (then operating as Sloan-C), UPCEA, and WCET jointly suggested to the Department of Education what should (and should not) be included in any new regulation. Such a regulation may be issued for public comment early next year.

2. States Expect You to Comply Now
You are not off the hook. States expect you to follow their regulations BEFORE you conduct any regulated activity in their states. Depending on the state, this could include direct marketing, enrolling a student, expecting students to participate in a clinical experience. If the student is in another state from where you are located, you are expected to follow their regulations whether there is a federal regulation or not.

3. State Authorization Covers All of an Institution’s Activities in a State
State Authorization regulations are not confined to distance education courses. There are states that regulate direct marketing, having faculty in a state, conducting field experiences (clinicals, practica, etc.) in a state, or just about any other activity that you might be conducting in another state. This is true whether those activities are tied to distance education or not.

4. SARA is Growing
As of the webcast, SARA had nine states that were fully approved to participate in SARA. Several institutions from those states have already been authorized by their state to participate in SARA and now eligible for all the agreement’s benefits.

Looking to the future, SARA expects to have 20-24 states in the fold by the end of this calendar year and around 40 by the end of next year. Progress in each state can be tracked on the SARA website. Talk with the SARA Director in your region should you wish to promote it.

5. State Licensure Programs – Requires Separate Approvals and Not Covered by SARA
Academic programs in fields that require state licensure (such as Nursing, Psychology, Social Work, and others) sometimes require extra approvals from the appropriate boards overseeing those professions in each state. The requirements vary widely by state and profession. Students have been restricted from participating in clinical experiences or kept from sitting for the licensure exam if they attended an institution that was not approved in the state.

SARA does not cover the authorization of academic programs in professional licensure fields. Whether it is a SARA member or not, colleges are expected to follow state regulations regarding these programs in each state.

In the Negotiated Rulemaking Committee discussions earlier this year, it was clear that the Department of Education is very interested in this issue. Once a new regulation is released for public comment, it will not be surprising if expanded requirements for notifying students about an institution’s approval status in each state for each profession in which it enrolls students.

6. The Origin of the C-RAC Guidelines Used by SARA
We have heard many “interesting stories” about the origin of the C-RAC Guidelines for distance education programs. Among the theories that we have heard is that they were the product of for-profit colleges, corporations, or national accrediting agencies. I think I heard someone say that they came from the lost island of Atlantis. None of these are true.

The Guidelines are based on Best Practices developed almost two decades ago by the Council of Regional Accrediting Agencies (thus the initials C-RAC) and WCET. Over the years the regional accrediting agencies have updated the Guidelines. They were used to advise accreditation review teams on items they should exam in their campus visits. Most of the regional accrediting agencies still use these guidelines.

7. Should Institutions Pause in Seeking Authorization?

Given the pause in the federal regulation and the growing adoption of SARA, some wonder if it might be good to wait.

First, you should be following state regulations regardless of the federal regulation.

Second (if the first reason does not sway you and you are more focused on self-preservation), it looks like the federal regulation will return. I would not be surprised to see a short deadline for institutions to be in full compliance in each state in which it serves students. If you wait and discover that a state that is important for your enrollments is not part of SARA, you will probably want to quickly seek authorization. Don’t expect the state regulators to do you any favors. The regulators are great people, but they already have a long line of applications ahead of you. You might get trampled in the rush to seek approval.

Thank You to Our Partners
I am very glad that we were able to partner on these webcasts. Within a few days in July, I learned that there were plans for at least three different webcasts with essentially the same content scheduled for the same August timeframe. It made sense for us to partner to produce these two webcasts in which we could share expertise and delve deeper into questions that you may have.

Thank you to Jenny Parks and the group at MHEC for hosting the reciprocity webcast. Thank you to Laurie Hillman of OLC for expertly moderating the regulations webcast. Thank you to Jim Fong of UPCEA for lending his expertise on the survey that we conducted on institutional progress in seeking authorization. Thank you to all our presenters. And thank you to Megan Raymond for organizing WCET’s webcasts while trying to pull together our Annual Meeting.

Your Turn
Do you have additional takeaways that you would like to share or questions that you would like to ask? If so, please share them in the comment field.

Thank you!Photo of Russ Poulin with baseball bat


Russell Poulin
Interim Co-Executive Director
WCET – WICHE Cooperative for Educational Technologies
Twitter:  @RussPoulin

WCET’s Annual Meeting includes several sessions on state authorization and regulations.
Join us November 19-21 in Portland, Oregon.

Education 3.0 – Around The Globe

What do Greeks, Vietnamese, Australians, and Americans have in common? The answer is no joke…

I travel a lot. For the past several years, I have accumulated over 200,000 miles per year, going around the world to speak about education reform, effective practices, education technology, learning analytics, and neo-millennial learning, to name a few.   In fact, by my calculations I have spent over 15,000 hours in front of audiences over the past decade.

But lately, I have been socializing the concept of “Education 3.0.” I don’t know if I can say I coined the term or not – some other notable bloggers and leaders have been using it too – but in my estimation, if education was able to truly use the most effective, study driven practices from 1) neuroscience, 2) learning research, and 3) education technology, we could fix much of what is wrong with education at every level. As some of you know, my Research Center / Think Tank created a short-film (“School of Thought”), actually shooting in Hollywood last year. The premise for the 21 minute film was essentially a question: What could be, if Education 3.0 was actually implemented?

Photo of numerous motorcycles parked at a technical college in Vietnam.

Education 3.0 can be found on two wheels at this technical college in Vietnam.

While I travel, I try very hard to keep my wits about me – I try to notice what education looks and feels like in other places. I not only deliver keynotes and workshops, but I also have lengthy conversations with educators at all levels and of all types. These insiders often give amazing feedback and insights regarding the state of education today. And while I am always humbled and inspired by the simple experience of traveling abroad (if you haven’t done so, add to your bucket list touring the Acropolis, swimming off at the beaches in Perth, or taking a motorcycle cab ride in Ho Chi Minh city…), I’m most fascinated by the similarities between educational issues we all seem to share.

Neuroscience: Apply What We Know about Learning

When in Vietnam I witnessed something I had seen in other Asian countries. I walked past classrooms (both K12 and Higher Ed) where students were asleep on pillows sold specifically for that context. Why? Because in many Asian cultures learning does not end when the school day is finished. Formal learning may happen over the course of 18 hours, every day. So, students will buy these specially designed pillows as well as quality recording devices and the teachers will lecture to the devices while some students sleep, others surf the web, etc. In Vietnam, my specific consulting was around the cultural implications of a lack of interactivity between the teacher and the students, but it was obvious that a paradigm from the USA is shared by many Asian cultures: time = Learning.

We know some interesting things about time and our brains. We know that waking up during a REM cycle can potentially impair a person’s cognitive ability, equivalent to being drunk. This impairment can last for several hours. Yet we still promote and/or require students to attend early classes. We have researchers like John Medina telling us that some learners (and some teachers!) should have all learning completed before noon, while others should not start until noon. Yet we do nothing to even test which students fall into which categories, let alone to act on it.

And we all know the trouble with the Carnegie Unit. You know, the 110 year old, industrial aged model that says spending X amount of time on a subject means it has been learned. Silly, right? Yet the rules, regulations, accreditations, and policies persist. Sure, Competency Based Education is trying to fight this notion, and is making some great headway, but there is a ton of enculturation and baggage to push through.

I heard some game manufacturers recently explain that they had a product which would guarantee students to learn math faster, retain it longer, and apply it better than any college Algebra course. Yet nobody would adopt it. Why? Because the teacher had to give up approximately 40% of their traditional teaching time (classroom time) and instructors wouldn’t do so. We know more about the brain than ever before. Without using neuroscience to inform practice, we’ll never reach Education 3.0

Photo of the Acropolis in Greece at night.

Education 3.0 had its roots in Greece.

Learning Research: Apply What We Know about Teaching

The Greeks showed me much of what I consider the origins of my cultural heritage. To walk the paths and roads where great philosophers stood, where ideas like democracy were first debated, and where architectural beginnings happened was humbling! But I also heard from educators who are struggling with yet another common American problem – the lecture.

I get the allure of lectures. I do! I go around the world (essentially) lecturing. But keep in mind a few things. I’m lecturing on about 12-15 total hours of material that I’ve developed over 20 years because I only have 1 hour with which to make an argument or propose an idea. Yes, there are new pieces every time, but 90% of the lectures are polished and have gotten solid feedback. A GREAT lecture can be amazing and I try in my keynotes to deliver a great lecture.

But in my classes it’s a different story! I rarely lecture at all anymore. I have those students for 45 hours a term – I don’t need to cram anything into an hour. And I know that nobody can create 45 amazing lectures per term. In fact, after polling about 20,000 teachers and professors, the average number of great lecturers on campus seems to be 3 and the total number of great lectures any one person delivers seems to be 3.

So, despite years of research and confirmation that lecturing should be rare and surgical in its use, we still see evidence in polls like the National Survey of Student Engagement which suggests that ½ of a college student’s experiences in every class, every term is lecture. Despite the work of Dr. Eric Mazur, lecturer of the year at Harvard, who has proven that lecturing doesn’t work, many teachers still engage in the practice. Despite Richard Light’s Harvard Assessment Seminars, showing that student’s best experiences in college are the non-lecture based classes, we still over-use it to a fault. Without using learning research to inform practice, we’ll never reach Education 3.0.

Photo of an Australian beach.

Education 3.0 can be found in the beaches of Australia.

Education Technology: Apply What We Know about Technologies

I was down under very recently. I spent some time in Melbourne training faculty with regard to effective use of education technology. The people in Australia are quite remarkable. They are simply the kindest culture of people (collectively) that I’ve experienced in my travels. But that kindness cannot mask the frustration by some faculty at the notion of being asked (forced?) to use ed tech.

In the states, we share this trouble. I have spent over a decade “e-vangelizing” the usage of education technology. I believe it is impossible to reach all students in meaningful ways without ed tech. History has shown us that education without technology cannot scale. Yet many educators still balk at the idea of infusing technology in the classroom and if they do, most still only substitute ed tech for non-technical activities. (Instead of paper test, they’ll use a computer test, etc.)

But as Puentedura points out nicely in his S.A.M.R. model of transformative use of ed tech, it is not until we actually Modify and/or Redefine our activities, making use of the power, scalability, and connect-ability of these tools, that we start to see substantive, meaningful changes for our students. Until we use education technology to inform practice, we’ll never reach Education 3.0.

Let’s Strive for Education 3.0

I’m honored to have been asked to share some thoughts with the WCET community. It’s been a few years since I spoke at your conference and I hope to do so again soon! But as we all strive to fix our own corners of education, I really hope we’ll start to let the same important frameworks and research-driven practices inform those fixes. I hope we’ll all start to strive for and use Education 3.0. There is a lot at stake.

Good luck and good teaching my friends.

Dr. Jeff D Borden

Photo of Jeff Borden.

Dr. Jeff Borden (@bordenj), Pearson’s VP of Instruction & Academic Strategy is a consultant, speaker, professor, comedian, and trainer, all while leading the Center for eLearning (an Academic research center and think tank). As a University faculty member of 18 years and past college administrator, Jeff has assisted faculty, administrators, executives, and even politicians in conceptualizing and designing eLearning programs globally. Jeff has testified before the U.S. Congress’ Education Committee, blogs for Wired Innovations, provides global keynote addresses, promotes research findings from the academic think tank he directs, and has been asked to help determine the “Academic Vision” for Pearson Higher Education. To read Jeff’s blog, follow the cMooc his research group is building, or get more information, check out:

http://pearsonlearningsolutions.com/blog/?s=jeff+borden&x=-1066&y=-133 http://researchnetwork.pearson.com/blog

To see the Short-Film “School of Thought” that Jeff wrote and produced: http://researchnetwork.pearson.com/sot



Get every new post delivered to your Inbox.

Join 827 other followers