New Gainful Employment Regulations Tied to State Authorization

This morning, the U.S. Department of Education released a 945 page document describing its new “Gainful Employment” (GE) regulations. An informal version is available on the Department’s website and the final version will be published in the Federal Register on October 31.  Except for the quote from Inside Higher Ed, the other quotes are taken from an early release of the final document that I was able to view.

New to these regulations is an explicit connection between Gainful Employment and the state authorization regulations. As a result, the amount of information that will need to be reported and the number of states for which it will need to be reported could dramatically expand over what was published in the original proposed regulation.

Originally, colleges would have to report in the state in which they were located and also in their local Metropolitan Statistical Area (MSA). The expansion means that institutions will be required to inform students about programs licensure, certification, and accreditation for each GE program for each state in which it must meet the federal state authorization rules.Sign reading "Help Wanted Apply within".

Currently, there is no federal regulation for state authorization for distance education, so it is not yet enforceable. Should that regulation be reinstated, the Gainful Employment notification regulations will be triggered in each state in which the college needs to be authorized for each program that is covered by Gainful Employment. We have heard that the Department may release proposed language for a new federal state authorization regulation sometime in 2015.

What is Gainful Employment?

Gainful Employment has been a controversial subject for several years. Its purpose, according to an article in today’s Inside Higher Ed:

“Gainful employment applies to vocational programs, including most of the for-profit sector’s offerings. Non-degree programs at community colleges would also need to comply with the rules, which are set to go into effect in July 2015. So would some non-degree programs at four-year nonprofit institutions, both public and private.”

The reason for the Gainful Employment rules are:

“Specifically, the Department is concerned that number of GE programs: 91) do not train students in the skills they need to obtain and maintain jobs in the occupation for which the program purports to provide training, (2) provide training for an occupation for which low wages do not justify program costs, and (3) are experiencing a high number of withdrawals or “churn” because relatively large numbers of students enroll but few, or none, complete the program, which can often lead to default. “

As a result, they will create a “transparency network” that will:

“…increase the transparency of student outcomes of GE programs so that students, prospective students, and their families have accurate and comparable information to help them make informed decisions about where to invest their time and money in pursuit of a postsecondary degree credential.”

Must Disclose Licensure, Certification, and Accreditation Info for GE Programs

The regulation will require colleges to disclose their licensure, certification, and accreditation status to students in Gainful Employment programs:

“We are…eliminating the proposal for program certifications to cover the States within an MSA, and requiring instead that the institutions provide applicable program certification in any State where the institution is otherwise required to obtain State approval under 34 CFR 600.9.”

As a reminder, §600.9 is the federal state authorization regulation. As stated in the paragraph below, the current federal state authorization regulation is only for states where an institution has a physical location:

“The current State authorization regulations apply to States where an institution has a physical location, and the program certification requirements also apply in those States so these two sets of requirements are aligned.”

But it goes on to hint that it will also include distance education if that regulation returns:

“If any changes are made in the future to extent the State authorization requirements in 34 600.9 to apply in other States, we intend the program certification requirement to remain aligned…We believe that the requirements for the applicable program certifications should also be provided for those States. This will ensure a program and institution that provides the program have the necessary State approvals for purposes of the Title IV, HEA programs. Linking the State certification requirements in §668.414(d)(2) with the State authorization regulations in §600.9 to identify States where institutions must obtain the applicable approvals benefits students and prospective students because the State authorization requirements include additional student projections for student enrolled in the programs for which certifications would be required.”

And a final reason for doing this…

“…institutions may be required to include on a program’s disclosure template whether the program meets the licensure, certification, and accreditation requirements of States…for which the institution has made a determination regarding those requirements so that students who intend to seek employment in those other States can consider this information before enrolling in the program.”

Conclusion

There are an unsettling number of colleges who are not transparent with students about this information. While Gainful Employment has definitely targeted the for-profit sector, there are plenty of institutions from other sectors who have not informed students about whether their program will meet local requirements.
I’ve only had a few hour to review this regulation. Some people did not think it was their job, it will be now.

I would not be surprised if there is not significant push-back and possible lawsuits regarding the whole regulation.

As I learn more, I’ll let you know.

Russ

Russell Poulin
Deputy Director, Research & Analysis
WCET – WICHE Cooperative for Educational Technologies
rpoulin@wiche.edu
wcet.wiche.edu
303-541-0305
Twitter: @wcet_info and @russpoulin

 

Photo credit: MorgueFile

 

Community Colleges Adapt to CBE for the Benefit of Their Students

At your cooperative, we’re always happy to share the learning of our members.  Sharing with us today is Sally Johnstone, Vice President for Academic Advancement at Western Governors University, about the work WGU has done with community colleges to launch CBE programs and the resources they have produced, which are open to all.

As some of you may be aware, Western Governors University has been working with almost a dozen community colleges across the country for the last two years.  Our role was to help them develop their own competency-based degree (CBE) programs. The staff members and faculty at these colleges worked at an incredible pace to incorporate the basic tenants of a CBE into their new programs. They were all faced with many challenges from both within and outside their institutions.  They met these challenges creatively and within the context of their own campus cultures.

cbe wgu 2300pxAmong them the colleges now have over 3,000 students enrolled in CBE programs.  The lessons they learned in their journeys from being vaguely aware of CBE to launching their own programs are being collected.  This week we launched www.CBEinfo.org.  It is site to help other community colleges learn from the pioneering work of Austin Community College, Bellevue College, Broward College, Columbia Basin College, Edmonds Community College, IVY Tech at both Fort Wayne and Lafayette, Lone Star College’s University Campus, Sinclair Community College, and Spokane Falls Community College.

One of the most remarkable aspects of all their work is that they integrated CBE into their regular campus operations. That effort has already enabled several of colleges to expand their CBE activities from a single degree or certificate program to include other academic areas.  The preliminary evidence indicating improvement in student success encouraged faculty and staff not involved in the initial projects to pay attention.

Accommodating Campus Cultures

The varieties in the campus cultures I mention above include both strong and loose system arrangements plus almost total autonomy.  Some campuses had faculty unions, some did not.  Some campuses were in states that tend to micromanage academic activities, like requiring A – F grades in each course.  These grades become meaningless when students are progressing by demonstrating mastery of the courses and working at different paces to achieve it.  The solution in this case was to assign a grade but redefine ‘passing’.  At a different campus, the faculty did not have state mandates to assign grades, but their student information system did require it.  Their solution was to consider an ‘A’ or ‘B.’ If a student earned a ‘C’ or lower he/she was allowed to continue working toward the degree but in a more traditional distance learning program.

As you explore the lessons on the site, you will also notice that the organizational structure to support CBE was dependent on the culture at each college.  For example, at Sinclair Community College they already had in place a sophisticated distance learning support center.  The faculty on campus were used to working with instructional designers who used common course templates.  This was a very good fit for the development of their CBE program.  In addition, they had developed technological tracking systems that allowed them to flag at risk distance learning students.  This was adapted to their CBE program and helped their academic coaches know which students might be struggling with their courses.

In contrast, at Austin Community College they did not have a centralized distance learning operation.  Consequently they created a support structure for CBE students and the faculty developing the courses within the academic department in which the program was housed.  It was a good way to get started, but as other academic departments are beginning to develop their own CBE programs, the staff in the Dean’s office is developing a new plan that will have some of the characteristics of the Sinclair center.

Interaction with the pioneers

Throughout this whole project all the colleges have been learning from one another. They will continue to do so as they modify their initial practices to better serve their students.  We now invite you to join in that sharing process.  Within www.CBEinfo.org is a Discussion option.  The staff and faculty from the partnering colleges have agreed to pay attention to questions and comments in that section and share what they have learning and are learning.

I look forward to seeing you online.

 

Johnstone,SallySally M. Johnstone
Vice President for Academic Advancement
Western Governors University

U-Pace: Building Student Success Through Content Mastery & Proactive Support

The University of Wisconsin-Milwaukee is a 2014 WCET Outstanding Work Award winner for their innovative online instructional approach, U-Pace.  Today Diane Reddy and Ray Fleming, co-creators of U-Pace and Laura Pedrick, executive director of UWM Online, share with us a little about the program.

Putting data into action. Valuable student data is now easily available to instructors through the Learning Management System (LMS) at their college or university. A wealth of information about each student’s work habits and progress is automatically recorded for each course. But while advances in learning management systems and learning analytics are accelerating, a gap remains between instructors’ ability to access student data and to act upon the data in an empirically-based way to fully utilize the potential to benefit their students. The U-Pace instructional approach uses information about learner engagement and performance recorded in the institution’s LMS to maximize students’ learning experience and provide personalized support for students to be successful.

U-Pace logoWhat does U-Pace mean for instructors? U-Pace is a self-paced, mastery-based online instructional approach that works within any LMS. U-Pace proactively supports learners through instructor-initiated messages called Amplified Assistance.

The mastery-based learning component of U-Pace consists of course content divided into small, manageable units (usually half of a chapter/lesson) that are each associated with a 10-item quiz. Students must demonstrate mastery on each unit quiz by scoring at least 90% before they can advance to new content. Retakes are unlimited (with a required one-hour wait between attempts), and consist of different quizzes that cover the same content.

Amplified Assistance consists of tailored feedback and motivational support that is emailed at least weekly to each student, which may be particularly useful for students who are struggling but reluctant to ask for help. Instructors use information recorded in the LMS to craft Amplified Assistance messages personalized for the student. Valuable information provided by most LMS’s includes:

  • When was the last time the student took a quiz or accessed the course material?
  • How many attempts does the student require to demonstrate mastery on a quiz?
  • Are students missing questions related to one particular concept that the instructor can help clarify?
  • Are students missing questions from multiple different content areas, perhaps suggesting they need assistance with their study skills and general approach to learning the material?

How does Amplified Assistance help students? In Amplified Assistance messages, instructors communicate unwavering belief in the student’s ability to succeed, and praise the student for small accomplishments (such as mastering a single quiz). Instructors also reinforce students’ effort and persistence (for example, by praising a student for consistently making attempts at a quiz they have found challenging to master), which may be especially beneficial for students who have the tendency to give up after minor set-backs. By focusing on the student’s positive behavior, instructors are shaping the student’s behavior for success. Through Amplified Assistance messages, U-Pace instructors provide the support students need to meet the high standards created by U-Pace’s mastery-based learning component, a combination which empowers students and fosters their sense of control over learning.

How can instructors create Amplified Assistance messages efficiently? Past U-Pace instructors have field-tested dozens of Amplified Assistance messages and created templates that are freely available on the U-Pace website. The variety of templates offered allows instructors to select a message from an appropriate category (e.g., “Students who are on schedule,” “Students who are behind schedule,” or “Students who have not started”) and then tailor each message based on the individual student’s performance. These templates reduce the time needed for instructors to compose effective messages to address student needs, while still offering instructors the flexibility to craft personalized messages that will be meaningful and build rapport with each student.

Does U-Pace produce results? The U-Pace instructional approach has been honored with a 2014 WCET Outstanding Work (WOW) Award. U-Pace instruction produced striking student success in multiple rigorous evaluations, including a large randomized controlled trial (where learners did not self-select their course format) funded by the U.S. Department of Education, Institute of Education Sciences, and a multi-institutional study funded by EDUCAUSE’s Next Generation Learning Challenges program. U-Pace has consistently produced greater learning and greater academic success compared to conventional face-to-face instruction. U-Pace students have scored higher than conventionally taught students on proctored, cumulative exams taken at the end of the course, and again six months later (EDUCAUSE Review Online). A greater percentage of U-Pace students (compared to conventionally taught students) have earned final grades of A or B (EDUCAUSE Learning Initiative Case Study). Furthermore, U-Pace students have shown improvements in self-regulated learning, as evidenced by a decrease in the number of attempts needed to master the quizzes over the course of the semester (Journal of Asynchronous Learning Networks).

UPace charts

How do students react to U-Pace instruction? Survey data has found that, relative to conventionally taught students, U-Pace students perceive greater instructor support, control over their learning, and improvements in time management and study skills over the semester (NGLC Grantee Profile, Journal of Asynchronous Learning Networks). Student reviews have mirrored these findings:

“I am actually retaining the information that I learned in this course. It has helped me out so much in boosting my confidence, and actually showing me, and opening the door, and saying you are just a step further from graduation and you can succeed because you have all these skills in you that you might have never seen before.”

“I go out and try new things, and I know that that sounds really weird, that a course can change someone like that, but you know that it is, I learned the content as well, but it is not even that, it is the fact that I am learning to be myself more, and I am opening up more doors to being motivated and having better time managing skills and being more confident in myself. Outside of school, people have noticed changes in me, that I have more of a glow to me, that I am more outgoing, almost because I have that confidence that I can actually do stuff that I used to think I had no business doing.”

Bottom line. By acting upon student data recorded in the LMS, instructors can have a meaningful impact on students. U-Pace is an empirically-tested instructional approach that has shown great success in utilizing this data to motivate, engage, and improve the learning of students. By integrating the U-Pace instructional method with LMS capabilities, instructors have the opportunity to maximize the value of these tools for guiding students to success.

If you’d like to learn more about U-Pace instruction, we’d be delighted to talk with you at the WCET Annual Meeting at one of our presentations on Thursday, November 20th: Conversation about Competency Based Education (1:30 – 2:30 pm) & U-Pace Instruction: Paving the Way to College Success (3:00 – 4:00 pm)

 

Reddy, Diane

 

Diane Reddy, Co-creator of U-Pace Instruction
reddy@uwm.edu

 

 

 

Fleming, Ray

 

Ray Fleming, Co-creator of U-Pace Instruction
mundo@uwm.edu

 

 

 

Pedrick, Laura

 

Laura Pedrick, Executive Director, UWM Online
lpedrick@uwm.edu

Education Department Urges Colleges to Follow IPEDS Distance Ed Definitions

In an extended conversation with the U.S. Department of Education (US ED) IPEDS personnel, they confirmed which distance education enrollment counts colleges should be reporting to the Department’s IPEDS (Integrated Postsecondary Education Data System) survey.

The Department representatives also wondered why we did not highlight some of the errors made by colleges in reporting their enrollments.   They encouraged colleges to follow the IPEDS definitions and instructions and to call them if they have any questions.

Background

A few weeks ago Phil Hill of the e-Literate blog and I reported on anomalies that we found when colleges reported their distance education enrollments to the U.S. Department of Education.  Earlier this year the Department released data from its Fall Enrollment 2012 IPEDS (Integrated Postsecondary Education Data System) survey.  For the first time in a long time, the survey included counts of distance education students.

Upon publishing our initial IPEDS blogs analyzing distance education enrollments, we heard from some of our readers.  They told us about the following situation in reporting their numbers when they strayed from what was expected of them.

Undercounts:  Some Colleges Did Not Report All of Their Distance Education Students

photo of dictionaries

There are many definitions of “distance education.”

The first report was from a college that did not report any of their students who were enrolled in continuing education, self-support (receive no state funding) colleges.  We were surprised at this and learned that other colleges also did not report all their distance education students.

In following up, some whom we contacted were unaware that there were self-support entities on some campuses that offered for-credit courses  leading to full degrees.  They do exist.  The most common instance is with public colleges that have a College of Continuing Education.  Jim Fong, Director, University Professional and Continuing Education Association’s Center for Research and Consulting, said that his organization has about 370 members.  In a survey a couple years ago, about 91% of the respondents to his inquiry have for-credit offerings.  He did not have data on how many are self-support units.

Reasons for the Undercount

We heard different reasons for not reporting these students:

  • Misunderstanding the IPEDS instructions.  The survey instructs colleges to: “Exclude students who are not enrolled for credit. For example, exclude: Students enrolled exclusively in Continuing Education Units (CEUs).”  Of course, this instruction is intended to reference non-credit, CEU courses….not colleges of continuing education.  It is conceivable that someone may have misread the instruction.
  • They understood, but it was too difficult to do.  For some colleges the data systems for the continuing education colleges are different than those for the main campus.  Merging the data is difficult and would take calculations by hand.
  • They chose not to report the correct enrollments to IPEDS.  A college might decide that it does not wish to report different enrollment numbers to IPEDS than it reported to the state, even though the requirements for each government entity are different.
  • Their data system was not ready.  One college said that their data system simply was not ready to report the correct numbers.

Response from the Department

The Department was offered a chance to provide a written response, but they declined.  In their discussion with me they noted:

  • Most of the reasons given above were not due to the IPEDS definition, but were due to errors or inaction by the colleges.  That’s a fair point.
  • The definition asks colleges to: “Include all students enrolled for credit (courses or programs that can be applied towards the requirements for a postsecondary degree, diploma, certificate, or other formal award), regardless of whether or not they are seeking a degree or certificate.”  There is no mention of how the courses are funded or whether the courses are offered by a continuing education college.  They were very clear that students enrolled in for-credit courses in colleges of continuing education should have been included in the counts.  The Department will not issue a clarifying document, but they plan to inform the state IPEDS coordinators when they next meet.

Photo of a dictionary with the term "disclaimer" highlighted.Overcounts:  Some Colleges Using the Wrong Definition of “Distance Education”

As we talked to colleges, we learned that some colleges did not use the definition of “distance education.”  IPEDS defines a “distance education course” as: “A course in which the instructional content is delivered exclusively via distance education.  Requirements for coming to campus for orientation, testing, or academic support services do not exclude a course from being classified as distance education.”

Reasons for the Overcount

We heard different reasons for using their own definitions:

  • Misunderstanding the IPEDS instructions.  One institution said that it tried to get a clarification on the definition and was still confused even after contacting the IPEDS call center.
  • They understood, but it was too difficult to do.  The state and/or an accrediting agency may already have its own definition that differs from the IPEDS definition and it would be difficult to create another classification just for IPEDS.  A few examples:
    • The Southern Association of Colleges and Schools Commission on Colleges defines distance education as when “a majority of instruction (interaction between students and instructors and among students) in a course occurs when students and instructors are not in the same place.” By majority, colleges are interpreting that to mean more than 50% of the instruction.
    • The Texas Coordinating Board defines a “Fully Distance Education Course” as having “mandatory face-to-face sessions totaling no more than 15 percent of the instructional time.”   Therefore 85% of the instruction is at a distance.
  • They chose not to report the correct enrollments to IPEDS.  A college might decide that it does not wish to report different enrollment numbers to IPEDS than it reported to the state, even though the requirements are different.

Response from the Department

In their discussion with me they noted:

  • Most of the reasons given above were not due to the IPEDS definition, but were due to errors or inaction by the colleges.  Once again, that’s a fair point.
  • The IPEDS “distance education” definition (cited above) defines a distance education as being nearly 100% at a distance.  The definition is clearly listed in the IPEDS Glossary.  While they understand that states may have differing reporting requirements, they were very clear that they expect colleges to use this nearly 100% definition in reporting distance education enrollments.  Again, the Department will not issue a clarifying document, but they plan to inform the state IPEDS coordinators when they next meet.

In Conclusion…

Some final thoughts:

  • As shown with the “distance education” definition examples, a college in Texas would need to report distance education as 51+% of a course to SACS, 86+% of a course to its Coordinating Board, and nearly 100% of a course to IPEDS.  You can see the difficulties they face.
  • The Department did not seem to think that the errors from these anomalies were significant.  From the enrollments numbers that were reported to IPEDS, about one-in-eight students take all of their courses at a distance and about one-in-four take at least some distance courses.  Those are significant numbers and I’d like to see both colleges and IPEDS strive to make future counts as accurate as possible.
  • Those colleges waiting for a clarification from the Department will not see anything dramatic. They may want to call them if they have any questions or feel that they might not be reporting enrollments correctly.

Finally, we could ask the question as to whether the Department’s definition of “distance education” is a useful one?  On the plus side, it is a clear definition.  On the negative side, the “nearly 100%” definition does not reflect current practice.  But, that’s a question for a different day.  And it is a discussion that may need to include accreditors and states.

For now, let’s use the definitions as presented by the Department so that IPEDS has accurate data to inform federal financial aid policies.

RussPhoto of Russ Poulin with baseball bat

Russell Poulin
Deputy Director, Research & Analysis
WCET – WICHE Cooperative for Educational Technologies
rpoulin@wiche.edu
wcet.wiche.edu
303-541-0305
Twitter:  wcet_info and RussPoulin

Join us in Portland, OR for the WCET Annual Meeting – November 19-21.

 

Photo credits:
Dictionaries – Morgue File
“Disclaimer” definition – Morgue File

 

Intellipath for MBA preparation

Today we welcome Colorado Technical University Chief Academic Officer and Provost, Connie Johnson and CTU faculty member, Sarah Pingrey as they share what their WCET Outstanding Work (WOW) award-winning program has done to improve the student and faculty experience in MBA-preparatory courses.

One of the great pleasures that I have as Chief Academic Officer of Colorado Technical University, is to work with a large group of talented faculty who embrace new technology to improve student learning. In 2012, CTU committed to creating a personalized learning experience for our students.

Because CTU has many adult students pursuing an MBA degree that may have an undergraduate degree in other disciplines, University Dean of Business, Dr. Gail Whitaker worked with the business program committee to integrate adaptive learning into business pre-requisites. The purpose was two-fold: to ensure that students received the content that he or she specifically needed as a knowledge-base for the MBA program and to comply with prerequisite requirements prescribed by Association Council for Business Schools and Programs (ACBSP). This innovative approach, which CTU received the WCET Outstanding Work award for, provided students with a tailored learning path for topics including accounting, statistics, economics and finance.

Central to the implementation of adaptive learning technology (Intellipath) were faculty who created relevant assessments and content for each course. Intellipath provides content to students determined by an assessment and provides instructors with the ability to work with students.

The Faculty Experience

Sarah Pingrey, faculty member at CTU, shares her experience using the platform:

I started working on the adaptive learning platform Intellipath at CTU in the spring of 2012. From development to testing to piloting courses to full implementation, I’ve seen Intellipath grow into an essential learning platform for students. Throughout my teaching career other platforms have tried to woo me, but Intellipath does something different – faculty members are intimately involved in their students’ progress every step of the way.

CTU dashboard view 1Teaching in an Intellipath classroom is such a joyful experience. Training is simple with videos and documents to review and a short quiz to demonstrate competency. Once training is complete and fundamental best practices are understood, the next step is to delve deeper into exactly what Intellipath offers and how to access and use this information. With so many students entering the classroom who are scared that mathematics will be the end of their college careers, I am to be able to follow their progress through the course objectives, praise their successes, and help them immediately when they struggle. Intellipath gives me the information I need to do this and there is no way a student can fall behind without me knowing.

Intellipath contains detailed data for the entire class and for each student, and using this data effectively is crucial. The first thing I want to know is whether a student has started working on their weekly assignments. Intellipath clearly shows which students have started or completed the assignment. Also, it only takes a quick glance to find a student’s current knowledge score on the assignment, the number of objectives completed, the time spent working on the assignment, and the day that the assignment was last accessed. This information is such a treat for an instructor to have. Instructors can now motivate students who have not started the assignment and give praise to those who have.

CTU dashboard view 2Students can also easily flag difficult problems. A detailed solution is provided to every problem, but if a student doesn’t understand the solution or has a question, they can easily flag the problem by just pushing a button. The problem, the student’s solution, and the scratch work can be viewed, and I am able to leave feedback for the student. Encouraging students to use this feature is crucial and students are very likely to use it since they are able to ask questions without having to directly email the instructor: pushing a button is easy.

Intellipath has definitely led to more interaction between students and faculty. It has also changed the dynamics of synchronous lectures. Having the lectures apply to all students can be challenging when some students have already started their Intellipath assignments and have very specific questions, while other students don’t have enough foundational knowledge yet to jump into answering these questions. Having organized slides and corresponding class activities, and being able to jump around in them during the lecture, makes teaching more effective for both students and faculty.

The biggest challenge for an online professor can be making that initial connection with students. Students are struggling, but what they are struggling with is unknown until it is too late. Intellipath takes away the mystery of why a student is struggling and makes interactions between the instructor and student easy, fun, and often. I am excited for the future of Intellipath, and most of all, excited that students are truly learning!

If you’re interested in learning more about CTU’s Intellipath for MBA-preparation program, be sure to join us at the WCET Annual Meeting where Connie will share more about the program on Thursday, November 20.

 

Headshot of Connie JohnsonConnie Johnson
Chief Academic Officer & Provost
Colorado Technical University

 

 

 

 

 

 

headshot of Sarah PingreySarah Pingrey
Professor
Colorado Technical University

Capella University FlexPath

Capella University is a 2014 WCET Outstanding Work (WOW) Award recipient for the development of FlexPath.  Deb Bushway, Vice President of Academic Innovation and Chief Academic Officer shares with us today the evolution of FlexPath from pilot to celebrating its first birthday as a program. Capella will accept their award at the WCET 26th Annual Meeting.

FlexPath LogoIn May 2012, Capella University’s academic leadership made the decision to delve into the world of direct assessment. The impetus for our pursuit of designing a direct assessment program came from a conversation with leaders at the U.S. Department of Education regarding barriers to innovation. They recommended that we leverage the Federal Department of Education’s Title IV eligibility language introduced in August 2006, and finalized by the department in November of that same year to explore our options for introducing direct assessment programs. As the process rolled out, it became clear that we also needed to acquire approval from our regional accreditor, the Higher Learning Commission, to begin development of direct assessment courses citing this portion of the Title IV.

It is important to note that all curricula at Capella University are competency based. These academic programs are still rooted in courses and comply with seat time (and other) requirements for credit bearing distance education. This curricular base provided a great starting place to move to a competency-based, direct assessment delivery model.

The Self-Paced Course Pilot

Initially, we started with a pilot to help us understand the learning experience in a self-paced delivery model. The initiative was simply referred to as the Self-Paced Course Pilot. We based these courses on the infrastructure built to support our credit-bearing programs. From the onset of design and development of these courses, we have taken a very faculty-driven design and delivery approach. The faculty chairs from both programs worked directly with our instructional designers, curriculum specialists, assessment specialists, and project managers as we built a self-paced course experience that would be as rigorous and engaging as our more “traditional” online credit-bearing model. The team worked very closely together for many months with a goal of offering four courses from each of the two programs in early January 2013.

FlexPath Competency MapSupport Structures for Learners

Concurrently, we built out a support structure for our learners to be able to achieve the necessary competencies to successfully complete each course in a self-paced format. After countless meetings with faculty, advising staff, and a host of other contributors, we arrived at a three-pronged support structure. This learner support structure consists of traditional faculty, tutoring faculty, and coaching faculty. In traditional credit-bearing delivery models, these three support roles are often integrated into the work of a single faculty member. With this new architecture of the faculty role, extensive training was necessary for the individuals who took on these new functions.

Capella faculty chairs chose select faculty members from our traditional programs to pilot this new approach to teaching and learning. These faculty members then led the work to articulate competencies, align the criteria through which to assess competencies, design the authentic assessments, and serve as the evaluators of learners’ demonstration of competency. The tutoring faculty are aligned to particular “clusters of competency” (or courses) and are content area experts. Many are enrolled in our doctoral programs at Capella University and have demonstrated success in the relevant content area.  One reason for this design is that our research indicates adult learners prefer seeking help from peers rather than the traditional faculty. Finally, the coaching faculty team was formed.  This team came from among our traditional advising teams, although this model is significantly more proactive and engaged than the traditional advising model. Each learner is assigned a coaching faculty member who stays with that individual throughout his/her experience at Capella.

Aligning the Technology

The third major component in developing FlexPath was to align Capella’s technology infrastructure to accommodate the needs of a competency-based direct assessment delivery model. There are many unique attributes to FlexPath that our systems simply were not designed to accommodate. These attributes range from not having any course level grades to not being able to transcribe individual competencies for each course. Other technological areas needing alterations included our learning management system, our institution’s learner website (iGuide), and our student administration system. We needed to accommodate these attributes without making permanent changes to our systems, not knowing if the direct assessment delivery model would be accepted widely enough to make permanent changes a worthwhile endeavor. Additionally, the university’s catalog, policy, and external communications needed to reflect all of the changes needed to deliver FlexPath. The entire initiative took dozens of people and thousands of hours’ worth of work before a single learner was enrolled into a FlexPath course!

FlexPath SavingsCelebrating Successes & Expanding to the Future

All in all, FlexPath has been and continues to be an exciting endeavor. We have been honored with several awards for our efforts, most notably, the WOW Award from WCET, along with Blackboard Learning’s Innovation Award and the NUTN Award for Innovation in Distance Education.

FlexPath will soon be celebrating its first birthday. There are now a total of six degree programs offered in the FlexPath model. As the FlexPath program expands, so does our knowledge base for developing high-quality, competency-based direct assessment programs. With that said, we identify opportunities to enhance FlexPath on a daily basis. As more and more universities take on the challenge of implementing this type of program, we look forward to the opportunity to participate in a larger community of practice around direct assessment to further address the needs of 21st century adult learners and employers.

Learn more about the first year of FlexPath on November 20th at 9:45am during #WCET14.

 

Deborah Bushway HeadshotDeborah Bushway

Vice President of Academic Innovation & Chief Academic Officer

Capella University

Learner-Centric Shifts in Education

Katie Blot, senior vice president, education services at Blackboard shares with us how MyEdu is helping learners succeed through academic planning and out into the marketplace.  

When we talk about changes in education, the best place to start is with the learner.  And if there’s one statistic that highlights the shift in who our learners are, it could be this: today less than 15 percent of higher education students in the U.S. are what we would call “traditional” 4-year residential students.  That means that there are roughly 18 million post-traditional students who are over 25, need to work to afford education, attend multiple institutions, have dependents, and are actively working toward job and career goals.

Even “traditional” undergraduates are seeking flexibility and transparency in their educational options (online, self-paced, dual enrollment, accelerated degrees, competency based learning, etc.), such that the old distinctions between “traditional” and “non-traditional” are not really applicable anymore.  Post-traditional students do not fit in clearly defined categories, and they follow extremely varied educational pathways.

Economics Make Students More Focused on Their Goals

These shifts are driven in part by economics.  The need for higher education opportunities is greater than ever before, as increasingly jobs require post-secondary education.  Research repeatedly emphasizes that higher education leads to greater economic attainment, both for individuals and for our country.  But with the recession and the rise in the cost of higher education, degree attainment is extremely difficult for many people to afford.  Potential students are looking for a more economical means to access upward mobility and are accelerating huge consumer-driven changes in higher education.

Students today are much more focused on establishing a clear-cut career path and figuring out which job is going to help them earn a living and eventually pay for or justify the cost of their education.  This leads to a focus on the competencies they’ll gain from their education and how they can demonstrate those competencies to prospective employers and translate them into gainful employment.

What Do College and University Presidents Foresee? It’s Not Always Positive.

Naturally, these shifts in learner behavior put significant pressure on our institutions.  But educational institutions, even in their own estimation, are not adapting quickly enough.  Our recent research co-sponsored by The Chronicle of Higher Education gathered insights from over 350 college and university presidents on the topics of change and innovation in higher education.

  • Nearly 70% of presidents believe that at least a moderate amount of disruption is needed in higher education.
  • 63% of presidents believe that the pace of change is too slow.
  • While 60% of respondents are optimistic about the direction the American higher education system is going in, only 30% felt as if the higher education system is currently the best in the world. Surprisingly, this drops to 17 percent when asked about the next decade.
  • And less than half (49%) of the Presidents believed we are providing good or excellent value to students and their families.

How will we approach systemic educational changes?  We’ve been talking for a while about mobile, personal, and flexible – but that is not enough.  Now we need to add affordable, modular, and accessible.  The learning ecosystems that serve these needs empower students to:

  • See at a glance what’s happening, what they’ve accomplished, and what should be next.
  • Access learning materials anytime, anyplace, including diverse personalized learning resources.
  • Collaborate in learning activities while forming communities of peers and mentors.
  • Easily create learning artifacts and reflect on their own learning.
  • Collect portable evidence of learning.
  • Manage competencies and learning achievements from multiple sources.
  • Develop an online academic and professional identity.

MyEdu:  A New Academic Planning Tool

MyEdu ProfileOne example of how Blackboard is helping learners succeed is MyEdu.  MyEdu offers free, learner-centric academic planning tools, including a personal education planner, a flexible scheduler and an assignment tracker.  By consolidating multiple components into a rich, easy-to-use platform, MyEdu helps students plan their degree path, take the right classes, and stay on track to graduate on time.  MyEdu also presents students with up-to-date information about courses, schedules, and professors as well as feedback from other students.

MyEdu helps learners establish their professional profiles.  The traditional resume doesn’t accurately represent the skills and talents of learners, and tools that work for mid-career professionals don’t effectively convey a student’s credentials, capabilities, and evidence of learning.   As students use MyEdu’s academic tools, they build their professional profiles with data about their school, major, courses, graduation date and academic work.  They can also personalize their profiles with credentials, projects, organizations, services, languages, work experiences and competencies, providing valuable information to employers.

And perhaps most importantly, MyEdu connects learners with employers and jobs.  Learners choose how much and what type of information from their profiles to show to employers for potential jobs and internships.  By connecting the learning achievements from their courses and projects with lifelong, cross-institutional learning profiles, these achievements and their related competencies become more powerful for helping learners succeed in their goals of completing their degrees, getting jobs, and advancing their careers. MyEdu empowers learners to recognize, manage and continuously build on their own recognizable achievements beyond any single course, program, degree, or institution.

MyEdu enables employers to connect directly to students whose data indicates that they would be a great match for open positions. They can see not only courses and credits, but also much more granular achievements that reveal who the student is and what specific talents they can bring to the job.  These are very concrete ways in which new technologies and evolving learning ecosystems serve post-traditional learners.

It’s not just about jobs—it’s the new normal of evolving careers and the need for lifelong learning. Today’s learners need to build skills and work toward credentials at any time, at any age and apply them to an ever-changing landscape of personal goals. Today’s evolutions in our learning ecosystems coincide with the rise in learners’ need to have more control over their own learning achievements.

Katie BlotKatie Blot
Senior Vice President
Education Services
Blackboard
katieblot@blackboard.com

 

 

Investigation of IPEDS Distance Education Data: System Not Ready for Modern Trends

After billions of dollars spent on administrative computer systems and billions of dollars invested in ed tech companies, the U.S. higher education system is woefully out of date and unable to cope with major education trends such as online & hybrid education, flexible terms, and the expansion of continuing and extended education. Based on an investigation of the recently released distance education data for IPEDS, the primary national education database maintained by the National Center for Education Statistics (NCES), we have found significant confusion over basic definitions of terms, manual gathering of data outside of the computer systems designed to collect data, and, due to confusion over which students to include in IPEDS data, the systematic non-reporting of large numbers of degree-seeking students.

In Fall 2012, the IPEDS (Integrated Postsecondary Education Data System) data collection for the first time included distance education – primarily for online courses and programs. This data is important for policy makers and institutional enrollment management as well as for the companies serving the higher education market.

We first noticed the discrepancies based on feedback from analysis that we have both included at the e-Literate and WCET blogs. One of the most troubling calls came from a state university representative that said that the school has never reported any students who took their credit bearing courses through their self-supported, continuing education program.  Since they did not include the enrollments in reporting to the state, they did not report those enrollments to IPEDS. These were credits toward degrees and certificate programs offered by the university and therefore should have been included in IPEDS reporting based on the following instructions.

Photo of a calculator focused on the Clear Button

Calculating distance education enrollments hit some major snags.

“Include all students enrolled for credit (courses or programs that can be applied towards the requirements for a postsecondary degree, diploma, certificate, or other formal award), regardless of whether or not they are seeking a degree or certificate.”

Unfortunately, the instructions call out this confusing exclusion (one example out of four)

“Exclude students who are not enrolled for credit. For example, exclude: Students enrolled exclusively in Continuing Education Units (CEUs)”

How many schools have interpreted this continuing education exclusion to apply to all continuing education enrollments? To do an initial check, we contacted several campuses in the California State University system and were told that all IPEDS reporting was handled at the system level. Based on the introduction of the Fall 2012 distance education changes, Cal State re-evaluated whether to change their reporting policy. A system spokesman explained that:

“I’ve spoken with our analytic studies staff and they’ve indicated that the standard practice for data reporting has been to share only data for state-supported enrollments. We have not been asked by IPEDS to do otherwise so when we report distance learning data next spring, we plan on once again sharing only state-supported students.”

Within the Cal State system, this means that more than 50,000 students taking for-credit self-support courses will not be reported, and this student group has never been reported.

One of the reasons for the confusion, as well as the significance of this change, is that continuing education units have moved past their roots of offering CEUs and non-credit courses for the general public (hence the name continuing education) and taking up a new role of offering courses not funded by the state (hence self-support). Since these courses and programs are not state funded, they are not subject to the same oversight and restrictions as state-funded equivalents such as maximum tuition per credit hour.

This situation allows continuing education units in public schools to become laboratories and innovators in online education. The flip side is that given the non-state-funded nature of these courses and programs, it appears that schools may not be reporting these for-credit enrollments through IPEDS, whether or not the students were in online courses. However, the changes in distance education reporting may actually trigger changes in reporting.

Did Other Colleges Also Omit Students from Their IPEDS Report?

Given what was learned from the California State University System, we were interested in learning if other colleges were having similar problems with reporting distance education enrollments to IPEDS.  WCET conducted a non-scientific canvassing of colleges to get their feedback on what problems they may have encountered.  Twenty-one institutions were selected through a non-scientific process of identifying colleges that reported enrollment figures that seemed incongruous with their size or distance education operations.  See the “Appendix A:  Methodology(Link added after publishing) for more details.

From early August to mid-September, we sought answers regarding whether the colleges reported all for-credit distance education and online enrollments for Fall 2012.  If they did not, we asked about the size of the undercount and why some enrollments were not reported.

Typically, the response included some back-and-forth between the institutional research and distance education units at each college.  Through these conversations, we quickly realized that we should have asked a question about the U.S. Department of Education’s definition of “distance education.”   Institutions were very unclear about what activities to include or exclude in their counts.  Some used local definitions that varied from the federal expectations.  As a result, we asked that question as often as we could.

The Responses

Twenty institutions provided useable responses. We agreed to keep responses confidential.  Table 1 provides a very high level summary of the responses to the following two questions:

  • Counts Correct? – Do the IPEDS data reported include all for-credit distance education and online enrollments for Fall 2012?
  • Problem with “Distance Education” Definition? – Although we did not specifically ask this question, several people volunteered that they had trouble applying the IPEDS definition.

 

Table 1:  Counts for Institutional Responses
Counts Correct? Problem with “Distance Education” Definition?
Yes 11 3
Maybe 5 5
No 4 12

 

One institution declined to respond.  Given that its website advertises many hundreds of online courses, the distance education counts reported would leave us to believe that they either: a) under-reported, or b) average one or two students per online class.  The second scenario seems unlikely.

Of those that assured us that they submitted the correct distance education counts, some of them also reported having used their own definitions or processes for distance education.  This would make their reported counts incomparable to the vast majority of others reporting.

Findings

This analysis found several issues that call into question the usability of IPEDS distance education enrollment counts and, more broadly and more disturbingly, IPEDS statistics, in general.

There is a large undercount of distance education students

While only a few institutions reported an undercount, one was from the California State University System and another from a large university system in another populous state.  Since the same procedures were used within each system, there are a few hundred thousand students who were not counted in just those two systems.

In California, they have never reported students enrolled in Continuing Education (self-support) units to IPEDS.  A source of the problem may be in the survey instructions.  Respondents are asked to exclude: “Students enrolled exclusively in Continuing Education Units (CEUs).”  The intent of this statement is to exclude those taking only non-credit courses.  It is conceivable that some might misinterpret this to mean to exclude those in the campuses continuing education division. What was supposed to be reported was the number of students taking for-credit courses regardless of what college or institutional unit was responsible for offering the course.

In the other large system, they do not report out-of-state students as they do not receive funding from the state coffers.

It is unclear what the numeric scope would be if we knew the actual numbers across all institutions.  Given that the total number of “students enrolled exclusively in distance education courses” for Fall 2012 was 2,653,426, an undercount of a hundred thousand students just from these two systems would be a 4% error.  That percentage is attention-getting on its own.

The IPEDS methodology does not work for innovative programs…and this will only get worse

Because it uses as many as 28 start dates for courses, one institutional respondent estimated that there was approximately a 40% undercount in its reported enrollments.  A student completing a full complement of courses in a 15-week period might not be enrolled in all of those courses at the census date.  With the increased use of competency-based programs, adaptive learning, and innovations still on the drawing board, it is conceivable that the census dates used by an institution (IPEDS gives some options) might not serve every type of educational offering.

The definition of ‘distance education’ is causing confusion

It is impossible to get an accurate count of anything if there is not a clear understanding of what should or should not be included in the count.  The definition of a “distance education course” from the IPEDS Glossary is:

“A course in which the instructional content is delivered exclusively via distance education.  Requirements for coming to campus for orientation, testing, or academic support services do not exclude a course from being classified as distance education.”

Even with that definition, colleges faced problems with counting ‘blended’ or ‘hybrid’ courses.  What percentage of a course needs to be offered at a distance to be counted in the federal report?  Some colleges had their own standard (or one prescribed by the state) with the percentage to be labeled a “distance education” course varied greatly.  One reported that it included all courses with more than 50% of the course being offered at a distance.

To clarify the federal definition, one college said they called the IPEDS help desk.  After escalating the issue to a second line manager, they were still unclear on exactly how to apply the definition.

The Online Learning Consortium is updating their distance education definitions.  Their current work could inform IPEDs on possible definitions, but probably contains too many categories for such wide-spread data gathering.

There is a large overcount of distance education students

Because many colleges used their own definition, there is a massive overcount of distance education.  At least, it is an overcount relative to the current IPEDS definition.  This raises the question, is the near 100% standard imposed by that definition useful in interpreting activity in this mode of instruction?  Is it the correct standard since no one else seems to use it?

In addressing the anomalies, IPEDS reporting becomes burdensome or the problems ignored

In decentralized institutions or in institutions with “self-support” units that operate independently from the rest of campus, their data systems are often not connected.  They are also faced with simultaneously having to reconcile differing “distance education” definitions. One choice for institutional researchers is to knit together numbers from incompatible data systems and/or with differing definitions. Often by hand. To their credit, institutional researchers overcome many such obstacles.  Whether it is through misunderstanding the requirements or not having the ability to perform the work, some colleges did not tackle this burdensome task.

Conclusions – We Don’t Know

While these analyses have shed light on the subject, we are still left with the feeling that we don’t know what we don’t know.  In brief the biggest finding is that we do not know what we do not know and bring to mind former Secretary of Defense Donald Rumsfeld’s famous rambling:

“There are known knowns. These are things we know that we know. We also know there are known unknowns. That is to say, there are things that we know we don’t know. But there are also unknown unknowns. There are ones we don’t know we don’t know.”

The net effect is not known

Some institutions reported accurately, some overcounted, some undercounted, some did both at the same time.  What should the actual count be?

We don’t know.

The 2012 numbers are not a credible baseline

The distance education field looked forward to the 2012 Fall Enrollment statistics with distance education numbers as a welcomed baseline to the size and growth of this mode of instruction.  That is not possible and the problems will persist with the 2013 Fall Enrollment report when those numbers are released.  These problems can be fixed, but it will take work.  When can we get a credible baseline?

We don’t know.

A large number of students have not been included on ANY IPEDS survey, EVER.

A bigger issue for the U.S. Department of Education goes well beyond the laser-focused issue of distance education enrollments.  Our findings indicate that there are hundreds of thousands of students who have never been reported on any IPEDS survey that has ever been conducted.  What is the impact on IPEDS?  What is the impact on the states where they systematically underreported large numbers of students?

We don’t know.

Who is at fault?

Everybody and nobody.  IPEDS is faced with institutional practices that vary greatly and often change from year-to-year as innovations are introduced.  Institutional researchers are faced with reporting requirements that vary depending on the need, such as state oversight agencies, IPEDS, accrediting agencies, external surveys and ranking services, and internal pressures from the marketing and public relations staffs.  They do the best they can in a difficult situation.  Meanwhile, we are in an environment in which innovations may no longer fit into classic definitional measurement boxes.

What to expect?

In the end, this expansion of data from NCES through the IPEDS database is a worthwhile effort in our opinion, and we should see greater usage of real data to support policy decisions and market decisions thanks to this effort. However, we recommend the following:

  • The data changes from the Fall 2012 to Fall 2013 reporting periods will include significant changes in methodology from participating institutions. Assuming that we get improved definitions over time, there will also be changes in reporting methodology at least through Fall 2015. Therefore we recommend analysts and policy-makers not put too much credence in year-over-year changes for the first two or three years.
  • The most immediate improvement available is for NCES to clarify and gain broader consensus on the distance education definitions. This process should include working with accrediting agencies, whose own definitions influence school reporting, as well as leading colleges and universities with extensive online experience.

 

NOTE:  The research on the IPEDS survey and this blog post are the result of an on-going partnership between Phil Hill (e-literate blog and co-founder of MindWires Consulting, @PhilonEdTech) and WCET. Throughout this year, we coordinated in analyzing and reporting on the IPEDS Fall Enrollment 2012 distance education enrollment data. As they came to light, we also coordinated in examining the anomalies. We very much appreciate Phil and this partnership.

Much thanks goes to Terri Taylor Straut, who performed the heavy lifting for WCET in surveying institutions and conducting follow-up calls with respondents. Her insightful questions and attention to details was invaluable. And thank you to Cali Morrison, WCET, for her work in helping us to get the word out about our findings.
Russ Poulin

 

Photo of Russ Poulin with baseball batRussell Poulin
Interim Co-Executive Director
WCET – WICHE Cooperative for Educational Technologies
rpoulin@wiche.edu
Twitter:  @RussPoulin

 

 

 

 

Photo of Phil HillPhil Hill
MindWires / e-Literate
phil@mindwires.com
Twitter:  @philonedtech

 

 

 

 

Calculator Photo Credit: Alvimann

Have Fun in Portland at WCET’s Annual Meeting

Greetings from the great Pacific Northwest!

Portland (aka the City of Roses, Bridgetown, Beervana, P-Town, Rip City, Stumptown, and PDX) welcomes you to WCET’s 26th Annual Meeting. There’s no way to tell if you’ll experience the usual rain, sun breaks, or gorgeous beautiful weather when you’re here. No worries though, there’s plenty to do when you’re not engaged in sessions, rain or shine.

Eat

Portland’s known to be quite a foodie town. Check out one of the many restaurants downtown such as Mother’s Bistro, the Red Star Tavern, Veritable Quandary, Tasty n Alder, or Nel Centro. Take public transportation over to the Pearl and dine at Andina, Piazza Italia, or Isabel. For a more casual dining experience, you may want to try one of the many Portland Food Carts. There are several locations in the downtown area.Photo of the Willamette River and downtown Portland

Drink

Of course Portland’s rich with coffee and tea spots. But, if you’re interested in experiencing Beervana, choose from over 50 microbrews within the city. You can check out this map at PortlandBrewPubs’s web site of brewpubs in the Portland area. You can select downtown, Northwest Portland, and the Pearl and either walk or ride the Max or Streetcar. You cannot go wrong if you want to make it easy on yourself and head over to the Rogue Distillery and Public House and the Deschutes Brewery. Find the best downtown walking map at portlandmap.com.

Photo of Saturday marketShop

Did you know Oregon does not have sales tax? Yes, you can shop tax free in Portland! Choose from the NW 23rd area or Pearl District, Downtown, or on the weekend visit Saturday Market on Saturday or Sunday (rain or shine) to check out up to 250 artisans sharing their work. Stop in and listen to some music while you enjoy a tasty meal from one of the food vendors.   Of course, right across the street from the hotel is the Riverplace Marina, Shops, and Restaurants. Stop in for ice cream, sushi, drinks, or browse the shops just steps from the hotel.

Read

You must have heard about Powell’s City of Books. It takes up an entire city block and you need a map just to find your way around the store taking up 68,000 square feet.

 

RelaxPhoto of Cannon Beach

Without leaving the city, you can take a little break and visit the beautiful Lan Su Chinese Garden created by artists in our sister city, Suzhou. It’s not just a garden, but it’s work of art, complete with a Teahouse.   If you come early or stay late, you may want to take advantage of our rich climate and head out of town. Take a drive out on the Historic Columbia River Highway. Stop in to see Multnomah Falls, drive on to Hood River, or go all the way to Mt. Hood. If you want to head west, check out Cannon Beach, Lincoln City, or Pacific City on the Oregon Coast and stop in for some wine tasting on the way at one of the Willamette Valley Wineries.

As you are exploring new ideas and innovations at the annual meeting, be sure to get out and experience Portland. It’s not quite as extreme as Portlandia makes us out to be. Enjoy the city and help us keep Portland weird!

 

Loraine Schmitt
Director of Distance Education
Portland Community College

 

Come join us at the 26th WCET Annual Meeting in Portland, Oregon on November 19 – 21.  Early bird registration deadline is October 18.

Creating a New Kind of OWL: Online Writing Support that Makes a Difference

The Excelsior College Online Writing Lab (OWL) is a 2014 recipient of the WCET Outstanding Work (WOW) Award and will accept the award at the WCET Annual Meeting.  Today Crystal Sands, Director of  the OWL shares with us the goals, process and results of a pilot study on the OWL’s use that resulted in their award.

We knew we wanted our Online Writing Lab (OWL) to stand out, to be more student friendly than other online writing resources, and to use some of the latest research about what works in writing instruction and in online education. It turned out to be a monumental task; we had just one year to build it. But, the Excelsior College Online Writing Lab was a labor of love for all of us, and I think it shows.

The Excelsior College OWL is a first-of-its-kind, open-source multimedia online writing lab. While we continue to expand its resources, the OWL already provides comprehensive writing support for students across eight areas:

  1. Locating Information and Writing with Sources takes students through the entire process of writing a research paper.
  2. Grammar Essentials provides students with detailed, student-friendly grammar, punctuation, and common error support.
  3. The Writing Process area helps students develop a strong writing process for papers that do not require research.
  4. The Essay Zone provides comprehensive support for the major rhetorical styles students are likely to encounter in college.
  5. Digital Writing supports students who are writing in digital environments, with coverage for everything from e-mails to digital presentations.
  6. The Avoiding Plagiarism tutorial explains what plagiarism is, what its consequences are, and what students can do to avoid it.
  7. The ESL Writing Online Workshop provides detailed writing process support for ESL writers.
  8. Paper Capers is an original writing process video game, allowing students to practice writing process steps and build a writer’s vocabulary, which is essential for skill transfer. The game also features mini assessments, allowing students to practice lessons from the other areas of the OWL.

Funding for building this kind of comprehensive support was generously provided by the Kresge Foundation. To fit within the funding criteria, our team worked quickly to build the OWL, completing it in just one year. During the second year of the grant, we conducted a national pilot study and based revisions upon feedback from the study.

Creating the OWL

Excelsior OWL mascotExcelsior College worked with one writing faculty member from each of its five community college partners to develop specific goals for the OWL. We knew we wanted to create an OWL that was different than other online writing labs, one that was student-centered, warm, and engaging. We wanted to make the OWL a fun learning experience, a place that students would come back to even after their writing class was over. We decided to focus on helping students build a strong writing process, as research indicates that students who have a better writing process also have better writing products. We also needed to help students build a rhetorical foundation and vocabulary, which would assist them in becoming more flexible writers. As part of the creation of OWL, a writing video game was created to reinforce both the writing process and a rhetorical foundation.
As director, my job was to develop content based on feedback from the committee and try to imagine how the content could be brought to life for students. An instructional designer was critical in that process. Additionally, we worked with an outside vendor, who was committed to our idea to do something creative and fun, on the website build and design. The brainstorming sessions we had were remarkable at times, and it was not long before we were seeing our ideas become reality.

As we neared the end of the first year of the project, we realized we were doing more than we had originally envisioned in the scope of the grant—adding new content, additional areas, and working to add a creative flair to the OWL. The hours were long, but our committed, small team got the OWL ready for the pilot study, which was to begin in the fall of 2013.

The summer of 2013 was an epic time. As the project director, I was responsible for making sure deadlines were met and budgets were kept. Thankfully, we had a wonderful grants office that supported me and our team in this endeavor. My family also became involved in the project as well, with my husband providing audio, and when testing on the site began my high school-aged son joined us in the testing as well. The OWL became our dinner-time conversation, and when my toddler asked me, “Mama, what is a thesis statement?” I knew I had probably crossed that work-life balance line. I knew I was not alone in crossing that line, as our team of five went above and beyond that summer. Thankfully, we were just about ready for the pilot study.

It truly was a labor of love. I don’t think we could have built such a resource in such a short time otherwise. Fortunately, our hard work has been rewarded.

The Pilot Study

Course Grades OWL pilotThanks to an amazing team effort, the OWL was ready to go, minus a few tweaks, for the pilot study. Our team of teachers from our partner colleges worked together to build the OWL into the curriculum of their writing classes. We ran treatment and control group classes in order to have sets of students working with the same curriculum without the added support of the OWL. The results were positive and gave us a good start on future study of the OWL and how it benefits students.

We found that students in the treatment groups, who used the OWL regularly, scored 6.6 points higher on their final grades than students in the control groups. We also ran a “writing about writing” assessment in order to evaluate how students approached the writing process. In six of the seven categories we assessed, students in the treatment groups exhibited more growth than students in the control groups. In our assessment of the final product essays, something we knew would be tricky, as it is difficult to show improvement in just one semester, we had positive results as well. Students in the treatment groups exhibited more growth in three of the five categories we assessed, showing greater improvement in context and purpose for writing, control of syntax and mechanics, and genre and disciplinary conventions.

Students also completed extensive surveys on the OWL and their attitudes toward writing at the beginning and end of the semester. Students responded well to the OWL, reporting that the content felt relevant and helpful. Students in the treatment groups also reported greater improvements in their general attitudes about writing, with many students indicating they were more likely to write in their spare time after using the OWL.

These results are promising and are in line with the goals of the OWL. While longitudinal study is needed, we have evidence that the Excelsior College OWL provides students with a strong foundation in writing, one that is going to help them transfer the skills they learn in writing classes to other writing situations, which is, of course, the ultimate goal of writing instruction.

The OWL team at Excelsior College feels we have set the stage, through solid writing instruction and extensive multimedia support, to be the kind of free resource that students can rely upon and come back to, throughout their college careers and beyond.
Our team has been honored with the WCET Outstanding Work Award. We are excited that high schools, community colleges, and universities across the country are beginning to use the OWL in their classes and their writing centers. We have been successful in our goals to create a warm, engaging learning environment. The structure of the OWL makes it a valuable resource, whether students need one short lesson on documentation or extensive instruction in writing support. There is something for everyone in the OWL!

Crystal SandsCrystal Sands, Director
Online Writing Lab (OWL)
Excelsior College

Email Crystal

Follow

Get every new post delivered to your Inbox.

Join 827 other followers