DEAC Announces Peer Review Process for Non-traditional Distance Ed Providers

How do students judge the quality of distance education courses?  The Distance Education Accrediting Commission’s (DEAC) new quality review process helps students to make that evaluation.

Traditional colleges have accreditation. Accrediting agencies traditionally provide peer-review evaluations to (according to the U.S. Department of Education) “ensure that education provided by institutions of higher education meets acceptable levels of quality.” However, there are a growing number of entities offering distance education courses that are not “institutions of higher education.”

Last Friday, DEAC announced its new Approved Quality Curriculum (AQC). This new service uses a standard rubric to peer-review non-institutional “providers” of distance education. StraighterLine and Sophia are the first “providers” to be recognized as achieving AQC status for their online courses.Logo for DEAC's Approved Quality Curriculum.

This is a major step in erasing one of the major delineators between traditional credit-bearing institutions and their non-credit counterparts.

What it is the Approved Quality Curriculum?

The Distance Education Accrediting Commission (formerly the Distance Education and Training Council) is a federally-recognized accrediting agency. Even though they are approved by the Department of Education and the Council on Higher Education Accreditation (CHEA), the “national” accrediting agencies are viewed by some as a distant relative to the regional accrediting agencies. Despite that perception, DEAC’s rigorous review methods are the equal of its regional cousins.

“In AQC, we are applying the hallmarks of our accreditation process and expertise in distance education in a constructive and meaningful way for all offerings of online learning, whether institutionally based or not,” said Leah Matthews, DEAC’s Executive Director. “When DEAC conducts an accreditation process, it reviews education quality all the way to the course level.  We are implementing the aspects of our course review process, and we call it AQC reviewed curriculum. Earning an AQC status means that StraighterLine and Sophia have met the same quality expectations for their courses that DEAC implements when it reviews courses as part of an accreditation review.”

Photo of Leah Matthews

Leah Matthews, Executive Director, Distance Education Accrediting Commission

Matthews credits the leadership of her board in seeking innovative quality assurance processes. Creating the AQC rubric took nearly a year of long and hard work according to Matthews. The process included input from institutions accredited by other accrediting agencies and from providers who might wish to participate in AQC.

According to Matthews, DEAC is careful to separate AQC from full-fledged accreditation since this process is not approved by the Department of Education or CHEA. Consumers were mostly left to their own devices to evaluate course quality. AQC’s helps fill that void.

Perceptions on AQC from a “Provider”

WCET member StraighterLine provides low cost, online courses offered individually or on a subscription basis.  It is one of the two providers to achieve the initial AQC recognition. CEO Burck Smith is a former member of WCET’s Executive Council, so I reached out to him to get his input on AQC.

“DEAC is to be congratulated for taking a leadership role in reviewing curricula and courses from providers outside of traditional higher education,” said Smith. “StraighterLine is honored to have had its courses validated and vetted by acknowledged leaders in quality online education provision.”

Smith said that StraighterLine has sought third-party reviews from other sources and adds: “We have close to 90 colleges who have conducted their own individual evaluations of our courses and entered into articulation agreements with us.”

On his thoughts about accreditation, Smith said “At the highest level, accreditation provides three things 1) access to substantial taxpayer subsidies, 2) third party validation of quality for prospective students and other colleges, and 3) some degree of credit transferability. In exchange, you must offer entire degree programs and be subject to their review standards (which may make sense for a program, but don’t for stand-alone courses). For us, we believed that our price point was low enough that we could forego taxpayer subsidies, that articulation agreements with colleges would create credit transferability and that, eventually, enough 3rd party ‘Good Housekeeping Seals of Approval’ would equal or exceed accreditation as a stamp of quality.”

The Clamor Around Non-Credit Providers

You can probably blame it on the MOOC phenomenon, but there has been increased interest by policy-makers in the question of how non-credit educational opportunities fit into the educational puzzle.  It also helped that the emergence of “providers” such as StraighterLine and Sophia have also raised serious questions about alternative paths of learning.

Some policy examples:

  • ACE was quick to get on the bandwagon of reviewing MOOCs for credit-worthiness, which is not accreditation but does provide course-level review that is used by many colleges in recognizing externally-offered learning for credit purposes.
  • Senator Lee (R-UT) recently reintroduced his proposed legislation (S649), which would allow each state to “establish an alternative accreditation system for the purpose of establishing institutions that provide postsecondary education and postsecondary education courses or programs as eligible for funding under title IV…” Oy! If you thought state authorization was bad, this could be a real headache.
  • The Council on Higher Education Accreditation has been trying to figure out a path to quality assurance for alternative providers and announced a Quality Platform Pilot. In partnerships with the Presidents’ Forum, they published the paper “Quality Assurance and Alternative Higher Education: A Policy Perspective” last year.
  • Inside Higher Ed recently published a nice summary of many of the issues around the review of alternative providers and competency-based education.

The reason that this issue is bubbling to the top is the expected reauthorization of the Higher Education Act of 1965, which covers the many regulations regarding higher education and federal financial aid.  The big question is whether these alternative providers will be allowed into Title IV or other federal pots of funding. If so, what constraints will be place on them and do they really want to enter the world of regulatory oversight?

” We’re thrilled to be working with DEAC, it’s reflective of the semi-feverish activity going on to ‘own’ alternative credit review. CHEA announced a process,” said Burck Smith. “ACE has its Gates funded Alt-Credit project. Lamar Alexander issued a paper about it. David Bergeron and the New York billionaire are talking about it. The Department of Education is certainly interested. So, there’s a lot of activity which reflects the increasing inability to ignore the price differential of accredited v. unaccredited providers offering substantially similar offerings online.”

Leah Matthews says, “I see an increasing level of interest among higher education research groups and federal policy makers in alternative ways of assuring the quality of online learning that is offered outside of the domain of traditional institutions. I think it is incredibly important to continually visualize the future of online learning and think about how higher education is steadily transitioning to a more ‘learner-driven’ model.”  Besides the nascent “direct assessment” models for federal financial aid, she feels there has been little innovation in federal financial aid since the Higher Education Act of 1965.  We are still tied to the credit hour.

What does this mean for the future?

“Unaccredited providers are challenging a whole host of assumptions about higher education,” according to Smith. Chief among those is that “anyone can offer a college course. There’s nothing special about a course offered by a college or by some other provider — so long as the content, rigor and assessments are consistent (which they aren’t across all of higher ed). Therefore, online, accreditation is an arbitrary distinction that confers competitive advantages on one group of providers — usually much higher priced – than on the other equivalent set. As more students opt for lower priced offerings, colleges will have to acknowledge the validity of non-college courses.”

In looking to the future, Matthews she envisions a financial aid model that benchmarks on student achievement. As for the new “providers,” she sees the regulatory curiosity about them continue to grow. If they start receiving any federal money, they will face a new level of scrutiny that they have not previously encountered.

In my opinion, we are facing an evolving landscape that will no longer look like the traditional higher education structure that we experienced. For institutional leadership, those who do not recognize that the Earth is shifting beneath our feet do so at their own long-term peril.  For the regulatory landscape, there needs to be a balance between consumer protection and innovation. Unfortunately, finding that Goldilocks “just right” point is tricky business.

Congratulations to DEAC and we look forward to following the progress of AQC and other effort regarding non-institutional “providers.”

 

Photo of Russ Poulin with baseball bat

Ready for baseball and regulatory season.

Russ

Russell Poulin
Director, Policy & Analysis
WCET – WICHE Cooperative for Educational Technologies
rpoulin@wiche.edu
Twitter:  wcet_info and RussPoulin

NANSLO Web-based Labs: Real Equipment, Real Data, Real People!

Many pressures on higher education make the services of the North American Network of Science Labs Online (NANSLO) essential.  These include growing enrollments in online courses while campus enrollments decline, the need to provide flexibility for nontraditional students, a growing demand for digital literacy, and a national emphasis on growing the number of STEM students, among others.

What is NANSLO?
The North American Network of Science Labs Online (NANSLO) is an alliance of cutting-edge science laboratories based at higher education institutions that provide students enrolled in science courses with opportunities to conduct their lab experiments on state-of-the-art science equipment using robotics, software, and video over the Internet. From any computer, students can log into one of the labs’ web interfaces and manipulate the controls on a microscope or other scientific equipment, participate in conversation with lab partners, ask for assistance from a knowledgeable lab technician in real time, and collect data and images for their science assignments. Through NANSLO, institutions can expand student access to STEM pathways, as they make it possible for students who cannot come to campus to complete lab activities online.

How Does it Work?
Through the NANSLO control panel, students:

  1. CONNECT by computer to control the movement of high quality scientific equipment used to perform the assigned lab activities.“Very convenient and easy to use” Great Falls College Montana State University
    Student, MT
  2. DISCOVER AND EXPLORE. Students have the opportunity to think like a scientist – observing, interpreting, predicting, classifying, modeling, communicating, and drawing conclusions based on the data collected.  Students watch their progress in real time on a webcam that displays what they are doing, and they gather real data to analyze and make predictions and draw conclusions.“It is much easier using an online microscope than even one by hand … You can
    zoom/capture images, and do things you cannot otherwise do unless the
    microscope is hooked up to a computer . . . ” Community College of Aurora Student, CO
  3. COLLABORATE with lab partners nearby or around the world as each takes turns using the equipment. And, students get immediate help from the NANSLO lab techs when needed.“Great! makes me feel like I’m in an actual lab! Lamar Community College, CO
  4. ENGAGE in active learning. As they work through the activities, students are actively performing their experiment, using their settings, experiencing their own observations, and collecting their own data.“This type of unique ‘hands on’ experience taps into parts of the brain that even
    person-person labs miss.” Kenai Peninsula College University of Alaska Anchorage, AK

Photographic rendering of a student accessing real lab equipment via the internet using her own computer and talking to the tech via telephone.

What is NANSLO’s Discipline Focus?
NANSLO has developed 27 lab activities in biology, chemistry, physics, and allied health that are openly licensed with Creative Commons BY attribution.  These labs are easily integrated into course curriculum and include background information, pre-lab questions, and lab activities that can be performed online.  Or, these labs can easily be customized to meet individual course requirements.  A list of all NANSLO lab activities with access to the Word version is available for easy download.  Over time, NANSLO expects to expand its collection within these initial disciplines and add others.

Photo of sheets of papers with instructions on conducting a lab experiment.

Who Participates in NANSLO?
The NANSLO network’s hub is based at the Western Interstate Commission for Higher Education (WICHE) in Boulder, CO. WICHE serves as the public’s primary resource for information about NANSLO, coordinates communication among the network’s lab partners, provides the centralized scheduling system, and oversees selected contracting and financial transaction services for the partners.  Currently, the network includes three laboratories: the Colorado Community College System (CCCS) laboratory is located at Red Rocks Community College in Arvada in Colorado; the Great Falls College Montana State University (GFCMSU) is located in Great Falls, Montana; and North Island College (NIC) is located in Courtenay, British Columbia.

Three photos of different lab locations and the equipment at each location.

The CCCS laboratory began serving CCConline students in 2012 and continues to do so along with serving students from three Consortium for Healthcare Education Online (CHEO) community colleges in Colorado through a U.S. Department of Labor Trade Adjustment Assistance Community College and Career Training (TAACCCT) grant that established that initiative and provided funding for expanding NANSLO’s work.  The NIC laboratory is primarily NANSLO’s development site but delivers lab activities to one CHEO community college in Alaska.  NANSLO’s GFCMSU laboratory, established with funds from the TAACCCT grant, opened in late fall 2014 and serves four CHEO community colleges in Montana, South Dakota, and Wyoming.

Photo of screen used to schedule time on the lab equipment.

How Do Students Access an Assigned Lab?
First, an institution or its faculty uses the centralized scheduling system to reserve a block of time for students to perform assigned NANSLO lab activities.  When a reservation is made, a unique URL and PIN is generated.  Faculty give their students this information, and students use it to access the scheduling system and select a day and time within the reserved block to complete the lab activity.  Students also use that link or the student dashboard customized with information unique to them to access their NANSLO lab activities.

Once connected to the NANSLO lab, students have access to real scientific lab equipment that lets them:

  1. Collect real time data and capture it electronically;
  2. Experiment with different settings to see the impact on that data;
  3. Capture high-resolution images to include in lab reports;
  4. Engage in authentic instrumental experimentation;
  5. Collaborate with classmates and lab personnel through teleconferencing; and
  6. Gain skills that can be used in the future whether in science or other careers that require data collection, analysis and decision making, while gaining digital experience in remote web-based control.Faculty view of the progress reports for students accessing and using the lab equipment.

Faculty can find out how their students are performing by logging into a dashboard customized for their use and selecting the lab activity of interest.  The report generated assists them in determining student participation in the assigned lab activity.

Conclusion
Over the past year, students have been surveyed about their experience with NANSLO.  Here are some comments:

“It was amazing to be able to sit in comfort of my own home and be able to work with this equipment.  I believe this is the way of the future just because it is so fitting for people to be able to do this.  Everyone in this world has busy lives and this makes it that much easier on people.”  Kodiak College University of Alaska Anchorage Student, AK

“What a great resource, it was way easier to use and much cheaper than buying the microscope
for my class”  Flathead Valley Community College Student, MT

“This was great and I can see enormous potential for online students.  Thank you for the opportunity!  Community College of Aurora, CO

“I think this was a great experience.  I think it comes pretty close to the real thing, which is
great.  :)” Arapahoe Community College, CO

In sum, NANSLO can provide real value to institutions by:

  1. Delivering high-quality lab activities to students online in science courses requiring a lab component;
  2. Providing students with access to real lab equipment allowing them to collect real data and think like real scientists;
  3. Reducing the need to expend limited dollars on expanding labs on campus;
  4. Providing students with an experience that can be applied to many professions; and
  5. Addressing the need for flexibility in accessing and performing lab activities.

NANSLO’s future plans call for implementing a fee-for-service model so that other institutions can purchase NANSLO services for their students. It is also looking at cloud computing as a possible approach to expand capacity exponentially while further enhancing efficiencies.

If you would like more information about NANSLO go to www.wiche.edu/nanslo or contact Sue Schmidt at sschmidt@wiche.edu or 303-541-0220.Photo of Sue Schmidt

Sue Schmidt
NANSLO/CHEO Project Coordinator
WICHE
sschmidt@wiche.edu

 

Pat SheaPhoto of Pat Shea
Director, Academic Leadership Initiatives
WICHE
pshea@wiche.edu

This product was funded by a grant awarded by the U.S. Department of Labor’s Employment and Training Administration.  The product was created by the grantee and does not necessarily reflect the official position of the U.S. Department of Labor.  The Department of Labor makes no guarantees, warranties, or assurances of any kind, express or implied, with respect to such information, including any information on linked sites and including, but not limited to, accuracy of the information or its completeness, timeliness, usefulness, adequacy, continued availability, or ownership.

Opening the Doors to Education: Ensuring Accessibility in Open Textbooks

Accessibility is a concern across all of technology-enhanced education.  At BCcampus, they wanted to help content creators incorporate accessible practices into their open materials. Amanda Coolidge, Open Education manager at BCcampus, shares with us how they crafted the BC Open Textbook Accessibility Toolkit and how you can take advantage of this great resource.

The BC Open Textbook Accessibility Toolkit is a collaboration between BCcampus and the Centre for Accessible Post-secondary Education Resources BC (CAPER-BC). BCcampus is a publicly funded organization that uses information technology to connect the expertise, programs, and resources of all B.C. post-secondary institutions under a collaborative service delivery framework. BCcampus is the lead organization for the BC Open Textbook project. CAPER-BC provides accessible learning and teaching materials to students and instructors who cannot use conventional print because of disabilities.

BC Campus Open Textbook Accessibility ToolkitAt the end of 2014, BCcampus and (CAPER-BC) contacted the Disability Services Coordinators at partner institutions to find student participants with print disabilities to evaluate British Columbia (B.C.) open textbooks. The participants were asked to evaluate five chapters from the open textbook library and provide their evaluation on each chapter. They were asked to access the materials in their preferred layout, such as web format, ePub, or PDF, and then provide written feedback about their experience. This model worked well, but we decided to take this further and invited the participants to join us for a half-day focus group, where we had the opportunity to understand why they responded to the questions – or didn’t respond – to see how they were reading and accessing the materials on their different devices. Based on student feedback, we were able to create a series of tasks to make our own textbooks more accessible.

Working with Tara Robertson from CAPER-BC, and Sue Doner, an instructional designer from Camosun College who has been working with universal design and creating accessibility guides for instructors, we have developed an accessibility toolkit.  The goal of the BC Open Textbook Accessibility Toolkit is to provide the needed resources needed to each content creator, instructional designer, educational technologist, librarian, administrator, and teaching assistant to create a truly open and accessible textbook — one that is free and accessible for all students.

We developed the toolkit in Pressbooks, and as a result it is available in a variety of downloadable formats (PDF, EPUB, MOBI, XHTML, and WordPress XML). Within the toolkit you will find information on how to make content accessible, with specifics on:

  • Images/Charts/Graphs/Maps
  • Weblinks
  • Tables
  • Multimedia
  • Formulas (math and scientific)
  • Font size
  • Colour contrast

BC Open Textbook Accessibility Toolkit Team workingAs you work through the content of the BC Open Textbook Accessibility Toolkit, you will find that the suggestions provided are intended for the non-technical user. If you are looking for more technical descriptions of how to make your work accessible, we suggest you review the WCAG (Web Content Accessibility Guidelines).

Based on some of the accessibility testing we conducted, our technical team at BCcampus is creating a new accessibility plug in for Pressbooks. The plug-in will give users the option to modify the user interface and the exports’ interface, font size, and line spacing for accessibility purposes.

If you have comments, suggestions, or questions about the BC Open Textbook Accessibility Toolkit we would love to hear from you. Please contact us at opentext@bccampus.ca
Photo of Amanda CoolidgeAmanda Coolidge

Manager, Open Education

BCcampus 

Twitter: @acoolidge

What Can Happen If I Don’t Follow State Authorization Regulations?

Those of us in WCET’s State Authorization Network (SAN) and in the State Authorization Reciprocity Agreement (NC-SARA) leadership often get asked the questions:

  • “Does anyone really enforce ‘state authorization’ in the U.S.?”
  • “Why don’t I read in the higher education news about colleges being fined for ‘state authorization’ violations?”

States are watching, regulating, and taking action.  Before delving into that, let’s start with an example.

University Required to Issue Refund in “Unauthorized” State

Photo of a rat trap loaded for action with a piece of cheese on the trigger.

There could be consequences for violating state laws.

This recent action is told in the colorful verbiage of Alan Contreras.  In a previous life Alan was the state regulator for Oregon.  He now works for WCET SAN and NC-SARA.

A public university in the Midwest recently discovered what can happen in a relatively straightforward situation in which the institution failed to get authorization. 

 Institution X enrolled an online student in a state in which state authorization is required, but did not get that authorization.  The student paid only a small part of the tuition due for a course and withdrew late in the term, past the standard withdrawal date. The student therefore owed the institution some money, and the institution requested payment.  The student did not pay and the debt was assigned to a collection agency, as is the institution’s standard practice. When the student heard from the collection agency, the student wrote to the institution and said “this debt is uncollectable because you were operating illegally in my state.” 

At this point the institution’s new compliance officer was served this rat sandwich by the affected department with a request for advice, and called WCET-SAN staff to discuss the issue. In our view, the institution was on shaky ground, so we advised the compliance officer to bring in the institution’s legal staff. 

When the institution contacted the appropriate agency in the student’s state of residence, state officials there indicated that not only was the debt uncollectable, but all of the student’s tuition that had been collected had to be refunded in order to avoid formal action against the provider, which could have included a ban on operating in that state, as the institution had no authority to charge tuition to a resident of that state.

Yes, states really do take action, it just rarely appears in the headlines.  Would you prefer mayonnaise or mustard with that sandwich?

All I can say is “Ouch”!

Photo of a line-up of  motorcycle police and their motorcycles.

There are not state authorization motorcycle brigades, but there are those enforcing the laws.

What Usually Happens When an Institution is Found to be Out-of-Compliance?

Many states will contact the institution to inquire about an alleged infraction.  They usually don’t start with a “cease and desist” letter, but I have heard of some colleges suddenly being surprised with such a notification.  It’s a doubly unhappy day if the letter goes directly to your college’s president, as the president and public relations folks at your institution understand the damage of bad publicity…even if there is no official action, but word leaks out about not following laws.

The goal of the state regulatory agency is to protect students in the state by getting the institution into compliance.  Often one of two paths is followed:

  • The institution decides to come into compliance and a process for doing so is negotiated between the institution and the state.
  • The institution decides to leave the state and an exit process is negotiated between the institution and the state.

Either way, at the end of the day the institution is following state law by either obtaining the correct approval(s) or leaving the state.  Fines are a threat, but are rarely part of the final equation as both sides seek an amicable solution.

What About Student Actions?

Ah yes.  Students can take matters into their own hands and sue the institution.  This seems to happen most often in cases involving professional licensure.  As you can imagine, a student will be quite upset after spending several years studying with you only to learn that their degree will be worthless in the state in which they are residing.  A few years ago, I wrote about two such students who suddenly found that their program was unrecognized by the Board of Nursing in their home state.  In one case, the institution sought authorization and made things right.  In the other case, it was only under the threat of lawsuit from the student that the institution took action.

I heard a sad case last year in which a student moved to another state to attend face-to-face courses after being told that she could perform all of her practical fieldwork back in her home state.  She quit her job and moved to the institution’s state. Once there, the non-profit institution told her that a new federal law had been passed and that she could not conduct her fieldwork in her home state.  This was completely untrue.  She quit the program before starting it, received a minor refund, and moved back to her home state.  She did not wish to go after the institution and asked me not to reveal the identity of the institution.  That college dodged a bullet.  I was mad at the blatant disrespect and dishonesty demonstrated by this college.

Why Don’t I Read About These Actions?

Since everyone is seeking the best possible outcome, there is no reason for the regulator to embarrass the institution.  The two parties often reach an understanding not to publicize the details of agreements resulting from findings of non-compliance.

This is all very boring to the press.  No conflict.  No story.

In Conclusion…

It often takes more time and effort to fix an unpleasant situation than to just do it right the first time.

The bigger compliance hammer will come if the Department of Education decides to bring back the state authorization regulation for distance education.  The signs point to their planning to do so later this year.   We will keep you updated on that process.

Meanwhile, states still expect you to be in compliance now.

Finally, life is easier if you treat students properly.

Thank you,

Russ

P.S.  WCET’s State Authorization Network membership is now open. To keep updated on state authorization issues, come join us!

Russell Poulin
Director, Policy & Analysis
WCET (the WICHE Cooperative for Educational Technologies)
rpoulin@wiche.edu
Twitter:  @russpoulin

If you like our work, join WCET!

 

Photo credits:
Rat trap:  Morgue File.
Police:  Morgue File.

eCampusAlberta Quality Rubric for Online Courses

A long time supporter of WCET, we are delighted to hear from Tricia Donovan, executive Director of eCampusAblerta today.  Thank you, Tricia, for sharing with us eCA’s work in developing the quality eToolkit. 

eCampus Alberta LogoeCampusAlberta is a consortium of 26 publicly funded post-secondary institutions in Alberta.

From its inception, the consortium was established to increase access to high quality online learning offerings across the province of Alberta. Set in a backdrop of strong institutional autonomy, the advent of a senior executive-led initiative required unprecedented institutional collaboration. Participating institutions sought ways to inform their efforts to collaborate, and the development of eCA Quality Standards became a mechanism to facilitate trust and inspire shared practices across member institutions.

Creating the Original ‘Quality Suite’

Work on quality began shortly after the consortium was formed. In 2005, members developed a position paper on Quality Standards which was primarily adapted from the widely heralded Quality on the Line: Benchmarks for Success in Internet-Based Distance Education, published by The Institute for Higher Education Policy and Blackboard, May 2000.

The Quality Standards were used as a requirement for access to funding for online curricula and eventually became adopted or adapted for design at many of the member institutions. In 2007, further development of the standards resulted in a suite of resources to support the development of online curricula through what was known as the eCampusAlberta Quality Suite. The suite included a set of  Essential Quality Standards, a Quality eRubric, a Curricula Assessment Scorecard and a Course Review and Report Process.

In 2012, It Was Time to Refurbish our ‘Quality Suite’ or Move On

In 2012, eCampusAlberta evaluated the eCA Quality Suite in terms of effectiveness, usage, and alignment with current academic literature on quality of online courses. Emphasis was placed on determining if revisions of our Quality Suite were needed or if we would benefit from adopting an existing external set of quality standards.

An intensive review of more than 40 online course quality standards was conducted, as well as a literature review of quality standards, quality assurance principles, processes in higher education and online learning in many jurisdictions. We also surveyed our eCA Quality Suite users and eCA course reviewers and consulted with experts in online course development.

We found that there was a strong correlation between other established quality standards and the eCampusAlberta Quality Suite and high levels of awareness/usage of and satisfaction with the Quality Suite. And we also identified areas in which additions and improvements were needed.

We determined that our standards were robust and held up well against others in the field and we worked on process revisions and updates to support the use and application of the standards.

The links below provide access to our Quality Suite of materials, all of which is licensed under Creative Commons and we encourage WCET members to use our work. The OERu has adopted our Quality Standards globally and they are currently being reviewed for applicability with open educational resources.

eCampusAlberta Quality 2.0

In July 2014, we launched the eCampusAlberta Quality Suite 2.0.  The suite is comprised of the Essential Quality Standards, the eLearning Rubric, the Quality eToolkit, an online review and database system, and many quality-related professional development resources and opportunities. We also introduced three levels of achievement on the quality standards – expanding beyond those standards deemed Essential to include those identified as Excellent and Exemplary. This initiative was viewed as a means of recognizing the work of those faculty, designers and institutions that were exceeding the minimum or essential standards to more robust design standards. It is interesting to note that where we had experienced challenges in meeting the Essential Standards across the consortium for year, many institutions are inspired to showcase their work in all three levels of the standards.

Essential Quality Standards and Course Review Process

The Essential Quality Standards include a new rubric approach which offers criteria for Essential (the required level for courses to meet), Excellent, and Exemplary levels for each standard. It also includes examples of effective practice and academic references. These are licensed under a Creative Commons Creative Commons Attribution 3.0 License.

The Course Review Process guides the provision of review services to members. eCampusAlberta reviewers use the new online course review system to evaluate member courses prior to these being added to the eCA Course Catalogue for delivery. Review reports are provided to the institutions. The review process defines expectations of timelines and deliverables for all participants. As reviews are conducted, the Quality Team identifies examples of quality course design that institutions are asked to share as examples of effective practice. Some institutions have decided to create templates based on the Essential Quality Standards for their online courses, and these are reviewed upon request.

Quality eLearning Rubric

The eCampusAlberta eLearning Rubric supports the creation of quality online curriculum. Developers may use the free online rubric to self-assess their courses using the Essential Quality Standards. Their reviews can be saved and reports can be downloaded. All works are licensed under a Creative Commons Attribution 3.0 License.

ECA Quality Toolkit buttonsQuality eToolkit

The Quality eToolkit hosts all of our quality resources and supports our quality services. Components include the Essential Quality Standards and their accompanying resources, the online Quality eLearning Rubric, Frequently Asked Questions, background information, information on the Course Review Process, an annually updated literature review, examples of effective course design provided by member institutions, and more.

Quality Professional Development

Quality-related professional development is an ongoing activity that includes webinars, orientation sessions, articles, conference sessions, workshops, etc. The Quality Manager provides consultation with institutions as requested and customized sessions are also delivered as needed. Reviewer training is also ongoing. Webinars are also available at no cost to participating institutions and a new “Quality Corner” has been recently added to our eZine to promote dialogue and awareness of quality standards and approaches.

After one year of implementing, and hearing mixed reviews anecdotally, we initiated an evaluation process in fall, 2014. We held a workshop with a group of instructional designers, directors, faculty, and staff involved in producing quality online courses at our member institutions.  Keen to learn how we could enhance our processes and the experience for members, we openly solicited feedback on the standards, the rubric and our course review process. We learned that there was overall acceptance and adoption of our standards and that many of our members had created templates to support their curricula development. We also heard about challenges arising from implementation, primarily around a lack of consistency in our reviews, tone, and approach. Collectively, we then adapted our course review approach to be more open and constructive and to provide opportunity for designers to meet with our Quality Manager to discuss a course review. An online survey will complete our evaluation of the Quality 2.0 and will publish the results in Spring, 2015.

Please contact me if you have any questions or would like to discuss our standards with our Quality Team: Tricia.donovan@ecampusalberta.ca

Tricia DonovanTricia Donovan
Executive Director
eCampusAlberta

IPEDS Fall 2013: Less than Half of Fully Distant Students Come from Other States

This is the third in a series of three blog posts examining the 2013 IPEDS Fall Enrollment survey and its counts of distance education enrollments.  The first blog focused on the 2013 survey statistics and the second compared the growth between 2012 and 2013.  The survey also asked if institutions enrolled students from other states or other countries. We examine those statistics in this blog post, especially in relation to state authorization regulations.

Russ Poulin

With the addition of Fall 2013 IPEDS Distance Education (DE) data, for the first time we have year-to-year enrollment data to compare.  In theory, this should allow us to see whether institutions are improving their IPEDS reporting for the location of students. However, a WCET and e-Literate study published in September 2014, revealed numerous challenges with IPEDS DE reporting (using different definitions, not reporting some students) and cautioned putting much stock in year-to-year comparisons with the first few years of IPEDS data since data reliability issues remain.

ExDELocationEnrollmentThe IPEDS DE survey requests information about the location of the student only for those students who are “exclusively in DE courses”. This is a relatively small proportion of the total population of online students, representing 13% of all enrollments in both 2012 (see last year’s blog post) and 2013.  All of our analysis on the geographic location of the students is limited to this population of students enrolled “exclusively in DE courses”.

Table 1: Fall 2013 Out-of-State Distance Education Enrollments by Sub-sector

      Students Enrolled Exclusively in Distance Education Courses
Sector # IHEs in Sector Number of
enrollments exclusively in DE courses
% of Total enrollment % enrolled in same state as IHE % enrolled not in same state as IHE % enrolled located outside U.S. % enrolled in U.S., state unknown % enrolled student location unknown/ not reported
Public, 4-year or above 691 620,386 8% 75% 21% 1% 1% 2%
Private, Non-Profit, 4-year and above 1,587 519,588 13% 40% 56% 2% 0% 2%
Private, For-Profit, 4-year and above 761 837,795 62% 13% 78% 2% 1% 6%
Public, 2-year 934 661,494 10% 91% 5% 0% 3% 1%
Private, non-profit, 2-year 88 802 2% 50% 50% 0% 0% 0%
Private, for-profit, 2-year 663 19,138 6% 30% 70% 0% 0% 0%
Totals 4,724 2,659,203 13% 52% 42% 2% 1% 3%

homevoutofstate

 Public Students Primarily Attend Distance Education Institutions in Their Home State

For-profit Students Primarily Attend Distance Education Institutions in Other States

As we would predict, a large proportion of students enrolled “exclusively DE courses” are served by public institutions in their state of residence, 75% for 4-year and 91% for 2-year public schools. These numbers are consistent with last year’s data. We see a decline in in-state enrollment in the Private, Non-Profit 2-year IHEs from 61% in-state in 2012 to just 50% in-state in 2013. Private For-Profit 4-year IHEs report a consistent 13% of online enrollments as in-state, while their 2-year counterparts report a small increase from 26% in 2012 to 30% in 2013.

About One in Five Public Four-year Students are Located in Other States

Public institutions report that about 20% of their fully online students are from another state for 4-year IHEs and about 5% are fully online for 2-year schools. This data appears to be consistent between Fall 2012 and Fall 2013 reporting. Private Non-Profit 4-year OneinFiveinOtherStatesinstitutions reported 54% of their fully online students were in another state in 2012 and 56% in 2013, again this appears to be consistent data.  There was growth in the reported number of fully online enrollments between 2012 (576,615) and 2013 (620,386). So while the proportion of enrollments reported as “not in the same state as the IHE” remained relatively constant (20% in 2012 and 21% in 2013) there were approximately 15,000 more out-of-state enrollments reported in 2013.

For Profit Institutions are Enrolling a Smaller Percentage of Out-of-State Students

We see some variation in reporting in the For-Profit sector. Private, For-Profit 4-year institutions reported that 84% of their “exclusively DE” enrollments were from another state in Fall 2012 and that has dropped to 78% in Fall 2013. In the same timeframe, Private, For-Profit 2-year IHEs reported a small decline from 73% in Fall 2012 to 70% in Fall 2013. These decreases are consistent with the overall decline in student enrollment in the Private, For-Profit sector reported for the period.

GlobalstudentsColleges are Serving Very Few International Students via Distance Education

The proportion of institutions reporting that they serve students who are outside the U.S. remains small, no more than 2% in any sector. This reporting does not appear to have changed significantly between Fall 2012 and Fall 2013. Based on Fall 2013 data, international students who are served exclusively online number about 26,600. The institutions who serve these students need to have clear policies, procedures, and services in place for these students. Institutions also need to be aware of any laws of the students’ country of residence and need to comply with those laws.

Public Colleges Improving in Reporting the Student’s Location

Overall, we do not see improvement in the proportion of online students for whom their location is reported as “in US, state unknown” or “unknown/not reported”. In both 2012 and 2013, the total in that category remains about 4% of all “exclusively DE” enrollments.

We see improvement in the Public sector reporting of the location of students, as those in the “unknown” categories fell from 4% in 2012 to 2% in 2013. There is a concerning increase in the proportion of reporting “unknown/not reported” by Private, For-Profit 4-year schools. The Fall 2012 data showed just 1% “in US, state unknown” and 1% “unknown/not reported”, while 2013 data 1% for “in US, state unknown” but 6% for “unknown/not reported”.  While this is a large gain, it is largely attributed to a reporting anomaly in a single institution.

It should be noted that the reporting methodology inconsistencies of even one large IHE can affect the percentage results since the numbers are relatively small. For example, in Fall 2013 a large for-profit institution reported almost 50,704 out of 52,131 exclusively online enrollments as “unknown/not reported.” This single campus accounts for 59% of the “unknowns” reported for the sector. Curiously, the same campus reported only 298 students as “unknown/not reported” last year, and another 500 as “in US, state unknown.” As long as IHEs are not consistent with their reporting methodology from year to year, the data on student location remains unreliable. This is unfortunate, since there is much attention on state authorization and compliance. We need good data to understand the magnitude of the state authorization challenge.

Institutions That Don’t Know Student Location Tend to Remain Uninformed

UninformedAmong the concerns with IPEDS DE data reporting includes the fact that some institutions simply do not report the geographic location of their students. Rather, they report the entire population of “exclusively DE” enrollments as “location of student unknown/not reported.” The Fall 2013 data reveals that some institutions continue to report their entire population of “exclusively DE” enrollments as location of student is “unknown/not reported,” others report a large proportion of their online enrollments this way. The IHEs reporting this way are a combination of For-Profits and Publics.

An analysis of the IHEs reporting the largest numbers of enrollments as “unknown/not reported” in Fall 2013 shows that some of these schools used the same reporting methodology in Fall 2012, but some actually seemed to have a had a better handle on where their students were in 2012 than they did in 2013. This, again, points out the importance of using accurate and consistent methodology from year-to-year within an IHE as well as the need to improve the methodology of the schools that continue to not report where their online students are physically located.

Bottom line:  If an institution does not know where its students are located, it cannot possibly be in compliance with state authorization regulations.

The Number of Institutions Serving Students Outside Their State and Outside the U.S. is Growing

GrowingoutsidestateInstitutions in each sector reported the number of distance education students they enrolled either outside their state or outside the US. According to the data (see Table 2), 47% of the IHEs reporting (2,249 of 4,724) indicate that they have students outside their state. This represents an increase of 100 institutions reporting students outside their state borders from last year. In addition, 21% of institutions (1,008 of 4,724) indicate that they have students located outside the U.S, as compared to 940 institutions reporting having students from outside the U.S. exclusively in DE courses in Fall 2012. These survey questions are not mutually exclusive, many IHEs have both out-of-state and out-of-country students.

With nearly one-quarter of all institutions responding to the IPEDS survey reporting that they enrolled students exclusively at a distance outside the U.S., it is curious why this those students represents only 2% of the reported distance education enrollments. This is certainly possible as many institutions probably enroll only a handful of students in other countries, but it would be interesting to examine if this might be another source for reporting anomalies.

Table 2: Number of Institutions reporting students outside their state and outside the U.S.

Sector # of Institutions with at least one student reported as Exclusively DE and OUTSIDE their State # of Institutions with at least one student reported as Exclusively DE and OUTSIDE the US
Public, 4-year or above 519 312
Private, Non-Profit, 4-year and above 742 365
Private, For-Profit, 4-year and above 249 82
Public, 2-year 674 244
Private, non-profit, 2-year 8 0
Private, for-profit, 2-year 57 5
Total 2,249 1,008

 

More Institutions are Subject to State Authorization Regulations in Other States

Regulations in many states require that institutions who enroll students in their state are required to seek state authorization, request an exemption, or perform some other act authorizing the institution to operate in the state. Based on the number of institutions reporting students outside their state and/or outside the US, there are a great number of institutions that should be seeking state authorization in the states where they serve students. Institutions that do not know where their students are located simply cannot be in compliance with state authorization regulations. Since the IPEDS DE survey does not ask about the specific states in which their students resided, it is impossible to determine from the IPEDS data how many institutions may be out of compliance.

As seen in our survey of institutions regarding their progress on obtaining authorizations, only 25% of responding institutions possess all the approvals that they need.  Since that survey, the Department of Defense has required all institutions offering Tuition Assistance for military personnel to have obtained the authorizations in each state.  The Department of Education has said that they plan to bring back their own federal requirements for students collecting Title IV funds in other states.  The State Authorization Reciprocity Agreement will require reporting of student enrollments at a distance in each SARA state.  Given all these pressures, institutions should be incentivized to provide clearer data on the locations of students in future years.

The analysis of the reported location of students in relation to the institution that serves them continues to show that some institutions are not reporting geographic location data. Either they are not collecting the data at the time of enrollment or they do not have mechanisms to accurately report student location. These institutions need to know the location of their students to ensure that they are in compliance with the state regulations in each state that is home to their students.

This concludes the current series examining Fall 2013 Distance Education IPEDS data. WCET will continue to monitor the data as it is released. We are hopeful that as institutions all adopt the IPEDS definition of distance education and as they gain the systems and experience to pull and report accurate student data, the industry will soon have a true benchmark.

 

Terri StrautTerri Taylor Straut
Ascension Consulting

 

 

 

 

Russ PoulinRuss Poulin
Director, Policy and Analysis
WCET

 

If you like our work, join WCET!

IPEDS Fall 2013: Distance Education Data Reveals More Than Overall Flat Growth

The first blog post in this series analyzed the Fall 2013 distance education enrollment data (as released by the U.S. Department of Education’s IPEDS survey) sector-by-sector. The post revealed that one-of-four students took at least one distance education course and that public institutions continue to remain the largest provider of DE courses.

While the 2012 IPEDS survey provides a shaky base for comparison, let’s see what we can learn.  Again, thank you to Terri Taylor Straut for the deep dive into these data.

Russ Poulin

For the first time, we have two sets of IPEDS Distance Education (DE) data. The industry is anxious to have benchmark data that will help us determine trends in the marketplace.

While we still have grave reservations over the accuracy of the IPEDS data reported (some colleges used different definitions, some did not report all enrollments), the data set is the most comprehensive and bears examination.  The biggest caveat is this:  Given the errors that we found in colleges reporting to IPEDS, the Fall 2012 distance education reported enrollments create an unstable base for comparisons.

EnrollmentsDownwebA new concern for the comparison of data by educational sector between Fall 2012 and Fall 2013 is the fact that many institutions have changed their IPEDS identification numbers and how they report. For example, in Fall 2012, all DE enrollments for the Arizona State University were reported through a single IPEDS number, in Fall 2013,  five separate campuses reported DE enrollments. Anyone who wants to dig into the data for themselves needs to be aware of these changes in the data set.

With these caveats, let’s take a look at the data regarding overall higher education enrollments as that will provide context on trends in distance education enrollments.

Overall Higher Education Enrollments are Down 4%

There is Significant (17%) Retrenchment in For-Profit College Enrollment

It is important to understand the enrollment trends by sector. The table below shows that while all sectors report declining enrollment, between 2012 and 2013, the retrenchment in the For-Profit sector, with a decline of 17%, is significantly greater than the other two sectors.

Table 1: Total Enrollment by Sector 2012 to 2013

2013 2012 % Change
Public     14,745,558 15,085,798 -2%
Private, Non-Profit        3,974,004       4,118,688 -4%
Private, For-Profit        1,656,227 1,932,857 -17%
Total     20,375,789 21,137,343 -4%

 

Overall Distance Education Enrollment Growth Was Relatively Flat

Distance Education Grew Despite Declines in Overall Higher Ed Enrollments

Comparisons of overall IPEDS enrollment data for Fall 2012 and Fall 2013 reveal no growth, or a slight decline (-1.3%) in total student enrollments. Analysis of the IPEDS categories, “exclusively enrolled in DE courses” and “enrolled in some but not all DE courses” reveal increases in enrollment of less than 1%. Given the known problems with IPEDS data validity and the fact that these changes are so small, we conclude that there is no growth in higher education DE enrollments for the first time since this data has been reported. Phil Hill’s blog on this topic provides additional detailed analysis.

Table 2: Annual DE Enrollment Comparisons

All Students Enrolled Exclusively DE Courses Some but not all DE Courses At Least One DE Course At Least One DE Course as % Total Enrollment
2012 20,642,819 2,653,426 2,842,609 5,496,035 26.6%
2013 20,375,789 2,659,203 2,862,991 5,522,194 27.1%
% Change -1.3% .2% .7% .5% 1.8%

FullyDistanceEnrollmentsweb

 

Fully Distance Ed Growth Varied Greatly by Sector:

  • Non-profit Enrollments Grew by Almost 9%.
  • Public College Enrollments Grew by a Modest 2.4%.
  • For-profit Enrollments Fell by 8.3%.

Combining all the enrollments from each sector provides an incomplete picture of the distance education trends.  For many reasons, the for-profit sector is in a major retrenchment mode.  Their fall enrollment decline (71,154 fewer fully distance students than last year) is dragging down the other two sectors.

Most notable is the growth in fully online enrollments in non-profit institutions.  Their fully distance enrollments grew by almost 47,000, which is almost 9% more than last year.  This is especially notable in the context of overall enrollment in public colleges, which fell by 4% from year to year.  Non-profit institutions have fewer students and many of those that they are retaining are going online.

The change in fully distance enrollments in Public colleges is relatively flat at 2.4%, but occurred at the same time that Public colleges lost 2% of their enrollment.

Table 3: Percentage Change (2012 to 2013) in Distance Education Enrollment for Students Enrolled Exclusive at a Distance by Sector

Sector Exclusively DE Courses 2013 Exclusively DE Courses2012 % Change
Public 1,281,880 1,251,398 2.4%
Private, Non-Profit 520,390 473,941 8.9%
Private, For-Profit 856,933 928,087 -8.3%
Total 2,659,203 2,653,426 0.2%

SomeDistanceEnrollmentsweb

 

Enrollment in Some But Not All DE Courses is Also Mixed by Sector:

  • Publics Enrollment in Some DE Courses Grow 2%
  • Non-Profits Lose 5% of Enrollments in Some DE Courses
  • For-Profits Lose 13% on Enrollment in Some DE Courses

While the combined reporting shows a very small gain in enrollment of .7%, the differences among the sectors in these categories is significant. Publics maintained growth of about 2% between 2012 and 2013. Non-Profits report a loss of about 6% in this category of DE enrollments. The For-Profits report the greatest loss at nearly 13%. Once again, sector analysis reveals significantly different results.

Table 4: Percentage Change (2012 to 2013) in Distance Education Enrollment for Students Taking Some (But Not All) Distance Courses by Sector

Some But Not All DE Courses 2013 Some But Not All DE Courses 2012 % Change
Public      2,462,362         2,409,595 2.1%
Private, Non-Profit         275,020            291,144 -5.9%
Private, For-Profit         125,609            141,870 -12.9%
Total      2,862,991         2,842,609 0.7%

 

Combined DE Data Reveals Enrollment Growth in Publics 2.2% and Non-Profits 3.8%

Only For-Profits Lost DE Enrollment Overall at -9%

The final category of DE enrollments, “at least one DE course”, is calculated by adding the IPEDS categories “exclusively DE courses” and “some but not all DE courses”. The combined data reveal that only For-Profits actually lost DE enrollments, overall, between 2012 and 2013. Both Publics and Non-Profit overall DE enrollment actually grew during the period. This is another case where the sector analysis shows a very different picture than looking at combined data alone. While overall there is no significant growth in DE enrollments, it is clear that the growth in Publics and Non-Profits is actually negated by the decline in For-Profit enrollment.

Table 5: Percentage Change (2012 to 2013) in Students Taking At Least One Distance Education Course by Sector

At Least One DE Course 2013 At Least One DE Course 2012 % Change
Public        3,744,242         3,660,993 2.2%
Private, Non-Profit            795,410            765,085 3.8%
Private, For-Profit            982,542         1,069,957 -8.9%
Total        5,522,194         5,496,035 0.5%

 

Large Players Enrollment Changes Impact Total Enrollment

Closer evaluation of the largest institutions in the distance education market, measured by student enrollment, reveals interesting results. There is large variations year-to-year of reported student enrollment in “exclusively DE courses” by some of the largest players in the market place. With large sums of money being spent on advertising by public and private schools alike, the winners and loser are a bit surprising.

Table 6: Institutions with Large Variation in DE Enrollments between 2012 and 2013

Institution Total 2013 Enrollment 2013 Exclusively DE Enrollment 2012 Exclusively DE Enrollment % Change
For- profit Institutions
Grand Canyon University           55,497           45,496 28,417 38%
Kaplan University-Davenport Campus           52,407           52,131           44,678 14%
University of Phoenix-Online Campus        212,044        207,060        250,600 -21%
Ashford University           58,104           57,235           76,722 -34%
Non-profit Institutions
Southern New Hampshire University 28,389 20,701 10,679 94%
Western Governors University           46,733 46,733           41,369 19%
Liberty University           77,338           64,503           61,786 4%
Public Institution
Arizona State University—All Campuses 76,728 9,958 7,444 34%

 

University of Phoenix-Online Campus maintains the number one ranking, by far, with 2013 total enrollment of 212,044 students; 207,060 are reported as fully online. However, they lost 21% of their enrollment in just one year. The other big loser in reported enrollments is Ashford University with a 34% enrollment decline, from 76,722 online enrollments in 2012 down to 57,235 enrollments in 2013.

Big enrollment winners during the period 2012 to 2013 include Private, Non-Profits. Western Governors University reported 19% online enrollment growth, with 2013 enrollments at 46,733 and Liberty University reported a 4% gain, growing to 64,503 fully DE enrollments in 2013.

Among the largest schools, Grand Canyon University, a Private, For-Profit shows the highest growth for the period at a 38% enrollment increase from 28,417 online enrollments to 45,496 in a single year, Kaplan University, also Private, For-Profit reported a 14% enrollment increase, and 52,131 exclusively DE students.

Does Advertising Work?

The public reporting and the Fall 2013 IPEDS data support the fact that some of the large For-Profit players are retrenching and have experienced significant enrollment decline. We were curious to see if the increased advertising that we are seeing from some of the smaller institutions is resulting in enrollment gains.

SNHUad1webArizona State University, a Public IHE, has been actively seeking students from outside Arizona through radio advertising, billboards, etc. According to their IPEDS reporting, ASU increased their online enrollments by 34% between 2012 and 2013, resulting in nearly 10,000 exclusively DE enrollments in 2013. A private, non-profit institution, Southern New Hampshire University, reported a whopping 94% enrollment growth in fully online enrollments in the same period, nearly doubling those enrollments from 10,679 to 20,701. While far from a scientific sampling, it does appear that targeted, effective advertising can have a significant impact on enrollment growth. What is not publicly known is the cost to gain those increases in enrollment.

The next blog in the series will focus on the data reported with regard to where the students who are in exclusively DE courses are physically located in relation to the geographic location of the IHE serving them. This is important information for institutional leaders responsible for compliance; regulators and policy-makers who are working to create a more manageable state authorization process.

 

Note: We erroneously listed Southern New Hampshire University as a public institutions in the second-to-last paragraph.  That has been corrected. (03/16/15)


 

Terri StrautTerri Taylor Straut
Ascension Consulting

 

 

 

 

Russ PoulinRuss Poulin
Director, Policy and Analysis
WCET

IPEDS Fall 2013: Higher Ed Sectors Vary Greatly in Distance Ed Enrollments

This is the first of a series of posts providing insight on data regarding enrollments in distance education that was released by the U.S. Department of Education earlier this year.  Terri Taylor Straut crunched the data for WCET.  Terri has a deep knowledge of the distance education world and we appreciate her thorough analysis. As noted below, Phil Hill of the e-Literate blog has published several articles on this issue.  We communicated with him to make sure that our work was complementary with his and followed the same methodology.  The most recent version of the Babson Survey Research Group study of enrollments in online learning this year switched to using the IPEDS data.  Phil Hill and I provided an update in that publication on some of the short-coming we found last year with institutions answering the IPEDS survey.

This series will examine three different aspects of the IPEDS distance education enrollments: 1) overall distance education enrollments in 2013, 2) comparisons of the 2012 to 2013 data, and 3) the data on serving students in other states and the implications on state authorizations.

We will use a headline format (highlighting the major findings) and have some observations that are slightly different than those made by Phil Hill or Babson.

Thank you,
Russ Poulin, WCET

The U.S. Department of Education’s National Center for Educational Statistics (NCES) released the second year of Integrated Postsecondary Education Data System (IPEDS) data that reports Distance Education (DE) course enrollment for the Fall of 2013. The fact that we now have two consecutive years of DE data may lead us to believe that we can finally begin tracking enrollment trends. However, as reported in the Frontier’s blog last fall, there are still issues with the reporting methodology used by many institutions when reporting their DE enrollments to IPEDs. Since the Fall 2013 data was reported to IPEDS before any publication about the challenges with the Fall 2012 were published by WCET and e-Literate, we believe that there are also problems with the validity of the Fall 2013 data. Please see the Methodology section at the end of this blog for more detail.

Previous reports by the Babson Survey Research Group and e-Literate have led with headlines about the overall flat growth in distance education. Our WCET analysis looks more closely at the sector data, which reveals more nuanced results and supports the continued dominance of public institutions in the offering of distance education courses to students nationwide.

A statistical note: The IPEDS Fall Enrollment is focused at the point in time of Fall 2013.  The survey asks institutions to separate students into three categories:

  • Enrolled Exclusively in Distance Education Courses.
  • Enrolled in Some (But Not All) Distance Education Courses.
  • Not Enrolled in Any Distance Education Courses.

We focus primarily on the first two categories and add them together to create a new category of student who took at least one distance education course.

Publics Enrolled about 3/4 of all Students

Public institutions of higher education continue to educate nearly three-fourths (72%) of all enrolled students, regardless of mode of delivery. For all the discussion about the impact of the many new players and the profit motive, the majority of learners still attend public institutions. Non-Profits account for 20% of all enrollment, and despite all the hype, For-Profits enrolled just 8% of US learners in 2013.

Table 1: Sectors as Percentage of Total Higher Education Enrollments

2013 Sector Data Total Enrollments Sector Enrollment as % of Total Enrollment
Public 14,745,558 72%
Private,
Non-Profit
3,974,004 20%
Private,
For-Profit
1,656,227 8%
Totals 20,375,789 100%

 

OneinEight


The Percentage of Exclusively Distance Students Varies Significantly by Sector:

  • One-in-Eleven Public Students
  • One-in-Eight Non-profit Students
  • Over Half of For-profit Students13PERCENT

 While For-profit institutions account for just 8% of total enrollments, they have successfully grown their exclusively online programs so that they now account for 52% of all For-profit enrollments. For-Profits have successfully launched exclusively online programs earlier than the other sectors. It is this accomplishment that receives much attention in the media and in the distance education marketplace. It is also important to note the Exclusively DE programs account for about 13% of all student enrollments in Fall 2013, or one out of eight students is studying exclusively online.

Table 2: Exclusively DE Enrollment as Percentage of Total Sector Enrollment

Sectors Total Sector Enrollment Students enrolled exclusively in distance education courses as % of Total Sector Enrollment
Public 14,745,558 1,281,880 9%
Private, Non-Profit 3,974,004 520,390 13%
Private, For-Profit 1,656,227 856,933 52%
 Totals 20,375,789 2,659,203 13%

ExclusiveDEPiegreenFor-Profit Enrollments Comprise Less than A Third of All Fully Distant Students
Public Institutions Account for Almost Half of All Fully Distant Students

It is still a common myth held by many in the public, in the press, and in the policy arena that almost all of the enrollments in distance education is in the for-profit sector.  Nothing could be further from the truth.

For-Profit enrollments represent just 32% of Exclusively Distance Education student enrollments reported in 2013. So while fully online learning is the dominant mode for the For-Profit sector, public institutions are still enrolling more online learners, 1,281,880 than the For-Profits with 856,933 student enrollments fully at a distance.

 

Table 3: Students Enrolled Exclusively in DE by Sector

Sectors Students enrolled exclusively in distance education courses as
% of Exclusively DE Enrollment
Public 1,281,880 48%
Private, Non-Profit 520,390 20%
Private, For-Profit 856,933 32%
 Total 2,659,203 100%

 

Almost 90% of Students Taking Some Distance Education Course are Enrolled in Public Institutions

It is Relatively Rare for a Non-profit or For-profit Student to Mix Face-to-Face and Distance Courses

 

Table 4: Students Enrolled in Some But Not All DE Courses by Sector

Sectors Students enrolled in some but not all distance education courses as % of Exclusively DE Enrollment
Public 2,462,362 87%
Private, Non-Profit 275,020 10%
Private, For-Profit 125,609 4%
 Totals 2,862,991 100%

OneinFourPublic Students Taking at Least One Distance Course Outnumber For-profits 4 to 1

As described in the methodology section below, to approximate the Babson definition of online course enrollments, the IPEDS results for “students enrolled in some but not all distance education courses” and “students enrolled exclusively in distance education courses” are combined. When we look at sector data, using this broader definition of online enrollments, we see that more than one-in-four students (27%) took at least one online course in Fall 2013. And once again, there are far more enrollments in the Publics (3,744,242) than in the For-Profits (982,542) or Non-Profits (795,410). In fact, Public IHE online enrollments outnumber Private For-Profit enrollment 4 to 1.

Table 5: Enrollment in At Least One DE Course by Sector

Sectors Total Enrollment Students enrolled exclusively in DE courses Students enrolled in some but not all DE courses Students enrolled in at least one DE course as % of Total Enrollment
Public 14,745,558 1,281,880 2,462,362 3,744,242 25%
Private, Non-Profit 3,974,004 520,390 275,020 795,410 20%
Private, For-Profit 1,656,227 856,933 125,609 982,542 59%
 Totals 20,375,789 2,659,203 2,862,991 5,522,194 27%

Conclusions

distance-ed-enrollmentsIt is not our intention to place value judgments on the different sectors, but rather to puncture some common myths around distance education enrollments by sector.  Having a better handle on the activity in distance education enrollments and the type of activity (fully distance or mixed), helps us to better understand the marketplace whether we are interested in completive positioning or regulatory oversight.

The analysis of sector data reveals that Public institutions of higher education continue to educate the lion’s share of students and students taking online courses, both as a component of their course load and exclusively online. While the For-Profits enroll a higher proportion of their enrollments fully online, they remain far from dominating the distance education marketplace.

In the next blog post, despite reservations about the validity of the IPEDS data (see Methodology below), we will focus on comparisons between Fall 2012 and Fall 2013 enrollment in distance education courses.  We will examine the recent headlines regarding distance education growth being flat.  It is definitely true overall, but not in all cases.  Stay tuned.


 

Terri StrautTerri Taylor Straut
Ascension Consulting

 

 

 

 

Russ PoulinRuss Poulin
Director, Policy and Analysis
WCET

 

If you like our work, join WCET!

 

Methodology

Matching the Fall 2012 process, analysis of the Fall 2013 IPEDS data was conducted on all degree granting institutions in the U.S. This presents 4,724 institutions of higher education (IHE) in total, both 4-year and 2-year colleges. This data set matches the data set that Phil Hill, edtech author and blogger at e-Literate, has used in his recent blogs on the topic of Fall 2013 IPEDS data. According to Phil, the data set also matches the historical data reported by the Babson Survey Research Group (BSRG)/Sloan-C/Pearson survey. In order to approximate the same measures used by the prior survey, the data fields, “enrolled exclusively in DE courses” and “enrolled in some but not all DE courses” were combined to match the Babson category “enrolled in at least one online course”. We want to thank Phil for his early analysis of the Fall 2012 data and for his ongoing collaboration with the BSRG to ensure that the two data sets can be compared appropriately. Phil has also done a fine job of illuminating the differences in the data and definitions used.

The biggest caveat is this:  Given the errors that we found in colleges reporting to IPEDS, the Fall 2012 distance education reported enrollments create a very unstable base for comparisons.

The fact that the Department of Education has included DE data in annual IPEDS reporting is a solid step towards having comprehensive, reliable data from every institution in the U.S. As the reporting becomes more consistent from year to year and between institutions, the data will be more reliable and stronger conclusions will be able to be drawn from the data. We are not there yet.

The IPEDS DE data that the distance education market wants very much to rely upon as a baseline to measure industry changes will not be accurate until all 4,700+ institutions have a shared understanding of the definitions and reporting requirements. In addition to understanding how they need to report the data, they need the systems in place to collect and report accurate data about their DE students, as well as the entire student population. Without this consistency, the industry continues to function without any reliable measures of change from year-to-year, by sector, or otherwise.

 

WCET Strategic Priorities: Practice, Policy, and Advocacy

Today we hear from Peter Smith, Founding President, Open College @ Kaplan University and chair of the WCET Executive Council.  Thank you Peter for your insights today and the leadership you provide your Cooperative.

It is a tremendous honor to have the opportunity to serve as the Chair of the WCET Executive Committee, especially as the “Abbiatti Era” dawns at WCET. Once again, WCET has identified the right leader at the right time to address the challenges and opportunities that we and our members face in the educational technology space.

As those of you who have worked with Mike know, he moves quickly and surely to define objectives and create clarity of direction and purpose. Since he arrived, Mike has worked with the staff, members of the Steering Committee and the Executive Council, to list, prioritize, and select a limited number of three strategic focus areas for WCET. They represent, I believe, appropriately diverse levels and types of activity that are at one and the same time aggressive, important, and achievable.

Practice

22253314_sThere are three areas which have been identified for strategic focus, each distinct from the other two, but all intertwined in an organizational DNA that will prove extremely valuable. Historically, WCET is a membership organization which focuses on best and emerging practices in the use of technology in Higher Education at the institutional level. This will continue to be the organizing vision and purpose of WCET: to serve the practice-related needs of our institutional and other members in the use of technology. In my opinion, no one does it better than we do. But by choosing it as an explicit strategic focus area, it will drive a more operational focus, and hence improvement, going forward.

Policy

At the same time, we all know and understand that the policy arena in higher education will be significantly occupied by technology-oriented issues as well as the disruptive consequences of big data, abundant information, and new technological capacities emerging every day. Therefore, it makes all the sense in the world that WCET should lead the analysis of policy that impacts technology-enhanced teaching and learning. This year, it will undoubtedly involve the reauthorization of the Higher Education Act. But as we have seen from other initiatives, such as PAR and SARA, the policy conversations of the future will be as diverse as they are, in some cases, unanticipated.

Advocacy

Finally, when you consider the sum total of these first two priorities, it suggests that WCET’s third area of strategic focus should be advocacy, guiding proposed regulations and policies so that they result in practices that benefit students while also balancing the needs of institutions and governments. While we can and should engage in joint advocacy with other groups, and there will be, and should be, other voices in this discussion, WCET’s membership base, focus on practice, and history in the advocacy field position us well to be a major national and global voice.

I look forward to an exciting and productive year working with Mike, his extraordinary team, the Executive and Steering Committee members and all the members who make WCET the vital and important organization that it has become. We are standing at a crossroads in higher education. The leadership opportunity in practice, policy and advocacy in and for technology –related issues has never been bigger or more important. Working together, we can seize the future.

Peter SmithPeter Smith
Founding President
Open College @ Kaplan University

Empowering Practices: 3 Steps You Can Take Now To Improve Academic Integrity

Today we welcome academic integrity expert Tricia Bertam Gallant, Outreach Coordinator at the International Center on Integrity, who will share with us how they are helping institutions employ academic integrity best practices.  The “Trusted Seal of Integrity” should help us combat issues like those addressed in this December 30 article.  Tricia and her team are making great strides, but they can’t combat integrity issues in a bubble, they need the support of the entire academic community to do so!  Thank you Tricia for the informative blog and the great work you’re doing.   

There is a rumor I frequently hear uttered (both under one’s breath and outright, as if in exclamation) — “online students cheat more than regular students!”

This rumor is uttered in traditional brick and mortar institutions by faculty who are resisting online education. It is uttered in general society by people who want, yet fear, new and expanding opportunities for people to undertake higher education. And the rumor is uttered by online instructors who worry that academic integrity will be undermined by people they cannot identify or know.

The truth is that there is no research to support the claim that online students cheat more than students who take traditional brick-and-mortar classes. The research, in fact, is inconclusive. Some online students cheat more than some traditional students, and some traditional classes have more cheating than some online classes.

Why is the Research on Online Cheating Inconclusive?

CAII_SquareICAILogoThe research is inconclusive because the research doesn’t consider the practices employed by the instructors of those courses. And the truth is that online and traditional brick-and-mortar class instructors can implement the same best practices to enhance integrity and reduce cheating. There is no secret here to what works – we just need to commit to applying them in the online educational environment.

The best practices to enhancing integrity and reducing cheating in any classroom are to:

  1. Inform and Educate
  2. Prevent & Protect
  3. Practice & Support

Inform and Educate

Generally speaking, many university bound students are ignorant of academic integrity. Yes, they may have been taught not to lie, cheat or steal, like most of us, but they have existed (and thrived) in educational environments in which almost any method for getting assignments in and passing tests is acceptable, or at least, only lightly punished.

This means that our domestic or international, online or face-to-face, younger or older students, all need to be informed and educated on academic integrity and specifically about the academic integrity “dos and don’ts” of a particular classroom, program and/or institution. And, their knowledge needs to be assessed so we can be more certain that they do share the same understanding that we do.

Prevent and Protect

We also need to take steps to ensure that our assessment practices promote integrity. Are we verifying student identity when they are demonstrating their knowledge? Are we monitoring and verifying the integrity of assessments? This is critical, of course, in both in-person and online exams.

If the online students are taking their exams in person, the institution should ensure that the testing site is employing best integrity practices, and if the online student is taking his or her exam remotely, the institution should be using technological tools to ensure that the person completing the assessment is the person who is enrolled in the class.

Practice and Support

Finally, in order to support both faculty and students in ensuring that integrity is the norm and cheating is the exception, the program and institution needs policies and procedures that are consistently implemented, equitable, and reviewed.  These policies should encourage consistent reporting of integrity violations, provide for a fair and educational process for alleged violators, provide a teachable moment, and be reviewed every 3-5 years to ensure the integrity of the process.

The “Trusted Seal of Integrity”

Trusted SealIf we manage to do these three things in online (or face-to-face) educational environments, then we will be employing best integrity practices, and, perhaps more importantly, it will be more likely than not that cheating will be the exception and integrity the norm.

To help faculty, programs and institutions employ these best integrity practices, the International Center for Academic Integrity and Software Secure have partnered to develop the Trusted Seal of Integrity program. This program provides faculty, programs and institutions with a rubric by which they can assess how well they are employing best integrity practices. And, if they are performing sufficiently, they can be awarded the Trusted Seal – a public declaration that integrity is a priority for the instructor, the program and the institution.

For more information about Trusted Seal, you can email me or visit us at www.integritytrusted.com. I’d be happy to hear from you!

 

Dr. GallantTricia Bertram Gallant

ICAI & Trusted Seal

 

 

 

 

 

 

Follow

Get every new post delivered to your Inbox.

Join 889 other followers