As the Online Program Management (OPM) industry matures, universities increasingly enjoy the opportunity to negotiate a more customized arrangement to accelerate the growth and visibility of their online programs. But, with expanded opportunity comes increased responsibility to negotiate an agreement that advances the institution’s mission while minimizing potential liability to its operation and reputation.
While avoiding the “one-size-fits-all” approach to OPM contracts, universities often overlook a number of key contract dimensions that may result in an agreement that suffers from poor partner relations, weak program momentum, and even an early dissolution of the executed agreement--with a concomitant loss of opportunity and momentum.
Based on numerous consultations helping universities and OPMs create mutually beneficial agreements, we recommend special attention to a number of key dimensions to clarify partner roles and manage expectations:
For example, universities often rely solely on inflated and/or subjective internal assessments of their technological capabilities, platform support services, course development capacity, and student support infrastructure when determining which services to include in the OPM contract, only to discover later in the partnership that they can’t deliver--or can’t deliver as cost effectively as the OPM.
Conversely, universities frequently over-estimate the power of their brand and the attractiveness of their academic programs, often entering OPM agreements with lofty visions of instant, worldwide success. OPM partnerships require a thoughtful, rational pre-contract assessment of each other’s capacity and ability, as well as a realistic view of enrollment growth and sustainability. Control of key marketing assets, such as search terms, marketing channels, landing pages, and institutional messaging requires a transparent approach between the partners that supports all aspects of the university enrollment mix, while building online as a strategic focus of enrollment expansion.
JenEd Consulting offers extensive support to universities and OPMs in the brokering and negotiation of contracted partnerships. Let us know how we can help you develop an achievable and sustainable relationship.
Dr. John E. Neal
JenEd Consulting, LLC
As the field of big data moves out of the “early adopter” phase and into more mainstream use, many colleges and universities are struggling to balance the challenges it creates with the opportunities it affords. Finding the answer to a question on the Internet has famously been compared to getting a sip of water from a firehouse. That same metaphor can be applied to big data, where the pursuit of a relatively simple question can lead to a deluge of data-supporting answers from all sides.
While there are several methods to improve the likelihood of an effective research effort using big data, there is no element as critical to your success as a research team that includes staff with the necessary skills and knowledge for the effort. When assembling your team, the following roles and skill sets are requisite.
Senior Sponsor: The Senior Sponsor is usually a member of the senior academic or administrative staff. This individual is often responsible for the operations of the area that is being researched. This may include academic affairs, student services, enrollment management, faculty administration, institutional research, information technology, etc. The Senior Sponsor is responsible for ensuring that the research process stays true to its intention, represents the strategic needs of the institution, and is reported correctly and without bias to all of the necessary channels.
Data Scientist: In many ways the Data Scientist is the nexus for the entire research process. Data Scientists must have several skills and abilities to accomplish their responsibilities within the team. Since this role provides the primary oversight and direction of the study, the Data Scientist must have a clear understanding of all phases of the research including its design, acquisition/preparation of data, analysis, interpretation of output, and reporting. Staff in these roles need to pay particular attention to the seams that naturally occur in the “hand-off” between the phases of research. These areas are often prone to error and compromise in the integrity of data and analysis may be compromised.
Data Scientists have the ultimate responsibility for the fidelity of the process and its output. In areas where the institutional talent is not deep enough to support the needs, the college or university may elect to consider outside experience and expertise.
Subject Matter Expert/Content Representatives: The Subject Matter Expert is the team member who works most closely in the area represented in the research. Of all of team members, this individual is “in the trenches” daily and has the greatest knowledge and awareness of the research topic and its context.
Subject Matter Experts are engaged in validating the research plans and questions as they pertain to the area of analysis. They are excellent resources for identifying important internal and external data sources and can often be called upon to judge the quality and completeness of records as part of the data audit. Subject Matter Experts represent the needs and perspective of the consumer of the research. As such, they are in a critical position to inform and shape the format of the research output.
Data Hygienist: The Data Hygienist is responsible for evaluating and determining the suitability of the data for use in analysis. This will include an audit of the accuracy and completeness of the records. The Data Hygienist will also prepare the data based on the agreed upon standards for ingestion. The person(s) in this role also organizes the data in the specified format for analysis.
Data Journalist: The Data Journalist is responsible for the design and composition of the reporting elements of the research. This includes determining the ideal format for presenting clear and actionable output for the identified audience. The Data Journalist often needs to convert technical statistical information into knowledge that can be easily consumed by those without quantitative expertise. The staff member(s) in this position should be comfortable determining the correct and accurate method for graphically representing statistical data. Finally, the Data Journalist ensures that the reporting accurately represents the outcome of the research in a fair and unbiased manner.
IT Team: The IT challenges associated with big data are legion and subject for another blog, if not a book! The development of the IT sub team that supports Big Data is a challenge unto itself. Institutions seeking to work with big data must be committed to recruiting and retaining IT staff that are capable of extracting, storing, accessing, aggregating, formatting, and manipulating structured and semi-structured data from disparate sources. The institution will need data engineers, programmers, database managers, systems architects and related staff with proficiency in coding, modeling, machine learning, database management, information systems management, and related areas. No research effort involving big data will be successful without a skilled and talented IT subteam in place.
While the difficulties in using big data for research are significant, the advantages are considerable. The detail and sheer volume of data now available to your institution has never been matched. Used appropriately, colleges and universities can make decisions informed by analysis that is previously unmatched. But this possibility is unlikely without a skilled and dedicated research team.
Dr. Rob Sapp, Senior Associate
JenEd Consulting, LLC, a comprehensive consulting services company for colleges and universities seeking to extend their mission through quality online programming, today announced a strategic alliance with Academic Keys, LLC, the premier source for academic employment, to collaborate on executive searches in higher education. Offering four levels of search, Academic Keys has filled a long-standing need in higher education - comprehensive search options for all levels of positions at affordable prices. By design, their services focus on sourcing candidates for challenging and tough to fill positions.
The strategic alliance will combine JenEd Consulting’s executive higher education experience with Academic Keys’ expertise providing comprehensive searches at an affordable price. “As a former university president, I know how difficult it can be to attract quality candidates in key faculty and administrative positions,” said Dr. John Neal, President of JenEd Consulting, “JenEd’s strategic consulting activities often result in opportunities to facilitate key searches in a variety of faculty and administrative levels. We are pleased to be utilizing Academic Keys’ dedicated team of researchers, remarkable database, recruiting skills and advertising to offer the highest caliber search services at attractive prices.”
Jennifer Muller, Vice President of Academic Keys, sees the alliance with JenEd Consulting as “a unique partnership with seasoned executives who bring university and corporate experience to their strategic consulting--enriching Academic Keys’ relationships to colleges and universities who view searches as more than simply filling positions, but as strategic opportunities to advance the mission of an institution.”
The JenEd/Academic Keys alliance will offer four levels of search: full-service, economical, cluster hires, and interim positions.
In my work as an online higher education consultant, I divide my time between colleges/universities and the companies that serve them in the online space, particularly online program management firms (OPMs). As a former university president and a former OPM CEO, I have the opportunity to observe universities and OPMs negotiate, create, and manage complex partnership agreements--and I'm often invited to the table because I bring a unique practical perspective of both sides of the relationship.
Over the past five years, I've interviewed over 100 presidents of medium to smaller private, not-for-profit institutions (largely members of the Council of Independent Colleges) about their strategic online plans and what they want and/or need from an OPM. As online learning moves from the periphery to the strategic agendas of private, not-for-profit colleges, presidential perspectives on the role of online learning provide an essential resource to an OPM’s development of products, services, features and marketing themes to assure consonance between the OPM’s strategic direction and its target market. Here are a few of the common themes:
Serve as a source of research for colleges. Presidents wrestle with conflicting constituencies, including regulatory bodies. Objective research on optimum philosophies and approaches (as well as national trends and models for improvement) will give presidents needed ammunition and make institutional change easier.
Don’t treat colleges like a business, nor try to appeal to business-driven results. Not-for-profits bring different values to the table and feel vindicated by recent public floggings of the for-profit sector. Appeal to the larger purposes of higher education—changing lives, expanding the reach of the college’s mission and educational approach.
Colleges maintain a healthy skepticism of external business partners. They all have horror stories of high-priced, overblown promises that cost the president stature and the college momentum and resources. Entrepreneurial colleges feel confident of their ability to run online in-house, while colleges behind the curve are intimidated about which model (and partner) would be best. Trust built through relationships and results must remain paramount.
Most presidents are content with their current online model, but quickly reveal fear of falling behind or losing market share. Few presidents have grandiose dreams about online enrollment, but they are concerned that the world is passing them by.
Online development is usually driven by a single innovator or a small team. Presidents like the inexpensive and home-grown nature of the arrangement, but understand that these key players could leave and move the college back to square one. Many also admit to current difficulties in scaling for success.
Presidents admit that too much time was spent on curricular design and platform development, while too little time and expertise has developed in the branding, marketing and student recruitment side of the online venture. The ability to deliver adequate numbers of students at a reasonable price would convince some presidents to change models and/or platforms.
Declining and/or flat enrollments on campus and in adult programs increase the sense of urgency for online development. Most presidents admit to little or no knowledge of online programming, thinking that it would be an extension of their on ground adult and off-site programs. Online is becoming a separate item on the strategic agenda of many presidents.
Concern persists that online program expansion will cannibalize the adult and off-site programs. Presidents worry that their lack of online marketing expertise will result in fishing in the same prospect pool for both online and on-ground adult students, leading to a net loss of their current markets.
Faculty governance remains the #1 internal obstacle related to all innovation, including online development and expansion. No other issue is more diverse from campus to campus, ranging from faculty driving all issues to faculty being avoided. In all cases, however, the role and function of faculty was a foundational concern.
Presidents must preserve the core identity and mission of the institution. Their jobs hang in the balance. Ironically, many of the presidents admitted that they could survive financial downturns, but not accusations of changing the nature of the college. The for-profits (or not-for-profits that became for-profits) are not attractive models. Partners must understand how colleges measure success over the long haul.
Accreditation and regulatory constraints are making strategic change almost impossible. Many presidents have recent horror stories of timelines for approvals of new programs and locations—most of the stories involved a year or more of waiting for decisions.
Presidents aren’t looking for big markets or rapid growth. Both scenarios raise accreditation and regulatory red flags, and private colleges prefer running under the radar. Conversely, incremental long-term expansion was desirable for most presidents—preferring modest increases that would make a substantial impact in the institution over time.
The student environment must be the focus of all aspects of online delivery. Private college presidents take great pride in the hands-on, personal approach of their programs, including off-site and adult services that exceed student expectations. Retention is not viewed as a business metric, but as a measure of their success in supporting their students.
At JenEd Consulting, we're facilitating the conversation between colleges and OPMs to clarify expectations, while also increasing an understanding and appreciation for the folks on the opposite side of the negotiating table.
Dr. John E. Neal, President
“User Experience” is everywhere. As an emerging formal discipline, its degree programs, best practices, vocabulary and models have made User Experience (note self-important uppercase) feel terribly significant, but totally inaccessible. The province of techies and experts.
But before there were “users”, there were customers and consumers—the regular people who bought and used products and services in the market. Customers interacted with and kept or returned a product. If they were satisfied, they may have come back to the same company for more or they might be lured away by an innovation, better price or clever advertising. Similarly, before students became users, they chose a school, a field of study and courses that they took with more or less enthusiasm and/or learning. Ironically, in a field that prides itself on human-centric design, the very word “users” feels less personal, less human.
If the fundamental idea behind user experience (note less arrogant lowercase) is not new, what has changed? Here are five major shifts.
User experience is often unnecessarily narrowed to refer to on-screen interactions, ignoring bigger goals and context. Instead, user experience at a university might be seen as everything and everyone a student (or parent) interacts with from the first visit to a website to an on-campus visit to a phone call about financial aid. It includes online (marketing, enrollment, delivery of education), books, physical materials, and—most important—contact with real people. Good user experience allows students to manage their own education and achieve their goals.
Faculty members also enjoy or despair of a user experience. In their quest to help students learn and share their passion for a subject, they feel that their knowledge, abilities and time are valued to a greater or lesser extent. Good user experience allows faculty members to use a variety of techniques to deliver content, talk with students, set up meaningful collaboration and create and test new models of learning.
The institution has its own goals: attracting and retaining students and faculty, providing education, improving its reputation, and meeting financial expectations. Good user experience can help a university stay competitive, raise standards, manage risk, and use resources effectively.
Why User Experience?
Providing a great user experience is a challenge and an opportunity. If you’re looking for reasons to focus on user experience, here’s a short list.
User Experience Questions
A good user experience results from asking lots of questions—broad and tightly focused—listening carefully to the answers and pursuing new lines of inquiry. A by-no-means comprehensive list of questions includes:
User experience work is totally accessible, so there’s no need to be intimidated. Like all disciplines, a good guide can help you get smarter fast. JenEd Consulting can work with you to clarify your goals, understand your audiences, and create a process to meet their needs. We practice what we preach—lots of questions, careful listening, advocating on behalf of users, with better results through relationships and collaboration.
Senior Consultant, User Experience
JenEd Consulting, LLC
JenEd Consulting is a Contributor for CIC's 2016 Institute for Chief Academic Officers in New Orleans (November 5-8, 2016). Dr. John Neal and Dr. Rob Sapp will be in attendance, so stop by the booth to say hello and discover specific resources and strategies for your online programs.
Welcome Ginny Rice to the JenEd Consulting team!
Ginny specializes in strategy, user experience and design for education and public information. Her experience spans industries, content and applications with clients that include the Virginia Community College System, Junior Achievement, K12 Inc., the National Library of Medicine, the Corporation for Public Broadcasting, Time-Life for Children, and the Smithsonian Institution. She is also an entrepreneur—having helped launch three successful businesses—and uses this knowledge to work with new initiatives and growing organizations.
By understanding the audience and identifying the keys to engaging them, she has consistently delivered measurably successful products, services and experiences. She is experienced in working with experts in complex subject matter to develop programs that appeal to multiple audiences with varying levels of competence.
Ginny is a graduate of Brown University and the Culinary Institute of America. She studied Educational Technology at the University of Wisconsin.
I'm honored to be speaking in the Leadership Learning Series for Leadership Donelson-Hermitage. Come join us at this event sponsored by First Tennessee Foundation and Nashville Shores Lakeside Resort (free to attend, but registration required).
Online learning is changing the landscape of higher education and employee development. Employees and organizations must seek innovative alternatives for their development and advancement offerings, all while balancing requests and proposals from vendors and service providers, which requires an enhanced awareness of what’s available and what’s the biggest return for investment for your team.
Attendees at this program will:
Friday, October 14, 2016/7:30am
Piedmont Natural Gas/83 Century Blvd/Nashville, TN 37214
In recent years, the Learning Management System (LMS) has emerged as the enterprise technology for most academic functions in higher education. No longer defined within the narrow parameters of course management, the current generation of LMSs are a powerful amalgamation of many functions including content management, learning analytics, electronic portfolios, program tracking, digital rights management, enterprise academic calendaring, teleconferencing, adaptive instruction and assessment and more.
Your LMS is also likely to be a hub for several other technologies including your student information system, customer relationship management, and a myriad of administrative systems associated with an ERP (Enterprise Resource Planning). The LMS is now used to support traditional face-to-face, hybrid and fully online instruction and it is difficult to overestimate the strategic criticality of this system for current and coming academics.
A routine and comprehensive evaluation of your LMS is critical for sustaining and advancing the academic mission of your institution. When conducting such evaluations, you may wish to consider the following questions in your examination.
1. What are my eLearning needs and plans?
The evaluation of any enterprise technology must begin with an understanding of its role in your college or university. This includes both the current institutional context and your plans the use of the system in the coming years. While a fully articulated strategic plan is not necessary to evaluate your LMS, you will need a working document that includes a statement of your institutional needs, goals and/or objectives to be used for determining the standards for the evaluation.
Since the assessment of your LMS can only be as strong as the criteria you use to evaluate, your consideration of these needs should be a thoughtful and complete representation of the Institution’s requirements. You will likely want to create a small team of five to eight critical staff members involved in the use of the current LMS and with broad institutional perspectives in academic, administrative and technical areas.
2. Does the existing LMS meet the current needs and plans?
The evaluation of your existing LMS usually begins with some list of prioritized criteria based on the response to question one. These should reference both your existing needs and your plans for growth and development. Once the criteria are assembled, an honest and thoughtful review of your current system against these standards should be completed.
There are a few important points to consider in this evaluation. First, realize that there will be inherent bias in the evaluation of your existing system from anyone who currently uses it in your institution. The bias can occur both for and against the current system and it will occur even if the evaluators are aware of this possibility and are attempting to control for it. You may wish to consider an external evaluator to augment your staff.
The answer to question two is often not a clear “Yes” or “No.” A careful review of your LMS against the criteria you have identified will likely show strengths and weaknesses of the system as it stands. The team used to create the criteria for evaluation should meet again to consider the output of the analysis. This final determination of question two is generally provided in terms of a short report with a narrative that describes the performance of the existing system against areas of the criteria and a recommendation to maintain the current system, change or redeploy the current system or consider a new system.
3. Can another LMS better meet my needs and plans?
If the response to question two leads you to consider replacing your existing LMS, you should plan to dedicate significant time and resources to the consideration of a new system. Selecting a new LMS can be an intimidating process. The market is large and complex and current systems vary widely. No single LMS is the strongest in all categories of functionality.
While you will certainly want to use the criteria you identified for evaluating your existing system, realize that assessing candidate LMSs is more involved than a simple environmental scan using checklists and questionnaires. This task is a more comprehensive process that includes an RFI (Request for Information) or an RFP (Request for Proposal) created by a team of qualified representatives of the important areas of your Institution. You may also want to include expertise that is external to your college or university to assist in building the documents and assessing the responses.
4. Is the change worth the cost?
In the final calculus of selecting a new LMS or maintaining the existing one, be sure to consider the total cost. The licensing of the software is only one element of total cost. Be sure to include other hard technical costs in the fiscal evaluation. For example, is the LMS to be hosted externally or internally? What are the total relative costs of each solution including staff training and augmentation, new hardware, infrastructure build-outs, networking upgrades, etc.
Try to provide an extensive accounting for the peripheral costs of implementing your new system. Consider the real costs of preparing your students, faculty and administrators. Be sure to reference conversion costs such as the need to maintain concurrent systems if that is within your transition plan. Include expenditures associated with the development of new training materials, help functions, and associated staffing.
Consider also the soft but legitimate costs of the transition of your user community. The hours that students and faculty use to learn the new system may not be directly billable to you, but they may have an adverse effect on the process of learning and teaching. This can be mitigated with a thoughtful transition plan and careful communication but there will be some frustration. Conversely, frustration may occur by maintaining a legacy LMS with outmoded functions that no longer meets core needs of your user community.
Your final determination between an existing system and a new system will be greatly informed by question four. After the careful consideration you have given questions two and three, be sure you provide the same diligence to your analysis of real costs.
The consultants at JenEd Consulting have extensive experience in helping to determine your LMS needs, evaluating your current system and assessing new systems for use in your institution. Whether you need someone to lead the process or just an external set of experienced eyes, we are ready to assist you in your LMS review.
Dr. Rob Sapp
One of the great benefits of online instruction is the ability it affords to track the activities and behavior of the students on the virtual campus. By identifying the characteristics of “successful” students and the metrics that measure those characteristics, we can determine which student conditions, activities, and behaviors best predict student success. But even those institutions with the most advanced online learning programs are only just beginning to realize the untapped potential of their student and faculty data. This type of critical inquiry requires a thoughtful design by experienced researchers. Here are some suggestions to get you started.
At the foundation of nearly all institutional research is the goal of student success. How does your institution define student success? Some common definers include student performance (formative and summative grades), retention, class standing, progression toward degree, satisfaction, etc. The definition for your institution may include degrees of all of these elements. However, the definition may be nuanced and the weighting of these elements is likely to vary greatly from one institution to another. It is important to build a consensus about what the institutional definition of student success is before determining the processes and selecting the metrics to measure it.
Choosing Data Sources
Once you have a definition for student success, you should take inventory of the data sources available for analysis. You will likely need to input from several departments within your college or university to complete a comprehensive account of the available sources. Consider including the owners or “keepers” of areas such as your learning technologies, learning content, academic information systems, enterprise resource planning systems, etc. You may need multiple levels of expertise for some of these systems, e.g., content knowledge, information technology knowledge, policy knowledge, and more.
Now consider what specific sources and systems will be included in your inventory. Typical locations include the Learning Management System, Student Information System, Customer Relations Manager, Enterprise Resource Planning, Education Management Information Software, etc. This will likely begin as a relatively small list. However, the inventory must also include field level data from the sources; these are the actual metrics that will be used in the analysis, either as direct input OR as data for generating input.
There are classifications of student data for consideration in your analysis. Consider the following:
• Student Conditions: Student conditions are those student metrics that occur outside the classroom. Common student conditions include demographics, academic programs, previous performance, class status/standing, etc.
• Student Activities: Student activities are the actions that students take in your virtual campus and online course. Some examples of student activities include the time spent in course (total time, average time per day, average time per session, etc.), time spent in various content areas, performance in course to date, when assignments are completed, etc.
• Student Behavior: Some research efforts elect to separate student activities and student behavior. Student behavior examines the “how” of specific student activities. How do students elect to consume instructional content and complete their assignments? What are the sequences they use to complete necessary academic activities? In many instances, how students choose to work online can tell us as much or even more than the actual tasks they have completed.
Analysis and Interpretation
The processes for analysis and interpretation of your data will vary widely depending on your research questions, available sources, the metrics that you chose, and the output you desire. It is likely that you will want a descriptive statistical report for the key variables of your student population. These may include metrics describing distributions, tendencies, and dispersions.
It is also quite likely your analysis will include inferential statistical models. These types of analyses examine the relationship between two or more variables. For example, what is the predictive value of time spent in course content to the final grade in that course? Once again, the quantitative models you select will be a product of the nature of your research and metrics you have decided to use.
The interpretation of your statistical output is normally completed in a small team with senior members of academic administration, subject or content experts, the researchers, and experts in relevant statistical areas. The key to quality interpretation is determining a consensus based on the data and articulating that in “actionable” terms. By actionable, we mean a clear and obvious path to change based on the output of the data.
One of the reasons that colleges and universities have been reluctant to pursue significant research using their online student data is the perception of the complexities and difficulties associated with it. While research of student conditions, activities and behavior is not a trivial thing, it is certainly a realistic goal worth pursuing.
Knowing what student actions are likely to predict factors of success such as strong academic performance, retention, persistence, and completion of degrees are clearly invaluable strategic factors. JenEd Consulting has a great deal of experience in the entire research lifecycle. We can help you articulate your research questions, identify key data sources, prepare the data for analysis, select the appropriate research models and methods and complete actionable interpretation of the output.
Dr. Rob Sapp