1 Q&A with US News & World Report The answers to the questions below (in red) come from Bob Morse, director of data research for US News, and Eric Brooks, the lead US News data research analyst for the recently released first US News Top Online Education Program Rankings. US News welcomes the opportunity to communicate with WCET s members. We hope that this will be the first of many conversations between US News and WCET about our new online degree rankings. All questions submitted by WCET members in January General Questions What is USNWR's perspective on the actual value of the first round results (apart from the obvious self-promotional brand-building aspects)? o What is the perceived value for the consumer or other audiences of a first round of rankings which so clearly does not reflect the quality of available programs? Or o Does USNWR view its findings as somehow revelatory in some way (e.g., by uncovering the four best online programs which made its honor roll)? The value of the first round rankings has more do with the fact that US News decided to do them in the first place. The timing was right because online education is no longer in its infancy. It has reached a certain level of mainstream acceptance and continues to grow. We felt that now was the time to expand our data-collection and ranking efforts into the rapidly expanding online education sector. There was also a great deal of demand for comparative information on the relative merits of online programs. Such rankings information was not available for online degree programs and US News is now starting to fill this large information gap. So, the main value in the first round is filling and information gap and taking first steps to providing evaluative comparative data on online degree programs. In the questionnaire, there seemed to be an assumption that it was desirable for distance education students to comingle with on-campus students online. Can you point to the research findings that support this concept? Answer: It s clear from our data collection that there are many ways that online education is offered and that there is not a one size fits all way to do online education. Consequently, none of the rankings gave blended programs a comparative advantage over online-only programs aside from
2 a single minor indicator about career centers used to compute the online master s in business Services and Technology ranking. But the surveys still asked many questions about the extent online programs were integrated with on-campus programs so prospective students can find the types of programs that are right for them by sorting among the characteristics of each school. How much are organizations listed paying for leads generated by your site? We are a private company and do not release that type of data, financial or web stats. What are the web analytics on the site, e.g., visits, unique visits, page visits, etc? We are a private company and do not release that type of data, financial or web stats. I ran across this quote from USNWR's marketing materials: o "And because US News & World Report is the leading ranking resource for anyone seeking an on-campus or online degree programs, you know you re getting the best advice and information available for your on-campus or online education." In what sense do the rankings they've come up with represent "the best advice and information available for your... online education"? The rankings stand by themselves. US News published a very detailed methodology that explained how programs were ranked. That methodology described each variable and weights used. Prospective students use all kinds to information to make decisions and the rankings will be one way to get comparative information about online programs that didn t exist before, in our view. There are detail profiles of programs that are part of the US News online education site. US News hopes that the rankings will only be used as one tool and not the only basis for deciding which online programs to select. How do the results of this set of rankings support their tacit assumption that rankings lead to the best available advice and information? US News is the first to say that rankings should not be used as the sole basis to choose a program and that they should be used as one tool in the
3 college/program search process. We stress the need to look at many factors in addition to rankings like cost, programs offerings, etc... Implementation Questions What specific steps does USNWR plan to take to increase numbers of participants and ensure accuracy and completeness of survey responses? We are pleased with our response rate, and expect to get a higher level of participation next year. Once schools see this year s online education rankings and the searchable individual profiles for each program, we think they may want to be part of it in the future if they can. We believe we got a very large percentage of schools with online degree programs to participate. Of course, some questions will be refined; that s just part of the process. We learned a lot doing this for the first time, and some of that will shape next year s questions. We also want to get more outcomes data, but that will take a while. Students will have to complete entire online degree programs, graduate and enter the workforce. But that s data we definitely want to get in the future. Given that USNWR has selected a set of criteria, how wedded will USNWR be to maintaining those criteria in future years to enable cross-year comparisons? Answer: U.S. NEWS is not wedded to maintaining the same methodology in order for schools or the public to do benchmarking. o Alternatively, how open is USNWR to changing those criteria to provide a more complete picture of each category? o US News has said with all academic rankings that if we are able to collect better data that enables us to improve the methodology in terms of our users we will make those changes. We have said that since this is first online degree programs rankings we expect that the methodology will evolve over time. It s not frozen or set. US News would like to have a systematic way, like an advisory group of online programs and experts in the field, to meet with us on a regular basis in order to improve what we have done. The public good to be provided in ranking education institutions, as in any ranking scheme, would be to present to stakeholders (parents, students, legislators, etc.) outcomes from an evaluation tool that can reasonably differentiate between programs based on reliable, accurate, and shared measures of quality that together sufficiently address the learning experience. Surely, this was the goal of the U.S News rankings? If so, such measures should be pre-determined and, in this case, might reasonably be
4 based on research, a detailed survey of stakeholders in the learning experience, or a serious outreach and data collection effort to the community of practice involved. Ideally, the effort to determine what ought to be measured should include all three methods. And the questions themselves should be provided to the surveyed group with enough explanatory detail that there is a reasonable assurance they are understood in the same way by respondents. o Which methods listed above did U.S News use to determine what questions to ask in the survey and why do you think the resulting set of questions is necessary and sufficient to differentiate quality between programs in meaningful ways? o In addition to examining literature reviews and academic research, US News contacted a large number of academics who run online degree programs in the areas we surveyed to do research on online education specific to that degree area and to get ideas on which questions we asked. We used those many interviews as the main basis to determine many of which questions that were asked. o What did you do to ensure the questions asked could be understood or interpreted in a consistent manner by respondents? o We did a limited amount of pre-testing the survey by sending it out to persons in each online degree program area we planned to survey to get feedback on clearness of the survey wording, etc. But a question interpreted the same by a handful of pilot testers will not always be interpreted the same across hundreds of respondents. o Can you explain how your scoring system, based on the pre-determined questions (or rubric) of quality measures, maintains its integrity as a scoring tool if you eliminate some questions because you couldn t collect the data (for whatever reason)? o We said clearly at the beginning that we were doing research from the ground up and that we had not determined the methodology in advance. The questions were designed with evaluative possibilities in mind. But as US News s published methodologies state, the 23 indicator rankings, their components, and the weights attributed to each component were determined based on quantity and quality of data collected; finalized only after survey administration was finished.
5 o How would you recommend stakeholders use the information you present here? Assuming that stakeholders mean the schools/programs US NEWS is not going to prescribe how data can be used. Some possibility is that programs can get a detailed look at specific programs by looking at their web-profiles on US News.com and if they choose can set how they stack up on the indicators US News has used in our ranking methodology. So simply put: peer to peer comparisons. How does the ranking or rating you have provided help them do this in reliable and effective ways? Answer: Based on our research this is the first time so much comparative information about the online degree areas at the program level has ever been available. US News believes that the information is reliable since it was collected in a uniform basis and that each program filled out the questions using the same definitions. All in all, the information provides other programs the most in-depth look ever at other programs. The survey as it was implemented in this iteration contained no options for community colleges to participate. Will USN be adding options for including this part of the online education community? US News is studying adding other parts of the online community for the next survey. At this point in time we don t have immediate plans of adding two year colleges. We would be interesting in finding out about the scope of online degree offering at two year colleges. In subsequent rankings, will USN provide a glossary of terms to ensure data is reported consistently across campuses? Many of the questions were defined on the survey instruments. However, we will consider doing that for the next data collection. U.S. NEWS would like to work more directly with online educators on which questions need definitions. Did USN gather input from any of the learner audiences (i.e. online or adult learners) in regard to the survey questions? US News did not seek input from learner audiences in regards to the specific survey questions or the ranking methodologies.
6 Will there be an opportunity before survey re-deployment for the questions to be revetted? Yes. US News is open to have the survey questions re-vetted. However, we would to do it an organized way. Could members of WCET assist with the vetting process? Answer: Yes. US News would like to do that in systematic way like an advisory group of WCET members to meet with us on a regular basis in order to improve what we have done and advise us. WCET members are the experts and US News would very much like to work with WCET in an organized way. Given that there was no way to provide any question interpretations, some survey questions we left blank because we didn t have the specific information in the specific way asked in the survey. It was not apparent to us how specific questions are weighted to understand the consequences of a non-response. Can you provide additional insights about issues of non-response and how that impacts on an institution s ranking? Our answer depends upon the ranking. To be included in an Admissions Selectivity ranking or a Faculty Credentials and Training ranking, schools needed to have answered all statistical questions. For example, for all programs ranked in the Admissions Selection for Master s in Engineering provided data on average undergraduate GPAs of new entrants, average GREs of new entrants, and admissions data (# of applicants and # of offers of admissions) used to calculate acceptance rates. Schools providing all these data were ranked; schools not providing all these data were not ranked (they do not appear anywhere on US News s admissions selectivity ranking table). No school was given a lower ranking for not providing statistical data for either of these indicator rankings. In contrast, the Engagement indicator rankings and Student Services and Technology indicator rankings were comprised predominantly of non-statistical questions, such as whether a business program was AACSB accredited or whether it offered career placement assistance. We believed these profile questions can be answered by all respondents so those who left them blank were scored zero points on them. Can the turnaround time for institution participation be extended in the future? Since the questions were vastly different than previous surveys, institutional staff time was tremendous and crossed multiple units of our institution. Many of the questions required a lot of research for us to complete and we really needed additional time.
7 Sorry to hear that. Indeed, this was the first year online education program degree surveys were administered to schools and understandably many were not accustomed to collecting and reporting these data at the program level, which increased the time burden. That said, the initial amount of time schools received to complete the online education surveys was comparable to what US News asks for its longstanding graduate and undergraduate surveys, while the number of questions was fewer. Schools requesting extensions were also repeatedly granted them. In fact, Master s degree programs ended up having three full months between when they first were sent the surveys and when the absolute final deadline was implemented. Bachelor s degree programs had 3.5 months. Allowing more time would not have been feasible for an annual ranking. But out of respect for survey respondents time, the 2012 survey will trim and eliminate some problematic questions from the 2011 survey. Furthermore, the upcoming 2012 online degree program surveys will once again have at least two months in the field, and probably more. Survey Questions How open is USNWR to sharing the sub-criteria behind those criteria in cases where it's not explicit? For each discipline, US News published detailed methodologies on its website outlining all the indicators used to compute each ranking. These also include descriptions on an indicator-by-indicator level of how scoring was done. Here is a sample for Master s in Education: News.com/education/onlineeducation/articles/2012/01/09/methodology-online-masters-of-education-degreerankings?page=3 o For example, what does USNWR deems as indicators for quality relative to its "maximum class size" criteria -- is it that the program states a maximum class size? Is there a particular number which is deemed "best"? If so, what are the gradations above and below that target figure? As the published methodologies explain: a school s score *for the maximum class size indicator] equals the lowest maximum class size among all [ranked] schools divided by the maximum class of [the school being scored], multiplied by the weight [assigned to
8 the indicator]. For example, if the smallest maximum class size reported by a school in the engagement and accreditation ranking was 5 students, and a school reported a maximum class size of 12 students, then its score for that indicator would be 5/12 * the number of points out of 100 that question was worth. As many totally online colleges serve adult students, have no first time students and rely primarily on adjunct faculty, will USN include factors for accurately evaluating the success of these institutions in future rankings? o How will USN develop these new measures? None of the ranking indicators had a bias against bachelor s degree programs that served adult learners or did not have first-time students. This is because only the admissions selectivity rankings incorporated student-level data, and these rankings were only produced for for master s level programs. But to evaluate the success of institutions, all of the online education degree surveys included questions on outcome measures like retention, graduation, indebtedness and career outcomes. US News may modify some of these questions for 2012 based on research and input provided by schools, and possibly, organized input from organizations such as WCET. The goal is to better incorporate these metrics into its rankings as soon as the quality and quantity of data provided by schools allow. One heavily weighted question in the student engagement category measures whether or not group work is required. Can you provide research that supports mandatory group work as a valid measure of student engagement? Yes. Student-to-student collaboration was one of the 70 quality indicators listed by 43 administrators of online education programs as part of a six round Delphi Method study in 2010 published by Kaye Shelton of Dallas Baptist University. Additionally, student interaction with other faculty and students was one of 24 quality indicators outlined in a 2000 Institute for Higher Education Policy study. Anecdotally,, US News interviewed multiple online degree program administrators who recommended student collaboration as an evaluative metric that demonstrates engagement. In the survey USN asked colleges to "guestimate" or "composite answer" broadly about all degree programs inside one college and did not look at individual degree programs.
9 There is often very little congruency across online degree programs within one academic division let alone one entire college. Do you plan to correct that? Little has been planned for 2012 so far. But US News understands there often is incongruence among programs and that this presents dilemmas for schools not wanting to report inconsistent information. Unfortunately, allowing schools to report data on more specialized levels would require specialized survey instruments with specialized questions. This would make computing rankings using standardized criteria more challenging. However, if the scope of the online education degree rankings eventually becomes expanded (as we hope), allowing schools to report on more individualized program levels may be a focus. In subsequent rankings, will you include metrics on the accessibility of the programs to students with disabilities? A couple such questions already exist. Each of the Engagement indicator rankings score the extents of programs compliance with the American Disabilities Act. Additionally, each of the Services and Technology indicator rankings score whether programs offer softwarebased readers (which vocalize print on computer screens, aiding the visually impaired). Would USN consider changing the questions regarding faculty training for instance looking at the percentage of participation in and the quality of the faculty training? This is especially important for institutions where large percentages of online courses are taught by tenured faculty, thus making it harder to mandate the training. Those are good ideas for new questions, although measuring the quality of faculty training is no simple task. Regardless, the Faculty Training and Credentials indicator ranking is not biased against tenured faculty. In fact, the rankings for each discipline placed slightly greater weight on a traditional credential (% of instructional faculty with PhDs) than toward the sums of the online training questions. The highest ranking programs were those that managed to both employ highly credentialed faculty and then train them to teach online. Ultimately, quality can be measured by what students know, do and value. Does USN plan to adapt or augment the surveys with more questions to address learner outcomes, including demonstrated outcomes that create career readiness?
10 US News is very much open to modifying the survey to include learner outcomes and demonstrated outcomes. Unfortunately, getting schools to report standardized information on either is difficult or non-existent. Programs often do not assess what their students learned in standardized ways that can be compared to other programs. Some do not track their students after graduation at all. We hope this data will become more standardized in later years. The Retention and Graduation section of the Best Online Bachelor s Degree survey included questions that seemed restrictive with regard to retention/graduation of parttime adult learners. The survey seemed to favor a first year, traditional freshman model where learners complete their degree requirements in 4 years or less. Is it possible to work with WCET and its member institutions to find alternative ways of exploring retention and graduation given the diverse learner populations we serve? Yes, input would be greatly appreciated. US News is strongly considering modifying these questions for the 2012 survey. Possible changes may include elongating the time periods students have to graduate. The bachelor s level surveys will likely also evaluate graduation rates relative to the number of course credits initially possessed by new entrants. The USNWR's rankings criteria seem skewed toward synchronous online delivery rather than asynchronous -- specifically, live tutoring, live streaming video, etc. Is this so, and if so, why? If not, what does USNWR plan to do to counteract the perception that their ranking system holds this bias toward synchronous delivery, which may be misleading to their audience? Hopefully public perception of the rankings will be rooted in understanding of the methodologies. There is no favoritism in the rankings toward synchronous vs. asynchronous classroom instruction. The Services and Technology indicator ranking scores whether live audio and live video are accessible to students, but it also equally weights whether recorded audio and recorded video are accessible to students; demonstrating no preference for one delivery method versus other. Programs that scored the best were ones with technologies that provide students the most flexibility in how they received courses. Regarding services external to course delivery, such as live tutoring or live technical support, no doubt students would prefer speaking to a person in real time for those things.
12 Comments Received Regarding the US News Survey Please feel free to write a response to any of these. We will be including them in the document we share through our blog. Of the 110 questions, 17 were completely inapplicable. And there were probably another dozen or more that were worded in such a way that I had grave concerns about answering them. I did not want to misrepresent my institution. I will try to provide examples. o First, we have three online degree completion programs: 1) the RN-BSN, 1) the Bachelor of Liberal Studies (BLS) and 3) the Bachelor of Applied Studies. I essentially answered the survey for the BLS and BAS, which are administered through the Division of Continuing Education. The College of Nursing administers the RN-BSN, and the differences in their programs and ours are significant enough that I did not think it reasonable to address them as a group. Understood. Unfortunately, US News cannot design a separate survey or create separate survey questions for each distinct online bachelor s degree program administered at institutions. o Also, there was no place to note that all students at the institution, on-campus or distance, meet the same admission and graduation criteria, and go through the same office. The only difference is that our distance education program are geared to students who have completed 60 hours of credit anywhere, though presumably at a community college. The online bachelor s degree survey contained a series of questions asking whether the program was linked to a face-to-face program, and whether each had identical standards on admissions standards, course credits, course delivery, curriculum and faculty. Maybe we can make those questions more specific next time. o Q-15: I understand the value of asking about residency. However, the institution has a residency requirement for all students, including BAS and BLS. On campus student can meet residency by taking 90 out of 120 hours on campus, 45 of their last 60 on campus, or their final 30. Because these are distance learners, the 30 hour residency rule is the ones that applies, and is modified so that distance learners must take 30 hours through the institution, not necessarily physically on campus. That was impossible to explain, given that our distance degree learners all come in with 60 hours as a requirement of admission. Also, in answering simply yes we have residency requirements, it
13 suggests an on-campus component. If I answered no it would appear as if we had lax standards. A response option for this survey question was Some of our OBD granting programs have residency requirements and some do not. This response would have most closely reflected the circumstantial nature of in-person attendance requirements at the institution. However, US News will strongly consider modifying some survey questions to account for the high prevalence of transfer students in online degree programs. o Q-18: None of the answers are true for us. Yes is misleading. No is somewhat true, insofar as the on-campus majors do operate separately from our degree programs. Not applicable is not true either though, since we do offer courses that are concurrently on campus and streamed to distance learners. So again, I have an answer, but could not provide it within the confines of the question. Not responding makes us look weak, responding within the confines is misleading. Without more specifics about the nature of the program, understanding how these responses were insufficient is difficult. The thought behind designing the response options was that online students either typically share classes with campus-based students or they do not. o Q-21: Makes no sense unless I am misunderstanding it. A student in the BAS or BLS program could certainly decide to move to campus and the credits earned would transfer they are considered institution courses. But our online and faceto-face programs are not integrated whatever is meant by that. The yes/no question referenced is as follows: Can the credits earned for courses in the online bachelor s degree programs be transferred toward the equivalent degree programs offered in the face-to-face setting? Select yes if online and face-to-face bachelor s students are integrated. The confusion involves the second part. Just because all programs with online and face-to-face bachelor s students integrated are instructed to select yes does not mean that all programs that select yes must have integrated online and face-to-face bachelor s students. Perhaps the wording needs to be modified for 2012.
14 o Q-23: Again, makes little sense. Every course we offer online is a fully vetted Institution course. The curriculum is not the same, because we do not offer majors online. The degrees offered are constructed as areas of study. US News will strongly consider modifying some survey questions to account for the high prevalence of transfer students in online degree programs. o Q-24: We don t offer majors, but by not responding it appears we just offer courses and not degrees. There was no opportunity to explain how the degrees are built, and they are indeed, rigorous degrees. o Q : Geared toward first year admission. Our students must meet Institutional admission criteria, but after 60 hours of college course work, most of the requirements for high school students are inapplicable (class rank, ACT, etc.). Some online bachelor s degree programs require applicants to submit high school level information. A majority do not, which is why this information was not used in the rankings. But no doubt prospective students benefit from being able to learn whether an individual program requires SAT scores, and if so, what those scores typically are. o Q-109: This is an example of what I consider a prejudicial question. Of course I hope our graduates are employed and employable, and they are. But are institutions required to ask this question of their on-campus graduates? As many have pointed out, the rankings would be improved if they included outcome level data on employment and salaries. Most programs do not collect this information, which is why it was not used for the rankings. o I think surveys can be valuable. And some of these questions got me thinking about how to collect this kind of data for our own purposes. However, I have studied and conducted survey research, and many of the questions here are poorly designed, and simply do not address the many ways students study online, and the many options for online degrees. US News' "honor roll" of online colleges misses the point and lacks understanding of adult learners. In more than a decade of experience, I've learned that students choose online programs that serve their needs, not institutions. However, US News bundles online programs on an institutional level, the more the better. Therefore, an institution
15 with one or two exceptional, nationally respected, online programs will be ignored. One does not have to question the validity of US News' honor roll because its weakness is all too apparent. Incorrect. Data for these surveys were solicited at the program level, not at the institutional level. US News's website clearly labels each honor roll as being at the program degree level and not at the institution level. US News is out of touch in other ways, as well. I'll offer one example. One question asks which anti-plagiarism service the university uses. The question presupposes that students will write papers. Many online programs require authentic assessment in the form of real-life projects in place of papers, but the question provides no option for twenty-first century practice. Anti-plagiarism services apply to papers, exams, or any other assignment in which students submit writing for a grade. Writing skills may play a role in all career areas. Consequently, US News believes assuming that quality academic programs administer at least some writing component at least once across their entire courses of study is fair for use as a very small ranking criterion. Also, while the use of anti-plagiarism devices was part of the Engagement ranking, the type of anti-plagiarism service the institution uses was for profile information only. Riposte Online Rankings & One Question Their arguments boil down to the following: the rankings received too little participation from schools to be useful; not enough emphasis was placed in the rankings on student level outcomes; and US News did not properly engage online education experts in developing the questionnaires. These arguments were in most cases exaggerated or misinformed. Well, no. The most problematic argument WCET makes is that we need to learn to live with rankings and the purported consumer need for them, however dubious their efficacy and utility. I suppose. Certainly the success of publications like US News and World Report confirms the use of publishing those ranking, at least for US News and their share holders. It doesn t confirm they are really used, however (see Zemsky s* research, for instance, that finds that rankings are NOT used, even among selective schools in which information about academic performance is rarely decisive. ) And the
16 successful sales of US News and World Report s rankings certainly don t confirm that rankings should be used. That success does suggest, though, that there is a pressing need to do more to educate the American population in a variety of literacies, most notably those useful for understanding the silliness of ranking complex organizations in ways similar to the ways one might rank the reliability of toasters or determining this year s sexiest man alive. That aside, as I reviewed the survey and the issue, and after one gets past the packaging, there is potentially great value in US News and World Reports initiative. As a service, a thorough analysis of the great variety of programs could be very useful. Imagine if the issue focused on creating a taxonomy of learning opportunities rather trying to cram a host of insipid assumptions about quality into a hierarchy. It is (or would be) useful to see an easily accessible presentation of the various student demographics and the different kinds of schools that are evolving to serve such great and growing diversity. It is or would be useful to do more, as WCET suggests, to understand the various selection criteria that help institutions identify their students and the associated configurations of technologies and instructional strategies they use to help their students. Such a service might be very useful contribution for helping prospective students as well as the institutions who serve them. But for the most part it is the insipidity articulated well by many others in WCET and elsewhere that is most prominent in this round of ratings. More precisely, it is, as WCET s critique suggests, the underlying assumptions about quality indicators that are so unfortunate. Elsewhere in the issue we see things like response time to identified as a quality indicator. Certainly that is one kind of expectation--common in practice and in the underlying assumption that good pedagogy depends upon a dyadic relationship between teacher and student. But there are other ways to effectively promote productive interaction. How student community and collaboration are similarly valued and together with response times formulate a linear measure of quality does confound things a bit. Similarly, the emphasis on quantity--of software packages used, technologies used, full time faculty employed or the requisite of attending training conflates quantity with quality. Maybe the biggest problem is the idea that it is the institution, not the program or discipline, which matters. And the irony is, the issue does begin to make useful descriptive program level distinctions.
17 Similarly, though WCET rightly questions the conflation of professional accreditation with engagement. It is a small concern that professional accreditation is essentially purchased by the program. Though it is not broadly known, it is for this reason that regional accreditors sometimes have problematic relationships with professional accreditation. It is not clear, in other words, that professional accreditation is always the most useful measure of quality. All that aside, it is troubling that in practice, most of us who grouse loudest about rankings are the first to post good scores on rankings on our web sites, boast about them to our donors, and polish them in hopes of attracting prospective students. Complicating all this more is maybe the most disturbing aspect of the online rankings-- the fact that they are largely self-report. If you ask me, depending upon the stakes, I m either the sexiest man alive or a very reliable toaster. So my question? How do we expand collaboration with the publishers of US News and World Report in ways that really do serve the public, students and the institutions that strive to serve them? In 2005 Robert Zemsky reported that the vast majority of students in the U.S. exercise little choice about where they attend because the primary drivers of this decision are institution location and price. Online programs notably eliminate the location issue, but, again as Zemsky reports, Even where competitive markets do exist among selective institutions in traditional higher education, information about academic performance is rarely decisive (2005, in Ewell, 2010). U.S. News definitely appreciates your observation that the rankings are controversial and that there is a big debate on their impact on students and institutions. On key point. US News very would like to do that expand collaboration in systematic way like an advisory group of WCET members to meet with us on a regular basis in order to improve what we have done and advise us. WCET members are the experts and US News would very much like to work with WCET in an organized way. US News hopes that this Q&A is the first step to setting up such an advisory group of experts in online education.
18 Given that each of the schools surveyed has diverse student populations and characteristics, it would be helpful if some of the questions allowed for further explanation or clarification. This could be in the form of an open text box where schools can provide additional information about their data. As an example, the survey question about live technical support services to students was difficult for us to answer. Often schools may not offer 24/7 live technical support, but do accept s from students at all times of the day and respond quickly to address technical issues. Unfortunately there is no way to convey this given the construction of the survey question, which could be misleading to students and the public. Accepting s from students at all times of the day is not 24/7 live technical support. It still might be very good technical support, but categorizing schools by hard distinctions is necessary for computing rankings. However, for information purposes each school s US News profile page lists whether any technical support at all was reported as being offered, in addition to 24/7 live technical support. As a non-traditional institution, our students tend to be older students who are seeking undergraduate or graduate degrees. Many of our students come from the military and only a very small percentage of them enroll directly after high school. Many of the questions focus on traditional students; that is, they ask about admissions criteria such as rigor of secondary school record, class rank, high school GPA, and standardized test scores. For schools that offer open admissions and cater to an older student population, these factors are not particularly relevant. High school level information was not used in the online bachelor s degree rankings due to the degree completion nature of most programs. However, some online bachelor s degree programs do mandate high school information from at least some applicants, which is why this information was solicited and published on their school profile pages. At the graduate level, a proxy of a quality online master s degree program is that its admissions standards resemble those one would expect for a reputable bricks-andmortar program. Older populations (on average, in their early 30s according to survey responses) are capable of taking GREs and submitting college transcripts. Retention data was requested for all students who matriculated during four most recent 12-month time frames. For schools that have open admissions and no application fee, there are many students who attempt one or two courses and then leave the university. Unlike some traditional universities, the cost to enroll and attempt courses is
19 low. Given our population of students who are typically older, working adults, and many of whom are serving in the military, a large population of our students fit the higher education definition of student swirlers. These are students that try out a number of institutions throughout their course of study to achieve a higher education degree. They are sometimes enrolled in two or more institutions, either simultaneously or consecutively. Given that these students are not limited geographically as to where they can enroll, they often shop around for what is appropriate for them at the moment. Taking into account the unique characteristics of both traditional and nontraditional schools in this respect will better inform students and the public. Yes, input would be greatly appreciated. US News is strongly considering modifying these questions for the 2012 survey. Possible changes may include, but will not definitely include, elongating the time periods students have to graduate. The bachelor s level surveys will likely also evaluate graduation rates relative to the number of course credits initially possessed by new entrants. The rankings are an important step toward providing increased transparency and accountability for schools offering online education. By focusing more on learning outcomes rather than on the qualifications of entering students, US News could provide more relevant information to students. We are encouraged that the surveys captured a great deal of information about how exams are administered and techniques are employed to ensure academic integrity. We are also encouraged by the inquiries on the types of assessment instruments that schools use to measure student learning. To follow up with this work, we would like to see a greater focus on student learning/career outcomes in future iterations of the survey and methodology. US News is very much open to modifying the survey to include learner outcomes and demonstrated outcomes. Unfortunately, getting schools to report standardized information on either is difficult or non-existent. Programs often do not assess what their students learned in standardized ways that can be compared to other programs. Some do not track their students after graduation at all. We hope this data will become more standardized in later years. GENERAL CONCERN: The survey did not provide thorough definitions of the data elements. The IR staff completing the survey needs a glossary of terms similar to what IPEDS provides to
20 ensure that data is reported consistently across campuses. (e.g. OBD student definition needs to be clarified) Below is a specific example of clarification we requested from US News. Their answers didn t really help us complete the survey. They (US News) need to define the items so that universities with our student populations can accurately answer the questions and not have to leave them blank or report estimated numbers without footnotes! Institution questions: Per our discussion, we are planning to use the following criteria as guidelines for our data collection: This Institution has 8 bachelor s degree programs that qualify as OBD. Please note that these programs include on-campus and online students and when students select a program they do not select online or face to-face. a. Enrollment Calculations: OBD Students will be defined as students in one of the 8 OBD programs who are taking at least one course online. To select cohorts for retention and graduation calculations, we will use the fall first-time baccalaureate who are taking at least one course online during their first term. b. Degree-awarded: We cannot distinguish between OBD graduates and face-to-face graduates. (For example, if we have 154 in our BBA Marketing program which is one of the 8 OBDs- we cannot distinguish who would qualify as an OBD student. Many of our students on campus take a few online courses. For Question 24, should we just list the 5 highest percentage of graduates in the 8 OBDS even though it includes face-to-face students? c. For employment statistics, we would also need to use all graduates from all OBD programs-some of whom may never have taken an online course. US NEWS response to the institution at time of survey: Advising them that A (above) is correct and that they should just leave B and C blank if they can t distinguish students. Additional response from Bob Morse: In terms of B and C, tell school we aren t going to footnote their data. If they report it as OBD, we are going to say it s OBD. We aren t going to be able have disclaimers on questions. So, if they are comfortable saying in B that the blend represents OBD, then answer. IF in C they are comfortable if the blend =OBD, then it s OK.
Ranking Online Education Programs: A Conversation with U.S. News Open Questions from July 26, 2011 Webcast Russ Poulin, WCET While the program may be 80% available online students might flow back-and-forth
Ranking Online Education Programs: A Conversation with U.S. News July 26, 2011 The webcast will begin at the top of the hour. There is no audio being broadcast at this time. If you need assistance, contact
Improving Graduate Programs at the University of Miami National University of Ireland, Galway Mary M. Sapp, Ph.D. Assistant Vice President Planning & Institutional Research University of Miami Coral Gables,
An Inside Look into the U.S. News and other Media MBA Rankings Robert J. Morse, Director of Data Research, U.S. News email@example.com Presented at: IREG-5 Berlin, Germany October 7, 2010 The Editorial
BEST COLLEGES 2015 ONLINE LEARNING SURVEY: ONLINE STUDENT NEEDS, PREFERENCES AND EXPECTATIONS EMAIL US: firstname.lastname@example.org Online Student Needs, Preferences and Expectations Online learning is still
APPLICATION GUIDE TO GRADUATE SCHOOL IN PHILOSOPHY Section 1: Preparing for Graduate Study in Philosophy 1 Graduate school in philosophy is the natural option for many undergraduate majors in philosophy.
Page 1 of 9 Frequently Asked Questions about CGS What are the advantages to CGS? What is so special about our CGS professors? What is the core curriculum? What is the team system? How do I apply to CGS?
GEORGIA STANDARDS FOR THE APPROVAL OF PROFESSIONAL EDUCATION UNITS AND EDUCATOR PREPARATION PROGRAMS (Effective 9/01/08) Kelly Henson Executive Secretary Table of Contents Standard 1: Candidate Knowledge,
Is an Executive MBA right for you? A GUIDE TO THE EXECUTIVE MBA achieve goals busy schedule support classroom contribution cost peer network the new MBA What is an Executive MBA? The Executive MBA is a
Page 1 of 32 STANDARDS AND CRITERIA FOR ACCREDITATION OF POSTSECONDARY CONSTRUCTION EDUCATION DEGREE PROGRAMS TABLE OF CONTENTS 1. INTRODUCTION 2. GOVERNANCE AND ADMINISTRATION 3. CURRICULUM 4. FACULTY
Optional Pathways to RHIA Certification A CAHIIM Report 2013 Introduction The healthcare industry recognizes the long-standing Registered Health Information Administrator (RHIA) credential offered by the
Student diaries: using technology to produce alternative forms of feedback NUZ QUADRI University of Hertfordshire PETER BULLEN University of Hertfordshire AMANDA JEFFERIES University of Hertfordshire 214
How U.S. News Calculated the 2015 Best Colleges Rankings Here's how you can make the most of the key college statistics. The U.S. News Best Colleges rankings can help prospective students and their families
, Executive Director, Accrediting Commission of Career Schools and Colleges Before the On The Department of Education Inspector General s Review of Standards for Program Length in Higher Education Thank
Fuqua School of Business, Duke University Master of Management Studies Proposal for pilot Foundations of Business program Overview The Fuqua School of Business proposes a three year pilot of a Foundations
Degree Program Student Learning Report (rev. 7/14) Fall 2013 Spring 2014 The Department of Applied Technology in the School of Business & Technology Business Information Technology, B.S. Effectively assessing
1 Applying to Graduate School Frequently Asked Questions 1. What are the differences between Master s, PhD, and MFA programs? The main difference between master s and doctoral programs has to do with the
UNIVERSITY OF CINCINNATI: CASE STUDY OF ONLINE STUDENT SUCCESS Melody Clark Lisa Holstrom Ann M. Millacci University of Cincinnati ABSTRACT The University of Cincinnati (UC) is a premier, public, urban
AP CS Principles Pilot at University of California, San Diego Authors: Beth Simon (UCSD) and Quintin Cutts (University of Glasgow) Course Name: CSE3 Fluency with Information Technology Pilot: Fall 2010
Dartmouth College Admissions College Fairs Frequently Asked Questions So tell me about Dartmouth Located in beautiful Hanover, NH Liberal Arts college 4,300 undergrads & 1,700 grad students the smallest
Council for Standards in Human Service Education National Standards ASSOCIATE DEGREE IN HUMAN SERVICES http://www.cshse.org 2013 (2010, 1980, 2005, 2009) I. GENERAL PROGRAM CHARACTERISTICS A. Institutional
ASU School of Social Work College of Public Service and Community Solutions MSW Online Program Fact Sheet Website: ASU School of Social Work MSW Online Program website https://socialwork.asu.edu/online
1 Creating Rubrics for Distance Education Courses Introduction Kasey Fernandez University of Hawai i at Manoa 250 Campus Road Honolulu, HI 96822 email@example.com Abstract: Rubrics have been proven to
Student Feedback on Online Summer Courses October 8, 2015 Santa Clara University Office of Assessment Report Introduction In the summer of 2015, approximately 700 undergraduate students were enrolled in
Business Accreditation Eligibility Application The purpose of this application is to determine if the school meets AACSB s eligibility criteria outlined in AACSB s Standards for Business Accreditation.
For more resources click here -> Online Course Delivery at 50 Accredited Institutions: The Critical Issues Robert M. Colley Associate Dean, Continuing Education Syracuse University Shelly Blowers Graduate
Bozkurt Identifying Stakeholder Needs 1 Identifying Stakeholder Needs within Online Education Dr. Ipek Bozkurt Assistant Professor Engineering Management Programs School of Science and Computer Engineering
GRADUATE STUDENT SATISFACTION WITH AN ONLINE DISCRETE MATHEMATICS COURSE * Amber Settle DePaul University 243 S. Wabash Avenue Chicago, IL 60604 (312) 362-5324 firstname.lastname@example.org Chad Settle University
WHITE PAPER Business Analytics and Data Warehousing in Higher Education by Jim Gallo Table of Contents Introduction...3 Business Analytics and Data Warehousing...4 The Role of the Data Warehouse...4 Big
Mathematics Placement And Student Success: The Transition From High School To College Mathematics David Boyles, Chris Frayer, Leonida Ljumanovic, and James Swenson University of Wisconsin-Platteville Abstract
Office of LEGISLATIVE AUDITOR GENERAL State of Utah REPORT NUMBER 2003-07 July 2003 Follow-up Audit of Medical School Admissions Although considerable progress has been made, the school s diversity policy
Testimony for the National Commission on Accountability in Higher Education Oklahoma State Regents for Higher Education Prepared by: Dr. Dolores Mize, Associate Vice Chancellor and Special Assistant to
Northeastern State University Online Educator Certificate Purpose Beginning in October 1999, the Northeastern State University Distance Learning Advisory Committee (now the Center for Teaching and Learning
M.S. in Education Assessment in the Major Report 2010 By Dr. Renee Chandler, Program Director Submitted: October 2011 Table of Contents 1. Outcomes of Previous Assessments... 2 2. Questions To Be Answered
RYERSON UNIVERSITY POLICY OF SENATE PERIODIC PROGRAM REVIEW OF GRADUATE AND UNDERGRADUATE PROGRAMS Policy Number 126 Previous Approvals: April 5, 2005; May 6, 2008; November 2, 2010; May 3, 2011, May 3,
Waitlist Chat Transcript - 04/04/13 10 AM Q: What are our chances of getting off the waitlist? A: We don't have an exact answer for you. We were conservative with the number of offers we extended in R1
The Balanced Scorecard Beyond Reports and Rankings More commonly used in the commercial sector, this approach to strategic assessment can be adapted to higher education. by Alice C. Stewart and Julie Carpenter-Hubin
The Future of Ranking Systems The U.S. News Experience and its impact on colleges. The rise of global rankings. Accreditation versus Rankings Robert J. Morse, Director of Data Research, U.S. News email@example.com
Page 1 of 6 A Consortium of Institutions and Organizations Committed to Quality Online Education Home Membership Publications Effective Practices Workshops Events Awards Resources About Us Contact Us FAQs
ISSUES AND SOLUTIONS for Career and Technical Education in Virginia 2015 Educators and business representatives from across Virginia, along with 10 organizations representing Career and Technical Education
GRADUATE ADMISSION PROCESS Purdue University School of Nursing The School of Nursing has set a preferred deadline in April by which graduate school applicants should forward their documents to the graduate
Understanding Freshman Engineering Student Retention through a Survey Dr. Mary R. Anderson-Rowland Arizona State University Session 3553 Abstract It is easier to retain a student than to recruit one. Yet,
ASPIRE INSPIRE LEAD RN to BSN Program 1 You can do the best work of your life! Graceland University s online RN to BSN program As an RN, you re already doing important work, and you re part of the fastest
ACADEMIC POLICIES AND PROCEDURES College of Business Administration California State University, Sacramento Contents 1. Introduction... 1 2. Reviewing Body... 1 2.1. The College of Business Administration
Faculty Senate Ad Hoc Committee on Quality in Online Learning. The Faculty Senate Ad Hoc Committee on the Future Direction of Quality Education is charged with: Defining quality in online/distance education
RESEARCH BRIEFING Australasian Survey of Student Engagement Volume 9 April 2011 Highlights Compared with undergraduate students, postgraduate coursework students record significantly more engagement in
Guidelines for Massachusetts Early Educator Preparation Programs Participating in the Early Childhood Educator Scholarships Program Background The Departments of Higher Education and Early Education and
Optimizing Enrollment Management with Predictive Modeling Tips and Strategies for Getting Started with Predictive Analytics in Higher Education an ebook presented by Optimizing Enrollment with Predictive
ACCEPTABILITY OF ONLINE BACHELOR'S DEGREES Journal of Computing in Higher Education Fall 2004 Vol. 16(1), 150-163. Acceptability of Online Degrees As Criteria for Admission to Graduate Programs Margaret
Master s Degrees - A Guide to Studying Online Why Study With the University of Liverpool? As postgraduate degrees become increasingly necessary for career advancement, employees are becoming far more discerning
Texas Woman s University College of Arts and Sciences Bachelor of General Studies Program Undergraduate Program Review May 2012 I. PROGRAM REVIEW AND MISSION A. History Texas Woman s University has a proud
Measuring Internationalization at Community Colleges Funded by the Ford Foundation AMERICAN COUNCIL ON EDUCATION The Unifying Voice for Higher Education Center for Institutional and International Initiatives
Defining Graduate Education in Interior Design As it exists today, interior design graduate education is defined by various degrees with different missions, professional content, research content, degree
GRADUATE SCHOOL Contributions From: Donald Asher Career Development Center York College of Pennsylvania Campbell Hall Room 200 717-815-1452 firstname.lastname@example.org www.ycp.edu/careerdevelopment TABLE
CGS Assessment Report: MBA Program 2013-2014 Table of Contents Assessment Process Overview... 2 Assessment Report... 4 Assessment Process... 4 Results: Direct Measures of Student learning... 5 Results:
1 Conley, D. T. (2010). College and Career Ready: Helping all Students Succeed Beyond High School. San Francisco: Jossey Bass. (Abstract prepared for AVID Postsecondary by Harriet Howell Custer, Ph.D.)
Noel-Levitz Report on Undergraduate Trends in Enrollment Management 2013 Student Retention and College Completion Practices Report for Four-Year and Two-Year What s working to increase student retention
Joann A. Boughman, PhD Senior Vice Chancellor for Academic Affairs University System of Maryland Testimony before the United States House of Representatives Committee on Education and Workforce April 2,
A Research Brief on the A Survey Research of the Brief Shared Design Elements & Survey Emerging of the Practices Shared of Design Competency-Based Elements & Emerging Education Practices Programs of Competency-Based
THE DUE DILIGENCE PROCESS FOR SUB-ADVISED INVESTMENT OPTIONS Our proprietary due diligence process provides a rigorous and disciplined framework for identifying, hiring, and retaining premier investment
THE UNIVERSITY OF UTAH ADVISING BASICS MANUAL Transfer Student Considerations Last Updated: 8/2013 Transfer Student Considerations Table of Contents Transfer Students 3 Advising Transfer Students 3 Transfer
Best Practices in Implementing ACCUPLACER//MyFoundationsLab June 2013 2013 The College Board. College Board, ACCUPLACER, SAT and the acorn logo are registered trademarks of the College Board. PSAT/NMSQT
Law School Rankings The U.S. News Law School Rankings: Why and How they are done. Plus comments on the current state of law school rankings Robert J. Morse, Chief Data Strategist U.S. News & World Report
Technical Report No. 15 An Analysis of IDEA Student Ratings of Instruction in Traditional Versus Online Courses 2002-2008 Data Stephen L. Benton Russell Webster Amy B. Gross William H. Pallett September
Dual Enrollment Dual enrollment is a successful acceleration mechanism that allows students to pursue an advanced curriculum relevant to their individual postsecondary interests. Over 37,000 students participated
RESEARCH SUMMARY International Baccalaureate Diploma Programme: Examining college readiness Based on a research report prepared for the IB by: David Conley, PhD, Charis McGaughy, PhD, Whitney Davis-Molin,
Consortium for Faculty Diversity at Liberal Arts Colleges Fellowship Program G U I D E L I N E S O N M E N T O R I N G The Consortium for Faculty Diversity at Liberal Arts Colleges (CFD) Program The Consortium
A Brief Look at Online Tutorial Experiences of Older Students in Remedial College Mathematics Greg A. Baugher Instructor and Doctoral Student Mercer University, Georgia USA Baugher_GA@Mercer.edu Abstract
Education Administrator, Director and Principal Careers, Jobs, and Employment Information Career and Job Highlights for Education Administrators Qualifications such a master s or doctoral degree and experience
Government response to the review of teacher education and school induction Government response to the review of teacher education and school induction In February 2010, the Queensland Government released
Exit Evaluation M.Ed. Educational Leadership NO NAME PLEASE! DEMOGRAPHIC INFORMATION: Year of Program Completion: On Campus Program: Cohort Program (Off-Campus) (check which applies) Number Years in Present
Dual Credit in Indiana Q & A Version 7.8 October 30, 2012 Dual Credit in Indiana Q&A GENERAL INFORMATION 1. What is dual credit? In Indiana, dual credit is the term given to courses in which high school
Awarding Credit Where Credit is Due: Effective Practices for the Implementation of Credit by Exam adopted spring 2014 The Academic Senate for California Community Colleges Credit by Exam Task Group Lesley
SUPPORTING THE DISTANCE LEARNING OF THE INTRODUCTORY BUSINESS STATISTICS COURSE BY TELEPHONE AND E-MAIL. D. Bruce Dunning and Chin-Diew Lai, Department of Statistics, Massey University, New Zealand The
Master of Healthcare Administration Frequently Asked Questions Is your program CAHME accredited? Yes, the Seton Hall MHA program received initial CAHME accreditation effective May 20, 2014. This extends
THE CALIFORNIA STATE UNIVERSITY ENROLLMENT MANAGEMENT POLICY AND PRACTICES What Is Enrollment Management? It is the intent of the Legislature that each California resident with the capacity and motivation
Commission on Colleges Southern Association of Colleges and Schools Best Practices For Overview to the Best Practices These Best Practices are divided into five separate components, each of which addresses
DRAFT NATIONAL PROFESSIONAL STANDARDS FOR TEACHERS Teachers Registration Board of South Australia Submission 21 May 2010 Table of Contents 1. EXECUTIVE SUMMARY 1 2. TEACHERS REGISTRATION BOARD OF SOUTH