GLOBAL OPINION SURVEY NEW OUTLOOKS ON INSTITUTIONAL PROFILES



Similar documents
FINDING MEANINGFUL PERFORMANCE MEASURES FOR HIGHER EDUCATION A REPORT FOR EXECUTIVES

International Ranking. Institutional Research

HIGHER EDUCATION PRACTICE RESEARCH ABSTRACTS

The Future of Ranking Systems on colleges. The rise of global rankings.

Trusted. Independent. Global.

2015 Laureate/Zogby Global Student Confidence Index

UNC Leadership Survey 2012: Women in Business

GLOBAL INSTITUTIONAL VORONEZH STATE UNIVERSITY 2013 PROFILE: Copyright 2013 THOMSON REUTERS

UNH Graduate Education Department. Quarterly Assessment Report

MOTIVATION IN PROJECT MANAGEMENT: THE PROJECT MANAGER S PERSPECTIVE

Sample Reporting. Analytics and Evaluation

Previous Approvals: April 5, 2005; May 6, 2008; November 2, 2010; May 3, 2011, May 3, 2011, May 7, 2013

The performance of universities: an introduction to global rankings

Academic Ranking of World Universities And the Performance of Asia Pacific Universities

NICE INCENTIVE COMPENSATION MANAGEMENT. NICE Incentive Compensation Management

SELF-STUDY FORMAT FOR REVIEW OF EXISTING DEGREE PROGRAMS

Scoping Study on Service Design

ACADEMIC INTEGRITY AND STUDENT PLAGIARISM: GUIDED INSTRUCTIONAL STRATEGIES FOR BUSINESS COMMUNICATION ASSIGNMENTS

B-to-B Lead Generation:

PROMOTION OF NATIONAL STUDENT SURVEY 2016: EXCERPT FROM HEFCE GUIDELINES ON AVOIDING INAPPROPRIATE INFLUENCE

COUNTRY REPORT MALAYSIA. Copyright QS Intelligence Unit (a division of QS Quacquarelli Symonds Ltd)

A Reassessment of Asian Excellence Programs in Higher Education the Taiwan Experience

Chapter 12 Academic & Reputational Rankings

Since the 1990s, accountability in higher education has

THE RANKING WEB NEW INDICATORS FOR NEW NEEDS. 2 nd International Workshop on University Web Rankings CCHS-CSIC, Madrid (Spain).

Continuity and Transformation Continuous challenges for world-class status among universities in Taiwan and Japan as ageing societies

The Work Environment for Tenure-Track/Tenured Faculty at the University of Maryland. ADVANCE Research and Evaluation Report for CMNS

Employee performance management in a global setting. Brenda Wilson

Social Media Intelligence

Corporate performance: What do investors want to know? Reporting adjusted performance measures

Tertiary education: The United States

HMRC Tax Credits Error and Fraud Additional Capacity Trial. Customer Experience Survey Report on Findings. HM Revenue and Customs Research Report 306

CURTIN S KEY PERFORMANCE INDICATORS CERTIFICATION OF KEY PERFORMANCE INDICATORS. Certification Introduction Teaching and Research...

Global Services and Capabilities

EASTSPRING INVESTMENTS ASIA INVESTOR BEHAVIOUR STUDY 2015 INDONESIA. October eastspring.co.id

Rankings Criteria and Their Impact on Universities

U.S. News & World Report

Parent and Community Survey

Customer Experience Survey Report

The International Standing of Australian Universities. Executive Summary

The Role of Information Technology Studies in Software Product Quality Improvement

July TD Bank Checking Experience Index 2013

THE GROWING SKILL SET OF THE SUCCESSFUL TRAVEL MANAGER

AACSB International Accounting Accreditation Standard A7: Information Technology Skills and Knowledge for Accounting Graduates: An Interpretation

Introduction. This white paper outlines the key findings from the survey and provides an analysis of its implications

Measuring Quality in Graduate Education: A Balanced Scorecard Approach

Transparency of Firms that Audit Public Companies

ILA Strategic Plan

Prepared for: Your Company Month/Year

National Commission for Academic Accreditation & Assessment

2015 Country RepTrak The World s Most Reputable Countries

UCL Personal Tutoring Strategy

THE WORLD S TOP 500 UNIVERSITIES THROUGH STUDENT EYES

Book Review - Quality Assurance and Accreditation in Distance Education and e-learning: Models, Policies and Research

Quality Assurance for doctoral education

Fact sheet DTZ Fair Value Index TM methodology

GROWTH & INCOME INDEX 2013 MUTUAL FUND INVESTOR BEHAVIOUR STUDY HONG KONG

How To Know How Well You Did At A Nonprofit Leadership And Management Program

Table of Contents. Excutive Summary

in the Rankings U.S. News & World Report

U.S. News Best Global Universities Rankings: An inside look at the latest results and methodology

SCHOLARONE MANUSCRIPTS TM ORCID ID GUIDE

Outsourcing: driving efficiency. and growth. Grant Thornton International Business Report 2014

Smart beta: 2015 survey findings from U.S. financial advisors

Synergy Funds Management. Synergy Financial Markets. Create Your Financial Future by Investing with

Boardroom-Ready Analytics: Actionable Metrics for Future Planning

COLLEGE OF HEALTH AND HUMAN SERVICES School of Nursing. Guidelines for Promotion, Tenure and Reappointment (Effective May, 2007)

Consumer Goods and Services

Presented by Dr. David Renz Midwest Center for Nonprofit Leadership Henry W. Bloch School of Management University of Missouri Kansas City

Management School. MSc/PG Dip/PG Cert. MSc Business and Management. University of Stirling Management School

GRADUATE CERTIFICATE IN ORGANISATIONAL DEVELOPMENT

The Impact of Rewards Programs on Employee Engagement. research. Dow Scott, Ph.D., Loyola University Tom McMullen, Hay Group WorldatWork June 2010

The criteria and guidelines Best Belgian Sustainability Report Edition 2014

American Journal of Business Education January 2010 Volume 3, Number 1

Marketing: A National Independent Schools Comparative Study

Oyster Database Marketing April 2009

AACSB International Accreditation and Joint Programs

A QuestionPro Publication

Transcription:

NEW OUTLOOKS ON INSTITUTIONAL PROFILES FEBRUARY 2010 DR JONATHAN ADAMS KATHY BAKER

Copyright 2010 Thomson Reuters

THE HISTORY BEHIND UNIVERSITIES, LEAGUE TABLES AND THE BRAND BY PROFESSOR DAVID N SMITH DIRECTOR, CENTRE FOR RESEARCH ON LIFELONG LEARNING, GLASGOW CALEDONIAN UNIVERSITY Although we associate ranked tables with ideas about a global higher education industry, the idea of universities comparing themselves has a much longer history. Governments and scholars had been publishing quality or research rankings for over 100 years according to Ruth Pagell, University Librarian at Singapore Management University. 1 The medieval university was no stranger to competition and hierarchy. The Oxford scholars who founded the University of Northampton in 1261 did such a good promotional job that the Bishops persuaded Henry III to suppress the institution in 1265. By the 19th century there was much to count, classify and rank amidst the explosion of interest in education and qualification. In the 20th century universities were still seen as collections of individual organizations, but interest in a higher education system, linked to policy for science and society, was increasing. As infrastructure expanded on the back of increased public funding, so the scene was set for an emerging battle. For states and their governments, knowing what goes on inside the university and what comes out of it has been a fundamental driver of the desire to classify. 2 The league table becomes an instrument of governance as well as a measure of comparison. It is the multi-faceted uses that make tables such an object of desire, or of revulsion! On one side the faculty and the disciplines is a desire to set goals and drive learning. On the other side the state and the funders is a desire to steer universities and to drive impact. The competing forces of autonomy and accountability drive the rise of contemporary university league tables while, paradoxically, the internationalization of higher education and the need to secure competitive advantage through global positioning unites these interests. National league tables, largely about student choice within a consumer market, form one part of the landscape. The Berlin Principles of UNESCO-CEPES and IREG reflect a movement to bring international standards to bear on national teaching and student learning outcomes. International league tables are driven largely by the desire to measure research. Despite being restricted to a relatively small number of global players, the desire to compete is intense and potentially so significant that it gains national policy attention. 3 The brand of the institution unites academic, national and international interest. This is not based on the strength of the various disciplines, but on values associated with the institution as a whole. Some declare foul at perverse consequences of league positions yet the allure of jostling for position is a potent force. Promoting the brand is served by the very notion of a public league table position. There is a global twist to all this. The European and even the American age that gave birth to mass higher education and modern conceptions of academic research and innovation is, if not over, at least challenged. If the 20th was the American century, does 2010 signal the start of the Asia- Pacific century? Opinion leaders like Rick Levin, President of Yale University, suggest we take this seriously. 4 The changing numbers in university league tables reflect this shift and remind us of the continuing global reach of the university as a key knowledge institution.

OPINION SURVEY Thomson Reuters is in little doubt that league tables matter when it comes to ranking universities and colleges. If well-developed, they can be informative to students and their mentors, 5 and they matter hugely to those who run universities. 6 But league tables can hide as much as they show, because universities complex organizations span cultural boundaries and support multiple missions. No single indicator can capture that. Thomson Reuters is committed to providing high quality, well-informed analyses that help people make decisions. We do not produce league tables. Other companies use our data to create rankings and we want to enable them to get the best data for their publications. We aim to gather data and create products and services that go the next step and reveal the more complex management information behind those tables. To create a database useful to purpose, we need to compile the most relevant indicators of organizational competence and quality. We understand and acknowledge the Berlin Principles formulated by UNESCO- CEPES and IREG. 7 We also need to know which indicators people consider important and what they think is wrong with previous methods. Thomson Reuters worked with Ipsos Insight Corporation, a global market research firm to develop and implement an online opinion survey to find out what the global academic community thinks about these issues. This report contains the results of that survey conducted during year-end 2009. The sample includes 350 respondents making it larger and more diverse than any previous survey. The composition of that sample includes: Employees and students of academic institutions, and other informants who were aware of academic institution comparisons Responses from over 30 countries with the majority from the United Kingdom (n=107), United States (n=90) and Australia (n=30) Respondents came from many different roles. A high proportion of those from North America classed themselves as administration but in that region this includes senior managers. The most frequent category elsewhere was academic, which included staff and students. Many different kinds of institutions were represented, most having between 6,000 and 25,000 students, but a significant proportion of responding institutions in North America and Australia/New Zealand were much larger (35,000+). About one-half of the respondents institutions were identified as having a relatively high level of research activity. What did people think about current league-table publications? The best known and most familiar ranking systems around the world are the Times Higher Education World University Rankings and Shanghai Jiao Tong s Academic Ranking of World Universities, which enjoys a strong reputation in Asia, Australia and Europe. The U.S. News & World Report College and University Rankings is well-known and recognized most frequently in North America.

CHART 1 Familiarity of Comparison Systems by Region Q3. WHAT IS YOUR LEVEL OF FAMILIARITY WITH EACH OF THE FOLLOWING ACADEMIC INSTITUTION COMPARISON SERVICES? Times Higher Education World University Rankings 54% 72% 78% 86% 91% 49% 34% Academic Ranking of World Universities (Jiao Tong) 41% 81% 66% 44% U.S. News & World Report College and University Rankings 16% 32% 31% 95% Total (n=350) North America (n=99) Europe (n=140) Australia/NewZealand (n=37) Asia (n=32) % Extremely/Very Familiar How useful are current league and rankings tables? Respondents generally felt that the current analytic comparison systems had recognizable utility. About 40% globally said they were extremely/very useful and a further 45% said they were somewhat useful. The strongest positive response came from students. Respondent feedback showed that there is an opportunity to increase the usefulness of the published ranking tables in North America and Europe to match what was seen as having greater utility in Asia and Australia. CHART 2 Usefulness of Academic Comparisons Q1. HOW USEFUL DO YOU BELIEVE THE ANALYTIC COMPARISONS BETWEEN ACADEMIC INSTITUTIONS ARE IN GENERAL? 100% 80% 60% 40% 20% Extremely useful Very useful Somewhat useful Not very useful Not at all useful Base: All respondents (n=350) 0%

However, the overriding feeling was that a need existed to use more information, not only on research, but also on broader institutional characteristics. The data indicators and methodology currently utilized were perceived unfavorably by many and there was widespread concern about data quality in North America and Europe. In particular, many felt that data enquiries were too often directed to the wrong offices within institutions. TABLE 1 Primary Weakness (Open-End Results) Q5. NOW WHAT DO YOU VIEW AS THE PRIMARY WEAKNESSES OF THE CURRENT ACADEMIC INSTITUTION COMPARISON SYSTEMS? PLEASE BE AS SPECIFIC AS POSSIBLE. Primary Weaknesses of Comparison Systems (Open End Responses) PRIMARY % of Respondents (N=350) Data Metrics/Methodology Issues (Net) 36% Improper methodology 10% Some factors are inappropriately weighted 10% Data metrics not consistent 9% Doesn t consider all/most of the parameters 9% Data metrics used not appropriate/ the best 7% Complexity (Net) 19% Difficult to compare 17% Data Quality Issues (Net) 17% Improper/Lack of information 11% Unreliable 4% Too biased 13% Too subjective/ based on opinions 12% Not result oriented 7% Only Quantitative based, does not include qualitative 6% Considers other factors than data 5% Don t know/not sure 10%

A concern found in the survey, and echoed in discussions with representative groups, was that published ranking tables could have more insidious effects. They changed the behavior, even the strategy, of institutions, not to become more effective, but to perform well against arbitrary ranking criteria. Some would even manipulate their data to move up in the rankings. This is of great concern and warns against any reliance on indicators that could be manipulated without creating a real underlying improvement. Of particular interest was that, across all regions, the same concern was often expressed with about the same degree of frequency. This gave us some confidence in the outcomes but also highlighted a specific concern from Asia: that all the current analyses tend to favor English speaking nations. This is an important reminder that while English remains the international language for academic discourse its pervasiveness may obscure the changing geography of academic activity. CHART 3 Evaluation Of Academic Comparison Systems Q6. PLEASE LOOK AT THE LIST BELOW AND LET US KNOW HOW MUCH YOU AGREE OR DISAGREE WITH EACH OF THE FOLLOWING STATEMENTS REGARDING CURRENT ACADEMIC INSTITUTION COMPARISON SYSTEMS. Some institutions manipulate their data to move up in the rankings 74% Makes institutions focus on numerical comparisons rather than on educating students 71% Methodologies and data uses are neither transparent nor reproducible 70% Appropriate metrics are not included when compiling institutional comparisons 68% Only institutions that have always been ranked highly continue to be ranked highly Quantitative information mislead institutional comparisons 66% 66% % Strongly / Somewhat Agree Base: All respondents (n=350)

Across all regions, four out of five respondents said that league tables favored research-oriented institutions. This could again be due to the commonality of English as a research language, the public attention given to research and the ease of acquiring research-linked data. CHART 4 Research vs. Teaching Bias Q2. WHEN THINKING ABOUT RESEARCH-ORIENTED INSTITUTIONS VERSUS INSTITUTIONS ENGAGED PRIMARILY IN TEACHING, DO INSTITUTIONAL COMPARISONS TEND TO FAVOR ONE TYPE OF INSTITUTION MORE THAN ANOTHER? 100% 80% 60% 40% 20% 0% Comparisons favor research-oriented institutions the most Comparisons favor teaching-focused institutions the most Both are treated fairly Base: All respondents (excludes don t know)

What should be collected to inform profiles and rankings? Respondents who expressed a preference wanted the data collection from HEIs to be more reliable and better validated. Clarity and transparency from the outset about the data to be collected, methodology to be used and disciplines to be included were important in ensuring the results are valid and reliable. One way to ensure validation and quality, it was suggested, would be to develop a process to share data among neighboring institutions as a validation cross-check, but willingness to participate in such evaluation was not indicated. While the majority of respondents said that league tables favored research-oriented institutions, they also indicated that research metrics and institutional characteristics were the more important measures to use in comparison systems. In particular, over 90% rated faculty output and impact as nice to have or must have the highest ratings for any indicators clearly showing how much people value the information that comes from data on research publications and citations. Research awards (grants and contracts) are also important but need to be scaled against staff numbers to get a useful indicator such as a faculty activity ratio. Patents, on the other hand, are less valued. Finally, respondents were not as positive as expected about possible new indicators on demographics perhaps because HEIs are seen as naturally diverse and inclusive. TABLE 2 Essential metrics for comparisons: research and institutional characteristics Q7. FOR EACH OF THE FOLLOWING TYPES OF INFORMATION, PLEASE IDENTIFY WHETHER THE DATA ELEMENT IS MUST HAVE, NICE TO HAVE, NOT ESSENTIAL, OR NOT RELEVANT, IN THINKING ABOUT THEIR USE IN THE CREATION OF VALID INSTITUTIONAL COMPARISONS. Essential metrics for Comparisons % Must Have / Nice to Have Research Faculty output: research publications 92% Faculty impact: citations and major scholarly 91% Research Awards received 85% Patents 76% Institutional characteristics Faculty/Student ratios 86% Faculty activity ratios (teaching income/research grants/publications per staff) 85% Number of faculty 79% Demographics of faculty and student population (international, gender, race/ethnicity) 75% Base: All respondents (n=350)

Respondents indicated that the range and number of graduate programs and degrees are important, as is the size of the student body, though not the number of courses or classes taught. Of the financial measures asked about in the survey, research income is the most important financial measure to respondents, but the analysis of income and expenditure is also an indicator valued as nice to have by many. However, teaching income is not seen as being so important, perhaps because of variations in government funds and individual fee levels. Collaboration was identified as a key indicator in comparison systems by 82% of respondents, as this is seen as a significant marker of quality by peers. Community engagement is seen as a less valuable indicator, perhaps because its significance is less obvious. External perception is a tricky indicator, because it is about seeming rather than doing. It is also true that reputation can linger after performance has changed. However, peer researcher perception was seen as a key indicator by a good number of respondents, with the value of employer, alumni and community perceptions all being slightly lower. TABLE 3 Essential Metrics for Comparisons: Teaching and Financial, External Engagement and Reputation Q7. FOR EACH OF THE FOLLOWING TYPES OF INFORMATION, PLEASE IDENTIFY WHETHER THE DATA ELEMENT IS MUST HAVE, NICE TO HAVE, NOT ESSENTIAL, OR NOT RELEVANT, IN THINKING ABOUT THEIR USE IN THE CREATION OF VALID INSTITUTIONAL COMPARISONS. Essential Metrics for Comparisons % Must Have / Nice to Have Teaching Graduate degree offered 85% Graduate programs offered 83% Number of students enrolled 79% Number of classes taught 64% Financial Income from research grants and awards 84% Analysis of income sources 75% Analysis of expenditures 73% Total expenditures 72% Income from teaching 61% External engagement Collaborations industry, international, multidisciplinary 82% Community engagement 69% Reputation External perception among peer researchers 79% External perception among employers 71% External perception among alumni and the community 69% External perception among administrators 52% Base: All respondents (n=350)

REFERENCES 1 Ruth A Pagell. (2009) University Research Rankings: From Page Counting to Academic Accountability 3:1:33-63 2 Hood, Christopher (2009) Foreword, in Jeroen Huisman (ed.) International Perspectives on the Governance of Higher Education. Alternative Frameworks for Coordination. New York & Abingdon, Oxon: Routledge; Scott, P. (1990) Knowledge and Nation, Edinburgh: Edinburgh University Press. 3 Higher Education Funding Council for England (2008). Counting what is measured or measuring what counts? League tables and their impact on higher education institutions in England. Issues paper no 14. (April 2008). This report surveys ninety respondents across English universities, and six case studies. It was commissioned by HEFCE to investigate league tables and their impact on higher education institutions in England. It presents findings from an analysis of five league tables, and an investigation of how higher education institutions respond to league tables generally and the extent to which they influence institutional decision-making and actions. http://www.hefce.ac.uk/pubs/hefce/2008/08_14/ 4 Michael Bastedo and Nicholas Bowman (2010). U.S. News &World Report College Rankings: modelling institutional effects on organizational reputation. American Journal of Education, 116, (published electronically, Dec 18, 2009). 5 Philippe Van Parijs (2009). European higher education under the spell of university rankings. Ethical Perspectives, 16, 189-206. 6 Institute for Higher Education Policy, Washington (2007). College and University Ranking Systems: global perspectives and American challenges. IHEP, Washington DC, www.ihep.org. This edited report includes chapters on, among other things, a global survey of rankings and their impact on student access. 7 The Berlin principles were formulated by the UNESCO s European Centre for Higher Education in Bucharest, Romania and the International Rankings Expert Group http://www.che.de/downloads/berlin_ Principles_IREG_534.pdf. IREG has met in Warsaw (2002), Washington (2004), Berlin (2006), Shanghai (2007) and Astana (2009), and will meet again in Berlin in 2010. http://www.ireg-observatory.org/

Science Head Offices Americas Philadelphia +1 800 336 4474 +1 215 386 0100 Europe, Middle East and Africa London +44 20 7433 4000 Asia Pacific Singapore +65 6411 6888 Tokyo +81 3 5218 6500 For a complete office list visit: science.thomsonreuters.com/contact AG1002257 Copyright 2010 Thomson Reuters