1 / 26

University and Higher Education Rankings – What Relevance Do they Have?

University and Higher Education Rankings – What Relevance Do they Have?. EI Affiliates Conference in the OECD member countries “FRAMING EDUCATION FOR THE PUBLIC GOOD” 29-30 January 2013, London, United Kingdom Grahame McCulloch General Secretary, NTEU (Australia) EI Executive Board Member.

bryce
Download Presentation

University and Higher Education Rankings – What Relevance Do they Have?

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. University and Higher Education Rankings – What Relevance Do they Have? EI Affiliates Conference in the OECD member countries “FRAMING EDUCATION FOR THE PUBLIC GOOD” 29-30 January 2013, London, United Kingdom Grahame McCulloch General Secretary, NTEU (Australia) EI Executive Board Member

  2. BACKGROUND AND CONTEXT • Rise of mass higher education in rich countries (1960s-1990s) – mainly well funded public systems (with notable Japanese and Korean exceptions) but recognisable hierarchies and stratification (with a premium on research intensity) • Economic, social and labour market benefits (including R&D, innovation, technology transfer and human capital)

  3. BACKGROUND AND CONTEXT • Slowing of growth and scarcer resources – increased managerial authority, expanded role for private effort and markets, accountability, performance measurement and indicators and erosion of tenure and academic autonomy (1990s – present) • Rapid growth in emerging and developing countries (embryo of mass systems) and strong preoccupation with science, technology, R&D and direct economic role of universities (particularly in Asia)

  4. BACKGROUND AND CONTEXT • 177 million students, 17000 institutions and 11 million staff – expected to double in 20 years (mainly in Asia and Latin America) • Global trade and flows – 2.5 million international students (around $100 billion), regional trade blocs (US, Canada, UK, Australia and Europe) and regional/national accreditation, qualifications and quality assurance

  5. BACKGROUND AND CONTEXT • Imbalance between per capita resources and enrolments on world scale • Rich countries seeking larger share of world expansion via trade, offshore and joint ventures and research collaboration • Emerging countries seeking domestic expansion through national strategy and investment (including imports of foreign capital, expertise and technical systems)

  6. Source: UNESCO 2010

  7. Source: UNESCO 2010

  8. OVERVIEW OF GLOBAL MEASUREMENT INSTRUMENTS • Parallels between schools, vocational education and higher education – Program for International Student Assessment (PISA), Teaching and Learning Survey (TALIS), Program for International Assessment of Adult Competencies (PIAAC) and Assessment of Higher Education Learning Outcomes (AHELO) • Focussed on systems and/or disciplines (not individual institutions) for cross-country comparisons and use in national benchmarking and quality assurance processes, high media and political visibility

  9. OVERVIEW OF GLOBAL MEASUREMENT INSTRUMENTS • Problems of measurement and misinterpretation or misuse of data by national governments – causation, correlation, limits of mathematical language and dangers of simplistic international league tables and single standardised scores • Narrowing of domestic public policy standards with strong instrumental focus, and less emphasis on wider social and educational objectives.

  10. OVERVIEW OF GLOBAL MEASUREMENT INSTRUMENTS • Tendency of scores and metrics to undermine qualitative and organisational quality assurance measures, and to encourage ‘gaming’ and manipulation of metrics

  11. UNIVERSITY RANKING SYSTEMS • Siblings and first cousins of international and national performance indicators/accountability systems, but focussed on individual institutions and not systems • Typical weighted indicators include undergraduate and postgraduate enrolments, research grants and endowments, public and private funding, student/staff ratios, graduation rates, research citations and publications and prizes/awards

  12. UNIVERSITY RANKING SYSTEMS • Measurement of teaching and research quality uses proxies (metrics and reputational surveys), and league tables are based on standardisation, aggregation into single score and ordinal scale based on the top ranked institutions • Developed and administered by media companies or specialist arm of university research centres – no direct government or intergovernmental involvement

  13. UNIVERSITY RANKING SYSTEMS • Multi-Ranking without league tables – University Ranking and U-Map – the “Berlin Rankings” (CHE/die Zeit, Germany and IREG) and U-Multirank (EU)

  14. UNIVERSITY RANKING SYSTEMS • National ranking league tables – Japan (Asahi Shimbun), Canada (Macleans), Italy (La Repubblica), US (US News and World Report) • International ranking league tables – US News and World Report (with QS Symonds), Times Higher Education Supplement (with Thomson Reuters), Academic Rank of World Universities (Shanghai Jiao Tong University, China), Global Universities Rankings (Lomonosov State University, Russia), Scientific Papers for World Universities (Accreditation and Evaluation Council, Taiwan), Leiden Research Ranking (Leiden University, Netherlands), University Web Ranking (CSIC Cybernetics, Spain)

  15. MOST INFLUENTIAL – THES AND ARWU • In rich countries used by governments in domestic policy debate and by universities in marketing and promotion, particularly in North and South East Asia • In emerging and developing countries used by governments as benchmark for development of domestic institutions and systems • Directly affects institutional behaviour and indirectly high achieving student choice

  16. MOST INFLUENTIAL – THES AND ARWU • ARWU based solely on metrics with research (maths and science in particular), accounting for 90% of composite scores • THES apparently more balanced (30% teaching, 30% research volume, income and reputation, 32.5% research citations, 7.5% international and 2.5% economic innovation), but actually closer to 75% weighting for research

  17. MOST INFLUENTIAL – THES AND ARWU • Both rankings actually reflect the prestige, high selectivity in student enrolments and staff appointments, economic resources and global reach of each university • Are not able and do not aspire to reflect diversity of institutions and systems (large and small, teaching intensity, access and equality, three and four year programs, cultural context) • Not a guide or benchmark for national system development

  18. Proportion of universities covered by THES and ARWU rankings Source: European Universities Association (EUA) 2011

  19. AN EI RESPONSE • High quality information and feedback for national and international students necessary in mass systems, and robust quality assurance is essential • Quality assurance and performance assessment should reflect the characteristics, resources social/educational objectives of each institution, and be autonomously determined within each university using peer review and stakeholder consultation

  20. AN EI RESPONSE • Academic freedom, collegial decision-making, trade union rights and employment standards should be part of quality assurance criteria • The aggregation of data at national and international level for any cross-institutional comparative purposes should prevent the construction of league tables • Building on EI’s strategic response to PISA, EI should continue a critical dialogue with OECD in the development and implementation of AHELO (noting its discipline and national system focus). Any final methodology should prevent the construction of arbitrary league tables

  21. AN EI RESPONSE • EI should develop direct dialogue with the Berlin rankings group (CHE/die Ziet and IREG) on the development of University Ranking and U-Map, and EU on U-Multirank (noting these are consciously constructed to enable comparison without league tables)

  22. FURTHER READING • Global university rankings: where to from here?, Simon Marginson, Centre for the Study of Higher Education, University of Melbourne, Australia • To Rank or To Be Ranked: The Impact of Global Rankings in Higher Education, Simon Marginson and Marijk van derWende, Journal of Studies in International Education, Vol. 11 No 3/4 • College and University Ranking Systems - Global Perspectives and American Challenges, Institute for Higher Education Policy, Washington D.C., April 2007 • Global University Rankings and Their Impact, EUA Report on Rankings 2011, AndrejsRauhvargers • The Road to Academic Excellence – The Making of World-Class Research Universities, Philip G. Altbach and JamilSalmi Editors, The World Bank, Washington D.C

More Related