1 / 20

Graham Gibbs

If we want to improve our courses, what should we be paying attention to, and how should we go about making changes?. Graham Gibbs. ‘ Dimensions of Quality ’ Literature review to inform debates about: whether UK HE is comparatively good whether university league tables are valid

stormy
Download Presentation

Graham Gibbs

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. If we want to improve our courses, what should we be paying attention to, and how should we go about making changes? Graham Gibbs

  2. ‘Dimensions of Quality’ Literature review to inform debates about: • whether UK HE is comparatively good • whether university league tables are valid • whether the NSS and KIS provide info students can trust ‘Implications of ‘Dimensions of Quality’ in a Market Environment’ Review of institutional behaviour • is how universities are responding to their PIs likely to “drive up quality”? • which enhancement strategies are working?

  3. ‘Presage’ variables • Resources per student predict much less than one might expect (but learning resources predict effort) • Selectivity predicts performance, but not learning gains, or engagement, or use of pedagogies known to enhance engagement • Research predicts performance, but not engagement, and negatively predicts satisfaction & measures of learning gains. • Who does the teaching predicts performance and gains • Reputation predicts only selectivity, funding & research • Peer ratings only reflect reputation (US and TQA)

  4. ‘Process’ variables • Cohort size, class size, ‘close contact’ with teachers (SSRs) (cohort effect avoidable...) • Not class contact hours but total study hours • Quality of teaching: training, student ratings, but not teachers’ research • Quality of research environment: not at u/g level Consequences for learning: • Deep and surface approaches • Engagement: close contact, high and clear expectations, good quick feedback, active and collaborative learning, time on task

  5. ‘Product’ variables • Degree classifications • Retention • Employability ... too many confounding variables to be able to make much sense of any of this data, and degree classifications and employability data are highly unreliable

  6. What to pay attention to in terms of pedagogy? • Changing students: effort, internalisation of goals and standards, meta cognitive awareness, self-efficacy • Changing teachers: who, and how sophisticated • Moving from solitary to social learning • Changing curricula: • Focussing course design, review and evaluation around learning hours • Shift from summative to formative assessment • Making programmes coherent, with comprehensive changes implemented by course teams, not only by individuals (no matter how wonderful)

  7. Departments and social mediation of quality • Programmes vary widely in quality within institutions (except where ‘institutional pedagogy’) • Institutions with no QE focus on programmes have problems • Communities of practice (Havnes) • Talking about teaching at programme level (TESTA) • Employment practices (adjunct faculty, pseudo departments, Fordism) • Modular structures, no assessment (or even shared understanding) of programme outcomes ...implies increased developmental focus on depts. or course teams (Lund, Oslo, Finland, Utrecht...)

  8. The ‘how’ of change... 1 Using teaching PIs to improve quality 2 Unanticipated impacts on curricula 3 Student engagement 4 KIS, and better information about provision

  9. 1 Using teaching PIs to improve quality • Unprecedented attention to quantitative PIs • Average NSS scores up every year • Some institutions climbing rankings every year • ...by paying attention and using clever change processes • Exeter • Coventry • Winchester: TESTA assessment and feedback • ‘hygiene’ factors

  10. 1st degree programme at Winchester to use TESTA, now top ranked nationally

  11. University of Winchester

  12. 1 Using teaching PIs to improve quality • Unprecedented attention to quantitative PIs • Average NSS scores up every year • Some institutions climbing rankings every year • ...by paying attention and using clever change processes • Exeter • Coventry • Winchester: 20+ Universities now using TESTA • ‘hygiene’ factors

  13. 2 Unanticipated impacts on curricula • Whole is less than the sum of the parts (OU, Plymouth, module level NSS scores) • Course rationalisation, abandoning joint degrees • Abandoning modularity altogether • Bigger, longer, fewer modules, fewer in parallel • Planned programme assessment regimes, including programme level learning outcomes

  14. 2 Unanticipated impacts on curricula • Whole is less than the sum of the parts (OU) • Course rationalisation, abandoning joint degrees • Abandoning modularity altogether • Bigger, longer, fewer modules, less in parallel • Planned programme assessment regimes • ... but this may cause • Less choice, less engagement • Larger classes

  15. 3 Student engagement • Students as change agents (Exeter) • Students as democratic partners (Bath) • Students as educational researchers (Winchester) • Students as collaborators (Leicester) • Changed practices, changed student attitudes • Better engagement in studies (USA, NSSE) • Improved NSS scores (2008-12 7%, Av 2%)

  16. 4 KIS

  17. If you study Pork Products at the University of Poppleton, you will experience: ...lecture classes of 900 ...run by teachers you will only see from 30m away ...seminar classes of 40 ...run by part time teachers who do not have an office and who do not mark your work ...computer based assessment ...fellow students you don’t know, never talk to, and who disappear at the end of the lectures NONE of these variables feature in KIS ALL are valid indicators of poor quality

  18. If you study Egyptology at Hamble University you will experience... • small classes, with other students you will get to know • ...taught by teachers you will get to know • ...who will give you copious written and oral feedback on your assignments, • ... which will engage your time and effort • ...and which will often involve working with other students, which will mainly be for learning, not for marks • ...in a well stocked ‘learning resource centre’ NONE of these variables feature in KIS ALL are valid indicators of good quality

  19. 4 KIS • Includes invalid PIs, & emphasises satisfaction • % class contact • Misses key valid PIs about provision: • cohort size, class sizes, formative assessment • Difficult to specify standardised ways of calculating single quantitative indicators that work across contexts • ...so obliges programmes to specify what provision students are buying .... in Prospectuses, or in descriptive text linked to KIS pages • David Willetts has agreed to review KIS and add unique local data

  20. A postscript on Moocs ... and the ‘avalanche’ predicted to engulf conventional universities • On line and distance courses usually have terrible quality PIs, and very poor retention. • The Open University is an exception – but • it builds in all the pedagogic practices Moocs lack (and largely abandoned video lectures decades ago) • so it costs the same as conventional provision • Employers have traditional notions about which universities to recruit graduates from • MIT gave away its lectures and course materials a decade ago • on the assumption that they represented nothing of importance about the experience of studying at MIT • Valid quality PIs will protect conventional universities – provided that universities pay attention to these PIs better than Moocs do...

More Related