1 / 39

Quality Assurance within Higher Education Institutions

Quality Assurance within Higher Education Institutions. Professor Phil Cardew Pro Vice Chancellor (Academic) London South Bank University. Objectives of the Day. To establish the concepts of ‘standards’ and ‘quality’ and their place within a higher education institution.

burke
Download Presentation

Quality Assurance within Higher Education Institutions

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Quality Assurance within Higher Education Institutions Professor Phil Cardew Pro Vice Chancellor (Academic) London South Bank University

  2. Objectives of the Day • To establish the concepts of ‘standards’ and ‘quality’ and their place within a higher education institution. • To consider core processes of benchmarking, reporting and review needed to manage quality and standards. • To discuss the relationship between management processes and resourcing and quality assurance systems. • To consider the interrelationship between internal and external assurance systems. • To consider reporting mechanisms, and risk assessment and management from a management perspective.

  3. Agenda 0930: Welcome, introductions, discussion of core concepts. 1000: Quality assurance at programme level: establishing and assessing to a standard. 1100: Coffee Break 1130: Validation, monitoring and review: process and reporting. 1230: Lunch 1300: Student engagement: feedback and representation. 1400: Using reports, risk assessment and management. 1430: Plenary Discussion

  4. Quality assurance at programme level: establishing and assessing to a standard.

  5. Standards and Quality • What is ‘a standard’? • Thresholds attainment • Levels of achievement • Benchmarking and equity • What is ‘quality’? • Customer service models • Enhancement

  6. The Building Blocks of degree awards – working with institutional variation: • ‘One size fits all’ approaches. • Establishing ‘labels’ – understanding structures. • Common ‘labels’: • Programmesand courses • Frameworks and Pathways • Modules and Units • Variability of approach • Modification

  7. Structures of delivery • Full-time and part-time • Single honours and combined honours • Distance, blended and distributed delivery • Delivery by partner institutions • Multi-site delivery • Comparability • Assessment of standards • Research degrees and learning contracts • Relationship to academic regulations

  8. How do we establish a standard at programme level? • National Qualifications Frameworks • Subject benchmark statements • Professional body requirements • Employer requirements • External examiners • Academic • Professional

  9. Standards, levels, awards • Award outcomes – graduate attributes • Levels within an award • Exit qualifications • Assessment • Type • Variation • Loading • Embedded, dual and articulated awards.

  10. Employability • ‘Professional’ and ‘Academic’ qualifications • Subject knowledge, technical ability, specialist skills, core skills • Currency of knowledge • Work-based learning

  11. Conclusions • No ‘one size fits all’ or ‘standard’ model • However – especially in early stages – consistent approach pays dividends • Important to establish an ‘outcomes based’ approach • Clear understanding of overall learning outcomes • Clear understanding of level and progression • Clear assessment strategy

  12. Coffee Break

  13. Validation, monitoring and review: process and reporting.

  14. The Building Blocks • Validation – programme approval • Annual monitoring: • Action planning • Relationship to other processes • Periodic review: • Cycle of operation • ‘End of cycle’ and ‘mid-cycle’

  15. Working with collaborative partners • Types of relationship • ‘Flying faculty’ • Part-franchise • Franchise • Validation • Accredited Partner • Approval of delivery • Periodic review

  16. Validation • Initial approval in principle: • Strategic ‘fit’ within overall academic portfolio • Clarity of award title • Market • Desirability for professional and/or employment market • Validation event: • Programme specification • External involvement • Academic • Professional • Employer

  17. Annual Monitoring • Cyclical action planning • Responding to data: • External Examiners’ report(s) • Progression and Award Statistics • Module Evalution Questionnaire results • National Student Survey • Employment Statistics • ‘Sign off’ of minor modifications

  18. Periodic Review • Relationship between review, validation, monitoring and minor modifications: • Incremental change and re-validation • Stability of award title and learning outcomes • (Advantage of frameworks and pathways) • Gives experience of programme over a longer time-scale • Allows for ‘major’ changes • MUST include appropriate externality

  19. Action Planning and Reporting • Identifies short and medium-term actions • Includes responsibility • Identifies activities to be undertaken • Includes review point • Establishes benefits of activity • Reports on: • Conclusion of activity • Results of action

  20. Conclusions • Nested activities – not separate processes • Should establish continuum of evidenced action planning • Can work in clusters of programmes as well as individual programmes • Should lead to clear, concise reports • MUST include externality in all aspects and at all points.

  21. Lunch

  22. Student engagement: feedback and representation. • Why engage students with quality processes? • Identify strengths and weaknesses of delivery from a student perspective • Engage with aspects of delivery outside teaching: • Classroom and lecture space • Library • IT • Specialist equipment • Engage with assessment, marking, moderation and feedback to students

  23. Basic Methods • Module Evaluation • Annual Surveys • Course Boards • Student Meetings • Student involvement in Periodic Review: • In meetings • As Reviewers • Senior engagement with the Students’ Union

  24. Module Evaluation • Standard questions • Scoring • Similar timescales of delivery • Anonymous completion • Comments as well as scores • Standard reports • Module • Programme • Department • Faculty • Focus on under-performing modules

  25. Annual Surveys • New entrants • International Students • National Student Survey • Postgraduate Surveys: • Taught programmes • Research Students • ‘Pulse’ surveys

  26. Course Boards and Student Meetings • Elected representatives • Training • Timescales for meetings • Standard Agendas • Gathering information • Feedback • Relationship to other processes • External examining • Annual monitoring

  27. Students within validation and review processes • Student meetings • Engagement with new proposals • Feedback on existing courses • Recent graduates reflecting on employability • Engagement with department – responsiveness to feedback etc • Students as reviewers • Experience on QAA reviews • Training • Limits of process

  28. Conclusions • Student input adds value to processes. • Needs to happen in collaboration with Students’ Union (or a Student Society). • Representatives need training. • Need to establish clear understanding of goals of engagement. • Needs careful handling not to patronise or antagonise. • Need to reassure staff that they are in control of their programmes – but that student input is valuable!

  29. Using reports, risk assessment and management.

  30. What are the aims of quality assurance processes? • Confirmation of standards • Reassurance that processes have been completed • Reflection on performance (data monitoring) • Enhancement of future delivery (programme structure and quality of delivery/environment).

  31. What should processes focus on? • Specialist understanding of the academic discipline • Statistical data – progressionand achievement • Feedback from external examiners • Feedback from students • Employment statistics • Resources

  32. What should we avoid? • Long and tedious reports with nothing to say. • Repetition from previous years. • Narrative description with no analysis. • ‘Open-ended’ action planning.

  33. What should we promote? • Focused reports. • Clear analysis of data. • Action plans which show monitoring and completion of actions. • Forward planning related to analysis. • Responsiveness to feedback. • Development, not stagnation.

  34. Becoming ‘risk aware’ • Can we focus only on key areas (programmes) of risk? • What should lead to investigation? • Threats to standards • Poor progression and/or achievement • Negative feedback (from examiners or students) • Poor satisfaction • Poor employability • Lack of action

  35. Reporting as part of a cycle • Reflection on previous year’s report (and actions). • Analysis of data – including comparison with past performance (what is ‘direction of travel’?) • What action is needed as a result? • Who will do it – by when?

  36. Conclusions • Reporting need not be a huge burden (either to the author or the reader). • Must have clear outcomes and be useful. • Must be used. • Must have place in future activity and reflection. • Poor performance must be dealt with (both in terms of activity and reporting).

  37. Plenary Discussion

More Related