1 / 28

PTES 2014 Update

PTES 2014 Update. Postgraduate Taught Experience Survey 2014. 100 participating institutions (up from 89 in 2013); Mission Groups: GuildHE 12/28 Million+ 9/17 Russell Grp 18/24 Uni Alliance 17/22. How is it going for you?.

burian
Download Presentation

PTES 2014 Update

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. PTES 2014 Update

  2. Postgraduate Taught Experience Survey 2014 • 100 participating institutions (up from 89 in 2013); • Mission Groups: • GuildHE 12/28 • Million+ 9/17 • Russell Grp 18/24 • Uni Alliance 17/22

  3. How is it going for you? Discussion Summary:International students are sometimes difficult to recruit; Having PTES as part of course review encourages engagement;Shibboleth login would be useful (BOS is exploring).

  4. The Response Rate Chase

  5. Response rates 2013

  6. Three rules for getting good response rates Personalise emails and make them informal; Publicise what you have done in response to student feedback; Use a hyperlink that passes the student to the survey directly from the email.

  7. When is enough, enough? • Number of responses is critical for validity of results, not response rate; • Over 23 responses for a score is ok, over 50 is good, over 100 is better.

  8. When is enough, enough?

  9. When is enough, enough? • Non-response can skew results, more important is impact on staff confidence; • Should get 15% minimum response rate, 25% ok, 33% good, 50% great.

  10. Chasing Response Rates Discussion summary: Some institutions run a “survey season”, others prefer a late launch; Different strategies around minimum numbers for what is reported on, which can encourage courses to respond Conversations with staff difficult if results are withheld – a preference that HEA provides stronger guidelines;

  11. Standard Benchmarking Reports

  12. What are standard benchmarking reports? • Reports provided to each institution by the HEA that give detailed benchmarking information, including statistical analysis; • Results only provided for benchmarks where there are at least 23 responses, for at least three institutions where no single institution constitutes over 50% of the response; •Results broken down by subject; •Summary graphs, summary tables, detailed data tables and guidance provided; • NOTE: Benchmarking also available in BOS from late May

  13. Process for 2014 • Late-May - Officers contacted with further details; • June - Survey sent to PTES officers asking for preferences; • Option to select up to three standard benchmarks (e.g. Russell Group, Universities Alliance, “1994”) in addition to sector scores; • Option to select from 6 to 20 of participating institutions to construct a custom benchmark, for a charge of £150 for the first and £50 for up to two more.

  14. What could we do for you? • • HEA Surveys Team are considering an extended reporting service that would, usually for a charge, offer options such as: • Benchmark response rates; • Detailed subject analysis; • Course/Department/Faculty reports; • Trend analysis.

  15. Analysis and reporting for enhancement

  16. Reporting within institutions Discussion summary: HEA graphs and Tableux useful for reporting simple stats at an institutional level, mainly around exceptions; Detailed information to departments; Internal comparisons using overall benchmarks or sector quartile scores; Free text comments are labour intensive given need to make sure use fits in with institutional policy e.g. being fully anonymised; Preference for anonymised comments for distribution to all staff and confidential un-anonymised comments for management. However, students should be directed to use appropriate channels for complaints.

  17. Disseminating results and engaging ‘staff’ • Identifying possible users of PTES results • Who currently receive the PTES results in your institution? (I.e. staff in which roles?) • Who else might find them useful? Why? • What currently stops them from receiving or using the results?

  18. Disseminating results and engaging ‘staff’ • Engaging users of PTES results • How do you (or could you) ensure senior staff are aware of and act on PTES results? • How do you (or could you) ensure academic staff use the results to inform enhancement?

  19. Case studies • We need your help in producing case-studies of the impact of PTES to: • Persuade even more institutions to take part (and thus improve benchmarking further) • Share good practice between institutions about using PTES for enhancement • Demonstrate the impact of PTES to the sector • Please let us know if you can help.

  20. Linking results to institutional processes • PTES should be part of a wider process of enhancement, not a standalone survey… DCU Learning Innovation Unit

  21. Linking results to institutional processes • PTES should be part of a wider process of enhancement, not a standalone survey; • Results are best used in combination with other evidence, as a starting point for conversations about enhancement; • The new PTES scales are partly designed to fit with other institutional metrics…

  22. Linking results to NSS results and QAA indicators See online (Excel spreadsheet with detailed mapping)

  23. PTES (and PRES) 2015

  24. PTES 2015 and BOS2.0 • BOS 2.0 on schedule for July release; • Much improved interface for survey design; • Much improved navigation and accessibility; • More useable and useful results analysis; • Email facility? (we don’t know yet…)

  25. PTES 2015 and Demographic Data • HEA Surveys propose to use institutional and HESA data rather than student entered data where possible; • Process has some potential risks and greater burdens on institutions; • Seeks to gather institutional data where required for immediate benchmarking analysis and use HESA data where data only needed for further research;

  26. PTES 2015 and Demographic Data 2011 review suggested several fields are not used by many institutions. We need to establish what fields are used for benchmarking.

  27. Consulting on demographics Discussion summary: HEA to ask officers for which data is required for institutional benchmarking; HEA then to produce a definite list of which variables will be required for upload to BOS, setting out what data will be needed;

More Related