1 / 21

Michael Anne Greer Director, Academic Success Center Amanda Novak

Look! Our ‘Stuff’ is Making an Impact: Using Meaningful Assessment to Demonstrate the Effectiveness of Academic Support Services. Michael Anne Greer Director, Academic Success Center Amanda Novak Director, Advising and New Student Services Texas Wesleyan University.

Download Presentation

Michael Anne Greer Director, Academic Success Center Amanda Novak

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Look! Our ‘Stuff’ is Making an Impact:Using Meaningful Assessment to Demonstrate the Effectiveness of Academic Support Services Michael Anne Greer Director, Academic Success Center Amanda Novak Director, Advising and New Student Services Texas Wesleyan University

  2. The Same Old Question:Why Assess? • We have to • Accreditation • Reaffirm our roles • Demonstrate department’s contributions to institutional goals • SEE IF OUR STUFF IS WORKING

  3. Who Should Be Assessing? • Academic Departments • Experience creating, writing, and evaluating learning outcomes • Administrative Departments • Why it can be daunting • Which areas assess? • Internal, within the department • Centralized Institutional Research

  4. Let’s Look at Our ‘Stuff’ • Texas Wesleyan- brief statistics • 3,204 students • 1,794 undergraduates • Private • Academic Success Center • History • Services • Roles • Advising & New Student Services • History • Services • Roles

  5. Common Pitfalls • Avoiding assessment • It’s not that bad…after 1 or 2 cycles, much less daunting! • Don’t be afraid of the data • Much easier and more useful if kept up regularly • Staying on track • Evaluation Grid • Advising Assessment Matrices (Redmond & Babenchuk, 2011) (Robbins, 2011)

  6. Common Pitfalls • Alignment with University • Crucial to demonstrate how your department is contributing to institutional goals and mission • Demonstrate needs (funding, staff, resources) • Informed decisions for improvement • Alignment with department mission • Make sure goals support current mission (Bresciani, 2011)

  7. Common Pitfalls • Alignment with University • “Our mission at Texas Wesleyan University is to develop students to their full potential as individuals and as members of the world community.” • “The Academic Success Center’s mission is to serve as a critical link between students and a fully successful academic experience by providing personal support services that remove barriers to academic excellence and assist in the development of critical thinking, analytical reasoning, and problem solving skills.” • “Academic advising at Texas Wesleyan University is a teaching and learning process dedicated to student success. Consistent with the university mission, academic advising engages students in developing a plan to realize their educational, career and life goals.” (Bresciani, 2011)

  8. Common Pitfalls • Unreasonable goals and objectives • Keep goals manageable and attainable • Be sure goals are relevant and meaningful to stakeholders • Goals should be broad and reflect what your program would like to be, look like, do (Bresciani, 2011)

  9. Common Pitfalls • Ineffective measures • Are we measuring actual outcomes? • Tracking visits does not measure outcomes. • Lack of documentation/ tracking • Are we asking for/ collecting/ pulling the data we need at the right time? • We’ve been doing all of these great things, but have no evidence. • One survey is not assessment!

  10. Common Pitfalls • Invalid data • Does the data give us a valid measure of our outcomes? • Incorrect interpretation of data • Increased GPA does not imply retention • Ineffective use of data- not closing the loop!

  11. ‘Backwards Thinking’ • Some struggle with how to think about assessment • Let’s think backwards! • What are we doing? • 85 students come in for writing tutoring • Held 12 academic skills workshops • 150 out of 200 freshmen register on the first day of early registration • Write 2 or 3 of your services

  12. ‘Backwards Thinking’ • Why are we doing it? • “Students need/ request it”- not specific enough • Need to demonstrate why spending the time, effort, funds to support services

  13. Outcomes • What are we attempting to achieve? • Outcomes- what the end user should know, have, be able to do as the result of the service • Must be specific • Must be measurable (Bussell, 2011)

  14. Writing Outcomes • “80% of undergraduates will utilize tutoring services”- not an outcome • “Students who utilize tutoring services will succeed”- not specific enough • “Students will benefit from workshops”- not measurable • Students will receive adequate advising – not easily measured, too broad

  15. Writing Outcomes • “Students who receive writing assistance will pass writing intensive courses with a C or better” • “Students who attend academic workshops will gain effective note-taking methods” • “Freshman Students will be able to calculate their GPA” • Write 2-3 service outcomes

  16. Measuring Outcomes • Is it working? • How do we know? • Measures- should assess actual outcome, not another process • Direct Measures- assess actual performance or product • Indirect Measures- measure perceptions or opinions

  17. Measures • Direct or Indirect? • Practice test at end of math tutoring session • Survey to workshop attendees • Focus group opinions of tutors • Demonstration exercise during workshop • Enrollment data during early registration • Write 1 direct and 1 indirect for your outcomes

  18. Gathering the Data • What do the measures tell us? • Data Collection Examples • Survey responses • Focus group feedback • Retention reports • Grade reports • GPA • Self Study Audits • Individual observation of students performing a task (especially useful in advising)

  19. Assessing the Data • Are we doing well? • Did we meet our goals? • Do we need to increase/ decrease goals? • What are appropriate benchmarks? • CAS Standards • Professional association standards • Peer Institutions

  20. Using the Data • Now what? • Closing the loop • What goals were met/ not met? • Why or why not? • Needs- resources, staff, funding • How to improve • Affirmations • Recommendations/ plans going forward

  21. References Bresciani, M.J. (2011, June). Enhancing outcomes-based assessment for student affairs. Webinar. Lecture conducted from Innovative Educators, Thornton, CO. Bussell, K.H. (2011, May). Strategic planning & outcomes assessment. Academic Student Support Services Council retreat. Lecture conducted from Texas Wesleyan University, Fort Worth, TX. Redmond, J. & Babenchuk, I. (2011, September). Data-driven decision-making. Learning centers: At the crossroads of student success. Lecture conducted from National College Learning Center Association 26th Annual Conference, Indianapolis, IN.

More Related