1 / 35

Essentials of Assessment

Essentials of Assessment. Eric Hampton, Ph.D. Cindy Crowder, Ph.D. Assessment in Context. Evaluation A process of reaching a conclusion, judgment, or decision about an evaluation object. This involves judging the worth of something ( Scriven , 1967). Assessment

haruko
Download Presentation

Essentials of Assessment

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Essentials of Assessment Eric Hampton, Ph.D. Cindy Crowder, Ph.D.

  2. Assessment in Context • Evaluation • A process of reaching a conclusion, judgment, or decision about an evaluation object. This involves judging the worth of something (Scriven, 1967). • Assessment • Procedures and processes which identify, collect, and prepare data to serve evaluative needs (e.g. student outcomes, educational objectives, program objectives). • Measurement • A process of systematically assigning numbers to measured attributes according to established rules.

  3. Attributes of High Quality Assessment (Stiggins, 1997) • Clear Targets • Clear achievement targets and knowing what you are after is a must. • Focused Purpose • Know why the assessment is conducted and how results will be used. • Proper Method • The method of assessment must match the target. • Sound Sampling • A representative sampling of possible performances is gathered. • Accuracy in Assessment • Assessment limits error and bias in measurement.

  4. Potential Educational Assessment Targets (Adapted from Stiggins, 1997) • Knowledge • Reasoning/problem solving • Skill • Creation of products • Dispositions/attitudes

  5. Potential Assessment Methods Direct Assessment Methods Student knowledge, skill or product is directly examined or observed. Student performance on the direct measure is compared against measurable performance criteria. • Student work products assessment • Classroom assessment of knowledge/reasoning • Observations of student skills • Standardized tests • National certification exams • Locally developed tests • Juried review • Simulations • Internship evaluations based on learning outcomes

  6. Potential Assessment Methods Indirect Assessment Methods Assessment data gathered on reported perceptions of student learning • Student self-assessment of learning • Surveys (alumni, employer) • Exit Interviews • Focus groups

  7. Assessment Paradigms Quantitative • Gathering assessment data in numeric form. • Can be analyzed statistically. Qualitative • Gathering assessment data in narrative form. • Can provide rich detail.

  8. Matching Data to Outcomes Do not address a specific outcome with a global measure. • Course grades are good reflections of overall performance in a class. • Course grades are poor reflections of particular learning outcomes. • Can be impacted by many factors not tied to the outcome (attendance, participation, etc.) • Summative exam scores are good reflections of overall performance in a content area or construct of knowledge. • Summative exams may not provide sufficient information about particular outcomes. • Particular items from exams may provide a more direct measure of student performance on particular outcomes.

  9. Matching Data to Outcomes Rubrics are useful in assessment of specific performance criteria for outcomes. • Can be used in student papers, theses, presentations, portfolios, projects, etc. • Review existing rubrics for a match with desired student learning outcomes. • Revise when the coherency of the match between rubric and outcome can be improved. • Analyze existing student work (course projects, papers, etc.) for match with desired student learning outcomes and develop new rubrics. • Find those outcomes not adequately assessed by the first two steps and develop new student performances and corresponding rubrics.

  10. Some Assessment Guidelines • All assessment methods have advantages and disadvantages. There is no “perfect” method in the abstract. • A good assessment has a strong match with the specific outcome to be assessed. • A good assessment demonstrates validity for the purposes for which the data will be used. • A good assessment is measured with accuracy and reliability. • A good assessment is feasible in terms of resources (time, effort and money)

  11. Use multiple methods of assessment and data gathering. • Any one method of data collection carries with it its own strengths and its own weaknesses. • Use of only one method leads to mono-method bias. • Student learning or process outcomes should be approached from multiple aspects, utilizing multiple methods, and carried out by multiple individuals.

  12. Assessment/Evaluation Steps • Generate assessment questions based on student learning outcomes. • Generate ideas on behaviors, skills, attitudes, performances, and dispositions which would provide data to answer these assessment questions. • Compare questions with ideas, looking for congruence. If congruence is not perfect, consider whether ideas need to be added or removed, or assessment questions added or removed. • Assess existing data sources. • What is already being gathered? • What would need to be developed? • What can be gathered?

  13. Assessment/Evaluation Steps • Develop an assessment plan which collects data to provide answers to assessment questions. • Identify how/when the data will be collected. • Identify how/when the data will be analyzed. • Identify how/when the data will be disseminated and to whom. • Operationally define each measure. • What behaviors, skills, etc. are targeted? • Where will each be exhibited or measured? • In what way will each be exhibited or measured? • Develop measurement/evaluation/assessment instruments. • Identify what differentiates successful and unsuccessful performance. • Develop rubrics for assessment of performance.

  14. Assessment/Evaluation Steps • Initial assessment of evaluation plan adequacy. • Adjust plan/measures/metrics accordingly. • Carry out assessment. • Collect initial data. • Analyze initial data. • Evaluate analyzed data for adequacy in answering the assessment questions posed. • Evaluate effectiveness of instruments. • Evaluate effectiveness of analysis. • Make revisions to the assessment plan if necessary.

  15. Assessment/Evaluation Steps • Evaluate data in comparison to assessment questions. • What are the strengths of student performance or program preparation? • What are the weaknesses of student performance or program preparation? • Disseminate assessment results. • Consider gathering and listening to feedback from stakeholders on the usefulness/appropriateness of the findings. • Continue data collection. • Revise assessment plan as necessary to meet program/accreditation needs.

  16. Assessment at ISU • Standing Requirements • Mission Statement • Outcomes Library • Curriculum Map • Communication of Outcomes • Assessment Plan • 2011-2012; 2012-2013; 2013-2014 • Assessment Cycle • Assessment Findings • Action Plan based on findings • Status Report

  17. Assessment at ISU Timelines • 2011-2012 Assessment Cycle • Assessment plan May 1, 2011. • Assessment findings May 1, 2012. • Action plan December 1, 2012. • Status report May 1, 2013. • 2012-2013 Assessment Cycle • Assessment plan May 1, 2012. • Assessment findings May 1, 2013. • Action plan December 1, 2013. • Status report May 1, 2014.

  18. Aspects Involved in Assessment • Mission Statement • Program Educational/Process Objectives (3-6) • Student Learning Outcomes/Process Outcomes (3-6 for each objective) • Student Learning Outcomes/Process Outcomes Aligned with Practices (Curriculum Mapping) • Assessment Plan: which objectives/outcomes will be assessed when; methods; performance targets; • Assessment: Collection, Analysis of Evidence • Evaluation: Interpretation of Evidence • Action Plan

  19. Mission Statement • Links the function of the unit to the overall mission and strategic priorities of ISU • Identifies the program’s purpose • Identifies the primary stakeholders (e.g., students) • Formulating a mission statement: • What is the primary function of the unit? • What are the core activities? • What should those whom you serve experience while/after interacting with your unit?

  20. Outcomes Library Language of assessment: • LEARNING OBJECTIVE: general knowledge/skill/ability a student should have at time of graduate • LEARNING OUTCOME: Specific accomplishments to be achieved. What are you looking for in student performance to tell if they “get it”? • Stated in the form of : <one action verb> + <one something> • In general, aim for 3-6 OBJECTIVES and 3-6 OUTCOMES for each objective.

  21. OBJECTIVE 1: Students will be able to conduct, analyze and interpret experiments, and apply experimental results to improve processes • SLO 1.1: Students will develop and execute experiments to validate designs. • SLO 1.2: Students will design and execute test plans as a part of system commissioning. Objective 2: Students function effectively on teams • SLO 2.1: Student gathers information that relates to the team’s topic. • SLO 2.2: Student shares in the work of the team. • SLO 2.3: Student listens to other teammates. (Could be measured using a rubric during observation and for peer evaluation.)

  22. Curriculum Map • Educational experiences (e.g., courses, internships) mapped to Objectives/Outcomes • Ensure that experiences are present at appropriate levels to support student achievement of each outcome • Communication tool • Faculty identify gaps in curriculum • Sharing with students help them understand how their courses form a curriculum and support achievement of identified outcomes

  23. Curriculum Map

  24. Communication of Outcomes • A description of how students and other stakeholders are informed about the programmatic learning outcomes. • Identify programmatic stakeholders. • Identify methods for informing constituents about learning outcomes (e.g. program handbooks, syllabi, programmatic documents, web publishing) • The best method of communication for students may not be the best method of communication for stakeholders. • This is distinct from communication of assessment findings.

  25. Assessment Plan • Which outcomes are assessed? • Include timeline – not all outcomes need be assessed every year. • How are the outcomes assessed? • What methodology is employed to collect assessment data? • Where will outcomes be assessed? • What are the targets for student achievement? • Who is responsible for carrying out each element of the assessment plan (identified by title)?

  26. Assessment Cycles • Developing assessment cycles: • Don’t try to assess every outcome every year. • Develop a timetable for assessment activities. • Identify person(s) responsible for each assessment activity. • Try to avoid random acts of assessment.

  27. Assessment Cycle Table

  28. Assessment Findings • Aggregate your data • Analyze your data • Ask whether the target for achievement was met. • If not, what are the recommendations for improvement? • If you cannot tell from the collected data, revise your assessment plan. • Provide supporting evidence (meeting minutes, etc.) of faculty discussions of the assessment findings and proposed improvements.

  29. Action Plan • Focus on the findings. • In light of the findings, what will the program do? • Include a timetable for implementing this response. • Evaluate and discuss the resources necessary to support the action plan. • Identify individuals responsible for ensuring that implementation occurs.

  30. Assessment and Evaluation Cycles

  31. DefinitionsFrom G. Rogers, ABET, Program Assessment of Student Learning: Keep it Simple

  32. Questions ?

  33. An Assessment Plan Table

More Related