1 / 100

Strategies, Tips, and Tools for Facilitating Learning Outcomes Assessment

Student Learning Outcomes. Strategies, Tips, and Tools for Facilitating Learning Outcomes Assessment. Jerry Rudmann, Irvine Valley College February 2008. Overview - Instruction. Fine-tuning assessment Item analysis primer Calibrating rubric scoring Tips for writing surveys

dasan
Download Presentation

Strategies, Tips, and Tools for Facilitating Learning Outcomes Assessment

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Student Learning Outcomes Strategies, Tips, and Tools for Facilitating Learning Outcomes Assessment Jerry Rudmann, Irvine Valley CollegeFebruary 2008

  2. Overview - Instruction • Fine-tuning assessment • Item analysis primer • Calibrating rubric scoring • Tips for writing surveys • Helpful technology tools • Clickers - promote active learning and record SLO information • PDF Acrobat forms - autoscoring and recording student input • Portfolios - making students responsible and reflective • Scanning - some ideas • eTablets • Rubric generators - a way to measure most anything • Excel templates • CCC Confer - makes dialogue easier • Calibrated Peer Review • Tracking software - organizing all this stuff • Several options / strategies for making SLOs meaningful • Address “robust” SLOs (overarching outcomes) • Problem focus • Less is better • Share SLOs with students • Use what you already have • Think of SLOs in the context of student development • Qualitative assessment in OK

  3. Some Options / Strategies for Making SLOs Meaningful • Address “robust” SLOs (overarching outcomes) • Problem focus • Less is better • Share SLOs with students • Use what you already have • Think of SLOs in the context of student development • Qualitative assessment is OK • Others…

  4. General Tip 1: Robust SLOs • Developed through faculty dialog • Behavioral/measurable • Real-world • Higher-level • Conditions • Performance Criteria • Global, over-arching • Scored with rubric

  5. General Tip 2: Problem Focus Approach • What concepts or competencies do students have difficulty mastering? • Focus SLO activities on problem areas.

  6. General Tip 3: Keep It Simple But Meaningful • Corollary - Often, less is better.

  7. General Tip 4: Student Development Approach • Student development • Academic self-efficacy (Bandura) • Academic self-regulation • Campus involvement (Astin) • Mentoring professor studies • Student Services DO help student success

  8. A Closer Look at Objective Tests Test Items

  9. Item Considerations • Reliability • Item Difficulty • Validity • Level of assessment (Bloom’s taxonomy) • Tips from ETS • Recommendations

  10. What is an Assessment? • In the most general sense, an assessment is a sample of behavior • Achievement • Aptitude • Personality • For assessing SLOs, an assessment is a finite set of objective format items

  11. Assessment and Item Analysis Concerns We must consider the properties of the items that we choose • Is your assessment reliable? • Item difficulty level • Examine performance of distracters • Is your assessment valid?

  12. Improving Reliability of Assessment • Use several similar items that measure a particular skill or area of knowledge. • Seek items giving high item/total test score correlations. Each item should correlate well with the assessment’s total score.

  13. Said in another way… • When you are giving a test that is measuring a specific characteristic (e.g., knowledge of the steps toward completing a financial aid form; knowledge of theoretical perspectives in psychology; knowledge of the components of a neuron), the items on the test should be intercorrelated (i.e., have “internal consistency” - they relate with one another). • Only when the items relate to one another can we be confident that we are measuring the characteristic we intended to measure (i.e., when the test has ‘internal consistency, the test is a reliable measure of that characteristic and it is more likely to be valid).

  14. How Do We Determine Internal Reliability? • Examine correlations between each item score and the total test score—this is one way to assess “internal consistency” • You are correlating students’ “pass” vs. “fail” status on each item with students’ overall test scores. • This analysis indicates whether the item and total scores are assessing the behavior in the same way. • In general, items should be answered correctly by those obtaining high total scores (thus there should be a positive correlation). • In your final test, select only those items that have high positive internal correlations.

  15. Item Difficulty • Difficulty • Permits the construction of a test in a specific way with specific characteristics. • Difficulty is based on the proportion of persons who pass or correctly answer the item. • The greater the proportion, the easier the item • What is the optimum level of item difficulty?

  16. Item Difficultly - Prediction • If you are assessing achievement,proficiency, or mastery of subject matter, AND the results will be used in studies or examinations for prediction, then you should strive for an average item difficulty of .50 • (and each item should not deviate much from this—this gives maximum variance among test scores, which is good for reliability and validity) • With .50 difficulty, there are more “discriminations” possible, thus you have the maximum “variance” among the test scores (this leads to better reliability and validity)

  17. Item Difficultly - Competency • If you are interested in classification (e.g., mastery or not of most of the material in the course), then you should use the proportion that represents that standard. • If you deem 80% on an exam as “mastering the material,” then you should use .80 as the average difficulty level • some items will be higher and some lower, but the average would be .80.

  18. Item Analysis: Validity • Test Validity: Relationship between total test scores and scores on an outside variable • Item Validity: Relationship between scores on each of the items and some external criterion. • Most are concerned with test validity, but test validity is a function of item validity.

  19. More on Item Validity • Create external criterion groups: e.g., those with high scores (say upper 27%) and those with low scores (say lower 27%)— find items on the test (to predict school aptitude) that are passed by a significantly greater number in one group than the other group. These are the more effective items. • To select items, calculate the “discrimination index” (D), which is the difference between the number of correct responses for the high (H) and the low (L) groups. If 80 H scorers answered the item correctly, while 10 L scorers answered it correctly, the D = H – L = 70. Should select positive and high D value items (especially for achievement or aptitudes tests) for inclusion in the final form of the test (can use D as proportions, thus taking the difference between proportions and would be independent of sample size).

  20. Tips for Writing Multiple-Choice Questions • You CAN test more than recognition and recall. • Applying Bloom’s Taxonomy.

  21. Bloom’s Taxonomy & Objective Test Items

  22. A Lower Order Question Obsessive-compulsive disorder is characterized by which primary symptom? • Hallucination • Memory loss • Intense specific fear • Delusion • Unwanted repetitive thoughts*

  23. Lower Order Question, Type 2 Which disorder is characterized by unwanted, intrusive thoughts and repetitive behavior? • Phobia • Obsessive-compulsive disorder* • Dissociative identity disorder • Major depressive disorder • Schizophrenia

  24. Creating Higher-Order Questions • The question requires students to mediate their answers by doing an extra step they had not previously learned in their studies. • Students must transfer recalled knowledge to a new situation, break apart and reassemble concepts in new ways, or combine content of two areas in novel ways to answer a question. • Not always easy to distinguish between application and analysis questions

  25. A student who misses deadline in school while striving for perfection may be exhibiting symptoms of which of the following disorders? • Phobia • Obsessive-compulsive disorder* • Dissociative identity disorder • Major depressive disorder • Schizophrenia

  26. Gene is always late for school because he spends an hour organizing his closet each morning. Which of the following treatments would be most effective for Gene’s problem? • In-depth interpretation of dreams • Electroconvulsive therapy • Medication affecting serotonin levels* • Systematic desensitization • Regular exposure to bright lights

  27. Tips from ETS • Whenever possible write items using positive form. • Don’t include “teaching” in the stem. • Uses plausible distracters. • Can you give a reason why each distracter is not an acceptable response? • The stem should be a complete question or statement. • The correct answer should be about the same length as the distracters. • Items should not ask trivial information. The point being tested should be one worth testing.

  28. ETS Tips on Distracters • Should be reasonable. • May include misconceptions and errors typical of less prepared examinees. • May include truisms, rules-of-thumb that do not apply to or satisfy the problem requirements. • Negative stems should be avoided. Stems that include “EXCEPT” “NOT” “LEAST” can be difficult to process. Never use negatives in both the stem and in the options. • Avoid using “All of the above” as a distracter.

  29. Conclusions and Recommendations • Take care when writing and/or selecting items from a test bank. • Look for at least some items that test higher levels of Bloom’s Taxonomy. • After the test, have your best students critique your test and find items needing revision. • When selecting software (clicker, scanner, survey, test) consider the item analysis capability that comes with the software – factor that in to your purchase decision. • Pass/fail rate for each item • Percentage breakdown for all distracters • Discrimination index (high versus low scorers) • Item correlation with total score

  30. Final Word on Valid Assessment • Try using different methods of assessing learning • Converging evidence • This increases the overall validity of assessment • Example • Embedded assessment (multiple choice quizzes, exams) • Authentic assessment (students apply the skill) • Students self-rate their ability • Students post evidence in ePortfolio

  31. Surveys

  32. Students self-rate their competencies on program or college level learning outcomes. Students’ satisfaction with various student services. Surveys - SLO Uses

  33. Types of Questions Open-ended – respondents answer in own words Closed-ended – respondent limited to a finite range of choices

  34. Types of Questions Open-ended Flexible Hard to code answers Good for preliminary work to finalize a survey Closed-ended Easier to code answers, process and analyze Hard to write good closed-ended items

  35. Item Format Visual Analogue Scale Food in the cafeteria is… Poor_ _ __ _ _ _ _ _ _ _ _ _ _ _ _ _Excellent Likert Scale Food in the cafeteria isoutstanding! SDDN A SA (Strongly Agree) (Disagree) (Neutral) (Agree) (Strongly Agree)

  36. Nine tips for designing and deploying a survey Don’t call it a survey Provide a carefully worded rationale or justification at the beginning Group items by common format Start with more interesting items Put demographic items last Mix in negative wording to catch acquiescence (aka “response set”) Automate scoring when possible If asking for sensitive information, use procedures designed to assure anonymity Always, always, always pilot test first

  37. Survey Administration Methods Face to Face Written Group administration Mail Computerized http://research.ccc.cccd.edu Password protected Validation rules Branching and piping Telephone

  38. Focus Groups Focus groups can be especially insightful and helpful for program and institutional level learning outcome assessment. Have your college researcher provide some background materials. Focus Groups: A Practical Guide for Applied Research By Richard A. Krueger, Mary Anne Casey The RP Group sponsored several “drive in” workshops over the last few years.

  39. Goal for This Section • Technology Uses • Technology ToolsExpected Outcome: Be able to select and use technology-based approaches to assess student learning outcomes

  40. Assessment Challenges • Assessing Students in Large Classes • Assessing Performance at a Distance • Minimizing Subjectivity in Assessment • Creating Authentic Assessments • Engaging Students in Self-Evaluation • Accommodating Diverse Learning Styles • Assessing Conceptual Thinking • More Efficient Assessment

  41. Technology Tools • CCC Confer (Web Conferencing) • Online Rubric Builders • eLumen (SLO Assessment/Tracking) • Calibrated Peer Review (CPR) • Classroom Responders (“Clickers”) • Scannable and Online Tests • ePortfolio • Adobe Acrobat Forms • Excel Spreadsheets

  42. CCC Confer • Small-group work in project-based learning • Involving ALL instructors in the department’s SLO dialogue http://www.cccconfer.org

  43. CCC Confer Screen Shot

  44. Rubrics • Way to measure the heretofore immeasurable: products and performances. • A rubric breaks the assessment into important components. • Each component rated along a scale well-labeled scale.

  45. Let’s Develop an Assessment Rubric for a Resume

  46. Chocolate Chip Cookie Rubric Chocolate Chip Cookie Rubric

  47. Rubrics are Good! • Facilitate staff dialogue regarding satisfactory performance. • Create a more objective assessment. • Make expectations more explicit to the student. • Encourage metacognitive skill of self-monitoring own learning. • Facilitate scoring and reporting of data.

  48. Online Discussion Rubric http://www.uas.alaska.edu/sitka/IDC/resources/onlineDiscussionRubric.pdf

  49. Design Your Own Rubric • Please work in groups and use the worksheet in your packet to design a scoring rubric for assessing one of the following: • Coffee shop (not café) • Syllabi • Customer service at retail stores • Grocery stores • Online courses

  50. Online Rubric Builders • Rubrics to guide and measure learning • Tools • Rubistar http://rubistar.4teachers.org • Landmark Rubric Machine http://landmark-project.com/rubric_builder

More Related