1 / 21

i2a Institute: Assessing Critical Thinking In Your Course

i2a Institute: Assessing Critical Thinking In Your Course. Cathy L. Bays, PhD, RN i2a Specialist for Assessment. Objectives. Describe the assessment process. Articulate the core Essential Intellectual Standards. Identify methods to assess critical thinking. Assessment.

Download Presentation

i2a Institute: Assessing Critical Thinking In Your Course

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. i2a Institute: Assessing Critical Thinking In Your Course Cathy L. Bays, PhD, RN i2a Specialist for Assessment

  2. Objectives • Describe the assessment process. • Articulate the core Essential Intellectual Standards. • Identify methods to assess critical thinking.

  3. Assessment Write in one word what comes to mind when you hear the word “ASSESSMENT”

  4. Assessment vs. Evaluation • Gelmon, S.B., Holland, B.A., Driscoll, A., Spring, A., Kerrigan, S. (2001). Assessing service-learning and civic engagement: Principles and techniques. Providence, RI: Campus Compact. • Palomba, C.A &Banta, T.W. (1999). Assessment essentials: Planning, implementing and improving assessment in higher education. San Francisco, CA: Jossey-Bass. • Suskie, L. (2009) Assessing student learning: A common sense guide. San Francisco, CA: Anker Publishing Company, Inc. • Walvoord, B.A. (2004). Assessment clear and simple: A practical guide for institutions, departments, and general education. San Francisco, CA: Jossey-Bass.

  5. Assessment vs. Evaluation • Palomba, C.A &Banta, T.W. (1999). Assessment essentials: Planning, implementing and improving assessment in higher education. San Francisco, CA: Jossey-Bass. • Asessment is the systematic collection, review, and use of information about educational programs undertaken for the purpose of improving student learning and development.

  6. Assessment vs. Evaluation • Suskie, L. (2009). Assessing student learning: A common sense guide. San Francisco, CA: Jossey-Bass. • Evaluation is part of the assessment process-Interpreting & using the results • Evaluation is a broader concept than assessment • Evaluation = Assessment

  7. Ideas to Action (i2a) Evaluation is the systematic collection of information about i2a initiatives and processes and its impact on student learning and development. In this process baseline, process and outcome assessments are conducted and information is reviewed and used to enhance learning and achieve i2a goals. The i2a Evaluation Vision is a systematic, ongoing process to evaluate the evidence of undergraduate students' ability to think critically and connect student learning to community for the purpose of enhancing the quality of the undergraduate educational experience and documenting accountability to accreditation agencies. Specific i2a Evaluation Goals include triangulation of meaningful direct & indirect assessments, consistency with Paul-Elder critical thinking model, evaluation of outcomes and process, "value-added" assessments, and faculty input & participation.

  8. Formative & Summative Assessment • Formative Assessment: The gathering of information about student learning-during the progression of a course or program and usually repeatedly-to improve the learning of those students. Example: reading the first lab reports of a class to assess whether some or all students in the group need a lesson on how to make them succinct and informative. • Summative Assessment: The gathering of information at the conclusion of a course, program, or undergraduate career to improve learning or to meet accountability demands. When used for improvement, impacts the next cohort of students taking the course or program. Examples: examining student final exams in a course to see if certain specific areas of the curriculum were understood less well than others; analyzing senior projects for the ability to integrate across disciplines. Leskes, A. (2002). Beyond confusion: An assessment glossary. AAC&U Peer Review, (4) 2/3.

  9. Assessment Cycle

  10. i2a Evaluation Plan

  11. Universal Intellectual Standards • Assess the “quality” of thinking • Miniature Guide 7, 8, 9, 10 standards! • Analytic Thinking 9-18 standards! • Intellectual Standards 9+ standards! • 6 “core” standards: Clarity, Accuracy, Precision, Relevance, Depth, Breadth

  12. Standards for Thinking Clarity: Understandable, the meaning can be grasped Accuracy: Free from errors or distortions, true Precision: Exact to the necessary level of detail Relevance: Relating to the matter at hand Depth: Containing complexities and interrelationships Breadth: Involving multiple viewpoints Logic: The parts make sense together, no contradictions Significance: Focusing on the important, not trivial Fairness: Justifiable, not self-serving (or egocentric) Richard Paul Keynote, 28th International Conference on Critical Thinking

  13. StandardsActivity • Each group has been give 3 standards to discuss. • In your group discuss the unique characteristics of each standard. In other words, what makes that standard unique of different from the other ones. • Be prepared to share with the group in 1 or 2 words what makes the standard unique.

  14. The “Standards” in Action • Art Methods for Elementary and Middle School • Creative production must be critically assessed. Our artworks can be viewed in light of the following: • Clarity- Is my artwork unclear? •  Accuracy-Is it accurate? •  Precision- Is it imprecise? • Relevance-Is it irrelevant? • Depth- Is it superficial? • Breadth- Does it have breadth or is it too narrow? • Logic- Is it illogical? • Significance- Is it trivial?

  15. The “Standards” in Action • Social Theory Class • Assignment Criteria A. Setting the Theoretical Framework: With accuracy, precision, clarity, depth, and breadth (see the Miniature Guide to Critical Thinking) , explain Durkheim’s theory of suicide, including the concepts integration, regulation, collective consciousness/conscience collective, social solidarity, and ritual. Also explain his ontological assumptions, his epistemology, and his goals in creating this theory. (30 points) B. Presenting the Data: • C. Analyzing the Data: •  D. Using logic –that is, drawing conclusions based on your data: • E. Implications:

  16. Measures • Global vs. Discipline/Content Specific • Direct • Palomba & Banta “… require students to display their knowledge & skills…” • Suskie “…tangible, visible, self-explanatory evidence of exactly what students have and haven’t learned.” • Examples: Tests, Papers, Presentations, Clinical evaluations • Indirect • Palomba & Banta “…ask students to reflect on their learning rather than to demonstrate it.” • Suskie “…signs that students are probably learning, but the evidence of exactly what they are learning is less clear and less convincing.” • Examples: Survey, Course grades

  17. Critical Thinking Measures • Course Activities e.g. SEE-I Assignments e.g. Rubrics Evaluations • Instruments • Watson Glaser http://pearsonassess.com/HAIWEB/Cultures/en-us/Productdetail.htm?Pid=015-8191-013 • California Critical Thinking Disposition & Skills http://www.insightassessment.com/ • Basic Concepts & Skills Test http://www.criticalthinking.org/resources/assessment/index.cfm • Critical Thinking Assessment Test http://www.tntech.edu/cat/ • CAAP, CLA, MAPP, NSSE • Intellectual Traits Inventory • Speed School of Engineering

  18. Rubrics • Stevens, D.D. & Levi, A.J. (2005). Introduction to rubrics. Sterling, VA: Stylus. • Rubric: In general a rubric is a scoring guide used in subjective assessments. A rubric implies that a rule defining the criteria of an assessment system is followed in evaluation. A rubric can be an explicit description of performance characteristics corresponding to a point on a rating scale. A scoring rubric makes explicit expected qualities of performance on a rating scale or the definition of a single scoring point on a scale. From “A Short Glossary of Assessment Terms” at http://serc.carleton.edu/sp/library/assessment/glossary.html • Types: Holistic-Critical Thinking Analytic-Grading

  19. Rubric Development • Planning: Project, Objectives Questions: 1. What skills will students need to have or develop to successfully complete the project? 2. What evidence can students provide in this project that would show they have accomplished what you hoped they would accomplish when you created the project? 3. What are the highest expectations you have for student performance on this project overall? 4. What is the worst fulfillment of the project you can imagine, short of simply not turning it in at all? • Development: Dimensions Key content or behaviors, Weighting Scales (Level) Numeric or behavioral, Even vs. odd number • Application: Score the project • Revision: Refinement with use

  20. Rubric Examples • Rubistar http://rubistar.4teachers.org/index.php • Examples • Washington State http://wsuctproject.wsu.edu/ctr.htm • Miami University http://www.units.muohio.edu/led/Assessment/Assessment_Basics/Rubrics.htm • Foundation for Critical Thinking • CEHD & Speed • PEACC & REACH • Do students see the evaluation/grading rubric before they complete the project?

  21. i2a Assessment http://louisville.edu/ideastoaction/what/assessment

More Related