1 / 36

Daniel L. Stufflebeam C. I. P. P. Evaluation Model

Daniel L. Stufflebeam C. I. P. P. Evaluation Model. CIPP Model. Objectives: Be familiar with Stufflebeam’s educator background Understand Stufflebeam’s CIPP model Be able to discuss the HRD “essence” of the CIPP model. CIPP Model. Pre - Test 1. What do the letters CIPP stand for?

leonora
Download Presentation

Daniel L. Stufflebeam C. I. P. P. Evaluation Model

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Daniel L. Stufflebeam C. I. P. P. Evaluation Model

  2. CIPP Model Objectives: • Be familiar with Stufflebeam’s educator background • Understand Stufflebeam’s CIPP model • Be able to discuss the HRD “essence” of the CIPP model

  3. CIPP Model Pre - Test 1. What do the letters CIPP stand for? 2. What profession is Daniel L. Stufflebeam? 3. Name the three major steps for any evaluation. 4. Draw the matrix for the four decision-making settings. .

  4. CIPP Model Pre-Test 5. Describe the General Evaluation Model. 6. Classify each evaluation type within the ends, means, intended and actual matrix. 7. Name the four evaluation types and their decision-making purpose

  5. CIPP Model Stufflebeam Biography • Daniel Leroy Stufflebeam, education educator • Born in Waverly, Iowa, September 19, 1936 • BA, State University Iowa, 1958 • MS, Purdue University, 1962, Ph D, 1964; postgrad., University of Wisconsin 1965

  6. CIPP Model Stufflebeam Biography • Professor, Director Ohio State University Evaluation Center, Columbus, 1963 - 1973 • Professor education, Director Western Michigan University Evaluation Center, Kalamazoo, 1973 - • Author monographs and 15 books; contributed chapters to books, articles to professional journals

  7. CIPP Model • Recipient Paul Lazersfeld award Evaluation Research Society, 1985 • Member American Educational Research Association, National Council on Measurement in Education, American Evaluation Association • Served with the United States Army, 1960 • Children: Kevin D., Tracy Smith, Joseph

  8. CIPP Model Key Components : 1. Evaluation definition 2. Major 3 steps for any evaluation 3. Decision-making settings 4. Types of decisions 5. General evaluation model 6. Types of evaluation 7. Total evaluation model

  9. CIPP Model Definition: Evaluation is the process of delineating, obtaining and providing useful information for judging decision alternatives

  10. CIPP Model Definition Key Terms: • Evaluation: ascertainment of value • Decision: act of making up one’s mind Then from the decision-maker viewpoint: Evaluation is the process of ascertaining the relative value of competing alternatives

  11. CIPP Model Evaluation is: • Decision-making driven • Systematic and continuing process • Made-up of 3 major steps/methodologies 1. Delineating 2. Obtaining 3. Providing

  12. CIPP Model Definitions of Evaluation Steps: 1. Delineating - focusing the requirements for information to be collected through specifying, defining and explicating

  13. CIPP Model Definitions of Evaluation Steps: 2. Obtaining - making information available throughprocesses such as collecting, organizing and analyzing and through means such as statistics and measurement 3. Providing - fitting together into systems or sub-systems that best serve the needs or purposes of the evaluation

  14. CIPP Model High Information Grasp Low Degree of Change Small Large Decision-Making Settings

  15. CIPP Model Decision-Making Settings - Key Points: • Driven by the relation of useful information available to degree of change to be effected • Importance/consequences of the decision to be made drives evaluation extensiveness • Little information available or not in useful form drives more evaluation extensiveness

  16. CIPP Model Decision-Making Setting Definitions 1. Metamorphic - utopian complete change in the educational system with full information/knowledge of how to effect the desired changes (low probability) 2. Homeostatic - small, remedial, restorative to normal state changes to the educational system guided by technical standards and routine data collection systems (prevalent “quality control” with low risk)

  17. CIPP Model Decision-Making Setting Definitions 3. Incremental - continuous improvement in an educational system intended to shift the program to a new norm (rather than correct back to a norm for homeostatic) but guided by little available knowledge and ad-hoc/special project in nature (allows “innovation” in a trial and error and iterative nature with acceptable risk since small corrections can be made as problems are detected)

  18. CIPP Model Decision-Making Setting Definitions 4. Neomobilistic - innovative activities for major change/new solutions to significant problems in an educational system but supported by little theory and little knowledge; driven by great and compelling opportunities like knowledge explosion, critical conditions or world competition (becoming more prevalent in response to needed higher rates of change under worthy risk)

  19. CIPP Model Ends Means Types of Decisions

  20. CIPP Model Types of Decisions Matrix: • Forms the model of all possible educational system needed decision-making categories while also being mutually exclusive (ends, means, intended and actual) • Provides for a generalizable evaluation design model

  21. System Activities Evaluation 1. 2. 3. General Evaluation Model CIPP Model Decisions

  22. CIPP Model Types of Evaluation: ContextEvaluation - to determine objectives Input Evaluation - to determine program design Process Evaluation- to control program operations Product Evaluation -to judge and react to program attainments

  23. CIPP Model Ends Means Types of Decisions and Evaluations

  24. CIPP Model Evaluation Design: • Evaluations are designed after a decision has been made to effect a system change and the actual evaluation design is driven by the decision-making setting • Generally: greater the change and lower the information grasp the more formal, structured and comprehensive the evaluation required

  25. Evaluation Type Objectives: CONTEXT EVALUATION • Provides rationale for determination of objectives • Defines relevant environment • Describes desired and actual conditions of environment • Identifies unmet needs • Identifies unused opportunities

  26. Evaluation Type Objectives: INPUT EVALUATION • Determines how to use resources • Assesses capabilities of responsible agency • Assesses strategies for achieving objectives • Assesses designs for implementing a selected strategy

  27. Evaluation Type Objectives: PROCESS EVALUATION • Detect or predict defects in procedure design or its implementation • Provide information for programming decisions • Maintain record of the procedure as it occurs

  28. Evaluation Type Objectives: PRODUCT EVALUATION • Measure attainments • Interpret attainments • Done as often as necessary during the program life

  29. A Total Evaluation Model: 1. Follows the general evaluation model relationships between activities, evaluation and decisions and uses the 3 major steps for any evaluation 2. Need a full time program evaluator

  30. A Total Evaluation Model: 3. Need a continuous and systematic context evaluation process sponsored by the program planning body for the purpose of deciding to change or continue with program goals and objectives 4. Initiate specific and ad-hoc input, process and product evaluations only after a planning decision to effect a system change

  31. A Total Evaluation Model: 5. Specific evaluation designs vary according to the setting for the change • Homeostatic (small changes with adequate information) • Incremental (low information for small changes) • Neomoblistic (low information for large changes) • (exclude Metamorphic since only theoretical relevance)

  32. CIPP Model HRD Essence • HRDviewpoint • Formative - Summative • Evaluation traditions

  33. CIPP Model HRDViewpoint • Discrepancy • Democratic • Analytical • Diagnostic - CIPP: logical and research based approach of the total training system

  34. CIPP Model Formative - Summative • Context • Input formative • Process • Product summative

  35. CIPP Model Evaluation Traditions • Scientific - 1950’s • Systems - 1970’s CIPP • Qualitative - 1980’s • Eclectic - late 1980’s

  36. Post - Test 1. What do the letters CIPP stand for? 2. What profession is Daniel L. Stufflebeam? 3. Name the three major steps for any evaluation. 4. Draw the matrix for the four decision-making settings. 5. Describe the General Evaluation Model. 6. Classify each evaluation type within the ends, means, intended and actual matrix. 7. Name the four evaluation types and their decision-making purpose.

More Related