300 likes | 565 Views
Discover how to create tests that assess learner performance and instructional effectiveness based on criteria, with various types and techniques explained.
E N D
Developing Assessment Instruments Instructional Design: Unit, 3 Design Phase
Criterion-Referenced Tests • Designed to measure explicit behavioral objectives • Allows instructors to decide how well the learners have met the objectives that were set forth. • Used to evaluate: • learner performance • effectiveness of the instruction
Criterion-Referenced • Also called objective-referenced , or domain-referenced • Refers directly to explicit “criterion” or specified performance • “Criterion-Referenced Test” must: • match test item and performance objective • provide degree of mastery of the skill
Types of Criterion-Referenced Tests Dick, Carey and Carey discuss four different types of criterion-referenced tests that fit into the design process: • Entry Behaviors Test • Pretest • Practice Tests • Posttests
Types of Criterion Tests Entry behavior test: • Consists of items that: • measure entry behavior skills • test skills to be taught • draw from skills below the entry behavior line • Helps determine appropriateness of required entry skills. • Used during formative evaluation process. May be discarded in final version of instruction.
Types of Criterion Tests Pretest: • used to determine whether learners have previously mastered some or all of the skills that are to be included in the instruction. • IBT determines whether or not students are ready to begin your instruction, • PT helps determine which skills in your main instructional analysis, Students may already be familiar with.
Types of Criterion Tests Practice test: • To provide active learner participation during instruction. • Enable learners to rehearse the new knowledge and skills they are being taught. • also allow instructors to provide corrective feedback to keep learners on track.
Types of Criterion Tests Posttest: • Are administered following instruction, and they are parallel to pretest. • Assesses all the objectives, focusing on terminal objectives • Helps identify ineffective instructional segments • Used during the design process and may be eventually modified to measure only terminal objectives.
1-4 5 - 13 • Using the instructional analysis diagram in this slide, indicate by box number (s) the skills that should be used to develop test items for: • Entry behaviors test:………. • Pretest:………… • Posttest:……….. 5 - 13 14 12 13 11 7 8 9 10 5 6 Skills for instruction Entry behaviors 4 1 2 3
Designing Tests for Learning Domains • Intellectual & Verbal Information • paper & pencil, short-answer, matching, and multiple-choice. • Attitudinal • state a preference or choose an option • Psychomotor • performance quantified on checklist • subordinate skills tested in paper-and-pencil format
Determining Mastery Levels • Approach # 1 • mastery defined as level of performance normally expected from the best learners • arbitrary (norm-referenced) (group comparison methods) • Approach # 2 • defined in statistical terms, beyond mere chance • mastery varies with critical nature of task • example: nuclear work Vs. paint a house • Is the level required in order to be successful on the job.
Writing Test Items What should test items do? • Match the behavior of the objective • Use the correct “verb” to specify the behavior • Match the conditions of the objective
Writing Test Items How many test items do you need? • Determined by learning domains • Intellectual requires three or more • Wide range use random sample
Writing Items (continued) What types (true / false, multiple choice, etc..) to use? • clues provided by the behavior listed in the objective • review “Types of Test Items” this chap. p 148 • Entry behavior • Pretest • Practice test • Posttest
Writing Items (continued) Item types tempered by: • amount of testing time • ease of scoring • amount of time to grade • probability of guessing • ease of cheating, etc. • availability of simulations
Writing Items (continued) What types are inappropriate? • true / false for definition • discrimination, not definition Acceptable alternatives from “best possible” • for simulations • list steps
Constructing Test Items Consider: • vocabulary • setting of test item (familiar Vs. unfamiliar) • clarity • all necessary information • trick questions • double negatives, misleading information, etc.
Other Factors • Sequencing Items • Consider clustering by objective • Test Directions • Clear and concise • General • Section specific • Evaluating Tests / Test Items
Measuring Performance, Products, & Attitudes • Write directions to guide learner activities • Construct an instrument to evaluate these activities • a product, performance, or attitude • Sometimes includes both process and a product
Test Directions for Performance, Products, & Attitudes • Determine the • Amount of guidance? • Special conditions • time limits, special steps, etc. • Nature of the task (i.e., complexity) • Sophistication level of the audience
Assessment Instruments for Performance, Products, & Attitudes • Identify what elements are to be evaluated • cleanliness, finish, tolerance (possibility) of item, etc. • Paraphrase each element • Sequence items on the instrument • Select the type of judgment for rater • Determine instrument scoring
Formats for Assessments of Performance, Products, & Attitudes • Checklist • Rating Scale Frequency Counts • Etc.
Evaluating Congruency • Skills, Objectives, & Assessments should refer to the same behaviors • To check for congruency • Construct Congruency Evaluation Chart • include: Subskills, Behavioral Objectives, & Test Items
Example • Objective1: Given a research topic and a list of ten Google search results, select the three web sites most appropriate to the research topic. • What will they need to do? The learners should be able to select web sites from a list of search results. • What conditions will need to be provided? The learners will need to be given a predetermined research topic and a list of actual Google search results related to that topic. • Domain : Intellectual Skills: Rules. Students have to apply a set of criteria in order to make a decision. • This objective will require fill-in-the-blank test item, as the students will have to write down the three most appropriate sites based on certain criteria. • Test Item 1: • Take a look at the following Googlesearch results: (show screen capture of search results). Which 3 web sites are likely to have specific and relevant information dealing with the subject of Life on Mars?