1 / 23

Validity

Validity. Validity. The appropriate use and interpretation of test results, rather than the instrument itself. It is about degree, rather than yes/no. It is a unitary concept. Validity is specific to some particular group, use, or purpose of the test score interpretations

clove
Download Presentation

Validity

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Validity

  2. Validity • The appropriate use and interpretation of test results, rather than the instrument itself. • It is about degree, rather than yes/no. • It is a unitary concept. • Validity is specific to some particular group, use, or purpose of the test score interpretations • Purpose, population…. Referencing test manuals.

  3. The nature of Validity • Construct (latent Variable) • Concepts, ideas, or hypotheses that are used to describe or explain behaviors. • Can not be measured directly • Can not be observed • Ex: intelligence, depression….. • Threats to Validity • Construct underrepresentation • Was measured insufficiently • Construct-irrelevant variance • testing irrelevant variables • Others: the design of instruments, complying with manual, test takers, inappropreate test group

  4. Validity & Reliability • Reliability is a necessary, but not sufficient condition for validity

  5. Sources of Validity Evidence • Content Related Evidence • The relationship between contents of an instrument and the construct. • Criterion-Related Evidence/concurrent, predictive • Evidence based on the relationship between test scores and external variables, existing criterion or predictive criterion • Construct Related Validity Evidence • Evidence based on the appropriateness of inferences drawn from test scores as they relate to a particular construct

  6. Content-Related Evidence • Based on response process • Actions, thought processes, and emotional traits that the test taker invoked in responding to a test. • Face validity • A test appears to be measuring what it claims to be measuring.

  7. Criterion-Related Validity Evidence • Evaluating a criterion • Results of a instrument is correlated with those produced from a well respected instrument. • Validity Coefficients • Correlation “r” • -1< r < +1 • Very high: r. .50 • High: .40 - .49 • moderate/acceptable: .21 - .40 • unacceptable/low: .20 • Prediction errors • Hit/ error

  8. Construct-Related Validity Evidence • Evidence of homogeneity • Uniform of items and components of an instrument are measuring a single concept. • Convergent & Discriminant validity evidence

  9. Evidence Based on Consequences of Testing • Intended and unintended consequences • Gather evidence to support the positive consequences outweigh the negative consequences. • Ex: No Child Left Behind

  10. Process and Procedures of Testing

  11. Selecting Assessment instruments and Strategies • Identify the type of information needed • Purposes of testing • Identify available information • Existing information • What are they? Where can you obtain/request for them? • Determine methods for obtaining information • Formal Assessment Vs. Informal Assessment

  12. Selecting Assessment instruments and Strategiescount. Search assessment resources Mental Measurement Yearbook Publishers’ websites and catalogs Manuals Research literatures Internet resources

  13. Evaluate and Select an Instrument • What is the purpose of the instrument? • Who is the intended population? • What is the makeup of the norm group? • Are the results of the instrument reliable? • Do the instrument's results have evidence of validity? • Does the instrument’s manual provided clear and detailed instructions about administration procedures?

  14. Evaluate and Select an Instrument count. • Does the manual provide sufficient information about scoring, interpreting, and reporting results? • Is the instrument biased? • What level of competency is needed to use the instrument? • What practical issues should be considered for the instrument? • Time • Ease of administration • Ease of scoring • Ease of interpretation • Reliability • Coast of the instrument

  15. Administering Assessment Instruments • Before administration • Acquainted with the instrument • Manual, forms, answer sheets, and relevant materials • Obtain informed consent prior to the testing • During administration • Checking all materials relevant to testing/ site/seating/signs • Paper pencil testing VS. Computer testing • After administration • Organize testing materials and collecting answer sheets • Recording any incidents may during testing that may affect individuals’ responses to testing items.

  16. Interpreting Assessment Results • What type of instrument? • Norm Referenced Vs. Criterion referenced • Inter-individual (variation among test takers)Vs. Intra-individual (variations of the same individual among different sub-scales of the same test) • Knowledgeable about various type of scores: percentile, standard scores… • Communicating results with language (level) that your clients can understand. • Interpreting only scores . Do not carelessly inferring the test results or apply values onto test scores. • True scores/measurement errors

  17. Summary • What elements should a counselor need to consider before choosing an instrument? • What info is needed • what info is available • What approach is appropriate for the situation • What resources are available for the selection of an instrument? • Mental Measurement Yearbook • Publishers’ websites and catalogs • Manuals • Research literatures • Internet resources

  18. Feedback Sessions • Oral or written • Comprehensible and useful • Need to consider: • Purposes of assessment • What does it mean to the test taker, education, career…. • Should be soon after assessment

  19. What to Cover in a Feedback Session? * Purposes of the assessment * Concepts involved * Limit of the assessment * Test bias, like___________ * Assessment error -- ___________ * The result is one of many resources, and not sufficient for assessing any construct fully. * Report results in a manner that clients can understand

  20. Interpreting scores • Percentile rank • Standardized deviation • T score • Stanine • Grade Equivalent Scores • Age Equivalent Scores

  21. Summary • What is the rang of Percentile scores? • A. 1-99 B. 1-100 C. 0-100 • What is the mean of T scores? • A. 50 B. 40 C. 60 • What is the rang of Stannine scores? • A. 1-9 B. 0-9 C. 0-10

  22. Questions?

More Related