1 / 24

Measuring Up on College-Level Learning

Measuring Up on College-Level Learning. Margaret Miller, Project Director September 2003. Measuring Up 2000. Learning in the States: Incomplete. [Add state map on incomplete]. Certification of individual students E.g., Texas ’ s TASP, Florida ’ s CLAST

rehan
Download Presentation

Measuring Up on College-Level Learning

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Measuring Up on College-Level Learning Margaret Miller, Project Director September 2003

  2. Measuring Up 2000

  3. Learning in the States: Incomplete [Add state map on incomplete]

  4. Certification of individual students E.g., Texas’s TASP, Florida’s CLAST Institutional assessment for improvement E.g., Tennessee's performance measures Missouri’s accountability program Campus-based assessment Institutional assessment for accountability E.g., S. Dakota and Arkansas State Efforts to Measure Learning(taxonomy: Peter Ewell, Change magazine)

  5. Pew’s Quality of Undergraduate Education and writing assessment projects American Association of Colleges and Universities’ general education assessment project Council on Higher Education Accreditation’s project on institutional effectiveness Secretary's Commission on Achieving Necessary Skills (SCANS) skills Equipped for the Future National Skills Standards Board National Attention to College-Level Learning

  6. Key Questions What do the state’s college-educated citizens know and what can they do that contributes to the social good? What kind of educational capital do they represent? and

  7. How well do the state’s public and private, two- and four-year colleges and universities collectively contribute to that capital? What do those whom they educate know, and what can they do? Key Questions (cont.)

  8. Whose learning will we measure? What learning will we measure? How will we use the information? What strategies will we pursue? Key Decisions

  9. Whose Learning • Whose learning • What learning • The policy uses for the information • Assessment strategies The college-educated in the states and college students

  10. What Learning Whose learning What learning The policy uses for the information Assessment strategies National Education Goal 6: “By the year 2000, every adult American will be literate and will possess the knowledge and skills necessary to compete in a global economy and exercise the rights and responsibilities of citizenship”

  11. What Learning (cont.) Whose learning What learning The policy uses for the information Assessment strategies National Goal 6, objective for college education: “By the year 2000, every adult American will be literate and will possess the knowledge and skills necessary to compete in a global economy and exercise the rights and responsibilities of citizenship”

  12. Policy Purposes Whose learning What learning The policy uses for the information Assessment strategies Higher education policy and K-12 education + economic development + adult literacy policy

  13. National Assessment of Adult Literacy Graduate-admissions and licensing exams General intellectual skills tests Direct Strategies Whose learning What learning The policy uses for the information Assessment strategies

  14. Disadvantages: Labor-intensive, expensive Decadal federal survey --timing National sample only, except in 6 states Not what colleges think they teach Advantages: Advanced literacy levels of a good measure of educational capital Assesses general population Comparison group of non-college-educated Household survey – respondent motivation high National Assessment of Adult Literacy (NAAL) concludes12/03

  15. Graduate-admissions exams Dental Graduate Management Graduate Record Law School, Medical College Optometry Pharmacy Licensing exams Clinical Pathology Dental Hygiene Occupational Therapy Physical Therapy Physician Assistant Nursing Respiratory Therapy Teaching Existing Exams Whose learning What learning The policy uses for the information Assessment strategies

  16. Disadvantages: Selection bias Uneven coverage by discipline Variable (and sometimes small) numbers of test- takers in each state Most in health professions Advantages: Established, credible instruments Highly motivated test-takers Admissions tests assess general intellectual abilities Availability Low cost Existing Examsdata gathered by 03/04

  17. WorkKeys to a sample of two-year students in each state Applied Math Locating Information Reading for Information Business Writing Collegiate Learning Assessment (CLA) to a sample of four-year students in each state General Intellectual Skills Testsadministered fall 03

  18. Disadvantages: Institutional motivation Test-taker motivation Expense Advantages: Excellent tests of general & functional intellectual skills Can impart useful information to student and school WorkKeys and CLA

  19. National Survey of Student Engagement (NSSE) Community College Survey of Student Engagement (CCSSE) College Results Survey (CRS) Indirect MeasuresNSSE/CCSSE co-administered with testsCRS summer through fall, 03 Whose learning What learning The policy uses for the information Assessment strategies

  20. Disadvantages: Not direct learning measures Not yet cross-correlated with direct measures Advantages: Excellent and recently developed instruments Process measure could lead to improvement Both have face validity Respondent motivation good Surveys

  21. Political instability in states: gubernatorial, SHEEO Personnel changes among key players Institutional skepticism Faculty resistance Data-collection hurdles Test-taker motivation Challenges

  22. General Timeline Measuring Up 2002: model tested with incomplete data from Kentucky 2002-2004: Five-state pilot to test assessment model: IL, KY, NV, OK, SC Measuring Up 2004: publish the results of the pilot Measuring Up 2006: if enough states adopt the model, grade states on learning

  23. It is the right thing to do. We can determine how to do it right. This initiative will generate information useful to states, institutions, and students. State-level analysis can promote collaborations to serve underachieving subpopulations or regions of the state. State resources can be effectively targeted. Reasons to Act

  24. http:///collegelevellearning.org

More Related