1 / 29

Evaluating Your STEM Outreach Program

Evaluating Your STEM Outreach Program. MISO Spring Workshop May 7, 2012 MISO Data Analytics Team Jeni Corn, Tricia Townsend, Alana Unfried. http://miso.ncsu.edu/. Goals of the Session. Understand how program evaluation process can be helpful to improving your STEM programs

gracie
Download Presentation

Evaluating Your STEM Outreach Program

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Evaluating Your STEM Outreach Program MISO Spring Workshop May 7, 2012 MISO Data Analytics Team Jeni Corn, Tricia Townsend, Alana Unfried http://miso.ncsu.edu/

  2. Goals of the Session • Understand how program evaluation process can be helpful to improving your STEM programs • Ask evaluation questions that are useful to your STEM programs • Identify a variety of useful data sources for your STEM programs

  3. Agenda Introductions The Evaluation Process - Presentation Asking Good Evaluation Questions - Small group discussion Identifying Data – Presentation, Small group discussion Q & A - Whole group discussion

  4. Formative and Summative Evaluation Just like formative and summative assessments … “When the cook tastes the soup, that's formative evaluation. When the guests taste the soup that is summative evaluation.” ~ Bob Stake

  5. Keep it Simple and Focused A basic chart is a great way to organize your own thinking and to share easily the plan with others. • An evaluation doesn’t have to be big! Match the number of evaluation questions in your plan to your resources. • Focus on efficient, effective data collection strategies. Evaluation Questions Data Sources

  6. Program Evaluation Steps 1-5, Repeat! • Identify the critical elements of the STEM program (logic model). • Ask important questions about your STEM program. (evaluation questions). • Identify what data are available to help answer your questions and determine the additional data needed (MISO surveys, NCERDC data, program level data). • Collect, analyze, and interpret data to answer your questions. • What changes to your STEM program should you make based on your results? Repeat steps 1-5!!

  7. Program Evaluation Steps 1-5, Repeat! • Identify the critical elements of the STEM program (logic model). • Ask important questions about your STEM program. (evaluation questions). • Identify what data are available to help answer your questions and determine the additional data needed (MISO surveys, NCERDC data, program level data). • Collect, analyze, and interpret data to answer your questions. • What changes to your STEM program should you make based on your results? Repeat steps 1-5!!

  8. What is a logic model? A logic model is a graphic representation of the relationships among the key elements of a project: inputs, strategies, objectives, long-term goals. • Helps to articulate the key elements of the project. • Enables evaluation efficiency and effectiveness. • Promotes stakeholder buy-in by helping clarify how the project works. • Drafting one can be a great way to involve stakeholders in planning.

  9. What is a logic model?

  10. Program Evaluation Steps 1-5, Repeat! • Identify the critical elements of the STEM program (logic model). • Ask important questions about your STEM program. (evaluation questions). • Identify what data are available to help answer your questions and determine the additional data needed. • Collect, analyze, and interpret data to answer your questions. • What changes to your STEM program should you make based on your results? Repeat steps 1-5!!

  11. Developing evaluation questions • The process for identifying the questions to be answered by the evaluation is critical. • Evaluation questions provide the direction and foundationfor the entire evaluation. Results Data Analysis Data Collection EVALUATION QUESTIONS

  12. Developing evaluation questions Why do we need to ask good questions? To … • Determine what is really important and to whom • Project leaders, program participants-teachers, students and their parents, etc. • Focus data collection efforts • What do we need to find out? • How can we collect that information? • Who is the best person to collect that information?

  13. Developing evaluation questions The main types of evaluation questions are: • Questions about STRATEGIES: these questions ask about how well the strategies were implemented. • Questions about OBJECTIVES: these questions ask about impacts. Logic models are great guides for developing evaluation questions.

  14. Developing evaluation questions Quick tips for writing good questions: • Try to avoid simple “yes or no” questions • Consider QUANTITY questions, e.g: • “How many” • “How much” • “How often” • Consider QUALITY questions, e.g.: • “How well” • “How effectively” • “In what ways” • Be able to be tuned-in to unexpected results.

  15. Developing evaluation questions • IMPLEMENTATION questions: • How many hours of sleep am I getting each week? (quantity) • How soundly am I sleeping? (quality) • IMPACT questions: • How much weight have I lost? (quantity) • How has my stress level changed? (quality)

  16. Developing evaluation questions Every evaluation question can’t be answered - finding out the answers costs time, money and people. Pick the most important questions that provide the most valuable information to users.

  17. Small Group Activity With a partner(s) at your table, use the sample STEM program logic model to: • Brainstorm 2-3 implementation questions about the program’s strategies E.g. Strategy: Teachers will engage in face-to-face and online professional development. Quantity Questions: What percentage of teachers attend the PD regularly? What were the total number of hours teachers attended PD each month? Over the course of the program? Quality Question: How do teachers rate the professional development? • Brainstorm 2-3 impact questions to evaluate how well the outcomes are being met. Whole group share-out.

  18. STEM Program Implementation Questions Quantity Questions Quality Questions

  19. STEM Program Impact Questions Quantity Questions Quality Questions

  20. Program Evaluation Steps 1-5, Repeat! • Identify the critical elements of the program (logic model). • Ask important questions about your program. • Identify what data are available to help answer your questions and determine the additional data needed. • Collect, analyze, and interpret data to answer your questions. • What changes to your program should you make based on your results? Repeat steps 1-5!!

  21. Data in Evaluations For each evaluation question, what information are you going to gather in order to answer it? • Consider a wide variety of data types and sources – both quantitative and qualitative. • What data do you already have? • What data do you need? • How much time, money and/or other resources will it cost to collect the data? • Make a calendar of what you’ll need, from who, by when. REMEMBER: Data must be interpreted, not just analyzed.

  22. Data in STEM Evaluations MISO Instruments: Evaluation Questions: To what extent did students’ interest in STEM careers increase? Did inquiry-based learning increase student engagement? How did teachers self efficacy for teaching STEM content change? • Student STEM Attitudes Surveys • Upper Elementary • Middle/High • Teacher STEM Attitudes Surveys • Elementary • Science • Technology • Engineering • Mathematics

  23. Data in Evaluations “Qualitative data are measurements that cannot be measured on a numerical scale; they can only be classified into one of a group of categories.” Qualitative data are analyzed for patterns or themes. There are many sources of qualitative data; common in education evaluation: • Interviews and focus groups with teachers • Interviews and focus group with students • Open-ended questions on surveys or questionnaires • Open-ended assessments • Portfolios of student work or other performance artifacts • Open-ended classroom observation notes • Journals, logs or other artifacts of project activities

  24. Data in Evaluations “Quantitative data are measurements that are recorded on a naturally occurring numerical scale.” Quantitative data are analyzed using descriptive or inferential statistics. There are many sources of quantitative data; common in education evaluation: • Demographics • Grade-level information • Years of teaching experience • Standardized assessment scores • Scaled questions on surveys • Scale-scored classroom observations • Graduation rates • Rates of course-taking or course completion Most common and straightforward type of statistics – absolute numbers, percentages, averages, etc.

  25. Data in Evaluations Beware of common data traps! • Biting off more than you can chew • Not collecting data needed to answer important questions • Collecting data that is not really useful • Neglecting hard-to-quantify data • Not formalizing “informal data” (e.g., anecdotes, unrecorded observations) • Not using valuable data after it has been collected

  26. Small Group Activity • With a partner(s) at your table, select either the evaluation questions you developed or 4-5 questions shared during report out. • Brainstorm 1-3 data sources and collection strategies (how & when) you could use to answer each question. • Examples: • How many teachers attended the PD regularly? Attendance counts collected after each PD session throughout the entire evaluation time period. • How do participants rate the PD sessions? Interviews with teachers collected once or twice throughout the entire evaluation period. Feedback forms administered and collected after each evaluation. Observations of each session. Whole group share-out.

  27. STEM Evaluation Data Sources

  28. STEM Evaluation Data Sources

  29. Q & A Thank You

More Related