410 likes | 509 Views
Differential Success or Different Populations: Variations Across Sites within Penny Harvest Program Evaluation . E. Christine Baker-Smith Christopher C. Weiss Vanessa G. Ohta Quantitative Methods in the Social Sciences (QMSS) Institute for Social and Economic Research and Policy
E N D
Differential Success or Different Populations: Variations Across Sites within Penny Harvest Program Evaluation E. Christine Baker-Smith Christopher C. Weiss Vanessa G. Ohta Quantitative Methods in the Social Sciences (QMSS) Institute for Social and Economic Research and Policy Columbia University 30. October. 2010
Outline/Overview • Previous Research on Service Learning • Evaluating Service Learning – The Penny Harvest Program in New York City & Columbus, Ohio • The Penny Harvest Program • Evaluation/Research Design • Using Factor Analysis to assess “program success” • Factor Analysis Theory • Our Factors • Variations by site
Previous Research on Service Learning • Evaluation of Service Learning and Civic Instruction has grown in past decade: • Service learning programs enhance host of outcomes: • Personal efficacy and enhanced social skills • Academic learning • Critical thinking skills • Sense of social responsibility • Civic engagement • Increased later service (Astin & Sax, 1998; Smith 2007; Billig 2000, 2003, 2004)
Previous Research on Service Learning • Yet many of the research designs used have significant limitations: • Focus on older students – usually middle school and high school age. • Artifact of program focus – most service learning programs oriented toward older students. • However, some research documents benefits of service learning for younger grades. • e.g., Michigan Learn and Serve study finds grater benefit for students in grades 2-5 than for those in older grades. (Billig 2004; Billig and Klute 2003; Smith 2007)
Previous Research on Service Learning • Likewise, most research focuses on one particular group using: • Cross-sectional analysis • Self-selected samples • Single location analysis lacking comparative possibilities • Are benefits location and program-site specific? • Program fidelity, site comparability, etc. • Are there ways to compare programs and sites with regard to outcomes?
Evaluating Service Learning:The Penny Harvest Program in NYC • One evaluation of a program focused on younger grades: The Penny Harvest Program • Founded in 1991 by Common Cents. Penny Harvest is an inclusive year-round program • Designed to develop community values through real-world service experiences. • Penny Harvest Program consists of four integrated stages.
Scale and Recognition • The Common Cents Penny Harvest is now in over 850 New York City schools, serving over half a million children. • The program and curriculum guide have been internationally recognized • Named one of four top programs in international competition on civic engagement by Bertelsmann LP • Recognized by Bridgespan as a leader in the youth development field. • Common Cents was recently selected as a key partner in NYC Service, Mayor Bloomberg’s initiative to increase civic participation amongst all New Yorkers. • The organization has replicated the model it in six (primarily urban) learning sites across the country. • New York region; New York City; Columbus, Ohio; Denver, CO; Seattle, WA; Florida
Penny Harvest • Stage 1: • PENNY HARVEST • Kick-off • Wheels of Caring • Collection “Transforming the multi-million-dollar resource of idle pennies into the philanthropic property of children”
Penny Harvest • Stage 1: • PENNY HARVEST • Kick-off • Wheels of Caring • Collection • Stage 2: • ROUNDTABLE • Student Leaders • Group Decision-Making • Grant Awards “Transforming the multi-million-dollar resource of idle pennies into the philanthropic property of children”
Penny Harvest • Stage 1: • PENNY HARVEST • Kick-off • Wheels of Caring • Collection • Stage 2: • ROUNDTABLE • Student Leaders • Group Decision-Making • Grant Awards • Stage 3: • SERVICE • Service in Community “Transforming the multi-million-dollar resource of idle pennies into the philanthropic property of children”
Penny Harvest • Stage 1: • PENNY HARVEST • Kick-off • Wheels of Caring • Collection • Stage 2: • ROUNDTABLE • Student Leaders • Group Decision-Making • Grant Awards • Stage 3: • SERVICE • Service in Community • Stage 4: Reflect & Plan • Check Presentation Ceremony • Reflection time “Transforming the multi-million-dollar resource of idle pennies into the philanthropic property of children”
How The Common Cents Method Unites a School Community: Dynamic Development Common Cents/ Affiliate Partners PH Coach(es) Community Based Organizations Teachers PH Student Leaders School Student Body
Program Goals for Multiple Levels Program goals (growth, implementation of model) Student goals Teacher goals Community goals
Student Goals Efficacy Connectedness Ethical Development
Student Goals Measures Academic Achievement Engagement Behavior in School Voice/Capacity Helping Others Community Orientation Self v. Others Efficacy Connectedness Ethical Development
Student Goals Measures Academic Achievement Engagement Behavior in School Voice/Capacity Helping Others Community Orientation Self v. Others Efficacy Connectedness Ethical Development
Student Goals Measures Academic Achievement Engagement Behavior in School Voice/Capacity Helping Others Community Orientation Self v. Others Efficacy Connectedness Ethical Development
Evaluating Service Learning: • Research Design Measurement techniques • Questions from 3 sources • University of California at Berkeley’s Service-Learning Research and Development Center’s Civic Responsibility Survey National Survey of Student Engagement (Furco, Muller, & Ammon, 1998) • MacArthur School Engagement Survey (Blumenfeld, 1998) • Penny Harvest Pilot Roundtable Survey 2005-06 • Benefits: • Generalizability (1&2): • Used in multiple locations nationally • Used on various student-groups by age, gender, ethnicity & urbanity • Internal Validity (3): • Used on similar population to test same program. • Allows for modification of instrument with specific regard to unique program
Evaluating Service Learning:The Penny Harvest Program in NYC Research Design • Selected a Sample of Schools based on a set of criteria • Geographic location • Characteristics of the student population • Features of the program in school • Then invited grades 3, 4, and 5 in these schools to participate-all with parent permission surveyed. • Approximately 500 student interviews in Year 1 (fall& spring) across approx 10 schools.
Evaluating Service Learning:The Penny Harvest Program in NYC • Research Design • Issue of self-selection and comparison groups. • All schools in NYC eligible for participation. • Factors related to whether a school participates also likely related to program outcomes. • Limits comparability of non-program schools. • We focus only on schools with programs in this analysis. • Concentrate on variation of program effects. • Comparisons not to other schools in NYC, but to sites where the questions we use have been fielded.
Evaluating Service Learning:The Penny Harvest Program in Ohio Research Design • Selected a Sample of Schools based on a set of criteria • Geographic location • Characteristics of the student population • Features of the program in school • Then invited grades 3, 4, and 5 in these schools to participate-all with parent permission surveyed. • Approximately 500 student interviews in Years 1 & 2 (fall& spring) across approx 6 schools.
Evaluating Service Learning: Research Design • Pretest/Posttest Evaluation design • Issues with pre-post test for NYC: • Program in operation for over 15 years. Only true pre-test for 3rd graders* • Importance of true pre-test. . . Columbus, Ohio • Program begins 2008 • Initial survey conducted before program “kicks off” • +Accomplishes true pre-test for all grades • - Potential programs with program fidelity in first years
Evaluating Service Learning: Analysis • All created using P.C.F. factor analysis • Measures of Student Attitudes and Behaviors • Student engagement • Behavior in school • Voice/Capacity • Helping others (other orientation)
Factor Analysis (Kim & Mueller, 1982) • Common objective: represent group of variables in terms of a smaller number of hypothetical variables. • Exploratory: data reduction • Confirmatory: test theoretical hypotheses • Factor loadings • Standardized coefficients
“Student Engagement” • Created from student responses to six questions (italicized = scale reversed): • I pay attention in class. • I complete my homework on time. • I feel bored in school. • My classroom is a fun place to be. • I feel excited by the work at school. • I like being at school. • I am interested in the work at school.
Comparison of Loadings: Engagement NYC Ohio
Behavior in School • Created from student responses to eight questions: • I help people who are picked on. • I share things with others. • I work very well with other students. • I find ways to solve problems that are fair. • I pay attention in class. • I complete my homework on time. • I get in trouble at school. • I follow the rules at school.
Voice/ Capacity Created from student responses to five questions: I am interested in doing something about the problems in my school or neighborhood. If I work hard I can accomplish my goals. I help people who are picked on. I help others with their schoolwork. I talk to other students about helping our school or neighborhood.
Helping Others • Created from student responses to ten questions: • I think all students should learn about problems in their neighborhood or city. • People who have problems should only turn to their family for help. • I think communities should take care of people who can’t take care of themselves. • I think you should help all people, not just people you know. • It’s better to work on a problem with a group than to work alone. • I share things with others. • I help people who are picked on. • I help others with their schoolwork. • I find ways to solve problems that are fair. • I cheer up people who are feeling sad.
Helping Others Theory Action
Further Lessons Learned Survey design for young audiences. Theory vs. Action Timing and surveying children. School year and standardized testing constraints. New York vs. Ohio Population testing for surveys Different populations interpret differently. Engagement variations vs. Behavior
Community Orientation • Created from student responses to ten questions: • I talk to other students about helping our school or neighborhood. • I help others with their schoolwork. • I work very well with other students. • I help people who are picked on. • I share things with others. • I think you should help all people, not just people you know well. • I am interested in doing something about problems in my school or neighborhood. • It’s important for all students to help out their school or community. • It’s better to work on a problem with a group than to work alone. • I think all students should learn about problems in their neighborhood or city.
Self vs. Others Orientation • Created from student responses to eight questions: • It’s sometimes hard for me to talk when I’m in a group of people. • It’s better to work on a problem with a group than to work alone. • I am interested in doing something about problems in my school or neighborhood. • I share things with others. • I work very well with other students. • I help others with their schoolwork. • I talk to other students about helping out school or neighborhood. • I help people who are picked on.
Evaluating Child-Service Programs LIMITATIONS: • Sample size • Sampling bias • Age • Little understanding of how younger children are affected • Longevity of study • Difficult to determine true program effects • Difficult to see changes in program effects (deterioration, augmentation, etc.) • Difficulty in defining a good program/success