1 / 22

Update on RAINS (Research, Analysis and Insight into National Standards)

Update on RAINS (Research, Analysis and Insight into National Standards). Martin Thrupp NZPF President’s Moot 22 March 2013. Today’s session 1. RAINS progress, features and early international feedback 2. Illustrate some of the power and reach of the project ‘Doing it to ourselves’

moanna
Download Presentation

Update on RAINS (Research, Analysis and Insight into National Standards)

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Update on RAINS (Research, Analysis and Insight into National Standards) Martin Thrupp NZPF President’s Moot 22 March 2013

  2. Today’s session • 1. RAINS progress, features and early international feedback • 2. Illustrate some of the power and reach of the project • ‘Doing it to ourselves’ • ‘Comparability of OTJs across schools’

  3. The RAINS project – a three year study of six diverse schools as they respond to the National Standards policy • First report (2012) Researching Schools’ Enactments of New Zealand’s National Standards Policy • Second report (out soon) Understanding New Zealand’s Very Local National Standards • Final report (November 2013) • A small project that makes the most of its advantages • Rich data that illustrates the issues in schools • Longitudinal – getting to see changes over time • One researcher has oversight of all cases • Time to assemble background material and develop insights • RAINS lead teachers and principals • International & national advisory groups

  4. Profs. David Berliner (USA), Barbara Comber (Australia), David Hursh (USA), Margie Hohepa, Bob Lingard (Australia), Meg Maguire (UK), Martin Thrupp and other local panelists Facilitated by Lester Flockton Wellington 22-24 January 2014. Registrations open soon.

  5. What are international reviewers saying? (1) The NS campaign as inspiring and worthwhile As a non-New Zealander (and indeed someone far out of the region) I appreciated this [paper’s findings] on a number of levels. It was heartening to read about the resistance to standardization and assessment movements to which other countries (including my own) long ago submitted. It was also fascinating to see the extent to which teachers and administrators seem to protest the labelingof children as ‘below’ etc.—again not something we see protests about in my country, despite even worse labels.

  6. What are international reviewers saying? (2) It makes sense there would be problems. The paper is great. Real problems are addressed/discussed in the cases of the various schools. It seems like [the NS policy] is not turning out well. Sad, but almost predictable. National Standards are designed specifically to incapacitate comparisons which rank individual schools; they were not meant to be published but they are. Because they use a standardized method of representing students' achievements, they enable exactly the comparisons for which they provide no grounds.

  7. Impact of NS varies widely: ‘It depends’. • ‘Tell me about your school and I’ll tell you about its response to National Standards’. • Perspectives and responses often ‘make sense’ when seen against context and are more nuanced than the debate over National Standards usually allows. • Some schools have really done very little to respond to National Standards, other schools have done a lot more.

  8. Where there are adverse impacts we will often be doing it to ourselves(performativity). One of my [experienced] staff sent me an email last term saying that we should cut out all extraneous things in the afternoon so we can focus on numeracy and literacy. I said ‘you’re buying into what you and I both hate’. But she’s under pressure to improve the results of her kids. And there is pressure put on by the leadership team to improve results, there has to be. Because we are in the business of making a difference. (Cicada principal, Nov 2012) When you hear teachers speak, and we had another conversation around art, and other areas, and its like ‘well we don’t really have time for that’. (Cicada lead teacher, Nov 2012)

  9. I had a teacher [from another school] in yesterday and she asked ‘what do you call this for writing?’, and we found we had a different level, and they had been talking to intermediate schools and they were worried and…If its so unfair who cares. I can’t compare them because I just know too much that they are doing different things in every school. So why worry about them. But I do know other people will analyse those lists and take them at face value…but its not honest data. (Seagull lead teacher, Oct 2012) I was so pleased in our leadership meeting, [a senior teacher] talking about more testing, another test. Wow, I thought that was interesting, usually the argument is ‘less is best’, and my question was ‘Well why would we do it’ and then the discussion started, you know, ‘I don’t think we are getting enough data up here’, or ‘I don’t know that this one is hitting the spot’. (Kanuka Principal, October 2012)

  10. One of those schools that put their data [into the newspaper] had been asking me for our data and she wanted to tell her board what our data was, to prove her data [was OK]. And I hadn’t given it and when all our data was put online I laughed because she can access it now anyway!.. I know her thinking was that [her board] didn’t think [her data] was good enough and she wanted to prove well here’s another school the same decile, their data is more or less the same, that was her thinking, she had obviously been questioned by her board. When I looked at her data it was outstanding. (Seagull lead teacher, Oct 2012) There had also been some discussion at a board meeting: ‘so what do you think about this?’ But no one was saying ‘How does our data compare?’. At the same time I did have a conversation about how was that beneficial to our students…. [butthe board] still asked it again. So yes there is a buy-in from parents who wish to have an easy comparison number to look at and go ‘we must be better than them’.(Magenta principal, Dec 2012)

  11. Chloe aged 12, talking to MT about impending end of Yr8 school report Chloe: I’ll probably be like ‘at’ and ‘below’, I’m not really that good at anything… I’d be happy with an ‘at’ but if I got ‘below’ then I’d be a bit, down, because I’d feel I hadn’t really tried in class. MT: But you have tried in class you are saying though? Chloe: Yeah, but I’ll just feel like I haven’t really tried. Yeah. After Chloe got her report with ‘below’ in writing and maths Chloe: I sort of knew it was coming but I didn’t think I’d be below in maths because that was my strongest one. Chloe’s mother: I expected these marks, I was actually surprised she got the ‘at’ in reading, I actually thought she would be ‘below’.

  12. 'National Standards and OTJ’s: Just how far are we from an ‘apples to apples’ comparison across schools? Over time, OTJs are supposed to become comparable across all primary and intermediate schools. As indicated by the ‘National Standards: School Sample Monitoring and Evaluation Project’, (MTL research) comparability or consistency is seen by the Ministry as a problem of teacher or school assessment practices.

  13. MTL: Using artificial assessment scenarios • The study collected information about teachers’ ability to rate individual pieces of student work in relation to the National Standards, and to collate several pieces of assessment evidence that had already been rated against the standards to make an OTJ…There was considerable variability in the accuracy of teachers’ ratings against the National Standards for individual work or assessment samples. In writing, accuracy ranged from 3% to 89% over the samples, while accuracy in mathematics ranged from 18% to 90%. This is a cause for concern as it is these individual judgements that are the basis of OTJs. • (Ward & Thomas 2012, p.2).

  14. MTL: Working with Ministry performance criteria • Teachers use their knowledge of the National Standards in the process of making OTJs. • OTJs are informed by student achievement information that is relevant and current. • Teachers make OTJs efficiently. • Schools use processes and systems to ensure OTJs are consistent. • Moderation decisions are informed by the NS in reading, writing, and mathematics. • Moderation processes are efficient and effective. • Teachers make dependable OTJs”. (Ward & Thomas 2012, p.6).

  15. These criteria suggest that if teachers were just more knowledgeable, more data informed, more efficient and more systematic then variability in OTJ-making would all but disappear. In focussing on teachers’ individual abilities and understandings, the MTL research reflects what is probably the most common way people think about why National Standards are not comparable across and within schools and this thinking is found in the RAINS schools as well: What [a child] writes for me and she may sacrifice surface features for deep messaging and I get really excited about that. Someone else may say ’I’m not that interested in the deep messaging, I can’t read it’. (Principal, Cicada School)

  16. Yet what is it that prevents teachers from being like it is imagined they should be? The consistency or comparability of OTJs has to be seen as more than a matter of individual practice because like so many issues and processes in education, context comes into play. The difficulty with making OTJs in schools is that teachers are not dealing with artificial assessment scenarios. They are dealing with real scenarios that are heavily influenced by school specific factors and incremental changes leading to different school trajectories and variability at multiple levels.

  17. Sources of OTJ & National Standards data variation from national and regional level • Ambiguities • Varying professional development opportunities • Weak Ministry requirements • Difficulties around advice • Crude and misleading reporting

  18. Sources of OTJ & National Standards data variation at school level • Differences in schools’ overall trajectories • Differences in each school’s framing of the National Standards • Defining the National Standards categories • Matching National Standards categories to curriculum levels • The rigour of data sent to the Ministry • Differences in the detail of policies and practices • Formalising policies and practices • Discussion about National Standards and National Standards-related areas • Intervention by the SLT • Balance between types of evidence for informing OTJs • Choice of assessment tools • Assessment and moderation procedures

  19. Very different trajectories of enactment across schools because of school-specific factors, these cannot be easily set aside, for instance: • Juniper’s principal had more time for her enthusiasm for assessment than would be the case in most New Zealand schools. • Seagull had been fine-tuning its assessment processes for years. • Kanuka was particularly concerned about ‘deficit thinking’ and saw an opportunity to get staff focused on ‘acceleration’. • Magenta was preoccupied by its local response to the New Zealand Curriculum. • Cicada was opposing the National Standards. • Huia teachers were a long way from making OTJs.

  20. Sources of OTJ & National Standards data variation at classroom level • Differences in teacher judgements • Deviation from school expectations • Children’s practices

  21. Doing an IKAN test at HuiaIntermediate We go to the hall, there are perhaps 150 children sitting or lying on their tummies on the wooden floor, not a chair or desk in sight. A data projector plays the test onto a large screen and a teacher is reading out the questions as well. The questions are fast(as they are intended to be, this is a maths recall test) but with lying on the floor the childrens’ heads have to bob up and down to see both the screen and their answer papers in a way that must be quite tiring. Other class teachers are standing around the walls of the hall. After a short time the test is over and the noise in the hall rapidly increases. After a minute the teacher in charge of the testing asks ‘Who couldn’t keep up? Who didn’t write anything?’ (A fairly large number of hands go up, maybe a third). The teacher then comments ‘Well its incredibly fast, Ms ‘Speedy’ [the teacher who was reading]couldn’t even speak it out fast enough. But we have no control over the speed so we have decided you guys can have it again’. And so the questions are projected again and at the end the children swap sheets and mark the answers, also projected on the screen. In later discussion I am told a few children complained about the second chance for the test. The teachers are using an old version of IKAN because it allows the children to self-mark whereas the latest version requires teachers to mark all the tests.

  22. Further information www.waikato.ac.nz/wmier www.education2014.org.nz thrupp@waikato.ac.nz

More Related