Manaiakalani Whānau Capability Building and Classroom Instruction

Final Report – Full Report UniServices Task number: 32635.001

Auckland UniServices Limited

A wholly owned company of The University of Auckland Prepared for:

Prepared by:

Manaiakalani Education Trust

Dr Rebecca Jesson

P O Box 18061

Professor Stuart McNaughton

Glen Innes

Naomi Rosedale

Auckland 1743

Tong Zhu Maria Meredith

Attn: Pat Snedden

Alexander Kegel

Executive Chair Manaiakalani Education Trust

Date: 4 March 2016

Faculty of Education and Social Work

Reports from Auckland UniServices Limited should only be used for the purposes for which they were commissioned. If it is proposed to use a report prepared by Auckland UniServices Limited for a different purpose or in a different context from that intended at the time of commissioning the work, then UniServices should be consulted to verify whether the report is being correctly interpreted. In particular it is requested that, where quoted, conclusions given in UniServices reports should be stated in full.

Acknowledgements We warmly acknowledge the substantial contribution that different groups within Manaiakalani are making to this ongoing research and development project. We hope that this report provides useful input to the Manaiakalani Education Trust’s continual development in primary and secondary schools. We especially acknowledge the students, teachers and Principals at the schools with whom we worked for their time and for engaging in ongoing discussion. Your time and helpful assistance is greatly appreciated. We are also grateful to the parents and the wider community for their support of the Woolf Fisher Research Centre and their involvement with the research. As part of the Manaiakalani innovation community, special thanks are afforded to Jenny Oxley, Pat Snedden, Dorothy Burt and Russell Burt, for their support and feedback on this report. We thank all who have taken the time to participate in the variety of research measures and who have engaged in constructive dialogue about both the processes and findings of the ongoing research-practice partnership. Staff members at the Woolf Fisher Research Centre also contributed to the completion of this research project and report: Lin Teng, Jacinta Oldehaver and Angela McNicholl had particular input.

i

ii

Table of Contents Acknowledgements ..................................................................................................................................... i Executive Summary ....................................................................................................................................1 1. Overview ...............................................................................................................................................13 1.1 Whānau Capability Building and Classroom Instruction Programme .........................................13 1.2 Previous Reports...........................................................................................................................13 2. Methods and Procedures .......................................................................................................................15 2.1 Data Sources .................................................................................................................................15 e-asTTle writing ........................................................................................................................16 Progressive Achievement Test (PAT) reading and mathematics ..............................................16 National Certificate of Educational Achievement (NCEA) ......................................................16 Classroom observations .............................................................................................................16 Case study teachers – reading....................................................................................................19 Parent/whānau case studies .......................................................................................................19 2.2 Analysis ........................................................................................................................................21 e-asTTle writing ........................................................................................................................21 Progressive Achievement Test (PAT) reading and mathematics ..............................................22 National Certificate of Educational Achievement (NCEA) ......................................................24 Classroom observations .............................................................................................................24 Case study teachers – reading....................................................................................................24 Parent/whānau case studies .......................................................................................................26 3. Results ...................................................................................................................................................29 3.1 Writing, Reading and Mathematics – Term 4, 2015 ....................................................................29 e-asTTle writing ........................................................................................................................29 Progressive Achievement Test (PAT) reading ..........................................................................45 Progressive Achievement Test (PAT) mathematics ..................................................................61 3.2 Writing, Reading and Mathematics – Term 1, 2015 to Term 4, 2015 ..........................................77 e-asTTle writing ........................................................................................................................77 Progressive Achievement Test (PAT) reading ..........................................................................94 Progressive Achievement Test (PAT) mathematics ................................................................111 3.3 Writing, Reading and Mathematics – 2012 to 2015 ...................................................................128 e-asTTle writing ......................................................................................................................128 Progressive Achievement Test (PAT) reading ........................................................................132 iii

Progressive Achievement Test (PAT) mathematics ................................................................137 3.4 National Certificate of Educational Achievement (NCEA) .......................................................142 NCEA Level 1, 2, and 3 ..........................................................................................................142 3.5 Classroom Observations .............................................................................................................144 Classroom observations Term 4, 2014 to Term 3, 2015..........................................................144 3.6 Case Study Teachers – Reading .................................................................................................154 Promoting engagement in reading ...........................................................................................154 Instruction and support for understanding ...............................................................................156 Creativity .................................................................................................................................161 Offering in-task support ..........................................................................................................162 Enhancing tutorial properties...................................................................................................164 Criticality and complexity .......................................................................................................164 Child histories..........................................................................................................................166 3.7 Parent/Whānau Case Studies ......................................................................................................166 Engagement type one: Fanau learning.....................................................................................166 Engagement type two: Engagement with digital learning .......................................................167 Engagement type three: Other sites of learning.......................................................................168 4. Summary of Results and Discussion...................................................................................................170 4.1 Student Achievement Data .........................................................................................................170 e-asTTle writing ......................................................................................................................170 Progressive Achievement Test (PAT) reading ........................................................................171 Progressive Achievement Test (PAT) mathematics ................................................................172 National Certificate of Educational Achievement (NCEA) ....................................................173 4.2 Classroom Observations .............................................................................................................173 4.3 Case Study Teachers – Reading .................................................................................................175 Increased engagement in reading ............................................................................................175 Instruction and support for depth of understanding.................................................................175 Greater opportunity for in-task support ...................................................................................175 Challenging tasks and high expectations .................................................................................176 Closer connections to students’ interests and reading histories...............................................176 5. Recommendations ...............................................................................................................................177 6. References ...........................................................................................................................................179 Appendix A: Classroom Observation Tool 2015 ....................................................................................181 Appendix B: Post Classroom Observation Scoring Protocol..................................................................183 Appendix C: Parent/Whānau Interview Tool..........................................................................................185

iv

Executive Summary Overview This report is the third milestone to the Manaiakalani Education Trust. It details the activities and findings of the “research and development” strand of research and incorporates the evaluative information provided by student achievement data. This milestone details the findings from ongoing student achievement and classroom observations across the cluster. In addition, reported are the findings from the case studies of the most effective teachers of reading with the primary schools which provide the impetus for developing hypotheses about effective teaching of reading in a digital learning environment.

Methods and Procedures Data Sources In 2015 we relied on the following sources of data: e-asTTle and Progressive Achievement Test (PAT) achievement; preliminary National Certificate of Educational Achievement (NCEA) standards; classroom observations, classroom site and student blog analysis of case study classes and parent/whānau interviews.

Analysis e-asTTle and Progressive Achievement Test (PAT) The cross-sectional analyses contain single time point snapshots, including comparisons of differences from normative averages between groups. The PAT tools were introduced to the schools in Term 1, 2015, whereas students were previously tested using e-asTTle reading and mathematics tests. Individual tests are analysed using scale scores for that test (PAT scale score or aWs). We used the PAT scale score to measure gains for each subject from Term 1, 2015 to Term 4, 2015. The longitudinal analyses include summary statistics of scale scores over time, across year levels, schools, school clusters and ethnicities, and between school types, school religion and gender. Additional cross-sectional and longitudinal analyses compared e-asTTle writing and PAT reading and

1

mathematics achievement of students who transitioned to Tamaki College from within the Manaiakalani cluster against achievement of students who transitioned from outside the cluster.

National Certificate of Educational Achievement (NCEA) NCEA data supplied by the school were analysed as percentage of students (roll-based) who attained qualifications including NCEA Level 1, 2, 3 and University Entrance (UE), level 1 literacy and numeracy from 2010 to 2015. Pass rates for each qualification in each year were calculated based on the assumption that level 1 qualifications should be attained by Year 11 students, level 2 qualifications should be attained by Year 12 students and level 3 qualifications should be attained by Year 13 students.

Classroom observations Descriptive statistics are provided which portray the percentages of blocks in which each of the variables was observed. Percentages of block containing each variable are presented in bar charts (side-by-side).

Case study teachers – reading A selection of case study teachers was informed by the triangulation of three effectiveness measures following a ranking process to determine the “top 10” tier of practitioners from which convergence by any single teacher across measurements strengthened nomination. The final selection of eight cases was identified from comparisons of convergence across three sources of triangulated evidence: 1.

Achievement in reading;

2.

Observer “impressions”;

3.

Principal nominations.

Following identification of the eight case study teachers, teacher online planning, student blog posts and classroom observations were qualitatively and quantitatively analysed. The analyses were used to build instrumental case studies of effective teaching of reading within the digital learning environment.

2

Parent/whānau case studies Initial analyses of the 2015 interviews with eight parents/caregivers identified several types of engagement. The analysis has led to the development of the Fanau Engagement hypothesis (Meredith, 2015) using two frameworks. The first is cultural modes to identify values, beliefs, practices and aspirations. The second is academic socialisation where parents/caregivers describe how they support their children’s education and learning (Suizzo, Pahlke, Chen, & Romero, 2014). Within this framework researchers examined parents’/caregivers’ face-to-face and digital interactions at home, with schools and the community (Epstein, 1996; Epstein & Sanders, 2002). Full analysis of the 2015 interviews resulted in three types of engagement: learning at home; digital learning; other sites of learning. The theoretical framework will identify patterns of parent/family involvement in learning, their literacy practices, home school interaction, community engagement and opportunities for community development (Riveria, 2014). Interviews will conclude in early 2016.

Summary of Results and Discussion Student Achievement Data e-asTTle writing Despite achievement levels generally remaining below national levels, writing remains the area where students made most progress during 2015. Of the 10 schools within Manaiakalani in 2015 which assessed student writing using the e-asTTle writing tool, two had average student achievement levels above national normative comparisons at the end of the year, while eight had average levels that remained below normative comparisons. Within the school year, however, seven of the 10 schools had average progress rates above national comparative rates of progress, one had progress rates in line with averages, and two schools’ progress rates were below average. At the classroom level, six of the 58 classes had average achievement above national comparisons, 15 had average achievement at national levels and 37 had achievement levels on average below the national comparison. In terms of average classroom progress in writing within a year, 20 made accelerated progress, 33 made expected progress, and five classes made progress at rates below average. In the school cohort 1 and 2, compared with 2014, slightly fewer schools had average achievement that was accelerated. Of the eight schools whose achievement levels have been tracked since 2012, 3

one school has average levels above normative comparisons, while the rest were below. Progress was accelerated, on average, for five of the eight schools and in line with normative comparisons for one. In two schools rates of progress dipped below national rates. This number indicates some decline in progress compared with 2014, when seven out of eight schools made accelerated progress in writing. At the classroom level, 34 of the 47 classes were below national comparisons on average, while 13 of the classes were at or above expected levels. Fifteen classes made accelerated progress, 27 made average progress, while five classes’ average progress was less than the normative comparison, two of these were in primary schools. There was a difference between the genders in writing achievement, but not in rates of progress. At the end of 2015, across the whole cluster, girls continued to outperform boys in writing on average. Overall girls scored more than 50 e-asTTle points (approximately a year’s growth) greater than boys. The difference is noted in both school cohort 1 and 2 and when newer schools are included into the analyses. In terms of progress, both girls and boys made accelerated progress compared with national averages, and at similar rates. Across the cluster as a whole, NZ European students outperformed other ethnicities; however this effect is not present in the schools that have been part of Manaiakalani since 2012, which likely reflects the different demographic patterns in the Manaiakalani schools. In school cohort 1 and 2, Pasifika and Maori students both had effect size gains of 0.19 (above expected), students categorised ‘Other’ had effect size gains of 0.32, while the small number of NZ European students made overall expected gain. The emerging differences between year levels is highlighted as a pattern in writing in 2015. While all other year levels perform approximately one year below average in school cohort 1 and 2, the gap for Year 9 and 10 student achievement is greater. Moreover, whereas all other year levels made accelerated progress within the year on average, rates of progress in these upper year levels were negative (students lost ground within the year). In the writing test in Term 4, 2015, Year 9 and 10 students performed on average at levels that merit concern. It will need to be a matter of cluster investigation as to whether these test scores represent concerning patterns of learning or whether there is an issue of assessment which underpins these results. Transition analyses suggest that those students who enter Year 9 from Manaiakalani schools score on average higher in writing than those who enter from other schools, however, there is great variability in the scores of both groups, so the differences are non-significant.

Progressive Achievement Test (PAT) reading Progress in reading was generally at national levels, but achievement levels in reading remained lower than national. Average progress levels during the 2015 school year in reading were, in general, in line

4

with expected growth over 12 months. At the end of 2015, nine of the 10 cluster schools scored below national expectations in reading, while one scored at expectations. End of year expected achievement, using PAT benchmarks, is deemed equivalent to beginning of year levels for the subsequent year (e.g., the end of Year 4 mean is estimated to be the beginning of Year 5 mean). Thus, to achieve at rates commensurate with the norms, students need to make 12 month’s average progress within the school year. In eight schools students made this expected progress on average in reading, and in two schools students scored lower than expected progress on average. Similarly, most classes across the cluster (40 of 59) made progress in line with normative comparisons between the beginning and end of the year. When primary school classes are considered separately, 44 of the 48 classes had averages gains at or above expectations. Similar rates of progress were found within school cohort 1 and 2. Of these schools, all had average achievement levels below normative expectations in reading at the end of the year. Six of those eight schools had progress rates in line with expectations (i.e., meeting the beginning of year mean for the subsequent year by year’s end), while two had progress rates below expectations. Of the 37 primary school classes, 34 had rates of progress at or above expected levels. The rates of progress shown in PAT are likely also influenced by the normative expectations, because end of year achievement using PAT is deemed to be equivalent to the next year’s beginning of year achievement. Thus to make within year expected progress, students need to gain a full year’s learning in the school year. Gender differences in reading are slightly less marked than those in writing. Across the whole cluster, girls on average outperformed boys by approximately five scale score points (six month’s progress). This difference is apparent in both school cohort 1 and 2 and the cluster combined, although slightly smaller (four scale score points) when school cohort 1 and 2 are considered alone. Across the whole cluster, there were differences in progress according to ethnicity in reading, with only NZ European and ‘Other’ students making expected progress overall. For school cohort 1 and 2, ethnicities did not differ in terms of relative progress, although ‘Other’ students were the highest performing group. There are likely differences between individual schools in terms of the interaction between year level, ethnicity and progress rates which will be important at individual school levels. For most year levels progress was at expected rates, that is, most year levels made 12 month’s progress within the school year. The exception to this is Year 5, and Years 9 and 10. In both school cohort 1 and 2 and across the wider cluster, Year 5 students made slightly less than a year’s progress, likely reflecting a slightly higher expected growth rate in Year 5 within the normative comparisons. Years 9 and 10 made little progress across the year, scoring at levels similar to their beginning of year levels, thus falling away from the normative comparison. As with writing, there is a need to

5

investigate whether students are failing to learn more widely, or whether issues of assessment impact these results.

Progressive Achievement Test (PAT) mathematics There was some variability in progress rates across the cluster between schools in mathematics. Across the wider cluster, one school had achievement levels above the normative comparisons. The remaining nine schools achieved below end of year expectations. As is the case with PAT generally, the end of year normative estimate is the beginning of the next year’s normative comparison. In three of the 10 schools, progress exceeded the 12 month’s normative progress; in five schools progress was equivalent to 12 month’s progress and in two schools average progress was less than the normative estimate. A large majority of classes (41 of 57) made on average a year’s progress within the 2015 school year. Eight classes made on average greater gains, another eight made less progress than the normative expectation. Of the 35 primary school mathematics classes in school cohort 1 and 2, 32 had rates of progress at or above expected. Unlike reading and writing, there was no marked difference between genders in mathematics achievement. Across the whole cluster NZ European and ‘Other’ students made greatest progress in mathematics (ES = 0.15 - 0.16 above expected progress). When school cohort 1 and 2 are considered in isolation, all ethnicities made expected progress, with the exception of Pasifika students. As with reading, individual schools will need to inquire into the interactions between ethnicity and progress at each of the year levels. As with reading and writing, attainment began to fall away in comparison with normative expectations in older year levels. This effect becomes apparent from Year 7, at which point average progress fails to keep pace with the normative (12 month’s) estimates.

National Certificate of Educational Achievement (NCEA) Preliminary results from the 2015 NCEA data provided by the school indicate key areas of improvement in Levels 2 and 3 in particular, in terms of both quantity of qualifications gained and the quality of the learning in the subject areas. To date 39% of students have achieved level 1 qualifications, less than the 50% of students who achieved Level 1 in 2014. Of the 42 students who have achieved Level 1, 22 achieved either a merit or excellence endorsement. At Level 2 the percentage of students achieving the qualification was 68%, similar to 2014 (70%). Of the 64 students who achieved a level 2 qualification, 18 received a merit or excellence endorsement. At Level 3, 59% of students achieved the qualification. Of note is the number of students working toward Level 3 in the year. Of the 94 students, 56 gained the qualification, 10 with merit or excellence endorsements. Of the 29 students who indicated that they wanted to go on to University study, 24 achieved a University

6

Entrance qualification. The results represent significant advances in the number of students leaving school with higher education opportunities.

Classroom Observations Most observed classes across Term 1, 2015 (22 out of 33) to Term 3, 2015 (24 out of 29) were reading lessons, in line with a focus on development in reading achievement across the cluster. Almost all classes (82 - 89%) had high levels of implementation and digital access (over 90% students with access to devices) across the whole year. There were also consistently high levels of group teaching with 61.9% of blocks of time in Term 4, 2014, 50% in Term 1 2015 and 53.8% in Term 3, 2015, where teachers were working with a group. In general, changes over time in the way teachers work with students are becoming apparent. There seem to be shifts toward more open-ended and cognitively engaging forms of teaching interactions. This is apparent in the numbers of three minute blocks that were coded as focussed on extended discussion, accounting for 54.3% in Term 4, 2014, 36.8% in Term 1, 2015 and 54.5% in Term 3, 2015 respectively. It seems that teachers are increasingly creating spaces in classes that allow learners to engage in discussion around texts. The teachers’ lessons still feature a mixture of instructional types. Teachers’ interactions mostly focused on practice (59.5% of blocks in Term 4, 2014, 55.7% in Term 1, 2015 and 62.8% in Term 3, 2015). Teachers also asked students to link to prior knowledge and to have metacognitive discussions about strategies with 27% of blocks coded as these types both at the beginning and end of the year. The only instructional type less well represented was critical appraisal of content or texts (6.7% of blocks in Term 4, 2014, 11.8% in Term 1, 2015 and 4.7% in Term 3, 2015). Skills to evaluate the author’s position, the credibility of text or the intended effect of a text on the reader are vital tools for students engaged in digital learning environments. In reading lessons at the end of the year 2015, teacher-led activities were less constrained and focused more on deeper thinking than the tasks that were assigned by teachers as independent reading tasks. In this way, the profile of reading lessons with the teacher contrasts with the independent tasks assigned to students. Whereas teachers seemed to be working to extend student thinking through extended discussion, independent tasks remained predominantly focused on reading a single text (66.5% of blocks) and answering a constrained practice worksheet (53% of blocks). Therefore, although discussions may extend student thinking, the structures of independent tasks seem to lend themselves more to testing comprehension through closed questions. The opportunities for extended thinking observed previously, through digital learning objects creation (14.9% in Term 3, 2015) and the use of

7

multiple texts (13.5% in Term 3, 2015), were much less apparent in the final 2015 observations, which focused on reading. There was a continued high level of digital task management over time. Teachers therefore seemed to be implementing the collaborative and efficiency affordances of the digital learning environment at a high level. Collaboration and joint authoring was taken up during 2015. Students tended to work more collaboratively on jointly authored texts and using digital means with an increase from 0.5% in Term 4, 2014, to 23.3% in Term 3, 2015. Face-to-face collaboration was less evident with a decrease from 19% in Term 4, 2014, 17.1% in Term 1, 2015 to 5.1% in Term 3, 2015, possibly reflecting the independent task assignment of worksheets.

Case Study Teachers – Reading From analysing the practices of teachers who were effective in teaching reading in 2015, it would seem that more effective teaching may result from affordances of the digital learning environment to promote not only greater reading expertise but also agency over that reading.

Increased engagement in reading Students learn to read by reading. Thus reading instruction needs to engage students in the practice of reading. Practices that seemed instrumental for reading engagement were providing opportunities and support for independent reading and extending the quantity of reading through providing supplementary texts as part of instruction. In the case study classes, teachers drew on digital tools to provide both the opportunity for self-chosen reading materials and teacher selected materials to supplement instruction. Case study teachers used both digital (for example video recording) and traditional sustained silent reading means to develop students’ independent reading skills, knowledge of books and routines for independent reading for both recreation (novels) and information (current events). Case study teachers also took the opportunity to increase reading mileage through reading instruction that drew on multiple texts, through layering texts (e.g., text sets) or supplementary texts (e.g., contrastive texts).

Instruction and support for depth of understanding Case study teachers supported depth of understanding through instruction in reading comprehension strategies and vocabulary. Most tutorial (teaching) interactions occurred face-to-face, supported by digital tools, such as digital modelling books. As with building mileage, building comprehension combined both student directed and teacher directed approaches. Students were supported to become aware of and monitor their own skills and strategies through self-assessment and goal setting. 8

Teachers supported reading comprehension strategies through focussed lessons led by learning intentions. Vocabulary strategies and word consciousness are areas where there was less instruction and appears likely to be catalytic for supporting comprehension and thinking, supported by multimedia. Similarly, critical skills, perspective taking and language choices are all areas where there is opportunity to extend instruction.

Greater opportunity for in-task support Case study teachers provided structures and supports for students to employ before, during and after independent and instructional reading. In-task support could come from online tools or scaffolds. Examples include thinking prompts or guides, developed by teachers to structure students’ thinking about texts. Direct in-task support was also achieved through collaboration. Case study teachers used the digital learning environment to create shared spaces for thinking, for example, through shared documents. Another potentially powerful practice was peer feedback, most commonly through responding to blog posts, but potentially also through other media (e.g., face-to-face feedback about podcasts, group self-assessments on a digital learning object).

Challenging tasks and high expectations Case study teachers developed students’ abilities in higher order thinking. They tended to do this through asking students to evaluate their reading, justify their thinking and to develop agency over their interpretation. Teachers also supported students to self-monitor the depth of that thinking through levelled approaches and thinking taxonomies. Connections between reading and writing were an opportunity to increase the complexity of tasks and agency over reading. In such cases, students used their reading as “knowledge fuel” for writing or for creating, thereby reading for a student defined purpose rather than to answer teacher assigned questions. Creativity and innovation were also apparent in case study classes, and these served to increase the challenge when students were asked to “repurpose” their reading to another form. Examples include written arguments as an approach to book sharing (why you should read this book), advertisements for books, diagrams to summarise content, and reading advice columns.

Closer connections to students’ interests and reading histories Case study teachers used the digital learning environment to build on what students knew about. They did this by incorporating texts that reflected students’ values, language backgrounds and identities. Such texts were international as well as local. For example, the use of Samoan and Tongan newspapers, as well as New Zealand content. Teachers also sought to incorporate students’ personal

9

reading histories, and well as inviting students to share their differing perspectives on the content of their reading. Case study teachers therefore used the digital learning environment to provide a wider range of texts that made links to personal histories, and also as a tool to support students’ thinking about how texts had links to their lives and histories.

Recommendations 1.

Continue to embed the effective practices in writing

While there is some evidence that writing continues to be the area of greatest acceleration, there are also signs that slightly less acceleration may have been achieved in writing over 2015 than in 2014. And, while girls and boys both make accelerated progress in writing on average, girls continue to outperform boys by approximately a year’s learning. Thus there is a need to continue to reinforce the most effective practices in writing, including highly engaged learners, complex, creative tasks and powerful conversations. 2.

Develop a shared understanding of effective practices in reading including innovative digital practices which broaden and deepen reading

The analysis of some of the most effective teachers of reading highlights some potentially effective practices, likely to develop reading ability. Some of these practices are effective in traditional environments, and have potential to be amplified by digital learning environments, such as making links to students’ lives outside schools. Others are innovative practices that are made possible by digital learning environments, for example repurposing content across modes to create a digital learning object. Finally, a set of effective practices respond to the additional reading skills demanded of learners within a digital learning environment, for example perspective taking about social issues or critical appraisal of evidence in texts. We recommend therefore that with Manaiakalani we develop a shared set of hypotheses about the relationships between effective teaching and accelerated learning in reading. Once these are agreed, we recommend that the cluster work to embed these practices throughout the cluster. The following key areas seem to be key levers that case study teachers drew on the digital learning environment to enhance reading: a)

Promote engagement in reading, comprehension and higher order thinking. This might include supporting students’ independent monitoring of their reading enjoyment, interests, engagement and mileage.

10

b)

Promote instruction for depth of understanding and independence. This might include both teacher-led and student-led approaches to depth, as well as intertextual approaches to deep understanding of topics through reading and an increased focus on vocabulary learning and use.

c)

Provide in-task support for thinking about reading. This might include opportunities and support for students to develop independence in higher order thinking and agency over their interpretations. It would also include critical appraisal of what is read and the intended influences of texts on their readers.

d)

Increasing the challenge and expectations in assigned texts and tasks. This might include leveraging from the multiple text reading, and the reading-writing connections drawn within the ‘learn, create, share’ learning cycle. This might also include a greater emphasis on creativity and repurposing the ‘learning’ from multiple sources using multiple modes.

e)

Making connections. This might include explicit teacher selection of texts that make links to students’ communities and reading histories. It might also include supporting students to make links between texts that they have read, or in juxtaposing texts, comparing and contrasting, and taking an agentive stance to how texts position them as readers.

3.

Investigate subject-specific literacies and pedagogies supporting adolescent literacy

In both cross-sectional and longitudinal measures of student achievement, an emerging pattern is the need to keep pace with the normative comparisons in older year levels. International research suggests that there are additional, more specialised literacy demands of students as the texts they encounter become more complex and subject specific. While general literacy skills suffice for students in the middle primary years, it seems likely that more subject specialised demands are impacting on students in these older year levels. Thus, we recommend that we work with Manaiakalani to pay particular attention to the specialised reading and writing demands for learners in the upper primary and junior secondary years. Each of the hypothesised effective practices in reading previously mentioned will likely be relevant to this endeavour, as teachers work to increase the challenge, complexity and higher order thinking of students into secondary school.

11

1. Overview This report is the third milestone to the Manaiakalani Education Trust (MET). It details the activities and findings since Milestone 2 – Full Report (24 June, 2015). The Research and Development strand is focused on identifying and building on the strengths of the cluster. In this milestone, classroom instruction in reading is a key consideration; in particular the practices that can be identified by an in-depth focus on successful teachers of reading. In addition to the instructional focus, a number of analyses update the student achievement data monitoring. During 2013, a number of new schools (n = 4) joined the cluster. Two of these new schools have contributed data to the project. This milestone reports on the progress made toward achieving the aims of Manaiakalani during 2015. For this reason, schools are represented in two sets of analyses – one consists of both cohort 1 schools (including the five schools existing in the cluster in 2011) and cohort 2 schools (including the three schools that joined the cluster during 2011 to 2012); the other includes all the schools with student achievement data (whole cluster; n = 10) at present.

1.1

Whānau

Capability

Building

and

Classroom

Instruction Programme There are two parts in the development programme: Whānau capability building, and classroom instruction. The research and development programme focuses on these two core activities of the overall Manaiakalani programme to identify and generate new knowledge and innovative practice for the Manaiakalani group of schools and nationally. The focus is on highlighting positive deviance, i.e., those factors identifiable within the existing variation which most seem to contribute to positive outcomes for learners. In this report the pedagogical factors that seem most likely to contribute to student achievement in reading are presented.

1.2 Previous Reports Milestone 2 offered a number of recommendations based on whānau capability building, learning outof-school and classroom instruction in 2013 and 2014 (Jesson, McNaughton, Rosedale, Zhu & Meredith, 2015). Key messages from that report highlighted a number of successes of the

13

Manaiakalani innovation including effective practices in the teaching of writing, effective practices of students and their families in learning at home. Based on the research findings and also the strategic focus of the Manaiakalani leaders (including the MET and school leaders), a number of activities have taken place within Manaiakalani since the delivery of the Milestone 2. Those activities directly related to the research foci include: •

Analysis and interpretation of 2015 e-asTTle writing, PAT reading and mathematics for students in Years 4-10;



Analysis and interpretation of preliminary 2015 NCEA Level 1, 2 and 3;



Development of a hierarchical linear model (HLM) for student achievement data over 2014;



Analysis of in-depth observations in outlier/case study classes in reading;



Development of the “Summer Learning Journey” project from November 2015;



Further development of the Learning@Home resource website presented at ULearn15, October 2015;



Presentations by researchers to Manaiakalani teachers;



Presentation by Rebecca Jesson at Manaiakalani Hui in 2015;



Participation of the research team at Principal meetings;



Further interviews with parents/caregivers for parent/whānau case studies.

These activities, alongside the ongoing self-improvement and self-review professional learning groups of school leaders and teachers, contribute to the ongoing redesign and improvement of Manaiakalani. This milestone report seeks to document and contribute to these efforts by identifying and describing the most effective practices in reading.

14

2. Methods and Procedures 2.1 Data Sources In 2015 we relied on the following sources of data: e-asTTle and Progressive Achievement Test (PAT) achievement; preliminary National Certificate of Educational Achievement (NCEA) standards; and classroom observations, including analysis of case study classes in reading. Schools were grouped into cohorts according to their entry year into Manaiakalani (Table 1).

Table 1 Schools Included in Cohorts According to Years Involved in Manaiakalani School

Cohort

Tamaki Primary School

School Cohort 1

Point England School

School Cohort 1

Panmure Bridge School

School Cohort 1

St Pius X School

School Cohort 1

Glenbrae School

School Cohort 1

Tamaki College

School Cohort 2

Glen Innes School

School Cohort 2

St Patrick’s School

School Cohort 2

Stonefields School

School Cohort 3

Ruapotaka School

School Cohort 3

15

e-asTTle writing Student achievement data for writing at Term 4, 2015 were downloaded directly from the e-asTTle website, with separate data files downloaded for each school. The raw data for each school were then collated into one data file. In many cases, students had been tested multiple times at Term 4, 2015. Duplicate entries were deleted, with later tests retained. The data were then collated into the longitudinal writing database. To ensure completeness of the databases, quality assurance checks were conducted across assessment data obtained from e-asTTle and roll data obtained from schools. Longitudinal data were matched and collated in Excel, then exported into R (statistical software) for analysis. Any recoding or creation of variables required for analysis was completed in R.

Progressive Achievement Test (PAT) reading and mathematics Student achievement data for Progressive Achievement Test (PAT) reading and mathematics were downloaded directly from the New Zealand Council for Educational Research (NZCER) website, with one data file downloaded for each test in Term 4, 2015. That is, the raw data for all schools were contained in one data file for each subject. In many cases, students had been tested multiple times with different tests in Term 4, 2015. Duplicate entries were deleted, and tests with higher scale scores were retained. The data were then collated into two large longitudinal databases, one for each subject. To ensure completeness of the databases, quality assurance checks were conducted across assessment data obtained from NZCER and roll data obtained from schools.

National Certificate of Educational Achievement (NCEA) The 2015 NCEA data were collected directly from the school and are preliminary only. The final cutoff date for the official NCEA data has not happened yet. These will need to be updated once finalised data are available from the New Zealand Qualifications Authority (NZQA). NCEA level 1, 2, and 3 data for 2010-2014 were obtained directly from the NZQA website in an Excel spreadsheet.

Classroom observations Observations were conducted in Term 1 and Term 3, 2015. At both time points, classroom instruction was captured using the observation tool (Appendix A) designed, and previously employed, to capture both the teaching foci, and student activities that took place during lessons. The observation focus requested was reading if this was possible. Table 2 shows the number of classes observed in each school across Term 4, 2014 to Term 3, 2015. 16

Table 2 Number of Classes Observed in each School Across Term 4, 2014 to Term 3, 2015 School

Term 4, 2014

Term 1, 2015

Term 3, 2015

Glen Innes School

2

2

2

Point England School

6

5

5

Tamaki Primary School

4

4

3

Glenbrae School

3

3

3

Panmure Bridge School

4

5

5

St Patrick’s School

2

3

3

St Pius X School

3

3

3

Tamaki College

5

6

6

Stonefields School

4

4

2

Ruapotaka School

2

3

4

Total

35

38

36

The number of lessons observed in primary and secondary classrooms in Term 4, 2014 to Term 3, 2015 are presented in Table 3 and Table 4.

Table 3 Number of Lessons Observed in Primary School Classroom Observations by Lesson Type, Term 4, 2014 to Term 3, 2015 Lesson Type

Term 4, 2014

Term 1, 2015

Term 3, 2015

Reading

11

22

24

Writing

17

5

4

Mathematics

3

2

0

Inquiry

2

3

1

Spelling

0

1

0

Total

33

33

29

17

Table 4 Number of Lessons Observed at Tamaki College Classroom Observations by Lesson Type, Term 4, 2014 to Term 3, 2015 Lesson Type

Term 4, 2014

Term 1, 2015

Term 3, 2015

English

3

2

2

Mathematics

1

2

2

Science

1

2

2

Total

5

6

6

Using the classroom observation tool researchers collected data in samples of three minute intervals, whereby the observer alternated between observing the teacher interacting with a targeted teacher group of students for three minutes and observing the tasks assigned to the students who were not working with the teacher for the next three minutes. Judgments were made during the three minute interval focused on the teacher group about the nature of the main teaching activity (question and answer; lecturing and modeling; extended discussion and conferencing; roving; and, behaviour management). Instances of feedback were recorded as evaluative, descriptive and/or generative feed forward and whether involving the use of digital affordance. Finally, any teaching foci were coded as: item teaching (e.g., “So the plot includes the events that happen in the story…”; activating prior knowledge (e.g., “Can we remember what we mean by quarters?”; practice (e.g., writing a summary); and critical thinking/literacy (e.g., “identifying how the use of language positions a reader”). In the alternating three minute interval, which is focused on the tasks and activities assigned to the non-teaching group, the observer records any texts used, and the nature of the digital sites and activities students are engaged in, the nature of behavioural engagement (on-task/off-task), and whether agency is afforded (e.g., students independent, self-directed decision making). Any working together is also determined as being teacher directed (or not) and face-to-face discussion (FTF); or computer mediated discussion (CMD); and, targeted at individual, or shared work. Finally, activity during this interval is judged to be managed offline, online with some verbal prompts (by the teacher); or totally digitally managed.

18

We considered only the first six blocks from each class in the classroom analyses. For the whole cluster, there were: •

210 blocks in Term 4, 2014;



228 blocks in Term 1, 2015;



215 blocks in Term 3, 2015.

For school cohort 1 and 2 we considered: •

174 blocks in Term 4, 2014;



186 blocks in Term 1, 2015;



180 blocks in Term 3, 2015.

Case study teachers – reading Building on the case studies of effective teachers of writing in 2014 (reported in Milestone 2), a corresponding study of effective reading practices was carried out in Term 1 and Term 3, 2015. A small number of teacher cases were purposefully selected for in-depth exploration. The ‘closeness’ of a case study approach enabled comprehensive descriptions and comparative analysis of digital pedagogy and innovation identified as potential reading improvement accelerators. The investigation was informed by three primary data sources: •

Teacher online planning (Terms 1 and 3, 2015);



Student blog posts (Terms 1 and 3, 2015);



Classroom observations (Term 4, 2014; Terms 1 and 3, 2015).

Parent/whānau case studies A qualitative ethnographic approach was employed to explore case studies of a small number of families to accurately account for the nature of change in engagement of families. Eight Pasifika parents/caregivers were recruited from four local schools. Parents were aged between 30 to 50 years. Four participants stated they were New Zealand born and four were Pasifika born migrants. Table 5 shows the distribution of ethnicity and gender of participating parents/caregivers. Data were drawn from five semi-structured interviews carried out with each parent/caregiver in 2015 and 2016 and through ongoing communication with researchers via blog posting, texting and email. In the first 45 minute interview parents/caregivers were asked about their background, education, family and community activities to measure their level of engagement in learning at home, with schools and 19

in their communities. In the subsequent interviews of 15-20 minutes in duration researchers continued their discussions with parents/caregivers about their activities at home, with school and the wider community. Appendix C presents the interview tool. Nineteen interviews were completed in 2015 and transcribed using Dragon (voice recognition software). In 2016 researchers will continue their data collection with a further 21 interviews planned. Data from interviews (n = 40) will be transcribed and analysed. These data will be used to explore the nature and frequency of activities of families in their roles as parents, community members and citizens more widely. The changes in the frequency and nature of these activities as a result of engagement in a digital learning environment are described through these case studies.

Table 5 Parents/Caregivers Case Studies by Ethnicity and Gender Ethnicity

Male

Female

Cook Island Māori

0

1

Samoan

0

4

Tongan

1

2

Total

1

7

Table 6 shows parents/caregivers’ reported occupations. Of the seven mothers three were studying toward a qualification at certificate level (one in hospitality and the other in small business) or degree level (i.e., BA in Psychology). Three parents/caregivers were employed full-time.

20

Table 6 Number of Parents/Caregivers Who Participated In Case Studies by Occupation Occupation

N

Stay-at-home mum

2

Teacher/Manager

1

Student

3

IT Programmer

1

Manager

1

Total

8

2.2 Analysis In 2015 we analysed the following sources of data: e-asTTle writing; PAT reading and mathematics; NCEA achievement; classroom observations; and case study teachers of reading.

e-asTTle writing e-asTTle writing achievement data were analysed with R and data visualisations were created using Excel. The cross-sectional analyses contained single time point snapshots, including comparisons of differences in overall e-asTTle writing scores from normative averages (Table 7) between groups (independent samples t-tests ). The longitudinal analyses included summary and inferential statistics of differences in overall writing scores from normative averages over time, differences from expected gains in writing over time compared with normative gains at each time point, across year levels, schools, schools cohorts (Table 1) and ethnicities, and between school types, school religion and genders. Additional cross-sectional and longitudinal analyses compared e-asTTle writing achievement of students who transitioned to Tamaki College from within the Manaiakalani cluster against achievement of students who transitioned from outside the cluster. We considered students to have transitioned to Tamaki College from outside the cluster if they could not be tracked in the existing longitudinal database from the previous time points.

21

Table 7 Summary Statistics for the e-asTTle Writing Sample Year

M1

M2

4

1355

1413

5

1429

1471

6

1485

1526

7

1539

1578

8

1591

1630

9

1645

1699

10

1716

1770

Note: M1 and M2 depict e-asTTle writing norms for Term 1 and Term 4, respectively.

Progressive Achievement Test (PAT) reading and mathematics Progressive Achievement Test (PAT) achievement data were analysed in R, and data visualisations were created in R or Excel. The cross-sectional analyses contained single time point snapshots, including comparisons of differences in PAT scale scores from normative averages (Table 8 and Table 9) between groups (independent samples t-tests). The longitudinal analyses included summary and inferential statistics of scale scores over time, across year levels, schools, schools cohorts (Table 1) and ethnicities, and between school types, school religion and gender. Additional cross-sectional and longitudinal analyses compared PAT reading and mathematics achievement of students who transitioned to Tamaki College from within the Manaiakalani cluster against achievement of students who transitioned from outside the cluster. We considered students to have transitioned to Tamaki College from outside the cluster if they could not be tracked in the existing longitudinal database from the previous time points. PAT scale scores were used in both cross-sectional and longitudinal analyses, thus allowing an analysis of growth over time (Darr, Neill, Stephanou & Ferral, 2007, p. 30; Darr, Neill, Stephanou & Ferral, 2008, p. 34; Ministry of Education & NZCER, 2012, p. 57).

22

Table 8 Summary Statistics for the Progressive Achievement Test Reading Sample Year

N

M1

M2

4

1442

28.8

35.8

5

1432

35.8

45.0

6

1439

45.0

53.2

7

2086

53.2

60.4

8

1814

60.4

67.0

9

2204

67.0

76.5

10

1903

76.5

84.5

Note: M1 depicts PAT reading norms at Term 1. M2 depicts norms at subsequent years as the end of year expectations assuming no expected summer gains. As for Year 10 students, gain is assumed to be at the average across all year levels which is 8.

Table 9 Summary Statistics for the Progressive Achievement Test Mathematics Sample Year

N

M1

M2

4

1469

30.6

38.9

5

1616

38.9

45.1

6

1521

45.1

49.6

7

1834

49.6

55.0

8

1815

55.0

60.6

9

1479

60.6

65.4

10

1572

65.4

71.4

Note: M1 depicts PAT mathematics norm at Term 1. M2 depicts norms at subsequent years as the end of year expectations assuming no expected summer gains. As for Year 10 students, gain is assumed to be at the average across all year levels which is 6.

23

National Certificate of Educational Achievement (NCEA) NCEA data supplied by the school were analysed as percentage of students (roll-based) who attained qualifications including NCEA Level 1, 2, 3 and University Entrance (UE), level 1 literacy and numeracy from 2010 to 2015. Pass rates for each qualification in each year were calculated based on the assumption that level 1 qualifications should be attained by Year 11 students, level 2 qualifications should be attained by Year 12 students and level 3 qualifications should be attained by Year 13 students.

Classroom observations Descriptive statistics are presented as percentages of blocks in which each variable was observed. These summaries are presented as percentages of all the blocks observed, rather than as an average per teacher. Thus, the picture is of the teaching profile across the cluster. Comparisons for each factor variable were presented in bar charts (side-by-side).

Case study teachers – reading Consistent with our approach to investigating accelerated outcomes in students’ writing in 2014, focus on reading acceleration was also based on analysing variability in features of pedagogy and implementation. The selection of case study teachers was informed by a triangulation process involving three effectiveness measures; triangulation of these data followed a ranking process to determine the “top 10” tier of practitioners from which convergence by any single teacher across measurements strengthened nomination. Comparisons of convergence across the three measures (per teacher), determined a final selection of cases designated for analysis (n = 8). The three sources of evidence triangulated for analysis were: 4.

Achievement in reading – teachers of classes whose average gain scores in reading were accelerated by comparison with national normative progress in 2014;

5.

Observer “impressions” – teachers observed in Terms 1 and/or 3, 2015, whose practice reflected “a powerful uptake of digital affordance” based on a post observation scoring protocol (Appendix B). Evaluation of “powerful uptake” was determined by totalling the observer score (1 = Limited; 2 = Some; 3 = Powerful) for each of the following indicators: •

Nature of cognitive challenge – for example, intertextuality (tasks with text), criticality, text level, significance/relevance; 24



Digital integration that takes learning beyond what could be achieved ordinarily - e.g., skill development, efficiencies, engagement, agency, collaboration;



Cultivation of learner independence – for example, scaffold in/out, tracking learning goals, self-regulation, initiative, and agency.

Evidence in support of observer rating was recoded alongside each indicator. 6.

Principal nominations – Principals were invited to nominate “top tier teachers of reading” from their school’s teaching staff.

Following identification of the case study teachers (n = 8), the four sources of evidence were qualitatively and quantitatively analysed to identify and describe instrumental reading practice within digital contexts. Qualitative analyses of these data sets were loosely guided by the framework derived from our case study investigation of effective reading practice reported in Milestone 2, which in turn, had been initially informed by a collaboration workshop with the principal investigators and a group of Manaiakalani senior leaders, incorporating hypotheses built from extant literature. The resulting five categories, comprising a framework for accelerating writing outcomes through digital affordance, had been identified: •

Increased engagement (in writing);



Tutorial properties of interactions;



Greater opportunities for in-task support;



Increased opportunities for complex settings and tasks;



Closer connection to student settings.

Although it was hypothesised that the “acceleration practices” represented by the writing framework would not be limited to writing improvement only (and therefore useful in uncovering influencers of accelerated reading), a grounded coding or “bottom up” approach was initially adopted. In this way, any features of instruction associated with improved reading would not be overlooked or unduly influenced by the existing framework. Firstly, teacher online planning was charted (typically by week) across Terms 1, and 3 (2015), followed by investigations of associated student digital artefacts, posted to individual student blog sites. Sampling of blog sites, to trace student postings representative of teacher planned tasks, followed a purposive approach: totalling the blog post frequencies of all students within each teacher’s classroom and selecting two male and two female students (n = 4) with the highest 25

frequency of postings per term (n = 32). As in the previous case study of effective writing, it was determined that artefacts of students with the most blog posts would potentially better contextualise implementation of teachers’ reading planning, alignment with teacher learning objectives and evidence of digital affordance. Finally, qualitative descriptions derived from a process of constant comparative, grounded analysis, identified thematic patterns of teaching and learning practice, which were then compared with the previous acceleration framework for possible alignment.

Parent/whānau case studies Analyses of the 2015 interviews with parents and caregivers are ongoing, as interviews continue. While interviews are being finalised during the first months of 2016, a number of initial types of engagement have been identified. The initial analysis informed the development of the Fanau Engagement hypothesis (Meredith, 2015) using two frameworks. The first framework is cultural modes, such as values, beliefs, practices and aspirations of parents and their families. The second is academic socialisation where parents’ describe their child rearing practices and how they support their children’s education and learning (Suizzo, Pahlke, Chen, & Romero, 2014). Within this framework researchers examined parents’/caregivers’ involvement and engagement in face-to-face and digital interactions at home, with schools and in the wider community. Involvement is where parents/caregivers participate in school-focused activities such as reading programmes, parent helper, or coaching (Epstein, 1996; Epstein & Sanders, 2002). Engagement within the home/school context is a partnership where both sites are working together. Fanau Engagement is the combination of involvement (participation) and engagement (partnership) influenced by family learning, learning in a digital learning environment, other sites of learning that seeks to enable parents, their families and communities to make a positive difference in their children’s education as well as for themselves. The theoretical framework may provide opportunities for learning and community development (Riveria, 2014).

In analysing the parent/whānau case studies three types of engagement were found: •

Learning at home;



Digital learning;



Other sites of learning.

Each of these is influenced by families’ (i.e., parents/caregivers) aspirations, value, beliefs and practices surrounding learning.

26

Aspirations

Other sites of learning Digital Learning

Values

Fanau Learning

Beliefs

Practices

Figure 1. Fanau Engagement: cultural modes and academic socialisation.

27

3. Results 3.1 Writing, Reading and Mathematics – Term 4, 2015 e-asTTle writing Whole cluster In this section, we consider only students (n = 1364) from the whole cluster that have e-asTTle writing scores at Term 4 in 2015. An ANOVA was conducted to determine whether there was any variation in difference from norm easTTle writing score across year levels, schools, and ethnicities, and between genders. A summary of the ANOVA statistics is provided in Table 10, and descriptive statistics for each school and student characteristics are presented in Table 11 - Table 17. Figure 2 - Figure 5 show confidence intervals of difference from norm e-asTTle writing score at Term 4 in 2015 by student characteristics such as gender, ethnicity, year level, and school respectively. Figure 6 - Figure 8 show confidence intervals of difference from norm e-asTTle writing score at Term 4 in 2015 by school characteristics such as school cohort, school type, and school religion respectively. Table 18 shows the number of classrooms with average writing score that was above, at, and below norm at Term 4 in 2015 by school cohort. Table 10 ANOVA Summary Statistics – Difference from Norm e-asTTle Writing Score (Whole Cluster) at Term 4 in 2015 df

F

p

ES*

Gender

1

67.3

< .001

0.05

Year Level

6

27.9

< .001

0.11

Ethnicity

4

10.9

< .001

0.03

School

1

48.2

< .001

0.03

Residuals

1351

Note: *The effect size used is partial eta squared ηp2. 29

By gender Table 11 Difference from Norm e-asTTle Writing Score (Whole Cluster) at Term 4 in 2015 by Gender Gender

N

Mdiff

SDdiff

t

Significance

Female

662

-31.1

119.2

-6.7

***

Male

702

-83.1

134.3

-16.4

***

Significance of p-values: *** < 0.001; ** < 0.01; * < 0.05; . < 0.1.

Difference from Norm Writing Score at Term 4 in 2015 by Gender - Whole Cluster

Difference from Norm e-asTTle Writing Score

0 -10 -20 -30

-31.1

-40 -50 -60 -70 -80

-83.1

-90 -100 Female

Male

Figure 2. Difference from norm e-asTTle writing score (whole cluster) at Term 4 in 2015 by gender.

30

By ethnicity Table 12 Difference from Norm e-asTTle Writing Score (Whole Cluster) at Term 4 in 2015 by Ethnicity Ethnicity

N

Mdiff

SDdiff

t

Significance

Māori

329

-69.0

114.3

-11.0

***

Pasifika

785

-71.3

130.2

-15.3

***

NZ European

93

3.8

141.7

0.3

Other

156

-2.4

126.7

-0.2

Significance of p-values: *** < 0.001; ** < 0.01; * < 0.05; . < 0.1.

Difference from Norm Writing Score at Term 4 in 2015 by Ethnicity - Whole Cluster

Difference from Norm e-asTTle Writing Score

40 20 3.8

0

-2.4

-20 -40 -60 -69.0

-71.3

-80 -100 Māori

Pasifika

NZ European

Other

Figure 3. Difference from norm e-asTTle writing score (whole cluster) at Term 4 in 2015 by ethnicity.

31

By year level Table 13 Difference from Norm e-asTTle Writing Score (Whole Cluster) at Term 4 in 2015 by Year Level Year Level

N

Mdiff

SDdiff

t

Significance

4

238

-17.8

145.2

-1.9

.

5

261

-48.6

126.7

-6.2

***

6

221

-49.3

127.9

-5.7

***

7

238

-45.5

109.6

-6.4

***

8

221

-50.8

111.0

-6.8

***

9

103

-131.2

111.2

-12.0

***

10

82

-189.3

107.1

-16.0

***

Significance of p-values: *** < 0.001; ** < 0.01; * < 0.05; . < 0.1.

Difference from Norm Writing Score at Term 4 in 2015 by Year Level - Whole Cluster

Difference from Norm e-asTTle Writing Score

50 0 -17.8 -48.6

-50

-45.5

-49.3

-50.8

-100 -131.2 -150 -189.3

-200 -250 4

5

6

7

8

9

10

Figure 4. Difference from norm e-asTTle writing score (whole cluster) at Term 4 in 2015 by year level.

32

By school Table 14 Difference from Norm e-asTTle Writing Score (Whole Cluster) at Term 4 in 2015 by School School

N

Mdiff

SDdiff

t

Significance

1

120

-64.2

131.5

-5.3

***

2

327

-77.3

134.7

-10.4

***

3

116

-79.5

131.0

-6.5

***

4

98

-39.1

92.7

-4.2

***

5

140

-41.3

132.0

-3.7

***

6

64

26.1

85.7

2.4

*

7

65

-29.4

97.6

-2.4

*

8

185

-157.0

112.9

-18.9

***

9

158

37.9

90.6

5.3

***

10

91

-41.8

105.3

-3.8

***

Significance of p-values: *** < 0.001; ** < 0.01; * < 0.05; . < 0.1.

Difference from Norm Writing Score at Term 4 in 2015 by School - Whole Cluster

Difference from Norm e-asTTle Writing Score

100 50

37.9

26.1 0 -39.1

-50

-64.2

-77.3

-29.4

-41.3

-41.8

-79.5

-100 -150

-157.0

-200 1

2

3

4

5

6

7

8

9

10

Figure 5. Difference from norm e-asTTle writing score (whole cluster) at Term 4 in 2015 by school.

33

By school cohort Table 15 Difference from Norm e-asTTle Writing Score (Whole Cluster) at Term 4 in 2015 by School Cohort School Cohort a

N

Mdiff

SDdiff

t

Significance

1

746

-61.7

127.2

-13.2

***

2

369

-95.0

134.2

-13.6

***

3

249

8.7

103.4

1.3

Significance of p-values: *** < 0.001; ** < 0.01; * < 0.05; . < 0.1. a

see Table 1

Difference from Norm Writing Score at Term 4 in 2015 by School Cohort - Whole Cluster

Difference from Norm e-asTTle Writing Score

40 20 8.7

0 -20 -40 -60

-61.7

-80 -95.0

-100 -120 1

2

3

Figure 6. Difference from norm e-asTTle writing score (whole cluster) at Term 4 in 2015 by school cohort.

34

By school type Table 16 Difference from Norm e-asTTle Writing Score (Whole Cluster) at Term 4 in 2015 by School Type School Type

N

Mdiff

SDdiff

t

Significance

Primary

1179

-42.3

125.3

-11.6

***

Secondary

185

-157.0

112.9

-18.9

***

Significance of p-values: *** < 0.001; ** < 0.01; * < 0.05; . < 0.1.

Difference from Norm Writing Score at Term 4 in 2015 by School Type - Whole Cluster

Difference from Norm e-asTTle Writing Score

0 -20 -40

-42.3

-60 -80 -100 -120 -140 -157.0

-160 -180 -200 Primary

Secondary

Figure 7. Difference from norm e-asTTle writing score (whole cluster) at Term 4 in 2015 by school type.

35

By school religion Table 17 Difference from Norm e-asTTle Writing Score (Whole Cluster) at Term 4 in 2015 by School Religion School Religion

N

Mdiff

SDdiff

t

Catholic

129

-1.8

95.7

-0.2

Non-Catholic

1235

-63.7

131.5

-17.0

Significance

***

Significance of p-values: *** < 0.001; ** < 0.01; * < 0.05; . < 0.1.

Difference from Norm Writing Score at Term 4 in 2015 by School Religion - Whole Cluster

Difference from Norm e-asTTle Writing Score

20 10 0

-1.8

-10 -20 -30 -40 -50 -60

-63.7

-70 -80 Catholic

Non-Catholic

Figure 8. Difference from norm e-asTTle writing score (whole cluster) at Term 4 in 2015 by school religion.

36

By classroom Table 18 Number of Classrooms with Average Writing Score Above, At, and Below Norm Score (Whole Cluster) at Term 4 in 2015 by School Cohort Normative Comparison

School Cohort 1 and 2

School Cohort 3

Total

Above

1

5

6

At

12

3

15

Below

34

3

37

Total

47

11

58

School cohort 1 and 2 In this section, we consider only students (n = 1115) from school cohort 1 and 2 that have e-asTTle writing scores at Term 4 in 2015. An ANOVA was conducted to determine whether there was any variation in difference from norm easTTle writing score across year levels, schools, and ethnicities, and between genders. A summary of the ANOVA statistics is provided in Table 19, and descriptive statistics for each school and student characteristics are presented in Table 20 - Table 24. Figure 9 - Figure 11 show confidence intervals of difference from norm e-asTTle writing score at Term 4 in 2015 by student characteristics such as gender, ethnicity and year level respectively. Figure 12 - Figure 13 show confidence intervals of difference from norm e-asTTle writing score at Term 4 in 2015 by school characteristics such as school type and school religion respectively. Table 19 ANOVA Summary Statistics – Difference from Norm e-asTTle Writing Score (School Cohort 1 and 2) at Term 4 in 2015 df

F

p

ES*

Gender

1

49.8

< .001

0.04

Year Level

6

19.3

< .001

0.10

Ethnicity

4

1.9

= .105

0.01

School

1

30.6

< .001

0.03

Residuals

1102

Note: *The effect size used is partial eta squared ηp2.

37

By gender Table 20 Difference from Norm e-asTTle Writing Score (School Cohort 1 and 2) at Term 4 in 2015 by Gender Gender

N

Mdiff

SDdiff

t

Significance

Female

527

-45.8

119.2

-8.8

***

Male

588

-96.8

135.4

-17.3

***

Significance of p-values: *** < 0.001; ** < 0.01; * < 0.05; . < 0.1.

Difference from Norm Writing Score at Term 4 in 2015 by Gender - School Cohort 1 and 2

Difference from Norm e-asTTle Writing Score

0 -20 -40

-45.8

-60 -80 -96.8

-100 -120 Female

Male

Figure 9. Difference from norm e-asTTle writing score (school cohort 1 and 2) at Term 4 in 2015 by gender.

38

By ethnicity Table 21 Difference from Norm e-asTTle Writing Score (School Cohort 1 and 2) at Term 4 in 2015 by Ethnicity Ethnicity

N

Mdiff

SDdiff

t

Significance

Māori

301

-75.0

113.5

-11.5

***

Pasifika

705

-75.3

132.7

-15.1

***

NZ European

29

-95.0

185.4

-2.8

*

Other

79

-31.0

140.6

-2.0

.

Significance of p-values: *** < 0.001; ** < 0.01; * < 0.05; . < 0.1.

Difference from Norm Writing Score at Term 4 in 2015 by Ethnicity - School Cohort 1 and 2

Difference from Norm e-asTTle Writing Score

20 0 -20 -31.0

-40 -60 -80

-75.0

-75.3 -95.0

-100 -120 -140 -160 -180 Māori

Pasifika

NZ European

Other

Figure 10. Difference from norm e-asTTle writing score (school cohort 1 and 2) at Term 4 in 2015 by ethnicity.

39

By year level Table 22 Difference from Norm e-asTTle Writing Score (School Cohort 1 and 2) at Term 4 in 2015 by Year Level Year Level

N

Mdiff

SDdiff

t

Significance

4

171

-46.5

152.5

-4.0

***

5

201

-60.3

129.8

-6.6

***

6

180

-59.6

132.1

-6.1

***

7

194

-50.7

111.8

-6.3

***

8

184

-61.9

108.3

-7.8

***

9

103

-131.2

111.2

-12.0

***

10

82

-189.3

107.1

-16.0

***

Significance of p-values: *** < 0.001; ** < 0.01; * < 0.05; . < 0.1.

Difference from Norm Writing Score at Term 4 in 2015 by Year Level - School Cohort 1 and 2

Difference from Norm e-asTTle Writing Score

0

-46.5

-50

-50.7

-59.6

-60.3

-61.9

-100 -131.2 -150 -189.3

-200

-250 4

5

6

7

8

9

10

Figure 11. Difference from norm e-asTTle writing score (school cohort 1 and 2) at Term 4 in 2015 by year level.

40

By school type Table 23 Difference from Norm e-asTTle Writing Score (School Cohort 1 and 2) at Term 4 in 2015 by School Type School Type

N

Mdiff

SDdiff

t

Significance

Primary

930

-56.0

127.2

-13.4

***

Secondary

185

-157.0

112.9

-18.9

***

Significance of p-values: *** < 0.001; ** < 0.01; * < 0.05; . < 0.1.

Difference from Norm Writing Score at Term 4 in 2015 by School Type - School Cohort 1 and 2

Difference from Norm e-asTTle Writing Score

0 -20 -40 -60

-56.0

-80 -100 -120 -140 -157.0

-160 -180 -200 Primary

Secondary

Figure 12. Difference from norm e-asTTle writing score (school cohort 1 and 2) at Term 4 in 2015 by school type.

41

By school religion Table 24 Difference from Norm e-asTTle Writing Score (School Cohort 1 and 2) at Term 4 in 2015 by School Religion School Religion

N

Mdiff

SDdiff

t

Catholic

129

-1.8

95.7

-0.2

Non-Catholic

986

-82.0

131.6

-19.6

Significance

***

Significance of p-values: *** < 0.001; ** < 0.01; * < 0.05; . < 0.1.

Difference from Norm Writing Score at Term 4 in 2015 by School Religion - School Cohort 1 and 2

Difference from Norm e-asTTle Writing Score

20 0

-1.8

-20 -40 -60 -80

-82.0

-100 Catholic

Non-Catholic

Figure 13. Difference from norm e-asTTle writing score (school cohort 1 and 2) at Term 4 in 2015 by school religion.

42

By transition to secondary school In this section we analysed e-asTTle writing achievement at Term 4 in 2015 for students from Tamaki College. Analyses compared achievements of students (n = 185) who transitioned to Tamaki College from within the Manaiakalani cluster against achievements of those who transitioned from outside the cluster. Table 25 shows the number of students at Tamaki College who transitioned from within and outside the Manaiakalani cluster. Figure 14 - Figure 15 present confidence intervals of difference from norm e-asTTle writing score at Term 4 in 2015 by transition to Tamaki College.

Table 25 Number of Students at Tamaki College who Transitioned from Within and Outside the Manaiakalani Cluster Year Level

Transition from

N

9

Outside Manaiakalani

26

9

Within Manaiakalani

77

10

Outside Manaiakalani

42

10

Within Manaiakalani

40

Note: All students transitioned after Year 8.

43

Difference from Norm Writing Score at Term 4 in 2015 by Transition Year 9 Difference from Norm e-asTTle Writing Score

0

-50

-100

-150

-200

-250 Year 9, Outside

Year 9, Within

Figure 14. Year 9 difference from norm e-asTTle writing scores by transition at Term 4 in 2015 at Tamaki College. Difference from Norm Writing Score at Term 4 in 2015 by Transition Year 10 Difference from Norm e-asTTle Writing Score

0

-50

-100

-150

-200

-250 Year 10, Outside

Year 10, Within

Figure 15. Year 10 difference from norm e-asTTle writing scores by transition at Term 4 in 2015 at Tamaki College. 44

Progressive Achievement Test (PAT) reading Whole cluster In this section, we consider only students (n = 1431) from the whole cluster that have PAT reading scores at Term 4 in 2015. An ANOVA was conducted to determine whether there was any variation in difference from norm PAT reading score across year levels, schools, and ethnicities, and between genders. A summary of the ANOVA statistics is provided in Table 26 and descriptive statistics for each school and student characteristics are presented in Table 27 - Table 33. Figure 16 - Figure 19 show confidence intervals of difference from norm PAT reading score at Term 4 in 2015 by student characteristics such as gender, year level, ethnicity, and school respectively. Figure 20 - Figure 22 show confidence intervals of difference from norm PAT reading score at Term 4 in 2015 by school characteristics such as school cohort, school type, and school religion respectively. Table 34 shows the number of classrooms with average reading score that was above, at, and below norm at Term 4 in 2015 by school cohort.

Table 26 ANOVA Summary Statistics – Difference from Norm PAT Reading Score (Whole Cluster) at Term 4 in 2015 df

F

p

ES*

Gender

1

60.6

< .001

0.04

Year Level

1

34.9

< .001

0.02

Ethnicity

3

50.0

< .001

0.10

School

1

25.8

< .001

0.02

Residuals

1415

Note: *The effect size used is partial eta squared ηp2.

45

By gender Table 27 Difference from Norm PAT Reading Score (Whole Cluster) at Term 4 in 2015 by Gender Gender

N

Mdiff

SDdiff

t

Significance

Female

694

-8.1

12.7

-16.9

***

Male

735

-12.9

11.8

-29.6

***

Significance of p-values: *** < 0.001; ** < 0.01; * < 0.05; . < 0.1.

Difference from Norm Reading Score at Term 4 in 2015 by Gender - Whole Cluster

Difference from Norm PAT Reading Score

0.0 -2.0 -4.0 -6.0 -8.0

-8.1

-10.0 -12.0

-12.9

-14.0 -16.0 Female

Male

Figure 16. Difference from norm PAT reading score (whole cluster) at Term 4 in 2015 by gender.

46

By ethnicity Table 28 Difference from Norm PAT Reading Score (Whole Cluster) at Term 4 in 2015 by Ethnicity Ethnicity

N

Mdiff

SDdiff

t

Significance

Māori

349

-11.6

11.7

-18.5

***

Pasifika

822

-12.6

10.8

-33.3

***

NZ European

108

-0.1

14.2

-0.1

Other

143

-4.0

15.3

-3.1

**

Significance of p-values: *** < 0.001; ** < 0.01; * < 0.05; . < 0.1.

Difference from Norm Reading Score at Term 4 in 2015 by Ethnicity - Whole Cluster

Difference from Norm PAT Reading Score

4.0 2.0 0.0

-0.1

-2.0 -4.0

-4.0

-6.0 -8.0 -10.0 -12.0

-11.6

-12.6

-14.0 -16.0 Māori

Pasifika

NZ European

Other

Figure 17. Difference from norm PAT reading score (whole cluster) at Term 4 in 2015 by ethnicity.

47

By year level Table 29 Difference from Norm PAT Reading Score (Whole Cluster) at Term 4 in 2015 by Year Level Year Level

N

Mdiff

SDdiff

t

Significance

4

249

-7.2

15.7

-7.2

***

5

278

-11.7

11.0

-17.7

***

6

240

-10.0

11.9

-13.0

***

7

240

-9.7

12.0

-12.6

***

8

220

-9.0

10.9

-12.3

***

9

111

-14.9

11.1

-14.1

***

10

93

-18.6

8.9

-20.1

***

Significance of p-values: *** < 0.001; ** < 0.01; * < 0.05; . < 0.1.

Difference from Norm Reading Score at Term 4 in 2015 by Year Level - Whole Cluster

Difference from Norm PAT Reading Score

0.0

-5.0 -7.2 -10.0

-9.0

-9.7

-10.0 -11.7

-14.9

-15.0

-18.6

-20.0

-25.0 4

5

6

7

8

9

10

Figure 18. Difference from norm PAT reading score (whole cluster) at Term 4 in 2015 by year level.

48

By school Table 30 Difference from Norm PAT Reading Score (Whole Cluster) at Term 4 in 2015 by School School

N

Mdiff

SDdiff

t

Significance

1

119

-14.4

10.2

-15.3

***

2

336

-12.0

11.5

-19.2

***

3

139

-14.0

10.3

-16.1

***

4

100

-12.0

11.0

-10.9

***

5

140

-7.9

12.2

-7.7

***

6

64

-5.5

10.8

-4.1

***

7

67

-9.8

10.0

-8.0

***

8

204

-16.6

10.3

-22.9

***

9

159

1.6

14.1

1.4

10

103

-9.7

11.3

-8.7

***

Significance of p-values: *** < 0.001; ** < 0.01; * < 0.05; . < 0.1.

Difference from Norm Reading Score at Term 4 in 2015 by School - Whole Cluster

Difference from Norm PAT Reading Score

5.0 1.6 0.0

-5.0

-5.5 -7.9 -9.7

-9.8

-10.0 -12.0

-12.0 -14.0

-14.4

-15.0

-16.6 -20.0 1

2

3

4

5

6

7

8

9

10

Figure 19. Difference from norm PAT reading score (whole cluster) at Term 4 in 2015 by school.

49

By school cohort Table 31 Difference from Norm PAT Reading Score (Whole Cluster) at Term 4 in 2015 by School Cohort School Cohort a

N

Mdiff

SDdiff

t

Significance

1

782

-11.4

11.3

-28.2

0.000

2

387

-14.0

11.1

-25.0

0.000

3

262

-2.8

14.2

-3.2

0.001

Significance of p-values: *** < 0.001; ** < 0.01; * < 0.05; . < 0.1. a

see Table 1

Difference from Norm Reading Score at Term 4 in 2015 by School Cohort - Whole Cluster

Difference from Norm PAT Reading Score

0.0 -2.0

-2.8

-4.0 -6.0 -8.0 -10.0 -11.4

-12.0 -14.0

-14.0

-16.0 1

2

3

Figure 20. Difference from norm PAT reading score (whole cluster) at Term 4 in 2015 by school cohort.

50

By school type Table 32 Difference from Norm PAT Reading Score (Whole Cluster) at Term 4 in 2015 by School Type School Type

N

Mdiff

SDdiff

t

Significance

Primary

1227

-9.6

12.5

-26.9

***

Secondary

204

-16.6

10.3

-22.9

***

Significance of p-values: *** < 0.001; ** < 0.01; * < 0.05; . < 0.1.

Difference from Norm Reading Score at Term 4 in 2015 by School Type - Whole Cluster

Difference from Norm PAT Reading Score

0.0 -2.0 -4.0 -6.0 -8.0 -10.0

-9.6

-12.0 -14.0 -16.0

-16.6

-18.0 -20.0 Primary

Secondary

Figure 21. Difference from norm PAT reading score (whole cluster) at Term 4 in 2015 by school type.

51

By school religion Table 33 Difference from Norm PAT Reading Score (Whole Cluster) at Term 4 in 2015 by School Religion School Religion

N

Mdiff

SDdiff

t

Significance

Catholic

131

-7.7

10.6

-8.3

***

Non-Catholic

1300

-10.9

12.6

-31.1

***

Significance of p-values: *** < 0.001; ** < 0.01; * < 0.05; . < 0.1.

Difference from Norm Reading Score at Term 4 in 2015 by School Religion - Whole Cluster

Difference from Norm PAT Reading Score

0.0 -2.0 -4.0 -6.0 -8.0

-7.7

-10.0 -10.9 -12.0 -14.0 Catholic

Non-Catholic

Figure 22. Difference from norm PAT reading score (whole cluster) at Term 4 in 2015 by school religion.

52

By classroom Table 34 Number of Classrooms with Average Reading Score Above, At, and Below Norm Score (Whole Cluster) at Term 4 in 2015 by School Cohort Normative Comparison

School Cohort 1 and 2

School Cohort 3

Total

Above

0

1

1

At

2

6

8

Below

47

5

52

Total

49

12

61

School cohort 1 and 2 In this section, we consider only students (n = 1169) from the school cohort 1 and 2 that have PAT reading scores at Term 4 in 2015. An ANOVA was conducted to determine whether there was any variation in difference from norm PAT reading score across year levels, schools, and ethnicities, and between genders. A summary of the ANOVA statistics is provided in Table 35 and descriptive statistics for each school and student characteristics are presented in Table 36 - Table 40. Figure 23 - Figure 25 show confidence intervals of difference from norm PAT reading score at Term 4 in 2015 by student characteristics such as gender, ethnicity and year level, respectively. Figure 26 Figure 27 show confidence intervals of difference from norm PAT reading score at Term 4 in 2015 by school characteristics such as school type and school religion respectively. Table 35 ANOVA Summary Statistics – Difference from Norm PAT Reading Score (School Cohort 1 and 2) at Term 4 in 2015 df

F

P

ES*

Gender

1

44.9

< .001

0.04

Year Level

1

10.6

< .01

0.01

Ethnicity

3

8.4

< .001

0.02

School

1

1.8

= .182

0.00

Residuals

1162

Note: *The effect size used is partial eta squared ηp2.

53

By gender Table 36 Difference from Norm PAT Reading Score (School Cohort 1 and 2) at Term 4 in 2015 by Gender Gender

N

Mdiff

SDdiff

t

Significance

Female

552

-10.0

11.5

-20.5

***

Male

617

-14.3

10.8

-33.0

***

Significance of p-values: *** < 0.001; ** < 0.01; * < 0.05; . < 0.1.

Difference from Norm Reading Score at Term 4 in 2015 by Gender - School Cohort 1 and 2

Difference from Norm PAT Reading Score

0.0 -2.0 -4.0 -6.0 -8.0 -10.0

-10.0

-12.0 -14.0

-14.3

-16.0 Female

Male

Figure 23. Difference from norm PAT reading score (school cohort 1 and 2) at Term 4 in 2015 by gender.

54

By ethnicity Table 37 Difference from Norm PAT Reading Score (School Cohort 1 and 2) at Term 4 in 2015 by Ethnicity Ethnicity

N

Mdiff

SDdiff

t

Significance

Māori

320

-12.3

11.2

-19.6

***

Pasifika

739

-13.0

10.7

-32.9

***

NZ European

32

-9.5

14.2

-3.8

***

Other

78

-6.8

14.2

-4.3

***

Significance of p-values: *** < 0.001; ** < 0.01; * < 0.05; . < 0.1.

Difference from Norm Reading Score at Term 4 in 2015 by Ethnicity - School Cohort 1 and 2

Difference from Norm PAT Reading Score

0.0 -2.0 -4.0 -6.0

-6.8

-8.0 -9.5

-10.0 -12.0

-12.3

-13.0

-14.0 -16.0 Māori

Pasifika

NZ European

Other

Figure 24. Difference from norm PAT reading score (school cohort 1 and 2) at Term 4 in 2015 by ethnicity.

55

By year level Table 38 Difference from Norm PAT Reading Score (School Cohort 1 and 2) at Term 4 in 2015 by Year Level Year Level

N

Mdiff

SDdiff

t

Significance

4

172

-11.1

13.8

-10.5

***

5

219

-13.1

10.0

-19.4

***

6

195

-11.3

11.1

-14.2

***

7

199

-11.0

11.3

-13.6

***

8

180

-10.3

10.3

-13.4

***

9

111

-14.9

11.1

-14.1

***

10

93

-18.6

8.9

-20.1

***

Significance of p-values: *** < 0.001; ** < 0.01; * < 0.05; . < 0.1.

Difference from Norm Reading Score at Term 4 in 2015 by Year Level - School Cohort 1 and 2

Difference from Norm PAT Reading Score

0.0

-5.0

-10.0

-11.1

-10.3

-11.0

-11.3 -13.1

-14.9

-15.0

-18.6

-20.0

-25.0 4

5

6

7

8

9

10

Figure 25. Difference from norm PAT reading score (school cohort 1 and 2) at Term 4 in 2015 by year level.

56

By school type Table 39 Difference from Norm PAT Reading Score (School Cohort 1 and 2) at Term 4 in 2015 by School Type School Type

N

Mdiff

SDdiff

t

Significance

Primary

965

-11.4

11.3

-31.3

***

Secondary

204

-16.6

10.3

-22.9

***

Significance of p-values: *** < 0.001; ** < 0.01; * < 0.05; . < 0.1.

Difference from Norm Reading Score at Term 4 in 2015 by School Type - School Cohort 1 and 2

Difference from Norm PAT Reading Score

0.0 -2.0 -4.0 -6.0 -8.0 -10.0 -12.0

-11.4

-14.0 -16.0

-16.6

-18.0 -20.0 Primary

Secondary

Figure 26. Difference from norm PAT reading score (school cohort 1 and 2) at Term 4 in 2015 by school type.

57

By school religion Table 40 Difference from Norm PAT Reading Score (School Cohort 1 and 2) at Term 4 in 2015 by School Religion School Religion

N

Mdiff

SDdiff

t

Significance

Catholic

131

-7.7

10.6

-8.3

***

Non-Catholic

1038

-12.9

11.3

-36.8

***

Significance of p-values: *** < 0.001; ** < 0.01; * < 0.05; . < 0.1.

Difference from Norm Reading Score at Term 4 in 2015 by School Religion - School Cohort 1 and 2

Difference from Norm PAT Reading Score

0.0 -2.0 -4.0 -6.0 -8.0

-7.7

-10.0 -12.0 -12.9 -14.0 -16.0 Catholic

Non-Catholic

Figure 27. Difference from norm PAT reading score (school cohort 1 and 2) at Term 4 in 2015 by school religion.

58

By transition to secondary school In this section we analysed Progressive Achievement Test (PAT) reading achievement at Term 4 in 2015 for students from Tamaki College. Analyses compared achievements of students (n = 204) who transitioned to Tamaki College from within the Manaiakalani cluster against achievements of those who transitioned from outside the cluster. Table 41 shows the number of students at Tamaki College who transitioned from within and outside the Manaiakalani cluster. Figure 28 - Figure 29 present confidence intervals of difference from norm PAT reading score at Term 4 in 2015 by transition to Tamaki College.

Table 41 Number of Students at Tamaki College who Transitioned from Within and Outside the Manaiakalani Cluster Year Level

Transition From

N

9

Outside Manaiakalani

32

9

Within Manaiakalani

79

10

Outside Manaiakalani

51

10

Within Manaiakalani

42

Note: All students transitioned after year 8.

59

Difference from Norm Reading Score at Term 4 in 2015 by Transition Year 9 Difference from Norm e-asTTle Writing Score

0

-5

-10

-15

-20 Year 9, Outside

Year 9, Within

Figure 28. Year 9 difference from norm PAT reading scores by transition at Term 4 in 2015 at Tamaki College. Difference from Norm Reading Score at Term 4 in 2015 by Transition Year 10 Difference from Norm e-asTTle Writing Score

0

-5

-10

-15

-20

-25 Year 10, Outside

Year 10, Within

Figure 29. Year 10 difference from norm PAT reading scores by transition at Term 4 in 2015 at Tamaki College. 60

Progressive Achievement Test (PAT) mathematics Whole cluster In this section, we consider only students (n = 1450) from the whole cluster that have Progressive Achievement Test (PAT) mathematics scores at Term 4 in 2015. An ANOVA was conducted to determine whether there was any variation in difference from norm PAT mathematics score across year levels, schools, and ethnicities, and between genders. A summary of the ANOVA statistics is provided in Table 42 and descriptive statistics for each school and student characteristics are presented in Table 43 - Table 49. Figure 30 - Figure 33 show confidence intervals of difference from norm PAT mathematics score at Term 4 in 2015 by student characteristics such as gender, ethnicity, year level, and school respectively. Figure 34 - Figure 36 show confidence intervals of difference from norm PAT mathematics score at Term 4 in 2015 by school characteristics such as school cohort, school type, and school religion respectively. Table 50 shows the number of classrooms with average mathematics scores above, at, and below norm at Term 4 in 2015 by school cohort.

Table 42 ANOVA Summary Statistics – Difference from Norm PAT Mathematics Score (Whole Cluster) at Term 4 in 2015 df

F

p

ES*

Gender

1

0.4

= .527

0.00

Year Level

1

42.0

< .001

0.03

Ethnicity

3

72.8

< .001

0.13

School

1

7.8

< .01

0.01

Residuals

1434

Note: *The effect size used is partial eta squared ηp2.

61

By gender Table 43 Difference from Norm PAT Mathematics Score (Whole Cluster) at Term 4 in 2015 by Gender Gender

N

Mdiff

SDdiff

t

Significance

Female

709

-7.8

12.4

-16.8

***

Male

741

-8.2

11.7

-19.1

***

Significance of p-values: *** < 0.001; ** < 0.01; * < 0.05; . < 0.1.

Difference from Norm Mathematics Score at Term 4 in 2015 by Gender - Whole Cluster Difference from Norm PAT Mathematics Score

0.0 -1.0 -2.0 -3.0 -4.0 -5.0 -6.0 -7.0 -8.0

-7.8

-8.2

-9.0 -10.0 Female

Male

Figure 30. Difference from norm PAT mathematics score (whole cluster) at Term 4 in 2015 by gender.

62

By ethnicity Table 44 Difference from Norm PAT Mathematics Score (Whole Cluster) at Term 4 in 2015 by Ethnicity Ethnicity

N

Mdiff

SDdiff

t

Significance

Māori

351

-9.0

10.5

-16.0

***

Pasifika

837

-10.5

10.4

-29.0

***

NZ European

107

3.4

13.7

2.6

*

Other

146

0.6

14.1

0.5

Significance of p-values: *** < 0.001; ** < 0.01; * < 0.05; . < 0.1.

Difference from Norm Mathematics Score at Term 4 in 2015 by Ethnicity - Whole Cluster Difference from Norm PAT Mathematics Score

8.0 6.0 4.0

3.4

2.0 0.6

0.0 -2.0 -4.0 -6.0 -8.0 -10.0

-9.0 -10.5

-12.0 -14.0 Māori

Pasifika

NZ European

Other

Figure 31. Difference from norm PAT mathematics score (whole cluster) at Term 4 in 2015 by ethnicity.

63

By year level Table 45 Difference from Norm PAT Mathematics Score (Whole Cluster) at Term 4 in 2015 by Year Level Year Level

N

Mdiff

SDdiff

t

Significance

4

258

-6.0

14.0

-6.9

***

5

279

-7.5

12.5

-10.0

***

6

242

-6.1

11.4

-8.3

***

7

239

-7.9

11.7

-10.4

***

8

226

-8.9

10.8

-12.3

***

9

116

-11.4

9.2

-13.5

***

10

90

-14.1

9.9

-13.6

***

Significance of p-values: *** < 0.001; ** < 0.01; * < 0.05; . < 0.1.

Difference from Norm Mathematics Score at Term 4 in 2015 by Year Level - Whole Cluster Difference from Norm PAT Mathematics Score

0.0 -2.0 -4.0 -6.0

-6.0

-6.1 -7.5

-8.0

-7.9

-8.9

-10.0 -11.4

-12.0 -14.0

-14.1

-16.0 -18.0 4

5

6

7

8

9

10

Figure 32. Difference from norm PAT mathematics score (whole cluster) at Term 4 in 2015 by year level.

64

By school Table 46 Difference from Norm PAT Mathematics Score (Whole Cluster) at Term 4 in 2015 by School School

N

Mdiff

SDdiff

t

Significance

1

126

-9.4

11.1

-9.5

***

2

335

-9.0

10.7

-15.4

***

3

144

-9.6

10.1

-11.4

***

4

100

-10.7

10.7

-10.0

***

5

139

-8.7

12.2

-8.4

***

6

63

-5.5

10.6

-4.2

***

7

70

-8.7

9.2

-7.9

***

8

206

-12.6

9.5

-19.0

***

9

164

6.0

12.7

6.1

***

10

103

-11.6

10.9

-10.8

***

Significance of p-values: *** < 0.001; ** < 0.01; * < 0.05; . < 0.1.

Difference from Norm Mathematics Score at Term 4 in 2015 by School - Whole Cluster Difference from Norm PAT Mathematics Score

10.0 6.0

5.0

0.0

-5.0

-5.5 -9.0

-9.4

-10.0

-8.7

-8.7

-9.6

-10.7

-11.6

-12.6 -15.0 1

2

3

4

5

6

7

8

9

10

Figure 33. Difference from norm PAT mathematics score (whole cluster) at Term 4 in 2015 by school. 65

By school cohort Table 47 Difference from Norm PAT Mathematics Score (Whole Cluster) at Term 4 in 2015 by School Cohort School Cohort a

N

Mdiff

SDdiff

t

Significance

1

788

-9.2

10.8

-24.2

***

2

395

-10.4

10.5

-19.7

***

3

267

-0.8

14.8

-0.9

Significance of p-values: *** < 0.001; ** < 0.01; * < 0.05; . < 0.1. a

see Table 1

Difference from Norm Mathematics Score at Term 4 in 2015 by School Cohort - Whole Cluster Difference from Norm PAT Mathematics Score

2.0 0.0

-0.8

-2.0 -4.0 -6.0 -8.0 -9.2

-10.0

-10.4

-12.0 -14.0 1

2

3

Figure 34. Difference from norm PAT mathematics score (whole cluster) at Term 4 in 2015 by school cohort.

66

By school type Table 48 Difference from Norm PAT Mathematics Score (Whole Cluster) at Term 4 in 2015 by School Type School Type

N

Mdiff

SDdiff

t

Significance

Primary

1244

-7.3

12.2

-20.9

***

Secondary

206

-12.6

9.5

-19.0

***

Significance of p-values: *** < 0.001; ** < 0.01; * < 0.05; . < 0.1.

Difference from Norm Mathematics Score at Term 4 in 2015 by School Type - Whole Cluster Difference from Norm PAT Mathematics Score

0.0 -2.0 -4.0 -6.0 -8.0

-7.3

-10.0 -12.0

-12.6

-14.0 -16.0 Primary

Secondary

Figure 35. Difference from norm PAT mathematics score (whole cluster) at Term 4 in 2015 by school type.

67

By school religion Table 49 Difference from Norm PAT Mathematics Score (Whole Cluster) at Term 4 in 2015 by School Religion School Religion

N

Mdiff

SDdiff

t

Significance

Catholic

133

-7.2

9.9

-8.3

***

Non-Catholic

1317

-8.1

12.2

-24.0

***

Significance of p-values: *** < 0.001; ** < 0.01; * < 0.05; . < 0.1.

Difference from Norm Mathematics Score at Term 4 in 2015 by School Religion - Whole Cluster Difference from Norm PAT Mathematics Score

0.0 -1.0 -2.0 -3.0 -4.0 -5.0 -6.0 -7.0

-7.2

-8.0

-8.1

-9.0 -10.0 Catholic

Non-Catholic

Figure 36. Difference from norm PAT mathematics score (whole cluster) at Term 4 in 2015 by school religion.

68

By classroom Table 50 Number of Classrooms with Average Mathematics Score Above, At, and Below Norm Score (Whole Cluster) at Term 4, 2015 by School Cohort Normative Comparison

School Cohort 1 and 2

School Cohort 3

Total

Above

0

4

4

At

5

3

8

Below

45

5

50

Total

50

12

62

School cohort 1 and 2 In this section, we consider only students (n = 1183) from school cohort 1 and 2 that have Progressive Achievement Test (PAT) mathematics scores at Term 4 in 2015. An ANOVA was conducted to determine whether there was any variation in difference from norm PAT mathematics score across year levels, schools, and ethnicities, and between genders. A summary of the ANOVA statistics is provided in Table 51 and descriptive statistics for each school and student characteristics are presented in Table 52 - Table 56. Figure 37 - Figure 39 show confidence intervals of difference from norm PAT mathematics score at Term 4 in 2015 by student characteristics such as gender, ethnicity and year level, respectively. Figure 40 - Figure 41 show confidence intervals of difference from norm PAT mathematics score at Term 4 in 2015 by school characteristics such as school type and school religion respectively. Table 51 ANOVA Summary Statistics – Difference from Norm PAT Mathematics Score (School Cohort 1 and 2) at Term 4 in 2015 df

F

p

ES*

Gender

1

1.1

= .292

0.00

Year Level

1

15.4

< .001

0.01

Ethnicity

3

11.0

< .001

0.03

School

1

0.2

= .632

0.00

Residuals

1176

Note: *The effect size used is partial eta squared ηp2.

69

By gender Table 52 Difference from Norm PAT Mathematics Score (School Cohort 1 and 2) at Term 4 in 2015 by Gender Gender

N

Mdiff

SDdiff

t

Significance

Female

565

-10.0

10.7

-22.1

***

Male

618

-9.3

10.6

-21.8

***

Significance of p-values: *** < 0.001; ** < 0.01; * < 0.05; . < 0.1.

Difference from Norm Mathematics Score at Term 4 in 2015 by Gender - School Cohort 1 and 2 Difference from Norm PAT Mathematics Score

0.0 -2.0 -4.0 -6.0 -8.0 -10.0

-10.0

-9.3

-12.0 Female

Male

Figure 37. Difference from norm PAT mathematics score (school cohort 1 and 2) at Term 4 in 2015 by gender.

70

By ethnicity Table 53 Difference from Norm PAT Mathematics Score (School Cohort 1 and 2) at Term 4 in 2015 by Ethnicity Ethnicity

N

Mdiff

SDdiff

t

Significance

Māori

320

-9.4

10.3

-16.3

***

Pasifika

753

-10.6

10.3

-28.1

***

NZ European

32

-4.7

16.1

-1.7

Other

78

-3.9

10.7

-3.2

**

Significance of p-values: *** < 0.001; ** < 0.01; * < 0.05; . < 0.1.

Difference from Norm Mathematics Score at Term 4 in 2015 by Ethnicity - School Cohort 1 and 2 Difference from Norm PAT Mathematics Score

2.0 0.0 -2.0 -3.9

-4.0

-4.7

-6.0 -8.0 -10.0

-9.4 -10.6

-12.0 Māori

Pasifika

NZ European

Other

Figure 38. Difference from norm PAT mathematics score (school cohort 1 and 2) at Term 4 in 2015 by ethnicity.

71

By year level Table 54 Difference from Norm PAT Mathematics Score (School Cohort 1 and 2) at Term 4 in 2015 by Year Level Year Level

N

Mdiff

SDdiff

t

Significance

4

179

-9.4

12.0

-10.5

***

5

218

-9.4

11.5

-12.1

***

6

196

-7.0

10.5

-9.4

***

7

196

-8.8

10.8

-11.3

***

8

188

-10.6

8.8

-16.5

***

9

116

-11.4

9.2

-13.5

***

10

90

-14.1

9.9

-13.6

***

Significance of p-values: *** < 0.001; ** < 0.01; * < 0.05; . < 0.1.

Difference from Norm Mathematics Score at Term 4 in 2015 by Year Level - School Cohort 1 and 2 Difference from Norm PAT Mathematics Score

0.0 -2.0 -4.0 -6.0 -7.0

-8.0 -10.0

-8.8

-9.4

-9.4

-10.6

-11.4

-12.0 -14.0

-14.1

-16.0 -18.0 4

5

6

7

8

9

10

Figure 39. Difference from norm PAT mathematics score (school cohort 1 and 2) at Term 4 in 2015 by year level.

72

By school type Table 55 Difference from Norm PAT Mathematics Score (School Cohort 1 and 2) at Term 4 in 2015 by School Type School Type

N

Mdiff

SDdiff

t

Significance

Primary

977

-9.0

10.8

-26.1

***

Secondary

206

-12.6

9.5

-19.0

***

Significance of p-values: *** < 0.001; ** < 0.01; * < 0.05; . < 0.1.

Difference from Norm Mathematics Score at Term 4 in 2015 by School Type - School Cohort 1 and 2 0.0

Difference from Norm PAT Mathematics Score

-2.0 -4.0 -6.0 -8.0 -9.0 -10.0 -12.0

-12.6

-14.0 -16.0 Primary

Secondary

Figure 40. Difference from norm PAT mathematics score (school cohort 1 and 2) at Term 4 in 2015 by school type.

73

By school religion Table 56 Difference from Norm PAT Mathematics Score (School Cohort 1 and 2) at Term 4 in 2015 by School Religion School Religion

N

Mdiff

SDdiff

t

Significance

Catholic

133

-7.2

9.9

-8.3

***

Non-Catholic

1050

-10.0

10.7

-30.1

***

Significance of p-values: *** < 0.001; ** < 0.01; * < 0.05; . < 0.1.

Difference from Norm Mathematics Score at Term 4 in 2015 by School Religion - School Cohort 1 and 2 0.0

Difference from Norm PAT Mathematics Score

-2.0 -4.0 -6.0 -7.2 -8.0 -10.0

-10.0 -12.0 Catholic

Non-Catholic

Figure 41. Difference from norm PAT mathematics score (school cohort 1 and 2) at Term 4 in 2015 by school religion.

74

By transition to secondary school In this section we analysed Progressive Achievement Test (PAT) mathematics achievement at Term 4 in 2015 for students from Tamaki College. Analyses compared achievements of students (n =206) who transitioned to Tamaki College from within the Manaiakalani cluster against achievements of those who transitioned from outside the cluster. Table 57 shows the number of students at Tamaki College who transitioned from within and outside the Manaiakalani cluster. Figure 42 - Figure 43 present confidence intervals of difference from norm PAT mathematics score at Term 4 in 2015 by transition to Tamaki College.

Table 57 Number of Students at Tamaki College who Transitioned from Within and Outside the Manaiakalani Cluster Year Level

Transition From

N

9

Outside Manaiakalani

36

9

Within Manaiakalani

80

10

Outside Manaiakalani

47

10

Within Manaiakalani

43

Note: All students transitioned after Year 8.

75

Difference from Norm Mathematics Score at Term 4 in 2015 by Transition Year 9 Difference from Norm e-asTTle Writing Score

0

-5

-10

-15

-20 Year 9, Outside

Year 9, Within

Figure 42. Year 9 difference from norm PAT mathematics scores by transition at Term 4 in 2015 at Tamaki College. Difference from Norm Mathematics Score at Term 4 in 2015 by Transition Year 10 Difference from Norm e-asTTle Writing Score

0

-5

-10

-15

-20 Year 10, Outside

Year 10, Within

Figure 43. Year 10 difference from norm PAT mathematics scores by transition at Term 4 in 2015 at Tamaki College. 76

3.2 Writing, Reading and Mathematics – Term 1, 2015 to Term 4, 2015 e-asTTle writing Whole cluster In this section, we consider only students (n = 1254) that have e-asTTle writing scores at Term 1 and Term 4, 2015. A summary of descriptive statistics for each school and student characteristics are presented in Table 58 - Table 64. Figure 44 - Figure 48 show difference from norm e-asTTle writing score across Term 1, 2015 to Term 4, 2015 by student characteristics such as gender, ethnicity, year level, and school respectively. Figure 49 - Figure 51 show difference from norm e-asTTle writing score across Term 1, 2015 to Term 4, 2015 by school characteristics such as school cohort, school type, and school religion respectively.

77

By gender Table 58 Difference from Norm e-asTTle Writing Score (Whole Cluster) Across Term 1, 2015 to Term 4, 2015 by Gender Gender

Term 1, 2015

Term 4, 2015

Effect Size

N

M

SD

Significance

M

SD

Significance

Female

620

-52.8

127.5

***

-29.0

116.1

***

0.19

Male

634

-103.9

133.8

***

-77.0

132.9

***

0.20

Significance of p-values: *** < 0.001; ** < 0.01; * < 0.05; . < 0.1.

e-asTTle Writing by Gender - Whole Cluster 0

Difference from Norm Score

-20 -40 -60 -80 -100 -120 Term 1, 2015 Female

Term 4, 2015 Male

Norm

Figure 44. Difference from norm e-asTTle writing score (whole cluster) across Term 1, 2015 to Term 4, 2015 by gender.

78

By ethnicity Table 59 Differences from Norm e-asTTle Writing Score (Whole Cluster) Across Term 1, 2015 to Term 4, 2015 by Ethnicity Ethnicity

Term 1, 2015

Term 4, 2015

Effect Size

N

M

SD

Significance

M

SD

Significance

Māori

295

-83.6

124.0

***

-61.6

115.2

***

0.18

Pasifika

738

-91.9

128.6

***

-67.7

127.5

***

0.19

NZ European

82

-25.5

148.3

0.1

144.3

0.18

Other

138

-28.1

148.0

11.2

109.8

0.30

*

Significance of p-values: *** < 0.001; ** < 0.01; * < 0.05; . < 0.1.

e-asTTle Writing by Ethnicity - Whole Cluster 20

Difference from Norm Score

0 -20 -40 -60 -80 -100 Term 1, 2015 Māori

Pasifika

Term 4, 2015 NZ European

Other

Norm

Figure 45. Difference from norm e-asTTle writing score (whole cluster) across Term 1, 2015 to Term 4, 2015 by ethnicity.

79

By year level Table 60 Difference from Norm e-asTTle Writing Score (Whole Cluster) Across Term 1, 2015 to Term 4, 2015 by Year Level Year Level

Term 1, 2015

Term 4, 2015 Significance

Effect Size

N

M

SD

Significance

M

SD

4

222

-64.7

165.8

***

-12.5

138.9

5

245

-71.1

146.8

***

-44.2

126.5

***

0.20

6

206

-87.1

141.2

***

-46.5

125.2

***

0.30

7

220

-61.3

108.3

***

-43.8

104.5

***

0.17

8

206

-70.6

99.8

***

-45.8

102.6

***

0.25

9

84

-122.0

107.0

***

-131.8

117.8

***

-0.09

10

70

-149.2

91.1

***

-192.1

110.8

***

-0.42

0.34

Significance of p-values: *** < 0.001; ** < 0.01; * < 0.05; . < 0.1.

e-asTTle Writing by Year Level - Whole Cluster 0

Difference from Norm Score

-50

-100

-150

-200

-250 Term 1, 2015 4

5

6

Term 4, 2015 7

8

9

10

Norm

Figure 46. Difference from norm e-asTTle writing score (whole cluster) across Term 1, 2015 to Term 4, 2015 by year level.

80

1,850 Manaiakalani Cluster

Norm

Baseline

Overall e-asTTle Writing Score

1,750

1,650

1,550

1,450

1,350

1,250 Feb

Nov

Year 4

Feb

Nov

Year 5

Feb

Nov

Year 6

Feb

Nov

Year 7

Feb

Nov

Year 8

Feb

Nov

Year 9

Feb

Nov

Year 10

Normative Comparison by Year Level across Term 1 to Term 4 in 2015 - Whole Cluster

Figure 47. Overall e-asTTle writing scores by year levels (whole cluster) across Term 1, 2015 and Term 4, 2015.

81

By school Table 61 Difference from Norm e-asTTle Writing Score (Whole Cluster) Across Term 1, 2015 to Term 4, 2015 by School School

Term 1, 2015

Term 4, 2015

Effect Size

N

M

SD

Significance

M

SD

Significance

1

109

-65.0

111.3

***

-48.2

108.8

***

0.15

2

312

-122.3

131.4

***

-75.1

132.1

***

0.36

3

108

-45.8

114.6

***

-74.9

132.3

***

-0.24

4

97

-81.7

118.2

***

-36.9

90.6

***

0.43

5

128

-85.4

167.0

***

-36.9

134.5

**

0.32

6

62

0.8

111.5

27.5

82.5

*

0.27

7

61

-86.0

159.6

***

-23.7

87.4

*

0.48

8

154

-134.4

100.7

***

-159.2

118.2

***

-0.23

9

134

7.2

122.4

44.0

80.3

***

0.36

10

88

-50.9

90.6

-37.7

104.6

**

0.13

***

Significance of p-values: *** < 0.001; ** < 0.01; * < 0.05; . < 0.1. e-asTTle Writing by School - Whole Cluster 100

Difference from Norm Score

50 0 -50 -100 -150 -200 Term 1, 2015 1

2

3

4

Term 4, 2015 5

6

7

8

9

10

Norm

Figure 48. Difference from norm e-asTTle writing score (whole cluster) across Term 1, 2015 to Term 4, 2015 by school. 82

By school cohort Table 62 Difference from Norm e-asTTle Writing Score (Whole Cluster) Across Term 1, 2015 to Term 4, 2015 by School Cohort School Cohort a

Term 1, 2015

Term 4, 2015

Effect Size

N

M

SD

Significance

M

SD

Significance

1

706

-95.2

139.5

***

-58.5

125.7

***

0.28

2

325

-85.3

118.2

***

-86.4

131.6

***

-0.01

3

222

-15.8

114.3

*

11.6

98.9

.

0.26

Significance of p-values: *** < 0.001; ** < 0.01; * < 0.05; . < 0.1. a

see Table 1

e-asTTle Writing by School Cohort - Whole Cluster 20

Difference from Norm Score

0 -20 -40 -60 -80 -100 -120 Term 1, 2015 1

Term 4, 2015 2

3

Norm

Figure 49. Difference from norm e-asTTle writing score (whole cluster) across Term 1, 2015 to Term 4, 2015 by school cohort.

83

By school type Table 63 Difference from Norm e-asTTle Writing Score (Whole Cluster) Across Term 1, 2015 to Term 4, 2015 by School Type School Type

Term 1, 2015

Term 4, 2015

Effect Size

N

M

SD

Significance

M

SD

Significance

Primary

1100

-70.8

135.3

***

-38.4

121.1

***

0.25

Secondary

154

-134.4

100.7

***

-159.2

118.2

***

-0.23

Significance of p-values: *** < 0.001; ** < 0.01; * < 0.05; . < 0.1.

e-asTTle Writing by School Type - Whole Cluster 0

Difference from Norm Score

-20 -40 -60 -80 -100 -120 -140 -160 -180 Term 1, 2015 Primary

Term 4, 2015 Secondary

Norm

Figure 50. Difference from norm e-asTTle writing score (whole cluster) across Term 1, 2015 to Term 4, 2015 by school type.

84

By school religion Table 64 Difference from Norm e-asTTle Writing Score (Whole Cluster) Across Term 1, 2015 to Term 4, 2015 by School Religion School Religion

Term 1, 2015

Term 4, 2015

N

M

SD

Significance

M

SD

Catholic

123

-42.3

143.7

**

2.1

88.5

Non-Catholic

1131

-82.5

131.4

***

-59.3

129.2

Significance

Effect Size 0.37

***

0.18

Significance of p-values: *** < 0.001; ** < 0.01; * < 0.05; . < 0.1.

e-asTTle Writing by School Religion - Whole Cluster 10

Difference from Norm Score

0 -10 -20 -30 -40 -50 -60 -70 -80 -90 Term 1, 2015 Catholic

Term 4, 2015 Non-Catholic

Norm

Figure 51. Difference from norm e-asTTle writing score (whole cluster) across Term 1, 2015 to Term 4, 2015 by school religion.

85

School cohort 1 and 2 In the following, we consider only students (n = 1032) that have e-asTTle writing scores at Term 1 and Term 4, 2015. A summary of descriptive statistics for each school and student characteristics are presented in Table 65 - Table 69. Figure 52 - Figure 55 show difference from norm e-asTTle writing score across Term 1, 2015 to Term 4, 2015 by student characteristics such as gender, ethnicity, and year level, respectively. Figure 56 - Figure 57 show difference from norm e-asTTle writing score across Term 1, 2015 to Term 4, 2015 by school characteristics such as school type and school religion respectively.

86

By gender Table 65 Difference from Norm e-asTTle Writing Score (School Cohort 1 and 2) Across Term 1, 2015 to Term 4, 2015 by Gender Gender

Term 1, 2015

Term 4, 2015

Effect Size

N

M

SD

Significance

M

SD

Significance

Female

496

-67.8

130.5

***

-43.2

116.7

***

0.20

Male

536

-114.6

131.6

***

-89.6

134.3

***

0.19

Significance of p-values: *** < 0.001; ** < 0.01; * < 0.05; . < 0.1.

e-asTTle Writing by Gender - School Cohort 1 and 2 0

Difference from Norm Score

-20 -40 -60 -80 -100 -120 -140 Term 1, 2015 Female

Term 4, 2015 Male

Norm

Figure 52. Difference from norm e-asTTle writing score (school cohort 1 and 2) across Term 1, 2015 to Term 4, 2015 by gender.

87

By ethnicity Table 66 Differences from Norm e-asTTle Writing Score (School Cohort 1 and 2) Across Term 1, 2015 to Term 4, 2015 by Ethnicity Ethnicity

Term 1, 2015

Term 4, 2015

Effect Size

N

M

SD

Significance

M

SD

Significance

Māori

272

-89.9

124.1

***

-67.7

114.1

***

0.19

Pasifika

660

-96.7

131.8

***

-71.8

129.9

***

0.19

NZ European

28

-98.4

169.0

**

-96.7

188.6

*

0.01

Other

71

-54.7

158.1

**

-9.3

120.7

0.32

Significance of p-values: *** < 0.001; ** < 0.01; * < 0.05; . < 0.1.

e-asTTle Writing by Ethnicity - School Cohort 1 and 2 0

Difference from Norm Score

-20 -40 -60 -80 -100 -120 Term 1, 2015 Māori

Pasifika

Term 4, 2015 NZ European

Other

Norm

Figure 53. Difference from norm e-asTTle writing score (school cohort 1 and 2) across Term 1, 2015 to Term 4, 2015 by ethnicity.

88

By year level Table 67 Difference from Norm e-asTTle Writing Score (School Cohort 1 and 2) Across Term 1, 2015 To Term 4, 2015 By Year Level Year Level

Term 1, 2015

Term 4, 2015

Effect Size

N

M

SD

Significance

M

SD

Significance

4

159

-96.2

169.2

***

-39.8

146.4

***

0.36

5

194

-88.0

145.4

***

-58.2

130.5

***

0.22

6

168

-98.5

149.3

***

-54.9

129.8

***

0.31

7

183

-66.1

111.6

***

-46.6

106.5

***

0.18

8

173

-76.7

98.3

***

-54.6

98.3

***

0.22

9

84

-122.0

107.0

***

-131.8

117.8

***

-0.09

10

70

-149.2

91.1

***

-192.1

110.8

***

-0.42

Significance of p-values: *** < 0.001; ** < 0.01; * < 0.05; . < 0.1.

e-asTTle Writing by Year Level - School Cohort 1 and 2 0

Difference from Norm Score

-50

-100

-150

-200

-250 Term 1, 2015 4

5

6

Term 4, 2015 7

8

9

10

Norm

Figure 54. Difference from norm e-asTTle writing score (school cohort 1 and 2) across Term 1, 2015 to Term 4, 2015 by year level.

89

1,850 Manaiakalani Cluster

Norm

Baseline

Overall e-asTTle Writing Score

1,750

1,650

1,550

1,450

1,350

1,250 Feb

Nov

Year 4

Feb

Nov

Year 5

Feb

Nov

Year 6

Feb

Nov

Year 7

Feb

Nov

Year 8

Feb

Nov

Year 9

Feb

Nov

Year 10

Normative Comparison by Year Level across Term 1 to Term 4 in 2015 - School Cohort 1 and 2

Figure 55. Overall e-asTTle writing scores by year levels (school cohort 1 and 2) across Term 1, 2015 and Term 4, 2015.

90

By school type Table 68 Difference from Norm e-asTTle Writing Score (School Cohort 1 and 2) Across Term 1, 2015 to Term 4, 2015 by School Type School Type

Term 1, 2015

Term 4, 2015

Effect Size

N

M

SD

Significance

M

SD

Significance

Primary

878

-84.7

136.7

***

-51.1

123.0

***

0.26

Secondary

154

-134.4

100.7

***

-159.2

118.2

***

-0.23

Significance of p-values: *** < 0.001; ** < 0.01; * < 0.05; . < 0.1.

e-asTTle Writing by School Type - School Cohort 1 and 2 0

Difference from Norm Score

-20 -40 -60 -80 -100 -120 -140 -160 -180 Term 1, 2015 Primary

Term 4, 2015 Secondary

Norm

Figure 56. Difference from norm e-asTTle writing score (school cohort 1 and 2) across Term 1, 2015 to Term 4, 2015 by school type.

91

By school religion Table 69 Difference from Norm e-asTTle Writing Score (School Cohort 1 and 2) Across Term 1, 2015 to Term 4, 2015 by School Religion School Religion

Term 1, 2015

Term 4, 2015

N

M

SD

Significance

M

SD

Catholic

123

-42.3

143.7

**

2.1

88.5

Non-Catholic

909

-98.8

130.2

***

-76.7

129.8

Significance

Effect Size 0.37

***

0.17

Significance of p-values: *** < 0.001; ** < 0.01; * < 0.05; . < 0.1.

e-asTTle Writing by School Religion - School Cohort 1 and 2 20

Difference from Norm Score

0 -20 -40 -60 -80 -100 -120 Term 1, 2015 Catholic

Term 4, 2015 Non-Catholic

Norm

Figure 57. Difference from norm e-asTTle writing score (school cohort 1 and 2) across Term 1, 2015 to Term 4, 2015 by school religion.

92

By transition to secondary school In this section we analysed e-asTTle writing achievement at Term 1 and Term 4, 2015 for students from Tamaki College. We considered only students (n = 154) that have e-asTTle writing scores at Term 1 and Term 4, 2015. Analyses compared achievements of students who transitioned to Tamaki College from within the Manaiakalani cluster against achievements of those who transitioned from outside the cluster. Table 70 shows the number of students at Tamaki College who transitioned from within and outside the Manaiakalani cluster. Figure 58 presents difference from norm e-asTTle writing score across Term 1, 2015 to Term 4, 2015 by transition to Tamaki College. Table 70 Number of Students at Tamaki College who Transitioned from Within and Outside the Manaiakalani Cluster Year Level

Transition From

N

9

Outside Manaiakalani

20

9

Within Manaiakalani

64

10

Outside Manaiakalani

35

10

Within Manaiakalani

35

Note: All students transitioned after Year 8. Difference from Norm e-asTTle Writing Score at Term 1 and Term 4 in 2015 by Transition Difference from Norm Score

0 -50 -100 -150 -200 -250 Term 1, 2015 Year 9, Outside

Year 9, Within

Term 4, 2015 Year 10, Outside

Year 10, Within

Figure 58. Difference from norm e-asTTle writing scores by year levels and Manaiakalani transition across Term 1, 2015 to Term 4, 2015 at Tamaki College. 93

Progressive Achievement Test (PAT) reading Whole cluster In the following, we consider only students (n = 1311) that have PAT reading scores at Term 1 and Term 4, 2015. A summary of descriptive statistics for each school and student characteristics are presented in Table 71 - Table 77. Figure 59 - Figure 63 show difference from reading Norm Score across Term 1, 2015 to Term 4, 2015 by student characteristics such as gender, ethnicity, year level, and school respectively. Figure 64 - Figure 66 show difference from reading Norm Score across Term 1, 2015 to Term 4, 2015 by school characteristics such as school cohort, school type, and school religion respectively.

94

By gender Table 71 Difference from Norm PAT Reading Score (Whole Cluster) Across Term 1, 2015 to Term 4, 2015 by Gender Gender

Term 1, 2015

Term 4, 2015

Effect Size

N

M

SD

Significance

M

SD

Significance

Female

640

-6.9

12.6

***

-7.8

12.5

***

-0.07

Male

671

-11.2

12.3

***

-12.5

11.9

***

-0.10

Significance of p-values: *** < 0.001; ** < 0.01; * < 0.05; . < 0.1.

PAT Reading by Gender - Whole Cluster 0.0

Difference from Norm Score

-2.0 -4.0 -6.0 -8.0 -10.0 -12.0 -14.0 Term 1, 2015 Female

Term 4, 2015 Male

Norm

Figure 59. Difference from norm PAT reading score (whole cluster) across Term 1, 2015 to Term 4, 2015 by gender.

95

By ethnicity Table 72 Differences from Norm PAT Reading Score (Whole Cluster) Across Term 1, 2015 to Term 4, 2015 by Ethnicity Ethnicity

Term 1, 2015

Term 4, 2015

Effect Size

N

M

SD

Significance

M

SD

Significance

Māori

318

-10.2

11.7

***

-11.1

11.5

***

-0.08

Pasifika

764

-10.7

11.4

***

-12.4

10.8

***

-0.15

NZ European

100

-0.4

16.2

0.4

14.0

Other

129

-3.9

14.4

-3.1

15.0

**

0.05 *

0.06

Significance of p-values: *** < 0.001; ** < 0.01; * < 0.05; . < 0.1.

PAT Reading by Ethnicity - Whole Cluster 2.0

Difference from Norm Score

0.0 -2.0 -4.0 -6.0 -8.0 -10.0 -12.0 -14.0 Term 1, 2015 Māori

Pasifika

Term 4, 2015 NZ European

Other

Norm

Figure 60. Difference from norm PAT reading score (whole cluster) across Term 1, 2015 to Term 4, 2015 by ethnicity.

96

By year level Table 73 Difference from Norm PAT Reading Score (Whole Cluster) Across Term 1, 2015 to Term 4, 2015 by Year Level Year Level

Term 1, 2015

Term 4, 2015

Effect Size

N

M

SD

Significance

M

SD

Significance

4

209

-8.1

13.3

***

-5.6

15.5

***

0.17

5

256

-9.1

12.8

***

-11.4

11.1

***

-0.19

6

223

-9.0

12.7

***

-9.8

11.9

***

-0.06

7

227

-10.0

13.3

***

-9.6

12.0

***

0.03

8

211

-9.6

14.2

***

-9.1

11.0

***

0.03

9

101

-6.7

7.9

***

-14.8

11.3

***

-0.83

10

84

-11.4

6.8

***

-17.6

7.6

***

-0.86

Significance of p-values: *** < 0.001; ** < 0.01; * < 0.05; . < 0.1.

PAT Reading by Year Level - Whole Cluster 0.0 -2.0 Difference from Norm Score

-4.0 -6.0 -8.0 -10.0 -12.0 -14.0 -16.0 -18.0 -20.0 Term 1, 2015 4

5

6

Term 4, 2015 7

8

9

10

Norm

Figure 61. Difference from norm PAT reading score (whole cluster) across Term 1, 2015 to Term 4, 2015 by year level.

97

90 Manaiakalani Cluster

Norm

Scaled PAT Reading Score

80 70 60 50 40 30 20 10 0 Feb

Nov

Year 4

Feb

Nov

Year 5

Feb

Nov

Year 6

Feb

Nov

Year 7

Feb

Nov

Year 8

Feb

Nov

Year 9

Feb

Nov

Year 10

Normative Comparison by Year Level across Term 1 to Term 4 in 2015 - Whole Cluster

Figure 62. Scaled PAT reading scores by year levels (whole cluster) across Term 1, 2015 and Term 4, 2015.

98

By school Table 74 Difference from Norm PAT Reading Score (Whole Cluster) Across Term 1, 2015 to Term 4, 2015 by School School

Term 1, 2015

Term 4, 2015

Effect Size

N

M

SD

Significance

M

SD

Significance

1

88

-14.5

12.4

***

-13.1

9.3

***

0.13

2

320

-11.6

12.5

***

-12.0

11.4

***

-0.03

3

130

-13.4

13.7

***

-13.7

10.5

***

-0.02

4

99

-9.3

10.4

***

-12.1

11.0

***

-0.25

5

129

-8.7

12.3

***

-7.9

12.4

***

0.07

6

62

-6.5

10.8

***

-5.4

10.7

***

0.11

7

64

-8.4

11.9

***

-9.4

10.0

***

-0.09

8

185

-8.8

7.8

***

-16.1

9.9

***

-0.81

9

145

0.8

14.5

2.3

13.8

*

0.11

10

89

-7.9

11.3

-9.0

11.7

***

-0.09

***

Significance of p-values: *** < 0.001; ** < 0.01; * < 0.05; . < 0.1. PAT Reading by School - Whole Cluster

Difference from Norm Score

5.0

0.0

-5.0

-10.0

-15.0

-20.0 Term 1, 2015 1

2

3

4

Term 4, 2015 5

6

7

8

9

10

Norm

Figure 63. Difference from norm PAT reading score (whole cluster) across Term 1, 2015 to Term 4, 2015 by school.

99

By school cohort Table 75 Difference from Norm PAT Reading Score (Whole Cluster) Across Term 1, 2015 to Term 4, 2015 by School Cohort School Cohort a

Term 1, 2015

Term 4, 2015

Effect Size

N

M

SD

Significance

M

SD

Significance

1

742

-10.8

12.5

***

-11.3

11.4

***

-0.04

2

335

-9.9

10.1

***

-13.3

10.7

***

-0.33

3

234

-2.5

14.0

**

-2.0

14.2

*

0.04

Significance of p-values: *** < 0.001; ** < 0.01; * < 0.05; . < 0.1. a

see Table 1

PAT Reading by School Cohort - Whole Cluster 0.0

Difference from Norm Score

-2.0 -4.0 -6.0 -8.0 -10.0 -12.0 -14.0 Term 1, 2015 1

Term 4, 2015 2

3

Norm

Figure 64. Difference from norm PAT reading score (whole cluster) across Term 1, 2015 to Term 4, 2015 by school cohort.

100

By school type Table 76 Difference from Norm PAT Reading Score (Whole Cluster) Across Term 1, 2015 to Term 4, 2015 by School Type School Type

Term 1, 2015

Term 4, 2015

Effect Size

N

M

SD

Significance

M

SD

Significance

Primary

1126

-9.1

13.2

***

-9.2

12.5

***

0.00

Secondary

185

-8.8

7.8

***

-16.1

9.9

***

-0.81

Significance of p-values: *** < 0.001; ** < 0.01; * < 0.05; . < 0.1.

PAT Reading by School Type - Whole Cluster 0.0

Difference from Norm Score

-2.0 -4.0 -6.0 -8.0 -10.0 -12.0 -14.0 -16.0 -18.0 Term 1, 2015 Primary

Term 4, 2015 Secondary

Norm

Figure 65. Difference from norm PAT reading score (whole cluster) across Term 1, 2015 to Term 4, 2015 by school type.

101

By school religion Table 77 Difference from Norm PAT Reading Score (Whole Cluster) Across Term 1, 2015 to Term 4, 2015 by School Religion School Religion

Term 1, 2015

Term 4, 2015

Effect Size

N

M

SD

Significance

M

SD

Significance

Catholic

126

-7.5

11.4

***

-7.4

10.5

***

0.01

Non-Catholic

1185

-9.3

12.7

***

-10.5

12.5

***

-0.09

Significance of p-values: *** < 0.001; ** < 0.01; * < 0.05; . < 0.1.

PAT Reading by School Religion - Whole Cluster 0.0

Difference from Norm Score

-2.0 -4.0 -6.0 -8.0 -10.0 -12.0 Term 1, 2015 Catholic

Term 4, 2015 Non-Catholic

Norm

Figure 66. Difference from norm PAT reading score (whole cluster) across Term 1, 2015 to Term 4, 2015 by school religion.

102

School cohort 1 and 2 In this section, we consider only students (n = 1077) that have PAT reading scores at Term 1 and Term 4, 2015. A summary of descriptive statistics for each school and student characteristics are presented in Table 78 - Table 82. Figure 67 - Figure 70 show difference from reading Norm Score across Term 1, 2015 to Term 4, 2015 by student characteristics such as gender, ethnicity and year level, respectively. Figure 71 - Figure 72 show difference from reading Norm Score across Term 1, 2015 to Term 4, 2015 by school characteristics such as school type and school religion respectively.

103

By gender Table 78 Difference from Norm PAT Reading Score (School Cohort 1 and 2) Across Term 1, 2015 to Term 4, 2015 by Gender Gender

Term 1, 2015

Term 4, 2015

Effect Size

N

M

SD

Significance

M

SD

Significance

Female

511

-8.5

11.5

***

-9.7

11.2

***

-0.10

Male

566

-12.3

11.8

***

-14.0

10.8

***

-0.15

Significance of p-values: *** < 0.001; ** < 0.01; * < 0.05; . < 0.1.

PAT Reading by Gender - School Cohort 1 and 2 0.0

Difference from Norm Score

-2.0 -4.0 -6.0 -8.0 -10.0 -12.0 -14.0 -16.0 Term 1, 2015 Female

Term 4, 2015 Male

Norm

Figure 67. Difference from norm PAT reading score (school cohort 1 and 2) across Term 1, 2015 to Term 4, 2015 by gender.

104

By ethnicity Table 79 Differences from Norm PAT Reading Score (School Cohort 1 and 2) Across Term 1, 2015 to Term 4, 2015 by Ethnicity Ethnicity

Term 1, 2015

Term 4, 2015

Effect Size

N

M

SD

Significance

M

SD

Significance

Māori

291

-10.8

11.6

***

-11.8

10.8

***

-0.09

Pasifika

684

-11.1

11.3

***

-12.8

10.7

***

-0.16

NZ European

29

-7.9

17.7

*

-9.7

13.6

***

-0.11

Other

73

-5.4

13.2

***

-5.6

13.9

***

-0.02

Significance of p-values: *** < 0.001; ** < 0.01; * < 0.05; . < 0.1.

PAT Reading by Ethnicity - School Cohort 1 and 2 0.0

Difference from Norm Score

-2.0 -4.0 -6.0 -8.0 -10.0 -12.0 -14.0 Term 1, 2015 Māori

Pasifika

Term 4, 2015 NZ European

Other

Norm

Figure 68. Difference from norm PAT reading score (school cohort 1 and 2) across Term 1, 2015 to Term 4, 2015 by ethnicity.

105

By year level Table 80 Difference from Norm PAT Reading Score (School Cohort 1 and 2) Across Term 1, 2015 to Term 4, 2015 by Year Level Year Level

Term 1, 2015

Term 4, 2015

Effect Size

N

M

SD

Significance

M

SD

Significance

4

140

-10.8

11.3

***

-9.4

13.7

***

0.11

5

207

-10.3

12.1

***

-13.1

10.1

***

-0.25

6

182

-10.6

11.9

***

-11.1

11.0

***

-0.05

7

188

-11.7

13.0

***

-10.9

11.3

***

0.07

8

175

-11.0

13.7

***

-10.3

10.5

***

0.06

9

101

-6.7

7.9

***

-14.8

11.3

***

-0.83

10

84

-11.4

6.8

***

-17.6

7.6

***

-0.86

Significance of p-values: *** < 0.001; ** < 0.01; * < 0.05; . < 0.1.

PAT Reading by Year Level - School Cohort 1 and 2 0.0 -2.0 Difference from Norm Score

-4.0 -6.0 -8.0 -10.0 -12.0 -14.0 -16.0 -18.0 -20.0 Term 1, 2015 4

5

6

Term 4, 2015 7

8

9

10

Norm

Figure 69. Difference from norm PAT reading score (school cohort 1 and 2) across Term 1, 2015 to Term 4, 2015 by year level. 106

90 Manaiakalani Cluster

Norm

Scaled PAT Reading Score

80 70 60 50 40 30 20 10 0 Feb

Nov

Year 4

Feb

Nov

Year 5

Feb

Nov

Year 6

Feb

Nov

Year 7

Feb

Nov

Year 8

Feb

Nov

Year 9

Feb

Nov

Year 10

Normative Comparison by Year Level across Term 1 to Term 4 in 2015 - School Cohort 1 and 2

Figure 70. Scaled PAT reading scores by year levels (school cohort 1 and 2) across Term 1, 2015 and Term 4, 2015.

107

By school type Table 81 Difference from Norm PAT Reading Score (School Cohort 1 and 2) Across Term 1, 2015 to Term 4, 2015 by School Type School Type

Term 1, 2015

Term 4, 2015

Effect Size

N

M

SD

Significance

M

SD

Significance

Primary

892

-10.9

12.5

***

-11.1

11.3

***

-0.02

Secondary

185

-8.8

7.8

***

-16.1

9.9

***

-0.81

Significance of p-values: *** < 0.001; ** < 0.01; * < 0.05; . < 0.1.

PAT Reading by School Type - School Cohort 1 and 2 0.0

Difference from Norm Score

-2.0 -4.0 -6.0 -8.0 -10.0 -12.0 -14.0 -16.0 -18.0 Term 1, 2015 Primary

Term 4, 2015 Secondary

Norm

Figure 71. Difference from norm PAT reading score (school cohort 1 and 2) across Term 1, 2015 to Term 4, 2015 by school type.

108

By school religion Table 82 Difference from Norm PAT Reading Score (School Cohort 1 and 2) Across Term 1, 2015 to Term 4, 2015 by School Religion School Religion

Term 1, 2015

Term 4, 2015

Effect Size

N

M

SD

Significance

M

SD

Significance

Catholic

126

-7.5

11.4

***

-7.4

10.5

***

0.01

Non-Catholic

951

-10.9

11.8

***

-12.5

11.2

***

-0.14

Significance of p-values: *** < 0.001; ** < 0.01; * < 0.05; . < 0.1.

PAT Reading by School Religion - School Cohort 1 and 2 0.0

Difference from Norm Score

-2.0 -4.0 -6.0 -8.0 -10.0 -12.0 -14.0 Term 1, 2015 Catholic

Term 4, 2015 Non-Catholic

Norm

Figure 72. Difference from norm PAT reading score (school cohort 1 and 2) across Term 1, 2015 to Term 4, 2015 by school religion.

109

By transition to secondary school In this section we analysed PAT reading achievement at Term 1 and Term 4, 2015 for students from Tamaki College. We considered only students (n = 185) that have PAT reading scores at Term 1 and Term 4, 2015. Analyses compared achievements of students who transitioned to Tamaki College from within the Manaiakalani cluster against achievements of those who transitioned from outside the cluster. Table 83 shows the number of students at Tamaki College who transitioned from within and from outside the Manaiakalani cluster. Figure 73 presents difference from reading Norm Score across Term 1, 2015 to Term 4, 2015 by transition to Tamaki College. Table 83 Number of Students at Tamaki College who Transitioned from Within and Outside the Manaiakalani Cluster Year Level

Transition From

N

9

Outside Manaiakalani

26

9

Within Manaiakalani

75

10

Outside Manaiakalani

46

10

Within Manaiakalani

38

Note: All students transitioned after Year 8.

Difference from Norm Score

Difference from Norm PAT Reading Score at Term 1 and Term 4 in 2015 by Transition 0 -2 -4 -6 -8 -10 -12 -14 -16 -18 -20 Term 1, 2015 Year 9, Outside

Year 9, Within

Term 4, 2015 Year 10, Outside

Year 10, Within

Figure 73. Difference from norm PAT reading scores by year levels and Manaiakalani transition across Term 1, 2015 to Term 4, 2015 at Tamaki College. 110

Progressive Achievement Test (PAT) mathematics Whole cluster In this section, we consider only students (n = 1297) that have PAT mathematics scores at Term 1 and Term 4, 2015. A summary of descriptive statistics for each school and student characteristics are presented in Table 84 - Table 90. Figure 74 - Figure 78 show difference from mathematics Norm Score across Term 1, 2015 to Term 4, 2015 by student characteristics such as gender, ethnicity, year level, and school respectively. Figure 79 - Figure 81 show difference from mathematics Norm Score across Term 1, 2015 to Term 4, 2015 by school characteristics such as school cohort, school type, and school religion respectively.

111

By gender Table 84 Difference from Norm PAT Mathematics Score (Whole Cluster) Across Term 1, 2015 to Term 4, 2015 by Gender Gender

Term 1, 2015

Term 4, 2015

Effect Size

N

M

SD

Significance

M

SD

Significance

Female

638

-7.4

10.8

***

-7.4

12.2

***

0.00

Male

659

-8.3

10.8

***

-8.1

11.7

***

0.02

Significance of p-values: *** < 0.001; ** < 0.01; * < 0.05; . < 0.1.

PAT Mathematics by Gender - Whole Cluster 0.0

Difference from Norm Score

-1.0 -2.0 -3.0 -4.0 -5.0 -6.0 -7.0 -8.0 -9.0 Term 1, 2015 Female

Term 4, 2015 Male

Norm

Figure 74. Difference from norm PAT mathematics score (whole cluster) across Term 1, 2015 to Term 4, 2015 by gender.

112

By ethnicity Table 85 Differences from Norm PAT Mathematics Score (Whole Cluster) Across Term 1, 2015 to Term 4, 2015 by Ethnicity Ethnicity

Term 1, 2015

Term 4, 2015

Effect Size

N

M

SD

Significance

M

SD

Significance

Māori

312

-8.8

9.8

***

-8.7

10.6

***

0.01

Pasifika

762

-9.8

9.5

***

-10.2

10.3

***

-0.04

NZ European

93

2.1

12.2

4.0

13.7

**

0.15

Other

130

-1.4

12.5

0.7

14.1

0.16

Significance of p-values: *** < 0.001; ** < 0.01; * < 0.05; . < 0.1.

PAT Mathematics by Ethnicity - Whole Cluster 6.0

Difference from Norm Score

4.0 2.0 0.0 -2.0 -4.0 -6.0 -8.0 -10.0 -12.0 Term 1, 2015 Māori

Pasifika

Term 4, 2015 NZ European

Other

Norm

Figure 75. Difference from norm PAT mathematics score (whole cluster) across Term 1, 2015 to Term 4, 2015 by ethnicity.

113

By year level Table 86 Difference from Norm PAT Mathematics Score (Whole Cluster) Across Term 1, 2015 to Term 4, 2015 by Year Level Year Level

Term 1, 2015

Term 4, 2015

Effect Significance Size

N

M

SD

Significance

M

SD

4

211

-6.7

11.1

***

-4.8

14.1

***

0.15

5

254

-8.1

11.2

***

-7.3

12.2

***

0.07

6

207

-7.9

10.6

***

-6.2

11.6

***

0.15

7

228

-7.0

11.4

***

-7.7

11.8

***

-0.06

8

215

-7.9

11.4

***

-8.8

10.8

***

-0.08

9

98

-8.2

8.1

***

-11.2

9.0

***

-0.35

10

84

-11.3

7.4

***

-13.5

9.7

***

-0.25

Significance of p-values: *** < 0.001; ** < 0.01; * < 0.05; . < 0.1.

PAT Mathematics by Year Level - Whole Cluster 0.0

Difference from Norm Score

-2.0 -4.0 -6.0 -8.0 -10.0 -12.0 -14.0 -16.0 Term 1, 2015 4

5

6

Term 4, 2015 7

8

9

10

Norm

Figure 76. Difference from norm PAT mathematics score (whole cluster) across Term 1, 2015 to Term 4, 2015 by year level. 114

80 Manaiakalani Cluster

Norm

Scaled PAT Mathematics Score

70 60 50 40 30 20 10 0 Feb

Nov

Year 4

Feb

Nov

Year 5

Feb

Nov

Year 6

Feb

Nov

Year 7

Feb

Nov

Year 8

Feb

Nov

Year 9

Feb

Nov

Year 10

Normative Comparison by Year Level across Term 1 to Term 4 in 2015 - Whole Cluster

Figure 77. Scaled PAT mathematics scores by year levels (whole cluster) across Term 1, 2015 and Term 4, 2015.

115

By school Table 87 Difference from Norm PAT Mathematics Score (Whole Cluster) Across Term 1, 2015 to Term 4, 2015 by School School

Term 1, 2015

Term 4, 2015

Effect Size

N

M

SD

Significance

M

SD

Significance

1

93

-8.1

11.6

***

-7.8

10.6

***

0.02

2

320

-9.7

10.0

***

-8.8

10.7

***

0.09

3

133

-9.3

8.9

***

-9.7

10.1

***

-0.04

4

98

-9.2

8.9

***

-10.8

10.8

***

-0.17

5

129

-7.2

10.7

***

-8.1

11.8

***

-0.08

6

61

-6.1

10.6

***

-5.6

10.7

***

0.05

7

62

-10.7

9.6

***

-7.9

8.9

***

0.30

8

182

-9.6

7.9

***

-12.3

9.4

***

-0.30

9

140

3.0

11.7

**

6.6

13.0

***

0.29

10

79

-11.2

11.3

***

-11.8

11.3

***

-0.06

Significance of p-values: *** < 0.001; ** < 0.01; * < 0.05; . < 0.1. PAT Mathematics by School - Whole Cluster

Difference from Norm Score

10.0

5.0

0.0

-5.0

-10.0

-15.0 Term 1, 2015 1

2

3

4

Term 4, 2015 5

6

7

8

9

10

Norm

Figure 78. Difference from norm PAT mathematics score (whole cluster) across Term 1, 2015 to Term 4, 2015 by school. 116

By school cohort Table 88 Difference from Norm PAT Mathematics Score (Whole Cluster) Across Term 1, 2015 to Term 4, 2015 by School Cohort School Cohort a

Term 1, 2015

Term 4, 2015

Effect Size

N

M

SD

Significance

M

SD

Significance

1

742

-9.2

9.8

***

-9.1

10.7

***

0.02

2

336

-8.6

9.6

***

-9.8

10.3

***

-0.13

3

219

-2.1

13.4

*

0.0

15.2

0.14

Significance of p-values: *** < 0.001; ** < 0.01; * < 0.05; . < 0.1. a

see Table 1

PAT Mathematics by School Cohort - Whole Cluster 0.0

Difference from Norm Score

-2.0 -4.0 -6.0 -8.0 -10.0 -12.0 Term 1, 2015 1

Term 4, 2015 2

3

Norm

Figure 79. Difference from norm PAT mathematics score (whole cluster) across Term 1, 2015 to Term 4, 2015 by school cohort.

117

By school type Table 89 Difference from Norm PAT Mathematics Score (Whole Cluster) Across Term 1, 2015 to Term 4, 2015 by School Type School Type

Term 1, 2015

Term 4, 2015

Effect Size

N

M

SD

Significance

M

SD

Significance

Primary

1115

-7.6

11.2

***

-7.0

12.2

***

0.05

Secondary

182

-9.6

7.9

***

-12.3

9.4

***

-0.30

Significance of p-values: *** < 0.001; ** < 0.01; * < 0.05; . < 0.1.

PAT Mathematics by School Type - Whole Cluster 0.0

Difference from Norm Score

-2.0 -4.0 -6.0 -8.0 -10.0 -12.0 -14.0 Term 1, 2015 Primary

Term 4, 2015 Secondary

Norm

Figure 80. Difference from norm PAT mathematics score (whole cluster) across Term 1, 2015 to Term 4, 2015 by school type.

118

By school religion Table 90 Difference from Norm PAT Mathematics Score (Whole Cluster) Across Term 1, 2015 to Term 4, 2015 by School Religion School Religion

Term 1, 2015

Term 4, 2015

Effect Size

N

M

SD

Significance

M

SD

Significance

Catholic

123

-8.4

10.3

***

-6.8

9.8

***

0.16

Non-Catholic

1174

-7.8

10.8

***

-7.8

12.2

***

0.00

Significance of p-values: *** < 0.001; ** < 0.01; * < 0.05; . < 0.1.

PAT Mathematics by School Religion - Whole Cluster 0.0

Difference from Norm Score

-1.0 -2.0 -3.0 -4.0 -5.0 -6.0 -7.0 -8.0 -9.0 Term 1, 2015 Catholic

Term 4, 2015 Non-Catholic

Norm

Figure 81. Difference from norm PAT mathematics score (whole cluster) across Term 1, 2015 to Term 4, 2015 by school religion.

119

School cohort 1 and 2 In this section, we consider only students (n = 1183) that have PAT mathematics scores at Term 1 and Term 4, 2015. A summary of descriptive statistics for each school and student characteristics are presented in Table 91 - Table 95. Figure 82 – 85 show difference from mathematics Norm Score across Term 1, 2015 to Term 4, 2015 by student characteristics such as gender, ethnicity, and year level, respectively. Figure 86 - Figure 87 show difference from mathematics Norm Score across Term 1, 2015 to Term 4, 2015 by school characteristics such as school type and school religion respectively.

120

By gender Table 91 Difference from Norm PAT Mathematics Score (School Cohort 1 and 2) Across Term 1, 2015 to Term 4, 2015 by Gender Gender

Term 1, 2015

Term 4, 2015

Effect Size

N

M

SD

Significance

M

SD

Significance

Female

516

-8.8

9.3

***

-9.5

10.5

***

-0.07

Male

562

-9.2

10.2

***

-9.1

10.6

***

0.01

Significance of p-values: *** < 0.001; ** < 0.01; * < 0.05; . < 0.1.

PAT Mathematics by Gender - School Cohort 1 and 2 0.0

Difference from Norm Score

-1.0 -2.0 -3.0 -4.0 -5.0 -6.0 -7.0 -8.0 -9.0 -10.0 Term 1, 2015 Female

Term 4, 2015 Male

Norm

Figure 82. Difference from norm PAT mathematics score (school cohort 1 and 2) across Term 1, 2015 to Term 4, 2015 by gender.

121

By ethnicity Table 92 Differences from Norm PAT Mathematics Score (School Cohort 1 and 2) Across Term 1, 2015 to Term 4, 2015 by Ethnicity Ethnicity

Term 1, 2015

Term 4, 2015

Effect Size

N

M

SD

Significance

M

SD

Significance

Māori

286

-9.2

9.8

***

-9.1

10.4

***

0.01

Pasifika

689

-9.7

9.3

***

-10.2

10.1

***

-0.06

NZ European

30

-3.7

12.7

-3.6

15.6

Other

73

-4.2

11.1

-3.6

10.6

**

0.01 **

0.06

Significance of p-values: *** < 0.001; ** < 0.01; * < 0.05; . < 0.1.

PAT Mathematics by Ethnicity - School Cohort 1 and 2 0.0

Difference from Norm Score

-2.0 -4.0 -6.0 -8.0 -10.0 -12.0 Term 1, 2015 Māori

Pasifika

Term 4, 2015 NZ European

Other

Norm

Figure 83. Difference from norm PAT mathematics score (school cohort 1 and 2) across Term 1, 2015 to Term 4, 2015 by ethnicity.

122

By year level Table 93 Difference from Norm PAT Mathematics Score (School Cohort 1 and 2) Across Term 1, 2015 to Term 4, 2015 by Year Level Year Level

Term 1, 2015

Term 4, 2015

Effect Size

N

M

SD

Significance

M

SD

Significance

4

142

-8.8

9.7

***

-8.2

11.9

***

0.05

5

205

-9.7

10.0

***

-9.1

11.0

***

0.05

6

183

-8.4

9.6

***

-6.8

10.5

***

0.15

7

187

-8.0

11.2

***

-8.7

10.9

***

-0.06

8

179

-9.5

9.7

***

-10.5

8.7

***

-0.11

9

98

-8.2

8.1

***

-11.2

9.0

***

-0.35

10

84

-11.3

7.4

***

-13.5

9.7

***

-0.25

Significance of p-values: *** < 0.001; ** < 0.01; * < 0.05; . < 0.1.

PAT Mathematics by Year Level - School Cohort 1 and 2 0.0

Difference from Norm Score

-2.0 -4.0 -6.0 -8.0 -10.0 -12.0 -14.0 -16.0 Term 1, 2015 4

5

6

Term 4, 2015 7

8

9

10

Norm

Figure 84. Difference from norm PAT mathematics score (school cohort 1 and 2) across Term 1, 2015 to Term 4, 2015 by year level. 123

80 Manaiakalani Cluster

Norm

Scaled PAT Mathematics Score

70 60 50 40 30 20 10 0 Feb

Nov

Year 4

Feb

Nov

Year 5

Feb

Nov

Year 6

Feb

Nov

Year 7

Feb

Nov

Year 8

Feb

Nov

Year 9

Feb

Nov

Year 10

Normative Comparison by Year Level across Term 1 to Term 4 in 2015 - School Cohort 1 and 2

Figure 85. Scaled PAT mathematics scores by year levels (school cohort 1 and 2) across Term 1, 2015 and Term 4, 2015.

124

By school type Table 94 Difference from Norm PAT Mathematics Score (School Cohort 1 and 2) Across Term 1, 2015 to Term 4, 2015 by School Type School Type

Term 1, 2015

Term 4, 2015

Effect Size

N

M

SD

Significance

M

SD

Significance

Primary

896

-8.9

10.1

***

-8.7

10.7

***

0.02

Secondary

182

-9.6

7.9

***

-12.3

9.4

***

-0.30

Significance of p-values: *** < 0.001; ** < 0.01; * < 0.05; . < 0.1.

PAT Mathematics by School Type - School Cohort 1 and 2 0.0

Difference from Norm Score

-2.0 -4.0 -6.0 -8.0 -10.0 -12.0 -14.0 Term 1, 2015 Primary

Term 4, 2015 Secondary

Norm

Figure 86. Difference from norm PAT mathematics score (school cohort 1 and 2) across Term 1, 2015 to Term 4, 2015 by school type.

125

By school religion Table 95 Difference from Norm PAT Mathematics Score (School Cohort 1 and 2) Across Term 1, 2015 to Term 4, 2015 by School Religion School Religion

Term 1, 2015

Term 4, 2015

Effect Size

N

M

SD

Significance

M

SD

Significance

Catholic

123

-8.4

10.3

***

-6.8

9.8

***

0.16

Non-Catholic

955

-9.1

9.7

***

-9.6

10.6

***

-0.05

Significance of p-values: *** < 0.001; ** < 0.01; * < 0.05; . < 0.1.

PAT Mathematics by School Religion - School Cohort 1 and 2 0.0

Difference from Norm Score

-2.0 -4.0 -6.0 -8.0 -10.0 -12.0 Term 1, 2015 Catholic

Term 4, 2015 Non-Catholic

Norm

Figure 87. Difference from norm PAT mathematics score (school cohort 1 and 2) across Term 1, 2015 to Term 4, 2015 by school religion.

126

By transition to secondary school In this section we analysed PAT mathematics achievement at Term 4, 2015 for students from Tamaki College. We consider only students (n = 182) that have PAT mathematics scores at Term 1 and Term 4, 2015. Analyses compared achievements of students who transitioned to Tamaki College from within the Manaiakalani cluster against achievements of those who transitioned from outside the cluster. Table 96 shows the number of students at Tamaki College who transitioned from within and outside the Manaiakalani cluster. Figure 88 presents difference from mathematics Norm Score across Term 1, 2015 to Term 4, 2015 by transition to Tamaki College. Table 96 Number of Students at Tamaki College who Transitioned from Within and Outside the Manaiakalani Cluster Year Level

Transition From

N

9

Outside Manaiakalani

24

9

Within Manaiakalani

74

10

Outside Manaiakalani

44

10

Within Manaiakalani

40

Note: All students transitioned after Year 8. Difference from Norm PAT Mathematics Score at Term 1 and Term 4 in 2015 by Transition Difference from Norm Score

0 -5 -10 -15 -20 Term 1, 2015 Year 9, Outside

Year 9, Within

Term 4, 2015 Year 10, Outside

Year 10, Within

Figure 88. Difference from norm PAT mathematics scores by year levels and transition across Term 1, 2015 to Term 4, 2015 at Tamaki College.

127

3.3 Writing, Reading and Mathematics – 2012 to 2015 e-asTTle writing Whole cluster In this section we collated analyses of e-asTTle writing data across four years (2012 to 2015) for the whole cluster. Table 97 - Table 100 present the number of schools and classrooms in the whole cluster from 2012 to 2015 that achieved above, at or below national levels in terms of their end of year achievement and their gain within the same academic year. Table 101 - Table 102 show the same analysis for primary school classrooms only.

Table 97 Number of Schools with Average Writing Score Above, At, and Below Norm Score (Whole Cluster) at Term 4 Across 2012 to 2015 Normative Comparison

Above

At

Below

Total

2012

0

0

7

7

2013

0

1

7

8

2014

2

1

7

10

2015

2

0

8

10

Total

4

2

29

35

Table 98 Number of Schools with Average Writing Gain Above, At, and Below Norm Gain (Whole Cluster) within Academic Year Across 2012 to 2015 Normative Comparison

Above

At

Below

Total

2012

2

2

1

5

2013

5

3

0

8

2014

9

1

0

10

2015

7

1

2

10

Total

23

7

3

33

128

Table 99 Number of Classrooms with Average Writing Score Above, At, and Below Norm Score (Whole Cluster) at Term 4 Across 2012 to 2015 Normative Comparison

Above

At

Below

Total

2012

0

9

28

37

2013

0

14

37

51

2014

10

22

33

65

2015

6

15

37

58

Total

16

60

135

211

Table 100 Number of Classrooms with Average Writing Gain Above, At, and Below Norm Gain (Whole Cluster) within Academic Year Across 2012 to 2015 Normative Comparison

Above

At

Below

Total

2012

4

11

3

18

2013

22

23

5

50

2014

36

20

3

59

2015

20

33

5

58

Total

82

87

16

185

Table 101 Number of Primary School Classrooms with Average Writing Score Above, At, and Below Norm Score (Whole Cluster) at Term 4 Across 2012 to 2015 Normative Comparison

Above

At

Below

Total

2012

0

9

28

37

2013

0

13

26

39

2014

10

21

23

54

2015

6

15

27

48

Total

16

58

104

178

129

Table 102 Number of Primary School Classrooms with Average Writing Gain Above, At, and Below Norm Gain (Whole Cluster) within Academic Year Across 2012 to 2015 Normative Comparison

Above

At

Below

Total

2012

4

11

3

18

2013

20

15

3

38

2014

33

14

2

49

2015

19

27

2

48

Total

76

67

10

153

School cohort 1 and 2 In this section we collated analyses of e-asTTle writing data across four years (2012 to 2015) for school cohort 1 and 2. Table 103 - Table 106 present the number of schools and classrooms in school cohort 1 and 2 from 2012 to 2015 that achieved above, at or below national levels in terms of their end of year achievement and their gain within the same academic year. Table 107 - Table 108 show the same analysis for primary school classrooms only.

Table 103 Number of Schools with Average Writing Score Above, At, and Below Norm Score (School Cohort 1 and 2) at Term 4 Across 2012 to 2015 Normative Comparison

Above

At

Below

Total

2012

0

0

7

7

2013

0

1

7

8

2014

1

1

6

8

2015

1

0

7

8

Total

2

2

27

31

130

Table 104 Number of Schools with Average Writing Gain Above, At, and Below Norm Gain (School Cohort 1 and 2) within Academic Year Across 2012 to 2015 Normative Comparison

Above

At

Below

Total

2012

2

2

1

5

2013

5

3

0

8

2014

7

1

0

8

2015

5

1

2

8

Total

19

7

3

29

Table 105 Number of Classrooms with Average Writing Score Above, At, and Below Norm Score (School Cohort 1 and 2) at Term 4 Across 2012 to 2015 Normative Comparison

Above

At

Below

Total

2012

0

9

28

37

2013

0

14

37

51

2014

3

16

32

51

2015

1

12

34

47

Total

4

51

131

186

Table 106 Number of Classrooms with Average Writing Gain Above, At, and Below Norm Gain (School Cohort 1 and 2) within Academic Year Across 2012 to 2015 Normative Comparison

Above

At

Below

Total

2012

4

11

3

18

2013

22

23

5

50

2014

29

15

3

47

2015

15

27

5

47

Total

70

76

16

162

131

Table 107 Number of Primary School Classrooms with Average Writing Score Above, At, and Below Norm Score (School Cohort 1 and 2) at Term 4 Across 2012 to 2015 Normative Comparison

Above

At

Below

Total

2012

0

9

28

37

2013

0

13

26

39

2014

3

15

22

40

2015

1

12

24

37

Total

4

49

100

153

Table 108 Number of Primary School Classrooms with Average Writing Gain Above, At, and Below Norm Gain (School Cohort 1 and 2) within Academic Year Across 2012 to 2015 Normative Comparison

Above

At

Below

Total

2012

4

11

3

18

2013

20

15

3

38

2014

26

9

2

37

2015

14

21

2

37

Total

64

56

10

130

Progressive Achievement Test (PAT) reading Whole cluster In this section we collated analyses of Progressive Achievement Test (PAT) reading data across four years (2012 to 2015) for the whole cluster. Table 109 - Table 112 present the number of schools and classrooms in the whole cluster from 2012 to 2015 that achieved above, at or below national levels in terms of their end of year achievement and their gain within the same academic year. Table 113 - Table 114 show the same analysis for primary school classrooms only.

132

Table 109 Number of Schools with Average Reading Score Above, At, and Below Norm Score (Whole Cluster) at Term 4 Across 2012 to 2015 Normative Comparison

Above

At

Below

Total

2012

0

1

7

8

2013

0

1

7

8

2014

1

0

9

10

2015

0

1

9

10

Total

1

3

32

36

Table 110 Number of Schools with Average Reading Gain Above, At, and Below Norm Gain (Whole Cluster) within Academic Year Across 2012 to 2015 Normative Comparison

Above

At

Below

Total

2012

3

2

3

8

2013

3

3

2

8

2014

1

7

2

10

2015

0

8

2

10

Total

7

20

9

36

Table 111 Number of Classrooms with Average Reading Score Above, At, and Below Norm Score (Whole Cluster) at Term 4 Across 2012 to 2015 Normative Comparison

Above

At

Below

Total

2012

0

9

28

37

2013

0

11

37

48

2014

8

8

46

62

2015

1

8

52

61

Total

9

36

163

208

133

Table 112 Number of Classrooms with Average Reading Gain Above, At, and Below Norm Gain (Whole Cluster) within Academic Year Across 2012 to 2015 Normative Comparison

Above

At

Below

Total

2012

6

27

3

36

2013

7

32

8

47

2014

6

41

5

52

2015

4

40

15

59

Total

23

140

31

194

Table 113 Number of Primary School Classrooms with Average Reading Score Above, At, and Below Norm Score (Whole Cluster) at Term 4 Across 2012 to 2015 Normative Comparison

Above

At

Below

Total

2012

0

9

28

37

2013

0

9

30

39

2014

6

8

37

51

2015

1

7

42

50

Total

7

33

137

177

Table 114 Number of Primary School Classrooms with Average Reading Gain Above, At, and Below Norm Gain (Whole Cluster) within Academic Year Across 2012 to 2015 Normative Comparison

Above

At

Below

Total

2012

6

27

3

36

2013

4

26

8

38

2014

3

34

4

41

2015

4

40

4

48

Total

17

127

19

163

134

School cohort 1 and 2 In this section we collated analyses of PAT reading data across four years (2012 to 2015) for school cohort 1 and 2. Table 115 - Table 118 present the number of schools and classrooms in school cohort 1 and 2 from 2012 to 2015 that achieved above, at or below national levels in terms of their end of year achievement and their gain within the same academic year. Table 119 - Table 120 show the same analysis for primary school classrooms only.

Table 115 Number of Schools with Average Reading Score Above, At, and Below Norm Score (School Cohort 1 and 2) at Term 4 Across 2012 to 2015 Normative Comparison

Above

At

Below

Total

2012

0

1

7

8

2013

0

1

7

8

2014

0

0

8

8

2015

0

0

8

8

Total

0

2

30

32

Table 116 Number of Schools with Average Reading Gain Above, At, and Below Norm Gain (School Cohort 1 and 2) within Academic Year Across 2012 to 2015 Normative Comparison

Above

At

Below

Total

2012

3

2

3

8

2013

3

3

2

8

2014

1

5

2

8

2015

0

6

2

8

Total

7

16

9

32

135

Table 117 Number of Classrooms with Average Reading Score Above, At, and Below Norm Score (School Cohort 1 and 2) at Term 4 Across 2012 to 2015 Normative Comparison

Above

At

Below

Total

2012

0

9

28

37

2013

0

11

37

48

2014

3

7

41

51

2015

0

2

47

49

Total

3

29

153

185

Table 118 Number of Classrooms with Average Reading Gain Above, At, and Below Norm Gain (School Cohort 1 and 2) within Academic Year Across 2012 to 2015 Normative Comparison

Above

At

Below

Total

2012

6

27

3

36

2013

7

32

8

47

2014

6

35

4

45

2015

3

31

14

48

Total

22

125

29

176

Table 119 Number of Primary School Classrooms with Average Reading Score Above, At, and Below Norm Score (School Cohort 1 and 2) at Term 4 Across 2012 to 2015 Normative Comparison

Above

At

Below

Total

2012

0

9

28

37

2013

0

9

30

39

2014

6

8

37

51

2015

1

7

42

50

Total

7

33

137

177

136

Table 120 Number of Primary School Classrooms with Average Reading Gain Above, At, and Below Norm Gain (School Cohort 1 and 2) within Academic Year Across 2012 to 2015 Normative Comparison

Above

At

Below

Total

2012

6

27

3

36

2013

4

26

8

38

2014

3

28

3

34

2015

3

31

3

37

Total

16

112

17

145

Progressive Achievement Test (PAT) mathematics Whole cluster In this section we collated analyses of Progressive Achievement Test (PAT) mathematics data across four years (2012 to 2015) for the whole cluster. Table 121 - Table 124 present the number of schools and classrooms in the whole cluster from 2012 to 2015 that achieved above, at or below national levels in terms of their end of year achievement and their gain within the same academic year. Table 125 - Table 126 show the same analysis for primary school classrooms only.

Table 121 Number of Schools with Average Mathematics Score Above, At, and Below Norm Score (Whole Cluster) at Term 4 Across 2012 to 2015 Normative Comparison

Above

At

Below

Total

2012

0

0

8

8

2013

0

0

8

8

2014

1

0

9

10

2015

1

0

9

10

Total

2

0

34

36

137

Table 122 Number of Schools with Average Mathematics Gain Above, At, and Below Norm Gain (Whole Cluster) within Academic Year Across 2012 to 2015 Normative Comparison

Above

At

Below

Total

2012

1

6

1

8

2013

4

4

0

8

2014

7

3

0

10

2015

3

5

2

10

Total

15

18

3

36

Table 123 Number of Classrooms with Average Mathematics Score Above, At, and Below Norm Score (Whole Cluster) at Term 4 Across 2012 to 2015 Normative Comparison

Above

At

Below

Total

2012

0

6

31

37

2013

0

7

35

42

2014

4

9

53

66

2015

4

8

50

62

Total

8

30

169

207

Table 124 Number of Classrooms with Average Mathematics Gain Above, At, and Below Norm Gain (Whole Cluster) within Academic Year Across 2012 to 2015 Normative Comparison

Above

At

Below

Total

2012

5

25

6

36

2013

15

23

2

40

2014

13

40

2

55

2015

8

41

8

57

Total

41

129

18

188

138

Table 125 Number of Primary School Classrooms with Average Mathematics Score Above, At, and Below Norm Score (Whole Cluster) at Term 4 Across 2012 to 2015 Normative Comparison

Above

At

Below

Total

2012

0

6

31

37

2013

0

7

33

40

2014

4

8

43

55

2015

4

7

40

51

Total

8

28

147

183

Table 126 Number of Primary School Classrooms with Average Mathematics Gain Above, At, and Below Norm Gain (Whole Cluster) within Academic Year Across 2012 to 2015 Normative Comparison

Above

At

Below

Total

2012

5

25

6

36

2013

15

21

2

38

2014

12

32

0

44

2015

8

34

4

46

Total

40

112

12

164

School cohort 1 and 2 In this section we collated analyses of PAT mathematics data across four years (2012 to 2015) for school cohort 1 and 2. Table 127 - Table 130 present the number of schools and classrooms in school cohort 1 and 2 from 2012 to 2015 that achieved above, at or below national levels in terms of their end of year achievement and their gain within the same academic year. Table 131 - Table 132 show the same analysis for primary school classrooms only.

139

Table 127 Number of Schools with Average Mathematics Score Above, At, and Below Norm Score (School Cohort 1 and 2) at Term 4 Across 2012 to 2015 Normative Comparison

Above

At

Below

Total

2012

0

0

8

8

2013

0

0

8

8

2014

0

0

8

8

2015

0

0

8

8

Total

0

0

32

32

Table 128 Number of Schools with Average Mathematics Gain Above, At, and Below Norm Gain (School Cohort 1 and 2) within Academic Year Across 2012 to 2015 Normative Comparison

Above

At

Below

Total

2012

1

6

1

8

2013

4

4

0

8

2014

5

3

0

8

2015

2

4

2

8

Total

12

17

3

32

Table 129 Number of Classrooms with Average Mathematics Score Above, At, and Below Norm Score (School Cohort 1 and 2) at Term 4 Across 2012 to 2015 Normative Comparison

Above

At

Below

Total

2012

0

6

31

37

2013

0

7

35

42

2014

0

5

47

52

2015

0

5

45

50

Total

0

23

158

181

140

Table 130 Number of Classrooms with Average Mathematics Gain Above, At, and Below Norm Gain (School Cohort 1 and 2) within Academic Year Across 2012 to 2015 Normative Comparison

Above

At

Below

Total

2012

5

25

6

36

2013

15

23

2

40

2014

9

35

2

46

2015

5

34

7

46

Total

34

117

17

168

Table 131 Number of Primary School Classrooms with Average Mathematics Score Above, At, and Below Norm Score (School Cohort 1 and 2) at Term 4 Across 2012 to 2015 Normative Comparison

Above

At

Below

Total

2012

0

6

31

37

2013

0

7

33

40

2014

4

8

43

55

2015

4

7

40

51

Total

8

28

147

183

Table 132 Number of Primary School Classrooms with Average Mathematics Gain Above, At, and Below Norm Gain (School Cohort 1 and 2) within Academic Year Across 2012 to 2015 Normative Comparison

Above

At

Below

Total

2012

5

25

6

36

2013

15

21

2

38

2014

8

27

0

35

2015

5

27

3

35

Total

33

100

11

144

141

3.4 National Certificate of Educational Achievement (NCEA) NCEA Level 1, 2, and 3 In this section we present the preliminary 2015 NCEA results from Tamaki College. Table 133 - Table 135 show the number of students who achieved level 1, 2, and 3 subject endorsements respectively. Figure 89 shows roll-based NCEA pass rates at Tamaki College from 2010 to 2015 by qualification level.

Table 133 Number of Students who Achieved Level 1 Subject Endorsements Subject

Merit

Excellence

English

5

1

Art

1

History

2

Mathematics

1

Physical Education

9

Science

3

Social Studies

1

Business

1

DVC

1

Total

24

1 2

4

142

Table 134 Number of Students who Achieved Level 2 Subject Endorsements Subject

Merit

Excellence

English

6

2

Mathematics

3

3

Physical Education

8

4

History

1

1

Biology

2

Chemistry

1

Media

1

Music

2

Social Studies

1

Total

25

10

Table 135 Number of Students who Achieved Level 3 Subject Endorsements Subject

Merit

Excellence

English

5

1

Physical Education

2

History

1

Biology Total

1 8

2

143

90% 80%

Percentage of Attainment

70% 60% 50% 40% 30% 20% 10% 0% 2010

2011

2012

2013

2014

Level 1

Level 2

Level 3

UE

Level 1 Literacy

Level 1 Numeracy

2015

Figure 89. Roll-based NCEA achievement rates at Tamaki College in 2010 to 2015 by achievement level.

3.5 Classroom Observations Classroom observations Term 4, 2014 to Term 3, 2015 Whole cluster In this section we present descriptive results of classroom observations from all schools across Term 4, 2014 to Term 3, 2015. Figure 90 shows the percentage of students with netbook use across Term 4, 2014 to Term 3, 2015. Figure 91 - Figure 98 show the percentages of observed blocks by various measures across Term 4, 2014 to Term 3, 2015.

144

30

Number of Classrooms

25 20 15 10 5 0 0

0-10

10-20

20-30

30-40

40-50

50-60

60-70

70-80

80-90

90-99

100

Percent of Students with Device Access Whole Cluster Term 4, 2014

Term 1, 2015

Term 3, 2015

Figure 90. Percentage of students (whole cluster) with access to a device across Term 4, 2014 to Term 3, 2015.

70

Percentage of Blocks

60 50 40 30 20 10 0 Individual

Group

Whole Class

Number of Students in Teacher Group - Whole Cluster Term 4, 2014

Term 1, 2015

Term 3, 2015

Figure 91. Percentage of observed blocks (whole cluster) by number of students in teacher group across Term 4, 2014 to Term 3, 2015.

145

60

Percentage of Blocks

50 40 30 20 10 0 Lecture/Model

Q&A

Conf/ED

Rove

Instruct

Behaviour

Main Teaching Activity - Whole Cluster Term 4, 2014

Term 1, 2015

Term 3, 2015

Figure 92. Percentage of observed blocks (whole cluster) by main teaching activity across Term 4, 2014 to Term 3, 2015.

100 90

Percentage of Blocks

80 70 60 50 40 30 20 10 0 Any

Evaluative

Descriptive

Generative

Type of Feedback - Whole Cluster Term 4, 2014

Term 1, 2015

Term 3, 2015

Figure 93. Percentage of observed blocks (whole cluster) by type of feedback across Term 4, 2014 to Term 3, 2015.

146

70

Percentage of Blocks

60 50 40 30 20 10 0 Item

APK

Practice

Strategy

Critical

Instructional Depth - Whole Cluster Term 4, 2014

Term 1, 2015

Term 3, 2015

Figure 94. Percentage of observed blocks (whole cluster) by instructional depth across Term 4, 2014 to Term 3, 2015.

80

Percentage of Blocks

70 60 50 40 30 20 10 0 Gaming

Constrained Open Ended Practice Template Worksheet

Extended Extended Extended Reading Reading Writing (Multiple (Single Texts) Texts)

Creating a DLO

Navigating Commenting and on Blogs Organising

Nature of Task - Whole Cluster Term 4, 2014

Term 1, 2015

Term 3, 2015

Figure 95. Percentage of observed blocks (whole cluster) by nature of task across Term 4, 2014 to Term 3, 2015.

147

90

Percentage of Blocks

80 70 60 50 40 30 20 10 0

Nature of Site - Whole Cluster Term 4, 2014

Term 1, 2015

Term 3, 2015

Figure 96. Percentage of observed blocks (whole cluster) by nature of site across Term 4, 2014 to Term 3, 2015.

70

Percentage of Blocks

60 50 40 30 20 10 0 Offline

Online and Verbal Prompt

Totally Digitally Managed

Task Management - Whole Cluster Term 4, 2014

Term 1, 2015

Term 3, 2015

Figure 97. Percentage of observed blocks (whole cluster) by task management across Term 4, 2014 to Term 3, 2015.

148

60

Percentage of Blocks

50 40 30 20 10 0 My Work Both Using Computer

My Work Discussion (FTF)

Our Work Both Using Computer

Our Work Discussion (FTF)

Patterns of Collaboration - Whole Cluster Term 4, 2014

Term 1, 2015

Term 3, 2015

Figure 98. Percentage of observed blocks (whole cluster) by patterns of collaboration across Term 4, 2014 to Term 3, 2015.

School cohort 1 and 2 In this section we present descriptive results of classroom observations from school cohort 1 and 2 schools across Term 4, 2014 to Term 3, 2015. Figure 99 shows the percentage of students with netbook use across Term 4, 2014 to Term 3, 2015. Figure 100 - Figure 107 show the percentages of observed blocks by various measures across Term 4, 2014 to Term 3, 2015.

149

25

Number of Classrooms

20

15

10

5

0 0

0-10

10-20

20-30

30-40

40-50

50-60

60-70

70-80

80-90

90-99

100

Percent of Students with Device Access - School Cohort 1 and 2 Term 4, 2014

Term 1, 2015

Term 3, 2015

Figure 99. Percentage of students with access to a device (school cohort 1 and 2) across Term 4, 2014 to Term 3, 2015.

70

Percentage of Blocks

60 50 40 30 20 10 0 Individual

Group

Whole Class

Number of Students in Teacher Group - School Cohort 1 and 2 Term 4, 2014

Term 1, 2015

Term 3, 2015

Figure 100. Percentage of observed blocks (school cohort 1 and 2) by number of students in teacher group across Term 4, 2014 to Term 3, 2015.

150

70

Percentage of Blocks

60 50 40 30 20 10 0 Lecture/Model

Q&A

Conf/ED

Rove

Instruct

Behaviour

Main Teaching Activity - School Cohort 1 and 2 Term 4, 2014

Term 1, 2015

Term 3, 2015

Figure 101. Percentage of observed blocks (school cohort 1 and 2) by main teaching activity across Term 4, 2014 to Term 3, 2015.

100 90

Percentage of Blocks

80 70 60 50 40 30 20 10 0 Any

Evaluative

Descriptive

Generative

Type of Feedback - School Cohort 1 and 2 Term 4, 2014

Term 1, 2015

Term 3, 2015

Figure 102. Percentage of observed blocks (school cohort 1 and 2) by type of feedback across Term 4, 2014 to Term 3, 2015.

151

70

Percentage of Blocks

60 50 40 30 20 10 0 Item

APK

Practice

Strategy

Critical

Instructional Depth - School Cohort 1 and 2 Term 4, 2014

Term 1, 2015

Term 3, 2015

Figure 103. Percentage of observed blocks (school cohort 1 and 2) by instructional depth across Term 4, 2014 to Term 3, 2015.

80

Percentage of Blocks

70 60 50 40 30 20 10 0 Gaming

Constrained Open Ended Practice Template Worksheet

Extended Extended Extended Reading Reading Writing (Multiple (Single Texts) Texts)

Creating a DLO

Navigating Commenting and on Blogs Organising

Nature of Task - School Cohort 1 and 2 Term 4, 2014

Term 1, 2015

Term 3, 2015

Figure 104. Percentage of observed blocks (school cohort 1 and 2) by nature of task across Term 4, 2014 to Term 3, 2015.

152

90

Percentage of Blocks

80 70 60 50 40 30 20 10 0

Nature of Site - School Cohort 1 and 2 Term 4, 2014

Term 1, 2015

Term 3, 2015

Figure 105. Percentage of observed blocks (school cohort 1 and 2) by nature of site across Term 4, 2014 to Term 3, 2015.

70

Percentage of Blocks

60 50 40 30 20 10 0 Offline

Online and Verbal Prompt

Totally Digitally Managed

Task Management - School Cohort 1 and 2 Term 4, 2014

Term 1, 2015

Term 3, 2015

Figure 106. Percentage of observed blocks (school cohort 1 and 2) by task management across Term 4, 2014 to Term 3, 2015.

153

60

Percentage of Blocks

50 40 30 20 10 0 My Work Both Using Computer

My Work Discussion (FTF)

Our Work Both Using Computer

Our Work Discussion (FTF)

Patterns of Collaboration - School Cohort 1 and 2 Term 4, 2014

Term 1, 2015

Term 3, 2015

Figure 107. Percentage of observed blocks (school cohort 1 and 2) by patterns of collaboration across Term 4, 2014 to Term 3, 2015.

3.6 Case Study Teachers – Reading Eight teachers were identified as case studies in order to describe the features of effective teachers of reading. Teachers’ online planning and students’ blog posts were used as primary sources of data, alongside existing classroom observations. The following descriptions are gathered from the reading instruction in these classes over Terms 1 and 3, 2015, and include teachers’ learning intentions, assigned tasks and activities, and student blog posts related to their reading activities.

Promoting engagement in reading Teachers tended to plan for instruction and engagement in reading in two ways: through student selected texts or through selecting texts for the students. When teachers promoted student text selection, this tended to be for independent reading for pleasure via library visits and online e-book reading; keeping abreast of current events such as online newspaper reading; and, reading for the purposes of independent inquiry or research. Teacher text selection was employed for guided literacy or literary instruction using print text such as School Journals, Connections series, novels, and websites.

154

Teachers employed two distinct strategies for achieving reading mileage. A small number of the case study teachers provided time for student selected reading and associated activities alongside the teacher-selected instructional texts with complementary online text links. The balance of teachers tended to emphasise purposefully selected activities over a fortnight or more, associated with a principle teacher-selected text supported by complementary links to one or more online texts. Some of these timetables included planned sustained silent reading (SSR) time, but student text-selected reading with an associated purpose or challenge was ‘back grounded’, whilst a deeper engagement with a teacher-selected text was foregrounded. This strategy resulted in less overall breadth of reading, but greater depth of understanding. Two practices from these approaches are worth highlighting because of their impact on reading mileage and motivation. The first was the promotion of independent reading through a ‘stretch’ approach. This was evident in tasks such as the Independent Reading Challenge. Here students were required to negotiate ‘challenge’ activities over a school term, related to texts of their choice from within a range. Each challenge had a number of activity choices, based on the Bloom’s thinking levels, shared within a collaborating group, and incorporating scheduled ‘report ins’ with the teacher. Students managed their own reading time, selected challenge tasks, and achieved reading project goals by specific time points. Students who chose the same challenge activity published to a shared document, so that they collaborated both within their reading group and across groups at the same time. In this way students could work within ability groups for support, but had opportunity to be influenced by the various offerings and feedback of those students with varying abilities. Extensive independent reading was achieved through coverage of a variety of reading types, including current affairs (news media), print fiction, and online fiction. In addition to monitoring their own personal reading goals, students’ awareness of their reading strengths and patterns was fostered through reflection on their developing reading preferences and histories. For example, one class created a digital timeline of their favourite books since they first became readers, to present day. The teacher shared her most influential titles and the nature of changes to her personal reading tastes as she became a more mature reader over time. The second practice was to promote additional reading for deeper understanding. Most often, teachers purposefully paired an instructional text with an online text, to problematise a key idea. For example, one teacher selected a School Journal story that featured sharks, paired with an online petition to have sharks culled in a problem swimming area in South Australia. The online text provided a non-fiction comparison with the narrative, further exposure to diverse vocabulary from a related but different context and opportunities to contrast the representation of sharks from another perspective. Further

155

opportunities to juxtapose texts for critical purposes could take this practice beyond meaning making or interest purposes to investigating alternative beliefs or positions within text. Teacher selected texts for guided reading were often informational texts. Prominent were School Journal or Connected articles covering topics such as healthy lunches, Global Warming, Manawatu Gorge road repair, native bird or forest conservation, the ‘Power of Rubbish’ and white baiting. Online texts included petitions, websites, Auckland Council reports, virtual museums (video game history), and Kids Science (“How to Make a Bottle Rocket”). Class and individual blogs were valued texts in promoting digital citizenship, reading mileage and ‘making connections across a wide range of texts’. Novel study was generally reserved for higher ability readers and included print texts such as “Holes” by Louis Sachar, “The Twits” by Roald Dahl, “Harry Potter” by J. K. Rowling and “Counting by 7’s” by Holly Goldberg Sloan.

Instruction and support for understanding Teachers’ learning intention statements over two terms indicated two main approaches for developing reading comprehension: •

Strategies and skills;



Vocabulary.

Strategy building was far more common than word work. In terms of strategy building, the case study teachers used both student-led and teacher-led approaches. The student-led approach was to support students’ awareness of strategies through their independent learning plans or goals for future learning which were based on assessment data and curriculum objectives. The teacher-led approach was through practice on skills embedded differentiated reading groups. Student-led goals for future learning Teachers employing this approach planned for specific consultation with students at least once a term about their goals, set as a result of their individual assessment (e-asTTle; PAT) or curriculum achievement objectives (Progressions), in times additional to the guided reading instruction. Students received electronic copies of assessment results and progression or objective descriptors to set their improvement goals. Indicators that were clear to students and individualised meant that each learner could work towards building their own comprehension capability. For example, a single document that described the student’s close reading, language, critical thinking and information processing objectives provided an overall and explicit approach to helping students work towards a repertoire of strategies. It was not always clear whether there was synergy between individual students’ goal setting 156

and the guided reading sessions tailored to these goals, however, those case study teachers who planned for their students to create learning goals also designed learning that incorporated general maintenance as well as focussed guidance. Table 136 gives an example of a reading tumble programme that incorporated weekly student independent sessions to practice upkeep of skills.

Table 136 Example of a Reading Tumble Programme Incorporating Independent Student Maintenance Activities for Upkeep of Skills Activity

Activity Instruction

Chunky Challenge; Hunt a Chunk

WALT: Decode words by looking for chunks we already know. For example: “Use the ABCya Word Cloud generator to display chunk words in an interesting way”

Scavenger Hunt

“Use the text you have been reading to find the following: word parts; sentences types; structural features; visualisations of words.”

Setting Study/ Character Inference

“Chose a story, article, or text from your Journal book or a chapter from your SSR book to get inside a character’s head.”

Summarising Main Ideas

“Find and summarise the information of a current event.”

Fantastic Facts

WALT: Locate information and record it in my own words. For example: topic; questions; keywords; what I found out; website links.

These sorts of maintenance practices were often linked to student text-selected reading activities such as current event or independent reading and were supplementary to the weekly guided sessions with the teacher. Teachers created slide presentations to guide the practice in these skills. Importantly, some of the teachers achieving highest student gains incorporated maintenance practices across a range of student-selected text types. Weekly practice activities included summarising ideas, character inference, setting analyses and vocabulary meanings. Although students created personal reading goals, we were less able to find evidence of student reflection, tracking or evaluation of outcomes against these goals. An exception was a teacher who maintained online modelling books where groups recorded reflections related to reading activities, conversations and thinking, including sharing their reading progressions discussion.

157

Teacher-led goals and practice The most prevalent type of instruction was teachers’ selections of a single strategy or skill, for guidance and practice, over a period of one to three weeks, employing a teacher selected focus text (usually a School Journal article). Activities were tailored by teachers to development of the particular skill(s) described as learning intention statements. Table 137 shows examples of this teacher-led approach.

Table 137 Learning Intention and Focus Text Duration by Sample Teacher and Year Level Teacher

Year Weeks Learning Intention

A

6

B

C

6

7/8

2

Present information in different ways and evaluate ideas to see both sides of an argument (Text 1)

2

Make connections across a wide range of texts

2

Present information in different ways (Text 2)

2

Find information from a range of sources

1

Present information in different ways (Text 3)

1

Know that other people might take a different judgement from the same information (Text 1)

1

Know that other people might take a different judgement from the same information (Text 2)

1

Make connections across a wide range of texts

1

Look beyond the text

1

Look beyond the text

1

Follow instructions, and complete tasks to a high standard

1

Identify the main points in a text

3

We are learning to identify the keywords and strategies to help me learn from the text (Text 1)

2

We are learning to identify the keywords and inference skills to gain information from the text (Text 2)

1

We are learning to identify the keywords and inference skills to gain information from the text (Text 3)

2

Connecting: I find the answers in different places in the text and join the information together to make connections (Text 1)

1

Connecting: I find the answers in different places in the text and join the information together to make connections (Text 2)

158

In this approach, students could be limited to reading a relatively smaller number of texts over the course of a term’s programme if independent reading and other supplementary maintenance activities were not included in the planning. Most learning intentions analysed were focussed on meaning making. Less evident were reading intentions requiring evaluation of an author’s style or its impact on the reader. For example, “I think critically about the author's style, perspective and language choices and how that is trying to influence me or how that might impact choices I make both as a reader (i.e., to continue reading; agree/disagree), and as a writer (to incorporate ideas or language)”. Vocabulary development Teachers’ approaches to word learning were largely traditional. Word work focussed most commonly on meaning identification of new words in text – mostly through definitions using dictionaries, for example, finding “three new words you have come across” and “look up the meanings.” More indepth approaches were less evident, but included multiple exposures to unfamiliar words, in different contexts, where students were required to “Write the meaning, find a picture, write the word in a sentence and find synonyms" (or “Write a sentence for vocab in focus e.g., salary, employer, and post a picture”).

159

Figure 108. Year 6 vocabulary task posted to class blog incorporating visual connections to word meaning in student’s own words. Although some teachers employed collaborative tools such Padlet or other Word Cloud apps, students often simply pasted definitions directly from online dictionaries. This meant that students were not able to give meaningful examples because the dictionary definitions were either too sophisticated or too abstract to link to the text or to students’ prior knowledge. One teacher included prefix/suffix analyses as a practice skill within students’ independent reading plans, and another identified technical keywords for students to ‘look up’. There were only two instances of digital innovation for word learning. The first was the use of digital word walls as glossaries to revisit new vocabulary over time. The second was the inclusion of multi-media resources such as audio links of word

160

pronunciation. The promotion of word consciousness, or an appreciation and awareness of words, was not apparent in the artefacts students created during vocabulary work, although some teachers’ instructions did encourage students to notice words, such as identifying words that are “interesting”.

Creativity Most opportunities for innovation or creativity in reading were made through: 1.

Making links between reading and writing;

2.

Creation of digital learning objects (DLO).

Making reading/writing connections was a strongly embedded practice across case study teachers. In all the teachers’ classes, students were engaged in building on the ideas, language and information they had read about and repurposing these into written form. Activities that demonstrated innovative approaches included: •

Creating advertisements;



Using diagrams to summarise content;



Reading to write advice columns;



Amazon reviews;



Interpreting literary characters as onscreen versions in rewriting a story line with celebrity actors.

Students also became authors and wrote written arguments for why readers should read their books, or described what they thought an author wanted them to learn from a variety of texts, both literary and non-fiction. The reading, and subsequent authoring of poetry was scarce, particularly in developing interpretations from figurative language and in discerning the difference between literal and abstract language use. The incorporation of DLO and multimedia authorship within reading instruction was also less obvious. Production of learning objects was limited to only a couple of teachers with a focus on reimagining reading content in motivating ways such as: •

Creating a flip-book to “retell the story in your own words”;



A multimedia representation of personal book reading history;



Redesigning content as flow charts or diagrams;



Slide show presentations of content.

161

One class composed narratives based on traditional Victorian forms accessed online, and re-presented these compositions as clay animation. Another teacher encouraged students to “complete the task using other interesting Web2 tools that you are confident with e.g., GoAnimate”. In a very small number of cases students recorded themselves reading, or produced humorous enactments of language or ideas such as demonstrating a “skunk dance” to accompany their comprehension activity on skunk behaviour, and acting a “visual dictionary” entry for what it means to “peck”. There was no evidence of learning objects that built knowledge about what a skilled reader does, metacognitive conversations about reading or celebrating reading. Unlike writing practices in the cluster, there was also no obvious use of digital tools to annotate texts, learning objects, or multimedia to capture language, ideas or strategic approaches to reading.

Offering in-task support Within-task direction and guidance for reading related support was comprehensive and tended to take two forms: 1.

Task framing;

2.

Collaboration.

The framing of activities was typically directed at post-reading, rather than before or during, to support comprehension and higher order thinking. A number of teachers used De Bono or Blooms frameworks to guide understanding through staged progressions from literal, to inferential analyses, culminating in evaluation. Templates of this nature were usually in slide presentation form, with each slide guiding comprehension analyses or interpretation at each thinking level. Two case study teachers framed students’ encounters with unfamiliar text as Before, During and After guidance in the form of open questioning and personal reflection. Table 138 provides examples of these.

162

Table 138 Template Resource with Prompted Guidance to Support a ‘Holistic’ Approach to Unfamiliar Text Before, During and After Reading Reading Sequence

Teacher Prompt

Before

How does the title set the scene for a persuasive text? What type of language might be used in these texts? How do the photographs support the title?

During

How is the author trying to convince the reader of the author's point of view? Which words appeal to the emotions? Is this just one person’s opinion?

After

What factual information is used in the text to support the author’s point of view? Where does the author use ‘repetition’ to persuade the reader? (Quotes, rhetorical questions, statistics, exaggeration?) What is your opinion of this issue? What are the main points of a persuasive text? What solutions are offered to the problem?

In general, student response to text was highly supported by teachers, either through templates or questioning. Less evident was the scaffolding ‘out’ of teacher support for responding to text. There is opportunity to move students from reliance on teacher developed templates, to more student developed creations. In-task support through student collaboration was strongly represented. Students were often asked to discuss understandings or text features face-to-face with peers, and in shared documents. Shared mediums included: •

Tables to record gathering of evidence from the text;



KWL charts (what I know, what I want to know and what I now know);



Documents to summarise common thinking synthesised from individual ideas (e.g., the significance of conserving forests, clean waterways, needing people);



Student created quizzes (Google forms);



Electronic group modelling books.

163

All of these approaches built the potential for comparing perspectives and language use, communicating ideas and clarifying thinking. Similarly, but less frequently, students collaborated by giving feedback, guided by rubric progressions criteria, though this was almost always limited to the reading of student blog posts. There appears to be further potential for expanding peer support using digital tools through video or podcast, as reviewers of text, explaining understandings and interpretations, or read aloud. There is also potential to use these tools to develop knowledge as a reader, for example about how to identify misunderstandings, ask questions, or summarise.

Enhancing tutorial properties Most tutorial interactions occurred in face-to-face settings. Digital tools to support these conversations included online electronic modelling books. These shared documents were used as a student and teacher thinking space, to compare ideas or give peer feedback during the reading group session. Digital modelling books captured various perspectives and also required that students justify and evaluate reasoning so that they were accountable to the text. For example, contributors recorded their discussion about “What our critical literacy tells us about these advert[isement]s and these products?”, “Which was the most effective and why?” or “Why are we looking at infomercials?” As each student was required to first “Bring what you know”, to contemplate their own or others’ thinking, the electronic space afforded the capture of all voices consecutively in a way that whole class, or group discussion is unable to. The shared document would have future use for “rewindability” and review purposes.

Criticality and complexity Teachers tended to integrate higher order thinking by way of: •

Evaluation;



Justification;



Agency.

Using a “levels of thinking” approach to task design (e.g., De Bono/Blooms Taxonomy/PAT), reading related activities consistently incorporated evaluation of author’s point of view, or significance of text message. Examples of such activities were: •

“Explain what you think is best and why”;



“Review whether the product is good and bad and why”; or



“What changes to the outcome would you recommend and why”.

164

Teachers’ text choices were often non-fiction, and the topics were well aligned with topical or contentious issues, community problem solving and the need for consideration. A typical example was the juxtaposition of a print article, with an online text. Where students analysed the text information, and then adopted positions as “agents of change” by producing a poster for local residents. Further potential for deeper reasoning could come from analysing the content for social and scientific positions. Synthesising information and ideas across texts for the purposes of comparing and evaluating opposing points of view are opportunities for further developing higher order thinking. Considerable emphasis was also placed on finding evidence from a text and justifying ideas or view point. Case study teachers engaged students in reasoned support practices with varying degrees of complexity. Table 139 illustrates examples of case study teachers’ task directives.

Table 139 Case Study Teacher Task Directives Illustrating Varying Degrees of Complexity Degree of Complexity

Teacher Instruction "In the book they proved this on page ___ when they…" "Record what you think the main idea is and justify this" "Why do you think Sue cares/doesn't care about penguins?"

Varying degrees of complexity

“What do you think may be reasons for why Kakapo are not seen more widely?” “Why do we need window washers?” "Would you like to help orphan skunks? Why or why not? “Rate this story from 1-10 and give reasons why” “What do you think the author wanted you to learn from this article?” “Write a persuasive argument, justifying why others should or should not read this book.”

A particularly sophisticated reading analysis was achieved by one teacher who asked students to compare, analyse and contrast texts over time, by comparing their own reading with classic or popular texts from previous eras (Animal Farm; 70’s song lyrics). Students were invited to make connections from their personal experiences of text features (“our favourite novels….how they created tensions in their problems”) and then evaluate those same techniques in the teacher selected texts (“how these authors have created fear…”). Another uncommon example, because of the use of historical literature, was where students were introduced to English Victorian forms (“The Highwayman”; “Charge of the 165

Light Brigade”) which they analysed and then used the patterns of rhythm, rhyme and structure in their own text creation.

Child histories Teachers’ planning reflected connections to children’s funds of knowledge by inviting students to include experiences beyond school, both cultural and individual. Reading provision that reflected valuing students’ identities leveraged the digital affordance by including: •

Texts situated in Pacific Islands, such as Tongan or Samoan newspapers;



Tasks linking reading context to students’ past experiences by reflecting and blog posting about personal connections;



Perspective taking after being informed by an online text: "Would you try to build a penguin shelter? Write your thoughts…";



Making links to popular culture such as: “Who would you choose to play the characters if the book was a movie? Create a Google Presentation with images and support your choices …”

Also embedded in comparing and analysing texts were connections to students’ personal reading, for example, “Look beyond the Text” or “Use critical thinking skills”, to relate texts to previous reading or experiences.

3.7 Parent/Whānau Case Studies The following analysis draws on a developing hypothesis of Fanau Engagement that examines three types of family engagement: fanau (family) learning; digital learning; and other sites of learning based on eight parent/caregiver case studies (Meredith, 2015).

Engagement type one: Fanau learning Learning at home has two approaches. In the first approach schools provide information and ideas to families about how to support their children’s learning (Epstein, 1996) out-of-school. Secondly parents/caregivers add value by being involved in their children’s learning experiences through family activities (e.g., discussion, learning together) to enhance fanau learning. The range of practices of families we interviewed demonstrates that parents/caregivers do take an interest and are supportive of their children’s learning (i.e., academic socialisation) through discussion, “Every day I like to ask my children how school was. Tell me one thing they learnt at school… I get them to tell me about that”. 166

There were also opportunities to learn together, “I think I learned more from my children than my children from me. I’ve learnt to shop online. My daughter taught me how to do that”. The digital learning environment enabled these families to share knowledge and skills changing the nature of interaction within family learning at home. Figure 109 illustrates fanau learning where parents provide support for their children's as well as their own learning at home.

Fanau Learning

Figure 109. Fanau learning: learning out-of-school, family learning.

Engagement type two: Engagement with digital learning Learning in a digital learning environment or digital learning at home and the wider community requires connectivity (i.e., access to internet and devices). Parents’ engagement with the digital learning environment allows participants to see the benefits of this engagement in a broader context, beyond school, as this parent states, “…it’s the future the technology we are talking about… To progress and understand the requirements of our society these days we actually need to be able to educate ourselves and to know how to cope with the evolution of technology.” Social media was the 167

most common tool that families found beneficial to stay connected with others, “I think that Facebook does keep me informed on all areas whether it’s personal or social or education you can like so many pages and following different types of pages…it keeps me informed on what’s happening in the world.” Parents/caregivers found that engaging digitally beyond school was useful for their children, as this mother found, “With my kids, it’s quite good as it helps them with their assignments and their research projects to find things.” Digital enablement changes the nature of how and why families access learning and how they interact with each other. Figure 110 illustrates the relationship of digital learning practices embedded in fanau learning.

Digital Learning

Fanau Learning

Figure 110. Digital learning: access to internet and devices.

Engagement type three: Other sites of learning Parents/caregivers’ engagement with their children’s learning is not restricted to digital engagement only. Other ‘Sites of Learning’ is linked to off-line learning experiences in the wider community where families engage in activities, events and cultural experiences in the “real-world” (i.e., cultural modes). These activities tell us more about their life outside of the home and how families are 168

connected, how their experiences influence their children’s learning and values. The range of practices found in the families researchers interviewed centred on church and community groups they belonged to. One mother immersed in church life, states, “My spiritual life is also part of my life. I am a preschool Sunday school teacher. I believe in role modelling… I take my children to church and I teach my children.” Another mother and her family were involved in a community organisation for bikers, “We get involved in bike rides for charity… raising money for stem cell treatment for this boy who needs treatment.” This mother spoke about her community involvement, “It’s only the Samoan community that I am involved in… I’m talking about the Auckland Samoan community that’s where we are”. These multiple sites of learning that occur in the wider community add value and meaning to family life. Families within this community value belonging to an organisation or establishment which will influence their children’s worldviews, cultural models, aspirations and sense of citizenship. Figure 111 illustrates the relationship of other sites of learning to fanau learning

Other sites of Digital Learning

Fanau Learning

Figure 111. Other sites of learning: learning experiences in the community.

169

4. Summary of Results and Discussion 4.1 Student Achievement Data e-asTTle writing Despite achievement levels generally remaining below national levels, writing remains the area where students made most progress during 2015. Of the 10 schools within Manaiakalani in 2015 which assessed student writing using the e-asTTle writing tool, two had average student achievement levels above national normative comparisons at the end of the year, while eight had average levels that remained below normative comparisons. Within the school year, however, seven of the 10 schools had average progress rates above national comparative rates of progress, one had progress rates in line with averages, and two schools’ progress rates were below average. At the classroom level, six of the 58 classes had average achievement above national comparisons, 15 had average achievement at national levels and 37 had achievement levels on average below the national comparison. In terms of average classroom progress in writing within a year, 20 made accelerated progress, 33 made expected progress, and five classes made progress at rates below average. In the school cohort 1 and 2, compared with 2014, slightly fewer schools had average achievement that was accelerated. Of the eight schools whose achievement levels have been tracked since 2012, one school has average levels above normative comparisons, while the rest were below. Progress was accelerated, on average, for five of the eight schools and in line with normative comparisons for one. In two schools rates of progress dipped below national rates. This number indicates some decline in progress compared with 2014, when seven out of eight schools made accelerated progress in writing. At the classroom level, 34 of the 47 classes were below national comparisons on average, while 13 of the classes were at or above expected levels. Fifteen classes made accelerated progress, 27 made average progress, while five classes’ average progress was less than the normative comparison, two of these were in primary schools. There was a difference between the genders in writing achievement, but not in rates of progress. At the end of 2015, across the whole cluster, girls continued to outperform boys in writing on average. Overall girls scored more than 50 e-asTTle points (approximately a year’s growth) greater than boys. The difference is noted in both school cohort 1 and 2 and when newer schools are included into the analyses. In terms of progress, both girls and boys made accelerated progress compared with national 170

averages, and at similar rates. Across the cluster as a whole, NZ European students outperformed other ethnicities; however this effect is not present in the schools that have been part of Manaiakalani since 2012, which likely reflects the different demographic patterns in the Manaiakalani schools. In school cohort 1 and 2, Pasifika and Maori students both had effect size gains of 0.19 (above expected), students categorised ‘Other’ had effect size gains of 0.32, while the small number of NZ European students made overall expected gain. The emerging differences between year levels is highlighted as a pattern in writing in 2015. While all other year levels perform approximately one year below average in school cohort 1 and 2, the gap for Year 9 and 10 student achievement is greater. Moreover, whereas all other year levels made accelerated progress within the year on average, rates of progress in these upper year levels were negative (students lost ground within the year). In the writing test in Term 4, 2015, Year 9 and 10 students performed on average at levels that merit concern. It will need to be a matter of cluster investigation as to whether these test scores represent concerning patterns of learning or whether there is an issue of assessment which underpins these results. Transition analyses suggest that those students who enter Year 9 from Manaiakalani schools score on average higher in writing than those who enter from other schools, however, there is great variability in the scores of both groups, so the differences are non-significant.

Progressive Achievement Test (PAT) reading Progress in reading was generally at national levels, but achievement levels in reading remained lower than national. Average progress levels during the 2015 school year in reading were, in general, in line with expected growth over 12 months. At the end of 2015, nine of the 10 cluster schools scored below national expectations in reading, while one scored at expectations. End of year expected achievement, using PAT benchmarks, is deemed equivalent to beginning of year levels for the subsequent year (e.g., the end of Year 4 mean is estimated to be the beginning of Year 5 mean). Thus, to achieve at rates commensurate with the norms, students need to make 12 month’s average progress within the school year. In eight schools students made this expected progress on average in reading, and in two schools students scored lower than expected progress on average. Similarly, most classes across the cluster (40 of 59) made progress in line with normative comparisons between the beginning and end of the year. When primary school classes are considered separately, 44 of the 48 classes had averages gains at or above expectations. Similar rates of progress were found within school cohort 1 and 2. Of these schools, all had average achievement levels below normative expectations in reading at the end of the year. Six of those eight schools had progress rates in line with expectations (i.e., meeting the beginning of year mean for the 171

subsequent year by year’s end), while two had progress rates below expectations. Of the 37 primary school classes, 34 had rates of progress at or above expected levels. The rates of progress shown in PAT are likely also influenced by the normative expectations, because end of year achievement using PAT is deemed to be equivalent to the next year’s beginning of year achievement. Thus to make within year expected progress, students need to gain a full year’s learning in the school year. Gender differences in reading are slightly less marked than those in writing. Across the whole cluster, girls on average outperformed boys by approximately five scale score points (six month’s progress). This difference is apparent in both school cohort 1 and 2 and the cluster combined, although slightly smaller (four scale score points) when school cohort 1 and 2 are considered alone. Across the whole cluster, there were differences in progress according to ethnicity in reading, with only NZ European and ‘Other’ students making expected progress overall. For school cohort 1 and 2, ethnicities did not differ in terms of relative progress, although ‘Other’ students were the highest performing group. There are likely differences between individual schools in terms of the interaction between year level, ethnicity and progress rates which will be important at individual school levels. For most year levels progress was at expected rates, that is, most year levels made 12 month’s progress within the school year. The exception to this is Year 5, and Years 9 and 10. In both school cohort 1 and 2 and across the wider cluster, Year 5 students made slightly less than a year’s progress, likely reflecting a slightly higher expected growth rate in Year 5 within the normative comparisons. Years 9 and 10 made little progress across the year, scoring at levels similar to their beginning of year levels, thus falling away from the normative comparison. As with writing, there is a need to investigate whether students are failing to learn more widely, or whether issues of assessment impact these results.

Progressive Achievement Test (PAT) mathematics There was some variability in progress rates across the cluster between schools in mathematics. Across the wider cluster, one school had achievement levels above the normative comparisons. The remaining nine schools achieved below end of year expectations. As is the case with PAT generally, the end of year normative estimate is the beginning of the next year’s normative comparison. In three of the 10 schools, progress exceeded the 12 month’s normative progress; in five schools progress was equivalent to 12 month’s progress and in two schools average progress was less than the normative estimate. A large majority of classes (41 of 57) made on average a year’s progress within the 2015 school year. Eight classes made on average greater gains, another eight made less progress than the normative expectation. Of the 35 primary school mathematics classes in school cohort 1 and 2, 32 had

172

rates of progress at or above expected. Unlike reading and writing, there was no marked difference between genders in mathematics achievement. Across the whole cluster NZ European and ‘Other’ students made greatest progress in mathematics (ES = 0.15 - 0.16 above expected progress). When school cohort 1 and 2 are considered in isolation, all ethnicities made expected progress, with the exception of Pasifika students. As with reading, individual schools will need to inquire into the interactions between ethnicity and progress at each of the year levels. As with reading and writing, attainment began to fall away in comparison with normative expectations in older year levels. This effect becomes apparent from Year 7, at which point average progress fails to keep pace with the normative (12 month’s) estimates.

National Certificate of Educational Achievement (NCEA) Preliminary results from the 2015 NCEA data provided by the school indicate key areas of improvement in Levels 2 and 3 in particular, in terms of both quantity of qualifications gained and the quality of the learning in the subject areas. To date 39% of students have achieved level 1 qualifications, less than the 50% of students who achieved Level 1 in 2014. Of the 42 students who have achieved Level 1, 22 achieved either a merit or excellence endorsement. At Level 2 the percentage of students achieving the qualification was 68%, similar to 2014 (70%). Of the 64 students who achieved a level 2 qualification, 18 received a merit or excellence endorsement. At Level 3, 59% of students achieved the qualification. Of note is the number of students working toward Level 3 in the year. Of the 94 students, 56 gained the qualification, 10 with merit or excellence endorsements. Of the 29 students who indicated that they wanted to go on to University study, 24 achieved a University Entrance qualification. The results represent significant advances in the number of students leaving school with higher education opportunities.

4.2 Classroom Observations Most observed classes across Term 1, 2015 (22 out of 33) to Term 3, 2015 (24 out of 29) were reading lessons, in line with a focus on development in reading achievement across the cluster. Almost all classes (82 - 89%) had high levels of implementation and digital access (over 90% students with access to devices) across the whole year. There were also consistently high levels of group teaching with 61.9% of blocks of time in Term 4, 2014, 50% in Term 1 2015 and 53.8% in Term 3, 2015, where teachers were working with a group. In general, changes over time in the way teachers work with students are becoming apparent. There seem to be shifts toward more open-ended and cognitively engaging forms of teaching interactions. 173

This is apparent in the numbers of three minute blocks that were coded as focussed on extended discussion, accounting for 54.3% in Term 4, 2014, 36.8% in Term 1, 2015 and 54.5% in Term 3, 2015 respectively. It seems that teachers are increasingly creating spaces in classes that allow learners to engage in discussion around texts. The teachers’ lessons still feature a mixture of instructional types. Teachers’ interactions mostly focused on practice (59.5% of blocks in Term 4, 2014, 55.7% in Term 1, 2015 and 62.8% in Term 3, 2015). Teachers also asked students to link to prior knowledge and to have metacognitive discussions about strategies with 27% of blocks coded as these types both at the beginning and end of the year. The only instructional type less well represented was critical appraisal of content or texts (6.7% of blocks in Term 4, 2014, 11.8% in Term 1, 2015 and 4.7% in Term 3, 2015). Skills to evaluate the author’s position, the credibility of text or the intended effect of a text on the reader are vital tools for students engaged in digital learning environments. In reading lessons at the end of the year 2015, teacher-led activities were less constrained and focused more on deeper thinking than the tasks that were assigned by teachers as independent reading tasks. In this way, the profile of reading lessons with the teacher contrasts with the independent tasks assigned to students. Whereas teachers seemed to be working to extend student thinking through extended discussion, independent tasks remained predominantly focused on reading a single text (66.5% of blocks) and answering a constrained practice worksheet (53% of blocks). Therefore, although discussions may extend student thinking, the structures of independent tasks seem to lend themselves more to testing comprehension through closed questions. The opportunities for extended thinking observed previously, through digital learning objects creation (14.9% in Term 3, 2015) and the use of multiple texts (13.5% in Term 3, 2015), were much less apparent in the final 2015 observations, which focused on reading. There was a continued high level of digital task management over time. Teachers therefore seemed to be implementing the collaborative and efficiency affordances of the digital learning environment at a high level. Collaboration and joint authoring was taken up during 2015. Students tended to work more collaboratively on jointly authored texts and using digital means with an increase from 0.5% in Term 4, 2014, to 23.3% in Term 3, 2015. Face-to-face collaboration was less evident with a decrease from 19% in Term 4, 2014, 17.1% in Term 1, 2015 to 5.1% in Term 3, 2015, possibly reflecting the independent task assignment of worksheets.

174

4.3 Case Study Teachers – Reading From analysing the practices of teachers who were effective in teaching reading in 2015, it would seem that more effective teaching may result from affordances of the digital learning environment to promote not only greater reading expertise but also agency over that reading.

Increased engagement in reading Students learn to read by reading. Thus reading instruction needs to engage students in the practice of reading. Practices that seemed instrumental for reading engagement were providing opportunities and support for independent reading and extending the quantity of reading through providing supplementary texts as part of instruction. In the case study classes, teachers drew on digital tools to provide both the opportunity for self-chosen reading materials and teacher selected materials to supplement instruction. Case study teachers used both digital (for example video recording) and traditional SSR means to develop students’ independent reading skills, knowledge of books and routines for independent reading for both recreation (novels) and information (current events). Case study teachers also took the opportunity to increase reading mileage through reading instruction that drew on multiple texts, through layering texts (e.g., text sets) or supplementary texts (e.g., contrastive texts).

Instruction and support for depth of understanding Case study teachers supported depth of understanding through instruction in reading comprehension strategies and vocabulary. Most tutorial (teaching) interactions occurred face-to-face, supported by digital tools, such as digital modelling books. As with building mileage, building comprehension combined both student directed and teacher directed approaches. Students were supported to become aware of and monitor their own skills and strategies through self-assessment and goal setting. Teachers supported reading comprehension strategies through focussed lessons led by learning intentions. Vocabulary strategies and word consciousness are areas where there was less instruction and appears likely to be catalytic for supporting comprehension and thinking, supported by multimedia. Similarly, critical skills, perspective taking and language choices are all areas where there is opportunity to extend instruction.

Greater opportunity for in-task support Case study teachers provided structures and supports for students to employ before, during and after independent and instructional reading. In-task support could come from online tools or scaffolds. 175

Examples include thinking prompts or guides, developed by teachers to structure students’ thinking about texts. Direct in-task support was also achieved through collaboration. Case study teachers used the digital learning environment to create shared spaces for thinking, for example, through shared documents. Another potentially powerful practice was peer feedback, most commonly through responding to blog posts, but potentially also through other media (e.g., face-to-face feedback about podcasts, group self-assessments on a digital learning object).

Challenging tasks and high expectations Case study teachers developed students’ abilities in higher order thinking. They tended to do this through asking students to evaluate their reading, justify their thinking and to develop agency over their interpretation. Teachers also supported students to self-monitor the depth of that thinking through levelled approaches and thinking taxonomies. Connections between reading and writing were an opportunity to increase the complexity of tasks and agency over reading. In such cases, students used their reading as “knowledge fuel” for writing or for creating, thereby reading for a student defined purpose rather than to answer teacher assigned questions. Creativity and innovation were also apparent in case study classes, and these served to increase the challenge when students were asked to “repurpose” their reading to another form. Examples include written arguments as an approach to book sharing (why you should read this book), advertisements for books, diagrams to summarise content, and reading advice columns.

Closer connections to students’ interests and reading histories Case study teachers used the digital learning environment to build on what students knew about. They did this by incorporating texts that reflected students’ values, language backgrounds and identities. Such texts were international as well as local. For example, the use of Samoan and Tongan newspapers, as well as New Zealand content. Teachers also sought to incorporate students’ personal reading histories, and well as inviting students to share their differing perspectives on the content of their reading. Case study teachers therefore used the digital learning environment to provide a wider range of texts that made links to personal histories, and also as a tool to support students’ thinking about how texts had links to their lives and histories.

176

5. Recommendations 1.

Continue to embed the effective practices in writing

While there is some evidence that writing continues to be the area of greatest acceleration, there are also signs that slightly less acceleration may have been achieved in writing over 2015 than in 2014. And, while girls and boys both make accelerated progress in writing on average, girls continue to outperform boys by approximately a year’s learning. Thus there is a need to continue to reinforce the most effective practices in writing, including highly engaged learners, complex, creative tasks and powerful conversations. 2.

Develop a shared understanding of effective practices in reading including innovative digital practices which broaden and deepen reading

The analysis of some of the most effective teachers of reading highlights some potentially effective practices, likely to develop reading ability. Some of these practices are effective in traditional environments, and have potential to be amplified by digital learning environments, such as making links to students’ lives outside schools. Others are innovative practices that are made possible by digital learning environments, for example repurposing content across modes to create a digital learning object. Finally, a set of effective practices respond to the additional reading skills demanded of learners within a digital learning environment, for example perspective taking about social issues or critical appraisal of evidence in texts. We recommend therefore that with Manaiakalani we develop a shared set of hypotheses about the relationships between effective teaching and accelerated learning in reading. Once these are agreed, we recommend that the cluster work to embed these practices throughout the cluster. The following key areas seem to be key levers that case study teachers drew on the digital learning environment to enhance reading: a)

Promote engagement in reading, comprehension and higher order thinking. This might include supporting students’ independent monitoring of their reading enjoyment, interests, engagement and mileage.

b)

Promote instruction for depth of understanding and independence. This might include both teacher-led and student-led approaches to depth, as well as intertextual approaches to deep understanding of topics through reading and an increased focus on vocabulary learning and use.

177

c)

Provide in-task support for thinking about reading. This might include opportunities and support for students to develop independence in higher order thinking and agency over their interpretations. It would also include critical appraisal of what is read and the intended influences of texts on their readers.

d)

Increasing the challenge and expectations in assigned texts and tasks. This might include leveraging from the multiple text reading, and the reading-writing connections drawn within the ‘learn, create, share’ learning cycle. This might also include a greater emphasis on creativity and repurposing the ‘learning’ from multiple sources using multiple modes.

e)

Making connections. This might include explicit teacher selection of texts that make links to students’ communities and reading histories. It might also include supporting students to make links between texts that they have read, or in juxtaposing texts, comparing and contrasting, and taking an agentive stance to how texts position them as readers.

3.

Investigate subject-specific literacies and pedagogies supporting adolescent literacy

In both cross-sectional and longitudinal measures of student achievement, an emerging pattern is the need to keep pace with the normative comparisons in older year levels. International research suggests that there are additional, more specialised literacy demands of students as the texts they encounter become more complex and subject specific. While general literacy skills suffice for students in the middle primary years, it seems likely that more subject specialised demands are impacting on students in these older year levels. Thus, we recommend that we work with Manaiakalani to pay particular attention to the specialised reading and writing demands for learners in the upper primary and junior secondary years. Each of the hypothesised effective practices in reading previously mentioned will likely be relevant to this endeavour, as teachers work to increase the challenge, complexity and higher order thinking of students into secondary school.

178

6. References Beck, I., McKeown, M. G., & Kucan, L. (2002). Bringing words to life: Robust vocabulary development. New York: Guilford. Bromley, D. B. (1986). The case-study method in psychology and related disciplines. John Wiley & Sons. Darr, C., Neill, A., Stephanou, A., & Ferral, H. (2007). Progressive Achievement Test: Mathematics. Teacher Manual. (2nd Ed.). Wellington: New Zealand Council for Educational Research. Darr, C., Neill, A., Stephanou, A., & Ferral, H. (2008). Progressive Achievement Test: Reading. Teacher Manual. (2nd Ed.). Wellington: New Zealand Council for Educational Research. Ministry of Education. Wellington. (2012). E-asTTle Writing: The technical report. Wellington: New Zealand Council for Educational Research. Davies, C., & Jewitt, C. (2011). Introduction to the special issue on parental engagement in children’s uses of technologies for learning: putting policy into practice in the home. Journal of Computer Assisted Learning, 27, 289-291. Epstein, J. L. (1996). School/family/community partnerships: caring for the children we share. Phi Delta Kappan 76, 701-712. Epstein, J. L., & Sanders, M. G. (2002). Family, school and community partnership. M. H. Bornstein. (Ed.). Handbook of parenting: Practical issues in parenting, 5, 404-437. Mahwah, NJ: Erlbaum. Jesson, R., McNaughton, S., Meredith, M., & Oldehaver, J. (April 2014). Whānau capability building and classroom instruction – Milestone 1 report. Presented at the meeting of Manaiakalani Education Trust, Auckland. Jesson, R., McNaughton, S., Rosedale, N., Zhu, T., & Meredith, M. (2015). Manaiakalani evaluation final report 2012 - 2014 – Full report. Auckland, New Zealand: Auckland UniServices Limited.

179

Meredith, M. (2015). Fanau engagement: Understanding the effects of digital learning and home school partnership. Paper presented at the Language, Education and Diversity 2015 Conference, University of Auckland. Miles, M. B., & Huberman, A. M. (1994). Qualitative data analysis: An expanded sourcebook. Sage. 'Otunuku, M., Nabobo-Baba, U., & Johansson Fua, S. (2014). Of waves winds wonderful things. A decade of rethinking Pacific education. Fiji: USP Press. Riviera, H. H. (2014). Studying the impact of technology-influenced activities among low-income Spanish-speaking immigrant families. Journal of Latinos and Education, 13(3), 196-211. Suizzo, M., Pahlke, E., Yarnell, L., Chen, K., & Romero, S. (2014). Home-based parental involvement in young children’s learning across U.S. ethnic groups: Cultural models of academic socialization. Journal of Family Issues, 53(2), 254-284. Thaman, K. (1997). Considerations of culture in distance education in the Pacific Islands. In L. Rowan, L. Bartlett & T. Evans (Eds.), Shifting borders: Globalisation, localisation and open and distance education, pp. 23-43. Geelong: Deakin University Press.

180

Appendix A: Classroom Observation Tool 2015 Date: _____Observer: ___________School: ________Teacher: __________Class: _____ Subject: ___________# Students: ______# Devices: ______

Teacher group

Time

List of texts used

Main teaching activity (circle)

Other group

Feedback (tick)

Tchg foci (circle)

Evaluative L/Model

Item Descriptive

Generative Q&A

Group activity

Example Conf/ED

Nature of task (tick)

Nature of site (tick)

Gaming

Game

Constrained practice w/sheet

Googledocs

Open-ended template

Student product

Extended reading (multiple)

Information site

Extended reading (single)

Book

Extended writing

Language

DLO creation

Creation tools

Navigating/organising

Class site

Commenting on blogs

Blog

Agency

Y/N

Working Together (tick) ** Teacher Directed? (Y/N)

Task Management (circle)

Evidence of student decision Offline Nature of Decision:

My work

APK Online (Y/N)

#ss

Comment

Online + verbal prompts

Practice Our work

Rove

Totally digitally managed

Strategy

Instruct Other: e.g., off-task

Email Search Engine

Critical Behaviour

Other Both using computer (CMD)

Discussion (or one person on computer only) (FTF)

Other

181

Appendix B: Post Classroom Observation Scoring Protocol Observer Impressions - Manaiakalani Classroom Observations “What is your sense of whether classroom reading practice in this lesson reflects a powerful uptake of digital affordance?” Impression Dimensions

Evidence (e.g. record evidence to support your impressions towards building a corpus of practice

Uptake Rating [1=Limited; 2=Some; 3=Powerful] 1

2

3

indicators How much opportunity is there to learn? [e.g. direct instruction, guidance, vocabulary, reading strategy, mileage (extended reading)] What is the nature of cognitive challenge? [e.g. intertextuality (tasks with text), criticality, text level, significance/relevance] Does digital integration take learning beyond what could be achieved ordinarily? [e.g. skill development, efficiencies, engagement, agency, collaboration] How well is learner independence cultivated? [e.g. scaffold in/out, tracking learning goals, selfregulation, initiative, agency] 183

184

Appendix C: Parent/Whānau Interview Tool Manaiakalani Parent Whānau Interview Tool Semi-structured interview, designed to elicit the following key themes. Interview 1: I am really interested in what you experience as a parent in digital learning that helps you and your family connect you to the school, community and beyond. Remember, you don’t have to answer my questions if you don’t feel comfortable. I would like to find out why you made the decision to engage in digital learning and what you see as benefits or concerns you face as a family in this environment. Background Tell me about your background. What ethnic group do you most identify with? What language do you speak? Tell me about your education and schooling experience. Did you have access to and use computers when you were at school? Career? Are you currently employed? What is your status? [single, married, de facto, caregiver, grandparent, etc.] How many children do you have? Which school did you send your child to? Family engagement What kinds of digital devices do you have in your home? [desktop computer, netbook, laptop, mobile phone, iPad, Xbox] What kinds of online activities do you engage in at home? [personal development, information, news, shopping, leisure, social, gaming, communication] How much time do you spend online with these activities? How do you use your online knowledge to support your learning? Your child’s? Your family? Tell me about other [offline] learning activities that your family engage in? [discussions, learning, enquiry] What do you believe are the benefits of your family being online? Do you have any concerns? How would you like to resolve your concerns?

185

School engagement What kinds of school activities do you get involved in at your child’s school [netbook training, Reading Together, home school partnership, coach, cultural, parent committee, parent helper]. Do you access the school’s website? Do you access your child’s school work online? Parent portal online? Do you communicate with the school online? [email, blog posts, text message] Do you communicate with your child online about their school work? [email, blog, text message] In your school engagement activities how important is it for you to be connected? What do you believe are the future benefits of school engagement online? Do you have any concerns? How would you like to resolve your concerns? Community engagement What kinds of community activities are you involved in? [church, sports club, culture group, training, education, work or other] In your community activities how important is it for you to be connected? What do you believe are the future benefits of being online in the community, education, work? Do you have any concerns? How would you like to resolve your concerns? Digital citizenship Tell me about the types of global activities you engage in? [personal development, information, news, shopping, leisure, social, gaming, communication] What do believe are the future benefits of digital citizenship? Do you have any concerns? How would you like to resolve your concerns? Semi-structured interview, designed to elicit the following key themes. Interview 2, 3, 4 & 5: I am really interested in what you experience as a parent in digital learning that helps you and your family connect you to the school, community and beyond. Remember, you don’t have to answer my questions if you don’t feel comfortable. I would like to find out why you made the decision to engage in digital learning and what you see as benefits or concerns you face as a family in this environment and note any changes since we last met. Fanau engagement

186

What kinds of digital devices do you have in your home? [desktop computer, netbook, laptop, mobile phone, iPad, Xbox] What kinds of online activities do you engage in at home? [personal development, information, news, shopping, leisure, social, gaming, communication] How much time do you spend online with these activities? How do you use your online knowledge to support your learning? Your child’s? Your family? Tell me about other [offline] learning activities that your family engage in? [discussions, learning, enquiry] What do you believe are the benefits of your family being online? Do you have any concerns? How would you like to resolve your concerns? School engagement What kinds of school activities do you get involved in at your child’s school [netbook training, Reading Together, home school partnership, coach, cultural, parent committee, parent helper]. Do you access the school’s website? Do you access your child’s school work online? Parent portal online? Do you communicate with the school online? [email, blog, text message] Do you communicate with your child online about their school work? [email, blog posts, text message] In your school engagement activities how important is it for you to be connected? What do you believe are the future benefits of school engagement online? Do you have any concerns? How would you like to resolve your concerns? Community engagement What kinds of community activities are you involved in? [church, sports club, culture group, training/education/work or other] In your community activities how important is it for you to be connected? What do you believe are the future benefits of being online in the community, education, work? Do you have any concerns? How would you like to resolve your concerns? Digital citizenship Tell me about the types of global activities you engage in? [personal development, information, news, shopping, leisure, social, gaming, communication] What do believe are the future benefits of digital citizenship? Do you have any concerns? How would you like to resolve your concerns?

187

Manaiakalani Evaluation Programme

Data Sources. In 2015 we relied on the following sources of data: e-asTTle and Progressive Achievement Test. (PAT) achievement; preliminary National Certificate of Educational Achievement (NCEA) standards; classroom observations, classroom site and student blog analysis of case study classes and parent/whānau ...

2MB Sizes 6 Downloads 147 Views

Recommend Documents

Evaluation of the integrated care and support Pioneers programme in ...
Evaluation of the integrated care and support Pioneers ... funding arrangements for integrated care in England.pdf. Evaluation of the integrated care and support ...

MANAGEMENT PROGRAMME
Define the concept of strategy. Explain the Boston. Consulting Group (BCG) model, General Electric. (GE) planning model and highlight their usefulness.

CONFERENCE PROGRAMME
Mar 21, 2016 - Faculty of Economics and Business. Working ... The Online Dispute Resolution as Contribution ... „Cloud computing" opportunities and.

CONFERENCE PROGRAMME
Mar 21, 2016 - Faculty of Economics and Business. Working language – ... the Role of the. Sharing Economy ... „Cloud computing" opportunities and obstacles.

CONFERENCE PROGRAMME
Mar 21, 2016 - ... Market – the Role of the. Sharing Economy ... sharing economy. 12,10 – 12,30 ... University of Zagreb. „Cloud computing" opportunities and.

MANAGEMENT PROGRAMME
The unrecognised union claimed that they have a following of 30-40 percent and almost all white collar staff are their followers. The ' Mill Workers Union ' served a notice on the. Administration with the following demands : (a) Foreman should be tra

Programme for Graduates_mail to
Our ever expanding business provides near limitless ... Maintain high performance standards during the entire tenure with XLD and of course, perform well in ...

management programme
Note : Attempt any four questions. All questions carry equal marks. 1. (a) Material flow and information flow are equally important in the materials flow process. Why ? (b) Discuss the role of TQM in material management. Highlight the benefits. 2. (a

NATIONAL MEDIA AWARD PROGRAMME
Programme for young, mid-career journalists. The award allows them to take time off from their routine beats to research and publish articles/photo essays on ...

Programme-Vert.pdf
Atelier cuisine autour du goûter avec une diététicienne. ( Possibilité d'aller chercher les enfants à l'école durant. l'atelier pour qu'ils viennent partager le goûter).

Wedstrijdprogramma Race programme
6. ABERDEEN BC. Wilson Gary / Gieseler Henry ... Gulich Lionel / Wood Jack. GBR. 11 ..... Gulich Lionel / Baker James / Bruce Todd / Wood Jack /. Stm. Swan ...

ESEFA Programme -
comprehensive understanding of business processes and enterprise systems. • Beyond that, specific application knowledge of SAP ERP software is highly.

ESEFA Programme -
An integrated enterprise-wide business application. 3 ... Beyond that, specific application knowledge of SAP ERP software is highly relevant for all fields of ...

Wedstrijdprogramma Race programme
GER. 17. SHIPLAKE COLLEGE. Davies A. GBR. 18. ST PETER'S SCHOOL .... James Daniel. GBR. 8 ..... Bayer Oliver / Hofmann Daniel / Wülfrath Maxime /.

programme afrique.pdf
Page 1 of 2. جامعة القاضي عياض بمراكش. Université CADI Ayyad Marrakech. الكلية التعددة التخصصات آسفي. Faculté polydisciplinaire de Safi. تنظم ندوة دولیة حول ...

Programme for Graduates_mail to
degree from ITM. Irrespective of the option that you choose, the process for reimbursement remains the same: • Apply for Graduate Programme in the prescribed ...

Executive Programme - ICSI
Jun 30, 2014 - (vi) Cost audit and consequent management action can create a healthy competition among the various units in an industry. ..... (ii) Aeronautical services of air traffic management, aircraft operations, ground safety services, ground .

Learning Programme
manage themselves and their activities responsibly and effectively; collect, analyse, .... (e.g. 'What we need to find out is'). .... cloud formations and links them to.

Programme Rationale.pdf
Soccer Tournament) to successful bidding to host FIBA U-19 Basketball World. Championship 2017). in demonstrating the country's potential to manage these global ... regional football federations. Other sports federations. Sports clubs. Sports media.

Programme: B -
Course Code. Course Name. 09.06.2016 Thursday. 11MB015. Merchant Banking and Financial Services. 10.06.2016 Friday. 11MB036. Retail Management.

Programme Minsk.pdf
10.50 - 11.30 Дискуссия/ Diskussion / discussion. 11.30 – 11.45 Перерыв/ Pause. Page 3 of 16. Programme Minsk.pdf. Programme Minsk.pdf. Open. Extract.

Undergraduate Programme -
Relates to the concept of renaissance, birth and nurturing of the mind for ..... MARGINAL PASS .... the different concepts and systems used by the University.

Conference Programme
Jan 23, 2017 - Management and Accounting Services Business Model (MAAS Model) & Its Constructivist Framework (Delegate. Number: Ibrahim 4653 - AH44- Oral Presentation) Seat. No: 12. 12:35:00 PM - 1:00:00 PM. Ms. Tingting TAN, Ph.D Candidate, Kyushu U

Executive Programme - ICSI
Jun 30, 2014 - 5. Budgetary control including flexible budget system. 6. Cost management techniques indicating how an organization's assets should be allocated over competing projects or to decide whether it is worth proceeding with the investment, k