Module 7: Te Whakamahi Pūrongo (Data Driven Leadership): This module focuses on using data effectively to inform decision-making, assess progress, and drive continuous improvement. 

“He aha te take? He aha te pūtake?”

“What is the cause? What is the root cause?”

Module Objectives:

  • Understand the importance of data-driven decision-making in education.
  • Identify and collect relevant data to inform school improvement initiatives.
  • Analyse and interpret data effectively to identify trends, patterns, and areas for improvement.
  • Use data to inform and evaluate school programmes and initiatives.
  • Communicate data effectively to stakeholders, including teachers, students, parents, and the wider community.
  • Develop a data-driven improvement plan for a specific area of school focus.

Data are crucial for improving student achievement. By revealing gaps in student learning and instructional practices, they guide teachers and leaders in identifying areas for improvement and tailoring instruction to meet individual student needs.

However, data alone do not provide solutions. They serve as a valuable tool for understanding student learning and informing decision-making. Interpreting data is paramount; it involves uncovering the ‘story behind the numbers’ by identifying patterns and relationships. This process requires ongoing analysis, reflection, and the collection of further evidence to refine understanding and inform continuous improvement.

Types of Data:

  • Demographic data: Information about students, staff, and the school community.
  • Student achievement data: Standardised tests, classroom assessments, and student work samples.
  • Perceptions data: Information gathered through surveys, questionnaires, observations, and student voice.
  • School processes data: Information about programs, classroom practices, and assessment strategies.

When gathering data, focus on relevant information that serves a specific purpose. Avoid collecting excessive data, which can be time-consuming and difficult to analyse.

While student achievement data provides valuable information about outcomes, it doesn’t explain the underlying causes. To understand these, utilise formative assessment data, classroom observations, student voice, and other relevant sources.

Analysing Data:

Start by posing a specific question about your data, focusing on differences, gaps, or the impact of teaching practices. Look for unexpected findings and identify patterns, trends, and categories.

Avoid jumping to conclusions; explore the data deeply, considering multiple perspectives and questioning your assumptions.

Evaluate data quality using the 4 Cs: Completeness, Consistency, Comparison, and Concealed information.

Create a concise data overview and share it with colleagues to gain diverse perspectives.

Generate inferences and potential explanations, remembering that correlation does not equal causation.

Develop a range of data stories to identify areas for further investigation.

Recognise that data may not tell the whole story and that further data collection may be necessary to confirm findings.

Resources:

https://research.acer.edu.au/cgi/viewcontent.cgi?article=1317&context=research_conference

https://education.nsw.gov.au/about-us/education-data-and-research/cese/publications/research-reports/5-essentials-for-effective-evaluation 

https://www.education-leadership-ontario.ca/application/files/6414/9445/9507/Ideas_Into_Action_for_School_and_System_Leaders_Using_Data_Transforming_Potential_into_Practice_Updated__Winter_2013-14.pdf 

https://cdn.auckland.ac.nz/assets/education/about/schools/tchldv/docs/Using%20Evidence%20in%20the%20Classroom%20for%20Professional%20Learning.pdf 

Task: Developing a Data-Driven Improvement Plan

  • Select a specific area of school focus (e.g., literacy, numeracy, student well-being).
  • Identify relevant data sources and collect the necessary data.
  • Analyse the data and identify key trends, patterns, and areas for improvement.
  • Develop a data-driven improvement plan that outlines specific goals, strategies, and action steps.
  • Post your data-driven improvement plan on the online forum for peer feedback and discussion.

Assessment:

  • Completion of all readings.
  • Participation in the online forum discussion.
  • Development and submission of a data-driven improvement plan.
  • Demonstration of the ability to analyse and interpret data effectively.

18 Responses

  1. I am choosing literacy as my focus, in particular looking at NCEA Level 1 literacy outcomes. I have chosen this because Level 1 is the first major qualification milestone for our students and is strongly linked to their success in senior school. We also have potentially the new qualifications coming in a few years time which will have a huge impact on how students look at Year 11.
    I am looking at our Westlake Girls’ High School achievement data from the past three years (2020–2022). The data shows that the school sustains very high levels of achievement overall, with Level 1 pass rates averaging above 90%. However, the new literacy and numeracy requirements present fresh challenges for our students, and gaps remain for some groups, particularly Māori and Pacific learners.
    After looking at the data, it’s clear that students are strong in overall NCEA Level 1 participation and achievement. They also perform well in literacy skills such as reading comprehension at surface level and basic vocabulary acquisition. The data shows, however, that students need further support in literacy and numeracy at the foundational skills level (reading for inference and extended writing) to meet the new national standards. Māori and Pacific students in particular are not yet achieving at the same levels as the school average, which is a key area for improvement.
    Strategies:
    Provide teachers with regular access to literacy data, and support them with professional learning in data-informed teaching.
    Integrate quick, targeted literacy and numeracy activities into junior classes (Year 9 and 10) to strengthen foundational skills before students reach NCEA. (this is something that I have seen the school do this year with small-targeted groups and hopefully will roll out and capture more students in the years to come)
    Incorporate culturally responsive teaching strategies and mentorship programmes to support Māori and Pacific students in particular.
    Build student agency through the use of tools such as the NZQA NCEA Student App, allowing them to monitor their own credits and literacy/numeracy progress.

    Next steps:
    Short term: Provide all teachers with a bank of literacy “warm-up” activities (5–10 minutes) that can be embedded across subjects. Identify students who are below benchmark in foundational literacy and place them into small-group support. Initiate regular data-check meetings in departments to monitor progress.
    Medium term: Track Māori and Pacific student outcomes specifically at Level 1 and adjust interventions if disparities remain. Encourage early uptake of scholarship preparation for students who are excelling, to strengthen literacy skills through higher-level critical thinking tasks.
    Long term: Reassess literacy and numeracy progress at the end of each year and use the data to refine teaching strategies. Hold an annual literacy and numeracy review meeting, including senior leaders and the literacy lead, to evaluate progress toward equity goals and sustained success with the new requirements.

  2. This module prompted some discussion around data to assess the impact of the role of the WSLs for both teachers and learners. I developed a survey for both stakeholders and this was sent out to see how they felt the Writing rubrics had supported their understanding of Te Mataiaho.

    This came from a space of curiosity around how teachers and learners felt it supported the clarity of learning, success criteria and next steps for goal setting. One of the key indicators was the amount of teachers that took the survey. Only 50% of the staff took the survey which indicated we still have some work to do around the barriers to engaging with the rubrics. Of those that did complete the survey, the majority found it to be useful for understanding the progression outcomes within Te Mataiaho, along with using them with learners. This could be for planning for and modelling at the learning intentions and/or reflecting on success criteria and generation of next steps.

    The learners that we surveyed were from the WSL and early adopters classrooms. As we are a school that has Assessment for Learning pedagogy embedded, it was interesting to see how confident the learners were with using rubrics to support with finding evidence of success and planning of next steps. This was encouraging feedback that the WSLs can share back with the teaching staff, noting how children found the rubrics brought them more clarity around what is being learned during writing sessions.

    Some next steps that I have identified:
    Be curious and inquire why some teachers did not complete the survey. Draw on the rumble language from Brene Brown and lead with openness and curiosity around what barriers these teachers feel they have with using the writing rubrics for implementation of the Te Mataiaho
    Share the success that the surveyed learners feel they are having using the rubrics to support writing. Unpack this data and share trends/alignment with our Assessment for Learning pedagogy

    One of the barriers identified by the teachers that were surveyed was “time”. Start collaborative discussions around how we can manage this barrier and set an action plan to support those who feel it is another thing during a time where the educational landscape and goal posts are constantly being moved.

    Look at options for how we can engage learners across the school that are more manageable e.g. teachers to work alongside a smaller group of children within classes to use the rubrics, and then these learners become the experts (tuakana teina approach of reciprocal learning).
    Collaboratively look at and decide what data the teachers would find useful to support with implementation and engagement to measure effectiveness and outcomes..

  3. I am choosing literacy to focus on in particular our new entrant data as I am literacy lead and the 0-2 team leader. I am looking at our year 0/1 children (2025) on entry data from our foundation literacy skills test. This test focuses on key areas of foundational literacy, including: Phonological Awareness: identifying rhyme, syllables, and individual sounds (phonemes) within words and the Alphabetic Principle: understanding the relationship between letters (graphemes) and the sounds they represent (phonemes).

    After looking at the data it is clear that children starting school this year have strengths in syllable-based skills (blending, segmenting) and auditory discrimination. They can easily combine and break apart syllables.
    The data shows children need to further develop phonemic-level skills, especially rhyme generation, blending sounds, and isolating sounds.

    Strategies:
    Integrate multi-sensory and game-based activities into our year 0/1 classes, use and tactile activities to make phonemic awareness more concrete. Incorporate rhyming and sound games into daily routines to make learning engaging. Targeted small-group interventions. Use the screening data to form small groups of students who scored low on the blending and isolating sounds tasks to provide extra support on top of their whole class instruction.

    Next steps based on the data:
    Short term: Provide teachers with a bank of quick, 5-minute warm-up activities for blending and isolating sounds to use at the start of each literacy lesson. Implement small-group rotations where students in need of more exposures get this on top of their whole class learning
    Talk to ECE settings (in our community) about their knowledge of phonological awareness and invite them into our new entrant classrooms to observe games and ways to bring this in through play

    Long term: Reteach and reassess blending and isolating sounds every term to monitor progress and adjust small groups and whole class teaching as needed. Regularly look at new data to determine if the strategies are effective and if a change in focus is needed. Hold a literacy team meeting at the end of the year to review the final data and plan for next year.

  4. Developing a Data-Driven Improvement Plan

    Specific area of school focus- Literacy
    Year 4 started training in the BSLA programme this year. We performed the following tests: 1) Maze test- Students were given a limited time and had to choose the correct word to go in each sentence in a passage. 2) ORF test- Students read out as many made up words as they could in a minute, then students read as many actual words that they could in a minute, and lastly they were given another minute to read out as many words from a given passage. 3) Students were also assigned a Reading Comprehension test and word study test which was completed on a computer. This testing was performed by Year 5 and 6 as well. We met regularly to discuss our progress that we had made with setting up these tests and our completion of students sitting these tests.

    After completing this testing Year 4, 5 and 6 recorded their data in a spreadsheet and some data was recorded on the BSLA programme online. We then colour coordinated every score for every test completed. We used one colour to indicate the student had an above the expected standard score, one colour for at the expected standard and one colour for below the expected standard. We also completed a Reading E-asttle test which we colour coordinated as well. Once we had done this Year 4 then grouped all of their students and we made a collective agreement that one teacher would start at Taumata 7 and below, one teacher would start at Taumata 10, one teacher at Taumata 13 and one teacher at Taumata 16. We had a huge range of data!

    The BSLA programme key strategies include systematic teaching of phonics, phonological awareness, morphology, orthographic patterns, vocabulary, and comprehension, supported by decodable readers. We have been using the powerpoints with pre made lessons for both the Reading Comprehension and also the word study (PHOM). Texts are already selected and assigned activities ready to be used. Our Year 4 group of teachers are finding that as they are progressing through these lessons some teachers are modifying activities slightly to make the lessons more engaging and enjoyable for their learners. We have also moved some students up to the next Taumata class as we have noticed improvement and identified some students as needing further extension in their learning.

    During our recent Student Led Conferences I was able to discuss individual students data with their parents so that they could identify areas of strength and for improvement. We also enabled students parents to meet with their child’s Reading teacher to receive a more formative analyse of how their child was performing and progressing.

    We will perform another round of testing after we have taught this programme for 10 weeks as per the BSLA instructions. This will allow us to see improvements in our students data and inform our next steps and teaching programme for each student.

  5. Focus Area – Literacy – Reading

    As a school, we are preparing to begin our structured literacy journey with iDeal next year. Before launching into this, our aim in 2025 was to “give things a go” by exploring ways to strengthen teacher practice, lessen workload, and increase student engagement in reading. We wanted to build momentum and confidence across our staff and learners so that when we transition into iDeal, we do so from a position of shared growth and open mindedness.

    As leader of the Reading PLG, the focus is on strengthening teacher capability and confidence in planning and delivering reading programmes, and on lifting student engagement and achievement in reading.

    Data Sources

    Teacher Surveys (Term 2 & Term 3): Collected teacher voice on confidence and needs in teaching reading and planning programmes.

    Student Voice:
    – Term 2: Informal verbal feedback.
    – Term 3: Google Form survey with structured questions.
    Achievement Data: Teacher OTJs informed by asTTle, PROBEs, samples, and formative assessment.

    Data Analysis

    Teacher Voice
    Term 2 (Baseline): (on a scale of 1-5)
    – 60% of teachers rated their confidence in teaching reading at 3 or below
    – 86% rated confidence in planning reading programmes at 3 or below.

    Term 3 (Follow-up):
    – 30% rated confidence in teaching at 3 or below.
    – 38% rated confidence in planning at 3 or below.
    Trend: Noticeable improvement in teacher confidence, suggesting PLG support and shared resources may have been effective?

    Student Voice
    Term 2: Reading described as “not my favourite” or “not the best part of the day.”
    Term 3: Only 6.8% of students indicated they still don’t enjoy reading.
    Biggest challenges identified:
    – Giving full answers to comprehension questions.
    – Understanding new vocabulary.
    – Finding time to read in own time.
    Trend: Engagement has improved; barriers to comprehension are more clearly identified.

    Achievement Data
    Year 7s: Term 1: 55% at 3P or above → Term 3: 77%.
    Year 8s: Term 1: 64% at 3A or above → Term 3: 73%.

    Trend: Steady upward shift in achievement across both cohorts.

    Identified Needs

    Continue lifting teacher confidence in planning effective reading programmes.
    Address specific student challenges: vocabulary knowledge and extended comprehension responses.
    Maintain focus on making reading engaging and relevant to students.

    Improvement Plan

    Goal 1: Lift Teacher Capability and Confidence
    Target: By end of year, less than 20% of teachers rate confidence at 3 or below for teaching and planning reading.

    Strategies & Actions:
    – PLG continue to co-construct and share planning templates and exemplars.
    – Termly workshops on scaffolding comprehension strategies (e.g., inference, extended responses).
    – Ongoing modelling of lessons and peer observations

    Goal 2: Strengthen Student Engagement in Reading
    Target: Reduce “dislike reading” responses to below 5% by end of year.

    Strategies & Actions:
    – Incorporate more student choice in texts
    – Continue to embed fun and engaging activities (drama, debates, visual projects) into reading tumbles.
    – Connect texts to real-world contexts (events, cultural weeks, inquiry topics).
    – Start whole school initiatives during a book week with a Litquiz, spelling bee, and book character parade.

    Goal 3: Improve Student Comprehension and Vocabulary
    Target: By end of year, 80% of students will show improvement in giving full responses and understanding key vocabulary (measured by formative probes and classroom assessments).

    Strategies & Actions:
    – Introduce explicit vocabulary teaching routines
    – Scaffold extended responses through sentence starters and modelling (created and shared on shared planning resources)
    – Use peer and self-assessment checklists (created and shared)
    – Support teachers to provide targeted small-group support for learners below benchmark.

    Monitoring & Evaluation
    – Termly teacher confidence survey (same questions for consistency).
    – Termly student survey on enjoyment, challenges, and engagement.
    – Ongoing analysis of OTJs, asTTle, and formative probes.
    – Regular PLG reflections and adjustments based on data.

    1. Thank you for sharing all of your thinking here! This is a fantastic and comprehensive data-driven plan. You’ve done an excellent job of using teacher, student, and achievement data to inform your goals and strategies, creating a clear and logical path forward for your school’s structured literacy journey.

  6. I chose to look at our recently gathered Year 3 Better Start Literacy Approach (BSLA) data alongside OTJ data in Hero. Testing includes Word Reading (decoding and recognition of familiar words), Non-word Reading (phonological decoding of unfamiliar words), and Spelling results (encoding and application of phoneme–grapheme mapping). We are now in our second year of implementing the BSLA in Year 3, which has required considerable teacher training and a significant shift in practice to embed a new literacy programme.
    This year, our school has also extended the BSLA into Years 4 and above, with teachers currently undergoing training and upskilling. I am excited to explore how the Year 3 results can inform teaching priorities, identify students who may require additional support, and ensure there is a consistent, data-driven progression of literacy instruction across the school. By comparing BSLA outcomes with OTJs, I can see where structured assessment data aligns with teacher judgment and where gaps exist, helping us refine both our teaching practices and our moderation processes.
    Across Year 3, 26 students (out of 115) are identified as ‘below’ or ‘well below’ in reading; however, only 14 students currently receive Tier 2 targeted support. Our Tier 2 program has two teachers working with these 14 students. This data highlights a group of learners who have been recognised through OTJs as needing additional support but are not yet receiving targeted intervention. While they may not meet the current threshold for Tier 2, without careful monitoring, there is a risk they could fall further behind.
    *Results summary*
    Word Reading is a relative strength: most students can recognise familiar words.
    Non-word Reading and non-word spelling is weaker: more students struggle to decode unfamiliar words, showing reliance on sight word recognition rather than phonics transfer.While students can read words, this skill is not transferring into accurate spelling, making spelling the lowest-performing area.

    *Improvement Plan*
    Targeted Support for Identified Students
    Review the 12 students identified as ‘below’ or ‘well below’ who are not currently accessing Tier 2 intervention.
    Prioritise these students for additional support, either by extending Tier 2 provision or providing targeted small-group/individual instruction within class. Track the progress of this group specifically to measure the impact of any new support.
    Strengthening Non-Word Reading (Phonological Decoding)

    Embed regular practice with non-words in guided reading and literacy rotations to normalise decoding of unfamiliar words.
    Provide professional learning for teachers on strategies to support phonological generalisation. Complete regular teacher observations to check-in on classroom practices/provide support.
    – Improving Spelling Outcomes
    Implement daily, short, focused spelling instruction that links phoneme–grapheme mapping to writing tasks.
    2026 goal – train in the BSLA spelling program and implement it into our literacy practice
    Use BSLA-aligned spelling assessments to group students for targeted instruction.
    Use staff meetings, PLD sessions, and peer observations to share effective practice in decoding, spelling, and assessment.

    *Strengthening Whānau Engagement
    Review home–school partnerships, with a focus on supporting literacy learning at home.
    Hold whānau hui to share progress data, explain BSLA practices, and provide practical strategies for supporting literacy.
    Use communication platforms (e.g., Hero) more explicitly to share weekly sounds, spelling patterns, and key messages, ensuring families are well-informed and supported.

    1. This is a fantastic and highly effective data-driven improvement plan. You’ve done an excellent job of using specific BSLA data to not only identify areas for growth but also to create a comprehensive, actionable plan.

  7. I chose to focus on Mathematics, looking at my class’s e-asTTle Term 2 data. This was a teacher generated assessment, built around key concepts that had been explored since the beginning of the year.

    Significant strengths were identified across number concepts including rounding, rational numbers and operations with fractions and mixed numbers. The majority of the class were working at grade level with whole number and place value concepts, word problems involving the 4 operations, data analysis and solving linear, simultaneous, quadratic equations. Areas for improvement were solving problems with exponents, 11, 12 and 13 times tables and working with decimals.

    This year we have started using the Maths No Problem program. Pre-testing fractions and decimal knowledge before beginning to teach these units revealed minimal to no understanding of the concepts that were going to be taught. Post testing showed growth, however, still considerably below grade level for the majority of the students. Discussions were held at a grade and senior leadership level as to how to address gaps that were being seen across Year 7.

    From this I have put the following strategies and action steps in place:
    1. Recognise that 4 of our 6 feeder primary schools have also started with Maths No Problem this year. I will track our data closely for the next cycle of our strategic plan to see if the pre-test data improves.
    2. Ensure high quality differentiation activities are available in each lesson to target the needs of each student.
    3. Continue to implement strong AFL practices in every lesson – learning intention, success criteria. Daily reflections for the students on what went well and what their next steps are.
    4. Daily feedback and retrieval practices to elicit from the students to monitor progress.
    5. Acknowledge that Maths No Problem is a tool but not the curriculum. Pivot, adapt and use this resource in conjunction with other modes/methods to ensure growth and understanding continues.
    6. Appreciate and recognise that e-asTTLe is one type of diagnostic tool and to also take into account daily observations, book work, PAT Maths etc.
    7. Focus on making the basics stick, closing the gaps.

    1. Thank you, Leanne. This is a great reflection on your maths data and a really solid plan for moving forward. You’ve clearly identified strengths and areas for improvement, and your strategies show a deep understanding of effective teaching practice.

  8. Literacy Data Analysis and Improvement Plan.
    I teach two groups four days per week, making them the most relevant cohorts for this task:
    • Year 7–8 ESoL support
    • Year 5–6 extension

    Data Sources and Collection
    I drew on the following standardized tools:
    • PAT Reading Comprehension
    • PAT Reading Vocabulary
    I use these assessments because they drive both learning and parent reporting.
    These are administered in Terms 1 and 3 to emphasize their formative role in shaping instruction, rather than summative reporting.
    Data Analysis
    In examining the Term 1 and 3 results, I focused on:
    1. Scale scores, to track change over time
    2. Genre-based comprehension results, to pinpoint specific strengths and weaknesses
    Noted Inconsistencies
    1. The extension group took a year-level assessment in February but an adaptive PAT in July, making direct score comparisons less reliable.
    2. Four of the twelve highest-achieving boys showed a decrease in scale score, suggesting factors beyond raw ability (for example, test format or motivation) may be at play.
    Key Trends and Areas for Improvement
    • Overall growth in most students’ reading vocabulary and comprehension
    • Variation in genre performance, with poetry and narrative comprehension lagging for the extension group
    • Pronouns, positional verbs, and prepositions as gaps in the ESoL group’s vocabulary knowledge
    • A small subgroup of high-achieving boys whose scores dipped, indicating a need for targeted support in test-taking strategies
    Data-Driven Improvement Plan
    Goals
    1. For the extension group, increase average comprehension scale scores by Term 4, with a focus on poetry and narrative texts.
    2. For the ESoL group, boost vocabulary mastery of pronouns, positional verbs, and prepositions by 25% as measured in a Term 4 reassessment.
    Strategies and Action Steps
    Extension Group: Genre-Focused Instruction
    • Target poetry and narrative units during the second half of Term 3
    • Design lessons around:
    o Identifying evidence in different text types
    o Determining the “best answer” from multiple choices
    o Recognizing and applying synonyms to deepen comprehension
    • Implement small-group workshops twice weekly, using past PAT items for hands-on practice
    • Use peer-review of practice answers to cultivate metacognitive awareness
    ESoL Group: Vocabulary and Sentence Structure
    • Administer a weekly cloze activity that recycles target pronouns, positional verbs, and prepositions
    • Provide cut-up sentence exercises (both compound and complex) to reinforce word order, noun indicators, and prepositional usage
    • Employ context circles:
    1. Select a key vocabulary word
    2. Brainstorm related words and scenarios
    3. Build and share sentences orally before transferring to written form
    By aligning assessment formats, drilling into genre specifics, and embedding test-taking strategies into daily instruction, this plan utilises data insights to create focused and actionable teaching steps.

    1. This is an excellent and highly detailed data-driven plan. You’ve done a fantastic job of moving beyond surface-level scores to identify specific needs for both your extension and ESoL groups, creating a clear and highly actionable set of strategies to address those gaps.

  9. The area I have selected is Math, as I am Head of Maths. I am using our informed decisions and comparing them to our OTJs from mid-term last year.
    I have found out through our informed decisions from mid-year that in:
    Year 1 (61% are at or above)
    Year 2 (52% are at or above)
    Year 3 (45% are at or above)
    Year 4 (55% are at or above)
    Year 5 (40% are at or above)
    Year 6s (50% are at or above).

    Extra information to help analyse:
    We are currently using Math No Problem in our Year 2-6 classes and Numicon in our Year 0/2 classes. In Year 0/1 we are using the Jo Knox and Marie Hurst Snapshots as well as gathering information through Numicon. In the Year 2-6 classrooms, we are mainly using the Math No Problem chapter reviews.

    Findings:
    From this data, it shows that our Year 0/1s are achieving the highest, and that teaching the foundations through Numicon is working. They are also using the ‘Snapshot’ assessment, which is a quick snapshot of what they are able to do separate from the programme and is directly linked to the New Zealand Curriculum. Our target group was the Year 3 cohort, as they were our most successful group in maths class. But in identifying this data, the Year 5s are actually lower, so I need to look into this and look at individual kids to see why there are so many working towards standards. We do know the curriculum signposts have changed, and know children are expected to achieve harder maths, so I know this will have an impact. As a school, we have done a lot of PD on teaching our maths programmes in an effective way and having children join and work through the ‘I do, we do, you do’ strategy.
    A wondering that I have and am currently working on is our assessment. I am currently trialling the Phase 2 Snapshot to see how my children achieve off that, as it directly links to the curriculum and isn’t so literacy-heavy that Maths No Problem is. I am also hoping that through the Snapshot, you can identify their successes and gaps, then we can work on bringing back our visible learning concepts and help the children to be more assessment-capable, which will help them to work on their gaps.
    Task: Developing a Data-Driven Improvement Plan
    Select a specific area of school focus (e.g., literacy, numeracy, student well-being).
    Identify relevant data sources and collect the necessary data.
    Analyse the data and identify key trends, patterns, and areas for improvement.
    Develop a data-driven improvement plan that outlines specific goals, strategies, and action steps.
    Post your data-driven improvement plan on the online forum for peer feedback and discussion.

    Action plan:
    Continue to finish the Snapshot in my class and analyse the data against my current informed decisions. I am also asking a Year 3 teacher to do the same. From here, we can see our areas that the children are struggling in through the school and build teacher knowledge to improve this and get children and parents involved and working on their next steps together. We have just had a PM on basic facts, so I need to check that this is happening, as I am wondering if this is holding the children back.

  10. Over the last 5 years, our school has completed the NZCER Wellbeing@School student survey, which provides data on how students feel (positively or negatively) towards themselves, their peers, their teachers, their school, and their community. I have had the privilege of taking part in this survey, and also organising this survey for our entire kāhui ako and analysing my own class data, my whole school data, and my kāhui ako data. So I am answering this question from an Across School Leader perspective, based on the data that has driven our strategic planning moving forward as a collective over the past 2 years.

    Key Findings from 2023-2024:
    – One of the main aspects (out of the 5) that had the most significant change between 2023 and 2024 was “Pro-social Student Culture and Strategies”. The amount of students agreeing and strongly agreeing with every question in this area decreased between 2% and 8%, with more students disagreeing and strongly disagreeing with each question in 2024, than in 2023.
    – Overall, out of the 63 questions asked, 4 questions had a positive change/increase, 4 questions remained the same, and 55 questions had negative change or decreases from 2023 to 2024.

    After analysis from 2019-2024, I found that questions focussed on “community” had seen negative changes (potentially due to covid, social distancing etc. over the past 5 years). Seeing the amount of negative change from this survey data really helped us to make informed decisions around student wellbeing and what schools were doing about mental health. This led to many conversations with ISL’s and Principals and getting the majority of our schools to take on the Mental Health and Wellbeing programme “Mitey” within their schools. In my professional opinion, this is a great outcome for our schools as a collective so there is continuity throughout schooling as they transition through schools and take those key understandings and that language with them.

    I understand that a survey like that can be administered differently within each class, or even each school but I did my best to create procedures and guidelines, and communicate these clearly to all schools and leaders for consistency purposes. These findings were communicated to each school’s SLT (school data and across school data- how their data compared with the community data etc.) which has then been used in strategic planning for following years, and to help guide ISL decision making.

    1. This is a fantastic and insightful reflection on using a major data source to drive strategic change across your kāhui ako. You’ve clearly demonstrated how a single data set, particularly the Wellbeing@School survey, can be used to identify significant trends and lead to a powerful, collaborative response.

  11. I have analysed the data of those students who took the Level2 e-asTTle Maths assessment from my class in Term2.

    Key Findings:
    Students perform slightly better on deep understanding questions compared to surface-level knowledge.
    Strengths are observed in Measurement and Shape, while Number Knowledge and Number Sense & Operations are areas for improvement.
    Demographic disparities exist, with lower average scores for students speaking ‘Other Than English’ and for Maori and ‘Other’ ethnicity students.
    A positive correlation exists between student attitude and overall performance.

    Areas for Improvement:
    1. Strengthen Foundational Number Skills:
    -Goal: Improve Number Knowledge and Number Sense & Operations.
    Strategies: Implement targeted interventions, differentiated instruction, curriculum review, diagnostic assessments, small group instruction, interactive activities, and progress monitoring.
    2. Support ESOL Students:
    -Goal: Reduce the achievement gap for ‘Other Than English’ language speakers.
    Strategies: Provide comprehensive language and academic support, including bilingual support, vocabulary development, visual aids, peer support programs, and teacher training.
    3. Address Ethnic Disparities:
    -Goal: Reduce the achievement gap for Maori and ‘Other’ ethnicity students.
    Strategies: Implement culturally responsive teaching practices, community engagement, mentorship programs, foster a positive classroom environment, and continuous data disaggregation.
    4. Enhance Test-Taking Attitude:
    -Goal: Increase the average ‘Attitude – General’ score.
    Strategies: Foster a positive growth mindset, reduce math anxiety by celebrating progress, incorporate low-stakes practice, connect concepts to real-world applications, provide opportunities for student voice, and conduct parent workshops.

    1. Your findings are insightful and align with the principles of effective data analysis. You’ve gone beyond the surface-level scores to identify nuanced trends, such as the disparity between deep understanding and surface-level knowledge, and the correlation between attitude and performance.

Leave a Reply