Module 7: Te Whakamahi Pūrongo (Data Driven Leadership): This module focuses on using data effectively to inform decision-making, assess progress, and drive continuous improvement.
“He aha te take? He aha te pūtake?”
“What is the cause? What is the root cause?”
Module Objectives:
- Understand the importance of data-driven decision-making in education.
- Identify and collect relevant data to inform school improvement initiatives.
- Analyse and interpret data effectively to identify trends, patterns, and areas for improvement.
- Use data to inform and evaluate school programmes and initiatives.
- Communicate data effectively to stakeholders, including teachers, students, parents, and the wider community.
- Develop a data-driven improvement plan for a specific area of school focus.
Data are crucial for improving student achievement. By revealing gaps in student learning and instructional practices, they guide teachers and leaders in identifying areas for improvement and tailoring instruction to meet individual student needs.
However, data alone do not provide solutions. They serve as a valuable tool for understanding student learning and informing decision-making. Interpreting data is paramount; it involves uncovering the ‘story behind the numbers’ by identifying patterns and relationships. This process requires ongoing analysis, reflection, and the collection of further evidence to refine understanding and inform continuous improvement.
Types of Data:
- Demographic data: Information about students, staff, and the school community.
- Student achievement data: Standardised tests, classroom assessments, and student work samples.
- Perceptions data: Information gathered through surveys, questionnaires, observations, and student voice.
- School processes data: Information about programs, classroom practices, and assessment strategies.
When gathering data, focus on relevant information that serves a specific purpose. Avoid collecting excessive data, which can be time-consuming and difficult to analyse.
While student achievement data provides valuable information about outcomes, it doesn’t explain the underlying causes. To understand these, utilise formative assessment data, classroom observations, student voice, and other relevant sources.
Analysing Data:
Start by posing a specific question about your data, focusing on differences, gaps, or the impact of teaching practices. Look for unexpected findings and identify patterns, trends, and categories.
Avoid jumping to conclusions; explore the data deeply, considering multiple perspectives and questioning your assumptions.
Evaluate data quality using the 4 Cs: Completeness, Consistency, Comparison, and Concealed information.
Create a concise data overview and share it with colleagues to gain diverse perspectives.
Generate inferences and potential explanations, remembering that correlation does not equal causation.
Develop a range of data stories to identify areas for further investigation.
Recognise that data may not tell the whole story and that further data collection may be necessary to confirm findings.
Resources:
https://research.acer.edu.au/cgi/viewcontent.cgi?article=1317&context=research_conference
Task: Developing a Data-Driven Improvement Plan
- Select a specific area of school focus (e.g., literacy, numeracy, student well-being).
- Identify relevant data sources and collect the necessary data.
- Analyse the data and identify key trends, patterns, and areas for improvement.
- Develop a data-driven improvement plan that outlines specific goals, strategies, and action steps.
- Post your data-driven improvement plan on the online forum for peer feedback and discussion.
Assessment:
- Completion of all readings.
- Participation in the online forum discussion.
- Development and submission of a data-driven improvement plan.
- Demonstration of the ability to analyse and interpret data effectively.
6 Responses
I chose to look at our recently gathered Year 3 Better Start Literacy Approach (BSLA) data alongside OTJ data in Hero. Testing includes Word Reading (decoding and recognition of familiar words), Non-word Reading (phonological decoding of unfamiliar words), and Spelling results (encoding and application of phoneme–grapheme mapping). We are now in our second year of implementing the BSLA in Year 3, which has required considerable teacher training and a significant shift in practice to embed a new literacy programme.
This year, our school has also extended the BSLA into Years 4 and above, with teachers currently undergoing training and upskilling. I am excited to explore how the Year 3 results can inform teaching priorities, identify students who may require additional support, and ensure there is a consistent, data-driven progression of literacy instruction across the school. By comparing BSLA outcomes with OTJs, I can see where structured assessment data aligns with teacher judgment and where gaps exist, helping us refine both our teaching practices and our moderation processes.
Across Year 3, 26 students (out of 115) are identified as ‘below’ or ‘well below’ in reading; however, only 14 students currently receive Tier 2 targeted support. Our Tier 2 program has two teachers working with these 14 students. This data highlights a group of learners who have been recognised through OTJs as needing additional support but are not yet receiving targeted intervention. While they may not meet the current threshold for Tier 2, without careful monitoring, there is a risk they could fall further behind.
*Results summary*
Word Reading is a relative strength: most students can recognise familiar words.
Non-word Reading and non-word spelling is weaker: more students struggle to decode unfamiliar words, showing reliance on sight word recognition rather than phonics transfer.While students can read words, this skill is not transferring into accurate spelling, making spelling the lowest-performing area.
*Improvement Plan*
Targeted Support for Identified Students
Review the 12 students identified as ‘below’ or ‘well below’ who are not currently accessing Tier 2 intervention.
Prioritise these students for additional support, either by extending Tier 2 provision or providing targeted small-group/individual instruction within class. Track the progress of this group specifically to measure the impact of any new support.
Strengthening Non-Word Reading (Phonological Decoding)
Embed regular practice with non-words in guided reading and literacy rotations to normalise decoding of unfamiliar words.
Provide professional learning for teachers on strategies to support phonological generalisation. Complete regular teacher observations to check-in on classroom practices/provide support.
– Improving Spelling Outcomes
Implement daily, short, focused spelling instruction that links phoneme–grapheme mapping to writing tasks.
2026 goal – train in the BSLA spelling program and implement it into our literacy practice
Use BSLA-aligned spelling assessments to group students for targeted instruction.
Use staff meetings, PLD sessions, and peer observations to share effective practice in decoding, spelling, and assessment.
*Strengthening Whānau Engagement
Review home–school partnerships, with a focus on supporting literacy learning at home.
Hold whānau hui to share progress data, explain BSLA practices, and provide practical strategies for supporting literacy.
Use communication platforms (e.g., Hero) more explicitly to share weekly sounds, spelling patterns, and key messages, ensuring families are well-informed and supported.
I chose to focus on Mathematics, looking at my class’s e-asTTle Term 2 data. This was a teacher generated assessment, built around key concepts that had been explored since the beginning of the year.
Significant strengths were identified across number concepts including rounding, rational numbers and operations with fractions and mixed numbers. The majority of the class were working at grade level with whole number and place value concepts, word problems involving the 4 operations, data analysis and solving linear, simultaneous, quadratic equations. Areas for improvement were solving problems with exponents, 11, 12 and 13 times tables and working with decimals.
This year we have started using the Maths No Problem program. Pre-testing fractions and decimal knowledge before beginning to teach these units revealed minimal to no understanding of the concepts that were going to be taught. Post testing showed growth, however, still considerably below grade level for the majority of the students. Discussions were held at a grade and senior leadership level as to how to address gaps that were being seen across Year 7.
From this I have put the following strategies and action steps in place:
1. Recognise that 4 of our 6 feeder primary schools have also started with Maths No Problem this year. I will track our data closely for the next cycle of our strategic plan to see if the pre-test data improves.
2. Ensure high quality differentiation activities are available in each lesson to target the needs of each student.
3. Continue to implement strong AFL practices in every lesson – learning intention, success criteria. Daily reflections for the students on what went well and what their next steps are.
4. Daily feedback and retrieval practices to elicit from the students to monitor progress.
5. Acknowledge that Maths No Problem is a tool but not the curriculum. Pivot, adapt and use this resource in conjunction with other modes/methods to ensure growth and understanding continues.
6. Appreciate and recognise that e-asTTLe is one type of diagnostic tool and to also take into account daily observations, book work, PAT Maths etc.
7. Focus on making the basics stick, closing the gaps.
Literacy Data Analysis and Improvement Plan.
I teach two groups four days per week, making them the most relevant cohorts for this task:
• Year 7–8 ESoL support
• Year 5–6 extension
Data Sources and Collection
I drew on the following standardized tools:
• PAT Reading Comprehension
• PAT Reading Vocabulary
I use these assessments because they drive both learning and parent reporting.
These are administered in Terms 1 and 3 to emphasize their formative role in shaping instruction, rather than summative reporting.
Data Analysis
In examining the Term 1 and 3 results, I focused on:
1. Scale scores, to track change over time
2. Genre-based comprehension results, to pinpoint specific strengths and weaknesses
Noted Inconsistencies
1. The extension group took a year-level assessment in February but an adaptive PAT in July, making direct score comparisons less reliable.
2. Four of the twelve highest-achieving boys showed a decrease in scale score, suggesting factors beyond raw ability (for example, test format or motivation) may be at play.
Key Trends and Areas for Improvement
• Overall growth in most students’ reading vocabulary and comprehension
• Variation in genre performance, with poetry and narrative comprehension lagging for the extension group
• Pronouns, positional verbs, and prepositions as gaps in the ESoL group’s vocabulary knowledge
• A small subgroup of high-achieving boys whose scores dipped, indicating a need for targeted support in test-taking strategies
Data-Driven Improvement Plan
Goals
1. For the extension group, increase average comprehension scale scores by Term 4, with a focus on poetry and narrative texts.
2. For the ESoL group, boost vocabulary mastery of pronouns, positional verbs, and prepositions by 25% as measured in a Term 4 reassessment.
Strategies and Action Steps
Extension Group: Genre-Focused Instruction
• Target poetry and narrative units during the second half of Term 3
• Design lessons around:
o Identifying evidence in different text types
o Determining the “best answer” from multiple choices
o Recognizing and applying synonyms to deepen comprehension
• Implement small-group workshops twice weekly, using past PAT items for hands-on practice
• Use peer-review of practice answers to cultivate metacognitive awareness
ESoL Group: Vocabulary and Sentence Structure
• Administer a weekly cloze activity that recycles target pronouns, positional verbs, and prepositions
• Provide cut-up sentence exercises (both compound and complex) to reinforce word order, noun indicators, and prepositional usage
• Employ context circles:
1. Select a key vocabulary word
2. Brainstorm related words and scenarios
3. Build and share sentences orally before transferring to written form
By aligning assessment formats, drilling into genre specifics, and embedding test-taking strategies into daily instruction, this plan utilises data insights to create focused and actionable teaching steps.
The area I have selected is Math, as I am Head of Maths. I am using our informed decisions and comparing them to our OTJs from mid-term last year.
I have found out through our informed decisions from mid-year that in:
Year 1 (61% are at or above)
Year 2 (52% are at or above)
Year 3 (45% are at or above)
Year 4 (55% are at or above)
Year 5 (40% are at or above)
Year 6s (50% are at or above).
Extra information to help analyse:
We are currently using Math No Problem in our Year 2-6 classes and Numicon in our Year 0/2 classes. In Year 0/1 we are using the Jo Knox and Marie Hurst Snapshots as well as gathering information through Numicon. In the Year 2-6 classrooms, we are mainly using the Math No Problem chapter reviews.
Findings:
From this data, it shows that our Year 0/1s are achieving the highest, and that teaching the foundations through Numicon is working. They are also using the ‘Snapshot’ assessment, which is a quick snapshot of what they are able to do separate from the programme and is directly linked to the New Zealand Curriculum. Our target group was the Year 3 cohort, as they were our most successful group in maths class. But in identifying this data, the Year 5s are actually lower, so I need to look into this and look at individual kids to see why there are so many working towards standards. We do know the curriculum signposts have changed, and know children are expected to achieve harder maths, so I know this will have an impact. As a school, we have done a lot of PD on teaching our maths programmes in an effective way and having children join and work through the ‘I do, we do, you do’ strategy.
A wondering that I have and am currently working on is our assessment. I am currently trialling the Phase 2 Snapshot to see how my children achieve off that, as it directly links to the curriculum and isn’t so literacy-heavy that Maths No Problem is. I am also hoping that through the Snapshot, you can identify their successes and gaps, then we can work on bringing back our visible learning concepts and help the children to be more assessment-capable, which will help them to work on their gaps.
Task: Developing a Data-Driven Improvement Plan
Select a specific area of school focus (e.g., literacy, numeracy, student well-being).
Identify relevant data sources and collect the necessary data.
Analyse the data and identify key trends, patterns, and areas for improvement.
Develop a data-driven improvement plan that outlines specific goals, strategies, and action steps.
Post your data-driven improvement plan on the online forum for peer feedback and discussion.
Action plan:
Continue to finish the Snapshot in my class and analyse the data against my current informed decisions. I am also asking a Year 3 teacher to do the same. From here, we can see our areas that the children are struggling in through the school and build teacher knowledge to improve this and get children and parents involved and working on their next steps together. We have just had a PM on basic facts, so I need to check that this is happening, as I am wondering if this is holding the children back.
Over the last 5 years, our school has completed the NZCER Wellbeing@School student survey, which provides data on how students feel (positively or negatively) towards themselves, their peers, their teachers, their school, and their community. I have had the privilege of taking part in this survey, and also organising this survey for our entire kāhui ako and analysing my own class data, my whole school data, and my kāhui ako data. So I am answering this question from an Across School Leader perspective, based on the data that has driven our strategic planning moving forward as a collective over the past 2 years.
Key Findings from 2023-2024:
– One of the main aspects (out of the 5) that had the most significant change between 2023 and 2024 was “Pro-social Student Culture and Strategies”. The amount of students agreeing and strongly agreeing with every question in this area decreased between 2% and 8%, with more students disagreeing and strongly disagreeing with each question in 2024, than in 2023.
– Overall, out of the 63 questions asked, 4 questions had a positive change/increase, 4 questions remained the same, and 55 questions had negative change or decreases from 2023 to 2024.
After analysis from 2019-2024, I found that questions focussed on “community” had seen negative changes (potentially due to covid, social distancing etc. over the past 5 years). Seeing the amount of negative change from this survey data really helped us to make informed decisions around student wellbeing and what schools were doing about mental health. This led to many conversations with ISL’s and Principals and getting the majority of our schools to take on the Mental Health and Wellbeing programme “Mitey” within their schools. In my professional opinion, this is a great outcome for our schools as a collective so there is continuity throughout schooling as they transition through schools and take those key understandings and that language with them.
I understand that a survey like that can be administered differently within each class, or even each school but I did my best to create procedures and guidelines, and communicate these clearly to all schools and leaders for consistency purposes. These findings were communicated to each school’s SLT (school data and across school data- how their data compared with the community data etc.) which has then been used in strategic planning for following years, and to help guide ISL decision making.
I have analysed the data of those students who took the Level2 e-asTTle Maths assessment from my class in Term2.
Key Findings:
Students perform slightly better on deep understanding questions compared to surface-level knowledge.
Strengths are observed in Measurement and Shape, while Number Knowledge and Number Sense & Operations are areas for improvement.
Demographic disparities exist, with lower average scores for students speaking ‘Other Than English’ and for Maori and ‘Other’ ethnicity students.
A positive correlation exists between student attitude and overall performance.
Areas for Improvement:
1. Strengthen Foundational Number Skills:
-Goal: Improve Number Knowledge and Number Sense & Operations.
Strategies: Implement targeted interventions, differentiated instruction, curriculum review, diagnostic assessments, small group instruction, interactive activities, and progress monitoring.
2. Support ESOL Students:
-Goal: Reduce the achievement gap for ‘Other Than English’ language speakers.
Strategies: Provide comprehensive language and academic support, including bilingual support, vocabulary development, visual aids, peer support programs, and teacher training.
3. Address Ethnic Disparities:
-Goal: Reduce the achievement gap for Maori and ‘Other’ ethnicity students.
Strategies: Implement culturally responsive teaching practices, community engagement, mentorship programs, foster a positive classroom environment, and continuous data disaggregation.
4. Enhance Test-Taking Attitude:
-Goal: Increase the average ‘Attitude – General’ score.
Strategies: Foster a positive growth mindset, reduce math anxiety by celebrating progress, incorporate low-stakes practice, connect concepts to real-world applications, provide opportunities for student voice, and conduct parent workshops.