Module 7: Te Whakamahi Pūrongo (Data Driven Leadership): This module focuses on using data effectively to inform decision-making, assess progress, and drive continuous improvement. 

“He aha te take? He aha te pūtake?”

“What is the cause? What is the root cause?”

Module Objectives:

  • Understand the importance of data-driven decision-making in education.
  • Identify and collect relevant data to inform school improvement initiatives.
  • Analyse and interpret data effectively to identify trends, patterns, and areas for improvement.
  • Use data to inform and evaluate school programmes and initiatives.
  • Communicate data effectively to stakeholders, including teachers, students, parents, and the wider community.
  • Develop a data-driven improvement plan for a specific area of school focus.

Data are crucial for improving student achievement. By revealing gaps in student learning and instructional practices, they guide teachers and leaders in identifying areas for improvement and tailoring instruction to meet individual student needs.

However, data alone do not provide solutions. They serve as a valuable tool for understanding student learning and informing decision-making. Interpreting data is paramount; it involves uncovering the ‘story behind the numbers’ by identifying patterns and relationships. This process requires ongoing analysis, reflection, and the collection of further evidence to refine understanding and inform continuous improvement.

Types of Data:

  • Demographic data: Information about students, staff, and the school community.
  • Student achievement data: Standardised tests, classroom assessments, and student work samples.
  • Perceptions data: Information gathered through surveys, questionnaires, observations, and student voice.
  • School processes data: Information about programs, classroom practices, and assessment strategies.

When gathering data, focus on relevant information that serves a specific purpose. Avoid collecting excessive data, which can be time-consuming and difficult to analyse.

While student achievement data provides valuable information about outcomes, it doesn’t explain the underlying causes. To understand these, utilise formative assessment data, classroom observations, student voice, and other relevant sources.

Analysing Data:

Start by posing a specific question about your data, focusing on differences, gaps, or the impact of teaching practices. Look for unexpected findings and identify patterns, trends, and categories.

Avoid jumping to conclusions; explore the data deeply, considering multiple perspectives and questioning your assumptions.

Evaluate data quality using the 4 Cs: Completeness, Consistency, Comparison, and Concealed information.

Create a concise data overview and share it with colleagues to gain diverse perspectives.

Generate inferences and potential explanations, remembering that correlation does not equal causation.

Develop a range of data stories to identify areas for further investigation.

Recognise that data may not tell the whole story and that further data collection may be necessary to confirm findings.

Resources:

https://research.acer.edu.au/cgi/viewcontent.cgi?article=1317&context=research_conference

https://education.nsw.gov.au/about-us/education-data-and-research/cese/publications/research-reports/5-essentials-for-effective-evaluation 

http://www.edu.gov.on.ca/eng/policyfunding/leadership/IdeasIntoActionBulletin5.pdf

https://cdn.auckland.ac.nz/assets/education/about/schools/tchldv/docs/Using%20Evidence%20in%20the%20Classroom%20for%20Professional%20Learning.pdf 

Task: Developing a Data-Driven Improvement Plan

  • Select a specific area of school focus (e.g., literacy, numeracy, student well-being).
  • Identify relevant data sources and collect the necessary data.
  • Analyse the data and identify key trends, patterns, and areas for improvement.
  • Develop a data-driven improvement plan that outlines specific goals, strategies, and action steps.
  • Post your data-driven improvement plan on the online forum for peer feedback and discussion.

Assessment:

  • Completion of all readings.
  • Participation in the online forum discussion.
  • Development and submission of a data-driven improvement plan.
  • Demonstration of the ability to analyse and interpret data effectively.

27 Responses

  1. Select a specific area of school focus (e.g., literacy, numeracy, student well-being):
    We have a particular focus on boys achievement in junior English.

    Identify relevant data sources and collect the necessary data:
    I selected achievement data for Year 7 boys in the first two assessments of this year (their first two at high school). I also gathered their initial AsTTle information. I gathered Social Studies achievement data for comparison.

    Analyse the data and identify key trends, patterns, and areas for improvement.
    In english, boys have failed at a rate of approximately 48% in each assessment, with Excellence grades at 8%. This is not in line with their AsTTle data which has them at a higher level (collectively). In comparison, achievement in Social Studies for boys was 10% Not Achieved and 14% Excellence.

    Develop a data-driven improvement plan that outlines specific goals, strategies, and action steps.
    The goal is to close the gap between girls and boys achievement in English by raising the achievement rate for boys from 45% to 70% by the end of Term 4.
    To achieve this, the leader of Y7 and 8 English will attend a Social Studies dept meeting and have a one-on-one discussion with the head of junior Social Studies about how they implement assessment in their subject. This information will be brought to junior English and give us a chance to review our assessments and see if we can make them more interesting, accessible and engaging. Given the glaring difference in two ‘writerly’ subjects, there is likely room for improvement. The results of this will be fed back to our HoL for inclusion in the annual plan for next year.

    1. Kia ora Jess,
      You’ve used the data really effectively to highlight a clear gap in boys’ English achievement, and I like how you compared it with Social Studies to give context. Your improvement plan is practical and collaborative, especially drawing on another department’s strengths to review assessment design. It’s a great way to turn data into meaningful action. Ngā mihi!

  2. 1- Specific area of school focus

    I selected overall academic achievement in relation to attendance for Level 2 Outdoor Education over the last four years. I also compared achievement in the subject between ethnicities to gain insight into how our Māori students in particular are doing.

    2- Identify relevant data sources and collect the necessary data

    I explored achievement data from 2022, 2023, 2024 and this year so far and compared achievement and attendance specific to Level 2 Outdoor Education.

    3- Analyse the data and identify key trends, patterns, and areas for improvement

    I compared the achievement data from 2022, 2023, 2024, and 2025, alongside attendance for Level 2 Outdoor Education students. A couple of interesting trends emerged, along with evidence of some pleasing positive changes that have resulted from targeted action.
    In 2022, achievement in the AS we deliver in Term 3 was significantly lower than in Terms 1 and 2. It was also lower than Term 3 in 2023, 2024, and 2025. This increase in achievement (fewer NAs and more A/M/E grades) can be attributed to our identifying at the end of 2022 that we had scheduled two achievement standards with heavy workloads in Term 3. This term is often a challenging time of year, as some students already have their 80 credits, while others become extremely busy and struggle to manage their time with Winter Tournament Week, AIMS Games refereeing, and mock exams all taking place. These events negatively affected both attendance and achievement, demonstrating a clear relationship between the two.

    As subject teachers, we responded by restructuring the assessment calendar for our subject. This has resulted in improved achievement across all assessments throughout the year, which is very pleasing to reflect on.

    The data also shows that Māori achievement has improved over the last three years in particular. I believe this can be credited, partially to the mahi our kura has undertaken to improve staff knowledge and capability in culturally responsive and relational pedagogy. However, Māori achievement at Merit and Excellence levels remains slightly lower than that of NZ European students (Māori: 8% NA, 61% A, 21% M, 10% E vs NZ European: 6% NA, 38% A, 34% M, 22% E).

    4- Develop a data-driven improvement plan that outlines specific goals, strategies, and action steps

    As outlined above, we have already achieved a significant improvement in overall achievement simply by restructuring our assessment calendar to reduce pressure on ākonga. However, the data shows that Māori achievement is still slightly lower than that of their peers.
    Our plan moving forward is to:
    – Continue to strengthen the culturally responsive and relational pedagogy of our subject teachers.
    – Place greater emphasis on a place-responsive approach.
    – Gather student and whānau voice to gain further insight into what else we can do to raise Māori achievement.

    As always, our department strives for all students to experience success at kura- whatever that looks like for them. We will continue to place relationship-building and understanding our ākonga at the forefront of our practice.

    1. Kia ora Jarrod,
      I really like how you’ve used multi-year data to show the clear link between attendance, timing, and achievement. Restructuring the assessment calendar was a smart move and it’s great to see the positive impact already. Your focus on continuing to strengthen culturally responsive practice and gathering whānau voice shows real commitment to equity and supporting all ākonga to succeed. Ngā mihi!

  3. Kia ora koutou,
    I’ve been reviewing our Technology Achievement data and noticed some clear patterns we need to act on. Our department currently sits at NA 18.5% | A 53.3% | M 17.8% | E 10.4% (M+E = 28.2%), which is noticeably below the whole-school M+E average of 38.5%. We also have a higher rate of Not Achieved (18.5% vs school 14.8%). Māori students make up 34.5% of our roll (272 students), but only around 5.5% reach Excellence. This shows that while many students achieve at “Achieved” level, fewer are being stretched into Merit and Excellence, and equity gaps remain for Māori learners.

    SMART Goals (12 months)
    Reduce Not Achieved from 18.5% → ≤10%.
    Lift department Merit + Excellence from 28.2% → 35%.
    Raise Māori Excellence from 5.5% → 12%.
    Ensure all at-risk students are identified and supported within 2 weeks.

    Key Strategies
    Assessment design: Redesign 2–3 key assessments to clearly target M/E, with exemplars and modelling.
    Formative assessment cycles: Regular checks to catch gaps early, paired with a simple early-warning tracker.
    Professional development: Department PD on designing and moderating for M/E.
    Culturally responsive practice: Whānau engagement, Māori contexts in tasks, and co-design of projects.
    Targeted support: Small-group tutoring for at-risk students, extension workshops for those aiming for M/E.

    Action Steps
    Months 1–2: Run PD, redesign assessments, share M/E exemplars.
    Months 3–4: Launch formative cycles, begin weekly student tracking, identify at-risk students.
    Months 5–8: Provide small-group extension/tutoring, run student goal-setting sessions, whānau emails.
    Months 9–12: Department-wide moderation, evaluate progress against goals, scale up effective practices.

    I’d love some feedback on these areas:
    What PD formats have worked best in helping teachers design tasks that draw out Merit and Excellence?
    Which culturally responsive strategies have you found most effective for lifting Māori achievement in practical/technology subjects?
    What are your go-to low-load formative checks that still provide strong insight into student progress?

    1. Kia ora Sean,
      Love the clarity in your plan – you’ve nailed down some really practical goals that feel achievable. I think the balance you’ve struck between assessment tweaks, formative checks, and culturally responsive practice is spot on. I’ve found brief learning checks and short chats with students provide valuable information efficiently, and co-constructing assessments in moderation sessions has really helped push more kids into M/E. Your focus on whānau and real contexts will make a big difference, especially for Māori ākonga. Kia Kaha!

  4. Data-Driven Improvement Plan: Whānau Engagement

    Area of Focus: Gaining engagement with whānau (responding to reports and coming in for a whānau night). Need to create a “Why” for our parents. Why might they engage and reply to our teachers after reading their reports? Why is this important? What benefits/opportunities might this open for our tamariki?

    Context: The goal of this plan is to improve communication and partnership with all whānau following the distribution of student reports, with a particular focus on increasing the number of meaningful conversations and follow-up meetings.

    Step 1: Data Sources and Collection
    The following data sources have been identified as relevant to whānau engagement post-reporting:
    – Parent-Teacher Conference (Whānau night) Attendance: Records of which whānau attended the night.
    – Communication Logs: Teacher and administration records of phone calls, emails, and messages sent to and received from whānau regarding the reports.
    – Digital Report Viewership: Data from the student management system (EDGE) showing which whānau have accessed or viewed the digital reports. Hoping that they also engage with this- even a thumbs up.
    – Whānau Survey Responses: Feedback collected from post-report surveys about preferred communication methods and meeting times.

    Data Collection Summary:
    Data from the last two reporting cycles was compiled and analysed. This involved cross-referencing attendance lists, communication logs, and survey responses to identify patterns in engagement. We had a 10% engagement stat on the last round of reports

    Step 2: Data Analysis and Key Findings
    An analysis of the collected data revealed several key trends and patterns:
    – Low Attendance at Scheduled Meetings: A significant portion of whānau (approximately 20%) did not attend the scheduled parent-teacher conferences, despite multiple reminders.
    – Preference for Flexible Communication: The whānau survey results indicated a strong preference for more flexible communication options, such as phone calls or video conferences, over a fixed, in-person meeting time, less “intense” encounter (more casual walk in, have a look around and then have a read over the report)
    – Positive Impact of being proactive: A small-scale trial in one classroom, where teachers made proactive follow-up phone calls, emails and notices on the SMS to whānau of students who did not attend conferences, resulted in an 80% success rate in establishing contact and having a conversation about the report.

    Step 3: Data-Driven Improvement Plan
    Based on the analysis, the following improvement plan has been developed:

    Overall Goal: To increase the percentage of whānau who engage in a conversation with the school about their child’s report from 55% to 80% by the end of the next academic year.

    Strategy 1: Offer Flexible and Accessible Engagement Options
    Action Step 1.1: Develop and implement a system for whānau to book phone or video conference meetings as an alternative to in-person meetings.
    Responsibility: Teacher, Office Administrator
    Timeline: Before the next reporting period.

    Action Step 1.2: Distribute a new whānau feedback form that specifically asks for their preferred meeting times and communication channels (e.g., text message, email, phone call).
    Responsibility: Teachers
    Timeline: Immediately after reports are sent out.

    Strategy 2- Proactive Steps

    Action Step 2.1: Set a school-wide expectation for teachers to make one-on-one contact with all whānau who do not attend a scheduled conference within two weeks of the meeting date.
    Responsibility: All Staff
    Timeline: Ongoing throughout the school year.

    Acton Step 2.2: Hosting whānau night (informal) with kai to promote attendance and engagement in a fun, whānau friendly atmosphere.
    Responsibility: All Staff
    Timeline: Ongoing throughout the school year each term after reports.

    Step 3: Success Metrics and Monitoring
    We will monitor our progress using the following metrics:

    Bi-annually: Track whānau attendance at meetings and follow-up engagement rates using school data logs.
    Annually: Analyse whānau survey results to gauge satisfaction and identify areas for improvement.
    Continuous: Review progress in staff meetings to share successful strategies and address any challenges that arise.

    Step 5: Post on the Online Forum
    This document will be shared on our online forum for peer feedback and discussion. We welcome comments and suggestions on the following points:
    Are the proposed strategies realistic and achievable within our school’s resources?
    How can we effectively measure the “quality” of engagement, not just the quantity?
    Do you have any examples of successful “Whānau Connect” events you’ve seen in other schools?

    1. Kia ora Mark,
      I really like how you’ve laid out your plan to boost whānau engagement – it’s clear, practical, and grounded in real data. I think offering flexible meeting options and following up proactively shows a genuine commitment to building strong, positive relationships with whānau. I also love the idea of informal whānau nights with kai – it feels welcoming and fun, and will make engagement more natural. Your approach balances both connection and clear follow-through, and I can see it making a real difference for ākonga, kaiāko, and whānau. Ka rawe!

  5. Area of School Focus

    Numeracy results in Year 7 and Year 8
    The goal is to improve outcomes for students in the lower ability sets so that their results reflect the effort they put in. Our school follows the Cambridge curriculum, which sets a high baseline for mathematical knowledge. This creates a challenge when students join us from other schools where the curriculum differs, often leaving them with gaps in learning. Bridging these gaps is essential to help these students build confidence in the classroom.

    Identify Relevant Data Sources and Collect Data:
    We can use the following data sources to inform our approach:
    Previous school records (where available) for background on student performance.
    Teacher feedback on classwork, homework, and engagement.
    Internal assessments, while being mindful that test results may reflect gaps rather than actual ability.
    Progressive Achievement Tests (PATs), particularly the Number Knowledge component, which helps identify a student’s foundational numeracy skills and areas for development.

    Analyse Data and Identify Key Trends

    Analysis of current data shows:
    Students who transition from different schools often lag behind their peers and need one to two terms to adjust to the Cambridge approach.
    The Cambridge numeracy component is demanding and requires strong foundational knowledge and number sense. Gaps are largely due to differences in curriculum coverage and teaching approaches at previous schools. These gaps do not necessarily reflect a student’s potential but rather their exposure to content and numeracy language.

    Areas for improvement include:

    Providing structured scaffolding for students new to the Cambridge system. Identifying whether challenges are due to conceptual understanding or gaps in content knowledge. Offering targeted interventions such as lunchtime or after-school tutoring. Engaging parents early to explain the nature of the Cambridge curriculum and manage expectations about progress.

    Develop a Data-Driven Improvement Plan

    Key steps in the plan: Early Identification: Use PAT data, teacher observations, and class performance to identify students who require support.
    Parent Communication: Inform parents about the gaps and explain the strategies being implemented to support their child. Emphasize that progress may take time.

    Targeted Support:

    Provide additional sessions (lunchtime, morning tea, or after school).
    Create tailored learning plans to address specific weaknesses.

    Action Planning:

    For qualitative issues (e.g., work ethic, engagement), involve the Dean, parents, and students in setting goals and strategies.
    For quantitative issues (e.g., assessment results), create measurable goals and track progress over time.

    Ongoing Monitoring and Review:

    Hold follow-up meetings with parents after key assessments to evaluate progress.
    Adjust the support plan as needed to ensure improvement.

    Core Principle: Maintain consistent communication with parents so they understand the goals, the support being offered, and the student’s progress toward achieving those goals.

    1. Kia ora Matthew,
      I really like how you’ve unpacked the challenges for Year 7 and 8 students in numeracy and set out a clear, practical plan to support them. Your focus on early identification, tailored scaffolding, and targeted interventions shows a real commitment to helping students build confidence and succeed. I also like the emphasis on keeping parents informed—it makes the whole process transparent and collaborative. Your approach balances academic support with relationship-building, and I can see it making a real difference for both students and their families. Kia Kaha!

  6. Select a specific area of school focus
    Improving academic achievement for students at risk of not achieving in history. A significant focus has been on improving academic performance for all students, but in particular identified learner groups such as ESOL, Māori and Pasifika students throughout the school and those at risk of not achieving.

    Identify relevant data sources and collect the necessary data.
    – Achievement data from internal assessments throughout the year (data tracking of assessments). Early identification of issues on a shared spreadsheet, highlighting students at risk of not achieving and any interventions or alternative pathway plans (e.g. extra internal or external assessment opportunities).
    – Tracking attendance, ethnicity, SAC, learning needs and other data along with achievement data to see if there is a correlation between student achievement and these other factors.
    – Teacher feedback and dean feedback to identify other contributing factors to student achievement, e.g. health concerns, learning needs, etc.

    Analyse the data and identify key trends, patterns, and areas for improvement.
    – Data for the cohort has identified those students at risk of not achieving. Also, there is a direct correlation between attendance and achievement (in general). There is some evidence when comparing years that the timing of certain assessments is improved when running it later in the year. This may be due to prior learning or less impact from other assessment demands. More data analysis is required to identify direct causation.

    Develop a data-driven improvement plan that outlines specific goals, strategies, and action steps.
    – Teachers offer extra internal (to all) but encourage those students in need (only an option in some subjects). The timeline is set for students to enrol in extra internals. Only some students will take this opportunity, but it can mean the difference between passing a subject for some or not achieving at all.
    – Communication with parents either via the interview process and/or email/call home to signal student is at risk of not achieving.
    – Keep transparent data records for all involved in the teaching team, so all staff can access them at all times (e.g. shared googlesheet). This way, staff can add data as it becomes available and ensure early interventions. They can track their own students against the rest of the cohort and be more accountable for their students’ achievements.
    – This system has been trialled and refined over a couple of years, with regular attendance data being added each term and colour-coded to identify issues early. As a result we have significant data showing the correlation between attendance and achievement. Early interventions with students to discuss their attendance and potential impacts on their learning have resulted in generally improved attendance records for most.

    1. Kia ora Melanie,
      I really like the way you’ve set this up so systematically – pulling together achievement, attendance, and teacher/dean insights gives such a clear picture of what’s going on for learners. The colour-coded shared spreadsheet and early interventions sound really effective, especially with the strong correlation you’ve identified between attendance and achievement.
      It’s also great how you’ve looked at assessment timing and adjusted where it supports better outcomes. I think the transparency across the teaching team is a real strength too – everyone sharing responsibility and being able to act early makes a big difference. Ngā mihi.

  7. 1. Select a specific area of school focus (e.g., literacy, numeracy, student well-being).

    Perception data. The focus area is student device management Chrome books, laptops etc). Since becoming a BYOD school, staff have noticed or felt a significant drop in learning engagement due to online distractions. Distraction in this context includes any material that is contrary to the intention of learning (social media, gaming, messaging apps etc). Also the increased concern with the unethical use of LLM’s with student learning and student assessment, student device management software becomes a plateau that schools and industry will need to explore hastily.

    2. Identify relevant data sources and collect the necessary data.

    I surveyed the school staff to determine their concerns with students mismanagement of their devices. Just under 50% (n=43) of staff filled in the survey. Ultimately, the data I am after is, do staff feel like a student software management system is necessary. There are 3 local BOP secondary schools that have already deployed the software (Classwize).

    I have also been in close communication with our IT team, looking for alternative software and identifying the holes in student device management software.

    3. Analyse the data and identify key trends, patterns, and areas for improvement.

    The data strongly suggests that the staff need a software device management system such as Classwize. 42% of teachers surveyed said they were “very likely” to use the software, and 42% of teachers surveyed said they were “likely” to use the software.

    42% of teachers surveyed felt their junior students were “Always” off task, and 26%felt that their students were “Almost Always” off task.

    Similar results were found with senior classes, with 47% of teachers surveyed feeling their junior students were “Always” off task, and 12% felt that their students were “Almost Always” off task.

    After watching a brief video about the function and usability of the Classwize software, 84% of the teachers surveyed said they were “likely” and “very likely” to use the classroom tile monitoring system that Classwize provides.’

    70% of teachers surveyed said they were “likely” and “very likely” to use the focus feature (designated and controlled tabs) that Classwize provides. 21% of teachers surveyed were unsure if they would use this feature.

    4. Develop a data-driven improvement plan that outlines specific goals, strategies, and action steps.

    Based on this data, I will present to the senior leadership team a clear need for a software device management system, such as Classwize. With the implementation of a student device management system this should aid student-teacher relationships, school culture, teacher autonomy, student learning, and student achievement, among many other benefits.

  8. Kia ora Damon,
    This is a really timely focus. Going BYOD has so many benefits, but I can see how the distractions you’ve described quickly become a big challenge for both teachers and learners. I like how you’ve gathered staff perceptions and then connected that with what’s happening in other local schools – that gives your case real weight.
    The data you’ve shared is really clear, and it’s great that you’ve thought about how device management software isn’t just about control, but also about supporting engagement, relationships, and culture. Framing it that way makes the purpose feel much more about learning than about monitoring. Kia Kaha!

  9. Select a specific area of school focus
    Improving academic achievement for students at risk of not achieving in the commerce learning area. A significant focus has been on improving academic performance for all students, but in particular identified learner groups such as ESOL, Māori and Pasifika students throughout the school and those at risk of not achieving.

    Identify relevant data sources and collect the necessary data.
    – Achievement data from internal assessments throughout the year (data tracking of assessments). Early identification of issues on a shared spreadsheet, highlighting students at risk of not achieving and any interventions or alternative pathway plans (e.g. extra internal or external assessment opportunities).
    – Tracking attendance, ethnicity, SAC, learning needs and other data along with achievement data to see if there is a correlation between student achievement and these other factors.
    – Teacher feedback and dean feedback to identify other contributing factors to student achievement, e.g. health concerns, learning needs, etc.

    Analyse the data and identify key trends, patterns, and areas for improvement.
    – Data for the cohort has identified those students at risk of not achieving. Also, there is a direct correlation between attendance and achievement (in general). There is some evidence when comparing years that the timing of certain assessments is improved when running it later in the year. This may be due to prior learning or less impact from other assessment demands. More data analysis is required to identify direct causation.

    Develop a data-driven improvement plan that outlines specific goals, strategies, and action steps.
    – Teachers offer extra internal (to all) but encourage those students in need (only an option in some subjects). The timeline is set for students to enrol in extra internals. Only some students will take this opportunity, but it can mean the difference between passing a subject for some or not achieving at all.
    – Communication with parents either via the interview process and/or email/call home to signal student is at risk of not achieving.
    – Keep transparent data records for all involved in the teaching team, so all staff can access them at all times (e.g. shared googlesheet). This way, staff can add data as it becomes available and ensure early interventions. They can track their own students against the rest of the cohort and be more accountable for their students’ achievements.
    – This system has been trialled and refined over a couple of years, with regular attendance data being added each term and colour-coded to identify issues early. As a result we have significant data showing the correlation between attendance and achievement. Early interventions with students to discuss their attendance and potential impacts on their learning have resulted in generally improved attendance records for most.

    1. Kia ora Leanne,

      I really enjoyed reading your reflection. You’ve shown a clear focus on supporting priority learners and using data in a practical way to guide timely interventions. I like how your system makes student progress visible for the whole team and how you’ve linked attendance closely with achievement. It’s great to see that these strategies are already making a difference. Kia Kaha!

  10. 1. Select a specific area of school focus
    Within the science department we have had a focus on students continuing to learn at home. We have therefore invested in a program called carousel which is a retrieval based system where we have made questions for students to revise throughout the week using flash cards. They then take a test of those flashcards at the end of the week which they self mark.

    2. Identify relevant data sources and collect the necessary data
    With this system you can track whether or not a student has completed the end of week test, accuracy of students marking and the accuracy of students answers.
    In order to see if this program is making a difference to students learning their overall results have also been collected.
    I also thought that their attendance to class should be tracked to see if this was the greater determinant of success within class.

    3. Analyse the data and identify key trends, patterns, and areas for improvement
    The main focus of our department is to get student doing homework. My data shows that there is a dip in student completion of homework mid term. This is normally when we change subjects in our scheme and could be a reason why students are struggling with the first topic in their end of term exams.

    From the first two terms of data I have found that students that have completed the homework over 75% of the the time have not received a grade lower than Achieved (with the exception of ESOL students).
    Students with the greatest completion rate and accuracy rate also seem to get the best results with a higher percentage receiving Merit and Excellence grades.

    I have been in a fortunate situation where I have taken over another colleague’s L1 class just recently. With my class I have been holding lunch time detentions for those who have not completed their homework. In these detentions I have been using the time to go over effective uses of the system and how I want them to be using it. I have been; sending emails home to parents reinforcing the importance of their child doing the homework, praising those who have completed the homework at a high level, and those who have had the biggest increase in their homework completion and accuracy. With the first week of data comparing these two classes, the class that I have taken over had a 38% completion rate whereas my class had a 53% completion rate (Week 4 of Term 3). My class had an accuracy rate of 85% compared to 75% of my new class.

    4. Develop a data-driven improvement plan that outlines specific goals, strategies, and action steps.
    From this data I can see that completion of homework has a fairly significant effect on student achievement within science. I can however see that there is a big midterm dip in completion of the homework.
    Goal
    Improve completion of weekly homework up to 80%.
    Strategy
    Continue classroom detentions to those who are not completing their homework. Use this time to reinforce the benefits of doing their homework by showing them the data, as well as time for them to revise their key terms.

    1. Kia ora Heath,
      I really like how you’ve used the data to show the clear link between homework completion, accuracy, and achievement. The strategies you’ve put in place — detentions, parent communication, and student praise — are already lifting results, which is great to see. The mid-term dip you’ve noticed is really interesting, and I like how you’re thinking about ways to address it. You’re building a strong, practical system that’s making a real difference for your students. Ngā mihi.

  11. Kia Ora,

    I have selected overall academic achievement and the relationship with attendance for the house I am dean of. Furthermore identified Maori/Pasifika within the data set as an identified priority group within our school. Thankfully I shared this with one of the maths teachers in our house who added value to the information too.
    Some summaries we found last term were:
    – we had 74% of students attending school at or over 90% of the time. That is 136 students out of 182 hitting the ministry expectations, however overall we are below what the ministry is aiming for.
    “80% of students attending more than 90% of the time by 2030”
    – Alongside this MOE has set a target of >94% for daily attendance. As at week 6 we were at 92.8% which we would ideally like to improve. Illness, unapproved absences and holidays are the biggest contributor. Without looking for excuses, the livelihoods of whanau (farming, midwifery, transport, location) are reasons for the above. We give reasonable leniency towards the reasons outlined while providing plenty of notice on our stance with absences.
    – As at week 7, 66% of Maori/Pasifika students are attending over 90% of the time. By the end of week 10 this had dropped to 59% unfortunately.
    – The relationship between minimum attendance and passing the year (previous years results) showed an 88% chance of passing if you attended over 90% which dropped below 83% for 60% attendance.
    – Another statistic we gathered was the amount of time spent out of class for school sport related fixtures. Although this is justified as ‘present’ it is time out of class where some students were over 50% for their total periods attended.

    At times I know there is little we can do for attendance especially for us as a hostel where some akonga live remotely or travel some distance to school. But the more we can provide for whanau to encourage attendance and prove the value in it in relation to achievement, hopefully we can create a culture change in this space. Similarly for Maori/Pasifika, the more they feel welcomed, safe and part of our hostel, hopefully the change in academic success and attendance can see an improve

    1. Kia ora Leighton,

      I really like the way you’ve analysed the attendance data and linked it clearly to achievement. It’s great to see the focus on Māori and Pasifika students, and how you’ve worked with a colleague to strengthen the insights. I also like your recognition of the challenges hostel whānau face and your focus on building a culture that values attendance. Bringing in more student and whānau voice could be a powerful next step to support the culture shift you’re aiming for. Ka rawe on developing such a strong, data-driven focus that clearly shows your commitment to equity and student success. Kia Kaha!

  12. Data-Driven Improvement Plan for the Science Department
    Area of School Focus:
    Mid-year data analysis to identify senior students at risk of not achieving NCEA qualifications, and to implement timely interventions to support them.
    Relevant Data Sources and Collection:
    To identify students at risk and understand the ‘why’ behind their performance, a combination of quantitative and qualitative data will be collected and analysed.
    Student Achievement Data:
    Quantitative: Review of Internal Assessment grades for all senior students across Chemistry, Biology, and Physics.
    Quantitative: Analysis of Practice External grade data to identify students who are performing below the required achievement standard.
    School Processes Data:
    Quantitative: Review of student attendance records, particularly for science classes.
    Qualitative: Analysis of notes from teacher-student check-ins or parent meetings to identify non-academic barriers to achievement (e.g., motivation issues, well-being concerns).
    Perceptions Data:
    Teacher Perceptions: A brief survey for teachers to highlight students they have concerns about, even if grades do not yet reflect it, and to provide context for academic performance.
    Analysis and Key Findings:
    An initial analysis of the data reveals several key trends and patterns.
    The data identifies a group of students with a pattern of low achievement (grades of “Not Achieved” or low “Achieved”) on internal assessments across multiple science subjects.
    Practice external exam grades for this same cohort confirm a gap in their knowledge, suggesting they are unlikely to meet the requirements for a full qualification without intervention.
    A cross-reference with attendance data shows a correlation between poor attendance and lower grades for a significant number of at-risk students.
    Discussions with teachers and review of pastoral notes reveal that some of these students have not submitted key assessments, and others show a lack of motivation or are dealing with personal issues that affect their learning.
    Data-Driven Improvement Plan:
    Goal: To provide targeted support to at-risk senior science students to improve their academic standing, with the aim of increasing the mid-year qualification rate by 10%.
    Strategies and Action Steps:
    Strategy: Implement a multi-tiered intervention system for identified at-risk students.
    Action Step 1.1: Develop a “watchlist” based on the data. For students on the list, their Head of Department (HOD) will initiate a personal one-on-one meeting with the student to discuss their grades and set a clear academic plan.
    Action Step 1.2: Establish a drop in session for students, where they will be supported by a science teacher to complete overdue work and revise for upcoming assessments. Invite at risk students to attend.
    Action Step 1.3: Communicate with parents/guardians to share the findings and collaborate on a support plan, including regular check-ins and progress reports.
    Strategy: Provide targeted academic support and resources.
    Action Step 2.1: Create small-group tutoring sessions led by teachers or high-achieving peer mentors, focusing on areas identified as common weaknesses in the practice of external data.
    Action Step 2.2: Update the department’s digital resource library to include a ‘Revision’ section with practice questions, instructional videos, and past papers.
    Evaluation:
    The effectiveness of this plan will be evaluated through continuous monitoring of student progress. The primary measures of success will be a minimum 10% increase in the number of at-risk students who successfully achieve their internal assessments by the end of the year and an improvement in their overall practice external grades. Qualitative feedback from students and teachers will be used to understand the impact of the interventions.

    1. Kia ora Laura,
      Thank you for your clear and practical plan. I really like how you’ve used both data and teacher insights to identify at-risk students and put strategies in place like the watchlist, drop-in sessions, and parent engagement. One suggestion is to clarify how you’ll monitor progress and share updates, and maybe include success indicators beyond grades, like attendance or engagement. This is a strong, actionable plan that should make a real difference for your students. Kia Kaha!

  13. Focus Area – Reading
    Current Data Summary – Mid-year data shows that 88% of the students in the team are working at or above expectations in reading, with 12% working towards. This indicates strong overall achievement but highlights a small group requiring targeted support to reach the expected level.
    Identified Needs – The 12% (14 students) working below expectations represent a small, specific group rather than a widespread concern. Each of these students is likely there for different reasons, and understanding their individual contexts, strengths, and challenges is key. The data suggests our overall reading practice is strong, and the focus should be on targeted, personalised support for these learners.
    Goal – To accelerate progress for the 14 identified students so that at least half (7 students) are working at or above expectation by the end of the year, while maintaining high achievement levels for the rest of the cohort.
    Actions –
    Adopt a culturally responsive relational approach.
    Conversations between the team leader and classroom teachers to explore each student’s learning story.
    Draw on teachers’ existing in-depth knowledge of these students to identify strengths, challenges, and effective strategies already in place.
    Agree on a small number of practical, targeted adjustments that can be embedded into existing classroom practice.
    Check in regularly to celebrate progress and adjust support as needed.
    Resources & Support – Review the current supports in place for each student and identify what additional help, if any, is needed. This will be individualised and may include in-class strategies, external programmes, or whānau engagement. Focus on building on existing supports rather than creating entirely new systems.
    Timeline – Interventions and support strategies will be implemented and refined throughout the term, with flexibility to respond to student progress.
    Monitoring & Evaluation – Progress will be discussed in team meetings and reviewed formally each term. This will be a shared ownership process, with the team collectively responsible for supporting these students’ success.
    Next Steps – To be determined following the completion of the term’s interventions and review of student progress data.

  14. Data Driven Improvement Notes:
    DATA COLLECTION AND ANALYSIS
    In my department we gather data from multiple sources. Data collection methods for Years 1-13 vary. We utilise standardised tests including PAT and e-asTTle for Years 4-10 students. We’re looking forward to the replacement of e-asTTle, as the current format presents challenges for our Year 3 and 4 students due to age-appropriateness issues. We use student and teacher individual tracking sheets, engagement student surveys, diagnostic tests and summative tests etc. We use Maths Whizz progression data reports too.
    My department report does not only include one year, we review a cohort over multiple years using standardised test. This also creates teacher accountability. We look for gaps in achievement, areas of concern and successful prgoression. Reports to the Board of Trustees and staff include identifying patterns and trends as well as breaking down data:
    – Standardised test results
    – Achievement by topic (Number, Alg, Geometry, Measurement, Statistics & probability)
    – Achievement breakdowns by ethnicity
    – Attendance analysis
    – Cohort tracking across multiple years (not just single-year snapshots)
    – Engagement survey results from students
    SMART GOAL SETTING
    We do not only do this for our yearly department goal, but recently asked students to do it too for their learning. Our Maths department goals area based on data and specific targets are set. Review timelines need to be allocated to goals set.
    EVIDENCE BASED STRATEGY SELECTION
    Our data indicated that Year 9 students were performing at a level where they could successfully achieve Numeracy. As a result, we moved Numeracy exam preparation units down into our Year 9 course. All Year 9 students now sit the Numeracy exam in September, with dedicated time spent ensuring they’re exam-ready, particularly focusing on question comprehension and unpacking skills.
    Research-backed interventions are covered by external PD, staff are encouraged to do Maths PD including NZAMT yearly. Our department meeting minutes reflect unpacking the new curriculum, strategies coverage and best practice modeling.
    We have renewed Mathematics learning units and required a massive amount of practical resources to be creative, our board approved our budget to purchase scales, number resources, Geometry equipment etc to build on our hands-on resources for Mathematical learning.
    “The key is that every decision, from identifying priorities to measuring success, is grounded in evidence rather than assumptions or intuition alone.”
    MONITORING AND EVALUATION SYSTEMS
    Results are reviewed quarterly by staff together, discussing cohort progress and concerns. Minutes in meetings.
    As part of our ‘Assessment for Learning’ focus we are actively developing and using student self-monitoring tools. Currently we have a visual progress tracking system.
    Each mathematics topic features fill-in/colour-in assessment boxes that enable students to:
    – Colour sections they have mastered based on test results/teacher feedback
    – Identify their next learning steps independently
    The idea is colour coded progress over time. We use a different colour for each term, allowing students to view their progress chronologically when in colour mode. The key principle underlying our approach is that it’s not where you start that matters, but the progress you make. This philosophy ensures every student can see their learning journey and celebrate their individual growth. We use MathsWhizz monitoring as well.
    COMMUNICATION & ACCOUNTABILITY
    For the Maths department this involves using data driven decision making, being open to growth. Helping students understand it is not where they start on their Maths journey every year, but how much progress they make. Personal success. Individual tracking. Cohort tracking over multiple years creates teacher accountability. Board Reports, Reports to Parents. Sound standardised assessments. Reward systems and progress milestones celebrated with certificates and chocolates in class and at assembly etc.

    Blessings,
    Monique

    1. Kia ora Monique,

      Ngā mihi for a really thorough and strong data-driven approach. I like how you track cohorts over multiple years, use a variety of data sources, and involve students in goal-setting and self-monitoring. Your evidence-based adjustments, like moving Numeracy prep into Year 9, show responsiveness to student needs. The visual progress tracking and celebrations of growth are excellent ways to keep students engaged and motivated. One suggestion is to make the review timelines for goals a bit more explicit so progress can be monitored consistently. Kia Kaha!

  15. 1. Select a specific area of school focus
    The focus area is student well-being through improved Year 13 attendance. This is a crucial year for our students, as it is their last before moving on to further study, training, or work. Good attendance in Year 13 not only supports NCEA Level 3 success but also helps students develop positive habits and routines that will set them up for life after school.
    2. Identify relevant data sources and collect the necessary data
    To get a clear understanding of the issue, I used the Term 1 2025 Attendance Data from the Every Day Matters report, which provided a current snapshot of attendance patterns. The year level breakdown of attendance categories was also reviewed to compare Year 13 with other cohorts. In addition, historical Term 1 attendance data from 2020 to 2025 was examined to see long-term trends and identify changes in student attendance over time.
    3. Analyse the data and identify key trends, patterns, and areas for improvement
    – In Term 1 2025, 32% of Year 13 students were attending regularly (above 90%).
    – 32% were in the chronic absence category (below 70%), the highest rate in the school.
    – The remainder of students were in the irregular (80–90%) or moderate (70–80%) attendance categories.
    – 82% of all Year 13 absences were unjustified, pointing to disengagement rather than unavoidable absence.
    – The school-wide regular attendance rate fell from 57% in Term 1 2024 to 50% in Term 1 2025.
    – Chronic absence in Year 13 indicates a long-term pattern of disengagement, not just occasional absence.
    – Many Year 13 students are balancing work, personal responsibilities, and school, which affects attendance priorities.
    4. Develop a data-driven improvement plan that outlines specific goals, strategies, and action steps
    The goal is to raise Year 13 regular attendance from 32% to 50% by the end of Term 3, 2025. This will mean moving roughly one-third of students currently in the irregular or moderate attendance categories into the regular category.
    To achieve this, every Year 13 student will have an individual meeting with a Dean or Whānau teacher to discuss their attendance, understand any barriers they are facing, and link their attendance to their NCEA achievement and future opportunities. Careers advisors will work alongside these students to connect their attendance goals directly to their post-school pathway plans, whether that involves further study, apprenticeships, or employment. Finally, we will work to acknowledge and reward improvement in attendance, rather than only celebrating students who already have high attendance, so that progress from any starting point is recognised and valued.

    1. Kia ora Wessel,
      You’ve provided a clear, data-driven reflection on Year 13 attendance, highlighting key trends and challenges. I like how your plan combines individualised support with positive reinforcement and sets a clear, measurable goal. This is a strong, strategic approach that connects student well-being, achievement, and future pathways. Ka rawe!

Leave a Reply