Module 7: Te Whakamahi Pūrongo (Data Driven Leadership): This module focuses on using data effectively to inform decision-making, assess progress, and drive continuous improvement. 

“He aha te take? He aha te pūtake?”

“What is the cause? What is the root cause?”

Module Objectives:

  • Understand the importance of data-driven decision-making in education.
  • Identify and collect relevant data to inform school improvement initiatives.
  • Analyse and interpret data effectively to identify trends, patterns, and areas for improvement.
  • Use data to inform and evaluate school programmes and initiatives.
  • Communicate data effectively to stakeholders, including teachers, students, parents, and the wider community.
  • Develop a data-driven improvement plan for a specific area of school focus.

Data are crucial for improving student achievement. By revealing gaps in student learning and instructional practices, they guide teachers and leaders in identifying areas for improvement and tailoring instruction to meet individual student needs.

However, data alone do not provide solutions. They serve as a valuable tool for understanding student learning and informing decision-making. Interpreting data is paramount; it involves uncovering the ‘story behind the numbers’ by identifying patterns and relationships. This process requires ongoing analysis, reflection, and the collection of further evidence to refine understanding and inform continuous improvement.

Types of Data:

  • Demographic data: Information about students, staff, and the school community.
  • Student achievement data: Standardised tests, classroom assessments, and student work samples.
  • Perceptions data: Information gathered through surveys, questionnaires, observations, and student voice.
  • School processes data: Information about programs, classroom practices, and assessment strategies.

When gathering data, focus on relevant information that serves a specific purpose. Avoid collecting excessive data, which can be time-consuming and difficult to analyse.

While student achievement data provides valuable information about outcomes, it doesn’t explain the underlying causes. To understand these, utilise formative assessment data, classroom observations, student voice, and other relevant sources.

Analysing Data:

Start by posing a specific question about your data, focusing on differences, gaps, or the impact of teaching practices. Look for unexpected findings and identify patterns, trends, and categories.

Avoid jumping to conclusions; explore the data deeply, considering multiple perspectives and questioning your assumptions.

Evaluate data quality using the 4 Cs: Completeness, Consistency, Comparison, and Concealed information.

Create a concise data overview and share it with colleagues to gain diverse perspectives.

Generate inferences and potential explanations, remembering that correlation does not equal causation.

Develop a range of data stories to identify areas for further investigation.

Recognise that data may not tell the whole story and that further data collection may be necessary to confirm findings.

Resources:

https://research.acer.edu.au/cgi/viewcontent.cgi?article=1317&context=research_conference

https://education.nsw.gov.au/about-us/education-data-and-research/cese/publications/research-reports/5-essentials-for-effective-evaluation 

http://www.edu.gov.on.ca/eng/policyfunding/leadership/IdeasIntoActionBulletin5.pdf

https://cdn.auckland.ac.nz/assets/education/about/schools/tchldv/docs/Using%20Evidence%20in%20the%20Classroom%20for%20Professional%20Learning.pdf 

Task: Developing a Data-Driven Improvement Plan

  • Select a specific area of school focus (e.g., literacy, numeracy, student well-being).
  • Identify relevant data sources and collect the necessary data.
  • Analyse the data and identify key trends, patterns, and areas for improvement.
  • Develop a data-driven improvement plan that outlines specific goals, strategies, and action steps.
  • Post your data-driven improvement plan on the online forum for peer feedback and discussion.

Assessment:

  • Completion of all readings.
  • Participation in the online forum discussion.
  • Development and submission of a data-driven improvement plan.
  • Demonstration of the ability to analyse and interpret data effectively.

7 Responses

  1. Select a specific area of school focus
    Improving academic achievement for students at risk of not achieving in the commerce learning area. A significant focus has been on improving academic performance for all students, but in particular identified learner groups such as ESOL, Māori and Pasifika students throughout the school and those at risk of not achieving.

    Identify relevant data sources and collect the necessary data.
    – Achievement data from internal assessments throughout the year (data tracking of assessments). Early identification of issues on a shared spreadsheet, highlighting students at risk of not achieving and any interventions or alternative pathway plans (e.g. extra internal or external assessment opportunities).
    – Tracking attendance, ethnicity, SAC, learning needs and other data along with achievement data to see if there is a correlation between student achievement and these other factors.
    – Teacher feedback and dean feedback to identify other contributing factors to student achievement, e.g. health concerns, learning needs, etc.

    Analyse the data and identify key trends, patterns, and areas for improvement.
    – Data for the cohort has identified those students at risk of not achieving. Also, there is a direct correlation between attendance and achievement (in general). There is some evidence when comparing years that the timing of certain assessments is improved when running it later in the year. This may be due to prior learning or less impact from other assessment demands. More data analysis is required to identify direct causation.

    Develop a data-driven improvement plan that outlines specific goals, strategies, and action steps.
    – Teachers offer extra internal (to all) but encourage those students in need (only an option in some subjects). The timeline is set for students to enrol in extra internals. Only some students will take this opportunity, but it can mean the difference between passing a subject for some or not achieving at all.
    – Communication with parents either via the interview process and/or email/call home to signal student is at risk of not achieving.
    – Keep transparent data records for all involved in the teaching team, so all staff can access them at all times (e.g. shared googlesheet). This way, staff can add data as it becomes available and ensure early interventions. They can track their own students against the rest of the cohort and be more accountable for their students’ achievements.
    – This system has been trialled and refined over a couple of years, with regular attendance data being added each term and colour-coded to identify issues early. As a result we have significant data showing the correlation between attendance and achievement. Early interventions with students to discuss their attendance and potential impacts on their learning have resulted in generally improved attendance records for most.

  2. 1. Select a specific area of school focus
    Within the science department we have had a focus on students continuing to learn at home. We have therefore invested in a program called carousel which is a retrieval based system where we have made questions for students to revise throughout the week using flash cards. They then take a test of those flashcards at the end of the week which they self mark.

    2. Identify relevant data sources and collect the necessary data
    With this system you can track whether or not a student has completed the end of week test, accuracy of students marking and the accuracy of students answers.
    In order to see if this program is making a difference to students learning their overall results have also been collected.
    I also thought that their attendance to class should be tracked to see if this was the greater determinant of success within class.

    3. Analyse the data and identify key trends, patterns, and areas for improvement
    The main focus of our department is to get student doing homework. My data shows that there is a dip in student completion of homework mid term. This is normally when we change subjects in our scheme and could be a reason why students are struggling with the first topic in their end of term exams.

    From the first two terms of data I have found that students that have completed the homework over 75% of the the time have not received a grade lower than Achieved (with the exception of ESOL students).
    Students with the greatest completion rate and accuracy rate also seem to get the best results with a higher percentage receiving Merit and Excellence grades.

    I have been in a fortunate situation where I have taken over another colleague’s L1 class just recently. With my class I have been holding lunch time detentions for those who have not completed their homework. In these detentions I have been using the time to go over effective uses of the system and how I want them to be using it. I have been; sending emails home to parents reinforcing the importance of their child doing the homework, praising those who have completed the homework at a high level, and those who have had the biggest increase in their homework completion and accuracy. With the first week of data comparing these two classes, the class that I have taken over had a 38% completion rate whereas my class had a 53% completion rate (Week 4 of Term 3). My class had an accuracy rate of 85% compared to 75% of my new class.

    4. Develop a data-driven improvement plan that outlines specific goals, strategies, and action steps.
    From this data I can see that completion of homework has a fairly significant effect on student achievement within science. I can however see that there is a big midterm dip in completion of the homework.
    Goal
    Improve completion of weekly homework up to 80%.
    Strategy
    Continue classroom detentions to those who are not completing their homework. Use this time to reinforce the benefits of doing their homework by showing them the data, as well as time for them to revise their key terms.

  3. Kia Ora,

    I have selected overall academic achievement and the relationship with attendance for the house I am dean of. Furthermore identified Maori/Pasifika within the data set as an identified priority group within our school. Thankfully I shared this with one of the maths teachers in our house who added value to the information too.
    Some summaries we found last term were:
    – we had 74% of students attending school at or over 90% of the time. That is 136 students out of 182 hitting the ministry expectations, however overall we are below what the ministry is aiming for.
    “80% of students attending more than 90% of the time by 2030”
    – Alongside this MOE has set a target of >94% for daily attendance. As at week 6 we were at 92.8% which we would ideally like to improve. Illness, unapproved absences and holidays are the biggest contributor. Without looking for excuses, the livelihoods of whanau (farming, midwifery, transport, location) are reasons for the above. We give reasonable leniency towards the reasons outlined while providing plenty of notice on our stance with absences.
    – As at week 7, 66% of Maori/Pasifika students are attending over 90% of the time. By the end of week 10 this had dropped to 59% unfortunately.
    – The relationship between minimum attendance and passing the year (previous years results) showed an 88% chance of passing if you attended over 90% which dropped below 83% for 60% attendance.
    – Another statistic we gathered was the amount of time spent out of class for school sport related fixtures. Although this is justified as ‘present’ it is time out of class where some students were over 50% for their total periods attended.

    At times I know there is little we can do for attendance especially for us as a hostel where some akonga live remotely or travel some distance to school. But the more we can provide for whanau to encourage attendance and prove the value in it in relation to achievement, hopefully we can create a culture change in this space. Similarly for Maori/Pasifika, the more they feel welcomed, safe and part of our hostel, hopefully the change in academic success and attendance can see an improve

  4. Data-Driven Improvement Plan for the Science Department
    Area of School Focus:
    Mid-year data analysis to identify senior students at risk of not achieving NCEA qualifications, and to implement timely interventions to support them.
    Relevant Data Sources and Collection:
    To identify students at risk and understand the ‘why’ behind their performance, a combination of quantitative and qualitative data will be collected and analysed.
    Student Achievement Data:
    Quantitative: Review of Internal Assessment grades for all senior students across Chemistry, Biology, and Physics.
    Quantitative: Analysis of Practice External grade data to identify students who are performing below the required achievement standard.
    School Processes Data:
    Quantitative: Review of student attendance records, particularly for science classes.
    Qualitative: Analysis of notes from teacher-student check-ins or parent meetings to identify non-academic barriers to achievement (e.g., motivation issues, well-being concerns).
    Perceptions Data:
    Teacher Perceptions: A brief survey for teachers to highlight students they have concerns about, even if grades do not yet reflect it, and to provide context for academic performance.
    Analysis and Key Findings:
    An initial analysis of the data reveals several key trends and patterns.
    The data identifies a group of students with a pattern of low achievement (grades of “Not Achieved” or low “Achieved”) on internal assessments across multiple science subjects.
    Practice external exam grades for this same cohort confirm a gap in their knowledge, suggesting they are unlikely to meet the requirements for a full qualification without intervention.
    A cross-reference with attendance data shows a correlation between poor attendance and lower grades for a significant number of at-risk students.
    Discussions with teachers and review of pastoral notes reveal that some of these students have not submitted key assessments, and others show a lack of motivation or are dealing with personal issues that affect their learning.
    Data-Driven Improvement Plan:
    Goal: To provide targeted support to at-risk senior science students to improve their academic standing, with the aim of increasing the mid-year qualification rate by 10%.
    Strategies and Action Steps:
    Strategy: Implement a multi-tiered intervention system for identified at-risk students.
    Action Step 1.1: Develop a “watchlist” based on the data. For students on the list, their Head of Department (HOD) will initiate a personal one-on-one meeting with the student to discuss their grades and set a clear academic plan.
    Action Step 1.2: Establish a drop in session for students, where they will be supported by a science teacher to complete overdue work and revise for upcoming assessments. Invite at risk students to attend.
    Action Step 1.3: Communicate with parents/guardians to share the findings and collaborate on a support plan, including regular check-ins and progress reports.
    Strategy: Provide targeted academic support and resources.
    Action Step 2.1: Create small-group tutoring sessions led by teachers or high-achieving peer mentors, focusing on areas identified as common weaknesses in the practice of external data.
    Action Step 2.2: Update the department’s digital resource library to include a ‘Revision’ section with practice questions, instructional videos, and past papers.
    Evaluation:
    The effectiveness of this plan will be evaluated through continuous monitoring of student progress. The primary measures of success will be a minimum 10% increase in the number of at-risk students who successfully achieve their internal assessments by the end of the year and an improvement in their overall practice external grades. Qualitative feedback from students and teachers will be used to understand the impact of the interventions.

  5. Focus Area – Reading
    Current Data Summary – Mid-year data shows that 88% of the students in the team are working at or above expectations in reading, with 12% working towards. This indicates strong overall achievement but highlights a small group requiring targeted support to reach the expected level.
    Identified Needs – The 12% (14 students) working below expectations represent a small, specific group rather than a widespread concern. Each of these students is likely there for different reasons, and understanding their individual contexts, strengths, and challenges is key. The data suggests our overall reading practice is strong, and the focus should be on targeted, personalised support for these learners.
    Goal – To accelerate progress for the 14 identified students so that at least half (7 students) are working at or above expectation by the end of the year, while maintaining high achievement levels for the rest of the cohort.
    Actions –
    Adopt a culturally responsive relational approach.
    Conversations between the team leader and classroom teachers to explore each student’s learning story.
    Draw on teachers’ existing in-depth knowledge of these students to identify strengths, challenges, and effective strategies already in place.
    Agree on a small number of practical, targeted adjustments that can be embedded into existing classroom practice.
    Check in regularly to celebrate progress and adjust support as needed.
    Resources & Support – Review the current supports in place for each student and identify what additional help, if any, is needed. This will be individualised and may include in-class strategies, external programmes, or whānau engagement. Focus on building on existing supports rather than creating entirely new systems.
    Timeline – Interventions and support strategies will be implemented and refined throughout the term, with flexibility to respond to student progress.
    Monitoring & Evaluation – Progress will be discussed in team meetings and reviewed formally each term. This will be a shared ownership process, with the team collectively responsible for supporting these students’ success.
    Next Steps – To be determined following the completion of the term’s interventions and review of student progress data.

  6. Data Driven Improvement Notes:
    DATA COLLECTION AND ANALYSIS
    In my department we gather data from multiple sources. Data collection methods for Years 1-13 vary. We utilise standardised tests including PAT and e-asTTle for Years 4-10 students. We’re looking forward to the replacement of e-asTTle, as the current format presents challenges for our Year 3 and 4 students due to age-appropriateness issues. We use student and teacher individual tracking sheets, engagement student surveys, diagnostic tests and summative tests etc. We use Maths Whizz progression data reports too.
    My department report does not only include one year, we review a cohort over multiple years using standardised test. This also creates teacher accountability. We look for gaps in achievement, areas of concern and successful prgoression. Reports to the Board of Trustees and staff include identifying patterns and trends as well as breaking down data:
    – Standardised test results
    – Achievement by topic (Number, Alg, Geometry, Measurement, Statistics & probability)
    – Achievement breakdowns by ethnicity
    – Attendance analysis
    – Cohort tracking across multiple years (not just single-year snapshots)
    – Engagement survey results from students
    SMART GOAL SETTING
    We do not only do this for our yearly department goal, but recently asked students to do it too for their learning. Our Maths department goals area based on data and specific targets are set. Review timelines need to be allocated to goals set.
    EVIDENCE BASED STRATEGY SELECTION
    Our data indicated that Year 9 students were performing at a level where they could successfully achieve Numeracy. As a result, we moved Numeracy exam preparation units down into our Year 9 course. All Year 9 students now sit the Numeracy exam in September, with dedicated time spent ensuring they’re exam-ready, particularly focusing on question comprehension and unpacking skills.
    Research-backed interventions are covered by external PD, staff are encouraged to do Maths PD including NZAMT yearly. Our department meeting minutes reflect unpacking the new curriculum, strategies coverage and best practice modeling.
    We have renewed Mathematics learning units and required a massive amount of practical resources to be creative, our board approved our budget to purchase scales, number resources, Geometry equipment etc to build on our hands-on resources for Mathematical learning.
    “The key is that every decision, from identifying priorities to measuring success, is grounded in evidence rather than assumptions or intuition alone.”
    MONITORING AND EVALUATION SYSTEMS
    Results are reviewed quarterly by staff together, discussing cohort progress and concerns. Minutes in meetings.
    As part of our ‘Assessment for Learning’ focus we are actively developing and using student self-monitoring tools. Currently we have a visual progress tracking system.
    Each mathematics topic features fill-in/colour-in assessment boxes that enable students to:
    – Colour sections they have mastered based on test results/teacher feedback
    – Identify their next learning steps independently
    The idea is colour coded progress over time. We use a different colour for each term, allowing students to view their progress chronologically when in colour mode. The key principle underlying our approach is that it’s not where you start that matters, but the progress you make. This philosophy ensures every student can see their learning journey and celebrate their individual growth. We use MathsWhizz monitoring as well.
    COMMUNICATION & ACCOUNTABILITY
    For the Maths department this involves using data driven decision making, being open to growth. Helping students understand it is not where they start on their Maths journey every year, but how much progress they make. Personal success. Individual tracking. Cohort tracking over multiple years creates teacher accountability. Board Reports, Reports to Parents. Sound standardised assessments. Reward systems and progress milestones celebrated with certificates and chocolates in class and at assembly etc.

    Blessings,
    Monique

  7. 1. Select a specific area of school focus
    The focus area is student well-being through improved Year 13 attendance. This is a crucial year for our students, as it is their last before moving on to further study, training, or work. Good attendance in Year 13 not only supports NCEA Level 3 success but also helps students develop positive habits and routines that will set them up for life after school.
    2. Identify relevant data sources and collect the necessary data
    To get a clear understanding of the issue, I used the Term 1 2025 Attendance Data from the Every Day Matters report, which provided a current snapshot of attendance patterns. The year level breakdown of attendance categories was also reviewed to compare Year 13 with other cohorts. In addition, historical Term 1 attendance data from 2020 to 2025 was examined to see long-term trends and identify changes in student attendance over time.
    3. Analyse the data and identify key trends, patterns, and areas for improvement
    – In Term 1 2025, 32% of Year 13 students were attending regularly (above 90%).
    – 32% were in the chronic absence category (below 70%), the highest rate in the school.
    – The remainder of students were in the irregular (80–90%) or moderate (70–80%) attendance categories.
    – 82% of all Year 13 absences were unjustified, pointing to disengagement rather than unavoidable absence.
    – The school-wide regular attendance rate fell from 57% in Term 1 2024 to 50% in Term 1 2025.
    – Chronic absence in Year 13 indicates a long-term pattern of disengagement, not just occasional absence.
    – Many Year 13 students are balancing work, personal responsibilities, and school, which affects attendance priorities.
    4. Develop a data-driven improvement plan that outlines specific goals, strategies, and action steps
    The goal is to raise Year 13 regular attendance from 32% to 50% by the end of Term 3, 2025. This will mean moving roughly one-third of students currently in the irregular or moderate attendance categories into the regular category.
    To achieve this, every Year 13 student will have an individual meeting with a Dean or Whānau teacher to discuss their attendance, understand any barriers they are facing, and link their attendance to their NCEA achievement and future opportunities. Careers advisors will work alongside these students to connect their attendance goals directly to their post-school pathway plans, whether that involves further study, apprenticeships, or employment. Finally, we will work to acknowledge and reward improvement in attendance, rather than only celebrating students who already have high attendance, so that progress from any starting point is recognised and valued.

Leave a Reply