Module 7: Te Whakamahi Pūrongo (Data Driven Leadership): This module focuses on using data effectively to inform decision-making, assess progress, and drive continuous improvement. 

“He aha te take? He aha te pūtake?”

“What is the cause? What is the root cause?”

Module Objectives:

  • Understand the importance of data-driven decision-making in education.
  • Identify and collect relevant data to inform school improvement initiatives.
  • Analyse and interpret data effectively to identify trends, patterns, and areas for improvement.
  • Use data to inform and evaluate school programmes and initiatives.
  • Communicate data effectively to stakeholders, including teachers, students, parents, and the wider community.
  • Develop a data-driven improvement plan for a specific area of school focus.

Data are crucial for improving student achievement. By revealing gaps in student learning and instructional practices, they guide teachers and leaders in identifying areas for improvement and tailoring instruction to meet individual student needs.

However, data alone do not provide solutions. They serve as a valuable tool for understanding student learning and informing decision-making. Interpreting data is paramount; it involves uncovering the ‘story behind the numbers’ by identifying patterns and relationships. This process requires ongoing analysis, reflection, and the collection of further evidence to refine understanding and inform continuous improvement.

Types of Data:

  • Demographic data: Information about students, staff, and the school community.
  • Student achievement data: Standardised tests, classroom assessments, and student work samples.
  • Perceptions data: Information gathered through surveys, questionnaires, observations, and student voice.
  • School processes data: Information about programs, classroom practices, and assessment strategies.

When gathering data, focus on relevant information that serves a specific purpose. Avoid collecting excessive data, which can be time-consuming and difficult to analyse.

While student achievement data provides valuable information about outcomes, it doesn’t explain the underlying causes. To understand these, utilise formative assessment data, classroom observations, student voice, and other relevant sources.

Analysing Data:

Start by posing a specific question about your data, focusing on differences, gaps, or the impact of teaching practices. Look for unexpected findings and identify patterns, trends, and categories.

Avoid jumping to conclusions; explore the data deeply, considering multiple perspectives and questioning your assumptions.

Evaluate data quality using the 4 Cs: Completeness, Consistency, Comparison, and Concealed information.

Create a concise data overview and share it with colleagues to gain diverse perspectives.

Generate inferences and potential explanations, remembering that correlation does not equal causation.

Develop a range of data stories to identify areas for further investigation.

Recognise that data may not tell the whole story and that further data collection may be necessary to confirm findings.

Resources:

https://research.acer.edu.au/cgi/viewcontent.cgi?article=1317&context=research_conference

https://education.nsw.gov.au/about-us/education-data-and-research/cese/publications/research-reports/5-essentials-for-effective-evaluation 

https://www.education-leadership-ontario.ca/application/files/6414/9445/9507/Ideas_Into_Action_for_School_and_System_Leaders_Using_Data_Transforming_Potential_into_Practice_Updated__Winter_2013-14.pdf

https://cdn.auckland.ac.nz/assets/education/about/schools/tchldv/docs/Using%20Evidence%20in%20the%20Classroom%20for%20Professional%20Learning.pdf 

Task: Developing a Data-Driven Improvement Plan

  • Select a specific area of school focus (e.g., literacy, numeracy, student well-being).
  • Identify relevant data sources and collect the necessary data.
  • Analyse the data and identify key trends, patterns, and areas for improvement.
  • Develop a data-driven improvement plan that outlines specific goals, strategies, and action steps.
  • Post your data-driven improvement plan on the online forum for peer feedback and discussion.

Assessment:

  • Completion of all readings.
  • Participation in the online forum discussion.
  • Development and submission of a data-driven improvement plan.
  • Demonstration of the ability to analyse and interpret data effectively.

5 Responses

  1. The area I chose was literacy across the whole school, as I am leading Literacy at my kura. I used the data from Hero, our SMS, which included easttle data as well as teacher OTJs. I also used BSLA data from the the junior school, as well as the first round of the new Phonics Checks implemented by the MOE. Additonally, I included student voice as I have been running a pilot programme so this helped to gauge student enjoyment and engagement before implementing the change school-wide. It also helped with teacher buy-in across the school.
    My analysis showed that our small group reading is going really well across the school, so no need to change anything there. But our writing was quite lacking due to low surface level skills such as spelling, handwriting and grammar/punctuation. It was noted that across the school there was limited explicit teaching of sentence structure and these skills so I made that a prominent part of my literacy programme. My personal research, from looking at model schools around NZ and in Australia, showed me that students literacy improves when reading and writing are interwoven.
    My DDIP is: – new whole school literacy approach to align with the new literacy curriculum, and to make this very similar across the phases for cohesion.
    – whole class reading and writing to be from the same class novel
    – a new structure of writing lessons and the inclusion of explicit teaching on new vocabularly from the text, sentence structure and genre lessons that make sense with the novel/shared text.
    – implementation of a school-wide handwriting and spelling scope and sequence that aligns with BLSA and the new curriculum.

  2. Over the course of this year we have been looking at our school wide data, specifically with Reading Comprehension.
    We have noticed that progression in Reading has not had the same levels of improvement as our other subjects. Therefore in our teams we had a look at possible reasons for this and how we would make a differnece to these results.

    We started by looking at our testing data. from these were were able to develop “hunches” (some assumptions to likely causes). We then dug abit deeper with in class formative assesment to get a gauge of these hunches and whether or not we were/are on the right track.

    What we saw/see is there a gap in students skills to comprehend and answer high order thinking questions. This was/is apparent across our syndicate data.

    The next steps were to research and find out how do we address these gaps. We found a range of possiblle programmes that we could use in our team. However, we decided that we should have one consistent approach as opppsed to multiple and that this approach would last for at least a term. This meant there was a bit of pressure on finding the one with the most “bang for your buck.”

    We have settled on an approach and are currently implementing our programme. So far, we believe that this is making a difference with our students. How much difference? we will wait and see.

  3. The recent update to the NZC means that the data in our SMS can no longer be reliably compared to previous data. In this instance, I reviewed the most recent overall analysis of writing data in our Years 0–2 team. An overwhelming number of students (84%) were identified as “developing,” when we would expect them to be “progressing.” When I compare this with other teams data, I am left with several targeted questions:

    Completeness: Is this data complete? What systems are in place to ensure the data is current and accurate?

    Consistency: What processes does our team follow to ensure consistency and reliability in our data?

    Comparison: At this stage, the only benchmark is the NZC. It’s difficult to show progress given the limited time we’ve had with the updated curriculum.

    Concealed Information: I don’t believe this data fully represents our students’ abilities. What might be missing or not captured?

    Before making plans to “improve” what the data shows, I want to ensure the data itself is accurate, consistent, and up to date. My next step is to discuss with my team how they are tracking achievement and talk with SLT about assessment tools to support greater consistency. Once I feel confident in the data, I can develop a targeted improvement plan.

  4. Standardised testing, teacher judgements and curriculum goals are what usually inform our data but with the change of curriculum, it is a little harder to have an accurate set of data at the moment. In saying that, looking at the data for literacy and maths within the Yr 5/6’s, it shows that there is the usual range of well below to above.
    The things that stood out to me that require addressing are:
    – around 30% of our students are below the expected curriculum level for their age. We have recently switched to a structured literacy approach in our yr 5 classes and will be switching our yr 6 classes next term. We also now have dedicated LA time to help support our struggling learners with their literacy and maths goals.
    – 2 – 15% of our students are above their curriculum level which is lower than in previous years. This can be explained by our focus on a single year level curriculum but does mean their is a gap for supporting our top learners. We are starting an extension program that uses STEM as the basis to extend the critical thinking, problem solving skills and maths capabilities so hopefully this will help extend our higher learning students.

  5. As part of normal review, we were recently tasked as team leaders to look at our literacy and numeracy data. The year 7 literacy data showed a very normal distribution – small number of students (4-5) achieving well below, the large bulk in the middle achieving below and at standard, and a few (7-8) achieving above the expected curriculum level. These findings were supported by standardised testing also.
    A couple of things stood out to me to immediately form plans to address:
    – the ~40% of students who are achieving below the expected curriculum level in the first half of year 7. There was no obvious gender or ethnicity factors impacting the data.
    – no students were identified as working well above the curriculum level and requiring an extension programme in year 7 literacy.
    As a team we discussed how we would address these results. We discussed the following actions:
    – Designing a ‘rapid acceleration’ approach for the first two terms of the year in literacy in 2026. Increasing explicit literacy teaching, with a focus on foundation skills and habit forming. This group of students have come through school just ahead of BSLA, and have not been taught through those methods – so there are some what you would now call gaps in their knowledge when looking at the expectations of the new curriculum.
    – Working with students who are able to be extended – finding their interests early, and awakening their inner love for literacy and leveraging this to excite them about literacy.

Leave a Reply