Using assessment data with teachers: what works?

As my final post in this series, I wanted to review the literature findings to highlight which practices have been effective in helping teachers use assessment data to inform their instructions and what we still need to learn more about.

I compared the results of six articles to identify what we currently know about what does and does not have a positive impact on teacher use of student assessment data to improve learning. Two of these articles, Datnow & Hubbard’s (2015) and Marsh’s (2012) are literature reviews and synthesise what has already been learned about how educators use data and what strategies and conditions seem to indicate success. The rest are research papers done within specific contexts. Two of them relate to an intervention led at secondary schools in the Netherlands (Kippers, Wolterinck, Schildenkamp, Poortman, & Visscher, 2018; Ebbeler, Poortman, Schildkamp, & Pieters, 2018). The second study was done in South Africa (Kanjee & Moloi, 2014) and the final one in the United States in a large suburban school district (Hoover & Abrams, 2013). It is important to note that all of these studies relied on teacher self-reporting data concerning how they used data. There is little to no empirical evidence to support their claims.


One consistent finding was about how teachers used the data. Generally data was used to:

  • Identify the need to differentiate for certain students, specifically low-achieving students. Rarely was it used to differentiate for high achieving students.
  • Group students for instruction.
  • Identify the need to reteach content.
  • To inform future practice.

Common barriers were:

  • Pacing: many teachers reported that the pacing of their curriculum meant there was not time to go back and reteach, even when their data indicated this was necessary.
  • Assessment literacy: all of the articles identified that teachers did not analyse the data deeply enough or frequently enough to find the information that would provide the most learning traction. Sufficient assessment literacy was identified as the barrier to analysing the data effectively. 
  • Time: time was consistently mentioned as a barrier in a few ways. First, there were concerns about the lack of time available for teachers to meet and engage in data analysis, as it cannot be done in 30 minute sessions. Second, there was the need to devote time to learning assessment literacy. Third, Ebbeler et al noted that time was needed for a culture of assessment to develop and for teachers to shift their mindset (101).
  • Beliefs: I’ve dealt with this in previous posts, but teachers’ beliefs about assessment and its role have a significant impact on how willingly they will use assessment data. While most of the articles found that teachers’ had positive attitudes towards using assessment data, there were some beliefs that inhibited their participation. Datnow and Hubbard found a difference in how primary school and secondary school teachers in Spain viewed and used formative assessment data. It was found that primary school teachers held a pedagogical orientation which saw assessment as a tool for supporting instructional decisions, whereas the secondary school teachers held a societal conception of assessment, which led them to use formative data to determine levels of performance (18).

Strategies that helped:

  • Collaboration: most of the studies reported that teachers had more success in using data if they worked in collaborative teams. The collaborative nature meant that teachers felt a responsibility to each other, they were exposed to more perspectives and to a wider range of instructional strategies and solutions.
  • Trust and norms: again, this was featured in almost all the articles. Collaborative teams which had high levels of trust were more likely to engage with the process. In schools where data use was tied to punitive measures there was more resistance amongst staff to participate meaningfully. Marsh found that when school leaders used non-threatening approaches to data use, such as anonymous test results, it built trust within the school around the use of data (12). Collaborative norms were also found to help support the process as it meant more equal participation and reduced the influence of positional hierarchy on the decisions made.
  • Tools: the use of inquiry models and protocols to support the data analysis process allowed teams to maintain focus on the inquiry. Making the steps “as concrete and explicit as possible” (Ebbeler et al, 2018, 101) was identified as an important intervention.
  • Coaching: access to a data coach who could help teachers make sense of the assessment data or teach them the assessment literacy skills was found to have a positive impact on teachers’ use of data. Additionally, coaches who had expertise in pedagogy were found useful as they helped the teams identify instructional solutions.
  • Technical support to render data useable (ie. teacher friendly!): some of the schools in the studies used computer programmes that turned the data from raw scores into charts and graphs that were easier for teachers to interpret.
  • School leadership: a distributed leadership model was found to have a positive affect as it spread the responsibility for using data across the school and meant the expertise did not reside within one person. Marsh found that without an expectation from leadership that teachers were accountable for engaging and following through in their inquiry, they were unlikely to see tangible results (18). The school’s policies and procedures were also found to influence how successfully the teachers engaged with data.
  • Structured and sustained approach: this was again a common theme. Teacher use of assessment data needs to be a long term goal and needs to be sustained over time through investments in professional learning, a culture that values the use of assessment data, and systems that will provide the time and necessary resources.

Marsh and Kippers et al both recommend using a range of implementation strategies for the best chance of success. For example, developing an assessment culture, building teachers’ assessment capacity, dedicating blocks of structured time, and using protocols and models will make it more likely that teaching teams will be able to use data to improve learning for their students than just using one of these strategies.

What this means for me:

Many of the barriers identified are not necessarily large hurdles in my context. For example, we have large blocks of time for professional learning. We have early release Wednesdays for whole staff professional learning, as well as weekly PYP planning meetings of 80 minutes per team, and another 40 minute team planning block. Additionally we have 6 inservice days within the academic year.

As previously discussed we have a strong collaborative culture and our leadership model is distributed. We work hard as a leadership team to ensure that expertise does not reside in any one person, as one of the features of international schools is the transience of the teaching population. To ensure sustainability the school invests heavily in a shared leadership model and in developing expertise throughout the organisation.

We have slightly more flexibility when it comes to pacing as well. As a PYP school we are allowed to decide for ourselves how long a unit should run and if necessary we can make changes to these plans. For example, we have some math units that run for four weeks and some that run for 10 to 14 weeks depending upon the complexity of the material, how much previous experience students have had with this content and what our diagnostic data tells us about our learners’ needs.

I am hopeful that I can use my learning from this inquiry to work with my school leadership team to plan a path forward for our school. We will need to invest time and energy in developing assessment literacy and coaching as these are strategies we have control over and both will have a significant impact on how effectively we can use assessment data at different levels – both the classroom level and the whole school level. At a personal level, I am still dedicated to learning how to make sense of our ISA data, particularly now that I am clear about its role in programme evaluation, but I am also equally dedicated to helping my teachers learn how to use their data to improve their teaching and their students’ learning. After all, that’s our raison d’être.


Datnow, A., & Hubbard, L. (2015). Teachers’ Use of Assessment Data to Inform Instruction: Lessons from the Past and Prospects for the Future. Teachers College Record, 117, 1-48.

Ebbeler, J., Poortman, C., Schildkamp, K., & Pieters, J. (2017). The effects of data use intervention on educators’ satisfaction and data literacy. Education Assessment Evaluation Association, 29, 83-105. DOI 10.1007/s11092-016-9251-z.

Hoover, N., & Abrams, L., (2013). Teachers’ Instructional Use of Summative Assessment Data. Applied Measurement in Education, 26, 219-231.

Kanjee, A., & Molio, Q. (2014). South African teachers’ use of national assessment data. South African Journal of Childhood Education, 4(2), 90-113.

Kippers, W., Wolterinck, C., Schildkamp, K., Poortman, C., & Visscher, A. (2018). Teachers’ views on the use of assessment for learning and data-based decision making in classroom practice. Teaching and Teacher Education, 75, 199-213.

Marsh, J. (2012). Interventions Promoting Educators’ Use of Data: Research Insights and Gaps. Teachers College Record, 114, 1-48.


Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s