Programme Evaluation Use

illustration-converted_11-269x300A properly designed evaluation will provide a “decision-making framework for enhancing utility and actual use of evaluations. Therefore, evaluators should facilitate the evaluation process and design any evaluation with careful consideration of how everything that will be done, from beginning to end, will affect use” (Patton, 2014). The driving force of an evaluation should be, how will the results be used?

The results of this evaluation will be used to provide the pedagogical leadership team with feedback to determine how to support teachers in implementing the Numicon programme.

Data collection methods and analysis strategies:

We will use mixed methods, collecting both quantitative and qualitative data about the programme. Including qualitative data will help us to understand the “process – that is, the mechanisms by which a particular intervention instigates a series of events that ultimately result in the observed impact” (Rao & Woolcock, 165).

Quantitative data:

  • Questionnaire
  • Standardised test data (ACER, PUMA)
  • Year level assessment data from progress assessments (assuming these were used correctly or at all)
  • Report card data
  • Demographic data (gender, age, attendance, language profile, year level, number of years in the school, class assignment)

Qualitative data:

  • Focus group interviews (teaching teams) – coded example
  • A review of minutes from weekly planning meetings, in which teachers can choose to discuss some area of teaching that has gone well, to quantify how many times the Numicon programme was mentioned and qualify why it was mentioned.
  • Minutes from vertical Numicon meetings (coded)
  • Literature review of case studies and research reports from other schools that have implemented the Numicon programme (UK, NZ)

Data Analysis:

Data will be used to answer our three guiding questions: How are teacher perceptions about the programme influencing how they implement it? What barriers to implementation are teachers facing? Are students achieving better on this programme than on the past programmes, and if so, how much better?

Qualitative data analysis:

We are using both a deductive and inductive approach to qualitative data analysis. Some data is being used immediately to solve implementation issues, such as issues with resources. All data is being used inductively as well to identify larger patterns.

As we are collecting teacher perception data, it is being shared back with the teachers. Therefore we are using an ethnographic method to analyse qualitative data. This allows us to identify patterns and themes as they emerge while we read through and review the focus group interviews, weekly planning minutes, and minutes from vertical meetings. By sharing the collected data with the teachers, we build up a picture of how the teachers are feeling about and thinking about the programme. It provides us with a context for understanding the findings.

Qualitative data is also being open-coded. First I read through in its entirety, then re-read more closely and began coding by labels: teaching methods, teacher perceptions, student methods, teacher knowledge, resources, and differences. See examples minutes from vertical meeting and focus group interview minutes

Qualitative data will then be reviewed again and the codes will be combined to create categories. This is where the literature review will be especially helpful in identifying common areas of implementation challenges and how they were addressed in other schools.

Since we want to respond quickly to implementation challenges, the categories are action focused. Currently the categories that are emerging seem to be:

  • Supporting teaching: teaching methods, student methods
  • Supporting teachers: teacher knowledge, teacher perceptions
  • Resourcing: resources (internal from school and external from Numicon/Oxford University Press)

Quantitative data analysis:

Sample:

  • We will be analysing data collected from the teaching staff who are implementing the programme (23 members). Our questionnaire sample size is 20/23, which is significant enough to make conclusions.
  • We will be analysing student data from students in years 3 to 6 as these are the ages we conduct PUMA and ACER testing (173 students).

Methodology:

We will use bivariate analysis to determine associations between data sets, specifically the PUMA and ACER student test results and teacher questionnaire results. For example, we will

  • compare teacher perception data (questions 2, 3, 7, 9, and 12) with teacher use data (questions 4, 5, and 6) to see if there is any correlation. Here are the results of comparing question 1 –  What year level do you teach, to question 2 – To what degree do you enjoy teaching the programme. This graph suggests that Year 2 and 3 teachers enjoy teaching the programme more than Year 4 and 5 teachers, and that there is consistency within these year levels about their feelings towards the programme.

  • compare individual results of questions 2, 3, 7, 9 and 12 from the teacher questionnaire, which relate to teacher perceptions about the programme and compare them with the PUMA results from term 1 and term 2 of their class, to see if there is any correlation between the student scores and teacher perceptions for each question. There is sufficient research evidence that teacher beliefs and perceptions have a significant impact on how and what they teach.
  • compare the individual results of questions 4, 5, and 6 from the questionnaire, which relate to teacher use of the programme, and compare them with that teacher’s students PUMA results from term 1 and term 2 to see if there is any correlation between the student scores and teacher use of the programme.
  • compare results of the ACER from last year (pre-implementation) to this year (post-implementation) to see if there is a positive or negative correlation. This will be done using a bar graph.

Once the data has been analysed, we can use the qualitative data to help us interpret and understand the quantitative data. For example, if student test results go down from last year, we can look at teacher perception data to identify if there are any factors that may explain that decrease.

Examining both the qualitative and quantitative data together should enable us to answer our guiding questions, as well as identify how we can use this information to improve the implementation of the programme.

How we will enhance evaluation use:

To ensure that the evaluation impacts on the implementation of the Numicon programme, we will involve the teaching and pedagogical leadership teams in the process evaluation. The design of this part of the evaluation will be collaborative, with the aim to increase teachers’ understanding of the programme, develop shared knowledge, increase their commitment to teaching the programme and increase their sense of power in being able to respond to implementation challenges.

Once data is collected and analysed, the evaluator (curriculum coordinator) will present it to the teachers and pedagogical leaders. They will then work in small groups to examine the data, draw conclusions, and make commendations and recommendations. They will then share their group findings back to the rest of the staff. This will provide for a variety of perspectives and allow for rich discussions of how we can improvement implementation. It will also hopefully allow for some innovative ideas to emerge. The pedagogical leadership team will then use this information to develop an action plan. The action plan will then be re-assigned to different groups within the staff to be completed.

Findings from the small teams and the action plan will also be shared with the school board by the Head of School as part of the monthly update on school progress.

Commitment to Standards of Practice:

This evaluation will attempt to meet the standards of practice as set out by the Joint Committee on Standards for Educational Evaluation.

Utility standards have been met by clearly identifying the pedagogical leadership, teachers, students and, to a lesser extent, the board members as stakeholders involved and impacted by the programme evaluation. Evaluation questions have been clearly stated, are the driving force of the evaluation, and are responsive to the needs of the teachers and students. Interim findings can be acted upon quickly as the teachers and pedagogical leaders are actively collaborating in the evaluation.  The development of an action plan should also increase the likelihood that findings will be used purposefully. The only standard that is questionable is that of evaluator credibility. Given financial constraints it is not possible to hire an outside evaluator; however, the person facilitating the evaluation is not acting in isolation and the findings are for internal use only.

Feasibility standards have been met in that the evaluation will be run at zero cost to the school and the information produced should be of sufficient value that decisions can be made from it. The evaluation procedures are practical and met through the normal operating procedures of the school. This evaluation will involve all teachers who teach the programme and the findings will help us understand the variety of needs experienced by the teachers.

Propriety standards have been met as the only other option is to not conduct this evaluation, which would be a disservice to our students and teachers. Prior to analysing the evaluation data, an essential agreement will be written and agreed upon stating how the data will and will not be used. Additionally, much of the data will be relabelled with non-descriptive titles (replacing names with numbers for example). This helps protect the privacy of students and reduces teacher exposure to their peers. Identifying programme strengths and weaknesses is part of the design brief, as we want to capture what is working well so it can be shared with teachers as a “continue to do” and weaknesses can be addressed to improve outcomes for students. There is a clear conflict of interest in having teachers and pedagogical leaders facilitating the evaluation, but it is a more significant conflict of interest to not evaluate the programme at all. This could considered professional negligence.

Accuracy standards have been met as the programme and its context have been clearly defined. A mixed methods was chosen specifically so the qualitative data can help us understand the context of the qualitative data and help defend its reliability.  Using site specific data as well as data from external sources (literature review) increases the validity of the data. As the teams analysing the data will have overlapping tasks, the findings should be relatively robust and allow us to make justified conclusions. This should also contribute to more impartial reporting. A reflection involving the teachers and pedagogical leaders will be conducted at the conclusion of the evaluation to help us better understand the value of the process and rectify any outstanding issues.


References:

Interview Strategies. Cambridge: Harvard Department of Sociology, n.d. Pdf. http://sociology.fas.harvard.edu/files/sociology/files/interview_strategies.pdf

Little, Kate. 9 Quantitative data analysis. YouTube, 04 Apr. 2016. Web. 17 Mar. 2017. <https://www.youtube.com/watch?v=81jmL-jUdcA&gt;.

Lofgren, Kent. Qualitative analysis of interview data: A step-by-step guide. YouTube, 19 May 2013. Web. 17 Mar. 2017. <https://www.youtube.com/watch?v=DRL4PF2u9XA&gt;.

Mymandchannel. “Utilization-focused evaluation for equity-focused and gender-responsive evaluations.” YouTube. YouTube, 07 June 2013. Web. 12 Mar. 2017. <https://www.youtube.com/watch?v=jQP1FGhxloY&gt;.

Sanders, James R. The Program Evaluation Standards: Summary of Standards. Thousand Oaks: Sage Publication, n.d. Pdf. http://www.oecd.org/dev/pgd/38406354.pdf

TIPSHEET – QUALITATIVE INTERVIEWING. Durham: Duke Initiative on Survey Methodoloy, n.d. Pdf. http://www.dism.ssri.duke.edu/pdfs/Tipsheet%20-%20Qualitative%20Interviews.pdf

Qualitative Data Analysis Info. N.p.: n.p., n.d. Pdf.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s