Friday, 20 October 2017

The School Research Lead and the 'backfire effect' - should you be worried?


One of the challenges faced by school research leads is the need to engage with colleagues who have different views about the role of evidence in bringing about improvement.  Indeed, these different views are not likely to be restricted just to the role of evidence, they are also likely to include differing views about the research evidence itself.  What’s more in a widely cited article (Nyhan & Reifler, 2010) show how  attempts to  correct misconceptions through the use of evidence frequently fail to to reduce the misconceptions held by a target-group.  Indeed, these attempts at correcting misconceptions may inadvertently lead to increasing misconceptions in the target-group i.e the so-called back-effect

Now if there is a ‘backfire effect this could have profound implications for both evidence-based school leaders and school research leads as they attempt to engage in dialogue to correct the misconceptions which may be held by colleagues about research.   This is extremely important as it is necessary to know whether it is possible to engage in constructive dialogue where misperceptions can be corrected.   If this is not the case then school research leads will need to give careful consideration to how they go about disseminating scholarly research, as it may lead to major opinion formers within a school having an even less favourable view of research as a means of bringing about improvement.

However, there may be an even bigger issue -  the ‘backfire effect’ may not exist at all, and even if it does it may well be the exception rather than the norm.  In a peer-reviewed paper, (Wood & Porter, 2016) present results from four experiments involving over 8000 subjects, and found that on the whole individuals tended to take on board factual information even if this information challenges their partisan and ideological commitments. 

What are the implications for you, as you attempt to develop a school climate and culture based of evidence use.  

First, as (Wood & Porter, 2016) noted the back-fire effect appeared to be a product of question wording, so this would suggest that it’s important to really think through how information is presented to colleagues and how subsequent questions are phrased.  

Second, Wood and Porter note that in general respondents tend to shy away from cognitive effort and will deploy strategies to avoid it.  Whereas as the backfire effect relies on substantial cognitive effort by developing new considerations to offset the cognitive dissonance generated by the new information.    However, the research which has identified the back-fire effect often took place in university settings where the respondents, be it students or teaching staff often take great delight in cognitive effort.  Indeed, the school staff room may have a number of similarities with experiments taking place in university settings.  As such, schools may be particularly prone to seeing a disproportionate number of incidents to the ‘back-fire effect.  

Third, Wood and Porter note that their findings are not without limitations, for example, just because individuals have been presented with information to address their misconceptions, does not mean that that this information has been retained.    


And finally, it’s important to note that even when relatively new ideas and concepts breakout from the academy and reach the public domain, that does not mean they should be taken as ‘gospel’ but rather should be seen as something which has more than surface plausibility.  That said, even when things are plausible that does not mean it is the only explanation for what is taking place.

References

Nyhan, B., & Reifler, J. (2010). When corrections fail: The persistence of political misperceptions. Political Behavior, 32(2), 303-330.

Wood, T., & Porter, E. (2016). The elusive backfire effect: Mass attitudes' steadfast factual adherence.

Friday, 13 October 2017

The school research lead, the 5 Whys and appraising school data ?

As a school research lead one of your key tasks will be to help colleagues interpret the  quantitative data which is generated by your school.  So in this post I am going to suggest that we look at a technique called the ' five whys,' which you can use to  analyse data in a way that will help get to the very heart of any underlying issue (Pojasek, 2000).  In doing so, we will use a case-study where last year's GCSE results in a particular department have been especially disappointing

Asking ‘why’ five times

The ‘five whys' is a simple technique, which involves asking the question ‘why’ at least five times so that you can get to the root cause of a problem.  The process tends to come to an end when it is no longer possible to come up with an answer to ‘why’.    But first let's look at what happens when you ask 'why' only once and then come up with a fairly 'lazy' answer

Problem: A subject’s examination results are substantially below the previous year’s results and the 1 why

Q Why are this department's examination results below those of the previous year

A Because the both the Head of Department and teacher who taught this subject are either newly qualified and relatively inexperienced, who need support and improvement targets

However we know from the work of (Crawford and Benton, 2017) that almost all of the change in a school's examination results can be explained by changes in year to year changes in the pupil cohort.  So let's have a go with the 5 whys

Problem: A subject’s examination results are substantially below the previous year’s results - the 5 whys

Q Why are examination results below the previous year’s results
A Because this year a weaker cohort of students took the subject

Q Why did a weaker cohort of student take the subject this year
A Because ‘stronger’ students who would normally take this subject chose other subjects.

Q Why did the stronger students choose other subjects 
A Because in the year before the students chose their ‘options’, they had been taught predominantly by non-specialist teachers who were adequate rather than inspiring 

Q Why did a non-specialist teachers deliver this subject
A Because all teachers had to have a full timetable

Q Why did all teachers have to have a full timetable
A Due to financial pressures it was not viable to have teachers on ‘light’ timetables

Pojaskek (200)) identifies a number of benefits which come from asking 'why' five times.  First, once you have got the hang of it, it's a pretty quick and easy technique to use.  Second, it helps you think through an issue so that you can drill down to the underlying cause of the problem.  Third, it may help you change your perception of the root cause of a problem.  That said, there a couple of clear challenges in using the 'five whys' and these include the need for strong facilitation skills - as the focus is on getting to the root cause of an issue rather than allocating blame. There's also the issue that there may be multiple issues in play - so it may be difficult to isolate the root cause

And some final words

In these times of acute financial pressures on schools it needs to be emphasised that decisions often have long-term consequences - and what may be a quick fix for the current year, may cause substantive problems in years to come.

Reference

CRAWFORD, C. & BENTON, T. 2017. Volatility happens: Understanding variation in schools’ GCSE results : Cambridge Assessment Research Report. Cambridge, UK: Cambridge Assessmentn.


POJASEK, R. B. 2000. Asking“Why?” five times. Environmental Quality Management, 10, 79-84.


Saturday, 7 October 2017

Leading the research and evidence based school : Are you ready?

When introducing evidence-based practice into your school a major challenge you will face is having to diagnose your school’s readiness to both engage with and implement research and other evidence. One way of helping you addressing this challenge is to use the Promoting  Action on Research Implementation in Health Services (PARiHS) framework (Kitson, Harvey, & McCormack, 1998) and which provides a useful conceptual and pragmatic heuristic to to help you think about the implementation of research within your school. The PARiHS framework describes the successful implementation of research into practice as being a function of the interaction of three core elements—the quality and type of the evidence, the context, setting or environment into which the research used, and the approaches and techniques of facilitation.

Evidence – this consists or four elements: academic/scientific research, practitioner expertise, school/organisational data and the views of stakeholders

Context – is the environment in which the proposed change – either an evidence-informed intervention or the adoption of evidence-based school leadership – is being implemented. As such, it can be divided into three sub-elements; the prevailing culture; school leadership; and the school’s approach to accountability and development.

Facilitation – this is where one or more persons – for example, senior leaders, school research leads and champions - makes things easier for others and this includes; the characteristics of facilitators; their role; style and skills. In this context, the facilitator’s role is help people understand the change required and what’s needed to be done to bring it about

Now each of these elements and sub-elements is placed on a low to high continuum, with (Kitson et al., 1998)(Kitson et al., 1998) stating

‘…that for the implementation of research into practice to be successful there needs to be a clear understanding of the nature of the evidence being used, the quality of context in terms of its ability to cope with change and type of facilitation needed to ensure a successful change process’ (p152)

For each of these three elements a range of conditions may exist which indicate the likelihood of the successful implementation of research and evidence-based practices. These conditions are illustrated in Figure 1

Figure 1 Conditions for evidence, context and facilitation

Element
Sub-element
Likelihood of success of implementing change


Low

High
Evidence
Research
Anecdotal
Descriptive


Systematic reviews
Randomised controlled trials


Practitioner expertise
Expert opinion divided
Several ‘camps’

High level of consensus
Consistency of view of value of evidence


Organisational date
Little detailed data available
Data comes from a restricted number of sources.

High level of detailed quantitative data available  
Data available from multiple sources


Stakeholder views
Stakeholders not involved

Partnerships with full range of stakeholders





Context
Culture
Task driven
Low regard for individuals
Low morale
Little or no CPD
Focus on the immediate

Learning school
Pupil centred
Values people
Focus on CPD
Focus on capacity and capability building


Leadership
Lack of vision
Diffuse roles
Lack of team roles
Poor leadership
Poor organisation or management of the school

Clarity of vision
Clear roles
Effective team work
Effective organisational structures
Clear leadership


Measurement
Absence of:
Audit and feedback
Peer reviews
Performance review
External evaluation

Internal measures regularly used
Audit or feedback used routinely
Peer review
External measures


Support structures
Not in place
Lack of journal clubs/research learning communities
No guidance on processes
No external partnerships with research schools or HEIs
Time not made availabled

Part of routine processes
Journal clubs and research learning communities embedded
Clear guidance on processes
Partnerships with research schools and HEIs
Dedicated and ring-fenced time available





Facilitation
Characteristics
Respect
Empathy
Authenticity
Credibility

Respect
Empathy
Authenticity
Credibility


Roles
Access
Authority
Position in school
Change agenda

Access
Authority
Change agenda successfully negotiated


Style
Inflexible
Sporadic
Infrequent
Inappropriate

Range of style and flexibility
Consistent and appropriate presence and support

Adapted from (Kitson et al., 1998) Figure 3 p151 


As such, those schools which would appear to have the greatest chances of successfully evidence-based practice and associated innovations, would appear to be predominantly on the right-hand high side of the continuum. Whereas those schools who have significant work to do to increase their chances of successfully implementing evidence-based practice, would have features primarily located on the left-hand side of the continuum.

Now having undertaken an initial assessment of your school’s readiness to use research and evidence, try to plot where you and your school are on the following evaluative grid. In doing so, you are going to focus on your evaluation of the evidence and context elements of the PARiHS model.

Figure 2 The PARiHs Diagnostic and Evaluative Grid - adapted from Kitson et al, 2008


In Figure 2, Q1 represents a school which has a weak context -  though strong evidence on which decision-makers and the main stakeholders can agree. Q2 represents the ideal situation for the implementation of evidence-based practice, where there is a strong supportive school context and agreement on the strength of evidence available. Q3 is where there is weak context and a school in this quadrant is not well placed to take advantage of any agreement about the strength of the evidence. Finally, Q4 represents a situation where the school has a strong context but where there is little or no agreement about the strength of the evidence available to bring about changes in practice

So what does this mean for the facilitation of evidence-based school leadership?  Drawing on the work of (Greenhalgh, 2017) if a primary concern is that colleagues are not aware of the research evidence available to them as teachers and school leaders, a priority for you as the school research lead/champion maybe to help individuals gain a greater awareness of the available evidence and how to evaluate it.    Alternatively, if there is a recognition that the school context is weak – there may be greater focus on putting in enabling conditions – such as focussing on pupil and staff learning, ensuring individuals have clear roles and responsibilities and there is an appropriate organisational framework - such as journal clubs or time for evidence-based CPD

However, whenever we look at a conceptual model and heuristic we need to see whether there is robust evidence demonstrating the efficacy of the approach. For as (Greenhalgh, 2017) notes in the context of healthcare there are no studies of how PARiHS has been used as the original authors have intended, indeed all studies into PARiHS have tended to use the framework to look back at what had been done. That said, as Greenhalgh notes – the PARiHs has what she calls ‘face-validity – that is it seems intuitively sensible that evidence, context and facilitation are all key to implementation. Furthermore, at a broad-brush level, the PARiHS framework is sufficiently flexible to allow its application to many different situations and examples’ (Greenhalgh, 2017).

And finally

You may wish to use the PARiHS framework as an initial diagnostic which captures your school’s readiness to engage with the implementation of research and evidence-based practice. If it works for you, fantastic. If not, there may be other models which work in your context such as the NFER self-assessment too