Pupil Premium and the dream world of perfect evidence-based decision making

Pupil Premium funds are back in education news after the Chair of the Education Select Committee suggested that it should be tied to getting students into apprenticeships. Another terrible idea which has prompted a little investigation into what this funding is for, and whether it is being used effectively.

The most up to date government evaluation is from 2015, I actually think it makes for pretty interesting reading still. The links for the other reports are at the bottom.

Interesting themes:

  • We don’t know what factors influence the decisions that leaders make

The Department of Education concluded that part of the problem is insufficient accountability: ‘Ofsted does not routinely schedule inspections on the basis of these pupils’ performance’ and ‘does not routinely monitor early-warning signs of success or failure.’ This once again made the assumption that better inspection will lead to better outcomes – something we should be highly skeptical of. Without a proper understanding of what is influencing the decision making of school leaders this seems like a pretty drastic conclusion to come to.

  • Pupil Premium is being used to fill the gaps and maybe that’s ok

Despite the fact that the attainment gap is not dropping the department confusingly concluded that many schools are spending this money in ‘useful ways’. It feels as though the department is basically reluctant to cut the money because it knows that the most deprived 16% of schools have received a 5% real terms cut in funding and this money is basically filling that gap. (This is all before the new funding formula comes in.) I’m also reading between the lines here but is there a tacit recognition that disadvantage and SEND overlap to such a degree and the loss of funding locally for the latter should be made up by the former?

So the department paradoxically admits that “the core school funding that the Pupil Premium supplements is not distributed on the basis of need” but also blames schools and argues “that accountability and intervention mechanisms allow schools to waste money on ineffective activities for many years without effective challenge.” Which is it? Are schools in challenging budgetary conditions simply using this to plug gaps or are they willfully wasting money?

When seeking to solve this problem, the first assumption that the Department of Education needs to make is that no school leader wishes to spend money badly and ignore the most deprived within their school community. In fact the 2013 report by the DfE noted that 90% of schools were targeting these pupils prior to the Pupil Premium existing- we should trust that leaders are making decisions with the best intentions; they were doing it before the money came along. If the response is simply to up the levels of monitoring and inspections rather than responding to the legitimate concerns of school leaders that much of the EEF research is too general then we have an issue; just a guess on my part, but the likely result will be that schools find a way of reporting the spending in a particular way rather than seeking to honestly review the way spending is done. If you don’t give school leaders the tools to reform effectively, don’t expect a miracle.

  • The EEF are frustrating when it comes to giving advice

Across the 2013 and 2015 DfE reports the EEF is constantly cited as the school leader’s solution to spending wisely. The EEF unsurprisingly recommended “a strong commitment to the promotion of rigorous evidence” in its own evaluation of pupil premium. They suggested using their services is a good way of doing this. However, they then produce graphs for schools like this:

Screen Shot 2017-10-09 at 19.58.22

I could go on all day about why this graph is bad. One thing that makes it particularly bad is that it is so widely used in schools! Top of the list of offending features- note that teaching assistants are labelled ‘Not worth it’! Yes- they are expensive, but graphs like this fail to capture the nuances within the research such as: teaching assistants can be very effective, but you need to train them and deploy them in the right manner. Also, to label performance pay as ‘not worth it’ when the actual state of the research is ‘we don’t know’ is another jaw dropper. Let’s not going into the fact that ‘Feedback’ could mean literally anything in a school context. The EEF loves to keep things simple but that is somewhat undermining their hopes of being ‘rigorous.’ With advice like this to go on, seems absurd for Ofsted to criticise school leaders for defective decision making!

 

2015 DfE report:

https://www.nao.org.uk/wp-content/uploads/2015/06/Funding-for-disadvantaged-pupils.pdf

Also The Sutton Trust report is useful: http://www.suttontrust.com/wp-content/uploads/2015/06/Pupil-Premium-Summit-Report-FINAL-EDIT.pdf

Finally the 2013 Dfe report gives some context around how this area has developed: https://www.gov.uk/government/uploads/system/uploads/attachment_data/file/243919/DFE-RR282.pdf

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

w

Connecting to %s