Collecting data about what happens in our schools is important. It helps understand and identify trends, spot problems and react to them. However, sometimes data which is accurate and collected for all the right reasons tells a skewed story that benefits no one.
A good example of that is what we call positive destination figures. That’s the information that is collected about young people leaving school and what they do next. Every child that goes into some form of education, training or employment is counted as one in a positive destination, then overall percentages for each school can be calculated and published. The mistake then is to rank those schools in order of how high that percentage is without knowing any of the detail.
For example, positive destinations can include a zero hours contract working in a bar or a young person being on an activity agreement, which can mean as little as two hours contact time with a project worker in a youth work centre once a week. It can mean signing up for but never completing a college course.
For the purposes of the statistics, all of these outcomes count as positive destinations, but how many are really positive? What’s more, the last time I looked, we only track young people after they leave school for the first nine months. If you leave at 16, the government has little or no idea what you are doing at 18, 21 or 23.
In my experience, some of the “worst” schools are actually the best at preparing young people for life beyond the school gates, giving them the resilience and confidence they need to make their way in the world.