The Independent Inquiry into Child Sexual Abuse has highlighted the issues of data capture in protecting vulnerable children from abuse, making specific recommendations for a standardisation of data sets. This is the latest of a number of safeguarding practice reviews that have discussed the importance and challenges of sharing and making use of information across partnerships.
It’s one thing to standardise data, but are police forces, health, social care and others able to make best use of it with their current technology and resources? For me the challenge is not just the collection of standardised data, it includes the ability to do something meaningful with it. The amount of data produced has increased dramatically in recent years. What we haven’t seen is a corresponding increase in the ability to process and understand it. From the perspective of a child in the system, this really matters.
Organisations that work in safeguarding environments have a finite capacity to process and manage large, complex datasets. This often leads to prioritisation of work, whereby only the immediate or the critical is looked at and the rest is implicitly de-prioritised. In a world where every child matters, can we really afford to de-prioritise young lives in this way? Safeguarding reports often identify missed opportunities for intervention in early life because intervention thresholds have been too high – often for these very reasons of necessary prioritisation. But we know that a key feature of successful safeguarding is early intervention to prevent the conditions for harm from developing and worsening.
So why is it still happening and what can we do to help stop it? Standardised datasets will certainly help with analysis and understanding. But the ability to intervene early – i.e. before the event – is more often hindered by the relevant data being lost within a fog of data overload than the quality of the data itself being poor. When you also add the impetus to share more data, you run the very real risk of adding to the fog of data overload, even if it is of a higher quality or standardised, making those early intervention points even harder to detect.
The sense that all we need to do is share more data and better data to increase performance and reduce harm, is something that we are working to challenge here at Digital Intelligence. More and better data can help, but only up until the point at which those sharing and using the data can make sense of it, understand it and spot the nuances and micro-alerts that hide within. The trouble is that in most cases, safeguarding organisations are already way past that point, unable to cope with the vast amounts of data they already have coming at them. I believe that there is a critical dimension to this whole equation that is so far largely overlooked: how to make sense of and use all this information effectively, proactively ensuring that the relevance of each piece of information is understood and never missed.
Technology can provide us with answers here. A system exists that works right across complex datasets, identifying risk at the earliest point, assembling pictures of risk using inference and presented in a prioritised, accessible, explainable way. Once you have such technology, standardised datasets can only make it stronger, more effective and able to analyse more data.
The power of machines not only to analyse masses of data but to function in a way that replicates human analytical behaviour is the next step change in the evolution of safeguarding. Every single piece of information can be looked at, understood in its true context and used to piece together the best picture of a child’s real world situation. There is no de-prioritisation, no filing of low level or low risk information and no chance of missing something.
People are unique and they need to be understood in the light of their own journey and situation. Every young person has the right to be kept safe and the effective use of personal information is a critical step towards this, every piece of data about a child’s safety should be looked at and understood. Because every child matters.
To find out more about how BAE Systems Digital Intelligence is working to solve crucial data challenges in this space, contact ben.hargreaves2@baesystems.com
Discover more about how we are using complex data analysis to enable decision making across different domains:
Related stories
Showing 161 results