Real-time decisioning is not about implementing batch-style strategies in real-time, but more about  designing systems and processes to predict, detect and respond to real-world events, and enabling and empowering their people to work in this new way.
With customers and citizens demanding frictionless experiences with Amazon-esque standards for speed and convenience, organisations are shifting from a batch way of working to an always-on, never-ending continuum that more accurately reflects the age we live in.
Real-time decisioning and psychological safety blog banner

Digital distractions and information fatigue

For employees, this means more decisions to oversee and more data than you could imagine. So it isn't surprising that the unceasing demands of technology in our data-saturated age are damaging our physical and mental wellbeing, with employees struggling with digital distractions and information fatigue.
A YouGov study found that 55 per cent of 2,000 British employees1 experience some form of information overload at work. A similar proportion feels they are distracted by information; just under half experience stress as a result, with 28 per cent believing it affects their wellbeing.
In Information Overload: Causes, Symptoms and Solutions, an article for the Harvard Graduate School of Education's Learning Innovations Laboratory, Joseph Ruff comments that we are bombarded with so much data that we're on information overload. An overload that interferes with our ability to concentrate, think, creativity and personal productivity, with symptoms such as hurry sickness (the belief that one must constantly rush to keep pace with time) and pervasive hostility resulting in a chronic state of irritability, near anger, as well as burnout.
Burnout and stress are rife in the technology and data industry, a topic spotlighted in a BBC article last year and called out in Harvey Nash Tech Survey 2020.

So, what's this got to do with real-time decisioning?

In a real-time decisioning environment, you can blink, and with a refresh of data, the decision you had just made has turned out to be the wrong one. Or, if you are continuously bombarded with data, alerts and automated decision exceptions, there's never a sense of accomplishment and getting away from the data smog2.
Alert fatigue is a topical issue in healthcare. For example, the annual report from ECRI Institute ranks alert fatigue in their top ten technology hazards.
Alert fatigue — also known as alarm fatigue — is when an overwhelming number of alerts desensitises the people tasked with responding to them, leading to missed or ignored warnings or delayed responses. In short, the more you're exposed to something, the more you tolerate, normalise, and forget it. And while the risks are somewhat different, alert fatigue is a common issue within IT, DevOps and Data teams as they monitor the always-on technology, data-driven dashboards and automated decisions that enable organisations.
Mistakes, slower response times, feeling overwhelmed by the role, missed or ignored warnings, productivity slumps, feeling people are too busy to help are symptoms of a lack of psychological safety in the workplace.
In a real-time decisioning organisation, leaders create safe environments where a learning mindset is encouraged. As a result, individuals believe and trust their mistakes won't be punished by the team or leadership, instead learnt from collectively.

Why psychological safety matters

Psychological safety primarily draws upon the research of American scholar Amy Edmondson who defines it as "a shared belief held by members of a team that the team is safe for interpersonal risk-taking-taking taking".
Decision making requires individuals to navigate a triple constraint of speed, accuracy, and cost: weighing up between a delay in applying information and missing a decision window or making a decision too quickly, with half-baked data that can cost money and lives.
Individuals don't want their peers and their organisation to perceive them as incompetent or ignorant, and if they feel vulnerable and exposed are less likely to ask left-field questions, challenge the status quo, admit mistakes and ask for help. It is this sort of environment that inhibits collaboration, team learning and innovation, allowing group think and confirmation bias to take hold.

Canary in the coalmine

Psychological safety is not necessarily about developing tougher or more resilient employees. Dr Christina Maslach, social psychologist and author of the Maslach Burnout Inventor describes burnout as the “canary in the coalmine”3.
The canary in the cage goes deeper in the coalmine, and if the canary is having trouble breathing and functioning – it’s a red flag that something isn’t right in the workplace – it’s a sign that the workplace, the mine, is dangerous. This warning sign of an unsafe environment calls out for leader and employees to ask “what is going on”. In her presentation at the 2018 DevOps Enterprise Summit, Maslach explained that is not about making the canary – the employees – more resilient, it’s about reducing the toxic fumes and making the workplace less toxic rather than looking for a tougher canary.

Championing psychological safety

Real-time decisioning is as much about culture, people and mindset as it is about technology, data and business processes. Google famously learnt this lesson in its quest to build the perfect team in its Project Aristotle initiative, where psychological safety was identified as the most important factor in determining team effectiveness.
Organisations that make better decisions and implement them at pace can outmanoeuvre competitors and adversaries. As organisations shift their decisioning from a traditional static ‘request and response’ to an event mindset that supports them in being responsive and resilient to the customers and citizens they serve; the ethics of why psychological safety is a table stake becomes clearer.
By promoting an environment where employers feel empowered to take measured risks and confident that no one on the team will embarrass or punish anyone else for admitting a mistake, asking a question, or offering new ideas, leaders can foster a culture of psychological safety.
While leaders set and shape their teams culture, all employees can role model the behaviours that underpin psychological safety.
This could include, for example:
  • Sharing their own experiences, what they have learnt from their own mistakes;
  • Improving how they give (and receive) regular constructive, informal feedback;
  • Remaining alert to how they 'hear' others and how they signal for help; and
  • Embracing, not shying away from productive conflict
The bottom line is when teams and their leaders champion psychologically safe work cultures, interpersonal risk-taking become normal. Individuals feel able to speak up and, most importantly, they can see how they are contributing towards making their organisation a better place for all.

About the author
Holly Armitage is Principal Strategist at BAE Systems Digital Intelligence

1. Report from Microsoft called ‘Defying Digital Distraction’, fronted by Dave Coplin, the Chief Envisioning Officer of Microsoft UK at the time and author of “The Rise of the Humans: How to outsmart the digital deluge”
2. According to Techopedia Data smog refers to an overwhelming amount of data and information — often obtained through an internet search — whose volume serves more to confuse the user than illuminate a topic. Data smog is a term coined from a book written by the journalist David Shenk, which deals with the influence of the information technology revolution and how the vast amount of information available online make it increasingly difficult to separate facts from fiction.
3. Understanding Job Burnout - Dr. Christina Maslach talk at DOES18 Las Vegas,

Recommended reading

  • Bringing data to the party. Caroline Bellamy is on a mission to transform how the UK Ministry of Defence uses data. She tells Mivy James about her 30-year career in industry and why data holds the key to smarter and faster decision-making across Defence
  • Delivering data dividends. Mivy James examines what needs to be done to help the military be more data centric
  • We need to talk about data engineering. Organisations across the public and private sectors are increasingly prioritising the role of data engineers – and rightly so, says Alex Richards  
  • Tuning up data trust. How can governments generate greater trust when it comes to data? Nicola Eschenburg says it can be done, and the sooner the better
  • Homing in on data-driven government. Statistical models, data, and analytics have always loomed large in Andy Gregory’s in-tray, but now he’s putting his expertise to good use at the UK’s Home Office. He tells Dylan Langley about his eclectic career and adjusting to life as a senior civil servant.
  • How a new model of collaboration can detect data risk. Data sharing has many advocates but it is fraught with ethical risk. Richard Thorburn says technology can help fuel greater collaboration that safeguards data by detecting risk earlier and faster
  • Operationalising big data analytics and machine learning. Harriet Barr examines the challenges faced in moving from research to operations and explains why machine learning is actually the easy bit

Stay informed

Please tick the boxes below to opt-in to receive our latest email updates.
Thank you for your subscription.

Holly Armitage

Principal Strategist, BAE Systems