AWS x SEBP Webinar Series: Episode 1 - Emotions, Data and Decision-Making in Critical Incidents
- Helen Khezrzadeh
- 4 hours ago
- 4 min read
Introduction
The first webinar in the AWS x SEBP 2026 series brought together experts from the military and ambulance service to explore a deceptively simple question: what really shapes decision‑making in critical incidents?
Across policing, health, and emergency response, practitioners increasingly operate in environments saturated with data - dashboards, feeds, sensors, mapping tools, and AI‑enabled systems. Yet, as this session made clear, the human emotional experience remains inseparable from the decisions made under pressure.
SEBP's Chief Operating Officer Matt Bland opened the session by framing the challenge: critical incidents unfold too quickly for calm, analytical weighing of all available information. Instead, decisions are shaped by time pressure, uncertainty, threat, and emotional load. Experienced commanders rely heavily on pattern recognition - a sense that “something doesn’t feel right” - built from years of exposure to similar situations.
Matt brought together two speakers from outside policing to broaden the perspective:
Steve Killick, Tactical Director at Airbox and former senior military leader with 38 years’ experience in operations.
Dave Williams, senior leader in the ambulance service and doctoral researcher studying emotional decision‑making in large‑scale incidents.
Both speakers emphasised that while technology is advancing rapidly, the emotional and cognitive realities of decision‑makers must remain central.
Training, Pattern Recognition and the “Right” Information
Dave highlighted that effective decision‑making begins long before an incident occurs. Training, exposure, and prior knowledge create the mental patterns responders rely on when events unfold at speed. “How do we ensure the data we’re looking at matches something that we see?” he asked, emphasising the role of preparation in building those internal templates.
Dave described responding to the 2009 Boeing 777 crash, where information was limited to voice‑only radio updates - a stark contrast to today’s digital tools. The lack of visual data increased uncertainty and anxiety, reinforcing his research finding that information flow can mitigate fear, which he defines (via Huberman) as anxiety + uncertainty.
Steve echoed this from a military perspective: early in a career, concise information is essential; with experience, leaders prefer more data so they can self‑filter. But accuracy and consistency are critical — every step in the chain (gathering, transmitting, receiving, interpreting) introduces potential distortion.
Emotion, Instinct and the Limits of Data
Both speakers agreed that instinct - far from being guesswork - is the brain’s rapid matching of current cues to past patterns. Dave noted that experienced responders often act instinctively first, then use data to steer or validate their decisions. Data becomes an enabler, not a replacement for human judgement.
Dave's research uses self‑evaluation tools to track emotional states during decision‑making. While physiological measures (heart rate, eye tracking) are possible, simple self‑reflection remains highly effective.
Designing Technology for Humans, Not the Other Way Around
A recurring theme was the need for systems that support, rather than overwhelm, decision‑makers.
Dave argued strongly for visual information as a stabilising force. Many responders are visual learners, and mapping tools help reduce emotional load by grounding people in space and context. “Stability in information gives you key evidence to drop your emotions down a bit,” he explained.
Steve described how Airbox evolved from an aviation tracking tool into a multi‑agency situational awareness platform, shaped by user feedback rather than theoretical design. But he acknowledged that current systems rarely incorporate emotional cues - an area ripe for development.
Both emphasised:
intuitive interfaces
simplicity under pressure
systems usable even after long gaps
early and continuous user involvement in design
As Dave put it: “System design can either add to or reduce anxiety.”
Emotional Regulation: A Missing Piece of Training
One of the most compelling discussions centred on whether experienced leaders “turn off” their emotions. Both speakers rejected the idea. Emotions cannot be switched off - but they can be regulated.
Dave noted that emotional regulation is a trainable skill, yet rarely taught in emergency services. He argued for embedding emotional intelligence, fatigue awareness, and self‑reflection into command training, especially for 3am decision‑making when cognitive resources are low.
Steve added that repeated realistic training conditions people to operate effectively despite emotional load, and that heightened emotion can sometimes sharpen decision‑making when time is critical.
Gamification and Safe Failure
Both speakers saw huge potential in gamification, VR, and simulation. Steve argued that humans learn best from failure - but real‑world failure in major incidents is unacceptable. Technology allows safe experimentation, pushing scenarios to breaking points to reveal hidden weaknesses.
Dave emphasised narrative‑based learning: exercises should tell a story, not just present isolated tasks. This mirrors how humans naturally make sense of events.
What Should Procurement Teams Look For?
The session closed with practical advice for organisations buying decision‑support tools:
Intuitive, simple interfaces that reduce cognitive load
Visual mapping and spatial tools
User involvement from the earliest design stages
Systems that work under pressure and after long periods of non‑use
Evaluation criteria that measure emotional impact, not just technical performance
As Steve warned, if a system is too complex, “under pressure, they’re just not going to use it.”
