Psychological Safety Can't Data-fied
The new OHS regulations in Victoria are a necessary and powerful step towards protecting mental health in the workplace. They rightly mandate a proactive risk management approach, treating psychological harm with the same seriousness as physical injury.
However, as the focus shifts to compliance, a worrying trend is emerging: the push for "real-time data," "custom analytics," and "proactive mental health tools." We are being sold a vision where employee well-being is a dashboard, compliance is a series of green ticks, and a human being is just a collection of quantifiable metrics.
The Illusion of "Real-Time" Safety
The promise is seductive: Real-time data and insights into employee wellbeing will allow HR and WHS teams to head off crises. Proactive mental health tools will support employees before they burn out.
This approach fundamentally misunderstands the nature of psychological safety. Psychosocial hazards—like poor job control, low role clarity, or a toxic culture—are structural problems embedded in the design of work. They are slow-burn issues that cannot be fully captured by immediate, quantitative inputs.
Lagging vs. Leading Indicators: Tools that track sentiment, keystrokes, or meeting attendance are mostly lagging indicators of stress or productivity. By the time an AI flags a change in typing speed or a shift in the tone of internal emails, the underlying hazard (e.g., unsustainable workload) has already been active for months.
The Problem is the "Why": A system can show that 40% of a team uses the 24/7 access to qualified counsellors—a seemingly positive metric. However the true compliance question is: Why is the need for crisis support so high? The answer lies not in the usage data, but in qualitative insights: Are managers bullying, is the change management chaotic, or are the demands impossible? The solution is changing the work system, not just providing a digital safety net.
Reducing Humanity to a Data Point
The most profound ethical concern with the "analytics first" mentality is that it risks stripping employees of their human dignity, converting complex lives into consumable metrics.
When we rely on custom reporting and analytics to drive WHS decisions, we risk:
Surveillance and Mistrust: Systems that monitor activity to infer "wellbeing" breed a culture of fear, where employees are incentivised to mask stress and work longer hours just to keep their data profile "healthy." The very act of being monitored can become a new, acute psychosocial hazard, directly undermining psychological safety.
The Black Box of Decision Making: If an employee’s data—be it their EAP usage, performance metrics, or passive sentiment analysis—informs a hiring, promotion, or even a termination decision, that human being has been judged not on their character or work quality, but on an algorithmic score. This removes transparency, justice, and humanity from the employment relationship.
The False Fix: By focusing on the individual’s mental state (the "data point") rather than the organisational hazard (the "system"), companies use these tools as a substitute for the heavy lifting required by the Victorian regulations: redesigning work, fixing bad management, and truly consulting with staff. It becomes an act of mitigating stress rather than eliminating the source of the stress.
The Compliance We Truly Need
True psychosocial compliance, as envisioned by the Victorian OHS reforms, is not an audit of metrics. It's an audit of human processes and work design.
It is achieved when leaders sit down and genuinely ask:
How can we redesign this team’s handover process to eliminate the 60-hour week?
How can we train managers to give control and clarity, rather than just demanding results?
How can we make our complaints process so fair and transparent that workers trust it completely?
The necessary input is not real-time data from an app, but real-world consultation from a person.
Psychological safety cannot be measured into existence; it must be designed and built through transparent, human-led effort. Anything less risks reducing our people to digits on a digital compliance checklist, forgetting the very human complexity the law seeks to protect.