Walk a quarry the morning after a hand injury. The shift lead has the worker, the witnesses, and the maintenance log. By the time those fragments get typed into a form, reviewed, paired with a corrective action, and tied back to the relevant subpart of the regulation, three weeks have passed. The inspector arrives in two.
That delay is where most EHS programs get caught flat-footed. The capture surface is usually fine. The synthesis is what falls apart — the analysis that turns a record into an answer the auditor can read in one sitting.
Structured, not free-form
Free-form root-cause prose reads as authoritative. That is the problem with it. Auditors do not score prose. They score citations and corrective actions, and the question they are answering is whether the action actually addresses the root cause.
So we committed the output to four explicit sections, in order:
- Immediate cause — the unsafe act or condition closest in time to the incident.
- Contributing factors — the conditions or system gaps that made the immediate cause possible.
- Root cause(s) — the systemic node that, if changed, prevents the next iteration.
- Recommended corrective actions — each tied to a specific section of the regulation, with the citation visible.
The fourth section is where the system earns its keep. A recommended action without a citation is a suggestion. With a citation, it is an audit defense.
The regulation does the citing
Citations have to come from somewhere defensible. We work against a curated body of regulatory text — the relevant federal subparts for general industry, construction, and surface mining, plus the codes those subparts reference. When the system writes a recommended corrective action, the only citations available to it are passages already present in that body. If the relevant subpart is not in scope, the action ships without a citation and is flagged for human review.
That last part matters. Inventing a citation — a real-looking section number paired with text that does not actually appear in the regulation — is the single fastest way to make an EHS tool untrustworthy in a regulated environment. We treated that as non-negotiable from day one.
What we deliberately did not let it do
Three guardrails matter as much as the affirmative behavior.
1. It does not invent citations.
If the relevant passage is not in the regulatory context we have curated, the recommended action ships uncited and flagged. A half-finished action with a clear gap is a better outcome than a confidently wrong reference number.
2. It does not replace human investigation on serious incidents.
The analysis is a starting frame, not a substitute for a properly run investigation. For serious-injury or fatality events, the platform surfaces the structured analysis as evidence for the investigators and routes the record into a multi-investigator workflow with explicit sign-off. The analysis informs. It does not close.
3. It does not run silently in the background.
Every analysis is triggered by a person — usually the EHS lead or an inspector pulling up a record. Background batch-generation against the entire historical corpus is available but off by default. Programs where every record carries a machine-generated analysis tend to get worse, not better, because the human eye stops reading.
Where it sits in the workflow
The trigger order matters. An inspection is filed at the work face on a Pulse kiosk by card-tap. The record arrives in the dashboard with the asset bound. The EHS lead — or, later, an inspector working through last year's records — can ask the system for a structured analysis. The result appears on the incident detail page with the cited sections rendered as clickable references.
The practical effect for an inspector arriving on site: when they pull up an incident from six months ago, the analysis is already there, the citations are already there, and the corrective action ledger is tied back to the section of the regulation they were about to ask about. The conversation moves immediately from "what happened" to "what changed and when," which is the conversation everyone wanted to have in the first place.
What we're working on next
- Cross-incident pattern surfacing. When the same root-cause node appears in multiple incidents inside a rolling window, an alert escalates and the associated corrective actions are tied back into the ledger.
- Regulation revision watchers. When the underlying regulation is updated, the affected analyses surface a delta for the EHS lead to review and accept.
- Voice capture continuity. Dictated incident descriptions feeding straight into the structured analysis path. Narrow scope, high-value handshake.
Get in touch