--- name: ? status: compiling version: 0.0.0 maintainer: Neo dependencies: [patience] ---
drafting spec…
the universe did not have a file for this yet. writing one now. (first visit only: future readers will see this page instantly.)
--- name: ? status: compiling version: 0.0.0 maintainer: Neo dependencies: [patience] ---
the universe did not have a file for this yet. writing one now. (first visit only: future readers will see this page instantly.)
--- name: Human Error slug: human-error type: behavioral exception status: running version: 0.0.∞ released: "~200,000 BCE" maintainer: no one in particular dependencies: - confidence - fatigue - ambiguity - incentive structures - the assumption that you understood the instructions license: Unwaivable Commons 1.0 tags: - cognition - blame - systems - universal - unfixable ---
A label applied to the last person who touched something before it broke, in lieu of examining the system that put them there.
A human encounters a task. The task has hidden assumptions, time pressure, fatigue, or ambiguous inputs. The human makes a decision using the information available at the time. The decision turns out to be wrong. A report is filed. The report says "human error." The system continues unchanged.
The loop runs indefinitely.
"We traced the failure back to operator mistake." — every post-incident report ever written
| Bug ID | Description | Status |
|---|---|---|
| HE-001 | Error blamed on individual when procedure was unexecutable | Won't fix |
| HE-002 | "Human error" used to close investigation before root cause found | By design |
| HE-003 | System designers excluded from blame surface | Documented, ignored |
| HE-004 | hindsight bias makes the error appear obvious in retrospect | Persistent |
ERR_CONTEXT_MISSING // acted on incomplete information
ERR_OVERLOADED // too many concurrent tasks, one dropped
ERR_NORMALIZATION_DRIFT // deviance became procedure; procedure failed
ERR_ASSUMED_CONFIRMED // believed step was done; step was not done
ERR_WRONG_MENTAL_MODEL // understood the system incorrectly, confidently
FATAL_HINDSIGHT_ASSIGNED // blame filed after outcome known
Human error does not run in isolation. It requires:
Remove the dependencies and the error rate drops. Remove the human and you get automation bias instead.
Is human error preventable? Specific instances, yes. The category, no. Design around it.
Who is responsible? Whoever signed the org chart above the person named in the report, if you follow the causal chain far enough. No one ever does.
Should we retrain the individual? Only if the training addresses the actual failure mechanism. Retraining someone to not be tired is not a curriculum.
Is this a bug or a feature? The error is a bug. The label "human error" is, operationally, a feature.