Data Quality & Degradation
Definition
Data quality determines how much a system can reliably evaluate, interpret, and act upon.
When quality decreases, the system must degrade gracefully—not compensate with assumptions.
Quality Dimensions
Data quality must be assessed along:
- Continuity — Are there gaps in the data stream?
- Resolution — Is the data granular enough for evaluation?
- Source reliability — Is the data source trustworthy?
- Context completeness — Is sufficient context available?
Degradation Rules
When quality is insufficient:
- Evaluation depth must decrease — Fewer constructs evaluated, wider uncertainty bands
- Interpretation scope must narrow — Less specific language, more hedging
- Action availability must reduce — Fewer suggestions, more "no action" outputs
- Full degradation to silence is permitted — No output is valid
The Degradation Cascade
| Quality Level | Evaluation | Interpretation | Action | |--------------|------------|----------------|--------| | High | Full | Specific | Available | | Medium | Partial | General | Limited | | Low | Minimal | Hedged | None | | Insufficient | None | None | Refused |
Why Degradation Matters
Systems that maintain confidence despite poor data:
- Mislead users
- Create false precision
- Cannot admit uncertainty
- Eventually fail catastrophically
Systems that degrade gracefully:
- Match confidence to evidence
- Preserve user trust
- Fail safely
- Maintain long-term reliability
What Degradation Looks Like
High quality data: "Recovery capacity is present. Stress load is moderate. Conditions support light activity."
Medium quality data: "Recovery capacity appears present. Stress load is elevated. Limited confidence in current assessment."
Low quality data: "Insufficient data continuity. Current state uncertain. No action recommended."
Insufficient data: [No output]
Compliance Note
Systems that maintain high confidence despite low data quality, or that compensate for poor data with assumptions, violate ESGR System Model specifications.