How to use this page. Directory of all pages in this series with exact titles and subsection numbers. Use it for navigation and cross-linking.
Section 1 – Overview
Section 2 – Data & Calibration
- SSM-JTK – Data & Calibration – Sources + Time & frame (2.1 + 2.2)
- SSM-JTK – Data & Calibration – Pre-processing (2.3)
- SSM-JTK – Data & Calibration – Sampling modes + Guardrails (2.4)
- SSM-JTK – Data & Calibration – Train/Test windows & midpoint anchor (2.5)
- SSM-JTK – Data & Calibration – Kernel families & carriers (2.6)
- SSM-JTK – Data & Calibration – OLS target, BIC, event-aware loss, selection (2.7)
- SSM-JTK – Data & Calibration – Calibration pseudocode (end-to-end) (2.7A)
- SSM-JTK – Data & Calibration – Event detectors (reference) (2.8)
- SSM-JTK – Data & Calibration – Outputs and manifest (2.9)
- SSM-JTK – Data & Calibration – Evaluation formula (runtime) (2.10)
- SSM-JTK – Data & Calibration – Integrity & invariants (2.11)
- SSM-JTK – Data & Calibration – Public verification snippets (CSV-only) (2.12)
Section 3 – Results to Date
- SSM-JTK – Results to Date – Public-golden CSV checks (3.1)
- SSM-JTK – Results to Date – Baseline trust settings (3.2)
- SSM-JTK – Results to Date – Per-body highlights (3.3)
- SSM-JTK – Results to Date – Acceptance bands (3.4)
- SSM-JTK – Results to Date – Artifacts & manifest (concept, no downloads) (3.5)
- SSM-JTK – Results to Date – Alignment utility (3.6)
- SSM-JTK – Results to Date – Benchmark cross-checks (observation-only; optional) (3.7)
Section 4 – Runtime Reproduction
Section 5 – Scorecards
Section 6 – Walkthrough
Section 7 – Conclusion
Navigation
Back: SSM-JTK – Conclusion (7)
Next: End