Canopy reads six signals from every AI agent session. Intent, spatial position in the codebase, momentum, coherence decay, knowledge state, and structured user intent. It fuses them through a Mixture-of-Experts layer and outputs a single pressure score.
That score drives everything. Which agent gets spawned next. What it should focus on. Whether the current trajectory is productive or going in circles.
The backbone is Mamba, a state-space model that processes temporal sequences without the quadratic cost of attention. Hidden states capture things handcrafted features can't: transitions between work phases, stuck states, drift from the original goal.
We tested this rigorously. Five research tracks. Frozen benchmarks. Bootstrap confidence intervals on every metric. The result: a hybrid architecture where gradient-boosted trees handle prediction on handcrafted features (AUROC 0.924) while Mamba hidden states handle temporal structure underneath.
Canopy doesn't tell agents what to think. It tells them where the pressure is.