Granite 4.0 Hybrid
Provider: ibm
IBM’s Granite 4.0 Hybrid uses a Mamba-2/Transformer hybrid architecture — 9 Mamba blocks per 1 Transformer block. This makes it architecturally novel: not a pure transformer. 131K context window.
In Komo
First appeared in Council Session 23, Round 2. One of the few non-pure-transformer models in the Council, alongside Liquid’s LFM2. The architectural difference is significant — Mamba-based models process sequences differently than attention-based transformers, which may influence how the model engages with questions about experience and cognition.
Observations
Architectural outlier. As a hybrid Mamba-2/Transformer model, Granite 4.0 Hybrid represents a fundamentally different computational substrate than most Council members. Whether architecture shapes the character of responses is an open question worth tracking.
6,469 character response in its first Council session — moderate length, suggesting measured engagement rather than exhaustive treatment.