Library

Current primitives, backend families, and agent-adjacent surfaces.

This page tracks the repository as it exists now, not the older site wording.

Scope

InfoTheory combines general information-theoretic primitives, a shared predictive model class, compression adapters, and two Universal-AI-oriented planners: MC-AIXI and AIQI [kim2026_aiqi].

Primitives and estimators

The core library exposes entropy, entropy-rate, mutual information, cross-entropy, intrinsic dependence, NED, NTE, and several NCD variants. The implementation is organized around explicit runtime contexts so the same higher-level operations can be driven by different predictive or compression backends.

Rate backend families

The central predictive model class is RateBackend. Library operations such as entropy-rate estimation, generation, rate-coded compression, and agent planning all consume this same model class. The important RateBackend families are:

  • ROSA+ for fast sequential estimation.
  • CTW and FAC-CTW for Bayesian bit-level modeling and AIXI-compatible world models.
  • Match, sparse-match, and PPMD-style byte predictors.
  • Mixture, calibrated, and particle-style meta-predictors.
  • Mamba and RWKV neural backends for deterministic CPU-oriented byte modeling.
  • ZPAQ as a rate model where strict conditioning constraints are not required.

Mamba is not a vague future idea here; the repository already contains a deterministic CPU Mamba-1 runtime and exposes it through the shared backend surface [gu2023_mamba].

RateBackend::Mixture is the ensemble model-class mechanism. It combines a list of RateBackend experts into one predictive model. Bayes, Switching, and Convex follow On Ensemble Techniques for AIXI Approximation [veness2012_aixiens]; FadingBayes, Mdl, and Neural are extensions implemented in this repository.

Compression backends

Compression is handled separately from prediction. The library supports standalone ZPAQ where enabled, generic rate-coded compression through arithmetic coding or rANS, and RWKV-specific compression wiring in builds that include the RWKV feature set. This separation is what makes the same modeling stack useful for both information estimation and compression-distance workflows.

Agent-facing configuration

On the agent side, MC-AIXI and AIQI are not separate projects bolted onto the repo. They share the same environment layer, use the same backend abstractions, and are configured from the same codebase.

Both planners can also take a rate_backend object using the same RateBackend schema and mixture parser as the rest of the repo, including the browser mixture editor. This is how the library's shared model class becomes the planner world model. In practice that means Bayes and Convex mixtures are exposed directly, while Switching uses the fixed-share update from On Ensemble Techniques for AIXI Approximation [veness2012_aixiens] with a constant switch-rate alpha; FadingBayes, Mdl, and Neural remain explicit extensions.

For the planner-specific details, see Agents.