Research release
September 20, 2025
Novelty-triggered Capacity Growth (NCG) for Continual Learning
NCG is a continual learning framework in which a network grows capacity dynamically—but only when it needs to. A novelty detector monitors incoming data and flags inputs that differ significantly from what the model has seen before; when novelty is detected, the network expands (new capacity) to accommodate new knowledge. Meta-parameters govern when and how much growth happens. The goal is to mitigate catastrophic forgetting—the loss of prior knowledge when training on new tasks.
In our paper we report 21% forgetting reduction on Split-MNIST (p = 0.012) and 64% on Split-CIFAR-10 (p < 0.0001). We also report an honest negative finding: meta-parameter recovery ratios sometimes fell below 0.5, meaning convergence is not always clean.
The NCG paper and release artifacts are now available with reproducible reporting and clear statements of both strengths and limitations.