Compressibility Measures Complexity: Minimum Description Length Meets Singular Learning Theory

Einar Urdshals =
Timaeus
Edmund Lau =
UK AISI
Jesse Hoogland
Timaeus
Stan van Wingerden
Timaeus
Daniel Murfet
Timaeus
October 14, 2025

Abstract

We study neural network compressibility by using singular learning theory to extend the minimum description length (MDL) principle to singular models like neural networks. Through extensive experiments on the Pythia suite with quantization, factorization, and other compression techniques, we find that complexity estimates based on the local learning coefficient (LLC) are closely, and in some cases, linearly correlated with compressibility. Our results provide a path toward rigorously evaluating the limits of model compression.

Cite as

@article{urdshals2025compressibility,
  title = {Compressibility Measures Complexity: Minimum Description Length Meets Singular Learning Theory},
  author = {Einar Urdshals and Edmund Lau and Jesse Hoogland and Stan van Wingerden and Daniel Murfet},
  year = {2025},
  abstract = {We study neural network compressibility by using singular learning theory to extend the minimum description length (MDL) principle to singular models like neural networks. Through extensive experiments on the Pythia suite with quantization, factorization, and other compression techniques, we find that complexity estimates based on the local learning coefficient (LLC) are closely, and in some cases, linearly correlated with compressibility. Our results provide a path toward rigorously evaluating the limits of model compression.},
  eprint = {2510.12077},
  archivePrefix = {arXiv},
  url = {https://arxiv.org/abs/2510.12077}
}
Click to copy