Excess Description Length (EDL) is an information-theoretic framework quantifying how much predictive structure a model extracts from training data during fine-tuning. Defined via prequential coding, it measures the gap in bits required to encode training labels sequentially with an evolving model versus the final model's residual encoding cost, providing bounds on generalization.
Excess Description Length (EDL) is a metric that quantifies how much useful information a machine learning model learns and encodes from its training data, especially during fine-tuning. It helps researchers understand if a model is truly learning new capabilities or just revealing existing knowledge, and provides insights into its potential to generalize to new data.
Information-theoretic learning, Prequential coding measure, Learning description length, Model information gain, Predictive structure quantification
Was this definition helpful?