Double descent (original) (raw)
In statistics and machine learning, double descent is the phenomenon where a statistical model with a small number of parameters and a model with an extremely large number of parameters have a small error, but a model whose number of parameters is about the same as the number of data points used to train the model will have a large error. This phenomenon seems to contradict the bias-variance tradeoff in classical statistics, which states that having too many parameters will yield an extremely large error.