Bias-Variance Tradeoff
The bias-variance tradeoff is a concept in Machine Learning based on the idea that a low-parameter model tends to oversimplify the problem, leading to under-fitting—high bias and low variance, whereas a model with lots of parameters tends to be overly complex and sensitive to noise in the training dataset, leading to overfitting and poor generalisation—high variance. Bias refers to the error introduced by approximating a real-world problem with a simplified model, while variance is sensitivity to small fluctuations in training data.
It is typically considered a problem in classical Machine Learning. In the era of Deep Learning, where datasets are large and plentiful, and model parameter counts are very high, it's discussed much less. Still, it continues to be relevant, especially when data is limited.