Back Navigation Next Navigation Machine Learning Overview (page 10 of 13)

An over-fitted model is an idiosyncratic model that only works with the training data and not with the unseen testing data. We use a process called regularization to prevent over-fitting. Regularization adds an additional penalty term in the error function to control for large & sudden fluctuations in the coefficients. An under-fitted model is too simple. As analysts, we attempt to construct models that are neither over- or under-fitted.

Over vs under fitting