Back Navigation Next Navigation Machine Learning Overview (page 10 of 13)

An over-fitted model is an idiosyncratic model that only works with the training data and not well with the unseen testing data. We can set constraints on the algorithm (hyper-parameter tuning) and/or use a process called regularization to prevent over-fitting. Regularization adds an additional penalty term in the error function to control for large & sudden fluctuations in the coefficients. An under-fitted model is too simple. As analysts, we attempt to construct models that are neither over- or under-fitted, which means we have balanced bias (accuracy) and variance between training and testing.

Over vs under fitting