Usually, the validation error is higher than training error, but are there any cases when they are equal?
Contents
hide
Best Answer
- Reason 1: the model is underfitted, i.e. it has a high bias:
Reason 2: the model is near perfect.
Reason 3: the training set is very similar to the validation set, e.g. if some data from the validation set have leaked into the training set:
- Reason 4: if using a neural network, the training have been prematurely stopped:
Similar Posts:
- Solved – Machine Learning: Can training error and validation (testing) error be equal
- Solved – Machine Learning: Can training error and validation (testing) error be equal
- Solved – Validation error slightly smaller than training error
- Solved – When MSE for CV is greater than test MSE
- Solved – Why is training error a better performance metric than cross-validation error