Solved – Why does training error increase in learning curves

I can't seem to think of a reason why training error increases in learning curves as the number of samples increases. Would someone please explain?

enter image description here

Because it is harder for the model (with a fixed complexity) to overfit to a bigger training set.

I've copied @ŁukaszGrad's comment as an answer because the comment is, more or less, an answer to this question. We have a dramatic gap between answers and questions. At least part of the problem is that some questions are answered in comments: if comments which answered the question were answers instead, we would have fewer unanswered questions.

Similar Posts:

Rate this post

Leave a Comment