Solved – Where did the term “learn a model” come from

Often I have heard the data miners here use this term. As a statistician who has worked on classification problems I am familiar with the term "train a classifier" and I assume "learn a model" means the same thing. I don't mind the term "train a classifier". That seems to portray the idea of fitting a model as the training data is used to get good or "improved" estimates of the model parameters. But the would learn means to gain knowledge. In plain English "learn a model" would mean to know what it is. But in fact we never "know" the model. Models approximate reality but no model is correct. It is like Box said "No model is correct but some are useful."

I would be interested to hear the data miners response. How did the term originate? If you use it, why do you like it?

I suspect its origins are in the artificial neural network research community, where the neural network can be thought of as learning a model of the data via modification of synaptic weights in a similar manner to that which occurs in the human brain as we ourselves learn from experience. My research career started out in artificial neural networks so I sometimes use the phrase.

Perhaps it makes more sense if you think of the model as being encoded in the parameters of the model, rather than the equation, in the same way that a mental model is not an identifiable physical component of the brain as much as a set of parameter settings for some of our neurons.

Note that there is no implication that a mental model is necessarily correct either!

Similar Posts:

Rate this post

Leave a Comment