Solved – How to find optimal penaltyparameter C for SVM (regression)

I am training an svm regressor using python sklearn.svm.SVR

From the example given on the sklearn website, the above line of code defines my svm.

svr_rbf = SVR(kernel='rbf', C=1e3, gamma=0.1) 

where C is "penalty parameter C of the error term."

My question is, how to find optimal value for C. Is there a method or some libraries which could help me with this.

Thank you

If you are using an RBF kernel as in the code example, you should be optimizing $C$ and $gamma$ together. The most common way to optimize hyperparameters is through grid search, but this is not efficient.

Principled method to optimize hyperparameters are available in specialized libraries, such as Optunity (code example specifically for SVM here, disclaimer: I'm the lead dev of Optunity). Such libraries offer specialized black-box function optimizers that efficiently find good sets of hyperparameters. Alternatives include Hyperopt, SMAC and Spearmint.

Similar Posts:

Rate this post

Leave a Comment