I am training an svm regressor using python sklearn.svm.SVR

From the example given on the sklearn website, the above line of code defines my svm.

`svr_rbf = SVR(kernel='rbf', C=1e3, gamma=0.1) `

where C is "penalty parameter C of the error term."

My question is, how to find optimal value for C. Is there a method or some libraries which could help me with this.

Thank you

**Contents**hide

#### Best Answer

If you are using an RBF kernel as in the code example, you should be optimizing $C$ and $gamma$ *together*. The most common way to optimize hyperparameters is through grid search, but this is not efficient.

Principled method to optimize hyperparameters are available in specialized libraries, such as Optunity (code example specifically for SVM here, disclaimer: I'm the lead dev of Optunity). Such libraries offer specialized black-box function optimizers that efficiently find good sets of hyperparameters. Alternatives include Hyperopt, SMAC and Spearmint.

### Similar Posts:

- Solved – How to optimize RBF parameters $C,gamma$ with KSVM method
- Solved – Visualization figures for the grid search optimization steps
- Solved – Grid search error in LIBSVM while optimizing C and g parameters
- Solved – Isn’t kernel ridge regression supposed to be the mean of gaussian processes
- Solved – Is a high number of support vectors a sign of overfitting