Support Vector Regression, which evolved from the support vector classification for doing regression tasks by introduction of the \(\varepsilon\)-insensitive loss function, is a data-driven machine learning methodology.
The parameter \(C\) controls the trade-off between the complexity of the function and the frequency in which errors are allowed. The parameter \(\sigma\) affects the mapping transformation of the input data to the feature space and controls the complexity of the model, thus, it is important to select suitable parameters, and the value of parameter \(\sigma\) should be selected more carefully than \(C\) .
Li, S., Fang, H. & Liu, X., 2018. Parameter optimization of support vector regression based on sine cosine algorithm. Expert Systems with Applications, 91, pp.63–77. Available at: http://dx.doi.org/10.1016/j.eswa.2017.08.038.
Li, S., Fang, H., 2017. A WOA-based algorithm for parameter optimization of support vector regression and its application to condition prognostics. 2017 36th Chinese Control Conference (CCC). Available at: http://dx.doi.org/10.23919/chicc.2017.8028516.
The matlab samples for svm parameter selection can be found in my Source repository.
To be continued...