SVM
Support Vector Machines.
See
REF: ml_intro_svmMember of Ml
-
-
Retrieves all the uncompressed support vectors of a linear %SVM
The method returns all the uncompressed support vectors of a linear %SVM that the compressed support vector, used for prediction, was derived from. They are returned in a floating-point matrix, where the support vectors are stored as matrix rows. -
Generates a grid for %SVM parameters.
Declaration
Objective-C
+ (nonnull ParamGrid *)getDefaultGridPtr:(int)param_id;Swift
class func getDefaultGridPtr(param_id: Int32) -> ParamGridParameters
param_id%SVM parameters IDs that must be one of the SVM::ParamTypes. The grid is generated for the parameter with this ID.
The function generates a grid pointer for the specified parameter of the %SVM algorithm. The grid may be passed to the function SVM::trainAuto.
-
Creates empty model. Use StatModel::train to train the model. Since %SVM has several parameters, you may want to find the best parameters for your problem, it can be done with SVM::trainAuto.
Declaration
Objective-C
+ (nonnull SVM *)create;Swift
class func create() -> SVM -
Loads and creates a serialized svm from a file
Use SVM::save to serialize and store an SVM to disk. Load the SVM from this file again, by calling this function with the path to the file.
Declaration
Objective-C
+ (nonnull SVM *)load:(nonnull NSString *)filepath;Swift
class func load(filepath: String) -> SVMParameters
filepathpath to serialized svm
-
Declaration
Objective-C
- (nonnull TermCriteria *)getTermCriteria;Swift
func getTermCriteria() -> TermCriteria -
Trains an %SVM with optimal parameters
Declaration
Objective-C
- (BOOL)trainAuto:(nonnull Mat *)samples layout:(int)layout responses:(nonnull Mat *)responses kFold:(int)kFold Cgrid:(nonnull ParamGrid *)Cgrid gammaGrid:(nonnull ParamGrid *)gammaGrid pGrid:(nonnull ParamGrid *)pGrid nuGrid:(nonnull ParamGrid *)nuGrid coeffGrid:(nonnull ParamGrid *)coeffGrid degreeGrid:(nonnull ParamGrid *)degreeGrid balanced:(BOOL)balanced;Parameters
samplestraining samples
layoutSee ml::SampleTypes.
responsesvector of responses associated with the training samples.
kFoldCross-validation parameter. The training set is divided into kFold subsets. One subset is used to test the model, the others form the train set. So, the %SVM algorithm is
Cgridgrid for C
gammaGridgrid for gamma
pGridgrid for p
nuGridgrid for nu
coeffGridgrid for coeff
degreeGridgrid for degree
balancedIf true and the problem is 2-class classification then the method creates more balanced cross-validation subsets that is proportions between classes in subsets are close to such proportion in the whole train dataset.
The method trains the %SVM model automatically by choosing the optimal parameters C, gamma, p, nu, coef0, degree. Parameters are considered optimal when the cross-validation estimate of the test set error is minimal.
This function only makes use of SVM::getDefaultGrid for parameter optimization and thus only offers rudimentary parameter options.
This function works for the classification (SVM::C_SVC or SVM::NU_SVC) as well as for the regression (SVM::EPS_SVR or SVM::NU_SVR). If it is SVM::ONE_CLASS, no optimization is made and the usual %SVM with parameters specified in params is executed.
-
Trains an %SVM with optimal parameters
Declaration
Objective-C
- (BOOL)trainAuto:(nonnull Mat *)samples layout:(int)layout responses:(nonnull Mat *)responses kFold:(int)kFold Cgrid:(nonnull ParamGrid *)Cgrid gammaGrid:(nonnull ParamGrid *)gammaGrid pGrid:(nonnull ParamGrid *)pGrid nuGrid:(nonnull ParamGrid *)nuGrid coeffGrid:(nonnull ParamGrid *)coeffGrid degreeGrid:(nonnull ParamGrid *)degreeGrid;Parameters
samplestraining samples
layoutSee ml::SampleTypes.
responsesvector of responses associated with the training samples.
kFoldCross-validation parameter. The training set is divided into kFold subsets. One subset is used to test the model, the others form the train set. So, the %SVM algorithm is
Cgridgrid for C
gammaGridgrid for gamma
pGridgrid for p
nuGridgrid for nu
coeffGridgrid for coeff
degreeGridgrid for degree balanced cross-validation subsets that is proportions between classes in subsets are close to such proportion in the whole train dataset.
The method trains the %SVM model automatically by choosing the optimal parameters C, gamma, p, nu, coef0, degree. Parameters are considered optimal when the cross-validation estimate of the test set error is minimal.
This function only makes use of SVM::getDefaultGrid for parameter optimization and thus only offers rudimentary parameter options.
This function works for the classification (SVM::C_SVC or SVM::NU_SVC) as well as for the regression (SVM::EPS_SVR or SVM::NU_SVR). If it is SVM::ONE_CLASS, no optimization is made and the usual %SVM with parameters specified in params is executed.
-
Trains an %SVM with optimal parameters
Declaration
Objective-C
- (BOOL)trainAuto:(nonnull Mat *)samples layout:(int)layout responses:(nonnull Mat *)responses kFold:(int)kFold Cgrid:(nonnull ParamGrid *)Cgrid gammaGrid:(nonnull ParamGrid *)gammaGrid pGrid:(nonnull ParamGrid *)pGrid nuGrid:(nonnull ParamGrid *)nuGrid coeffGrid:(nonnull ParamGrid *)coeffGrid;Parameters
samplestraining samples
layoutSee ml::SampleTypes.
responsesvector of responses associated with the training samples.
kFoldCross-validation parameter. The training set is divided into kFold subsets. One subset is used to test the model, the others form the train set. So, the %SVM algorithm is
Cgridgrid for C
gammaGridgrid for gamma
pGridgrid for p
nuGridgrid for nu
coeffGridgrid for coeff balanced cross-validation subsets that is proportions between classes in subsets are close to such proportion in the whole train dataset.
The method trains the %SVM model automatically by choosing the optimal parameters C, gamma, p, nu, coef0, degree. Parameters are considered optimal when the cross-validation estimate of the test set error is minimal.
This function only makes use of SVM::getDefaultGrid for parameter optimization and thus only offers rudimentary parameter options.
This function works for the classification (SVM::C_SVC or SVM::NU_SVC) as well as for the regression (SVM::EPS_SVR or SVM::NU_SVR). If it is SVM::ONE_CLASS, no optimization is made and the usual %SVM with parameters specified in params is executed.
-
Trains an %SVM with optimal parameters
Declaration
Parameters
samplestraining samples
layoutSee ml::SampleTypes.
responsesvector of responses associated with the training samples.
kFoldCross-validation parameter. The training set is divided into kFold subsets. One subset is used to test the model, the others form the train set. So, the %SVM algorithm is
Cgridgrid for C
gammaGridgrid for gamma
pGridgrid for p
nuGridgrid for nu balanced cross-validation subsets that is proportions between classes in subsets are close to such proportion in the whole train dataset.
The method trains the %SVM model automatically by choosing the optimal parameters C, gamma, p, nu, coef0, degree. Parameters are considered optimal when the cross-validation estimate of the test set error is minimal.
This function only makes use of SVM::getDefaultGrid for parameter optimization and thus only offers rudimentary parameter options.
This function works for the classification (SVM::C_SVC or SVM::NU_SVC) as well as for the regression (SVM::EPS_SVR or SVM::NU_SVR). If it is SVM::ONE_CLASS, no optimization is made and the usual %SVM with parameters specified in params is executed.
-
Trains an %SVM with optimal parameters
Declaration
Parameters
samplestraining samples
layoutSee ml::SampleTypes.
responsesvector of responses associated with the training samples.
kFoldCross-validation parameter. The training set is divided into kFold subsets. One subset is used to test the model, the others form the train set. So, the %SVM algorithm is
Cgridgrid for C
gammaGridgrid for gamma
pGridgrid for p balanced cross-validation subsets that is proportions between classes in subsets are close to such proportion in the whole train dataset.
The method trains the %SVM model automatically by choosing the optimal parameters C, gamma, p, nu, coef0, degree. Parameters are considered optimal when the cross-validation estimate of the test set error is minimal.
This function only makes use of SVM::getDefaultGrid for parameter optimization and thus only offers rudimentary parameter options.
This function works for the classification (SVM::C_SVC or SVM::NU_SVC) as well as for the regression (SVM::EPS_SVR or SVM::NU_SVR). If it is SVM::ONE_CLASS, no optimization is made and the usual %SVM with parameters specified in params is executed.
-
Trains an %SVM with optimal parameters
Declaration
Parameters
samplestraining samples
layoutSee ml::SampleTypes.
responsesvector of responses associated with the training samples.
kFoldCross-validation parameter. The training set is divided into kFold subsets. One subset is used to test the model, the others form the train set. So, the %SVM algorithm is
Cgridgrid for C
gammaGridgrid for gamma balanced cross-validation subsets that is proportions between classes in subsets are close to such proportion in the whole train dataset.
The method trains the %SVM model automatically by choosing the optimal parameters C, gamma, p, nu, coef0, degree. Parameters are considered optimal when the cross-validation estimate of the test set error is minimal.
This function only makes use of SVM::getDefaultGrid for parameter optimization and thus only offers rudimentary parameter options.
This function works for the classification (SVM::C_SVC or SVM::NU_SVC) as well as for the regression (SVM::EPS_SVR or SVM::NU_SVR). If it is SVM::ONE_CLASS, no optimization is made and the usual %SVM with parameters specified in params is executed.
-
Trains an %SVM with optimal parameters
Declaration
Parameters
samplestraining samples
layoutSee ml::SampleTypes.
responsesvector of responses associated with the training samples.
kFoldCross-validation parameter. The training set is divided into kFold subsets. One subset is used to test the model, the others form the train set. So, the %SVM algorithm is
Cgridgrid for C balanced cross-validation subsets that is proportions between classes in subsets are close to such proportion in the whole train dataset.
The method trains the %SVM model automatically by choosing the optimal parameters C, gamma, p, nu, coef0, degree. Parameters are considered optimal when the cross-validation estimate of the test set error is minimal.
This function only makes use of SVM::getDefaultGrid for parameter optimization and thus only offers rudimentary parameter options.
This function works for the classification (SVM::C_SVC or SVM::NU_SVC) as well as for the regression (SVM::EPS_SVR or SVM::NU_SVR). If it is SVM::ONE_CLASS, no optimization is made and the usual %SVM with parameters specified in params is executed.
-
Trains an %SVM with optimal parameters
Declaration
Parameters
samplestraining samples
layoutSee ml::SampleTypes.
responsesvector of responses associated with the training samples.
kFoldCross-validation parameter. The training set is divided into kFold subsets. One subset is used to test the model, the others form the train set. So, the %SVM algorithm is balanced cross-validation subsets that is proportions between classes in subsets are close to such proportion in the whole train dataset.
The method trains the %SVM model automatically by choosing the optimal parameters C, gamma, p, nu, coef0, degree. Parameters are considered optimal when the cross-validation estimate of the test set error is minimal.
This function only makes use of SVM::getDefaultGrid for parameter optimization and thus only offers rudimentary parameter options.
This function works for the classification (SVM::C_SVC or SVM::NU_SVC) as well as for the regression (SVM::EPS_SVR or SVM::NU_SVR). If it is SVM::ONE_CLASS, no optimization is made and the usual %SVM with parameters specified in params is executed.
-
Trains an %SVM with optimal parameters
Declaration
Parameters
samplestraining samples
layoutSee ml::SampleTypes.
responsesvector of responses associated with the training samples. subset is used to test the model, the others form the train set. So, the %SVM algorithm is balanced cross-validation subsets that is proportions between classes in subsets are close to such proportion in the whole train dataset.
The method trains the %SVM model automatically by choosing the optimal parameters C, gamma, p, nu, coef0, degree. Parameters are considered optimal when the cross-validation estimate of the test set error is minimal.
This function only makes use of SVM::getDefaultGrid for parameter optimization and thus only offers rudimentary parameter options.
This function works for the classification (SVM::C_SVC or SVM::NU_SVC) as well as for the regression (SVM::EPS_SVR or SVM::NU_SVR). If it is SVM::ONE_CLASS, no optimization is made and the usual %SVM with parameters specified in params is executed.
-
See
-setC:Declaration
Objective-C
- (double)getC;Swift
func getC() -> Double -
See
-setCoef0:Declaration
Objective-C
- (double)getCoef0;Swift
func getCoef0() -> Double -
Retrieves the decision function
Declaration
Parameters
ithe index of the decision function. If the problem solved is regression, 1-class or 2-class classification, then there will be just one decision function and the index should always be 0. Otherwise, in the case of N-class classification, there will be
N(N-1)/2decision functions.alphathe optional output vector for weights, corresponding to different support vectors. In the case of linear %SVM all the alpha’s will be 1’s.
svidxthe optional output vector of indices of support vectors within the matrix of support vectors (which can be retrieved by SVM::getSupportVectors). In the case of linear %SVM each decision function consists of a single “compressed” support vector.
The method returns rho parameter of the decision function, a scalar subtracted from the weighted sum of kernel responses.
-
See
-setDegree:Declaration
Objective-C
- (double)getDegree;Swift
func getDegree() -> Double -
See
-setGamma:Declaration
Objective-C
- (double)getGamma;Swift
func getGamma() -> Double -
See
-setNu:Declaration
Objective-C
- (double)getNu;Swift
func getNu() -> Double -
See
-setP:Declaration
Objective-C
- (double)getP;Swift
func getP() -> Double -
Type of a %SVM kernel. See SVM::KernelTypes. Default value is SVM::RBF.
Declaration
Objective-C
- (int)getKernelType;Swift
func getKernelType() -> Int32 -
See
-setType:Declaration
Objective-C
- (int)getType;Swift
func getType() -> Int32 -
getC - see:
-getC:Declaration
Objective-C
- (void)setC:(double)val;Swift
func setC(val: Double) -
getCoef0 - see:
-getCoef0:Declaration
Objective-C
- (void)setCoef0:(double)val;Swift
func setCoef0(val: Double) -
getDegree - see:
-getDegree:Declaration
Objective-C
- (void)setDegree:(double)val;Swift
func setDegree(val: Double) -
getGamma - see:
-getGamma:Declaration
Objective-C
- (void)setGamma:(double)val;Swift
func setGamma(val: Double) -
Initialize with one of predefined kernels. See SVM::KernelTypes.
Declaration
Objective-C
- (void)setKernel:(int)kernelType;Swift
func setKernel(kernelType: Int32) -
getNu - see:
-getNu:Declaration
Objective-C
- (void)setNu:(double)val;Swift
func setNu(val: Double) -
getP - see:
-getP:Declaration
Objective-C
- (void)setP:(double)val;Swift
func setP(val: Double) -
getTermCriteria - see:
-getTermCriteria:Declaration
Objective-C
- (void)setTermCriteria:(nonnull TermCriteria *)val;Swift
func setTermCriteria(val: TermCriteria) -
getType - see:
-getType:Declaration
Objective-C
- (void)setType:(int)val;Swift
func setType(val: Int32)
View on GitHub
SVM Class Reference