Important changes to repositories hosted on mbed.com
Mbed hosted mercurial repositories are deprecated and are due to be permanently deleted in July 2026.
To keep a copy of this software download the repository Zip archive or clone locally using Mercurial.
It is also possible to export all your personal repositories from the account settings page.
Dependents: RZ_A2M_Mbed_samples
SVMSGD Class Reference
[Machine Learning]
Stochastic Gradient Descent SVM classifier. More...
#include <ml.hpp>
Inherits cv::ml::StatModel.
Public Types | |
enum | SvmsgdType { SGD, ASGD } |
SVMSGD type. More... | |
enum | MarginType { SOFT_MARGIN, HARD_MARGIN } |
Margin type. More... | |
enum | Flags { , RAW_OUTPUT = 1 } |
Predict options. More... | |
Public Member Functions | |
virtual CV_WRAP Mat | getWeights ()=0 |
virtual CV_WRAP float | getShift ()=0 |
virtual CV_WRAP void | setOptimalParameters (int svmsgdType=SVMSGD::ASGD, int marginType=SVMSGD::SOFT_MARGIN)=0 |
Function sets optimal parameters values for chosen SVM SGD model. | |
virtual CV_WRAP int | getSvmsgdType () const =0 |
Algorithm type, one of SVMSGD::SvmsgdType. | |
virtual CV_WRAP void | setSvmsgdType (int svmsgdType)=0 |
Algorithm type, one of SVMSGD::SvmsgdType. | |
virtual CV_WRAP int | getMarginType () const =0 |
Margin type, one of SVMSGD::MarginType. | |
virtual CV_WRAP void | setMarginType (int marginType)=0 |
Margin type, one of SVMSGD::MarginType. | |
virtual CV_WRAP float | getMarginRegularization () const =0 |
Parameter marginRegularization of a SVMSGD optimization problem. | |
virtual CV_WRAP void | setMarginRegularization (float marginRegularization)=0 |
Parameter marginRegularization of a SVMSGD optimization problem. | |
virtual CV_WRAP float | getInitialStepSize () const =0 |
Parameter initialStepSize of a SVMSGD optimization problem. | |
virtual CV_WRAP void | setInitialStepSize (float InitialStepSize)=0 |
Parameter initialStepSize of a SVMSGD optimization problem. | |
virtual CV_WRAP float | getStepDecreasingPower () const =0 |
Parameter stepDecreasingPower of a SVMSGD optimization problem. | |
virtual CV_WRAP void | setStepDecreasingPower (float stepDecreasingPower)=0 |
Parameter stepDecreasingPower of a SVMSGD optimization problem. | |
virtual CV_WRAP TermCriteria | getTermCriteria () const =0 |
Termination criteria of the training algorithm. | |
virtual CV_WRAP void | setTermCriteria (const cv::TermCriteria &val)=0 |
Termination criteria of the training algorithm. | |
virtual CV_WRAP int | getVarCount () const =0 |
Returns the number of variables in training samples. | |
virtual CV_WRAP bool | empty () const |
Returns true if the Algorithm is empty (e.g. | |
virtual CV_WRAP bool | isTrained () const =0 |
Returns true if the model is trained. | |
virtual CV_WRAP bool | isClassifier () const =0 |
Returns true if the model is classifier. | |
virtual CV_WRAP bool | train (const Ptr< TrainData > &trainData, int flags=0) |
Trains the statistical model. | |
virtual CV_WRAP bool | train (InputArray samples, int layout, InputArray responses) |
Trains the statistical model. | |
virtual CV_WRAP float | calcError (const Ptr< TrainData > &data, bool test, OutputArray resp) const |
Computes error on the training or test dataset. | |
virtual CV_WRAP float | predict (InputArray samples, OutputArray results=noArray(), int flags=0) const =0 |
Predicts response(s) for the provided sample(s) | |
virtual CV_WRAP void | clear () |
Clears the algorithm state. | |
virtual void | write (FileStorage &fs) const |
Stores algorithm parameters in a file storage. | |
virtual void | read (const FileNode &fn) |
Reads algorithm parameters from a file storage. | |
virtual CV_WRAP void | save (const String &filename) const |
Saves the algorithm to a file. | |
virtual CV_WRAP String | getDefaultName () const |
Returns the algorithm string identifier. | |
Static Public Member Functions | |
static CV_WRAP Ptr< SVMSGD > | create () |
Creates empty model. | |
template<typename _Tp > | |
static Ptr< _Tp > | train (const Ptr< TrainData > &data, int flags=0) |
Create and train model with default parameters. | |
template<typename _Tp > | |
static Ptr< _Tp > | read (const FileNode &fn) |
Reads algorithm from the file node. | |
template<typename _Tp > | |
static Ptr< _Tp > | load (const String &filename, const String &objname=String()) |
Loads algorithm from the file. | |
template<typename _Tp > | |
static Ptr< _Tp > | loadFromString (const String &strModel, const String &objname=String()) |
Loads algorithm from a String. |
Detailed Description
Stochastic Gradient Descent SVM classifier.
SVMSGD provides a fast and easy-to-use implementation of the SVM classifier using the Stochastic Gradient Descent approach, as presented in bottou2010large.
The classifier has following parameters:
- model type,
- margin type,
- margin regularization (
),
- initial step size (
),
- step decreasing power (
),
- and termination criteria.
The model type may have one of the following values: SGD and ASGD.
- SGD is the classic version of SVMSGD classifier: every next step is calculated by the formula
is the weights vector for decision function at step
,
is the step size of model parameters at the iteration
, it is decreased on each step by the formula
is the target functional from SVM task for sample with number
, this sample is chosen stochastically on each step of the algorithm.
- ASGD is Average Stochastic Gradient Descent SVM Classifier. ASGD classifier averages weights vector on each step of algorithm by the formula
The recommended model type is ASGD (following bottou2010large).
The margin type may have one of the following values: SOFT_MARGIN or HARD_MARGIN.
- You should use HARD_MARGIN type, if you have linearly separable sets.
- You should use SOFT_MARGIN type, if you have non-linearly separable sets or sets with outliers.
- In the general case (if you know nothing about linear separability of your sets), use SOFT_MARGIN.
The other parameters may be described as follows:
- Margin regularization parameter is responsible for weights decreasing at each step and for the strength of restrictions on outliers (the less the parameter, the less probability that an outlier will be ignored). Recommended value for SGD model is 0.0001, for ASGD model is 0.00001.
- Initial step size parameter is the initial value for the step size
. You will have to find the best initial step for your problem.
- Step decreasing power is the power parameter for
decreasing by the formula, mentioned above. Recommended value for SGD model is 1, for ASGD model is 0.75.
- Termination criteria can be TermCriteria::COUNT, TermCriteria::EPS or TermCriteria::COUNT + TermCriteria::EPS. You will have to find the best termination criteria for your problem.
Note that the parameters margin regularization, initial step size, and step decreasing power should be positive.
To use SVMSGD algorithm do as follows:
- first, create the SVMSGD object. The algoorithm will set optimal parameters by default, but you can set your own parameters via functions setSvmsgdType(), setMarginType(), setMarginRegularization(), setInitialStepSize(), and setStepDecreasingPower().
- then the SVM model can be trained using the train features and the correspondent labels by the method train().
- after that, the label of a new feature vector can be predicted using the method predict().
// Create empty object cv::Ptr<SVMSGD> svmsgd = SVMSGD::create(); // Train the Stochastic Gradient Descent SVM svmsgd->train(trainData); // Predict labels for the new samples svmsgd->predict(samples, responses);
Definition at line 1584 of file ml.hpp.
Member Enumeration Documentation
enum Flags [inherited] |
enum MarginType |
enum SvmsgdType |
Member Function Documentation
virtual CV_WRAP float calcError | ( | const Ptr< TrainData > & | data, |
bool | test, | ||
OutputArray | resp | ||
) | const [virtual, inherited] |
Computes error on the training or test dataset.
- Parameters:
-
data the training data test if true, the error is computed over the test subset of the data, otherwise it's computed over the training subset of the data. Please note that if you loaded a completely different dataset to evaluate already trained classifier, you will probably want not to set the test subset at all with TrainData::setTrainTestSplitRatio and specify test=false, so that the error is computed for the whole new set. Yes, this sounds a bit confusing. resp the optional output responses.
The method uses StatModel::predict to compute the error. For regression models the error is computed as RMS, for classifiers - as a percent of missclassified samples (0-100%).
virtual CV_WRAP void clear | ( | ) | [virtual, inherited] |
Clears the algorithm state.
Reimplemented in DescriptorMatcher, and FlannBasedMatcher.
Creates empty model.
Use StatModel::train to train the model. Since SVMSGD has several parameters, you may want to find the best parameters for your problem or use setOptimalParameters() to set some default parameters.
virtual CV_WRAP bool empty | ( | ) | const [virtual, inherited] |
virtual CV_WRAP String getDefaultName | ( | ) | const [virtual, inherited] |
Returns the algorithm string identifier.
This string is used as top level xml/yml node tag when the object is saved to a file or string.
virtual CV_WRAP float getInitialStepSize | ( | ) | const [pure virtual] |
Parameter initialStepSize of a SVMSGD optimization problem.
- See also:
- setInitialStepSize
virtual CV_WRAP float getMarginRegularization | ( | ) | const [pure virtual] |
Parameter marginRegularization of a SVMSGD optimization problem.
- See also:
- setMarginRegularization
virtual CV_WRAP int getMarginType | ( | ) | const [pure virtual] |
Margin type, one of SVMSGD::MarginType.
- See also:
- setMarginType
virtual CV_WRAP float getShift | ( | ) | [pure virtual] |
- Returns:
- the shift of the trained model (decision function f(x) = weights * x + shift).
virtual CV_WRAP float getStepDecreasingPower | ( | ) | const [pure virtual] |
Parameter stepDecreasingPower of a SVMSGD optimization problem.
- See also:
- setStepDecreasingPower
virtual CV_WRAP int getSvmsgdType | ( | ) | const [pure virtual] |
Algorithm type, one of SVMSGD::SvmsgdType.
- See also:
- setSvmsgdType
virtual CV_WRAP TermCriteria getTermCriteria | ( | ) | const [pure virtual] |
Termination criteria of the training algorithm.
You can specify the maximum number of iterations (maxCount) and/or how much the error could change between the iterations to make the algorithm continue (epsilon).
- See also:
- setTermCriteria
virtual CV_WRAP int getVarCount | ( | ) | const [pure virtual, inherited] |
Returns the number of variables in training samples.
virtual CV_WRAP Mat getWeights | ( | ) | [pure virtual] |
- Returns:
- the weights of the trained model (decision function f(x) = weights * x + shift).
virtual CV_WRAP bool isClassifier | ( | ) | const [pure virtual, inherited] |
Returns true if the model is classifier.
virtual CV_WRAP bool isTrained | ( | ) | const [pure virtual, inherited] |
Returns true if the model is trained.
static Ptr<_Tp> load | ( | const String & | filename, |
const String & | objname = String() |
||
) | [static, inherited] |
Loads algorithm from the file.
- Parameters:
-
filename Name of the file to read. objname The optional name of the node to read (if empty, the first top-level node will be used)
This is static template method of Algorithm. It's usage is following (in the case of SVM):
Ptr<SVM> svm = Algorithm::load<SVM>("my_svm_model.xml");
In order to make this method work, the derived class must overwrite Algorithm::read(const FileNode& fn).
static Ptr<_Tp> loadFromString | ( | const String & | strModel, |
const String & | objname = String() |
||
) | [static, inherited] |
Loads algorithm from a String.
- Parameters:
-
strModel The string variable containing the model you want to load. objname The optional name of the node to read (if empty, the first top-level node will be used)
This is static template method of Algorithm. It's usage is following (in the case of SVM):
Ptr<SVM> svm = Algorithm::loadFromString<SVM>(myStringModel);
virtual CV_WRAP float predict | ( | InputArray | samples, |
OutputArray | results = noArray() , |
||
int | flags = 0 |
||
) | const [pure virtual, inherited] |
Predicts response(s) for the provided sample(s)
- Parameters:
-
samples The input samples, floating-point matrix results The optional output matrix of results. flags The optional flags, model-dependent. See cv::ml::StatModel::Flags.
Implemented in LogisticRegression.
Reads algorithm from the file node.
This is static template method of Algorithm. It's usage is following (in the case of SVM):
cv::FileStorage fsRead("example.xml", FileStorage::READ); Ptr<SVM> svm = Algorithm::read<SVM>(fsRead.root());
In order to make this method work, the derived class must overwrite Algorithm::read(const FileNode& fn) and also have static create() method without parameters (or with all the optional parameters)
Reimplemented in Feature2D, DescriptorMatcher, and FlannBasedMatcher.
virtual void read | ( | const FileNode & | fn ) | [virtual, inherited] |
Reads algorithm parameters from a file storage.
Reimplemented in Feature2D, DescriptorMatcher, and FlannBasedMatcher.
virtual CV_WRAP void save | ( | const String & | filename ) | const [virtual, inherited] |
Saves the algorithm to a file.
In order to make this method work, the derived class must implement Algorithm::write(FileStorage& fs).
virtual CV_WRAP void setInitialStepSize | ( | float | InitialStepSize ) | [pure virtual] |
Parameter initialStepSize of a SVMSGD optimization problem.
- See also:
- getInitialStepSize
virtual CV_WRAP void setMarginRegularization | ( | float | marginRegularization ) | [pure virtual] |
Parameter marginRegularization of a SVMSGD optimization problem.
- See also:
- getMarginRegularization
virtual CV_WRAP void setMarginType | ( | int | marginType ) | [pure virtual] |
Margin type, one of SVMSGD::MarginType.
- See also:
- getMarginType
virtual CV_WRAP void setOptimalParameters | ( | int | svmsgdType = SVMSGD::ASGD , |
int | marginType = SVMSGD::SOFT_MARGIN |
||
) | [pure virtual] |
virtual CV_WRAP void setStepDecreasingPower | ( | float | stepDecreasingPower ) | [pure virtual] |
Parameter stepDecreasingPower of a SVMSGD optimization problem.
- See also:
- getStepDecreasingPower
virtual CV_WRAP void setSvmsgdType | ( | int | svmsgdType ) | [pure virtual] |
Algorithm type, one of SVMSGD::SvmsgdType.
- See also:
- getSvmsgdType
virtual CV_WRAP void setTermCriteria | ( | const cv::TermCriteria & | val ) | [pure virtual] |
Termination criteria of the training algorithm.
- See also:
- getTermCriteria
virtual CV_WRAP bool train | ( | InputArray | samples, |
int | layout, | ||
InputArray | responses | ||
) | [virtual, inherited] |
Trains the statistical model.
- Parameters:
-
samples training samples layout See ml::SampleTypes. responses vector of responses associated with the training samples.
virtual CV_WRAP bool train | ( | const Ptr< TrainData > & | trainData, |
int | flags = 0 |
||
) | [virtual, inherited] |
Trains the statistical model.
- Parameters:
-
trainData training data that can be loaded from file using TrainData::loadFromCSV or created with TrainData::create. flags optional flags, depending on the model. Some of the models can be updated with the new training samples, not completely overwritten (such as NormalBayesClassifier or ANN_MLP).
virtual void write | ( | FileStorage & | fs ) | const [virtual, inherited] |
Stores algorithm parameters in a file storage.
Reimplemented in Feature2D, DescriptorMatcher, and FlannBasedMatcher.
Generated on Tue Jul 12 2022 18:20:24 by
