Renesas / opencv-lib

Dependents:   RZ_A2M_Mbed_samples

Embed: (wiki syntax)

« Back to documentation index

SVMSGD Class Reference

Stochastic Gradient Descent SVM classifier. More...

#include <ml.hpp>

Inherits cv::ml::StatModel.

Public Types

enum  SvmsgdType { SGD, ASGD }
 

SVMSGD type.

More...
enum  MarginType { SOFT_MARGIN, HARD_MARGIN }
 

Margin type.

More...
enum  Flags { , RAW_OUTPUT = 1 }
 

Predict options.

More...

Public Member Functions

virtual CV_WRAP Mat getWeights ()=0
virtual CV_WRAP float getShift ()=0
virtual CV_WRAP void setOptimalParameters (int svmsgdType=SVMSGD::ASGD, int marginType=SVMSGD::SOFT_MARGIN)=0
 Function sets optimal parameters values for chosen SVM SGD model.
virtual CV_WRAP int getSvmsgdType () const =0
 Algorithm type, one of SVMSGD::SvmsgdType.
virtual CV_WRAP void setSvmsgdType (int svmsgdType)=0
 

Algorithm type, one of SVMSGD::SvmsgdType.


virtual CV_WRAP int getMarginType () const =0
 Margin type, one of SVMSGD::MarginType.
virtual CV_WRAP void setMarginType (int marginType)=0
 

Margin type, one of SVMSGD::MarginType.


virtual CV_WRAP float getMarginRegularization () const =0
 Parameter marginRegularization of a SVMSGD optimization problem.
virtual CV_WRAP void setMarginRegularization (float marginRegularization)=0
 

Parameter marginRegularization of a SVMSGD optimization problem.


virtual CV_WRAP float getInitialStepSize () const =0
 Parameter initialStepSize of a SVMSGD optimization problem.
virtual CV_WRAP void setInitialStepSize (float InitialStepSize)=0
 

Parameter initialStepSize of a SVMSGD optimization problem.


virtual CV_WRAP float getStepDecreasingPower () const =0
 Parameter stepDecreasingPower of a SVMSGD optimization problem.
virtual CV_WRAP void setStepDecreasingPower (float stepDecreasingPower)=0
 

Parameter stepDecreasingPower of a SVMSGD optimization problem.


virtual CV_WRAP TermCriteria getTermCriteria () const =0
 Termination criteria of the training algorithm.
virtual CV_WRAP void setTermCriteria (const cv::TermCriteria &val)=0
 

Termination criteria of the training algorithm.


virtual CV_WRAP int getVarCount () const =0
 Returns the number of variables in training samples.
virtual CV_WRAP bool empty () const
 Returns true if the Algorithm is empty (e.g.
virtual CV_WRAP bool isTrained () const =0
 Returns true if the model is trained.
virtual CV_WRAP bool isClassifier () const =0
 Returns true if the model is classifier.
virtual CV_WRAP bool train (const Ptr< TrainData > &trainData, int flags=0)
 Trains the statistical model.
virtual CV_WRAP bool train (InputArray samples, int layout, InputArray responses)
 Trains the statistical model.
virtual CV_WRAP float calcError (const Ptr< TrainData > &data, bool test, OutputArray resp) const
 Computes error on the training or test dataset.
virtual CV_WRAP float predict (InputArray samples, OutputArray results=noArray(), int flags=0) const =0
 Predicts response(s) for the provided sample(s)
virtual CV_WRAP void clear ()
 Clears the algorithm state.
virtual void write (FileStorage &fs) const
 Stores algorithm parameters in a file storage.
virtual void read (const FileNode &fn)
 Reads algorithm parameters from a file storage.
virtual CV_WRAP void save (const String &filename) const
 Saves the algorithm to a file.
virtual CV_WRAP String getDefaultName () const
 Returns the algorithm string identifier.

Static Public Member Functions

static CV_WRAP Ptr< SVMSGDcreate ()
 Creates empty model.
template<typename _Tp >
static Ptr< _Tp > train (const Ptr< TrainData > &data, int flags=0)
 Create and train model with default parameters.
template<typename _Tp >
static Ptr< _Tp > read (const FileNode &fn)
 Reads algorithm from the file node.
template<typename _Tp >
static Ptr< _Tp > load (const String &filename, const String &objname=String())
 Loads algorithm from the file.
template<typename _Tp >
static Ptr< _Tp > loadFromString (const String &strModel, const String &objname=String())
 Loads algorithm from a String.

Detailed Description

Stochastic Gradient Descent SVM classifier.

SVMSGD provides a fast and easy-to-use implementation of the SVM classifier using the Stochastic Gradient Descent approach, as presented in bottou2010large.

The classifier has following parameters:

  • model type,
  • margin type,
  • margin regularization ( $\lambda$),
  • initial step size ( $\gamma_0$),
  • step decreasing power ( $c$),
  • and termination criteria.

The model type may have one of the following values: SGD and ASGD.

  • SGD is the classic version of SVMSGD classifier: every next step is calculated by the formula

    \[w_{t+1} = w_t - \gamma(t) \frac{dQ_i}{dw} |_{w = w_t}\]

    where
    • $w_t$ is the weights vector for decision function at step $t$,
    • $\gamma(t)$ is the step size of model parameters at the iteration $t$, it is decreased on each step by the formula $\gamma(t) = \gamma_0 (1 + \lambda \gamma_0 t) ^ {-c}$
    • $Q_i$ is the target functional from SVM task for sample with number $i$, this sample is chosen stochastically on each step of the algorithm.
  • ASGD is Average Stochastic Gradient Descent SVM Classifier. ASGD classifier averages weights vector on each step of algorithm by the formula $\widehat{w}_{t+1} = \frac{t}{1+t}\widehat{w}_{t} + \frac{1}{1+t}w_{t+1}$

The recommended model type is ASGD (following bottou2010large).

The margin type may have one of the following values: SOFT_MARGIN or HARD_MARGIN.

  • You should use HARD_MARGIN type, if you have linearly separable sets.
  • You should use SOFT_MARGIN type, if you have non-linearly separable sets or sets with outliers.
  • In the general case (if you know nothing about linear separability of your sets), use SOFT_MARGIN.

The other parameters may be described as follows:

  • Margin regularization parameter is responsible for weights decreasing at each step and for the strength of restrictions on outliers (the less the parameter, the less probability that an outlier will be ignored). Recommended value for SGD model is 0.0001, for ASGD model is 0.00001.
  • Initial step size parameter is the initial value for the step size $\gamma(t)$. You will have to find the best initial step for your problem.
  • Step decreasing power is the power parameter for $\gamma(t)$ decreasing by the formula, mentioned above. Recommended value for SGD model is 1, for ASGD model is 0.75.

Note that the parameters margin regularization, initial step size, and step decreasing power should be positive.

To use SVMSGD algorithm do as follows:

  • then the SVM model can be trained using the train features and the correspondent labels by the method train().
  • after that, the label of a new feature vector can be predicted using the method predict().
// Create empty object
cv::Ptr<SVMSGD> svmsgd = SVMSGD::create();

// Train the Stochastic Gradient Descent SVM
svmsgd->train(trainData);

// Predict labels for the new samples
svmsgd->predict(samples, responses);

Definition at line 1584 of file ml.hpp.


Member Enumeration Documentation

enum Flags [inherited]

Predict options.

Enumerator:
RAW_OUTPUT 

makes the method return the raw results (the sum), not the class label

Reimplemented in DTrees.

Definition at line 303 of file ml.hpp.

enum MarginType

Margin type.

Enumerator:
SOFT_MARGIN 

General case, suits to the case of non-linearly separable sets, allows outliers.

HARD_MARGIN 

More accurate for the case of linearly separable sets.

Definition at line 1597 of file ml.hpp.

enum SvmsgdType

SVMSGD type.

ASGD is often the preferable choice.

Enumerator:
SGD 

Stochastic Gradient Descent.

ASGD 

Average Stochastic Gradient Descent.

Definition at line 1590 of file ml.hpp.


Member Function Documentation

virtual CV_WRAP float calcError ( const Ptr< TrainData > &  data,
bool  test,
OutputArray  resp 
) const [virtual, inherited]

Computes error on the training or test dataset.

Parameters:
datathe training data
testif true, the error is computed over the test subset of the data, otherwise it's computed over the training subset of the data. Please note that if you loaded a completely different dataset to evaluate already trained classifier, you will probably want not to set the test subset at all with TrainData::setTrainTestSplitRatio and specify test=false, so that the error is computed for the whole new set. Yes, this sounds a bit confusing.
respthe optional output responses.

The method uses StatModel::predict to compute the error. For regression models the error is computed as RMS, for classifiers - as a percent of missclassified samples (0-100%).

virtual CV_WRAP void clear (  ) [virtual, inherited]

Clears the algorithm state.

Reimplemented in DescriptorMatcher, and FlannBasedMatcher.

Definition at line 3030 of file core.hpp.

static CV_WRAP Ptr<SVMSGD> create (  ) [static]

Creates empty model.

Use StatModel::train to train the model. Since SVMSGD has several parameters, you may want to find the best parameters for your problem or use setOptimalParameters() to set some default parameters.

virtual CV_WRAP bool empty (  ) const [virtual, inherited]

Returns true if the Algorithm is empty (e.g.

in the very beginning or after unsuccessful read

Reimplemented from Algorithm.

virtual CV_WRAP String getDefaultName (  ) const [virtual, inherited]

Returns the algorithm string identifier.

This string is used as top level xml/yml node tag when the object is saved to a file or string.

virtual CV_WRAP float getInitialStepSize (  ) const [pure virtual]

Parameter initialStepSize of a SVMSGD optimization problem.

See also:
setInitialStepSize
virtual CV_WRAP float getMarginRegularization (  ) const [pure virtual]

Parameter marginRegularization of a SVMSGD optimization problem.

See also:
setMarginRegularization
virtual CV_WRAP int getMarginType (  ) const [pure virtual]

Margin type, one of SVMSGD::MarginType.

See also:
setMarginType
virtual CV_WRAP float getShift (  ) [pure virtual]
Returns:
the shift of the trained model (decision function f(x) = weights * x + shift).
virtual CV_WRAP float getStepDecreasingPower (  ) const [pure virtual]

Parameter stepDecreasingPower of a SVMSGD optimization problem.

See also:
setStepDecreasingPower
virtual CV_WRAP int getSvmsgdType (  ) const [pure virtual]

Algorithm type, one of SVMSGD::SvmsgdType.

See also:
setSvmsgdType
virtual CV_WRAP TermCriteria getTermCriteria (  ) const [pure virtual]

Termination criteria of the training algorithm.

You can specify the maximum number of iterations (maxCount) and/or how much the error could change between the iterations to make the algorithm continue (epsilon).

See also:
setTermCriteria
virtual CV_WRAP int getVarCount (  ) const [pure virtual, inherited]

Returns the number of variables in training samples.

virtual CV_WRAP Mat getWeights (  ) [pure virtual]
Returns:
the weights of the trained model (decision function f(x) = weights * x + shift).
virtual CV_WRAP bool isClassifier (  ) const [pure virtual, inherited]

Returns true if the model is classifier.

virtual CV_WRAP bool isTrained (  ) const [pure virtual, inherited]

Returns true if the model is trained.

static Ptr<_Tp> load ( const String &  filename,
const String &  objname = String() 
) [static, inherited]

Loads algorithm from the file.

Parameters:
filenameName of the file to read.
objnameThe optional name of the node to read (if empty, the first top-level node will be used)

This is static template method of Algorithm. It's usage is following (in the case of SVM):

     Ptr<SVM> svm = Algorithm::load<SVM>("my_svm_model.xml");

In order to make this method work, the derived class must overwrite Algorithm::read(const FileNode& fn).

Definition at line 3074 of file core.hpp.

static Ptr<_Tp> loadFromString ( const String &  strModel,
const String &  objname = String() 
) [static, inherited]

Loads algorithm from a String.

Parameters:
strModelThe string variable containing the model you want to load.
objnameThe optional name of the node to read (if empty, the first top-level node will be used)

This is static template method of Algorithm. It's usage is following (in the case of SVM):

     Ptr<SVM> svm = Algorithm::loadFromString<SVM>(myStringModel);

Definition at line 3094 of file core.hpp.

virtual CV_WRAP float predict ( InputArray  samples,
OutputArray  results = noArray(),
int  flags = 0 
) const [pure virtual, inherited]

Predicts response(s) for the provided sample(s)

Parameters:
samplesThe input samples, floating-point matrix
resultsThe optional output matrix of results.
flagsThe optional flags, model-dependent. See cv::ml::StatModel::Flags.

Implemented in LogisticRegression.

static Ptr<_Tp> read ( const FileNode fn ) [static, inherited]

Reads algorithm from the file node.

This is static template method of Algorithm. It's usage is following (in the case of SVM):

     cv::FileStorage fsRead("example.xml", FileStorage::READ);
     Ptr<SVM> svm = Algorithm::read<SVM>(fsRead.root());

In order to make this method work, the derived class must overwrite Algorithm::read(const FileNode& fn) and also have static create() method without parameters (or with all the optional parameters)

Reimplemented in Feature2D, DescriptorMatcher, and FlannBasedMatcher.

Definition at line 3055 of file core.hpp.

virtual void read ( const FileNode fn ) [virtual, inherited]

Reads algorithm parameters from a file storage.

Reimplemented in Feature2D, DescriptorMatcher, and FlannBasedMatcher.

Definition at line 3038 of file core.hpp.

virtual CV_WRAP void save ( const String &  filename ) const [virtual, inherited]

Saves the algorithm to a file.

In order to make this method work, the derived class must implement Algorithm::write(FileStorage& fs).

virtual CV_WRAP void setInitialStepSize ( float  InitialStepSize ) [pure virtual]

Parameter initialStepSize of a SVMSGD optimization problem.

See also:
getInitialStepSize
virtual CV_WRAP void setMarginRegularization ( float  marginRegularization ) [pure virtual]

Parameter marginRegularization of a SVMSGD optimization problem.

See also:
getMarginRegularization
virtual CV_WRAP void setMarginType ( int  marginType ) [pure virtual]

Margin type, one of SVMSGD::MarginType.

See also:
getMarginType
virtual CV_WRAP void setOptimalParameters ( int  svmsgdType = SVMSGD::ASGD,
int  marginType = SVMSGD::SOFT_MARGIN 
) [pure virtual]

Function sets optimal parameters values for chosen SVM SGD model.

Parameters:
svmsgdTypeis the type of SVMSGD classifier.
marginTypeis the type of margin constraint.
virtual CV_WRAP void setStepDecreasingPower ( float  stepDecreasingPower ) [pure virtual]

Parameter stepDecreasingPower of a SVMSGD optimization problem.

See also:
getStepDecreasingPower
virtual CV_WRAP void setSvmsgdType ( int  svmsgdType ) [pure virtual]

Algorithm type, one of SVMSGD::SvmsgdType.

See also:
getSvmsgdType
virtual CV_WRAP void setTermCriteria ( const cv::TermCriteria val ) [pure virtual]

Termination criteria of the training algorithm.

See also:
getTermCriteria
virtual CV_WRAP bool train ( InputArray  samples,
int  layout,
InputArray  responses 
) [virtual, inherited]

Trains the statistical model.

Parameters:
samplestraining samples
layoutSee ml::SampleTypes.
responsesvector of responses associated with the training samples.
static Ptr<_Tp> train ( const Ptr< TrainData > &  data,
int  flags = 0 
) [static, inherited]

Create and train model with default parameters.

The class must implement static `create()` method with no parameters or with all default parameter values

Definition at line 364 of file ml.hpp.

virtual CV_WRAP bool train ( const Ptr< TrainData > &  trainData,
int  flags = 0 
) [virtual, inherited]

Trains the statistical model.

Parameters:
trainDatatraining data that can be loaded from file using TrainData::loadFromCSV or created with TrainData::create.
flagsoptional flags, depending on the model. Some of the models can be updated with the new training samples, not completely overwritten (such as NormalBayesClassifier or ANN_MLP).
virtual void write ( FileStorage fs ) const [virtual, inherited]

Stores algorithm parameters in a file storage.

Reimplemented in Feature2D, DescriptorMatcher, and FlannBasedMatcher.

Definition at line 3034 of file core.hpp.