opencv on mbed

Dependencies:   mbed

Embed: (wiki syntax)

« Back to documentation index

ANN_MLP Class Reference

Artificial Neural Networks - Multi-Layer Perceptrons. More...

#include <ml.hpp>

Inherits cv::ml::StatModel.

Public Types

enum  TrainingMethods { BACKPROP = 0, RPROP = 1 }
 

Available training methods.

More...
enum  ActivationFunctions { IDENTITY = 0, SIGMOID_SYM = 1, GAUSSIAN = 2 }
 

possible activation functions

More...
enum  TrainFlags { UPDATE_WEIGHTS = 1, NO_INPUT_SCALE = 2, NO_OUTPUT_SCALE = 4 }
 

Train options.

More...
enum  Flags { , RAW_OUTPUT = 1 }
 

Predict options.

More...

Public Member Functions

virtual CV_WRAP void setTrainMethod (int method, double param1=0, double param2=0)=0
 Sets training method and common parameters.
virtual CV_WRAP int getTrainMethod () const =0
 Returns current training method.
virtual CV_WRAP void setActivationFunction (int type, double param1=0, double param2=0)=0
 Initialize the activation function for each neuron.
virtual CV_WRAP void setLayerSizes (InputArray _layer_sizes)=0
 Integer vector specifying the number of neurons in each layer including the input and output layers.
virtual CV_WRAP cv::Mat getLayerSizes () const =0
 Integer vector specifying the number of neurons in each layer including the input and output layers.
virtual CV_WRAP TermCriteria getTermCriteria () const =0
 Termination criteria of the training algorithm.
virtual CV_WRAP void setTermCriteria (TermCriteria val)=0
 

Termination criteria of the training algorithm.


virtual CV_WRAP double getBackpropWeightScale () const =0
 BPROP: Strength of the weight gradient term.
virtual CV_WRAP void setBackpropWeightScale (double val)=0
 

BPROP: Strength of the weight gradient term.


virtual CV_WRAP double getBackpropMomentumScale () const =0
 BPROP: Strength of the momentum term (the difference between weights on the 2 previous iterations).
virtual CV_WRAP void setBackpropMomentumScale (double val)=0
 

BPROP: Strength of the momentum term (the difference between weights on the 2 previous iterations).


virtual CV_WRAP double getRpropDW0 () const =0
 RPROP: Initial value $\Delta_0$ of update-values $\Delta_{ij}$.
virtual CV_WRAP void setRpropDW0 (double val)=0
 

RPROP: Initial value $\Delta_0$ of update-values $\Delta_{ij}$.


virtual CV_WRAP double getRpropDWPlus () const =0
 RPROP: Increase factor $\eta^+$.
virtual CV_WRAP void setRpropDWPlus (double val)=0
 

RPROP: Increase factor $\eta^+$.


virtual CV_WRAP double getRpropDWMinus () const =0
 RPROP: Decrease factor $\eta^-$.
virtual CV_WRAP void setRpropDWMinus (double val)=0
 

RPROP: Decrease factor $\eta^-$.


virtual CV_WRAP double getRpropDWMin () const =0
 RPROP: Update-values lower limit $\Delta_{min}$.
virtual CV_WRAP void setRpropDWMin (double val)=0
 

RPROP: Update-values lower limit $\Delta_{min}$.


virtual CV_WRAP double getRpropDWMax () const =0
 RPROP: Update-values upper limit $\Delta_{max}$.
virtual CV_WRAP void setRpropDWMax (double val)=0
 

RPROP: Update-values upper limit $\Delta_{max}$.


virtual CV_WRAP int getVarCount () const =0
 Returns the number of variables in training samples.
virtual CV_WRAP bool empty () const
 Returns true if the Algorithm is empty (e.g.
virtual CV_WRAP bool isTrained () const =0
 Returns true if the model is trained.
virtual CV_WRAP bool isClassifier () const =0
 Returns true if the model is classifier.
virtual CV_WRAP bool train (const Ptr< TrainData > &trainData, int flags=0)
 Trains the statistical model.
virtual CV_WRAP bool train (InputArray samples, int layout, InputArray responses)
 Trains the statistical model.
virtual CV_WRAP float calcError (const Ptr< TrainData > &data, bool test, OutputArray resp) const
 Computes error on the training or test dataset.
virtual CV_WRAP float predict (InputArray samples, OutputArray results=noArray(), int flags=0) const =0
 Predicts response(s) for the provided sample(s)
virtual CV_WRAP void clear ()
 Clears the algorithm state.
virtual void write (FileStorage &fs) const
 Stores algorithm parameters in a file storage.
virtual void read (const FileNode &fn)
 Reads algorithm parameters from a file storage.
virtual CV_WRAP void save (const String &filename) const
 Saves the algorithm to a file.
virtual CV_WRAP String getDefaultName () const
 Returns the algorithm string identifier.

Static Public Member Functions

static CV_WRAP Ptr< ANN_MLPcreate ()
 Creates empty model.
template<typename _Tp >
static Ptr< _Tp > train (const Ptr< TrainData > &data, int flags=0)
 Create and train model with default parameters.
template<typename _Tp >
static Ptr< _Tp > read (const FileNode &fn)
 Reads algorithm from the file node.
template<typename _Tp >
static Ptr< _Tp > load (const String &filename, const String &objname=String())
 Loads algorithm from the file.
template<typename _Tp >
static Ptr< _Tp > loadFromString (const String &strModel, const String &objname=String())
 Loads algorithm from a String.

Detailed Description

Artificial Neural Networks - Multi-Layer Perceptrons.

Unlike many other models in ML that are constructed and trained at once, in the MLP model these steps are separated. First, a network with the specified topology is created using the non-default constructor or the method ANN_MLP::create. All the weights are set to zeros. Then, the network is trained using a set of input and output vectors. The training procedure can be repeated more than once, that is, the weights can be adjusted based on the new training data.

Additional flags for StatModel::train are available: ANN_MLP::TrainFlags.

See also:
ml_intro_ann

Definition at line 1254 of file ml.hpp.


Member Enumeration Documentation

possible activation functions

Enumerator:
IDENTITY 

Identity function: $f(x)=x$.

SIGMOID_SYM 

Symmetrical sigmoid: $f(x)=\beta*(1-e^{-\alpha x})/(1+e^{-\alpha x}$.

Note:
If you are using the default sigmoid activation function with the default parameter values fparam1=0 and fparam2=0 then the function used is y = 1.7159\*tanh(2/3 \* x), so the output will range from [-1.7159, 1.7159], instead of [0,1].
GAUSSIAN 

Gaussian function: $f(x)=\beta e^{-\alpha x*x}$.

Definition at line 1354 of file ml.hpp.

enum Flags [inherited]

Predict options.

Enumerator:
RAW_OUTPUT 

makes the method return the raw results (the sum), not the class label

Reimplemented in DTrees.

Definition at line 296 of file ml.hpp.

enum TrainFlags

Train options.

Enumerator:
UPDATE_WEIGHTS 

Update the network weights, rather than compute them from scratch.

In the latter case the weights are initialized using the Nguyen-Widrow algorithm.

NO_INPUT_SCALE 

Do not normalize the input vectors.

If this flag is not set, the training algorithm normalizes each input feature independently, shifting its mean value to 0 and making the standard deviation equal to 1. If the network is assumed to be updated frequently, the new training data could be much different from original one. In this case, you should take care of proper normalization.

NO_OUTPUT_SCALE 

Do not normalize the output vectors.

If the flag is not set, the training algorithm normalizes each output feature independently, by transforming it to the certain range depending on the used activation function.

Definition at line 1368 of file ml.hpp.

Available training methods.

Enumerator:
BACKPROP 

The back-propagation algorithm.

RPROP 

The RPROP algorithm. See RPROP93 for details.

Definition at line 1258 of file ml.hpp.


Member Function Documentation

virtual CV_WRAP float calcError ( const Ptr< TrainData > &  data,
bool  test,
OutputArray  resp 
) const [virtual, inherited]

Computes error on the training or test dataset.

Parameters:
datathe training data
testif true, the error is computed over the test subset of the data, otherwise it's computed over the training subset of the data. Please note that if you loaded a completely different dataset to evaluate already trained classifier, you will probably want not to set the test subset at all with TrainData::setTrainTestSplitRatio and specify test=false, so that the error is computed for the whole new set. Yes, this sounds a bit confusing.
respthe optional output responses.

The method uses StatModel::predict to compute the error. For regression models the error is computed as RMS, for classifiers - as a percent of missclassified samples (0-100%).

virtual CV_WRAP void clear (  ) [virtual, inherited]

Clears the algorithm state.

Reimplemented in DescriptorMatcher, and FlannBasedMatcher.

Definition at line 2984 of file core.hpp.

static CV_WRAP Ptr<ANN_MLP> create (  ) [static]

Creates empty model.

Use StatModel::train to train the model, Algorithm::load<ANN_MLP>(filename) to load the pre-trained model. Note that the train method has optional flags: ANN_MLP::TrainFlags.

virtual CV_WRAP bool empty (  ) const [virtual, inherited]

Returns true if the Algorithm is empty (e.g.

in the very beginning or after unsuccessful read

Reimplemented from Algorithm.

virtual CV_WRAP double getBackpropMomentumScale (  ) const [pure virtual]

BPROP: Strength of the momentum term (the difference between weights on the 2 previous iterations).

This parameter provides some inertia to smooth the random fluctuations of the weights. It can vary from 0 (the feature is disabled) to 1 and beyond. The value 0.1 or so is good enough. Default value is 0.1.

See also:
setBackpropMomentumScale
virtual CV_WRAP double getBackpropWeightScale (  ) const [pure virtual]

BPROP: Strength of the weight gradient term.

The recommended value is about 0.1. Default value is 0.1.

See also:
setBackpropWeightScale
virtual CV_WRAP String getDefaultName (  ) const [virtual, inherited]

Returns the algorithm string identifier.

This string is used as top level xml/yml node tag when the object is saved to a file or string.

virtual CV_WRAP cv::Mat getLayerSizes (  ) const [pure virtual]

Integer vector specifying the number of neurons in each layer including the input and output layers.

The very first element specifies the number of elements in the input layer. The last element - number of elements in the output layer.

See also:
setLayerSizes
virtual CV_WRAP double getRpropDW0 (  ) const [pure virtual]

RPROP: Initial value $\Delta_0$ of update-values $\Delta_{ij}$.

Default value is 0.1.

See also:
setRpropDW0
virtual CV_WRAP double getRpropDWMax (  ) const [pure virtual]

RPROP: Update-values upper limit $\Delta_{max}$.

It must be >1. Default value is 50.

See also:
setRpropDWMax
virtual CV_WRAP double getRpropDWMin (  ) const [pure virtual]

RPROP: Update-values lower limit $\Delta_{min}$.

It must be positive. Default value is FLT_EPSILON.

See also:
setRpropDWMin
virtual CV_WRAP double getRpropDWMinus (  ) const [pure virtual]

RPROP: Decrease factor $\eta^-$.

It must be <1. Default value is 0.5.

See also:
setRpropDWMinus
virtual CV_WRAP double getRpropDWPlus (  ) const [pure virtual]

RPROP: Increase factor $\eta^+$.

It must be >1. Default value is 1.2.

See also:
setRpropDWPlus
virtual CV_WRAP TermCriteria getTermCriteria (  ) const [pure virtual]

Termination criteria of the training algorithm.

You can specify the maximum number of iterations (maxCount) and/or how much the error could change between the iterations to make the algorithm continue (epsilon). Default value is TermCriteria(TermCriteria::MAX_ITER + TermCriteria::EPS, 1000, 0.01).

See also:
setTermCriteria
virtual CV_WRAP int getTrainMethod (  ) const [pure virtual]

Returns current training method.

virtual CV_WRAP int getVarCount (  ) const [pure virtual, inherited]

Returns the number of variables in training samples.

virtual CV_WRAP bool isClassifier (  ) const [pure virtual, inherited]

Returns true if the model is classifier.

virtual CV_WRAP bool isTrained (  ) const [pure virtual, inherited]

Returns true if the model is trained.

static Ptr<_Tp> load ( const String &  filename,
const String &  objname = String() 
) [static, inherited]

Loads algorithm from the file.

Parameters:
filenameName of the file to read.
objnameThe optional name of the node to read (if empty, the first top-level node will be used)

This is static template method of Algorithm. It's usage is following (in the case of SVM):

     Ptr<SVM> svm = Algorithm::load<SVM>("my_svm_model.xml");

In order to make this method work, the derived class must overwrite Algorithm::read(const FileNode& fn).

Definition at line 3027 of file core.hpp.

static Ptr<_Tp> loadFromString ( const String &  strModel,
const String &  objname = String() 
) [static, inherited]

Loads algorithm from a String.

Parameters:
strModelThe string variable containing the model you want to load.
objnameThe optional name of the node to read (if empty, the first top-level node will be used)

This is static template method of Algorithm. It's usage is following (in the case of SVM):

     Ptr<SVM> svm = Algorithm::loadFromString<SVM>(myStringModel);

Definition at line 3046 of file core.hpp.

virtual CV_WRAP float predict ( InputArray  samples,
OutputArray  results = noArray(),
int  flags = 0 
) const [pure virtual, inherited]

Predicts response(s) for the provided sample(s)

Parameters:
samplesThe input samples, floating-point matrix
resultsThe optional output matrix of results.
flagsThe optional flags, model-dependent. See cv::ml::StatModel::Flags.

Implemented in LogisticRegression.

virtual void read ( const FileNode fn ) [virtual, inherited]

Reads algorithm parameters from a file storage.

Reimplemented in DescriptorMatcher, and FlannBasedMatcher.

Definition at line 2992 of file core.hpp.

static Ptr<_Tp> read ( const FileNode fn ) [static, inherited]

Reads algorithm from the file node.

This is static template method of Algorithm. It's usage is following (in the case of SVM):

     Ptr<SVM> svm = Algorithm::read<SVM>(fn);

In order to make this method work, the derived class must overwrite Algorithm::read(const FileNode& fn) and also have static create() method without parameters (or with all the optional parameters)

Reimplemented in DescriptorMatcher, and FlannBasedMatcher.

Definition at line 3008 of file core.hpp.

virtual CV_WRAP void save ( const String &  filename ) const [virtual, inherited]

Saves the algorithm to a file.

In order to make this method work, the derived class must implement Algorithm::write(FileStorage& fs).

virtual CV_WRAP void setActivationFunction ( int  type,
double  param1 = 0,
double  param2 = 0 
) [pure virtual]

Initialize the activation function for each neuron.

Currently the default and the only fully supported activation function is ANN_MLP::SIGMOID_SYM.

Parameters:
typeThe type of activation function. See ANN_MLP::ActivationFunctions.
param1The first parameter of the activation function, $\alpha$. Default value is 0.
param2The second parameter of the activation function, $\beta$. Default value is 0.
virtual CV_WRAP void setBackpropMomentumScale ( double  val ) [pure virtual]

BPROP: Strength of the momentum term (the difference between weights on the 2 previous iterations).

See also:
getBackpropMomentumScale
virtual CV_WRAP void setBackpropWeightScale ( double  val ) [pure virtual]

BPROP: Strength of the weight gradient term.

See also:
getBackpropWeightScale
virtual CV_WRAP void setLayerSizes ( InputArray  _layer_sizes ) [pure virtual]

Integer vector specifying the number of neurons in each layer including the input and output layers.

The very first element specifies the number of elements in the input layer. The last element - number of elements in the output layer. Default value is empty Mat.

See also:
getLayerSizes
virtual CV_WRAP void setRpropDW0 ( double  val ) [pure virtual]

RPROP: Initial value $\Delta_0$ of update-values $\Delta_{ij}$.

See also:
getRpropDW0
virtual CV_WRAP void setRpropDWMax ( double  val ) [pure virtual]

RPROP: Update-values upper limit $\Delta_{max}$.

See also:
getRpropDWMax
virtual CV_WRAP void setRpropDWMin ( double  val ) [pure virtual]

RPROP: Update-values lower limit $\Delta_{min}$.

See also:
getRpropDWMin
virtual CV_WRAP void setRpropDWMinus ( double  val ) [pure virtual]

RPROP: Decrease factor $\eta^-$.

See also:
getRpropDWMinus
virtual CV_WRAP void setRpropDWPlus ( double  val ) [pure virtual]

RPROP: Increase factor $\eta^+$.

See also:
getRpropDWPlus
virtual CV_WRAP void setTermCriteria ( TermCriteria  val ) [pure virtual]

Termination criteria of the training algorithm.

See also:
getTermCriteria
virtual CV_WRAP void setTrainMethod ( int  method,
double  param1 = 0,
double  param2 = 0 
) [pure virtual]

Sets training method and common parameters.

Parameters:
methodDefault value is ANN_MLP::RPROP. See ANN_MLP::TrainingMethods.
param1passed to setRpropDW0 for ANN_MLP::RPROP and to setBackpropWeightScale for ANN_MLP::BACKPROP
param2passed to setRpropDWMin for ANN_MLP::RPROP and to setBackpropMomentumScale for ANN_MLP::BACKPROP.
virtual CV_WRAP bool train ( const Ptr< TrainData > &  trainData,
int  flags = 0 
) [virtual, inherited]

Trains the statistical model.

Parameters:
trainDatatraining data that can be loaded from file using TrainData::loadFromCSV or created with TrainData::create.
flagsoptional flags, depending on the model. Some of the models can be updated with the new training samples, not completely overwritten (such as NormalBayesClassifier or ANN_MLP).
static Ptr<_Tp> train ( const Ptr< TrainData > &  data,
int  flags = 0 
) [static, inherited]

Create and train model with default parameters.

The class must implement static `create()` method with no parameters or with all default parameter values

Definition at line 357 of file ml.hpp.

virtual CV_WRAP bool train ( InputArray  samples,
int  layout,
InputArray  responses 
) [virtual, inherited]

Trains the statistical model.

Parameters:
samplestraining samples
layoutSee ml::SampleTypes.
responsesvector of responses associated with the training samples.
virtual void write ( FileStorage fs ) const [virtual, inherited]

Stores algorithm parameters in a file storage.

Reimplemented in DescriptorMatcher, and FlannBasedMatcher.

Definition at line 2988 of file core.hpp.