class cv::ml::ANN_MLP
Overview
Artificial Neural Networks - Multi-Layer Perceptrons. Moreā¦
#include <ml.hpp> class ANN_MLP: public cv::ml::StatModel { public: // enums enum ActivationFunctions; enum TrainFlags; enum TrainingMethods; // methods virtual double getBackpropMomentumScale() const = 0; virtual double getBackpropWeightScale() const = 0; virtual cv::Mat getLayerSizes() const = 0; virtual double getRpropDW0() const = 0; virtual double getRpropDWMax() const = 0; virtual double getRpropDWMin() const = 0; virtual double getRpropDWMinus() const = 0; virtual double getRpropDWPlus() const = 0; virtual TermCriteria getTermCriteria() const = 0; virtual int getTrainMethod() const = 0; virtual Mat getWeights(int layerIdx) const = 0; virtual void setActivationFunction( int type, double param1 = 0, double param2 = 0 ) = 0; virtual void setBackpropMomentumScale(double val) = 0; virtual void setBackpropWeightScale(double val) = 0; virtual void setLayerSizes(InputArray _layer_sizes) = 0; virtual void setRpropDW0(double val) = 0; virtual void setRpropDWMax(double val) = 0; virtual void setRpropDWMin(double val) = 0; virtual void setRpropDWMinus(double val) = 0; virtual void setRpropDWPlus(double val) = 0; virtual void setTermCriteria(TermCriteria val) = 0; virtual void setTrainMethod( int method, double param1 = 0, double param2 = 0 ) = 0; static Ptr<ANN_MLP> create(); static Ptr<ANN_MLP> load(const String& filepath); };
Inherited Members
public: // enums enum Flags; // methods virtual void clear(); virtual bool empty() const; virtual String getDefaultName() const; virtual void read(const FileNode& fn); virtual void save(const String& filename) const; virtual void write(FileStorage& fs) const; template <typename _Tp> static Ptr<_Tp> load( const String& filename, const String& objname = String() ); template <typename _Tp> static Ptr<_Tp> loadFromString( const String& strModel, const String& objname = String() ); template <typename _Tp> static Ptr<_Tp> read(const FileNode& fn); virtual float calcError( const Ptr<TrainData>& data, bool test, OutputArray resp ) const; virtual bool empty() const; virtual int getVarCount() const = 0; virtual bool isClassifier() const = 0; virtual bool isTrained() const = 0; virtual float predict( InputArray samples, OutputArray results = noArray(), int flags = 0 ) const = 0; virtual bool train( const Ptr<TrainData>& trainData, int flags = 0 ); virtual bool train( InputArray samples, int layout, InputArray responses ); template <typename _Tp> static Ptr<_Tp> train( const Ptr<TrainData>& data, int flags = 0 ); protected: // methods void writeFormat(FileStorage& fs) const;
Detailed Documentation
Artificial Neural Networks - Multi-Layer Perceptrons.
Unlike many other models in ML that are constructed and trained at once, in the MLP model these steps are separated. First, a network with the specified topology is created using the non-default constructor or the method ANN_MLP::create. All the weights are set to zeros. Then, the network is trained using a set of input and output vectors. The training procedure can be repeated more than once, that is, the weights can be adjusted based on the new training data.
Additional flags for StatModel::train are available: ANN_MLP::TrainFlags.
See also:
Methods
virtual double getBackpropMomentumScale() const = 0
BPROP: Strength of the momentum term (the difference between weights on the 2 previous iterations). This parameter provides some inertia to smooth the random fluctuations of the weights. It can vary from 0 (the feature is disabled) to 1 and beyond. The value 0.1 or so is good enough. Default value is 0.1.
See also:
virtual double getBackpropWeightScale() const = 0
BPROP: Strength of the weight gradient term. The recommended value is about 0.1. Default value is 0.1.
See also:
virtual cv::Mat getLayerSizes() const = 0
Integer vector specifying the number of neurons in each layer including the input and output layers. The very first element specifies the number of elements in the input layer. The last element - number of elements in the output layer.
See also:
virtual double getRpropDW0() const = 0
RPROP: Initial value \(\Delta_0\) of update-values \(\Delta_{ij}\). Default value is 0.1.
See also:
virtual double getRpropDWMax() const = 0
RPROP: Update-values upper limit \(\Delta_{max}\). It must be >1. Default value is 50.
See also:
virtual double getRpropDWMin() const = 0
RPROP: Update-values lower limit \(\Delta_{min}\). It must be positive. Default value is FLT_EPSILON.
See also:
virtual double getRpropDWMinus() const = 0
RPROP: Decrease factor \(\eta^-\). It must be <1. Default value is 0.5.
See also:
virtual double getRpropDWPlus() const = 0
RPROP: Increase factor \(\eta^+\). It must be >1. Default value is 1.2.
See also:
virtual TermCriteria getTermCriteria() const = 0
Termination criteria of the training algorithm. You can specify the maximum number of iterations (maxCount) and/or how much the error could change between the iterations to make the algorithm continue (epsilon). Default value is TermCriteria (TermCriteria::MAX_ITER + TermCriteria::EPS, 1000, 0.01).
See also:
virtual int getTrainMethod() const = 0
Returns current training method
virtual void setActivationFunction( int type, double param1 = 0, double param2 = 0 ) = 0
Initialize the activation function for each neuron. Currently the default and the only fully supported activation function is ANN_MLP::SIGMOID_SYM.
Parameters:
type | The type of activation function. See ANN_MLP::ActivationFunctions. |
param1 | The first parameter of the activation function, \(\alpha\). Default value is 0. |
param2 | The second parameter of the activation function, \(\beta\). Default value is 0. |
virtual void setBackpropMomentumScale(double val) = 0
See also:
virtual void setBackpropWeightScale(double val) = 0
See also:
virtual void setLayerSizes(InputArray _layer_sizes) = 0
Integer vector specifying the number of neurons in each layer including the input and output layers. The very first element specifies the number of elements in the input layer. The last element - number of elements in the output layer. Default value is empty Mat.
See also:
virtual void setRpropDW0(double val) = 0
See also:
virtual void setRpropDWMax(double val) = 0
See also:
virtual void setRpropDWMin(double val) = 0
See also:
virtual void setRpropDWMinus(double val) = 0
See also:
virtual void setRpropDWPlus(double val) = 0
See also:
virtual void setTermCriteria(TermCriteria val) = 0
See also:
virtual void setTrainMethod( int method, double param1 = 0, double param2 = 0 ) = 0
Sets training method and common parameters.
Parameters:
method | Default value is ANN_MLP::RPROP. See ANN_MLP::TrainingMethods. |
param1 | passed to setRpropDW0 for ANN_MLP::RPROP and to setBackpropWeightScale for ANN_MLP::BACKPROP |
param2 | passed to setRpropDWMin for ANN_MLP::RPROP and to setBackpropMomentumScale for ANN_MLP::BACKPROP. |
static Ptr<ANN_MLP> create()
Creates empty model.
Use StatModel::train to train the model, Algorithm::load <ANN_MLP>(filename) to load the pre-trained model. Note that the train method has optional flags: ANN_MLP::TrainFlags.
static Ptr<ANN_MLP> load(const String& filepath)
Loads and creates a serialized ANN from a file.
Use ANN::save to serialize and store an ANN to disk. Load the ANN from this file again, by calling this function with the path to the file.
Parameters:
filepath | path to serialized ANN |