mlpack  master
Classes | Typedefs | Functions
mlpack::ann Namespace Reference

Artificial Neural Network. More...

Classes

class  Add
 Implementation of the Add module class. More...
 
class  AddMerge
 Implementation of the AddMerge module class. More...
 
class  AddVisitor
 AddVisitor exposes the Add() method of the given module. More...
 
class  BackwardVisitor
 BackwardVisitor executes the Backward() function given the input, error and delta parameter. More...
 
class  BaseLayer
 Implementation of the base layer. More...
 
class  Concat
 Implementation of the Concat class. More...
 
class  ConcatPerformance
 Implementation of the concat performance class. More...
 
class  Constant
 Implementation of the constant layer. More...
 
class  Convolution
 Implementation of the Convolution class. More...
 
class  DeleteVisitor
 DeleteVisitor executes the destructor of the instantiated object. More...
 
class  DeltaVisitor
 DeltaVisitor exposes the delta parameter of the given module. More...
 
class  DeterministicSetVisitor
 DeterministicSetVisitor set the deterministic parameter given the deterministic value. More...
 
class  DropConnect
 The DropConnect layer is a regularizer that randomly with probability ratio sets the connection values to zero and scales the remaining elements by factor 1 /(1 - ratio). More...
 
class  Dropout
 The dropout layer is a regularizer that randomly with probability ratio sets input values to zero and scales the remaining elements by factor 1 / (1 - ratio). More...
 
class  ELU
 The ELU activation function, defined by. More...
 
class  FFN
 Implementation of a standard feed forward network. More...
 
class  FFTConvolution
 Computes the two-dimensional convolution through fft. More...
 
class  ForwardVisitor
 ForwardVisitor executes the Forward() function given the input and output parameter. More...
 
class  FullConvolution
 
class  Glimpse
 The glimpse layer returns a retina-like representation (down-scaled cropped images) of increasing scale around a given location in a given image. More...
 
class  GradientSetVisitor
 GradientSetVisitor update the gradient parameter given the gradient set. More...
 
class  GradientUpdateVisitor
 GradientUpdateVisitor update the gradient parameter given the gradient set. More...
 
class  GradientVisitor
 SearchModeVisitor executes the Gradient() method of the given module using the input and delta parameter. More...
 
class  GradientZeroVisitor
 
class  HardTanH
 The Hard Tanh activation function, defined by. More...
 
class  IdentityFunction
 The identity function, defined by. More...
 
class  Join
 Implementation of the Join module class. More...
 
class  KathirvalavakumarSubavathiInitialization
 This class is used to initialize the weight matrix with the method proposed by T. More...
 
class  LayerTraits
 This is a template class that can provide information about various layers. More...
 
class  LeakyReLU
 The LeakyReLU activation function, defined by. More...
 
class  Linear
 Implementation of the Linear layer class. More...
 
class  LinearNoBias
 Implementation of the LinearNoBias class. More...
 
class  LoadOutputParameterVisitor
 LoadOutputParameterVisitor restores the output parameter using the given parameter set. More...
 
class  LogisticFunction
 The logistic function, defined by. More...
 
class  LogSoftMax
 Implementation of the log softmax layer. More...
 
class  Lookup
 Implementation of the Lookup class. More...
 
class  LSTM
 An implementation of a lstm network layer. More...
 
class  MaxPooling
 Implementation of the MaxPooling layer. More...
 
class  MaxPoolingRule
 
class  MeanPooling
 Implementation of the MeanPooling. More...
 
class  MeanPoolingRule
 
class  MeanSquaredError
 The mean squared error performance function measures the network's performance according to the mean of squared errors. More...
 
class  MultiplyConstant
 Implementation of the multiply constant layer. More...
 
class  NaiveConvolution
 Computes the two-dimensional convolution. More...
 
class  NegativeLogLikelihood
 Implementation of the negative log likelihood layer. More...
 
class  NguyenWidrowInitialization
 This class is used to initialize the weight matrix with the Nguyen-Widrow method. More...
 
class  OivsInitialization
 This class is used to initialize the weight matrix with the oivs method. More...
 
class  OrthogonalInitialization
 This class is used to initialize the weight matrix with the orthogonal matrix initialization. More...
 
class  OutputHeightVisitor
 OutputWidthVisitor exposes the OutputHeight() method of the given module. More...
 
class  OutputParameterVisitor
 OutputParameterVisitor exposes the output parameter of the given module. More...
 
class  OutputWidthVisitor
 OutputWidthVisitor exposes the OutputWidth() method of the given module. More...
 
class  ParametersSetVisitor
 ParametersSetVisitor update the parameters set using the given matrix. More...
 
class  ParametersVisitor
 ParametersVisitor exposes the parameters set of the given module and stores the parameters set into the given matrix. More...
 
class  PReLU
 The PReLU activation function, defined by (where alpha is trainable) More...
 
class  RandomInitialization
 This class is used to initialize randomly the weight matrix. More...
 
class  RectifierFunction
 The rectifier function, defined by. More...
 
class  Recurrent
 Implementation of the RecurrentLayer class. More...
 
class  RecurrentAttention
 This class implements the Recurrent Model for Visual Attention, using a variety of possible layer implementations. More...
 
class  ReinforceNormal
 Implementation of the reinforce normal layer. More...
 
class  ResetVisitor
 ResetVisitor executes the Reset() function. More...
 
class  RewardSetVisitor
 RewardSetVisitor set the reward parameter given the reward value. More...
 
class  RNN
 Implementation of a standard recurrent neural network container. More...
 
class  SaveOutputParameterVisitor
 SaveOutputParameterVisitor saves the output parameter into the given parameter set. More...
 
class  Select
 The select module selects the specified column from a given input matrix. More...
 
class  Sequential
 Implementation of the Sequential class. More...
 
class  SetInputHeightVisitor
 SetInputHeightVisitor updates the input height parameter with the given input height. More...
 
class  SetInputWidthVisitor
 SetInputWidthVisitor updates the input width parameter with the given input width. More...
 
class  SoftplusFunction
 The softplus function, defined by. More...
 
class  SoftsignFunction
 The softsign function, defined by. More...
 
class  SVDConvolution
 Computes the two-dimensional convolution using singular value decomposition. More...
 
class  TanhFunction
 The tanh function, defined by. More...
 
class  ValidConvolution
 
class  VRClassReward
 Implementation of the variance reduced classification reinforcement layer. More...
 
class  WeightSetVisitor
 WeightSetVisitor update the module parameters given the parameters set. More...
 
class  WeightSizeVisitor
 WeightSizeVisitor returns the number of weights of the given module. More...
 
class  ZeroInitialization
 This class is used to initialize randomly the weight matrix. More...
 

Typedefs

template<class ActivationFunction = IdentityFunction, typename InputDataType = arma::mat, typename OutputDataType = arma::mat>
using IdentityLayer = BaseLayer< ActivationFunction, InputDataType, OutputDataType >
 Standard Identity-Layer using the identity activation function. More...
 
using LayerTypes = boost::variant< Add< arma::mat, arma::mat > *, AddMerge< arma::mat, arma::mat > *, BaseLayer< LogisticFunction, arma::mat, arma::mat > *, BaseLayer< IdentityFunction, arma::mat, arma::mat > *, BaseLayer< TanhFunction, arma::mat, arma::mat > *, BaseLayer< RectifierFunction, arma::mat, arma::mat > *, Concat< arma::mat, arma::mat > *, ConcatPerformance< NegativeLogLikelihood< arma::mat, arma::mat >, arma::mat, arma::mat > *, Constant< arma::mat, arma::mat > *, Convolution< NaiveConvolution< ValidConvolution >, NaiveConvolution< FullConvolution >, NaiveConvolution< ValidConvolution >, arma::mat, arma::mat > *, DropConnect< arma::mat, arma::mat > *, Dropout< arma::mat, arma::mat > *, Glimpse< arma::mat, arma::mat > *, HardTanH< arma::mat, arma::mat > *, Join< arma::mat, arma::mat > *, LeakyReLU< arma::mat, arma::mat > *, Linear< arma::mat, arma::mat > *, LinearNoBias< arma::mat, arma::mat > *, LogSoftMax< arma::mat, arma::mat > *, Lookup< arma::mat, arma::mat > *, LSTM< arma::mat, arma::mat > *, MaxPooling< arma::mat, arma::mat > *, MeanPooling< arma::mat, arma::mat > *, MeanSquaredError< arma::mat, arma::mat > *, MultiplyConstant< arma::mat, arma::mat > *, NegativeLogLikelihood< arma::mat, arma::mat > *, PReLU< arma::mat, arma::mat > *, Recurrent< arma::mat, arma::mat > *, RecurrentAttention< arma::mat, arma::mat > *, ReinforceNormal< arma::mat, arma::mat > *, Select< arma::mat, arma::mat > *, Sequential< arma::mat, arma::mat > *, VRClassReward< arma::mat, arma::mat > * >
 
template<class ActivationFunction = RectifierFunction, typename InputDataType = arma::mat, typename OutputDataType = arma::mat>
using ReLULayer = BaseLayer< ActivationFunction, InputDataType, OutputDataType >
 Standard rectified linear unit non-linearity layer. More...
 
template<class ActivationFunction = LogisticFunction, typename InputDataType = arma::mat, typename OutputDataType = arma::mat>
using SigmoidLayer = BaseLayer< ActivationFunction, InputDataType, OutputDataType >
 Standard Sigmoid-Layer using the logistic activation function. More...
 
template<class ActivationFunction = TanhFunction, typename InputDataType = arma::mat, typename OutputDataType = arma::mat>
using TanHLayer = BaseLayer< ActivationFunction, InputDataType, OutputDataType >
 Standard hyperbolic tangent layer. More...
 

Functions

 HAS_MEM_FUNC (Gradient, HasGradientCheck)
 
 HAS_MEM_FUNC (Deterministic, HasDeterministicCheck)
 
 HAS_MEM_FUNC (Parameters, HasParametersCheck)
 
 HAS_MEM_FUNC (Add, HasAddCheck)
 
 HAS_MEM_FUNC (Model, HasModelCheck)
 
 HAS_MEM_FUNC (Location, HasLocationCheck)
 
 HAS_MEM_FUNC (Reset, HasResetCheck)
 
 HAS_MEM_FUNC (Reward, HasRewardCheck)
 
 HAS_MEM_FUNC (InputWidth, HasInputWidth)
 
 HAS_MEM_FUNC (InputHeight, HasInputHeight)
 
 HAS_MEM_FUNC (InputHeight, HasRho)
 

Detailed Description

Artificial Neural Network.

Typedef Documentation

template<class ActivationFunction = IdentityFunction, typename InputDataType = arma::mat, typename OutputDataType = arma::mat>
using mlpack::ann::IdentityLayer = typedef BaseLayer< ActivationFunction, InputDataType, OutputDataType>

Standard Identity-Layer using the identity activation function.

Definition at line 147 of file base_layer.hpp.

using mlpack::ann::LayerTypes = typedef boost::variant< Add<arma::mat, arma::mat>*, AddMerge<arma::mat, arma::mat>*, BaseLayer<LogisticFunction, arma::mat, arma::mat>*, BaseLayer<IdentityFunction, arma::mat, arma::mat>*, BaseLayer<TanhFunction, arma::mat, arma::mat>*, BaseLayer<RectifierFunction, arma::mat, arma::mat>*, Concat<arma::mat, arma::mat>*, ConcatPerformance<NegativeLogLikelihood<arma::mat, arma::mat>, arma::mat, arma::mat>*, Constant<arma::mat, arma::mat>*, Convolution<NaiveConvolution<ValidConvolution>, NaiveConvolution<FullConvolution>, NaiveConvolution<ValidConvolution>, arma::mat, arma::mat>*, DropConnect<arma::mat, arma::mat>*, Dropout<arma::mat, arma::mat>*, Glimpse<arma::mat, arma::mat>*, HardTanH<arma::mat, arma::mat>*, Join<arma::mat, arma::mat>*, LeakyReLU<arma::mat, arma::mat>*, Linear<arma::mat, arma::mat>*, LinearNoBias<arma::mat, arma::mat>*, LogSoftMax<arma::mat, arma::mat>*, Lookup<arma::mat, arma::mat>*, LSTM<arma::mat, arma::mat>*, MaxPooling<arma::mat, arma::mat>*, MeanPooling<arma::mat, arma::mat>*, MeanSquaredError<arma::mat, arma::mat>*, MultiplyConstant<arma::mat, arma::mat>*, NegativeLogLikelihood<arma::mat, arma::mat>*, PReLU<arma::mat, arma::mat>*, Recurrent<arma::mat, arma::mat>*, RecurrentAttention<arma::mat, arma::mat>*, ReinforceNormal<arma::mat, arma::mat>*, Select<arma::mat, arma::mat>*, Sequential<arma::mat, arma::mat>*, VRClassReward<arma::mat, arma::mat>* >

Definition at line 115 of file layer_types.hpp.

template<class ActivationFunction = RectifierFunction, typename InputDataType = arma::mat, typename OutputDataType = arma::mat>
using mlpack::ann::ReLULayer = typedef BaseLayer< ActivationFunction, InputDataType, OutputDataType>

Standard rectified linear unit non-linearity layer.

Definition at line 158 of file base_layer.hpp.

template<class ActivationFunction = LogisticFunction, typename InputDataType = arma::mat, typename OutputDataType = arma::mat>
using mlpack::ann::SigmoidLayer = typedef BaseLayer< ActivationFunction, InputDataType, OutputDataType>

Standard Sigmoid-Layer using the logistic activation function.

Definition at line 136 of file base_layer.hpp.

template<class ActivationFunction = TanhFunction, typename InputDataType = arma::mat, typename OutputDataType = arma::mat>
using mlpack::ann::TanHLayer = typedef BaseLayer< ActivationFunction, InputDataType, OutputDataType>

Standard hyperbolic tangent layer.

Definition at line 169 of file base_layer.hpp.

Function Documentation

mlpack::ann::HAS_MEM_FUNC ( Gradient  ,
HasGradientCheck   
)
mlpack::ann::HAS_MEM_FUNC ( Deterministic  ,
HasDeterministicCheck   
)
mlpack::ann::HAS_MEM_FUNC ( Parameters  ,
HasParametersCheck   
)
mlpack::ann::HAS_MEM_FUNC ( Add  ,
HasAddCheck   
)
mlpack::ann::HAS_MEM_FUNC ( Model  ,
HasModelCheck   
)
mlpack::ann::HAS_MEM_FUNC ( Location  ,
HasLocationCheck   
)
mlpack::ann::HAS_MEM_FUNC ( Reset  ,
HasResetCheck   
)
mlpack::ann::HAS_MEM_FUNC ( Reward  ,
HasRewardCheck   
)
mlpack::ann::HAS_MEM_FUNC ( InputWidth  ,
HasInputWidth   
)
mlpack::ann::HAS_MEM_FUNC ( InputHeight  ,
HasInputHeight   
)
mlpack::ann::HAS_MEM_FUNC ( InputHeight  ,
HasRho   
)