mlpack  master
Public Types | Public Member Functions | Private Member Functions | Private Attributes | List of all members
mlpack::ann::RNN< OutputLayerType, InitializationRuleType > Class Template Reference

Implementation of a standard recurrent neural network container. More...

Public Types

using NetworkType = RNN< OutputLayerType, InitializationRuleType >
 Convenience typedef for the internal model construction. More...
 

Public Member Functions

 RNN (const size_t rho, const bool single=false, OutputLayerType outputLayer=OutputLayerType(), InitializationRuleType initializeRule=InitializationRuleType())
 Create the RNN object with the given predictors and responses set (this is the set that is used to train the network) and the given optimizer. More...
 
 RNN (const arma::mat &predictors, const arma::mat &responses, const size_t rho, const bool single=false, OutputLayerType outputLayer=OutputLayerType(), InitializationRuleType initializeRule=InitializationRuleType())
 Create the RNN object with the given predictors and responses set (this is the set that is used to train the network) and the given optimizer. More...
 
 ~RNN ()
 Destructor to release allocated memory. More...
 
template<typename LayerType >
void Add (const LayerType &layer)
 
template<class LayerType , class... Args>
void Add (Args...args)
 
void Add (LayerTypes layer)
 
double Evaluate (const arma::mat &, const size_t i, const bool deterministic=true)
 Evaluate the recurrent neural network with the given parameters. More...
 
void Gradient (const arma::mat &parameters, const size_t i, arma::mat &gradient)
 Evaluate the gradient of the recurrent neural network with the given parameters, and with respect to only one point in the dataset. More...
 
size_t NumFunctions () const
 Return the number of separable functions (the number of predictor points). More...
 
const arma::mat & Parameters () const
 Return the initial point for the optimization. More...
 
arma::mat & Parameters ()
 Modify the initial point for the optimization. More...
 
void Predict (arma::mat &predictors, arma::mat &responses)
 Predict the responses to a given set of predictors. More...
 
template<typename Archive >
void Serialize (Archive &ar, const unsigned int)
 Serialize the model. More...
 
template<template< typename > class OptimizerType = mlpack::optimization::SGD>
void Train (const arma::mat &predictors, const arma::mat &responses, OptimizerType< NetworkType > &optimizer)
 Train the recurrent neural network on the given input data using the given optimizer. More...
 
template<template< typename > class OptimizerType = mlpack::optimization::SGD>
void Train (const arma::mat &predictors, const arma::mat &responses)
 Train the recurrent neural network on the given input data. More...
 

Private Member Functions

void Backward ()
 The Backward algorithm (part of the Forward-Backward algorithm). More...
 
void Forward (arma::mat &&input)
 The Forward algorithm (part of the Forward-Backward algorithm). More...
 
void Gradient ()
 Iterate through all layer modules and update the the gradient using the layer defined optimizer. More...
 
void ResetDeterministic ()
 Reset the module status by setting the current deterministic parameter for all modules that implement the Deterministic function. More...
 
void ResetGradients (arma::mat &gradient)
 Reset the gradient for all modules that implement the Gradient function. More...
 
void ResetParameters ()
 Reset the module infomration (weights/parameters). More...
 
void SinglePredict (const arma::mat &predictors, arma::mat &responses)
 

Private Attributes

arma::mat currentInput
 THe current input of the forward/backward pass. More...
 
DeleteVisitor deleteVisitor
 Locally-stored delete visitor. More...
 
DeltaVisitor deltaVisitor
 Locally-stored delta visitor. More...
 
bool deterministic
 The current evaluation mode (training or testing). More...
 
arma::mat error
 The current error for the backward pass. More...
 
InitializationRuleType initializeRule
 Instantiated InitializationRule object for initializing the network parameter. More...
 
size_t inputSize
 The input size. More...
 
std::vector< arma::mat > moduleOutputParameter
 List of all module parameters for the backward pass (BBTT). More...
 
std::vector< LayerTypesnetwork
 Locally-stored model modules. More...
 
size_t numFunctions
 The number of separable functions (the number of predictor points). More...
 
OutputLayerType outputLayer
 Instantiated outputlayer used to evaluate the network. More...
 
OutputParameterVisitor outputParameterVisitor
 Locally-stored output parameter visitor. More...
 
size_t outputSize
 The output size. More...
 
arma::mat parameter
 Matrix of (trained) parameters. More...
 
arma::mat predictors
 The matrix of data points (predictors). More...
 
bool reset
 Indicator if we already trained the model. More...
 
ResetVisitor resetVisitor
 Locally-stored reset visitor. More...
 
arma::mat responses
 The matrix of responses to the input data points. More...
 
size_t rho
 Number of steps to backpropagate through time (BPTT). More...
 
bool single
 Only predict the last element of the input sequence. More...
 
size_t targetSize
 The target size. More...
 
WeightSizeVisitor weightSizeVisitor
 Locally-stored weight size visitor. More...
 

Detailed Description

template<typename OutputLayerType = NegativeLogLikelihood<>, typename InitializationRuleType = RandomInitialization>
class mlpack::ann::RNN< OutputLayerType, InitializationRuleType >

Implementation of a standard recurrent neural network container.

Template Parameters
OutputLayerTypeThe output layer type used to evaluate the network.
InitializationRuleTypeRule used to initialize the weight matrix.

Definition at line 40 of file rnn.hpp.

Member Typedef Documentation

template<typename OutputLayerType = NegativeLogLikelihood<>, typename InitializationRuleType = RandomInitialization>
using mlpack::ann::RNN< OutputLayerType, InitializationRuleType >::NetworkType = RNN<OutputLayerType, InitializationRuleType>

Convenience typedef for the internal model construction.

Definition at line 44 of file rnn.hpp.

Constructor & Destructor Documentation

template<typename OutputLayerType = NegativeLogLikelihood<>, typename InitializationRuleType = RandomInitialization>
mlpack::ann::RNN< OutputLayerType, InitializationRuleType >::RNN ( const size_t  rho,
const bool  single = false,
OutputLayerType  outputLayer = OutputLayerType(),
InitializationRuleType  initializeRule = InitializationRuleType() 
)

Create the RNN object with the given predictors and responses set (this is the set that is used to train the network) and the given optimizer.

Optionally, specify which initialize rule and performance function should be used.

Parameters
rhoMaximum number of steps to backpropagate through time (BPTT).
singlePredict only the last element of the input sequence.
outputLayerOutput layer used to evaluate the network.
initializeRuleOptional instantiated InitializationRule object for initializing the network parameter.
template<typename OutputLayerType = NegativeLogLikelihood<>, typename InitializationRuleType = RandomInitialization>
mlpack::ann::RNN< OutputLayerType, InitializationRuleType >::RNN ( const arma::mat &  predictors,
const arma::mat &  responses,
const size_t  rho,
const bool  single = false,
OutputLayerType  outputLayer = OutputLayerType(),
InitializationRuleType  initializeRule = InitializationRuleType() 
)

Create the RNN object with the given predictors and responses set (this is the set that is used to train the network) and the given optimizer.

Optionally, specify which initialize rule and performance function should be used.

Parameters
predictorsInput training variables.
responsesOutputs results from input training variables.
rhoMaximum number of steps to backpropagate through time (BPTT).
singlePredict only the last element of the input sequence.
outputLayerOutput layer used to evaluate the network.
initializeRuleOptional instantiated InitializationRule object for initializing the network parameter.
template<typename OutputLayerType = NegativeLogLikelihood<>, typename InitializationRuleType = RandomInitialization>
mlpack::ann::RNN< OutputLayerType, InitializationRuleType >::~RNN ( )

Destructor to release allocated memory.

Member Function Documentation

template<typename OutputLayerType = NegativeLogLikelihood<>, typename InitializationRuleType = RandomInitialization>
template<typename LayerType >
void mlpack::ann::RNN< OutputLayerType, InitializationRuleType >::Add ( const LayerType &  layer)
inline
template<typename OutputLayerType = NegativeLogLikelihood<>, typename InitializationRuleType = RandomInitialization>
template<class LayerType , class... Args>
void mlpack::ann::RNN< OutputLayerType, InitializationRuleType >::Add ( Args...  args)
inline
template<typename OutputLayerType = NegativeLogLikelihood<>, typename InitializationRuleType = RandomInitialization>
void mlpack::ann::RNN< OutputLayerType, InitializationRuleType >::Add ( LayerTypes  layer)
inline
template<typename OutputLayerType = NegativeLogLikelihood<>, typename InitializationRuleType = RandomInitialization>
void mlpack::ann::RNN< OutputLayerType, InitializationRuleType >::Backward ( )
private

The Backward algorithm (part of the Forward-Backward algorithm).

Computes backward pass for module.

Referenced by mlpack::ann::RNN< OutputLayerType, InitializationRuleType >::Parameters().

template<typename OutputLayerType = NegativeLogLikelihood<>, typename InitializationRuleType = RandomInitialization>
double mlpack::ann::RNN< OutputLayerType, InitializationRuleType >::Evaluate ( const arma::mat &  ,
const size_t  i,
const bool  deterministic = true 
)

Evaluate the recurrent neural network with the given parameters.

This function is usually called by the optimizer to train the model.

Parameters
parametersMatrix model parameters.
iIndex of point to use for objective function evaluation.
deterministicWhether or not to train or test the model. Note some layer act differently in training or testing mode.
template<typename OutputLayerType = NegativeLogLikelihood<>, typename InitializationRuleType = RandomInitialization>
void mlpack::ann::RNN< OutputLayerType, InitializationRuleType >::Forward ( arma::mat &&  input)
private

The Forward algorithm (part of the Forward-Backward algorithm).

Computes forward probabilities for each module.

Parameters
inputData sequence to compute probabilities for.

Referenced by mlpack::ann::RNN< OutputLayerType, InitializationRuleType >::Parameters().

template<typename OutputLayerType = NegativeLogLikelihood<>, typename InitializationRuleType = RandomInitialization>
void mlpack::ann::RNN< OutputLayerType, InitializationRuleType >::Gradient ( const arma::mat &  parameters,
const size_t  i,
arma::mat &  gradient 
)

Evaluate the gradient of the recurrent neural network with the given parameters, and with respect to only one point in the dataset.

This is useful for optimizers such as SGD, which require a separable objective function.

Parameters
parametersMatrix of the model parameters to be optimized.
iIndex of points to use for objective function gradient evaluation.
gradientMatrix to output gradient into.
template<typename OutputLayerType = NegativeLogLikelihood<>, typename InitializationRuleType = RandomInitialization>
void mlpack::ann::RNN< OutputLayerType, InitializationRuleType >::Gradient ( )
private

Iterate through all layer modules and update the the gradient using the layer defined optimizer.

Referenced by mlpack::ann::RNN< OutputLayerType, InitializationRuleType >::Parameters().

template<typename OutputLayerType = NegativeLogLikelihood<>, typename InitializationRuleType = RandomInitialization>
size_t mlpack::ann::RNN< OutputLayerType, InitializationRuleType >::NumFunctions ( ) const
inline

Return the number of separable functions (the number of predictor points).

Definition at line 186 of file rnn.hpp.

References mlpack::ann::RNN< OutputLayerType, InitializationRuleType >::numFunctions.

template<typename OutputLayerType = NegativeLogLikelihood<>, typename InitializationRuleType = RandomInitialization>
const arma::mat& mlpack::ann::RNN< OutputLayerType, InitializationRuleType >::Parameters ( ) const
inline

Return the initial point for the optimization.

Definition at line 189 of file rnn.hpp.

References mlpack::ann::RNN< OutputLayerType, InitializationRuleType >::parameter.

template<typename OutputLayerType = NegativeLogLikelihood<>, typename InitializationRuleType = RandomInitialization>
arma::mat& mlpack::ann::RNN< OutputLayerType, InitializationRuleType >::Parameters ( )
inline
template<typename OutputLayerType = NegativeLogLikelihood<>, typename InitializationRuleType = RandomInitialization>
void mlpack::ann::RNN< OutputLayerType, InitializationRuleType >::Predict ( arma::mat &  predictors,
arma::mat &  responses 
)

Predict the responses to a given set of predictors.

The responses will reflect the output of the given output layer as returned by the output layer function.

Parameters
predictorsInput predictors.
responsesMatrix to put output predictions of responses into.
template<typename OutputLayerType = NegativeLogLikelihood<>, typename InitializationRuleType = RandomInitialization>
void mlpack::ann::RNN< OutputLayerType, InitializationRuleType >::ResetDeterministic ( )
private

Reset the module status by setting the current deterministic parameter for all modules that implement the Deterministic function.

Referenced by mlpack::ann::RNN< OutputLayerType, InitializationRuleType >::Parameters().

template<typename OutputLayerType = NegativeLogLikelihood<>, typename InitializationRuleType = RandomInitialization>
void mlpack::ann::RNN< OutputLayerType, InitializationRuleType >::ResetGradients ( arma::mat &  gradient)
private

Reset the gradient for all modules that implement the Gradient function.

Referenced by mlpack::ann::RNN< OutputLayerType, InitializationRuleType >::Parameters().

template<typename OutputLayerType = NegativeLogLikelihood<>, typename InitializationRuleType = RandomInitialization>
void mlpack::ann::RNN< OutputLayerType, InitializationRuleType >::ResetParameters ( )
private

Reset the module infomration (weights/parameters).

Referenced by mlpack::ann::RNN< OutputLayerType, InitializationRuleType >::Parameters().

template<typename OutputLayerType = NegativeLogLikelihood<>, typename InitializationRuleType = RandomInitialization>
template<typename Archive >
void mlpack::ann::RNN< OutputLayerType, InitializationRuleType >::Serialize ( Archive &  ar,
const unsigned  int 
)
template<typename OutputLayerType = NegativeLogLikelihood<>, typename InitializationRuleType = RandomInitialization>
void mlpack::ann::RNN< OutputLayerType, InitializationRuleType >::SinglePredict ( const arma::mat &  predictors,
arma::mat &  responses 
)
private
template<typename OutputLayerType = NegativeLogLikelihood<>, typename InitializationRuleType = RandomInitialization>
template<template< typename > class OptimizerType = mlpack::optimization::SGD>
void mlpack::ann::RNN< OutputLayerType, InitializationRuleType >::Train ( const arma::mat &  predictors,
const arma::mat &  responses,
OptimizerType< NetworkType > &  optimizer 
)

Train the recurrent neural network on the given input data using the given optimizer.

This will use the existing model parameters as a starting point for the optimization. If this is not what you want, then you should access the parameters vector directly with Parameters() and modify it as desired.

Template Parameters
OptimizerTypeType of optimizer to use to train the model.
Parameters
predictorsInput training variables.
responsesOutputs results from input training variables.
optimizerInstantiated optimizer used to train the model.
template<typename OutputLayerType = NegativeLogLikelihood<>, typename InitializationRuleType = RandomInitialization>
template<template< typename > class OptimizerType = mlpack::optimization::SGD>
void mlpack::ann::RNN< OutputLayerType, InitializationRuleType >::Train ( const arma::mat &  predictors,
const arma::mat &  responses 
)

Train the recurrent neural network on the given input data.

By default, the SGD optimization algorithm is used, but others can be specified (such as mlpack::optimization::RMSprop).

This will use the existing model parameters as a starting point for the optimization. If this is not what you want, then you should access the parameters vector directly with Parameters() and modify it as desired.

Template Parameters
OptimizerTypeType of optimizer to use to train the model.
Parameters
predictorsInput training variables.
responsesOutputs results from input training variables.

Member Data Documentation

template<typename OutputLayerType = NegativeLogLikelihood<>, typename InitializationRuleType = RandomInitialization>
arma::mat mlpack::ann::RNN< OutputLayerType, InitializationRuleType >::currentInput
private

THe current input of the forward/backward pass.

Definition at line 287 of file rnn.hpp.

template<typename OutputLayerType = NegativeLogLikelihood<>, typename InitializationRuleType = RandomInitialization>
DeleteVisitor mlpack::ann::RNN< OutputLayerType, InitializationRuleType >::deleteVisitor
private

Locally-stored delete visitor.

Definition at line 305 of file rnn.hpp.

template<typename OutputLayerType = NegativeLogLikelihood<>, typename InitializationRuleType = RandomInitialization>
DeltaVisitor mlpack::ann::RNN< OutputLayerType, InitializationRuleType >::deltaVisitor
private

Locally-stored delta visitor.

Definition at line 290 of file rnn.hpp.

template<typename OutputLayerType = NegativeLogLikelihood<>, typename InitializationRuleType = RandomInitialization>
bool mlpack::ann::RNN< OutputLayerType, InitializationRuleType >::deterministic
private

The current evaluation mode (training or testing).

Definition at line 308 of file rnn.hpp.

template<typename OutputLayerType = NegativeLogLikelihood<>, typename InitializationRuleType = RandomInitialization>
arma::mat mlpack::ann::RNN< OutputLayerType, InitializationRuleType >::error
private

The current error for the backward pass.

Definition at line 284 of file rnn.hpp.

template<typename OutputLayerType = NegativeLogLikelihood<>, typename InitializationRuleType = RandomInitialization>
InitializationRuleType mlpack::ann::RNN< OutputLayerType, InitializationRuleType >::initializeRule
private

Instantiated InitializationRule object for initializing the network parameter.

Definition at line 251 of file rnn.hpp.

template<typename OutputLayerType = NegativeLogLikelihood<>, typename InitializationRuleType = RandomInitialization>
size_t mlpack::ann::RNN< OutputLayerType, InitializationRuleType >::inputSize
private

The input size.

Definition at line 254 of file rnn.hpp.

template<typename OutputLayerType = NegativeLogLikelihood<>, typename InitializationRuleType = RandomInitialization>
std::vector<arma::mat> mlpack::ann::RNN< OutputLayerType, InitializationRuleType >::moduleOutputParameter
private

List of all module parameters for the backward pass (BBTT).

Definition at line 296 of file rnn.hpp.

template<typename OutputLayerType = NegativeLogLikelihood<>, typename InitializationRuleType = RandomInitialization>
std::vector<LayerTypes> mlpack::ann::RNN< OutputLayerType, InitializationRuleType >::network
private

Locally-stored model modules.

Definition at line 269 of file rnn.hpp.

Referenced by mlpack::ann::RNN< OutputLayerType, InitializationRuleType >::Add().

template<typename OutputLayerType = NegativeLogLikelihood<>, typename InitializationRuleType = RandomInitialization>
size_t mlpack::ann::RNN< OutputLayerType, InitializationRuleType >::numFunctions
private

The number of separable functions (the number of predictor points).

Definition at line 281 of file rnn.hpp.

Referenced by mlpack::ann::RNN< OutputLayerType, InitializationRuleType >::NumFunctions().

template<typename OutputLayerType = NegativeLogLikelihood<>, typename InitializationRuleType = RandomInitialization>
OutputLayerType mlpack::ann::RNN< OutputLayerType, InitializationRuleType >::outputLayer
private

Instantiated outputlayer used to evaluate the network.

Definition at line 247 of file rnn.hpp.

template<typename OutputLayerType = NegativeLogLikelihood<>, typename InitializationRuleType = RandomInitialization>
OutputParameterVisitor mlpack::ann::RNN< OutputLayerType, InitializationRuleType >::outputParameterVisitor
private

Locally-stored output parameter visitor.

Definition at line 293 of file rnn.hpp.

template<typename OutputLayerType = NegativeLogLikelihood<>, typename InitializationRuleType = RandomInitialization>
size_t mlpack::ann::RNN< OutputLayerType, InitializationRuleType >::outputSize
private

The output size.

Definition at line 257 of file rnn.hpp.

template<typename OutputLayerType = NegativeLogLikelihood<>, typename InitializationRuleType = RandomInitialization>
arma::mat mlpack::ann::RNN< OutputLayerType, InitializationRuleType >::parameter
private

Matrix of (trained) parameters.

Definition at line 278 of file rnn.hpp.

Referenced by mlpack::ann::RNN< OutputLayerType, InitializationRuleType >::Parameters().

template<typename OutputLayerType = NegativeLogLikelihood<>, typename InitializationRuleType = RandomInitialization>
arma::mat mlpack::ann::RNN< OutputLayerType, InitializationRuleType >::predictors
private

The matrix of data points (predictors).

Definition at line 272 of file rnn.hpp.

template<typename OutputLayerType = NegativeLogLikelihood<>, typename InitializationRuleType = RandomInitialization>
bool mlpack::ann::RNN< OutputLayerType, InitializationRuleType >::reset
private

Indicator if we already trained the model.

Definition at line 263 of file rnn.hpp.

template<typename OutputLayerType = NegativeLogLikelihood<>, typename InitializationRuleType = RandomInitialization>
ResetVisitor mlpack::ann::RNN< OutputLayerType, InitializationRuleType >::resetVisitor
private

Locally-stored reset visitor.

Definition at line 302 of file rnn.hpp.

template<typename OutputLayerType = NegativeLogLikelihood<>, typename InitializationRuleType = RandomInitialization>
arma::mat mlpack::ann::RNN< OutputLayerType, InitializationRuleType >::responses
private

The matrix of responses to the input data points.

Definition at line 275 of file rnn.hpp.

template<typename OutputLayerType = NegativeLogLikelihood<>, typename InitializationRuleType = RandomInitialization>
size_t mlpack::ann::RNN< OutputLayerType, InitializationRuleType >::rho
private

Number of steps to backpropagate through time (BPTT).

Definition at line 244 of file rnn.hpp.

template<typename OutputLayerType = NegativeLogLikelihood<>, typename InitializationRuleType = RandomInitialization>
bool mlpack::ann::RNN< OutputLayerType, InitializationRuleType >::single
private

Only predict the last element of the input sequence.

Definition at line 266 of file rnn.hpp.

template<typename OutputLayerType = NegativeLogLikelihood<>, typename InitializationRuleType = RandomInitialization>
size_t mlpack::ann::RNN< OutputLayerType, InitializationRuleType >::targetSize
private

The target size.

Definition at line 260 of file rnn.hpp.

template<typename OutputLayerType = NegativeLogLikelihood<>, typename InitializationRuleType = RandomInitialization>
WeightSizeVisitor mlpack::ann::RNN< OutputLayerType, InitializationRuleType >::weightSizeVisitor
private

Locally-stored weight size visitor.

Definition at line 299 of file rnn.hpp.


The documentation for this class was generated from the following file: