ConvNet  1.0
A GPU-based C++ implementation of Convolutional Neural Nets
 All Classes Namespaces Functions Variables
Public Member Functions | Protected Attributes | List of all members
ReLULayer Class Reference

Implements a layer with a rectified linear activation function. More...

#include <layer.h>

Inheritance diagram for ReLULayer:
LinearLayer Layer

Public Member Functions

 ReLULayer (const config::Layer &config)
 
virtual void ApplyActivation (bool train)
 Apply the activation function. More...
 
virtual void ApplyDerivativeOfActivation ()
 Apply the derivative of the activation. More...
 
- Public Member Functions inherited from LinearLayer
 LinearLayer (const config::Layer &config)
 
virtual void AllocateMemory (int imgsize, int batch_size)
 Allocate memory for storing the state and derivative at this layer. More...
 
virtual void ComputeDeriv ()
 Compute derivative of loss function. More...
 
virtual float GetLoss ()
 Compute the value of the loss function that is displayed during training. More...
 
- Public Member Functions inherited from Layer
 Layer (const config::Layer &config)
 Instantiate a layer from config. More...
 
virtual float GetLoss2 ()
 Compute the value of the actual loss function. More...
 
void ApplyDropout (bool train)
 Apply dropout to this layer. More...
 
void ApplyDerivativeofDropout ()
 Apply derivative of dropout. More...
 
void AccessStateBegin ()
 
void AccessStateEnd ()
 
void AccessDerivBegin ()
 
void AccessDerivEnd ()
 
EdgeGetIncomingEdge (int index)
 Returns the incoming edge by index. More...
 
MatrixGetState ()
 Returns a reference to the state of the layer. More...
 
MatrixGetDeriv ()
 Returns a reference to the deriv at this layer. More...
 
MatrixGetData ()
 Returns a reference to the data at this layer. More...
 
void Display ()
 
void Display (int image_id)
 
void AddIncoming (Edge *e)
 Add an incoming edge to this layer. More...
 
void AddOutgoing (Edge *e)
 Add an outgoing edge from this layer. More...
 
const string & GetName () const
 
int GetNumChannels () const
 
int GetSize () const
 
bool IsInput () const
 
bool IsOutput () const
 
int GetGPUId () const
 
void AllocateMemoryOnOtherGPUs ()
 
MatrixGetOtherState (int gpu_id)
 
MatrixGetOtherDeriv (int gpu_id)
 
void SyncIncomingState ()
 
void SyncOutgoingState ()
 
void SyncIncomingDeriv ()
 
void SyncOutgoingDeriv ()
 

Protected Attributes

const bool rectify_after_gaussian_dropout_
 
- Protected Attributes inherited from Layer
const string name_
 
const int num_channels_
 
const bool is_input_
 
const bool is_output_
 
const float dropprob_
 
const bool display_
 
const bool dropout_scale_up_at_train_time_
 
const bool gaussian_dropout_
 
const float max_act_gaussian_dropout_
 
int scale_targets_
 
int image_size_
 
Matrix state_
 
Matrix deriv_
 State (activation) of the layer. More...
 
Matrix data_
 Deriv of the loss function w.r.t. More...
 
Matrix rand_gaussian_
 Data (targets) associated with this layer. More...
 
map< int, Matrixother_states_
 Need to store random variates when doing gaussian dropout. More...
 
map< int, Matrixother_derivs_
 Copies of this layer's state on other gpus. More...
 
ImageDisplayerimg_display_
 Copies of this layer's deriv on other gpus. More...
 
const int gpu_id_
 
set< int > other_incoming_gpu_ids_
 
set< int > other_outgoing_gpu_ids_
 

Additional Inherited Members

- Static Public Member Functions inherited from Layer
static LayerChooseLayerClass (const config::Layer &layer_config)
 
- Public Attributes inherited from Layer
vector< Edge * > incoming_edge_
 
vector< Edge * > outgoing_edge_
 
bool has_incoming_from_same_gpu_
 
bool has_outgoing_to_same_gpu_
 
bool has_incoming_from_other_gpus_
 
bool has_outgoing_to_other_gpus_
 
- Protected Member Functions inherited from Layer
void ApplyDropoutAtTrainTime ()
 
void ApplyDropoutAtTestTime ()
 

Detailed Description

Implements a layer with a rectified linear activation function.

Member Function Documentation

void ReLULayer::ApplyActivation ( bool  train)
virtual

Apply the activation function.

Derived classes must implement this. This method applies the activation function to the state_ and overwrites it.

Parameters
trainIf true, use dropout.

Reimplemented from LinearLayer.

void ReLULayer::ApplyDerivativeOfActivation ( )
virtual

Apply the derivative of the activation.

Derived classes must implement this. Computes the derivative w.r.t the inputs to this layer from the derivative w.r.t the outputs of this layer. Applies the derivative of the activation function to deriv_ and overwrites it.

Reimplemented from LinearLayer.


The documentation for this class was generated from the following files: