MyCaffe  1.12.2.41
Deep learning software for Windows C# programmers.
MyCaffe.layers.Layer< T > Class Template Referenceabstract

An interface for the units of computation which can be composed into a Net. More...

Inheritance diagram for MyCaffe.layers.Layer< T >:
MyCaffe.layers.AccuracyLayer< T > MyCaffe.layers.ArgMaxLayer< T > MyCaffe.layers.AttentionLayer< T > MyCaffe.layers.BaseConvolutionLayer< T > MyCaffe.layers.BaseDataLayer< T > MyCaffe.layers.BatchNormLayer< T > MyCaffe.layers.BatchReindexLayer< T > MyCaffe.layers.BiasLayer< T > MyCaffe.layers.ConcatLayer< T > MyCaffe.layers.ConstantLayer< T > MyCaffe.layers.CopyLayer< T > MyCaffe.layers.CropLayer< T > MyCaffe.layers.DataNormalizerLayer< T > MyCaffe.layers.DebugLayer< T > MyCaffe.layers.DummyDataLayer< T > MyCaffe.layers.EltwiseLayer< T > MyCaffe.layers.EmbedLayer< T > MyCaffe.layers.FilterLayer< T > MyCaffe.layers.FlattenLayer< T > MyCaffe.layers.Im2colLayer< T > MyCaffe.layers.InnerProductLayer< T > MyCaffe.layers.InputLayer< T > MyCaffe.layers.LRNLayer< T > MyCaffe.layers.LSTMAttentionLayer< T > MyCaffe.layers.LSTMSimpleLayer< T > MyCaffe.layers.LSTMUnitLayer< T > MyCaffe.layers.LossLayer< T > MyCaffe.layers.MVNLayer< T > MyCaffe.layers.NeuronLayer< T > MyCaffe.layers.PoolingLayer< T > MyCaffe.layers.RecurrentLayer< T > MyCaffe.layers.ReductionLayer< T > MyCaffe.layers.ReshapeLayer< T > MyCaffe.layers.SPPLayer< T > MyCaffe.layers.ScaleLayer< T > MyCaffe.layers.SilenceLayer< T > MyCaffe.layers.SliceLayer< T > MyCaffe.layers.SoftmaxLayer< T > MyCaffe.layers.SplitLayer< T > MyCaffe.layers.TileLayer< T > MyCaffe.layers.beta.AccuracyDecodeLayer< T > MyCaffe.layers.beta.AccuracyEncodingLayer< T > MyCaffe.layers.beta.ConvolutionOctaveLayer< T > MyCaffe.layers.beta.DataSequenceLayer< T > MyCaffe.layers.beta.DecodeLayer< T > MyCaffe.layers.beta.GatherLayer< T > MyCaffe.layers.beta.GlobResNormLayer< T > MyCaffe.layers.beta.InterpLayer< T > MyCaffe.layers.beta.KnnLayer< T > MyCaffe.layers.beta.MergeLayer< T > MyCaffe.layers.beta.ModelDataLayer< T > MyCaffe.layers.beta.Normalization1Layer< T > MyCaffe.layers.beta.SqueezeLayer< T > MyCaffe.layers.beta.TextDataLayer< T > MyCaffe.layers.beta.TransposeLayer< T > MyCaffe.layers.beta.UnPoolingLayer1< T > MyCaffe.layers.beta.UnPoolingLayer< T > MyCaffe.layers.beta.UnsqueezeLayer< T > MyCaffe.layers.gpt.CausalSelfAttentionLayer2< T > MyCaffe.layers.gpt.CausalSelfAttentionLayer< T > MyCaffe.layers.gpt.LayerNormLayer< T > MyCaffe.layers.gpt.MultiheadAttentionLayer< T > MyCaffe.layers.gpt.PositionalEncodingLayer< T > MyCaffe.layers.gpt.TokenizedDataLayer< T > MyCaffe.layers.gpt.TokenizedDataPairsLayer< T > MyCaffe.layers.gpt.TransformerBlockLayer< T > MyCaffe.layers.hdf5.HDF5DataLayer< T > MyCaffe.layers.lnn.CfcLayer< T > MyCaffe.layers.lnn.LnnUnitLayer< T > MyCaffe.layers.nt.EventLayer< T > MyCaffe.layers.nt.GramLayer< T > MyCaffe.layers.nt.OneHotLayer< T > MyCaffe.layers.ssd.DetectionEvaluateLayer< T > MyCaffe.layers.ssd.DetectionOutputLayer< T > MyCaffe.layers.ssd.Normalization2Layer< T > MyCaffe.layers.ssd.PermuteLayer< T > MyCaffe.layers.ssd.PriorBoxLayer< T > MyCaffe.layers.tft.CategoricalTransformationLayer< T > MyCaffe.layers.tft.ChannelEmbeddingLayer< T > MyCaffe.layers.tft.DataTemporalLayer< T > MyCaffe.layers.tft.GateAddNormLayer< T > MyCaffe.layers.tft.GluLayer< T > MyCaffe.layers.tft.GrnLayer< T > MyCaffe.layers.tft.MultiHeadAttentionInterpLayer< T > MyCaffe.layers.tft.NumericTransformationLayer< T > MyCaffe.layers.tft.QuantileAccuracyLayer< T > MyCaffe.layers.tft.ReshapeTemporalLayer< T > MyCaffe.layers.tft.VarSetNetLayer< T >

Public Member Functions

 Layer (CudaDnn< T > cuda, Log log, LayerParameter p)
 The Layer constructor. More...
 
void Dispose ()
 Releases all GPU and host resources used by the Layer. More...
 
virtual void ConnectLoss (LossLayer< T > layer)
 Called to connect the loss OnLoss event to a specified layer (typically the data layer). More...
 
virtual BlobCollection< T > PreProcessInput (PropertySet customInput, out int nSeqLen, BlobCollection< T > colBottom=null)
 The PreprocessInput allows derivative data layers to convert a property set of input data into the bottom blob collection used as intput. More...
 
virtual bool PreProcessInput (string strEncInput, int? nDecInput, BlobCollection< T > colBottom)
 Preprocess the input data for the RUN phase. More...
 
virtual List< Tuple< string, int, double > > PostProcessOutput (Blob< T > blobSofmtax, int nK=1)
 The PostProcessOutput allows derivative data layers to post-process the results, converting them back into text results (e.g., detokenizing). More...
 
virtual List< Tuple< string, int, double > > PostProcessLogitsOutput (int nCurIdx, Blob< T > blobLogits, Layer< T > softmax, int nAxis, int nK=1)
 The PostProcessLogitsOutput allows derivative data layers to post-process the results, converting them back into text results (e.g., detokenizing). More...
 
virtual string PostProcessFullOutput (Blob< T > blobSoftmax)
 The PostProcessFullOutput allows derivative data layers to post-process the results, usually be detokenizing the data in the blobSoftmax. More...
 
virtual string PostProcessOutput (int nIdx)
 Convert the index to the word. More...
 
virtual void SetOnDebug (EventHandler< GetWorkBlobArgs< T > > fn)
 Set the OnDebug event. More...
 
virtual void ResetOnDebug (EventHandler< GetWorkBlobArgs< T > > fn)
 Reset the OnDebug event, disabling it. More...
 
virtual bool ReInitializeParameters (WEIGHT_TARGET target)
 Re-initialize the parameters of the layer. More...
 
void SetNetReshapeRequest ()
 Called by the Net when requesting a reshape. More...
 
void SetPhase (Phase phase)
 Changes the layer's Phase to the one specified. More...
 
void Setup (BlobCollection< T > colBottom, BlobCollection< T > colTop)
 Implements common Layer setup functionality. More...
 
abstract void LayerSetUp (BlobCollection< T > colBottom, BlobCollection< T > colTop)
 Performs Layer specific setup. Derived layers should override this function as well as the Reshape function. More...
 
virtual void SetNetParameterUsed (NetParameter np)
 This function allows other layers to gather needed information from the NetParameters if any, and is called when initialzing the Net. More...
 
abstract void Reshape (BlobCollection< T > colBottom, BlobCollection< T > colTop)
 Adjust the shapes of top blobs and internal buffers to accomodate the shapes of the bottom blobs. More...
 
void ConvertToBase (BlobCollection< T > col)
 ConvertToBase converts any blobs in a collection that are in half size to the base size. More...
 
double Forward (BlobCollection< T > colBottom, BlobCollection< T > colTop)
 Given the bottom (input) Blobs, this function computes the top (output) Blobs and the loss. More...
 
void Backward (BlobCollection< T > colTop, List< bool > rgbPropagateDown, BlobCollection< T > colBottom)
 Given the top Blob error gradients, compute the bottom Blob error gradients. More...
 
double loss (int nTopIdx)
 Returns the scalar loss associated with the top Blob at a given index. More...
 
void set_loss (int nTopIdx, double dfLoss)
 Sets the loss associated with a top Blob at a given index. More...
 
virtual bool AllowForceBackward (int nBottomIdx)
 Return whether to allow More...
 
bool param_propagate_down (int nParamIdx)
 Returns whether or not the Layer should compute gradients w.r.t. a parameter at a particular index given by a parameter index. More...
 
void set_param_propagate_down (int nParamIdx, bool bPropagate)
 Sets whether or not the Layer should compute gradients w.r.t. a parameter at a particular index given by a parameter index. More...
 
void SetEnablePassthrough (bool bEnable)
 Enables/disables the pass-through mode. More...
 

Static Public Member Functions

static Layer< T > Create (CudaDnn< T > cuda, Log log, LayerParameter p, CancelEvent evtCancel, IXDatabaseBase db=null, TransferInput trxinput=null)
 Create a new Layer based on the LayerParameter. More...
 

Protected Member Functions

virtual void dispose ()
 Releases all GPU and host resources used by the Layer. More...
 
void dispose (ref Layer< T > l)
 Helper method used to dispose internal layers. More...
 
void dispose (ref Blob< T > b)
 Helper method used to dispose internal blobs. More...
 
void dispose (ref BlobCollection< T > rg, bool bSetToNull=true)
 Dispose the blob collection. More...
 
GetIterationArgs getCurrentIteration ()
 Fires the OnGetIteration event to query the current iteration. More...
 
long convert_to_full (int nCount, long hMem)
 Convert half memory to full memory. More...
 
void convert (BlobCollection< T > col)
 Convert a collection of blobs from / to half size. More...
 
virtual bool reshapeNeeded (BlobCollection< T > colBottom, BlobCollection< T > colTop, bool bReset=true)
 Tests the shapes of both the bottom and top blobs and if they are the same as the previous sizing, returns false indicating that no reshape is needed. More...
 
bool compareShapes (BlobCollection< T > colBottom, BlobCollection< T > colTop)
 Compare the shapes of the top and bottom and if the same, return true, otherwise false. More...
 
void setShapes (BlobCollection< T > colBottom, BlobCollection< T > colTop)
 Set the internal shape sizes - used when determining if a Reshape is necessary. More...
 
abstract void forward (BlobCollection< T > colBottom, BlobCollection< T > colTop)
 This forward abstract function must be overriden by each derived Layer class to compute the top (output) Blobs for this layer. More...
 
abstract void backward (BlobCollection< T > colTop, List< bool > rgbPropagateDown, BlobCollection< T > colBottom)
 This backward abstract function must be overriden by each derived Layer class to compute the bottom (intput) Blob diffs for this Layer. More...
 
virtual void setup_internal_blobs (BlobCollection< T > col)
 Derivative layers should add all internal blobws to the 'col' provided. More...
 
void CheckBlobCounts (BlobCollection< T > colBottom, BlobCollection< T > colTop)
 Called by the Layer::Setup function to check the number of bottom (input) and top (output) Blobs provided match the expected number of blobs expected via the {EactNum,Min,Max}{Bottom,Top}Blobs functions. More...
 
void SetLossWeights (BlobCollection< T > colTop)
 Called by Layer::Setup to initialize the weights associated with any top (output) Blobs in the loss function ans store non-zero loss weights in the diff Blob. More...
 
LayerParameter convertLayerParam (LayerParameter pChild, LayerParameter pParent)
 Called to convert a parent LayerParameterEx, used in blob sharing, with a child layer parameter. More...
 
bool shareParameter (Blob< T > b, List< int > rgMinShape, bool bAllowEndsWithComparison=false)
 Attempts to share a parameter Blob if another parameter Blob with the same name and accpetable size is found. More...
 
bool shareLayerBlob (Blob< T > b, List< int > rgMinShape)
 Attempts to share a Layer Blob if another parameter Blob with the same name and acceptable size is found. More...
 
bool shareLayerBlobs (Layer< T > layer)
 Attempts to share the Layer blobs and internal_blobs with matching names and sizes with those in another matching layer. More...
 
virtual WorkspaceArgs getWorkspace ()
 Returns the WorkspaceArgs used to share a workspace between Layers. More...
 
virtual bool setWorkspace (ulong lSizeInBytes)
 Sets the workspace size (in items) and returns true if set, false otherwise. More...
 
void check_nan (Blob< T > b)
 Checks a Blob for NaNs and throws an exception if found. More...
 
convert (double df)
 Converts a double to a generic. More...
 
convert (float f)
 Converts a float to a generic. More...
 
double convertD (T df)
 Converts a generic to a double value. More...
 
float convertF (T df)
 Converts a generic to a float value. More...
 
double[] convertD (T[] rg)
 Converts an array of generic values into an array of double values. More...
 
T[] convert (double[] rg)
 Converts an array of double values into an array of generic values. More...
 
float[] convertF (T[] rg)
 Converts an array of float values into an array of generic values. More...
 
T[] convert (float[] rg)
 Converts an array of float values into an array of generic values. More...
 
int val_at (T[] rg, int nIdx)
 Returns the integer value at a given index in a generic array. More...
 
Size size_at (Blob< T > b)
 Returns the Size of a given two element Blob, such as one that stores Blob size information. More...
 

Protected Attributes

LayerParameter.LayerType m_type = LayerParameter.LayerType._MAX
 Specifies the Layer type. More...
 
CudaDnn< T > m_cuda
 Specifies the CudaDnn connection to Cuda. More...
 
Log m_log
 Specifies the Log for output. More...
 
LayerParameter m_param
 Specifies the LayerParameter describing the Layer. More...
 
Phase m_phase
 Specifies the Phase under which the Layer is run. More...
 
BlobCollection< T > m_colBlobs
 Specifies the learnable parameter Blobs of the Layer. More...
 
BlobCollection< T > m_colInternalBlobs = new BlobCollection<T>()
 Specifies internal blobs used by the layer. More...
 
DictionaryMap< bool > m_rgbParamPropagateDown
 Specifies whether or not to compute the learnable diff of each parameter Blob. More...
 
DictionaryMap< double > m_rgLoss
 Specifies the loss values that indeicate whether each top (output) Blob has a non-zero weight in the objective function.. More...
 
m_tOne
 Specifies a generic type equal to 1.0. More...
 
m_tZero
 Specifies a generic type equal to 0.0. More...
 
bool m_bEnablePassthrough = false
 Enables/disables the pass-through mode for the layer. Default = false. More...
 
bool m_bUseHalfSize = false
 Specifies that the half size of the top (if any) should be converted to the base size. More...
 
bool m_bConvertTopOnFwd = false
 Specifies whether or not the layer should convert the top on the forward pass when using half sized memory (typically only done with input data). More...
 
bool m_bConvertTopOnBwd = true
 Specifies whether or not to convert the top on the backward pass when using half sized memory (typically not done on loss layers). More...
 
bool m_bConvertBottom = true
 Specifies whether or not the layer should convert the bottom when using half sized memory. More...
 
bool m_bReshapeOnForwardNeeded = true
 Specifies whether or not the reshape on forward is needed or not. More...
 
bool m_bNetReshapeRequest = false
 Specifies whether the reshape is requested from a Net.Reshape call or not. More...
 
LayerParameter.? LayerType m_parentLayerType = null
 Specifies the layer type of the parent. More...
 

Properties

LayerParameter.? LayerType parent_layer_type [get]
 Optionally, specifies the parent layer type (e.g. LOSS, etc.) More...
 
virtual bool SupportsPreProcessing [get]
 Should return true when PreProcessing methods are overriden. More...
 
virtual bool SupportsPostProcessing [get]
 Should return true when pre PostProcessing methods are overriden. More...
 
virtual bool SupportsPostProcessingLogits [get]
 Should return true when pre PostProcessingLogits methods are overriden. More...
 
virtual bool SupportsPostProcessingFullOutput [get]
 Should return true when PostProcessingFullOutput is supported. More...
 
BlobCollection< T > blobs [get]
 Returns the collection of learnable parameter Blobs for the Layer. More...
 
BlobCollection< T > internal_blobs [get]
 Returns the collection of internal Blobs used by the Layer. More...
 
LayerParameter layer_param [get]
 Returns the LayerParameter for this Layer. More...
 
LayerParameter.LayerType type [get]
 Returns the LayerType of this Layer. More...
 
virtual int ExactNumBottomBlobs [get]
 Returns the exact number of bottom (input) Blobs required by the Layer, or -1 if no exact number is required. More...
 
virtual int MinBottomBlobs [get]
 Returns the minimum number of bottom (input) Blobs required by the Layer, or -1 if no minimum number is required. More...
 
virtual int MaxBottomBlobs [get]
 Returns the maximum number of bottom (input) Blobs required by the Layer, or -1 if no maximum number is required. More...
 
virtual int ExactNumTopBlobs [get]
 Returns the exact number of top (output) Blobs required by the Layer, or -1 if no exact number is required. More...
 
virtual int MinTopBlobs [get]
 Returns the minimum number of top (output) Blobs required by the Layer, or -1 if no minimum number is required. More...
 
virtual int MaxTopBlobs [get]
 Returns the maximum number of top (output) Blobs required by the Layer, or -1 if no maximum number is required. More...
 
virtual bool EqualNumBottomTopBlobs [get]
 Returns true if the Layer requires and equal number of bottom (input) and top (output) Blobs. More...
 
virtual bool AutoTopBlobs [get]
 Return whether "anonymous" top (output) Blobs are created automatically by the Layer. More...
 
double forward_timing [get]
 Returns the timing of the last forward pass in milliseconds. More...
 
double forward_timing_average [get]
 Returns the average timing of the forward passes in milliseconds. More...
 
double backward_timing [get]
 Returns the timing of the last backward pass in milliseconds. More...
 
double backward_timing_average [get]
 Returns the average timing of the backward passes in milliseconds. More...
 

Events

EventHandler< WorkspaceArgsOnGetWorkspace
 Specifies the OnGetWorkspace event that fires when the getWorkspace() function is called by a layer to get a shareable workspace to conserve GPU memory. More...
 
EventHandler< WorkspaceArgsOnSetWorkspace
 Specifies the OnSetWorkspace event that fires when the setWorkspace() function is called by a layer to get a shareable workspace to conserve GPU memory. More...
 
EventHandler< GetIterationArgsOnGetIteration
 Specifies the OnGetIteration event that fires when a layer needs to get the current iteration from the solver. More...
 
EventHandler< GetWorkBlobArgs< T > > OnDebug
 Specifies the OnGetWorkBlob event that is only supported when debugging to get a work blob from the primary Net holding this layer. More...
 

Detailed Description

An interface for the units of computation which can be composed into a Net.

Layers must implement an override to the forward function, in which they take their input (bottom) Blobs (if any) and compute their output Blobs (if any). They may also implement aan override to the backward function, in which they compute the error gradients with respect to their input Blob's, given the error gradients with their output Blobs.

Template Parameters
TSpecifies the base type float or double. Using float is recommended to conserve GPU memory.

Definition at line 30 of file Layer.cs.

Constructor & Destructor Documentation

◆ Layer()

MyCaffe.layers.Layer< T >.Layer ( CudaDnn< T >  cuda,
Log  log,
LayerParameter  p 
)

The Layer constructor.

Setup code for derivative classes should go into an override of the LayerSetup function where the dimensionsn of the Blobs are provided to the Layer.

Parameters
cudaSpecifies the CudaDnn connection to Cuda.
logSpecifies the Log for output.
pSpecifies the LayerParameter that contains the settings of the Layer.

Definition at line 158 of file Layer.cs.

Member Function Documentation

◆ AllowForceBackward()

virtual bool MyCaffe.layers.Layer< T >.AllowForceBackward ( int  nBottomIdx)
virtual

Return whether to allow

force_backward for a given bottom (input) Blob index.

If AllowForceBackward(i) == false, the

force_backward

setting will be ignored and backpropagate to Blob i only if it needs gradient information. (as is done when

force_backward == false
Parameters
nBottomIdxSpecifies the index of the bottom (input) item to force.
Returns

Reimplemented in MyCaffe.layers.beta.TripletLossLayer< T >, MyCaffe.layers.ssd.SmoothL1LossLayer< T >, MyCaffe.layers.QuantileLossLayer< T >, MyCaffe.layers.ContrastiveLossLayer< T >, MyCaffe.layers.EuclideanLossLayer< T >, MyCaffe.layers.LossLayer< T >, MyCaffe.layers.LSTMUnitLayer< T >, and MyCaffe.layers.RecurrentLayer< T >.

Definition at line 1046 of file Layer.cs.

◆ Backward()

void MyCaffe.layers.Layer< T >.Backward ( BlobCollection< T >  colTop,
List< bool >  rgbPropagateDown,
BlobCollection< T >  colBottom 
)

Given the top Blob error gradients, compute the bottom Blob error gradients.

The Backward function calls the overriden backward function implemented by each specific Layer derivative, to compute the bottom (input) Blob diffs given the top (output) Blob diffs.

Parameters
colTopSpecifies a collection of top (output) Blobs, whos diff fields store the gradient of the error with respect to themselves.
rgbPropagateDownSpecifies a List with equal length to the bottom, with each element indicating whether or not to propagate the error gradients down to the bottom Blob at the corresponding index.
colBottomSpecifies a collection of bottom (input) Blobs, whos diff fields are filled with the gradient of the error with respect to themselves after the Backward function is run.

Definition at line 815 of file Layer.cs.

◆ backward()

abstract void MyCaffe.layers.Layer< T >.backward ( BlobCollection< T >  colTop,
List< bool >  rgbPropagateDown,
BlobCollection< T >  colBottom 
)
protectedpure virtual

This backward abstract function must be overriden by each derived Layer class to compute the bottom (intput) Blob diffs for this Layer.

Parameters
colTopSpecifies a collection of top (output) Blobs, whos diff fields store the gradient of the error with respect to themselves.
rgbPropagateDownSpecifies a List with equal length to the bottom, with each element indicating whether or not to propagate the error gradients down to the bottom Blob at the corresponding index.
colBottomSpecifies a collection of bottom (input) Blobs, whos diff fields are filled with the gradient of the error with respect to themselves after the Backward function is run.

Implemented in MyCaffe.layers.beta.AccuracyDecodeLayer< T >, MyCaffe.layers.beta.AccuracyEncodingLayer< T >, MyCaffe.layers.AttentionLayer< T >, MyCaffe.layers.beta.ConvolutionOctaveLayer< T >, MyCaffe.layers.CopyLayer< T >, MyCaffe.layers.beta.DataSequenceLayer< T >, MyCaffe.layers.beta.DecodeLayer< T >, MyCaffe.layers.beta.GatherLayer< T >, MyCaffe.layers.beta.GlobResNormLayer< T >, MyCaffe.layers.beta.InterpLayer< T >, MyCaffe.layers.beta.KnnLayer< T >, MyCaffe.layers.LSTMAttentionLayer< T >, MyCaffe.layers.beta.MeanErrorLossLayer< T >, MyCaffe.layers.beta.MergeLayer< T >, MyCaffe.layers.beta.MishLayer< T >, MyCaffe.layers.beta.ModelDataLayer< T >, MyCaffe.layers.beta.Normalization1Layer< T >, MyCaffe.layers.beta.SerfLayer< T >, MyCaffe.layers.beta.SqueezeLayer< T >, MyCaffe.layers.beta.TextDataLayer< T >, MyCaffe.layers.beta.TransposeLayer< T >, MyCaffe.layers.beta.TripletLossLayer< T >, MyCaffe.layers.beta.UnPoolingLayer< T >, MyCaffe.layers.beta.UnPoolingLayer1< T >, MyCaffe.layers.beta.UnsqueezeLayer< T >, MyCaffe.layers.gpt.CausalSelfAttentionLayer< T >, MyCaffe.layers.gpt.CausalSelfAttentionLayer2< T >, MyCaffe.layers.gpt.GeluLayer< T >, MyCaffe.layers.gpt.LayerNormLayer< T >, MyCaffe.layers.gpt.MultiheadAttentionLayer< T >, MyCaffe.layers.gpt.NLLLossLayer< T >, MyCaffe.layers.gpt.PositionalEncodingLayer< T >, MyCaffe.layers.gpt.TokenizedDataLayer< T >, MyCaffe.layers.gpt.TokenizedDataPairsLayer< T >, MyCaffe.layers.gpt.TransformerBlockLayer< T >, MyCaffe.layers.hdf5.HDF5DataLayer< T >, MyCaffe.layers.lnn.CfcLayer< T >, MyCaffe.layers.lnn.CfcUnitLayer< T >, MyCaffe.layers.lnn.LeCunLayer< T >, MyCaffe.layers.lnn.LtcUnitLayer< T >, MyCaffe.layers.lnn.SiLULayer< T >, MyCaffe.layers.lnn.SoftPlusLayer< T >, MyCaffe.layers.nt.EventLayer< T >, MyCaffe.layers.nt.GramLayer< T >, MyCaffe.layers.nt.OneHotLayer< T >, MyCaffe.layers.nt.ScalarLayer< T >, MyCaffe.layers.nt.TVLossLayer< T >, MyCaffe.layers.ssd.DetectionEvaluateLayer< T >, MyCaffe.layers.ssd.DetectionOutputLayer< T >, MyCaffe.layers.ssd.MultiBoxLossLayer< T >, MyCaffe.layers.ssd.Normalization2Layer< T >, MyCaffe.layers.ssd.PermuteLayer< T >, MyCaffe.layers.ssd.PriorBoxLayer< T >, MyCaffe.layers.ssd.SmoothL1LossLayer< T >, MyCaffe.layers.tft.CategoricalTransformationLayer< T >, MyCaffe.layers.tft.ChannelEmbeddingLayer< T >, MyCaffe.layers.tft.DataTemporalLayer< T >, MyCaffe.layers.tft.GateAddNormLayer< T >, MyCaffe.layers.tft.GluLayer< T >, MyCaffe.layers.tft.GrnLayer< T >, MyCaffe.layers.tft.MultiHeadAttentionInterpLayer< T >, MyCaffe.layers.tft.NumericTransformationLayer< T >, MyCaffe.layers.tft.QuantileAccuracyLayer< T >, MyCaffe.layers.QuantileLossLayer< T >, MyCaffe.layers.tft.ReshapeTemporalLayer< T >, MyCaffe.layers.tft.VarSetNetLayer< T >, MyCaffe.layers.AbsValLayer< T >, MyCaffe.layers.AccuracyLayer< T >, MyCaffe.layers.ArgMaxLayer< T >, MyCaffe.layers.BaseDataLayer< T >, MyCaffe.layers.BatchNormLayer< T >, MyCaffe.layers.BatchReindexLayer< T >, MyCaffe.layers.BiasLayer< T >, MyCaffe.layers.BNLLLayer< T >, MyCaffe.layers.ClipLayer< T >, MyCaffe.layers.ConcatLayer< T >, MyCaffe.layers.ConstantLayer< T >, MyCaffe.layers.ContrastiveLossLayer< T >, MyCaffe.layers.ConvolutionLayer< T >, MyCaffe.layers.CropLayer< T >, MyCaffe.layers.DataNormalizerLayer< T >, MyCaffe.layers.DebugLayer< T >, MyCaffe.layers.DeconvolutionLayer< T >, MyCaffe.layers.DropoutLayer< T >, MyCaffe.layers.DummyDataLayer< T >, MyCaffe.layers.EltwiseLayer< T >, MyCaffe.layers.ELULayer< T >, MyCaffe.layers.EmbedLayer< T >, MyCaffe.layers.EuclideanLossLayer< T >, MyCaffe.layers.ExpLayer< T >, MyCaffe.layers.FilterLayer< T >, MyCaffe.layers.FlattenLayer< T >, MyCaffe.layers.GradientScaleLayer< T >, MyCaffe.layers.HingeLossLayer< T >, MyCaffe.layers.Im2colLayer< T >, MyCaffe.layers.InfogainLossLayer< T >, MyCaffe.layers.InnerProductLayer< T >, MyCaffe.layers.InputLayer< T >, MyCaffe.layers.LabelMappingLayer< T >, MyCaffe.layers.LogLayer< T >, MyCaffe.layers.LRNLayer< T >, MyCaffe.layers.LSTMSimpleLayer< T >, MyCaffe.layers.LSTMUnitLayer< T >, MyCaffe.layers.MathLayer< T >, MyCaffe.layers.MemoryLossLayer< T >, MyCaffe.layers.MultinomialLogisticLossLayer< T >, MyCaffe.layers.MVNLayer< T >, MyCaffe.layers.ParameterLayer< T >, MyCaffe.layers.PoolingLayer< T >, MyCaffe.layers.PowerLayer< T >, MyCaffe.layers.PReLULayer< T >, MyCaffe.layers.RecurrentLayer< T >, MyCaffe.layers.ReductionLayer< T >, MyCaffe.layers.ReLULayer< T >, MyCaffe.layers.ReshapeLayer< T >, MyCaffe.layers.ScaleLayer< T >, MyCaffe.layers.SigmoidCrossEntropyLossLayer< T >, MyCaffe.layers.SigmoidLayer< T >, MyCaffe.layers.SilenceLayer< T >, MyCaffe.layers.SliceLayer< T >, MyCaffe.layers.SoftmaxCrossEntropy2LossLayer< T >, MyCaffe.layers.SoftmaxCrossEntropyLossLayer< T >, MyCaffe.layers.SoftmaxLayer< T >, MyCaffe.layers.SoftmaxLossLayer< T >, MyCaffe.layers.SplitLayer< T >, MyCaffe.layers.SPPLayer< T >, MyCaffe.layers.SwishLayer< T >, MyCaffe.layers.TanhLayer< T >, MyCaffe.layers.ThresholdLayer< T >, and MyCaffe.layers.TileLayer< T >.

◆ check_nan()

void MyCaffe.layers.Layer< T >.check_nan ( Blob< T >  b)
protected

Checks a Blob for NaNs and throws an exception if found.

Parameters
bSpecifies the Blob to check.

Definition at line 1313 of file Layer.cs.

◆ CheckBlobCounts()

void MyCaffe.layers.Layer< T >.CheckBlobCounts ( BlobCollection< T >  colBottom,
BlobCollection< T >  colTop 
)
protected

Called by the Layer::Setup function to check the number of bottom (input) and top (output) Blobs provided match the expected number of blobs expected via the {EactNum,Min,Max}{Bottom,Top}Blobs functions.

Parameters
colBottomSpecifies the collection of bottom (input) Blobs.
colTopSpecifies the collection of top (output) Blobs.

Definition at line 1080 of file Layer.cs.

◆ compareShapes()

bool MyCaffe.layers.Layer< T >.compareShapes ( BlobCollection< T >  colBottom,
BlobCollection< T >  colTop 
)
protected

Compare the shapes of the top and bottom and if the same, return true, otherwise false.

Parameters
colBottomSpecifies the bottom blobs.
colTopSpecifies the top blobs.
Returns
If the top and bottom blobs have not changed shape, true is returned, otherwise false.

Definition at line 648 of file Layer.cs.

◆ ConnectLoss()

virtual void MyCaffe.layers.Layer< T >.ConnectLoss ( LossLayer< T >  layer)
virtual

Called to connect the loss OnLoss event to a specified layer (typically the data layer).

When connected, the OnLoss event is called on each forward pass during the loss function.

Parameters
layerSpecifies the layer to connect the OnLoss event to.

Reimplemented in MyCaffe.layers.tft.DataTemporalLayer< T >.

Definition at line 240 of file Layer.cs.

◆ convert() [1/5]

void MyCaffe.layers.Layer< T >.convert ( BlobCollection< T >  col)
protected

Convert a collection of blobs from / to half size.

Parameters
colSpecifies the collection to convert.

Definition at line 535 of file Layer.cs.

◆ convert() [2/5]

T MyCaffe.layers.Layer< T >.convert ( double  df)
protected

Converts a double to a generic.

Parameters
dfSpecifies the double value.
Returns
Returns the generic value.

Definition at line 1329 of file Layer.cs.

◆ convert() [3/5]

T[] MyCaffe.layers.Layer< T >.convert ( double[]  rg)
protected

Converts an array of double values into an array of generic values.

Parameters
rgSpecifies the array of double values.
Returns
Returns an array of generic values.

Definition at line 1385 of file Layer.cs.

◆ convert() [4/5]

T MyCaffe.layers.Layer< T >.convert ( float  f)
protected

Converts a float to a generic.

Parameters
fSpecifies the float value.
Returns
Returns the generic value.

Definition at line 1339 of file Layer.cs.

◆ convert() [5/5]

T[] MyCaffe.layers.Layer< T >.convert ( float[]  rg)
protected

Converts an array of float values into an array of generic values.

Parameters
rgSpecifies the array of float values.
Returns
Returns an array of generic values.

Definition at line 1417 of file Layer.cs.

◆ convert_to_full()

long MyCaffe.layers.Layer< T >.convert_to_full ( int  nCount,
long  hMem 
)
protected

Convert half memory to full memory.

Parameters
nCountSpecifies the number of items.
hMemSpecifies the memory to convert.
Returns
A handle to the converted memory is returned.

Definition at line 514 of file Layer.cs.

◆ convertD() [1/2]

double MyCaffe.layers.Layer< T >.convertD ( df)
protected

Converts a generic to a double value.

Parameters
dfSpecifies the generic value.
Returns
The double value is returned.

Definition at line 1349 of file Layer.cs.

◆ convertD() [2/2]

double[] MyCaffe.layers.Layer< T >.convertD ( T[]  rg)
protected

Converts an array of generic values into an array of double values.

Parameters
rgSpecifies the array of generic values.
Returns
The array of double values is returned.

Definition at line 1369 of file Layer.cs.

◆ convertF() [1/2]

float MyCaffe.layers.Layer< T >.convertF ( df)
protected

Converts a generic to a float value.

Parameters
dfSpecifies the generic value.
Returns
The float value is returned.

Definition at line 1359 of file Layer.cs.

◆ convertF() [2/2]

float[] MyCaffe.layers.Layer< T >.convertF ( T[]  rg)
protected

Converts an array of float values into an array of generic values.

Parameters
rgSpecifies the array of float values.
Returns
Returns an array of generic values.

Definition at line 1401 of file Layer.cs.

◆ convertLayerParam()

LayerParameter MyCaffe.layers.Layer< T >.convertLayerParam ( LayerParameter  pChild,
LayerParameter  pParent 
)
protected

Called to convert a parent LayerParameterEx, used in blob sharing, with a child layer parameter.

Parameters
pChildSpecifies the child layer parameter.
pParentSpecifies the parent layer parameter.
Returns
If the parent layer parameter is a LayerParameterEx, the shared blobs are passed to the child.

Definition at line 1134 of file Layer.cs.

◆ ConvertToBase()

void MyCaffe.layers.Layer< T >.ConvertToBase ( BlobCollection< T >  col)

ConvertToBase converts any blobs in a collection that are in half size to the base size.

Parameters
colSpecifies the blob collection to convert.

Definition at line 579 of file Layer.cs.

◆ Create()

static Layer< T > MyCaffe.layers.Layer< T >.Create ( CudaDnn< T >  cuda,
Log  log,
LayerParameter  p,
CancelEvent  evtCancel,
IXDatabaseBase  db = null,
TransferInput  trxinput = null 
)
static

Create a new Layer based on the LayerParameter.

Parameters
cudaSpecifies the CudaDnn connection to Cuda.
logSpecifies the Log for output.
pSpecifies the LayerParameter that contains the LayerType to create.
evtCancelSpecifies the CancelEvent used by some Layers when created.
dbOptionally, specifies the in-memory MyCaffeDatabase used by data Layers.
trxinputOptionally, specifies the transfer input object used by some of the data Layers.
Returns

DEPRECIATED - soon to be replaced by SOFTMAXCROSSENTROPY2_LOSS

Definition at line 1468 of file Layer.cs.

◆ Dispose()

void MyCaffe.layers.Layer< T >.Dispose ( )

Releases all GPU and host resources used by the Layer.

Definition at line 180 of file Layer.cs.

◆ dispose() [1/4]

virtual void MyCaffe.layers.Layer< T >.dispose ( )
protectedvirtual

Releases all GPU and host resources used by the Layer.

Reimplemented in MyCaffe.layers.beta.AccuracyDecodeLayer< T >, MyCaffe.layers.beta.AccuracyEncodingLayer< T >, MyCaffe.layers.AttentionLayer< T >, MyCaffe.layers.beta.ConvolutionOctaveLayer< T >, MyCaffe.layers.beta.DataSequenceLayer< T >, MyCaffe.layers.beta.DecodeLayer< T >, MyCaffe.layers.beta.GlobResNormLayer< T >, MyCaffe.layers.beta.KnnLayer< T >, MyCaffe.layers.LSTMAttentionLayer< T >, MyCaffe.layers.beta.MeanErrorLossLayer< T >, MyCaffe.layers.beta.MergeLayer< T >, MyCaffe.layers.beta.ModelDataLayer< T >, MyCaffe.layers.beta.Normalization1Layer< T >, MyCaffe.layers.beta.TextDataLayer< T >, MyCaffe.layers.beta.TransposeLayer< T >, MyCaffe.layers.beta.TripletLossLayer< T >, MyCaffe.layers.beta.UnPoolingLayer< T >, MyCaffe.layers.beta.UnPoolingLayer1< T >, MyCaffe.layers.gpt.CausalSelfAttentionLayer< T >, MyCaffe.layers.gpt.CausalSelfAttentionLayer2< T >, MyCaffe.layers.gpt.LayerNormLayer< T >, MyCaffe.layers.gpt.MultiheadAttentionLayer< T >, MyCaffe.layers.gpt.NLLLossLayer< T >, MyCaffe.layers.gpt.PositionalEncodingLayer< T >, MyCaffe.layers.gpt.TokenizedDataLayer< T >, MyCaffe.layers.gpt.TokenizedDataPairsLayer< T >, MyCaffe.layers.gpt.TransformerBlockLayer< T >, MyCaffe.layers.hdf5.HDF5DataLayer< T >, MyCaffe.layers.lnn.CfcLayer< T >, MyCaffe.layers.lnn.CfcUnitLayer< T >, MyCaffe.layers.lnn.LtcUnitLayer< T >, MyCaffe.layers.nt.OneHotLayer< T >, MyCaffe.layers.ssd.AnnotatedDataLayer< T >, MyCaffe.layers.ssd.DetectionEvaluateLayer< T >, MyCaffe.layers.ssd.DetectionOutputLayer< T >, MyCaffe.layers.ssd.MultiBoxLossLayer< T >, MyCaffe.layers.ssd.Normalization2Layer< T >, MyCaffe.layers.ssd.PermuteLayer< T >, MyCaffe.layers.ssd.PriorBoxLayer< T >, MyCaffe.layers.ssd.SmoothL1LossLayer< T >, MyCaffe.layers.ssd.VideoDataLayer< T >, MyCaffe.layers.tft.CategoricalTransformationLayer< T >, MyCaffe.layers.tft.ChannelEmbeddingLayer< T >, MyCaffe.layers.tft.DataTemporalLayer< T >, MyCaffe.layers.tft.GateAddNormLayer< T >, MyCaffe.layers.tft.GluLayer< T >, MyCaffe.layers.tft.GrnLayer< T >, MyCaffe.layers.tft.MultiHeadAttentionInterpLayer< T >, MyCaffe.layers.tft.NumericTransformationLayer< T >, MyCaffe.layers.tft.QuantileAccuracyLayer< T >, MyCaffe.layers.QuantileLossLayer< T >, MyCaffe.layers.tft.ReshapeTemporalLayer< T >, MyCaffe.layers.tft.VarSetNetLayer< T >, MyCaffe.layers.AccuracyLayer< T >, MyCaffe.layers.BaseConvolutionLayer< T >, MyCaffe.layers.BaseDataLayer< T >, MyCaffe.layers.BasePrefetchingDataLayer< T >, MyCaffe.layers.BatchNormLayer< T >, MyCaffe.layers.BatchReindexLayer< T >, MyCaffe.layers.BiasLayer< T >, MyCaffe.layers.ContrastiveLossLayer< T >, MyCaffe.layers.ConvolutionLayer< T >, MyCaffe.layers.CropLayer< T >, MyCaffe.layers.DataLayer< T >, MyCaffe.layers.DataNormalizerLayer< T >, MyCaffe.layers.DebugLayer< T >, MyCaffe.layers.DeconvolutionLayer< T >, MyCaffe.layers.DropoutLayer< T >, MyCaffe.layers.EltwiseLayer< T >, MyCaffe.layers.ELULayer< T >, MyCaffe.layers.EmbedLayer< T >, MyCaffe.layers.EuclideanLossLayer< T >, MyCaffe.layers.HingeLossLayer< T >, MyCaffe.layers.Im2colLayer< T >, MyCaffe.layers.ImageDataLayer< T >, MyCaffe.layers.InfogainLossLayer< T >, MyCaffe.layers.InnerProductLayer< T >, MyCaffe.layers.LRNLayer< T >, MyCaffe.layers.LSTMSimpleLayer< T >, MyCaffe.layers.LSTMUnitLayer< T >, MyCaffe.layers.MemoryDataLayer< T >, MyCaffe.layers.MemoryLossLayer< T >, MyCaffe.layers.MVNLayer< T >, MyCaffe.layers.PoolingLayer< T >, MyCaffe.layers.PReLULayer< T >, MyCaffe.layers.RecurrentLayer< T >, MyCaffe.layers.ReductionLayer< T >, MyCaffe.layers.ReLULayer< T >, MyCaffe.layers.ScaleLayer< T >, MyCaffe.layers.SigmoidCrossEntropyLossLayer< T >, MyCaffe.layers.SigmoidLayer< T >, MyCaffe.layers.SoftmaxCrossEntropy2LossLayer< T >, MyCaffe.layers.SoftmaxCrossEntropyLossLayer< T >, MyCaffe.layers.SoftmaxLayer< T >, MyCaffe.layers.SoftmaxLossLayer< T >, MyCaffe.layers.SPPLayer< T >, MyCaffe.layers.SwishLayer< T >, and MyCaffe.layers.TanhLayer< T >.

Definition at line 188 of file Layer.cs.

◆ dispose() [2/4]

void MyCaffe.layers.Layer< T >.dispose ( ref Blob< T >  b)
protected

Helper method used to dispose internal blobs.

Parameters
bSpecifies the blob to dispose

Definition at line 209 of file Layer.cs.

◆ dispose() [3/4]

void MyCaffe.layers.Layer< T >.dispose ( ref BlobCollection< T >  rg,
bool  bSetToNull = true 
)
protected

Dispose the blob collection.

Parameters
rgSpecifies the blob collection to dispose.
bSetToNullSpecifies to set the rg param to null (default = true).

Definition at line 223 of file Layer.cs.

◆ dispose() [4/4]

void MyCaffe.layers.Layer< T >.dispose ( ref Layer< T >  l)
protected

Helper method used to dispose internal layers.

Parameters
lSpecifies the internal layer to dispose.

Definition at line 196 of file Layer.cs.

◆ Forward()

double MyCaffe.layers.Layer< T >.Forward ( BlobCollection< T >  colBottom,
BlobCollection< T >  colTop 
)

Given the bottom (input) Blobs, this function computes the top (output) Blobs and the loss.

The Forward function calls the overriden forward function implemented by each specific Layer derivative to compute the top (output) Blob's values given the bottom (input) Blobs. If the layer has any non-zero

loss_weights

this function then computes and returns the loss.

Parameters
colBottomSpecifies the collection of bottom (input) Blobs, whos data fields store the input data for this layers' outputs.
colTopSpecifies the collection of preshaped top (output) Blobs, whos data fields will store this layers' outputs.
Returns
Returns the total loss from the Layer.

Definition at line 728 of file Layer.cs.

◆ forward()

abstract void MyCaffe.layers.Layer< T >.forward ( BlobCollection< T >  colBottom,
BlobCollection< T >  colTop 
)
protectedpure virtual

This forward abstract function must be overriden by each derived Layer class to compute the top (output) Blobs for this layer.

Parameters
colBottomSpecifies the collection of bottom (input) Blobs, whos data fields store the input data for this layers' outputs.
colTopSpecifies the collection of preshaped top (output) Blobs, whos data fields will store this layers' outputs.

Implemented in MyCaffe.layers.beta.AccuracyDecodeLayer< T >, MyCaffe.layers.beta.AccuracyEncodingLayer< T >, MyCaffe.layers.AttentionLayer< T >, MyCaffe.layers.beta.ConvolutionOctaveLayer< T >, MyCaffe.layers.CopyLayer< T >, MyCaffe.layers.beta.DataSequenceLayer< T >, MyCaffe.layers.beta.DecodeLayer< T >, MyCaffe.layers.beta.GatherLayer< T >, MyCaffe.layers.beta.GlobResNormLayer< T >, MyCaffe.layers.beta.InterpLayer< T >, MyCaffe.layers.beta.KnnLayer< T >, MyCaffe.layers.LSTMAttentionLayer< T >, MyCaffe.layers.beta.MeanErrorLossLayer< T >, MyCaffe.layers.beta.MergeLayer< T >, MyCaffe.layers.beta.MishLayer< T >, MyCaffe.layers.beta.ModelDataLayer< T >, MyCaffe.layers.beta.Normalization1Layer< T >, MyCaffe.layers.beta.SerfLayer< T >, MyCaffe.layers.beta.SqueezeLayer< T >, MyCaffe.layers.beta.TextDataLayer< T >, MyCaffe.layers.beta.TransposeLayer< T >, MyCaffe.layers.beta.TripletLossLayer< T >, MyCaffe.layers.beta.UnPoolingLayer< T >, MyCaffe.layers.beta.UnPoolingLayer1< T >, MyCaffe.layers.beta.UnsqueezeLayer< T >, MyCaffe.layers.gpt.CausalSelfAttentionLayer< T >, MyCaffe.layers.gpt.CausalSelfAttentionLayer2< T >, MyCaffe.layers.gpt.GeluLayer< T >, MyCaffe.layers.gpt.LayerNormLayer< T >, MyCaffe.layers.gpt.MultiheadAttentionLayer< T >, MyCaffe.layers.gpt.NLLLossLayer< T >, MyCaffe.layers.gpt.PositionalEncodingLayer< T >, MyCaffe.layers.gpt.TokenizedDataLayer< T >, MyCaffe.layers.gpt.TokenizedDataPairsLayer< T >, MyCaffe.layers.gpt.TransformerBlockLayer< T >, MyCaffe.layers.hdf5.HDF5DataLayer< T >, MyCaffe.layers.lnn.CfcLayer< T >, MyCaffe.layers.lnn.CfcUnitLayer< T >, MyCaffe.layers.lnn.LeCunLayer< T >, MyCaffe.layers.lnn.LtcUnitLayer< T >, MyCaffe.layers.lnn.SiLULayer< T >, MyCaffe.layers.lnn.SoftPlusLayer< T >, MyCaffe.layers.nt.EventLayer< T >, MyCaffe.layers.nt.GramLayer< T >, MyCaffe.layers.nt.OneHotLayer< T >, MyCaffe.layers.nt.ScalarLayer< T >, MyCaffe.layers.nt.TVLossLayer< T >, MyCaffe.layers.ssd.DetectionEvaluateLayer< T >, MyCaffe.layers.ssd.DetectionOutputLayer< T >, MyCaffe.layers.ssd.MultiBoxLossLayer< T >, MyCaffe.layers.ssd.Normalization2Layer< T >, MyCaffe.layers.ssd.PermuteLayer< T >, MyCaffe.layers.ssd.PriorBoxLayer< T >, MyCaffe.layers.ssd.SmoothL1LossLayer< T >, MyCaffe.layers.tft.CategoricalTransformationLayer< T >, MyCaffe.layers.tft.ChannelEmbeddingLayer< T >, MyCaffe.layers.tft.DataTemporalLayer< T >, MyCaffe.layers.tft.GateAddNormLayer< T >, MyCaffe.layers.tft.GluLayer< T >, MyCaffe.layers.tft.GrnLayer< T >, MyCaffe.layers.tft.MultiHeadAttentionInterpLayer< T >, MyCaffe.layers.tft.NumericTransformationLayer< T >, MyCaffe.layers.tft.QuantileAccuracyLayer< T >, MyCaffe.layers.QuantileLossLayer< T >, MyCaffe.layers.tft.ReshapeTemporalLayer< T >, MyCaffe.layers.tft.VarSetNetLayer< T >, MyCaffe.layers.AbsValLayer< T >, MyCaffe.layers.AccuracyLayer< T >, MyCaffe.layers.ArgMaxLayer< T >, MyCaffe.layers.BasePrefetchingDataLayer< T >, MyCaffe.layers.BatchNormLayer< T >, MyCaffe.layers.BatchReindexLayer< T >, MyCaffe.layers.BiasLayer< T >, MyCaffe.layers.BNLLLayer< T >, MyCaffe.layers.ClipLayer< T >, MyCaffe.layers.ConcatLayer< T >, MyCaffe.layers.ConstantLayer< T >, MyCaffe.layers.ContrastiveLossLayer< T >, MyCaffe.layers.ConvolutionLayer< T >, MyCaffe.layers.CropLayer< T >, MyCaffe.layers.DataNormalizerLayer< T >, MyCaffe.layers.DebugLayer< T >, MyCaffe.layers.DeconvolutionLayer< T >, MyCaffe.layers.DropoutLayer< T >, MyCaffe.layers.DummyDataLayer< T >, MyCaffe.layers.EltwiseLayer< T >, MyCaffe.layers.ELULayer< T >, MyCaffe.layers.EmbedLayer< T >, MyCaffe.layers.EuclideanLossLayer< T >, MyCaffe.layers.ExpLayer< T >, MyCaffe.layers.FilterLayer< T >, MyCaffe.layers.FlattenLayer< T >, MyCaffe.layers.GradientScaleLayer< T >, MyCaffe.layers.HingeLossLayer< T >, MyCaffe.layers.Im2colLayer< T >, MyCaffe.layers.InfogainLossLayer< T >, MyCaffe.layers.InnerProductLayer< T >, MyCaffe.layers.InputLayer< T >, MyCaffe.layers.LabelMappingLayer< T >, MyCaffe.layers.LogLayer< T >, MyCaffe.layers.LRNLayer< T >, MyCaffe.layers.LSTMSimpleLayer< T >, MyCaffe.layers.LSTMUnitLayer< T >, MyCaffe.layers.MathLayer< T >, MyCaffe.layers.MemoryDataLayer< T >, MyCaffe.layers.MemoryLossLayer< T >, MyCaffe.layers.MultinomialLogisticLossLayer< T >, MyCaffe.layers.MVNLayer< T >, MyCaffe.layers.ParameterLayer< T >, MyCaffe.layers.PoolingLayer< T >, MyCaffe.layers.PowerLayer< T >, MyCaffe.layers.PReLULayer< T >, MyCaffe.layers.RecurrentLayer< T >, MyCaffe.layers.ReductionLayer< T >, MyCaffe.layers.ReLULayer< T >, MyCaffe.layers.ReshapeLayer< T >, MyCaffe.layers.ScaleLayer< T >, MyCaffe.layers.SigmoidCrossEntropyLossLayer< T >, MyCaffe.layers.SigmoidLayer< T >, MyCaffe.layers.SilenceLayer< T >, MyCaffe.layers.SliceLayer< T >, MyCaffe.layers.SoftmaxCrossEntropy2LossLayer< T >, MyCaffe.layers.SoftmaxCrossEntropyLossLayer< T >, MyCaffe.layers.SoftmaxLayer< T >, MyCaffe.layers.SoftmaxLossLayer< T >, MyCaffe.layers.SplitLayer< T >, MyCaffe.layers.SPPLayer< T >, MyCaffe.layers.SwishLayer< T >, MyCaffe.layers.TanhLayer< T >, MyCaffe.layers.ThresholdLayer< T >, and MyCaffe.layers.TileLayer< T >.

◆ getCurrentIteration()

GetIterationArgs MyCaffe.layers.Layer< T >.getCurrentIteration ( )
protected

Fires the OnGetIteration event to query the current iteration.

Returns
The GetIterationArgs is returned if the event is connected, otherwise null is returned.

Definition at line 398 of file Layer.cs.

◆ getWorkspace()

virtual WorkspaceArgs MyCaffe.layers.Layer< T >.getWorkspace ( )
protectedvirtual

Returns the WorkspaceArgs used to share a workspace between Layers.

Returns
The WorkspaceArgs are returned.

Reimplemented in MyCaffe.layers.BaseConvolutionLayer< T >.

Definition at line 1285 of file Layer.cs.

◆ LayerSetUp()

abstract void MyCaffe.layers.Layer< T >.LayerSetUp ( BlobCollection< T >  colBottom,
BlobCollection< T >  colTop 
)
pure virtual

Performs Layer specific setup. Derived layers should override this function as well as the Reshape function.

This method should perform one-time Layer specific setup. This may include reading and processing relevant parameters from teh

LayerParameter layer_param
Returns the LayerParameter for this Layer.
Definition: Layer.cs:899

. Setting up the shapes of top (output) blobs and internal buffers should be done in the

abstract void Reshape(BlobCollection< T > colBottom, BlobCollection< T > colTop)
Adjust the shapes of top blobs and internal buffers to accomodate the shapes of the bottom blobs.

function, which will be called before the Forward pass to adjust the top (input) Blob sizes.

Parameters
colBottomSpecifies the collection of bottom (input) Blobs to this Layer.
colTopSpecifies the collection of allocated but unshaped top (output) Blobs.

Implemented in MyCaffe.layers.beta.AccuracyDecodeLayer< T >, MyCaffe.layers.beta.AccuracyEncodingLayer< T >, MyCaffe.layers.AttentionLayer< T >, MyCaffe.layers.beta.ConvolutionOctaveLayer< T >, MyCaffe.layers.CopyLayer< T >, MyCaffe.layers.beta.DataSequenceLayer< T >, MyCaffe.layers.beta.DecodeLayer< T >, MyCaffe.layers.beta.GatherLayer< T >, MyCaffe.layers.beta.GlobResNormLayer< T >, MyCaffe.layers.beta.InterpLayer< T >, MyCaffe.layers.beta.KnnLayer< T >, MyCaffe.layers.LSTMAttentionLayer< T >, MyCaffe.layers.beta.MeanErrorLossLayer< T >, MyCaffe.layers.beta.MergeLayer< T >, MyCaffe.layers.beta.ModelDataLayer< T >, MyCaffe.layers.beta.Normalization1Layer< T >, MyCaffe.layers.beta.SqueezeLayer< T >, MyCaffe.layers.beta.TextDataLayer< T >, MyCaffe.layers.beta.TransposeLayer< T >, MyCaffe.layers.beta.TripletLossLayer< T >, MyCaffe.layers.beta.UnPoolingLayer< T >, MyCaffe.layers.beta.UnPoolingLayer1< T >, MyCaffe.layers.beta.UnsqueezeLayer< T >, MyCaffe.layers.gpt.CausalSelfAttentionLayer< T >, MyCaffe.layers.gpt.CausalSelfAttentionLayer2< T >, MyCaffe.layers.gpt.LayerNormLayer< T >, MyCaffe.layers.gpt.MultiheadAttentionLayer< T >, MyCaffe.layers.gpt.NLLLossLayer< T >, MyCaffe.layers.gpt.PositionalEncodingLayer< T >, MyCaffe.layers.gpt.TokenizedDataLayer< T >, MyCaffe.layers.gpt.TokenizedDataPairsLayer< T >, MyCaffe.layers.gpt.TransformerBlockLayer< T >, MyCaffe.layers.hdf5.HDF5DataLayer< T >, MyCaffe.layers.lnn.CfcLayer< T >, MyCaffe.layers.lnn.CfcUnitLayer< T >, MyCaffe.layers.lnn.LtcUnitLayer< T >, MyCaffe.layers.nt.EventLayer< T >, MyCaffe.layers.nt.GramLayer< T >, MyCaffe.layers.nt.OneHotLayer< T >, MyCaffe.layers.nt.ScalarLayer< T >, MyCaffe.layers.nt.TVLossLayer< T >, MyCaffe.layers.ssd.DetectionEvaluateLayer< T >, MyCaffe.layers.ssd.DetectionOutputLayer< T >, MyCaffe.layers.ssd.MultiBoxLossLayer< T >, MyCaffe.layers.ssd.Normalization2Layer< T >, MyCaffe.layers.ssd.PermuteLayer< T >, MyCaffe.layers.ssd.PriorBoxLayer< T >, MyCaffe.layers.ssd.SmoothL1LossLayer< T >, MyCaffe.layers.tft.CategoricalTransformationLayer< T >, MyCaffe.layers.tft.ChannelEmbeddingLayer< T >, MyCaffe.layers.tft.DataTemporalLayer< T >, MyCaffe.layers.tft.GateAddNormLayer< T >, MyCaffe.layers.tft.GluLayer< T >, MyCaffe.layers.tft.GrnLayer< T >, MyCaffe.layers.tft.MultiHeadAttentionInterpLayer< T >, MyCaffe.layers.tft.NumericTransformationLayer< T >, MyCaffe.layers.tft.QuantileAccuracyLayer< T >, MyCaffe.layers.QuantileLossLayer< T >, MyCaffe.layers.tft.ReshapeTemporalLayer< T >, MyCaffe.layers.tft.VarSetNetLayer< T >, MyCaffe.layers.AccuracyLayer< T >, MyCaffe.layers.ArgMaxLayer< T >, MyCaffe.layers.BaseConvolutionLayer< T >, MyCaffe.layers.BaseDataLayer< T >, MyCaffe.layers.BasePrefetchingDataLayer< T >, MyCaffe.layers.BatchNormLayer< T >, MyCaffe.layers.BatchReindexLayer< T >, MyCaffe.layers.BiasLayer< T >, MyCaffe.layers.ConcatLayer< T >, MyCaffe.layers.ConstantLayer< T >, MyCaffe.layers.ContrastiveLossLayer< T >, MyCaffe.layers.ConvolutionLayer< T >, MyCaffe.layers.CropLayer< T >, MyCaffe.layers.DataNormalizerLayer< T >, MyCaffe.layers.DebugLayer< T >, MyCaffe.layers.DeconvolutionLayer< T >, MyCaffe.layers.DropoutLayer< T >, MyCaffe.layers.DummyDataLayer< T >, MyCaffe.layers.EltwiseLayer< T >, MyCaffe.layers.ELULayer< T >, MyCaffe.layers.EmbedLayer< T >, MyCaffe.layers.EuclideanLossLayer< T >, MyCaffe.layers.ExpLayer< T >, MyCaffe.layers.FilterLayer< T >, MyCaffe.layers.FlattenLayer< T >, MyCaffe.layers.GradientScaleLayer< T >, MyCaffe.layers.Im2colLayer< T >, MyCaffe.layers.InfogainLossLayer< T >, MyCaffe.layers.InnerProductLayer< T >, MyCaffe.layers.InputLayer< T >, MyCaffe.layers.LabelMappingLayer< T >, MyCaffe.layers.LogLayer< T >, MyCaffe.layers.LossLayer< T >, MyCaffe.layers.LRNLayer< T >, MyCaffe.layers.LSTMSimpleLayer< T >, MyCaffe.layers.LSTMUnitLayer< T >, MyCaffe.layers.MathLayer< T >, MyCaffe.layers.MemoryLossLayer< T >, MyCaffe.layers.MVNLayer< T >, MyCaffe.layers.NeuronLayer< T >, MyCaffe.layers.ParameterLayer< T >, MyCaffe.layers.PoolingLayer< T >, MyCaffe.layers.PowerLayer< T >, MyCaffe.layers.PReLULayer< T >, MyCaffe.layers.RecurrentLayer< T >, MyCaffe.layers.ReductionLayer< T >, MyCaffe.layers.ReLULayer< T >, MyCaffe.layers.ReshapeLayer< T >, MyCaffe.layers.ScaleLayer< T >, MyCaffe.layers.SigmoidCrossEntropyLossLayer< T >, MyCaffe.layers.SigmoidLayer< T >, MyCaffe.layers.SilenceLayer< T >, MyCaffe.layers.SliceLayer< T >, MyCaffe.layers.SoftmaxCrossEntropy2LossLayer< T >, MyCaffe.layers.SoftmaxCrossEntropyLossLayer< T >, MyCaffe.layers.SoftmaxLayer< T >, MyCaffe.layers.SoftmaxLossLayer< T >, MyCaffe.layers.SplitLayer< T >, MyCaffe.layers.SPPLayer< T >, MyCaffe.layers.SwishLayer< T >, MyCaffe.layers.TanhLayer< T >, MyCaffe.layers.ThresholdLayer< T >, and MyCaffe.layers.TileLayer< T >.

◆ loss()

double MyCaffe.layers.Layer< T >.loss ( int  nTopIdx)

Returns the scalar loss associated with the top Blob at a given index.

Parameters
nTopIdxSpecifies the index.
Returns
The loss value is returned.

Definition at line 908 of file Layer.cs.

◆ param_propagate_down()

bool MyCaffe.layers.Layer< T >.param_propagate_down ( int  nParamIdx)

Returns whether or not the Layer should compute gradients w.r.t. a parameter at a particular index given by a parameter index.

Parameters
nParamIdxSpecifies the parameter index.
Returns

Definition at line 1057 of file Layer.cs.

◆ PostProcessFullOutput()

virtual string MyCaffe.layers.Layer< T >.PostProcessFullOutput ( Blob< T >  blobSoftmax)
virtual

The PostProcessFullOutput allows derivative data layers to post-process the results, usually be detokenizing the data in the blobSoftmax.

Parameters
blobSoftmaxSpecifies the data to be post processed.
Returns
A string of the post processed data is returned.

Reimplemented in MyCaffe.layers.gpt.TokenizedDataPairsLayer< T >.

Definition at line 351 of file Layer.cs.

◆ PostProcessLogitsOutput()

virtual List< Tuple< string, int, double > > MyCaffe.layers.Layer< T >.PostProcessLogitsOutput ( int  nCurIdx,
Blob< T >  blobLogits,
Layer< T >  softmax,
int  nAxis,
int  nK = 1 
)
virtual

The PostProcessLogitsOutput allows derivative data layers to post-process the results, converting them back into text results (e.g., detokenizing).

Parameters
nCurIdxSpecifies the current index being processed, or -1 for the last index.
blobLogitsSpecifies the logits blob output by the last inner product layer of the network.
softmaxSpecifies the softmax layer used to post process the logits.
nAxisSpecifies the axis of the softmax layer.
nKOptionally, specifies the K top items to return (default = 1).
Returns
The array of word string, index, propabilities and end of sequence found boolean corresponding to the softmax output is returned.

Reimplemented in MyCaffe.layers.gpt.TokenizedDataLayer< T >, and MyCaffe.layers.gpt.TokenizedDataPairsLayer< T >.

Definition at line 342 of file Layer.cs.

◆ PostProcessOutput() [1/2]

virtual List< Tuple< string, int, double > > MyCaffe.layers.Layer< T >.PostProcessOutput ( Blob< T >  blobSofmtax,
int  nK = 1 
)
virtual

The PostProcessOutput allows derivative data layers to post-process the results, converting them back into text results (e.g., detokenizing).

Parameters
blobSofmtaxSpecifies the softmax blob output by the network.
nKOptionally, specifies the K top items to return (default = 1).
Returns
The array of word string, index, propabilities and end of squence found boolean corresponding to the softmax output is returned.

Reimplemented in MyCaffe.layers.beta.TextDataLayer< T >.

Definition at line 328 of file Layer.cs.

◆ PostProcessOutput() [2/2]

virtual string MyCaffe.layers.Layer< T >.PostProcessOutput ( int  nIdx)
virtual

Convert the index to the word.

Parameters
nIdxSpecifies the index to convert.
Returns
The corresponding word is returned.

Reimplemented in MyCaffe.layers.beta.TextDataLayer< T >.

Definition at line 361 of file Layer.cs.

◆ PreProcessInput() [1/2]

virtual BlobCollection< T > MyCaffe.layers.Layer< T >.PreProcessInput ( PropertySet  customInput,
out int  nSeqLen,
BlobCollection< T >  colBottom = null 
)
virtual

The PreprocessInput allows derivative data layers to convert a property set of input data into the bottom blob collection used as intput.

Parameters
customInputSpecifies the custom input data.
nSeqLenReturns the sequence length.
colBottomOptionally, specifies the bottom data to fill.
Returns
The bottom data is returned.

The blobs returned should match the blob descriptions returned in the LayerParameter's overrides for 'PrepareRunModelInputs' and 'PrepareRunModel'.

Reimplemented in MyCaffe.layers.beta.TextDataLayer< T >, MyCaffe.layers.gpt.TokenizedDataLayer< T >, and MyCaffe.layers.gpt.TokenizedDataPairsLayer< T >.

Definition at line 294 of file Layer.cs.

◆ PreProcessInput() [2/2]

virtual bool MyCaffe.layers.Layer< T >.PreProcessInput ( string  strEncInput,
int?  nDecInput,
BlobCollection< T >  colBottom 
)
virtual

Preprocess the input data for the RUN phase.

Parameters
strEncInputSpecifies the encoder input.
nDecInputSpecifies the decoder input.
colBottomSpecifies the bottom blob where the preprocessed data is placed where colBottom[0] contains the preprocessed decoder input. colBottom[1] contains the preprocessed encoder input (depending on param settings), colBottom[2] contains the preprocessed encoder input reversed (depending on param settings)
Returns
If nDecInput == EOS, false is returned, otherwise true.

NOTE: the LayerSetup must be called before preprocessing input, for during LayerSetup the vocabulary is loaded.

Reimplemented in MyCaffe.layers.gpt.TokenizedDataLayer< T >, MyCaffe.layers.gpt.TokenizedDataPairsLayer< T >, and MyCaffe.layers.beta.TextDataLayer< T >.

Definition at line 316 of file Layer.cs.

◆ ReInitializeParameters()

◆ ResetOnDebug()

virtual void MyCaffe.layers.Layer< T >.ResetOnDebug ( EventHandler< GetWorkBlobArgs< T > >  fn)
virtual

Reset the OnDebug event, disabling it.

Parameters
fnSpecifies the event function to call when the OnDebug event fires.

Reimplemented in MyCaffe.layers.RecurrentLayer< T >.

Definition at line 379 of file Layer.cs.

◆ Reshape()

abstract void MyCaffe.layers.Layer< T >.Reshape ( BlobCollection< T >  colBottom,
BlobCollection< T >  colTop 
)
pure virtual

Adjust the shapes of top blobs and internal buffers to accomodate the shapes of the bottom blobs.

This method should reshape top blobs as needed according to the shapes of the bottom (input) Blobs, as well as reshaping any internal buffers and making any other necessary adjustments so that the layer can accomodate the bottom (input) Blobs.

Parameters
colBottomSpecifies the collection of bottom (input) Blobs, with requested input shapes.
colTopSpecifies the collection of top (output) Blobs, which should be reshaped as needed by the Layer.

Implemented in MyCaffe.layers.beta.AccuracyDecodeLayer< T >, MyCaffe.layers.beta.AccuracyEncodingLayer< T >, MyCaffe.layers.AttentionLayer< T >, MyCaffe.layers.beta.ConvolutionOctaveLayer< T >, MyCaffe.layers.CopyLayer< T >, MyCaffe.layers.beta.DataSequenceLayer< T >, MyCaffe.layers.beta.DecodeLayer< T >, MyCaffe.layers.beta.GatherLayer< T >, MyCaffe.layers.beta.GlobResNormLayer< T >, MyCaffe.layers.beta.InterpLayer< T >, MyCaffe.layers.beta.KnnLayer< T >, MyCaffe.layers.LSTMAttentionLayer< T >, MyCaffe.layers.beta.MeanErrorLossLayer< T >, MyCaffe.layers.beta.MergeLayer< T >, MyCaffe.layers.beta.ModelDataLayer< T >, MyCaffe.layers.beta.Normalization1Layer< T >, MyCaffe.layers.beta.SqueezeLayer< T >, MyCaffe.layers.beta.TextDataLayer< T >, MyCaffe.layers.beta.TransposeLayer< T >, MyCaffe.layers.beta.TripletLossLayer< T >, MyCaffe.layers.beta.UnPoolingLayer< T >, MyCaffe.layers.beta.UnPoolingLayer1< T >, MyCaffe.layers.beta.UnsqueezeLayer< T >, MyCaffe.layers.gpt.CausalSelfAttentionLayer< T >, MyCaffe.layers.gpt.CausalSelfAttentionLayer2< T >, MyCaffe.layers.gpt.LayerNormLayer< T >, MyCaffe.layers.gpt.MultiheadAttentionLayer< T >, MyCaffe.layers.gpt.NLLLossLayer< T >, MyCaffe.layers.gpt.PositionalEncodingLayer< T >, MyCaffe.layers.gpt.TokenizedDataLayer< T >, MyCaffe.layers.gpt.TokenizedDataPairsLayer< T >, MyCaffe.layers.gpt.TransformerBlockLayer< T >, MyCaffe.layers.hdf5.HDF5DataLayer< T >, MyCaffe.layers.lnn.CfcLayer< T >, MyCaffe.layers.lnn.CfcUnitLayer< T >, MyCaffe.layers.lnn.LtcUnitLayer< T >, MyCaffe.layers.nt.EventLayer< T >, MyCaffe.layers.nt.GramLayer< T >, MyCaffe.layers.nt.OneHotLayer< T >, MyCaffe.layers.nt.TVLossLayer< T >, MyCaffe.layers.ssd.DetectionEvaluateLayer< T >, MyCaffe.layers.ssd.DetectionOutputLayer< T >, MyCaffe.layers.ssd.MultiBoxLossLayer< T >, MyCaffe.layers.ssd.Normalization2Layer< T >, MyCaffe.layers.ssd.PermuteLayer< T >, MyCaffe.layers.ssd.PriorBoxLayer< T >, MyCaffe.layers.ssd.SmoothL1LossLayer< T >, MyCaffe.layers.tft.CategoricalTransformationLayer< T >, MyCaffe.layers.tft.ChannelEmbeddingLayer< T >, MyCaffe.layers.tft.DataTemporalLayer< T >, MyCaffe.layers.tft.GateAddNormLayer< T >, MyCaffe.layers.tft.GluLayer< T >, MyCaffe.layers.tft.GrnLayer< T >, MyCaffe.layers.tft.MultiHeadAttentionInterpLayer< T >, MyCaffe.layers.tft.NumericTransformationLayer< T >, MyCaffe.layers.tft.QuantileAccuracyLayer< T >, MyCaffe.layers.QuantileLossLayer< T >, MyCaffe.layers.tft.ReshapeTemporalLayer< T >, MyCaffe.layers.tft.VarSetNetLayer< T >, MyCaffe.layers.AccuracyLayer< T >, MyCaffe.layers.ArgMaxLayer< T >, MyCaffe.layers.BaseConvolutionLayer< T >, MyCaffe.layers.BaseDataLayer< T >, MyCaffe.layers.BatchNormLayer< T >, MyCaffe.layers.BatchReindexLayer< T >, MyCaffe.layers.BiasLayer< T >, MyCaffe.layers.ConcatLayer< T >, MyCaffe.layers.ConstantLayer< T >, MyCaffe.layers.ContrastiveLossLayer< T >, MyCaffe.layers.ConvolutionLayer< T >, MyCaffe.layers.CropLayer< T >, MyCaffe.layers.DataNormalizerLayer< T >, MyCaffe.layers.DebugLayer< T >, MyCaffe.layers.DeconvolutionLayer< T >, MyCaffe.layers.DropoutLayer< T >, MyCaffe.layers.DummyDataLayer< T >, MyCaffe.layers.EltwiseLayer< T >, MyCaffe.layers.ELULayer< T >, MyCaffe.layers.EmbedLayer< T >, MyCaffe.layers.EuclideanLossLayer< T >, MyCaffe.layers.FilterLayer< T >, MyCaffe.layers.FlattenLayer< T >, MyCaffe.layers.Im2colLayer< T >, MyCaffe.layers.InfogainLossLayer< T >, MyCaffe.layers.InnerProductLayer< T >, MyCaffe.layers.InputLayer< T >, MyCaffe.layers.LossLayer< T >, MyCaffe.layers.LRNLayer< T >, MyCaffe.layers.LSTMSimpleLayer< T >, MyCaffe.layers.LSTMUnitLayer< T >, MyCaffe.layers.MemoryDataLayer< T >, MyCaffe.layers.MemoryLossLayer< T >, MyCaffe.layers.MultinomialLogisticLossLayer< T >, MyCaffe.layers.MVNLayer< T >, MyCaffe.layers.NeuronLayer< T >, MyCaffe.layers.ParameterLayer< T >, MyCaffe.layers.PoolingLayer< T >, MyCaffe.layers.PReLULayer< T >, MyCaffe.layers.RecurrentLayer< T >, MyCaffe.layers.ReductionLayer< T >, MyCaffe.layers.ReLULayer< T >, MyCaffe.layers.ReshapeLayer< T >, MyCaffe.layers.ScaleLayer< T >, MyCaffe.layers.SigmoidCrossEntropyLossLayer< T >, MyCaffe.layers.SigmoidLayer< T >, MyCaffe.layers.SilenceLayer< T >, MyCaffe.layers.SliceLayer< T >, MyCaffe.layers.SoftmaxCrossEntropy2LossLayer< T >, MyCaffe.layers.SoftmaxCrossEntropyLossLayer< T >, MyCaffe.layers.SoftmaxLayer< T >, MyCaffe.layers.SoftmaxLossLayer< T >, MyCaffe.layers.SplitLayer< T >, MyCaffe.layers.SPPLayer< T >, MyCaffe.layers.SwishLayer< T >, MyCaffe.layers.TanhLayer< T >, and MyCaffe.layers.TileLayer< T >.

◆ reshapeNeeded()

virtual bool MyCaffe.layers.Layer< T >.reshapeNeeded ( BlobCollection< T >  colBottom,
BlobCollection< T >  colTop,
bool  bReset = true 
)
protectedvirtual

Tests the shapes of both the bottom and top blobs and if they are the same as the previous sizing, returns false indicating that no reshape is needed.

Parameters
colBottomSpecifies the bottom blobs.
colTopSpecifies the top blobs.
bResetSpecifies to reset the test (set to false when using in second derivative classes, e.g. set to true in BaseConvolutionLayer, and false in ConvolutionLayer).
Returns
If a reshape is needed, returns true otherwise returns fasle.

Reimplemented in MyCaffe.layers.tft.MultiHeadAttentionInterpLayer< T >, and MyCaffe.layers.ConvolutionLayer< T >.

Definition at line 622 of file Layer.cs.

◆ set_loss()

void MyCaffe.layers.Layer< T >.set_loss ( int  nTopIdx,
double  dfLoss 
)

Sets the loss associated with a top Blob at a given index.

Parameters
nTopIdxSpecifies the index.
dfLossSpecifies the loss value.

Definition at line 918 of file Layer.cs.

◆ set_param_propagate_down()

void MyCaffe.layers.Layer< T >.set_param_propagate_down ( int  nParamIdx,
bool  bPropagate 
)

Sets whether or not the Layer should compute gradients w.r.t. a parameter at a particular index given by a parameter index.

Parameters
nParamIdxSpecifies the index.
bPropagateSpecifies whether or not to progagate down the parameter.

Definition at line 1068 of file Layer.cs.

◆ SetEnablePassthrough()

void MyCaffe.layers.Layer< T >.SetEnablePassthrough ( bool  bEnable)

Enables/disables the pass-through mode.

When enabled, the forward pass merely compies the bottom inputs to the top outputs and returns.

Parameters
bEnableEnable/disable the pass-through mode.

Definition at line 1276 of file Layer.cs.

◆ SetLossWeights()

void MyCaffe.layers.Layer< T >.SetLossWeights ( BlobCollection< T >  colTop)
protected

Called by Layer::Setup to initialize the weights associated with any top (output) Blobs in the loss function ans store non-zero loss weights in the diff Blob.

Parameters
colTopSpecifies the collection of top (output) Blobs.

Definition at line 1109 of file Layer.cs.

◆ SetNetParameterUsed()

virtual void MyCaffe.layers.Layer< T >.SetNetParameterUsed ( NetParameter  np)
virtual

This function allows other layers to gather needed information from the NetParameters if any, and is called when initialzing the Net.

Parameters
npSpecifies the NetParameter.

Reimplemented in MyCaffe.layers.LabelMappingLayer< T >.

Definition at line 492 of file Layer.cs.

◆ SetNetReshapeRequest()

void MyCaffe.layers.Layer< T >.SetNetReshapeRequest ( )

Called by the Net when requesting a reshape.

Definition at line 414 of file Layer.cs.

◆ SetOnDebug()

virtual void MyCaffe.layers.Layer< T >.SetOnDebug ( EventHandler< GetWorkBlobArgs< T > >  fn)
virtual

Set the OnDebug event.

Parameters
fnSpecifies the event function to call when the OnDebug event fires.

Reimplemented in MyCaffe.layers.RecurrentLayer< T >.

Definition at line 370 of file Layer.cs.

◆ SetPhase()

void MyCaffe.layers.Layer< T >.SetPhase ( Phase  phase)

Changes the layer's Phase to the one specified.

Parameters
phaseSpecifies the new Phase for the layer.

Definition at line 423 of file Layer.cs.

◆ setShapes()

void MyCaffe.layers.Layer< T >.setShapes ( BlobCollection< T >  colBottom,
BlobCollection< T >  colTop 
)
protected

Set the internal shape sizes - used when determining if a Reshape is necessary.

Parameters
colBottomSpecifies the bottom input blobs.
colTopSpecifies the top output blobs.

Definition at line 685 of file Layer.cs.

◆ Setup()

void MyCaffe.layers.Layer< T >.Setup ( BlobCollection< T >  colBottom,
BlobCollection< T >  colTop 
)

Implements common Layer setup functionality.

Checks that the number of bottom and top blobs are correct. Calls LayerSetup to do Layer specific setup for each layer type, followed by Reshape to setup the sizes of the top Blobs and internal buffers. Shes up the loss weight multiplier blobs for any non-zero loss weights.

Parameters
colBottomSpecifies the collection of preshaped bottom (input) Blobs.
colTopSpecifies the collection of allocated but unshaped top (output) Blobs.

Definition at line 439 of file Layer.cs.

◆ setup_internal_blobs()

virtual void MyCaffe.layers.Layer< T >.setup_internal_blobs ( BlobCollection< T >  col)
protectedvirtual

Derivative layers should add all internal blobws to the 'col' provided.

Parameters
colSpecifies the blob collection where internal blobs are added.

Reimplemented in MyCaffe.layers.beta.AccuracyEncodingLayer< T >, MyCaffe.layers.AttentionLayer< T >, MyCaffe.layers.beta.DecodeLayer< T >, MyCaffe.layers.beta.GlobResNormLayer< T >, MyCaffe.layers.beta.KnnLayer< T >, MyCaffe.layers.LSTMAttentionLayer< T >, MyCaffe.layers.beta.Normalization1Layer< T >, MyCaffe.layers.beta.TransposeLayer< T >, MyCaffe.layers.beta.TripletLossLayer< T >, MyCaffe.layers.gpt.CausalSelfAttentionLayer< T >, MyCaffe.layers.gpt.CausalSelfAttentionLayer2< T >, MyCaffe.layers.gpt.LayerNormLayer< T >, MyCaffe.layers.gpt.MultiheadAttentionLayer< T >, MyCaffe.layers.gpt.NLLLossLayer< T >, MyCaffe.layers.gpt.PositionalEncodingLayer< T >, MyCaffe.layers.gpt.TokenizedDataPairsLayer< T >, MyCaffe.layers.gpt.TransformerBlockLayer< T >, MyCaffe.layers.lnn.CfcLayer< T >, MyCaffe.layers.lnn.CfcUnitLayer< T >, MyCaffe.layers.lnn.LtcUnitLayer< T >, MyCaffe.layers.ssd.DetectionOutputLayer< T >, MyCaffe.layers.ssd.MultiBoxLossLayer< T >, MyCaffe.layers.ssd.Normalization2Layer< T >, MyCaffe.layers.ssd.PermuteLayer< T >, MyCaffe.layers.ssd.SmoothL1LossLayer< T >, MyCaffe.layers.tft.CategoricalTransformationLayer< T >, MyCaffe.layers.tft.ChannelEmbeddingLayer< T >, MyCaffe.layers.tft.DataTemporalLayer< T >, MyCaffe.layers.tft.GateAddNormLayer< T >, MyCaffe.layers.tft.GluLayer< T >, MyCaffe.layers.tft.GrnLayer< T >, MyCaffe.layers.tft.MultiHeadAttentionInterpLayer< T >, MyCaffe.layers.tft.NumericTransformationLayer< T >, MyCaffe.layers.tft.QuantileAccuracyLayer< T >, MyCaffe.layers.tft.ReshapeTemporalLayer< T >, MyCaffe.layers.tft.VarSetNetLayer< T >, MyCaffe.layers.BaseConvolutionLayer< T >, MyCaffe.layers.BatchNormLayer< T >, MyCaffe.layers.BatchReindexLayer< T >, MyCaffe.layers.BiasLayer< T >, MyCaffe.layers.CropLayer< T >, MyCaffe.layers.DropoutLayer< T >, MyCaffe.layers.EltwiseLayer< T >, MyCaffe.layers.EmbedLayer< T >, MyCaffe.layers.InfogainLossLayer< T >, MyCaffe.layers.InnerProductLayer< T >, MyCaffe.layers.LRNLayer< T >, MyCaffe.layers.LSTMSimpleLayer< T >, MyCaffe.layers.LSTMUnitLayer< T >, MyCaffe.layers.MVNLayer< T >, MyCaffe.layers.PoolingLayer< T >, MyCaffe.layers.PReLULayer< T >, MyCaffe.layers.RecurrentLayer< T >, MyCaffe.layers.ReductionLayer< T >, MyCaffe.layers.ScaleLayer< T >, MyCaffe.layers.SoftmaxCrossEntropy2LossLayer< T >, MyCaffe.layers.SoftmaxLayer< T >, and MyCaffe.layers.SoftmaxLossLayer< T >.

Definition at line 891 of file Layer.cs.

◆ setWorkspace()

virtual bool MyCaffe.layers.Layer< T >.setWorkspace ( ulong  lSizeInBytes)
protectedvirtual

Sets the workspace size (in items) and returns true if set, false otherwise.

Parameters
lSizeInBytesSpecifies the size of the workspace data in bytes.
Returns
If the OnSetWorkspace event is set, true is returned, otherwise false.

Reimplemented in MyCaffe.layers.BaseConvolutionLayer< T >.

Definition at line 1300 of file Layer.cs.

◆ shareLayerBlob()

bool MyCaffe.layers.Layer< T >.shareLayerBlob ( Blob< T >  b,
List< int >  rgMinShape 
)
protected

Attempts to share a Layer Blob if another parameter Blob with the same name and acceptable size is found.

Parameters
bSpecifies the Blob to share.
rgMinShapeSpecifies the minimum shape requried to share.
Returns
If the Blob is shared, true is returned, otherwise false is returned.

Definition at line 1170 of file Layer.cs.

◆ shareLayerBlobs()

bool MyCaffe.layers.Layer< T >.shareLayerBlobs ( Layer< T >  layer)
protected

Attempts to share the Layer blobs and internal_blobs with matching names and sizes with those in another matching layer.

Parameters
layerSpecifies the layer who will use the shared blobs and internal blobs from the shared layer.
Returns
If the layer blobs and internal blobs are shared successfully true is returned, otherwise false is returned.

Definition at line 1187 of file Layer.cs.

◆ shareParameter()

bool MyCaffe.layers.Layer< T >.shareParameter ( Blob< T >  b,
List< int >  rgMinShape,
bool  bAllowEndsWithComparison = false 
)
protected

Attempts to share a parameter Blob if another parameter Blob with the same name and accpetable size is found.

Parameters
bSpecifies the Blob to share.
rgMinShapeSpecifies the minimum shape requried to share.
bAllowEndsWithComparisonOptionally, allow name comparison where end of blob 'b' name is compared with the share blob names (default = false).
Returns
If the Blob is shared, true is returned, otherwise false is returned.

Definition at line 1152 of file Layer.cs.

◆ size_at()

Size MyCaffe.layers.Layer< T >.size_at ( Blob< T >  b)
protected

Returns the Size of a given two element Blob, such as one that stores Blob size information.

Parameters
bSpecifies the Blob.
Returns
The height and width are returned in a Size object.

Definition at line 1444 of file Layer.cs.

◆ val_at()

int MyCaffe.layers.Layer< T >.val_at ( T[]  rg,
int  nIdx 
)
protected

Returns the integer value at a given index in a generic array.

Parameters
rgSpecifies the generic array.
nIdxSpecifies the index.
Returns
The value at the index is returned as an integer.

Definition at line 1434 of file Layer.cs.

Member Data Documentation

◆ m_bConvertBottom

bool MyCaffe.layers.Layer< T >.m_bConvertBottom = true
protected

Specifies whether or not the layer should convert the bottom when using half sized memory.

Definition at line 96 of file Layer.cs.

◆ m_bConvertTopOnBwd

bool MyCaffe.layers.Layer< T >.m_bConvertTopOnBwd = true
protected

Specifies whether or not to convert the top on the backward pass when using half sized memory (typically not done on loss layers).

Definition at line 92 of file Layer.cs.

◆ m_bConvertTopOnFwd

bool MyCaffe.layers.Layer< T >.m_bConvertTopOnFwd = false
protected

Specifies whether or not the layer should convert the top on the forward pass when using half sized memory (typically only done with input data).

Definition at line 88 of file Layer.cs.

◆ m_bEnablePassthrough

bool MyCaffe.layers.Layer< T >.m_bEnablePassthrough = false
protected

Enables/disables the pass-through mode for the layer. Default = false.

Definition at line 80 of file Layer.cs.

◆ m_bNetReshapeRequest

bool MyCaffe.layers.Layer< T >.m_bNetReshapeRequest = false
protected

Specifies whether the reshape is requested from a Net.Reshape call or not.

Definition at line 104 of file Layer.cs.

◆ m_bReshapeOnForwardNeeded

bool MyCaffe.layers.Layer< T >.m_bReshapeOnForwardNeeded = true
protected

Specifies whether or not the reshape on forward is needed or not.

Definition at line 100 of file Layer.cs.

◆ m_bUseHalfSize

bool MyCaffe.layers.Layer< T >.m_bUseHalfSize = false
protected

Specifies that the half size of the top (if any) should be converted to the base size.

Definition at line 84 of file Layer.cs.

◆ m_colBlobs

BlobCollection<T> MyCaffe.layers.Layer< T >.m_colBlobs
protected

Specifies the learnable parameter Blobs of the Layer.

Definition at line 55 of file Layer.cs.

◆ m_colInternalBlobs

BlobCollection<T> MyCaffe.layers.Layer< T >.m_colInternalBlobs = new BlobCollection<T>()
protected

Specifies internal blobs used by the layer.

Definition at line 59 of file Layer.cs.

◆ m_cuda

CudaDnn<T> MyCaffe.layers.Layer< T >.m_cuda
protected

Specifies the CudaDnn connection to Cuda.

Definition at line 39 of file Layer.cs.

◆ m_log

Log MyCaffe.layers.Layer< T >.m_log
protected

Specifies the Log for output.

Definition at line 43 of file Layer.cs.

◆ m_param

LayerParameter MyCaffe.layers.Layer< T >.m_param
protected

Specifies the LayerParameter describing the Layer.

Definition at line 47 of file Layer.cs.

◆ m_parentLayerType

LayerParameter.? LayerType MyCaffe.layers.Layer< T >.m_parentLayerType = null
protected

Specifies the layer type of the parent.

Definition at line 108 of file Layer.cs.

◆ m_phase

Phase MyCaffe.layers.Layer< T >.m_phase
protected

Specifies the Phase under which the Layer is run.

Definition at line 51 of file Layer.cs.

◆ m_rgbParamPropagateDown

DictionaryMap<bool> MyCaffe.layers.Layer< T >.m_rgbParamPropagateDown
protected

Specifies whether or not to compute the learnable diff of each parameter Blob.

Definition at line 63 of file Layer.cs.

◆ m_rgLoss

DictionaryMap<double> MyCaffe.layers.Layer< T >.m_rgLoss
protected

Specifies the loss values that indeicate whether each top (output) Blob has a non-zero weight in the objective function..

Definition at line 68 of file Layer.cs.

◆ m_tOne

T MyCaffe.layers.Layer< T >.m_tOne
protected

Specifies a generic type equal to 1.0.

Definition at line 72 of file Layer.cs.

◆ m_type

LayerParameter.LayerType MyCaffe.layers.Layer< T >.m_type = LayerParameter.LayerType._MAX
protected

Specifies the Layer type.

Definition at line 35 of file Layer.cs.

◆ m_tZero

T MyCaffe.layers.Layer< T >.m_tZero
protected

Specifies a generic type equal to 0.0.

Definition at line 76 of file Layer.cs.

Property Documentation

◆ AutoTopBlobs

virtual bool MyCaffe.layers.Layer< T >.AutoTopBlobs
get

Return whether "anonymous" top (output) Blobs are created automatically by the Layer.

If this method returns true, Net::Init will create enough "anonymous" top Blobs to fulfill the requirement specified by ExactNumTopBlobs() or MinTopBlobs().

Definition at line 1030 of file Layer.cs.

◆ backward_timing

double MyCaffe.layers.Layer< T >.backward_timing
get

Returns the timing of the last backward pass in milliseconds.

Definition at line 1256 of file Layer.cs.

◆ backward_timing_average

double MyCaffe.layers.Layer< T >.backward_timing_average
get

Returns the average timing of the backward passes in milliseconds.

Definition at line 1264 of file Layer.cs.

◆ blobs

Returns the collection of learnable parameter Blobs for the Layer.

Definition at line 874 of file Layer.cs.

◆ EqualNumBottomTopBlobs

virtual bool MyCaffe.layers.Layer< T >.EqualNumBottomTopBlobs
get

Returns true if the Layer requires and equal number of bottom (input) and top (output) Blobs.

This method should be overriden to return ture if your Layer expects an equal number of bottom and top Blobs.

Definition at line 1017 of file Layer.cs.

◆ ExactNumBottomBlobs

virtual int MyCaffe.layers.Layer< T >.ExactNumBottomBlobs
get

Returns the exact number of bottom (input) Blobs required by the Layer, or -1 if no exact number is required.

This method should be overriden to return a non-negative value if your Layer expects an exact number of bottom (input) Blobs.

Definition at line 939 of file Layer.cs.

◆ ExactNumTopBlobs

virtual int MyCaffe.layers.Layer< T >.ExactNumTopBlobs
get

Returns the exact number of top (output) Blobs required by the Layer, or -1 if no exact number is required.

This method should be overriden to return a non-negative value if your Layer expects an exact number of top (output) Blobs.

Definition at line 978 of file Layer.cs.

◆ forward_timing

double MyCaffe.layers.Layer< T >.forward_timing
get

Returns the timing of the last forward pass in milliseconds.

Definition at line 1240 of file Layer.cs.

◆ forward_timing_average

double MyCaffe.layers.Layer< T >.forward_timing_average
get

Returns the average timing of the forward passes in milliseconds.

Definition at line 1248 of file Layer.cs.

◆ internal_blobs

BlobCollection<T> MyCaffe.layers.Layer< T >.internal_blobs
get

Returns the collection of internal Blobs used by the Layer.

Definition at line 882 of file Layer.cs.

◆ layer_param

LayerParameter MyCaffe.layers.Layer< T >.layer_param
get

Returns the LayerParameter for this Layer.

Definition at line 898 of file Layer.cs.

◆ MaxBottomBlobs

virtual int MyCaffe.layers.Layer< T >.MaxBottomBlobs
get

Returns the maximum number of bottom (input) Blobs required by the Layer, or -1 if no maximum number is required.

This method should be overriden to return a non-negative value if your Layer expects a maximum number of bottom (input) Blobs.

Definition at line 965 of file Layer.cs.

◆ MaxTopBlobs

virtual int MyCaffe.layers.Layer< T >.MaxTopBlobs
get

Returns the maximum number of top (output) Blobs required by the Layer, or -1 if no maximum number is required.

This method should be overriden to return a non-negative value if your Layer expects a maximum number of top (output) Blobs.

Definition at line 1004 of file Layer.cs.

◆ MinBottomBlobs

virtual int MyCaffe.layers.Layer< T >.MinBottomBlobs
get

Returns the minimum number of bottom (input) Blobs required by the Layer, or -1 if no minimum number is required.

This method should be overriden to return a non-negative value if your Layer expects a minimum number of bottom (input) Blobs.

Definition at line 952 of file Layer.cs.

◆ MinTopBlobs

virtual int MyCaffe.layers.Layer< T >.MinTopBlobs
get

Returns the minimum number of top (output) Blobs required by the Layer, or -1 if no minimum number is required.

This method should be overriden to return a non-negative value if your Layer expects a minimum number of top (output) Blobs.

Definition at line 991 of file Layer.cs.

◆ parent_layer_type

LayerParameter.? LayerType MyCaffe.layers.Layer< T >.parent_layer_type
get

Optionally, specifies the parent layer type (e.g. LOSS, etc.)

Definition at line 247 of file Layer.cs.

◆ SupportsPostProcessing

virtual bool MyCaffe.layers.Layer< T >.SupportsPostProcessing
get

Should return true when pre PostProcessing methods are overriden.

Definition at line 263 of file Layer.cs.

◆ SupportsPostProcessingFullOutput

virtual bool MyCaffe.layers.Layer< T >.SupportsPostProcessingFullOutput
get

Should return true when PostProcessingFullOutput is supported.

Definition at line 279 of file Layer.cs.

◆ SupportsPostProcessingLogits

virtual bool MyCaffe.layers.Layer< T >.SupportsPostProcessingLogits
get

Should return true when pre PostProcessingLogits methods are overriden.

Definition at line 271 of file Layer.cs.

◆ SupportsPreProcessing

virtual bool MyCaffe.layers.Layer< T >.SupportsPreProcessing
get

Should return true when PreProcessing methods are overriden.

Definition at line 255 of file Layer.cs.

◆ type

Returns the LayerType of this Layer.

Definition at line 926 of file Layer.cs.

Event Documentation

◆ OnDebug

EventHandler<GetWorkBlobArgs<T> > MyCaffe.layers.Layer< T >.OnDebug

Specifies the OnGetWorkBlob event that is only supported when debugging to get a work blob from the primary Net holding this layer.

When implemented, this event causes a nan/inf check at the end of each forward and backward pass and is only recommended use during debugging.

Definition at line 140 of file Layer.cs.

◆ OnGetIteration

EventHandler<GetIterationArgs> MyCaffe.layers.Layer< T >.OnGetIteration

Specifies the OnGetIteration event that fires when a layer needs to get the current iteration from the solver.

Definition at line 132 of file Layer.cs.

◆ OnGetWorkspace

EventHandler<WorkspaceArgs> MyCaffe.layers.Layer< T >.OnGetWorkspace

Specifies the OnGetWorkspace event that fires when the getWorkspace() function is called by a layer to get a shareable workspace to conserve GPU memory.

Definition at line 124 of file Layer.cs.

◆ OnSetWorkspace

EventHandler<WorkspaceArgs> MyCaffe.layers.Layer< T >.OnSetWorkspace

Specifies the OnSetWorkspace event that fires when the setWorkspace() function is called by a layer to get a shareable workspace to conserve GPU memory.

Definition at line 128 of file Layer.cs.


The documentation for this class was generated from the following file: