MyCaffe  1.12.2.41
Deep learning software for Windows C# programmers.
MyCaffe.param.gpt.LayerNormParameter Class Reference

Specifies the parameters for the LayerNormalizationLayer. More...

Inheritance diagram for MyCaffe.param.gpt.LayerNormParameter:
MyCaffe.param.LayerParameterBase MyCaffe.basecode.BaseParameter MyCaffe.basecode.IBinaryPersist

Public Member Functions

 LayerNormParameter ()
 Constructor for the parameter. More...
 
override object Load (System.IO.BinaryReader br, bool bNewInstance=true)
 Load the parameter from a binary reader. More...
 
override void Copy (LayerParameterBase src)
 Copy on parameter to another. More...
 
override LayerParameterBase Clone ()
 Creates a new copy of this instance of the parameter. More...
 
override RawProto ToProto (string strName)
 Convert the parameter into a RawProto. More...
 
- Public Member Functions inherited from MyCaffe.param.LayerParameterBase
 LayerParameterBase ()
 Constructor for the parameter. More...
 
virtual string PrepareRunModelInputs ()
 This method gives derivative classes a chance specify model inputs required by the run model. More...
 
virtual void PrepareRunModel (LayerParameter p)
 This method gives derivative classes a chance to prepare the layer for a run-model. More...
 
void Save (BinaryWriter bw)
 Save this parameter to a binary writer. More...
 
abstract object Load (BinaryReader br, bool bNewInstance=true)
 Load the parameter from a binary reader. More...
 
- Public Member Functions inherited from MyCaffe.basecode.BaseParameter
 BaseParameter ()
 Constructor for the parameter. More...
 
virtual bool Compare (BaseParameter p)
 Compare this parameter to another parameter. More...
 

Static Public Member Functions

static LayerNormParameter FromProto (RawProto rp)
 Parses the parameter from a RawProto. More...
 
- Static Public Member Functions inherited from MyCaffe.basecode.BaseParameter
static double ParseDouble (string strVal)
 Parse double values using the US culture if the decimal separator = '.', then using the native culture, and if then lastly trying the US culture to handle prototypes containing '.' as the separator, yet parsed in a culture that does not use '.' as a decimal. More...
 
static bool TryParse (string strVal, out double df)
 Parse double values using the US culture if the decimal separator = '.', then using the native culture, and if then lastly trying the US culture to handle prototypes containing '.' as the separator, yet parsed in a culture that does not use '.' as a decimal. More...
 
static float ParseFloat (string strVal)
 Parse float values using the US culture if the decimal separator = '.', then using the native culture, and if then lastly trying the US culture to handle prototypes containing '.' as the separator, yet parsed in a culture that does not use '.' as a decimal. More...
 
static bool TryParse (string strVal, out float f)
 Parse doufloatble values using the US culture if the decimal separator = '.', then using the native culture, and if then lastly trying the US culture to handle prototypes containing '.' as the separator, yet parsed in a culture that does not use '.' as a decimal. More...
 

Properties

double epsilon [getset]
 Specifies the epsilon value used to avoid invalid values (default = 1e-10). More...
 
bool enable_passthrough [getset]
 Specifies to pass-through the data on the forward and backward pass (e.g. skip the layer norm, used only for debugging. default = false). More...
 
bool enable_cuda_impl [getset]
 Specifies to use the low-level full cuda implementation of LayerNorm (default = false). More...
 

Additional Inherited Members

- Public Types inherited from MyCaffe.param.LayerParameterBase
enum  LABEL_TYPE { NONE , SINGLE , MULTIPLE , ONLY_ONE }
 Defines the label type. More...
 

Detailed Description

Specifies the parameters for the LayerNormalizationLayer.

See also
GitHub:CyberZHG by Zhao HG (MIT Liceense).
LayerNorm PyTorch
Understanding and Improving Layer Normalization by Xu et al., 2019, arXiv:1911.07013

Definition at line 20 of file LayerNormParameter.cs.

Constructor & Destructor Documentation

◆ LayerNormParameter()

MyCaffe.param.gpt.LayerNormParameter.LayerNormParameter ( )

Constructor for the parameter.

Definition at line 27 of file LayerNormParameter.cs.

Member Function Documentation

◆ Clone()

override LayerParameterBase MyCaffe.param.gpt.LayerNormParameter.Clone ( )
virtual

Creates a new copy of this instance of the parameter.

Returns
A new instance of this parameter is returned.

Implements MyCaffe.param.LayerParameterBase.

Definition at line 97 of file LayerNormParameter.cs.

◆ Copy()

override void MyCaffe.param.gpt.LayerNormParameter.Copy ( LayerParameterBase  src)
virtual

Copy on parameter to another.

Parameters
srcSpecifies the parameter to copy.

Implements MyCaffe.param.LayerParameterBase.

Definition at line 85 of file LayerNormParameter.cs.

◆ FromProto()

static LayerNormParameter MyCaffe.param.gpt.LayerNormParameter.FromProto ( RawProto  rp)
static

Parses the parameter from a RawProto.

Parameters
rpSpecifies the RawProto to parse.
Returns
A new instance of the parameter is returned.

Definition at line 127 of file LayerNormParameter.cs.

◆ Load()

override object MyCaffe.param.gpt.LayerNormParameter.Load ( System.IO.BinaryReader  br,
bool  bNewInstance = true 
)

Load the parameter from a binary reader.

Parameters
brSpecifies the binary reader.
bNewInstanceWhen true a new instance is created (the default), otherwise the existing instance is loaded from the binary reader.
Returns
Returns an instance of the parameter.

Definition at line 70 of file LayerNormParameter.cs.

◆ ToProto()

override RawProto MyCaffe.param.gpt.LayerNormParameter.ToProto ( string  strName)
virtual

Convert the parameter into a RawProto.

Parameters
strNameSpecifies the name to associate with the RawProto.
Returns
The new RawProto is returned.

Implements MyCaffe.basecode.BaseParameter.

Definition at line 109 of file LayerNormParameter.cs.

Property Documentation

◆ enable_cuda_impl

bool MyCaffe.param.gpt.LayerNormParameter.enable_cuda_impl
getset

Specifies to use the low-level full cuda implementation of LayerNorm (default = false).

The cuda implementation runs around 30% faster when using float base types.

Definition at line 58 of file LayerNormParameter.cs.

◆ enable_passthrough

bool MyCaffe.param.gpt.LayerNormParameter.enable_passthrough
getset

Specifies to pass-through the data on the forward and backward pass (e.g. skip the layer norm, used only for debugging. default = false).

Definition at line 45 of file LayerNormParameter.cs.

◆ epsilon

double MyCaffe.param.gpt.LayerNormParameter.epsilon
getset

Specifies the epsilon value used to avoid invalid values (default = 1e-10).

Definition at line 35 of file LayerNormParameter.cs.


The documentation for this class was generated from the following file: