-
Notifications
You must be signed in to change notification settings - Fork 0
The bn class is a layer that can be used inside the model class and more specifically is included inside cl layers to define group normalization layers and inside fcl layers to define normalization layers. The model class doesn't accept any bn class not within bn or fcl.
def __cinit__(self,int batch_size,
int vector_input_dimension,
bint does_have_learning_parameters = True,
bint does_have_arrays = True,
bint is_only_for_feedforward = False):
-
@batch_size indicates group dimension of the inputs that must be normalized. For example, in a group normalization case it indicates the channels that must be normalized to each other.
-
@vector_dimension is the size of the input of each instance of the batch as well as the output
-
@does_have_learning_parameters initialize the class with all the arrays needed for the feed forward and back propagation as well as the learning parameters as the mean and std computed.
-
@does_have_arrays if set to false the class will not have any array allocated.
-
@is_only_for_feedforward if set to True the class will not have any array for the back propagation.
-
def save(self,number_of_file)
- @number_of_file will save the class in binary format in a file with name number_of_file.bin where number is an integer
-
def get_size(self)
- returns the number of bytes that more or less this class would occupy
-
def make_it_only_for_ff(self)
- deallocates all the arrays needed for the back propagation
-
def reset(self)
- usually used after each feed forward or back propagation or both
-
def clip(self, float threshold, float norm)
- clipping gradient applied to this layer (it should not be usually used for normalization layers)
- @threshold the threshold used for the clipping
- @norm the normalization parameter