The class implements a layer that performs object normalization using the following formula:
objectNorm(x)[i][j] = ((x[i][j] - mean[i]) / sqrt(var[i] + epsilon)) * scale[j] + bias[j]
where:
scale
andbias
are trainable parametersmean
andvar
are the mean and variance of each object in a batch
void SetEpsilon( float newEpsilon );
Sets epsilon
which is added to the variance in order to avoid division by zero.
CPtr<CDnnBlob> GetScale() const;
Gets the scale vector. It is a blob of any shape and of total size equal Height * Width * Depth * Channels
of the input.
CPtr<CDnnBlob> GetBias() const;
Gets the bias vector. It is a blob of any shape and of total size equal Height * Width * Depth * Channels
of the input.
The single input accepts a blob containing BatchLength * BatchWidth * ListSize
objects of size Height * Width * Depth * Channels
.
The single output contains a blob with the results, of the same size as the input blob.