Skip to content

Latest commit

 

History

History
35 lines (21 loc) · 851 Bytes

HSwishLayer.md

File metadata and controls

35 lines (21 loc) · 851 Bytes

CHSwishLayer Class

This class implements a layer that calculates the h-swish activation function for each element of a single input.

The activation function formula:

f(x) = x * ReLU6( x + 3 ) / 6

Settings

There are no settings for this layer.

Trainable parameters

There are no trainable parameters for this layer.

Inputs

There is only one input, which accepts a data blob of arbitrary size.

Outputs

There is only one output, which returns a blob of the same size as the input blob. Each element of the output contains the value of the activation function calculated on the corresponding element of the input.