Neural-Networks Activation functions : relu for hidden layers tanh for output layer See "Layer" class and "get_activation" / "get_derivative_activation" functions for details.