Softplus — PyTorch 1.10.1 documentation
pytorch.org › generated › torchSoftPlus is a smooth approximation to the ReLU function and can be used to constrain the output of a machine to always be positive. input \times \beta > threshold input×β > threshold. \beta β value for the Softplus formulation. Default: 1. threshold – values above this revert to a linear function. Default: 20.