Hard sigmoid - Wikipedia
https://en.wikipedia.org/wiki/Hard_sigmoidIn artificial intelligence, especially computer vision and artificial neural networks, a hard sigmoid is non-smooth function used in place of a sigmoid function. These retain the basic shape of a sigmoid, rising from 0 to 1, but using simpler functions, especially piecewise linear functions or piecewise constant functions. These are preferred where speed of computation is more important than precision.
Hard Sigmoid Explained | Papers With Code
https://paperswithcode.com/method/hard-sigmoidHard Sigmoid. Introduced by Courbariaux et al. in BinaryConnect: Training Deep Neural Networks with binary weights during propagations. Edit. The Hard Sigmoid is an activation function used for neural networks of the form: f ( x) = max ( 0, min ( 1, ( x + 1) 2)) Image Source: Rinat Maksutov. Source: BinaryConnect: Training Deep Neural Networks ...