ReLU6 Explained | Papers With Code
paperswithcode.com › method › relu6ReLU6. Introduced by Howard et al. in MobileNets: Efficient Convolutional Neural Networks for Mobile Vision Applications. Edit. ReLU6 is a modification of the rectified linear unit where we limit the activation to a maximum size of 6. This is due to increased robustness when used with low-precision computation. Image Credit: PyTorch.