Keras documentation: Layer activation functions
https://keras.io/api/layers/activationsApplies the rectified linear unit activation function. With default values, this returns the standard ReLU activation: max(x, 0), the element-wise maximum of 0 and the input tensor. Modifying default parameters allows you to use non-zero thresholds, change the max value of the activation, and to use a non-zero multiple of the input for values below the threshold.
Activation function - Wikipedia
en.wikipedia.org › wiki › Activation_functionActivation functions like tanh, Leaky ReLU, GELU, ELU, Swish and Mish are sign equivalent to the identity function and cannot learn the XOR function with a single neuron. The output of a single neuron or its activation is a = g ( z ) = g ( w T x + b ) {\displaystyle a=g(z)=g({\boldsymbol {w}}^{T}{\boldsymbol {x}}+b)} , where g is the activation ...