[source] # ReLU Rectified Linear Units (ReLU) only output the positive signal of the input. They have the benefit of having a monotonic derivative and are cheap to compute. $$ {\displaystyle ReLU = {\begin{aligned}&{\begin{cases}0&{\text{if }}x\leq 0\\x&{\text{if }}x>0\end{cases}}=&\max\{0,x\}\end{aligned}}} $$ ## Parameters This activation function does not have any parameters. ## Example ```php use Rubix\ML\NeuralNet\ActivationFunctions\ReLU; $activationFunction = new ReLU(0.1); ``` ## References [^1]: A. L. Maas et al. (2013). Rectifier Nonlinearities Improve Neural Network Acoustic Models. [^2]: K. Konda et al. (2015). Zero-bias Autoencoders and the Benefits of Co-adapting Features.