Leaky ReLU is a type of activation function that tries to solve the Dying ReLU problem.
A traditional rectified linear unit returns 0 when . The Dying ReLU problem refers to when the unit gets stuck this way–always returning 0 for any input.
Leaky ReLU aims to fix this by returning a small, negative, non-zero value instead of 0, as such:
where is typically a small value like .