#concept
What is ReLu?
?
Rectified Linear Unit. It is a common type of activation function.
f(x)=max(0,x)
- For xโฅ0: The function returns x (i.e., the function behaves linearly).
- For x<0: The function returns 0.
#concept
What is ReLu?
?
Rectified Linear Unit. It is a common type of activation function.
f(x)=max(0,x)