Skip to content
This repository has been archived by the owner on Jan 3, 2024. It is now read-only.

Activation functions

Matias Vazquez-Levi edited this page Feb 1, 2021 · 8 revisions

Back to Dann

Activation functions written below are provided as default, see how to add more

sigmoid

Sigmoid is the default activation function. This function outputs a value in the range [0,1].

Definition:



reLU

reLU activation function. This activation function is easy to compute since it does not require heavy calculations. This function outputs values in a range [0,infinity].

Definition:



leakyReLU

Similar to reLU, this activation function is easy to compute. It also allows the output to be negative which solves the "dying reLU neuron" problem. This function outputs values in a range [-infinity,infinity].

Definition:



tanH

This activation function shares a lot of similarities with sigmoid. Unlike sigmoid, outputs a value in the range [-1,1].

Definition:



siLU

This activation function is a sigmoid's output multiplied by . SiLU shares a lot of similarities with leakyReLU, exept the function does not output negative for the entirety of the negative domain.

Definition:



leakySigmoid

This is an experimental function, it is very simiar to arctan(x). Unlike , this function outputs a value in the range [~ 0,~ 1].

Definition: