next up previous
Next: ch4 Up: ch4 Previous: ch4

Sigmoid Unit

\psfig{file=./bookps/ann-sigmoid.epsf,width=6.0in}



$\sigma(x)$ is the sigmoid function

\begin{displaymath}\frac{1}{1 + e^{-x}} \end{displaymath}

Nice property: $\frac{d \sigma(x)}{dx} = \sigma(x) (1 - \sigma(x))$


We can derive gradient decent rules to train One sigmoid unit

Multilayer networks of sigmoid units $\rightarrow$ Backpropagation



Don Patterson 2001-12-13