Jump to content

Hard sigmoid

From Wikipedia, the free encyclopedia

In artificial intelligence, especially computer vision and artificial neural networks, a hard sigmoid is non-smooth function used in place of a sigmoid function. These retain the basic shape of a sigmoid, rising from 0 to 1, but using simpler functions, especially piecewise linear functions or piecewise constant functions. These are preferred where speed of computation is more important than precision.

Examples

[edit]

The most extreme examples are the sign function or Heaviside step function, which go from −1 to 1 or 0 to 1 (which to use depends on normalization) at 0.[1]

Other examples include the Theano library, which provides two approximations: ultra_fast_sigmoid, which is a multi-part piecewise approximation and hard_sigmoid, which is a 3-part piecewise linear approximation (output 0, line with slope 0.2, output 1).[2][3]

References

[edit]
  1. ^ Curves and Surfaces in Computer Vision and Graphics, Volume 1610, SPIE, 1992, p. 301
  2. ^ "nnet – Ops for neural networks". Archived from the original on 2018-08-14. Retrieved 2018-09-03.
  3. ^ Theano/sigm.py at 38a6331ae23250338290e886a72daadb33441bc4 · Theano/Theano · GitHub