Hard sigmoid

In artificial intelligence, especially computer vision and artificial neural networks, a hard sigmoid is non-smooth function used in place of a sigmoid function. These retain the basic shape of a sigmoid, rising from 0 to 1, but using simpler functions, especially piecewise linear functions or piecewise constant functions. These are preferred where speed of computation is more important than precision.

Examples

The most extreme examples are the sign function or Heaviside step function, which go from −1 to 1 or 0 to 1 (which to use depends on normalization) at 0.[1]

Other examples include the Theano library, which provides two approximations: ultra_fast_sigmoid, which is a multi-part piecewise linear approximation and hard_sigmoid, which is a 3-part piecewise linear approximation (output 0, line with slope 0.2, output 1).[2][3]

gollark: A what?
gollark: You should probably try a simpler step toward whatever your goals are *first*, to see if it's viable or not.
gollark: As far as I know stuff like detecting and tracking objects and generally converting the 2D input from eyes into a 3D worldspace thingy is quite hard, audio is mostly just fourier-transforming.
gollark: The visual system is waaay higher bandwidth and needs much more complex processing to do useful things with.
gollark: I feel like you may be underestimating the complexity of this, and I don't see why you need dedicated hardware to test this idea.

References

This article is issued from Wikipedia. The text is licensed under Creative Commons - Attribution - Sharealike. Additional terms may apply for the media files.