Approximate inference

Approximate inference methods make it possible to learn realistic models from big data by trading off computation time for accuracy, when exact learning and inference are computationally intractable.

Major methods classes

[1][2]

gollark: Which is *maybe* still the case, it's disputed.
gollark: Moore's law is about transistor count per chip doubling every two years.
gollark: No, it's not THAT.
gollark: To be fair, computers are faster now, but also waste horrendous amounts of processing power on random nonsense.
gollark: "computer get better""computer get better, but not as faster better!!!!"

See also

References

  1. "Approximate Inference and Constrained Optimization". Uncertainty in Artificial Intelligence - UAI: 313–320. 2003.
  2. "Approximate Inference". Retrieved 2013-07-15.
This article is issued from Wikipedia. The text is licensed under Creative Commons - Attribution - Sharealike. Additional terms may apply for the media files.