Causal Markov condition

The Markov condition, sometimes called the Markov assumption, is an assumption made in Bayesian probability theory, that every node in a Bayesian network is conditionally independent of its nondescendents, given its parents. Stated loosely, it is assumed that a node has no bearing on nodes which do not descend from it. This is equivalent to stating that a node is conditionally independent of the entire network, given its Markov blanket.

The related Causal Markov (CM) condition states that, conditional on the set of all its direct causes, a node is independent of all variables which are not direct causes or direct effects of that node.[1] In the event that the structure of a Bayesian network accurately depicts causality, the two conditions are equivalent. However, a network may accurately embody the Markov condition without depicting causality, in which case it should not be assumed to embody the causal Markov condition.

Definition

Let G be an acyclic causal graph (a graph in which each node appears only once along any path) with vertex set V and let P be a probability distribution over the vertices in V generated by G. G and P satisfy the Causal Markov Condition if every node X in V is independent of given [2]

Motivation

Statisticians are enormously interested in the ways in which certain events and variables are connected. The precise notion of what constitutes a cause and effect is necessary to understand the connections between them. The central idea behind the philosophical study of causation is that causes raise the probabilities of their effects, all else being equal.

A deterministic interpretation of causation means that if A causes B, then A must always be followed by B. In this sense, smoking does not cause cancer because some smokers never develop cancer.

On the other hand, a probabilistic interpretation simply means that causes raise the probability of their effects. In this sense, changes in meteorological readings associated with a storm do cause that storm, since they raise its probability. (However, simply looking at a barometer does not change the probability of the storm, for a more detailed analysis, see:[3]).

The looseness of the definition of probabilistic causation begs the question if events which are traditionally classified as effects (e.g. a wet piece of paper after spilling water on it) can actually make a difference to the probability of their causes. In a world without CM, the wetness of a piece of paper changes the probability that a glass of water was spilled on it. In a world with CM, only events which are parents of an event change its probability (e.g. gravity, a hand passing by the water glass, the nearness of the paper).

Implications

Dependence and Causation

It follows from the definition that if X and Y are in V and are probabilistically dependent, then either X causes Y, Y causes X, or X and Y are both effects of some common cause Z in V.[1]

Screening

It once again follows from the definition that the parents of X screen X from other "indirect causes" of X (parents of Parents(X)) and other effects of Parents(X) which are not also effects of X.[1]

Examples

In a simple view, releasing one's hand from a hammer causes the hammer to fall. However, doing so in outer space does not produce the same outcome, calling into question if releasing one's fingers from a hammer always causes it to fall.

A causal graph could be created to acknowledge that both the presence of gravity and the release of the hammer contribute to its falling. However, it would be very surprising if the surface underneath the hammer affected its falling. This essentially states the Causal Markov Condition, that given the existence of gravity the release of the hammer, it will fall regardless of what is beneath it.

Notes

  1. Hausman, D.M.; Woodward, J. (December 1999). "Independence, Invariance, and the Causal Markov Condition" (PDF). British Journal for the Philosophy of Science. 50 (4): 521–583. doi:10.1093/bjps/50.4.521.
  2. Spirtes, Peter; Glymour, Clark; Scheines, Richard (1993). Causation, Prediction, and Search. Lecture Notes in Statistics. 81. New York, NY: Springer New York. doi:10.1007/978-1-4612-2748-9. ISBN 9781461276500.
  3. Pearl, Judea (2009). Causality. Cambridge: Cambridge University Press. doi:10.1017/cbo9780511803161. ISBN 9780511803161.
gollark: Because it isn't deterministic.
gollark: Then other people can't verify it.
gollark: They *could* if they control the last pick, yes, hmmm.
gollark: Generate the outcome deterministically from the hashes of the inputs or something.
gollark: Perhaps this technology could somehow be adapted to the raffle.
This article is issued from Wikipedia. The text is licensed under Creative Commons - Attribution - Sharealike. Additional terms may apply for the media files.