## MARKOV CHAIN
^{2)}

← Back

A sequence of neither purely random, nor purely deterministic transitions from one state to any other in a system.

Another interesting definition by K. KRIPPENDORFF: "The behavior of an informationally closed and generative system that is specified by transition probabilities between the system's states" (1986, p47).

He adds: "The probabilities of a MARKOV chain are usually entered into a transition matrix indicating which state or symbol follows which other state or symbol. The order of a MARKOV chain corresponds to the number of states or symbols from which probabilities are defined to a successor. Ordinarily, MARKOV chains are state determined, or of the first order. Higher orders are history determined. An unequal distribution of transition probabilities is a mark of a MARKOV chain's redundancy, and a prerequisite of predictability" (Ibid)

I. PRIGOGINE and I. STENGERS state the three general characteristics of Markov chains: "Non-repetitivity, existence of long range correlations and spatial symmetry breaks" (1992, p.90).

Markov chains are "statistically reproductive" and correspond to deterministic chaos "intermediary between pure randomness and redundant order" (Ibid).

### Categories

- 1) General information
- 2) Methodology or model
- 3) Epistemology, ontology and semantics
- 4) Human sciences
- 5) Discipline oriented

### Publisher

Bertalanffy Center for the Study of Systems Science(2020).

To cite this page, please use the following information:

* Bertalanffy Center for the Study of Systems Science (2020).* Title of the entry. In Charles François (Ed.), International Encyclopedia of Systems and Cybernetics (2). Retrieved from www.systemspedia.org/[full/url]

We thank the following partners for making the open access of this volume possible: