Journal on Policy & Complex Systems Volume 1, Number 2, Fall 2014 | Page 8

��������������������������
use of probability theory . Weaver also hints on the concept later crystalized by Anderson ( 1972 ), saying that a group of scientists working together formed a “ unit [ that ] was much greater than the mere sum of its parts ” ( 1948 , p . 542 ).
Shannon ’ s paper ( 1948 ) establishes definitions that have become classic in communication theory , among them : information source , transmitter , channel , receiver , destination , and channel . Shannon describes the source of information as a symbol generator in a stochastic process that can be successively detailed in order to be approximated by natural language . In an example , he suggests to use words in English associating them with probabilities of occurrence . This stochastic process is now described as discrete processes of Markov .
The efficiency of the system when coding a message to be sent through a channel equals the ratio between the rate of codifying and the actual capacity of the channel . Shannon then defines distortion as what occurs when the signal received is a defined function of the signal sent , that is , when the difference between the signal that was sent and that was received is systematically the same . Distortion has the advantage that it can be fixed .
Noise occurs when the differences between signal sent and received are not always the same . In these cases , it is not possible to rebuild the signal sent with precision . In a statistical sense , a system with noise contains two statistical processes in action : the source one and the noise one . Redundancy allows the probability of errors to be small .
Within the context of information theory , entropy is the simultaneous measure of the uncertainty of a given variable and the information content that is present , so that the more content there is in the message , the greater the uncertainty . Specifically ,
Shannon ’ s entropy — also known as statistical entropy — is a measure of the quantity of information present in a message , or , in other words , the mathematical expectation : the chance that a given expected result occurs . So Shannon ’ s entropy is maximum when all results are equally probable and it decreases when other results have higher probability .
Kolmogorov ( 1965 ), in turn , defines complexity as being the measurement of computational resources necessary to describe an object , in other words , the smallest program capable of specifying a given object . Kolmogorov ( 1965 ) proposes further a new approach to describe information quantitatively beyond the paradigms of combinatorial and probabilistic analysis : the algorithm approach . According to him , combinatorial approach calculates the possibilities of the construction of ‘ words ’, given the ‘ letters ’ of the alphabet . Probabilistic analysis is possible , but it can lead to senseless results — such as the possibility of negative results for entropy . Take the case of literary work . Using ‘ War and Peace ’ by Tolstoy as an example , Kolmogorov states : “ Is it reasonable to include this novel in the set of ‘ possible novels ’, or even to postulate some probability distribution for this set ?” ( Kolmogorov , 1965 , p . 3 ).
Then , Kolmogorov defines relative complexity using his proposed algorithmic approach to be “… the minimal length l ( p ) of the ‘ program ’ p for obtaining y from x ” ( Kolmogorov , 1965 , p . 5 ).
φ ( p , x )= y
Finally , Gell-Mann and Lloyd ( 2004 ) offer a definition of effective complexity of an entity as the “ length of a highly compressed description of its regularities ” ( Gell-Mann & Lloyd , 2004 , p . 387 ), in other words , the smallest complete description of the patterns of a given object or system .
6