CorrelationEquation

The Correlation Equation

Things may interact with other things. Such interactions correspond to correlations between the things. These correlations only exist if the things coexist during a finite interval of time. Things coexist at least at their mutual boundary. The basic equation can easily be squared or can be raised to an arbitrary power n:

\[\sum_{i=0}^{N_\Phi}\Phi_i= 1 \rightarrow \left (\sum_{i=0}^{N_\Phi}\Phi_i \right )^2=1\]

 This expression can be rewritten as:

\[\left (\sum_{i=0}^{N_\Phi}\Phi_i \right )^2=\Phi_0^2+\Phi_1^2+...+\Phi_{N_\Phi}^2+\Phi_0\Phi_1+\Phi_0\Phi_2+...+\Phi_{N_\Phi}\Phi_{N_{\Phi-1}}\]

In short this reads:

\[\left (\sum_{i=0}^{N_\Phi}\Phi_i \right )^2= \sum_{i=0}^{N_\Phi}\Phi_i^2 +\sum_{i,j=0 \:i\neq j}^{N_\Phi}\Phi_i\Phi_j =1\]

This procedure obviously introduces correlation terms of type Phi_i *Phi_j between different things, i.e. terms describing the coexistence resp. interactions of pairs of things. Such terms are only present if the things are coexistent i.e. both things have non-zero values. These terms can be further elucidated as follows. This equation in combination with the basic equation (repeated here for better readability) :

\[\sum_{i=0}^{N_\Phi}\Phi_i=1\]

gives

\[-\sum_{i,j=0 \:i\neq j}^{N_\Phi}\Phi_i\Phi_j=\sum_{i=0}^{N_\Phi}\Phi_i^2 -1=\sum_{i=0}^{N_\Phi}\Phi_i^2 -\sum_{i=0}^{N_\Phi}\Phi_i=\sum_{i=0}^{N_\Phi}(\Phi_i^2-\Phi_i)\]

The term in brackets on the right hand side of this equation interestingly corresponds to the lowest order of the Taylor expansion of an entropy type logarithmic formulation [19]:

\[\Phi_i^2-\Phi_i\cong \Phi_iln\Phi_i \:\:\: for\: \Phi_i\leq 1\]

In total, the correlations of one thing with all other things can thus obviously be expressed by a function of that single thing only:

\[-\sum_{i,j=0 \:i\neq j}^{N_\Phi}\Phi_i\Phi_j \cong \sum_{i=0}^{N_\Phi}\Phi_iln\Phi_i\]

("correlation equation")

\[S =- \sum_{i=0}^{N_\Phi}\Phi_iln\Phi_i\]

("entropy equation")

Discussion


The term S is the well-known entropy formulation appearing in Gibbs’s thermodynamics ([20], [21]), in the derivation of Boltzmann’s statistics [22], or in Shannon’s theory of communication [23]. This seems a very important intermediate result:

entropy type terms are related to correlations of things

Because correlations exist at least at interfaces – entropy type terms are related to interfaces.Correlations further correspond to interactions and eventually interactions between things correspond to forces. Forces are often described as gradients of scalar fields. Such gradients exist at boundaries between things. Correlations have further been shown to be related to entropy type terms. The terms correlation, interaction, force, gradient, entropy, and boundary for this reason seem to be strongly interrelated and may even be based on a common principle.
Share by: