Golden mean, statistics and probability

It has been said for some time that in today’s science «correlation supersedes causation», and by correlation we obviously mean a statistical correlation. But even since Newton, physics has not been concerned that much with causation, nor could do it, so that more than a radical change we only have a steady increase in the complexity of the variables involved.

In the handling of statistical distributions and frequencies it makes little sense to talk about false or correct theories, but rather about models that fit the data better or worse, which gives this area much more freedom and flexibility with respect to assumptions. Physical theories may be unnecessarily restrictive, and conversely no statistical interpretation is truly compelling; but on the other hand, fundamental physics is increasingly saturated with probabilistic aspects, so the interaction between both disciplines continues to tighten.

Things become even more interesting if we introduce the possibility that the principle of maximum entropy production is present in the fundamental equations of both classical and quantum mechanics —not to mention if basic relations between this principle and the continuous proportion φ were eventually discovered.

Possibly the reflected wave/retarded potential model we have outlined for the circulatory system gives us a good idea of a virtuous correlation/causation circle that meets the demands of mechanics but suspend —if not reverse- the sense in the cause-effect sequence. In the absence of a specific study in this area, we will now be content to mention some more circumstantial associations between our constant and the probability distributions.

The first association of φ with probability, combinatorics, binomial and hypergeometric distributions is already suggested by the presence of the Fibonacci series in the polar triangle already mentioned.

When we speak of probability in nature or in the social sciences, two distributions come first to mind: the almost ubiquitous bell-shaped normal or Gaussian distribution, and the power law distributions, also known as Zipf distributions, Pareto distributions, or zeta distribution for discrete cases.

Richard Merrick has spoken of a «harmonic interference function» resulting from harmonic damping, or in other words, the square of the first twelve frequencies of the harmonic series divided by the frequencies of the first twelve Fibonacci numbers. According to the author, this is a balance between spatial resonance and temporal damping.

In this way he arrives at what he calls a «symmetrical model of reflexive interference», formed from the harmonic mean between a circle and a spiral. Merrick insists on the transcendental importance for all life of its organization around an axis, which Vladimir Vernadsky had already considered to be the key problem in biology.

Richard Merrick, Harmonically guided evolution

Merrick’s ideas about thresholds of maximum resonance and maximum damping can be put in line with Pinheiro’s thermomechanical equations, and as we have indicated they would have a wider scope if they contemplated the principle of maximum entropy as conducive to organization rather than the opposite. Merrick elaborates also a sort of musical theory on the privileged proportion 5/6-10/12 at different levels, from the organization of the human torso to the arrangement of the double helix of DNA seen as the rotation of a dodecahedron around a bipolar axis.

*

The powers laws and zeta distributions are equally important in nature and human events, and are present, among many other things, in fundamental laws of physics, the distribution of wealth among populations, the size of cities or the frequency of earthquakes. Ferrer i Cancho and Fernández note that «φ is the value where the exponents of the probability distribution of a discrete magnitude and the value of the magnitude versus its rank coincide». It is not known at this time if this is a curiosity or if it will allow to deepen the knowledge of these distributions [37].

Zipf or zeta distributions are linked to hierarchical structures and catastrophic events, and also overlap with fractals in the space domain and with the so-called 1/f noise in the time domain. A. Z. Mekjian makes a broader study of the application of the Fibonacci-Lucas numbers to statistics that include hyperbolic power laws [38].

I. Tanackov et al. show the close relationship of the elementary exponential distribution with the value 2ln φ, which makes them think that the emergence of the continuous proportion in nature could be linked to a special case of Markov processes —a non-reversible case, we would advance. It is well known that exponential distributions have maximum entropy. It can be obtained an incomparably faster convergence to the value of the number e with Lucas numbers, a generalization of Fibonacci numbers, than with Bernoulli’s original expression, which is enough food for thought; we can also get with non-reversible walks a faster convergence than with the usual random walk [38].

Edward Soroko proposed a law of structural harmony for the stability of self-organized systems, based on the continuous proportion and its series considering entropy from the point of view of thermodynamic equilibrium [39]. Although here we give preference to entropy in systems far from equilibrium, his work is of great interest and can be a source of new ideas.

It would be desirable to further clarify the relationship of power laws to entropy. The use of the principle of maximum entropy seems to be particularly suitable for open systems out of balance and with a strong self-interaction. Researchers such as Matt Visser think that Jaynes’ principle of maximum entropy allows a very direct and natural interpretation of powers laws [40].

Normally one looks for continuous power laws or discrete power laws, but in nature we can appreciate a middle ground between both as Mitchell Newberry observes with regard to the circulatory system. As usual, in such cases reverse engineering is imposed on the natural model. The continuous proportion and its series offer us an optimal recursive procedure to pass from continuous to discrete scales, and its appearance in this context could be natural [41].

The logarithmic average seems to be the most important component of these power laws, and we immediately associate the basis of the natural logarithms, the number e, with the exponential growth in which a certain variable increases without restrictions, something that in nature is only viable for very short periods of time. On the other hand, the golden mean seems to arise in a context of critical equilibrium between at least two variables. But this would lead us rather to logistic or S-curves, which are a modified form of the normal distribution and also a scaled compensation of a hyperbolic tangent function. On the other hand, exponential and power laws distributions look very different but sometimes can be directly connected, which is a subject on its own.

As already noticed, we can also connect the constants e and Φ through the complex plane, as in the equality (Φi = e ± πi/3). Although entropy has always been measured with algebras of real numbers, G. Rotundo and M. Ausloos have shown that here too the use of complex values can be justified, allowing to treat not only a «basic» free energy but also «corrections due to some underlying scale structure» [42]. The use of asymmetric correlation matrices could also be linked with the golden matrices generalized by Stakhov and applied to genetic code information by Sergey Pethoukov [43].

In the mechanical-statistical context the maximum entropy is only an extreme referred to the thermodynamic limit and to Poincaré’s immeasurable scales of recurrence; but in many relevant cases in nature, and evidently in the thermomechanical context, it is necessary to consider a non-maximum equilibrium entropy, which may be defined by the coarse grain of the system. Pérez-Cárdenas et al. show a non-maximum coarse-grained entropy linked to a power law, the entropy being so much lower when finer is the grain of the system [44]. This graininess can be linked to the constants of proportionality in the equations of mechanics, such as the same Planck’s constant.

*

Probability is a predictive concept and statistics a descriptive, interpretative one, and both should be balanced if we do not want human beings to be increasingly governed by concepts they do not understand at all.

Just to give an example, the mathematical apparatus known as the renormalization group applied to particle and statistical physics is particularly relevant in deep learning, to the point that some experts claim both are the same thing. But it goes without saying that this group historically emerged to deal with the effects of the Lagrangian self-interaction in the electromagnetic field, a central theme of this article.

For prediction, the effects of self-interaction are mostly «pathological», since they complicate calculations and often lead to infinity —although in fact we should put the blame for this in the inability to work with extended particles of special relativity, rather than in self-interaction. But for the description and interpretation the problem is the opposite, it is about recovering the continuity of a natural feedback broken by layers and more layers of mathematical tricks. The conclusion could not be clearer: the search for predictions, and the «artificial intelligence» thus conceived, has grown exponentially at the expense of ignoring natural intelligence —the intrinsic capacity for self-compensation in nature.

If we want to somehow reverse the fact that man is increasingly governed by numbers that he does not understand —and even the experts are bound to trust in programs whose outputs are way beyond their understanding- it is necessary to work at least as hard in a regressive or retrodictive direction. If the gods destroy men by making them blind, they make them blind by means of predictions.

*

As Merrick points out, for the current theory of evolution, if life were to disappear from this planet or have to start all over again, the long-term results would be completely different, and if a rational species were to emerge it would be totally different from our own. That is what random evolution means. In a harmonically guided evolution conditioned by resonance and interference as Merrick suggests, the results would be fairly the same, except for the uncertain incidence that the great cosmic cycles beyond our reach might have.

There is not something like pure chance, there is nothing purely random; no matter how little organized an entity is, be it a particle or an atom, it cannot fail to filter out the environmental «random» influences according to its own intrinsic structure. And the first sign of organization is the appearance of an axis of symmetry, which in particles is defined by axes of rotation.

The dominant theory of evolution, like cosmology, has emerged to fill the great gap between abstract and reversible, and therefore timeless, physical laws and the ordinary world showing an irreversible time, perceptible forms and sequences of events. Today’s whole cosmology is really based on an unnecessary and contradictory assumption, the principle of inertia. The biological theory of evolution is based on a false one, that life is only governed by chance.

The present «synthetic theory» of evolution has only come into existence because of the separation of disciplines, and more specifically, due to the segregation of thermodynamics from fundamental physics despite the fact that there is nothing more fundamental than the Second Law. It is not by chance that thermodynamics emerged simultaneously with the theory of evolution: the first one begins with Mayer, who elaborated on work and physiology considerations, and the second one with Wallace and Darwin starting, according to the candid admission of the latter in the first pages of his main work, from Malthus’ assumptions of resources and competition, which in turn go back to Hobbes —one is a theory of work and the other of the global ecosystem understood as a capital market. The accumulated capital in this ecosystem is, of course, the biological inheritance.

Merrick’s harmonic evolution, due to the collective interference of waves-particles, is an updating of an idea as old as music; and it is also a timeless, purpose-free vision of the events of the world. But to reach the desired depth in time, it must be linked to the other two clearly teleological, but spontaneous domains, of mechanics and thermodynamics, which we call thermomechanics for short.

It would be enough to unite these three elements for the present theory of evolution to start becoming irrelevant; and not to mention that human and technological evolution is decidedly Lamarckian beyond speculation. Even DNA molecules are organized in the most obvious way along an axis. And as for information theory, one only has to remember that it has come out of a peculiar interpretation of thermodynamics, and that it is impossible to do automatic computations without components with a turning axis. Whatever the degree of chance, the Pole rules and defines its sense and meaning.

However, in order to better understand the action of the Pole and the spontaneous reaction involved in mechanics it would be good to rediscover the meaning of polarity.

Deja una respuesta

Tu dirección de correo electrónico no será publicada. Los campos obligatorios están marcados con *