Questions of interpretation – and of principle

According to Proclus, when Euclid wrote the Elements his primary objective was to elaborate a complete geometrical theory of the Platonic solids. Indeed, it has been said on several occasions that behind the name «Euclid» there could be a collective with a strong Pythagorean component. The existence of only five regular solids is possibly the best argument to think that we live in a three-dimensional world.

                        Dodecahedron

Another great champion of the concept of harmony in science was Kepler, the astronomer who introduced the ellipses into the history of physics —we only need to remember the title of his opus magnum, Harmonices Mundi. He discovered the convergence of Fibonacci’s series, combined Pythagoras’ theorem with the golden ratio in the triangle that bears his name, and theorized extensively about Platonic solids, which he even placed between the orbits of the planets, and which have been considered by a certain tradition as the arcana of the four elements plus the quintessence or Ether.

Icosahedron

The latter would be represented by the pentagon and the pentagram, and by the dodecahedron dual of the icosahedron, corresponding to the element water.

a/b = b/c= c/d = (1+√ 5)/2

The cascade of identical proportions at different scales immediately evokes the capacity to generate self-similar forms like the logarithmic spiral that displays its proportion indefinitely in a series of powers: φ, φ2, φ3, φ4… If a recursion is that which allows something to be defined in its own terms, the pentagram shows us the simplest recursive process in a plane.

As late as 1884 Felix Klein, the reconciler of analytic and synthetic geometry, gave a group of Lectures on the Icosahedron in which he considered it to be the central object of the main branches of mathematics: «Every single geometrical object is connected in one way or another to the properties of the regular icosahedron». This is another of those «surprisingly well connected» objects and it would be interesting to trace the links with the two representations of the Pole from which we started. The Rogers-Ramanujan’s continuous fraction, for example, plays a role for the icosahedron analogous to the exponential function for a regular polygon.

The orientation proposed by the influential Klein did not get as wide a reception as the famous Erlangen program; to properly deepen it required the dynamic guidance of nature, of applied physics and mathematics. Even current times are more propitious to recover this program, in spite of the fact that physics itself has become more abstract by leaps and bounds. The best way to revive Klein’s second major program today might be with the use of a very down-to-earth version of geometric algebra.

The golden section emerges under the unequivocal seal of the fivefold symmetries, which were believed to be exclusive of living beings until the discovery of the quasi-crystals. That ordinary crystals, static structures by definition, exclude this type of symmetry seems to indicate long-range conditions of equilibrium. The optical properties of photonic quasi-crystals and other less periodic structures —often with a zero or double-zero refraction of permittivity and permeability- are being studied intensively in 1D, 2D, and 3D. As expected, phases with spiral holonomy have been found. As in graphene, although the Berry curvature has to be zero, the phase shift can be zero or π [26].

It would be interesting to study all these new states from the point of view of thermomechanics, quantum thermodynamics and retarded potentials, the latter being able to offer a more convincing interpretation of, for example, the so-called relativistic effects of graphene; just as they allow to deal with singular points in a more logical way. On the border between periodic and random structures, the secrets of fivefold symmetry will not be opened without a careful, unbiased interpretation.

*

Except to confuse minds and things, the proscription of the Ether by the special relativity is anything but relevant. First of all, because what makes special relativity work is the Lorentz-Poincaré transformation, conceived expressly for the Ether. Second, because although special relativity, which is the general theory, dispenses with the Ether, general relativity, which is the special theory for gravity, demands it, even if it is in a rarefied theoretical way.

Weber’s electrodynamics coincides closely with the Lorentz factor up to speeds of 0.85 c without having to dispense with the third principle of mechanics. But in any case, and even if we stick to Maxwell’s equations, which is the very crux of the matter, what we have is that there are not and cannot be electromagnetic waves moving in space with one component ortogonal to the other, but a statistical average of what occurs between space and matter. This is Mazilu’s conclusion, but we only need to acknowledge the complete failure of any attempt to specify the geometric description of the field and the waves [27].

The strange thing, once again, is that this has not been seen clearly before. But it turns out that the idea of the Ether around 1900, at least the idea of Larmor, among others, was not only as a medium between particles of matter, but something that also penetrated those particles, which in fact were seen as condensations of the Ether —as now we can see the particles as condensations of the field. There was Ether outside in space and Ether inside matter — as in the electromagnetic waves, among which is light.

Ether is nothing but light itself, but it can also be other things than the light we see, and the whole electromagnetic spectrum. It’s just that we cannot know anything without the help of light. Light is the mediator between a space that we cannot know directly, but gives us the metric, and a matter that is in the same situation, but is the subject to measurement.

And now that we know that we are in the midst of the Ether, like the bourgeois gentleman who discovered that he had always been speaking prose without knowing it, perhaps we can look at things more calmly. Only a consummately dualistic mentality that thinks in terms of «this or that» has been able to remain perplexed for so long on this question.

This idea of the Ether in medias res could not be very clear at the beginning of the 20th century, otherwise physicists would not have opened their arms to relativity as they did. If it was welcomed, leaving other reasons aside, it was because it seemed to end once and for all with an endless series of doubts and contradictions —or so they thought at the time, until insoluble «paradoxes» began to emerge one after another. One could think that by then Weber’s law, which did not even need the existence of a medium because it did not consider waves either, had largely fallen into oblivion —though certainly not for researchers as exhaustive as Poincaré.

Of course, instead of Ether we can also use the word «field» now, as long as we do not understand it as the supplement of space that surrounds the particles, but as the fundamental entity from which they emerge.

In any case, special relativity proper practically does not come into contact with matter, and when it does through quantum electrodynamics, since Dirac, we are confronted with vacuum polarization and an even more crowded, strange and contradictory medium than in any of the previous avatars of the Ether.

Today, transformation optics and the anisotropies of metamaterials are used to «illustrate» black holes or to «design» —it is said- space-times in different flavors. And yet only the macroscopic parameters of Maxwell’s old equations, such as permeability and permittivity, are being manipulated. And why should empty space have properties if it is really empty? But, again, what this is all about is statistical averages between space and matter. Only the prejudice created by relativity prevents us from seeing better these things.

It should be much more interesting to study the properties of the space-matter continuum accessible to our direct modulation than exotic aspects of hypothetical objects of a theory, the general relativity, which is not even unified with classical electromagnetism.

And if the ether of 1900 could prove to be inconvenient, what can we say about a theory that breaks with the continuity of the equations of classical mechanics, and that to alleviate it introduces infinite frames of reference? Certainly, it is not an economic solution, and it seems even worse if we think that with the principle of dynamic equilibrium we can dispense with inertia and the distinction of frames of reference —a move that is routinely used to rule out other theories and clear the field.

On the top of this, Maxwell’s equations only are valid for extended portions of the field; special relativity is only valid for point-events, and in the field equations of general relativity point particles again become meaningless. Transformation optics takes advantage of this threefold incompatibility with a bypass that leaves special relativity in limbo, to link Maxwell with another incompatible theory. And yet, although this is not said, it is because of special relativity that quantum mechanics has been unable to work with extended particles. In contrast, starting from Weber mechanics there are no problems in working with both extended and point particles.

For those who still believe that the fundamental framework of classical mechanics must have four dimensions, it can be recalled that well into the 21st century, consistent gauge theories of gravity have been developed that satisfy the criterion formulated by Poincaré in 1902, namely to elaborate a relativistic theory in ordinary Euclidean space by modifying the laws of optics, instead of curving the space with respect to the geodesic lines described by light. Light is the mediator between space and matter; and if light can be deformed, which is obvious, there is no need to deform anything else.

Maxwell’s equations are not even a general case, but a particular case, both of Weber and of Euler’s equations of fluid mechanics. Within fluid mechanics, Maxwell sought the case for a static or motionless medium; if Maxwell’s equations are not fundamental, the principle of relativity cannot be fundamental either [28]. The reciprocity of special relativity is purely abstract and kinematic, not mechanical, since it is not bound to centers of masses, and does not allow a distinction between internal forces that comply with the third law and external forces that do not need to comply with it. The principle of relativity, which asserts the impossibility of finding a privileged frame of reference, is valid if and only if there are no external forces to those considered within the system —but on the other hand, by neglecting the third principle, the internal forces are not defined mechanically either.

The so-called Poincaré stress that the French physicist introduced for the Lorentz force to comply with the third principle plays the same role in the relativistic context as Noskov’s longitudinal vibrations for the Weber force. The fact that such stress was later considered irrelevant for special relativity shows conclusively its total divorce from mechanics.

Maxwell’s equations, as Mazilu says, are a reaction to the partially or totally uncontrollable aspects of the Ether. In physical theories the quantities that matter are not those we can measure, but those we can control; but in this way we dispense with information that could be integrated into a broader theoretical framework.

Questions of interpretation inevitably bring us back to questions of principle; without changing the principles we are condemned to work for them.

The principle of relativity is contingent and therefore unnecessarily restrictive, depending also on arbitrary synchronization procedures. The equivalence principle of general relativity also does not put an end to the problems of reference frames, and, in combination with the principle of relativity, rather multiplies them.

The principle of dynamic equilibrium of relational mechanics radically simplifies this situation without creating unnecessary restrictions. Leaving generality aside, a principle should not be restrictive, but, first of all, necessary. On the other hand, the inability of the equivalence principle to get rid of the principle of inertia automatically subordinates it to the latter.

And it is no coincidence. If we know the same about inertia as we know about the Ether, it is rather because inertia itself inadvertently overlaps with the idea of the Ether and supplants it. Then, the Ether could only emerge without mystifications from a physics that completely dispenses with the idea of inertia, something that is perfectly feasible and compatible with all our experience.

No doubt each theory has its own virtues, but Maxwell’s theory and relativity have already been more than sufficiently extolled. Here we prefer to look at the theory that has historical precedence over both, since the presentation made today could not be more biased.

Returning to the past, we see that Lorentz’s non-dragging medium, Fresnel and Fizeau’s partial dragging medium, and Stokes’ total dragging medium are not contradictory and refer to clearly different cases. There are experiments, such as those of Miller, Hoek, Trouton and Noble, and many others, which can be carried out again under much better conditions and provide invaluable information from many points of view, provided that our theoretical framework allows us to contemplate them, which is not the case now [29]. In addition, these experiments are thousands of times less expensive, simpler and more informative than the current «confirmations» of special and general relativity.

There is also an inevitable complementarity between the constitutive aspects of electromagnetism in modern metamaterials, with their mixture of controllable and uncontrollable parameters in matter, and the measurement of uncontrollable parameters in a free environment out in space. But this complementarity cannot be appreciated without principles and a framework that can make them compatible, to begin with. On the other hand there is no need to say that between transformation optics and general relativity we have more of a flimsy parallelism than of real contact.

Another way of talking about a geometric phase is that it is a transformation or holonomy around a singularity. This singularity can be a vortex, which provides a natural connection with the entropy or attenuation of certain magnitudes, which obviously cannot reach infinite values.

An interesting case is the so-called transmutation of optical vortices, that is, the qualitative change of their most intrinsic feature, which is vorticity, and which has recently been carried out even in free space [30], also involving pentagonal symmetries. Vortices occur in the four states of matter —solid, liquid, gaseous and plasma, which are our version of the ancient four elements. Taking into account that their characteristic behavior can be described as a function of constitutive stress/strain relationships, a quantitative description of the transmutation of the states of matter is also feasible, which is widely different from the nuclear transmutation of the elements, without prejudice that also the nuclei can be described more or less classically with vortices like the skyrmions.

The so-called geometrical phase, this so universal phenomenon that it even manifests itself as vorticity on the surface of the water, applied to classical electromagnetism becomes, so to speak, «Maxwell’s fifth equation», since it brings into play and encompasses the four known ones. The very name «geometric phase» seems clearly a euphemism, since it is not the geometricians who usually deal with it, but the physicists, and applied physicists for that matter. I also prefer to call it holonomy instead of anholonomy, since the latter refers to the fact that it cannot be integrated within the frame of a theory, while holonomy refers to a global aspect that can be recognized even by the naked eye.

Berry himself admits that the geometrical phase is a way of including the (uncontrollable) environmental factors that are not within the system defined by the theory [31]. In this sense, to get to Maxwell’s «fifth equation» we don’t need to add terms, but only refer to the «predynamics» of the less restricted or more general equations from which they originate —Weber’s and Euler’s in this instance. And the same applies to relativity.

No doubt the phenomenology of light is so vast that it never ceases to surprise us, but all this would have an incomparably greater transcendence if a parallel effort were dedicated to the uncontrollable but complementary aspects that are now outlawed or masked by the dominant theories. But in fact there is no part of physics that is not being contemplated today under an unnecessarily distorted optic.

*

There is also place in the theory of black holes for the golden ratio to emerge, just at the critical turning point when the temperature goes from rising to falling: J2/M2 = (1+√ 5)/2, with M and J being the mass and angular momentum when the constants c and G equals 1. The meaning and relevance of this is not clear, but it echoes the custom of this constant of appearing at critical points [32].

In Weber-type forces there is no room for theoretical objects such as black holes since the force decreases with the speed, and it would be interesting to see if transformation optics is able to find a laboratory replica for the evolution of these parameters. In relational mechanics the most that can be expected are different types of phase singularity, such as the optical vortices mentioned.

If there is any interest for us in the emergence of φ in black holes, even if they are purely theoretical calculations, it is because of the direct association with angular momentum, entropy and thermodynamics. It shows us at least that the continuous proportion can also emerge in accordance with the principle of maximum entropy that we consider fundamental for understanding nature, quantum mechanics, or the thermomechanical formulation of classical mechanics. If it can emerge here, it can also do so in other types of singularities, such as the phase of vortices, in the optical model that de Broglie built for the light ray, or in holograms.

For perhaps the greatest interest of the theoretical study of black holes has been to introduce this principle of maximum entropy into fundamental physics, albeit as a final term at extreme conditions, when surely it has a presence at any time in the past and present. In modern theoretical physics, this could be expressed through the so-called holographic principle, which makes sense since this principle makes extensive use of the light phase, and, after all, we have already seen that there is no knowledge of our physical world that does not pass through light. However, there are all sorts of doubts about how to apply such a principle to ordinary low-energy physics.

Black holes are extreme theoretical objects of maximum energy, but they have been reached through the «ordinary» physics of gravity, governed by action principles of minimum energy variation. Technically, this does not involve any contradiction, but makes us wonder about the very nature of the action principles, something that still worried a conservative physicist like Planck.

The action principle of Weber’s law, or Noskov’s extension of it, does not allow the existence of these extreme objects because, applying reciprocity in a strictly mechanical way to the centres of mass, force and speed get balanced. In the breaking down of the Lagrangian by Mathis into two forces, something similar happens. In Pinheiro’s thermomechanics, in which there is a balance between minimum energy variation and maximum entropy, this does not seem to be possible either, provided there is free energy available.

The ordinary Lagrangian is the most void of causality, and Mathis’ theory, which wants to dispense with energy and action principles in order to remain only with vectors and forces, would be obviously the «fullest» model; if it is viable is a different matter. The other two are in between. We know that with action principles univocal causes are impossible. One can choose the path one prefers and see how far it goes, but my position on this is that although a univocal determination of causes is not possible, we can have a statistical but certain sense of causality, related to the Second Law of Thermodynamics. Noskov’s and Pinheiro’s action principles seems compatible.

The third principle that defines what a closed system is, but the ironic twist is that this principle cannot be applied without the help of an environment with free energy contributing to close the balance. This happens even in Mathis’ model, where free charge is recycled by matter. Thus, any reversible mechanics emerges as an island from an irreversible background, and it is this superposition of levels that gives us our intuition of causality.

The propagation of light is based on the homogeneity of space, but the masses on which gravity acts involve a non-homogeneous distribution. If we assume a primitive homogeneous medium, no type of force, including gravity, can alter that homogeneity except in a transitory way. The self-correction of forces, which is already implicit in Newton and the original Lagrangian, leads in that direction and seems the only conceivable way, if there is any, to cancel out the infinities arising in calculations. The entropic and thermodynamic treatment of gravity would also necessarily have to follow that direction.

The emergence of the continuous proportion and its series in plant growth, in pinecones or sunflowers, makes us think of vortices made up of discrete units, while at the same time it brings us back to the considerations about the optimal, not maximum, use of resources, matter and data collection that nature seems to display.

Yasuichi Horibe demonstrated that the Fibonacci binary trees were subject to the principle of maximum information entropy production [33], something that might be extended to thermodynamic entropy and, perhaps, to other branches such as optics, holography, or quantum thermodynamics. The question is whether these series can simply emerge from the principle of maximum entropy or are at some variable optimum point between minimum energy and maximum entropy, as suggested by Pinheiro’s equations.

Since Pinheiro begins by testing its mechanics with some very elementary models, such as a sphere rolling on a concave surface, or the period of oscillation of an elementary pendulum, it would be of great interest to determine the simplest problem, within this mechanics, in which the continuous proportion appears with a critical or relevant role. With this we could take up again the Ariadne’s thread of this ratio for action variables and optimization problems.

Planck was still concerned that the principles of action seem to imply a purpose. And the same was true of the Second Law of Thermodynamics for Clausius, although he was not at all bothered by that. It is plain to see that both types of processes, apparently so far apart, are effectively teleological, and this is not a coincidence since they are not even separated, as Pinheiro’s thermomechanics shows. It seems that the simultaneous inclusion of two undeniable propensities of nature is more natural than their separate treatment.

In the West there has been a strong rejection of any teleological connotation because teleology has always been confused either with theology and the providential invisible hand or with the intentional hand of man. Tertium non datur. However, it is clear that here, for both mechanics and thermodynamics, we are talking about a tendency as unquestionable as it is spontaneous. Understanding this third position, which already existed before the false dilemma of mechanism, leads us to radically change our understanding of Nature.

Deja una respuesta

Tu dirección de correo electrónico no será publicada. Los campos obligatorios están marcados con *