A Likely Convergence between the Quantum Hologram Theory of Consciousness and Post-Quantum Mechanics (work in progress)
By
Giorgio Piacenza
Preamble
Conceptually-speaking, there are parallels between the Quantum Hologram Theory of Consciousness (QHTC) and Post-Quantum Mechanics (PQM). I think it is imperative to explore both theories and how they might reinforce each other to advance a more comprehensive scientific theory capable of dealing with non-local, self-organizing, intelligent, conscious phenomena.
Introduction
Non-local, two-way, information signaling for effective communication is not allowed in orthodox quantum mechanics due to the unitary, linear, statistical evolution of the wave function. Bell’s Theorem shows that quantum reality must be non-local, but it shows this under an orthodox interpretation of Quantum Mechanics without back action, non-linearity. This is why Bell’s Theorem applies against the non-random, deterministic, Bohmian interpretation partly based on hidden non-observable, but real objects (“be-ables” or “beables” as called by John Bell) in relation to the orthodox evolution of the wave function. Not probability waves but real waves and potential waves are said to move real particles and the “beables” are non-observable but real. However, the emphasis is on the waves guiding the particles and little feedback is stressed for the latter to the former. Pilot wave theory´s hidden variables that are not encoded in the original wave function is difficult to intuitively accept and most physicists have preferred the – also intuitively difficult to accept - strangeness of sudden collapse and of probability waves associated with the Copenhagen Interpretation or the strangeness of the idea of complete alternative real universes appearing everytime there is a measurement.
But the introduction of back reaction between classic particles and their pilot waves in Bohmian mechanics and also of retrocausality in closed, time-like curves allows for a non-linear evolution of the wave function and for a more practical quantum mechanical process in which events like non-local, two-way, classical information signaling through traversable wormholes.
Interestingly, this would also be compatible with the universe-as-a-hologram idea that ER = EPR (originally conceptualized in relation to the AdS/CFT duality that was thought as a way to solve the black hole information loss paradox). It allows information from 3D physical objects to be stored or represented in a 2D surface area. This is an interface that according to Chris Fields and Donald Hoffman is an imaginary boundary (as in black holes) which allows for classical information interactions to be defined. And, in this picture, entanglement would be “the condition of interacting with the world through an imaginary interface on which classical information appears.” Source: https://www.youtube.com/watch?v=XGulRS2IyF8
Ultimately (especially in Don Hoffman’s theory) we would have a consciousness monism or consciousness panpsychism and consciousness would interact with its own symbolic representations acquiring classical information, experiencing itself distinct from other consciousnesses (or “conscious agents”) and from “objects” (which would also be “conscious agents”). But the interaction could be thought of as an “interface.” And this “interface” may connect but also transcend spacetime since Hoffman´s theory is already mathematically connecting with spacetime conformal super symmetry, Dirac spinors and Penrose twistors which are models that relate with various quantum-relativistic physics proposals. Hoffman also prefers new directions in physics which are outside of spacetime and show deeper symmetries in nature facilitating complex calculations (like Nima Arkani Hamed’s work on the Amplituhedron). Source: https://www.youtube.com/watch?v=7E-MwJgy2lI and https://www.youtube.com/watch?v=oadgHhdgRkI
In Hoffman’s theory physical objects and brain components in spacetime would have no actual causal powers but perhaps (by considering them as symbolic representations of consciousness) they can still be modeled as if they did (in the PQM) as interacting through the resonant convergence of causal and retrocausal information involving beables and pilots waves and (in the QHTC) as quantum offer waves meeting PCAR “virtual” waves.
Interestingly (and compatible with the concept of a holographic matrix), Hoffman basically states that we are living in a “data structure” but not in ultimately real spacetime and that our “error correcting code” produces the illusion of an objective 3D world. Thus, the ultimate source would be consciousness itself engaged/playing with itself and (in agreement with the QHTC) coding and decoding his own created matrix. Source: https://www.youtube.com/watch?v=oadgHhdgRkI
I like this approach (at least in partial agreement with QHTC) because I’m not satisfied with suggestions by some QHTC and PQM authors that, for consciousness to exist, other formal, decodable, quantifiable processes must exist. Instead, I’m more of the view that consciousness uses vehicles of expression and experience resonant with different levels of reality both ultimately of its own making (as in the concept of Hylic Pluralism, Vedanta, Theosophy, Mahayana-Tantric and other non-dual, experiential, metaphysical systems generally converging in this aspect).
I suppose that, in terms of some Indian metaphysics, the “interface” would be part of an “akasha” as a sort of transducer between gross, physical reality and a subtle, mind-dominated, non-physical reality. I suppose that (in terms of the QHTC) this “interface” is situated in the “handshake” between the emitted quantum information waves and the PCAR decoder of the waves as much as (in terms of the PQM) it is situated between the causal and retrocausal relation involving classical beables and “local” pilot waves. It may be “imaginary” not only because of the square root of minus 1 is used in the quantum wave function but because it allows observers with consciousness or subjectivities to obtain classical information (perform a measurement) about their world while simultaneously including retrocausality to do this.
Retrocausality
The original pilot wave theory referred to global hidden variables. It is affirmed that Von Neuman and, later, John Bell’s Theorem invalidated once and for all the possibility of hidden variables but they referred to a local, hidden variables. In David Bohm’s proposal, the wave function possesses holistic, non-local, global information of the position and momentum of every particle. Furthermore, the entire wave function is instantaneously modified when the particles are affected.
John Bell was only able to show that hidden variables exclusively obeying retarded, past-to-future causality contradicted the statistical predictions of orthodox quantum theory. He felt that his own approach to retrocausality as a solution against (spatial) non-locality would contradict free will.
But, beside retrocausality as an explanation for Bell’s Theorem’s non-locality, retrocausality can also be an “explanation of the results of 'weak measurements' by Aharonov, Vaidman and others.”
The introduction of retrocausality may also render “no-go,” physically impossible states, possible and invalidate no-go theorem-based explanations that require epistemic solutions over ontological ones. Source: http://prce.hu/centre_for_time/jtf/retro.html
It is said that Bohm’s Pilot Wave, Hidden Variable theory is not relativistic like Quantum Field Theory is, but the introduction of retrocausality relates Pilot Wave Theory with relativity.
Perhaps Bell’s mistake was to think only about one possible future and one possible past determining what happens to a quantum system instead of thinking about many possible futures and – when the past is unknown – many possible pasts.
A serious consideration of the need to include retrocausality – however counter-intuitive it might seem – has been entertained by renowned physicists like John A. Wheeler, Richard Feynman, Oscar Klein and William Gordon (of the relativistic-invariant Klein-Gordon Equation); by Jeff Tollaksen, John Cramer (the Transactional Interpretation of QM) and by mathematicians like Luigi Fantappie who since the 1940’s considered the necessary role of retrocausality in “syntropic” living processes.
However, accepting retrocausality not only requires demonstrable arguments but the capacity to include in our conscious awareness an intuition capable of transcending a time-forward only perception of reality. Otherwise, the inclusion of retrocausality (which among its effects allows for traversable wormholes, for conscious quantum computing, real-time, telepathic communication and other phenomena sometimes associated with psi phenomena and with reported events described by experiencers of contact with technologically advanced non-human intelligent beings) would have already become an important part of standard quantum physics at least since the 1920’s when (as Ulisse di Corpo explains in “The Conflict between Entropy and Syntropy: The Vital Needs Model”):
“In special relativity the energy-momentum relation relates the energy of an object (E) with its momentum (p), and mass (m), where c is the speed of light: E2 = p2 c2 + m2 c4
This equation has a dual energy solution: one positive +E, which moves forward in time, and one negative −E, which moves backward in time. If the momentum is zero then the equation simplifies into the famous energy-mass relation: E = mc2
Oscar Klein and William Gordon in order to generalize the Schrödinger wave equation into a relativistic invariant equation, had to insert the full energy-momentum relation, arriving at a dual wave solution which characterizes the D'Alambert operator: retarded potentials which propagate from the past to the future (+E) and anticipated potentials which propagate backward, from the future to the past (−E).” Source: http://www.hessdalen.org/sse/program/Ulisse.pdf
Or perhaps the fundamental reconciling role of retrocausality in physics would have been recognized when in the 1950’s French physicist Olivier Costa de Beauregard basically showed that a realistic interpretation of Quantum Mechanics was possible with its introduction… that the alternative to “spooky” action-at-a-distance was retrocausality as shown in Huw Price and Ken Wharton’s “Taming the Quantum Spooks: Reconciling Einstein with Quantum Mechanics Require the Notion that Cause always Precedes Effect” Source: https://aeon.co/essays/can-retrocausality-solve-the-puzzle-of-action-at-a-distance
“Costa de Beauregard pointed out that Alice could affect Bob’s particle without action-at-a-distance, if the influence followed an indirect, zigzag path through space and time, via the point in the past where the two particles intersect. But there are no zigzags like that in standard quantum mechanics, so if we put them in we are actually agreeing with Einstein that the theory is incomplete.
Later, when Bell’s work appeared, Costa de Beauregard recognised the deeper significance of the zigzag: it offers a potential reconciliation between Bell and Einstein. Bell’s argument depends on the assumption that the choice of measurement settings at the two sides of the experiment is independent of any earlier properties, or ‘hidden variables’, of the particles. This assumption is called statistical independence, but the Parisian zigzag gives us a reason to reject it.”
Furthermore, as explained in the abstract for “Disentangling the Quantum World” Huw Price and Ken Wharton elucidate that “Correlations related to quantum entanglement have convinced many physicists that there must be some at-a-distance connection between separated events, at the quantum level. In the late 1940s, however, O. Costa de Beauregard proposed that such correlations can be explained without action at a distance, so long as the influence takes a zigzag path, via the intersecting past lightcones of the events in question. Costa de Beauregard's proposal is related to what has come to be called the retrocausal loophole in Bell's Theorem, but -- like that loophole -- it receives little attention, and remains poorly understood. Here we propose a new way to explain and motivate the idea. We exploit some simple symmetries to show how Costa de Beauregard's zigzag needs to work, to explain the correlations at the core of Bell's Theorem. As a bonus, the explanation shows how entanglement might be a much simpler matter than the orthodox view assumes -- not a puzzling feature of quantum reality itself, but an entirely unpuzzling feature of our knowledge of reality, once zigzags are in play.” Source: https://arxiv.org/abs/1508.01140
Renowned experimental physicist and theoretician Yakir Aharonov also basically showed (both theoretically and experimentally) that retrocausality is possible. In the abstract to his article (with Eliahu Cohen and Tomer Shushi) “Accommodating Retrocausality with Free Will” he writes:
Retrocausal models of quantum mechanics add further weight to the conflict between causality and the possible existence of free will. We analyze a simple closed causal loop ensuing from the interaction between two systems with opposing thermodynamic time arrows, such that each system can forecast future events for the other. The loop is avoided by the fact that the choice to abort an event thus forecasted leads to the destruction of the forecaster's past. Physical law therefore enables prophecy of future events only as long as this prophecy is not revealed to a free agent who can otherwise render it false. This resolution is demonstrated on an earlier finding derived from the two-state vector formalism, where a weak measurement's outcome anticipates a future choice, yet this anticipation becomes apparent only after the choice has been actually made. To quantify this assertion, weak information is described in terms of Fisher information. We conclude that an already existing future does not exclude free will nor invoke causal paradoxes. On the quantum level, particles can be thought of as weakly interacting according to their past and future states, but causality remains intact as long as the future is masked by quantum indeterminism. Quanta 2016; 5: 53–60.
Lev Vaidman and Yakir Aharonov found that a weak vertical magnetic field is equivalent not to a classical measurement but to a quantum measurement. Furthermore, post-selected weak measurements experiments conducted by Lev Vaidman, Aephraim Steinberg, Yakir Aharonov and others show that negative probabilities and retrocausality leading to a deterministic, yet, non-linear, free-will-available interpretation of the wave function must be seriously considered. With post-selected weak measurements, the disturbances introduced into a quantum system are kept to a minimum but – naturally - precision is lost.
The use of large, identically prepared quantum systems while statistically averaging the results counteracts the loss of precision. With these post-selection experiments, the experimenter can only work with an ensemble of final states and retroactively recover how these influenced the outcome by interacting with initial states. This allows for a time-symmetric quantum interpretation as in the two-state vector formalism of quantum mechanics (TSVF) originating in the proposals of Aharonov, Bergmann, and Lebowitz (also known as the “ABL Proposal”).
In “The Two-State Vector Formalism: An Updated Review” Aharonov and Vaidman write:
“A system at a given time t is described completely by a two-state vector Φ| |Ψ
which consists of a quantum state |Ψ
defined by the results of measurements performed on the system in the past relative to the time t and of a backward evolving quantum stateΦ| defined by the results of measurements performed on this system after the time t. Again, the status of the two-state vector might be interpreted in different ways, but a noncontroversial fact is that it yields maximal information about how this system can affect other systems (in particular, measuring devices) interacting with it at time t.” Source: https://link.springer.com/chapter/10.1007/978-3-540-73473-4_13
which consists of a quantum state |Ψ
defined by the results of measurements performed on the system in the past relative to the time t and of a backward evolving quantum stateΦ| defined by the results of measurements performed on this system after the time t. Again, the status of the two-state vector might be interpreted in different ways, but a noncontroversial fact is that it yields maximal information about how this system can affect other systems (in particular, measuring devices) interacting with it at time t.” Source: https://link.springer.com/chapter/10.1007/978-3-540-73473-4_13
Perhaps Einstein’s realistic intuition was significantly correct after all if non-local communication is possible under realistic terms. The classical concept of spatial causality would endure while being non-local (due to the meeting of causality and retrocausality) in time. Thus, locality would not longer be synonymous with “physical objectivity” (under a form of determinism that allows choice/free will and, computationally, P=NP in closed, timelike loops that overcome algorithmic limitations).
PQM would overcome the no-signaling theorem in the sense that real physical entities would be able to interact exchanging information in a classical way.
Key Concepts in Jack Sarfatti’s abstract for his article “Bohm Pilot Wave Post-Quantum Theory”
“Valentini has shown that the Born probability rule and its consequent no entanglement signaling restriction is not fundamental.”
“Sutherland has shown how Yakir Aharonov’s retrocausal “weak measurement” technique applies in the Lagrangian framework to give a relativistically covariant post-quantum theory in which there is two-way action-reaction between the qubit pilot waves and their beables (e.g. classical particles and classical local gauge fields) without the need for configuration space for many-particle entanglement.”
“The post-quantum backreaction corresponds to computation around closed timelike curves in which P = NP with profound implications for quantum cryptography code breaking.”
“We expect Prigogine pumped open dissipative structures with Frohlich macro-quantum coherence to be post-quantum systems.”
“It’s a great calculational advantage of Sutherland’s local real retrocausal weak measurement formalism that we can use LOCAL field equations without second-quantization even when the beables are entangled.”
Discussion 1
PQM would refer to a deeper, more inclusive level of nature interpreted by a more comprehensive quantum theory which would operate in a relativistically invariant, realistic (rather than statistical) manner, building upon Dr. David Bohm’s interpretation of quantum mechanics. Unlike the orthodox de Broglie-Bohm-Hiley interpretation, it would still be non-local and compatible with the restrictions imposed by Bell’s Theorem. It is now being called “Post Quantum Mechanics,” is consistent with weak measurement experiments in which retrocausality is involved, has been proposed by Dr. Jack Sarfatti for several years and recently became mathematical consistent with the inclusion of Dr. Roderick Sutherland’s formalism.
The QHTC relies upon an orthodox Copenhagen, statistical interpretation of the wave function and – in its formalism - dismisses the need for higher dimensions of spacetime. It is “realistic” in that it operates on 3 dimensions of space and 1 dimension of time. Similarly, PQM’s formalism doesn’t require the use of a higher dimensional spacetime.
Moreover, (as in General Relativity in which mass-energy has a back action on spacetime), there is a form of back action in the PQM theory. “Backreaction” between particles and pilot waves is paramount in PQM and, similarly, in the QHTC, the concept of RESONANCE is paramount. For example, regarding human (and animal) conscious experience of reality, specific brain components (perhaps involving microtubules) are postulated to generate a phase conjugate adaptive resonant (PCAR) wave representation of the quantum information waves emitted by physical objects. The whole body is also part of the resonance. However, resonance requires feedback or back action and retrocausality, may be required.
I suggest that the information waves in the QHTC would be of a nature akin or identical to the Pilot Waves of the Quantum Potential in David Bohm’s framework. In the former case, the dimensional framework for causal energy effects would be “realistic” in the sense of remaining within 3 dimensions of space and 1 dimension of time. In the latter case, the framework would be “realistic” in the sense that a real configuration of the wave function is stressed even in the absence of an observer.
If we extend resonant, coding-decoding, quantum wave, non-local, holographic, information communication to a deeper quantum-realistic level comprising all objects in the universe (a level in which real objects can detect, decode, observe, measure and thus “converse” with each other), the “measurement problem” in the non-realistic, Copenhagen Interpretation “collapse of the (probability) wave function” might be solved.
Under the hypothesis of “quantum equilibrium” Bohmian mechanics reasonably replaces the “collapse (of probabilities)” by the interaction between real quantum particles and the particle configuration of a real measuring object. The particles are “real” and the theory is deterministic. The interphase seems to be in the Pilot Waves and the non-local wave function. This may be akin to a non-local, holographic conversation between objects through PCAR coding and decoding but, for the Bohmian interpretation to be feasible, it would have to be extended into Post-Quantum Mechanics.
In the QHTC theory it is said that a meeting of an object and its decoded information waves would generate a resonant standing wave pattern, as if two hands were shaking. The zero-point field from quantum field theory is considered necessary for the existence of the information waves. Source: https://www.youtube.com/watch?v=EQuFtyruewo
In the PQM theory, the back reaction between pilot waves and its particles/beables would also produce a standing wave. But – unlike the situation in Bohmian Mechanics - the pilot waves would not depend on the particles/ beables.
I suggest that the PCAR in the QHTC and the back reaction in PQM may be different aspects of a single phenomenon.
In the QHTC real physical objects are non-locally connected through a wave coding-decoding resonance in which the PCAR is compatible with a holographic mathematical description and which (through the Copenhagen interpretation extended to the observer) involves consciousness. In PQM theory we can also think of the non-local connection between beables/particles (objects) and pilot waves involving back action or feedback between the former and the latter as a coding-decoding, non-local, resonant, standing wave communication.
Discussion 2
The experience of subjective meaningfulness that we have is not explained well by the QHTC as it stand snow because it is partially based on the orthodox quantum theory and orthodox evolution of the wave function equation but if we introduce nonlinearity in the equation (and other factors related to the PQM), then the possibility of choice and, therefore, quite likely, of subjective meaningfulness would be included.
The symmetry “handshake” between retrocausality and time-forward causality in PQM giving rise to the world of conscious experience apparently echoes the Andean concept that the present world of experience depends on the meeting and creative encounter between a past related world of established principles and a future-oriented world that is about to rise.
The non-unitarity of the wave function allows for consciousness to manipulate probabilities of experience, possibly combining future probabilities with past probabilities. And this non-unitary evolution of the wave function is mathematically possible and coherent utilizing Roderick Sutherland’s formalism which allows for retrocausation through closed time loops. According to Sarfatti, this situation would be compatible with Herbert Fröhlich’s idea that a non-linear, creative, cohering influence on mesoscopic and macroscopic energy-pumped, open systems far from equilibrium which - like living organisms – would resist environmental decoherence. Frohlich was the author of “The Connection between Macro- and Micro-physics,” and, according to Sarfatti, this “is intimately connected with locally-retrocausal PQM back-reactions violation of the de Broglie guidance equation that was assumed by Bohm in his 1952 pilot wave theory.” Source: file:///C:/Users/holo1/Downloads/Solving_the_Hard_Problem_Mind-Matter-Con.pdf