The Dot Theory: The Name-and-Claim game,

the FUNDAMENTAL Game-theory strategy of EMERGENT reality. 

A pure-logic proof, satisfying requirements for a Grand Unified Theory proper, written as a 4000-word essay on (computational) perspective, Game Theory, Physics, and the nature of reality and consciousness by Stefaan Vossen. 

Abstract

This piece of logic imposes itself on our individual and collective senses of realism. It informs us of the way our sense of realism defines our reality-defining concepts (experiences and ideas of who and what we are), and then, logically and inescapably, imposes itself on the observation that we humans, our mathematics and our Physics, currently:

  • a) describe our relationship to reality (as described by and within our current-day terms used in Physics), and

  • b) do the mathematics (and the way it is used to calculate the meaning of the data in relationship to the reality we experience) in physics,

    incompletely.

This website explains a paper of which the logic described here can be understood as a novel method that is useful to demonstrate and observe this incompleteness as a mathematical object, and then calculate its topology relative to other, most-like, interpretations as a lensing factor. When this is followed by applying it correctively (as a now more informedly adjusted computational perspective) on the data used for any standard calculation, it will, inevitably, provide access to a variable-constant based computational methodology that calculates experienced reality more accurately than any other prediction that can be used using logic and algorithms only.

It is a demonstration of the Game-strategy (von Neumann reference) of relative self-improvement/betterment, rather than one of victory. It is not generally relative, but general and recursively relative, i.e. looking back at what trajectory achieved the anticipated/desired target for an object’s most-like itself. The inclusion of this expectant but informed perspective and expansion of the mathematics mainly, and in any essence, only, makes the existing (used) approach able to give better answers, by formulating better questions from the systems that can evaluate the already available databases as correlative metadata for emerging patterns.

Introduction 

When naming something, you inevitably attach context to a word. Name or bind a set of relations, together with their metadata, to a word or symbol that, to you, comes to mean that very something you meant. By its utterance (internally or externally) alone, it brings it into existence, even if briefly, inappropriately, or incorrectly, in (un)spoken or written language for another to imagine or notice. You turn a vague “some” into a “thing” and make it visible to others. Making it data. Quite magical when you think about it. 

Whether noticed or understood enough by others for the information to have been shared and given an agreed meaning that this data is that thing enough for it to be it, is where notions of hierarchy, value, change, evolution, and culture emerge.  

Whether as infant or ancestral hominid, when we first uttered sounds, they came to represent and mean pain, excitement, fear, or pleasure. Latterly, as either evolved, sounds became words, and languages were created. More memorable cultural languages, professional languages with big, noticeable records in books and institutions lasting longer than others. But also, Personal languages, woven from the languages of others near and far that are therefore never truly personal. Yet simultaneously impossible to fully experience by another and incomprehensibly uniquely individual. 

Limiting as that may occasionally seem, language, communication, and the act of creating languages, have brought us increasing levels of self-awareness. This remarkable ability we now have access to, and are able to reflect on (on a never-before experienced, and hard to imagine scale one century ago) is courtesy of the explosion in technology and data calculation tools following from the development of Quantum Field theory. 

With the past 20 years of data collection, we are now, demonstrably, at a stage of being able to recognise from very far afar, deeply personal levels of inter-personal psychodynamic communication trends and habits through the tracking of fashions, trends, memes, and schemes that are thematic of human expressions. We have now moved from a grunt faintly expressing pain to understanding the sequence of individualised psychodynamic stimulus- sequences that would promote or hinder healing. Our understanding of things has reached well beyond the mere calculation of the trajectory of planets, and into that of individual experiences. 

Demonstration 

This remarkable ability to predict individual real-world lived experiences is ultimately only the end-product of the act of naming things. Without naming things we would not, as we are now, be able to recognise that our fellow human beings are in every which way identical to us, had they been born in the time, place, and body we did. That consciousness is an emergent property from its container.

Our individual commercial and political predictability observed in data management today makes that obvious and inescapable. But, we resist, we are so much more than that, and of course we are, but naming things, the language of creating, is not the only purpose of language. 

It can also be used to express the language of the subjective world of interpretation and feelings. This language confusingly uses many of the same words and symbols from the “language of things”-world, although the words used to express feelings are more often used interchangeably and acquire far more meaning from their context than the words describing our more predictable behaviours. 

It too creates communicable reality, albeit in a different mental and personal experiential framework. A fine example being the uses of the words “Fear” and “Love” or the idea of having both words in the same sentence and in literary proximity. Or choosing to point that out etc. There, both the choice of words, their use and logical deployment are each, and individually, emotively responsive and appear to colour the individual’s landscape of the known literal meaning of the words.  

That secondary layer of data is a more subtle and interwoven meta-layer as it is the product of the Personal language that references to the objective, literal layer. The meaning of what we mean, so to speak. This Personal language, as stated before, cannot possibly be fully experienced nor fully understood by another, and therefore can be considered a complete information “black box”, avoid and an unknowable entity.  

Proof 

Yet fortunately, and as demonstrated by Quantum Mechanics in the world of objective observations and personal data monitoring, it is clearly also not critical for the world of subjective observations (the personal experiential world) to be fully known, for it to be known enough to successfully have significant aspects predicted of. This second layer of enmeshed information too is probabilistically accessible through contextually shared patterns of known and interesting causes and outcomes. An inherent co-dependency emerges: If I fall over, I sometimes get hurt. If someone else falls, they sometimes get hurt. If someone else pushes me over, it sometimes hurts me. If I push someone else over it may hurt them. If they have a diagnosed motion disorder, they will be even more likely to get hurt if someone pushes them, etc. What is more true for others that are most like me, is probably more true for me than what is true for everyone else. 

Through these shared recorded, or remembered, experiences, we develop notions of shared languages where ideas or concepts are quite precisely and deeply understood in shared, yet diversely expressed, ways. These widely diverse shared languages, with widely varying potential-user-size (from 2 to the total/infinite user group size) are each rooted in semi-homogenous shared patterns of experiences, traumas, and contexts.  

In their pattern analysis, we realise and observe recurring patterns with varying cluster-size yet reliable repeatability and association (emergence) to wider trends (age groups locations, education, socioeconomic changes, migrations, cataclysms in environment etc.). 

This process informs/widens the calculational perspective, or, otherwise put; changes the way and context in which the data is evaluated, what against, by, and how. Remarkably, when the data defining the observation is processed as such prior to any standard computation, this process will always and inevitably improve the outcome of any prediction or calculation when compared to the standard prediction/calculation.  

Description 

This, now altered, functional relationship to data has an odd Easter egg-laying, quasi-solipsistic, involuntarily evolving, inherently progressive, and inevitably benevolent-appearing mathematical encodement as its fabric.

It is the fabric, the fundamental necessity for it to be able to be real and observable to us, and exists because it is in everything that “is” in everything that is perceivable to us. As such, it evolved with us as we evolved with it, to use it to our benefit by perceiving it, understanding its potential and applying it creatively, with the successful versions living to tell the tale of its survival in our surviving languages.   

This of course, only when willing to accept logic and/or mathematics as the fundamental expressions of reality in both language and physics (and that therefore the personal, individual experience of reality can, so to speak be known at birth, but is not predestined and instead involves a series of binary choices that create options for more refined personalisation or predictability). 

When considered as a game strategy, this could be rated as The Utmost successful game strategy in the universe. But it has no equal, i.e. There is no alternative strategy by which to operate, leaving the use of the term “success” undefined but needed as relative to the observer’s opinion or perspective. This is because it is, from another perspective, not just a “strategy” but a “lens” through which we perceive and use the constructs and tools we create our understanding, meaning and value of the world, from. A lense that wouldn’t exist if the strategy wasn’t successful but their relation means that there is no existing prior of unsuccessful lenses. a form of evolution that functions on a chain of one-on-one.

The Dot Theory’s application is to crawl up that unique, unbroken chain to the data (the “game” that is played with the data) resulting in an ever- refining reduction in observer-lensing present in all (recorded) data used for any calculation undertaken by data that has been upcycled by correcting for the inevitable lensing. This is based on known pattern-formation in associated information known about most-like observations. When done prior to feeding for predictive calculation (answering a question), the now improved, de-lensed, data used in the (standard) calculation makes for a now, comparatively speaking, improved prediction.

It is seeking the more correct answer using the same tools in that process, whilst admittedly using probabilistic calculation for its apriori computation. 

Its theoretical benefit lies in that its application’s corrections to the relationship between the observer and the data results in improved predictions (using standard calculations) where experiments or target predictions are undertaken. Healthcare and Experimental Physics are such obvious spaces requiring tools for the predictive analysis of multidimensional (subjective/objective) data layers. This in turn has simultaneous and significant direct applications in human health and welfare and energy resource management and creation.

Evolution 

This oddly offset (partial, yet complete) pertinent change in meaning (and created in the space/event between strategy to lens) in how we can compute our perspectives on reality, underpins dualism. It is this emergent “real making” by defining mathematical and conceptual edges, that define the traits of what will be seen as useful or useless traits in their analysis. This ability to define progress as beneficial underpins evolution in the genetic and progressive senses and the conscious and self-aware senses.  

As a data-analytical strategy, it becomes remarkably interesting for predictive algorithms in subjects like health and wellbeing. As a more general concept in physics and philosophy it formulates (when applied in each field), as best as is possible and within the practical limitations of mathematical reality, a Grand Unified Theory. 

Function 

Unlike other theories, however, it is not a method for the improved understanding or calculation of any one given portion of reality (position, motion, trait or hierarchical relation) but one for the improved understanding of the observer’s relationship with the fabric (or data composition) of the portion of the observed (and calculated/ understood) reality itself.  

This improved understanding of the observer’s relation to the data is useful in improving our calculations and assessments of it, using only existing methods. This, in turn, usefully results in improved predictions without the need for significant changes to existing systems and calculation methods. 

Whilst in fact a declarative logic (“check your data for other data within it” is self-evidently positive overall, as a strategy) its computational remit is widespread and presents itself in all notions of ‘reality’ yet is most readily applicable, useful, and topical where clear formulas and calculation methods of defined data matrices already exist, such as in physics, computing or any data-containing process management. 

Also, unlike most formal theories, it is a non-axiomatic, non-monotonic logic in the Babylonian tradition. 

Wittgenstein – von Neumann-Langland realism 

The Dot Theory’s proposed theoretical Logic system, could best be represented as operations in a computational model of Wittgenstein's logical atomism on data, processed using von Neumann's architecture within a wider framework architecture (Langlands). It would attempt to represent and compute individual answers on reality as a complex network of facts and logical relationships, while also recognizing the inherent limitations of such representation as outlined in Wittgenstein's philosophy (This being as needing to be shown within the structure itself) by being able to observe how the data changes within the program evaluated or observed.  

Taking this view on realism would view the Langlands program operating it not just as a set of mathematical conjectures, but as a visible and temporal framework available for a refined understanding of the nature of mathematical reality in the way it describes itself as and our relationship to it as being a function of reality, rather than reality itself. 

This observer-bound coding limitation of representations (ability to express/say what is observed/experienced) is here, instead, translated into existence as the framework of the representation.  

In the Dot Theory’s case, the suggestion is that for personal, subjective matters, the individual observer is that limitation, and their meaning as definable by their self-descriptive data and its statistically associated metadata is the framework’s limitation.  

Whereas for individual scientific objective matters (concepts), the data describing the measurements and their metadata (measurement device and known environmental circumstances) and the rules describing them (sciences) would be the framework’s limitation.  

This is consistent with an opportunity for reversal of bias, or delensing of the computation that occurred during observation and simultaneous acknowledgement of the needs met by other emerging formats of Conscious-data theories on reality.  

Observer-dependency 

The central premise of the Dot Theory is that reality is fundamentally an observer-dependent, and most fundamentally, an individual experience-dependent construct.  

This theoretical position offers consistent opportunity for dual use in Ai-based pattern analysis:  

  1. subjective and self-referential frameworks (of which of course our current description of physics and all its formulas used to describe reality is one) and 

  1. Mathematics and physics where the meaning attributed to the numbers is agreed by each science and its conventions.  

Constructs of concepts are such individual experiences represented in shared ideas or understandings, and they must, for integrity, always be considered as such. This is significant, considering all notions of reality, including ideas of shared reality such as formulas are formed by such concepts and form related concepts that define the form and function of the constructs. This also positions notions of consciousness as the meta-structure that perceives reality but cannot perceive its own framework and cannot define what it is, only what it does. 

One could therefore constitute that data-technically speaking, consciousness is an individuated construct of degrees of accuracy of the relationship between data seen and data understood (absence of lensing). 

Theoretical Structure 

The Dot theory’s theoretical structure is as per that of concepts of Game-Theory, used widely in physics, computing and process management, where it is conceived that different function-strategies produce different outcomes, which in turn become the building-components to other games played out by variously informed (intelligent) strategies in more, or less, complex sequences to varyingly performing outcomes, under varying circumstances, with varyingly statistical confidences, subject to varying criteria and purposes.  

This is as analogous to prediction and evolution as it is to emerging properties becoming observable, measurable, and calculable by new methods of observation, interpretation, and calculation. Each are steps used in Game strategies to make up their construct or recipe and define a user-relation toward success. 

Applying Dot Theory to any observation (and the data describing it) can be positioned as being: The process of defining the construct of meaning that defines the observer-defined strategy that confidently defines the path the observer will, albeit when viewed in future retrospection, be most likely to describe as having been most like what they would then describe as the successful outcome. Most like the thing they knew would happen, if they knew everything there is to know and once having taken all the facts into account. 

This is why, in principle, this realism accepts that any person born in your body and bed would do the same thing as you are doing now. 

This, because applying Dot Theory to the data would invite the inclusion of data previously observed to describe the most frequently and most successfully used method, used under most-relevantly similar circumstances. This creates space to observe these relative changes in topological behaviours of the data over time inevitably offering improved predictive capability. 

In this Game-theoretical sense, Evolution is the Seen-to-Take-All-Winner strategy, when considered relative to the self-referential framework (they see themselves as the authority having the more valid perspective, how "evolution is shaped by the survivor” or how “history is written by the victor”).  

“Intelligence” is, in some sen ses, applying that strategy predictively by asking from the data what the most informed decision is. By including new layers of associated and probabilistically valid information, this shift in computational framework enables the inclusion of layers of associated metadata that offer valid predictive probabilities.  

This is analogous to defining success, progress, or evolution and as such, one could argue that “Dot Theory” is not a formulaic entity but a method of realising this process. A process of identifying more accurately what is more true/real computationally and calculationally only using what we already know to have been true. 

Applications and composition 

  1. Computer Science  

The composition of reality can be thought of as the observed and recorded data-trail and by additionally considering any available, in any way associated data with wider, also associative, and potentially predictive, patterns, in any evaluation, at any given stage we, inevitably, improve the outcome and evolve. That is the Dot theory Game strategy: When possible, consider all possible but relatively relevant data as such (relatively in that it has been statistically weighed).  

Doing that in real world calculation had been irrelevant until, and impossible without, the advent of Ai and Quantum computing. 

This identification and appreciation of the algorithmic composition of data and their mutations over time (recorded as meta-data), now coherently forms the recipe for the composition of the observed reality in a computational landscape. Thereby, it defines its relationship to reality (to the best degree it can express itself to), congruently within its own understanding of identity-defining notions of success.  

  1. Physics 

In Physics, its application is reducible to only requiring the manufacture and additional inclusion of an associated data matrix, consisting of data (metadata) describing any available, relative data, where any known pattern-associated statistical confidences are being represented in the mathematics used to make predictions in experimental physics. The string of code and its relationships to the data being calculated would more accurately describe the reality of any event.  

It also reframes our individual relationship with our idea of the scientific understanding of the expression

E=mc2Ε=mc2 to Ε¯=m⊙c3

where the ⊙ constant represents the per-datapoint-considered but cumulatively represented lensing/bias that is contained within the equation that is represented, yet can, confidently, be associated to an individual-specific variation. It is a modification on Einstein’s Cosmological constant, which successfully absorbs the issue of perception of an expanding universe. 

The overbar to say that Energy is now not considered real, but its function is. This is as it would be perceived as an object of data within a Langlands Program. 

This is necessary, even if that makes no difference to the human calculation undertaken in Quantum Physics. It does, however, make a difference if Ai-based calculations of reality were made with Quantum Computing for predictive purposes.  

The use of Ai in predictive calculation using the Quantum-entangled relationships described in von Neumann’s work is already in use every day, offering evidence that this reframing is only mildly innovative.  

  1. Mathematics 

It is fundamentally, and coherently, impossible to represent the Dot theory as a singular and immutably exact mathematical formulation (as they themselves are the emerging, observer-bounded frameworks that are being pre-analysed for lensing factor) required to reduce the idea of including any or all available additional data matrices in a coherent fashion mathematically.  

As a theory it does so unapologetically because it makes sense that in an evolution where Classical theory requires physical observational proof and General Relative theory requires mathematical observational proof, a GUT like the Dot Recursive theory would require logical observational proof. This, besides the immediately effective impact of the conceptual elevation in the meaning of the formulations and equations of Quantum Physics to that of making the entire observable reality calculable, make it reasonable. This, simply and necessarily, as each formulation is computation/question-, and scale-specific. 

The equation’s formulation is therefore dependent on which data is considered for which calculation, and for which purpose (in relation to what other data). But the inclusion and extension of multivector representations with Clifford convolutions and Clifford Fourier transforms in the context of deep learning would represent it fluidly. 

It (the mathematics) would become formulated as the question to pose to trained Ai exercising its ability to confidently give relevantly-predictive context from existing and associated historical data against composite data-avatars. 

Application 

What is required for it to be applied successfully is a) acknowledging it as an entirely valid computational perspective on the information making up the reality under analysis and b) processing the data describing the equations represented successfully as patterns describing observed reality as data in a computable landscape. 

This is permissible because we must inevitably come to realise that:  

  1. logic and perception, are the first and fundamental entry portals to all understanding of reality (both the demonstrable and repeatable ones, as the individual and emotive ones) and that  

  1. we must decide that what we observe is real. Whilst both are born from, they are in every sense equally limited and defined by our understanding of reality. 

Conclusion 

We have now (is the position of this and the associated Dot Theory papers) coherently and logically displayed sufficient evidence to demonstrate that the real-world experience of reality exceeds our current use of our relationships to mathematical frameworks, and that this inescapable insight requires, if not forces us, to reevaluate the limitations of our currently used perspectives in our computational relationship to the mathematical frameworks used in the calculation of reality. There is no issue with the mathematical frameworks, there is only an issue with the way we view and describe our relationship to them. 

We must therefore now describe and offer for evaluation the data describing the changes in the framework's shape over time. This change in relationship or perspective, is a simple, yet effective suggestion made in the Dot Theory’s alterations as an evolution on the Theory of General Relativity and imposes therefore, by pure logic alone, adequate evidence for an alteration in our relationship to the fundamental mathematical objects used to describe our relationship with predictable reality to a Theory of Recursive Relativity. 

This change does impose a significant shift in computational capability and a somewhat imperative requirement from the general public’s understanding of computing but is well within the remit of, and with immaculate timing to, the projected future of Quantum Ai applications.  

As such, this paper presents logical evidence that we already function in a Wittgenstein-von Neumann reality that can predict interpretational outcomes when analysis of data representing reality as being such, is paired with Quantum-powered Ai and data is considered as a topological landscape in a Langlands program. We therefore ought to update our representation of reality in both physics and mathematics to reflect that. 

In this sense, Dot theory is not much of a formula or even a constant. Even as a constant it is not much of one. This is only an inescapable idea to be kept in mind when considering making predictions and that in doing so creates a mathematical space for data layers to confidently reside against the Universal constant of the speed of light. 

It only requires us to accept that we are real and cannot perceive the future, only anticipate it relative to our needs, nor can know the past, only our interpretation of it relative to our current needs and perceptions of success. 

End 

To create is to say something that is not yet known to be true. To state something as an as of yet undefinable fact. It is to say that something that is, is not just what it is, but that it could or even should be something else before it is it in the eyes of others. Introducing an opinion and judgement that cannot by its very definition be fact. To create it is to take the risk that it does not pay off and is found to be false, but without creation, evolution and therefore life itself is simply impossible.  

Why does Quantum Mechanics calculate reality as if all the data representing it is local to shared reality, but nonlocal to the individual,

when the data describing visible reality (and meaning) is clearly local to us? 

Or; how the terms of the General Theory of Relativity and Quantum mechanics put its knickers in a Twistor to avoid consciousness. 

By Stefaan Vossen 08/11/2024 

Ideally, the mathematics claiming to describe reality should reflect reality accurately.  

The current structure of the mathematics of Quantum Mechanics however, calculates the data describing it as nonlocal relative to the observer, yet it is self-evident that this is not our only relationship with the data describing our relationship with reality.  

The Dot Theory demonstrates that quantum mechanics works smoothly across the Standard Model, when we modify the terms describing the Spinor’s mathematical structure, and enable the calculation of the data describing reality as being local to us. At least, on condition that calculation is made to predict a trajectory approximating a previously made observation.  

On occasions where, even if conditional, self-evident truths like this appear to contradict the terms of the mathematical description used to describe reality, a prompt re-examination of the mathematical terms is required. 

This paper is a pure logic argument inviting physicist and mathematicians to reconsider the use of the current defining terms and geo-mathematical structure of the Spinor as used in Quantum Mechanics, on the basis that they self-evidently do not consistently reflect the conscious, in vivo, terms of the full spectrum of the relationship between the observer and the data describing observed reality. 

In Quantum Mechanics, the set-definitional terms and functional structure of Spinor mathematics calculates the observational data consistently as if it were nonlocal. This, in line with the consideration that the data used to describe reality is nonlocal.  

If we look at the relationship between observer and reality in vivo however, cursory analysis swiftly reveals at least some data to be local, as evidenced by the success of the derivative equations used in weather and financial data forecasting methods. Further evaluation demonstrates this predictive phenomenon to also be conditional and only available wherever historical data is available for more-accurate trajectory forecasting (prediction with real-world prior) on most-like events, providing data lakes of progressive-looking data*.  

Our experience and observation of reality therefore is in its material part, quantum. This is self-evident from our success with classical, cosmological, weather and financial predictions, yet the mathematics used in Quantum Mechanics does not appear able to consistently treat all portions of the data spectrum representing reality into its equations. Where it treats data that is local to the observer as nonlocal relative to the observer overall, it catastrophically fails to compute it. 

This from the position of the Dot theory, is because the mathematical terms only accurately compute those terms that are nonlocal. Resultingly, Quantum Mechanics, conceptually and in its current mathematical structure, fails to absorb the fact that to our subjective, conscious experience, the real world is data. A world where our observations and measurements are, in fact, to be computed as, relatively speaking, metadata. 

And therein lies the asymmetrical superimposition to the issue with the Theory of General Relativity where the exact opposite terminological error could be adjusted from by reintroducing Einstein’s gravitational lensing constant but making it relative to enhancing E to fE or E¯>

.

It is the Dot Theory’s mathematical proposition that Quantum Mechanics becomes more efficient when the mathematics used reflect reality as data and the data describing our conscious experience of reality as metadata local to us. It does this by presenting a convincing argument between two self-evident yet mathematically uncomfortable observations, only the mathematics of which can be adjusted: 

  1. It is self-evident that the geometry and set definition of the Spinor as currently defined in Quantum Mechanics does not treat the data describing our conscious individual experience of reality as local metadata when it clearly should. 

  2. It is also self-evident that if the Spinor’s structure were to be modified to reflect the calculation of reality as if reality were nonlocal and the data describing observed reality were considered local, that the computation on progressive-looking data* would be quantum across the Standard Model. 

Therefore, if Quantum Mechanics is showing itself as coherent across the Standard Model when the necessary set definitional terms and structure of the Spinor are changed to calculate the data describing reality as being local and reality as data (even if only when calculating progressive-looking data), then the mathematical terms of Spinors are, logically, to be changed to reflect this. 

The currently held assumption in QM is that it calculates and should calculate the data describing reality as nonlocal. This, the Dot Theory positions, is in essence the error in the chosen mathematical expression of the function of E relative to the observer in the General Theory of Relativity.  

For QM to be able to become continuous with the Standard Model, the GTR must reciprocally raise its equation and tensor relation by one function to E¯>=m⊙c3 (with⊙ = the total sum of observational biases/constants (gravitational constant or cumulative lensing/c).

If QM becomes continuous with the Standard Model by making those defining corrections, the defining corrections suggested must logically be reflected accordingly. 

However, neither the terms of our physics nor its mathematics currently do this. 

The psychological and cultural challenge inherent to the treatment of reality as being hyperdata (if reality were considered as data, the data describing reality would, relatively speaking to our current conscious relationship to reality and lived experience, be described as metadata) may conceptually be significant, but that does not make it incorrect. 

The computation of reality being obviously quantum when considered nonlocal and when its data considered as local, is the only evidence required to at least intellectually force the review of and change to Spinor mathematics needed to make Quantum Mechanics work across the Standard Model and for our mathematical relationship to the function of E to be elevated in the GTR. 

Making this change in the structure of Spinor mathematics in Quantum Mechanics for predictive purposes, is technically conditional on that the computational perspective taken on the data must be retrospectively prospective, i.e. it must have been shown to have been true under similar circumstances in the past. 

This (that the data used to predict using QM is data describing something like it that has been true like it before, i.e. it cannot make “fantastical” predictions), as far as understood by the author, is the only condition imposed by the mathematical correction suggested to the current mathematics by the Dot theory and to the world of physics by this shift in mathematical perspective.  

In normo-verbal terms: The Dot theory does not exclude the possibility that there are other realities to be considered but states that reality is quantum, if its observations are treated as nonlocal and its perspective is progressive. 

 

*Progressive-looking data is data that is selected based on it having a specific relationship between the data itself and the computational perspective taken on the data. It consists of search-specified cluster-pattern recognition-based data that can be referenced with statistical confidence to improve trajectory accuracy to a specifically defined target when compared to previous attempts for similarly defined targets. 

 

 

Reality is the matter you can perceive  

Let it be 

Not that you can do any different really 

But if you try not to 

it will be something less.