6. Discussion

Discussion 

Equivalence and fairness 

By using the fair market premise inherent to the BSM equation, we establish an ethical mathematical basis on which to bring into relation the quanta representative of deeply intra- and inter-personal life-experiences. It poses the value of an individual’s Quality of Life as part of a larger framework that requires calculational equanimity for it to be functional. The relational framework’s value-metrics now include a wide and varied number of parameters. This process includes objective and subjective clinical and personal qualifiers, as well as current and historical psychological, genetic and circumstantial factors. Positioning them as the defining metric of Quality of Life. The equation also realises the ways in which resources used to soothe one, instantly become less available to the needs of others elsewhere (an example of real world “spooky action at a distance”). Realising too that it has the capacity to create a sense of competition, friction and irritation (emergence). Sometimes to their growth and strength, sometimes to their demise or distraction. 

With this equation, and its direct translation into existing predictive models as used in financial markets, the translation of existing database management platforms into healthcare ones appears within reach. Only requiring the development of an avatar-individual based analytics-framework to embody the notion of teh asset. Identifying the individual as an object/particle within the framework, rather than a collection of data.  

Interestingly, this can be executed at any perspective of calculation/observation. 

It is now to developers to define those metrics from the data they have available to them. Using AI and LLMs to figure out what their best performing relationships are and introduce them into the existing infrastructure. Make them available as wallet-based access. Contractual relationships between patients and the companies that hold personal information on them. Enabling their data’s inclusion in analytics executed for their benefit. 

When thinking about this in healthcare process-terms, the equation to define the value of stock options invites an appreciation of the health of “the company”, as could be done when looking at past call option values. In healthcare this analogy continues to hold true, as we know that both our own, and our suppliers’ and customers’ (i.e. our environment’s) general sense of value or “track-record of wellbeing” is a reliable predictor of an individual person’s sense of wellbeing.    

Considered a coalescence of 3 highly computational approaches, this proposal crosses the mythical bridge by addressing the localisation and identification of the “particle” i.e. “Patient X” in the process of creating a signature Kinesio-topological shift for that individual. Rather than only “identifying” the individual by this set of data (the non-unique signature topological shifts) the system creates “archetypes” of recurring patterns as the shareable reference pattern. These archetypes will sit at the agreed level of distance from the private individual. This system would benefit from access to the raw data so that AI-bots can be trained to in turn train bots to seek correlations in other branches of the n-ary tree where correlation(s) are being observed in others or across others and fed back as recurring patterns.  

A seemingly complex set of steps to be taken but as luck would have it all the tools are already in existence. In essence, it is only a small inclusion of creating an individual perspective, yet it enables this quantal computational leap in optimisation and customisation. 

The innovation is to alter the search-perspective to one looking at the number of “traits shared” by the individual and offered for evaluation of the question. This restricts the motive with which questions can be approached but multiplies exponentially the width and breadth of potentially relevant associations. These can subsequently be observed for predictive relevance and helpfulness and offered in real-time. Powerful, real world, intelligent, personalised healthcare and data services. By interacting with the data, patterns emerge, in turn becoming part of the information considered in future computation, especially when from a similar perspective as dictated by trends emerging alongside sickness, injury and treatment. 

With this system constructed as a platform, virtual simulations can be run to evaluate the predictive capability of various recurring patterns and algorithms against measured and observed real-world outcomes. This will be critical in the “early development” phase to deliver a robust process of in-built critical analysis. 

Furthermore, with these archetypes, focused and directed evaluations of the wider database of unobserved statistical correlations (i.e. it would observe correlations science hadn’t yet identified through the normal academic process) can then be run to the benefit of a human individual as a simulation. By identifying an individual in a location with a variety of traits as suggested, they become an item constructed of the same language as the database and their identified group of traits (as part of the clinical process) becomes a perfectly calculable series of probability-relationships.  

One observes the obstacle: the size of the necessary database. However, in this case, the internet’s “reach” is the database and the service providers’ desire to cut costs and optimise service already produced the necessary archetypes. The access afforded to data via APIs replaces the need for much data-storage at all, as those data points already exist and seek commercialisation via digital access. Companies will have to find ways to respond to API requests and may wish to make more of their archetypes as an industry of the future. A financial system will emerge when the value of archetypes (and not “data”) is realised and from a development perspective carries the digital hallmarks of cryptography.        

 

Development investment  

This idea would, over time, be taken on by a significant insurer/healthcare-provider in both the private and governmental sectors. By: 

  1. acquiring this datatype from areas associated to an individual (healthcare consumer -parameters indicative of levels of wellbeing, e.g. strength, appetite) and  

  1. having their permission to join a movement-based data mesh (avatar) to their healthcare data,  

it will be possible for individual healthcare providers to build functioning predictive platforms. 

From a development investment cost point of view this will require: 

  • the formation of integrated software platform to create avatar meshes, evaluate their trends and associate them to data trends in personal clinical data files. 

  • A clinical software platform to capture patient in-clinic data and accept data from wearables through APIs contracted by patients 

  • An affiliation with a healthcare platform with fixed-point venues allowed to observe its users through standard CCTV facilities n(possibly upgraded with LiDAR. 

  • A data engineer team and a data science team  

  • Acquisition of healthcare, science and ethnographic data 

  • Development of prototype, branding and facilities management 

For the first company to take this idea to market to benefit from being “the first”, other companies must see the value of the founding company having done this. Also, for the founding company(ies) to have value, other companies must also already have or be willing to develop customer-experience “archetypes” of their consumers. Fortunately, this is something all of them have done, or are already doing, consciously or not. They may have some work to do, but if so, their development cost will be to produce archetypical models for the benefit of their consumers and customers. They should or will also be strongly motivated to do so for their own improved cost-efficacies.  

And “all” it will require is CCTV cameras and motion-based large learning models with blockchain-based access technology.  

Many attempts to cross the conceptual bridge have been made with various initiatives in various data-heavy industries. This, with greater, and lesser understanding of their users. The uniquely and intimately positioned healthcare profession on the other hand, has not yet done so, and its inherently benevolent perspective on the purpose of its service, offers the appropriate computational perspective to appreciate and realise this method’s value.   

Novelty 

The development of a predictive observational perspective has so far been limited due to the nature and traditional structure of healthcare data and by data-restrictions inherent to the notions behind privacy laws. This proposal does not have to contend with this issue. The investment involved would be to develop learning models that associate changes in input (treatment modalities/interventions) and external stimuli (context) to: 

  1. identify recurring patterns of change and associated benefit observed  

  1. association of benefit to the patient to broader databases with clinical factual knowledge on which LLMs can be trained.  

Due to much-needed privacy restrictions, the way data will need to be navigated in the future will have to change radically. This model proposes the use-case for such a method, which in turn should spawn an industry of wallet-management and historical-data evaluation industries. This use-case is in fact the non-risk-motivated value metric behind cryptography commercial strategies. The industry’s current majority-financial motivation is driven by requests in the space of security and risk management. This use-case instead, is a positively motivated data-transfer requirement through blockchain technology. A technology that is already optimised for integration with consumer and service data globally.    

Whilst undoubtedly challenging as a globally coordinated effort, benefits can rapidly be derived at local and commercial levels. With today’s ability in the world of LLMs and blockchain technology, it would require a perfectly conceivable team to complete predictive capability from a single site. 

The development leap proposed alters the organisational perspective on its relationship with the consumer. It does this by creating a transformational digital avatar from whichever real-world data they hold, to form archetypes based on user and performance-based key criteria. Joining the archetypes together to form reliable Quantised reflections of “human experience” enables informed guidance toward actions and combinations of actions likely to be to their individual advantage(s). Mathematically speaking it does that by presenting an additional spherical rotation as discussed in Addendum E on Spinors. This novelty joins the other, somewhat unusual relative positions offered in this paper as novelty. 

Shannon-entropy: 

Where H(X)= -Σ(p(xi).log(p(xi))) is the mathematical expression describing the total sum of variables that represent the object under evaluation:  

Describing the variable as a trait (height/weight etc.) through the probability distribution of all possible values of the variable (field) of an individual person, would enable the description of a single individual’s collection of recorded traits to be represented as: -Σ(p(xi).log(p(xi))). 

From there, all computation assimilates to positive emergence of functional features as per all the features already currently using this equation’s derivatives in data-management derived from this observation (the Financial Industry). 

In one sense, that means that the more we know about someone, the more variables we become aware of, and the more variables make up our understanding of the object, the greater its value for H(X). This reflects an increase in entropy with increased data and is the current challenge our proposal overcomes by creating low-entropy, pattern-based archetypes. Data-transference occurs with the creation of archetypes which can then be used to navigate across vast data fields with significantly reduced computational complexity.  

From one perspective, it could be questioned whether this proposal doesn’t offer an innovative perspective, rather than any real innovation. Albeit a perspective from which many ideas and applications automatically emerge and opens a well of computational possibilities. Besides this, an analysis emerges that has the form of logical super-asymmetry, something expanded on in Addendum B (but for this paper's purposes, something that offers powerful options for the human species' well-being and hopes to generate investor interest).  

In conclusion, the consequences of the approach to data discussed in this paper, and its translation into the logical similes in our physical understanding of the world as discussed in the laws of thermodynamics, would imply that nature/reality expends energy on the development of an organised low entropy state because, on balance, their production enables its own continued propagation. This of course seems obvious, but its applicability is universal to all processes of lower-entropy-state-production (higher organisation states, aka “life”) and that implies something of note:  

“Life” self-propagates because it cannot but do so. That is the very trait that makes it “be alive”. Makes it meet the required criteria to be qualified as such. But the conscious experience of “living” is limited and defined by the data-acquisition, processing and storage capabilities of the object life produces. As such, life needs living to increase entropy and fuel the need to organise and decrease it again (give it “an existence” or “a meaning”) and living needs life to be able to pass the useful data-load on to future generations. These are not theosophical statements but simple, direct and logical conclusions if the system suggested in this paper’s proposal, do indeed provide real-world prediction. 

Previous
Previous

5. Structure

Next
Next

7. Conclusions