The Dot Theory
Full and complete paper in 11 sections available in blog posts for comment, please navigate by selecting “older posts” for further sections
1 Prologue, 2 Abstract, 3. Introduction, 4. Method, 5. Structure, 6. Discussion, 7. Conclusions,
8. Addenda A-D, 9. Addendum E 10. Reinterpreting Spinors 11. Addenda F-K & references
3. Introduction
Introduction
Data, equations and a healthcare problem
Healthcare is in big trouble, and that means there is a big opportunity to make things better. Obvious clinical/commercial inefficiencies, poor compliance and low trust are only the snow on the tip of the iceberg in the ocean of healthcare data. The challenges associated seem many, and the number of vital moving parts even greater. This paper provides a proposal on reorganising information to produce a process of identifying a per-patient-profile, efficient evaluation of what works for most-like patients. For the method suggested as solution in this paper, please visit page 19 without introduction, but rest assured the parts to execute and deliver this process are already in place and require no innovation, just connecting differently.
This paper’s aim is to demonstrate a method to reduce information losses at the controllable data elements (acquisition, transfer and analysis) to improve the efficiency of healthcare-decision making. It does this by suggesting an alternative approach to healthcare data-analysis and its juxtaposition and superimposition.
With the focus on the development of AI, the current approach to improving clinical data analysis can be described as “trying to better understand the fabric of the interactions between two knowledge bases that cannot, by their very fabric, speak the same language”. This, not only because one party experiences and the other observes, but also because the experience of disease and poor health, is a unique individual’s healthcare experience, with its own history and interpretations.
Healthcare of course does not exist just for the curing of sickness; it works through policy and research to deliver something more cost-effective and less unpleasant for the patient: prevention. To understand prevention, we must come to understand the data that proceeds/leads to the development (or for treatment: the resolution) of an observable symptom or cluster of symptoms. With this mesh of data, we can cross-reference to see patterns with probability-weighing, showing us paths with high probability of being beneficial to a specific individual.
This of course produces the problem of the associability to that data and requires the data to be organised in such a fashion that it is already subdivided in groups most-representative of the person under evaluation (e.g. Blood pressure variability between races, probabilities of heart attacks associated to BMI etc.).
Whilst the dis-ease process in the person under evaluation may have many observable correlates to similar “dis-ease processes” experienced by others, it remains an inherently unique process by virtue of being experienced by one and therefore cannot be experienced by another. This linguistic difference represents the “otherness” of the two individuals involved in a clinical encounter.
From this “otherness”, any evaluation cannot but be marred by data representing “subjectivity” and becomes the limitation of perception inherent to that “otherness”. That linguistic difference is also to be considered a key “limiting factor” to the potential quality of the clinical encounter.
The clinical encounter is a nexus point ‘X’ demonstrating the convergence of those two databases with their own uniquely individual personal traits and behaviours that make them, them. The first can be seen as the collection of data describing symptoms or health experience of an individual, whilst the other is a database of diagnostics and associated solutions and prognoses. A database of bumps and lumps, lotions and potions, with the clinician as observer and translator
Database A <-----Clinician-----> Database B
Set of all signs and symptoms <-----X-----> Set of all processes understood to aid
understood to be relevant sickness and optimise wellbeing
This latter database is currently built, ‘owned’, and translated by professions with individual philosophies and commercialisation approaches, and “used” by nations with differing socio-economic variables and habits. Also, occasionally executed with conflicts in nuances in philosophies and with varying lobbying-strengths and arguments.
This, as much as anything, is often due to the technological and scientific limitations locally imposed on their individual peripheral manifestations (language and rituals), rather than accuracy of their individual core ideas/clinical philosophies. This linguistic difference between core ideas for health (and their localised execution), and our (previously necessary) insistence on linear development of the knowledge base, potentially stifled scientific development. In this sense, this paper suggests a data-analytical way to propel intelligence forward, using the same fuel (data) but through a different process, to improve output-performance of that data.
Use-case argument
When compared to those of the financial markets, these “clinical” databases and data-prediction tools would be considered as severely suffering from “lag” in user-feedback-loop-terms. Current treatment-development models are step-iterative, time consuming (years/decades/centuries) and costly, even with wearable-data collection. Clinician- and technician training too is costly, both fiscally, cerebrally and in time. Its acquisition is highly reliant on human memory, as well as subjective inter-personal and contextually modified inputs (mood/acuity).
Objectively, healthcare today is also highly stressful for both patient and clinician. It is also stressful for institutions and nations trying to optimise their welfare and productivity interests.
The role of the clinician is a high-liability item, both in performance and direct financial and insurance terms. All these are pointing directly to significant areas for high-value improvements with high social impact. The system discussed suggests an opportunity to introduce a series of known and in-use technological elements at key stages, for significant social and commercial gain.
Finally, this is of course a frustratingly inefficient situation, but a logical one for the healthcare profession to find itself in. Especially when considering the private nature of what surrounds health and disease. It is only natural for mankind to have devolved a special social role to a person or institution who would keep things private and be aware of cures after all.
A truly globally, universally, and historically valued commodity.
Evolution, Function and Role
This level of specialisation could naturally prevent healers from acquiring skills and stability in other areas of self-provision. In return, this person would receive culture-wide protection for the duration of unbroken trust. The role was held by a wise/sage person/figure who would often have been educated formally or distinctly.
Sometimes figures, symbols or rituals would be invoked as part of belief infrastructure, to harvest their power, often referred to as placebo and nocebo. These are occasions where the knowledge of the meaning, and the feeling of that knowledge could be powerful enough to invoke or support cures by ritual and context. Frequent use for belief, sorcery and magic occasionally resulted in abuse, often of the vulnerable.
The pre- and post-enlightenment periods, and their scientific methods and privacy laws offered individual protection as an iterative accountability/safeguarding feature. This, somewhat laggy, evaluative process still in use in healthcare today, has and continues to be considered a valued commodity.
Another industry typically concerned with the management of high-value commodities with complex real-world relationships are the meteorological and financial sectors. These sectors have produced significant computational power in the form of derivative equations and populational epidemiological trends. These enable the effective entanglement of wide varieties of key datapoints, naturally revealing their predictive power. Unlike the healthcare industry, however, these sectors do not have to contend with the privacy constraints regarding the identity of their subject matter (figures on businesses or commodities) under evaluation. These offer themselves freely for specialist analytical approaches and have produced their own forecasting and data management models in the form of cryptographic strategies like Blockchain. A tool with significant meta-meta-data storage capability if used as suggested in this paper. Its deeply cryptographic nature and impenetrability in the presence of perfect randomisation make it perfect for handling personal data, of course, but more importantly, it holds data as packets in the chain. Packets that are defined items in a defined location. Quanta.
Trust and Blockchain
This paper understands the safeguarding offered by the blockchain cryptographic-technology approach to data and offers the idea that with the use of functional archetypes, a “personal wallet” could enable an individual to access via the internet, data correlating to the data they offer up for analysis in that wallet. It would also offer an opportunity to organically integrate perfect individual randomisation, offering hyper-exponential computing possibilities.
This idea offers the opportunity to analytically question the tools and methods currently in use for the education and formal data management methods of that nexus point. It also recognizes that:
clinical outcomes and accuracy have not grown in line with increased computational power and connectivity
for investors, return on investment in the process-driven healthcare field has deteriorated significantly (increased in service-driven healthcare)
This is because our systems for the management of Healthcare data have not yet adapted to the computational possibilities offered by Quantum-computational models. In this proposal’s sense, “wallet-brokers” will become platforms that enable individuals to gain access to curated data-lakes of archetypes to improve their digitally mediated real-world experiences. These wallets offer access to APIs for comparison of the client’s data-profile (the patient) to that of the avatars held by the API holder (avatars of them existing in other services and service profiles). Software companies as well as any retail/commercialisation behaviour (i.e. Real-World digital data with client-behaviour-pattern data) can offer their “intelligence” to the market, in a bid to appreciably improve the wallet-holder's experience by retailing pattern-altering suggestions. There are companies and organisations that already hold a great deal of such anonymised research data and use these patterns every day (and if not, are building them as we speak) and are keen to develop value from it.
In this sense, Web3.0 enabled cloud-computing services essentially represent wallet-holder-data as customer avatars for evaluation against their reference databases to advise best-next choice for people most-like you (statistically/probalistically linked to your defining traits and behaviours).
The retail of pattern suggestions is a broad subject discussed superficially in Addendum D.
Archetypes and avatars
Topological Kinesiomorphology: the (suggested) process of evaluating changes in fluctuation patterns of a data-surface in relationship to changes in another (any other) data-surface.
Through the suggested process of “Topological KinesioMorphology” (TKM), recurring rules and patterns can be observed and allocated as “archetypes” of human behaviour. These archetypes are fundamentally the amalgamation of the reliably correlating traits and features of most-same individuals. In essence, they are what is for a period considered “normal” for people who share your collective central traits associated to that being “normal” (e.g. “healthy”). TKM (name suggested) evaluates changes in the shape of an individual movement pattern in response to environmental changes, stimuli and stressors (including therapeutic encounters). It evaluates the patterns emerging from the disruption of patterns.
An archetype’s central traits change over time as the archetype develops. Archetypes are created based on the algorithmic exchanges between data the individuals historically have offered to the analytical system. The same can be said of how financial markets and the service industry optimise their product and customer behaviour prediction. By creating human avatars and deducing their archetypes, we open the door to predictive analysis of the associated healthcare data.
Shannon entropy, AGI and Infodynamics
The idea expanded here represents a system, in which recorded reality is represented as data points within a data matrix. Data that is probabilistically linked to other data via function-probabilities that can be questioned in turn. Data representing packets of specific units located in a certain type of field (Quantae).
As these fields (and their representations) coalesce through the convergence of their functions, reality changes its appearance in front of the very eyes of the observer, imperceptibly morphing into perception. The data-entanglement process and predictive power offered by the laws of thermodynamics are used and represented successfully in Shannon’s and Vopson’s data-centric interpretation of the laws of thermodynamics. Its reliable applicability to the emergence of biological systems from chemical systems is established. Its probability is considered very high, and it is both supported by ample physical evidence and countered by no computational rules.
Here, with the usage of data-entropic modelling, the method allows us to represent data points as computational relations in real-world space. Quantae in relationship to each other through functions. We suggest that clinical data can be seen and analysed as the coalescence of multiple such fields of physical, emotional and physiological human functions. This paper suggests that these types of data present themselves as prime candidates for such analysis, and that this can be achieved by taking an individual-centric perspective on computation. This could represent a “first” or early execution of true AGI functionality.
As such, these variously (anthropologically, epidemiologically and ethnographically) recorded and tangible (as data) notions of what constitutes the human experience of living and being well/unwell, are represented as consciousness and awareness-inducing wave collapses across the matrix of the collective human experience.
From this computational perspective, it rapidly becomes clear that a) the “closedness” of an entropic system defines the nature of the catastrophes that occur in it and b) the asynchronous sentiment and nature of the catastrophe is due to it occurring across the different perceptual fields’ barriers and disrupts the linearity of one with the linearity of the other. Or being from “another dimension”, calculationally speaking.
Each evaluatory layer has its own strata of rules defining its language from materialism to consciousness. From another perspective, they all say the same thing, using different languages to achieve different organisational ends.
This paper discusses the rules that connect all strata of “the health-experience" from physical realism to personal perception and emotion. These are necessarily linked through higher-order- systems-interactions, indicated and stratified below (picture 2). This diagram seeks to demonstrate the relationships between conceptual databases, data sources and data-processing algorithms across the healthcare’ “lived-experience” spectrum. This translates into significant differences in language, meaning and even translation. Because healthcare exists across the spectrum from heat radiation and chemistry to emotion and psychology, it embodies a function that spans the entirety of the human experience.
These groups of data-clusters often seem at odds with each other, yet all observably coalesce into higher-order functions. This tensile-repulsive relationship is the emergent property from existence itself and consciousness as its observer. This relationship is extensively discussed in literature and coherently amalgamated across it. The subject of this paper is only to highlight the fact that the relationship between strata is consistently inconsistent. Doing so points toward a rule, a repetition of an inherent driving force which when algorithmically understood reveals the opportunity to repeat a process of analysis and achieve new levels of understanding.
Picture 2 (apologies, pictures not available in this web format)
Each combination (e.g. Biological Systems+ Hierarchy) offers emergent (up arrows) properties (Social Systems) one-directionally. Which in turn when assisted by another (Creativity) creates another (Culture). These are “emergent”.
In reverse (down arrows), one-directionally offers a convergence (as opposed to emergence) of Creativity and Social systems and the confirmation of a more-consciously constructed concept of the trait when observed aka become conscious of (in this case Hierarchy). This self-generative and emergent association between functions is necessarily asymmetrical, yet as demonstrated in the diagram, bidirectional and “symmetrical from the perspective of the emergence”. These are “consolidatory”.
Observed, remembered or imagined, these experiences become recordable virtual data-particles in that human’s life-experience and leave traces in the form of data and records. Whether seen as being “added to” by prior computational experience/memory (inheritance/genetics/socio-cultural memes), or as having a life expectation (religio-spiritual or humanitarian motivation), data is being stored across the world, both literally and potentially, in far more meaningful terms than currently exploited.
This data is currently being stored directly and indirectly. Humans share significant amounts of data with digital commercial partners to receive improved service and trade terms. They also produce a significant data-trail in traffic on the internet displaying traits and behaviours, alongside their socio-cultural activities, art, and socio-familial interactions. Each marked with differently memorable and recorded changes in the framework of the world. Various meaningful discussions around this data-accumulation and the meaning of its entropy are being had at present. For this paper, we consider the current availability of this data to be both:
Indirect, by virtue of research already carried out on other, “most-like” individuals and
Direct, for research carried out at the behest of or to the benefit of the individual receiving treatment at any one moment in time as can be found in any clinical notes file, personal tracking app or wearable.
Therefore, imagining the individual human experience as occurring within a closed data-system composed of an evaluation of that data-trail in the context of vast amounts of data on associated data-trends and confidences, is no stretch.
This, in essence, is what this proposal translates as: a logical, and natural computational capability elegantly emerging from prior progress in the human data-mining field. It borrows well-used data and asset-management tools and strategies that are in successful deployment in financial market prediction. The proposal is to use those finance-mathematical tools (derivatives) and apply them to healthcare data. Not to assess the value of a company but the quality of life of the individual. In this sense, group company profiles and expected performance patterns are analogous to individual archetypes. P&L sheets and individual company stock market performance profiles are avatars, and call option value is analogous to Quality of Life (QoL) scores. This enables significant computational possibility. This, in both meanings of the word “computational”:
To be put into relationship to other numerical values providing new meaning on their relationship in certain ways, using certain algorithms embodied into circuitry (via AI).
To be put into computer-logic driven processes that manipulate various clusters of data-behavioural patterns and superimpose over related objects in time and space for predictive outcome and problem solving.
The need for invoking principles of Thermodynamics and “Infodynamic” interpretations, is to allow the tantalising understanding that, if our proposal is accepted as logical, tested and shown valid, then the extremely rewarding conclusions of successful thermodynamic coupling also become visible and available to humans. Classically these are: predictability, significant rewards and cost-efficiencies.
Further outlined below, is a mathematical and computational shift that de-isolates the meaning of the dataset from that of:
The existence and privacy of an individual human (the GDPR protected “patient file”)
To that of a “patient X” avatar as a data trend identified as having been in a specific location in a database representative of Reality.
The local comparison of such avatars (as permitted within privacy laws) inevitably reveals archetypical algorithmic functions in relation to the wider environment, as already used locally for function-optimisation.
This shift, and its application in the healthcare field, give it adequate reference for valuable and valued predictive analysis. In making an avatar, as something observed to change in response to treatment input, we observe and associate the human experience (pain/wellbeing), to their reaction to the environment (as expressed by a change in movement pattern). Thereby, making the data set functionally useful for improved prediction.
Legal, ethical and HIPAA/GDPR
Individual rights, rightly impose legal and ethical limitations to the execution of computation on the relationships of private data. The current appreciation of the protection offered to the private meaning of their relationships, by HIPAA/GDPR makes for the effective sharing and protection of this private data. The novelty of our proposal exists in that it is a data-approach that does not create new conflicts to overcome, and yet simultaneously enables a rapid, and continuous learning and improvement cycle with real-time predictive capability based on data from humans “most-like us”, closest to “our normal”. This would engender growth of usage only by virtue of its success at delivering what is most likely to benefit the individual for whom we are asking the computational question. If lost, they would be lost to all and easily replaceable by introducing the data anew from private records.
This would not require involvement of an organisation to provide access, as access is defined by the information voluntarily given by the individual to the service provider. Wallets will be seen as holding packets of personal data. A cluster of data an individual has chosen to make available for analysis by a company they are asking/enlisting to optimise their lived experience. This could potentially include API access to their personal data held on other platform databases or a two-way exchange of archetypes.
Besides the significant but logical leaps taken by the principles that “that which is normal does not exist but guides us” and “a thing can only ever be in one place in time and space” the proposal’s moral structure safeguards all notions of individual privacy exquisitely.
Because data is collected based on treatment improvements, the continuation (i.e. sequencing) only occurs in the event of satisfaction with the care (until completion). Where events do not go well, they are considered “adverse events” and offer a learning opportunity to technicians and archetypes. Their occurrence triggers an instant cessation of the actions undertaken and review of the decision making. As such, this system cannot predict “the outcome that would make matters worse”. Further offering interesting ethical solace.
Post-rationalisation
Historically and on balance, the healthcare and health research markets have been densely populated by people with high internal motivation with relatively little external incentive. Financial markets on the other hand, have had strong incentive to acquire high-fidelity prediction-models and have provided derivative equations that successfully enable an intimate understanding of the “health” of a company from numbers only.
In one sense, the analogy and projected success of this application are so significant, yet the only innovation was to make a human’s movement pattern a reference data-object.
This, only to rationalise the elegant emergence of this theory through a healthcare proposal. The juxtaposition of a coldly motivated fiscal tool and an ethereal digital concept to the warmth, intimacy and inaccessibility of the human experience. This by visualising the individual under evaluation and making their individual search perspective the computational one. Asking “What, if I had access to all the information in the world, has been/worked best for other people most like me that are/have been out there?”.