HEALTHCARE

The Bioingine.com :- “HDN = Semantic Knowledge + General Graph + Probability = Best Decision Making”

Patient_Records_HDN

METHODS USED IN The BioIngine APPROACH: ROOTS OF THE HYPERBOLIC DIRAC NETWORK (HDN). – Dr. Barry Robson

General Approach : Solving the Representation and Use of Knowledge for the Real World.

Blending Systematically Produced and Unsystematically Existing Information and Synthesizing the Knowledge.

The area of our efforts in the support of healthcare and biomedicine is essentially one in Artificial Intelligence (AI). For us, however, this means a semantic knowledge engineering approach intimately combined with principles of probability theory, information theory, number theory, theoretical physics, data analytic principles, and even linguistic theory. These contributions and the unification of these, in the manner described briefly later below, is the general theory of an entity called the Hyperbolic Dirac Net (HDN), a means of representing and probabilistically quantifying networks of knowledge of both a simple probabilistic, and an even more sophisticated probabilistic semantic, nature in a way that has not been possible for previous approaches. It provides the core methodology for making use of medical knowledge in the face of considerable uncertainty and risk in the practice of medicine, and not least the need to manage massive amounts of diverse data, including both structured data and unstructured natural language text. As described here, the ability of the HDN and its supporting Q-UEL language to handle also the kind of interactions between things that we describe in natural language by using verbs and propositions, take account of the complex lacework of interactions between things, and do so when our knowledge is of probabilistic character, are of pressing and crucial importance to development of a higher level of information technology in many fields, but particularly in medicine.

In a single unified strike, the mathematics of the HDN, adapted in a virtually seamlessand natural way from a standard in physics due to Nobel Laureate Paul Dirac as discussed below, addresses several deficiencies (both well-known and less well advertised) in current forms of automated inference. These deficiencies largely relate to assumptions and representations that are not fully representative of the real world. They are touched upon later below, but the general one of most strategic force is as follows. As is emphasized and as discussed here, of essential importance to modern developments in many industries and disciplines, and not least in medicine, is the capture of large amounts of knowledge in what we call a Knowledge Representation Store (KRS). Each entry or element in such a store is a statement about the world.  Whatever the name, the captured knowledge includes basic facts and definitions about the world in general, but also knowledge about specific cases (and looking more like what is often meant by “data”), such as a record about the medical status of a patient or a population. From such a repository of knowledge, general and specific, end users can invoke automated reasoning and inference to predict, aid decision making, and move forward acting on current best evidence Wide acceptance and pressing need is demonstrated (see below) by numerous efforts from the earliest Expert systems to the emerging Semantic Web, an international effort to link not just web pages (as with the World Wide Web) but also data and knowledge, and comparable efforts such as Never-Ending Language Learning system (NELL) at Carnegie Mellon University.  The problem is that there is no single agreed way to actually using such a knowledge store in automated reasoning and inference, especially when uncertainty is involved.

In part this problem is perhaps in part because there is the sense that there is something deep that is still missing in what we mean by “Artificial Intelligence” (AI), and in part by lack of agreement in how to reason with connections of knowledge represented as a general graph. The latter is even to the extent that the popular Bayes Net is, by its original definition, a directed acyclic graph (DAG) that ignores or denies cyclic paths in knowledge networks, in stark contrast to the multiple interactions in a “mind map” concept map in student study notes, a subway map, biochemical pathways, physiological interactions, the wiring of the human brain, and the network of interactions in ecology. Primarily, however, the difficulty is that the elements of knowledge in the Semantic Web and other KRS-like efforts are for the most part presented as authoritative assertions rather than treated probabilistically.  This is the despite the fact that the pioneering Expert Systems for medicine needed from the outset to be essentially probabilistic in order to manage uncertainty in the knowledge used to make decisions and the combining of it, and to deduce most probable diagnosis and select best therapy amongst many initial options, although here too there is lack of agreement, and almost every new method represented a different perception and use of uncertainty.  Many of the aspects, use of a deeper theory, arrangement of knowledge elements into a general graph, might be addressed in the way a standard repository of knowledge is used, i.e. applied after a KRS is formed, but a proper and efficient treatment can only associate probability with the elements of represented knowledge from the outset (even though, like any aspect of knowledge, the probabilities should be allowed to evolve by refinement and updating).  One cannot apply a probabilistic logic without probabilities in the axioms, or at least not to any advantage. Further, it makes no sense to have elements of knowledge, however they are used, that state unequivocally that some things are true, e.g. that obese patients are type 2 diabetics, because it is a matter of probability, in this case describing the scope of applicability of the statement to patients, i.e. only some 20-30% are so. Indeed, in that case, using only certainty or near-certainty, this medically significant association might never have appeared as a statement in the first place. Note that the importance of probabilistic thinking is also exemplified here by the fact that the reader may have been expecting or thinking in terms of “type 2 patients are obese”, which is not the same thing and has a probability of about 90%, closer to certainty, but noticeably still not 100%. All the above aspects, including the latter “two way” diabetes example, relate to matters that are directly relevant, and the differentiating features, of an HDN. The world that humans perceive is full of interactions in all directions, yet full of uncertainty, so we cannot only say that

“HDN = Semantic Knowledge + General Graph + Probability = Best Decision Making”

but also that any alternative method runs the risk of being seriously wrong or severely approximate  if ignores any of knowledge or general graph or probability. For example, the popular Bayes Net as discussed below is probabilistic, but it uses only conditional and prior probabilities as knowledge, is a very restricted form of graph. Conversely, approach like that of IBM’s well-known Watson is clearly limited, and leaves a great deal to be sifted, corrected, and reasoned by the user, if is primarily a matter of “a super search engine” rather than inferring from an intricate lacework of probabilistic interactions. Importantly, even if it might be argued that some areas of science and industry can for the most part avoid such subtleties relating to probability, it is certainly not true in medicine, as the above diabetes example illustrates. From the earliest days of clinical decision support it clearly made no sense to pick, for example, “a most true diagnosis” from a set of possible diagnoses each registered only, on the evidence available so far, as true or false. What is vitally important to medicine is a semantic system that the real world merits, one capable of handling degree of truth and uncertainty in a quantitative way. Our larger approach, additionally building on semantic and linguistic theory, can reasonably be called probabilistic semantics. By knowledge in an HDN we also mean semantic knowledge in general, including that expressed by statements with relationships that are verbs of actions. In order to be able also to draw upon the preexisting Semantic Web and other efforts that contain such statements, however, the HDN approach is capable of making use of knowledge represented as certain[2].

Knowledge and reasoning from it does not stand alone from the rest of information management in the domain that generates and uses it, and it is a matter to be seriously attended to when, in comparison to many other industries such as finance, interoperability and universally accepted standards are lacking. Importantly, the application of our approach, and our strategy for healthcare and biomedicine, covers a variety of areas in healthcare information technology that we have addressed as proofs-of-concept in software development, welded into a single focus by a unification made possible through the above theoretical and methodological principles. These areas include digital patient records, privacy and consent mechanisms, clinical decision support, and translational research (i.e. getting the results of relevant biomedical research such as new genomics findings to physicians faster). All of these are obviously required to provide information for actions taken by physicians and other medical workers, but the broad sweep is also essential because no aspect stands alone: there has been a need for new semantic principles, based on the core features of the AI approach, to achieve interoperability and universal exchange.

  1. There are various terms for such a knowledge store. “Knowledge Representation Store” is actually our term emphasizing that it is (in our view) analogous to human memory as enabled and utilized by human thought and language, but now in a representation that computers can readily read directly and use efficiently (while in our case also remaining readable directly by humans in a natural way).
  2. In such cases, probability one (P=1) is the obvious assignment, but strictly speaking in our approach this technically means that it is an assertion that awaits refutation, in the manner of the philosophy of Karl Popper, and consistent with information theory in which the information content I of any statement of probability P is I = -ln(P), i.e. we find information I=0 when probability P=1. A definition such as “cats are mammals” seems an exception, but then, as long as it stands as a definition, it will not be refuted.
  3. These are the rise of medical IT (and AI in general) as the next “Toffler wave of industry”, the urgent need to greatly reduce inefficiency and the high rate of medical error, especially considering to the strain on healthcare systems by the booming elderly population,  the rise of genomics and personalized medicine, their impact on the pharmaceutical industry, belief systems and ethics, and their impact on the increased need for management of privacy and consent.

Advertisements

Part A – Healthcare Interoperability Measures:- Cartesian Dilemma (Diagnosis)

Those in blue in the below content are reproduced from the referenced links.Slide06

Definition of Cartesian Dilemma; per Alexander Christopher

(what eyes sees and the mind sees are two different things)

Cartesian Dilemma

http://www.worldsystema.com/worldsystema/2011/10/christopher-alexander-templeto-1.html

From above link

“””””Alexander has been inexorably led to the revolutionary necessity of revising our basic picture of the universe to include a conception of the personal nature of order and our belonging to the world in which the wholeness of space and the extent to which it is alive is perceived as rooted in the plenum behind the visible universe, “the luminous ground” that holds us all. This form of extended objective truth will ultimately resolve our Cartesian dilemma by teaching us a new view of order and a new cosmology in which objective reality “out there” and a personal reality “in here” are thoroughly connected and the bifurcation of nature healed.””””””

“”To Rene Descartes the “Method” (1638) was a convenient mental trick but its success has left us with a mindset that conceives of the universe as a machine without any intrinsic value: the realms of human experience and of feeling are simply absent from the Cartesian world. Whilst inspiring generations of architects and many others from all walks of life concerned with the fate of the earth, Alexander’s ultimately life changing work has understandably provoked powerful opposition from those invested within the establishment of the old paradigm. Social disorder, mental illness, ecological degradation, these and many other problems are due to a misunderstanding of the structure of matter and the nature of the universe and, until quite recently, there has been no coherent way of explaining the order that we respond to and love in nature.””

———————————————————————-

Affordability Care Act and HITECH Act lead into EHR Incentive Program. Based on the EHR Incentive Program CMS has already payed out 24+ Billions of dollars to Eligible Participants. Has it or will it drive the envisioned Healthcare Interoperability still remains a big question. Specifically will it be possible to mine the millions of records and discover opportunity for improvement? Without emphasis on clinical decision support will it be possible to achieve efficacy in the healthcare delivery, while also advancing the opportunities for “pay for performance” outcomes?

To advance EHR adoption in the Healthcare Ecosystem CMS proposed formation of Accountable Care Organization

https://www.cms.gov/Newsroom/MediaReleaseDatabase/Fact-sheets/2011-Fact-sheets-items/2011-12-19.html

From the above link

“”The Pioneer ACO Model is designed for health care organizations and providers that are already experienced in coordinating care for patients across care settings. It will allow these provider groups to move more rapidly from a shared savings payment model to a population-based payment model on a track consistent with, but separate from, the Medicare Shared Services Program. And it is designed to work in coordination with private payers by aligning provider incentives, which will improve quality and health outcomes for patients across the ACO, and achieve cost savings for Medicare, employers and patients.””

Importantly CMS proposed roadmap for EHR Adoption based on Meaningful Use (MU) 3 Stages, in the hope of advancing interoperability in the healthcare ecosystem ultimately achieving performance driven model, where the payment models shifts from “pay for service” towards “pay for performance”. Looking at the Healthcare ecosystem, one must take note that achieving efficiency is in the healthcare management; while achieving efficacy is in the healthcare delivery.

You will see in the end of the discussion that somehow efforts of the EHR Incentive Program lays more emphasis on the helathcare efficiency without paying required attention to clinical efficacy. This leads to the systemic entropic discontinuity that can be described by the Boltzmann constant.

This results into missed Line of Sight, where the established “objective”s at the IT / EHR level do not deliver all the required the “business capabilities” or the output and hence the desired “transformative outcomes” are not realized.

https://en.wikipedia.org/wiki/Boltzmann%27s_entropy_formula

From the above link:-

“”In statistical mechanicsBoltzmann’s equation is a probability equation relating the entropy S of an ideal gas ( or consider healthcare ecosystem) to the quantity W, which is the number of microstates corresponding to a given macrostate.”””

Following are the EHR Adoption Meaningful Use Stages:-

MU Stage 1 :- Achieves electronic capture of the patient data (Data Capture and Sharing)

MU Stage 2 :- Achieves Health Information Exchanges (Advances co-ordinated clinical processes)

MU Stage 3:- Target Improved Outcomes ( achieved by moving the payment model from pay for service to pay for performance)

The eligible participants, physicians, hospitals and the ACOs have to demonstrate that they have met the MU criteria in stages. To demonstrate that they have met the requirements, first of all it is required to demonstrate that the data being captured adhere to a prescribed format. This is ascertained by MU attestation.

Additionally, the eligible participants are required to submit quality measures reports to CMS

https://www.cms.gov/Medicare/Medicare-Fee-for-Service-Payment/sharedsavingsprogram/Quality_Measures_Standards.html

From the above link

“””” Quality Measures and Performance Standards

Quality data reporting and collection support quality measurement, an important part of the Shared Savings Program. Before an ACO can share in any savings generated, it must demonstrate that it met the quality performance standard for that year. There are also interactions between ACO quality reporting and other CMS initiatives, particularly the Physician Quality Reporting System (PQRS) and meaningful use. The sections below provide resources related to the program’s 33 quality measures, which span four quality domains: Patient / Caregiver Experience, Care Coordination / Patient Safety, Preventive Health, and At-Risk Population. Of the 33 measures, 7 measures of patient / caregiver experience are collected via the CAHPS survey, 3 are calculated via claims, 1 is calculated from Medicare and Medicaid Electronic Health Record (EHR) Incentive Program data, and 22 are collected via the ACO Group Practice Reporting Option (GPRO) Web Interface.””””

https://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment Instruments/QualityMeasures/index.htm/lredirect=/QUALITYMEASURES/

National Quality Forum (NQF) endorsed for CMS reports are :

  • The Hospital Inpatient Quality Reporting (IQR) Program,
  • The Hospital Outpatient Quality Reporting (OQR) Program,
  • The Physician Quality Reporting System (PQRS), and
  • Others as directed by CMS, such as long-term care settings and ambulatory care settings

The CMS quality reporting is based on the schematic derived from HL7, termed QRDA

https://www.cms.gov/regulations-and-guidance/legislation/ehrincentiveprograms/downloads/qrda_ep_hqr_guide_2015.pdf

From the above link

Overview of QRDA

“””The Health Level Seven International (HL7) QRDA is a standard document format for the exchange of electronic clinical quality measure (eCQM) data. QRDA reports contain data extracted from electronic health records (EHRs) and other information technology systems. QRDA reports are used for the exchange of eCQM data between systems for a variety of quality measurement and reporting initiatives, such as the Centers for Medicare & Medicaid Services (CMS) EHR Incentive Program: Meaningful Use Stage 2 (MU2).1

The Office of the National Coordinator for Health Information Technology (ONC) adopted QRDA as the standard to support both QRDA Category I (individual patient) and QRDA Category III (aggregate) data submission approaches for MU2 through final rulemaking in September 2012.2 CMS and ONC subsequently released an interim final rule in December 2012 that replaced the QRDA Category III standard adopted in the September 2012 final rule with an updated version of the standard.3 QRDA Category I and III implementation guides (IGs) are Draft Standards for Trial Use (DSTUs). DSTUs are issued at a point in the standards development life cycle when many, but not all, of the guiding requirements have been clarified. A DSTU is tested and then taken back through the HL7 ballot process to be formalized into an American National Standards Institute (ANSI)-accredited normative standard.

QRDA is a subset of CDA HL7 Standard; QRDA is a constraint on the HL7 Clinical Document Architecture (CDA), a document markup standard that specifies the structure and semantics of clinical documents for the purpose of exchange.4 To streamline implementations, QRDA makes use of CDA templates, which are business rules for representing clinical data consistently. Many QRDA templates are reused from the HL7 Consolidated CDA (C-CDA) standard5, which contains a library of commonly used templates that have been harmonized for MU2. Templates defined in the QRDA Category I and III IGs enable consistent representations of quality reporting data to streamline implementations and promote interoperability.”””

On the contrary we have Office Of National Coordinator (ONC) stipulate and regulate standards to achieve Healthcare Interoperability

ONC Roadmap Vision in the below link

https://www.healthit.gov/policy-researchers-implementers/interoperability

From above link:-

Sadly, although Evidence based is discussed, data mining and concerns around algorithm development is missing.

“””””””

Connecting Health and Care for the Nation: A Shared Nationwide Interoperability Roadmap version 1.0 (Roadmap) [PDF – 3.7 MB] supports the vision that ONC outlined in Connecting Health and Care for the Nation: A 10 Year Vision to Achieve An Interoperable Health IT Infrastructure [PDF – 607 KB]. The Roadmap, shaped by stakeholder input, lays out a clear path to catalyze the collaboration of stakeholders who are going to build and use the health IT infrastructure. The collaborative efforts of stakeholders is crucial to achieving the vision of a learning health system where individuals are at the center of their care; providers have a seamless ability to securely access and use health information from different sources; an individual’s health information is not limited to what is stored in electronic health records (EHRs), but includes information from many different sources and portrays a longitudinal picture of their health, not just episodes of care; and where public health agencies and researchers can rapidly learn, develop, and deliver cutting edge treatments.

“”””””””

http://www.healthit.gov/buzz-blog/from-the-onc-desk/oncinteroperability- roadmap-update/

There is no doubt that ONC aspires to achieve true Healthcare Interoperability, by bringing more clarity to the Health Information Exchange (HIE) as discussed in the below link.

Interoperability vs Health Information Exchange: Setting the Record Straight

ONC under its purview has Office of Standards and Technology, which drives the Interoperability Standards; and it acknowledges that there are numerous challenges in realizing the ONC roadmap; as discussed in the below link

Interoperability Standards – Shades of Gray

Also ONC specifies roadmap in achieving MU stages for physicians, hospitals and ACOs ( HIE)
Slide06https://www.healthit.gov/providers-professionals/ehrimplementation-steps/step-5-achieve-meaningful-use

Specifically for the Semantic Interoperability it recommends Consolidated – Clinical Document Architecture ( C-CDA).

https://www.healthit.gov/policy-researchers-implementers/consolidated-cda-overview

CDA helps in representing a comprehensive view of the patient; complete birth-to-death view – Longitudinal Record.

Also ONC Interoperability Specification Address the Following three levels (Not adequate to achieve EBM driven CDSS):-

There are three levels of health information technology interoperability:  1) Foundational; 2) Structural; and 3) Semantic.

1 – “Foundational” interoperability allows data exchange from one information technology system to be received by another and does not require the ability for the receiving information technology system to interpret the data.

2 – “Structural” interoperability is an intermediate level that defines the structure or format of data exchange (i.e., the message format standards) where there is uniform movement of healthcare data from one system to another such that the clinical or operational purpose and meaning of the data is preserved and unaltered. Structural interoperability defines the syntax of the data exchange. It ensures that data exchanges between information technology systems can be interpreted at the data field level.

3 – “Semantic” interoperability provides interoperability at the highest level, which is the ability of two or more systems or elements to exchange information and to use the information that has been exchanged. Semantic interoperability takes advantage of both the structuring of the data exchange and the codification of the data including vocabulary so that the receiving information technology systems can interpret the data. This level of interoperability supports the electronic exchange of patient summary information among caregivers and other authorized parties via potentially disparate electronic health record (EHR) systems and other systems to improve quality, safety, efficiency, and efficacy of healthcare delivery.

Desired or Recommended 2nd Order Semantic Interoperability

Probabilistic Ontology Driven Knowledge Engineering

Ref:- http://www.ncbi.nlm.nih.gov/pubmed/22269224

Chronically ill patients are complex health care cases that require the coordinated interaction of multiple professionals. A correct intervention of these sort of patients entails the accurate analysis of the conditions of each concrete patient and the adaptation of evidence-based standard intervention plans to these conditions. There are some other clinical circumstances such as wrong diagnoses, unobserved comorbidities, missing information, unobserved related diseases or prevention, whose detection depends on the capacities of deduction of the professionals involved.

< diagnosis > < procedures > < outcomes > [triple store]

Conclusion:-

From the above points it must be noted that QRDA and C-CDA achieves different things. Unfortunately, against MU attestation and quality reports that are filed by the eligible participants (physicians, hospitals and ACOs) based on QRDA (especially PQRA), CMS runs the EHR incentives program. Whereas, in the MU2 stage ( as per ONC), it is also required by the participants to demonstrate that they have achieved interoperability within ACO, while implementing HIE, this requires C-CDA. This stage must demonstrate that coordinated clinical processes have been achieved.

Also, clinical decision support system (CDSS) has been established addressing at least 5 critical or clinical priority areas.  Unfortunately this particular capability does not seems to be addressed adequately by the ACOs; who only pursue to demonstrate quality measures have been achieved which necessarily does not mean clinical efficacy have been addressed. 

It seems an important architectural problem has been glossed over by the policy designers, who proposed quality measures model with the motivation for capturing the metrics that eventually demonstrate “pay for performance”; and somehow assumed that the proposed metrics based on QRDA also demonstrate that the clinical efficacies have been achieved. This leads into systemic entropic discontinuity, where the efforts at macro states that represents healthcare management leading into healthcare efficiency  is not necessarily a cumulative realization for the efforts at the micro states which represents gaining clinical efficacy. This entropic discountuinity between the macro state and the micro states is measured by Boltzmann Constant.

Link to more discussion on micro states and macro states within a complex system. Basically discusses for a given complex system, and for all the efforts towards the input; the entropy arrested and created loss, so the output is a actually created incurring loss. This means the systemic efficiency incurred losses and did not realize all the benefits arising out of the clinical efficacy. This is a model problem which inaccurately represents the “phenomenon of interest”.

https://books.google.com/books?id=dAhQBAAAQBAJ&pg=PT295&lpg=PT295&dq=boltzmann+constant+macro+state&source=bl&ots=ubpGEUymWc&sig=cQ4Nz9f6OA0ryDGEupOHDUAyiRc&hl=en&sa=X&ved=0CCwQ6AEwA2oVChMI0qeqv4G4yAIVCzo-Ch07WAkU#v=onepage&q=boltzmann%20constant%20macro%20state&f=false

To achieve Clinical Decision Support System capability which rather plays a very important role in enhancing clinical efficacy, developing data mining driven Evidence Based Medicine capability is imperative. This capability does not seem as being achieved because most HIE / ACO is being developed around QRDA; although discussed in the ONC Connecting Health and Care for the Nation: A Shared Nationwide Interoperability Roadmap version 1.0 (Roadmap) [PDF – 3.7 MB]; unless data mining related algorithmic challenges are addressed which means standards beyond mere capture of the required data fields, interoperability efforts will be in vain.

Role of EBM in achieving CDSS discussed on following sites

CMS Site

https://www.healthit.gov/providers-professionals/achieve-meaningful-use/core-measures/clinical-decision-support-rule

NIH Site

http://www.ncbi.nlm.nih.gov/pmc/articles/PMC130063/

As such it must be noted clinical errors is one among the highest risk becoming the No 3 Killer in the US.

http://www.healthcareitnews.com/news/deaths-by-medical-mistakes-hit-records

From above link

“””It’s a chilling reality – one often overlooked in annual mortality statistics: Preventable medical errors persist as the No. 3 killer in the U.S. – third only to heart disease and cancer – claiming the lives of some 400,000 people each year. At a Senate hearing Thursday, patient safety officials put their best ideas forward on how to solve the crisis, with IT often at the center of discussions. “””

P.S:-

Bioingine (www.bioingine.com); a Cognitive Computing Platform transforms the patient information (millions of records) created by the HIE into Ecosystem Knowledge Landscape that is inherently evidence based, allowing for study of the Tacit Knowledge, as discovered from the millions of patient records (large data sets) by mining and knowledge inference in an automated way. This is achieved employing AI, Machine Learning and such techniques. Thereby, creating Clinical Decision Support System.