CLINICAL ERRORS

2nd Order Semantic Web and A.I driven Reasoning – 300 Years Plus of Crusade

Bioingine.com | Ingine Inc

Screenshot 2016-02-02 11.30.13

Chronology of Development of Hyperbolic Dirac Net (HDN) Inference. 

https://en.wikipedia.org/wiki/Thomas_Bayes

From Above Link:-

1. 1763. Thomas Bayes was an English statistician, philosopher and Presbyterian minister who is known for having formulated a specific case of the theorem that bears his name: Bayes’ theorem.

 Bayes’s solution to a problem of inverse probability was presented in “An Essay towards solving a Problem in the Doctrine of Chances” which was read to the Royal Society in 1763 after Bayes’ death

https://en.wikipedia.org/wiki/Bayes%27_theorem

From Above Link:-

In probability theory and statisticsBayes’ theorem (alternatively Bayes’ law or Bayes’ rule) describes the probability of an event, based on conditions that might be related to the event.

When applied, the probabilities involved in Bayes’ theorem may have different probability interpretations. In one of these interpretations, the theorem is used directly as part of a particular approach to statistical inference. With the Bayesian probability interpretation the theorem expresses how a subjective degree of belief should rationally change to account for evidence: this is Bayesian inference, which is fundamental to Bayesian statistics. However, Bayes’ theorem has applications in a wide range of calculations involving probabilities, not just in Bayesian inference.

https://en.wikipedia.org/wiki/Bayesian_inference

From Above Link:-

Bayesian inference is a method of statistical inference in which Bayes’ theorem is used to update the probability for a hypothesis as more evidence or information becomes available. Bayesian inference is an important technique in statistics, and especially in mathematical statistics. Bayesian updating is particularly important in the dynamic analysis of a sequence of data. Bayesian inference has found application in a wide range of activities, including scienceengineeringphilosophymedicinesport, and law. In the philosophy of decision theory, Bayesian inference is closely related to subjective probability, often called “Bayesian probability“.

2. 1859, Georg Friedrich Bernhard Riemann proposed Riemann zeta function,function useful in number theory for investigating properties of prime numbers. Written as ζ(x), it was originally defined as the infinite series

ζ(x) = 1 + 2−x + 3−x + 4−x + ⋯.

The theory should perhaps be distinguished from an existing purely number-theoretic area sometimes also known as Zeta Theory, which focuses on the Riemann Zeta Function and the ways in which it governs the distribution of prime numbers

http://mathworld.wolfram.com/RiemannZetaFunction.html

The Riemann zeta function is an extremely important special function of mathematics and physics that arises in definite integration and is intimately related with very deep results surrounding the prime number theorem. While many of the properties of this function have been investigated, there remain important fundamental conjectures (most notably the Riemann hypothesis) that remain unproved to this day. The Riemann zeta function is defined over the complex plane for one complex variable, which is conventionally denoted (instead of the usual ) in deference to the notation used by Riemann in his 1859 paper that founded the study of this function (Riemann 1859). It is implemented in the Wolfram Language as Zeta[s].

3. 1900. Ramanujan’s mathematical work was primarily in the areas of number theory and classical analysis. In particular, he worked extensively with infinite series, integrals, continued fractions, modular forms, q-series, theta functions, elliptic functions, the Riemann Zeta-Function, and other special functions.

Hardy wrote in Ramanujan’s obituary [14]:

There is always more i n one of Ramanujan’s formulae than meets the eye, as anyone who sets to work to verify those which look the easiest will soon discover. In some the interest lies very deep, in others comparatively near the surface; but there is not one, which is not curious and entertaining.

http://www.integralworld.net/collins18.html

From above link :-

Now there is a famous account of the gifted Indian mathematician Ramanujan who when writing to Hardy at Cambridge regarding his early findings included the seemingly nonsensical result,

1 + 2 + 3 + 4 + ……(to infinity) = – 1/12.

Initially Hardy was inclined to think that he was dealing with a fraud, but on further reflection realized that Ramanujan was in fact describing the Riemann Zeta Function (for s = – 1). He could then appreciate his brilliance as one, who though considerably isolated and without any formal training, had independently covered much of the same ground as Riemann.

However it still begs the question as to what the actual meaning of such a result can be, for in the standard conventional manner of mathematical interpretation, the sum of the series of natural numbers clearly diverges.

The startling fact is that this result – though indirectly expressed in a quantitative manner – actually expresses a qualitative type relationship (pertaining to holistic mathematical interpretation).

Uncovering Ramanujan’s “Lost” Notebook: An Oral History

http://arxiv.org/pdf/1208.2694.pdf

ROBERT P. SCHNEIDER

From above link :-

Whereas Ramanujan’s earlier work dealt largely with classical number-theoretic objects such as q-series, theta functions, partitions and prime numbers—exotic, startling, breathtaking identities built up from infinite series, integrals and continued fractions—in these newfound papers, Andrews found never-before-seen work on the mysterious “mock theta functions” hinted at in a letter written to Hardy in Ramanujan’s final months, pointing to realms at the very edge of the mathematical landscape. The content of Ramanujan’s lost notebook is too rich, too ornate, too strange to be developed within the scope of the present article. We provide a handful of stunning examples below, intended only to tantalize—perhaps mystify—the reader, who is encouraged to let his or her eyes wander across the page, picking patterns like spring flowers from the wild field of symbols.

The following are two fantastic q-series identities found in the lost notebook, published by Andrews soon after his discovery, in which is taken to be a complex number with |q| <1

Another surprising expression involves an example of a mock theta function provided by Ramanujan in the final letter he sent to Hardy

In the words of mathematician Ken Ono, a contemporary trailblazer in the field of mock theta functions, “Obviously Ramanujan knew much more than he revealed [14].” Indeed, Ramanujan then “miraculously claimed” that the coefficients of this mock theta function obey the asymptotic relation

The new realms pointed to by the work of Ramanujan’s final year are now understood to be ruled by bizarre mathematical structures known as harmonic Maass forms. This broader perspective was only achieved in the last ten years, and has led to cutting-edge science, ranging from cancer research to the physics of black holes to the completion of group theory. 

Yet details of George Andrews’s unearthing of Ramanujan’s notes are only sparsely sketched in the literature; one can detect but an outline of the tale surrounding one of the most fruitful mathematical discoveries of our era. In hopes of contributing to a more complete picture of this momentous event and its significance, here we weave together excerpts from interviews we conducted with Andrews documenting the memories of his trip to Trinity College, as well as from separate interviews with mathematicians Bruce Berndt and Ken Ono, who have both collaborated with Andrews in proving and extending the contents of Ramanujan’s famous lost notebook.

4. Elie Joseph Cartan, developed “Theory of Spinors

https://archive.org/details/TheTheoryOfSpinors

https://en.wikipedia.org/wiki/Spinor

From above link:-

In geometry and physics, spinors are elements of a (complexvector space that can be associated with Euclidean space. Like geometric vectors and more general tensors, spinors transform linearly when the Euclidean space is subjected to a slight (infinitesimal) rotation. When a sequence of such small rotations is composed (integrated) to form an overall final rotation, however, the resulting spinor transformation depends on which sequence of small rotations was used, unlike for vectors and tensors. A spinor transforms to its negative when the space is rotated through a complete turn from 0° to 360° (see picture), and it is this property that characterizes spinors. It is also possible to associate a substantially similar notion of spinor to Minkowski space in which case the Lorentz transformations of special relativity play the role of rotations. Spinors were introduced in geometry by Élie Cartan in 1913. In the 1920s physicists discovered that spinors are essential to describe the intrinsic angular momentum, or “spin”, of the electron and other subatomic particles.

5. 1928, Paul A M Dirac derived the Dirac equation, which In particle physics, is a relativistic wave equation.

From above link:-

http://www.mathpages.com/home/kmath654/kmath654.htm

http://mathworld.wolfram.com/DiracEquation.html

The quantum electrodynamical law which applies to spin-1/2 particles and is the relativistic generalization of the Schrödinger equation. In dimensions (three space dimensions and one time dimension), it is given by

DIRAC1

6. 1930. Dirac publishes his book on his pivotal view of quantum mechanics, including his earliest mentions of an operator with the properties of the hyperbolic number such that hh = +1. It extends the theory of wave mechanics to particle mechanics. 
P. A. M. Dirac, The Principles of Quantum Mechanics, First Edition, Oxford University Press, Oxford (1930).

7. 1933. In his Nobel Prize Dinner speech, Dirac states that mechanical methods are applicable to all forms of human thought where numbers are involved. http://www.nobelprize.org/nobel_prizes/physics/laureates/1933/dirac-speech.html

8. 1939. DIRAC PUBLISHES HIS BRAKET NOTATION. It is incorporated into the third edition of his book.

P.A.M. Dirac (1939). A new notation for quantum mechanics, Mathematical Proceedings of the Cambridge Philosophical Society 35 (3): 416–418

9. 1974. Robson develops his Expected Information approach that preempts the Bayes Net method.

B. Robson, Analysis of the Code Relating Sequence to Conformation in Globular Proteins: Theory and Application of Expected Information, Biochem. J141, 853-867 (1974).

10. 1978. The Expected Information approach crystallizes as the GOR method widely used in bioinformatics.

Garnier, D. J. Osguthorpe, and B. Robson, Analysis of the Accuracy and Implications of Simple Methods for Predicting the Secondary Structure of Globular Proteins”, J. Mol. Biol. 120, 97-120 (1978). 


11. 1982 . Buchannan and Shortliffe describe the first medical Expert System. It is based on probabilistic statements, but sets a tradition of innovation and diverse controversial methods in automated medical inference.

Buchanan, E.H. Shortliffe, (1982) Rule Based Expert Systems. The Mycin Experiments of the Stanford Heuristic Programming Project, Addison-Wesley: Reading, Massachusetts.

12. 1985. Pearl Gives Full Accound the Bayes Net method.

Pearl, Probabilistic Reasoning in Intelligent Systems. San Francisco CA: Morgan Kaufmann (1985).

13. March 1989, Sir Tim Berners-less invented WWW: – Introduced non-linear linking of information across systems.

Tim laid out his vision for what would become the Web in a document called “Information Management: A Proposal”.Believe it or not, Tim’s initial proposal was not immediately accepted. In fact, his boss at the time, Mike Sendall, noted the words “Vague but exciting” on the cover. The Web was never an official CERN project, but Mike managed to give Tim time to work on it in September 1990. He began work using a NeXT computer, one of Steve Jobs’ early products.

14. 1997. Clifford Algebra using becomes more widely recognized as a tool for engineers as well as scientists and physicists.

Gürlebeck, W. Sprössig, Quaternionic and Clifford Calculus for Physicists and Engineers, Wiley, Chichester (1997)

15. 1999. Tim Berners-Lee described the Semantic Web vision in the following terms

I have a dream for the Web [in which computers] become capable of analyzing all the data on the Web, the content, links, and transactions between people and computers. A Semantic Web, which should make this possible, has yet to emerge, but when it does, the day-to-day mechanisms of trade, bureaucracy and our daily lives will be handled by machines talking to machines. The intelligent agents people have touted for ages will finally materialize. (1999)

16. 2000. Khrennikov gives description of a primarily h-complex quantum mechanics.

Khrenikov, Hyperbolic quantum mechanics, Cornell University Library, arXiv:quant-ph/0101002v1 (2000).

17. 2000. Bucholz and Sommer refine work showing that neural networks as inference systems modeled on the brain can usefully use the hypercomplex imaginary number h.

S. Buchholz, and G. Sommer, A hyperbolic multilayer perceptron International Joint Conference on Neural Networks, IJCNN 2000, Como,Italy, Vol. 2 of pp. 129-133. Amari, S-I and. Giles, C.L M. Gori. M. and Piuri, V. Eds. IEEE Computer Society Press, (2000).

18. 2003. Robson Points out that the Expected Information method in bioinformatics is really the use of the partially summated Riemann Zeta function, and a best choice for treatment of sparse data in data mining in general.

B Robson (2003) “Clinical and Pharmacogenomic Data Mining. 1. The generalized theory of expected information and application to the development of tools” J. Proteome Res. (Am. Chem. Soc.) 283-301, 2 

19. 2003. Nitta Shows that the power of the h-complex approach in neural nets is primarily due to its ability to solver the notorious exclusive-or logical problem in a single neuron.

Nitta, Solving the XOR problem and the detection of symmetry using a single complex-valued neuron, Neural Networks 16:8, 1101-1105, T. (2003).

20. 2003. Khrennikov consolidates the notion of an extensively h-complex quantum mechanics, but feels that i-complex, h-complex, and real world mechanics are three spate systems.

A.Khrennikov, A. Hyperbolic quantum mechanics, Adv. in Applied Clifford Algebras, Vol.13, 1 (2003). 

21.2004. Khrennikov notes possible relation between h-complex quantum mechanics and mental function.

Khrennikov, On Quantum-Like Probabilistic Structure of Mental Information, Open Systems Information Dynamics, Vol. 11, 3, 267-275 (2004).

22. 2004 Rochon shows that the full Riemann Zeta function is both i-complex and h-complex.

Rochon, A Bicomplex Riemann Zeta Function, Tokyo J. of Math.

23. 2004. Robson argues that zeta theory is a solution to high dimensionality problems in data mining.

Robson, The Dragon on the Gold: Myths and Realities for Data Mining in Biotechnology using Digital and Molecular Libraries, J. Proteome Res. (Am. Chem. Soc.) 3 (6), 1113 – 9 (2004).

24. 2005. Robson argues that all statements in zeta theory and in prime number theory are really statements relevant to data and data mining, and describes first link to Dirac’s quantum mechanics and Dirac’s braket notation.

Robson, Clinical and Pharmacogenomic Data Mining: 3. Zeta Theory As a General Tactic for Clinical Bioinformatics, J. Proteome Res. (Am. Chem. Soc.) 4(2); 445-455 (2005) 


25. 2005. Code CliniMiner/Fano based on Zeta Theory and prime number theory is used in first pioneering effort in data mining large number of patient records.

Mullins, I. M., M.S. Siadaty, J. Lyman, K. Scully, G.T. Garrett, G. Miller, R. Muller, B. Robson, C. Apte, C., S. Weiss, I. Rigoutsos, D. Platt, and S. Cohen, Data mining and clinical data repositories: Insights from a 667,000 patient data set, Computers in Biology and Medicine, 36(12) 1351 (2006). 


26. 2007. Robson recognizes that the imaginary number required to reconcile zeta theory with quantum mechanics and to allow Dirac notation to be used in inference is the hyperbolic imaginary number h, not the imaginary number i. Unaware of the work of Khrennikov, he makes no Khrennikov-like distinction between h-complex quantum mechanics and the everyday world.

Mullins, I. M., M.S. Siadaty, J. Lyman, K. Scully,G.T. Garrett, G.Miller, R. Muller, B.Robson, C. Apte, C., S. Weiss, I. Rigoutsos, D. Platt, and S. Cohen, Data mining and clinical data repositories: Insights from a 667,000 patient data set, Computers*in*Biology* and*Medicine, 36(12) 1351 (2006)

27. 2007. Robson recognizes that the imaginary number required to reconcile zeta theory with 
quantum mechanics and to allow Dirac notation to be used in inference is the hyperbolic imaginary number h, not the imaginary number i. Unaware of the work of Khrennikov, he makes no Khrennikov like distinction between h complex quantum mechanics and the every day world.

Robson, The New Physician as Unwitting Quantum Mechanic: Is Adapting Dirac’s Inference System Best Practice for Personalized Medicine, Genomics and Proteomics, J. Proteome Res. (A. Chem. Soc.), Vol. 6, No. 8: 3114 – 3126, (2007). 


Robson, B. (2007) “Data Mining and Inference Systems for Physician Decision Support in Personalized Medicine” Lecture and Circulated Report at the 1st Annual Total Cancer Care Summit, Bahamas 2007. 


28. 2008. Data Mining techniques using the full i-complex and h-complex zeta function are developed.

Robson, Clinical and Pharmacogenomic Data Mining: 4. The FANO Program and Command Set as an Example of Tools for Biomedical Discovery and Evidence Based Medicine” J. Proteome Res., 7 (9), pp 3922–3947 (2008). 


29. 2008. Nitta and Bucholtz explore decision process boundaries of h-complex neural nets.

Nitta, and S. Bucholtz, On the Decision Boundaries of Hyperbolic Neurons. In 2008 International Joint Conference on Neural Networks (IJCNN). 


30. 2009. Semantic Web starts to emerge but runs into bottleneck regarding the best approach for probabilistic treatment.

Prediou and H. Stuckenschmidt, H. Probabilistic Models for the SW – A Survey. http://ki.informatik.unimannheim.de/fileadmin/ publication/ Predoiu08Survey.pdf (last accessed 4/29/2010) 


31. 2009. Baek and Robson propose that, for reasons of bandwidth limitations and security, the Internet should consist of data-centric computing by smart software robots. Robson indicates that they could be based on h-complex inference systems and link to semantic theory.

Robson B.. and Baek OK. The Engines of Hippocrates. From the Dawn of Medicine to Medical and Phrmaceuteutical Infomatics, Wiley, 2009. 

Robson B. (2009) “Towards Intelligent Internet-Roaming Agents for Mining and Inference from Medical Data”, Future of Health Technology Congress, Technology and Informatics, Vol. 149, 157-177 IOS Press 

Robson, B. (2009) “Links Between Quantum Physics and Thought” (A. I. Applications in Medicine) , Future of Health Technology Congress, Technology and Informatics, Vol. 149, 157-177 IOS Press. 

32. 2009. Nivitha et al. develop new learning algorithms for complex-valued networks.

S. Savitha, S. Suresh, S. Sundararajan, and P, Saratchandran, A new learning algorithm with logarithmic performance index for complex-valued neural networks, Neurocomputing 72 (16-18), 3771-3781 (2009).

33. 2009. Khrennikov argues for the h-complex Hilbert space as providing the “contextual” (underlying rationale, hidden variables etc.) for all quantum mechanics.

Khrennikov, Contextual Approach to Quantum Formalism, Springer (2009) 

34. 2010. Robson and Vaithiligam describe how zeta theory and h-complex probabilistic algebra can resolves challenges in data mining by the pharmaceutical industry.

Robson and A. Vaithiligam, Drug Gold and Data Dragons: Myths and Realities of Data Mining in the Pharmaceutical Industry pp25-85 in Pharmaceutical Data Mining, Ed Balakin, K. V. , John Wiley Sons (2010).

35. 2010. PCAST. December Report by the US President’s Council of Advisors on science and Technology calls for an XML-like Universal Exchange Langue for medicine including disaggregation for the patient record on the Internet for patient access, security, and privacy.

http://www.whitehouse.gov/sites/default/files/microsites/ostp/pcast-health-it- report.pdf 

36. 2011. First description of Q-UEL in response to PCAST 2010.

Robson, B., Balis, U. G. J. and Caruso, T. P. (2011)“Considerations for a Universal Exchange Language for Healthcare.” In Proceedings of 2011 IEEE 13th International Conference on e-Health Networking, Applications and Services (Healthcom 2011), 173– 176. Columbus, MO: IEEE, 2011. 

 37. 2011. Robson and Colleagues develop the method of match-and-edit instructions for extracting

Robson, B., Li, J., Dettinger, R., Peters, A., and Boyer, S.K. (2011), Drug discovery using very large numbers of patents. General strategy with extensive use of match and edit operations. Journal of Computer-Aided Molecular Design 25(5): 427-441 

38. 2011. Kuroe et al. consolidate the theory of h– complex neural nets.

Kuroe, T. Shinpei, and H. Iima, Models of Hopfield-Type Clifford Neural Networks and Their Energy Functions – Hyperbolic and Dual Valued Networks, Lecture Notes in Computer Science, 7062, 560 (2011).

39. 2012. Robson argues that h-complex algebra is an appropriate basis for Artificial Intelligence in the Pharmaceutical Industry.

Robson, B. (2012) “Towards Automated Reasoning for Drug Discovery and Pharmaceutical Business Intelligence”, Pharmaceutical Technology and Drug Research, 2012 1: 3 ( 27 March 2012 ) 


40. 2013. Goodman and Lassiter attempt to reconcile and restore interest in probabilistic semantics after a long period of domination by classical logic. 
N. D. Goodman and D. Lassiter, Probabilistic Semantics and Pragmatics: Uncertainty in Language and Thought,

https://web.stanford.edu/~ngoodman/papers/Goodman-HCS-final.pdf

41. 2013. Robson argues for importance of h-complex approach for measures in epidemiology. Robson, B. (2013)

“Towards New Tools for Pharmacoepidemiology”, Advances in Pharmacoepidemiology and Drug Safety, 1:6,

http://www.omicsgroup.org/journals/towards-new-tools-for-pharmacoepidemiology-2167-1052.1000123.pdf

42. 2013 Robson promotes Q-UEL from a public health perspective.
B. Robson, Rethinking Global Interoperability in Healthcare. Reflections and Experiments of an e-Epidemiologist from Clinical Record to Smart Medical Semantic Web Johns Hopkins Grand Rounds Lectures (last accessed 3/14/2013).

Screenshot 2016-02-02 11.30.13

http://dhsi.med.jhmi.edu/GrandRoundsVideo/Feb15-2013/SilverlightLoader.html

43. 2013 Robson and Caruso describe first version of Q-UEL in greater Detail.

Robson, B, and TP Caruso (2013) “A Universal Exchange Language for Healthcare” MedInfo ’13: Proceedings of the 14th World Congress on Medical and Health Informatics, Copenhagen, Denmark, Edited by CU Lehmann, E Ammenwerth, and C Nohr. IOS Press, Washington, DC, USA. http://quantalsemantics.com/documents/MedInfo13-RobsonCaruso_V6.pdf; http://ebooks.iospress.nl/publication/34165

44. 2014. Robson et al. release formal description of consolidated second version of Q-UEL.

Robson, T. P. Caruso and U. G. J. Balis, Suggestions for a Web Based Universal Exchange and Inference Language for Medicine, Computers in Biology and Medicine, 
43(12) 2297 (2013).

45. 2013. Moldoveneua expresses view that hyperbolic quantum mechanics can’t also include wave mechanics. Possible attack on Khrennikov’s idea that hyperbolic quantum mechanics can show 
interference as for waves. Signs of growing sense that hyperbolic quantum mechanics is simply the everyday world described in terms of the machinery of traditional quantum mechanics.

Moldoveanu, Non viability of hyperbolic quantum mechanics as a theory of Nature, Cornell University Library, arXiv:1311.6461v2 [quant-ph] (2013).

46. 2013. First full description of the Hyperbolic Dirac Net and its relation to Q-UEL and to Bayes Nets.

Robson, Hyperbolic Dirac Nets for Medical Decision Support. Theory, Methods, and Comparison with Bayes Nets, Computers in Biology and Medicine, 51, 183 (2013).

http://www.sciencedirect.com/science/article/pii/S0010482514000778

47. 2014. Kunegis et al.c develop h-complex algorithms for dating recommender systems.

Kunegis, G. Gröner, and T, Gottrron, On-Line Dating Recommender Systems, the Split Complex Number Approach, (Like/Dislike, Similar/Disimilar) http://userpages.uni- koblenz.de/~kunegis/paper/kunegis-online-dating-recommender-systems-the-split- complex-number-approach.pdf (last accessed 6/1/2014).

48. 2015. Robson describes extension of Hyperbolic Dirac Net to semantic reasoning and probabilistic lingusitics. 


Robson, B. “POPPER, a Simple Programming Language for Probabilistic Semantic Inference in Medicine. Computers in Biology and Medicine ” Computers in biology and Medicine”, (in press), DOI: 10.1016/j.compbiomed.2014.10.011 (2015). 


http://www.ncbi.nlm.nih.gov/pubmed/25464353

49. 2014. Yosemite Manifesto – a response to PCAST 2010 that the Semantic Web should provide healthcare IT, al though preempted by Q-UEL

http://yosemitemanifesto.org/ (last accessed 7/5/2014). 

50. 2015. Robson et al. describe medical records in Q-UEL format and PCAST disaggregation for patient security and privacy.

Robson, B., Caruso, T, and Balis, U. G. J. (2015) “Suggestions for a Web Based Universal Exchange and Inference Language for Medicine. Continuity of Patient Care with PCAST Disaggregation.” Computers in Biology and Medicine (in press) 01/2015; 56:51. DOI: 10.1016/j.compbiomed.2014.10.022 

51. 2015. Mathematician Steve Deckelman of U. Wisconsin-Stout and Berkeley validates the theoretical principles Hyperbolic Dirac Net.

Deckelman and Robson, B. (2015)“Split-Complex Numbers and Dirac Bra-Kets” Communications in Information andSystems (CIS), in press.

http://www.diracfoundation.com/?p=148

From Above Link:-

The inference net on which this dualization is performed is defined as an estimate of a probability as an expression comprising simpler probabilities and or association measures, i.e. each with fewer attributes (i.e. arguments, events, states, observations or measurements) that the joint probability estimated, where each attribute corresponds to nodes of a general graph and the probabilities or association measures represent their interdependencies as edges. It is not required that the inference net be an acyclic directed graph, but the widely used BN that satisfies that description by definition is a useful starting point for making use of the given probabilities to address the same or similar problems. Specifically for the estimation of a joint probability, and HDN properly constructed with prior probabilities, and whether or not it contains cyclic paths, is purely real valued and its construction principles represent a generalization of Bayes Theorem. Any imaginary part indicates the degree of departure from Bayes Theorem over the net as a whole, and the direction of conditionality in which the degree of departure occurs, and thus the HDN provides an excellent book-keeping tool that Bayes Theorem is satisfied overall. Specially for the estimation of a conditional probability, it follows conversely from the above that any expression for a joint probability validated by the above means can serve as the generator of an HDN for the estimation of a conditional probability simply by dividing it by the HDN counterparts of prior probabilities, whence the resulting net is not purely real save by coincidence of probability values.

52. 2015. Implementation of a web based universal exchange and inference language for medicine: Sparse data, probabilities and inference in data mining of clinical data repositories

Barry Robson and Srinidhi Boray

http://www.computersinbiologyandmedicine.com/article/S0010-4825(15)00257-7/abstract

52. 2015. Robson, B., and S. Boray, The Structure of Reasoning in Answering Multiple Choice Medical Licensing Examination Questions. Computer Studies   towards Formal Theories of Clinical Decision Support and Setting and Answering Medical Licensing Examinations, Workshop Lecture presentation, Proceedings of the IEEE International conference of Bioinformatics and Biomedicine, 9th-11th November, Washington DC (2015)

https://www.osehra.org/sites/default/files/Computer_Exams_V10.pdf

https://cci.drexel.edu/ieeebibm/bibm2015/BIBM2015Program.pdf

 

 

 

 

 

 

 

 

 

Part A – Healthcare Interoperability Measures:- Cartesian Dilemma (Diagnosis)

Those in blue in the below content are reproduced from the referenced links.Slide06

Definition of Cartesian Dilemma; per Alexander Christopher

(what eyes sees and the mind sees are two different things)

Cartesian Dilemma

http://www.worldsystema.com/worldsystema/2011/10/christopher-alexander-templeto-1.html

From above link

“””””Alexander has been inexorably led to the revolutionary necessity of revising our basic picture of the universe to include a conception of the personal nature of order and our belonging to the world in which the wholeness of space and the extent to which it is alive is perceived as rooted in the plenum behind the visible universe, “the luminous ground” that holds us all. This form of extended objective truth will ultimately resolve our Cartesian dilemma by teaching us a new view of order and a new cosmology in which objective reality “out there” and a personal reality “in here” are thoroughly connected and the bifurcation of nature healed.””””””

“”To Rene Descartes the “Method” (1638) was a convenient mental trick but its success has left us with a mindset that conceives of the universe as a machine without any intrinsic value: the realms of human experience and of feeling are simply absent from the Cartesian world. Whilst inspiring generations of architects and many others from all walks of life concerned with the fate of the earth, Alexander’s ultimately life changing work has understandably provoked powerful opposition from those invested within the establishment of the old paradigm. Social disorder, mental illness, ecological degradation, these and many other problems are due to a misunderstanding of the structure of matter and the nature of the universe and, until quite recently, there has been no coherent way of explaining the order that we respond to and love in nature.””

———————————————————————-

Affordability Care Act and HITECH Act lead into EHR Incentive Program. Based on the EHR Incentive Program CMS has already payed out 24+ Billions of dollars to Eligible Participants. Has it or will it drive the envisioned Healthcare Interoperability still remains a big question. Specifically will it be possible to mine the millions of records and discover opportunity for improvement? Without emphasis on clinical decision support will it be possible to achieve efficacy in the healthcare delivery, while also advancing the opportunities for “pay for performance” outcomes?

To advance EHR adoption in the Healthcare Ecosystem CMS proposed formation of Accountable Care Organization

https://www.cms.gov/Newsroom/MediaReleaseDatabase/Fact-sheets/2011-Fact-sheets-items/2011-12-19.html

From the above link

“”The Pioneer ACO Model is designed for health care organizations and providers that are already experienced in coordinating care for patients across care settings. It will allow these provider groups to move more rapidly from a shared savings payment model to a population-based payment model on a track consistent with, but separate from, the Medicare Shared Services Program. And it is designed to work in coordination with private payers by aligning provider incentives, which will improve quality and health outcomes for patients across the ACO, and achieve cost savings for Medicare, employers and patients.””

Importantly CMS proposed roadmap for EHR Adoption based on Meaningful Use (MU) 3 Stages, in the hope of advancing interoperability in the healthcare ecosystem ultimately achieving performance driven model, where the payment models shifts from “pay for service” towards “pay for performance”. Looking at the Healthcare ecosystem, one must take note that achieving efficiency is in the healthcare management; while achieving efficacy is in the healthcare delivery.

You will see in the end of the discussion that somehow efforts of the EHR Incentive Program lays more emphasis on the helathcare efficiency without paying required attention to clinical efficacy. This leads to the systemic entropic discontinuity that can be described by the Boltzmann constant.

This results into missed Line of Sight, where the established “objective”s at the IT / EHR level do not deliver all the required the “business capabilities” or the output and hence the desired “transformative outcomes” are not realized.

https://en.wikipedia.org/wiki/Boltzmann%27s_entropy_formula

From the above link:-

“”In statistical mechanicsBoltzmann’s equation is a probability equation relating the entropy S of an ideal gas ( or consider healthcare ecosystem) to the quantity W, which is the number of microstates corresponding to a given macrostate.”””

Following are the EHR Adoption Meaningful Use Stages:-

MU Stage 1 :- Achieves electronic capture of the patient data (Data Capture and Sharing)

MU Stage 2 :- Achieves Health Information Exchanges (Advances co-ordinated clinical processes)

MU Stage 3:- Target Improved Outcomes ( achieved by moving the payment model from pay for service to pay for performance)

The eligible participants, physicians, hospitals and the ACOs have to demonstrate that they have met the MU criteria in stages. To demonstrate that they have met the requirements, first of all it is required to demonstrate that the data being captured adhere to a prescribed format. This is ascertained by MU attestation.

Additionally, the eligible participants are required to submit quality measures reports to CMS

https://www.cms.gov/Medicare/Medicare-Fee-for-Service-Payment/sharedsavingsprogram/Quality_Measures_Standards.html

From the above link

“””” Quality Measures and Performance Standards

Quality data reporting and collection support quality measurement, an important part of the Shared Savings Program. Before an ACO can share in any savings generated, it must demonstrate that it met the quality performance standard for that year. There are also interactions between ACO quality reporting and other CMS initiatives, particularly the Physician Quality Reporting System (PQRS) and meaningful use. The sections below provide resources related to the program’s 33 quality measures, which span four quality domains: Patient / Caregiver Experience, Care Coordination / Patient Safety, Preventive Health, and At-Risk Population. Of the 33 measures, 7 measures of patient / caregiver experience are collected via the CAHPS survey, 3 are calculated via claims, 1 is calculated from Medicare and Medicaid Electronic Health Record (EHR) Incentive Program data, and 22 are collected via the ACO Group Practice Reporting Option (GPRO) Web Interface.””””

https://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment Instruments/QualityMeasures/index.htm/lredirect=/QUALITYMEASURES/

National Quality Forum (NQF) endorsed for CMS reports are :

  • The Hospital Inpatient Quality Reporting (IQR) Program,
  • The Hospital Outpatient Quality Reporting (OQR) Program,
  • The Physician Quality Reporting System (PQRS), and
  • Others as directed by CMS, such as long-term care settings and ambulatory care settings

The CMS quality reporting is based on the schematic derived from HL7, termed QRDA

https://www.cms.gov/regulations-and-guidance/legislation/ehrincentiveprograms/downloads/qrda_ep_hqr_guide_2015.pdf

From the above link

Overview of QRDA

“””The Health Level Seven International (HL7) QRDA is a standard document format for the exchange of electronic clinical quality measure (eCQM) data. QRDA reports contain data extracted from electronic health records (EHRs) and other information technology systems. QRDA reports are used for the exchange of eCQM data between systems for a variety of quality measurement and reporting initiatives, such as the Centers for Medicare & Medicaid Services (CMS) EHR Incentive Program: Meaningful Use Stage 2 (MU2).1

The Office of the National Coordinator for Health Information Technology (ONC) adopted QRDA as the standard to support both QRDA Category I (individual patient) and QRDA Category III (aggregate) data submission approaches for MU2 through final rulemaking in September 2012.2 CMS and ONC subsequently released an interim final rule in December 2012 that replaced the QRDA Category III standard adopted in the September 2012 final rule with an updated version of the standard.3 QRDA Category I and III implementation guides (IGs) are Draft Standards for Trial Use (DSTUs). DSTUs are issued at a point in the standards development life cycle when many, but not all, of the guiding requirements have been clarified. A DSTU is tested and then taken back through the HL7 ballot process to be formalized into an American National Standards Institute (ANSI)-accredited normative standard.

QRDA is a subset of CDA HL7 Standard; QRDA is a constraint on the HL7 Clinical Document Architecture (CDA), a document markup standard that specifies the structure and semantics of clinical documents for the purpose of exchange.4 To streamline implementations, QRDA makes use of CDA templates, which are business rules for representing clinical data consistently. Many QRDA templates are reused from the HL7 Consolidated CDA (C-CDA) standard5, which contains a library of commonly used templates that have been harmonized for MU2. Templates defined in the QRDA Category I and III IGs enable consistent representations of quality reporting data to streamline implementations and promote interoperability.”””

On the contrary we have Office Of National Coordinator (ONC) stipulate and regulate standards to achieve Healthcare Interoperability

ONC Roadmap Vision in the below link

https://www.healthit.gov/policy-researchers-implementers/interoperability

From above link:-

Sadly, although Evidence based is discussed, data mining and concerns around algorithm development is missing.

“””””””

Connecting Health and Care for the Nation: A Shared Nationwide Interoperability Roadmap version 1.0 (Roadmap) [PDF – 3.7 MB] supports the vision that ONC outlined in Connecting Health and Care for the Nation: A 10 Year Vision to Achieve An Interoperable Health IT Infrastructure [PDF – 607 KB]. The Roadmap, shaped by stakeholder input, lays out a clear path to catalyze the collaboration of stakeholders who are going to build and use the health IT infrastructure. The collaborative efforts of stakeholders is crucial to achieving the vision of a learning health system where individuals are at the center of their care; providers have a seamless ability to securely access and use health information from different sources; an individual’s health information is not limited to what is stored in electronic health records (EHRs), but includes information from many different sources and portrays a longitudinal picture of their health, not just episodes of care; and where public health agencies and researchers can rapidly learn, develop, and deliver cutting edge treatments.

“”””””””

http://www.healthit.gov/buzz-blog/from-the-onc-desk/oncinteroperability- roadmap-update/

There is no doubt that ONC aspires to achieve true Healthcare Interoperability, by bringing more clarity to the Health Information Exchange (HIE) as discussed in the below link.

Interoperability vs Health Information Exchange: Setting the Record Straight

ONC under its purview has Office of Standards and Technology, which drives the Interoperability Standards; and it acknowledges that there are numerous challenges in realizing the ONC roadmap; as discussed in the below link

Interoperability Standards – Shades of Gray

Also ONC specifies roadmap in achieving MU stages for physicians, hospitals and ACOs ( HIE)
Slide06https://www.healthit.gov/providers-professionals/ehrimplementation-steps/step-5-achieve-meaningful-use

Specifically for the Semantic Interoperability it recommends Consolidated – Clinical Document Architecture ( C-CDA).

https://www.healthit.gov/policy-researchers-implementers/consolidated-cda-overview

CDA helps in representing a comprehensive view of the patient; complete birth-to-death view – Longitudinal Record.

Also ONC Interoperability Specification Address the Following three levels (Not adequate to achieve EBM driven CDSS):-

There are three levels of health information technology interoperability:  1) Foundational; 2) Structural; and 3) Semantic.

1 – “Foundational” interoperability allows data exchange from one information technology system to be received by another and does not require the ability for the receiving information technology system to interpret the data.

2 – “Structural” interoperability is an intermediate level that defines the structure or format of data exchange (i.e., the message format standards) where there is uniform movement of healthcare data from one system to another such that the clinical or operational purpose and meaning of the data is preserved and unaltered. Structural interoperability defines the syntax of the data exchange. It ensures that data exchanges between information technology systems can be interpreted at the data field level.

3 – “Semantic” interoperability provides interoperability at the highest level, which is the ability of two or more systems or elements to exchange information and to use the information that has been exchanged. Semantic interoperability takes advantage of both the structuring of the data exchange and the codification of the data including vocabulary so that the receiving information technology systems can interpret the data. This level of interoperability supports the electronic exchange of patient summary information among caregivers and other authorized parties via potentially disparate electronic health record (EHR) systems and other systems to improve quality, safety, efficiency, and efficacy of healthcare delivery.

Desired or Recommended 2nd Order Semantic Interoperability

Probabilistic Ontology Driven Knowledge Engineering

Ref:- http://www.ncbi.nlm.nih.gov/pubmed/22269224

Chronically ill patients are complex health care cases that require the coordinated interaction of multiple professionals. A correct intervention of these sort of patients entails the accurate analysis of the conditions of each concrete patient and the adaptation of evidence-based standard intervention plans to these conditions. There are some other clinical circumstances such as wrong diagnoses, unobserved comorbidities, missing information, unobserved related diseases or prevention, whose detection depends on the capacities of deduction of the professionals involved.

< diagnosis > < procedures > < outcomes > [triple store]

Conclusion:-

From the above points it must be noted that QRDA and C-CDA achieves different things. Unfortunately, against MU attestation and quality reports that are filed by the eligible participants (physicians, hospitals and ACOs) based on QRDA (especially PQRA), CMS runs the EHR incentives program. Whereas, in the MU2 stage ( as per ONC), it is also required by the participants to demonstrate that they have achieved interoperability within ACO, while implementing HIE, this requires C-CDA. This stage must demonstrate that coordinated clinical processes have been achieved.

Also, clinical decision support system (CDSS) has been established addressing at least 5 critical or clinical priority areas.  Unfortunately this particular capability does not seems to be addressed adequately by the ACOs; who only pursue to demonstrate quality measures have been achieved which necessarily does not mean clinical efficacy have been addressed. 

It seems an important architectural problem has been glossed over by the policy designers, who proposed quality measures model with the motivation for capturing the metrics that eventually demonstrate “pay for performance”; and somehow assumed that the proposed metrics based on QRDA also demonstrate that the clinical efficacies have been achieved. This leads into systemic entropic discontinuity, where the efforts at macro states that represents healthcare management leading into healthcare efficiency  is not necessarily a cumulative realization for the efforts at the micro states which represents gaining clinical efficacy. This entropic discountuinity between the macro state and the micro states is measured by Boltzmann Constant.

Link to more discussion on micro states and macro states within a complex system. Basically discusses for a given complex system, and for all the efforts towards the input; the entropy arrested and created loss, so the output is a actually created incurring loss. This means the systemic efficiency incurred losses and did not realize all the benefits arising out of the clinical efficacy. This is a model problem which inaccurately represents the “phenomenon of interest”.

https://books.google.com/books?id=dAhQBAAAQBAJ&pg=PT295&lpg=PT295&dq=boltzmann+constant+macro+state&source=bl&ots=ubpGEUymWc&sig=cQ4Nz9f6OA0ryDGEupOHDUAyiRc&hl=en&sa=X&ved=0CCwQ6AEwA2oVChMI0qeqv4G4yAIVCzo-Ch07WAkU#v=onepage&q=boltzmann%20constant%20macro%20state&f=false

To achieve Clinical Decision Support System capability which rather plays a very important role in enhancing clinical efficacy, developing data mining driven Evidence Based Medicine capability is imperative. This capability does not seem as being achieved because most HIE / ACO is being developed around QRDA; although discussed in the ONC Connecting Health and Care for the Nation: A Shared Nationwide Interoperability Roadmap version 1.0 (Roadmap) [PDF – 3.7 MB]; unless data mining related algorithmic challenges are addressed which means standards beyond mere capture of the required data fields, interoperability efforts will be in vain.

Role of EBM in achieving CDSS discussed on following sites

CMS Site

https://www.healthit.gov/providers-professionals/achieve-meaningful-use/core-measures/clinical-decision-support-rule

NIH Site

http://www.ncbi.nlm.nih.gov/pmc/articles/PMC130063/

As such it must be noted clinical errors is one among the highest risk becoming the No 3 Killer in the US.

http://www.healthcareitnews.com/news/deaths-by-medical-mistakes-hit-records

From above link

“””It’s a chilling reality – one often overlooked in annual mortality statistics: Preventable medical errors persist as the No. 3 killer in the U.S. – third only to heart disease and cancer – claiming the lives of some 400,000 people each year. At a Senate hearing Thursday, patient safety officials put their best ideas forward on how to solve the crisis, with IT often at the center of discussions. “””

P.S:-

Bioingine (www.bioingine.com); a Cognitive Computing Platform transforms the patient information (millions of records) created by the HIE into Ecosystem Knowledge Landscape that is inherently evidence based, allowing for study of the Tacit Knowledge, as discovered from the millions of patient records (large data sets) by mining and knowledge inference in an automated way. This is achieved employing AI, Machine Learning and such techniques. Thereby, creating Clinical Decision Support System.