diracfoundation

2nd Order Semantic Web and A.I driven Reasoning – 300 Years Plus of Crusade

Bioingine.com | Ingine Inc

Screenshot 2016-02-02 11.30.13

Chronology of Development of Hyperbolic Dirac Net (HDN) Inference. 

https://en.wikipedia.org/wiki/Thomas_Bayes

From Above Link:-

1. 1763. Thomas Bayes was an English statistician, philosopher and Presbyterian minister who is known for having formulated a specific case of the theorem that bears his name: Bayes’ theorem.

 Bayes’s solution to a problem of inverse probability was presented in “An Essay towards solving a Problem in the Doctrine of Chances” which was read to the Royal Society in 1763 after Bayes’ death

https://en.wikipedia.org/wiki/Bayes%27_theorem

From Above Link:-

In probability theory and statisticsBayes’ theorem (alternatively Bayes’ law or Bayes’ rule) describes the probability of an event, based on conditions that might be related to the event.

When applied, the probabilities involved in Bayes’ theorem may have different probability interpretations. In one of these interpretations, the theorem is used directly as part of a particular approach to statistical inference. With the Bayesian probability interpretation the theorem expresses how a subjective degree of belief should rationally change to account for evidence: this is Bayesian inference, which is fundamental to Bayesian statistics. However, Bayes’ theorem has applications in a wide range of calculations involving probabilities, not just in Bayesian inference.

https://en.wikipedia.org/wiki/Bayesian_inference

From Above Link:-

Bayesian inference is a method of statistical inference in which Bayes’ theorem is used to update the probability for a hypothesis as more evidence or information becomes available. Bayesian inference is an important technique in statistics, and especially in mathematical statistics. Bayesian updating is particularly important in the dynamic analysis of a sequence of data. Bayesian inference has found application in a wide range of activities, including scienceengineeringphilosophymedicinesport, and law. In the philosophy of decision theory, Bayesian inference is closely related to subjective probability, often called “Bayesian probability“.

2. 1859, Georg Friedrich Bernhard Riemann proposed Riemann zeta function,function useful in number theory for investigating properties of prime numbers. Written as ζ(x), it was originally defined as the infinite series

ζ(x) = 1 + 2−x + 3−x + 4−x + ⋯.

The theory should perhaps be distinguished from an existing purely number-theoretic area sometimes also known as Zeta Theory, which focuses on the Riemann Zeta Function and the ways in which it governs the distribution of prime numbers

http://mathworld.wolfram.com/RiemannZetaFunction.html

The Riemann zeta function is an extremely important special function of mathematics and physics that arises in definite integration and is intimately related with very deep results surrounding the prime number theorem. While many of the properties of this function have been investigated, there remain important fundamental conjectures (most notably the Riemann hypothesis) that remain unproved to this day. The Riemann zeta function is defined over the complex plane for one complex variable, which is conventionally denoted (instead of the usual ) in deference to the notation used by Riemann in his 1859 paper that founded the study of this function (Riemann 1859). It is implemented in the Wolfram Language as Zeta[s].

3. 1900. Ramanujan’s mathematical work was primarily in the areas of number theory and classical analysis. In particular, he worked extensively with infinite series, integrals, continued fractions, modular forms, q-series, theta functions, elliptic functions, the Riemann Zeta-Function, and other special functions.

Hardy wrote in Ramanujan’s obituary [14]:

There is always more i n one of Ramanujan’s formulae than meets the eye, as anyone who sets to work to verify those which look the easiest will soon discover. In some the interest lies very deep, in others comparatively near the surface; but there is not one, which is not curious and entertaining.

http://www.integralworld.net/collins18.html

From above link :-

Now there is a famous account of the gifted Indian mathematician Ramanujan who when writing to Hardy at Cambridge regarding his early findings included the seemingly nonsensical result,

1 + 2 + 3 + 4 + ……(to infinity) = – 1/12.

Initially Hardy was inclined to think that he was dealing with a fraud, but on further reflection realized that Ramanujan was in fact describing the Riemann Zeta Function (for s = – 1). He could then appreciate his brilliance as one, who though considerably isolated and without any formal training, had independently covered much of the same ground as Riemann.

However it still begs the question as to what the actual meaning of such a result can be, for in the standard conventional manner of mathematical interpretation, the sum of the series of natural numbers clearly diverges.

The startling fact is that this result – though indirectly expressed in a quantitative manner – actually expresses a qualitative type relationship (pertaining to holistic mathematical interpretation).

Uncovering Ramanujan’s “Lost” Notebook: An Oral History

http://arxiv.org/pdf/1208.2694.pdf

ROBERT P. SCHNEIDER

From above link :-

Whereas Ramanujan’s earlier work dealt largely with classical number-theoretic objects such as q-series, theta functions, partitions and prime numbers—exotic, startling, breathtaking identities built up from infinite series, integrals and continued fractions—in these newfound papers, Andrews found never-before-seen work on the mysterious “mock theta functions” hinted at in a letter written to Hardy in Ramanujan’s final months, pointing to realms at the very edge of the mathematical landscape. The content of Ramanujan’s lost notebook is too rich, too ornate, too strange to be developed within the scope of the present article. We provide a handful of stunning examples below, intended only to tantalize—perhaps mystify—the reader, who is encouraged to let his or her eyes wander across the page, picking patterns like spring flowers from the wild field of symbols.

The following are two fantastic q-series identities found in the lost notebook, published by Andrews soon after his discovery, in which is taken to be a complex number with |q| <1

Another surprising expression involves an example of a mock theta function provided by Ramanujan in the final letter he sent to Hardy

In the words of mathematician Ken Ono, a contemporary trailblazer in the field of mock theta functions, “Obviously Ramanujan knew much more than he revealed [14].” Indeed, Ramanujan then “miraculously claimed” that the coefficients of this mock theta function obey the asymptotic relation

The new realms pointed to by the work of Ramanujan’s final year are now understood to be ruled by bizarre mathematical structures known as harmonic Maass forms. This broader perspective was only achieved in the last ten years, and has led to cutting-edge science, ranging from cancer research to the physics of black holes to the completion of group theory. 

Yet details of George Andrews’s unearthing of Ramanujan’s notes are only sparsely sketched in the literature; one can detect but an outline of the tale surrounding one of the most fruitful mathematical discoveries of our era. In hopes of contributing to a more complete picture of this momentous event and its significance, here we weave together excerpts from interviews we conducted with Andrews documenting the memories of his trip to Trinity College, as well as from separate interviews with mathematicians Bruce Berndt and Ken Ono, who have both collaborated with Andrews in proving and extending the contents of Ramanujan’s famous lost notebook.

4. Elie Joseph Cartan, developed “Theory of Spinors

https://archive.org/details/TheTheoryOfSpinors

https://en.wikipedia.org/wiki/Spinor

From above link:-

In geometry and physics, spinors are elements of a (complexvector space that can be associated with Euclidean space. Like geometric vectors and more general tensors, spinors transform linearly when the Euclidean space is subjected to a slight (infinitesimal) rotation. When a sequence of such small rotations is composed (integrated) to form an overall final rotation, however, the resulting spinor transformation depends on which sequence of small rotations was used, unlike for vectors and tensors. A spinor transforms to its negative when the space is rotated through a complete turn from 0° to 360° (see picture), and it is this property that characterizes spinors. It is also possible to associate a substantially similar notion of spinor to Minkowski space in which case the Lorentz transformations of special relativity play the role of rotations. Spinors were introduced in geometry by Élie Cartan in 1913. In the 1920s physicists discovered that spinors are essential to describe the intrinsic angular momentum, or “spin”, of the electron and other subatomic particles.

5. 1928, Paul A M Dirac derived the Dirac equation, which In particle physics, is a relativistic wave equation.

From above link:-

http://www.mathpages.com/home/kmath654/kmath654.htm

http://mathworld.wolfram.com/DiracEquation.html

The quantum electrodynamical law which applies to spin-1/2 particles and is the relativistic generalization of the Schrödinger equation. In dimensions (three space dimensions and one time dimension), it is given by

DIRAC1

6. 1930. Dirac publishes his book on his pivotal view of quantum mechanics, including his earliest mentions of an operator with the properties of the hyperbolic number such that hh = +1. It extends the theory of wave mechanics to particle mechanics. 
P. A. M. Dirac, The Principles of Quantum Mechanics, First Edition, Oxford University Press, Oxford (1930).

7. 1933. In his Nobel Prize Dinner speech, Dirac states that mechanical methods are applicable to all forms of human thought where numbers are involved. http://www.nobelprize.org/nobel_prizes/physics/laureates/1933/dirac-speech.html

8. 1939. DIRAC PUBLISHES HIS BRAKET NOTATION. It is incorporated into the third edition of his book.

P.A.M. Dirac (1939). A new notation for quantum mechanics, Mathematical Proceedings of the Cambridge Philosophical Society 35 (3): 416–418

9. 1974. Robson develops his Expected Information approach that preempts the Bayes Net method.

B. Robson, Analysis of the Code Relating Sequence to Conformation in Globular Proteins: Theory and Application of Expected Information, Biochem. J141, 853-867 (1974).

10. 1978. The Expected Information approach crystallizes as the GOR method widely used in bioinformatics.

Garnier, D. J. Osguthorpe, and B. Robson, Analysis of the Accuracy and Implications of Simple Methods for Predicting the Secondary Structure of Globular Proteins”, J. Mol. Biol. 120, 97-120 (1978). 


11. 1982 . Buchannan and Shortliffe describe the first medical Expert System. It is based on probabilistic statements, but sets a tradition of innovation and diverse controversial methods in automated medical inference.

Buchanan, E.H. Shortliffe, (1982) Rule Based Expert Systems. The Mycin Experiments of the Stanford Heuristic Programming Project, Addison-Wesley: Reading, Massachusetts.

12. 1985. Pearl Gives Full Accound the Bayes Net method.

Pearl, Probabilistic Reasoning in Intelligent Systems. San Francisco CA: Morgan Kaufmann (1985).

13. March 1989, Sir Tim Berners-less invented WWW: – Introduced non-linear linking of information across systems.

Tim laid out his vision for what would become the Web in a document called “Information Management: A Proposal”.Believe it or not, Tim’s initial proposal was not immediately accepted. In fact, his boss at the time, Mike Sendall, noted the words “Vague but exciting” on the cover. The Web was never an official CERN project, but Mike managed to give Tim time to work on it in September 1990. He began work using a NeXT computer, one of Steve Jobs’ early products.

14. 1997. Clifford Algebra using becomes more widely recognized as a tool for engineers as well as scientists and physicists.

Gürlebeck, W. Sprössig, Quaternionic and Clifford Calculus for Physicists and Engineers, Wiley, Chichester (1997)

15. 1999. Tim Berners-Lee described the Semantic Web vision in the following terms

I have a dream for the Web [in which computers] become capable of analyzing all the data on the Web, the content, links, and transactions between people and computers. A Semantic Web, which should make this possible, has yet to emerge, but when it does, the day-to-day mechanisms of trade, bureaucracy and our daily lives will be handled by machines talking to machines. The intelligent agents people have touted for ages will finally materialize. (1999)

16. 2000. Khrennikov gives description of a primarily h-complex quantum mechanics.

Khrenikov, Hyperbolic quantum mechanics, Cornell University Library, arXiv:quant-ph/0101002v1 (2000).

17. 2000. Bucholz and Sommer refine work showing that neural networks as inference systems modeled on the brain can usefully use the hypercomplex imaginary number h.

S. Buchholz, and G. Sommer, A hyperbolic multilayer perceptron International Joint Conference on Neural Networks, IJCNN 2000, Como,Italy, Vol. 2 of pp. 129-133. Amari, S-I and. Giles, C.L M. Gori. M. and Piuri, V. Eds. IEEE Computer Society Press, (2000).

18. 2003. Robson Points out that the Expected Information method in bioinformatics is really the use of the partially summated Riemann Zeta function, and a best choice for treatment of sparse data in data mining in general.

B Robson (2003) “Clinical and Pharmacogenomic Data Mining. 1. The generalized theory of expected information and application to the development of tools” J. Proteome Res. (Am. Chem. Soc.) 283-301, 2 

19. 2003. Nitta Shows that the power of the h-complex approach in neural nets is primarily due to its ability to solver the notorious exclusive-or logical problem in a single neuron.

Nitta, Solving the XOR problem and the detection of symmetry using a single complex-valued neuron, Neural Networks 16:8, 1101-1105, T. (2003).

20. 2003. Khrennikov consolidates the notion of an extensively h-complex quantum mechanics, but feels that i-complex, h-complex, and real world mechanics are three spate systems.

A.Khrennikov, A. Hyperbolic quantum mechanics, Adv. in Applied Clifford Algebras, Vol.13, 1 (2003). 

21.2004. Khrennikov notes possible relation between h-complex quantum mechanics and mental function.

Khrennikov, On Quantum-Like Probabilistic Structure of Mental Information, Open Systems Information Dynamics, Vol. 11, 3, 267-275 (2004).

22. 2004 Rochon shows that the full Riemann Zeta function is both i-complex and h-complex.

Rochon, A Bicomplex Riemann Zeta Function, Tokyo J. of Math.

23. 2004. Robson argues that zeta theory is a solution to high dimensionality problems in data mining.

Robson, The Dragon on the Gold: Myths and Realities for Data Mining in Biotechnology using Digital and Molecular Libraries, J. Proteome Res. (Am. Chem. Soc.) 3 (6), 1113 – 9 (2004).

24. 2005. Robson argues that all statements in zeta theory and in prime number theory are really statements relevant to data and data mining, and describes first link to Dirac’s quantum mechanics and Dirac’s braket notation.

Robson, Clinical and Pharmacogenomic Data Mining: 3. Zeta Theory As a General Tactic for Clinical Bioinformatics, J. Proteome Res. (Am. Chem. Soc.) 4(2); 445-455 (2005) 


25. 2005. Code CliniMiner/Fano based on Zeta Theory and prime number theory is used in first pioneering effort in data mining large number of patient records.

Mullins, I. M., M.S. Siadaty, J. Lyman, K. Scully, G.T. Garrett, G. Miller, R. Muller, B. Robson, C. Apte, C., S. Weiss, I. Rigoutsos, D. Platt, and S. Cohen, Data mining and clinical data repositories: Insights from a 667,000 patient data set, Computers in Biology and Medicine, 36(12) 1351 (2006). 


26. 2007. Robson recognizes that the imaginary number required to reconcile zeta theory with quantum mechanics and to allow Dirac notation to be used in inference is the hyperbolic imaginary number h, not the imaginary number i. Unaware of the work of Khrennikov, he makes no Khrennikov-like distinction between h-complex quantum mechanics and the everyday world.

Mullins, I. M., M.S. Siadaty, J. Lyman, K. Scully,G.T. Garrett, G.Miller, R. Muller, B.Robson, C. Apte, C., S. Weiss, I. Rigoutsos, D. Platt, and S. Cohen, Data mining and clinical data repositories: Insights from a 667,000 patient data set, Computers*in*Biology* and*Medicine, 36(12) 1351 (2006)

27. 2007. Robson recognizes that the imaginary number required to reconcile zeta theory with 
quantum mechanics and to allow Dirac notation to be used in inference is the hyperbolic imaginary number h, not the imaginary number i. Unaware of the work of Khrennikov, he makes no Khrennikov like distinction between h complex quantum mechanics and the every day world.

Robson, The New Physician as Unwitting Quantum Mechanic: Is Adapting Dirac’s Inference System Best Practice for Personalized Medicine, Genomics and Proteomics, J. Proteome Res. (A. Chem. Soc.), Vol. 6, No. 8: 3114 – 3126, (2007). 


Robson, B. (2007) “Data Mining and Inference Systems for Physician Decision Support in Personalized Medicine” Lecture and Circulated Report at the 1st Annual Total Cancer Care Summit, Bahamas 2007. 


28. 2008. Data Mining techniques using the full i-complex and h-complex zeta function are developed.

Robson, Clinical and Pharmacogenomic Data Mining: 4. The FANO Program and Command Set as an Example of Tools for Biomedical Discovery and Evidence Based Medicine” J. Proteome Res., 7 (9), pp 3922–3947 (2008). 


29. 2008. Nitta and Bucholtz explore decision process boundaries of h-complex neural nets.

Nitta, and S. Bucholtz, On the Decision Boundaries of Hyperbolic Neurons. In 2008 International Joint Conference on Neural Networks (IJCNN). 


30. 2009. Semantic Web starts to emerge but runs into bottleneck regarding the best approach for probabilistic treatment.

Prediou and H. Stuckenschmidt, H. Probabilistic Models for the SW – A Survey. http://ki.informatik.unimannheim.de/fileadmin/ publication/ Predoiu08Survey.pdf (last accessed 4/29/2010) 


31. 2009. Baek and Robson propose that, for reasons of bandwidth limitations and security, the Internet should consist of data-centric computing by smart software robots. Robson indicates that they could be based on h-complex inference systems and link to semantic theory.

Robson B.. and Baek OK. The Engines of Hippocrates. From the Dawn of Medicine to Medical and Phrmaceuteutical Infomatics, Wiley, 2009. 

Robson B. (2009) “Towards Intelligent Internet-Roaming Agents for Mining and Inference from Medical Data”, Future of Health Technology Congress, Technology and Informatics, Vol. 149, 157-177 IOS Press 

Robson, B. (2009) “Links Between Quantum Physics and Thought” (A. I. Applications in Medicine) , Future of Health Technology Congress, Technology and Informatics, Vol. 149, 157-177 IOS Press. 

32. 2009. Nivitha et al. develop new learning algorithms for complex-valued networks.

S. Savitha, S. Suresh, S. Sundararajan, and P, Saratchandran, A new learning algorithm with logarithmic performance index for complex-valued neural networks, Neurocomputing 72 (16-18), 3771-3781 (2009).

33. 2009. Khrennikov argues for the h-complex Hilbert space as providing the “contextual” (underlying rationale, hidden variables etc.) for all quantum mechanics.

Khrennikov, Contextual Approach to Quantum Formalism, Springer (2009) 

34. 2010. Robson and Vaithiligam describe how zeta theory and h-complex probabilistic algebra can resolves challenges in data mining by the pharmaceutical industry.

Robson and A. Vaithiligam, Drug Gold and Data Dragons: Myths and Realities of Data Mining in the Pharmaceutical Industry pp25-85 in Pharmaceutical Data Mining, Ed Balakin, K. V. , John Wiley Sons (2010).

35. 2010. PCAST. December Report by the US President’s Council of Advisors on science and Technology calls for an XML-like Universal Exchange Langue for medicine including disaggregation for the patient record on the Internet for patient access, security, and privacy.

http://www.whitehouse.gov/sites/default/files/microsites/ostp/pcast-health-it- report.pdf 

36. 2011. First description of Q-UEL in response to PCAST 2010.

Robson, B., Balis, U. G. J. and Caruso, T. P. (2011)“Considerations for a Universal Exchange Language for Healthcare.” In Proceedings of 2011 IEEE 13th International Conference on e-Health Networking, Applications and Services (Healthcom 2011), 173– 176. Columbus, MO: IEEE, 2011. 

 37. 2011. Robson and Colleagues develop the method of match-and-edit instructions for extracting

Robson, B., Li, J., Dettinger, R., Peters, A., and Boyer, S.K. (2011), Drug discovery using very large numbers of patents. General strategy with extensive use of match and edit operations. Journal of Computer-Aided Molecular Design 25(5): 427-441 

38. 2011. Kuroe et al. consolidate the theory of h– complex neural nets.

Kuroe, T. Shinpei, and H. Iima, Models of Hopfield-Type Clifford Neural Networks and Their Energy Functions – Hyperbolic and Dual Valued Networks, Lecture Notes in Computer Science, 7062, 560 (2011).

39. 2012. Robson argues that h-complex algebra is an appropriate basis for Artificial Intelligence in the Pharmaceutical Industry.

Robson, B. (2012) “Towards Automated Reasoning for Drug Discovery and Pharmaceutical Business Intelligence”, Pharmaceutical Technology and Drug Research, 2012 1: 3 ( 27 March 2012 ) 


40. 2013. Goodman and Lassiter attempt to reconcile and restore interest in probabilistic semantics after a long period of domination by classical logic. 
N. D. Goodman and D. Lassiter, Probabilistic Semantics and Pragmatics: Uncertainty in Language and Thought,

https://web.stanford.edu/~ngoodman/papers/Goodman-HCS-final.pdf

41. 2013. Robson argues for importance of h-complex approach for measures in epidemiology. Robson, B. (2013)

“Towards New Tools for Pharmacoepidemiology”, Advances in Pharmacoepidemiology and Drug Safety, 1:6,

http://www.omicsgroup.org/journals/towards-new-tools-for-pharmacoepidemiology-2167-1052.1000123.pdf

42. 2013 Robson promotes Q-UEL from a public health perspective.
B. Robson, Rethinking Global Interoperability in Healthcare. Reflections and Experiments of an e-Epidemiologist from Clinical Record to Smart Medical Semantic Web Johns Hopkins Grand Rounds Lectures (last accessed 3/14/2013).

Screenshot 2016-02-02 11.30.13

http://dhsi.med.jhmi.edu/GrandRoundsVideo/Feb15-2013/SilverlightLoader.html

43. 2013 Robson and Caruso describe first version of Q-UEL in greater Detail.

Robson, B, and TP Caruso (2013) “A Universal Exchange Language for Healthcare” MedInfo ’13: Proceedings of the 14th World Congress on Medical and Health Informatics, Copenhagen, Denmark, Edited by CU Lehmann, E Ammenwerth, and C Nohr. IOS Press, Washington, DC, USA. http://quantalsemantics.com/documents/MedInfo13-RobsonCaruso_V6.pdf; http://ebooks.iospress.nl/publication/34165

44. 2014. Robson et al. release formal description of consolidated second version of Q-UEL.

Robson, T. P. Caruso and U. G. J. Balis, Suggestions for a Web Based Universal Exchange and Inference Language for Medicine, Computers in Biology and Medicine, 
43(12) 2297 (2013).

45. 2013. Moldoveneua expresses view that hyperbolic quantum mechanics can’t also include wave mechanics. Possible attack on Khrennikov’s idea that hyperbolic quantum mechanics can show 
interference as for waves. Signs of growing sense that hyperbolic quantum mechanics is simply the everyday world described in terms of the machinery of traditional quantum mechanics.

Moldoveanu, Non viability of hyperbolic quantum mechanics as a theory of Nature, Cornell University Library, arXiv:1311.6461v2 [quant-ph] (2013).

46. 2013. First full description of the Hyperbolic Dirac Net and its relation to Q-UEL and to Bayes Nets.

Robson, Hyperbolic Dirac Nets for Medical Decision Support. Theory, Methods, and Comparison with Bayes Nets, Computers in Biology and Medicine, 51, 183 (2013).

http://www.sciencedirect.com/science/article/pii/S0010482514000778

47. 2014. Kunegis et al.c develop h-complex algorithms for dating recommender systems.

Kunegis, G. Gröner, and T, Gottrron, On-Line Dating Recommender Systems, the Split Complex Number Approach, (Like/Dislike, Similar/Disimilar) http://userpages.uni- koblenz.de/~kunegis/paper/kunegis-online-dating-recommender-systems-the-split- complex-number-approach.pdf (last accessed 6/1/2014).

48. 2015. Robson describes extension of Hyperbolic Dirac Net to semantic reasoning and probabilistic lingusitics. 


Robson, B. “POPPER, a Simple Programming Language for Probabilistic Semantic Inference in Medicine. Computers in Biology and Medicine ” Computers in biology and Medicine”, (in press), DOI: 10.1016/j.compbiomed.2014.10.011 (2015). 


http://www.ncbi.nlm.nih.gov/pubmed/25464353

49. 2014. Yosemite Manifesto – a response to PCAST 2010 that the Semantic Web should provide healthcare IT, al though preempted by Q-UEL

http://yosemitemanifesto.org/ (last accessed 7/5/2014). 

50. 2015. Robson et al. describe medical records in Q-UEL format and PCAST disaggregation for patient security and privacy.

Robson, B., Caruso, T, and Balis, U. G. J. (2015) “Suggestions for a Web Based Universal Exchange and Inference Language for Medicine. Continuity of Patient Care with PCAST Disaggregation.” Computers in Biology and Medicine (in press) 01/2015; 56:51. DOI: 10.1016/j.compbiomed.2014.10.022 

51. 2015. Mathematician Steve Deckelman of U. Wisconsin-Stout and Berkeley validates the theoretical principles Hyperbolic Dirac Net.

Deckelman and Robson, B. (2015)“Split-Complex Numbers and Dirac Bra-Kets” Communications in Information andSystems (CIS), in press.

http://www.diracfoundation.com/?p=148

From Above Link:-

The inference net on which this dualization is performed is defined as an estimate of a probability as an expression comprising simpler probabilities and or association measures, i.e. each with fewer attributes (i.e. arguments, events, states, observations or measurements) that the joint probability estimated, where each attribute corresponds to nodes of a general graph and the probabilities or association measures represent their interdependencies as edges. It is not required that the inference net be an acyclic directed graph, but the widely used BN that satisfies that description by definition is a useful starting point for making use of the given probabilities to address the same or similar problems. Specifically for the estimation of a joint probability, and HDN properly constructed with prior probabilities, and whether or not it contains cyclic paths, is purely real valued and its construction principles represent a generalization of Bayes Theorem. Any imaginary part indicates the degree of departure from Bayes Theorem over the net as a whole, and the direction of conditionality in which the degree of departure occurs, and thus the HDN provides an excellent book-keeping tool that Bayes Theorem is satisfied overall. Specially for the estimation of a conditional probability, it follows conversely from the above that any expression for a joint probability validated by the above means can serve as the generator of an HDN for the estimation of a conditional probability simply by dividing it by the HDN counterparts of prior probabilities, whence the resulting net is not purely real save by coincidence of probability values.

52. 2015. Implementation of a web based universal exchange and inference language for medicine: Sparse data, probabilities and inference in data mining of clinical data repositories

Barry Robson and Srinidhi Boray

http://www.computersinbiologyandmedicine.com/article/S0010-4825(15)00257-7/abstract

52. 2015. Robson, B., and S. Boray, The Structure of Reasoning in Answering Multiple Choice Medical Licensing Examination Questions. Computer Studies   towards Formal Theories of Clinical Decision Support and Setting and Answering Medical Licensing Examinations, Workshop Lecture presentation, Proceedings of the IEEE International conference of Bioinformatics and Biomedicine, 9th-11th November, Washington DC (2015)

https://www.osehra.org/sites/default/files/Computer_Exams_V10.pdf

https://cci.drexel.edu/ieeebibm/bibm2015/BIBM2015Program.pdf

 

 

 

 

 

 

 

 

 

Part B – Healthcare Interoperability, Standards and Data Science (Resolving the Problem)

Srinidhi Boray | Ingine, Inc | Bioingine.com

Slide06

Introducing, Ingine, Inc. it is a startup in its incipient stages of developing BioIngine platform, which brings advancement in data science around Interoperability. Particularly with healthcare data mining and analytics dealing with medical knowledge extraction. Below are some of the lessons learned discussed while dealing with the healthcare transformation concerns, especially with the ONC’s Interoperability vision.

As an introduction, want to include the following passage from the book

The Engines of Hippocrates: From the Dawn of Medicine to Medical and Pharmaceutical Informatics

By Barry Robson, O. K. Baek

https://books.google.com/books?id=DVA0QouwC4YC&pg=PA8&lpg=PA8&dq=MEDICAL+FUTURE+SHOCK+barry+robson&source=bl&ots=Qv1cGRIY1L&sig=BgISEyThQS-8bXt-g7oIQ873cN4&hl=en&sa=X&ved=0CCQQ6AEwAWoVChMI0a2s-4zMyAIVSho-Ch216wrQ#v=onepage&q=MEDICAL%20FUTURE%20SHOCK%20barry%20robson&f=false

MEDICAL FUTURE SHOCK

Healthcare administration has often been viewed as one of the most conservative of institutions. This is not simply a matter of the inertia of any complex bureaucratic system. A serious body with an impressive history and profound responsibilities cannot risk unexpected disruptions to public service by changing with every fashionable new convenience, just for the sake of modernity. A strong motivation is needed to change a system on which lives depend and which, for all its faults, is still for the most part an improvement on anything that went before. However, this is also to be balanced against the obligation of healthcare, as an application of science and evolving human wisdom, to make appropriate use of the new findings and technologies available. This is doubly indicated when significant inefficiencies and accidents look as if they can be greatly relieved by upgrading the system. Sooner or later something has to give, and the pressure of many such accumulating factors can sometimes force a relatively entrenched system to change in a sudden way, just as geological pressures can precipitate an earthquake. An Executive Forum on Personalized Medicine organized by the American College of Surgeons in New York City in October 2002 similarly warned of the increasingly overwhelming accumulation of arguments demanding reform of the current healthcare system…if there is to be pain in making changes to an established system, then it makes sense to operate quickly, to incorporate all that needs to be incorporated and not spin out too much the phases of the transitions, and lay a basis for ultimately assimilating less painfully all that scientific vision can now foresee. But scientific vision is of course not known for its lack of imagination and courage, and is typically very far from conservative, still making an element of future shock inevitable in the healthcare industry.

  1. Complicated vs Complexity

A) Generally approaching to characterize a system, there are two views, complicated and complex. Complicated is with problems of system operations and population management, while complex problems are about multi-variability with an individual patient diagnosis.

Below link discusses providing better scenarios regarding complicated vs complexity

http://www.beckershospitalreview.com/healthcare-blog/healthcare-is-complex-and-we-aren-t-helping-by-making-it-more-complicated.html

https://www.bcgperspectives.com/content/articles/organization_design_human_resources_leading_complex_world_conversations_leaders_thriving_amid_uncertainty/

Generally, all management concerns around operations, payment models, healthcare ecosystem interactions, etc deal with delivering the systemic efficiencies. These are basically complicated problems residing in the system, which when resolved yield the hidden efficiencies.

All those that affect the delivery of the clinical efficacy have to deal with complex problem. Mostly owing to the high dimensionality (multi-variability) of the longitudinal patient data.

When both, complicated and complex concerns are addressed the Healthcare as an overarching complex system will begin to yield the desired performance driven outcomes.

B) Standards around Interoperability has generally dealt with following three levels of health information technology interoperability:

Ref:-http://www.himss.org/library/interoperability-standards/what-is-interoperability

From the above link:-

1) Foundational; 2) Structural; and 3) Semantic.

1 – “Foundational” interoperability allows data exchange from one information technology system to be received by another and does not require the ability for the receiving information technology system to interpret the data.

2 – “Structural” interoperability is an intermediate level that defines the structure or format of data exchange (i.e., the message format standards) where there is uniform movement of healthcare data from one system to another such that the clinical or operational purpose and meaning of the data is preserved and unaltered. Structural interoperability defines the syntax of the data exchange. It ensures that data exchanges between information technology systems can be interpreted at the data field level.

3 – “Semantic” interoperability provides interoperability at the highest level, which is the ability of two or more systems or elements to exchange information and to use the information that has been exchanged. Semantic interoperability takes advantage of both the structuring of the data exchange and the codification of the data including vocabulary so that the receiving information technology systems can interpret the data. This level of interoperability supports the electronic exchange of patient summary information among caregivers and other authorized parties via potentially disparate electronic health record (EHR) systems and other systems to improve quality, safety, efficiency, and efficacy of healthcare delivery.

The above levels of interoperability only deal with achieving semantic compatibility between systems in the data transacted from the large number of myriad systems (EHRs) while they converge into a heterogeneous architecture (HIE / IoT). This only deals with the complicated concerns within the system. They do not necessarily deal with the extraction and discernment of the knowledge hidden in the complex health ecosystem system. To achieve this for some simplicity sake, let us define need for a second order semantic interoperability that concerns with the data mining approaches required in the representation of the systemic medical knowledge. It is this medical knowledge; implicit, explicit and tacit that all together form evidence based medicine much desired to facilitate any clinical decision support system.

C) In the present efforts around Interoperability, which centers mostly around data standards (HL7v2, HLv3, FHIR, C-CDA, ICD-10, LOINC, SNOMED etc) and clinical quality measures (QRDA); only complicated concerns have been addressed and not necessarily the complex problems. This is the vexation in the quality measures reporting. While this has advanced the adoption of EHR by the hospitals, it is still far from it becoming an effective decision support tool for the physicians

It must be noted that in the MU2 criteria, it is suggested that besides achieving health information exchange pivotal to the creation of Accountable Care Organization (ACO), at the least five-health priority or critical health risk conditions must be addressed employing clinical decision support system. Deservedly created, this point creates a need for addressing clinical efficacy, in addition to achieving best possible system efficiencies leading to systemic performance driven outcomes. This means a much deeper perspective is required to be included in the Interoperability efforts to better drive data science around data mining that can help better engage physicians in the realization of the performance driven outcomes. Rather than allowing physicians to be encumbered by the reimbursement model driven EHRs. Also, although most EHR vendors employ C-CDA to frame the longitudinal patient view, they do not necessarily send all the data to the Health Information Exchange, this results into truncating the full view of the longitudinal patient records to the physicians.

D) Physician, Primary Care, Cost in Rendering and Shortage Physician workforce

http://www.ncbi.nlm.nih.gov/pubmed/20439861

When dealing with the primary care, it is desired that today’s physicians who are over-burdened, moving forward works as a team lead engaging variety of healthcare professionals, while also better enabling trained nurse practitioners. Furthermore, also rendering the work in a lesser-cost environment while moving away from higher cost environments such as hospitals and emergency care facilities. This also means moving away from service-based models into performance based payment models becomes imperative.

It must be noted that dealing with the way an organization generally works reassigning responsibilities both horizontally and vertically, has to do only with the complicated concerns of the system, not the complex problem. Again it must be emphasized that data mining related to evidence based medicine, which is in a way knowledge culled from the experiences of the cohorts within the health ecosystem, will play a vital role in improving the much desired clinical efficacy leading ultimately to better health outcomes. This begins to address the complex systemic problems, while also better engaging the physicians who find the mere data entry into the EHR cumbersome and intrusive; and not able to derive any clinical decision support from the integration of the System of systems (SoS).

  1. Correlation vs Causations

A) While we make a case for better enabling evidence based medicine (EBM) driven by data mining as a high priority in the interoperability scheme of things, we also would like to point out the need for creating thorough systematic review aided by automation which is vital to EBM. This also means dealing with Receiver-Operating Characteristic (ROC) Curves http://www.ncbi.nlm.nih.gov/pubmed/15222906

https://www.sciencebasedmedicine.org/evidence-in-medicine-correlation-and-causation/

From the above link:-

“”The consensus of expert opinion based upon systematic reviews can either result in a solid and confident unanimous opinion, a reliable opinion with serious minority objections, a genuine controversy with no objective resolution, or simply the conclusion that we currently lack sufficient evidence and do not know the answer.””

Also, another reference to:-

Reflections on the Nature and Future of Systematic Review in Healthcare. By:- Dr. Barry Robson

http://www.diracfoundation.com/?p=146

In the recent times Bayesian statistics has emerged as a gold standard to developing curated EBM (http://www.ncbi.nlm.nih.gov/pubmed/10383350) and; in this context we would like to draw attention that while correlation is important as discussed in the above linked article, which is developed from the consensus of the cohorts in the medical community, it is also important to ascertain the causation. This demands need for a holistic Bayesian statistics as proposed in the new algorithms, including those built on proven ideas in physics advancing the scope of the Bayesian Statistics as developed by Dr. Barry Robson. The approach and its impact on the Healthcare Interoperability and analytics are discussed in the link provided below.

http://www.ncbi.nlm.nih.gov/pubmed/26386548

From the above link: –

“”””Abstract

We extend Q-UEL, our universal exchange language for interoperability and

inference in healthcare and biomedicine, to the more traditional fields of public health surveys. These are the type associated with screening, epidemiological and cross-sectional studies, and cohort studies in some cases similar to clinical trials. “”There is the challenge that there is some degree of split between frequentist notions of probability as (a) classical measures based only on the idea of counting and proportion and on classical biostatistics as used in the above conservative disciplines, and (b) more subjectivist notions of uncertainty, belief, reliability, or confidence often used in automated inference and decision support systems. Samples in the above kind of public health survey are typically small compared with our earlier “Big Data” mining efforts. An issue addressed here is how much impact on decisions should sparse data have. “””””

B) Biostatistics, Algebra, Healthcare Analytics and Cognitive Computing

Another interesting aspect that emerges is the need for biostatistics and such many doctors with MD qualification are getting additionally qualified in Public Health Management, which also deals with Biostatistics. Dealing with population health one hand and clinical efficacy on the other, Interoperability via biostatistics has to deliver both views macro wrt systemic outcomes and at the micro level clinical efficacies. Developing such capabilities means much grander vision for Interoperability, as discussed in the OSEHRA, VA sponsored Open Source Efforts in making VistA available to the world market at a fraction cost. More discussion on the OSEHRA forum in the below link.

https://www.osehra.org/content/joint-dod-va-virtual-patient-record-vpr-iehr-enterprise-information-architecture-eia-0

From the above link:-

“”””Tom Munnecke – The Original Architect of VistA – This move to a higher level of abstraction is a bit like thinking of things in terms of algebra, instead of arithmetic. Algebra gives us computational abilities far beyond what we can do with arithmetic. Yet, those who are entrenched in grinding through arithmetic problems have a disdain for the abstract facilities of algebra.””””

Interesting point to note in the discussions on the above link, is that a case is being made for the role of data science (previously called Knowledge Engineering during last three decades) driving better new algorithms, including those built on proven ideas in physics in the Healthcare Interoperability. This helps in advancing the next generations of the EHR capabilities, eventually emerging as a medical science driven cognitive computing platform. The recommendation is in the employ of advances in the data science in moving the needle from developing a deterministic or a predicated System of systems (SoS) based on schemas such as FHIM (http://www.fhims.org), that proves design laborious and is outmoded, to harnessing the data locked in the heterogeneous system by the employ of advanced Bayesian statistics, new algorithms, including those built on proven ideas in physics and especially exploitation of the algebra. This approach delivered on a BigData architecture as a Cognitive Computing Platform with schema-less approaches has a huge benefit in terms of cost, business capability and time to market, delivering medical reasoning from the healthcare ecosystem as realized by the interoperability architectures.