Q-UEL Toolkit for Medical Decision Making :- Science of Uncertainty and Probabilities

Screen Shot 2016-08-24 at 11.07.49 AM

Quantum Universal Exchange Language

Emergent | Interoperability | Knowledge Mining | Blockchain


  1. It is a toolkit / framework
  2. Is an Algorithmic Language for constructing Complex System
  3. Results into a Inferential Statistical mechanism suitable for a highly complex system – “Hyperbolic Dirac Net”
  4. Involves an approach that is based on the premise that a Highly Complex System driven by the human social structures continuously strives to achieve a higher order in the entropic journey by continuos discerning the knowledge hidden in the system that is in continuum.
  5. A System in Continuum seeking Higher and Higher Order is a Generative System.
  6. A Generative System; Brings System itself as a Method to achieve Transformation. Similar is the case for National Learning Health System.
  7. A Generative System; as such is based on Distributed Autonomous Agents / Organization; achieving Syndication driven by Self Regulation or Swarming behavior.
  8. Essentially Q-UEL as a toolkit / framework algorithmically addresses interoperability, knowledge mining and blockchain; while driving the Healthcare Eco-system into Generative Transformation achieving higher nd higher orders in the National Learning Health System.
  9. It has capabilities to facilitate medical workflow, continuity of care, medical knowledge extraction and representation from vast large sets of structured and unstructured data, automating bio-statistical reasoning leading into large data driven evidence based medicine, that further leads into clinical decision support system including knowledge management and Artificial Intelligence; and public health and epidemiological analysis.


A Large Chaotic System driven by Human Social Structures has two contending ways.

a. Natural Selection – Adaptive – Darwinian – Natural Selection – Survival Of Fittest – Dominance

b. Self Regulation – Generative – Innovation – Diversity – Cambrian Explosion – Unique Peculiarities – Co Existence – Emergent

Accountable Care Organization (ACO) driven by Affordability Care Act transforms the present Healthcare System that is adaptive (competitive) into generative (collaborative / co-ordinated) to achieve inclusive success and partake in the savings achieved. This is a generative systemic response contrasting the functional and competitive response of an adaptive system.

Natural selection seems to have resulted in functional transformation, where adaptive is the mode; does not account for diversity.

Self Regulation – seems like is a systemic outcome due to integrative influence (ecosystem), responding to the system constraints. Accounts for rich diversity.

The observer learns generatively from the system constraints for the type of reflexive response required (Refer – Generative Grammar – Immune System –

From the above observation, should the theory in self regulation seem more correct and that adheres to laws of nature, in which generative learning occurs. Then, the assertion is “method” is offered by the system itself. System’s ontology has an implicate knowledge of the processes required for transformation (David Bohm – Implicate Order)

For very large complex system,

System itself is the method – impetus is the “constraint”.

In the video below, the ability for the cells to creatively create the script is discussed which makes the case for self regulated and generative complex system in addition to complex adaptive system.


Further Notes on Q-UEL / HDN :-

  1. That brings Quantum Mechanics (QM) machinery to Medical Science.
  2. Is derived from Dirac Notation that helped in defining the framework for describing the QM. The resulting framework or language is Q-UEL and it delivers a mechanism for inferential statistics – “Hyperbolic Dirac Net”
  3. Created from System Dynamics and Systems Thinking Perspective.
  4. It is Systemic in approach; where System is itself the Method.
  5. Engages probabilistic ontology and semantics.
  6. Creates a mathematical framework to advance Inferential Statistics to study highly chaotic complex system.
  7. Is an algorithmic approach that creates Semantic Architecture of the problem or phenomena under study.
  8. The algorithmic approach is a blend of linguistics semantics, artificial intelligence and systems theory.
  9. The algorithm creates the Semantic Architecture defined by Probabilistic Ontology :- representing the Ecosystem Knowledge distribution based on Graph Theory

To make a decision in any domain, first of all the knowledge compendium of the domain or the system knowledge is imperative.

System Riddled with Complexity is generally a Multivariate System, as such creating much uncertainty

A highly complex system being non-deterministic, requires probabilistic approaches to discern, study and model the system.

General Characteristics of Complex System Methods

  • Descriptive statistics are employed to study “WHAT” aspects of the System
  • Inferential Statistics are applied to study “HOW”, “WHEN”, “WHY” and “WHERE” probing both spatial and temporal aspects.
  • In a highly complex system; the causality becomes indeterminable; meaning the correlation or relationships between the independent and dependent variables are not obviously established. Also, they seem to interchange the position. This creates dilemma between :- subject vs object, causes vs outcomes.
  • Approaching a highly complex system, since the priori and posterior are not definable; inferential techniques where hypothesis are fixed before the beginning the study of the system become enviable technique.

Review of Inferential Techniques as the Complexity is Scaled

Step 1:- Simple System (turbulence level:-1)

Frequentist :- simplest classical or traditional statistics; employed treating data random with a steady state hypothesis – system is considered not uncertain (simple system). In Frequentist notions of statistics, probability is dealt as classical measures based only on the idea of counting and proportion. This technique is applied to probability to data, where the data sets are rather small.

Increase complexity: Larger data sets, multivariate, hypothesis model is not established, large variety of variables; each can combine (conditional and joint) in many different ways to produce the effect.

Step 2:- Complex System (turbulence level:-2)

Bayesian :- hypothesis is considered probabilistic, while data is held at steady state. In Bayesian notions of statistics, probability is of the hypothesis for a given sets of data that is fixed. That is, hypothesis is random and data is fixed. The knowledge extracted contains the more subjectivist notions of uncertainty, belief, reliability, or confidence often used in automated inference and decision support systems.

Additionally the hypothesis can be explored only in an acyclic fashion creating Directed Acyclic Graphs (DAG)

Increase the throttle on the complexity: Very large data sets, both structured and unstructured,  Hypothesis random, multiple Hypothesis possible, Anomalies can exist, There are hidden conditions, need arises to discover the “probabilistic ontology” as they represent the system and the behavior within.

Step 3: Highly Chaotic Complex System (turbulence level:-3)

Certainly DAG is now inadequate, since we need to check probabilities as correlations and also causations of the variables, and if they conform to a hypothesis producing pattern, meaning some ontology is discovered which describes the peculiar intrinsic behavior among a specific combinations of the variables to represent a hypothesis condition. And, there are many such possibilities within the system, hence  very chaotic and complex system.

Now the System itself seems probabilistic; regardless of the hypothesis and the data. This demands Multi-Lateral Cognitive approach

Telandic …. “Point – equilibrium – steady state – periodic (oscillatory) – quasiperiodic – Chaotic – and telandic (goal seeking behavior) are examples of behavior here placed in order of increasing complexity”

A Highly Complex System, demands a Dragon Slayer – Hyperbolic Dirac Net (HDN) driven Statistics (BI-directional Bayesian) for extracting the Knowledge from a Chaotic Uncertain System.


What came first Design (Gene – DNA – Chromosomes) or Building Material (Protein) :- Architecture Paradox

Below is the Reply to a Question what came first DNA or Protein? One seem not to exist without the other. This a paradox.

System is combination of questions (problem domain) and answers (solution domain) – why, how, what, when, where. Also, it alludes to transformation – lets say driven by entropy. So, system dynamics.

A. DNA precedes Protein – possible

B. Relation between Gene, DNA, Chromosome and what it does to Protein

C. Paradox of DNA vs Protein – leads into Intelligent Design Discussion

D. Method to bring our mind to study them and arrive at more hypothesis, postulates or just appreciate the grandeur we live in.

A. Paper presented at NIH – discusses the possibility of DNA preceding protein.

DNA before proteins? Recent discoveries in nucleic acid catalysis strengthen the case.

B. Think Architecture Talk Language. I think we are struggling to represent complex concept structurally – typical architecture problem discussed with constraints of linearity in language.

a. Instruction Set
Chromosome is a molecule containing DNA cell
DNA cell is made up of gene segment
gene has basic unit code or stroke on piano
chromosome describes the script or note on piano = set of strokes
{Chromosome [(DNA(GENE)]} = information = coded tune for set of functions to emerge
I think chromosomes also knows how to orchestrate the choreography that finally realizes the living organism.

b. Construction Material
Protein is the building block. It gets instruction from set of chromosomes and it builds creating complex forms from simple building blocks. Complex forms are uniquely created with different functional capability.

More complex living organism, more complex sets of chromosomes. Simpler organisms have smaller number of sets of chromosomes. Humans have 23 pairs, that is 46 chromosomes

DNA, Chromosomes and Gene

Discussion on DNA – Protein

C. Paradox of DNA vs Protein leading into discussion about Intelligent Design discussed below

D. My own spin – studying Implicate Order and System Dynamics; proposed by David Bohm

Design (DNA) was created either by a designer or by an “emergence process – emergent architecture”. Also, it could be affected by both causality or acausality.

Chromosomes (DNA) is the implicate order – by generative process it creates the explicate order (everything external) in rich diversity. It is such an architecture, where the ontology of which has containment of the process of creation (notes, tunes, choreography and orchestration)

Complex subjects or ideas needs to be investigated employing integrative tools. To understand the how vs what paradox, we have applied ideas from what we have understood of the world from following disciplines – biology, chemistry, physics and economics + maths. It is the fight for resource (economics), which brings in modes of life form emergence from its simpler forms to complex forms. The fight for resource is both inter and intra. Fiscal Cliff is the fight amongst ourselves. Rich and Poor’ who should own the resource and who should benefit. It is fight for control and eventually our life.

Implicate order, studied within the notion of system driven by entropy discusses energy and need for achieving stability via entropic journey creating order and chaos. This creates various social structures with varying stresses. Order and Chaos are English words to describe the state of entropy as it dynamically exists in a continuum – System Dynamics. This is very well discussed in the video below.

Very beautiful video to see if a different perspective on thermodynamics helps our understanding of complexity and even more of life itself.

password “montana”

The unique form that living organism takes; probably could be described combining macro & micro physics (quantum dynamics) and chemistry. Applying Pauli’s Exclusion Principle – different energy pattern affect composition, so we have a periodic table.

Operating Model – Have Fed Agencies abandoned creating “Enterprise Transition Plan” ? ETP is challenging for the OCIO

Enterprise Transition Plan – Line Of Sight – Operating Model

Ref:- FEAC Institute, FEA Guidance, INGINE INC, Srinidhi Boray[/

Conceiving a coherent modernization plan and executing them has always been a challenge for OCIO. Enterprise Transition Plans generally documents the visions, goals, capabilities at the strategic level and then progresses to envisage “target conceptual architecture”, while planning for tactical efforts working through the mechanisms of architecture lifecycle needed to transition towards the target enterprise conceptual architecture. This results into creating the “roadmap”. The documents that holistically speaks to organizationals capability in reaching such a roadmap is the “Enterprise Transition Strategy”.

Reading the Enterprise Transition Strategy following should be evident:

Organization’s maturity and competence to perform and embrace the future capabilities

Organization’s present constraints in delivering the needed “capabilities”

The profile of the architecture cross sections (segments) needing modernization to develop the intended “portfolio of capabilities”

Investment Profile – planned to executive segment modernization

The governance structure and mechanism that engages the different functional capabilities, such as leadership – management, lines of businesses, enterprise architecture, capital planning and program management. This group together develops and vets – strategic plans, tactical plans and implementation plans governed by following  life cycles

  • Performance Management (Architect, Invest, Implement)
  • Architecture Segment (Notional, Planned, In-Progress, Completed)
  • Release Architecture (sadly mostly do not plan this – it is here interdependence of architectures or capabilities are realized and maintained)
  • Capital Planning (Preselect, Select, Control & Evaluate)
  • Investment (mixed, steady state, Development, Modernization & Enhancement -DME)

A “Sequence Plan” along with “Implementation Plan” is developed to reach the desired Target Architecture (temporal perspective)

Finally Operating Model – The most important component – it assimilates all the decisions chosen for creating an operating leverage, that creates degree or order of leverage to deal with inherent systemic complexities against the efforts of creating economy of scale in delivering the enterprise capabilities essential for delivering the enterprise mission.

More discussion in below link in creating Enterprise Transition Strategy

Transformation Framework Capturing an Operating Model

Distortions leads to Cancerous Growth within Enterprise

Programmed Cell Death is very important function to understand to gain insight into the way Transformation need to occur. When distortions occur in Enterprise Engineering, then this leads into obvious cancerous growth, which does not have easy remedy.

How many CIO’s in the market inadvertently are responsible for distortions?! Countless.

The Case for Cautious Optimism – BusinessWeek

The Case for Cautious Optimism – BusinessWeek.

Leo Apotheker – SAP Executive

This lack of sustainability bothers me, and many business leaders have come to share it. My discussions with other CEOs reveal that we will not go back to our merry, pre-crisis ways of limitless consumption and exuberant investment fueled by excessive liquidity. The consensus is that we need better models for capitalism in the 21st century. This crisis has laid bare the need for more clarity in business practices, greater transparency in reporting standards, and above all, the dire need for more sustainable business models.

What we want Federal CIO to do

Vivek Kundra’s Resume: Much Ado About Nothing?.

Vivek Kundra’s Resume: Much Ado About Nothing?
Andrea DiMaio

August 14th, 2009 · 2 Comments

Over the last two days, the blogosphere has witnessed an interesting debate about whether Vivek Kundra’s resume is entirely accurate and sufficient for his current role as U.S. Federal CIO. It all started with a blog post by John Dvorak, where he casted doubts about Vivek’s academic achievements and his experience outside the public sector, followed by a post by Gauthan Nagesh on NextGov, shedding light on Vivek’s academic records and challenging Dvorak’s positions. It is quite intriguing to look at comments on either blog, as well as on many other blogs and online articles (just search for “Vivek Kundra qualifications” and you’ll find quite a lot).

Besides observing that politics are the same pretty much anywhere, with political appointees being regularly targeted by their opponents, the press and more recently by blogs (quite ironic in Vivek’s case, as he is a fervent believer in the power of Web 2.0), I am not really interested in whether these allegations are founded or not.

What I am more interested in is the fundamental question behind much of these discussions, i.e. is he qualified for the job at hand? In order to answer this question one has to looks at (1) the exact job description and (2) his achievements in related positions.

As far as (1), the job description in the White House’s announcement of Vivek’s appointment was:

The Federal Chief Information Officer directs the policy and strategic planning of federal information technology investments and is responsible for oversight of federal technology spending. The Federal CIO establishes and oversees enterprise architecture to ensure system interoperability and information sharing and ensure information security and privacy across the federal government. The CIO will also work closely with the Chief Technology Officer to advance the President’s technology agenda

Words are important here. He is not responsible for the federal IT budget (agencies are), but for its oversight. He directs policies and planning in order to advance the President’s technology agenda. Obama’s agenda clearly is about change in a number of areas, including IT as an important enabler of change. So I guess one of the most important trait the President was looking for was the ability to be a change agent.

Here comes (2), what did he do in the past to show such a trait? Well, I would argue that his achievements in D.C. as a CTO got the attention of many, ranging from how he changed portfolio management to how he made procurement more transparent up until his venture into crowdsourcing applications. He also got a number of recognitions during his tenure in D.C.

Those who have been reading this blog for some time know that I like Vivek and wished him well when he went into some trouble shortly after his appointment. I do not think that allegations or even facts about his qualifications as a student or an entrepreneur can deny his nature of change agent.

Where I believe challenges are for him as well as for the whole administration, is in deciding how to prioritize change, and how to find the right blend between continuity and innovation.

The risk is that in between open data, cloud computing, government 2.0, support to major programs like national broadband and health IT, Obama’s IT team finds itself chewing too much too soon.

If there is one thing that needs to be sorted out now, is how the CIO and CTO role relate to each other. A few days ago I watched an interview that CTO Aneesh Chopra gave to CNET, and could not get a straight answer to this, although the interviewers asked him a direct question. It is quite possible that roles, responsibilities and priorities have already been sorted out. But then, could we please know?


Tags: e-government
Jeff DePasquale // Aug 14, 2009 at 7:48 am

Net-net: I support Vivek as an Executive Partner under EXP. I find his approach to be refreshing and focused. Not sure what his paper qualifications are, but he’s demonstrated to our team his calculated ability to step outside the “beltway box”.

Srinidhi Boray // Aug 14, 2009 at 11:08 am

“”The Federal CIO establishes and oversees enterprise architecture to ensure system interoperability and information sharing and ensure information security and privacy across the federal government.””

Enterprise Architecture, no doubt is a mechanism that can help facilitate change. John Zachman, himself in his FEAC training mentions, that change a ’step function’ can be best anticipated via structures described in the Enterprise Architecture. And, EA is something that is learned over a period of time from experience. Unless, one is born savant.

In lack of architecture experience, most CIO’s in both federal and commercial sectors have made strategic decisions that have not manifested tactically with the desired results. There are numerous cases in the Federal. Should OIG in each agency work with motivation, then a lot of ill-conceived plans and associated ill spent millions of dollars can be purged. But who cares. Talking EA has become a fashion. And, planners lacking design mind introduce ‘empirical dilemma’, that is evident in the Federal Segment Architecture Methodology.

Within the CCA, A-130 Circular ‘information dissemination’ is just one portion of federal responsibility. Web 2.0, or any social networking although good, does not solve the country’s looming problem. Such as, the problem that HHS confronts – medicare, medicaid; that VA confronts – Veteran Patient Health Information system. So on and so forth, there are some very serious problem looming and they have no easy answers both in terms of budget planning and managing the architecture complexity. Here is where we want the Federal CIO to be active. Not dealing with superlatives and in the conceptual solutions domains. Government must be concerned with defining accurately the ‘problem domain’, not engage with ’solution domain’, but seek it instead.