systems thinking

Quantum Theory driven (QEXL Approach) Cognitive Computing Architecture resolving Healthcare Interoperability (BigData – HIE/ ACO )

http://www.BioIngine.com

[healthcare cognitive computing platform]

Conquering Uncertainties Creating Infinite Possibilities

(Possible application :- Achieving Algorithm Driven ACO)

HDN_Cognitive_Computing

Introduction

The QEXL Approach is a Systems Thinking driven technique that has been designed with the intension of developing “Go To Market” solutions for Healthcare Big Data applications requiring integration between Payor, Provider, Health Management (Hospitals), Pharma etc; where the systemic complexities tethering on the “edge of chaos” pose enormous challenges in achieving interoperability owing to existence of plethora of healthcare system integration standards and management of the unstructured data in addition to structured data ingested from diverse sources. Additionally, The QEXL Approach targets for the creation of Tacit  Knowledge Sets by inductive techniques and probabilistic inference from the diverse sets of data characterized by volume, velocity and variability. In fact, The QEXL Approach facilitates algorithmic driven Proactive Public Health Management, while rendering business models achieving Accountable Care Organization most effective.

The QEXL Approach is an integrative multivariate declarative cognitive architecture proposition to develop Probabilistic Ontology driven Big Data applications creating interoperability among Healthcare systems. Where, it is imperative to develop architecture that enable systemic capabilities such as Evidence Based Medicine, Pharmacognomics, biologics etc; while also creating  opportunities for studies such as Complex Adaptive System (CAS). Such approach is vital to develop ecosystem as an response to mitigate the Healthcare systemic complexities. Especially CAS studies makes it possible to integrate both macro aspects (such as epidemiology) related to Efficient Heathcare Management Outcomes ; and micro aspects (such as  Evidence Based Medicine and Pharmacogenomics that helps achieve medicine personalization) achieving Efficacy in the Healthcare delivery, to help achieve systemic integrity. In The QEXL Approach QEXL stands for “Quantum Exchange Language”, and Q-UEL is the initial proposed language. The QEXL Consortium embraces Quantal Semantics, Inc; (NC) and Ingine, Inc; (VA), and collaborates with The Dirac Foundation (UK), which has access to Professor Paul Dirac’s unpublished papers. The original consortium grew as a convergence of responses to four stimuli:

  1. The “re-emerging” interest in Artificial Intelligence (AI) as “computational thinking”, e.g. under the American Recovery Act;
  2. The President’s Council of Advisors on Science and Technology December 2010 call for an “XML-like” “Universal Exchange Language” (UEL) for healthcare;
  3. A desire to respond to the emerging Third World Wide Web (Semantic Web) by an initiative based on generalized probability theory  – the Thinking Web; and
  4. In the early courses of these  efforts, a greater understanding  of what Paul Dirac meant in his  Nobel Prize dinner speech where he stated that quantum mechanics should be applicable to all aspects of human thought.

The QEXL Approach

The QEXL Approach is developed based on considerable experiences in Expert Systems, linguistic theory, neurocognitive science, quantum mechanics, mathematical and physics-based approaches in Enterprise Architecture, Internet Topology, Filtering Theory, Semantic Web, Knowledge Lifecycle Management, and principles of Cloud Organization and Integration. The idea for well-formed probabilistic programming reasoning language is simple.  Importantly, also, the more essential features of it for reasoning and prediction are correspondingly simple such that the programmers are not necessarily humans, but structured and unstructured (text-analytic) “data mining” software robots. We have constructed a research prototype Inference Engine (IE) network (and more generally a program) that “simply” represents a basic Dirac notation and algebra compiler, with the caveat that it extends to Clifford-Dirac algebra; notably a Lorentz rotation of the imaginary number i (such that ii = -1) to the hyperbolic imaginary number h (such that hh = +1) corresponding to Dirac’s s, and gtime or g5) is applied.

[Outside the work of Dr. Barry Robson, this approach has not been tried in the inference and AI fields, with one highly suggestive exception: since the late 1990s it has occasionally been used in the neural network field by T. Nitta and others to solve the XOR problem in a single “neuron” and to reduce the number of “neurons” generally. Also suggestively, in particle physics it may be seen as a generalization of the Wick rotation time i x time used by Richard Feynman and others to render wave mechanics classical.  It retains the mathematical machinery and philosophy of Schrödinger’s wave mechanics but, instead of probability amplitudes as wave amplitudes, it yields classical but complex probability amplitudes encoding two directions of effect: “A acts on B, and B differently on A”. It maps to natural language where words relate to various types of real and imaginary scalar, vector, and matrix quantities. Dirac’s becomes the XML-like semantic triple . ]  

The QEXL Approach involves following  interdependent components.

  • Q-UEL (Probabilistic Inference + Phenomenon Of Interest): Addresses global issues that potentially pervade all human endeavors, and hence universal interoperability is of key importance
  •  (Inference Engine + Semantic Inferencing): Project addressing universal meaning underlying diverse natural languages on the Internet, and the use of that in knowledge representation
  • Inference Engine + Decentralized Infra: A link infrastructure for intra- and inter-cloud interoperability and integration in a coherent high level “metaware” environment. This component can also be explored to be replaced with simpler industry ready solutions such as MarkLogic® Enterprise NoSQL Database on Hadoop Distributed File System.

In an endeavor of this kind the partitions-of-work are inevitably artificial; it is important that this does not impede the integrity of optimal solutions.  The most important aspect in The QEXL Approach is, in essence where architecturally Probabilistic Inference (PI) and Data Architecture for the Inference Engine (IE)  is designed to be cooperative; software robots are created while PI and IE interact; and the inference knowledge gained by the PI and IE provide rules for solvers (robots) to self compile and conduct queries etc. This is therefore the grandeur of the scheme: This approach will have facilitated programming by nice compilers so that writing the inference network is easy, but it is not required to write the inference net as input code to compile, with the exception of reusable metarules as Dirac expressions with variables to process other rules by categorical and higher order logic. The robots are designed and programmed to do the remaining coding required to perform as solvers. So the notion of a compiler disappears under the hood. The robots are provided with well-formed instructions as well formed queries. Once inferences are formed, different “what – if” questions can be asked. Given that probability or that being the case, what is the chance of… and so on. It is as if having acquired knowledge, Phenomenon Of Interest (POI) is in a better state to explore what it means. Hyperbolic Dirac Networks (HDNs) are inference networks capable of overcoming the limitations imposed by Bayesian Nets (and statistics) and creating generative models richly expressing the “Phenomenon Of Interest” (POI) by the action of expressions containing binding variables. This may be thought of as an Expert System but analogous to Prolog data and Prolog programs that act upon the data, albeit here a “probabilistic Prolog”. Upfront should be stated the advantages over Bayes Nets as a commonly used inference method, but rather than compete with such methods the approach may be regarded as extending them. Indeed a Bayes Net as a static directed acyclic conditional probability graph is a subset of the Dirac Net as a static or dynamic general bidirectional graph with generalized logic and relationship operators, i.e. empowered by the mathematical machinery of Dirac’s quantum mechanics.

 The QEXL Approach Theory :- Robson Quantitative Semantics Algebra (RQSA)

Developed by Dr. Barry Robson

Theory :- The QEXL Approach based on Robson Quantitative Semantics Algebra – RQSA (Link to development of algorithm – overcoming limitations of Gold Stand Bayesian Network – to solve uncertainty while developing probabilistic ontology)

Impact Of The QEXL Approach

Impact of The QEXL Approach creating Probabilistic Ontology based on Clifford-Dirac algebra has immense opportunity in advancing the architecture to tackle large looming problems involving System of Systems; in which vast uncertain information emerge. Generally, as such systems are designed and developed employing Cartesian methods; such systems do not offer viable opportunity to deal with vast uncertain information when ridden with complexity. Especially when the context complexity poses multiple need for ontologies, and such a system inherently defies Cartesian methods. The QEXL Approach develops into an ecosystem response while it overcomes the Cartesian dilemma (link to another example for Cartesian Dilemma) and allows for generative models to emerge richly expressing the POI. The models generatively develops such that the POI behavior abstracted sufficiently lend for the IE and the Solvers to varieties of studies based on evidence and also allows for developing systemic studies pertaining to Complex Adaptive System and Complex Generative Systems afflicted by multiple cognitive challenges. Particularly, The QEXL Approach has potential to address complex challenges such as evidence based medicine (EBM); a mission that DoD’s Military Health System envisions while it modernizes its Electronics Health Record System – Veterans Health Information Systems and Technology Architecture (VistA). Vast potential also exists in addressing Veteran Administration’s (VA) Million Veteran Program (MVP); an effort by VA to consolidate genetic, military exposure, health, and lifestyle information together in one single database. By identifying gene-health connections, the program could consequentially advance disease screening, diagnosis, and prognosis and point the way toward more effective, personalized therapies.

Although The QEXL Approach is currently targeted to the healthcare and pharmaceutical domains where recognition of uncertainty is vital in observations, measurements and predictions, and probabilities underlying a variety of medical metrics, the scope of application is much more general. The QEXL Approach is to create a generic multivariate architecture for complex system characterized by Probabilistic Ontology that employing generative order will model “POI” facilitating creation of “communities of interest” by self-regulation in diverse domains of interest, requiring integrative of disciplines to create complex studies. The metaphor of “Cambrian Explosion” may aptly represent the enormity of the immense possibilities in advancing studies that tackle large systemic concerns riddled with uncertain information and random events that The QEXL Approach can stimulate.

Image

The inference engine can be conceptualized into solutions such as MarkLogic NoSQL + Hadoop (HDFS). http://www.marklogic.com/resources/marklogic-and-hadoop/

It is interesting to note that in the genesis of evolving various NoSQL solutions based on Hadoop few insights have emerged related to need for designing the components recognizing their cooperative existence.

The Goal of The QEXL Approach: Is all about Contextualization 

The goal employing The QEXL Approach is to enable the realization of cognitive multivariate architecture for Probabilistic Ontology, advancing the Probabilistic Ontology based architecture for context specific application; such as Healthcare. Specifically, The QEXL Approach will develop PI  that helps in the creation of generative models that depicts the systemic behavior of the POI riddled with vast uncertain information. Generally, uncertainty in the vast information is introduced by the System of Systems complexity that is required to resolve multiples of ontologies, standards etc., these further introduce cognitive challenges. The further goal of The QEXL Approach is to overcome such challenges, by addressing interoperability at all levels, including the ability to communicate data and knowledge in a way that recognizes uncertainty in the world, so that automated PI and decision-making is possible. The aim is semiotic portability, i.e. the management of signs and symbols that deals especially with their function and interactions in both artificially constructed and natural languages. Existing systems for managing semantics and language are mostly systems of symbolic, not quantitative manipulation, with the primary exception of BayesOWL.  RQSA, or Robson Quantitative Semantic Algebra by its author Dr. Barry Robson, to distinguish it from other analogous systems, underlies Q-UEL. It is the development of (a) details of particular aspect of Dirac’s notation and algebra that is found to be of practical importance in generalizing and correctly normalizing Bayes Nets according to Bayes Theorem (i.e. controlling coherence, which ironically Bayes Nets usually neglect, as they are unidirectional), (b) merged with the treatment of probabilities and information based on finite data using the Riemann Zeta function that he has employed for many years in bioinformatics and data mining (http://en.wikipedia.org/wiki/GOR_method), and (c) the extension to more flavors of hyperbolic imaginary number to encode intrinsic “dimensions of meaning” under a revised Rojet’s thesaurus system.

The Layers of the Architecture Created by The QEXL Approach

The QEXL Layered View

The QEXL Layered View

Layer 1- Contextualization: Planning, Designing driven by Theories 

A.    Probabilistic Ontology creating Inferencing leading into Evidence Based Medicine

i.     Aspects addressed by Q-UEL Tags and Kodaxil Inferencing

  1. Autonomy / Solidarity
  2. Inferencing (Kodaxil and Q – UEL)
  3. MetaData
  4. Security / Privacy
  5. Consented vs Un-consented Data
  6. Creating Incidence Rule (predicated – Q-UEL and Kodaxil)

ii.     Kodaxil:-  Enforcing Semantics across data sources (global text and data interoperability) – universal meaning underlying diverse natural languages on the Internet

iii.     Fluxology:- Logical Meta Data Cloud (A link infrastructure for intra- and inter-cloud interoperability and integration in a international setting)

  1. Adaptive
  2. Emergent Data Usage Patterns (networks of networks – enables by Probabilistic Ontology rules)
  3. Modeless Emergent Hierarchies
  4. Federation and Democratization Rule for Data (contract, trust, certificates, quality)

B.    Development of Probabilistic Model Representing Universal Abstraction of Phenomenon Of Interest

C.   Targeting Architecture to Application

  • Evidence Based Medicine
  • Genomics
  • Systemic Healthcare Studies
  • etc

Layer 2 – A: Operational Architecture (Logical )

A.    Reference Architecture

  1. Business Con Ops (Use cases)
  2. Conceptual Target Solution Architecture

Layer 2 – B: Data Management – Data Ingestion and Processing 

  1.  The processing of entries in the source data into form suitable for data mining
  2. The data mining of that processed data to obtain summary rules
  3. The capture of the appropriate released summary rules for inference

B.    Data Storage and Retrieval, Transactions

  1. Secure Storage and Retrieval
  2. Enable Secure Transactions
  3. Secure Data Exchange among several stake-holders and data owners

C.    Data Lifecycle, Data Organization Rules, Data Traceability to the Events, 

  1. Security and privacy by encryption and disaggregation of the EHR in a manner that is balanced against authorized access for extraction of global clinical and biomedical knowledge.
  2. Mechanisms for fine-grained consent permitting sharing and data mining.
  3. Mechanisms for secure alerting of patient or physician by backtrack when an authorized researcher or specialist notes that a patient is at risk.
  4. Structure and format that allows all meaningful use cases to be applied in reasonable time, including large-scale data mining.
  5. Assemblies across sources and data users forming contextual work patterns
  6. Hardened Security Framework

D.    Large EHR repository scaling

E.    Data Mining Rules

F.     Extracting and creating Incidence Rules

G.    Experimenting, observing and creating Semantic Inferences

H.    Visualization 

 The below two layers can be implemented in varieties of BigData platforms such as Hortonworks, Pivotal, Altiscale

Layer 3 – Application Layer (Schema-less for structured and unstructured Knowledge Repository – KRS)

Layer 4 – Infrastructure Architecture (Physical) (Hadoop and MapReduce for Large Data File-management and Processing; and Distributed / Concurrent Computations)

For Who The Heck is Enterprise Architecture Not?

Thinker Practitioner

While thinking of ideas discussed in this blog, have been probing the tenants that creates the fundamental characteristics of the system such as :-

  • Division Of Labor – the most important idea that revolutionized industrialization and for rise of capitalism
  • Commodiitization vs Specialization
  • Production of cost by Economy Of Scale by Division vs Multiplexing (manual vs automation)
  • Holistic vs Reductionist
  • Organic vs Inorganic (natural dichotomy)
  • Autonomy (increased self sufficiency) vs Corporate Sovereignty
  • Natural Selection leading into Adaptive vs Self Regulation leading into Generative
  • Centralization vs Polycentrism; and Federation
  • Simplicity vs Complicated (not to be confused with Complex System)

The above premise must be used as background in probing the discipline EA – which is essentially a System Theory being an integrative of several disciplines such as sociology, economics, business management, information technology etc

Defeating Cognitive Bias and Instant Gratification

https://www.cia.gov/library/center-for-the-study-of-intelligence/csi-publications/books-and-monographs/psychology-of-intelligence-analysis/art12.html

Basically have been battling within my head why this discipline Enterprise Architecture went so whacky in the corridors of corporates. I think all the below discussed stuff are symptoms. The core reasons I think is basically in the deviant behavior that corporate culture tends to promote driven by short term gains. Lets call this deviant theory (Corporate) Anomie a concept developed by Emile Durkheim, a French sociologist, who introduced this concept of Anomie in his book The Division of Labor in Society, published in 1893. Later expanded in the book Suicide published in 1897. Emile Durkheim is considered as the “father of sociology”.

1. Not for those Not solving Systemic Concerns

https://ingine.wordpress.com/2012/08/03/transformative-enterprise-architecture-framework-connecting-strategy-tactical-operational-execution-implementation/

https://ingine.wordpress.com/2007/08/22/enterprise-architecture-economic-model-system-dynamics/

Enterprise Architecture (EA) as a discipline is engaged to solve problems within systemic context. Where the (a) challenge of realizing business strategy by enabling relevant business capabilities (b) delivered by set of tactical objectives (operational) achieved by (c) making informed and decisive investments in technologies. When such systemic concerns are not being addressed, then EA is overkill. EA while it strives to solve macro concerns it does so by aligning well designed several sets of micro objectives.

2. Not for those who lack appreciation for Order and Maturity 

Generally, EA as a discipline is better desired when an enterprise strives to scale order of maturity to manage complexity by rational means. To develop organizational skills, a decisive competencies framework is desired.

https://ingine.wordpress.com/2010/03/17/enterprise-architecture-maturity-model-framework-federal/

https://ingine.wordpress.com/2012/08/01/developing-enterprise-architecture-practitioner-competamcies/

https://ingine.wordpress.com/2012/08/02/dod-ea-competencies-development-framework/

3. Not for those who cannot deal with Abstraction

https://ingine.wordpress.com/2007/08/01/teaching-an-elephant-to-dance/

EA is not for those who have not trained their mind to think in abstraction. Especially those who find themselves comfortable in dealing with physicality and forms will require head wrenching exercise to hone mind to develop abilities to deal with varieties of abstractions, while representing them with varying semantics that help in  delineation, thus leading into layered representation in terms of a framework.

“Separation of Concerns” is one such technique that achieve delineation by introduction of semantics and eventually it also helps in the design of EA Framework.

https://ingine.wordpress.com/2007/08/17/christopher-alexander-father-of-pattern-language/

https://ingine.wordpress.com/2008/03/12/ciao-interesting-pursuit-after-ea-ontology/

EA is not for those, who in abstraction find it difficult to delineate contextual from conceptual; and conceptual from logical and logical from physical.

4. Not for Reductionist; instead suited for Holistic Thinkers

https://ingine.wordpress.com/2007/08/08/the-black-swan/

https://ingine.wordpress.com/2008/03/15/six-sigma-aids-only-linear-transformation-not-non-linear-radical-transformation/

EA is not for those who are only reductionists in approach and who think everything in the world can be reduced to simple set of objects or objectives. EA is not necessarily Cartesian and certainly not for linear thinkers. EA is not about immediately discovering implementation; in fact it is delayed gratification by introducing decisive rationalizing process before subjective strategy turn into executable objective actions yielding the best business results by leverage.

5. Not for those who pursue Transformation without Goals

Although EA might help manage both organic and inorganic growth of an organization; by itself it is a disciple dealing with inconsistencies owing to structure and mechanism found within and their impact on transformation augmenting enterprise growth

https://ingine.wordpress.com/2008/12/06/enterprises-in-generative-transformation-akin-to-universe-life-cycle/

6. Not for those who pursue merely Functional Goals and NOT be concerned with Ecosystem’s Harmony 

https://ingine.wordpress.com/2007/08/03/huichols-shamanic-vision-architecture-of-the-consciousness/

EA is not a mere IT role, especially a difficult role for those who have worked only in the areas of IT infra and in delivering such services. EA is not limited only to physical layer.

EA is not for those who find it difficult to distinguish business strategy from business architecture; business architecture from application architecture; application architecture from technology (infra) architecture. And, importantly EA is an overarching and all encompassing meta-architecture inclusive of all those levels of architecture that holistically represents an enterprise in relevant abstractions.

Architecture is a holistic sum striving to achieve systemic balance by aligning function, performance and cost

Math and statistical although important, those approaches alone are not adequate to envision future capabilities for an organization. Likewise advents in technology alone do not necessarily prove transformative. Knowledge of IT architecture by itself is not transformative. Instead, EA as a discipline  requires to integrate Architecture, Capital Planning and Program Portfolio Management. All these brought together by productive Governance; even then the challenges of future remains fleeting.

7. Not for those who think Taxonomy suffices to represent EA and NOT Ontology 

https://ingine.wordpress.com/2008/12/15/implicate-order-descriptive-mechanism-form-large-systems-of-systems/

https://ingine.wordpress.com/2013/06/21/implicate-order-probabilistic-ontology-complexity-theory/

There is distinct semantics used to represent each of the layers (“separation of concerns”). Semantics introduces dimensionality to the architecture layers and they cannot be represented with limited set of semantics in limited dimensions. Each layer that is semantically different from the other requires transformation in planning and design to discover the opportunities in each layer. Ontology plays an important role in developing and assimilating ideas leading into discovering creative transformative opportunities. EA is not for those who tend to reduce everything into simple 2D representations in the hope that simplified versions help manage complexity better.

8. Not for those who are Programmatic in Approach and Not Practitioners 

Not for those who do not understand what disposes them to be credible management consultants. Furthermore, also not for those management consultants who have not gained appreciation for structure, semantics driven architecture and mechanism within; and their role together in systemic transformation, functional modernization and economic optimization.

Consultants are those who have gained immense multi-lateral experience in the industry in variety of areas especially in conducting transformation, modernization and optimization related activities. Generally individual gains such experiences driven by motivation to solve large fleeting problems those are systemic in nature. Not by pursuing opportunities where sole motivation is revenue generation no matter what.

Practitioners develop insight by assiduously probing the problem and complexity. The skills do not develop overnight. It is not swashbuckling nor shooting through the hip. It is developing opportunity by being proactive and intense probing.

There are no Outliers (myth destroyed by  Malcolm Gladwell http://en.wikipedia.org/wiki/Outliers_(book) )

“””A common theme that appears throughout Outliers is the “10,000-Hour Rule”, based on a study by Anders Ericsson. Gladwell claims that greatness requires enormous time, using the source of The Beatles’ musical talents and Gates’ computer savvy as examples.[3] The Beatles performed live in HamburgGermany over 1,200 times from 1960 to 1964, amassing more than 10,000 hours of playing time, therefore meeting the 10,000-Hour Rule. Gladwell asserts that all of the time The Beatles spent performing shaped their talent, and quotes Beatles’ biographer Philip Norman as saying, “So by the time they returned to England from Hamburg, Germany, ‘they sounded like no one else. It was the making of them.'”[3]Gates met the 10,000-Hour Rule when he gained access to a high school computer in 1968 at the age of 13, and spent 10,000 hours programming on it.[3]“””

9. Not for those who provide professional expertise as Contractor and NOT as Practitioner

https://ingine.wordpress.com/2012/08/01/developing-enterprise-architecture-practitioner-competamcies/

http://www.ipthree.org/frontpage-features/94-messageprofessionals

http://www.cos-mag.com/human-resources/hr-columns/whats-in-a-name-professional-vs-practitioner.html

It is not a contracting role – In theory contracting assumes that the client understand the requirement and they control the way project gets executed. This is obviously flawed approach.

Generally those who have thrived only in delivering IT services of operational nature are not the candidates for conducting management consulting, since most of their career has been delivering IT solutions and services for a requirement determined by the client and their consultants.

10. Not for those who do not value Professional Integrity

https://ingine.wordpress.com/2007/08/09/enterprise-architect-fighting-obfuscation/

Importantly EA is a Practitioner Discipline introducing high standards emphasizing on quality in rendering and most importantly professional ethics promoting the desired ethos for organization’s evident growth and maturity. The results of EA achieves transparency, accountability and line of sight driven by “structuralism” striving to achieve “order”.

11. Not for those who build career around Tools and NOT around Discipline Integrating Science and Art

EA is not a discipline that can be developed by building career around tools. Tools by themselves do not create Art and neither advances Science. EA is integrative of architecture, capital planning and program management. All these driven by corporate governance.

https://ingine.wordpress.com/2008/12/01/creative-people-at-work/

https://ingine.wordpress.com/2008/11/23/in-lack-of-theory-planning-will-be-along-a-straight-line/

12.Not for those who engage merely in IT Operation and Implementation

EA is integrative of strategy, operations and implementation

https://ingine.wordpress.com/2012/08/03/transformative-enterprise-architecture-framework-connecting-strategy-tactical-operational-execution-implementation/

13. Not for those who DO NOT Develop Enterprise Transition Plan and Operating Model (must skill for EA)

https://ingine.wordpress.com/2010/04/12/etp/

https://ingine.wordpress.com/2008/12/06/ea-framework-for-transition-planning/

14. Not for those who think EA and Solution Architecture are synonymous

http://blogs.msdn.com/b/gabriel_morgan/archive/2007/09/02/enterprise-architect-vs-solution-architect.aspx

http://weblog.tetradian.com/2012/09/13/linking-ea-with-sa/

 15. Not for those obsessed with Dominance and NOT Balance

EA cannot help organization achieve balance and sustainability without Governance

https://ingine.wordpress.com/2008/02/22/governance-managing-conflict-change-the-intrinsic-duality-of-an-enterprise/

16. Not for those who engage Masquerading Solution Facades and Intellectual Contrives 

EA improves “Loss of Innocence” while it DEFEATS Solutions that do not align with context.

https://ingine.wordpress.com/2007/08/01/teaching-an-elephant-to-dance/

 17. Not for those who think Service Oriented Architecture, Correlation Architecture, etc by themselves constitute EA

https://ingine.wordpress.com/2007/07/30/ea-framework-to-accomodate-soa-style/

 18. Not for those who merely think Strategy alone and NOT Innovation help create Newer Opportunities

Innovation to create newer opportunities while achieving higher order in capabilities and business sustainability are key to ensure system harmony against the challenges of existing and ensuing complexities.

https://ingine.wordpress.com/2012/12/20/moving-from-darwinian-adaptive-to-generative-transformation/

List for NOT’s can be endless….

…N. Is Certainly for System Thinkers