socio-economics

Q-UEL Toolkit for Medical Decision Making :- Science of Uncertainty and Probabilities

Screen Shot 2016-08-24 at 11.07.49 AM

Quantum Universal Exchange Language

Emergent | Interoperability | Knowledge Mining | Blockchain

Q-UEL

  1. It is a toolkit / framework
  2. Is an Algorithmic Language for constructing Complex System
  3. Results into a Inferential Statistical mechanism suitable for a highly complex system – “Hyperbolic Dirac Net”
  4. Involves an approach that is based on the premise that a Highly Complex System driven by the human social structures continuously strives to achieve a higher order in the entropic journey by continuos discerning the knowledge hidden in the system that is in continuum.
  5. A System in Continuum seeking Higher and Higher Order is a Generative System.
  6. A Generative System; Brings System itself as a Method to achieve Transformation. Similar is the case for National Learning Health System.
  7. A Generative System; as such is based on Distributed Autonomous Agents / Organization; achieving Syndication driven by Self Regulation or Swarming behavior.
  8. Essentially Q-UEL as a toolkit / framework algorithmically addresses interoperability, knowledge mining and blockchain; while driving the Healthcare Eco-system into Generative Transformation achieving higher nd higher orders in the National Learning Health System.
  9. It has capabilities to facilitate medical workflow, continuity of care, medical knowledge extraction and representation from vast large sets of structured and unstructured data, automating bio-statistical reasoning leading into large data driven evidence based medicine, that further leads into clinical decision support system including knowledge management and Artificial Intelligence; and public health and epidemiological analysis.

http://www.himss.org/achieving-national-learning-health-system

GENERATIVE SYSTEM :-

https://ingine.wordpress.com/2013/01/09/generative-transformation-system-is-the-method/

A Large Chaotic System driven by Human Social Structures has two contending ways.

a. Natural Selection – Adaptive – Darwinian – Natural Selection – Survival Of Fittest – Dominance

b. Self Regulation – Generative – Innovation – Diversity – Cambrian Explosion – Unique Peculiarities – Co Existence – Emergent

Accountable Care Organization (ACO) driven by Affordability Care Act transforms the present Healthcare System that is adaptive (competitive) into generative (collaborative / co-ordinated) to achieve inclusive success and partake in the savings achieved. This is a generative systemic response contrasting the functional and competitive response of an adaptive system.

Natural selection seems to have resulted in functional transformation, where adaptive is the mode; does not account for diversity.

Self Regulation – seems like is a systemic outcome due to integrative influence (ecosystem), responding to the system constraints. Accounts for rich diversity.

The observer learns generatively from the system constraints for the type of reflexive response required (Refer – Generative Grammar – Immune System – http://www.ncbi.nlm.nih.gov/pmc/articles/PMC554270/pdf/emboj00269-0006.pdf)

From the above observation, should the theory in self regulation seem more correct and that adheres to laws of nature, in which generative learning occurs. Then, the assertion is “method” is offered by the system itself. System’s ontology has an implicate knowledge of the processes required for transformation (David Bohm – Implicate Order)

For very large complex system,

System itself is the method – impetus is the “constraint”.

In the video below, the ability for the cells to creatively create the script is discussed which makes the case for self regulated and generative complex system in addition to complex adaptive system.

 

Further Notes on Q-UEL / HDN :-

  1. That brings Quantum Mechanics (QM) machinery to Medical Science.
  2. Is derived from Dirac Notation that helped in defining the framework for describing the QM. The resulting framework or language is Q-UEL and it delivers a mechanism for inferential statistics – “Hyperbolic Dirac Net”
  3. Created from System Dynamics and Systems Thinking Perspective.
  4. It is Systemic in approach; where System is itself the Method.
  5. Engages probabilistic ontology and semantics.
  6. Creates a mathematical framework to advance Inferential Statistics to study highly chaotic complex system.
  7. Is an algorithmic approach that creates Semantic Architecture of the problem or phenomena under study.
  8. The algorithmic approach is a blend of linguistics semantics, artificial intelligence and systems theory.
  9. The algorithm creates the Semantic Architecture defined by Probabilistic Ontology :- representing the Ecosystem Knowledge distribution based on Graph Theory

To make a decision in any domain, first of all the knowledge compendium of the domain or the system knowledge is imperative.

System Riddled with Complexity is generally a Multivariate System, as such creating much uncertainty

A highly complex system being non-deterministic, requires probabilistic approaches to discern, study and model the system.

General Characteristics of Complex System Methods

  • Descriptive statistics are employed to study “WHAT” aspects of the System
  • Inferential Statistics are applied to study “HOW”, “WHEN”, “WHY” and “WHERE” probing both spatial and temporal aspects.
  • In a highly complex system; the causality becomes indeterminable; meaning the correlation or relationships between the independent and dependent variables are not obviously established. Also, they seem to interchange the position. This creates dilemma between :- subject vs object, causes vs outcomes.
  • Approaching a highly complex system, since the priori and posterior are not definable; inferential techniques where hypothesis are fixed before the beginning the study of the system become enviable technique.

Review of Inferential Techniques as the Complexity is Scaled

Step 1:- Simple System (turbulence level:-1)

Frequentist :- simplest classical or traditional statistics; employed treating data random with a steady state hypothesis – system is considered not uncertain (simple system). In Frequentist notions of statistics, probability is dealt as classical measures based only on the idea of counting and proportion. This technique is applied to probability to data, where the data sets are rather small.

Increase complexity: Larger data sets, multivariate, hypothesis model is not established, large variety of variables; each can combine (conditional and joint) in many different ways to produce the effect.

Step 2:- Complex System (turbulence level:-2)

Bayesian :- hypothesis is considered probabilistic, while data is held at steady state. In Bayesian notions of statistics, probability is of the hypothesis for a given sets of data that is fixed. That is, hypothesis is random and data is fixed. The knowledge extracted contains the more subjectivist notions of uncertainty, belief, reliability, or confidence often used in automated inference and decision support systems.

Additionally the hypothesis can be explored only in an acyclic fashion creating Directed Acyclic Graphs (DAG)

Increase the throttle on the complexity: Very large data sets, both structured and unstructured,  Hypothesis random, multiple Hypothesis possible, Anomalies can exist, There are hidden conditions, need arises to discover the “probabilistic ontology” as they represent the system and the behavior within.

Step 3: Highly Chaotic Complex System (turbulence level:-3)

Certainly DAG is now inadequate, since we need to check probabilities as correlations and also causations of the variables, and if they conform to a hypothesis producing pattern, meaning some ontology is discovered which describes the peculiar intrinsic behavior among a specific combinations of the variables to represent a hypothesis condition. And, there are many such possibilities within the system, hence  very chaotic and complex system.

Now the System itself seems probabilistic; regardless of the hypothesis and the data. This demands Multi-Lateral Cognitive approach

Telandic …. “Point – equilibrium – steady state – periodic (oscillatory) – quasiperiodic – Chaotic – and telandic (goal seeking behavior) are examples of behavior here placed in order of increasing complexity”

A Highly Complex System, demands a Dragon Slayer – Hyperbolic Dirac Net (HDN) driven Statistics (BI-directional Bayesian) for extracting the Knowledge from a Chaotic Uncertain System.

What came first Design (Gene – DNA – Chromosomes) or Building Material (Protein) :- Architecture Paradox

Below is the Reply to a Question what came first DNA or Protein? One seem not to exist without the other. This a paradox.

System is combination of questions (problem domain) and answers (solution domain) – why, how, what, when, where. Also, it alludes to transformation – lets say driven by entropy. So, system dynamics.

A. DNA precedes Protein – possible

B. Relation between Gene, DNA, Chromosome and what it does to Protein

C. Paradox of DNA vs Protein – leads into Intelligent Design Discussion

D. Method to bring our mind to study them and arrive at more hypothesis, postulates or just appreciate the grandeur we live in.

A. Paper presented at NIH – discusses the possibility of DNA preceding protein.

DNA before proteins? Recent discoveries in nucleic acid catalysis strengthen the case.

http://www.ncbi.nlm.nih.gov/pubmed/19215202

B. Think Architecture Talk Language. I think we are struggling to represent complex concept structurally – typical architecture problem discussed with constraints of linearity in language.

a. Instruction Set
Chromosome is a molecule containing DNA cell
DNA cell is made up of gene segment
gene has basic unit code or stroke on piano
chromosome describes the script or note on piano = set of strokes
{Chromosome [(DNA(GENE)]} = information = coded tune for set of functions to emerge
I think chromosomes also knows how to orchestrate the choreography that finally realizes the living organism.

b. Construction Material
Protein is the building block. It gets instruction from set of chromosomes and it builds creating complex forms from simple building blocks. Complex forms are uniquely created with different functional capability.

More complex living organism, more complex sets of chromosomes. Simpler organisms have smaller number of sets of chromosomes. Humans have 23 pairs, that is 46 chromosomes

DNA, Chromosomes and Gene
http://www.dnatutorial.com/DNAChromosomes.shtml

Discussion on DNA – Protein
http://www.nobelprize.org/educational/medicine/dna/

C. Paradox of DNA vs Protein leading into discussion about Intelligent Design discussed below

http://www.uncommondescent.com/intelligent-design/which-came-first-dna-or-protein/

D. My own spin – studying Implicate Order and System Dynamics; proposed by David Bohm

Design (DNA) was created either by a designer or by an “emergence process – emergent architecture”. Also, it could be affected by both causality or acausality.

Chromosomes (DNA) is the implicate order – by generative process it creates the explicate order (everything external) in rich diversity. It is such an architecture, where the ontology of which has containment of the process of creation (notes, tunes, choreography and orchestration)

Complex subjects or ideas needs to be investigated employing integrative tools. To understand the how vs what paradox, we have applied ideas from what we have understood of the world from following disciplines – biology, chemistry, physics and economics + maths. It is the fight for resource (economics), which brings in modes of life form emergence from its simpler forms to complex forms. The fight for resource is both inter and intra. Fiscal Cliff is the fight amongst ourselves. Rich and Poor’ who should own the resource and who should benefit. It is fight for control and eventually our life.

Implicate order, studied within the notion of system driven by entropy discusses energy and need for achieving stability via entropic journey creating order and chaos. This creates various social structures with varying stresses. Order and Chaos are English words to describe the state of entropy as it dynamically exists in a continuum – System Dynamics. This is very well discussed in the video below.

Very beautiful video to see if a different perspective on thermodynamics helps our understanding of complexity and even more of life itself.

password “montana”

The unique form that living organism takes; probably could be described combining macro & micro physics (quantum dynamics) and chemistry. Applying Pauli’s Exclusion Principle – different energy pattern affect composition, so we have a periodic table.

Operating Model – Have Fed Agencies abandoned creating “Enterprise Transition Plan” ? ETP is challenging for the OCIO

Enterprise Transition Plan – Line Of Sight – Operating Model

Ref:- FEAC Institute, FEA Guidance, INGINE INC, Srinidhi Boray[/

Conceiving a coherent modernization plan and executing them has always been a challenge for OCIO. Enterprise Transition Plans generally documents the visions, goals, capabilities at the strategic level and then progresses to envisage “target conceptual architecture”, while planning for tactical efforts working through the mechanisms of architecture lifecycle needed to transition towards the target enterprise conceptual architecture. This results into creating the “roadmap”. The documents that holistically speaks to organizationals capability in reaching such a roadmap is the “Enterprise Transition Strategy”.

Reading the Enterprise Transition Strategy following should be evident:

Organization’s maturity and competence to perform and embrace the future capabilities

Organization’s present constraints in delivering the needed “capabilities”

The profile of the architecture cross sections (segments) needing modernization to develop the intended “portfolio of capabilities”

Investment Profile – planned to executive segment modernization

The governance structure and mechanism that engages the different functional capabilities, such as leadership – management, lines of businesses, enterprise architecture, capital planning and program management. This group together develops and vets – strategic plans, tactical plans and implementation plans governed by following  life cycles

  • Performance Management (Architect, Invest, Implement)
  • Architecture Segment (Notional, Planned, In-Progress, Completed)
  • Release Architecture (sadly mostly do not plan this – it is here interdependence of architectures or capabilities are realized and maintained)
  • Capital Planning (Preselect, Select, Control & Evaluate)
  • Investment (mixed, steady state, Development, Modernization & Enhancement -DME)

A “Sequence Plan” along with “Implementation Plan” is developed to reach the desired Target Architecture (temporal perspective)

Finally Operating Model – The most important component – it assimilates all the decisions chosen for creating an operating leverage, that creates degree or order of leverage to deal with inherent systemic complexities against the efforts of creating economy of scale in delivering the enterprise capabilities essential for delivering the enterprise mission.

More discussion in below link in creating Enterprise Transition Strategy

https://ingine.wordpress.com/2008/12/06/ea-framework-for-transition-planning/

Transformation Framework Capturing an Operating Model

https://ingine.wordpress.com/2012/08/03/transformative-enterprise-architecture-framework-connecting-strategy-tactical-operational-execution-implementation/

Distortions leads to Cancerous Growth within Enterprise

Programmed Cell Death is very important function to understand to gain insight into the way Transformation need to occur. When distortions occur in Enterprise Engineering, then this leads into obvious cancerous growth, which does not have easy remedy.

How many CIO’s in the market inadvertently are responsible for distortions?! Countless.

The Case for Cautious Optimism – BusinessWeek

The Case for Cautious Optimism – BusinessWeek.

Leo Apotheker – SAP Executive

This lack of sustainability bothers me, and many business leaders have come to share it. My discussions with other CEOs reveal that we will not go back to our merry, pre-crisis ways of limitless consumption and exuberant investment fueled by excessive liquidity. The consensus is that we need better models for capitalism in the 21st century. This crisis has laid bare the need for more clarity in business practices, greater transparency in reporting standards, and above all, the dire need for more sustainable business models.

What we want Federal CIO to do

Vivek Kundra’s Resume: Much Ado About Nothing?.

Vivek Kundra’s Resume: Much Ado About Nothing?
Andrea DiMaio
A MEMBER OF THE GARTNER BLOG NETWORK

August 14th, 2009 · 2 Comments

Over the last two days, the blogosphere has witnessed an interesting debate about whether Vivek Kundra’s resume is entirely accurate and sufficient for his current role as U.S. Federal CIO. It all started with a blog post by John Dvorak, where he casted doubts about Vivek’s academic achievements and his experience outside the public sector, followed by a post by Gauthan Nagesh on NextGov, shedding light on Vivek’s academic records and challenging Dvorak’s positions. It is quite intriguing to look at comments on either blog, as well as on many other blogs and online articles (just search for “Vivek Kundra qualifications” and you’ll find quite a lot).

Besides observing that politics are the same pretty much anywhere, with political appointees being regularly targeted by their opponents, the press and more recently by blogs (quite ironic in Vivek’s case, as he is a fervent believer in the power of Web 2.0), I am not really interested in whether these allegations are founded or not.

What I am more interested in is the fundamental question behind much of these discussions, i.e. is he qualified for the job at hand? In order to answer this question one has to looks at (1) the exact job description and (2) his achievements in related positions.

As far as (1), the job description in the White House’s announcement of Vivek’s appointment was:

The Federal Chief Information Officer directs the policy and strategic planning of federal information technology investments and is responsible for oversight of federal technology spending. The Federal CIO establishes and oversees enterprise architecture to ensure system interoperability and information sharing and ensure information security and privacy across the federal government. The CIO will also work closely with the Chief Technology Officer to advance the President’s technology agenda

Words are important here. He is not responsible for the federal IT budget (agencies are), but for its oversight. He directs policies and planning in order to advance the President’s technology agenda. Obama’s agenda clearly is about change in a number of areas, including IT as an important enabler of change. So I guess one of the most important trait the President was looking for was the ability to be a change agent.

Here comes (2), what did he do in the past to show such a trait? Well, I would argue that his achievements in D.C. as a CTO got the attention of many, ranging from how he changed portfolio management to how he made procurement more transparent up until his venture into crowdsourcing applications. He also got a number of recognitions during his tenure in D.C.

Those who have been reading this blog for some time know that I like Vivek and wished him well when he went into some trouble shortly after his appointment. I do not think that allegations or even facts about his qualifications as a student or an entrepreneur can deny his nature of change agent.

Where I believe challenges are for him as well as for the whole administration, is in deciding how to prioritize change, and how to find the right blend between continuity and innovation.

The risk is that in between open data, cloud computing, government 2.0, support to major programs like national broadband and health IT, Obama’s IT team finds itself chewing too much too soon.

If there is one thing that needs to be sorted out now, is how the CIO and CTO role relate to each other. A few days ago I watched an interview that CTO Aneesh Chopra gave to CNET, and could not get a straight answer to this, although the interviewers asked him a direct question. It is quite possible that roles, responsibilities and priorities have already been sorted out. But then, could we please know?

Share:

Tags: e-government
2 RESPONSES SO FAR ↓
Jeff DePasquale // Aug 14, 2009 at 7:48 am

Net-net: I support Vivek as an Executive Partner under EXP. I find his approach to be refreshing and focused. Not sure what his paper qualifications are, but he’s demonstrated to our team his calculated ability to step outside the “beltway box”.

Srinidhi Boray // Aug 14, 2009 at 11:08 am

“”The Federal CIO establishes and oversees enterprise architecture to ensure system interoperability and information sharing and ensure information security and privacy across the federal government.””

Enterprise Architecture, no doubt is a mechanism that can help facilitate change. John Zachman, himself in his FEAC training mentions, that change a ’step function’ can be best anticipated via structures described in the Enterprise Architecture. And, EA is something that is learned over a period of time from experience. Unless, one is born savant.

In lack of architecture experience, most CIO’s in both federal and commercial sectors have made strategic decisions that have not manifested tactically with the desired results. There are numerous cases in the Federal. Should OIG in each agency work with motivation, then a lot of ill-conceived plans and associated ill spent millions of dollars can be purged. But who cares. Talking EA has become a fashion. And, planners lacking design mind introduce ‘empirical dilemma’, that is evident in the Federal Segment Architecture Methodology.

Within the CCA, A-130 Circular ‘information dissemination’ is just one portion of federal responsibility. Web 2.0, or any social networking although good, does not solve the country’s looming problem. Such as, the problem that HHS confronts – medicare, medicaid; that VA confronts – Veteran Patient Health Information system. So on and so forth, there are some very serious problem looming and they have no easy answers both in terms of budget planning and managing the architecture complexity. Here is where we want the Federal CIO to be active. Not dealing with superlatives and in the conceptual solutions domains. Government must be concerned with defining accurately the ‘problem domain’, not engage with ’solution domain’, but seek it instead.

Regards

Game Theory : The essence of Enterprise Architecture is about establishing a “Dominant Strategy”

There is Darwinian in the below idea; however more congenial idea is Generative having more systemic congruent results

Same ideas can be repurposed in a better way.

The essence of Enterprise Architecture is about establishing a “Dominant Strategy”, that best achieves ‘economy of scale’. The economy of scale will apply to each of the architecture design decision selected. The set of design decisions that leads the architecture planning from strategy to tactical and from tactical to execution needs to converge to a dominant strategy engaging all the stake holders led by a cohesive mechanism of Governance.

Note: EA is about discussing the largest picture of the enterprise. Hence, any decision made must ensure that it lends sufficiently across the enterprise while increasing its “reuse”. This means ‘economy of scale’. All decisions, including the technical design decisions must yield a better ‘return on investment’ from optimized ‘performance vs cost’ perspective.

The main goal of the Governance, is to lead the dialogue that an enterprise riddled with complexities is engaged in,  towards a –“Dominant Strategy”. In many ways the array of events that Governance trigers, while working towards converging the decisions to a cohesive set of results, is similar to the behavioral probabilities  studied in Game Theory.

When a system is riddled with constraints, especially when  ‘money’ as  a resource is scarce, then it dominates the decisions needed to achieve the strategy. A system’s behavior  is governed by both micro and macro considerations. There is a threshold until which the system is probabilistically stable and is not affected by the micro behavior. After a certain threshold the micro behavior is unable to sustain the desired macro outcomes.

When Scarcity arises – Economics is Hot

Dominant Strategy

A strategy is dominant if, regardless of what any other players do, the strategy earns a player a larger payoff than any other. Hence, a strategy is dominant if it is always better than any other strategy, for any profile of other players’ actions. Depending on whether “better” is defined with weak or strict inequalities, the strategy is termed strictly dominant or weakly dominant. If one strategy is dominant, than all others are dominated. For example, in the prisoner’s dilemma, each player has a dominant strategy.

Introduction to Game Theory

Extreme Linearization results into Extreme Disparities

Extreme linearization leads into extreme exponential distribution resulting into extreme disparities. Top 20% in US earn 50%  income and own 85% wealth. The rest of the 80% earn the rest of 50% and own only 15% of wealth. The question is what or who is EA serving. The 20% or the 80% of the population or is it the income or to the wealth. Or, all put together.

In the context of TARP and the percentages of tax paid by different demographics – who is paying whom? Those who shored up losses by the marginalizing the working class receiving the bail-out?  In other words, to the losses white collar contributed, how much is the blue collar paying in bailout.

Recent Article

Concentration of wealth in hands of rich greatest on record

Share on Facebook
BY DANIEL TENCER

Published: August 15, 2009
Updated 1 day ago

The wealthiest 10 percent of Americans now have a larger share of total income than they ever have in records going back nearly a century — an even larger amount than during the Roaring Twenties, the last time the US saw such similar disparities in wealth.

In recent years, the fact that differences between rich and poor are the greatest they’ve been since the Great Depression has become a popular talking point among liberal-leaning economists.

But an updated study (PDF) from University of California-Berkeley economist Emanuel Saez shows that, in 2007, the wealth disparity grew to its highest number on record, based on US tax data going back to 1917.

According to Saez’s study, which Nobel prize-winning economist Paul Krugman drew attention to at his New York Times blog, the top 10 percent of earners in America now receive nearly 50 percent of all the income earned in the United States, a higher percentage than they did during the 1920s.

“After decades of stability in the post-war period, the top decile share has increased dramatically over the last twenty-five years and has now regained its pre-war level,” Saez writes. “Indeed, the top decile share in 2007 is equal to 49.7 percent, a level higher than any other year since [records began in] 1917 and even surpasses 1928, the peak of stock market bubble in the ‘roaring’ 1920s.”

By comparison, during most of the 1970s the top 10 percent earned around 33 percent of all the income earned in the United States.

The contrast is even starker for the super-rich. The top 0.01 percent of earners in the US are now taking home six percent of all the income, higher than the 1920s peak of five percent, and a whopping six-fold increase since the start of the Reagan administration, when the top 0.01 percent earned one percent of all the income.

There is no consensus among economists on whether large disparities in income lead to economic disruption, but it is hard to ignore the correlation between rising income inequality and the onset of economic crisis. The last time the US saw similar differences in income was in 1928 and 1929, just before the start of the Great Depression.

Saez also broke the numbers down by administration, and found that while the wealthiest few saw their incomes rise as quickly during the Bush years as they did during the Clinton years, the same was not true for the rest of the population.

Saez suggests that the economic growth seen on paper during the Bush years was little more than an illusion for the vast majority of Americans, who saw their income grow much more slowly in the 2002-2007 period than they did during the Clinton years.

During both expansions, the incomes of the top 1 percent grew extremely quickly at an annual rate over 10.3 and 10.1 percent respectively. However, while the bottom 99 percent of incomes grew at a solid pace of 2.7 percent per year from 1993–2000, these incomes grew only 1.3 percent per year from 2002–2007. As a result, in the economic expansion of 2002-2007, the top 1 percent captured two thirds of income growth.
Those results may help explain the disconnect between the economic experiences of the public and the solid macroeconomic growth posted by the US economy since 2002. Those results may also help explain why the dramatic growth in top incomes during the Clinton administration did not generate much public outcry while there has been an extraordinary level of attention to top incomes in the press and in the public debate over the last two years.
Saez, who this spring won the prestigious John Bates Clark Medal for economists under 40, links this disparity to the Bush tax cuts, noting that “top income tax rates went up in 1993 during the Clinton administration (and hence a larger share of the gains made by top incomes was redistributed) while top income tax rates went down in 2001 during the Bush administration.”

TWO MORE RECESSIONS?

The economic crisis that has taken hold over the past year isn’t over, and the world could in fact see two more recessions before the crisis is finally over, says the chief economist of Germany’s influential Deutsche Bank.

Norbert Walter told CNBC that investors are worried about the health of the US dollar, and many countries are facing difficult financial problems because of overspending by governments on bailouts and stimulus. Those things combined could push the world economy downwards not once but two more times in the near future, he said.

“I believe that the rescue packages brought on have been so costly for so many governments that the exit from this fiscal policy will be very painful, very painful indeed,” he said. “Some of us are already talking about a W-shaped recovery. I’d probably talk about a triple-U-shaped recovery because there are so many stumbling blocks here to get out of this.”

“The world is in trouble,” Walter told CNBC.

In Lack of Theories – Planning will be along a Straight Line

Light bends in a gravitational field

Einstein's prediction (1907): Light bends in a gravitational field

Enterprise Architecture as a discipline promotes a framework of theories useful for planning the enterprise behavior. It might take years for verifying a theory; but in lack of it, manifold years will be needed to salvage the wrong done. In lack of theories the planning will be along a straight line, completely missing what mind should have uncovered and rather relying on what meets the eye. Furthermore, while acknowledging that planning is not moving along a straight line, it also must factor that random action exists that gives rise to unpredictability. This means detecting actions that leads into fragments and anticipating consequence due to them is very important both in structure and in behavior.

Most important among all other theories in the realm of EA is ‘Theory of Constraints’. Per ‘Programmed Cell Death’ A Constraint provides information about the current state and why it evolved into its present state from the past. Also, it provides inference into the possible future state. TOC also ties into system dynamics that can be explained by entropy. All these together has a consequence in terms of the behavior, especially with the probabilistic determinism at macroscopic level and indeterminism at microscopic level. The all encompassing behavior of the enterprise as an instrument with no beginning and end is explained by ‘implicate order’, which also accounts for generative transformations by unfoldments the DNA of which exists enfolded. All encompassing means an inclusive and a pluralistic architectural framework.

Checkout: Creative People At Work – Tribute to Einstein’s Thought Experiment – “Traveling On a Beam of Light”.

https://ingine.wordpress.com/2008/12/01/creative-people-at-work/

Emergence of Universe – Thought Experiment that traces the origin of Universe and where possibly it might go. This is study of generative transformation.

https://ingine.wordpress.com/2008/12/06/enterprises-in-generative-transformation-akin-to-universe-life-cycle/

9 Billion Dollar Scientific Experiment – To verify the creation of world. Why do we need this? Keep asking this billion dollar question (that is 9 times manifold, 1000 times manifold – a million dollar question)