Month: July 2007

Is EA Framework adequate to accomodate SOA style?

Is EA Framework adequate to accomodate SOA Framework? 

The Business Architecture cross section, within the EA Framework is the area where the ‘form’ for the overall EA is more-or less decided. After the form for business process is laid out, rest of the activities is to align the IT resources to the businesses driven by the architectural considerations. The EA Framework typically provides areas for defining and describing the ‘problem’ (context, conceptual) domain and then the ‘solution’ (logical, physical) domain. For most thought process, EA Framework works mostly as a representation or as a reference. The architecture works unseen and is the underpinning that drives the thoughts from its inception in the problem domain to its completion in the solution domain. All-most all the design and description work occurs independently of the Framework. The architecture work-products gets organized by the Framework. The Frameworks thus far have played the role for providing a mechanism for defining the classification scheme that it both ordered and layered. This makes separation of concerns visible. Although, delayed binding and loose coupling have been touted, they are not evidently realized in the framework. Most Framework, follow a scheme of hierarchy. With the advent of SOA style, need for process centric framework is desired. The existing frameworks are either functional-centric or information-centric and these will completely breakdown, when dealing with process-centric system. Hierarchy in itself is functional in its organization, such as those found in the ‘confederation’ system.  Studying the structures of the biological organisms which has evolved from being uni-cell into multi-cellular organisms.  it is evident that Gene Ontology is defined as a process centric  (biological processes) scheme, with the lower compositions (molecular components) being retained as functional centric scheme.  All-most all the business modeling methods, approaches and techniques are either functional centric, meaning they follow ‘decomposition’ or are information centric as found in the information value chain models. These approaches do not lend very well for any SOA style renderings. Of all the methods in the market, one that could come closest to process centric scheme is ‘Riva’ that evolved by Martin Ould ( http://www.the-old-school.demon.co.uk/vc/riva.htm ). However, these lack the clear delineation of the problem and the solution domain. Nevertheless, has scope for improvement and alterations to accommodate the upper layers of the framework that work in the strategy areas. 

Any thoughts??

Advertisements

Effect of IT Commoditization on Corporate Competitiveness

In the articles “IT Does’nt Matter” and ‘The End of Corporate Computing”, that appeared in Harvard Business Review, during 2003 and Spring 2005 issue of the MIT Sloan Management Review 2005 respectively, Nicholas Carr examines both the demand and supply side of the IT. In these articles he argues based on his findings that IT is of no more strategic advantage to large corporations

 

In “IT Does’nt Matter”, Nicholas Carr, examines the demand side of IT. His arguments are that the IT has reached commoditization, which is similar to what has happened to infrastructure systems such as Electric Power, Railway Systems etc. During earlier times, when IT was just introduced, they proved to be infrastructure for commerce. Then, it proved to be of strategic advantage to corporate. But as IT reached its demand threshold, in Nicholas words, “their availability increased and their cost decreased – as they become ubiquitous – they become commodity inputs. From a strategic standpoint, they become invisible; they no longer matter”. Meaning, they retain competitiveness but are of no special advantage.

 

The IT commoditization is an anti-thesis of the technological advents. Nevertheless, the accent has shifted from mere technology products, services and information to management, architecture, process and execution. There are numerous examples in the industry, such as the ISP businesses that were initially profitable are now mere infrastructures, and companies do not derive direct profits from them, but from other services bundled in the portal. This shift is also evident from the accent that has shifted from Chief Technology Officers to Chief Information Officers, this is reflected in the recent salary surveys. There is a decline in the salaries of CTOs while that for CIOs have improved.

 

IT commoditization has also lead to off-shoring of its production, because of its cost advantage and the trend will continue, like it has happened in the manufacturing sector. This has a huge impact on the socio-economic conditions on those who rely on IT for their lively-hood. In USA, there has been a constant decline in students pursuing IT. According to The IT Professionals Association of America, (ITPAA, inc www.itpaa.org)” Salaries continue to decline in IT, and entry-level positions for new graduates are hard to come by since most of these have been offshored to India and China. Given that the average college student graduates with $50,000 in debt, it makes sense that he or she would avoid fields such as IT that are disappearing, and go into those that provide the income necessary to pay back that debt.”

 

In “The End of Corporate Computing”, Nicholas examines the supply side (how the technology industry will be organized to supply IT to companies). In particular, he examines how the wastefulness of the current, fragmented model of IT supply is unsustainable. As with the factory-owned generators that dominated electricity production a century ago, today’s private IT plants will be supplanted by large-scale, centralized utilities. The transition to the new supply model promises to bring challenges and opportunities to the users of IT while upending the status quo of the computer industry. His premise is based upon common platform multi-tenancy.

 

The supply and demand vortex is turning ‘exclusiveness’ into ‘inclusiveness’, so the world is becoming flat and the need for pluralistic society is becoming more and more evident. The looming danger is in the dichotomy that will afflict society, systems and self alike.

 

Ref: http://www.nicholasgcarr.com/articles/matter.html

Ref: ITPAA, inc www.itpaa.org

EconoPhysics

Econophysics’ – Interdisciplinary Approach to Study Behavioral Economics

All system display behavior those are dynamic in nature. The study of the system dynamics finds many of the classical models inadequate. In the world of physics, Newtonian classical models progressed into Einstein relativity and then into quantum dynamics. While the subject progressed, the inadequacies in the discipline surfaced and newer and newer mechanisms were sought to represent and study the subject more accurately. Many challenges remain in studying the system as a whole by reconciling macroscopic aspects that with the microscopic behaviors. To further the cause in the realm of physics, ‘Unified Theory’ is being attempted that brings together the different aspects of the macro and the micro. One of the stark features of any system is that it is deterministic in a probabilistic way, but remains in-deterministic when processes around individual events are followed. Also, the physical nature at the macrocosm completely alters when studied at the microscopic level. At one level it is a particle and in another it is quanta (or a wave). The probabilistic studies are attempted by applying Monte Carlo computational approach, which iteratively incorporates various coefficients in its attempts to discover the probable behavior. The random occurrences are studied applying stochastic methods.

Lot of similarities are being discovered between the systems dynamic studies that applied in Physics and Economics. Especially when the equilibrium studies are conducted in a system characterized by heterogeneous agents. It is argued by many that the neoclassical economics relies on principles that worked well in a system based on the Adam Smith’s simple axioms in which the heterogeneous agent complexities do not exist. Adam Smith’s principle sates that the individual greed plays a vital part on the overall economics for the better good of the whole society and then invisible hand works to correct the course when in-equilibrium occurs. This notion is now being contested to be an incorrect assumption. For, in modern economics era, the unpredictability of the market is in the lack of understanding the complex agent behaviors, that exists more like a flux, which is difficult to be discerned. This adds to the conundrum when predicting the random events in a market that is inherently unstable. Furthermore, the flux state defies the Cartesian system, based on which the present computations are derived. Almost all the applied agent behavior incorporates coefficients that encapsulate the supply demand behavior triggered by the one or more variables as observed within the Cartesian coordinate system.

One among many other behaviors observed in the neoclassical economics model is that the wealth creation has displayed Gaussian distribution. These distributions are considered harmful as it allows for the wealth localization. For, reasons such as these, econophysics is sought to accurately understand and study the complex economic system dynamics.

In the recent times, few universities have begun to conduct formal research in the econophysics area. One of the goals is to rewrite the economics theory by reconsidering the dynamics that relies on the realities of the unstable behavior.

Reference:

http://www.phys.uh.edu/research/econophysics/index.php

http://en.wikipedia.org/wiki/Econophysics

http://www.unifr.ch/econophysics/articoli/papers/Economic_System_Dyn-joco.pdf

http://www.unifr.ch/econophysics/comments/nov_99/9911291.pdf

http://www2.physics.umd.edu/~yakovenk/papers/PhysicaA-299-213-2001.pdf

Current condition owing to