market survey q1 2011 - fluxology · whereas gartner acquired meta group, amr research and burton...

28
Towards the Service Oriented Enterprise Market Survey Q1 2011 May 10, 2011

Upload: others

Post on 13-Jul-2020

1 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Market Survey Q1 2011 - Fluxology · whereas Gartner acquired META Group, AMR Research and Burton Group. Ovum was acquired by DataMonitor near the end of 2006, which in turn was acquired

Towards the Service Oriented Enterprise

Market Survey Q1 2011

May 10, 2011

Page 2: Market Survey Q1 2011 - Fluxology · whereas Gartner acquired META Group, AMR Research and Burton Group. Ovum was acquired by DataMonitor near the end of 2006, which in turn was acquired

Towards_SOE_Fluxology.docx Page 2 of 28 5/10/2011

TABLE OF CONTENT1 GENERAL INTRODUCTION ............................................................................................................ 3

1.1 DESCRIPTION OF SCOPE .........................................................................................................................3

2 RESEARCH ANALYSIS ................................................................................................................... 4

2.1 ANALYSIS ANALYSIS ..............................................................................................................................4 2.2 TECHNOLOGICAL TRENDS ......................................................................................................................5 2.3 WORKFLOW AUTOMATION ....................................................................................................................8 2.4 COMPLEXITY..................................................................................................................................... 11 2.5 ESB ++ ........................................................................................................................................... 14 2.6 HUMAN WORKFLOW ......................................................................................................................... 17 2.7 GENERAL CONCLUSION ...................................................................................................................... 19

3 EVALUATE, CLASSIFY AND COMPARE ......................................................................................... 21

3.1 EVALUATION .................................................................................................................................... 21 3.2 VENDOR CATEGORIZATION .................................................................................................................. 23 3.3 COMPARATIVE CRITERIA ..................................................................................................................... 26 3.4 IN PRAISE OF FOLLY (LOF DER ZOTHEID) ................................................................................................ 28

Page 3: Market Survey Q1 2011 - Fluxology · whereas Gartner acquired META Group, AMR Research and Burton Group. Ovum was acquired by DataMonitor near the end of 2006, which in turn was acquired

Towards_SOE_Fluxology.docx Page 3 of 28 5/10/2011

1 GENERAL INTRODUCTION

1.1 Description of scope This document originates from an overview of the global market for the Enterprise Service Bus (ESB), a best practice approach for integration of multiple applications and their underlying data sources. Integration has grown to a mature segment in the overall IT landscape, allowing for creating systems of systems, encapsulating older systems with newer protocols, legacy modernization, de-duplication of data and functionality, phased transition from one application to the next and allows for easy ways of connecting functionality inside and outside a company. One particular approach has become prominent and is currently reshaping both core technical tools as well as the data center as a whole, in which functions, data and assemblies of these are regarded as largely independent service modules. This Service Oriented Architecture (SOA) is currently receiving much attention in the form of both Clouds and Software-as-a-Service, which are both ways of exposing a ‘service’ such as storage, processing power, security, or even a collection of processes concerning customer contacts. Using the indirect approach of SOA both a company’s IT landscape itself, as well as their business can move towards a more comprehensive way of working, where e.g. internal departments subscribe to the services offered by other departments, or certain parts of the IT infrastructure are provided by an external party. Using a primarily qualitative assessment of these and related markets a more in depth study can be done involving a company’s current and planned technology standards landscape, and potential evolutionary tracks in which it will further develop. This ‘map’ will act as the main filter for selection phase for tools allowing for a way forward suitable for a company’s culture. Using a final shortlist several distinctive leverage points (performance, expandability, ease of use, re-use, internal integration versus ease of maintenance and any other additional functionality) are to be identified where migration effort can be reduced. For reasons explained during this analysis it is expected the ESB to be an intermediary solution, enabling the gradual application of a new architectural style which eliminates the need for an ESB product itself, so it is both a product and a way of doing, depending on what is most suitable for a company’s infrastructure, but it needs to be known first in order to forget about it. Next, an actual migration can be formulated and constructed, identifying migration phases which can give the largest ROI for both the existing and future solution using simple prioritization steps of these phases with different importance weighting; chronological, functional, possible synchronization with vendor roadmap, learning curve, and strategic direction for automated operations. Essentially a mature integration solution offers connectivity, routing, and mapping. With increasing amounts of interfaces the aspects of monitoring and management, and versioning and testing become increasingly important. To draw a suitable context several neighboring approaches are highlighted. First, a shift is happening under the marketing term ‘cloud’, which involves several shifts in abstraction away from physical links to a more logical ‘service oriented’ style. Second, dealing with the complexity levels of increasingly many applications; a service-oriented infrastructure network model is appearing, where the boundaries between network, computer hardware and software are also increasingly vague. Third, hardware performance blurs the boundaries of previously segmented domains. It is worth approaching application migration from these angles. Several of the ESB vendors are already active in this area either from a development compliance, governance or operations and runtime grid angle. This document aims to give readers a reasonable handle on the market forces, so to come to a better informed educated guess.

Page 4: Market Survey Q1 2011 - Fluxology · whereas Gartner acquired META Group, AMR Research and Burton Group. Ovum was acquired by DataMonitor near the end of 2006, which in turn was acquired

Towards_SOE_Fluxology.docx Page 4 of 28 5/10/2011

2 RESEARCH ANALYSIS

2.1 Analysis Analysis Ongoing consolidation in the world of Information Technology can be seen as the major shaping force in this survey. Not only has the traditional EAI (Enterprise Application Integration) market experienced a large vendor consolidation during 2008 with the shift towards a technological consolidation on the ESB (Enterprise Service Bus) model, major technology analyst firms have also undergone some consolidation although more gradual. Considering the highly likely prospect the current recession will continue to ripple onwards as a series of minor crunches vendor consolidation remains an active component within the context of several larger trends. These larger trends are mainly happening within the functional, informational and technological architectural areas, and can be seen as offering a wider variety of integration forms and a pervasive deepening of Service-Oriented principles in the form of SaaS (Software-as-a-Service), Cloud and Virtualization.

Now, it is suitable to first look at information provisioning, and the availability of neutral independent advice. During the last ten years the number of firms specialized in technology research, analysis and advisory has shrunken to such extend the market is dominated by a small number of firms. It has in fact become increasingly difficult to identify unbiased bureaus. During these years Forrester Research acquired Giga Information Group as well as JupiterResearch, whereas Gartner acquired META Group, AMR Research and Burton Group. Ovum was acquired by DataMonitor near the end of 2006, which in turn was acquired by Informa in 2007. Smaller firms such as Aberdeen Group were acquired by direct marketing firm Harte-Hanks or Yankee group still maintain their own identity, thus far, but don’t really offer deep insights in the ESB market. Their insights on Service-oriented approaches seem more oriented on how Cloud and SaaS are now being adopted by for telecommunication firms and datacenter infrastructures, supported by case studies. The fact that the upper management of these smaller firms generally come from either Forrester or Gartner may signify the flow of ideas and mindshare infused by such incidental organizational revamping.

Page 5: Market Survey Q1 2011 - Fluxology · whereas Gartner acquired META Group, AMR Research and Burton Group. Ovum was acquired by DataMonitor near the end of 2006, which in turn was acquired

Towards_SOE_Fluxology.docx Page 5 of 28 5/10/2011

It suffices to say the technology analysis and advisory market is dominated by Gartner, Forrester, IDC, Ovum and Aberdeen Group, where the latter three are often positioned by vendors for an in-depth study supported their offering as a best in class but rarely are their market survey reports offered. The two main ideators are clearly Gartner and Forrester, where it should be pointed out both companies are noticeably moving towards providing increasingly more consulting services, which appears to be a more lucrative endeavor. Simultaneously the phenomenon of ‘star analysts’ has taken off, with people such as Vinnie Mirchandani and Ray Wang playing an important role in re-personalizing the advisory role. Superimposed on the marketing spectrum the latter resembles the mild personality cults such as with Seth Godin, or any of the self-help gurus which are so popular amongst management consultants. Albeit for the self-fulfilling power of many of their predictions, and personal mediation by senior partners, the diminishing independence can be seen as a reinforcing feedback loop gravitating towards several concentrated ideas, and there is clear value in following and incorporating their market research as market adoption and the resulting lifetime of a vendor is not defined by technical or corporate prowess alone. Due to the close contacts of the analyst community with vendor’s marketeers, the extra attention resulting from being mentioned in reports can and should be seen as part of vendor’s sales strategy. This has the benefit of highlighting previously unknown vendors one may have missed out on, but the disadvantage that skilled marketeers can shunt and skew scores once they have a grasp of the analyst firm’s evaluation criteria. The latter comes at the costs of actually buying a previous edition of such a report, and some extras, a relative low investment considering the derived value. On the whole, none of the analyst firms deviate much from each other, although they do appear to push their own particular insights from when past predictions proved to be accurate enough. Once the market forces are understood the research results are not particularly surprising, which will be shown below when describing the different trend movements. In order to understand the current market movements it is worth looking at the phenomena of workflow automation, network complexity and long term trends in computing hardware.

2.2 Technological Trends

First, the long term trends continue to gradually shape the landscape of technical capabilities. Solutions previously impossible are now a reality; solutions previously considered as inefficient or ugly become a de-facto norm; solutions which were merely facilitating an industry have come to dictate it such as high-frequency trading which has replaced most of human trading which used to lean on gathering knowledge with very fast and short term quantitative analysis based on overall market dynamics. Below the major technological trends are listed: 1. Moore’s law: Doubling of affordable processing power every two years.

Tenfold increase of processing every 6½ years. 2. Nielsen’s law: Doubling of high-end network connection speed every 21 months.

Tenfold increase in bandwidth every 6 years. 3. Kryder’s law: Doubling of affordable magnetic storage density every year.

Tenfold increase in storage every 3¼ years. 4. Grosh’s law: Computer performance increases as the square of the costs.

Tenfold increase in performance for triple the costs.

Page 6: Market Survey Q1 2011 - Fluxology · whereas Gartner acquired META Group, AMR Research and Burton Group. Ovum was acquired by DataMonitor near the end of 2006, which in turn was acquired

Towards_SOE_Fluxology.docx Page 6 of 28 5/10/2011

Interestingly enough usage also seems to accommodate these technological trends, and the amount of information stored and used is increasing at a faster rate than network bandwidth, which, if not dealt with by smart solutions (e.g. buffering the diffs as a preprocessing step for data synchronization) will eventually create a self-sustaining race condition. On a ten year scale, these trends mean we have processors which are 35 times as powerful, connected over a network 50 times faster, dealing with data stores potentially a 1000 times larger. Whereas in 2000 one would need a high-end server to do some complicated data-analysis, nowadays the same can be done on a laptop with 64-bit Office to allow Excel to deal with spreadsheets larger than 2 Gigabyte. Not only that, this 2 Gigabyte spreadsheet can be stored on a remote location and be accessed with times acceptable for the end user. Once a capacity has risen beyond a certain threshold its availability becomes a means and applications will diversify. For cinema quality HDTV some 2 million pixels need be refreshed some 50 times per second, so to have 32-bit color depth, a network speed exceeding 3 Gigabit/second is needed. Once such bandwidth is there, the use of graphical applications will dramatically increase. The hundredfold increase in bandwidth capacity from 1999 to 2011 implies a significant lowering of the chance for transmission errors, large files containing e.g. updates on the sales information of a whole range of items which used to take nearly 15 minutes to be received can now be transmitted in less than 10 seconds. Well-planed schedules for synchronizing master data which could be challenging to fit within a 24-hour timeframe can now be squeezed into a 15 minute burst. In 2020 it will take half a minute. As seen with user-dense file- and video-sharing applications, content is growing about as fast as data storage capabilities, and every two days the amount of data created online is about as much as all of recorded history together up to the year 2003, some five Exabyte. Content needs will continue to drive onwards these trends, and as the recent successes of IBM’s Watson demonstrate with playing Jeopardy, fairly simple concept mining technologies can produce dramatic results when the hardware is powerful enough. This is just an early appearance of such tools, which can simulate actual understanding fast enough to win a television game show, but package a Watson in a shipping container and put a thousand of these in a Google or Bing data center, and an online secure multitenant ‘Oracle-as-a-Service’ can be operational by 2012. Not only can this be useful for having different ontological snapshots for Business Intelligence purposes, within a reasonably short amount of time businesses can use such a service for knowledgeable interpretation of what previously was unstructured data and real-time analytics and continuous event-simulation will gradually enter the scene wherever it becomes affordable. Artificial Intelligence with equivalent reasoning skills to an average human is predicted for 2015, and again, accelerated by combining with interpretative knowledge-frame provisioning of a Watson, Ask Jeeves or Wolfram Alpha, or all combined, and along for rapid improvements and upgrades due it being hosted as a cloud service, it may very well compete with the human workforce and most middle-management roles may come to an end. Needless to say, these forces will continue to pull technological progress forward for quite a while.

Page 7: Market Survey Q1 2011 - Fluxology · whereas Gartner acquired META Group, AMR Research and Burton Group. Ovum was acquired by DataMonitor near the end of 2006, which in turn was acquired

Towards_SOE_Fluxology.docx Page 7 of 28 5/10/2011

The reason these trends are important application designs and operations is quite simple, as time goes by any solution which got its value from excelling in one of the aspects will diminish in value, but technologies focused on a small portion have also gotten wider applications. Web Services which used to have their peak performance around the 10 Kilobyte margin are now capable of dealing with multi-Megabyte messages, and scheduled FTP-based batch processing which used to deal with big bulks of data is now moving towards so-called “Change Data Capture” or Near Real-Time ETL (Extract, Transform and Load), and more frequent batches, smaller batches, incremental batches, compressed batches and even continuous batches; bringing it very close to the more asynchronous world of ‘Event Driven Architecture’ which used to be the domain of Message Oriented Middleware. Also the ‘cloud’ trend partially owes its existence to increased processing capabilities, and the costs of adding an extra ‘pass by reference’ step provides an enormous flexibility by virtualization. ‘Cloud’ also involves a further extension on the Service-Oriented paradigm, as will be addressed later on. For now it suffices to point out there are ongoing virtualization trends of increased popularity and efficiency in the areas of hardware devices (machine emulation), Operating Systems, storage, network and other components which used to make up a stand-alone computing device.

Yet, the same enabling forces also result into misuse of technologies, outside and beyond the scope for which they were originally designed. And while the technologies, tools and/or vendors have become outdated and unsuitable within the overall context of mature enterprise architecture, their actual usage may have been such that a major investment is needed to retro-engineer a more suitable solution in place. Ironic computer maxims have already been coined describing this trend “The hope is that the progress in hardware will cure all software ills. However, a critical observer may observe that software manages to outgrow hardware in size and sluggishness.” However, a study published in December 2012 of the “President’s Council of Advisors on Science and Technology” demonstrated a speed gain from algorithmic improvement of a factor 43.000, benchmarked over a fifteen year period for production planning tasks. This indicates a doubling in efficiency every year due to ingenuity. Additionally the same study indicates data volumes are growing exponentially which necessitates the council to urge for a “big data” strategy, as well as other themes related to crosscutting concerns such as security, privacy and interoperability aspects while dealing with increased complexity as computer systems and daily life continue to intertwine at increasingly deeper levels.

Page 8: Market Survey Q1 2011 - Fluxology · whereas Gartner acquired META Group, AMR Research and Burton Group. Ovum was acquired by DataMonitor near the end of 2006, which in turn was acquired

Towards_SOE_Fluxology.docx Page 8 of 28 5/10/2011

2.3 Workflow Automation

Workflow automation, as a term, was first coined in the early eighties by FileNet. An early recognition in the value of automating document processing, such as printing, faxing, handling mail, storage and archiving, led to ways of digitizing information, optical recognition and automatic processing which gave an increased competitive advantage to companies doing so. As companies increasingly made use of computerized applications to run their business, advances in electronic document exchange followed a similar development and during the nineties, as companies increasingly made use of computer applications a number of software vendors appeared which addressed such issues internal and external to companies. 2002 was the first year worldwide digital storage capacity overtook total analog capacity and current estimations indicate nearly 98% of information is stored in digital form. The domains dealing with automated document workflows were most often referred to as EAI (Enterprise Application Integration) and B2B (Business to Business) and different vendors specialized in different aspects of these. Both address interoperability however, EAI replacing the custom-built sets of interfaces gluing together the information flows from system to system, B2B deals with regulated information exchange protocols between different companies. EAI often developed from work their founders had done previously in addressing interfacing issues, and as such were primarily focused on simplifying and hosting point-to-point interfaces, the information flows between two systems. But as more and more interfaces were being hosted, the requirements for EAI started shifting, and became largely similar to the B2B model. The internal systems and involved departments could be seen as trading partners, and sometimes they actually are, were or will be.

Due to the number of participants holding it together B2B Value Added Networks proved robust enough to stand the test of time, yet during the years many B2B implementation got to suffer from both EAI but also by the rise of Web Service technologies. Besides certain well-defined business domains in order to get more value from B2B it actually had to lean on and incorporate EAI functionality, otherwise it would crack under the weight of its own success such as early Electronic Commerce online stores were more akin to an online advertising brochure where orders would be routed to the printer at the other side of the screen.

Page 9: Market Survey Q1 2011 - Fluxology · whereas Gartner acquired META Group, AMR Research and Burton Group. Ovum was acquired by DataMonitor near the end of 2006, which in turn was acquired

Towards_SOE_Fluxology.docx Page 9 of 28 5/10/2011

Web Services, partly a self-fulfilling promise, but primarily because of their strong focus on standardization of the exchange protocol proved an excellent candidate for the more ‘quick and dirty’ way of B2B messaging and many web service based point-to-point interfaces have overtaken the thought-out experiences from decades of EDI (Electronic Data Interchange) and Managed File Transfer. Web Services also had a promise to fulfill as an interchange infrastructure of the Web, leaning on a widespread standardization of the underlying network infrastructure itself, whereas older forms of EDI would still use Teletext-based information retrieval. Leaning on three different XML-based components, SOAP (Simple Object Access Protocol) envelopes for message exchange, WSDL (Web Services Description Language) for describing the workings of the interface ‘service’ and UDDI (Universal Description, Discovery and Integration) as a registry, Web Services proved a very powerful concept for interfacing and due to widespread industry support it rapidly grew up to become the standard for exposing interfaces. Also due to the flexibility of XML, the ease to deal with different languages and terms (as in namespaces), and the general assumption that the trendy new architectural paradigm, SOA equated to the use of Web Service, it very quickly moved from a proposal to a de-facto standard. Yet, as foreseen by e.g. Ann Thomas Mannes of formerly Burton Group, implementation has often applied some shortcuts because of the lack of strategic overarching initiatives and UDDI more or less disappeared from the scene. As Web Services were picked up so hasty, much needed alterations in the definitions were implemented by vendors before they became agreed upon standards, which lead to all kinds of XML-dialects and interoperability issues between tools of different vendors. This lead to a cooperation between several vendors defining the WS-I (Web Service Interoperability) standards with which they choose to comply so to ensure better interoperability. While SOAP standards have become a mature technology, advocates of a more simple ‘quick and dirty’ way choose to go back to basics and as an intermediate format REST (Representational State Transfer) became quite popular with its more simple XML over HTTP scheme, yet all data exchanges are session-based synchronous invocations and thus REST based techniques are best for more real-time and stateless functionality. Lessons learned from Web Service technology and SOA have inspired a set of emerging standards, in particular the OASIS OpenSOA initiatives SCA (Service Component Architecture) and SDO (Service Data Objects) which are best be seen as a new programming paradigm. And of course, facing the issue of dealing with many interfaces, registry technologies are catching up but mostly because of added features such as advanced monitoring and management features. But as a new way of mixing programming with architectural design new possibilities are taking shape and with S-RAMP (SOA Repository Artifact Model and Protocol) the overarching registry is getting a well-deserved second life. Here we have both an industry-wide standard-based low-level way of packaging functionality, as well as a way forward into assembling a whole application out of these functions. Along with the increase of informatics in the business world, many early adopters gained a competitive advantage from simply applying some new technology but this often signaled the head start of a race with competitors trying to gain or maintain at the leading edge of their industry. Gradually this involved more and more automation of parts of the business, and analogous to the automation of the factory hall into a well-oiled assembly line, business functions were foreseen to become automated to such an extent. In an attempt to recognize the growing importance of IT for any entrepreneurial endeavor, as well as the departure from data processing, document management and the term ‘workflow automation’, which had gradually started to meant ‘human workflow’, the BPM (Business Process Management) was introduced. BPM is in fact the earlier discipline ‘Business Process Reengineering’ but now with extra attention for the use of sophisticated information systems as well as a departure of the use of technology as a goal (hence the ‘engineering’ analogy) but as a means to perform.

Page 10: Market Survey Q1 2011 - Fluxology · whereas Gartner acquired META Group, AMR Research and Burton Group. Ovum was acquired by DataMonitor near the end of 2006, which in turn was acquired

Towards_SOE_Fluxology.docx Page 10 of 28 5/10/2011

Similar to the abusive tendency of sales and marketing to push SOA as a solution to the world’s problems, BPM ‘suites’ quickly became popular, and whereas thorough models had been developed such as HP’s “Zero-Latency Enterprise”, EDS’s “Digital Business Platform” or the WfMC’s (Workflow Management Coalition) Reference Model, the push towards standards-based modeling and runtime execution, as well as standardized interfacing, caused an industry shift towards the use of both BPMN (Business Process Modeling Notation) and WSBPEL (Web Services Business Process Execution Language). Overcoming several limitations from earlier initiatives for Web Service Orchestration and Web Service Choreography, IBM and Microsoft joined forces into combining their Web Service Flow Language and XLANG (XML LANGuage) even before standardized the BPEL drafts were picked up by different vendors and combined with the Web Service incarnation of Service-Oriented interfacing proved a very strong combination able to address many scenarios in a cost-efficient manner.

Parallel with how former ‘Application Service Providers’ gradually morphed into ‘Software-as-a-Service’ based businesses, these standards are maybe best understood when looking at the evolution of both Composite Applications (still leaning on the Portal model) and its uncrowned successor the Enterprise Mashups. Leaning on possibilities offered by e.g. encapsulating a mainframe with a shell of Web Services to expose widely used functions, these functions can be easily put together in a background process or even in a personalized overview on a webpage so that information from different sources is combined into what appears to be a whole new application. Especially with the increased use of internet technologies and the rise of the internet browser as a universal client this way of serving up business functionality is becoming the norm instead of the exception. Add to this the rapidly growing mobility trend with the convergence of mobile telephony, personal digital assistants and portable computing, the widespread introduction of ‘sensors’ for just about everything measurable, and the recent market introduction of hybrid ultra-capacitor batteries, and it is clear the future of ubiquitous computing may arrive earlier than expected.

Page 11: Market Survey Q1 2011 - Fluxology · whereas Gartner acquired META Group, AMR Research and Burton Group. Ovum was acquired by DataMonitor near the end of 2006, which in turn was acquired

Towards_SOE_Fluxology.docx Page 11 of 28 5/10/2011

2.4 Complexity A third major trend movement concerns network effects and complexity. Dealing with complexity is seen as the primary IT challenge for the coming decade. While the ‘long tail’ effect was cheerfully hailed in the management consultancy domain the underlying ideas about self-organized criticality have a much greater impact. Besides that it provides a deepening of the limited range of applicability of both the 80-20 division and the often accompanying normal (Gaussian) distribution, it also points to a universal tendency of connections, of networking. Given enough choice, any distribution will stabilize towards a few hubs with many connections and many hubs with a few connections, distribution tends to cluster, just like cities arise, train stations within a city, or even the distribution of wealth.

Network effects are what create lock-in; the sheer number of entangled connections forms a critical mass resulting in a high degree of path dependency for the future evolution for its contextual domain. In other words, at a certain number of participants, either users, developers or providers, adopting a technology it passes a threshold and its widely accepted application has become resilient. This has a positive side to it, e.g. when trying to establish a large enough consumer base but the resulting inertia becomes burdensome when a next step is needed, wanted or just more appropriate. This tendency to network is also reflected in computer systems, following the gradual evolution from a special purpose computational device to a more general purpose computer, to cables connecting some computers to join their strengths to networking, from dial-up services over telephone lines similar to faxing to virtual private networking over the internet, and interfacing becomes integration, and next in 2020 after the sensor wave eases down, we will effectively be living inside a decentralized robot. Systems of systems continue to be constructed with interoperability enabling technologies appropriate to deal with only a limited number of connections until it reaches a critical mass, a threshold where the costs and effort of monitoring and maintenance becomes so large the technique shifts up an abstraction level towards the ‘adjacent possible’ where a more flexible general purpose variation of the original technique is adopted. This is where portability and interoperability based on open-standards continue to show their value in anticipating and facilitating these emergent expansions in interchangeability. Early observations of these dynamics include for example “The iron law of oligarchy”, a political theory from a century ago which states that all forms of organization, regardless of how democratic or autocratic they may be at the start, will eventually and inevitably develop into oligarchies. The reasons behind this tendency towards ‘rule by the few’ are; bureaucratization to maintain efficiency while growing, which in turn leads to delegation and thus specialization. This in turn leads to the rationalization and routinization of authority and decision making, and necessarily results in the role of a leader, and along with the common tendency to have a bias towards their own interests this will eventually result in a self-perpetuating oligarchy.

Page 12: Market Survey Q1 2011 - Fluxology · whereas Gartner acquired META Group, AMR Research and Burton Group. Ovum was acquired by DataMonitor near the end of 2006, which in turn was acquired

Towards_SOE_Fluxology.docx Page 12 of 28 5/10/2011

Whether technologies are seen in isolation, or in a more contextual manner, lessons from decades of utilizing systems theory approaches have accumulated to a simple predictive set of 12 ‘leverage points’ in any system where small shifts can produce big changes. They are intentionally listed here because they provide a simple powerful means to estimate technological shifts, usage patterns and how these impact the IT environment. They also give a good idea as to how a vendor’s offering effectively fits into their overall suite, what their focus may be for this particular offering and the whole roadmap, what the purpose and goal behind their offering and general company policy. Are they doing what they promise or do they have a rather liberal idea of marketing? Do they aim to be a storehouse for every possible technology, or are they highly focused and specialized, and if so, are they too unique to evolve or are their tools of such exceptional quality that when they may be acquired that it means no interruption to the tools future itself? The following leverage points are in increasing order of effectiveness: 12. Constants, parameters, numbers (such as subsidies, taxes, standards).

Though they are the most clearly perceived among all leverages, they rarely change behaviors and therefore have little long-term effect as a means of intervening in the system.

11. The size of buffers and other stabilizing stocks, relative to their flows. A buffer's ability to stabilize a system is important when the stock amount is much higher than the potential amount of inflows or outflows. In the lake, the water is the buffer: if there's a lot more of it than inflow/outflow, the system stays stable. Buffers can improve a system, but are often of a critical size and can't be changed easily.

10. Structure of material stocks and flows (such as transport network, population age structures). A system's structure may have enormous effect on operations, but may be difficult or prohibitively expensive to change. Fluctuations, limitations, and bottlenecks may be easier to address.

9. Length of delays, relative to the rate of system changes. In- and outflows occurring too fast or too late can cause over- or underreaction, even oscillations.

8. Strength of negative feedback loops, relative to the effect they are trying to correct against. A negative feedback loop slows down a process, tending to promote stability. The loop can keep a stock on target, thanks to e.g. speed of parametric data feedback, and size of correcting flows.

7. Gain around driving positive feedback loops. A positive feedback loop speeds up a process. In most cases, it is preferable to slow down a positive loop, rather than speeding up a negative one.

6. Structure of information flow (who does and does not have access to what kinds of information). Information flow is neither a parameter, nor a reinforcing or slowing loop, but a loop that delivers new information. It is cheaper and easier than changing structure.

Page 13: Market Survey Q1 2011 - Fluxology · whereas Gartner acquired META Group, AMR Research and Burton Group. Ovum was acquired by DataMonitor near the end of 2006, which in turn was acquired

Towards_SOE_Fluxology.docx Page 13 of 28 5/10/2011

5. Rules of the system (such as incentives, punishment, constraints, and legalities). Pay attention to rules, and to who makes them.

4. Power to add, change, evolve, or self-organize system structure. Self-organization describes a system's ability to change itself by creating new structures, adding new negative and positive feedback loops, promoting new information flows, or making new rules.

3. Goal of the system. Changes every item listed above: parameters, feedback loops, information and self-organization.

2. Mindset or paradigm that the system - its goals, structure, rules, delays, parameters - arises out of. Paradigms are hard to change, but there are very little limitations to it. It is suggested paradigms might be changed by repeatedly and consistently pointing out anomalies, failures and alternatives.

1. Power to transcend paradigms. May go beyond challenging fundamental assumptions, into the realm of changing the values and priorities that lead to the assumptions, and being able to choose among value sets at will.

These general ideas can easily be applied to technical challenges, such as using size limitations on emails to avoid network congestion, or blocking YouTube to avoid unneeded downloads of large video files, or putting a personalized upper limit to the daily visits to YouTube to allow for a balance between much needed pauses and amusement, and loafing. Or, is the patent system still appropriate for knowledge intensive industries which tend to accelerate due to the positive reinforcement of cross-fertilization among different knowledge domains? It has clearly shown to slow down innovation, but is that a positive or negative effect when taking in mind return on investment, competition and the need to create path dependence.

Along with the accelerated change occurring due to the progress in hardware capacity and the cross fertilization amongst knowledge-based industries, networking effects have significance effects due to the risk of exclusion; “The more people included within and enjoying the benefits of a network, the more the costs of exclusion grow exponentially to the excluded, and spread across multiple dimensions and impose additional costs even on those who are networked included.” This indicates a rather inconvenient characteristic of positive and negative competitive reinforcement for most networking phenomena without an easy way out when sticking to the level the network effect applies to.

Page 14: Market Survey Q1 2011 - Fluxology · whereas Gartner acquired META Group, AMR Research and Burton Group. Ovum was acquired by DataMonitor near the end of 2006, which in turn was acquired

Towards_SOE_Fluxology.docx Page 14 of 28 5/10/2011

Within every IT department certain unforeseen applications have grown with such immediate popularity they’ve grown into an architectural patchwork which at some stage needs to be retro-engineered or at least encapsulated until decommissioned so to gain the next step of value it gives from working in conjunction with other applications. Every IT department has had to deal with in the increase of emails in which business decisions are made, which would better be officialized in a dedicated application, as well as the challenge to deal with the sheer volume of email moving through the internal network and how to deal with this when scheduling system backups. Every IT department has had to deal with increases in security requirements which demand for rigorous user management, authentication, authorization and access control and monitoring this for accounting purposes and improvements. And now we’re on the threshold of having to deal with even more information, of unified communications, and dealing with many end-user devices. Most IT environments lack the time and budget to take a step back and create some order, and now are facing what will be large strategic projects which cannot be compartmentalized into a singular monolithic application. ‘Command and control’ is reaching its current limits, unless something changes and these changes are gradually becoming visible. End-to-end visibility, application discovery and dependency mapping and software engineering automation meta-tools are just some of the tools which have appeared to address these issues, some as part of Business Service Management suites, some to simplify operations of distributed server parks, some to ‘encapsulate’ the act of programming and system design in a facilitated manner to optimize results. And this way, what used to be the end station of development, operations is reaching around and gradually enforcing a much more modern approach to both hosting and application development, with ongoing maintenance, search engine techniques, dependency injection in an environment which is best described as a state of perpetual beta. And this is where these needs arising from sheer complexity touch on the new programming paradigm of the Service Component Architecture, a first step towards dynamic application infrastructures, and what used to a systems management tool is now transforming into a whole suite of products very much akin to the BPM products it also hosts.

2.5 ESB ++ As recognized with the S-RAMP initiative though, and the expansion of traditional integration vendors into the Service Management area, a SOA Registry has become increasingly important. Not only for Governance reasons, but also for technical monitoring and management, as well as acting as a centralized repository to store different versions of development and runtime artifacts.

Page 15: Market Survey Q1 2011 - Fluxology · whereas Gartner acquired META Group, AMR Research and Burton Group. Ovum was acquired by DataMonitor near the end of 2006, which in turn was acquired

Towards_SOE_Fluxology.docx Page 15 of 28 5/10/2011

This, again, is an area where Business Service Management meets SOA as they are more or less addressing the same issues, albeit on a different resolution level of the IT environment, or seemingly so. Service granularity is often a difficult subject as it aims to find a modality between the informational level on which the services are exposed and the fractal nature of SOA, an issue often addressed by introducing a hierarchical set of layers of increasing complexity, from the singular service upwards to an end stage where these services come together in a singular unit such as a process or a screen. Yet, from a conceptual angle there is little difference in approaching a server virtualization project in a SOA manner, orchestrating tens or hundreds of batch processes for data exchange and processing, or for a client to order several samples via the updated online catalogue and replying with a range of delivery dates to choose from. All of these are dealing with ‘services’. The Service Component Architecture is currently realized with the implementation of OSGi (Open Services Gateway initiative), a dynamic module system for Java which provides the standardized primitives that allow applications to be constructed from small, reusable and collaborative components. OSGi provides functions to change the composition of these components dynamically, in a ‘hot pluggable’ manner, without requiring restarts. Listing the core principles of a SOA make it apparent why Component Based Development turned to it;

• abstraction • autonomy • composability • discoverability • formal contract • loose coupling • reusability • statelessness

Such flexibility not only speeds up the development cycle, it also fits very well in the ‘perpetual beta’ closed-loop application life cycle management model introduced earlier. With OSGi it is fairly simple to run multiple versions of the same functionality within the same application, allowing for gradual phased improvements instead of sharp cutovers. This does however mean a shift towards service assembly with the so-called Composites, consisting of one or more OSGi components, and these Composites can resemble a Java Object tree or a BPEL process or any other kind of hierarchy. And whichever tool

Page 16: Market Survey Q1 2011 - Fluxology · whereas Gartner acquired META Group, AMR Research and Burton Group. Ovum was acquired by DataMonitor near the end of 2006, which in turn was acquired

Towards_SOE_Fluxology.docx Page 16 of 28 5/10/2011

assembles the components into Composites, and regulates its runtime behavior cannot be simply stateless, and along with other possible features we’re again touching on Business Service Management. The need for large-scale integration solutions to mix and merge in some way with Systems Management has become apparent alongside the emerging of a new way of application development, and recognition of this shift is visible across many vendors, but it has clearly not crystalized into a consistent offering yet. Current trends at the traditional integration vendors indicate a solid understanding of the complexities of scale. During the gradual evolution most tools and product suites have merged or enveloped what used to be distinct products, such as Enterprise Application Servers, Event-Driven Applications, Message Oriented Middleware and Transaction Processing Monitors. Most vendors have moved away from a ‘hub-and-spoke’ architectural design model towards a ‘bus’ model (comparable with connecting two machines with a direct cable, or via the network), where the bus is usually provided by an, ideally clusterable, Message Queue product, although for Extreme Transaction Processing or Ultra-low-latency Messaging Space-based solutions are becoming increasingly popular. Such in-memory ‘spaces’ are provided by what are primarily JavaSpaces, a virtualized memory realm consisting of a dynamic federation of Jini runtime containers. These spaces latter are also popular because it offers a relative cheap high-performance solution for caching relevant business data, an Operational Data Store, which are important in three emerging trends in the integration market. The first trend is more traditional and involves Business Intelligence, which was one of the first uses of computing. The gradual shift towards more real-time reporting has forked into two camps, one with a centralized data storage updated with batch runs of large amounts of data originating from several key applications, the other with a data store which is continuously updated with the latest information from the dataflow through the interfaces themselves. A second trend involves again the need for reporting, but also associated with the number of interfaces hosted. Often EAI projects originated with point-to-point interfaces, but there are many of these dataflows which need to go to multiple systems, especially when this information is concerned with identifiers, master data. Such broadcasting of information are better served by a subscription-based model and the extra step involves introducing a canonical message format which incorporates a canonical data format; this is where information proprietary to the source system is renormalized to an accepted standard. Not only does this allow for much easier tracking and tracing, visibility and overall reporting across different systems, it also reduces the integration effort from an N x M towards N + M, where N and M signify the number of source and target interfaces. This is how Enterprise Application Integration shifts a level upwards the OSI layers towards Enterprise Information Integration, and the area of MDM (Master Data Management) is evolving rapidly to provide the tools to enable this as well as data synchronization between the systems on the edge, for notification of changes after batch processing procedures, verification and simply for data cleansing and data quality purposes. The third trend involves CEP (Complex Event Processing) which is the real-time filtering, matching and correlating of small data streams, events. CEP is currently most often advocated as a form of BAM (Business Activity Monitoring), but it is easiest to understand as an inverted database. Instead of having a store with structured data on which queries are being executed, CEP acts as a memory space with queries through which data events are being sent, and which subsequently trigger a follow-on action when such a query is full filled. Processing a variety of RFID signals was one of the original use cases, but it can also be useful in monitoring the data center itself. CEP and BPM Suites offer complementary functionality and they are likely to merge, if not as a technology then at least as an offering, hopefully with the appropriate level of interoperability. CEP can be used for active monitoring of compliance to highly critical Service Level Agreements, and in that sense can even be used for automated handling of SNMP traps in combination other types of information.

Page 17: Market Survey Q1 2011 - Fluxology · whereas Gartner acquired META Group, AMR Research and Burton Group. Ovum was acquired by DataMonitor near the end of 2006, which in turn was acquired

Towards_SOE_Fluxology.docx Page 17 of 28 5/10/2011

Both CEP and BPM are highly suitable to deal with the agreed upon protocols for a B2B VAN, within a certain time a request needs be acknowledged of receipt, and followed by a response, or an escalation action can follow involving a notification or allocation of a human task. It should be clear that the remaining differences between EAI and B2B are in the area of security, but one is advised to not assume this conceptual convergence is translated in a fusion of different specialized tools. CEP and MDM are considered the main areas of development for the integration vendors for the coming years. Most vendors offer a mature ESB product and/or a mature BPM product and these products are natural extensions as far as back office integration goes.

2.6 Human Workflow As far as the front office goes, providers of integration providers have been relatively weak in this area, although it should also be recognized as highly complex. The automation of a structured process where some are vital, but most are usually is barely visible background noise, is a lot simpler than dealing with what the many variables in human interaction. Even Enterprise Mashups, albeit focused on one or several personalized composite screens, have fairly complicated means to combine the information from different sources, especially if it involves bi-directional data exchange and multiple steps. Several mature human workflow solutions are available from the more BPM-oriented vendors; the most advanced offering adaptive processes consisting of small atomic series of activities which are dynamically chained together depending on policy, rules, information from previous activities or marketing goals. An even more flexible sort of human workflow solution are nearing the mainstream, these are related to what are often completely unstructured workflows. Deeply related to other Collaborative Software, Adaptive Case Management (ACM) is a solution receiving more attention both in the front-office as well as in the back-office environments. Collaborative Software can best be seen as a further evolution of groupware.

These product suites are meant to ease, facilitate and optimize all those activities which cannot be simply standardized or automated. Such features were already available with the context of e.g. Collaborative Product Lifecycle Management where complex knowledge intensive activities such as product design were encapsulated with features for conferencing, document management, joint agendas and means for shared editing of the same document even though the team itself could be

Page 18: Market Survey Q1 2011 - Fluxology · whereas Gartner acquired META Group, AMR Research and Burton Group. Ovum was acquired by DataMonitor near the end of 2006, which in turn was acquired

Towards_SOE_Fluxology.docx Page 18 of 28 5/10/2011

located at different places on the globe. Although solutions for Application Lifecycle Management already existed, these are gradually merging with in a more collaborative wrapping, dealing with e.g. ‘closed-loop incident management’ or ‘integrated change management’. Development and delivery has changed from an assembly line pipeline from the front-office to the back-office, towards a continuous loop within a context of Business Service Management.

Front-office workers, who answer emails when they can, work on documents when needed, attend a meeting when available, quickly use an instant messenger to settle an issue, they are supported by tools which used to deal with Enterprise Content Management, but it is very difficult to extract knowledge from this information. Naming and archiving standards are widely used, allowing a little idea of the content to reappear dependent on the location where a document is stored and the name it has, maybe meta-tags are used to label it within different categories, but there is still a large loss of information in between that level of meaning and the actual content. Preferably, that is. Recent acquisitions of indicate a recognition of the value BPM has for the ECM market. As more activities become automated front-office workers will increasing deal with the exceptions and other activities occurring on a level of complexity which may be broken down in subtasks, but as a knowledge domain cannot be simplified any further without a severe loss of meaning and thus value. For example, with the above scenario’s assembling temporary teams will be part of a much more goal-oriented context, but that doesn’t mean that psychometric tests, training in Neuro-Linguistic Programming, body language, and pre-screening by HR employees results in a better selection than simple ‘touch and feel’ meet-up with someone with a lot of experience of human nature. In a simplified manner, BPM can be characterized as data flows in a process setting, whereas ACM can be seen as processes which are executed in the context of a case. The current vendor landscape shows a differentiation in back-end workflows, BPM, and the user-facing Human Workflow which is morphing into more collaborative settings both in the area of Adaptive Case Management and Business Service Management, but at present can best be considered dealing with structured and unstructured workflow.

Page 19: Market Survey Q1 2011 - Fluxology · whereas Gartner acquired META Group, AMR Research and Burton Group. Ovum was acquired by DataMonitor near the end of 2006, which in turn was acquired

Towards_SOE_Fluxology.docx Page 19 of 28 5/10/2011

2.7 General Conclusion Overall, the value of SOA appears to be clear, and both pure-play ESB offerings as well as those embedded in a BPM Suite offer most of the much needed features. The current marketplace is driven by mergers and acquisitions as well as pure technical innovation, and this leads to significant differences between one vendor’s offering and another, sometimes even at the same vendor. As the market clearly is on the crossroads where SOA is becoming a new paradigm for application infrastructures, with the current cloud craze as an example, it is duly note the market is very competitive, and as a result fragmented, and a demonstrated excellence in dealing with this challenge in a satisfying manner for their end-clients will clearly stand out.

SOA has not only become the leading norm for integration projects, but also for application design of multi-tier architectures and in such a pervasive way one should consider enterprise-wide application infrastructures which are ‘hot pluggable’.

Page 20: Market Survey Q1 2011 - Fluxology · whereas Gartner acquired META Group, AMR Research and Burton Group. Ovum was acquired by DataMonitor near the end of 2006, which in turn was acquired

Towards_SOE_Fluxology.docx Page 20 of 28 5/10/2011

Several ‘attractors’ are assumed to be the primary shaping forces for the industry, and during this pre-selection, vendors with a future-compliant roadmap will also be preferred above others. Not only that, the ESB has to deal with the inheritance of the past and deal with the challenges of a heterogeneous environment and this means solid support for adapters and transformation. Yet, it should also be recognized the paradigm shift for application infrastructures applies to the integration suites as well, and this is an ongoing retro-engineering feat which some vendors appear to be avoiding. The ESB is becoming an application platform, it is not ‘middle’-ware anymore when most in-house applications are more or less embedded in a network of interfaces, it is ‘total’-ware. SOA is a discipline and might as well be acted out with paper mail, and choosing an appropriate technological foundation is first but vital step. As adaptive standard-based hot-pluggable service assemblies ought span the offerings of different vendors, and after the widespread consolidation of the EAI market vendors will have to be versatile enough to allow for solid interoperability and portability, especially as the Service-Oriented paradigm shift also means the ‘build or buy’ question can often be decided towards an affordable customization, because in times when just using a new technology is no competitive differentiator anymore few buyers are served by buying a ‘business in a box’.

In many aspects migration efforts are on the junction of these evolutionary trends, a step into an ongoing realization of a future while simultaneously forming a bridge to harvest existing systems, and by encapsulating proprietary technologies with universal interoperability implementing an ESB product more or less removes the need for an ESB architectural model. In other words, the ESB as a mediator is only needed for as long as the systems do not offer service-oriented features themselves and the role of the ESB is enable such a Service-Oriented application infrastructure thereby making itself redundant and be replaced by a more appropriate service management solution.

Page 21: Market Survey Q1 2011 - Fluxology · whereas Gartner acquired META Group, AMR Research and Burton Group. Ovum was acquired by DataMonitor near the end of 2006, which in turn was acquired

Towards_SOE_Fluxology.docx Page 21 of 28 5/10/2011

3 EVALUATE, CLASSIFY AND COMPARE

3.1 Evaluation Similar to the gradual introduction of databases facilitating a large number of applications to arise as well as architectural frameworks, the shift to Service-Oriented technologies is alike to the introduction of the relational database in 1970. This shift has parallels even in a conceptual way as composite applications are moving beyond the stage of a centrally orchestrated set of functions using some structured data to a more inverted form of network-based architectural style with spontaneously emerging purpose-driven hierarchical structures. In this sense software design is shifting from the scale of building a house, hut or mansion to building a residential area or an apartment building.

It is worth noting that many Systems Management suites have built-in capabilities for supporting ITIL (Information Technology Infrastructure Library), a set of concepts and best practices as published by the UK Office of Government Commerce. Published since the early 1980’s ITIL, although very powerful, also proved very complex, in 2007 it was extended with the release of ITIL v3, a more universal approach via adopting the idea of a ‘Service’, with full lifecycle management for services, covering the entire IT organization and all supporting components. ITIL v3 will replace the older versions as of the end of Q2 2011. Needless to say, this only enforces the uptake of the Service-Oriented paradigm.

Page 22: Market Survey Q1 2011 - Fluxology · whereas Gartner acquired META Group, AMR Research and Burton Group. Ovum was acquired by DataMonitor near the end of 2006, which in turn was acquired

Towards_SOE_Fluxology.docx Page 22 of 28 5/10/2011

When looking for a particular solution making use of service offering, it is worth using a simple overview combining SWOT (Strengths, Weaknesses, Opportunities, and Threats), VRIO (Value, Rarity, Imitability, and Organization) and Porter’s Five Forces which give a good idea of competitive rivalry using a model with three forces from 'horizontal' competition: threat of substitute products, the threat of established rivals, and the threat of new entrants; and two forces from 'vertical' competition: the bargaining power of suppliers and the bargaining power of customers.

Albeit most times indirect, a vendor’s offering is assumed to be shaped by several industry methodologies, so one can safely assume the use of systemic invention methodologies such as TRIZ, as well as decision management support systems using dynamic strategy process models. Although offerings definitely deserve a reality check, because standardization has been such a strong force in this segment best practice tend to bubble up and should be fairly easily identifiable via the provided marketing information, brand name, participation in standards organs, public R&D and willingness to reach out. In that sense, an offering as a face-lifted version of near obsolete technology, will gradually being filtered out by the market segment itself, due to the strong focus on standardized interoperability.

Page 23: Market Survey Q1 2011 - Fluxology · whereas Gartner acquired META Group, AMR Research and Burton Group. Ovum was acquired by DataMonitor near the end of 2006, which in turn was acquired

Towards_SOE_Fluxology.docx Page 23 of 28 5/10/2011

3.2 Vendor categorization When comparing vendors in their main market segment, making individual comparisons with objective criteria is often unfeasible and, more importantly, pointless. It is worth looking at what places them in a particular segment compared to the overall market and other nearby segments, but when vendors are close together other qualities start dictating. Essentially a vendor needs to be able to deliver a valuable solution and the question is if it can continue to deliver this value. This translates into predictive capabilities, whether it recognizes the nearby future and is able to shape it, both technically as with comprehensive business cases, market research and marketing information. In that sense companies can form their own segment, if they are further ahead than their usual competitors are. And of course their delivery organization, their general ability to execute, including market share and brand name, overall and for this offering.

In that sense a vendor should be evaluated in respect to their vision, strategy, competence and ability, as an individual company and in the context of the market evolution. And it is with respect to the overall market that their offering may be suitable for further consideration. Their whole offering involves a triage of solution, services and support, and can be rated according to capability maturity, whether the quality levels are ad-hoc, repeatable, defined, managed or optimizing. A vendor needs to show competence on several simple capabilities; goals, commitment, ability, measurement and verification.

Page 24: Market Survey Q1 2011 - Fluxology · whereas Gartner acquired META Group, AMR Research and Burton Group. Ovum was acquired by DataMonitor near the end of 2006, which in turn was acquired

Towards_SOE_Fluxology.docx Page 24 of 28 5/10/2011

Using a general purpose contextual classification means as depicted above, different aspects of the offering can be made insightful, it doesn’t try to address a winning segment but the criterion is to be weighted in all four segments. With that the different angles from SWOT, VRIO and Porter’s five forces can be easily understood; SWOT aims to list internal and external competitive factors, while VRIO aims to quantify an offering’s competitive potential and Porter’s model provides insight in the competitive intensity of an industry, a market segment or several adjacent ones. For example, positive rarity is a cohesive qualitative discriminative characteristic where the offering is not too far ahead of the market, while negative rarity would imply the offering is so unique within the whole (incoherent) market segment it has difficulty demonstrating the value and return on investment. The latter can e.g. be addressed by solid support for open standards by which the offering aligns with and gains a variety of contextual settings and use cases which it wouldn’t have in its own right. The interpretation flips around in value when an offering’s aim has to deal with e.g. security, such as financial messaging networks, where uniqueness and rarity are positive attributes. Negative imitability would be an incoherent uniting quantitative differentiation where sufficiently many equivalent offerings exist on the market dominated by competition, and positive imitability could for example result in de-facto market leadership, counterfeit Rolex watches adding to the reputation of the official brand or the benefits claimed by Open Source Software vendors. Many widely used criteria used are doubtful measures, such as company size, which may only be advantageous for a client within certain margins. If a company gets too big, there is the risk of lock-in, if they’re too small they may get acquired for their customer base in a particular industry. If their overall suite addresses most challenges, but is not a cohesive whole in a technical and commercial sense, is that a negative characteristic when a client wants to diversify their preferred supplier list? Is it a positive if a unified design-time and run-time is offered when for development purposes five instances need to run simultaneously to offer a complete end-to-end view? Any such criteria need be evaluated an-sich, their place in the overall offering, in the context of the client’s current and future wants and needs, as well as in the context of the market dynamics, and these can be ‘weighted’ depending on the degree of separation, alignment and cohesion. Do these criteria maintain their unique value while heading in the direction of the market’s hotspots, yet at the same time avoid stranding in isolation? Any kind of product, solution, technology or service offering, when it enters the market shows an indeterminate yet predictable behavior it converges to. Depending on the rate of technological maturation and market saturation, the interplay of push and pull forces result in a growth pattern. These patterns are displayed here on the right. A characteristic for knowledge based market segments is that they show an unusual behavior which is known as ‘increasing returns’. Due to the reduced dependencies on physical limitations knowledge and technologies can be distributed very quickly, e.g. via downloads or television news broadcasts, and due to network effects it creates what is known as ‘path dependence’, a self-sustaining reinforcing feedback loop. A fashion trend.

Page 25: Market Survey Q1 2011 - Fluxology · whereas Gartner acquired META Group, AMR Research and Burton Group. Ovum was acquired by DataMonitor near the end of 2006, which in turn was acquired

Towards_SOE_Fluxology.docx Page 25 of 28 5/10/2011

Whatever their degree of immanence, eventually the expectations concerning a new technology, solution, service or product will require a recalibration to get in tune with the changing market landscape again. This is where technologies demonstrate the highlighted ‘overshoot and oscillate’ pattern, or what often happens to early vendors of such a technology; ‘overshoot and collapse’. Many early vendors simply collapse because of their own success, growing too fast, and while experiencing a ‘combinatorial explosion’ in utilitarian possibilities they cannot keep up with their own organization to support this growth spurt. On the other hand, inventions, innovations and technologies appear in clusters, and if a market segment or target audience is too dense the potential may be exhausted by mutual crowding out competition and either individual instances form groups (mergers and acquisitions) or they perish.

However, these mechanisms are known and eventhough not exactly deterministic, experienced entrepreneurs and organizations ought be proficient enough to control the physical and informational factors to dampen any potentially harmfull excess. Often that implies a complete overhaul of a company’s organization to move from a rapid growth focus to a more sustaining modus operandi.

Page 26: Market Survey Q1 2011 - Fluxology · whereas Gartner acquired META Group, AMR Research and Burton Group. Ovum was acquired by DataMonitor near the end of 2006, which in turn was acquired

Towards_SOE_Fluxology.docx Page 26 of 28 5/10/2011

Companies that understand these dynamics, in particular the interplay of perception, expectations, maturity of the offering, fulfillment and obsolescence, can make use of above described means to bypass the usual hurdles when introducing a new product. Apple’s introduction of the iPhone and iPad are excellent examples of a vendor entering an existing market while having such brand recognition they are able to grab a large piece of the client potential.

Such is the power of increasing returns. Gartner describes this with their hype-cycle, which is aimed to represent technological maturity, adoption and widespread application. These models are a slightly anthropomorphized framework used to characterize over-enthusiasm, or "hype", and subsequent disappointment that typically happens with the introduction of new technologies. If anything, as technologies continue to evolve towards utilizing more general purpose interchangeable means, the more close they resemble an information technology with all the benefits imaginable software brings. The tendency to ‘oscillate’ is not merely a psychological effect, it is a natural systemic mechanism when a new technology is introduced and tries to ‘settle in’ within a wider population of related technologies.

3.3 Comparative Criteria Now that we have a coherent framework for gauging an offering, a context needs to be formulated for the criteria. When looking at the SOA market, considering the importance of the mediation function within a service-oriented application landscape, a differentiation relative to the market dynamics is worthwhile. As described earlier on, this involves three main trend movements. First there is the move from point solutions to SOA as a design pattern to deal with increased complexity. Second, the move from process automation of repeatable activities towards collaborative suites supportive of activities which cannot be further simplified without losing value. And third, the increasingly blurring boundaries between adjacent technological domains due to continuous improvement of core technologies. Within these three trends it suits to differentiate with their degree of alignment, whether a vendor’s offering is outdated or possibly obsolete, whether theirs is bleeding edge innovation, or whether the offering is gravitating around the mean. Within the latter category an additional subdivision is applied,

Page 27: Market Survey Q1 2011 - Fluxology · whereas Gartner acquired META Group, AMR Research and Burton Group. Ovum was acquired by DataMonitor near the end of 2006, which in turn was acquired

Towards_SOE_Fluxology.docx Page 27 of 28 5/10/2011

distinguishing between challenging contenders, fit-for-purpose strong performers and leading edge state-of-the-art. Obviously there is an increased risk involved with bleeding edge solutions, whether this offering is at the forefront of functional application or at the forefront of architectural paradigm. Likewise a solution which will be obsolete within the foreseeable future may carry unaffordable risks as well. However, these should be seen in combination with the other trend movements, as well as the degree of alignment and coherence for the industries, and the company itself of course. A solution may be near-obsolete as far as the continuously evolving landscape of commercial offerings goes, but it may be fit-for-purpose for the parts of a client’s infrastructure which are planned to be updated before the vendor offering invalidates, especially in combination with other service offerings, when the offering itself is encapsulated in a service-oriented manner so it is interchangeable to a high degree. In that context it befits to highlight some questions and findings from the individual vendors evaluations:

The foreseen convergence of ‘front-office’ and ‘back-office’. B2B, BPM and BAM can be seen as emphasizing the business application of technological adoption, and have long been initiated because of the value it offers in improving business conduct. System Management has gradually gown more and more to deal with increased forms of complexity due to the number of applications and platforms, their interdependencies and the resulting issues for maintenance schedules, upgrades, bug fixes, and problems. This has grown into an advanced toolset which addresses the technical side of business applications with increasing precision. IT projects will increasingly be embedded in a context of closed-loop change management in order to enable agility, increased involvement, rapid corrections and clarity of communications and understanding. With this ongoing change, the question arises that if it weren’t for the transformations and connectivity requirements, if a mature integration solution would not be better replaced by a governance solution such as Service Request Management or SOA Lifecycle Management? In an infrastructure which is becoming increasingly service-oriented the probable case will be that the remaining integration features will form a plug-in for a more encompassing suite, to be injected into the application where needed.

Several vendors offer a solution suite which combines much of Application Lifecycle Management as well as many of the application integration styles. Large vendors offer such complete approaches whereas smaller vendors offer a valuable and flexible subset which can easily be embedded with such solution suites. Some vendors even offer solutions for integrated SOA lifecycle management with hundreds of templates and integrated best practices, which may not be an integration tool in its own right, but with the effort reductions offered by continuous build, automated testing and quality assurance during the development phase, the question arises if such is still needed. Other vendors provide an integrated and comprehensive environment for development, deployment and management of service-oriented applications where many different integration styles are supported, such change data capture, data consistency, micro-flows, multistep processes and composite applications. By offering such a holistic approach, these vendors can offer exciting additional features such as real-time analytics, proactive “sensing and responding” of changes, and automated process discovery along with highlighting possible inefficiencies (e.g. unneeded data moving back and forth). Most of these larger vendors are moving into such a direction via increased proficiency in dealing with IT operations.

How to valuate vendors which demonstrate a high degree of alignment but a low degree of coherence? Many vendors are implementing SCA features via incorporating OSGi within their entire product suite, which allows for flexible interoperability between the different components, but their solution suite itself is spread over twenty-some different products. This raises questions about roadmap synchronization, possible low maturity of certain service assemblies as well as license and maintenance costs. These matters need be communicated well on all levels of the organization or it might very well be too overwhelmingly complex for a midsized IT department. When a vendor offers

Page 28: Market Survey Q1 2011 - Fluxology · whereas Gartner acquired META Group, AMR Research and Burton Group. Ovum was acquired by DataMonitor near the end of 2006, which in turn was acquired

Towards_SOE_Fluxology.docx Page 28 of 28 5/10/2011

a solution suite with best-of-breed individual “fit for purpose” products, but if a client’s need covers different functional areas, with some lightweight requirements and some more serious, then it requires some demonstrable proficiency of these different functional areas to cooperate. Does their increased alignment result in coherence of their offering and company as well?

How to valuate vendors which have a high degree of coherence but a low degree of alignment? Their solution suite may be great to address the variety of integration challenges, but may be lagging behind with the industry trends for accepted and emerging standards for interoperability. This makes it challenging to move certain functionality to a more appropriate suite if such a need arises.

Some of commercial open source vendors either claim de-facto thought leadership in their area or a gradual consolidation of close partnerships with in-house built solutions or acquisitions. Transparency of accountability and ownership continue to play an important role for more complex solutions, especially when dealing with distributed transactions. When a purchase order is lost, or a payment is performed twice, most clients do not enjoy abhor juridical group therapy to figure out what went wrong and whose is responsible in the whole end to end chain, they want a single point of contact and OEM-style arrangements to make sure it doesn’t happen again.

When a vendor provides the appropriate building blocks for addressing various styles of integration, and with a little extra effort dramatic improvements in development, testing and management can be achieved, is it worth weighing this against a more bundled offering?

3.4 In Praise of Folly (Lof der Zotheid) Although this report aims to throw people in at the deep end, it also provides ample tools to draw one’s own conclusions. IT environments are becoming increasingly complex, and there is no stopping this trend. Business and IT are intertwined in ways to an unforeseen level, and that gives an increased number of possibilities where actions triggered by cost reduction can be seen with additional possibilities for value creation. Many decades of interfacing between applications, data stores and humans have resulted in a strong fabric of industry-wide standards and best practices. The current technological infrastructures allow for an added layer of indirection without paying a price in terms of performance, and this allows using previously tightly integrated features by simply referring to an address and communicating by means of some general accepted ‘language’. While complexity increases IT environments are also growing towards newer models which aim to address this, and simplify overall complexity by providing newer tools which are better suitable to deal with this. And instead of instructing a computer to have some data placed at a particular location on a particular storage device, it is sent now to a storage service requesting adherence to a particular contract concerning service levels, and it is the services itself which tries to put it somewhere at some optimal place. Ongoing cycles of such abstraction to get to maximal interchangeability of replaceable and standard parts are leading now to a visible similar trend for IT infrastructures themselves. This in turn leads to a situation where a company’s organization is best approached as a hierarchy of cooperative, collaborative, coopetitive and competitive arrangements. It’s not getting more simple, on the whole, but the complexity continues to bubble up to other layers of the company, or industry, greatly simplifying the way certain technical ‘parts’ are being used. Likewise, future business can be composed of the services offered by other companies, from outsourcing some IT functions, purchasing, accounting, manufacturing, or even sales and marketing. Enterprises can be constructed custom-fit around the actual product or service that is being offered, and scale up to meet growing demands, or be disassembled quickly if the offer fails to meet with any demand. Larger companies are expected to transform into marketplaces, hosting to their own departments, while providing general services such as brand recognition or others where the whole is greater than the sum of its parts.