vendor strategies: operational business intelligence for agile enterprises

25

Click here to load reader

Upload: kishore-jethanandani

Post on 12-May-2015

680 views

Category:

Technology


0 download

DESCRIPTION

Enterprises need real-time operational data to manage performance.

TRANSCRIPT

Page 1: Vendor strategies: Operational Business Intelligence for Agile Enterprises

Operational Business Intelligence for Agile Enterprises

By Kishore Jethanandani

Page 2: Vendor strategies: Operational Business Intelligence for Agile Enterprises

VENDOR STRATEGIES AND PRODUCT OFFERINGS

Agile enterprises need business intelligence to improve their sensory perceptions; the ability to

gauge reality, to view their resource flows and alacrity to remain on top of events. Business

intelligence vendors are increasingly conscious that customers hate to be hobbled by their sunk

costs in Information Technology. Instead, customers want to be able to rejig their existing

technologies to adapt in fluid situations. Information needs to flow unimpeded by clunky

technologies. Technology inadvertently has often, in the past, become a millstone instead of a

lubricant of change. Customers are increasingly looking for technologies that read the pulse of

their business activity and funnel information to all their employees who can communicate and

collaborate and act in time to respond to events.

The concerted effort that customers are making to lower latencies in data collection and decision

making is best illustrated by the effort investment banks are making to speed up the processes to

refresh their data that influences their decisions on portfolio management. They are looking to

receive information directly from the exchanges, option trading exchanges and ECNs so that they

can weigh the impact of events on any of the securities that they hold in their portfolios. Automatic

trading tools enable traders to complete the calculus of risk and return when they buy or sell

securities and need to evaluate the impact of major changes such as prices and interest rates.

Their information architecture has to be so constructed that it can tap market data from a variety

of servers, with their own data formats, and convert them into a single format and move the data,

aided by middleware, to the enterprise data infrastructure.

In the past, information systems were tenuously linked to the levers that companies could use to

act as situations changed. Increasingly, enterprises are looking to integrate decision making

processes and business processes so that the lag time between the receipt of information and

the response is minimized. They want to build in the rules for predictable responses to known

problems so that human resources can be reallocated to attend to more knotty problems. Where

human intervention is required, companies want to be able to quickly visualize a situation and

size up a problem before they act.

Above all, the best decisions happen when all the related information is brought to bear on a

course of action. In addition, decision-makers want to simulate alternative scenarios, visualize

them before they take their decisions.

The business intelligence industry is still evolving and the jury is still out on who is eventually

going to win. Clearly, established players with experience in implementing large size deals in the

enterprise software industry have the best chance to integrate several technologies required to

gather information, analyze data and communicate decisions. Business intelligence projects also

require consulting services that are so necessary for successful implementations of especially the

operational applications.

Page 3: Vendor strategies: Operational Business Intelligence for Agile Enterprises

INTELLIGENCE ON TAP

Business managers, in operational situations, get the “missed the bus” feeling when they are

unable to take decisions at the right time; their calculations can go haywire as the ground shifts

underneath them. Well timed moves help them to grab opportunities and get ahead of the

competitors. Companies need to also take decisions before a problem snowballs into a crisis.

Delays in decision making can cause grievous losses to businesses. In the pharmaceutical

industry, for example, counterfeiting, medical errors and poor quality of products can undermine

confidence in companies and medical groups. The actual manufacturers could be blamed for the

harm done by counterfeit or an odd batch of poorly manufactured products causes harm to

patients. The healthcare industry is now well equipped with laser vision, RFID and other

technologies to eliminate errors in data entry and data gathering technologies which aggregate

data so that causes of any damage done can be traced back to specific deliveries and preemptive

action is taken.

Edge Dynamics Inc is one of the companies with applications for real time decision making for the

pharmaceutical industry. Typically, a complex set of contracts, deals and regulatory policy bind

the stakeholders in the supply chain consisting of the manufacturers, wholesalers and retailers.

Edge Dynamic has the software which is able to capture transaction order stream data originating

from an EDI or other such sources such as a Web-service B2B network. The data is analyzed for

discrepancies from forecasted numbers or any deviations from agreements initiated at the outset.

As data is received and analyzed, the partners in the supply chain will discover flaws in its design

and work towards managing inventories better. They will find better ways to optimize and

reevaluate their partners and their logistical planning.

The technology that goes into gaining visibility into the operations of businesses is illustrated by

the implementation of Siebel Analytics at Jostens which sells class rings, graduation

announcements, and yearbooks to schools across the country. Siebel Analytics is able to

aggregate data from the Oracle data warehouse, a Microsoft SQL Web/e-commerce application,

and Microsoft Access for use by Jostens sales staff which can view the results on role-based,

interactive dashboards. The Siebel Analytics software enables Jostens to keep track of the sales

performance data by each segment of the business. With real time data feeds, Jostens’ sales

staff can spot opportunities for cross-selling, up-selling, etc.

Retail stores present a familiar scenario where markdowns happen almost everyday when

inventories pile up unexpectedly. All too often, retail store managements are taken by surprise as

preferences change, the media influences attitudes, new promotions are announced by

competitors, seasons change or events affect purchases by consumers. At the local level,

consumer behavior can be quirky and the inventory in stock may not excite them. Retail stores

have to learn to stock an assortment of products that are in tune with the tastes of customers for

Page 4: Vendor strategies: Operational Business Intelligence for Agile Enterprises

each of their stores spread around the country, ensure that they will be profitable and manage the

supply chain so that the products will available in time for the season.

When supply closely matches demand, companies can not only pass the benefits of lower losses

from stock outs to consumers in the form of lower prices but also offer products that closely

match their needs. Zara, a Spanish clothing company, takes less time than its competitors to

respond to market need. The managers at its stores send information about customer

preferences by handheld devices and it is all aggregated rapidly so that the most relevant

products are displayed in their stores. The dyeing and printing is done only after the customer

information is available.

The management of demand and supply has gotten more difficult as the product life cycles get

shorter and the supply chains get longer as goods are sourced from more distant places.

Increasingly, companies are looking at software that can aggregate point-of-sale data from

multiple sources, analyze it to predict demand for individual categories of products, optimize the

supply chain and help in pricing.

Several different pieces of software are used in the management of demand and supply. One of

them is revenue optimization software which takes into account information on demand, the costs

and determines the best price to offer based on elasticity of demand. Conversely, it can take the

prices offered by competitors as given and throw up the numbers for the desired demand and

supply. Such software can also pinpoint customer segments most likely to respond to a particular

offer. Manugistics Inc. is one company that leads in this segment of the market. However,

revenue optimization software does takes into account existing demand before it cranks out

figures on prices and potential segments to target.

The suppliers would rather that they could forecast demand accurately and produce as much so

that they can receive better price deals. This is best achieved by using demand forecasting

software. The successful implementation of demand forecasting tools presupposes the collection

of point-of-sales data and the willingness of retailers to share such information with their vendors.

Besides supply chain management software providers such as i2 and Manugistics, Business

Intelligence vendors such as NCR Teradata, Business Objects, Cognos, and Prescient are the

players in the segment.

An additional piece of software for collaborating in real time is the software for supply chain

management to collaborate with vendors, manage logistics and to share information. Oracle's 11i

E-Business Suite, for example, includes iSupplier and Collaborative Planning portal to

communicate with offshore contract manufacturers and suppliers with web based tools.

MACHINE LEARNING FOR INSIGHTS

The growing size of data sets has changed the analytical paradigm. Well known techniques, such

as statistical techniques, are overwhelmed by the colossal volumes of data. Typically, statistical

techniques begin with a hypothesis and a model, based on domain knowledge such as

Page 5: Vendor strategies: Operational Business Intelligence for Agile Enterprises

psychology, which they seek to validate. The involvement of human beings and uncertain

processes preclude the use of insights in real time.

When data sets are large and chaotic, it is much harder to decide on the methodology for

verification. The dimensions overwhelm human cognition’s ability to see the connections and be

able to this quickly enough to make decisions. Increasingly, machine learning methods are

required to reduce raw data into patterns before humans can look for the story that is relevant for

decision making purposes. These automated methods of finding patterns look for correlations of

data over periods of time (time series), find clusters in activities such as crime or classify data

such as in decision trees. Market basket analysis, for example, looks for combinations of products

customers tend to buy.

The kind of situations in which machine learning has a compelling value is searches on the web.

Intelligence agencies have to look for terrorist activity, competitive intelligence analysts look for

information on rivals or content creators have to look for intellectual property rights violations.

Companies, such as FAST, have created tools that are able to extract insights from such a

labyrinth. Reuters, for example, uses FAST’s search tool to zero down on content that looks

suspiciously like its own.

Machine learning plays an important role in functions such as fraud detection, stock trading, and

customer segmentation to extract intelligence that cannot wait for an analyst to extract

intelligence. Neural Network software systems, for example, have reduced fraud in UK banks by

as much as 30%.

The vendors in the space include SAS, STATISTICA Data Miner, S-Plus, Fair Isaac, SPSS

Clementine, IBM Intelligent Miner Affinium Model, Insightful Miner and KXEN, IBM Intelligent

Miner and Genelytics.

Tools that understand fuzzy concepts

Companies are best able to extract insights when they can search across all their data and

classify and correlate it. For decision support knowledge, companies have to be able to conduct

searches on both structured and unstructured information.

The urgency to search for unstructured information is more urgent now as it has wide range of

applications such as especially law enforcement, customer service, drug discovery, knowledge

management, etc. Companies are beginning to discover the enormous benefits of mining text and

other unstructured information. The pharmaceutical industry, for example, is discovering that it

can reduce the time required to commercialize new drugs if only it could search and analyze

information pouring in from clinical trials for all drugs. When data on safety is available for all

clinical trials, the regulatory bodies can look for patterns that will help them to come to decisions

about accepting drugs for human use faster than is the case now. XML is the bedrock for linking

related databases and to search them with text mining tools.

Page 6: Vendor strategies: Operational Business Intelligence for Agile Enterprises

Customers need a common language and search tools to parse all the relevant data and to

analyze it. Natural language is best able to express the nuances in human thought processes.

Inevitably, natural language can have a variety of meanings, synonyms, connotations and usage.

New tools are required to see words in their context before any meaning can be drawn from

them. This is best achieved when search engines have semantics capability.

The traditional and the most widely used method of searching databases, the Structured Query

Language, is inadequate for heterogeneous environments where data descriptions vary between

databases. This form of querying is relevant only for structured data and it presumes knowledge

of the specific information a person is searching. In most cases, people have a knowledge of the

theme they are interested in exploring. In heterogeneous environments, searching by using SQL

would be impossible since the number of series, as well its heterogeneity, overwhelms its ability

to extract meaningful information and knowledge.

With the advent of XML technologies it is now possible to classify unstructured information as

well. Individual elements of unstructured information can be described by tags or the metadata

that describes the information content in there. The detailed description of the content helps to

search repositories with large volumes of content much like SQL queries can extract information

from relational databases. XQuery can search both content repositories and databases and

extracted related quantitative and qualitative information. Microsoft’s SQL Server 2000 is one

product which supports XQuery and is able to use both structured and unstructured data for

analytical purposes.

Another approach to searching unstructured data is to use search engines. However, a search

conducted on unstructured data all too often yields a jumble of results which is an all too familiar

experience of users of the World Wide Web. Similar searches on corporate intranets are worse

since the information in not even linked as is the case with the World Wide Web.

Search technologies for corporate databases seek to look for the significance in a mass of words.

For example, someone maybe looking for information on a crime committed by a suspect named

John Lear of San Francisco will be able to find meaningful information when inter-related

information about the background of the person, the time, location, previous associations with the

victim is presented. Databases of unstructured information can have variables such as time,

location, biographical data, etc., as the dimensions of a data warehouse and they can store

related facts associated with each of them. A search conducted on such databases is more likely

to find related results instead of a jumble.

At the center of semantic search technologies is ontology or the knowledge base that helps to

define the “being” or the personae that are pivotal to understanding the universe under

consideration. For example, students and professors form the axis of the universe of an

educational institution. Semantic search tools create taxonomy to describe the entities in a

Page 7: Vendor strategies: Operational Business Intelligence for Agile Enterprises

universe and their relationships with the world around them. The information about the university

is classified by entities which helps to create the links between the available records.

Search engines for unstructured data are now able to find information in an organized way by

using tokenization, linking and taxonomies. In essence, these methods look for patterns in the

unstructured data. The tools are so designed that they look for associated text; a word like crime

is related to gang membership, academic performance of the person, incidents of drug or alcohol

abuse, etc., and the information is presented in its relevant context.

The impact of correlating structured and unstructured data can be easily visualized if we look at

the decision analysis that is required for store location analysis. Typically, retail companies will

need structured data such as the demographics of the neighborhood. They will need also map

information in the form of satellite imagery. Also, they would like to have unstructured information,

such as crime, to gauge the attractiveness of the location besides lifestyle trends in the region.

One example of the use of intelligent search engines is the case of ISYS search engine from

Odyssey Development. The Ventura County in California searches through its numerous

repositories to find related information. It could, for example, use data of blood examination, from

a structured database, and find related information on several burglaries committed by the same

individual from several other repositories.

One of the several semantic search tools in the market has been created by Semagix for

searching media sources. The ontology is a hierarchy of categories beginning with general

classifications like News, Business and Entertainment and then more specific terms like cricket,

soccer, etc. The searches could be done by themes such as cricket tournaments which exclude

the possibility of tangential information, such as tournaments of all sports, appearing.

IBM is one company bringing a great deal of intellectual property to the table for searching

unstructured data. IBM WebSphere Information Integrator OmniFind Edition has pushed the

envelope by launching its Unstructured Information Management Architecture (UIMA), a platform

for integrating structured data and unstructured information. The platform supports a variety of

functions such as linking analytics software and enterprise applications, tools for developers to

conveniently create new or reusable text analytics components. With this architecture,

unstructured data in a host of formats or languages, whether it is located in databases, e-mail

files, audio recordings, pictures or video images can be searched.

The searches are unlike the familiar keyword searches; they use concepts to look for related

pieces of information. Text analytic components, supported by UIMA, can use WebSphere

Information Integrator OmniFind Edition to define the ontology, look for relationships in data, mine

text to find hidden knowledge and extract useful business information. An example of how these

kinds of search engines can look for inter-related information would be the case of customer

satisfaction; it would be possible to search for data in maintenance records, market research

Page 8: Vendor strategies: Operational Business Intelligence for Agile Enterprises

studies, call center records and warranty claims to find the products that customers find most

satisfactory or vice versa.

Altogether a total of fifteen companies plan to use this architecture and they include Attensity,

SPSS, Endeca, Factiva, Kana ClearForest, Cognos, and SAS. Factiva and QL2 will provide data

for analysis.

They get it with visuals

Decision-makers are constantly intimidated by information clutter and are looking for tools to help

them digest information rapidly. There is a great deal of noise in large volumes of information

while the noteworthy nugget could well elude the decision makers. In industries such as the

securities industries, the value of information decays quickly unless the substance is absorbed

quickly.

Visualization is an indispensable tool for real time assimilation of relationships in large volumes of

data and their implications for decision making. One instance of this is American Water which has

to monitor the threat of a hostile intrusion on its IT network. It receives thousands of alerts and the

large majority of them are false alarms. Visualization tools help it to map the source address of a

packet and its destination to help isolate any suspicious activity.

Decision makers prefer interactive visualization tools to help them test their hypothesis visually.

Excel type of static graphics have been the staple for visualization in enterprises. Decision

makers need to be able to examine alternative scenarios and they like to have visuals that are

three-dimensional, pliable enough for impromptu reconfiguration to respond quickly to queries

and they like to flip them to view a problem from a variety of angles. The visuals are made lifelike

by the use of artifacts, colors and animations to convey the meaning of the information displayed.

All of these attributes are meant to contribute to effective communication of a message. None of

the widely available visuals available with spreadsheets have the capability to achieve this.

Visual queries are one of the means to isolate relevant data from a clutter and portray it visually.

Much like the structured query language, a visual query extracts specific pieces of information

from a mass of relational database and displays it on a graph. An alternative way to zero down on

selected information is by the choice of dimensions; an analyst might want to compare the bad

debt losses by regions, such as mid-west and the west coast, which is possible when a cube is

created. Cognos Visualizer, which works in combination with Cognos Powerplay for aggregation

of data from multiple sources, is one product that enables users to choose their dimensions and

the corresponding numbers they want to display graphically.

The vendors in this field include the Business Intelligence vendors and another group is

specialists with a focus on visualization. Among the leading BI vendors are Cognos, Business

Object and SAS. On the other hand, the specialists are companies like Vizible Corporation, Visual

Mining and visualization platform providers such as Antarctica System’s Visual Net and Spotfire’s

DecisionSite. The platform providers are the most versatile as they are designed to use data from

Page 9: Vendor strategies: Operational Business Intelligence for Agile Enterprises

any source and they can customize analytical tools to conduct the desired kind of visualization.

Typically, the platform providers focus on industries that generate enormous quantities such as

the pharmaceutical industry or the natural resources industry.

Spreadsheets are forever

An aspect of real time access to data for decision making is also the ability of users to have the

option to continue to use familiar tools. Spreadsheets have been the most widely used for

analytics required for decision making purposes. Integration of spreadsheets with business

intelligence software is critical to their widespread adoption in the enterprise.

Excel spreadsheets are ubiquitous in enterprises despite the fact that they inexorably fragment

the data sources. The flexibility of Excel allows users to create their own data marts and they can

add formulas of their own choice. On the other hand, they contribute to fragmentation of data

sources and perceptions which conflicts with the objectives of gaining a consistent view of the

enterprise. Excel spreadsheets are also not scalable and are not the tool of choice when it large

teams have to work together. In addition, Excel spreadsheets do not have the ability to manage

business processes based on the analysis conducted. Above all, Excel spreadsheets do not have

the ability to pull together information from a diverse set of corporate databases.

Business Intelligence vendors have, in the past, provided partial integration with Excel

spreadsheets by adding features for exporting data to Excel sheets which can be loaded on to a

server. However, Excel users routinely create their own formulas to create additional data series

which were not included in the process of integration. One of the exceptions was SRC Software,

recently acquired by Business Objects, which had closely integrated a variety of databases with

an Excel interface.

Lately, business intelligence vendors have changed course and have offered Excel as the

interface to their business performance or business intelligence software. The software provides

a consistent view of the data and uploads the formulas together with information on changes

made by any of the workers in the enterprise so that they all have access to the same

information.

Beyond a patchwork of integration

The issue of integration has gained urgency as companies seek to manage their sprawling supply

chains, collaborate in product development, offer self-service options to customers and outsource

business processes. A common denominator in these applications is that they span several

systems and applications; enterprises need conduits for information to flow across all of them.

In the past, middleware was the accepted way to integrate applications; it was a step forward

from arduous custom coding that was the norm. While custom coding is the recommended

method for joining two applications, middleware simplifies matters when an application has to be

connected to several others. The middleware is the intermediate junction which allows

information to flow to several different directions. On its way, the middleware prepares the data,

Page 10: Vendor strategies: Operational Business Intelligence for Agile Enterprises

the reformatting, merging, etc., which allows it to be accepted in another database or application.

Data transfers, in a message-oriented architecture, are akin to e-mail which remains in a server

queue till it can move through networks and eventually downloaded onto a desktop when desired.

Middleware created a patchwork of joins which grew in numbers and complexity over a period of

time and was increasingly difficult to manage. At this point of time, a need was felt for a platform

that would manage the gamut of middleware and associated software such as business process

management, be managed from a single platform. Enterprise Application Integration (EAI) suites

meet these needs. An EAI network has subscribing applications which replicate data received by

any one of them into another application.

Integration of applications is not limited to specialized EAI vendors such as Tibco, WebMethods

and SeeBeyond. These vendors have strengths in integrating several applications into a

continuous process but are less likely to be able to combine a broad range of complex business

processes. Another group of vendors are the large enterprise software companies offering

platforms such as IBM, Microsoft, BEA. Finally, the application vendors are also offering a series

of integrated applications such as SAP’s Netweaver, Siebel’s Universal Application Network

(UAN).

In the early stages of integration, message-oriented EAI software was most widely used for

transportation of data from one end to another. In a typical message oriented integration

technology, data can be transferred in an asynchronous manner so that current applications don’t

have to interrupt their functions to receive new data feeds. An intermediate middleware received

the data feeds and transmits to another application without the need to alter the two applications

in any way. The EAI network facilitates the navigation of data over a variety of networks,

programming languages and applications. When two applications are integrated, a point-to-point

integration will suffice. On the other hand, a publish/subscribe methodology will work better when

data transfers have to be undertaken to multiple applications.

One case study of the implementation of the EAI is Cincinnati-based HealthBridge which acts as

the junction for flows of information from hospitals and providers in the metropolitan area of

Cincinnati. The company navigates the data streams from 28 hospital clinical applications and

delivers more than 940,000 clinical results per month to 2,900 physicians. HealthBridge network

uses clinical messaging software to transmit data from one application in a hospital to another in

a physician’s office.

EAI makes a departure from piecemeal integration of some of the applications of an enterprise by

tying them with middleware. While integration by means of middleware is much less expensive

than an EAI implementation, the benefits too are far more limited. By integrating the entire

enterprise, an EAI also enables companies to view their business processes and applications in

their entirety. The process parameters are monitored so that companies can monitor their

performance. They can optimize their business processes by modeling them, simulating the

Page 11: Vendor strategies: Operational Business Intelligence for Agile Enterprises

impact of alternative designs which could help to lower costs and redesigning their business

processes.

The IBM MQ series is the market leader in the space for message oriented integration

technologies with an estimated 65% of the market share. The leading position of IBM is

accounted for by open architecture which means that applications in diverse range of platforms

can be integrated without any special programming. On the other hand, Microsoft, the other major

player in the market, does not support any other platform other than its own without custom code

to make it happen. IBM’s WebSphere MQ connects applications and passes messages between

them. It has a library of connectors to Oracle, SAP, and Siebel Systems applications, as well as

mainframe systems such as CICS and IMS. BEA Systems offers WebLogic Integration and uses

XML based application adaptors to inter-link applications.

The other method of integration is to execute it at the level of data. This method relies on

database technologies like gateways, metadata, queries, data set transfers, and bulk data loading

tools such as ETL or data grids. The most commonly used methods are ETL tools, traditionally

used to feed data into Data Warehouses, are products like Data Junction's Integration Studio and

Engine, Informatica's PowerCenter. These tools extract data from operational data stores,

transform the data and load them into data warehouses.

Increasingly, the former specialists in ETL have morphed into data integration specialists.

Informatica, for example, has incorporated PowerAnalyzer into its data integration product

PowerCenter Advanced Edition and is working closely with Composite Software for EII type of

integration. IBM extended its data integration capability by its acquisition of Ascential and has

incorporated its ETL engine, renamed DataStage TX, which also includes EAI capabilities that

came with the acquisition of Mercator Software Inc. The IBM WebSphere Information Integrator

plays the role of data integration.

Integration of data sources is crucial to not only consolidate data but also to check for its quality.

This is illustrated by the case of The Scotts Company, Marysville, Ohio, the lawn and garden

products company, which needed to find a way to forecast consumer demand with greater

accuracy. In the past, it had to depend on its own shipment data instead of the point of sales data

which was scattered and could only be accessed from EDI systems that were routinely used by

retail chains. The data integration major, Ascential, designed a solution that allows the company

to access consolidated data from its POS of sales services in a way that could be read by its SAP

applications. A comparison of the actual shipment, production and inventory data with demand

helps to determine the trends in consumer demand and to make relevant adjustments.

Enterprise Information Integration (EII) is a set of tools which integrate a federation of data

sources besides applications. EII provides a single point access to all the data in the enterprise,

whatever its format, with metadata to describe all the data. The better known products in this

space include BEA's Liquid Data and IBM's DB2 Integration Integrator. Among the new

Page 12: Vendor strategies: Operational Business Intelligence for Agile Enterprises

companies are Attunity, Avaki, Composite, and MetaMatrix. Composite has an alliance with

Cognos to integrate business intelligence software and its product Composite Information Server

3.0 starts at a price of $100,000. Avaki’s first product Avaki 6.0 was launched at $50,000 while

the total costs of deployment averages $175,000 to $250,000.

These systems have two important components; they need a data model to aid the process of

conversion of data from one source to another. These tools also provide graphical tools that show

the configuration of the network of applications and data sources and a directory of terms to

access methods and fields in the data sources.

Among the key players are start-ups such as MetaMatrix which has a partnership with Business

Objects and Hyperion while Composite Software has alliances with Informatica and Cognos. Both

these vendors come from a background in relational databases and their products afford SQL.

Another category of vendors, such as Ipedo, provide XML based query techniques. Composite

Software enables the aggregation of data into a portal view while any kind of manipulation of this

data has to be done manually by the user.

A much desired integration method is to orchestrate business processes in order to enhance the

ability to control the levers that will help to respond quickly to changes in the business

environment. The ability to manipulate business processes puts business executives in control of

IT and they can direct enterprise resources independent of the IT department. The need to

automate business processes had been alluded to by vendors in the content management and

workflow management as well as in the EAI space. However, the management of business

processes has been piecemeal so far and has not progressed to a level where all them can be

managed from a single platform of its own independent of other divisions.

A single platform, for business process management, provides the means to adapt business

processes to changing requirements rather than be set in an application stone. One of the key

barriers to configuring a series of business processes is that they have always been embedded in

applications. Once they are decoupled from applications, business processes can be broken up

into their components, reused for a variety of tasks and connected to complete a series of tasks

to complete a job at hand. The Business Process Modeling Language (BPML) provides the

means to create a path for the flow of business processes. For execution purposes, the Business

Process Execution Language (BPEL) plays a complementary role in that it manages the flow of

business processes. The conceptual bedrock of an independent management of business

processes is Pi Calculus which provides a method for unifying them and reuniting them for

another purpose. Just like properties define an object, Pi Calculus is like metadata which spells

out the tasks an individual unit of a business process can complete and the roles it expects

related processes to complete. In such a world, business processes are akin to packets in a

network which can be made to follow different routes depending on the addresses where they are

directed to move towards.

Page 13: Vendor strategies: Operational Business Intelligence for Agile Enterprises

Intalio is one of the pioneers in the design of platforms for the management of business

processes in their own right rather than as a component of an application. Other notable products

in the same space are Microsoft BizTalk server and Holosofx which was acquired by IBM was

renamed as Business Integration Modeler and incorporated into its Websphere platform.

Siebel’s Universal Application Network (UAN) creates a process centric environment for the

integration of applications and data in an enterprise. At the heart of this strategy is a library of

business processes such as quote to cash, campaign to lead, order to pay, etc. which can be

used to compose solutions. An overarching SOA architecture enables other vendors to tie their

applications in the overall solution. UAN incorporates vendor-neutral interfaces based on SOAP

(simple object access protocol), WSDL (Web service definition language) and XML (extensible

markup language) to create an environment for a diverse range of vendors to plug in their

applications. The UAN has used the syntax of the BPEL4WS (Business Process Execution

Language for Web Services) standard to define all business processes which implies that

individual vendors can hook in their own servers.

Lately, another variety of data integration software for interlinking data sources across a grid has

appeared. This kind of software, such as the Avid Data Grid, spans a wide area network and

allows access to the resources available across enterprises without going through the

cumbersome process of using multiple passwords to access them. The data access is made

possible by a universal directory which provides the access to the data source. With just this one

access point, users are able to extract data from numerous sources and are delivered for specific

users.

Some of the key players in the grid computing industry are HP’s Adaptive Enterprise, IBM’s On

Demand and Oracle’s 10G and Sun’s NI. IBM’s On Demand program uses its WebSphere and

Tivoli products for policy-based management. HP has a similar product called OpenView Platform

for the management of the grid and integrates Talking Blocks Web services into it.

The key advantage of Grids is the possibility of lowering latencies afforded by a cluster of servers

and storage devices which are not clogged when traffic spikes unexpectedly which would tend to

happen when numerous applications have to be operated simultaneously. An array of storage

and servers helps to spread the load over several devices which also are better utilized because

they are not dedicated to specific applications. Charles Schwab was able to lower to lower query

response from four minutes to as low as fifteen seconds.

One of the earliest applications of Grid computing is the case of Hewitt Associates, a global HR

company, which has deployed an IBM WebSphere-based grid to operate software to calculate

pensions.

Corporate radars

The payoff of integration of information assets in an enterprise is the ability to monitor any sign of

a threat or an opportunity so that enterprises can take timely actions. Business Activity Monitoring

Page 14: Vendor strategies: Operational Business Intelligence for Agile Enterprises

servers play the role of radar which can spot any exceptional events, such as missed schedules

in transportation, followed by the analysis of the impact of untoward events on related activities

such as communicating to a truck which could use its spare capacity to take on a load that the

assigned truck missed. Business Activity Monitoring involves a series of related activities of

observing business processes, comparing them with metrics, analyzing and visualizing its

implications and communicating for corrective action.

One of the applications of real time activity monitoring is the case of Brocade communications

which needed to keep track of the performance of its contract manufacturers to make decisions

about its outsourcing decisions. It acquired a business activity monitoring tool to be able to do this

in real time.

Some of the leading players in the industry are Informatica which has incorporated its Business

Activity Platform, developed jointly with WebMethods, in its PowerCenter RT, its integration

platform. WebMethods Application Integration capabilities and Informatica’s information

integration have been combined while they have, at best, lightweight business process

management capabilities. Ascential Software’s DataStage, now integrated with IBM WebSphere

Datastage has a Real Time Integration (RTI) Services component.

There are also companies from the business process domain, such as Microsoft BizTalk Server

2004, which has added a Business Activity Monitoring engine. BizTalk Server 2004 allows users

of Microsoft Office 2003 to monitor business processes from a desktop. Middleware specialist

TIBCO offers BusinessFactor, the technology it acquired from Praja. Celequest is among the

more prominent pure-play companies in the domain with its ActivityServer suite which has the

ability to stream data from operational systems and compares business rules with metrics to

determine whether an alert needs to be sent. Among the database companies, Teradata with its

Active Data warehousing product is focused on business activity monitoring.

A nose for bad data

In its early stages of growth, the data quality industry was largely populated by independent

players. In more recent years, the independent players have been bought over by the large

business intelligence companies. A prominent example of this is the acquisition of DataFlux by

SAS, Ascential bought Vality and was in turn bought by IBM. FirstLogic, an innovator in the

space, was bought by Pitney Bowes. Group 1, a data quality vendor, took a different course and

acquired an ETL vendor, Sagent, before Pitney Bowes purchased it. The process of extracting,

transforming and loading is expensive and companies hope to lower their costs by merging the

associated routines of validation, transformation, filtering or standardization. These processes are

optimal way to improving data quality when companies want to do data mining in a data

warehouse environment while much more needs to be achieved in a real time environment.

An example of the functionality available with such products that offer both data consolidation and

data quality services is IBM’s WebSphere Data Integration Suite. Its ProfileStage component

Page 15: Vendor strategies: Operational Business Intelligence for Agile Enterprises

automates the process of matching the data structures and data formats, from different

databases, so that the data from the source and the target are consistent. Similarly, the

QualityStage component standardizes data for individual entities, such as a customer, and

ensures that disparate conventions in storing data don’t contribute to inconsistencies. The data

quality components are provided with the DataStage, an ETL tool.

When data is updated in a real time environment, using technologies such as messaging queues,

there is a need to execute data quality functions early on in the transactional databases or by

other means such as metadata. The automation of data quality functions presupposes a

comprehensive solution to provide universal definitions of data and a means to convert from one

definition applicable to a specific application to another, removing duplications and correcting

errors in addresses, names, etc. Master data management is a means by which a database of

metadata is created and rules for converting from one format to another.

A key problem with current methodologies is that the data cleansing is done after the data has

already been extracted so that the source of the error is not detected. When data reconciliation

takes place with the help of master data management systems, the source of the error is also

identified.

It is now possible to buy rudimentary master data management products from a variety of

vendors. Among the platform vendors, the leaders are IBM and HP, PeopleSoft and SAP are the

players in the applications vendor category and Ascential, Infomatica among integration vendors

besides system integrators and Hyperion from the BI category.

An example of the implementation of a master data management system is the case of Unilever

which needed a centralized way of managing its data to pursue a global policy for brand

management and supply chain management. Historically, Unilever followed a decentralized policy

for the management of its subsidiaries which meant that its IT system was fragmented. It has now

implemented a master data repository which helps to align its transactional systems for a

consistent view of its data.

Sensors everywhere

Automation of data collection is one of the means to collect data free from errors. Sensors also

help companies gain visibility into their environment and to expand the universe of problems they

are able to address. The volumes grow with the use of sensors and present new challenges in

data processing. Increasingly, RFIDs and other type of sensors are available for commercial

application. For mass adoption, on the other hand, the costs of RFIDs would have to be

substantially lower before they will be accepted in applications such as supply chain

management.

General Electric has been one of the early pioneers in the use of sensors for analytical purposes.

These sensors gather data on the state of health of its jet engines. The data is gathered at one

place where analytical software looks for signs of trouble and provide early warning to their

Page 16: Vendor strategies: Operational Business Intelligence for Agile Enterprises

customers. The military has often been a leader in adoption in new technologies and its

willingness to accept the RFID technology is a pointer to wider diffusion in industry at large. Cost

factors are less binding in situations where high value activities, such as the manufacture of

aircraft engines, is involved or in the military where security is an overriding consideration.

The use of RFIDs in mass applications such as supply chain management is going to bring it

closer to ubiquity. In the early days, the industry had to improve the technology, the ability to read

information, so that it would work in commercial environments. Wal-mart is working with a

hundred of its partners to expand the use of RFIDs for supply chain management. NEC of Japan

has reported early successes in the use of RFID in its PC assembling plants. Unlike bar codes,

RFIDs do not require manual scanning of bar codes; the productivity at its Yonezawa Plant, as a

result, increased by 10% besides the benefits of just-in-time replenishment of inventories.

It will take middleware or some other form of integration technologies for companies to be able to

receive information from a variety of sources and funnel it to a central database. One of the

significant initiatives, to popularize the use of RFIDs, is the partnership between Oracle and Intel

and Xpaseo to supply tools for the management of information received from sensors. These

tools will mediate the flow of information from sensors and integrate with existing products from

Oracle and Intel, including Oracle Application Server 10g, Oracle Database 10g and Oracle E-

Business Suite. In addition, the partnership will extend the scope of pervasive computing by

linking data from other devices such as hand held devices, PC, mobile devices using Intel

communication and server platforms.

Other companies who have RFID offerings include SAP, IBM and SUN. There are also smaller

specialist companies who offer middleware to integrate RFID technology with the rest of the

enterprise software.

THE BIG PICTURE

Real time enterprise has to put in place several moving parts before the entire vehicle for rapid

response takes shape. Vendors have been able to offer most of the components of the solutions

of an adaptive enterprise and are acquiring companies to consolidate their products to provide a

complete package. Some remarkable breakthroughs have already been achieved especially in

the analysis of data. Real time interpretation of data, aided by machine learning techniques, and

predictive analytics capabilities has equipped enterprises to grasp the dimensions of the

problems they encounter and to anticipate the outcomes and consequences. The imminent

prospect of automated data gathering and business process design will trigger as much

excitement as dashboards have done in the recent past. Larger vendors such as Oracle, IBM,

Microsoft, SAS, SAP, Business Objects and Hyperion are emerging stronger than before and are

best equipped to emerge as leaders. In all probability, infrastructure providers from among the

former ERP companies will have the strongest foundations to supply the products and the

consulting skills to win over customers in the future.

Page 17: Vendor strategies: Operational Business Intelligence for Agile Enterprises