data migration through an information developpppment...
TRANSCRIPT
Data Migration through an Information Development Approach p ppA Management Overview
Introducing MIKE2.0 An Open Source Methodology for Information Developmenthttp://www openmethodology orghttp://www.openmethodology.org
Data Migration through an Information Development Approach
Agenda
Data Migration through Data Migration through Information Development
─ Executive Summary
Business Drivers for Better Data Migration ─ Business Drivers for Better Data Migration
─ Guiding Principles for Better Data Migration
MIKE2 0 MethodologyMIKE2.0 Methodology
─ 5 phased-approach to Better Business Intelligence
E l T k O t t f ─ Example Task Outputs from Strategy Activities
Lessons Learned
© 2008 BearingPoint, Inc. 2CROSS
Data Migration through Information DevelopmentScope within BearingPoint's IM Solution Suite
Information Management Solution Suite
Delivered through a Collaborative Approach with the IM Profession and our Alliance VendorsDelivered through a Collaborative Approach with the IM Profession and our Alliance Vendors
Enterprise Information Management
Supported by Solution Capabilities that provide a foundation for Suite Delivery
Access, Search and Content Delivery BI and EPM Information
Asset Management
olu
tio
ns
& O
pen
t
So
luti
on
s
Enterprise Data Management Enterprise Content Management
Bu
sin
ess
So
Co
mm
erc
ial
So
urc
e P
rod
uct
Information Strategy, Architecture and Governance
S
© 2008 BearingPoint, Inc. 3CROSS
Sets the new standard for Information Development through an Open Source Offering
Data Migration through Information DevelopmentExecutive Summary
Migrating from the legacy environment to a new system can be a straightforward activity or be a very complex initiative. Migration can come in many forms:
A migration from a relatively simply system into another systemA migration from a relatively simply system into another system
Upgrading a system to a new version through an approach that requires changing the underlying data
The convergence of multiple systems into a single composite system
Complex migration from one system to a new system which requires the migration to be rolled out Complex migration from one system to a new system, which requires the migration to be rolled out over a period of time
Multiple, con-current systems migrations and consolidation efforts. This is referred to as "IT Transformation"
In most large organisations, migration of Enterprise Systems is very complex. To simplify this complexity, g g , g p y y p p y p y,we first aim to understand the scope of the problem and then formulate some initial solution techniques.
The MIKE2.0 Solution for Data Migration provides techniques for measuring the complexity of the Data Migration initiative and determining the activities that are required. It also defines the strategic architectural capabilities as well as high-level solution architecture options for solving different data migration challenges. It then moves into the set of required Foundation Activities, Incremental Design, and Delivery steps. The It then moves into the set of required Foundation Activities, Incremental Design, and Delivery steps. The Executive Summary presents some of the strategy activities.
Similar MIKE2 Solutions include:
The MIKE2.0 Solution for IT Transformation provides a complementary Solution Approach for dealing with these issues on a very large scale
© 2008 BearingPoint, Inc. 4CROSS
The MIKE2.0 Solution for Master Data Management provides an approach for on running multiple systems in an ongoing fashion that synchronise data sets such as customer, product, employee and locality
Data Migration through Information DevelopmentBusiness Drivers for Better Data Migration
Better data quality into the new target systems
Achieve
High-risk implementations from a business perspective
Avoid
A systematic approach to prioritizing functionality to be moved to the target
Alignment of related migration initiatives
A standards-based approach to large-scale systems implementation
The ability run co existent applications to
Very complex code that is difficult to manage and is only used "once off"
Issues with reconciling common data across all systems
Inefficient software development processes that increase cost and slow delivery
BetterDataThe ability run co-existent applications to
reduce deployment risk
An ability to trace the flow of information across all systems in the architecture
Building new analytical systems as part of the operational data migration
that increase cost and slow delivery
Inflexible systems and lock-in to specific technologies
Unnecessary duplication of technology spend
DataMigration
Change Drivers
Continuously Changing Business Environment
Today's Systems are More Data-Dependent
Reduced Technical Complexity & Cost
© 2008 BearingPoint, Inc. 5CROSS
Architectures Moving to Vendor-Based Systems
Data Migration through Information Development10 Guiding Principles for Better Data Migration
Measure the complexity of your initiative – Understand your technology requirements based on the sophistication of the migration effort. Determine the full set of capabilities that are required
Don't bite off too much at once – Establish an overall architectural blueprint for a complex programme and migrate system functionality a piece at a time Complex systems can be progressively decommissioned through migrate system functionality a piece at a time. Complex systems can be progressively decommissioned through co-existent applications
Investigate & fix DQ problems early – Data quality issues discovered at a late stage often result in programme failures or significant delays. Start with data profiling to identify high risk areas in the early stages of the project.As soon as possible, get your hands on real data
Use standards to reduce complexity – Data Migration is simplified through the use of open and common standards Use standards to reduce complexity Data Migration is simplified through the use of open and common standards related to data, integration and infrastructure
Build a metadata-driven solution – A comprehensive approach to metadata management is the key to reducing complexity and promoting reusability across infrastructure. A metadata-driven approach makes it easier for users to understand the meaning of data and to understand the lineage of data across the environment
Take a diligent approach to testing – Data Migrations are complex and user expectations will be high, considering the t iti i t i ll f ki t A t ti t ti h ld b f ll d f i iti l f ti l transition is typically from a working system. A systematic testing process should be followed, from initial functional testing to final user acceptance
Don't provide an "infrastructure only" release – Unless the delivery times are short or the infrastructure issues very significant, always aim to provide some form of new business capability in a release – business users often get frustrated with technology-only initiatives. New reporting functionality is often good complement to an infrastructure-focused release
Make sure the business case is sound – If a system is going to be replaced, make sure there is a good business reason for it. Also make sure that the business appreciates that there will likely be an initial higher cost of systems in the early stages and that a properly constructed business case actually includes a replacement plan – even for the new system
Align projects into an overall programme – If conducting multiple initiatives, there will be many commonalities across the different projects Align these projects into an overall programme
© 2008 BearingPoint, Inc. 6CROSS
across the different projects. Align these projects into an overall programme
Use a detailed, method-based approach – The MIKE2.0 Methodology provides an open source approach for Information Development
Data Migration Guiding Principles Investigate & Fix DQ Problems Early
Eliminate the 11th Hour Fire Fight
The latter stages of testing is the most expensive and worst time to discover problems with the data. It is late in the process and there is little time to do the analysis and fix the problem. More times than not this has caused project delays By starting with data profiling we identify our high risk areas in the early stages of the projectdelays. By starting with data profiling, we identify our high risk areas in the early stages of the project
All problems need to be worked thru in the staging areas prior to further data movement. Therefore, we make as much of an effort as possible to fix the problems while the data is standing still. It costs time and resources to move data. Different types of problems are addressed in each staging area
Production
Test Environment(s)
Target System (s)
6-P
6-T
Migration StagingTable ScanAttribute ScanAssessments
1
Transformations
4Move Staging
LightMediumHeavy
Source Systems
oce
sses 5
Data
te
gra
tio
n
AssessmentsReporting
Heavy
Pro Int
3
Data IntegrationData Re-Engineering Data Re-Engineering
2
© 2008 BearingPoint, Inc. 7CROSS
Data IntegrationData Re-Engineering Data Re-Engineering
Metadata Management
Data Migration Guiding Principles Use Standards to Reduce Complexity
Current State Environments
Inventory Source Tables
Inventory C ti Source Attributes
Inventory Upstream Sources
Inventory Downstream Targets
Future State Environments
Enterprise Apps Data Models
iODS Data Models
Rationalization Rationalization
Rationalize Domains and Entities
across Current State and Future
State Environments
Creating Data Standards
Create as is Domain Model
Create as is Entity Model
Common Data Standards
Enterprise Representation
Create Domain Model
Create Entity Model
Rationalize Attributes across Current State
and Future State Environments
Create Entity Model
Create Entity Relationship Model
Create Rationalization Rationalization
Initial Common Data Standards and creation of:
Initial DQ Program
Initial Data O hi M d lEntity Attribute Model
Mapping Mapping Mapping Mapping
Ownership Model
Initial Data Management Governance Processes
© 2008 BearingPoint, Inc. 8CROSS
Attribute Mappings
Finance iODS DW CustomerMap in all Application Environments to the Enterprise Standard
The MIKE2.0 Methodology An Open Source Methodology for Information Development
What is MIKE2.0?
MIKE stands for Method for an Integrated Knowledge Environment
MIKE2 0 is our comprehensive methodology for Enterprise Information ManagementMIKE2.0 is our comprehensive methodology for Enterprise Information Management
MIKE2.0 brings together important concepts around Open Source and Web 2.0
The open source version of MIKE2.0 is available at: http://www.openmethodology.org
Key Constructs within MIKE2.0
SAFE (Strategic Architecture for the Federated Enterprise) is the architecture framework for the SAFE (Strategic Architecture for the Federated Enterprise) is the architecture framework for the MIKE2.0 Methodology
Information Development is the key conceptual construct for MIKE2.0 – develop your information just like applications
MIKE2.0 provides a Comprehensive, Modern Approach
S E t i I f ti M t b t i t d t il i t b d f Scope covers Enterprise Information Management, but goes into detail in areas to be used for more tactical projects
Architecturally-driven approach that starts at the strategic conceptual level, goes to solution architecture
A comprehensive approach to Data Governance, Architecture and strategic Information Management
MIKE2.0 provides a Collaborative, Open Source Methodology for Information Development
Balances adding new content with release stability through a method that is easier to navigate and understand
Allows non-BearingPoint users to contribute
© 2008 BearingPoint, Inc. 9CROSS
Links into BearingPoint's existing project assets on our internal knowledge management systems
Unique approach, we would like to make this "the standard" in the new area of Information Development
The MIKE2.0 Methodology MIKE2.0 for Data Migration
The MIKE2.0 Methodology can be applied to solve different types of Data Migration problems
F i l i ti ti iti t ll ti iti f th O ll I l t ti G id For simple migration activities, not all activities from the Overall Implementation Guide are required. The complete migration may take only a single release
For complex migration scenarios, most activities will be required and will be implemented over multiple increments. Complex migration scenarios often require e sophisticated a chitect al capabilitiesvery sophisticated architectural capabilities
Most migrations of Enterprise Applications are very complex processes
The following pages go through some of the initial strategy activities in MIKE2.0 that
Help introduce the overall approach to Data Migration and how it is applied depending on the complexity of the problem
Provide an example of one of the tasks in the initial current-state assessment
Propose some high-level solution architecture options that can be applied to different migration scenarios
Provide an approach for prioritizing complex migrations, based on business priorities and complexity of the implementation
© 2008 BearingPoint, Inc. 10CROSS
MIKE2.0 Methodology: Phase OverviewThe 5 Phases of MIKE2.0
Information Development through the 5 Phases of MIKE2.0
Strategic Programme Blueprint is done once
Increment 1
Increment 2
Increment 3Continuous Implementation Phases
Phase 2Technology
Development
Design
Roadmap & Foundation
Phase 1Business gy
Assessment
Begin Next
Deploy
Operate
ActivitiesAssessment
Improved Governance and Operating Model
Phase 3, 4, 5 Begin Next Increment
© 2008 BearingPoint, Inc. 11CROSS
Improved Governance and Operating Model
MIKE2.0 Methodology: Phase OverviewTypical Activities Conducted as part of the Strategy Phases
Phase 1 – Business Assessment and Strategy Definition Blueprint
1.1 Strategic Mobilisation1.2 Enterprise
Information Management Awareness
1.3 Overall Business Strategy for
Information Development
1.4 Organisational QuickScan for
Information Development
1.5 Future State Vision for Information Management
1.6 Data Governance Sponsorship and ScopeInformation Development Management p p p
1.7 Initial Data Governance Organisation
1.8 Business Blueprint Completion 1.9 Programme Review
Phase 2 – Technology Assessment and Selection Blueprint
2.1 Strategic Requirements for BI Application
Development
2.2 Strategic Requirements for Technology Backplane
Development
2.3 Strategic Non-Functional Requirements
2.4 Current-State Logical Architecture
2.5 Future-State Logical Architecture and Gap
Analysis
2.6 Future-State Physical Architecture and Vendor
Selection
2.7 Data Governance Policies
2 9 Software Development 2 10 Metadata 2 11 Technology Blueprint
2.8 Data Standards
© 2008 BearingPoint, Inc. 12CROSS
2.9 Software Development Lifecycle Preparation
2.10 Metadata Driven Architecture
2.11 Technology Blueprint Completion
MIKE2.0 Methodology: Task OverviewTask 1.2.1 Develop and Initiate Information Management Orientation
Information Development through the 5 Phases of MIKE2.0
Strategic Programme Blueprint is done once
Increment 2
Increment 3
Continuous Implementation Phases
Activity 1.2 Enterprise Information
Management AwarenessResponsible Status
1.2.1 Assess Team's
Phase 2Technology Assessment
Increment 1
Increment 2
Development
Deploy
Design
Roadmap & Foundation Activities
Phase 1Business
Assessment
1.2.1 Assess Team s Understanding of Information Management Concepts
Task 1.2.2 Develop and Initiate Information Management Orientation
Improved Governance and Operating Model
Phase 3, 4, 5
Begin Next Increment
Operate
Improved Governance and Operating Model
Phase 1 – Business Assessment and Strategy Definition Blueprint
1 1 Strategic Mobilisation1.2 Enterprise Information
1.3 Overall Business Strategy for 1.1 Strategic Mobilisation Information
Management AwarenessStrategy for
Information Development
1.4 Organisational QuickScan for
Information Development
1.5 Future State Vision for Information Management
1.6 Data Governance Sponsorship and Scope
© 2008 BearingPoint, Inc. 13CROSS
1.7 Initial Data Governance Organisation
1.8 Business Blueprint Completion 1.9 Programme Review
MIKE2.0 Methodology: Task OverviewTask 1.2.1 Develop and Initiate Information Management Orientation
Introductory Concept: Migration in MIKE2.0 takes places across multiple stages. This means that that in the continuous implementation phases (phase 3,4,5) these stages are repeated.
Th T f ti i th ht f i 4 t Mi ti C lid ti M d
End to End Tool Based Transformation Capabilities
The Transformation process is thought of in 4 stages – Migration, Consolidation, Move and Post-Move. Guidelines for each stage are listed below. Some final activities are often put off until the Post Move Stage.
Acquisition Stage
The acquisition stage is focused on the sourcing of data from the producer. The data is placed in a staging area
Consolidation Stage
The consolidation stage focuses on attribute rationalisation into an integrated data store that may be required to bring
Move Stage
The move stage focuses on moving the data and application capabilities that have been developed to the
Post Move Stage
The post move stage is focused on the data transformations and quality aspects that were best done after the move p g g
where the data is scanned and assessed. Judgments are made on the complexity of data quality issues and initially identified data quality
bl dd d
y q gdata together from multiple systems. Key transformations occur and further steps are required for re-engineering data. The data and processes
d f i ti
pproduction environment. The move stage has a staging area that is as close to production as possible. Final steps around data quality i t d
to production (but before the system goes live) such as environment specific data or reference data. Additional process changes or software
d l b problems are addressed are prepared for migration to the Move environment. Considerable collaboration is needed in those areas where decommissioning occurs
improvement are done this environment
upgrades may also be required. The skills and toolsets used are the same as the ones used in the prior phases. Attention is paid to the ongoing use of the
© 2008 BearingPoint, Inc. 14CROSS
ongoing use of the interfaces created during the transition process
MIKE2.0 Methodology: Task OverviewTask 1.2.1 Develop and Initiate Information Management Orientation
Introductory Concept: Depending on the level of complexity – different migration orientations are required. At an introductory level, MIKE2.0 classifies orientations as "lite", "medium" and "heavy".
Orientation – Migration Lite
Post Move Stage
AcquisitionStage
ConsolidationStage Move Stage
End to End Tool Based Transformation Capabilities
medium and heavy .
Strategy
The migration effort can start at any one of the orientations An enterprise A lite migration scenario is straightforward: it typically involves loading
data from a single source into a single target. Few changes are required in terms of data quality improvement; mapping is relatively simple as is the application functionality to be enabled. Data integration may be on the back-end of systems and will likely be a once-off, "big bang"
orientations. An enterprise transformation may have parts of the effort start concurrently at each of the orientations
Orientation – Migration MediumA medium migration scenario may involve loading data from a single source into a single target or to multiple systems. Data quality improvement will be performed through multiple iterations, transformation issues may be significant and integration into a common data model is typically complex
A migration effort my start at the Lite orientation and decide to move to the next orientation (medium) on the
Orientation – Migration HeavyA heavy migration scenario typically involves providing a solution for application co-existence that allows multiple systems to be run in parallel. The integration framework is formulated so the current-state and future-state can work together. The model for a heavy migration
i i i f i i i f i
data model is typically complexfly as the results of the data scans are examined
Further some of the data rationalization and Data Q l k b d
© 2008 BearingPoint, Inc. 15CROSS
scenario is representative of an organisation in IT TransformationAs heavy migrations are long running and involve a significant data integration effort, it is useful to build a parallel analytical environment to attain a "vertical" view of information
Quality work may be done in the target environment after the Move Phase
MIKE2.0 Methodology: Task OverviewTask 1.2.1 Develop and Initiate Information Management Orientation
Introductory Concept: Different capabilities are required from the architecture depending on the level of sophistication required. Capabilities are first defined at the strategic component level in Activity 1.5.level in Activity 1.5.
Capability/Skills Orientation – Lite Orientation – Medium Orientation – Heavy
Table and Attribute Assessment Performed Performed Performed
Relationship Assessment Direct copy of Source Systems Key Integrity Validated Referential Integrity Required
Data Replication None None Multiple Targets
Data Transfer To Target To Target Target/Downstream
Data Synchronization None None For Interfaces
Data Transformation Modest Significant to Similar Structures Major Activity
Data Mapping Minimal SME Supported Major Activity
D t St d di ti N K Att ib t All Att ib tData Standardization None Key Attributes All Attributes
Pattern Analysis and Parsing None None Yes
Record Matching None Based on Similar IDs IDs and Pattern Matching
Record De-Duping None None Yes
Out of Box Business Rules As Appropriate As Appropriate As Appropriate
Configure Complex Rules None Application Application/Infrastructure
Out of the Box Interfaces As Appropriate As Appropriate As Appropriate
Configure Custom Interfaces None Application Application/Infrastructure
Data Governance Process Model Documented in High Level Form Key or Lynchpin Processes only End to End Models
© 2008 BearingPoint, Inc. 16CROSS
Database Independent Functions As Existed in Source Few Custom APIs Infrastructure Services
Reporting Data Move Metrics only DQ and DM metrics Reporting as a Service
'Active' MetaData Repository Specific and 'Physical' Multiple Passive dictionaries Initial Implementation
MIKE2.0 Methodology: Task OverviewTask 1.4.1 Assess Current-State Application Portfolio
Information Development through the 5 Phases of MIKE2.0
Strategic Programme Blueprint is done once
Increment 2
Increment 3
Continuous Implementation Phases
Activity 1.4 Organisational QuickScan for
Information DevelopmentResponsible Status
Task 1.4.1 Assess Current-State l f l
Phase 2Technology Assessment
Increment 1
Increment 2
Development
Deploy
Design
Roadmap & Foundation Activities
Phase 1Business
Assessment
Application Portfolio
Task 1.4.2 Assess Information Maturity
Task 1.4.3 Assess Economic Value of Information
Task 1.4.4 Assess
Improved Governance and Operating Model
Phase 3, 4, 5
Begin Next Increment
Operate
Infrastructure Maturity
Task 1.4.5 Assess Key Current-State Information Processes
Task 1.4.6 Define Current-State Conceptual Architecture
Task 1.4.7 Assess Current-State Improved Governance and Operating Model
Phase 1 – Business Assessment and Strategy Definition Blueprint
1 1 Strategic Mobilisation1.2 Enterprise Information
1.3 Overall Business Strategy for
Task 1.4.7 Assess Current State People Skills
Task 1.4.8 Assess Current-State Organisational Structure
Task 1.4.9 Assemble Findings on People, Organisation and its Capabilities
1.1 Strategic Mobilisation Information Management Awareness
Strategy for Information Development
1.4 Organisational QuickScan for
Information Development
1.5 Future State Vision for Information Management
1.6 Data Governance Sponsorship and Scope
© 2008 BearingPoint, Inc. 17CROSS
1.7 Initial Data Governance Organisation
1.8 Business Blueprint Completion 1.9 Programme Review
MIKE2.0 Methodology: Task OverviewTask 1.4.1 Assess Current-State Application Portfolio
The Application Portfolio documents major systems and their functionality:
F I f ti D l t
System Name
Description Brief Statement that gives an overview of the system
Platform Technologies the system consists of – Application, Database, Operating System, Programming Language
Level 1 & 2 This attribute maps the application to the end-to-end process model for both level 1 (Functions)
From an Information Development perspective, the deliverable should only be to get a quick overview of the system to understand major application functionality
Functions and Level 2 (Process). For systems that are repeated across the business, this attribute is repeated for each instance.
Application Complexity
Rating of the complexity of an application. The ratings used are as follows:
Low – These systems are relatively simple, use a simple database or small number of files and are reasonably well documented. Low complexity systems may also include "black box" systems where support and documentation is provided by a vendor, the product is stable (with infrequent new releases), and there is little to no customization. These black box systems would generally apply more to areas such as production than to customer-facing of financial systemspp y
The Application Portfolio also documents the system owner and any expected changes it is expected to undergo during the period of
Medium – These systems are generally more substantial in functionality than low complexity systems or are reasonably simple in functionality but are poorly documented and/or use a number of related databases or files. Off-the-shelf vendor products may be classified as Medium complexity systems if they are generally well documented with some customization. This classification may also represent a grouping of multiple low complexity applications
High – Highly complex systems that use complex data structures, require trained staff to configure and support. It would also include those that are poorly documented and difficult to maintain, whilst containing a significant amount of business functionality. Major off-the-shelf vendor systems that have been heavily customized, have been tailored by the vendor to divisional requirements or for which upgrades would be difficult would likely be
g g pexecuting on the Blueprint vision
It is focused at a systems level, as opposed to infrastructure and information
to divisional requirements or for which upgrades would be difficult would likely be classified as High complexity
Divisions Divisions/business clusters supported by the system
Inter-Divisional Complexity
Rating of the inter-divisional complexity due to variations in the instances of systems supporting different divisions. Rating of the complexity of an application. The ratings used are as follows:
Low – The system is used in 1 division or in exactly the same way in 2 or more
Medium – There are multiple instances of the system but the differences between divisions are largely configuration changes
Time for this task may vary greatly depending on the existing artefacts
Ideally, this content is stored in a structured repository as opposed to an
High – There are multiple instances of the system and there are significant differences in code and configuration of these systems
Issues & Limitations
Key Problems associated with the system
Application Life Expectancy
This attribute may contain the envisaged life expectancy for the system – e.g. whether it will be decommissioned or be part of the strategic architecture.
Ratings may also be assigned to the system in terms of application life expectancy, using the following model:
Null Unknown
© 2008 BearingPoint, Inc. 18CROSS
structured repository as opposed to an unstructured document form.
Null – Unknown
Low – This system will be replaced in the near term (< 2 years)
Medium – This system will be replaced in the long term (2 – 5 years)
High – Strategic, long-term application (> 5 years)
Comment Additional comments related to the system – e.g. point-of-contact, open questions
MIKE2.0 Methodology: Task OverviewTask 1.5.10 High Level Solution Architecture Options
Information Development through the 5 Phases of MIKE2.0
Strategic Programme Blueprint is done once
Increment 2
Increment 3
Continuous Implementation Phases
Activity 1.5 Future-State Vision for
Information ManagementResponsible Status
1.5.1 Introduce Leading B i P i f
Phase 2Technology Assessment
Increment 1
Increment 2
Development
Deploy
Design
Roadmap & Foundation Activities
Phase 1Business
Assessment
Business Practices for Information Management
1.5.2 Define Future-State Business Alternatives
1.5.3 Define Information Management Guiding Principles
Improved Governance and Operating Model
Phase 3, 4, 5
Begin Next Increment
Operate
1.5.4 Define Technology Architecture Guiding Principles
1.5.5 Define IT Guiding Principles (Technology Backplane Delivery Principles)
1.5.6 Define Future-State Information Process ModelImproved Governance and Operating Model
Phase 1 – Business Assessment and Strategy Definition Blueprint
1 1 Strategic Mobilisation1.2 Enterprise Information
1.3 Overall Business Strategy for
Information Process Model
1.5.7 Define Future-State Conceptual Data Model
1.5.8 Define Future-State Conceptual Architecture
1.5.9 Define Source-to-T t M t i1.1 Strategic Mobilisation Information
Management AwarenessStrategy for
Information Development
1.4 Organisational QuickScan for
Information Development
1.5 Future State Vision for Information Management
1.6 Data Governance Sponsorship and Scope
Target Matrix
1.5.10 Define High-Level Recommendations for Solution Architecture
© 2008 BearingPoint, Inc. 19CROSS
1.7 Initial Data Governance Organisation
1.8 Business Blueprint Completion 1.9 Programme Review
MIKE2.0 Methodology: Task OverviewTask 1.5.10 High Level Solution Architecture Options
A lite migration scenario is straightforward: it typically involves loading data from a single source into a single target. Few changes are required in terms of data quality improvement; mapping is relatively simple as is the application functionality to be enabled. Data mapping is relatively simple as is the application functionality to be enabled. Data integration may be on the back-end of systems and will likely be a once-off, "big bang".
Below is a high level solution option for a lite migration scenario.
5
Future-State Production System
Current-StateSystem
2
4
3
1
Future-State Test Systemy
Transformation
Some Data Quality Cleanup
© 2008 BearingPoint, Inc. 20CROSS
Once-off migration load
MIKE2.0 Methodology: Task OverviewTask 1.5.10 High Level Solution Architecture Options
A medium migration scenario may involve loading data from a single source into a single target or to multiple systems. Data quality improvement will be performed through multiple iterations, transformation issues may be significant and integration into a common data iterations, transformation issues may be significant and integration into a common data model is typically complex.
Data migration may involve multiple iterations through a gradual roll-out of capabilities. Below is a high level solution option for a medium migration scenario.
77
Production Target
6-P
6-T
Migration StagingTable ScanAttribute Scan Transformations
Integrated Data StoreCommon Data ModelDetailed Data
Data ProducersCurrent-State
5
Data
eg
rati
on
Test Target
1
AssessmentsReporting
Detailed DataApply '80/20 rule' for Data Re-Engineering
DIn
te
Data IntegrationProfiling Data
Re-Engineering
2 4
© 2008 BearingPoint, Inc. 21CROSS
Integration Re-Engineering
Metadata ManagementCUSTOMER ST
CUSTOMERCUSTOMER NUMBERCUSTOMER NAME
CUSTOMER CITYCUSTOMER POST
CUSTOMER ADDRCUSTOMER PHONECUSTOMER FAX
3
MIKE2.0 Methodology: Task OverviewTask 1.5.10 High Level Solution Architecture Options
A heavy migration scenario will require a comprehensive strategy that develops a vision for people, process, organisation and technology. A heavy Application Co-Existence scenario shows how multiple systems can be run in parallel so the current-state and future-state can shows how multiple systems can be run in parallel so the current state and future state can work together. Below is a high level solution option for a heavy migration scenario and representative of an organisation in IT Transformation.
C rrent Nati e New Ne
Current State Implementations New Implementations
System 1
System Type Function/Data/MigrationCurrent Systems Data
Native Use System
Data
New System
Horizontal Portal
T h l B k l
System 2
System 3
Horizontal Portal
Horizontal Vertical
Whole of Customer
Technology BackplaneProcess IntegrationWorkflow ImplementationBusiness ServicesData ServicesInterface Services
System 3
System 4
Portal
Horizontal Portal
Portal
Wholeof Product
Common MessagingData Mastering ModelActive MetadataPortal Enablement
Enterprise Platform
Data Mediation
© 2008 BearingPoint, Inc. 22CROSS
Reporting and Analysis
Horizontal Portal
Platform Mediation
MIKE2.0 Methodology: Task OverviewTask 1.8.1 Prioritise Requirements and Identify Immediate Opps
Information Development through the 5 Phases of MIKE2.0
Strategic Programme Blueprint is done once
Increment 2
Increment 3
Continuous Implementation Phases
Activity 1. 8 Business Blueprint Completion Responsible Status
Task 1.8.1 Prioritise Requirements and Identify I di t W k O t iti
Phase 2Technology Assessment
Increment 1
Increment 2
Development
Deploy
Design
Roadmap & Foundation Activities
Phase 1Business
Assessment
Immediate Work Opportunities
Task 1.8.2 Define High-Level Programme Plan
Task 1.8.3 Develop High-Level Business Case
Task 1.8.4 Assemble Key
Improved Governance and Operating Model
Phase 3, 4, 5
Begin Next Increment
Operate
Messages to Complete Business Blueprint
Improved Governance and Operating Model
Phase 1 – Business Assessment and Strategy Definition Blueprint
1 1 Strategic Mobilisation1.2 Enterprise Information
1.3 Overall Business Strategy for 1.1 Strategic Mobilisation Information
Management AwarenessStrategy for
Information Development
1.4 Organisational QuickScan for
Information Development
1.5 Future State Vision for Information Management
1.6 Data Governance Sponsorship and Scope
© 2008 BearingPoint, Inc. 23CROSS
1.7 Initial Data Governance Organisation
1.8 Business Blueprint Completion 1.9 Programme Review
MIKE2.0 Methodology: Task OverviewTask 1.8.1 Prioritise Requirements and Identify Immediate Opps
For Data Migration initiatives that involve replacement of a number of systems, a key part of prioritisation involves balancing the desire for new business capabilities with the complexity of their implementation. of their implementation.
High Level Project Estimating Factors Include:
The complexity of the current-state environment
The number of critical business functions to be enabled
The level of technology sophistication that is required
The number of systems to be migrated
Amount of data within these systems to be migratedy g
Level of documentation on the system
Availability of Subject Matter Experts
Complexity of system interfaces
Quality of the data within the system
A key aspect of the MIKE2.0 approach is determining these Estimating Factors. The Estimating Model available as part of MIKE2.0 is described on the following pages.
© 2008 BearingPoint, Inc. 24CROSS
MIKE2.0 Methodology: Task OverviewTask 1.8.1 Prioritise Requirements and Identify Immediate Opps
There are many factors that will be used to estimate the time and resources required for delivering a Data Migration project. The model below can be used to make a quantitative estimate on the complexity of the project and to weigh business priorities. Ifmultiple migrations are to take place across a large transformation programme, this model can be used to help prioritize the sequencing of the overall implementation.q g p
These questions should be asked relative to an application or application cluster by senior staff. A large-scale transformation programme may have multiple applications or application clusters. A starter set of sample questions is listed below.
Criteria for Assessing Difficulty Alignment with Business Enablers
Number and size of databases in Application Degree to which system functions align with pp
Number of Tables per database
Total Number of Attributes
# of attributes that have had multiple definitions over time
# f tt ib t i t f d t
g y gbase capabilities
Degree to which system functions align with enhanced capabilities focused on the new business model
Degree to which the system addresses high priority customer segment growth
# of attributes in terms of synonyms and antonyms
Number of DB dependent processes
Number of one time Interfaces
Number of ongoing Interfaces
Number of Data Quality Problems and Issues to fix
Degree to which the system addresses high priority customer segment retention
Degree to which the system addresses high priority areas of product growth
Degree to which the system addresses high priority areas f d b lNumber of Data Quality Problems and Issues to fix
Knowledge/Documentation of Data Quality issues
Ease of de-duping similar entities in the same DB
Ease of matching same entity records across multiple DBs
Completeness of the functional documentation
of product stabilization
Degree to which the system is cost effective (I.e., cost takeout)
Degree to which the system is flexible to adding capabilities
© 2008 BearingPoint, Inc. 25CROSS
Availability of Subject Matter Experts (SMEs)
Maturity of the Enterprise in Managing their data
MIKE2.0 Methodology: Task OverviewTask 1.8.1 Prioritise Requirements and Identify Immediate Opps
Difficulty Criteria
Number and size of databases in Application
N b f T bl d t b
Small 2 Medium 4 Large
Number of Tables per database
Total Number of Attributes
% of attributes that have had multiple definitions over time
% of attributes in terms of synonyms and antonyms
Number of DB dependent processes
Number of one time Interfaces
Number of ongoing Interfaces
Number of Data Quality problems and issues to fix
Knowledge/Documentation of Data Quality issues
Ease of de-duping similar entities in the same DB
Low 2 Average 4 High
Ease of de duping similar entities in the same DB
Ease of matching same entity records across multiple DBs
Completeness of the functional documentation
Availability of Subject Matter Experts (SMEs)
© 2008 BearingPoint, Inc. 26CROSS
Availability of Subject Matter Experts (SMEs)
Maturity of the Enterprise in Managing their data
MIKE2.0 Methodology: Task OverviewTask 1.8.1 Prioritise Requirements and Identify Immediate Opps
Alignment with Key Business Enablers
Degree to which system functions align with base capabilities
1-Low 2 3-Fair 4 5-High
base capabilities
Degree to which system functions align with enhanced capabilities focused on the new business model
Degree to which the system addresses high priority customer segment growth
Degree to which the system addresses high priority customer segment retention
Degree to which the system addresses high priority areas of product growthareas of product growth
Degree to which the system addresses high priority areas of product stabilization
Degree to which the system is cost effective (I.e., cost takeout)(I.e., cost takeout)
Degree to which the system is flexible to adding capabilities
© 2008 BearingPoint, Inc. 27CROSS
MIKE2.0 Methodology: Task OverviewTask 1.8.1 Prioritise Requirements and Identify Immediate Opps
Business Alignment
Degree to which system functions align with base capabilities x +
Degree to which system functions align with enhanced capabilities focused on the new business model x +
Scoring Formulas
Degree to which the system addresses high priority customer segment growth x +
Degree to which the system addresses high priority customer segment retention x +
Degree to which the system addresses high priority areas of product growth x +
Degree to which the system addresses high priority areas of product stabilization x +
Degree to which the system is cost effective (I.e., cost takeout) x +Business Alignment
Degree to which the system is flexible to adding capabilities x
Difficulty Index
Number and size of Data Bases in Application x +
Number of Tables per Data Base 2x +
Total Number of Attributes x +
Business Alignment
Equals
Total Number of Attributes x +
# of attributes that have had multiple definitions over time 2x +
# of attributes in terms of synonyms and antonyms 1.5x +
Number of DB dependent processes 3x +
Number of one time Interfaces 2x +
Number of ongoing Interfaces 2x +
Number of Data Quality Problems and Issues to fix x +
Knowledge/Documentation of Data Quality issues Minus
Ease of de-duping similar entities in the same DB x +
Ease of matching same entity records across multiple DBs 3x +Difficulty Index
© 2008 BearingPoint, Inc. 28CROSS
Completeness of the functional documentation 2x
Availability of Subject Matter Experts (SMEs) x +
Maturity of the Enterprise in Managing their data 3x +Equals
Difficulty Index
MIKE2.0 Methodology: Task OverviewTask 1.8.1 Prioritise Requirements and Identify Immediate Opps
Metrics on Business Alignment and Difficulty help to formulate priorities for the overall implementation of a large-scale migration programme. This is done by starting with areas that are most important for the business and of the lowest complexity. Whilst a simple that are most important for the business and of the lowest complexity. Whilst a simple model, this helps to clearly illustrate to the business and technical community how priorities were driven for the project in an objective fashion.
Business Alignment
Do Now 'early wins' Analyse and Schedule
High
ness
En
ab
lers
Do Now – early wins
Transition to Target and use Transformation Framework for forward and backward continuity. All process functions migrated at once. Should focus on the most sensitive and critical aspects of the new business
Analyse and Schedule
Migration to Transformation Framework followed by Migration to Target System(s). Migration of process functions is iterative. Many existing functions may remain in'as is' environment for an
extended period
wit
h K
EY
Bu
sin
Targets of Opportunity
Transition to Target Systems as part of Migration Packages
Phase Out
Functions picked up by Target System(s) as needed
?
Ali
gn
men
t w as part of Migration Packages
associated with higher priority systems. Traditional data conversion techniques used. Good sense of integration needed
Target System(s) as needed –other functions discontinued. Traditional data conversion techniques used. Change management represents a key set of activities
© 2008 BearingPoint, Inc. 29CROSS
Transition Difficulty Score HardEasy
Low
0
Data Migration through Information DevelopmentLessons Learned
Priortise Planning
Define business priorities and start with quick wins
Don't do everything at once – Deliver complex projects through an incremental programme
"Big bang" if you can, know that often you can't
Foc s on the Areas of High Comple itFocus on the Areas of High Complexity
Get the Technology Backplane capabilities out in front
Don't wait until the 11th hour to deal with Data Quality issues – Fix them early
Follow the 80/20 rule for fixing data Does this iteratively through multiple cyclesFollow the 80/20 rule for fixing data – Does this iteratively through multiple cycles
Understand the sophistication required for Application Co-Existence and that in the short term your systems will get more complex
Keep the Business Engaged
Communicate continuously on the planned approach defined in the strategy –The overall Blueprint is the communications document for the life of the programme
Try not to be completely infrastructure-focused for long-running releases – Always deliver some form of new business functionality
© 2008 BearingPoint, Inc. 30CROSS
deliver some form of new business functionality
Align the migration programme with analytical initiatives to give business users more access to data