Data Migration Roadmap Bir 14112017

Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 14

https://www.qualitestgroup.

com/white-papers/

Data Migration Guide

SunGard Releases Data Migration Toolkit


August 29, 2011/0 Comments/in Product Announcements /by Muciojad
The Mifos Data migration toolkit developed by the SunGard Mifos team is now available as a Beta release. The Mifos
data migration toolkit can be used to migrate legacy system data into Mifos for MFIs running a Windows environment.

This tool will help Mifos Specialists reduced the time to deploy Mifos. Through the convenient upload of an Excel
spreadsheet via the data migration tool, they can more efficiently migrate historical loan and client data. The tool will
map the data according to your general ledger accounts and specified Mifos configuration. Currently the Data Migration
Toolkit has been tested for Mifos version 2.1 and covers the following functions:
1. Migration of Centers, Groups, clients entity wise
1. Client Family details
2. Migration of Loans both individual client and group loans
1. Support for Flat Rate of interest type
2. Support for Declining Balance interest rate type
3. Migration of Individual clients i.e. clients not associated with any groups or Center
4. Migration of groups independent of centers
5. Migration of Additional fields Questions groups
6. Validations for
1. Mandatory field in MIFOS
2. Duplicate records check
3. Range checks on the loan product offerings
4. Hierarchy check to validate the loan officer -> Center -> Group->Client link
5. Loan Product Applicability : Products applicable to groups should have loans created for the groups
only and not clients
6. Loan creation with product frequency Mismatch : Validation such that the recurrence for the loan
product matches the customer/center schedule
7. Date Validations :
1. Loan creation date should be greater than center creation date ( or MFI joining date)
2. Group MFI joining date should be greater than center creation date
3. Disbursement date in loan account cannot be less than loan creation date
4. Should not allow future dated values for MFI joining date, loan creation date etc
8. Migrating One Time fees
9. Migrating Group loans

Background and Purpose


BGMFI is implementing Abacus core banking software. This implementation of Abacus project demands
data migration. Data migration is the transfer of data from one location, storage medium, or
hardware/software system to another.

A successful data migration effort requires identifying of main challenges such as dependencies on vendor
support, minimal in-house migration expertise, or insufficient understanding of legacy data and source
systems requirements and mitigation of these issues and risks early in the project by applying different
methods so that there will be no/ less disruption.

Benefits of a Data Migration Data Migration Guide


One of the most important things that help to increase the chances of success on data migration is establishing
and follows a guideline which is based on best practices and lessons learned. This guideline is believed on
enhance the quality of data migration efforts. Specifically, the following benefits are expected:
https://www.qualitestgroup.com/white-papers/

1. Efficient resource utilization: Proper data migration planning allows for efficient use of resources.
2. Risk Mitigation: it helps to improve early risk mitigation endeavors.
3. Cost reduction and speed enhancement: contributes for an overall cost reduction. In addition, it helps
to speed-up migration works by minimizing business disruptions resulting in reduced cost.

Data Migration Plan


It is a must to develop a dedicated data migration plan as a subset of the overall Abacus project plan. Having
this plan helps to have a checklist that ensures completion of all relevant points.

Scope
The scope of data migration concentrates on moving the data from multiple source systems (LPF, Peachtree,
and SPM) to a single target system-Abacus.
The scope of a data migration project includes:
- Planning and execution of the transfer of data between a legacy (or source) data storage and a new (or
target) storage.
- Procedures for validating the results along the way.

Identifying risks, constraints, and challenges

The better these obstacles are managed, the better the chances become for a successful data migration.
The next page shows a preliminary list of typical assumptions/dependencies, constraints, and risks to
consider:
# Assumptions/Dependencies Constraints Risks
Sufficient resources are available for Time/Schedule dictates what must be Unexpected delay and/or
1.
all aspects of data migration. completed and when. downtime might occur.
All requirements needed from legacy The team might
system and what the target system encounter complex:
2. requires are fully documented, - Processes
available, and accessible as planned - Configuration issues
during necessary steps of migration. related to data volumes
The team has access to: Personnel and equipment might be - Misunderstanding or
Subject matter experts for current limited or unavailable. misinterpretation of
source system and data o requirements might
3. Documentation for data models, result in unexpected
physical implementation (database), issues
and business rules, and interfaces o
Future business requirements

Purpose: It is believed that data integrity is a key factor in producing quality reporting and analysis and that
is why attention to data quality detail is key needed to properly perform Data Migration and Abacus
implementation.

Not all data migration we have worked on ends up with success while importing. Usually the process of
running import tests leads to additional iterations on the proposed data structure in the new Abacus as we
struggle with limitations of the data extracted from LPF. It is more important to note that we are migrating all
the data that BG NEEDS, not all the data that BG HAVE and need is defined as what BG will use.

Data migration* refers to the process of transferring data between data storage types, such as databases.
Perhaps the most common data migration task undertaken by nonprofit organizations involves the migration
of donor data from one CRM system to another.
https://www.qualitestgroup.com/white-papers/

Keep in mind that all data migrations require planning and involve risk. Data is being moved, manipulated,
and interpreted.

Goal of data migration: The goal of data migration is to maximize the quality of data in Abacus in order to
achieve higher goals within the organization, such as raising more money and improving donor relationships
long term.
Roles in data migration:
- Gathering business requirements for data migration needs.
- Designing, planning and managing the data migration process.
- Work with business owners to identify, define, document and communicate the data migration
requirements.
- Prepare data migration plans including migration risk, milestones, and quality
- Manage risks and monitor potential impacts as part of the data migration plan.
- Develop best practice, processes, and standards for effectively carrying out data migration
activities.
- Perform source system data analysis in order to manage source to target data mapping.
- Perform migration and testing of static data and transaction data from one core system to another.
- Perform data migration audit and reconciliation
Skills Needed
- Proficiency in Microsoft Excel.
- Working efficiently with datasets, e.g., Excel Pivot Tables advanced scripting, and VBA scripting
abilities.
- Knowledge of database architectures and data mapping between tables.
- Knowledge of data configuration, training and maintenance of the system.

But what does that process really look like? There are many steps. These are the steps we think of most
frequently when we diagram a data migration project.

01. 02. 03. 04. 05. 06. 07.


Planning Analysis Mapping Configure Data Extraction Create Import
New database Import Files

It is important to note that testing a data migration should start well in advance of the
actual data being migrated. It is quite typical that one of the most important commodities
that a business controls, owns, or maintains is its data and therefore any data migration
should be considered high risk and should be subjected to significant verification and
validation efforts. A business may choose to reduce the risk level against the data
migration, but it is always prudent to start off with data migration as a high-priority task
to ensure it is successful.

By: Paul Darby


It is important to note that testing a data migration should start well in advance of the actual
data being migrated. It is quite typical that one of the most important commodities that a
business controls, owns, or maintains is its data and therefore any data migration should
be considered high risk and should be subjected to significant verification and validation
efforts. A business may choose to reduce the risk level against the data migration, but it is
always prudent to start off with data migration as a high-priority task to ensure it is
successful.

A common term for exercising a data migration task is to perform an Extract, Transform
and Load (ETL) function which sees extracted data from various sources being fixed and
https://www.qualitestgroup.com/white-papers/

then loaded into the target data warehouse. This document offers some of the keys
consideration points to a test team so that effective test planning and strategizing can be
achieved for an ETL project.

Strategy
When strategizing the data migration testing, the following phases of testing need to be
considered:

Pre-migration testing
Post-migration testing
Acceptance (Not covered in this document)
Production Migration (Not covered in this document)
Pre-Migration testing
This phase we will perform pre-migration testing, these tests occur early in the migration
process, before any migration, even migration for testing purposes, is completed. The pre-
migration testing options include
Verify scope of source systems and data with user community and IT. Verification should
include data to be included as well as excluded and, if applicable, tied to the specific queries
being used for the migration.
Verify the source to target high-level mappings for each category of data or content and
verify that the desired type has been defined in the destination system.
Verify destination system data requirements such as the field names, field type, mandatory
fields, valid value lists and other field-level validation checks.
Using the source to destination mappings, test the source data against the requirements of
the destination system. For example, if the destination system has a mandatory field, ensure
that the appropriate source is not null, or if the destination system field has a list of valid
values, test to ensure that the appropriate source fields contain these valid values.
Test source and target system connections from the migration platform.

Before any migration has taken place, the following testing actions should take place:
Ensure that the scope of the data is understood: the extraction process, how many records, tables,
relationships, etc. exist. For this, it is important to get visibility of an entity relationship diagram, data
dictionary or similar for the current system that is being migrated from.
Ensure that the destination as well as the load process is understood. Again, for the former at least, this
should include an entity relationship diagram, data dictionary, or something similar.
Ensure that the data scheme is known including mandatory fields, field names, field types, data types etc. for
both the original data source and the destination system.
Understand the data cleansing requirements (See section 1.2 for more detail)
Understand any interfacing with other systems or secondary suppliers of data into the system
Much of this information may come from a formal design review and detailed requirements
or a migration specification; however, if this review does not take place or the specification
does not exist, it should still remain part of the test teams planning process to source this
information.
https://www.qualitestgroup.com/white-papers/

These steps will provide an understanding of the mapping of the data from its original
location to the new location and it will highlight where there are inconsistencies. These
inconsistencies may require a design change or a business process change to enable the
data to be usable and present correctly to the user.

Once this is understood, the following can take place:

Ensure the mapping to user interface is correct


Ensure the mapping to business process is correct
Test tool configuration and script generation using field details
Identify data subset (sampling) details for testing (Note: it is often impractical to test 100% of a data set,
therefore it is imperative that the scope of the test data is fully understood along with any risks that are
presented by using a subset)
Ensure all business cases, user stories, or use cases are understood
Data Cleansing
It is vitally important that data cleansing is performed on a system that houses and
maintains data. This is easiest done by virtue of data validation during input through Field
Validation and Error Checking as this is a cheaper and far more efficient
method. Depending on how stringent previous methods of system development and user
input have been, it is very likely that incorrect data will exist within the migrated data
set. The following data cleansing tasks should be planned within a data migration exercise:

Understand the error types: these may include blank fields, too-long data lengths, or bad characters, among
many others
Identify method of checking the data: either interrogation through database using SQL commands or the use
of tools such as DQGlobal, Clean & Match or DTM Data Scrubber, though many other tools are also
available
Understand the data linkages: changing the data in one table may damage the data in another linked table if
the same changes are not made across the entire system.
Once this level of understanding has been achieved, the following tasks should be carried
out:

Interrogation of data in order to find the potential error types previously ascertained
Fix the data (Note: data must be fixed consistently across the system)
Check record counts
The list above shows things that should take place during the pre-migration and migration
phases and may be fulfilled by the development team and the tester working together. The
migrated data should be the result of the work of all parties within the development
lifecycle.

See the Post-Migration Testing section of this document to understand the data cleansing
tasks once the data has been migrated.

Post-Migration Testing
Following the migration, black box testing against the business cases, use cases, or user
stories may be carried out and it is possible that the scripts produced for this work will be
reusable for Acceptance testing later on.
https://www.qualitestgroup.com/white-papers/

The tester should fulfill these tests against the subset of data as defined during the pre-
migration testing phase. It is recommended that the tester builds test scripts that provide
exact detail of data usage either via referenced data sheets or within the text of the test
itself.
In addition to testing the known business flows, the testers should carry out the following
testing, including negative testing approaches, which primarily ensure that data cleansing is
being carried out at run time within the system:

Expect a significant sum of errors to be identified during the initial test runs although it will be
minimized if sufficient pre-migration testing is well executed. Post-migration is typically
performed in a test environment and includes:

Test the throughput of the migration process (number of records per unit time). This testing
will be used to verify that the planned downtime is sufficient. For planning purposes, consider
the time to verify that the migration process was completed successfully.
Compare Migrated Records to Records Generated by the Destination System Ensure that
migrated records are complete and of the appropriate context.
Summary Verification There are several techniques that provide summary information
including record counts and checksums. Here, the number of records migrated is compiled from
the destination system and then compared to the number of records migrated.
Compare Migrated Records to Sources Tests should verify that fields values are migrated as
per the migration specification. In short, source values and the field level mappings are used to
calculate the expected results at the destination. This testing can be completed using sampling
if appropriate or if the migration includes data that poses significant business or compliance
risk, 100% of the migrated data can be verified using an automated testing tool.

Input bad data: attempt to violate the validation rules including the use of boundary value analysis,
equivalence partitioning and error guessing. The validation at the user interface level should be in accordance
with the validation at database level. If no validation has been enforced or the validation between database
and user interface do not match then this should be treated as a defect.
Bypass mandatory data: attempt to navigate to further stages of a user case within the system prior to all data
requirements being fulfilled.
Drag and drop: dragging and dropping data and even images may force bad error handling in the application,
which may lead to incomplete records.
Checking data locks: Where data is being written, it should not be possible for multiple user to access the
same new record within the database. Typically this level of validation is managed by the underlying
database if it is provided by a third party so the risk may be minimal but this testing should be carried out
where possible.
Conversions testing: ensure that the data when recalled from the database(s) is displayed accordingly to the
user. This will most likely be fulfilled within the coverage of business cases, use cases and user stories;
however, it is worth ensuring that all data is utilized to be certain that it will be called and displayed
accordingly.
Outside of the actual data migration tasks there are a number of non-functional test
approaches that should be considered when planning the testing. These are as follows:

Security Testing: is access to the data restricted accordingly?


https://www.qualitestgroup.com/white-papers/

Performance Testing: can the data be utilized and accessed in accordance with relevant KPIs?
Usability of data: is the user able to utilize the data effectively? Do they have to input the same data multiple
times?
Data migration is a technique which, due to its complexity, necessitates specific, careful
testing at all stages of the process. Not doing so can result in improper migration
(incomplete or corrupt data) or poor user experience, which will be damaging to both the
systems reputation and its inherent usability as well.

Key Outputs:
Test Results
Defect report
No Sev1 and Agreed number of Open Sev2 Defects.
UAT Preparedness (set of activities which does)
USER ACCEPTANCE Testing (UAT)

In this phase the UAT team is involved in testing and making sure of the below:
Validate that the migration plan satisfies the business and technical requirements identified
for the migration.
Find and resolve potential risks to the production environment
Attention should be given to reporting, downstream feeds, and other system processes that
rely on migrated data
Key inputs:
QA Testing is complete and meets exit criteria
No Sev1 defects in OPEN state
UAT Test Cases have been written and Sign off happened by Business SME
UAT Test Data has been set up
All the necessary access has been granted to the business users
Release notes/Traceability Matrix has been made available
Key Outputs:
All in-scope tests for UAT have been executed at least once
UAT Test Results have been recorded in QC
No Sev1 and Agreed number of Open Sev2 Defects.
An action plan has been created for any agreed outstanding defects
To test the DB database:
1. Create a controlled version of the migrated database. Keep the database migration scripts in a
source control system.
2. Design a set of test cases that we can use to test the DB database from unit to system level.
The test cases should:

a. Ensure the following:

v All the users in the source database have migrated successfully


v Privileges and grants for users are correct
v Tables have the correct structure, defaults are functioning correctly,
v and errors did not occur during mapping

b. Validate that the data migrated successfully by doing the following:

v Comparing the number of rows in the DB database with those in the source database
v Calculating the sum of numerical columns in the DB database and compare with those in the
source database

c. Ensure that the following


v Constraints: We cannot enter duplicate primary keys
v Foreign keys prevent we from entering inconsistent data
https://www.qualitestgroup.com/white-papers/

v Check constraints prevent we from entering invalid data

d. Check that indexes and sequences are created successfully

e. Ensure that triggers, procedures, and functions are migrated successfully. Check that the
correct values are returned for triggers and functions

3. Run the test cases against the migrated database.


4. Create a report that evaluates the test case results.
These reports allow evaluating the data to qualify the errors, file problem reports, and provide
a customer with a controlled version of the database.
5. If the tests pass, go to Step 7
If all tests in the test cases pass or contain acceptable errors, the test passes. If acceptable
errors occur, document them in an error report that we can use for audit purposes.
6. If the test cases fail, do the following:

a. Identify the cause of the error.

b. Identify the test cases needed to check the errors.

c. Log an issue on the controlled version of the migrated database code in the problem report.

d. Add the test case and a description of the problem to the incident tracking system of yours
organization, which could be a spreadsheet or bug reporting system. Aside from the test case,
the incident log should include the following:

v Provide a clear, concise description of the incident encountered


v Provide a complete description of the environment, such as platform and source control version
v Attach the output of the test; if useful indicate the frequency and predictability of the incident
v Provide a sequence of events leading to the incident
v Describe the effect on the current test, diagnostic steps taken, and results noted

e. Attempt to fix the errors.

f. Return to Step 1
7. Identify acceptance tests that we can use to make sure the DB database is an acceptable
quality level.

Risks Involved in Data Migration Process and solution to overcome.

Data migration risk mitigation matrix.

# Risk Risk Description Mitigation Strategy


Data loss risk When data available in legacy Reconciliation: There can be two folded
system but after migration process, reconciliation, Count Reconciliation and Key
if it is not available in target financial column reconciliation.
system, then it is termed as data Comparing number of records in legacy
1. loss. Data loss is highest risk in system and target system will give fair
data migration. Cost involved assessment about data loss in migration. It is
not necessary that always legacy system
rectifying data loss and business record count and target system data count
cost involved due to poor data adds should match; there may be business rules to
up as to financial and reputation reject records based on certain parameters. If
risk.
https://www.qualitestgroup.com/white-papers/

such rules were used, then number of records


in legacy system should match with number
of rejected records plus record count in target
system.Tester should also validate reason for
rejection are as per business rules.
Key financial column reconciliation is
recording summation of all key financial
columns ex closing balance, available balance
etc. and comparing between legacy and target
system will result in identification of data
loss. If mismatch is found, we can drill down
to granular level to find which record or
records were mismatching and root cause
analysis need to be performed to find reason
for such data loss.
Data Format and content of data in legacy Data Validation: Validating each and every
Corruption and system and target system is different in data between legacy and target system is best
Data integrity. comparison due to migration process methodology to avoid data corruption.
then it is called corrupted data. Following Data validation methodologies are
widely used
Due to data migration, anomalies or 1. Sample Data Validation
redundant or duplicated data or 2. Subsets of data Validation
presence of non-meaningful data are 3. Complete data set validation.
related to data integrity issues. Sample data validation: It involves in random
selection of record from legacy system and
Data corruption and data integrity comparing same with target system. Sampling
affects business and operation is not defect proof system as it selects small
efficiency and it totally affects purpose random records. Profiling samples records
of migration. fetch further more data coverage then random
samples.
Subset of data validation: Instead of choosing
random sample records for verification
between legacy and target system, here we
choose subset of records based on row
number such as first thousand records or ten
thousand to fifty thousand records. Only
advantage is it select more records and
expecting more data coverage based on higher
probability
Complete data set validation: It is ideal
validation method that we should strive in
migration testing. It compares every record in
bidirectional way, comparing every record in
legacy system against target system and target
system against legacy system.
Minus/exception query is used to compare
data between systems. When two different
database vendors are involved, such
comparison is impossible, to overcome
instance need to create in either one database
with both legacy and target system data.
Factor to consider in Data validation:
1. Stability of project.
2. Data coverage.
3. Execution time.
https://www.qualitestgroup.com/white-papers/

4. Efficiency of Query/Script

Semantics risk Sometime meaning of legacy column Real time users and subject matter experts
and target column hold same meaning should involve in feasibility study and such
but their unit of measurement is semantic issues should be detected very
different and it meaning of data is earlier in project life cycle.
completely changed. In this case data Test scope should include test cases to
is not lost or corrupted; migration is identifying the inconsistencies and
successful yet not useful in term of incompatibilities between the migrated data
objective. and the parameterization of the target
For example, if the source application application.
fields deal with decimal points up to 2 Testers manually compare data present in the
but target application does not consider source and target application by looking at the
any such constraints. Sometimes the main screen of the application.
source application fields deal with
decimal points up to 2 but target
application does not consider any such
constraints.
Data quality Data quality issues are not Data Quality review will be conducted
issues identified until late in the project, throughout each data migration releases
thus causing delays and cost so that data quality issues may be
overruns. identified early and addressed
accordingly.
Any arising Any issues arising during the life All issues need to be reported,
issues cycle of the data migration documented, and resolved as soon as they
arise. At a minimum, the procedure for
handling a concern/risk/issue should include
the following steps:
Identify the concern/risk/issue
(documentation).
- Register and communicate any risk/issue to
the team.
- Report issue to project manager
- Assign responsibility for analysis and
resolution.
- Analyze and determine a resolution
strategy
- Ask for approval of the mitigation and/or
solution.
- Implement the approved solution, if
possible.
Late Poor data quality of the source
Identification systems and lack of
of Risks documentation
or inaccurate data models would
be
identified late in the migration
cycle

Data Migration Tasks


Ad hoc data queries after the initial analysis is completed
https://www.qualitestgroup.com/white-papers/

Data cleaning
Data parsing (i.e. when a single legacy data field needs to be split into two or more fields in the new
CRM)
Data extraction from the legacy system
Building the import files for loading into the new system
Multiple rounds of testing and re-loads of updated import files
Archiving old backups and/or data not migrated

Definition
Data migration and Data conversion
Data conversion:- refers to the process of changing data from one format to another. Many events can require
a data conversion, only one of which is a data migration to a new data storage system
That is the measure of success when we get to the end of the data migration project, no matter whether it is a
6-step or 15-step process.
https://www.qualitestgroup.com/white-papers/

This data migration guide is a work in progress. It is a guide to data migration for BG. The goal of
this guide is to provide information on tools and techniques that may be used for data migration
process.
Prior to considering data migration, it is advisable to go through the information concerning the
system deployment planning. In particular, a gap analysis that includes a detailed comparison of
the organization and features of the new system to the existing system is important. This
comparison should provide a good starting point for understanding the relationship between data in
the existing system to data in the new system.

What Data Can Be Migrated?


The following are data that are needed for migration.

1. List of branches
2. Roles
3. System Users
4. Fees
5. Staff loan and provident fund
6. List of products (loan and saving)
7. Active individual and loan clients
8. Active Accounts
Client (Amount of Last loan; Loan Cycle Number; Loan Cycle per Product)
Group (Group Approved Date, Amount of Last loan; Loan Cycle Number; Loan Cycle
per Product)
9. Schedule, opening balance, amount overdue.
10. Savings Accounts: Mandatory/Voluntary savings, deposit schedule, opening balance,
deposits overdue (if mandatory savings)
11. Loan Accounts (Amount of Last Loan; Loan Cycle per Product)
12. Savings Accounts: Date Account Opened; Total Deposits; Total Withdrawals; Total interest
earned; Missed Deposits
*Data that will be ready for migration are in excel format. This format is obtained while performing
transformations on the SQL data. It will be amended after knowing Abacus file format.

The main task of the data migration project will be the transformation of clear data from the existing
system into new system (Abacus) compatible format.
https://www.qualitestgroup.com/white-papers/

Data Migration Process


The basic Processes which will be followed during data migration are:

Extract data from the existing system


Transform and clean the data
Generate CSV files from the SQL data
Load the CSV files into the new Abacus.
Validate that the data in Abacus is correct

Tools for Data Migration


The approach that is preferred for a given data migration project depends on the tools and
development environments that are most familiar to the Abacus system and the specialists
working on the project.
As more data migration works are completed, there will be a larger body of experience from which
we can draw big lessons.
Possible data migration approaches include:

Extracting data into an transitional database


Performing transformations on the SQL data and then writing it out into CSV format
(the one which we prefer).
Using an ETL (Extract, Transform, and Load) tool to do the extraction,
transformation, and generation of CSV data.
There are a number of open source ETL tools available today. The following tools are the most
known ones.

Enhydra Octopus
Clover.ETL
KETL
Kettle

Up until now, BG specialists did not select any particular tools for data migration project
yet.
Extraction Tools and Techniques

The form of the source data determines the type of methods we use for extracting data from the
source ;. If the source data is already in a database, then extracting may just be a matter of
copying data into a new set of tables to use for transformation, cleaning and CSV file generation.
ETL tools can be useful for this kind of activity.
Transformation Tools and Techniques

The choice of tools for transformation and cleaning of data will depend on what
https://www.qualitestgroup.com/white-papers/

intermediate form is chosen for the data. ETL tools may provide a convenient way of performing
transformations, doing transformation in SQL is another option, or programming transformations
directly in a your programming language of choice may work best.

MDEX File Generation Tools and Techniques

Since MDEX files are just XML files, any code or tool that can generate XML files can be used to
generate MDEX files. Some ETL tools include components for mapping data into XML. There are
tools available that can generate code for reading and writing XML files that follow a given XML
schema. These tools insure that valid XML is generated. Tools or libraries are listed below for
reading and writing XML from a given programming language:

The Data Migration Process

For a given MFI, data migration is likely to be a process that will need to be repeated more than
once. For example, after an initial data migration, both BG and the legacy system are likely to be
run in parallel for some length of time to insure that the newly migrated BG installation is correct.
To support multiple migrations, it is suggested that the data migration process should be
automated as much as possible.
Data Migration Experiences

As data migration projects are completed, feedback from each project can be assembled and
presented here to share experiences, offer suggestions, and provide tips and help for MFIs
working on their own data migration projects.

Owner of the data migration endeavor


Who is the Primary Owner of the data migration endeavor?

Most of the time, it is assumed that data migration is totally the work of IT people. Most visible
activities and processes lie with the data migration team and so issues are believed to be on
the data migration team. But most requirements are set and validated by the business team. It
must be clearly understood that any data migration without active role of the business team
has a high rate of failure.

You might also like