Applications Services Blog
Get the latest thought leadership and information about the role of Applications Services in an increasingly interconnected world at the HP Blog Hub.

Microsoft Dynamics CRM Enterprise Data Migrations – Part 2


Many of our enterprise customers have one or more legacy systems which contain all of their valuable legacy data. This data is vital for identifying historical trends and patterns.  So, it’s a logical requirement to migrate the legacy data into the new CRM system.


A data migration however can be a challenging,  complex, and risky operation. This blog is the second  in a two-part series which will list the key attention points when developing, testing and planning a data migration.


Part one can be found here: Microsoft Dynamics CRM enterprise data migrations, part 1 – gathering requirements


Technology to use

The first step in the development process is to decide which technology to use for the data migration. There are many technologies out there which can help you get started. Microsoft Dynamics CRM provides capabilities out of the box to migrate data (i.e. Excel Import). However, Enterprise clients will reach the boundaries of this very fast.


Instead, we tend to use SQL Server Integration Services (SSIS) with a connector like Cozyroc or the third party product Scribe with a bit of custom development on the side. These products have numerous functionalities and are highly recommended.


The decision on this comes back to looking at the detailed requirements and choosing the best match. The main things to consider are:

  • The source (legacy) systems technologies
  • The complexity
  • If the data has to be imported in a certain order
  • The number of steps
  • Future integration requirements


After you have decided on the technology, the next step is then to evaluate or set up a development environment. I strongly advise that you have a multi-core development environment which has connectivity to the source data and target system(s). If not you will lose time as you’ll end up having to do numerous builds from development to test.


After the environment is in place, you can start the development using the extensive mapping document from the analysis phase (see my previous blog post). The development of a data migration is something which has to be built step by step in small pieces. You want to develop this in a way that it can run off multiple cores simultaneously (i.e. make sure action A is finished before starting action B on a separate core). As well as making sure to implement an advanced logging mechanism which writes exactly what happens when in a file or database table.


When you build the data migration, be sure to create a unique link between the legacy records and new records by adding a unique key from the source system. This can be a unique code or record number in a format of choice. With this approach you will be in a better position if you need to perform additional steps later on.


You may also need to create temporary mapping tables. Example: country mapping where the code value in system A is ‘USA’ for America and in system B it’s ‘US’. This can be solved using temporary mapping tables where you can automate the lookups and mapping between these codes.


Testing a data migration

Testing is a major success factor in data migrations. I strongly advise that testers get involved as soon as possible in a data migration project. The faster issues are identified, the cheaper it will be to solve them. From experience, I can tell you that something always goes wrong.


When it comes to testing a data migration, we divide this in two categories -- manual and automated tests.


For manual tests, we tend to migrate an initial predefined sample set of data which combines as many different record types as possible from the legacy system and we then perform initial sanity testing on the values of specific fields to verify if there are no strange values, blanks or errors.


The second category are automated tests where we migrate a larger set of data and then develop tests which automatically look for errors by checking the data formats, mapping conversions and/or specific items.


While we migrate this larger set of data, we also aim to get a baseline on the duration of how long it takes to migrate a large set of data (i.e. 1000 records). Using some basic calculations we can then get some insight on the total duration, keeping in mind the difference in hardware between your environments.


At this point, you should run the data migration in as many cores as possible, to decrease the final run time. If you notice that it will still take a very long time to run the full migration, you should probably look at (temporarily) scaling up your production environment. We’ve done projects where our environment was hosted on cloud technologies and we temporarily scaled up our production server x 3 during our data migration.



The final step is to come up with a detailed planning which contains clear roles, responsibilities and timings on when and how the actual production migration will happen.


You have to agree on a cutoff date where the point data from the legacy system will be moved and what will happen while the data migration is in progress and with the records which are created in the legacy system after the point of migration.


I strongly advise creation of a production migration checklist which lists all steps (automated and manual).


Monitoring what is going on during the data migration is a very important factor as well. Make sure to check the log files on set times to ensure things are still moving forward as planned. I’ve seen projects where we had to migrate millions of records and where we’ve run extensive tests with more than ten different data sets to ensure nothing went wrong … and still unexpected things happened during the actual load.


Closing tip

One closing tip I would like to give you is to verify if any automated processes (workflows and plugins) are running on the target system which may not be required for legacy data. This can potentially massively reduce the time it takes to migrate your data.



Thank you for reading the second part of this blog post. I hope you gained some insight on data migrations and their importance in a successful CRM implementation. The key objectives should be to set realistic expectations, perform extensive analysis of the source and target systems, define a clear scope, and keep aiming for win-win scenarios.


To find out more:

HP Enterprise Applications Services for Microsoft Dynamics CRM



Ashish Jha | ‎05-27-2014 10:30 AM


Many users of CRM have been looking to move to Dynamics CRM 2013 from their previous versions. This new CRM comes in after 2 years and with a host of changes and new features which work well for different user sets and it also caters to solving many issues which was prevalent in Dynamics CRM 2011.

There are many documented procedures for successful migration from CRM 2011to Dynamics CRM 2013, but here are some key points to remember.

Once the upgrade is over, server cannot be rolled back to Dynamics CRM 2011.
Upgrade from CRM 2011 to Dyamics CRM 2013 is supported from either RU6 or RU14. Recommended is to upgrade all deployments to RU14 before upgrading to Dynamics CRM 2013.

Read More at: Nalashaa Solutions

Showing results for 
Search instead for 
Do you mean 
About the Author
Over the past 5 years, I have participated in the different stages of delivering Microsoft Enterprise Applications to customers as large as ...

Follow Us
The opinions expressed above are the personal opinions of the authors, not of HP. By using this site, you accept the Terms of Use and Rules of Participation.