In this post, I am going to share my personal experience of the Migration Team Foundation Server to Visual Studio Team Services using Database Import Service or TFS Migrator. The TFS Database Import Service, also known shorthand as the Import Service, provides a high fidelity way to migrate collection databases from TFS to VSTS. If you are looking to use this service to import your collection(s), there is a migration guide provided by Microsoft which I strongly recommend that you use it during your migration, as a walkthrough (you can download it from the following link).
There are six phases of the migration timeline shown in the following image. We will go through all of them.
Six phases of the Migration process of TFS to VSTS
1. Get started
In this step, let us first understand why we want to migrate from Team Foundation Server to Visual Studio Team Services.
- The main task for this second step is to make sure that we have a working Azure Active Directory that will be used for authenticating the team members in our VSTS account.
- We cannot migrate to an existing VSTS account since the tool is creating a new one. In case you want to migrate to an existing account, you cannot use this tool.
- VSTS support two different process models:
- Each Team Project Collection will be mapped to separate VSTS account, for which Microsoft will provide an entity, where we can keep multiple VSTS accounts, at a later date.
- Always download the latest version of TFS Migrator Tool
3. Upgrade TFS to the latest version
- I upgraded TFS 2015 to TFS 2018 (6 months support window)
4. Decision and Validate
- If you have any customization, you need to make a decision. Do you want to keep it or not?
- Note that there are customizations that are not supported. If you want to keep any of the supported customizations, you will need to use Hosted XML process model. Also note that you will not be able to move to the Inherited process model before 2019, as it is planned.
- In my process of migration, I used the inherited process model. First, I removed all customization, then checked the log and fixed all errors, then revalidate again, until no errors appeared.
- In order to start the TFS Migrator validate, from the command line on the application tier, navigate to the tool directory and type the following:
Now we should review validation warnings and errors. I got the following messages displayed in the image below:
As we can see on the image, I have one error that I have existing table with size of 45GBs, which is above recommended size of 20GBs, meaning that I am not able to use DACPAC import method.
Here is the list of the some of the customizations, which I have removed during my process of migration:
- List link types (listlinktypes)
- Delete link type (deletelinktype)
- Change work item field (changefield)
- Destroy work item definitions (destroywitd)
- Once the validation is ok, we are ready to import
- We need to run prepare command in order to generate import settings and related files:
Please note that in order to run this command you need Azure Active Directory tenant. The prepare command will contact your Azure Active Directory tenant so it will prompt you to login with a user from the tenant with permissions to read information about all of the users in the Azure Active Directory tenant.
- It generates the default configuration file (settings.json, which is an input to the import service)
- In my case I have two collections. One of them is that I used dacbac, and the second one is that I have table that exceed 45GBs so I had to use Azure VM with SQL Server
5.1.Prepare – First collection dacpac
- Prepare Azure storage container for the dacpac
- Detach collection
- Create the dacpac file from the collection using the following command:
- Upload (Azure Copy) the dacpac file to the Azure storage container
- Change the import setting in the configuration file (setting.json) to point to the url of the storage container and you are ready to go
5.2. Prepare – Second collection Azure VM
- Prepare the Azure VM with SQL Server
- Detach collections
- Backup the DB from the collection
- Upload (using the Azure Copy) the DB backup file to the Azure storage container
- Download DB backup to the SQL VM and restore it (using Download manager)
- Change the configuration file (generated from the preparation first step) to point to the VM IP and also the SQL connection string and you are ready to go
6.1. Import – Dry-run
Till now, I have Azure storage container with dacpac file and a database backup restored on a SQL Server on a virtual machine. This is my source, one tool settings file that have all the information needed for the migration.
The next step is to running the tool with the import as a dry-run (trial migration). Run the TfsMigrator import command and define the path to the configuration file which contains all the parameters for the migration, as following:
Next, we are going to see the progress of the import, as shown in the image below:
Next on the screen we see a warning with the following message:
This dry run account will expire and be deleted shortly after 7:20 PM on 2/1/2018. To continue testing beyond this date you will need to repeat the dry run import.
6.2. Import – Production
- Delete dry-run
- Change the configuration into “production”
- Re-run the import in the same way
- Rename your account with the reserved name
Here is the quick overview of the all steps of the migration process. Real migration process took about twelve continuous hours.
- Install and Upgrade to TFS 2018
- Remove all customization of all the projects
- Validate the process is ready
- Generate migration settings files for (preparation)
- Detach collections
- Dacpac The 1st collection
- Take backup 2nd collection
- Azure Copy the dacpac to the Azure storage container
- Azure Copy the DB backup to the Azure storage container
- Download DB backup to SQL VM and restore it
- Change the import setting and run the import for both collections
Your team is now on VSTS and will receive now updates frequently and have many opportunities to implement tools and processes that will help to be more effective in building quality software.
If you would like to see the video which go through all the steps, see the video below.