First Look: High Speed Migration to SharePoint Online

First Look: High Speed Migration to SharePoint Online

Steve Balmer once said “Developers, developers, developers. Now as a Strategic Consultant for AvePoint I say “Migrations, Migrations, Migrations.” Migration projects now comprise the majority of efforts to which I am assigned. One of the trends that I have been noticing in my role is the uptick of migrations from various content sources to Office 365. I was lucky enough to attend the Ignite Conference in Chicago recently and learned a bit about the new migration option that is being made available to Office 365 customers. The session I’m referring to is BRK3153 – Migration to SharePoint Online Best Practices and New API Investments presented by Simon Bourdages, Michael Jordan (Not #23), and Joe Newell.


Part of this session talked about best practices with Migrations regardless of source or destination as illustrated by this “Migrating the Collaboration Triangle” slide. Where they talk about the different kinds of content during a migration and how hard it is to migrate it between environments.



The juiciest stuff comes later in the session when they start talking about the new APIs that have recently gone into Open Beta with the overview below talking about the structure of this high speed migration.



With the new APIs offered by Microsoft, customers can outsource processing of migration data to a backend service managed by Microsoft, the big draw is that the backend resource will not be limited by Office 365 Bandwidth limits and resources. Packages uploaded by the Powershell commandlets or by third party tools will process in the background after being delivered/queued in Azure storage.


The downside is that there is now an intermediary location in Azure that will contain all the migrated data while it is queued for processing by the Migration service which can present the following issues for organizations:

  • Responsibility for Managing the Azure Storage Location
  • Security of Files at Rest in Azure Storage
  • Cost Burden for Storage/Processing of Data during Migration
  • Bandwidth available to Transfer to Azure Storage

Those concerns aside, this provides a new avenue to get large amounts of data from on premises to Office 365 without impacting especially if you consider you can now Ship Microsoft your data on a Hard Drive which can be loaded to the Azure Service for you and then processed on VM’s located close to whatever zone your service is in. This is the approach I took with my test of the preview Migration Service. I loaded up a VM in Azure in the US-East Zone which happens to be the same zone where (I hope) my Office 365 instance is located.


I started with an Office 365 Developer account which basically provides a single Enterprise License for a tenant and I planned to target my OneDrive for Business in order to take advantage of the larger size limits.


I also started with a brand new azure trial registered to my enterprise account in order to keep things as simple as possible.

Step 1) Provisioning a Migration Machine & Storage

 I used the Azure Compute resources to create a simple VM in the East US 2 Zone as you can see in the screenshot. I chose the standard pricing scheme since the performance of the VM was not likely to be the limiting factor in this test however the results may vary depending on the situation for the migration.


I also created a standard level storage resource in the same zone as my virtual machine in order to provide the blob storage for the files. In addition when you set up the storage accounts a queue is provisioned at the same time. Queues are a really great way to communicate simple information between machines or in this case the Migration Service and myself.




Step 2) Get some Content

Freshly back from ignite, I had a serious amount content from the conference I needed to catch up on. More often than not my session time slots were booked in triplicate or more. Luckily Vlad Catrinescu from absolute sharepoint put together a script to grab all the sessions and their slides from Channel9 to be viewed locally. So I Used the script to download the ignite videos as a large sampling of content that could be sent to azure. The script to download the sessions can be found here: All The Videos and Slides


 Executing the script in a folder will download several hundred files to your machine starting with the powerpoints for all of the sessions that are available.


 As you can see in the screenshot, I was able to get a fair amount of data throughput right from the start in my Azure VM. I was able to sustain about 200Mbps during the data transfer on the download.



In the end, I resulted with about 26GB of files and folders all downloaded from the Channel 9 Ignite stream.



Step 3) Preparing the Environment

In order to package these files so that the Office 365 Migration Service understands the structure, it is necessary to run several PowerShell commands against the files to create a Manifest that complies with what the service is expecting to see. The first step in running these commands is to get the SharePoint Online commandlets installed on your machine. In order to get the preview installation for the Migration Service, it is necessary to join the Open Beta for this service that was announced at Ignite.

    Step 1) Go to this site:

    Step 2) Sign in with your Microsoft Account

    Step 3) Accept the Terms and Conditions to Nominate/Join the Program in process.

    Step 4) Visit the downloads section of the program to access the Installation Files as well as the preview documentation

Once you get the installation files from the program site, you can run the executable on the Migration machine and follow the wizard to complete the install.



After the installation for the management shell completes it is necessary to create at least one file share from which the files will be accessed by the tooling. In this case I chose to make two file shares, one for the data and one for the packages in order to get an idea what the package composition was prior to uploading the data to Azure.


 Step 4) Packaging, Processing, and Submitting the Migration Job

At this point we have all the plumbing necessary to start processing our files for migration and we’ll need to execute four powershell commands in order to prepare/submit the files to the cloud service.

The commandlest we need to run are

  • New-SPOMigrationPackage -> Create a Migration package
  • ConvertTo-SPOMigrationTargetedPackage -> Target this migration package for a specific site
  • Set-SPOMigrationPackageAzureSource-> Upload this migration package to our storage account
  • Submit-SPOMigrationJob -> Tell the Migration Service the package is ready for processing

On order to do so find the SharePoint Online Migration Shell in the Start Menu and launch it:

The first command I’m going to run is to get & store my credentials used to authenticate to Office 365

    $creds = (Get.Credentials “”)



In this case I am executing


$pkg = New-SPOMigrationPackage -SourceFilesPath “\\hsmigrationtest\ignite\” -OutputPackagePath “\\hsmigrationtest\migrationpackage” -TargetWebUrl “” -TargetDocumentLibraryPath “Documents” -TargetDocumentLibrarySubFolderPath “Ignite” –NoADLookup


Command: New-SPOMigrationPackage







This is the location where my files that I am migrating to SharePoint Online are stored via fileshare.



This is the location where the package files will be generated by the tool including the manifest for the files, also a fileshare.


This is the site where the files will be stored. In my case its my OneDrive for Business however it could be any site on a SharePoint environment.



This is the document library name in sharepoint where you want to deploy the files. There may be futher limitations on this that haven’t been tested however.



The top level folder in my document library where the files will be migrated.



This means I wont be taking any security or user information along with me. Faster and necessary since I don’t have a synchronized active directory


You can see here that the tool is iterating through the file structure to create a manifest of all the files in the directory that was specified.


Finally here we can see that the migration package has been created in the file share that we specified in the above command.


A closer look below at the file structure of the package. It appears that there are some additional areas that are supported for the migration however probably not in this first public release of the powershell scripts.


The next step is to create a targeted package for the Office 365 Migration service to know where the files will be stored in SharePoint Online


This is the command I am executing

    $tpkg = ConvertTo-SPOMigrationTargetedPackage -SourceFilesPath “\\hsmigrationtest\Ignite” -SourcePackagePath “\\hsmigrationtest\migrationpackage” -OutputPackagePath “\\hsmigrationtest\migrationpackage\output” -TargetWebUrl “” -TargetDocumentLibraryPath “Documents” -TargetDocumentLibrarySubFolderPath “Ignite” -Credentials $cred



Command: ConvertTo-SPOMigrationTargetedPackage






This is the location where my files that I am migrating to SharePoint Online are stored via fileshare.



This is the location where the package files will be referenced by the tool including the manifest for the files, also a fileshare. These files will be recoded once they’re targeted with the migration details.



The FileShare output location for the package once it has been targeted for SharePoint Online.


This is the site where the files will be stored. In my case its my OneDrive for Business however it could be any site on a SharePoint environment.



This is the document library name in sharepoint where you want to deploy the files. There may be futher limitations on this that haven’t been tested however.



The top level folder in my document library where the files will be migrated.



These are the credentials that I created in the first step when running the tool, users will be prompted for a password when running the tool.




Once the previous command has finished, the package is now ready be uploaded to the Storage Location that was created in the first step.



The command I am executing is

    $uploadresult = Set-SPOMigrationPackageAzureSource –SourceFilesPath \\hsmigrationtest\Ignite – SourcepackagePath \\hsmigrationtestmigrationpackage\output –FileContainerName “MigrationFiles3” –PackageContainerName “PackageFiles3” –AzureQueueName “hsmigrationtest” –AccountName “hsmigrationtest” –AcccountKey “KEY”


Command: Set-SPOMigrationPackageAzureSource






The fileshare with our source files that need to be migrated



The fileshare with the Targeted package that has been created



A Container (Folder) that you want to store the files in



A Container (Folder) that you want to store the package files in



The name of your queue, should be the same as your storage account name in most cases



The name of your account name for the storage account


I’m not telling you 😛

The Private key that you have for uploading files to Azure. Don’t share these keys, they are meant to be private.


Note: Please note in the script that I am capturing the result of the files that have been uploaded to Azure. This is extremely useful in the next step


 Once Executed the powershell commandlet will verify the content of the package and then proceed to upload it as seen in the following two screenshots.





You can see as it begins to upload to the Storage Containers in Azure that it is using a fair amount of bandwidth during the sustained file transfers





You can see in the storage explorer tool that blobs have already begun to populate in the container for the migration files.

    Get the tool here:



And once the files for the migration are finished uploading the package files will be uploaded to the package files container as well. They could be the same however for clarity lets keep the containers separate.



Once the upload of our files and package to Azure has completed we are able to submit the migration job to the Migration Service.




This is the command that I am executing


$jobresult = Submit-SPOMigrationJob –TargetwebUrl “” –MigrationPackageAzureLocations $uploadresult –Credentials $creds


Command: Submit-SPOMigrationJob






The Site where we will be migrating the data to



Important: This the result of the previous step where we uploaded the migration files to azure. It has several parameters in the object that make it much easier to submit the job to the Migration Service



The credentials to Office 365 that I specified earlier in the process


This command will execute very quickly since it is only queuing the job with the service not processing any of the data. In order to monitor the progress of this step it is necessary to use the Azure Storage Manager and look at the Storage Queue that has been created. The messages in the queue as shown will give you any errors that may occur during the migration as well as the start and end of the job to help gague the progress. Its kind of hard given the nature of the queue to read the whole queue at once like a “log” however if you check the count of the Queue in the Azure Portal service you can watch the increase of the messages and once they stop increasing the job is probably close to finshed.



The real indicator that the migration process is working is that the files start appearing the document library where you specified the PowerShell scripts.





Success, we see here that we finally got our files and folders created in our OneDrive with all of our migrated content available. To us.


Migration Service Errors

 You may have noticed that some of the containers and folders differed slightly with the names as the flow of execution happened in the PowerShell command Line. This happened because the first two times I tried to submit the package to the Migration Service the Queue gave me some errors which I wasn’t expecting and thusly I was disappointed when my files didn’t show up in my one drive. I wanted to give you a quick look at what these errors look like so when you see them you can remediate them accordingly.


You can see the error below here gives the error The specified package does not contain the security information requested for import


Exporting the messages from the queue will give you a “log” of information similar to this, this comprises one run for the migration tool which ran in into an error. You can used the JobId below to find the messages that apply to your current execution cycle if you’re running multiple jobs at the same time.


{“Event”:”JobQueued”,”JobId”:”f625b4c7-cd44-495f-a19f-2d6ecf0b0972″,”Time”:”05/13/2015 08:48:28″,”SiteId”:”3e27d24f-f51b-490a-bc7c-61f774e11a4f”}


{“Event”:”JobStart”,”JobId”:”f625b4c7-cd44-495f-a19f-2d6ecf0b0972″,”Time”:”05/13/2015 08:48:53″,”SiteId”:”3e27d24f-f51b-490a-bc7c-61f774e11a4f”,”WebId”:”09a3a646-ea9e-4e7a-b21e-8216bbd5b319″,”DBId”:”b615c115-d9ec-43e4-a5b4-e352e53665d3″,”FarmId”:”6945487f-fc8b-468e-9acb-767d1d73468b”,”ServerId”:”55d9a440-306b-43a2-b069-f2e8a36ad88b”,”CorrelationId”:”15794a86-e674-4233-8438-8a1f2c5dabbb”}


{“Event”:”JobWarning”,”JobId”:”b729c7bb-585a-4379-ae1b-f60d5ffbbf96″,”Time”:”05/13/2015 08:19:54″,”CorrelationId”:”353eac79-b071-4e73-b439-5e30cee9462a”,”Message”:”The specified package does not contain the security information requested for import.”}


{“Event”:”JobError”,”JobId”:”f625b4c7-cd44-495f-a19f-2d6ecf0b0972″,”Time”:”05/13/2015 08:48:54″,”CorrelationId”:”15794a86-e674-4233-8438-8a1f2c5dabbb”,”ObjectType”:”List”,”Url”:”Documents”,”Id”:”f0a5e964-56f0-4fa7-8a8c-53bb58d6f445″,”ErrorCode”:”-2147024809″,”ErrorType”:”System.ArgumentException”,”Message”:”Must specify valid information for parsing in the string.”}


{“Event”:”JobFatalError”,”JobId”:”f625b4c7-cd44-495f-a19f-2d6ecf0b0972″,”Time”:”05/13/2015 08:48:54″,”CorrelationId”:”15794a86-e674-4233-8438-8a1f2c5dabbb”,”ErrorCode”:”-2147024809″,”ErrorType”:”System.ArgumentException”,”Message”:”Must specify valid information for parsing in the string.”}


{“Event”:”JobEnd”,”JobId”:”f625b4c7-cd44-495f-a19f-2d6ecf0b0972″,”Time”:”05/13/2015 08:48:54″,”FilesCreated”:”0″,”BytesProcessed”:”0″,”ObjectsProcessed”:”1″,”TotalErrors”:”2″,”TotalWarnings”:”1″,”TotalExpectedSPObjects”:”1656″}


The end result for this error was that I skipped the second step of this migration where I created the Targeted Package for SharePoint Online and so the package was missing the Security information necessary.


My first thought when I finally saw the documents start trickling into my one drive was “Finally”. The effort was a lot higher than just dragging and dropping a bunch of folders from the desktop to my one drive however since I was only using my OneDrive as an example for this test case, I can agree that it definitely has its uses. The strongest use case that I can think of is for those organizations with Terabytes of File shares that just want to do a cutover migration quickly and sort through the data at a later date. You can now ship a hard drive laden with your file shares to Azure and with a little bit of PowerShell magic it shows up in Azure Quickly. My next goal is to try this on a data sample size of a terabyte or more to see what the real performance of a large scale migration would be. I would also like to see an improved Queue management interface not unlike the ULS Manager in SharePoint

In closing I would like to encourage everyone to check out the Migration session that was given at ignite by clicking the image below and join the Open Beta so you too can get started with the High Speed Migration with Office 365.

5 thoughts on “First Look: High Speed Migration to SharePoint Online

  1. Hello!

    We are planning to do a migration from SharePoint On Premise 2010 to SharePöint Online.
    I know that making migration doing “drag and drop” is easier, but using this method: .

    Can we preserv the files metadata? (Created, Created by, persnolized columns of document libraries)
    Can we preserv the security? maybe doing some type of users mapping?

    Thank you for your help!!


    1. Hi Jonathan,

      I’m not sure yet what is going to supported by the migration PowerShell commandlets for metadata security. There is a way to take users across however, I believe you need to do a kind of user mapping, the details of which aren’t quite available. That being said the Migration API will support some of those requirements (User Migration, Metadata). There are some third party products that are being developed to take full advantage of the API, what that full advantage will be is still to be released. Keep checking at the Content Hub for the migration which is updated here:



Leave a Reply

Your email address will not be published. Required fields are marked *