Browsed by
Category: PowerShell

Find and Replace Formulas in SharePoint Calculated Columns

Find and Replace Formulas in SharePoint Calculated Columns

Something often overlooked during migrations are calculated fields in SharePoint Lists. In lots of tools available, the value of a calculated field migrates correctly from one environment to the next. The problem though, is in one environment this formula may have pointed to a URL or static asset as part of the formula. After this column is migrated the formula is technically correct, just inaccurate in the new SharePoint environment.

Following a migration from SharePoint 2007 to SharePoint 2013, my colleague was looking for a way to update formulas across SharePoint sites which had references to the old domain. He essentially needed a “Find and Replace” function for columns in SharePoint. Now this is not probably the smartest or the best solution to this problem, it is however, a solution.

Powershell Function:

And voila, you have a simple way to find and replace any text value inside of a SharePoint list. While I have only tested this with SharePoint 2013, it should work in SharePoint 2010 and SharePoint 2016 if that’s your thing.

Feel free to post questions or feedback on this article, and improvements if I have botched anything.

SharePoint Online: High Speed Migration Resources

SharePoint Online: High Speed Migration Resources

Please find a growing list of resources below, which I will update as information gets released about the Import Service for SharePoint Online. [Please click through to see the tables link]

Resources - Microsoft Articles

List of Microsoft Articles about High Speed Migration
SharePoint Online and OneDrive Migration Content RoadmapAug 7th, 2015
SharePoint Online and OneDrive Migration User GuideDec 11th, 2015
Import SharePoint Data to Office 365Apr 4th, 2016
SharePoint Online Migration API User Guiden.d.

Read More Read More

Office 365 SharePoint Parallel High Speed Migration Powershell Example Scripts

Office 365 SharePoint Parallel High Speed Migration Powershell Example Scripts

Last week there was an announcement from the Office 365 team regarding the High Speed Migration Preview. It was first announced back in may at Ignite (First Look article here) and has been in trials by several ISVs as well as individuals (Find the announcement here). Directly linked from this announcement is an article talking abit more technically about the Migration Process that circumvents the CSOM throttling. This article (link) gives some observed performance numbers from .5GB/HR all the way up to 15GB/HR, I have seen these performance numbers and more (Performance Summary here) and what remains is the importance of parallelization of the migration jobs. The single threaded performance is impressive especially when migrating simple file shares however for complex migrations parallelization is key.

So I want to share the way I achieved parallelization during the migration tests I did, so the scripts can be modified/reused. The scripts I wrote point to a file share and gets a listing of the top level directories for that file share. Each Top Level directory then becomes its own migration package and then is submitted separately to the high speed migration ingestion service.

Parallel migration processFurthermore the migration process can go two ways, you have the option of doing all for steps at once (Package, Target, Upload, & Submit) or you have the option of delaying the migration job submission till later such as when all the packages have been uploaded Stage 1 (Package, Target, & Upload) Stage 2 (Submit). This can be useful if you have multiple packages that all need to be uploaded to azure for performance reason or if you want to run the jobs over the weekend for instance, the reason behind it depends.

With a basic understanding of the process,, please find the first stage powershell script below:

In addition to creating packages from each of the top level folders, the script will output the results from the upload of the files to Azure Storage. Its useful to capture the results of the upload in case the submission of the job needs to happen in a separate stage like this. Also good to keep a record of each of the packages which are uploaded, which is why you will find the transcript from the session as well which describes which package goes where in azure.

Well once everything is uploaded you can use the following script to submit the migration jobs to the High Speed Migration ingestion service:

This script takes input when run which is the prefix of the files output from the previous script, should be named something like 123456789.Package.csv. The prompt is looking for the “123456789” prefix from these files to identify which group of jobs to submit. If you don’t want to submit the packages, simply change the name of the file or add a prefix.

Hopefully these scripts can get someone started on their own migration to Office 365. You can find the download links below for the scripts.

Stage 1 – Upload Documents Parallel Generic

Stage 2  – Submit Migration Jobs Generic

Please rename the txt extensions to ps1

One more thing to note is that you should take advantage of the free storage offered by the Microsoft team via the import service, that way you wont have to eat the Azure storage fees. You can get your key and storage account via the Import option in the Office 365 Administrator Center.

Just Published: SharePoint Online Preview Command-Lets on TechNet

Just Published: SharePoint Online Preview Command-Lets on TechNet

Following the open beta announced at ignite, Microsoft is publishing more information now about the high speed migration and the commands necessary to run the migration.

Now on TechNet: Use Windows PowerShell cmdlets for SPO Migration Public Preview

Blog hs migration commandlets

While a little light on the details and errors you will find the basic information from the previous four commands for creating and submitting the migration jobs as well as the command to remove a migration job.

  • New-SPOMigrationPackage – This is the command that
  • ConvertTo-SPOMigrationTargetPackage – Convert the Generated package to SharePoint Online specific (Suspect they’re bundling credentials in the packages)
  • Set-SPOMigrationPackageAzureSource – Upload the created package to the selected Azure Container
  • Submit-SPOMigrationJob – Process the uploaded job and return an ID of the migration job
  • Remove-SPOMigrationJob -Remove the migration job using the ID previous supplied by the Submit commandlet

Hopefully this is just the beginning of the high speed migration information to come out from the team. Stay tuned!


First Look: High Speed Migration to SharePoint Online

First Look: High Speed Migration to SharePoint Online

Steve Balmer once said “Developers, developers, developers. Now as a Strategic Consultant for AvePoint I say “Migrations, Migrations, Migrations.” Migration projects now comprise the majority of efforts to which I am assigned. One of the trends that I have been noticing in my role is the uptick of migrations from various content sources to Office 365. I was lucky enough to attend the Ignite Conference in Chicago recently and learned a bit about the new migration option that is being made available to Office 365 customers. The session I’m referring to is BRK3153 – Migration to SharePoint Online Best Practices and New API Investments presented by Simon Bourdages, Michael Jordan (Not #23), and Joe Newell.


Part of this session talked about best practices with Migrations regardless of source or destination as illustrated by this “Migrating the Collaboration Triangle” slide. Where they talk about the different kinds of content during a migration and how hard it is to migrate it between environments.



The juiciest stuff comes later in the session when they start talking about the new APIs that have recently gone into Open Beta with the overview below talking about the structure of this high speed migration.



With the new APIs offered by Microsoft, customers can outsource processing of migration data to a backend service managed by Microsoft, the big draw is that the backend resource will not be limited by Office 365 Bandwidth limits and resources. Packages uploaded by the Powershell commandlets or by third party tools will process in the background after being delivered/queued in Azure storage.


The downside is that there is now an intermediary location in Azure that will contain all the migrated data while it is queued for processing by the Migration service which can present the following issues for organizations:

  • Responsibility for Managing the Azure Storage Location
  • Security of Files at Rest in Azure Storage
  • Cost Burden for Storage/Processing of Data during Migration
  • Bandwidth available to Transfer to Azure Storage

Those concerns aside, this provides a new avenue to get large amounts of data from on premises to Office 365 without impacting especially if you consider you can now Ship Microsoft your data on a Hard Drive which can be loaded to the Azure Service for you and then processed on VM’s located close to whatever zone your service is in. This is the approach I took with my test of the preview Migration Service. I loaded up a VM in Azure in the US-East Zone which happens to be the same zone where (I hope) my Office 365 instance is located.


I started with an Office 365 Developer account which basically provides a single Enterprise License for a tenant and I planned to target my OneDrive for Business in order to take advantage of the larger size limits.


I also started with a brand new azure trial registered to my enterprise account in order to keep things as simple as possible.

Step 1) Provisioning a Migration Machine & Storage

 I used the Azure Compute resources to create a simple VM in the East US 2 Zone as you can see in the screenshot. I chose the standard pricing scheme since the performance of the VM was not likely to be the limiting factor in this test however the results may vary depending on the situation for the migration.


I also created a standard level storage resource in the same zone as my virtual machine in order to provide the blob storage for the files. In addition when you set up the storage accounts a queue is provisioned at the same time. Queues are a really great way to communicate simple information between machines or in this case the Migration Service and myself.




Step 2) Get some Content

Freshly back from ignite, I had a serious amount content from the conference I needed to catch up on. More often than not my session time slots were booked in triplicate or more. Luckily Vlad Catrinescu from absolute sharepoint put together a script to grab all the sessions and their slides from Channel9 to be viewed locally. So I Used the script to download the ignite videos as a large sampling of content that could be sent to azure. The script to download the sessions can be found here: All The Videos and Slides


 Executing the script in a folder will download several hundred files to your machine starting with the powerpoints for all of the sessions that are available.


 As you can see in the screenshot, I was able to get a fair amount of data throughput right from the start in my Azure VM. I was able to sustain about 200Mbps during the data transfer on the download.



In the end, I resulted with about 26GB of files and folders all downloaded from the Channel 9 Ignite stream.



Step 3) Preparing the Environment

In order to package these files so that the Office 365 Migration Service understands the structure, it is necessary to run several PowerShell commands against the files to create a Manifest that complies with what the service is expecting to see. The first step in running these commands is to get the SharePoint Online commandlets installed on your machine. In order to get the preview installation for the Migration Service, it is necessary to join the Open Beta for this service that was announced at Ignite.

    Step 1) Go to this site:

    Step 2) Sign in with your Microsoft Account

    Step 3) Accept the Terms and Conditions to Nominate/Join the Program in process.

    Step 4) Visit the downloads section of the program to access the Installation Files as well as the preview documentation

Once you get the installation files from the program site, you can run the executable on the Migration machine and follow the wizard to complete the install.



After the installation for the management shell completes it is necessary to create at least one file share from which the files will be accessed by the tooling. In this case I chose to make two file shares, one for the data and one for the packages in order to get an idea what the package composition was prior to uploading the data to Azure.


 Step 4) Packaging, Processing, and Submitting the Migration Job

At this point we have all the plumbing necessary to start processing our files for migration and we’ll need to execute four powershell commands in order to prepare/submit the files to the cloud service.

The commandlest we need to run are

  • New-SPOMigrationPackage -> Create a Migration package
  • ConvertTo-SPOMigrationTargetedPackage -> Target this migration package for a specific site
  • Set-SPOMigrationPackageAzureSource-> Upload this migration package to our storage account
  • Submit-SPOMigrationJob -> Tell the Migration Service the package is ready for processing

On order to do so find the SharePoint Online Migration Shell in the Start Menu and launch it:

The first command I’m going to run is to get & store my credentials used to authenticate to Office 365

    $creds = (Get.Credentials “”)



In this case I am executing


$pkg = New-SPOMigrationPackage -SourceFilesPath “\\hsmigrationtest\ignite\” -OutputPackagePath “\\hsmigrationtest\migrationpackage” -TargetWebUrl “” -TargetDocumentLibraryPath “Documents” -TargetDocumentLibrarySubFolderPath “Ignite” –NoADLookup


Command: New-SPOMigrationPackage







This is the location where my files that I am migrating to SharePoint Online are stored via fileshare.



This is the location where the package files will be generated by the tool including the manifest for the files, also a fileshare.


This is the site where the files will be stored. In my case its my OneDrive for Business however it could be any site on a SharePoint environment.



This is the document library name in sharepoint where you want to deploy the files. There may be futher limitations on this that haven’t been tested however.



The top level folder in my document library where the files will be migrated.



This means I wont be taking any security or user information along with me. Faster and necessary since I don’t have a synchronized active directory


You can see here that the tool is iterating through the file structure to create a manifest of all the files in the directory that was specified.


Finally here we can see that the migration package has been created in the file share that we specified in the above command.


A closer look below at the file structure of the package. It appears that there are some additional areas that are supported for the migration however probably not in this first public release of the powershell scripts.


The next step is to create a targeted package for the Office 365 Migration service to know where the files will be stored in SharePoint Online


This is the command I am executing

    $tpkg = ConvertTo-SPOMigrationTargetedPackage -SourceFilesPath “\\hsmigrationtest\Ignite” -SourcePackagePath “\\hsmigrationtest\migrationpackage” -OutputPackagePath “\\hsmigrationtest\migrationpackage\output” -TargetWebUrl “” -TargetDocumentLibraryPath “Documents” -TargetDocumentLibrarySubFolderPath “Ignite” -Credentials $cred



Command: ConvertTo-SPOMigrationTargetedPackage






This is the location where my files that I am migrating to SharePoint Online are stored via fileshare.



This is the location where the package files will be referenced by the tool including the manifest for the files, also a fileshare. These files will be recoded once they’re targeted with the migration details.



The FileShare output location for the package once it has been targeted for SharePoint Online.


This is the site where the files will be stored. In my case its my OneDrive for Business however it could be any site on a SharePoint environment.



This is the document library name in sharepoint where you want to deploy the files. There may be futher limitations on this that haven’t been tested however.



The top level folder in my document library where the files will be migrated.



These are the credentials that I created in the first step when running the tool, users will be prompted for a password when running the tool.




Once the previous command has finished, the package is now ready be uploaded to the Storage Location that was created in the first step.



The command I am executing is

    $uploadresult = Set-SPOMigrationPackageAzureSource –SourceFilesPath \\hsmigrationtest\Ignite – SourcepackagePath \\hsmigrationtestmigrationpackage\output –FileContainerName “MigrationFiles3” –PackageContainerName “PackageFiles3” –AzureQueueName “hsmigrationtest” –AccountName “hsmigrationtest” –AcccountKey “KEY”


Command: Set-SPOMigrationPackageAzureSource






The fileshare with our source files that need to be migrated



The fileshare with the Targeted package that has been created



A Container (Folder) that you want to store the files in



A Container (Folder) that you want to store the package files in



The name of your queue, should be the same as your storage account name in most cases



The name of your account name for the storage account


I’m not telling you 😛

The Private key that you have for uploading files to Azure. Don’t share these keys, they are meant to be private.


Note: Please note in the script that I am capturing the result of the files that have been uploaded to Azure. This is extremely useful in the next step


 Once Executed the powershell commandlet will verify the content of the package and then proceed to upload it as seen in the following two screenshots.





You can see as it begins to upload to the Storage Containers in Azure that it is using a fair amount of bandwidth during the sustained file transfers





You can see in the storage explorer tool that blobs have already begun to populate in the container for the migration files.

    Get the tool here:



And once the files for the migration are finished uploading the package files will be uploaded to the package files container as well. They could be the same however for clarity lets keep the containers separate.



Once the upload of our files and package to Azure has completed we are able to submit the migration job to the Migration Service.




This is the command that I am executing


$jobresult = Submit-SPOMigrationJob –TargetwebUrl “” –MigrationPackageAzureLocations $uploadresult –Credentials $creds


Command: Submit-SPOMigrationJob






The Site where we will be migrating the data to



Important: This the result of the previous step where we uploaded the migration files to azure. It has several parameters in the object that make it much easier to submit the job to the Migration Service



The credentials to Office 365 that I specified earlier in the process


This command will execute very quickly since it is only queuing the job with the service not processing any of the data. In order to monitor the progress of this step it is necessary to use the Azure Storage Manager and look at the Storage Queue that has been created. The messages in the queue as shown will give you any errors that may occur during the migration as well as the start and end of the job to help gague the progress. Its kind of hard given the nature of the queue to read the whole queue at once like a “log” however if you check the count of the Queue in the Azure Portal service you can watch the increase of the messages and once they stop increasing the job is probably close to finshed.



The real indicator that the migration process is working is that the files start appearing the document library where you specified the PowerShell scripts.





Success, we see here that we finally got our files and folders created in our OneDrive with all of our migrated content available. To us.


Migration Service Errors

 You may have noticed that some of the containers and folders differed slightly with the names as the flow of execution happened in the PowerShell command Line. This happened because the first two times I tried to submit the package to the Migration Service the Queue gave me some errors which I wasn’t expecting and thusly I was disappointed when my files didn’t show up in my one drive. I wanted to give you a quick look at what these errors look like so when you see them you can remediate them accordingly.


You can see the error below here gives the error The specified package does not contain the security information requested for import


Exporting the messages from the queue will give you a “log” of information similar to this, this comprises one run for the migration tool which ran in into an error. You can used the JobId below to find the messages that apply to your current execution cycle if you’re running multiple jobs at the same time.


{“Event”:”JobQueued”,”JobId”:”f625b4c7-cd44-495f-a19f-2d6ecf0b0972″,”Time”:”05/13/2015 08:48:28″,”SiteId”:”3e27d24f-f51b-490a-bc7c-61f774e11a4f”}


{“Event”:”JobStart”,”JobId”:”f625b4c7-cd44-495f-a19f-2d6ecf0b0972″,”Time”:”05/13/2015 08:48:53″,”SiteId”:”3e27d24f-f51b-490a-bc7c-61f774e11a4f”,”WebId”:”09a3a646-ea9e-4e7a-b21e-8216bbd5b319″,”DBId”:”b615c115-d9ec-43e4-a5b4-e352e53665d3″,”FarmId”:”6945487f-fc8b-468e-9acb-767d1d73468b”,”ServerId”:”55d9a440-306b-43a2-b069-f2e8a36ad88b”,”CorrelationId”:”15794a86-e674-4233-8438-8a1f2c5dabbb”}


{“Event”:”JobWarning”,”JobId”:”b729c7bb-585a-4379-ae1b-f60d5ffbbf96″,”Time”:”05/13/2015 08:19:54″,”CorrelationId”:”353eac79-b071-4e73-b439-5e30cee9462a”,”Message”:”The specified package does not contain the security information requested for import.”}


{“Event”:”JobError”,”JobId”:”f625b4c7-cd44-495f-a19f-2d6ecf0b0972″,”Time”:”05/13/2015 08:48:54″,”CorrelationId”:”15794a86-e674-4233-8438-8a1f2c5dabbb”,”ObjectType”:”List”,”Url”:”Documents”,”Id”:”f0a5e964-56f0-4fa7-8a8c-53bb58d6f445″,”ErrorCode”:”-2147024809″,”ErrorType”:”System.ArgumentException”,”Message”:”Must specify valid information for parsing in the string.”}


{“Event”:”JobFatalError”,”JobId”:”f625b4c7-cd44-495f-a19f-2d6ecf0b0972″,”Time”:”05/13/2015 08:48:54″,”CorrelationId”:”15794a86-e674-4233-8438-8a1f2c5dabbb”,”ErrorCode”:”-2147024809″,”ErrorType”:”System.ArgumentException”,”Message”:”Must specify valid information for parsing in the string.”}


{“Event”:”JobEnd”,”JobId”:”f625b4c7-cd44-495f-a19f-2d6ecf0b0972″,”Time”:”05/13/2015 08:48:54″,”FilesCreated”:”0″,”BytesProcessed”:”0″,”ObjectsProcessed”:”1″,”TotalErrors”:”2″,”TotalWarnings”:”1″,”TotalExpectedSPObjects”:”1656″}


The end result for this error was that I skipped the second step of this migration where I created the Targeted Package for SharePoint Online and so the package was missing the Security information necessary.


My first thought when I finally saw the documents start trickling into my one drive was “Finally”. The effort was a lot higher than just dragging and dropping a bunch of folders from the desktop to my one drive however since I was only using my OneDrive as an example for this test case, I can agree that it definitely has its uses. The strongest use case that I can think of is for those organizations with Terabytes of File shares that just want to do a cutover migration quickly and sort through the data at a later date. You can now ship a hard drive laden with your file shares to Azure and with a little bit of PowerShell magic it shows up in Azure Quickly. My next goal is to try this on a data sample size of a terabyte or more to see what the real performance of a large scale migration would be. I would also like to see an improved Queue management interface not unlike the ULS Manager in SharePoint

In closing I would like to encourage everyone to check out the Migration session that was given at ignite by clicking the image below and join the Open Beta so you too can get started with the High Speed Migration with Office 365.