When is 1MB not 1MB?


I’ve recently been working on a OneDrive for Business migration for a client to relieve them of the on-premises infrastructure. This client has home drives stored on NTFS shares which were mapped at logon and were for personal files only.  Because of the strict use, these file shares were limited to 500MB to ensure that any transactional data was stored elsewhere and not hidden away for personal use.

Configuring the storage quota in the OneDrive for Business Admin Centre for 500MB is not possible as the allowance is only configured in 1GB increments as shown below.


Even using the “Set-SPOSite -StorageQuota” PowerShell parameter has the same restriction so the minimum we could set for the client was twice the preferred size.  This shouldn’t be a problem providing a company policy is still followed to store non-personal documents elsewhere.

As part of a OneDrive for Business pilot, we moved users’ data from the NTFS shares to the client’s tenant, 115 users altogether. For this we used the SharePoint Migration Tool (available for download here).

Everything went well during the copying of data; the file shares were set as read-only, and the tool was run overnight to ensure minimal disruption to the users.  Before users arrived the next morning, the original file shares were disconnected from the login scripts and users were provided information on where their files were now located.  For the first couple of days, there were no complications until one of the users reported that they had run out of storage space.

Looking at the storage metrics (_layouts/15/storman.aspx) for the user’s OneDrive for Business they had in fact reached the 1GB limit set at the tenant.


This took us by surprise, as we were provided a list of users who were exceptions to the 500MB limit, but this user had appeared to be omitted from the list.  To validate, I looked at the user’s existing file share to confirm the actual amount of data prior to the migration and found the following;


So, the data was nowhere near the size to indicate that there would be a storage issue with a 1GB quota so something else must be going on.  After looking at the storage metrics in more detail I could see that the file size shown to the user did not match up in many cases with that reported by the storage metrics.

Picking a migrated file at random I can see in that one of the migrated docx files has a file size of 1.38MB.


Looking at the original file share this appears on disk as 1,415KB, so the file size is comparable after a migration.  Remember this is from an existing file share, no versioning exists.


If we view the file in Storage Metrics, it’s been inflated to a whopping 13.5MB – almost ten times the original size!


Just for validation, the version history is recorded and confirms a single version of the file at 1.4MB.



To test what is happening, I have created a word document 576KB and have copied this to another account in the same tenant using the following methods;

  • Copied with SharePoint Migration Tool
  • Uploaded from the interface
  • Dragged from desktop to browser

With each method, the result is the same, OneDrive for Business shows the file as 576KB as expected.


Looking at the storage metrics for the same files paints and different picture.


Initially, the files when uploaded appear to be the correct size, however if left, the total size does then inflate to 23.8MB approximately 43 times the original size.


Subsequently, this was tried with more files in OneDrive for Business and SharePoint sites in other tenants in various data regions. There was no difference observed in behaviour for any of the testing carried out.  We did confirm however that there is no relationship between the expected file size and the inflated size being reported.  It was also noted that this appeared to affect but was not limited to docx, pptx and pdf files.


An outstanding call is currently with Microsoft to explain the behaviour of this as they have recently stated that this is by design and caused by meta and additional data being stored for previews.

However, as storage metrics appear to be the mechanism for calculating remaining storage space for observing quotas, this means that when migrating or uploading data to OneDrive for Business or SharePoint Online it’s practically impossible to calculate how much storage space is required.  Taking the example given above, it’s entirely feasible that 1GB of storage could be required to store just 25MB of documents.


01/03/2018 – After further testing with a tenant on the “Standard release” channel in the UK data region, the file sizes appeared to remain normal.  After switching my lab tenant to “Targeted release for everyone” the files inflated after being viewed.  This time, one of the same files was considerably larger, 83 times larger.


28/03/2018 – Microsoft have confirmed that this is normal and have updated me with the following statement.
This is an expected behaviour for files that service reported size to be larger than on device size as the service size includes additional size for metadata, versions, file preview images in addition to stream size (whereas device size only includes stream size).

To meet this particular clients quota needs, it was suggested that the preview functionality was turned off for the tenant. Removing this functionality as a means to address the issue was obviously not a valid approach.  With OneDrive for Business having unlimited storage, this doesn’t really present a problem if quotas aren’t being configured.  However, I have additionally asked for a statement on how storage metrics are used, if at all when calculating the requirement for additional space in SharePoint Online.   Do these inflated sizes count towards the 1TB per tenant and 0.5GB limit per user?  If I buy additional storage, am I getting what I paid for?

13/08/2018 – Interestingly, Microsoft raised the storage allowance to 10GB from the existing 0.5GB per user back in July as described in “SharePoint Online Limits“. However, the storage metrics issue is still ongoing.  Microsoft confirmed to me back in July that “we started deploying this fix now first for internal farms and will monitor results on our end. If all goes as expected we should see these changes in the upcoming days in production farms.

The latest communication today states that for “the moment we’re still monitoring and validating the results for the internal farms on our end.“.  I’ll update back here when I get further news on the results.

28/09/2018 – Microsoft have confirmed that the Product team have completed testing and the fix has been released to production tenants.

Storage Metrics - FixedI’ve tested with the same document (577KB) and once uploaded the file is showing in Storage Metrics as 793KB.  While this isn’t a huge inflation like that we have previously seen, it’s still an increase of approx 37% on the original file size.  Microsoft are to confirm if this is now the expected behaviour?

15/10/2018 – Microsoft have confirmed that due to metadata and preview images this increase can be expected.  The case on this has now been closed.


How do I add Geo Location information to a SharePoint 2013 lists?

With SharePoint 2013 and Bing maps, there’s the added ability to include location information in your lists.  So how do we add this functionality?

List showing Geo Location information

To enable the Geo-location column in SharePoint 2013, you need to install the file “SQLSysClrTypes.msi” which is a free download from Microsoft.  This is a straight forward installation without configuration options so just run with the defaults.

The file can be downloaded from the following locations depending on your environment

SQL Server 2012 SP1 Feature Pack

SQL Server 2008 R2 SP1 Feature Pack

SQL Server 2008 R2 SP2 Feature Pack

This needs to be installed on all web servers in your farm.

Next head over to the Bing Maps Portal to get your key.  You’ll need a Windows Live ID to sign in, which can be created at the site if you don’t already have one.

Once signed in, click on “Create or view keys” in the “My Account” section and fill in the necessary details to create your key.  You’ll need to specify an “Application Name“, “Key Type” & “Application Type” as well as verify a CAPTCHA image.

Trial Keys are valid for 90 days and are limited to 10,000 transactions in any 30 day period.

Basic Keys are valid for applications which do not exceed 50,000 transactions in a 24 hour period.

Full details and the available application types can be found here.

Once your key is generated in copy the key and head back to a web server in your farm, we need to run the “Set-SPBingMapsKey” command from a PowerShell console with Administrator access and as below

Set-SPBingMapsKey –BingKey “<your_copied_bing_Key>”

Next we need to enable the column so it can be added to your list, this again is done via PowerShell replacing “<web_url>” with the full url of your web application.

$fieldxml = "<Field Type='Geolocation' DisplayName='Geo Location' />"
$web = Get-SPWeb <web_url>
$fieldName = $web.Fields.AddFieldAsXml($fieldxml)

There is already a default site column called “Location” so choose a different name otherwise this will be replaced.  In the example above I’ve chosen to call my column “Geo Location“.

Now, from “List Settings” on any of your lists, you can select “Add from existing site columns” and under “Custom Columns” you’ll see your new column ready to add.

To add the location to your list, you’ll need to grab the Longitude and Latitude for your chosen location, this is done by right-clicking Bing maps at the point of address and copying the 2 figures.

Highlight of Longitude and Latitude

Adding the position to your column is as simple as clicking “Specify Location” and adding the previously captured Longitude & Latitude into the pop-up dialog box.

Specify Location Dialog Box

That’s it!