Considerations for using the Azure Blob Storage module
Considerations for using Sitecore blobs according to your storage requirements, installation type, performance expectations, and security.
Before installing the Azure Blob Storage module to use Sitecore blobs, first consider: your storage requirements, whether you are running a scaled installation, whether you are running the migration scripts on PowerShell Core, what your performance expectations are for your blob provider and migration scripts, and what your security requirements are.
The Azure Blob Storage module uses shared storage for both the Content Management (CM) and Content Delivery (CD) roles. This means the media files in CM and CD do not need to be duplicated and thereby reduces storage costs.
Because Sitecore blob storage shares the same storage space as Azure Storage, Sitecore restricts CD access for Azure Storage with a secure connection string that uses a shared access signature (SAS) uniform resource identifier (URI). The connection string contains only List and Read permissions.
Important
When you configure an existing Azure Storage account to use with Sitecore in a scaled instance, you must ensure the CM and CD roles use the same storage account with the same storage container, and connection strings that contain different privileges:
The CM role - requires Full access.
The CD role - requires only List and Readaccess.
Creating an Azure Storage account to use with Sitecore
You can create a storage account to store your blobs in a variety of ways by using: the Microsoft Azure® portal, PowerShell, Azure CLI, or an ARM template.
After you create a storage account, you must create a container to store your blobs in, then you can learn how to use your storage account settings to gain access to the storage account keys and connection strings.
To avoid accidental deletes or overwrites of blob data, you can enable soft delete for blobs. If you enable blob soft delete for a storage account, you can specify a retention period for deleted objects. The retention period indicates how long the data remains available after it is deleted or overwritten. For more information, refer to the Microsoft documentation.
If you are running a scaled Sitecore installation, you must install the Azure Blob Storage module on all of the nodes that work with media items, (this is usually the CM and CD nodes). To ensure a safe connection, even though CM and CD share the same storage account, use different connection strings for each node.
Note
If you are deploying through the Azure Marketplace or Sitecore Azure Toolkit, then different connection strings are assigned for each node automatically.
To access the Azure Storage account, the migration scripts are designed to run with the AzureRM module by using PowerShell cmdlets from the AzureRM.Storage
module.
To run the scripts in PowerShell Core (using, for example, Azure Cloud Shell), you must update the scripts to use Az.Storage
instead of the AzureRM.Storage
module.
To avoid accruing unnecessary costs and preclude migrating orphan blobs to the Azure Blob Storage module, you must run the SQL Database Cleanup before beginning migration.
It is important to consider your blob storage performance requirements. To configure the best solution, refer to the Blob storage performance reference.
CM and CD roles share the same storage account, however, they each require different permissions, therefore use a different connection string for each role.
The CM role can use either Azure Storage account keys or a SAS URI with the following permissions:
Read
Add
Create
Write
Delete
The CD role requires a secure connection string using SAS URI with the following permissions:
Read
List
Note
If you using the SXA module and want to use the Asset Optimizer functionality, the CD role can use either Azure Storage account keys or a SAS URI and must also have the following permissions:
Add
Create
Write
Delete
Sitecore supports running migration from Cloud Shell and Local box while using a SAS URI. Sitecore does not support running migration from KUDU while using a SAS URI.