Uploading files to Azure Blob Storage using the Azure CLI is a straightforward process that can be completed in just a few steps.
To begin, make sure you have the Azure CLI installed and configured on your machine.
First, you'll need to authenticate with your Azure account using the az login command.
Once authenticated, you can use the az storage blob upload command to upload your file.
For your interest: How to Upload File to Google Cloud Storage Using Reactjs
Prepare Your Environment
To get started with this tutorial, you need an active Azure account. Install Azure CLI from the Microsoft website if you don't already have it.
Verify your Azure CLI works by running the az command in your terminal. Login to your account using az login, and select the subscription you want to use for this tutorial.
You can change your default subscription by using the az command with the subscription ID you want to switch to. Some services in this tutorial will incur charges to your account, so remember to clean up your resources when you're done.
Create a resource group to put all your resources in, which will make clean up easier. You can create a resource group using the az command.
Here's an interesting read: How to Create Blob Storage in Azure
Authorize Access
To authorize access to Blob storage, you can use either Microsoft Entra credentials or a storage account access key. However, using Microsoft Entra credentials is recommended.
You can authorize access with Microsoft Entra credentials by setting the --auth-mode parameter to login in Azure CLI commands for data operations against Blob storage. This parameter is only supported for Blob storage data operations, not management operations.
Signing into your Azure account with the az login command is the first step to authorizing access with Microsoft Entra credentials. Azure role assignments may take a few minutes to propagate.
You might like: Which Azure Storage Service Supports Big Data Analytics
Create Account
To create an account, you'll need to start by creating a storage account. Run the following command to create it in your resource group: Azure CLI command.
This will create a storage account that you can use for storing website assets. Azure storage by default doesn't serve your HTML files as a website.
You need to enable this feature in your blob storage and specify index and error pages. For uploading your website, you can use Azure portal, Storage Explorer, or CLI command.
I'm going to use Azure CLI to upload the content of my current terminal folder to blob storage using the following command.
Check this out: Create a Storage Account in Azure
Authorize Access
You can authorize access to Blob storage from the Azure CLI either with Microsoft Entra credentials or by using a storage account access key. Using Microsoft Entra credentials is recommended.
Azure CLI commands for data operations against Blob storage support the --auth-mode parameter, which enables you to specify how to authorize a given operation. Set the --auth-mode parameter to login to authorize with Microsoft Entra credentials.
Only Blob storage data operations support the --auth-mode parameter. Management operations, such as creating a resource group or storage account, automatically use Microsoft Entra credentials for authorization.
Run the login command to open a browser and connect to your Azure subscription. This will allow you to authenticate with your Azure account.
You'll need to assign the Storage Blob Data Contributor role to yourself before you can create a container or perform other data operations. Even if you are the account owner, you need explicit permissions to perform data operations against the storage account.
Additional reading: Python Access Azure Blob Storage
Upload File
You can upload a file to Azure Blob Storage using the Azure CLI. To do this, you'll need to create a file to upload, which can be done using the az storage blob upload command.
If you're using Azure Cloud Shell, you can create a file using the command "az storage blob upload" with the required parameter values. This command will create the blob if it doesn't already exist, and overwrite it if it does.
You can also use the az storage blob upload-batch command to upload multiple files at the same time. This command is useful for uploading large numbers of files, and it issues respective REST API calls via http and https protocols.
To specify the number of bytes to read from the stream, you can use the optional --data parameter. This parameter should be supplied for optimal performance.
The az storage blob upload command also allows you to upload a file to a blob with a blob SAS URL. This is done by specifying the blob SAS URL with the --blob-sas-url parameter.
See what others are reading: Azure File Storage vs Blob
You can use the az storage blob upload command to upload a file to a storage blob, and it will automatically chunk the file and provide progress notifications. This command also supports uploading files with a specific format, such as 'cli-201x-xx-xx.txt', by using the --pattern parameter.
To upload a file to a block blob, you'll need to specify the source path and file name with the --file parameter, and the name of the destination container with the --container-name parameter. You'll also need to supply the --account-name parameter.
You can use the az storage blob upload-batch command to recursively upload multiple blobs to a storage container, using Unix filename pattern matching to specify a range of files to upload.
For another approach, see: Azure Blob Storage C# Upload File
Azure CLI Commands
You can use the az storage blob upload command to upload a file to a blob storage container. This command creates a new blob or overwrites the original blob if it already exists.
If this caught your attention, see: Upload Url Google Cloud Storage
You can specify the source path and file name with the --file parameter, and the name of the destination container with the --container-name parameter. The --account-name parameter is also required.
To upload multiple files at the same time, you can use the az storage blob upload-batch command. This command allows you to recursively upload multiple blobs to a storage container. You can use Unix filename pattern matching to specify a range of files to upload with the --pattern parameter.
Worth a look: Azure Container Storage
Create a Container
To create a container in Azure CLI, you can use the az storage container create command.
You'll need to authorize the operation, which can be done using your Microsoft Entra account or the storage account key.
Assign the Storage Blob Data Contributor role to yourself before creating the container, even if you're the account owner.
You can also use the storage account key to authorize the operation, for more information see Authorize access to blob or queue data with Azure CLI.
To create a container, use the az storage container create command, and at least one container resource is required before uploading data.
Additional reading: Storage Account Key Azure
Az Generate-Sas
The az storage blob generate-sas command is a powerful tool for generating shared access signatures for blobs in Azure. It allows you to grant temporary access to your blobs for read-only or other permissions.
You can generate a sas token for a blob with read-only permissions using the az storage blob generate-sas command. To do this, you'll need to specify the permissions the SAS grants, which can be combined for multiple permissions such as read and write.
The expiry parameter and '--auth-mode login' are required if you want to generate a SAS signed with the user delegation key. This is a security feature to ensure that the SAS is only valid for a limited time.
You can also specify the UTC datetime at which the SAS becomes invalid, but be aware that this can't be used if a stored access policy is referenced with --policy-name that specifies this value. Instead, use the --expiry parameter to specify the expiry time.
Check this out: Azure Blob Storage Retention Policy
The az storage blob generate-sas command also allows you to specify the IP address or range of IP addresses from which to accept requests. This is useful for restricting access to your blobs from specific locations.
By using the az storage blob generate-sas command, you can generate a shared access signature for a blob with read-only permissions and specify the IP address range from which requests are accepted.
Explore further: How to Access Google Cloud Storage
Commands
The Azure CLI provides a range of commands for working with Azure Storage, including blob storage.
You can create a new blob from a file path using the `az storage blob upload` command. This command also allows you to update the content of an existing blob with automatic chunking and progress notifications.
The `az storage blob upload-batch` command is used to upload files from a local directory to a blob container. This command is particularly useful for recursively uploading multiple files to a storage container.
Before you can upload blobs, you need to create a container using the `az storage container create` command. This command allows you to organize groups of blobs in containers, similar to how you organize files on your computer in folders.
To upload a single file, you can use the `az storage blob upload` command, which creates a new blob or overwrites the original blob if it already exists. This command requires you to specify the source path and file name with the `--file` parameter, as well as the name of the destination container with the `--container-name` parameter.
The `az storage blob generate-sas` command is used to generate a shared access signature for a blob, which allows you to grant permissions to users or services to access the blob without sharing the account key. This command requires you to specify the permissions the SAS grants, such as read-only or write permissions.
You can also use the `--if-modified-since` parameter with the `az storage blob upload-batch` command to ensure that only files modified within a certain time frame are uploaded. This parameter must be provided in UTC format.
Here's an interesting read: Give Access to Azure Blob Storage
Azure URL
Azure URL is a powerful tool that allows you to create a URL to access a blob. This can be done using the az storage blob url command in Azure CLI.
To create a URL, you can specify a container URL or a container name as the source. If the source is a container URL, the storage account name will be automatically parsed from the URL.
You can also use the az storage blob url command to create a URL to access a blob in a standard storage account, where you can specify a standard blob tier value to set the blob to.
By using the az storage blob url command, you can easily access your blobs and perform various operations on them.
Manage Properties
A blob exposes both system properties and user-defined metadata. System properties exist on each Blob Storage resource.
Some system properties are read-only, while others can be read or set. Under the covers, some system properties map to certain standard HTTP headers.
User-defined metadata consists of one or more name-value pairs that you specify for a Blob Storage resource. You can use metadata to store additional values with the resource.
To read blob metadata, use the az storage blob metadata show command.
Metadata values are for your own purposes, and don't affect how the resource behaves.
To update blob metadata, you'll use az storage blob metadata update and supply an array of key-value pairs.
On a similar theme: Azure Blob Storage Add Metadata
Set Tier
You can change a blob's tier using the az storage blob set-tier command. This command moves the blob and all its data to the target tier.
You can change the tier between hot, cool, and archive. The Copy Blob operation is also an option, but it creates a new blob in the desired tier while leaving the source blob in the original tier.
Changing tiers from cool or hot to archive takes place almost immediately. After a blob is moved to the archive tier, it's considered offline and can't be read or modified.
A fresh viewpoint: Azure Archives
You'll need to rehydrate the blob to an online tier before you can read or modify its data. This process is called Blob rehydration from the archive tier.
The az storage blob set-tier command can be used to set the tier to hot for a single, named blob within the archive container.
Deploy Updates
When using a CDN, you need to manually refresh its cache after uploading new files to your storage account.
You have to explicitly tell the CDN to refresh its cache and get the new version of the files.
To do this, you can use a command to purge the CDN cache, but be aware that this process can take a few minutes.
The command used in this case includes the --no-wait flag, which returns quickly but may take a few minutes behind the scenes.
You have to wait until the CDN cache is refreshed for all the nodes before you can see the changes on your website.
For more insights, see: Azure Blob Storage Move Files between Containers C#
Sources
- Azure portal (azure.com)
- https://aka.ms/CLI_refstatus (aka.ms)
- az storage (microsoft.com)
- Unix filename pattern matching (python.org)
- Host your static website in Azure Storage using Azure CLI (telstra.com)
Featured Images: pexels.com