Azure Blob Storage Python Development Guide

Author

Reads 310

Aerial View of Blue Sea Under Blue Sky
Credit: pexels.com, Aerial View of Blue Sea Under Blue Sky

Azure Blob Storage is a highly scalable and secure object storage solution that can be used to store and serve large amounts of unstructured data.

To get started with Azure Blob Storage in Python, you'll need to install the Azure Storage Blob library using pip, which is a requirement for interacting with Blob Storage from Python.

This library provides a simple and intuitive API for creating, reading, and deleting blobs, as well as uploading and downloading files from Blob Storage.

With the library installed, you can start interacting with Blob Storage by importing the BlobServiceClient and creating a client instance to connect to your storage account.

For another approach, see: Python Google Cloud Storage

Getting Started

To get started with Azure Blob Storage in Python, you'll need to have Python 3.8 or later installed on your machine.

You can check your Python version by running a simple command in your terminal. If you're using an older version, you'll need to upgrade to 3.8 or later.

Credit: youtube.com, Getting Started With Azure Storage API In Python For Beginners

An Azure subscription is also required to use this package. You can create one for free, which is a great way to start exploring Azure services without incurring any costs.

Having an Azure storage account is also essential. You can create a storage account easily by following the prompts in the Azure portal.

To access Azure Blob Storage from your code, you'll need to install the Azure Blob Storage client library. This can be done by running a pip install command in your terminal.

Here's a quick rundown of the prerequisites you'll need to get started:

  • Azure subscription (create one for free)
  • Azure storage account (create a storage account)
  • Python 3.8+

The azure-identity package is also needed for passwordless connections to Azure services. This can be installed along with the Azure Blob Storage client library using pip.

Azure Blob Storage Client

The Azure Blob Storage Client is a Python library that allows you to interact with Azure Blob Storage resources. To create a client object, you need the storage account's blob service account URL and a credential that allows you to access the storage account.

Credit: youtube.com, 02. Azure using Python SDK: Access and upload files in Azure Blob Storage

You can create the client from a connection string, which can be found in the Azure Portal under the "Access Keys" section or by running a CLI command. The connection string is a string that uniquely identifies the storage account.

To authorize access and connect to Blob Storage, you can use a Microsoft Entra authorization token, an account access key, or a shared access signature (SAS). For optimal security, Microsoft recommends using Microsoft Entra ID with managed identities to authorize requests against blob data.

Here are the different ways to authorize access to Blob Storage:

Install the Package

To install the Azure Blob Storage client library, you can use pip, a package manager for Python.

You'll need to install the Azure Storage Blobs client library for Python using pip.

The command to install the package is straightforward: simply run pip in your terminal or command prompt.

In the project directory, run the pip install command to install the necessary packages.

Credit: youtube.com, Azure Blob Storage Containers - How to create a storage account and upload files, create folders

The azure-identity package is required for passwordless connections to Azure services, so be sure to include it in your installation.

To access Azure Blob Storage from your code, you'll need to install the Azure Blob Storage client library.

From the command line, run the pip install command to install the package.

The Client

There are four different clients provided to interact with the various components of the Blob Service. The clients are used to interact with the storage account itself, a specific container, a specific blob, and lease interactions.

The BlobServiceClient represents interaction with the Azure storage account itself, allowing you to acquire preconfigured client instances to access the containers and blobs within.

The ContainerClient represents interaction with a specific container, and allows you to acquire preconfigured client instances to access the blobs within.

The BlobClient represents interaction with a specific blob, and provides operations to upload, download, delete, and create snapshots of a blob.

See what others are reading: Give Access to Azure Blob Storage

Credit: youtube.com, Azure Blob Storage Node.js Client Library Tutorial

The BlobLeaseClient represents lease interactions with a ContainerClient or BlobClient, and provides operations to acquire, renew, release, change, and break a lease on a specified resource.

Here are the four clients and their descriptions:

  • BlobServiceClient: interaction with the Azure storage account itself
  • ContainerClient: interaction with a specific container
  • BlobClient: interaction with a specific blob
  • BlobLeaseClient: lease interactions with a ContainerClient or BlobClient

Key Concepts

In Azure Blob Storage, there are three main components: the storage account, a container within the storage account, and a blob within a container.

Each of these components can be interacted with using a dedicated client object through the Azure Storage Blobs client library for Python.

The storage account is the top-level entity, and it's where you'll store your data. A container is a logical grouping of related blobs, and a blob is the actual data stored in the container.

See what others are reading: Azure Container Storage

Key Concepts

The Azure Blob Service is made up of three key components: the storage account, container, and blob. The storage account is the top-level entity that holds your data.

A storage account is the foundation of your Azure Blob Service, and it's where all your data is stored. You can think of it like a digital file cabinet.

Modern data center corridor with server racks and computer equipment. Ideal for technology and IT concepts.
Credit: pexels.com, Modern data center corridor with server racks and computer equipment. Ideal for technology and IT concepts.

Containers are nested within storage accounts and are used to organize your data. You can create multiple containers within a single storage account.

A container is a logical grouping of blobs, and it's like a folder in your file cabinet. You can think of it as a place to store related files together.

Blobs are the actual data stored in Azure Blob Service. They can be images, videos, documents, or any other type of file.

Here are the three key components of the Azure Blob Service:

  • Storage Account
  • Container
  • Blob

Types

When working with blobs, it's essential to understand the different types available. There are three primary types of blobs: Block, Append, and Page.

Block blobs store text and binary data up to approximately 4.75 TiB. They're made up of blocks of data that can be managed individually.

Append blobs are similar to block blobs but are optimized for append operations. This makes them ideal for scenarios such as logging data from virtual machines.

Take a look at this: What Is Azure Storage

Networking cables plugged into a patch panel, showcasing data center connectivity.
Credit: pexels.com, Networking cables plugged into a patch panel, showcasing data center connectivity.

Page blobs store random access files up to 8 TiB in size. They're often used to store virtual hard drive (VHD) files and serve as disks for Azure virtual machines.

Here's a quick rundown of the different blob types:

Uploading and Managing Files

You can upload a blob using the async client. To do this, you'll need to use the upload_blob method.

To upload blobs to a container, create a text file in your local data directory and use the upload_blob method. This is shown in example code that creates a text file to upload to the container.

You can verify that file uploads are working by starting the Flask application and navigating to localhost:5000 in your browser. After uploading a couple of files, you should see the filenames of the uploaded files on the page.

For more insights, see: Azure Blob Storage C# Upload File

Container Upload

To upload a file to a container, you can use the async client to upload a blob. This is a straightforward process that allows you to upload a file from your local machine to the cloud.

Intriguing read: Azure Files vs Blob

Credit: youtube.com, [Uploads] Uploading and Managing Native Data

You can also upload blobs to a container using the upload_blob method. This method creates a text file in the local data directory to upload to the container.

To create a new container in your storage account, you can call the create_container method on the blob_service_client object. This ensures that the container name is unique, and it must be in lowercase.

Once you've uploaded a file to a container, you can verify that the file upload is working by navigating to the Storage Account in the portal and checking the container for the uploaded file.

File Hashes

File hashes are a crucial aspect of uploading and managing files. They allow you to verify the integrity and authenticity of a file, which is especially important when downloading files from the internet.

You can find hashes for uploaded files in the file details section. For example, the file "azure_storage_blob-12.24.0.tar.gz" has the following hashes: SHA256 eaaaa1507c8c363d6e1d1342bd549938fdf1adec9b1ada8658c8f5bf3aea844e, MD5 1c5ba95b091889f6f04f4e1dab7135be, and BLAKE2b-256 fef65a94fa935933c8483bf27af0140e09640bd4ee5b2f346e71eee06c197482.

Credit: youtube.com, File Hashing Demonstration

File hashes can be used to verify the integrity of a file by comparing the hash of the downloaded file with the hash provided by the uploader. If the hashes match, you can be confident that the file has not been tampered with or corrupted during transmission.

The file "azure_storage_blob-12.24.0-py3-none-any.whl" has the following hashes: SHA256 4f0bb4592ea79a2d986063696514c781c9e62be240f09f6397986e01755bc071, MD5 6a2a51001dbb13f166e81068947076ef, and BLAKE2b-256 e2f8ef0f76f8c424bedd20c685409836ddfb42ac76fd8a0f21c3c3659cf7207d.

Here is a list of the file hashes for the two files:

  • SHA256:
  • azure_storage_blob-12.24.0.tar.gz: eaaaa1507c8c363d6e1d1342bd549938fdf1adec9b1ada8658c8f5bf3aea844e
  • azure_storage_blob-12.24.0-py3-none-any.whl: 4f0bb4592ea79a2d986063696514c781c9e62be240f09f6397986e01755bc071
  • MD5:
  • azure_storage_blob-12.24.0.tar.gz: 1c5ba95b091889f6f04f4e1dab7135be
  • azure_storage_blob-12.24.0-py3-none-any.whl: 6a2a51001dbb13f166e81068947076ef
  • BLAKE2b-256:
  • azure_storage_blob-12.24.0.tar.gz: fef65a94fa935933c8483bf27af0140e09640bd4ee5b2f346e71eee06c197482
  • azure_storage_blob-12.24.0-py3-none-any.whl: e2f8ef0f76f8c424bedd20c685409836ddfb42ac76fd8a0f21c3c3659cf7207d

Download

Downloading files is a crucial step in managing files. You can download previously created files by calling the download method, and you can even add a suffix to the file name to keep track of the original and downloaded files.

The download method allows you to access your files from anywhere. In the example, the code adds a suffix of "DOWNLOAD" to the file name to distinguish it from the original file.

You can learn more about downloading files and explore more code samples by checking out the download a blob with Python resource.

On a similar theme: Azure Blob Storage Download

Project Setup and Deployment

Credit: youtube.com, Getting Started With Azure Storage API In Python For Beginners

To set up your project for Azure Blob Storage in Python, you'll need to initialize the Azure Developer CLI template and deploy resources. This can be done from an empty directory by cloning the quickstart repository assets from GitHub and initializing the template locally using `azd init --template blob-storage-quickstart-python`.

You'll be prompted to log in to Azure using `azd auth login` and then provision and deploy the resources to Azure with `azd up`. This process may take a few minutes to complete, and the output will include the name of the newly created storage account, which you'll need later to run the code.

Once you have your project set up, you'll need to install the necessary packages, including the Azure Blob Storage and Azure Identity client libraries, using `pip install`.

The Project

To start your project, you'll want to create a new directory for it. In a console window, run the command `mkdir blob-quickstart` to create a new directory named blob-quickstart.

Next, switch to the newly created directory by running `cd blob-quickstart`. This will change the directory you're currently working in to the one you just created.

Initialize Developer CLI Template and Deploy Resources

Credit: youtube.com, How To Deploy to Azure with the Azure Developer CLI (azd)

To initialize the Azure Developer CLI template and deploy resources, you'll need to follow these steps. Clone the quickstart repository assets from GitHub and initialize the template locally by running the command `azd init --template blob-storage-quickstart-python`. You'll be prompted for the following information.

You'll need to log in to Azure by running the command `azd auth login`. This will authenticate your Azure account and prepare it for the deployment process.

Next, you'll need to provision and deploy the resources to Azure by running the command `azd up`. You'll be prompted for the following information, including the name of the storage account, which will be created during the deployment process.

The deployment might take a few minutes to complete, and the output from the `azd up` command will include the name of the newly created storage account, which you'll need later to run the code.

Here's a summary of the steps:

  • Clone the quickstart repository assets from GitHub
  • Initialize the template locally
  • Log in to Azure
  • Provision and deploy the resources to Azure
  • Note the name of the newly created storage account

Remember to have your Azure account ready and authenticated before starting the deployment process.

Gilbert Deckow

Senior Writer

Gilbert Deckow is a seasoned writer with a knack for breaking down complex technical topics into engaging and accessible content. With a focus on the ever-evolving world of cloud computing, Gilbert has established himself as a go-to expert on Azure Storage Options and related topics. Gilbert's writing style is characterized by clarity, precision, and a dash of humor, making even the most intricate concepts feel approachable and enjoyable to read.

Love What You Read? Stay Updated!

Join our community for insights, tips, and more.