Golang Azure Blob Storage Essentials for Developers

Author

Reads 1.3K

Computer server in data center room
Credit: pexels.com, Computer server in data center room

Golang Azure Blob Storage is a great choice for storing and managing large amounts of unstructured data. Azure Blob Storage provides a highly available and durable storage solution for your Golang applications.

With Azure Blob Storage, you can store and serve large files, images, videos, and other types of binary data. This is particularly useful for applications that require high-performance and low-latency data access.

To get started with Golang Azure Blob Storage, you'll need to install the Azure SDK for Go, which provides a simple and intuitive API for interacting with Azure Blob Storage. This SDK is available on GitHub and can be installed using the go get command.

Getting Started

To get started with Go and Azure Blob Storage, you'll need Go version 1.18 or higher installed on your machine. Install Go if you haven't already.

You'll also need an Azure subscription, which you can create for free. This will give you access to the Azure services you'll need for blob storage.

Credit: youtube.com, What is the Azure Blob Storage? | How to Use the Azure Blob Storage

To create a storage account, you can use the Azure portal, Azure PowerShell, or the Azure CLI. Here's an example of how to create a storage account using the Azure CLI:

To interact with Azure Blob Storage, you'll need to install the Azure Blob Storage client module for Go with `go get`. If you plan to authenticate with Azure Active Directory, you'll also need to install the `azidentity` module.

To create an instance of the `azblob.Client` type, you'll need to use the `azidentity` module to add Azure Active Directory support for authenticating Azure SDK clients with their corresponding Azure services.

You can learn more about enabling Azure Active Directory for authentication with Azure Storage by checking out the official documentation.

Here are the basic authentication options for Azure Blob Storage:

  • Azure Active Directory
  • Connection strings
  • Shared key
  • Shared access signatures (SAS)
  • Anonymous public access

Choose the authentication mechanism that works best for your project and use the corresponding client constructor function.

Azure Blob Storage Overview

Azure Blob Storage is a highly available, durable, and scalable object store that can store any type of data.

Credit: youtube.com, AZ-900 Episode 11 | Azure Storage Services | Blob, Queue, Table, Files, Disk and Storage Tiers

It provides a secure and private way to store and serve files over the internet.

Azure Blob Storage supports three types of blobs: Block Blobs, Page Blobs, and Append Blobs, each with its own unique characteristics.

These characteristics allow for efficient storage and retrieval of large amounts of data.

Azure Blob Storage also offers a range of features such as data encryption, access control, and content delivery networks (CDNs) to ensure secure and fast data transfer.

To use Azure Blob Storage with Golang, you'll need to create a storage account and obtain the necessary credentials.

These credentials will be used to authenticate and authorize access to your storage account.

Azure Blob Storage provides a REST API that can be used to interact with the service, including creating, reading, updating, and deleting blobs.

This API can be used to implement a variety of use cases, such as storing and serving static website content.

Creating and Managing

Creating and Managing Containers in Azure Blob Storage is a breeze with the Go SDK. You can create a new container under a specified account using the CreateContainer method, which returns a client to interact with the newly created container.

Credit: youtube.com, What is the Azure Blob Storage? | How to Use the Azure Blob Storage

To create a new container, you can use the CreateContainer method, which takes a CreateContainerOptions object. This object contains optional parameters for the container creation process.

Here are some key things to keep in mind when creating and managing containers:

  • CreateContainer will raise a ResourceExistsError if the container with the same name already exists.
  • CreateContainerOptions contains optional parameters for the container creation process.
  • After creating a container, you can interact with it using the returned client.

The CreateContainerResponse object contains the response from the container creation method, which can be useful for debugging or logging purposes.

Create

Creating a new container under a specified account is a lifecycle method that can be achieved using the CreateContainer function. This function returns a client with which to interact with the newly created container.

To create a container, you'll need to use the CreateContainer function, which is available in the Client object. The function takes an optional CreateContainerOptions parameter, which allows you to specify additional settings for the container.

Here are the basic steps to create a container:

  • Create a client object using the azblob.NewClient function.
  • Call the CreateContainer function on the client object, passing in the desired container name and optional settings.

Note that if a container with the same name already exists, a ResourceExistsError will be raised.

New Client from Connection

Computer server in data center room
Credit: pexels.com, Computer server in data center room

Creating a client object is a crucial step in working with Azure resources, and one way to do this is by using the NewClientFromConnectionString function. This function creates an instance of Client with the specified values.

To use NewClientFromConnectionString, you'll need a connection string for the desired storage account. This is typically obtained from the Azure portal or created programmatically.

Here's a breakdown of the required parameters:

  • connectionString - a connection string for the desired storage account
  • options - client options; pass nil to accept the default values

You can pass in client options to customize the client object, but you can also pass nil to accept the default values. This function is a convenient way to create a client object without having to manually create a credential object.

New List Pager

To list all blobs of a particular container in Azure Storage Account, you can use the pager to deliver items in paged results.

All available Azure SDK modules use pagers to deliver items in paged results, allowing you to access all blobs in a particular container by iterating over the pages.

Modern data center corridor with server racks and computer equipment. Ideal for technology and IT concepts.
Credit: pexels.com, Modern data center corridor with server racks and computer equipment. Ideal for technology and IT concepts.

You can use NewListBlobsFlatPager to list the blobs in a container, starting from an empty Marker to begin enumeration from the beginning.

This method returns blob names in lexicographic order, so you can easily find the blob you need.

By continuing to page until there are no more results, you can ensure that you've listed all the blobs in the container.

The pager is an efficient way to handle large numbers of blobs, making it a great tool for managing your Azure Storage Account.

Buffer In Version

The Client function UploadBuffer was added in version 0.5.0.

This function allows you to upload a buffer in blocks to a block blob, which is a type of binary large object.

The UploadBuffer function is a key part of managing block blobs, allowing you to upload large amounts of data in smaller chunks.

This approach can be beneficial for uploading large files, as it enables you to start uploading the file as soon as it's partially available.

By breaking up the upload into smaller blocks, you can also handle errors and interruptions more efficiently.

Account Provision for Testing

Credit: youtube.com, Creating and Managing Managed Service Account

To provision an Azure Storage Account for testing purposes, you can use Azure CLI to quickly spin up a new account and a container.

You'll need access to an Azure Storage Account for this purpose.

For demonstration purposes, a new Azure Storage Account and a container (folder) inside of that Storage Account are created using Azure CLI.

The script will also create a new role assignment that assigns the Storage Blob Data Contributor role to the user currently signed in with Azure CLI.

If you're unfamiliar with Azure CLI, you can use Azure Portal or other management interfaces to create the Azure Storage Account.

Two important variables are defined: storageAccountName and storageAccountKey.

These variables are used later to pass the necessary information to your Go application.

With the storage account and the uploads container in place, you can use the Azure Storage Account from within Go.

Client Options

When creating a client object, you can pass options to customize its behavior. You can pass nil to accept the default values.

Credit: youtube.com, Azure Blob Storage: Managing Blobs in Azure Storage Accounts

Client options are used to customize the behavior of the client object. They are passed as the third argument to the NewClientWithNoCredential or NewClient functions.

You can pass client options to customize the behavior of the client object. The default values will be used if you pass nil.

Here are the client options that can be passed:

  • serviceURL - the URL of the storage account
  • options - client options; pass nil to accept the default values

In the case of azblob.NewClient, you can pass client options to customize the behavior of the client object. The default values will be used if you pass nil.

New Client in Version

In version 0.5.0, a new client was added to the Azure SDK, known as NewClient.

The NewClient function requires three parameters: serviceURL, cred, and options.

The serviceURL is the URL of the storage account, which can be found in the Azure portal.

A credential, typically obtained via the azidentity module, is also required.

Options can be passed in, or nil can be used to accept the default values.

Here's a quick rundown of the required parameters for NewClient:

New Shared Key Client

Credit: youtube.com, Manage access keys of an Azure Storage Account

When creating a client object to interact with Azure storage, you'll need to decide on the type of credential to use. One option is the SharedKeyCredential.

The SharedKeyCredential is an immutable object containing the storage account's name and either its primary or secondary key. This credential is used to authenticate with the storage account.

To create a SharedKeyCredential, you can use the NewSharedKeyCredential function. This function takes two parameters: the storage account's name and either its primary or secondary key.

Here's a step-by-step guide to creating a SharedKeyCredential:

  • Get the storage account's name and access key from the Azure portal.
  • Call the NewSharedKeyCredential function, passing in the storage account's name and access key.
  • The function returns a new SharedKeyCredential object, which you can use to authenticate with the storage account.

Once you have a SharedKeyCredential, you can use it to create a client object with the NewClientWithSharedKeyCredential function. This function takes three parameters: the service URL, the SharedKeyCredential, and client options.

Rosemary Boyer

Writer

Rosemary Boyer is a skilled writer with a passion for crafting engaging and informative content. With a focus on technical and educational topics, she has established herself as a reliable voice in the industry. Her writing has been featured in a variety of publications, covering subjects such as CSS Precedence, where she breaks down complex concepts into clear and concise language.

Love What You Read? Stay Updated!

Join our community for insights, tips, and more.