Create and Manage Azure Blob Storage Effectively

Author

Reads 962

Aerial Shot of Coastline and Blue Water
Credit: pexels.com, Aerial Shot of Coastline and Blue Water

Creating Azure Blob Storage effectively requires understanding its core components. Azure Blob Storage is a highly available, durable, and scalable object store.

To create Azure Blob Storage, you need to decide on the storage account's location, which can be either a hot or cool storage account. Hot storage accounts are designed for frequently accessed data, while cool storage accounts are optimized for infrequently accessed data.

Azure Blob Storage offers a range of features, including data redundancy, which ensures data durability and availability. By default, Azure Blob Storage uses a 3-way redundancy model, also known as RA-GRS (Read-Access Geo-Redundant Storage). This model ensures that your data is replicated across three different locations.

Azure Blob Storage Basics

To start creating your Azure Blob Storage, you'll need to provide essential information on the Basics tab. This tab is where you'll find the fields that are required to create a new storage account.

The Basics tab is divided into two main sections: Project details and Instance details. On the Project details section, you'll need to select a subscription and a resource group for your new storage account.

Credit: youtube.com, What is the Azure Blob Storage? | How to Use the Azure Blob Storage

You can either create a new resource group or select an existing one. For more information on resource groups, you can check out the Azure documentation.

To name your storage account, you'll need to choose a unique name that's between 3 and 24 characters in length. Storage account names can only contain numbers and lowercase letters.

The region you choose for your storage account can have a billing impact, so it's essential to select the appropriate region for your needs. Not all regions are supported for all types of storage accounts or redundancy configurations.

For instance details, you'll need to select the performance level for your storage account. The default option is Standard performance, which is recommended by Microsoft for most scenarios.

You'll also need to select the redundancy configuration for your storage account. This will determine how your data is replicated and stored.

Here's a summary of the required fields on the Basics tab:

Accessing and Managing Data

Credit: youtube.com, Manage access keys of an Azure Storage Account

You have several options to access and manage data in Azure Blob Storage.

The Azure Portal can be used to upload, download, and manage data in Blob Storage via the Azure Storage Explorer. This is a great option if you're new to Azure or prefer a graphical interface.

Azure Storage REST APIs can also be used to access Blob Storage data programmatically from your application or service. This is a good choice if you're building a custom application that needs to interact with Blob Storage.

Here are some of the key methods for accessing and managing data in Azure Blob Storage:

  • Azure Storage REST APIs
  • Azure PowerShell and CLI
  • Azure SDKs for .NET, Java, Python, and other programming languages
  • Azure Data Factory
  • Azure Functions
  • Third-party tools such as CloudBerry Explorer, Cyberduck, or Storage Made Easy

These tools and APIs provide a range of options for accessing and managing data in Azure Blob Storage, depending on your specific needs and preferences.

Accessing Data

You can access Azure Blob Storage data in various ways.

One of the simplest methods is to use the Azure Portal, specifically the Azure Storage Explorer, to upload, download, and manage data in Blob Storage.

Credit: youtube.com, Data Access Management: Balancing Data Access and Data Security - Eckerson Group Webinar

Azure Storage REST APIs can also be used to access Blob Storage data programmatically from your application or service.

Azure PowerShell and CLI can be used to access Blob Storage data and perform standard storage management tasks.

Azure SDKs for .NET, Java, Python, and other programming languages can be used to access Blob Storage data and integrate it with your application or service.

Azure Data Factory can be used to transport and transform data between Blob Storage and other data sources like SQL Server or Hadoop.

Azure Functions can be used to develop serverless functions that process and alter data stored in Blob Storage.

If you want to connect Fivetran to your Azure Blob Storage container, you can do so directly, which is the simplest connection method.

If you have a firewall enabled, you'll need to create a firewall rule to allow access to Fivetran's IPs.

Here are the main methods for accessing Azure Blob Storage data:

  • Azure Portal (Azure Storage Explorer)
  • Azure Storage REST APIs
  • Azure PowerShell and CLI
  • Azure SDKs
  • Azure Data Factory
  • Azure Functions
  • Third-party tools
  • Fivetran direct connection

Media Library Files

Credit: youtube.com, The Media Library - How to Manage Your Media In WordPress

Storing a large number of media library files in a single media library can significantly affect the performance and user experience of the Media libraries application.

Files stored in Azure Blob storage have some limitations to be aware of. They can't be mapped to subfolders, so you'll need to map either the directory containing the media libraries or individual media libraries.

If you modify a media file stored in external storage, the website may still display the old version until the cache expires. This is because the system's automatic clearing of files from the server-side cache doesn't work for external storage.

To avoid this issue, you can manually clear the application's cache.

Components and Features

Azure Blob Storage is made up of several key components that work together to help you store and manage your data.

A Storage Account is the foundation of Azure Blob Storage, where you specify the namespace for your data. This is where objects stored in Azure Storage will have unique addresses linked to the individual account.

Credit: youtube.com, A Beginners Guide to Azure Blob Storage

You can store multiple containers within a Storage Account, which act as directories to help you organize and manage your blobs. Container names are always specified in lowercase.

Blob Storage also offers advanced features such as object-level tiering, which allows you to store and manage data based on its frequency of use and temporal sensitivity. This includes hot, cold, and archive tiers.

Here are the main components of Azure Blob Storage:

  • Storage Account: specifies the namespace for your data
  • Container: acts as a directory to organize and manage your blobs
  • Blob: objects in the form of unstructured data, including images, audio, video, and files

Types of

Types of storage accounts are organized into different types, each with its own set of features and capabilities. Let's take a closer look at the types of Blob Storage available.

Block Blob is the default and most frequent type of Blob Storage, intended for storing vast amounts of unstructured data such as text and binary data. It supports up to 4.75 TB of data per blob.

Page Blob is designed to hold random access files, such as VHD pictures used by virtual machines. It can store up to 8 TB of data per blob.

A different take: Storage Options in Azure

Detailed view of server racks with glowing lights in a data center environment.
Credit: pexels.com, Detailed view of server racks with glowing lights in a data center environment.

Append Blob is ideal for instances where data must be added to an existing blob, such as logging and auditing. It allows users to append data to the end of the blob without changing the existing data and supports up to 195 GB of data per blob.

Here's a summary of the types of Blob Storage:

Features

Azure Blob Storage has some amazing features that make it a go-to choice for storing and managing large amounts of unstructured data. Scalability is one of its key features, allowing it to accommodate data of any size, making it easy to store and manage enormous amounts of data in the cloud.

Blob Storage provides exceptional durability for your data, with multiple copies kept across various data centers to ensure data availability and recovery. This means you can rest assured that your data is safe and secure.

Security is also a top priority in Blob Storage, with numerous layers of protection, including encryption at rest and in transit, role-based access control, and shared access signatures. This ensures that your data is protected from unauthorized access and tampering.

Readers also liked: Azure Blob Storage Access

Computer server in data center room
Credit: pexels.com, Computer server in data center room

One of the most cost-effective features of Blob Storage is its pay-as-you-go pricing model, which allows you to pay only for the storage you need, with no upfront expenditures or termination fees. This makes it a great option for businesses and individuals with varying storage needs.

Blob Storage also integrates seamlessly with other Azure services, such as Azure Data Factory, Azure Functions, and Azure Stream Analytics, as well as third-party tools, making it a versatile option for a wide range of use cases.

Here are the key features of Blob Storage:

  • Scalability: Accommodates data of any size
  • Durability: Multiple copies kept across data centers
  • Security: Encryption at rest and in transit, role-based access control, and shared access signatures
  • Cost-effectiveness: Pay-as-you-go pricing model
  • Integration: Seamless integration with other Azure services and third-party tools
  • Object-level Tiering: Hot, cold, and archive tiers for storing and managing data based on frequency of use and temporal sensitivity
  • Advanced Data Management features: Object versioning, deletion policies, lifecycle management, and event-driven workflows

Map Folders to Kentico

To map folders to Kentico, you'll need to get a basic understanding of Azure Blob storage containers.

To use a different configuration, you'll need to create, configure, and map Azure Blob storage providers and local file system providers for each folder.

This involves editing the custom module in the StorageInitializationModule.cs file that was automatically created during installation.

If this caught your attention, see: Azure File Storage vs Blob Storage

Computer server in data center room
Credit: pexels.com, Computer server in data center room

You'll also need to get familiar with the module code and modify it to suit your application.

Once you've made the necessary changes, rebuild the solution.

After that, create and deploy the deployment package.

The binary files will now be deployed according to the defined configuration and environment – either to Azure Blob storage or to local storage.

Setup and Configuration

To set up Azure Blob Storage, you'll need to create a storage account in the Azure portal. This involves clicking on 'Storage Accounts' and then 'New' to fill in the required details such as subscription, resource group, storage account name, region, performance, and redundancy.

The Azure portal provides a user-friendly interface to create a storage account, with options for standard and premium performance, as well as different redundancy configurations. You'll also need to select a region for your storage account, which can impact billing.

To validate your Azure Blob Storage connection, Fivetran performs several tests, including connectivity, connecting to the container, validating file pattern regex, and validating archive pattern. These tests may take a couple of minutes to complete.

Here's an interesting read: Azure Create Storage Account

Steps to

Networking cables plugged into a patch panel, showcasing data center connectivity.
Credit: pexels.com, Networking cables plugged into a patch panel, showcasing data center connectivity.

To set up and configure Azure Blob Storage, you'll need to follow these steps.

First, log in to the Azure Portal, and if you don't have a subscription, create a free trial account.

To create a storage account, click on 'Storage Accounts' in the Azure portal, then click on 'New'.

Fill in the required details, including subscription, resource group, storage account name, region, performance, and redundancy.

Choose the performance option that suits your needs, either Standard (using HDD Hard Disk Drives) or Premium (using SSD Solid-State Drives).

Select the redundancy option that suits your needs, including locally redundant storage (LRS), zone-redundant storage (ZRS), geo-redundant storage (GRS), or read-access geo-redundant storage (RA-GRS).

After filling in the details, click on the 'Create' button to deploy the storage account.

Once the deployment is complete, click on 'Go to resource' to access the storage account overview.

From there, click on 'Containers' to select Blob Storage, and choose the access tier that suits your needs.

Engineer fixing core swith in data center room
Credit: pexels.com, Engineer fixing core swith in data center room

To create a new container, click on the '+' Container button, and fill in the container's name and select the access level.

Finally, select the storage account and click on 'Access keys' to find the connection string, which you'll need to authenticate your website's code to interact with the created storage account.

Here's a summary of the required details to fill in when creating a storage account:

Note that the connection method option is only available for Business Critical accounts, and you can choose to connect Fivetran to your Azure Blob Storage container directly, using an SSH tunnel, or using Azure Private Link.

Default Kentico Configuration

The default Kentico configuration is a great starting point for many projects. It's designed to get you up and running quickly.

By default, Kentico-managed Azure Blob storage is configured to map project folders to an Azure Blob storage provider for the Qa, Uat, and Production environments. This means your content item assets and media libraries are automatically stored in Azure Blob storage.

Smiling woman in data center showcasing technology expertise.
Credit: pexels.com, Smiling woman in data center showcasing technology expertise.

If you've used the --cloud parameter to create your project, you'll find a sample storage initialization module and deployment script in the StorageInitializationModule.cs file and Export-DeploymentPackage.ps1 script, respectively. These files are a great starting point for customizing your configuration.

The default configuration maps the ~/assets and ~/BizFormFiles project folders to an Azure Blob storage provider. This is a convenient way to store files uploaded via the Upload fileform component.

Here's a breakdown of the default mappings:

  • ~/assets (content item assets and media libraries) and ~/BizFormFiles (files uploaded via the Upload fileform component) project folders to an Azure Blob storage provider for the Qa, Uat, and Production environments.
  • ~/assets/media (media libraries) project folder to a local storage provider configured for the development environment.

In development, the media libraries are stored in the ~/$StorageAssets/default/assets/media folder, and a container named default, but accessed under the ~/assets/media path in your application.

Self-Managed Projects

To set up self-managed projects, you'll need to map parts of your file system to Azure Blob storage. This involves specifying your storage account name and primary access key in your application configuration file, which is typically appsettings.json by default.

You'll also need to create a custom module class in a Class Library project, which you'll add to your Xperience project in Visual Studio. This module class will override the OnInit method to store files from specific folders in the blob storage.

Recommended read: Aws S3 Storage Tiers

Credit: youtube.com, Setup | Manage Setup Using Implementation Projects

To do this, follow these steps:

  1. Specify the storage account name and primary access key in your application configuration file (appsettings.json by default).
  2. Create a custom module class in a Class Library project and add the Kentico.Xperience.AzureStorage NuGet package as a dependency.
  3. Override the module's OnInit method and for each folder that you want to store in the blob storage:

You can store files from folders like ~/assets and ~/BizFormFiles/ in Azure Blob storage containers like myassetscontainer and myformfilescontainer.

General Purpose Account

You can create a general-purpose v1 storage account, although Microsoft recommends general-purpose v2 accounts for most scenarios. Microsoft will continue to support general-purpose v1 accounts for new and existing customers.

General-purpose v1 storage accounts can be created using PowerShell, the Azure CLI, Bicep, or Azure Templates. For the kind parameter, specify Storage, and choose a sku or SkuName from the table of supported values.

You can create general-purpose v1 storage accounts in new regions whenever Azure Storage is available in those regions. Pricing for general-purpose v1 accounts has changed and is equivalent to pricing for general-purpose v2 accounts in new regions.

The following table shows the supported redundancy configurations for general-purpose v1 storage accounts:

You can choose to further customize your new storage account by setting options on the other tabs, or you can select Review + create to accept the default options and proceed to validate and create the account.

Security and Access

Credit: youtube.com, Secure User Access to Blob Storage in Azure: A Step-by-Step Tutorial with Azure Storage Explorer

To ensure the security and integrity of your Azure Blob Storage, it's essential to understand how to access and manage data. You can use the Azure Portal, Azure Storage REST APIs, Azure PowerShell and CLI, Azure SDKs, Azure Data Factory, Azure Functions, or third-party tools like CloudBerry Explorer, Cyberduck, or Storage Made Easy to access and manage your data.

Access to subtrees of the data can be controlled using the --restrict-paths command-line option, which is configured using the PathRestrictions document type. This allows administrators to constrain data access on the storage gateway, setting restrictions at the folder level and allowing or denying read, write, or access to data.

To create a shared access signature in Azure, you'll need to open the Azure Portal, select your storage account, and click Shared access signature. From there, you can select the allowed services, resource types, and permissions, as well as set the start and expiry dates of your SAS. It's also recommended to safelist Fivetran's IP address range under Allowed IP addresses for enhanced security.

Here are the methods to access Azure Blob Storage data:

  1. Azure Portal
  2. Azure Storage REST APIs
  3. Azure PowerShell and CLI
  4. Azure SDKs
  5. Azure Data Factory
  6. Azure Functions
  7. Third-party tools like CloudBerry Explorer, Cyberduck, or Storage Made Easy

Shared Access Signature

Credit: youtube.com, Shared access signature in Azure storage account explained | generate SAS | disable SAS | Beginners

Creating a Shared Access Signature (SAS) in Azure is a crucial step in securing your data. You can reuse the SAS across multiple Fivetran connectors.

To create a SAS, open the Azure Portal and select your storage account. Then, click on Shared access signature. Select Blob from the Allowed services options.

When generating the SAS, it's essential to choose the appropriate start and expiry dates. Remember, when the SAS expires, you'll need to update your Azure Blob Storage connector to resume syncing files.

To enhance security, you can safelist Fivetran's IP address range under Allowed IP addresses. Use the IP range format, such as 35.234.176.144-35.234.176.151, to safelist the IP addresses.

Select HTTPS only from the Allowed protocols options to ensure the security of your files. This is a recommended best practice to protect your data.

Here are the steps to create a SAS:

  1. Open the Azure Portal and select your storage account.
  2. Click on Shared access signature and select Blob from the Allowed services options.
  3. Select Container and Object from the Allowed resource types options.
  4. Select Read and List from the Allowed permissions options.
  5. Choose the appropriate start and expiry dates of your SAS.
  6. (Optional) Safelist Fivetran's IP address range under Allowed IP addresses.
  7. Select HTTPS only from the Allowed protocols options.
  8. Click Generate SAS and connection string.
  9. Make a note of the Connection string value.

Data Access Policies

Data access policies are crucial in ensuring that only authorized individuals can access your data in Azure Blob Storage.

Credit: youtube.com, Role-based access control (RBAC) vs. Attribute-based access control (ABAC)

To restrict access to subtrees of data, you can use the --restrict-paths command-line option, which is configured using the PathRestrictions document type. This allows administrators to constrain data access on the storage gateway.

Path restrictions can be set at the folder level and may allow read, write, or deny access to data. These are absolute paths from the root of the storage gateway virtual file system.

You can also use Azure Blob Connector Storage Gateway Policies to manage application credentials and storage account settings.

Here are some key considerations for data access policies:

  • Use the --restrict-paths command-line option to restrict access to subtrees of data.
  • Configure path restrictions using the PathRestrictions document type.
  • Set path restrictions at the folder level to control access to data.

By implementing these data access policies, you can ensure that your data in Azure Blob Storage is secure and only accessible to authorized individuals.

Data Protection Tab

The Data Protection Tab is a crucial part of Azure Storage security, and it's where you can configure options that relate to how your data is encrypted when it's persisted to the cloud.

Credit: youtube.com, Data Security: Protect your critical data (or else)

Encryption type is a required field, and by default, data in the storage account is encrypted by using Microsoft-managed keys. You can rely on Microsoft-managed keys for the encryption of your data.

Enable support for customer-managed keys is also a required field, and by default, customer-managed keys can be used to encrypt only blobs and files. Setting this option to All service types (blobs, files, tables, and queues) enables support for customer-managed keys for all services.

If you choose to use customer-managed keys, you'll need to provide an encryption key, which is a required field. You can either select a key vault and key or enter a key from a URI.

A user-assigned identity is also required if you're configuring customer-managed keys at create time for the storage account. This identity is used for authorizing access to the key vault.

Infrastructure encryption is an optional feature that can be enabled to encrypt your data at both the service level and the infrastructure level. This provides an additional layer of security for your data.

Frequently Asked Questions

How to create a blob storage in Azure?

To create a blob storage in Azure, log in to the Azure portal and navigate to your storage account to add a new container. From there, follow the prompts to create a container and start storing your data.

What is a blob in Azure storage?

A blob in Azure storage is a binary object used to store data in a variety of formats. It's a flexible storage option for any type of data, from images to documents

Gilbert Deckow

Senior Writer

Gilbert Deckow is a seasoned writer with a knack for breaking down complex technical topics into engaging and accessible content. With a focus on the ever-evolving world of cloud computing, Gilbert has established himself as a go-to expert on Azure Storage Options and related topics. Gilbert's writing style is characterized by clarity, precision, and a dash of humor, making even the most intricate concepts feel approachable and enjoyable to read.

Love What You Read? Stay Updated!

Join our community for insights, tips, and more.