To get started with Azure Python programming, you'll first need to install the Azure SDK for Python, which can be done using pip, the Python package manager. This will give you access to the various Azure services and libraries.
Azure provides a wide range of services, including Azure Storage, Azure Cosmos DB, and Azure Machine Learning, which can be used in your Python applications. These services can be accessed using the Azure SDK.
Before you can start using Azure services in your Python code, you'll need to set up an Azure account and create a resource group. This will provide you with a namespace to organize your resources.
To connect to your Azure account from your Python code, you'll need to use the Azure Identity library, which provides a simple way to authenticate with Azure services. This library can be used with various authentication methods, including Azure Active Directory and Azure CLI.
Getting Started
To get started with Azure Python, you'll need to create a local project using Visual Studio Code. Press F1 to open the command palette and search for and run the command Azure Functions: Create New Project.
You'll be prompted to choose the directory location for your project workspace, so select an empty folder for the project workspace. Don't choose a project folder that is already part of a workspace.
Choose Python (Programming Model V2) as your language, and select a Python interpreter to create a virtual environment. If an option isn't shown, type in the full path to your Python binary.
Select the HTTP trigger template for your project's first function, and name the function HttpExample. Choose ANONYMOUS authorization level, which lets anyone call your function endpoint.
To use the storage emulator for the storage connection required by the Python v2 model, update the AzureWebJobsStorage setting in the local.settings.json file to "UseDevelopmentStorage=true".
Here's a summary of the steps to create a local project:
- Press F1 to open the command palette and search for and run the command Azure Functions: Create New Project.
- Choose the directory location for your project workspace and select an empty folder.
- Choose Python (Programming Model V2) as your language and select a Python interpreter.
- Select the HTTP trigger template and name the function HttpExample.
- Choose ANONYMOUS authorization level.
- Update the AzureWebJobsStorage setting in the local.settings.json file to "UseDevelopmentStorage=true".
Next, you'll need to create a client to interact with Azure Storage Blobs. You'll need the storage account's blob service account URL and a credential that allows you to access the storage account.
Azure Python Basics
Azure Python Basics are a great starting point for anyone looking to get started with Azure development.
You can install the Azure SDK for Python using pip, the Python package manager, with the command `pip install azure`.
Azure provides a wide range of libraries and tools to support Python development, including the Azure Identity library, which simplifies authentication and authorization for Azure services.
Key Concepts
To get started with Azure Python, you need Python 3.7 or later, an Azure subscription, and an Azure Machine Learning Workspace.
Azure Machine Learning Python SDK v2 offers many new features, including standalone local jobs, reusable components for pipelines, and managed online/batch inferencing. The SDK v2 brings consistency and ease of use across all assets of the platform.
The Azure Blob Service consists of three main components: the storage account itself, a container within the storage account, and a blob within a container.
Here are the key components of the Azure Blob Service:
Azure Machine Learning Python SDK v2 offers several capabilities, including running standalone jobs, running multiple jobs using pipelines, and using models for managed online and batch inferencing.
Programming Model
The programming model in Azure Functions is designed to be stateless, meaning your function should process input and produce output without relying on any external state.
By default, the runtime expects the method to be implemented as a global method called main() in the __init__.py file. You can also specify an alternative entry point.
Azure Functions expects a function to be a stateless method in your Python script that processes input and produces output. This is why it's essential to keep your functions simple and focused on a single task.
To bind data to your function, you'll need to use triggers and bindings, which are defined in the function.json file. For example, if you have a function triggered by an HTTP request named req, you'll need to use the name property to bind the request data.
You can also explicitly declare the attribute types and return type in the function by using Python type annotations. This helps you use the IntelliSense and autocomplete features provided by many Python code editors.
Triggers and bindings can be declared and used in a function in a decorator-based approach. They're defined in the same file, function_app.py, as the functions. This makes it easy to keep your code organized and maintainable.
The azure.functions.* package provides the necessary annotations to bind the input and outputs to your methods. By using these annotations, you can make your code more readable and self-explanatory.
Prerequisites
To get started with Azure Python basics, you'll need to meet some prerequisites.
To use Azure Functions, you'll need Azure Functions runtime version 4.34 or later, along with Python version 3.9 or later.
You'll also need an Azure subscription and an Azure Machine Learning Workspace to use this package. Python 3.7 or later is required, and you can create an account for free.
To use this package, you'll need Python 3.8 or later, and an Azure subscription and an Azure storage account. For more details, please read our page on Azure SDK for Python version support policy.
Here are the specific requirements:
Context
In Azure Python, getting the invocation context of a function is crucial for logging and tracing purposes.
To get the invocation context, you need to include the context argument in your function's signature.
The Context class has several string attributes that provide valuable information.
The function_directory attribute tells you the directory in which the function is running.
The function_name attribute gives you the name of the function.
The invocation_id attribute provides the ID of the current function invocation.
The thread_local_storage attribute contains a local invocation_id for logging from created threads.
The trace_context attribute is the context for distributed tracing.
The retry_context attribute is the context for retries to the function.
Version
To use Azure Functions with Python, you need to meet the prerequisites. This means your Azure Functions runtime version should be 4.34 or later, and your Python version should be 3.9 or later.
You can check the supported Python versions for your Azure Functions version in the Azure Functions documentation. For example, if you're using version 4.x, you can use Python versions 3.11, 3.10, 3.9, 3.8, or 3.7.
If you want to specify a specific Python version when creating your function app in Azure, you can use the --runtime-version option of the az functionapp create command. This sets the Python version when the function app is created, and it can't be changed for apps running in a Consumption plan.
Here's a summary of the supported Python versions for Azure Functions:
If you need to change the Python version of your function app, you can set the LinuxFxVersion field in the site configuration to specify the language and version. For example, to change the Python app to use Python 3.8, set linuxFxVersion to python|3.8.
Database and Storage
Azure Functions integrates well with Azure Cosmos DB for many use cases, including IoT, ecommerce, and gaming.
To connect to Azure Cosmos DB, you need to create an account, database, and container. Then you can connect your function code to Azure Cosmos DB using trigger and bindings.
Using the Python library for Cosmos DB can help you implement more complex app logic. An asynchronous I/O implementation can be used to read inserts and updates from the change feed functionality.
The account access key should be used with caution, as it can give anyone who has it access to all the data in the storage account. DefaultAzureCredential provides enhanced security features and benefits.
Creating a new container in your storage account can be done by calling the create_container method on the blob_service_client object. Container names must be lowercase.
Database Connection
Connecting to a database is a crucial step in building scalable and efficient applications. Azure Functions integrates seamlessly with Azure Cosmos DB for various use cases, including IoT, e-commerce, and gaming.
To connect to Azure Cosmos DB, you need to create an account, database, and container first. This sets the foundation for your database connection.
For event sourcing, Azure Functions and Azure Cosmos DB are integrated to power event-driven architectures using the change feed functionality. This change feed provides downstream microservices with the ability to reliably and incrementally read inserts and updates.
To connect your function code to Azure Cosmos DB, you can use trigger and bindings. This allows you to tap into Azure Cosmos DB's capabilities and leverage its features in your application.
You can also use the Python library for Cosmos DB to implement more complex app logic. This library provides an asynchronous I/O implementation that can be used to handle database operations efficiently.
Configure Storage Connection String
To configure your storage connection string, you'll need to add an environment variable in Windows. This is necessary because the connection string is stored in the environment variable created earlier.
You must start a new instance of the command window after adding the environment variable. This is because the changes won't take effect in the current session.
The code to retrieve the connection string from the environment variable is straightforward: it uses the connection string to construct a service client object. Make sure to add this code inside the try block for it to work properly.
It's worth noting that the account access key should be used with caution. If it's lost or accidentally placed in an insecure location, your service may become vulnerable. Anyone who has the access key can authorize requests against the storage account, effectively gaining access to all the data.
DefaultAzureCredential provides enhanced security features and benefits, and is the recommended approach for managing authorization to Azure services.
Delete a Container
Deleting a container is a straightforward process. You can remove the entire container using the delete_container method, which cleans up the resources the app created.
To delete a container, you'll need to use the delete_container method. This method removes the entire container and its associated resources.
If you want to delete the local files as well, you can do so in addition to removing the container.
Functions and Extensions
Azure Functions lets you integrate third-party libraries into your function app through Python worker extensions. These extensions act as middleware that can inject specific operations during the function's execution lifecycle.
To use an extension, you need to add the extension package to the requirements.txt file for your project, install the library into your app, and add the necessary application settings. You also need to import the extension module into your function trigger and configure the extension instance, if needed.
You can use a Python worker extension library in your Python functions by adding the extension package in the requirements.txt file, installing the library into your app, and configuring the extension instance. Extensions implement a Python worker extension interface, which lets the Python worker process call into the extension code during the function's execution lifecycle.
Create Function App
You'll be prompted to select your Azure subscription, which is a crucial step. Make sure to choose the correct one to avoid any issues later on.
Next, you'll need to enter a globally unique name for your function app. This name should be valid in a URL path and will be validated to ensure it's unique in Azure Functions.
The Azure extension will guide you through the process of selecting a runtime stack, which is the language version you currently run locally. This is an important decision, as it will affect how your function app behaves.
You'll also need to select a location for new resources, which is essentially choosing an Azure region. For better performance, it's recommended to select a region near you.
The Azure extension shows the status of individual resources as they're created in Azure in the Azure: Activity Log panel. This is a helpful feature that keeps you informed about the progress of your function app creation.
Here are the steps to create a function app in Azure:
- Select your Azure subscription
- Enter a globally unique name for your function app
- Select a runtime stack
- Select a location for new resources
Alternative Entry Point
You can change the default behavior of a function by specifying the scriptFile and entryPoint properties in the function.json file.
For example, the function.json file can tell the runtime to use a custom entry point method in the main.py file. The entry point is only in the function_app.py file, but you can reference functions within the project in function_app.py by using blueprints or by importing.
The custom entry point method in main.py is called customentry(). This method can be used as the entry point for your Azure function.
To reference functions within the project, you can use blueprints or import them.
Functions Worker Dependencies
The Azure Functions Python worker requires a specific set of libraries, which you can find in your project's requirements.txt file.
You should remove the azure-functions-worker entry from your requirements.txt file, as the functions worker is automatically managed by the Azure Functions platform.
If your package contains libraries that might collide with worker's dependencies, such as protobuf, tensorflow, or grpcio, configure PYTHON_ISOLATE_WORKER_DEPENDENCIES to 1 in app settings to prevent your application from referring to worker's dependencies.
This is crucial to prevent unexpected issues when using these libraries in your functions.
Retry Policy Configuration
Retry Policy Configuration is crucial in ensuring that your application can recover from temporary errors and maintain high availability. This configuration allows you to specify the number of retries for different types of errors.
You can configure the retry policy by passing in keyword arguments when instantiating a client. This is a straightforward process that can make a big difference in the reliability of your application.
The total number of retries allowed is determined by the `retry_total` argument, which defaults to 10 if not specified. If you don't want to retry on requests at all, you can pass in `retry_total=0`.
There are different types of errors that can be retried, including connection-related errors, read errors, and bad status codes. The number of retries for each of these error types can be configured using the `retry_connect`, `retry_read`, and `retry_status` arguments, respectively.
The `retry_to_secondary` argument allows you to specify whether the request should be retried to a secondary location if able. This is particularly useful for RA-GRS accounts where stale data can be handled.
Here's a summary of the retry policy configuration options:
Security and Authentication
Security and Authentication is a top priority when working with Azure Python. The Azure Identity client library provides a recommended approach for implementing passwordless connections to Azure services, including Blob Storage.
DefaultAzureCredential is the recommended way to authenticate to Azure and authorize access to blob data. This approach supports multiple authentication methods and determines which method to use at runtime.
To use DefaultAzureCredential, you need to install the azure-identity package and import the DefaultAzureCredential class. You can then use it to authenticate to Azure using your Azure CLI sign-in credentials or Visual Studio Code.
Here are the different types of credentials you can use to authenticate to Azure:
- Azure Active Directory (AAD) token credential: obtained from the azure-identity library
- Shared access signature (SAS) token: provided as a string or generated using the Azure Portal or the generate_sas() functions
- Storage account shared key (aka account key or access key): provided as a string or found in the Azure Portal under the "Access Keys" section
- Anonymous public read access: simply omit the credential parameter
Authenticate and Authorize
Authentication is crucial when working with Azure services, including Blob Storage. Developers must be diligent to never expose the access key in an unsecure location.
To authenticate to Azure and authorize access to blob data, use the DefaultAzureCredential class provided by the Azure Identity client library. This approach offers improved management and security benefits over the account key.
The DefaultAzureCredential class supports multiple authentication methods and determines which method to use at runtime. This enables your app to use different authentication methods in different environments without implementing environment-specific code.
You can authorize access to data in your storage account using the following steps:
- Sign in to Azure using the Azure CLI, Visual Studio Code, or Azure PowerShell.
- Make sure the azure-identity package is installed and the class is imported.
- Add the following code inside the try block: `from azure.identity import DefaultAzureCredential` and `from azure.storage.blob import BlobServiceClient`.
- Create the BlobServiceClient object using the account URL and credential.
Here are the types of credentials you can use to authenticate:
Always be careful to never expose the access key in an unsecure location. If your account access key is lost or accidentally placed in an insecure location, your service may become vulnerable.
Sign In and Connect App Code
To sign in and connect your app code to Azure, you'll need to authenticate with the same Microsoft Entra account you assigned the role to on your storage account. You can do this via the Azure CLI, Visual Studio Code, or Azure PowerShell.
Make sure you're logged into the Azure CLI, Visual Studio Code, or Azure PowerShell using the following commands: `az login`, `az login` in Visual Studio Code, or `Connect-AzAccount` in PowerShell.
To use DefaultAzureCredential, install the azure-identity package and import the class: `from azure.identity import DefaultAzureCredential` and `from azure.storage.blob import BlobServiceClient`.
When the code runs on your local workstation, DefaultAzureCredential uses the developer credentials of the prioritized tool you're logged into to authenticate to Azure. This includes tools like Azure CLI or Visual Studio Code.
You can create a BlobServiceClient object by passing the account URL and credential to the class: `blob_service_client = BlobServiceClient(account_url, credential=default_credential)`.
To authorize data access with the storage account access key, you'll need permissions for the Azure RBAC action `Microsoft.Storage/storageAccounts/listkeys/action`. The least privileged built-in role with permissions for this action is Reader and Data Access.
Here are the steps to sign in and connect your app code to Azure:
- Make sure you're authenticated with the same Microsoft Entra account you assigned the role to on your storage account.
- Install the azure-identity package and import the DefaultAzureCredential class.
- Use the following code to create a BlobServiceClient object: `blob_service_client = BlobServiceClient(account_url, credential=default_credential)`.
Remember to update the storage account name in the URI of your BlobServiceClient object, and enable managed identity on your app in Azure to authorize requests to Azure Storage from an application running in Azure.
Encryption Configuration
Encryption is a crucial aspect of security, and configuring it correctly is essential to protect your data.
To enforce encryption, you can set require_encryption to True when instantiating a client. This will ensure that objects are encrypted and decrypted.
The encryption version you use is also important, with version 2.0 being the recommended choice. It's worth noting that version 1.0 is deprecated.
If you're using a custom key-encryption-key, you'll need to implement the key_resolver_function method. This function uses the kid string to return a key-encryption-key.
Here's a summary of the encryption configuration options:
Frequently Asked Questions
Can I run Python scripts in Azure?
Yes, you can run Python scripts in Azure using Azure Functions for small-scale data processing or Azure Databricks for distributed environments. Both options offer scalable and efficient ways to execute Python code in the cloud.
Is Azure CLI written in Python?
Yes, the Azure CLI is written in Python. This allows for scripting and automation of Azure tasks using Python.
Sources
- https://learn.microsoft.com/en-us/azure/azure-functions/functions-reference-python
- https://learn.microsoft.com/en-us/python/api/overview/azure/ai-ml-readme
- https://learn.microsoft.com/en-us/azure/azure-functions/create-first-function-vs-code-python
- https://learn.microsoft.com/en-us/azure/storage/blobs/storage-quickstart-blobs-python
- https://learn.microsoft.com/en-us/python/api/overview/azure/storage-blob-readme
Featured Images: pexels.com