
Azure Open AI Service is a powerful tool that enables you to deploy AI models with ease. It's a cloud-based platform that provides a range of pre-built AI models and tools to help you get started.
With Azure Open AI Service, you can deploy AI models in a fraction of the time it would take to build them from scratch. This is because the service provides a library of pre-trained models that you can use as a starting point.
The service also includes a range of tools and features that make it easy to fine-tune and customize these models to fit your specific needs. This includes support for popular frameworks like TensorFlow and PyTorch.
By using Azure Open AI Service, you can unlock the full potential of AI and machine learning in your organization.
Intriguing read: Azure Openai Certification
Azure Open AI Configuration
Before you start fine-tuning your Azure Open AI model, you need to configure your parameters. You can choose to leave the default configuration or customize the values to your preference.
The available parameters include batch size, learning rate multiplier, number of epochs, and seed. The batch size determines the number of training examples used to train a single forward and backward pass. A larger batch size tends to work better for larger datasets.
Here are the available parameter options:
After you finish making your configurations, select Next to proceed with the fine-tuning process.
Configure Your Parameters
You can choose to leave the default configuration or customize the values to your preference. After you finish making your configurations, select Next.
The batch size is the number of training examples used to train a single forward and backward pass. In general, larger batch sizes tend to work better for larger datasets.
A larger batch size means that model parameters are updated less frequently, but with lower variance. When set to -1, batch_size is calculated as 0.2% of examples in the training set and the max is 256.
A different take: Azure Ai Training
The learning rate multiplier to use for training is controlled by the learning_rate_multiplier parameter. Larger learning rates tend to perform better with larger batch sizes.
The number of epochs to train the model for is determined by the n_epochs parameter. If set to -1, the number of epochs is determined dynamically based on the input data.
The seed controls the reproducibility of the job, and passing in the same seed and job parameters should produce the same results, but may differ in rare cases.
Here's a summary of the parameters you can configure:
We recommend experimenting with values in the range 0.02 to 0.2 to see what produces the best results for the learning rate multiplier.
Review Rest API Workflow
Reviewing the Rest API workflow is a crucial step in configuring Azure Open AI. The training and validation data you use must be formatted as a JSON Lines (JSONL) document.
To prepare your training data, you can either choose existing prepared training data or upload new prepared training data to use when fine-tuning your model. This includes uploading your files to the service, which can be done in two ways.
Once your training and validation files are uploaded, you're ready to start the fine-tuning job. Here's an example of how to create a new fine-tuning job with the REST API:
The REST API generates a name for the deployment of your customized model. This is a key point to note when using the Rest API workflow.
Readers also liked: Azure Ai Api
Using the API
To use the Azure OpenAI API, you'll first need to ensure your sign-in credential has the Cognitive Services OpenAI Contributor role on your Azure OpenAI resource. Run az login first to get started.
You'll need to make sure you have the necessary permissions to access the API. This involves granting the Cognitive Services OpenAI Contributor role to your sign-in credential.
Here are the steps to follow:
- Ensure your sign-in credential has the Cognitive Services OpenAI Contributor role on your Azure OpenAI resource.
- Run az login first to authenticate with Azure.
That's it! With these permissions in place, you're ready to start using the Azure OpenAI API.
Prerequisites
To get started with Azure Open AI configuration, you'll need to meet some prerequisites.
First, make sure you've read the fine-tuning guide for Azure OpenAI. This will give you a solid understanding of how to get the most out of the service.
To use Azure OpenAI, you'll need to have an Azure subscription, which you can create for free.
You'll also need to have an Azure OpenAI resource located in a region that supports fine-tuning of the Azure OpenAI model. Check the Model summary table and region availability to see which models are available in your region.
Fine-tuning access requires Cognitive Services OpenAI Contributor permissions.
If you don't already have access to view quota and deploy models in Azure AI Studio, you'll need to request additional permissions.
If this caught your attention, see: What Is Azure Open Ai
Import
Importing data into Azure Open AI is a straightforward process. You can import training and validation datasets from Azure Blob or another shared web location.
To import training data, enter the File name for the file, and then provide the Azure Blob URL, the Azure Storage shared access signature (SAS), or other link to an accessible shared web location. Select Import to import the training dataset to the service.
You can import validation data in a similar way, by entering the File name and providing the file location. This can be an Azure Blob URL or a shared access signature.
Here's a step-by-step guide to importing data:
- Enter the File name for the file.
- Provide the Azure Blob URL, the Azure Storage shared access signature (SAS), or other link to an accessible shared web location.
- Select Import to import the dataset.
Azure Open AI Security
Azure Open AI Security ensures the security and compliance of your data by providing private access through the integration of Private Link, enabling you to access your models within your virtual network infrastructure using private IP addresses.
To further enhance security, Azure Open AI Service allows you to set up Azure AI Search as a trusted service based on managed identity, which verifies the claims in the JSON Web Token (JWT) and enables Azure AI Search to call your Azure Open AI preprocessing-jobs as custom skill web API.
By integrating Microsoft Defender for Cloud, you can protect your applications with threat protection for AI workloads, providing teams with evidence-based security alerts enriched with Microsoft threat intelligence signals.
Suggestion: Azure Ai Courses
Secure AI Apps
To ensure your Azure OpenAI Service is secure, you need to enable managed identity. This allows your Azure AI Search and Storage Account to recognize your Azure OpenAI Service via Microsoft Entra ID authentication. You can do this by toggling on system assigned managed identity on the Azure portal.
Disabling public network access is also crucial. You can do this in the Azure portal, and it's essential for allowing access to your Azure OpenAI Service from client machines, like using Azure OpenAI Studio. This requires creating private endpoint connections that connect to your Azure OpenAI resource.
To bypass Azure AI Search as a trusted service, you need to set up Azure OpenAI to use the system assigned managed identity authentication. This involves setting networkAcls.bypass as AzureServices from the management API. However, this step can be skipped if you have a shared private link for your Azure AI Search resource.
Azure OpenAI Service offers private instances of OpenAI LLMs, providing private access through the integration of Private Link. This ensures the security and compliance of your data by enabling you to access your models within your virtual network infrastructure using private IP addresses.
Check this out: Azure Auth Json Website Azure Ad Authentication
Here's a summary of the security features:
- Enable managed identity for Azure OpenAI Service
- Disable public network access and create private endpoint connections
- Set up Azure OpenAI to use system assigned managed identity authentication
- Use Private Link for secure data access
Document-level access control is also supported for Azure AI search, allowing you to restrict the documents that can be used in responses for different users with Azure AI Search security filters. This involves registering your application, creating users and groups, and indexing your documents with their permitted groups.
A different take: Azure Cognative Search
Role Assignments
To set up role assignments for Azure Open AI, you need to allow the services to authorize each other. This is done by assigning the necessary roles to the resources.
The first step is to assign the Search Index Data Reader role to Azure OpenAI on Azure AI Search. This allows the inference service to query the data from the index. You also need to assign the Search Service Contributor role to Azure OpenAI on Azure AI Search, which allows the inference service to query the index schema for auto fields mapping.
In addition to these roles, you need to assign the Storage Blob Data Contributor role to Azure OpenAI on the Storage Account. This allows the inference service to read from the input container and write the preprocessed result to the output container.
You also need to assign the Cognitive Services OpenAI Contributor role to Azure AI Search on Azure OpenAI. This allows the custom skill to be used.
Here is a summary of the roles that need to be assigned:
The admin needs to have the Owner role on these resources to add role assignments.
Service
Azure OpenAI Service is a powerful artificial intelligence service that allows users to create and deploy AI models within the Microsoft Azure platform. It integrates OpenAI's language models and services into Azure applications and platform.
This service ensures the security and compliance of your data by providing private access through the integration of Private Link. This enables you to access your models within your virtual network infrastructure using private IP addresses.
The Azure OpenAI Service is a result of the partnership between Microsoft and OpenAI, which broadens the capabilities of technology. This partnership has led to various use cases, including analyzing user inputs, chat and conversation interaction, and code generation or transformation scenarios.
Check this out: Azure Ai Models
Here are some of the use cases for Azure OpenAI Service:
- Reason over structured and unstructured data
- Chat and conversation interaction
- Chat and conversation creation
- Code generation or transformation scenarios
- Summarization
- Journalistic content
- Question-answering
- Search
To use the Azure OpenAI Service, you need to have the Cognitive Services OpenAI Contributor role on your Azure OpenAI resource. You can then use the Azure CLI to run az login and access the service.
The service allows you to fine-tune an existing Azure OpenAI model in an Azure AI Studio project. This involves selecting a base model, choosing a version, and uploading training data. You can also use the API to make requests and receive responses from the service.
Safety Evaluation GPT-4 Fine-Tuning - Public Preview
GPT-4, GPT-4o, and GPT-4o-mini are our most advanced models that can be fine-tuned to your needs. Evaluations are conducted in dedicated, customer-specific, private workspaces, ensuring a safe and secure evaluation process.
Evaluation endpoints are in the same geography as the Azure OpenAI resource, which helps to reduce latency and improve performance. Training data is not stored in connection with performing evaluations; only the final model assessment (deployable or not deployable) is persisted.
The fine-tuned model evaluation filters are set to predefined thresholds and cannot be modified by customers. This is done to ensure that the evaluation process is consistent and reliable. Images must be:
- Evaluations are conducted in dedicated, customer-specific, private workspaces;
- Evaluation endpoints are in the same geography as the Azure OpenAI resource;
- Training data is not stored in connection with performing evaluations; only the final model assessment (deployable or not deployable) is persisted;
- The fine-tuned model evaluation filters are set to predefined thresholds and cannot be modified by customers;
If a fine-tuned model fails due to the detection of harmful content in model outputs, you won't be charged for the training run. This is a safety net that helps to prevent the deployment of potentially harmful models.
Explore further: Create an Ai Model in My Azure Tenant
Frequently Asked Questions
What is OpenAI in Azure?
Azure OpenAI is a cloud-based service that uses advanced language models to generate human-like text and perform tasks such as summarizing large files and documents. It enables users to leverage these models to automate various data-related tasks with high-quality results.
Is Azure OpenAI the same as ChatGPT?
No, Azure OpenAI and ChatGPT are distinct services with different features and authentication options, with Azure OpenAI offering integration with Azure AD and ChatGPT providing credits for customization. Understanding the key differences between these services can help you choose the best solution for your needs.
Can I use Azure OpenAI for free?
Unfortunately, Azure's free account tier does not support OpenAI services. To access OpenAI, you'll need to upgrade to a paid Azure subscription and obtain your subscription ID through the Azure Portal.
Does Microsoft own 49% of OpenAI?
No, Microsoft does not own 49% of OpenAI, but it is entitled to a significant portion of the for-profit arm's profits. This is a key distinction that clarifies Microsoft's investment in OpenAI.
Does OpenAI use AWS or Azure?
OpenAI integrates with Azure services through its API ecosystem, while Amazon Bedrock integrates with other AWS services. OpenAI's integration with Azure sets it apart from Amazon Bedrock's AWS integration.
Sources
- https://learn.microsoft.com/en-us/azure/ai-services/openai/how-to/use-your-data-securely
- https://www.prodwaregroup.com/our-solutions/microsoft-azure-openai/
- https://www.proserveit.com/blog/introduction-to-microsoft-new-azure-openai-service
- https://learn.microsoft.com/en-us/azure/ai-services/openai/how-to/fine-tuning
- https://www.intwo.cloud/news-blog/at-the-crossroads-of-ai-chatgpt-openai-or-azure-openai/
Featured Images: pexels.com