Azure provides a robust logging solution through Azure Monitor, which allows you to collect, analyze, and act on log data from your Azure resources.
Azure Monitor supports over 100 built-in logs, including Azure Storage, Azure SQL Database, and Azure Virtual Machines.
Effective logging is crucial for cloud operations, as it enables you to troubleshoot issues, track performance, and maintain security.
Azure Monitor's Log Analytics feature allows you to query and analyze log data using a SQL-like language called Kusto Query Language (KQL).
Azure Logging Destinations
Azure Logging Destinations offer various options for sending log data, including Log Analytics, Event Hubs, and Storage. Each destination has its own unique features and benefits.
You can send log data to Log Analytics to enable Azure Monitor Logs, which allows you to correlate activity log data with other monitoring data collected by Azure Monitor. This can be done by selecting Export Activity Logs to send the activity log to a Log Analytics workspace.
Log data can be sent to Event Hubs to stream entries outside of Azure, such as to a third-party SIEM or other log analytics solutions. Activity log events from event hubs are consumed in JSON format with a records element that contains the records in each payload.
Here are the different Azure Logging Destinations and their characteristics:
Send to Event Hubs
Sending logs to Azure Event Hubs is a powerful way to forward entries outside of Azure, allowing you to integrate with third-party SIEM or log analytics solutions.
Azure Event Hubs is a message bus service that enables streaming time-ordered lightweight events for consumption by multiple sources.
Logs sent to Event Hubs are consumed in JSON format with a records element that contains the records in each payload.
The schema depends on the category and is described in Azure activity log event schema.
Here's a brief overview of the benefits of using Azure Event Hubs:
- Send logs outside of Azure to third-party SIEM or log analytics solutions
- Stream time-ordered lightweight events for consumption by multiple sources
- Consume logs in JSON format with a records element
If you're looking to send logs to Azure Event Hubs, you can do so by following the steps outlined in the Azure documentation.
Configure Destination
To configure Azure Log Analytics as the Logstream Destination, follow these steps: Go to CONFIGURATION > Configuration Tree > Box > Infrastructure Services > Syslog Streaming, select Logstream Destinations, and click Lock. Then, add a new filter by clicking + and enter a Name. Click OK, and from the Logstream Destination list, select Microsoft OMS or Microsoft OMS Security. Click OK and Send Changes and Activate.
You can also send the activity log to a Log Analytics workspace to enable Azure Monitor Logs, where you can correlate activity log data with other monitoring data, consolidate log entries from multiple subscriptions, and perform complex analysis.
To send logs to Azure Monitor, you can create Diagnostic Settings to send logs to storage accounts, event hubs, and Log Analytics. This will add app settings to your app, which triggers an app restart.
Here are the supported log types and their descriptions:
You can also send logs to Azure Event Hubs to send entries outside of Azure, for example, to a third-party SIEM or other log analytics solutions.
To configure the Logdata Streams to Azure Log Analytics, combine the logdata filters and logstream destination to a logdata stream. Go to CONFIGURATION > Configuration Tree > Box > Infrastructure Services > Syslog Streaming, select Logdata Streams, and click Lock. Then, add a new syslog stream by clicking +, enter a Name, and click OK. Set Active Stream to yes and in the Log Destinations table, click + and select the logstream destination configured earlier. In the Log Filters table, click + and select the logdata filter configured earlier. Choose either OMS or OMS Security as your log destination. Click OK and Send Changes and Activate.
Legacy Logging Methods
Legacy logging methods are being phased out by Azure, so it's essential to understand what's happening and how to transition to new methods.
Azure Activity logs solution is being retired on September 15, 2026, and will be automatically converted to Diagnostic settings.
If you're collecting activity logs using the legacy collection method, you should export activity logs to your Log Analytics workspace and disable the legacy collection using the Data Sources - Delete API.
To do this, first list all data sources connected to the workspace using the Data Sources - List By Workspace API and filter for activity logs by setting kind eq 'AzureActivityLog'.
You'll need to copy the name of the connection you want to disable from the API response.
Then, use the Data Sources - Delete API to stop collecting activity logs for the specific resource.
A log profile is a legacy method for sending the activity log to storage or event hubs, and it's also being retired on September 15, 2026.
You should transition to Diagnostic Settings, which provide better functionality and consistency with resource logs, before September 15, 2025.
If a log profile already exists, you first must remove the existing log profile, and then create a new one.
To remove a log profile, use Get-AzLogProfile to identify if a log profile exists, and if it does, note the Name property.
Then, use Remove-AzLogProfile to remove the log profile by using the value from the Name property.
Here are the properties you'll need to specify when creating a new log profile:
Data Retrieval and Export
You can access activity log events through various methods, including using the Get-AzLog cmdlet in PowerShell, the az monitor activity-log command in the CLI, or the Azure Monitor REST API.
To export logs, you have several options. You can export logs to a storage account, which is the cheapest option at €0.0085 per GB for local redundant cold storage. This is ideal for security log files that are only accessed during Digital Forensics and Incident Response (DFIR).
You can also export logs to Azure Event Hub, which is a message bus service for streaming time-ordered lightweight events. This allows you to make logs available for consumption by other Azure services or external log analytics platforms.
Here are some key benefits of exporting logs to a Log Analytics workspace:
- Correlate activity log data with other monitoring data collected by Azure Monitor.
- Consolidate log entries from multiple Azure subscriptions and tenants into one location for analysis together.
- Use log queries to perform complex analysis and gain deep insights on activity log entries.
- Store activity log entries for longer than the activity log retention period.
- Incur no data ingestion or retention charges for activity log data stored in a Log Analytics workspace.
Export to Storage Account
Exporting logs to a storage account is a great way to retain your data for longer periods. It's available for most Azure resources and can be set up to write activity or diagnostics log to a Storage Account in the same or different subscription.
The logs are stored in a JSON format per subscription and resource, and you can choose a retention period of up to one year to rotate logs automatically. Alternatively, you can store them indefinitely.
This option is the cheapest for storing logs, costing €0.0085 per GB for local redundant cold storage. It's ideal for security log files that are only accessed occasionally.
The logs are stored in a storage container in Azure Blob Storage, and you can access them by importing the results of a log query into a Power BI dataset.
Viewing and Searching
To view and search logs, you must create a Log Analytics workspace, which is part of the Azure Monitor platform designed to process and analyze log data.
Each workspace has its own data repository, allowing you to combine data from multiple sources.
The interface consists of three main components: tables, search, and results.
Multiple tables are available, including Azure tables, custom tables, search results, and restored logs.
Azure tables store logs from Azure resources and have a predefined schema, but you can add more columns if needed.
Custom tables store logs from other resources and require designing a schema based on your data collection.
Search results are generated when you run a search job and are based on your search query.
Restored logs store archived logs.
You can search for logs using the Kusto Query Language (KQL) in the top section, and the search result will be displayed in the bottom section.
The following types of tables are available in the Log Analytics workspace:
- Azure table: These tables store logs from Azure resources.
- Custom table: These tables store logs from other resources.
- Search results: The search result table is generated when you run a search job.
- Restored logs: These tables store archived logs.
You can use log queries to retrieve specific columns of data and feed them to other services provided by Azure Monitor.
Data Structure and Management
Data Structure and Management is crucial for efficient logging. Azure provides various data structures to manage logs, including tables, which are ideal for storing large amounts of log data.
Azure Logging supports multiple data structures, such as tables, which can be used to store log data with high scalability and performance.
In Azure, tables are optimized for storing large amounts of unstructured data, making them a great fit for log data that often has a high volume and velocity.
Data Structure Changes
Data Structure Changes can be a challenge, especially when they're not clearly explained. The Export activity logs experience sends the same data as the legacy method used to send the activity log with some changes to the structure of the AzureActivity table.
Some columns in the AzureActivity table are deprecated, meaning they still exist but have no data. The replacements for these columns contain the same data, but in a different format. You might need to modify log queries that use them.
The column "Category" is one example, where the replacement is "CategoryValue". This is because the values in these columns might be all uppercase, and you'll need to use the =~ operator to do a case-insensitive comparison.
Here's a table showing the changes to some of these columns:
Values are success, start, accept, failureActivityStatus
Values same as JSONActivityStatusValue
Values change to succeeded, started, accepted, failedThe valid values change as shown.subStatusActivitySubstatusActivitySubstatusValueoperationNameOperationNameOperationNameValueREST API localizes the operation name value. Log Analytics UI always shows English.resourceProviderNameResourceProviderResourceProviderValue
In some cases, you might need to update your queries to use the new column names, such as "CategoryValue" instead of "Category". This will ensure you're getting the correct data from the AzureActivity table.
Visualizing Data
Visualizing Data is a crucial step in understanding how your resources are used and how they perform over time. The Metrics Explorer is a powerful tool that provides a comprehensive view of your Azure resources, allowing you to view real-time and historical metrics data from various Azure services.
You can create visualizations using the Metrics Explorer to gain insights into your resource performance. With this tool, you can view metrics from multiple resources in a single dashboard and compare metrics data to identify trends and patterns.
The Metrics Explorer allows you to export the created charts to other tools, such as Power BI, for further analysis and visualization. This makes it easier to share your findings with others and dive deeper into your data.
Workspace and Configuration
To set up a Log Analytics workspace, you'll need to create one in the Azure portal. This involves going to All services, searching for Log Analytics, and selecting Log Analytics workspaces. From there, click Create and enter the necessary information, including the pricing tier, tags, and verification of settings.
You can send the activity log to a Log Analytics workspace by selecting Export Activity Logs, which will send the activity log to a Log Analytics workspace. This allows you to store activity log entries for longer than the activity log retention period and incur no data ingestion or retention charges for activity log data stored in a Log Analytics workspace.
To send the activity log to a Log Analytics workspace, you can select up to five workspaces. The activity log data in a Log Analytics workspace is stored in a table called AzureActivity, which you can retrieve with a log query in Log Analytics.
A Log Analytics workspace has a default retention period of 90 days. To view a count of activity log records for each category, you can use the following query: | summarize count() by Category.
When creating a Log Analytics workspace, you'll need to specify values for the pricing tier, tags, and verification of settings. The pricing tier blade opens, where you specify values for the pricing tier.
To configure logdata filters, you'll need to go to CONFIGURATION > Configuration Tree > Box > Infrastructure Services > Syslog Streaming, select Logdata Filters, and click Lock. From there, you can add a new filter and specify the log file types to be transferred or streamed.
To configure the logdata streams to Azure Log Analytics, you'll need to go to CONFIGURATION > Configuration Tree > Box >Infrastructure Services > Syslog Streaming, select Logdata Streams, and click Lock. From there, you can add a new syslog stream and specify the logstream destination and logdata filter.
To enable application logging for Windows apps, you'll need to navigate to your app and select App Service logs. You can then select On for either Application Logging (Filesystem) or Application Logging (Blob), or both.
To enable application logging for Linux apps or custom containers, you'll need to navigate to your app and select App Service logs. In Application logging, select File System, and specify the disk quota for the application logs and the number of days the logs should be retained.
Security and Networking
For Azure logging, networking considerations are crucial when it comes to Diagnostic Settings restrictions.
To ensure you don't exceed destination limits, refer to the official Diagnostic Settings documentation for guidance.
Enabling Detailed Firewall Reporting can provide valuable insights into your network activity.
Creating Alerts
Creating alerts is a crucial step in staying informed about critical events or conditions in your Azure resources. Azure Monitor Alerts allows you to create threshold-based or log-based alerts.
You can set up alert rules, such as when a metric exceeds a specified threshold or when a log query returns a particular result. This helps you stay on top of potential issues before they impact your customers.
Azure Alerts can be configured to send notifications via email, SMS, or to an Azure Event Grid topic. You can also integrate Azure Alerts with other tools, such as Microsoft Teams or Slack.
By grouping multiple actions together, you can form an action group that simplifies the process of managing your alert actions. This allows you to respond quickly to issues and resolve them before they become major problems.
Enable Firewall Reporting
Enabling Firewall Reporting is a crucial step in monitoring and understanding network activity. This involves configuring the firewall to provide detailed reporting.
You can enable detailed firewall reporting by following the necessary steps. The output will display a format with various fields, such as date, time, and protocol used.
The detailed output will include information like the source and destination IP addresses, ports, and protocol used. It will also display the type of traffic, such as allow or block.
However, for streaming to OMS Security, the logs need to be in a simple pipe-separated value format for easier parsing. This format will display a condensed version of the detailed output.
The simple format will include fields like protocol, source IP, and destination IP, separated by pipes. This makes it easier to analyze and understand network activity.
Networking Considerations
Networking considerations are crucial for a secure and efficient system. For Diagnostic Settings restrictions, refer to the official Diagnostic Settings documentation regarding destination limits.
To ensure smooth communication, it's essential to understand the destination limits for Diagnostic Settings. The official documentation provides the necessary information to set up these restrictions effectively.
Limiting destinations can help prevent unauthorized access to sensitive data. This is a critical step in maintaining the security and integrity of your system.
It's also important to note that there may be other networking considerations to take into account. However, the destination limits for Diagnostic Settings should be your top priority.
Frequently Asked Questions
What is logging in Azure?
Logging in Azure refers to collecting and managing logs from Azure services, applications, and resources. Learn more about the various options and use cases for logging in Azure
What is an Azure activity log?
The Azure activity log is a platform log that tracks subscription-level events, such as resource modifications and virtual machine startups. It provides valuable insights into changes made to your Azure resources.
What are Azure platform logs?
Azure platform logs are detailed records of diagnostic and auditing information for Azure resources and the underlying platform they rely on. They provide valuable insights for troubleshooting and monitoring Azure services.
Where are Azure logs stored?
Azure logs are stored in a container named $logs, located in the blob namespace of the storage account. You can access them at a URL like http://
Sources
- https://learn.microsoft.com/en-us/azure/azure-monitor/essentials/activity-log
- https://blog.nviso.eu/2019/04/10/azure-security-logging-part-i-defining-you-logging-strategy/
- https://betterstack.com/community/guides/logging/azure-logging/
- https://learn.microsoft.com/en-us/azure/app-service/troubleshoot-diagnostic-logs
- https://campus.barracuda.com/product/cloudgenfirewall/doc/170821249/how-to-configure-log-streaming-to-microsoft-azure-log-analytics/
Featured Images: pexels.com