Lets Talk about Third category called:
Resource
Azure offers a variety of logging resources to support incident response, monitoring, and security analytics. Two key components are Network Security Group (NSG) Flow Logs and Traffic Analytics—essential tools for analyzing network activity and identifying potential security incidents in your Azure environment.
Key Components of Azure Network Security
Network Security Groups (NSG): NSGs are used to control network traffic flow to and from Azure resources by setting up security rules. Rules specify the source, destination, port, and protocol, either allowing or denying traffic. Rules are prioritized numerically, with lower numbers having higher priority.
https://learn.microsoft.com/en-us/azure/network-watcher/nsg-flow-logs-overview
NSG Flow Logs: Flow logs capture important network activity at the transport layer (Layer 4) and are a vital resource for tracking and analyzing network traffic. They include:
Source and destination IP, ports, and protocol: This 5-tuple information helps identify connections and patterns.
Traffic Decision (Allow or Deny): Specifies whether traffic was permitted or blocked.
Logging Frequency: Flow logs are captured every minute.
Storage: Logs are stored in JSON format, retained for a year, and can be configured to stream to Log Analytics or an Event Hub for SIEM integration.
Note: NSG flow logs are enabled through the Network Watcher service, which must be enabled for each region in use.
NSG Flow Log Configuration
To enable NSG flow logs:
Enable Network Watcher: Set up in each Azure region where NSG monitoring is needed.
Register Microsoft.Insights Provider: The "Insights" provider enables log capture and must be registered for each subscription.
Enable NSG Flow Logs: Use Version 2 for enhanced details, including throughput information.
---------------------------------------------------------------------------------------------------------
Traffic Analytics
Traffic Analytics is a powerful tool that enhances NSG flow logs by providing a visual representation and deeper insights into network activity. By using a Log Analytics workspace, it allows organizations to:
Visualize Network Activity: Easily monitor traffic patterns across subscriptions.
Identify Security Threats: Detect unusual traffic patterns that could signify attacks or unauthorized access.
Optimize Network Deployment: Analyze traffic flows to adjust resource configurations for efficiency.
Pinpoint Misconfigurations: Quickly identify and correct settings that might expose resources to risk.
Setup: Traffic Analytics is configured via the Network Watcher and requires the NSG logs to be sent to a Log Analytics workspace.
---------------------------------------------------------------------------------------------------------
Practical Applications in Incident Response and Forensics
For incident response, NSG flow logs and Traffic Analytics provide a detailed view into Azure network activity, allowing you to:
Track unusual or unauthorized traffic patterns.
Quickly spot and investigate potential lateral movement within the network.
Assess security posture by reviewing allowed and denied traffic flows, helping ensure configurations align with security policies.
---------------------------------------------------------------------------------------------------------
Now Lets Talk about Fourth category called:
Storage Account Logs
In Azure, storage accounts are crucial resources for storing and managing data, but they require specific configurations to secure access and enable effective monitoring through logs. Here’s a breakdown of key practices for setting up and securing storage accounts in Azure:
Enabling Storage Account Logs
Azure does not enable logging for storage accounts by default, but you can enable logs through two main options:
Diagnostic Settings – Preview: This is the preferred option, offering granular logging settings. Logs can be configured for each data type—blob, queue, table, and file storage—and sent to various destinations such as a Log Analytics workspace, another storage account, an Event Hub, or a partner solution.
Diagnostic Settings – Classic: An older option with limited customization compared to the preview settings.
Logging Categories: Logs can capture Read, Write, and Delete operations. For security and forensic purposes, it’s especially important to enable the StorageRead log to track data access, as this can help detect data exfiltration attempts (e.g., when sensitive data is downloaded from a blob).
Key Logging Considerations for Security
Data Exfiltration Tracking: Monitoring Read operations is critical for detecting unauthorized data access. Filtering for specific operations, such as GetBlob, allows you to identify potential data exfiltration activities.
Microsoft Threat Matrix: Azure’s threat matrix for storage, based on the MITRE ATT&CK framework, highlights data exfiltration as a significant risk. Monitoring for this by configuring relevant logs helps mitigate data theft.
---------------------------------------------------------------------------------------------------------
Storage Account Access Controls
Access to storage accounts can be configured at multiple levels:
Account Level: Overall access to the storage account itself.
Data Level: Specific containers, file shares, queues, or tables.
Blob Level: Individual blob (object) access, allowing the most restrictive control.
Access Keys: Each storage account comes with two access keys. Regular rotation of these keys is highly recommended to maintain security.
Shared Access Signatures (SAS): SAS tokens allow restricted access to resources for a limited time and are a safer alternative to using account keys, which grant broader access. SAS tokens can be scoped down to individual blobs for more restrictive control.
Public Access: It’s critical to avoid public access configurations unless absolutely necessary, as this can expose sensitive data to unauthorized users.
---------------------------------------------------------------------------------------------------------
Internet Access and Network Security for Storage Accounts
By default, Azure storage accounts are accessible over the internet, which poses security risks:
Global Access: Storage accounts exist in a global namespace, making them accessible worldwide via a URL (e.g., https://mystorageaccount.blob.core.windows.net). Restricting access to specific networks or enabling a private endpoint is recommended to limit exposure.
Private Endpoints and Azure Private Link: For enhanced security, private endpoints can be used to connect securely to a storage account via Azure Private Link. This setup requires advanced planning but significantly reduces the risk of unauthorized internet access.
Network Security Groups (NSG): Although NSGs do not directly control storage account access, securing virtual networks and subnets associated with storage accounts is essential.
Best Practices for Incident Response and Forensics
For effective incident response:
Enable and monitor diagnostic logs for Read operations to detect data exfiltration.
Regularly review access control configurations to ensure minimal exposure.
Use private endpoints and avoid public access settings to minimize risk from the internet.
These configurations and controls enhance Azure storage security, protecting sensitive data from unauthorized access and improving overall network resilience.
---------------------------------------------------------------------------------------------------------
In Azure, protecting against data exfiltration in storage accounts requires a layered approach, involving strict control over key and SAS token generation, careful monitoring of access patterns, and policies that enforce logging for audit and response purposes. Here’s a detailed breakdown:
Data Exfiltration Prevention and Monitoring
Key and SAS Management
Key Generation: Access keys or SAS tokens are critical for accessing data in storage accounts and can be generated through various methods:
Azure Console: Provides an intuitive UI for key generation and monitoring.
PowerShell and CLI: Useful for scripting automated key management tasks.
Graph API: Suitable for integrating key management into custom applications or workflows.
For example:
Access Keys: Azure generates two access keys per storage account to allow for seamless key rotation.
Shared Access Signatures (SAS): SAS tokens can be generated at different levels (blob, file service, queue, and table), granting temporary, limited access. Generating SAS tokens at the most granular level, such as for individual blobs, reduces the risk of misuse.
Monitoring Key Enumeration: To detect potential data exfiltration, look for specific operations that indicate credential enumeration:
LISTKEYS/ACTION Operation: Any instance of "operationName": "MICROSOFT.STORAGE/STORAGEACCOUNTS/LISTKEYS/ACTION" in the logs suggests that a principal has listed the keys. This is a red flag, as unauthorized access to these keys could enable data exfiltration.
Configuring Applications for Secure Access
Once a threat actor obtains storage credentials, it becomes straightforward to access and exfiltrate data through applications like Azure Storage Explorer. This tool allows quick configuration using access keys or SAS tokens, so it’s vital to:
Limit Key Distribution: Only authorized users should have access to SAS tokens or keys, ideally with restricted permissions and limited expiry.
Enable StorageRead Logs: The StorageRead log captures read activities, providing visibility into data access. If this log isn’t enabled, data exfiltration activity goes undetected.
Automating Log Enabling with Policies
For organizations with extensive storage account usage, enabling StorageRead logs on each account individually can be infeasible. To streamline this, you can:
Create a Policy for Storage Logs: Set a policy at the management group or subscription level to automatically enable logs for all current and future storage accounts.
Predefined Policies: Azure offers several predefined policies, but currently, none enforce storage account logging by default.
Custom Policy: If needed, a custom policy can be created (e.g., to enable StorageRead logging and direct logs to an Event Hub, a Log Analytics workspace, or other storage). This policy can ensure storage accounts remain compliant with logging requirements.
Policy Constraints and Configuration:
Regional Limitation: When configuring a policy to send logs to an Event Hub, both the Event Hub and the storage account must be in the same region. To capture logs across multiple regions, create corresponding Event Hubs.
Flexible Destinations: Customize the policy to send logs to various destinations, such as Log Analytics or a storage account, depending on organizational needs.
---------------------------------------------------------------------------------------------------------
Further we will talk in next blog, Until than stay safe and keep learning
Akash Patel
----------------------------------------------------------------------------------------------------------Special Thanks (Iqra)
I would like to extend my heartfelt gratitude to one of my dearest colleagues, a Microsoft Certified Trainer, for her invaluable assistance in creating these articles. Without her support, this would not have been possible. Thank you so much for your time, expertise, and dedication!
-------------------------------------------------------------------------------------------------------------
Comments