Automating Azure Sentinel: Using Playbooks to extract data

Automating Azure Sentinel: Using Playbooks to extract data

Robert Kitching - Senior Security Development Engineer for Bridewell Consulting


Azure Sentinel is Microsoft’s cloud native SIEM/SOAR and is quickly becoming the security tool of choice for many security teams around the world. It is highly capable and very extendable, which we will cover in more detail within this blog.

It connects natively to numerous Microsoft Products through built in, easy to connect, data connectors

It also has the capability of automation through Playbooks. You may have seen something like these across other Microsoft products in the guise of Microsoft Flow or Logic Apps.

Playbooks allow you run automation steps to schedule, automate, and orchestrate tasks and workflows without the need to write bespoke applications. They are provisioned within the cloud, so you do not need to think about separate infrastructure requirements.

They have distinct advantages within security monitoring as they are quick to develop and simple to understand. This can help mitigate any skills gaps within the team as the users will not be required to have a prior knowledge of any specific programming languages such a python, C# and PowerShell.

The problem?

Azure sentinel is a great tool right out of the box, but currently lacks some key features. One of these is the ability to extract all the metadata related to security incidents in a simple and effective way.

This is useful if you want to monitor KPIs, the effectiveness of sentinel detection or even just providing a simple data dump. The data could be imported into a variety of tools such as PowerBI or a custom workbook.

Searching through the Microsoft GitHub page and looking at some community blogs we came across the Azure Sentinel API (currently still in preview). This API has methods for retrieving incident metadata and importantly also has methods to pull corresponding comment data for each incident.

Now we could have created some local scripts to pull the required information, format the data and it would have been job done; but that’s not very collaborative or cloud centric, so we decided a Playbook would be a perfect fit.

The solution…

It took a few iterations and some trial and error, but we finally came up with a functioning solution. You can find the templated version of this Playbook on the official Microsoft Sentinel GitHub.

You can find an annotated guide to the playbook included below.


This playbook requires Managed Identity. You will need to turn on managed identity for this Playbook. You will then need to Assign RBAC ‘Log Analytic Reader’ role to the Logic App at the required level.


This playbook requires Managed Identity. You will need to turn on managed identity for this Playbook. You will then need to Assign RBAC ‘Log Analytic Reader’ role to the Logic App at the required level.

This is the recurrence trigger. Frequency and interval can be altered.

This is a variable used to filter the API call only return the required data  

Set another variable with an object containing information on the current workspace.

Initialise an object variable to hold settings required. This uses values from the ‘workflow’ variable from the previous step.

NB if your playbook is in a different subscription/resource group replace with the correct values. The workflow step will not be required.

Initialise variable to store API results

Initialise URL variable for use within GET request


The next step is to make a GET request to the security incident’s API. The API supports pagination so we will need to handle this within the playbook. Therefore, we have created a URL variable and incidents array variable in the previous steps.

The HTTP method sits with an Until control. The API returns a nextLink in the response. If the link is empty the logic app will proceed to the next step.

The URL variable will be replaced with the nextLink to continue to fetch the results.

If the link is populated the subsequent steps with the Until control will run until the nextLink is blank.

The ‘Get Incidents’ HTTP action requires managed identity.

Unions the result with the existing incidents array.

Replaces the incidents array with the union result.

Sets the URL variable to the new URL if it exists.


At this stage we now have all the required incidents from the API into the incidents array placeholder. We now need to parse this data into something meaningful and extract any corresponding comments for each incident.

Parse the raw JSON response using a predefined schema

Initialise result variable to contain our output.

Variables to hold nested array information for each incident


Now we have parsed the data and set up the placeholders we loop through each incident and attempt to pull any corresponding comments. If we do not find any comments, we continue with step 5.

For each to loop through all returned incidents.

Luckily, the incidents metadata contains a comment count so we can use this to check whether its worth making the additional HTTP request.

NB: Parallelism is set to 1 for this loop as it writes to placeholders outside the loop. 

If the incident contains additional comments, then make an additional API call passing the incident ID to retrieve specific comments for the incidents

Parse raw JSON response

Another loop is required as the comments are returned as an array. The response is looped and concatenated into a string variable.


We are still within the original for each loop at this stage. The next step is to extract the required fields from the incidents record and any additional comments if found.


Loop to flatten the product names into a string placeholder as they are in a nested array.

This creates the required output taking data from the original incident records and the placeholders that we have previously populated.

NB: You can adjust this as required.

Clear the placeholders in preparation of the next incident loop.


Once the loop has exited, we should now have a result output array containing the data we require. The next step is to convert the array to a CSV table and to email to a specific address.

Creates CSV table based on result array

Sends email to specific address attaching the CSV


Azure Sentinel may not have all the required features you need straight out of the box, but with some effort and understanding you can tweak it to your custom requirements. Breaking down the problem into small steps will help understand how it pieces together.

Microsoft and the wider community are also having a big part to play with various contributions to improve its capabilities. You can find any Bridewell and community contributions on the official Microsoft Sentinel GitHub page.

Your webinar host

Gavin Knapp

Thu, Jun 25, 2020 10:00 AM - 10:40 AM BST

The threat landscape has evolved dramatically over recent years. Cyber attacks are now a common occurrence and have become increasingly more sophisticated. Conversely the barrier to entry to perform such attacks has been drastically lowered due to technological advances.

In addition to the increase of targeted cyber attacks we see organisations of all sizes rapidly embracing digital transformation programs and adopting modern ways of working to gain a competitive advantage over peers. The majority of these organisations now find their information and assets existing far beyond a traditional corporate perimeter and in most cases readily accessible through the public cloud. The perimeter has now been extended past your firewalls and identity has now become the new perimeter.

It is well publicised that the dwell time between successful attacks and subsequent detection is significant. The Verizon 2019 Data Breach Investigations Report indicates that attackers gain access to systems within minutes whilst successful detection of these attacks can take months.

How confident are you that your organisation is equipped to successfully prevent, detect, respond and recover from cyber attacks?

This webinar will take you through how we enable our clients to detect and respond to today’s common attack scenarios using Microsoft’s flagship next gen SIEM and SOAR platform – Azure Sentinel.

The webinar will include practical scenarios covering popular techniques from the Mitre ATT&CK framework.

Core components:

  • Brief overview of Azure Sentinel
  • Analytics rule scenario
  • Hunting scenario
  • Bring your own threat intel scenario
  • Q&A

Need help with Azure Sentinel?

At Bridewell we have a number of Senior Consultants who are highly skilled Azure Sentinel specialist and if you have any questions or would like further information please get in touch.

Close Menu