Security Automation with Jira and Airflow

Jayden Zheng
4 min readApr 17, 2021

Disclaimer: I do not speak for my employer. These are my views, ideas, and opinions.

Problem

In my setup, Jira tickets will be created for alerts that are triggered by the SIEM. I always wondered how can I improve my investigation workflow to include security automation for containment tasks.

This is where Airflow comes into the picture, the place where I maintain my containment playbooks, mostly written in Python.

In this post, I will share with you how to trigger Airflow DAGs using Jira Automation.

Jira Automation and Airflow 101

Jira Automation is a no-code rule builder that enables customers to build if-this-then-that-rules based on events in Jira.

Apache Airflow is a workflow automation and scheduling system that can be used to author and manage data pipelines.

Playbook — AWS Secret Key Compromise

This is an example setup on how you can trigger the Jira Automation based on a field value in the ticket which then triggers the appropriate Airflow DAG.

Airflow DAG Setup

You will need to have your Airflow DAG ready before you set up the Jira Automation.

I’m using Airflow Experimental REST API in this example for simplicity, you should use the stable REST API instead.

The example code for Airflow DAG looks like this:

You can include parameters into the conf JSON key when triggering your Airflow DAG. Your DAG should then extract out the parameters from the JSON key and process them as you wish.

An example to trigger Airflow DAG with the curl command:

This will trigger the dag_id of default and then print out the parameters.

Jira Automation Setup

First, you will need to create a custom field with a list of playbook items. In this example, I’m using a Select List with options for key-compromise and instance-compromise.

Create another custom field with the name of the target and the option of Text Field (single line)

You should be able to see the newly created fields in your Jira project and the options.

Once done, head over to the Automation page under Project settings and create a rule that looks identical to mine.

The condition for this automation rule is set to trigger when the automate field is changed. It will then compare if the chosen value is key-compromise and the target is not empty. Then it will trigger the Airflow REST API to run your DAG.

In this example, the rule will send the ticket number and the target value to the Airflow key_compromise DAG.

Things to note:

  • Secure your REST API and then authenticate via the Authorization header.
  • Enter the correct dag_id into the Webhook URL.
  • Select Custom data in Webhook body because we need to send parameters to our DAG via the conf JSON key.
  • Custom fields will have customfield_XX as the key name, make sure to find out the correct key.

Incident

A user reported that his AWS access key was exposed publicly.

A ticket was created with the details and the access key was entered into the target field. To contain this incident, I select the key-compromise value in the automate field.

This will trigger the Airflow DAG via the automation rule that we created above.

In my Airflow DAG, the containment_task main purpose is to deactivate the access key. Once done, it will trigger another job to add a comment back to the Jira ticket.

When the key_compromise DAG runs successfully, the incident is contained and the access key is disabled.

Ideally, you should also add an explicit deny IAM policy to that IAM user to deal with any temporary credentials created via AWS STS.

The Airflow DAG will then add a comment back to the Jira ticket stating that the access key has been disabled.

Voila! now you have a way to trigger containment playbooks integrated into your investigation workflow.

You can also trigger an Airflow DAG to pull in additional context regarding the alert from different APIs when the ticket is created.

Hopefully, this is something useful to you and thanks for reading.

--

--