Azure Data Factory Bugs Expose Cloud Infrastructure

Three vulnerabilities in the service's Apache Airflow integration could have allowed attackers to take shadow administrative control over an enterprise cloud infrastructure, gain access to and exfiltrate data, and deploy malware.

Cloud drawing lit up inside by connected lines and lights against black backdrop above a network of lines and lights
Source: Aleksia via Alamy Stock Photo

Three flaws discovered in the way Microsoft's Azure-based data integration service leverages an open source workflow orchestration platform could have allowed an attacker to achieve administrative control over companies' Azure cloud infrastructures, exposing enterprises to data exfiltration, malware deployment, and unauthorized data access.

Researchers at Palo Alto Networks' Unit 42 discovered the vulnerabilities — two of which were misconfigurations and the third involved weak authentication — in Azure Data Factory's Apache Airflow integration. Data Factory enables users to manage data pipelines when moving information between different sources, while Apache Airflow facilitates the scheduling and orchestration of complex workflows.

While Microsoft classified the flaws as low-severity vulnerabilities, Unit 42 researchers found that exploiting them successfully could allow an attacker to gain persistent access as a shadow administrator over the entire Airflow Azure Kubernetes Service (AKS) cluster, they revealed in a blog post published Dec. 17.

Specifically, the flaws discovered in Data Factory were: a misconfigured Kubernetes role-based access control (RBAC) in Airflow cluster; a misconfigured secret handling of the Azure's internal Geneva service, which is responsible for managing critical logs and metrics; and weak authentication for Geneva.

Related:CISA Directs Federal Agencies to Secure Cloud Environments

Unauthorized Azure Cloud Access Already Mitigated

The Airflow instance's use of default, unchangeable configurations combined with the cluster admin role's attachment to the Airflow runner "caused a security issue" that could be manipulated "to control the Airflow cluster and related infrastructure," the researchers explained.

If an attacker was able to breach the cluster, they also could manipulate Geneva, allowing attackers "to potentially tamper with log data or access other sensitive Azure resources," Unit 42 AI and security research manager Ofir Balassiano and senior security researcher David Orlovsky wrote in the post.

Overall, the flaws highlight the importance of managing service permissions and monitoring the operations of critical third-party services within a cloud environment to prevent unauthorized access to a cluster.

Unit 42 informed Microsoft Azure of the flaws, which ultimately were resolved by the Microsoft Security Response Center. The researchers did not specify what fixes were made to mitigate the vulnerabilities, and Microsoft did not immediately respond to request for comment.

How Cyberattackers Gain Initial Administrative Access

Related:Zerto Introduces Cloud Vault Solution for Enhanced Cyber Resilience Through MSPs

An initial exploit scenario lies in an attacker's ability to gain unauthorized write permissions to a directed acyclic graph (DAG) file used by Apache Airflow. DAG files define the workflow structure as Python code; they specify the sequence in which tasks should be executed, the dependencies between tasks, and scheduling rules.

Attackers have two ways to gain access to and tamper with DAG files. They could gain write permissions to the storage account containing DAG files by leveraging a principal account with write permissions; or they could use a shared access signature (SAS) token, which grants temporary and limited access to a DAG file.

In this scenario, once a DAG file is tampered with, "it lies dormant until the DAG files are imported by the victim," the researchers explained.

The second way is to gain access to a Git repository using leaked credentials or a misconfigured repository. Once this occurs, the attacker can create a malicious DAG file or modify an existing one, and the directory containing the malicious DAG file is imported automatically.

In their attack flow, Unit 42 researchers used the Git repository leaked credentials scenario to access a DAG file. "In this case, once the attacker manipulates the compromised DAG file, Airflow executes it, and the attacker gets a reverse shell," they explained in the post.

Related:336K Prometheus Instances Exposed to DoS, 'Repojacking'

The basic exploit workflow, then, involves an attacker first crafting a DAG file that opens a reverse shell to a remote server and runs automatically when imported. The malicious DAG file is then uploaded to a private GitHub repository connected to the Airflow cluster.

"Airflow imports and runs the DAG file automatically from the connected Git repository, opening a reverse shell on an Airflow worker," the researchers explained. "At this point, we gained cluster admin privileges due to a Kubernetes service account that was attached to an Airflow worker."

The attack can then escalate from there to take over a cluster; use the shadow admin access to create shadow workloads for cryptomining or running other malware; exfiltrate data from the enterprise cloud; and exploit Geneva to reach other Azure endpoints for further malicious activity, the researchers wrote.

Cloud Security Should Extend Beyond the Cluster

Cloud-based attacks often begin with attackers pouncing on local misconfigurations, and the exploit flow again highlights how an entire cloud environment can be exposed to risk due to flaws exploited within a single node or cluster.

The scenario demonstrates the importance of going beyond merely securing the perimeter of a cloud cluster to a more comprehensive approach to cloud security that takes into consideration what happens if attackers break this boundary, according to Unit 42.

This strategy should include "securing permissions and configurations within the environment itself, and using policy and audit engines to help detect and prevent future incidents both within the cluster and in the cloud," the researchers wrote.

Enterprises also should safeguard sensitive data assets that interact with different services in the cloud to understand which data is being processed with which data service, they added. This will ensure that service dependencies are taken into consideration when securing the cloud.

About the Author

Elizabeth Montalbano, Contributing Writer

Elizabeth Montalbano is a freelance writer, journalist, and therapeutic writing mentor with more than 25 years of professional experience. Her areas of expertise include technology, business, and culture. Elizabeth previously lived and worked as a full-time journalist in Phoenix, San Francisco, and New York City; she currently resides in a village on the southwest coast of Portugal. In her free time, she enjoys surfing, hiking with her dogs, traveling, playing music, yoga, and cooking.

Keep up with the latest cybersecurity threats, newly discovered vulnerabilities, data breach information, and emerging trends. Delivered daily or weekly right to your email inbox.

You May Also Like


More Insights