Airflow declarative DAGs via YAML.
Compatibility:
- Python 2.7 / 3.5+
- Airflow 1.10.4+
- Declarative DAGs in plain text YAML helps a lot to understand how DAG will looks like. Made for humans, not programmers.
- It makes extremely hard to turn your DAGs into code mess. Even if you make complicated YAMLs generator the result would be readable for humans.
- No more guilty about coupling business logic with task management system (Airflow). They now could coexists separated.
- Static analysis becomes a trivial task.
- It's a good abstraction to create your own scheduler/worker compatible with original Airflow one.
Check tests/dags directory for example of DAGs which will works and which won't. Use src/airflow_declarative/schema.py module for the reference about YAML file schema. It should be self descriptive.
Don't be shy to experiment: trafaret-config will help you to understand what had gone wrong and why and where.
We provide support for two installation options:
- As a complementary side package for the upstream Airflow.
- As a built-in Airflow functionality using patches for Airflow.
The idea is to put a Python script to the dags_folder
which would
load the declarative dags via airflow_declarative. More details:
Installation using Upstream Airflow.
import os
import airflow_declarative
# Assuming that the yaml dags are located in the same directory
# as this Python module:
root = os.path.dirname(__file__)
dags_list = [
airflow_declarative.from_path(os.path.join(root, item))
for item in os.listdir(root)
if item.endswith((".yml", ".yaml"))
]
globals().update({dag.dag_id: dag for dags in dags_list for dag in dags})
We provide ready to use patches in the patches directory. To use them you will need to apply a patch to a corresponding Airflow version and then build it yourself. More details: Installation using Patched Airflow.