Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Install Airflow with different versions in CI & run corresponding jobs #237

Merged
merged 15 commits into from
Oct 15, 2024
Merged
Show file tree
Hide file tree
Changes from 14 commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
8 changes: 8 additions & 0 deletions .codecov.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,8 @@
---
coverage:
status:
project:
default:
target: auto
threshold: 2%
only_pulls: true
3 changes: 3 additions & 0 deletions .coveragerc
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
[run]
omit =
tests/*
113 changes: 110 additions & 3 deletions .github/workflows/test.yml
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@ name: test

on:
push: # Run on pushes to the default branch
branches: [main]
branches: [main,airflow-version-tests]
pankajkoti marked this conversation as resolved.
Show resolved Hide resolved
pull_request_target: # Also run on pull requests originated from forks
branches: [main]

Expand All @@ -22,14 +22,121 @@ jobs:
Static-Check:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- uses: actions/checkout@v4
with:
ref: ${{ github.event.pull_request.head.sha || github.ref }}

- uses: actions/setup-python@v3
- uses: actions/setup-python@v5
with:
python-version: "3.12"
architecture: "x64"

- run: pip3 install hatch
- run: CONFIG_ROOT_DIR=`pwd`"/dags" hatch run tests.py3.12-2.10:static-check

Run-Unit-Tests:
runs-on: ubuntu-latest
strategy:
fail-fast: false
pankajkoti marked this conversation as resolved.
Show resolved Hide resolved
matrix:
python-version: [ "3.8", "3.9", "3.10", "3.11", "3.12" ]
airflow-version: [ "2.2", "2.3", "2.4", "2.5", "2.6", "2.7", "2.8", "2.9", "2.10" ]
pankajkoti marked this conversation as resolved.
Show resolved Hide resolved
exclude:
# Apache Airflow versions prior to 2.3.0 have not been tested with Python 3.10
# See: https://airflow.apache.org/docs/apache-airflow/2.2.0/installation/prerequisites.html
- python-version: "3.10"
airflow-version: "2.2"
# Apache Airflow versions prior to 2.6.2 have not been tested with Python 3.11
- python-version: "3.11"
airflow-version: "2.2"
- python-version: "3.11"
airflow-version: "2.3"
- python-version: "3.11"
airflow-version: "2.4"
- python-version: "3.11"
airflow-version: "2.5"
- python-version: "3.11"
airflow-version: "2.6"
# Apache Airflow versions prior to 2.9.0 have not been tested with Python 3.12.
# Official support for Python 3.12 and the corresponding constraints.txt are available only for Apache Airflow >= 2.9.0.
# See: https://github.com/apache/airflow/tree/2.9.0?tab=readme-ov-file#requirements
# See: https://github.com/apache/airflow/tree/2.8.4?tab=readme-ov-file#requirements
- python-version: "3.12"
airflow-version: "2.2"
- python-version: "3.12"
airflow-version: "2.3"
- python-version: "3.12"
airflow-version: "2.4"
- python-version: "3.12"
airflow-version: "2.5"
- python-version: "3.12"
airflow-version: "2.6"
- python-version: "3.12"
airflow-version: "2.7"
- python-version: "3.12"
airflow-version: "2.8"
steps:
- uses: actions/checkout@v4
with:
ref: ${{ github.event.pull_request.head.sha || github.ref }}

- uses: actions/cache@v4
with:
path: |
~/.cache/pip
.local/share/hatch/
key: unit-${{ runner.os }}-${{ matrix.python-version }}-${{ matrix.airflow-version }}-${{ hashFiles('pyproject.toml') }}-${{ hashFiles('dagfactory/__init__.py') }}

- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v4
with:
python-version: ${{ matrix.python-version }}

- name: Install packages and dependencies
run: |
python -m pip install uv
uv pip install --system hatch
CONFIG_ROOT_DIR=`pwd`"/dags" hatch -e tests.py${{ matrix.python-version }}-${{ matrix.airflow-version }} run pip freeze

- name: Test DAG Factory against Airflow ${{ matrix.airflow-version }} and Python ${{ matrix.python-version }}
run: |
CONFIG_ROOT_DIR=`pwd`"/dags" hatch run tests.py${{ matrix.python-version }}-${{ matrix.airflow-version }}:test-cov

- name: Upload coverage to Github
uses: actions/upload-artifact@v4
with:
name: coverage-unit-test-${{ matrix.python-version }}-${{ matrix.airflow-version }}
pankajkoti marked this conversation as resolved.
Show resolved Hide resolved
path: .coverage
include-hidden-files: true

Code-Coverage:
if: github.event.action != 'labeled'
needs:
- Run-Unit-Tests
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
with:
ref: ${{ github.event.pull_request.head.sha || github.ref }}
- name: Set up Python 3.11
uses: actions/setup-python@v5
with:
python-version: "3.11"
- name: Install coverage
run: |
pip3 install coverage
- name: Download all coverage artifacts
uses: actions/download-artifact@v4
with:
path: ./coverage
- name: Combine coverage
run: |
coverage combine ./coverage/coverage*/.coverage
coverage report
coverage xml
- name: Upload coverage to Codecov
uses: codecov/codecov-action@v4
with:
fail_ci_if_error: true
token: ${{ secrets.CODECOV_TOKEN }}
files: coverage.xml
4 changes: 2 additions & 2 deletions dagfactory/dagbuilder.py
Original file line number Diff line number Diff line change
Expand Up @@ -59,14 +59,14 @@
from airflow.kubernetes.volume import Volume
from airflow.kubernetes.volume_mount import VolumeMount
else:
from kubernetes.client.models import V1ContainerPort as Port
from kubernetes.client.models import (
V1ContainerPort as Port,
V1EnvVar,
V1EnvVarSource,
V1ObjectFieldSelector,
V1Volume,
V1VolumeMount as VolumeMount,
)
from kubernetes.client.models import V1VolumeMount as VolumeMount
from airflow.kubernetes.secret import Secret
from airflow.providers.cncf.kubernetes.operators.kubernetes_pod import KubernetesPodOperator
except ImportError:
Expand Down
44 changes: 37 additions & 7 deletions pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -44,13 +44,36 @@ dev = [
"tox",
]
tests = [
"pytest>=6.0",
"pytest-cov",
"pre-commit"
]

######################################
# TESTING
######################################

[tool.hatch.envs.tests]
dependencies = [
"dag-factory[tests]"
]
"dag-factory[tests]",
"apache-airflow~={matrix:airflow}.0,!=2.9.0,!=2.9.1", # https://github.com/apache/airflow/pull/39670
pankajkoti marked this conversation as resolved.
Show resolved Hide resolved
]
pre-install-commands = ["sh scripts/test/pre-install-airflow.sh {matrix:airflow} {matrix:python}"]

[[tool.hatch.envs.tests.matrix]]
python = ["3.8", "3.9", "3.10", "3.11", "3.12"]
airflow = ["2.2", "2.3", "2.4", "2.5", "2.6", "2.7", "2.8", "2.9", "2.10"]

[tool.hatch.envs.tests.overrides]
matrix.airflow.dependencies = [
{ value = "typing_extensions<4.6", if = ["2.6"] },
]

[tool.hatch.envs.tests.scripts]
freeze = "pip freeze"
static-check = " pre-commit run --files dagfactory/*"
test = 'sh scripts/test/unit.sh'
test-cov = 'sh scripts/test/unit-cov.sh'

[project.urls]
Source = "https://github.com/astronomer/dag-factory"
Expand All @@ -64,21 +87,28 @@ include = ["dagfactory"]
[tool.hatch.build.targets.wheel]
packages = ["dagfactory"]

[[tool.hatch.envs.tests.matrix]]
python = ["3.9", "3.10", "3.11", "3.12"]
airflow = ["2.8", "2.9", "2.10"]
[tool.pytest.ini_options]
filterwarnings = ["ignore::DeprecationWarning"]
minversion = "6.0"

[tool.hatch.envs.tests.scripts]
static-check = " pre-commit run --files dagfactory/*"
######################################
# THIRD PARTY TOOLS
######################################

[tool.black]
line-length = 120
target-version = ['py39', 'py310', 'py311', 'py312']

[tool.ruff]
line-length = 120

[tool.ruff.lint]
select = ["C901", "D300", "I", "F"]
ignore = ["F541", "C901"]

[tool.ruff.lint.isort]
combine-as-imports = true
known-first-party = ["dagfactory", "tests"]

[tool.ruff.lint.mccabe]
max-complexity = 10
26 changes: 26 additions & 0 deletions scripts/test/pre-install-airflow.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,26 @@
#!/bin/bash

AIRFLOW_VERSION="$1"
PYTHON_VERSION="$2"

# Use this to set the appropriate Python environment in Github Actions,
# while also not assuming --system when running locally.
if [ "$GITHUB_ACTIONS" = "true" ] && [ -z "${VIRTUAL_ENV}" ]; then
py_path=$(which python)
virtual_env_dir=$(dirname "$(dirname "$py_path")")
export VIRTUAL_ENV="$virtual_env_dir"
fi

echo "${VIRTUAL_ENV}"

CONSTRAINT_URL="https://raw.githubusercontent.com/apache/airflow/constraints-$AIRFLOW_VERSION.0/constraints-$PYTHON_VERSION.txt"
curl -sSL $CONSTRAINT_URL -o /tmp/constraint.txt
# Workaround to remove PyYAML constraint that will work on both Linux and MacOS
sed '/PyYAML==/d' /tmp/constraint.txt > /tmp/constraint.txt.tmp
mv /tmp/constraint.txt.tmp /tmp/constraint.txt
# Install Airflow with constraints
pip install uv
uv pip install "apache-airflow==$AIRFLOW_VERSION" --constraint /tmp/constraint.txt

pip install apache-airflow-providers-cncf-kubernetes --constraint /tmp/constraint.txt
rm /tmp/constraint.txt
7 changes: 7 additions & 0 deletions scripts/test/unit-cov.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
pytest \
-vv \
--cov=dagfactory \
--cov-report=term-missing \
--cov-report=xml \
--durations=0 \
--ignore=tests/test_example_dags.py
4 changes: 4 additions & 0 deletions scripts/test/unit.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,4 @@
pytest \
-vv \
--durations=0 \
--ignore=tests/test_example_dags.py
10 changes: 7 additions & 3 deletions tests/test_dagbuilder.py
Original file line number Diff line number Diff line change
Expand Up @@ -145,7 +145,7 @@
"python_callable_name": "expand_task",
"python_callable_file": os.path.realpath(__file__),
"partial": {"op_kwargs": {"test_id": "test"}},
"expand": {"op_args": "request.output"},
"expand": {"op_args": {"request_output": "request.output"}},
},
},
}
Expand Down Expand Up @@ -662,7 +662,11 @@ def test_make_timetable():
timetable_params = {"cron": "0 8,16 * * 1-5", "timezone": "UTC"}
actual = td.make_timetable(timetable, timetable_params)
assert actual.periodic
assert actual.can_run
try:
assert actual.can_run
except AttributeError:
# can_run attribute was removed and replaced with can_be_scheduled in later versions of Airflow.
assert actual.can_be_scheduled


def test_make_dag_with_callback():
Expand Down Expand Up @@ -736,7 +740,7 @@ def test_dynamic_task_mapping():
"python_callable_name": "expand_task",
"python_callable_file": os.path.realpath(__file__),
"partial": {"op_kwargs": {"test_id": "test"}},
"expand": {"op_args": "request.output"},
"expand": {"op_args": {"request_output": "request.output"}},
}
actual = td.make_task(operator, task_params)
assert isinstance(actual, MappedOperator)
Expand Down
4 changes: 2 additions & 2 deletions tests/test_dagfactory.py
Original file line number Diff line number Diff line change
Expand Up @@ -28,7 +28,7 @@
"end_date": "2020-01-01",
},
"default_view": "graph",
"schedule_interval": "daily",
"schedule_interval": "@daily",
},
"example_dag": {
"tasks": {
Expand Down Expand Up @@ -376,7 +376,7 @@ def test_dagfactory_dict():
"end_date": "2020-01-01",
},
"default_view": "graph",
"schedule_interval": "daily",
"schedule_interval": "@daily",
}
expected_dag = {
"example_dag": {
Expand Down
Loading