Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Use github service containers instead of docker compose #82

Merged
merged 12 commits into from
Oct 18, 2023
61 changes: 35 additions & 26 deletions .github/workflows/provider.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -12,6 +12,31 @@ jobs:
env:
KEY: "AKIAIOSFODNN7EXAMPLE"
SECRET: "wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY"

services:
lakefs:
image: treeverse/lakefs:latest
env:
LAKEFS_AUTH_ENCRYPT_SECRET_KEY: "some random secret string"
LAKEFS_DATABASE_TYPE: local
LAKEFS_BLOCKSTORE_TYPE: local
LAKEFS_GATEWAYS_S3_DOMAIN_NAME: s3.local.lakefs.io:8000
LAKEFS_LOGGING_LEVEL: TRACE
LAKEFS_STATS_ENABLED: false
LAKEFS_INSTALLATION_USER_NAME: docker
LAKEFS_INSTALLATION_ACCESS_KEY_ID: ${{ env.KEY }}
LAKEFS_INSTALLATION_SECRET_ACCESS_KEY: ${{ env.SECRET }}
LAKECTL_SERVER_ENDPOINT_URL: http://localhost:8000
LAKECTL_CREDENTIALS_ACCESS_KEY_ID: $${ env.Key }}
LAKECTL_CREDENTIALS_SECRET_ACCESS_KEY: $${ env.SECRET }}
ports:
- 8000:8000
options: >-
--name lakefs
--health-cmd "curl --fail -LI http://localhost:8000/_health"
--health-interval 10s
--health-timeout 5s
--health-retries 5
steps:
- name: Checkout
uses: actions/checkout@v4
Expand Down Expand Up @@ -41,34 +66,24 @@ jobs:
run: printf "\nRUN pip install --user dist/airflow_provider_lakefs-*-py3-none-any.whl" >> astro/Dockerfile

- name: Start astro
run: |
cd astro && astro dev start

- name: spin up lakeFS
run: docker-compose -f ops/docker-compose.yaml up --quiet-pull -d && sleep 30

- name: Setup lakeFS
run: |
curl localhost:8000/api/v1/setup_lakefs -H "Content-Type: application/json" --request POST --data '{"username":"test","key":{"access_key_id":"${{ env.KEY }}","secret_access_key":"${{ env.SECRET }}"}}'
working-directory: astro
run: astro dev start

- name: Create test repo
run: |
export BASIC_AUTH=$(echo -n "${{ env.KEY }}:${{ env.SECRET }}" | base64)
curl localhost:8000/api/v1/repositories -H "Content-Type: application/json" -H "Authorization: Basic $(echo $BASIC_AUTH | tr -d ' ')" --request POST --data '{"name":"example-repo","storage_namespace":"local://data/"}'
curl -u '${{ env.KEY }}:${{ env.SECRET }}' -H 'Content-Type: application/json' -X POST --data '{"name":"example-repo","storage_namespace":"local://data/"}' 'http://localhost:8000/api/v1/repositories'

- name: Run lakeFS DAG
working-directory: astro
run: |
cd astro
astro dev run connections add conn_lakefs --conn-type=HTTP --conn-host=http://172.17.0.1:8000 --conn-login="${{ env.KEY }}" --conn-password="${{ env.SECRET }}"
astro dev run dags unpause lakeFS_workflow
astro dev run dags trigger lakeFS_workflow
sleep 30

- name : Run DAG state check script
id : dag_status_id
run: |
chmod +x dag_status.py
python3 dag_status.py
run: python3 dag_status.py

- name: Wait until Airflow makes output file available on main
env:
Expand All @@ -87,20 +102,14 @@ jobs:
with:
timeout_minutes: 3
max_attempts: 30
command: docker-compose -f ops/docker-compose.yaml exec -T lakefs ls lakefs/data/block/data/symlinks/example-repo/example-branch/path/to/symlink.txt

- name: lakeFS logs
if: ${{ always() }}
run: docker-compose -f ops/docker-compose.yaml logs --tail=1000 lakefs
command: docker exec lakefs ls lakefs/data/block/data/symlinks/example-repo/example-branch/path/to/symlink.txt
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think this loses the logs? Or does GitHub keep them for the service container?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

GitHub dump the services logs at the end as part of the post processing, ex: https://github.com/treeverse/airflow-provider-lakeFS/actions/runs/6533424810/job/17738585733?pr=82


- name: airflow scheduler logs
if: ${{ always() }}
run: |
cd astro
astro dev logs --scheduler
working-directory: astro
run: astro dev logs --scheduler

- name: airflow triggerer logs
if: ${{ always() }}
run: |
cd astro
astro dev logs --triggerer
working-directory: astro
run: astro dev logs --triggerer
24 changes: 0 additions & 24 deletions ops/docker-compose.yaml

This file was deleted.