Warning
The majority of operators and sensors within this repository have been deprecated and will not receive further updates. Read more about the deprecation in the Deprecation Notice section below.
With the release 1.19.0
of the astronomer-providers package, most of the operators and sensors are deprecated and will
no longer receive updates. We recommend migrating to the official Apache Airflow Providers for the latest features and support.
For the operators and sensors that are deprecated in this repository, migrating to the official Apache Airflow Providers
is as simple as changing the import path from
from astronomer.providers.*.*.operator_module import SomeOperatorAsync
to
from airflow.providers.*.*.operator_module import SomeOperator
and setting the deferrable
argument to True
while using the operator or sensor in your DAG.
Setting the deferrable
argument to True
will ensure that the operator or sensor is using the async version
of the operator or sensor from the official Apache Airflow Providers.
For example, to migrate from
astronomer.providers.amazon.aws.operators.batch.BatchOperatorAsync
to
airflow.providers.amazon.aws.operators.s3.BatchOperator
, simply change the import path and pass
the deferrable argument:
BatchOperator(
task_id="copy_object",
your_arguments,
your_keyword_arguments,
deferrable=True,
)
For more information on using the deferrable operators and sensors from the official Apache Airflow Providers, visit the following links:
- https://airflow.apache.org/docs/apache-airflow-providers/index.html
- https://airflow.apache.org/docs/#providers-packages-docs-apache-airflow-providers-index-html
- https://airflow.apache.org/docs/apache-airflow/stable/authoring-and-scheduling/deferring.html
Note
Although the default value for the deferrable argument is False, it's possible to configure the default value for the deferrable argument across your deployment by setting the default_deferrable flag in the operators sections of your Airflow configuration. Once you set the default_deferrable flag to True, you can remove the deferrable argument from your operators and sensors and they will use the async version of the operator or sensor from the official Apache Airflow Providers if it exists.
See more at: https://airflow.apache.org/docs/apache-airflow/stable/configurations-ref.html#default-deferrable
For troubleshooting of issues with migrations, you are suggested to open up a GitHub discussion
Install and update using pip:
pip install astronomer-providers
This only installs dependencies for core provider. To install all dependencies, run:
pip install 'astronomer-providers[all]'
To only install the dependencies for a specific provider, specify the integration name as extra argument, example to install Kubernetes provider dependencies, run:
pip install 'astronomer-providers[cncf.kubernetes]'
Extra Name | Installation Command | Dependencies |
---|---|---|
all |
pip install 'astronomer-providers[all]' |
All |
amazon |
pip install 'astronomer-providers[amazon]' |
Amazon |
apache.hive |
pip install 'astronomer-providers[apache.hive]' |
Apache Hive |
apache.livy |
pip install 'astronomer-providers[apache.livy]' |
Apache Livy |
cncf.kubernetes |
pip install 'astronomer-providers[cncf.kubernetes]' |
Cncf Kubernetes |
databricks |
pip install 'astronomer-providers[databricks]' |
Databricks |
dbt.cloud |
pip install 'astronomer-providers[dbt.cloud]' |
Dbt Cloud |
google |
pip install 'astronomer-providers[google]' |
|
http |
pip install 'astronomer-providers[http]' |
Http |
microsoft.azure |
pip install 'astronomer-providers[microsoft.azure]' |
Microsoft Azure |
openlineage |
pip install 'astronomer-providers[openlineage]' |
Openlineage |
sftp |
pip install 'astronomer-providers[sftp]' |
Sftp |
snowflake |
pip install 'astronomer-providers[snowflake]' |
Snowflake |
This repo is structured same as the Apache Airflow's source code, so for example if you want to import Async operators, you can import it as follows:
from astronomer.providers.amazon.aws.sensors.s3 import S3KeySensorAsync as S3KeySensor
waiting_for_s3_key = S3KeySensor(
task_id="waiting_for_s3_key",
bucket_key="sample_key.txt",
wildcard_match=False,
bucket_name="sample-bucket",
)
Example DAGs for each provider is within the respective provider's folder. For example, the Kubernetes provider's DAGs are within the astronomer/providers/cncf/kubernetes/example_dags folder.
We will only create Async operators for the "sync-version" of operators that do some level of polling (take more than a few seconds to complete).
For example, we won’t create an async Operator for a BigQueryCreateEmptyTableOperator
but will create one
for BigQueryInsertJobOperator
that actually runs queries and can take hours in the worst case for task completion.
To create async operators, we need to inherit from the corresponding airflow sync operators.
If sync version isn't available, then inherit from airflow BaseOperator
.
To create async sensors, we need to inherit from the corresponding sync sensors.
If sync version isn't available, then inherit from airflow BaseSensorOperator
.
We follow Semantic Versioning for releases. Check CHANGELOG.rst for the latest changes.
All contributions, bug reports, bug fixes, documentation improvements, enhancements, and ideas are welcome.
A detailed overview on how to contribute can be found in the Contributing Guide.
As contributors and maintainers to this project, you are expected to abide by the Contributor Code of Conduct.
- Our focus is on the speed of iteration and development in this stage of the project and so we want to be able to quickly iterate with our community members and customers and cut releases as necessary
- Airflow Providers are separate packages from the core
apache-airflow
package and we would like to avoid further bloating the Airflow repo - We want users and the community to be able to easily track features and the roadmap for individual providers that we develop
- We would love to see the Airflow community members create, maintain and share their providers to build an Ecosystem of Providers.
- In Airflow sensors have a param
mode
which can bepoke
andreschedule
. In async sensors, this param has no usage since tasks gets deferred to Triggerer.