
We seek to democratise access such that product owners and data scientists alike can leverage Airflow’s distributed execution and scheduling power without being a master in Python or Kubernetes.


As data orchestration becomes critical to a growing number of business units, we want Airflow to become a medium for making data engineering more approachable. There will be extra opportunity to integrate the kedro-airflow package with the Airflow API for a great developer experience.Īs we look towards Airflow 3.0 and beyond, building upon developer love and trust is inevitable. As the API develops, there will be new opportunities for specific abstraction layers to assist with DAG authoring and deployment, leading to a richer plugin ecosystem. Where do you think Kedro-Airflow could go, in terms of future development?Īirflow 2.0 extends and upgrades the Airflow REST API, allowing it to be robust in the coming years. Given this need, there was a natural bond between Kedro pipeline and Airflow: we wanted to do everything we could to build a great developer experience at the intersection of the two tools.
AIRFLOW 2.0 SOFTWARE
Kedro does an outstanding job of allowing data scientists to apply good software engineering principles to their code and make it modular, but Kedro pipelines need a separate scheduling and execution environment to run at scale. I had chatted with a few data scientists who were using Kedro to author their pipelines and looking for a good way to deploy those pipelines to Airflow. How did you find out about Kedro? When did you realise it was compatible with Airflow for users? We see 2.0 as a major milestone for the project not only does it significantly improve the scalability of Airflow, but also it sets a foundation upon which we can continuously build new features.


We would like to continue our work and make the process even smoother and eventually achieve a “one-click-deployment” workflow for Kedro pipelines on Airflow. Our work with Astronomer provides a simple way for our users to deploy their pipelines. To keep the workflow seamless, we are pleased to unveil the latest version of the Kedro-Airflow plugin, which simplifies deployment of a Kedro project on Airflow. Workflows in Airflow are modelled and organised as DAGs, making it a suitable engine to orchestrate and execute a pipeline authored with Kedro. We are particularly excited to work with the Astronomer team, who helps organisations adopt Apache Airflow, the leading open-source data workflow orchestration platform. One of the benefits of being an open source community is that we can explore partnerships with other, like-minded frameworks and technologies.
