Upgrading to Apache Airflow in large, production-grade environments can be complex—especially in enterprise setups with hundreds of DAGs, custom plugins, and mission-critical pipelines. The challenge grows even more complex in decentralized setups, where platform teams are responsible for the system’s stability, but the DAG code lives across multiple teams you don’t directly control.

You will have the chance for personalised review of your current organizational setup, assess testing coverage, and identify concrete ways to improve your upgrade process. This hands-on workshop will provide:

  • Environment Health Check & Audits (dependency checks, resource sizing)
  • DAG refactoring for deprecated features and optimizations
  • Database migrations and backward-compatibility strategies
  • Improving CI/CD validation using GenAI to increase reliability
  • Self-managed and Astronomer upgrade (with no downtime)

Supported by battle-tested approach and guided exercises. Recommended for platform teams, data engineers, and architects managing production Airflow deployments. At the end of this workshop participants will gain actionable strategies tailored to their specific upgrade challenges.

M. Waqas Shahid

Delivery Hero, Principal Data Platform Engineer