Speaker(s):

Hands on workshop showing how data observability can work within your Airflow and Modern Data Stack.

The workshop consists of 3 parts:

  • Capabilities of continuous data observability
  • Required technical tracking components for data incident alerting
  • Real-world data observability use cases in action (e.g., job failures, durations, schema changes, anomaly detection, quality checks)

This workshop is mostly targeted at medium and advanced Airflow users who are evaluating how to achieve better data incident monitoring and resolution as their scale their Airflow pipelines. If you’re connecting Airflow processes with other modern data stack tools like dbt, Spark, BigQuery, and Snowflake, then this workshop is for you.