In healthcare data, standards are often anything but standard. Every new partner arrives with its own requirements for data exchange spanning FHIR APIs, HL7 feeds, SFTP drops, and custom vendor extracts.
The result? Integration projects that stretch from weeks into months, custom pipelines that only one engineer understands, and implementation teams who are already counting down to your next missed deadline.
This session shows how Airflow can change your approach to managing data transfer for healthcare partners.
We’ll cover how to structure your pipelines with DAGFactory patterns so that you don’t need to treat partner onboarding as a custom engineering effort and where this approach breaks down. We will also cover how you can integrate with tools like OpenMetadata (using Airflow under the hood) to track data assets, so your implementation team knows what is happening without having to ask an engineer.
This session is for data engineers building or maintaining healthcare integration pipelines, and healthcare leaders who don’t want to keep hearing “the next partner integration is a few months away.”
Wyatt Shapiro
Lead Data Engineer, Zivian Health