Explore popular use cases to empower your teams

Explore popular use cases to empower your teams

Export Postgres data to CSV, JSON, Parquet and Avro files in S3

Learn how to easily export Postgres data to CSV, JSON, Parquet, and Avro file formats stored in AWS S3.

Version control Airbyte configurations with Octavia CLI

Use Octavia CLI to import, edit, and apply Airbyte application configurations to replicate data from Postgres to BigQuery.

Airflow and Airbyte OSS - Better Together

Learn how to create an Airflow DAG (directed acyclic graph) that triggers Airbyte synchronizations.

Validate data replication pipelines with data-diff

Learn to replicate data from Postgres to Snowflake with Airbyte, and compare replicated data with data-diff.

How to write an Airbyte Python Destination: DuckDB

A guide on how to create a Python Destination (DuckDB). Code snippets linked to a single PR.

Configure Airbyte Connections with Python (Dagster)

Data integration as code; Creating Airbyte sources, destinations, or connections depending on external factors.

Create and Monitor a Data Pipeline Using the Airbyte API

Learn how to export data from a Postgres table to a .CSV file in an Azure Blob Storage by using the new Airbyte API

No items found.

Developer Productivity Analytics Stack With Github, Airbyte, Dbt, Dagster and BigQuery

Kickstart your developer productivity analytics with a unified data stack. From Github to BigQuery via Airbyte, with the power of Dbt and Dagster. Simplify, analyze, and optimize effortlessly.

No items found.

Github Insights Stack with Airbyte, dbt, Dagster and BigQuery

Explore the fusion of Airbyte, dbt, and GitHub API to gain deep insights into code quality, collaboration, and project vitality.

No items found.

Optimizing error resolution with Sentry, dbt, Dagster and Snowflake

Configure an error analysis stack utilizing Sentry, Airbyte, Snowflake, dbt, and Dagster.

No items found.

Database snapshot to S3 then to warehouse

Build a full data stack that creates a table snapshot from a database and stores it in an Amazon S3 bucket as a JSONL file using Airbyte and then loads the snapshot file to a preferred data warehouse.

Building ETL Pipeline with Python, Docker, & Airbyte

Learn how to build robust ETL pipelines using Python, Docker, and Airbyte. A guide for data engineers covering setup, implementation, & best practices.