About the services
Airbyte is the leading open-source ELT tool, created in July 2020. Airbyte has more than 200 data connectors, and have 25,000 companies using them to sync data, syncing more than 1,000B rows per month. Their ambition is to commoditize data integration by addressing the long tail of connectors through their growing contributor community. Airbyte released a Cloud offer in April 2022 with a new pricing model distinguishing database from APIs and files.
Apache AIrflow is an open-source workflow management tool. Airflow is not an ETL tool but you can use Airflow operators to extract, transform and load data between different systems. Airflow started in 2014 at Airbnb as a solution to manage the company's workflows. Airflow allows you to author, schedule and monitor workflows as DAG (directed acyclic graphs) written in Python.
Pre-built connectors are the primary way to differentiate ETL / ELT solutions, as they enable data teams to focus only on the insights to build.
Within 2 years from inception, Airbyte already offers connectors for more than 200 data sources, and all major data warehouses, lakes and databases as data destinations.
All Airbyte connectors are open-sourced and can be edited to address any custom needs the customers have. Airbyte users can leverage these connectors through the open-source edition or the Cloud offer.
Airbyte’s low-code Connector Development Kit also enables their users to build custom connectors in a standardized way within 30 minutes(instead of 2 days), and the Airbyte team and community can help in their maintenance.
More than 50% of the connectors have been contributed by the growing community. Airbyte will provide a SLA for the certified connectors, but Airbyte’s ambition is also to provide a SLA for other connectors through the community and its participative model on the long tail of connectors, and to reach 1,000+ connectors in the next few years.
Airbyte will offer reverse-ETL connectors in 2023.
You can use one of the 60 available Airflow transfer operators to move data between one system to another like the PostgresToGCSOperator. Sources and destinations are tightly coupled. Because of this, you need a different transfer operator for each pair of source and destination. This makes it hard for Airflow to cover the long tail of integrations.
Airbyte is an ELT tool, and does not transform data prior to loading. Airbyte offers two normalization options out of the box: a serialized JSON file and some basic normalization to get the original structure of the data at the destination level.
Airbyte also offers custom transformations via SQL and through deep integration with dbt, allowing their users and customers to trigger their own dbt packages at the destination level right after the EL.
You can transform data locally with the PythonOperator, remotely with operators like the SparkSubmitOperator and in the database with operators like the BigQueryInsertJobOperator. You can also integrate Airflow with dbt for transformations.
Every company has custom data architectures and, therefore, unique data integration needs. A lot of tools don’t enable teams to address those, which results in a lot of investment in building and maintaining additional in-house scripts.
Airbyte’s architecture modularity implies that you can leverage any part of Airbyte. For instance, you can use Airflow’s orchestrator to trigger Airbyte’s ELT jobs.
You can also edit any pre-built connectors to your own specific needs, or even leverage the Connector Development Kit to build your own custom connectors in a matter of hours (instead of days) and have its maintenance shared with the community and the Airbyte team.
Airbyte’s promise is to address all your ELT needs and the long tail of integrations.
Airflow operators are split into the built-in operators and provider packages. You can modify existing operators and also create new operators on top of existing Airflow hooks.
You can scale Airflow deployments with operators and executors. For example, you can use the CeleryExecutor or the KubernetesExecutor to scale your Airflow workers.
You can also use AIrflow to schedule ELT tasks and integrate it with Airbyte for the EL steps and dbt for the T step.
Support & docs
Data integration tools can be complex, so customers need to have great support channels. This includes online documentation as well as tutorials, email and chat support. More complicated tools may also offer training services.
Airbyte provides chat support directly on their web app, with an average time to respond of 5 minutes.
Their documentation is comprehensive and full of tutorials.
Airbyte also has a Slack and Discourse community where help is available from the Airbyte team, other users or contributors.
Airbyte does not provide any training services.
Astronomer.io is the only service to provide premium support.
Airflow documentation is comprehensive but split over different supports. Astronomer.io also provides high quality documentation and guides.
There is a popular Airflow Slack community.
You can get Airflow training from Astronomer.io and get the Apache Airflow certification.
Airbyte provides a 14-day free trial or $1,000 worth of credits, whichever expires first. Airbyte’s pricing is credit-based, and you consume credits based on volume with a different price for APIs, databases and files, which enables it to adapt well to all use cases, including database replication.
Airbyte doesn’t charge for failed syncs or normalization.
Airbyte offers adapted pricing to customers with large volumes.
Cloud Composer pricing is consumption based, so you pay for what you use, based on your CPU, storage and data transfer costs.
Amazon Managed Workflows for Apache Airflow pricing is based on CPU usage from the scheduler, worker and web server. You also pay for the meta database storage.
Astronomer.io pricing is not publicly available, but they provide standard, premium and custom plans.