All recipes

Sync MySQL CDC to Kafka using Change Data Capture

Learn how to stream changes from a MySQL database to Kafka using Change Data Capture (CDC).

Enterprise use cases often involve the integration of data from a variety of sources and analytics involving logs and activity streams. For instance, if you are building an e-commerce application backed by a relational database, such as MySQL, then you might need to combine incoming order data with other downstream data for the purpose of operational analytics. Relational databases maintain a transaction log that records every event in the database. An update, an insert, a delete - all go into the database's transaction log. Rather than moving all the order data in bulk, CDC approaches are used, allowing you to stream every single event from the database as it occurs into a streaming platform like Apache Kafka.

By consuming the logs from MySQL, data integration tools such as Airbyte are able to extract data changes at very low latency using CDC and deliver this seamlessly to Kafka, which serves as a central integration point for numerous downstream data feeds in and out. This recipe will explain how to sync data from a MySQL database to Kafka using CDC. The process is similar for all Airbyte database sources that support CDC like Postgres and MSSQL.

What is Change Data Capture (CDC)?

Change Data Capture (CDC) is an efficient replication technology that allows row-level data changes at the source database to be quickly identified, captured, and delivered in real-time to the destination database store. With CDC in use, only the data that has changed — categorized by insert, update, and delete operations — since the last replication is transferred.


Here are the tools you’ll need to get started on replicating data from a MySQL database to Kafka.

  1. You’ll need to get Airbyte to do the data sync for you. In this recipe, we are running Airbyte in a docker container locally using the instructions in our documentation here.
  1. You will need a MySQL instance that will be used as your source for CDC. MySQL instances can be set up in a variety of ways, which can be found here. In this recipe, for the sake of getting up and running quickly, we are using a hosted MySQL instance
  1. You will need a Kafka instance that will serve as a destination . To get started quickly with Kafka, you can follow the instructions found here. Again, for the sake of getting up and running quickly, we are using a hosted Kafka instance.

Step 1: Load data into the MySQL database

In this recipe, we will create a MySQL database in our hosted instance, and load it with some sample data. 

Use the mysql CLI to connect to the remote MySQL instance

To set up the sample data, first, download the repo at the link here. Once, downloaded, run the following commands to set up the ‘employees’ database, its associated tables and then insert the sample data into these tables by running the following mysql commands.

You may be prompted to enter your MySQL password. The text between the angle brackets must be replaced with the hostname, port and username based on your MySQL instance. 

Once the two scripts are run, the employees database will be created. You can view the created tables by logging into the MySQL CLI and running the following commands.

Create a dedicated user with access to the tables

It's recommended to create a dedicated user for better permission control and auditing. Alternatively, you can use Airbyte with an existing user in your database.

To create a dedicated database user, run the following commands against your database.

The right set of permissions differ between the STANDARD and CDC replication method. For the STANDARD replication method, only SELECT permission is required. For the CDC replication method, SELECT, RELOAD, SHOW DATABASES, REPLICATION SLAVE, REPLICATION CLIENT permissions are required.

Your database user should now be ready for use with Airbyte.

Step 2: Set up the MySQL CDC source

It's easy to create a MySQL source through the Airbyte UI. Make sure to select CDC as the replication method. We have not used SSH in our example. If you are using a public internet network in production, we recommend using SSH tunnels.

Step 3: Set up the Kafka destination

Next, we will set up a Kafka destination in Airbyte. In this recipe, we are running Kafka in our hosted instance and will connect to it using the Kafka client setup locally. To get running with Apache Kafka client locally, follow through the quick start steps.

Create a new Kafka topic to sync data

First, we will create a Kafka topic named 'departments' which will be used to write the CDC data.

Next, create a destination in Airbyte as follows. 

For all of the remaining settings, we went with the default values that were provided

Step 4: Create a MySQL CDC to Kafka connection

Once the source and destination are set up, you can create a connection from MySQL to Kafka in Airbyte. The in the “select the data you want to sync” section, choose the department table and select Incremental under Sync mode.

Using the sync frequency and sync mode options of Airbyte, you can get control to replicate data incrementally and schedule Airbyte to replicate this data. 

Once configured, you can see your connection on the Connection tab.

Now that your connection is set up, go back into your MySQL shell and run the following commands to add, update and then delete a row in the departments table.

Next, go back to the Airbyte UI and select the connection you just created, and trigger a manual sync.

Once the sync is complete, in a new terminal window, run the following command to read the events that were persisted to the Kafka topic.

The screenshot below shows the CDC data for the row you just inserted, updated, and deleted with corresponding timestamps.

Wrapping up

Here’s what we’ve accomplished during this recipe:

  • Configure a MySQL Airbyte source
  • Configure a Kafka Airbyte destination
  • Create a connection that will automatically sync CDC log data from MySQL to Kafka

With a combination of MySQL CDC logs, Airbyte and Kafka, distributed data platforms can be kept in sync and made aware of data changes.

We know that engineering teams working on fast-moving projects need quick answers to their questions from developers who are actively developing Airbyte. Join the conversation at Airbyte’s community Slack Channel to share your ideas with over 1000 data engineers and help make everyone’s project successful. 

Getting started is easy

Start breaking your data siloes with Airbyte.

Similar use cases

Orchestrate ELT pipelines with Prefect, Airbyte and dbt

Learn how to build an ELT pipeline to discover GitHub users that have contributed to the Prefect, Airbyte, and dbt repositories.

Migrate from MySQL to PostgreSQL

Easily migrate your MySQL data to PostgreSQL.

Replicate data from a PostgreSQL database to Snowflake

Learn how to replicate data from an OnLine Transactional Processing (OLTP) database like PostgreSQL, to an OnLine Analytical Processing (OLAP) data warehouse like Snowflake.