No items found.

Easily replicate data from Azure SQL Database to BigQuery

Learn how to easily move your Azure SQL Database data into BigQuery where it can be combined with data from other sources to get a holistic view of your business and to gain valuable insights.

You may be able to improve your analytics capabilities by moving data from Azure SQL Database, into a centralized and unified data warehouse such as BigQuery. As discussed in Airbyte’s article on data integration, moving data from across the enterprise into a centralized data analytics platform provides the following benefits:

  • A unified view of data from across the enterprise, and a single source of truth.
  • A platform that is purpose built for running large analytics jobs.
  • A single location to transform and join data from across the enterprise.
  • Improved security – limit the number of people that require access to operational systems such as Azure SQL Database, as they will be able to access and analyze the required data in BigQuery.

In this tutorial, you will learn how to build a data replication pipeline from Azure SQL Database to Google’s BigQuery, without writing any code. Let’s get started!

Prerequisites

  1. A Microsoft Azure account to create a SQL Server using Azure. Because Azure SQL Database is built on SQL Server database engine, the instructions given in this tutorial should also be applicable to Microsoft SQL Server.
  2. A Google cloud account to create a BigQuery data warehouse.
  3. This tutorial was created using Airbyte cloud. Alternatively, you can choose to run Airbyte locally. You can follow the instructions to set up Airbyte on your system using docker-compose.

Step 1: Set up SQL Server as an Airbyte Source

Create an SQL Server host on Azure.  Once you log in to the Azure Portal, choose to create an SQL database from the dashboard

Choose Create option from the SQL databases page.

Create a server to host our database in it. From the Database details section, choose Create new server.

We will use SQL Authentication for authentication methods, creating a new username and password as shown below.

Once the server setup is complete, let’s add other details. For Subscription, choose Free Trial if you don’t already have a paid plan. Next, a new Resource group needs to be created to store all our resources related to our database.

Choose Next: Networking to move forward. Make sure to pick Public Endpoint for the Connectivity method.

Next, under Additional Settings, you can load sample tables by choosing Sample under Use existing data. We will use data from this table to sync to BigQuery later in the tutorial.

Once you are done with configurations, Choose Create to create the database, this step may take a few minutes. Once Azure creates the database for us, you can view more details.

Let’s run a sample SQL query by going to the Query Editor and logging in with the username/password you just created. Then run an SQL query as shown below:.

Azure doesn't allow connections to the database from IPs unless they are listed in the firewall settings. Therefore we need to add Airbyte’s public IP addresses to the firewall settings.

To whitelist the IPs, click on Add a firewall rule and add the same IP address for both start and end address.

Step 2: Configure Azure SQL Database as a source

You will now add the newly created Azure SQL Database as an Airbyte Source. Login to your Airbyte Cloud Dashboard and, select Sources on the left bar, then click on +New source. After selecting Microsoft SQL Server as your source , you should complete the UI as shown in the  image below:

ℹ️  Since Azure SQL Database uses the SQL Server engine, using the MSSQL source connector works.

Once you have entered all the necessary details, choose Set up source.

Step 3: Set up BigQuery

To set up a big query Airbyte destination, we need first to create a BigQuery dataset to load our books dataset.

Login in to your Google cloud dashboard. From the Welcome page, choose the Run a query in BigQuery button.

From the 3-dot menu of your cloud project, choose the only option to create a dataset


For Dataset ID, choose a descriptive name like mongodb_dataset. Only alphanumeric names are allowed as dataset ID names.

Choose the dataset location from the menu and click Create Dataset. You should now be able to see a newly created dataset. Take note of the Dataset ID you just entered since we will need it later to set up BigQuery as an Airbyte Destination.

Getting the appropriate account keys to access our BigQuery project is the only thing left. Choose API & Services from the Quick Access menu in your cloud dashboard.

From the credential menu, choose to create a new Service account.

Pick a valid name for your Service account and click Create and continue.

Next, we need to specify what resources can be accessed from this service account. From the available roles,  BigQuery Data Editor and BigQuery Job User should be sufficient. Alternatively BigQuery Admin should work (and is shown in the image below), but for security should not be used in production systems..

Click Done, and you should see a new service account.

Choose this newly created service account email and add a new key by clicking the Add Key button.

From the Create private key pop-up, choose JSON.

Once you select Create, a new file must have been downloaded to your system. The final step involves creating a new Airbyte destination. 

Step 4: Configure BigQuery as a destination

From your Airbyte dashboard, choose Destinations and, select New Destination, pick BigQuery from the available options.

Next, under Service Account Key JSON, copy the contents of the whole credentials file. Finally, click Set up destination once you are done with the changes.

Step 5: Set up Airbyte Connection between SQL Server and BigQuery

The final step for this tutorial involves building a connection between our newly setup SQL Server and our BigQuery warehouse. To achieve this, go to Connections and choose to set up a New connection.

Select the source and destination that we just created, and Airbyte will show you the tables (called streams in Airbyte) that can be synced.

Once you have selected the namespaces to sync, Choose Set up connection, and Airbyte should automatically start syncing data. Once the sync is complete, Airbyte will tell you the status of the rows synced.

With the sample data in Azure SQL Database, around 6083 rows were synced to BigQuery. Once normalization has completed, you can go to the BigQuery console to verify and run a sample SQL query such as the following:

Wrapping Up

To summarize, this tutorial has shown you how to:

  1. Set up Azure SQL Server.
  2. Configure Azure SQL Server as an Airbyte source.
  3. Set up a Google BigQuery as a data warehouse.
  4. Configure BigQuery as an Airbyte destination.
  5. Create an Airbyte connection that replicates data from a Azure SQL Database to BigQuery.

With Airbyte the data integration possibilities are endless, and we look forward to seeing you make use of it! We invite you to join the conversation on our community Slack Channel, to participate in discussions on Airbyte’s discourse, or to sign up for our newsletter. You may also be interested in Airbyte tutorials and Airbyte’s blog!

Similar use cases

Build a GitHub activity dashboard for your project

Using the Airbyte GitHub connector and Metabase, we can create insightful dashboards for GitHub projects.

Replicate Postgres data to Kafka

Learn how to replicate Postgres data to Kafka in a few minutes with Airbyte.

Transferring data from Klaviyo to BigQuery

Learn how to use Airbyte to easily synchronize your Klaviyo data into BigQuery.