Learn how to easily move your Azure SQL Database data into BigQuery where it can be combined with data from other sources to get a holistic view of your business and to gain valuable insights.
You may be able to improve your analytics capabilities by moving data from Azure SQL Database, into a centralized and unified data warehouse such as BigQuery. As discussed in Airbyte’s article on data integration, moving data from across the enterprise into a centralized data analytics platform provides the following benefits:
In this tutorial, you will learn how to build a data replication pipeline from Azure SQL Database to Google’s BigQuery, without writing any code. Let’s get started!
Create an SQL Server host on Azure. Once you log in to the Azure Portal, choose to create an SQL database from the dashboard
Choose Create option from the SQL databases page.
Create a server to host our database in it. From the Database details section, choose Create new server.
We will use SQL Authentication for authentication methods, creating a new username and password as shown below.
Once the server setup is complete, let’s add other details. For Subscription, choose Free Trial if you don’t already have a paid plan. Next, a new Resource group needs to be created to store all our resources related to our database.
Choose Next: Networking to move forward. Make sure to pick Public Endpoint for the Connectivity method.
Next, under Additional Settings, you can load sample tables by choosing Sample under Use existing data. We will use data from this table to sync to BigQuery later in the tutorial.
Once you are done with configurations, Choose Create to create the database, this step may take a few minutes. Once Azure creates the database for us, you can view more details.
Let’s run a sample SQL query by going to the Query Editor and logging in with the username/password you just created. Then run an SQL query as shown below:.
Azure doesn't allow connections to the database from IPs unless they are listed in the firewall settings. Therefore we need to add Airbyte’s public IP addresses to the firewall settings.
To whitelist the IPs, click on Add a firewall rule and add the same IP address for both start and end address.
You will now add the newly created Azure SQL Database as an Airbyte Source. Login to your Airbyte Cloud Dashboard and, select Sources on the left bar, then click on +New source. After selecting Microsoft SQL Server as your source , you should complete the UI as shown in the image below:
ℹ️ Since Azure SQL Database uses the SQL Server engine, using the MSSQL source connector works.
Once you have entered all the necessary details, choose Set up source.
To set up a big query Airbyte destination, we need first to create a BigQuery dataset to load our books dataset.
Login in to your Google cloud dashboard. From the Welcome page, choose the Run a query in BigQuery button.
From the 3-dot menu of your cloud project, choose the only option to create a dataset
For Dataset ID, choose a descriptive name like mongodb_dataset. Only alphanumeric names are allowed as dataset ID names.
Choose the dataset location from the menu and click Create Dataset. You should now be able to see a newly created dataset. Take note of the Dataset ID you just entered since we will need it later to set up BigQuery as an Airbyte Destination.
Getting the appropriate account keys to access our BigQuery project is the only thing left. Choose API & Services from the Quick Access menu in your cloud dashboard.
From the credential menu, choose to create a new Service account.
Pick a valid name for your Service account and click Create and continue.
Next, we need to specify what resources can be accessed from this service account. From the available roles, BigQuery Data Editor and BigQuery Job User should be sufficient. Alternatively BigQuery Admin should work (and is shown in the image below), but for security should not be used in production systems..
Click Done, and you should see a new service account.
Choose this newly created service account email and add a new key by clicking the Add Key button.
From the Create private key pop-up, choose JSON.
Once you select Create, a new file must have been downloaded to your system. The final step involves creating a new Airbyte destination.
From your Airbyte dashboard, choose Destinations and, select New Destination, pick BigQuery from the available options.
Next, under Service Account Key JSON, copy the contents of the whole credentials file. Finally, click Set up destination once you are done with the changes.
The final step for this tutorial involves building a connection between our newly setup SQL Server and our BigQuery warehouse. To achieve this, go to Connections and choose to set up a New connection.
Select the source and destination that we just created, and Airbyte will show you the tables (called streams in Airbyte) that can be synced.
Once you have selected the namespaces to sync, Choose Set up connection, and Airbyte should automatically start syncing data. Once the sync is complete, Airbyte will tell you the status of the rows synced.
With the sample data in Azure SQL Database, around 6083 rows were synced to BigQuery. Once normalization has completed, you can go to the BigQuery console to verify and run a sample SQL query such as the following:
To summarize, this tutorial has shown you how to:
With Airbyte the data integration possibilities are endless, and we look forward to seeing you make use of it! We invite you to join the conversation on our community Slack Channel, to participate in discussions on Airbyte’s discourse, or to sign up for our newsletter. You may also be interested in Airbyte tutorials and Airbyte’s blog!
Using the Airbyte GitHub connector and Metabase, we can create insightful dashboards for GitHub projects.
Learn how to replicate Postgres data to Kafka in a few minutes with Airbyte.
Learn how to use Airbyte to easily synchronize your Klaviyo data into BigQuery.