Warehouses and Lakes
Warehouses and Lakes

How to load data from GCS to BigQuery

Learn how to use Airbyte to synchronize your GCS data into BigQuery within minutes.

TL;DR

This can be done by building a data pipeline manually, usually a Python script (you can leverage a tool as Apache Airflow for this). This process can take more than a full week of development. Or it can be done in minutes on Airbyte in three easy steps:

  1. set up GCS as a source connector (using Auth, or usually an API key)
  2. set up BigQuery as a destination connector
  3. define which data you want to transfer and how frequently

You can choose to self-host the pipeline using Airbyte Open Source or have it managed for you with Airbyte Cloud.

This tutorial’s purpose is to show you how.

What is GCS

Google Cloud Storage is a cloud-based storage service that allows users to store and access their data from anywhere in the world. It provides a highly scalable and durable storage solution for businesses and individuals, with features such as automatic data replication, versioning, and access control. Google Cloud Storage offers different storage classes to suit different needs, including multi-regional, regional, nearline, and coldline storage. It also integrates with other Google Cloud services, such as BigQuery and Cloud Functions, to enable data analysis and processing. Overall, Google Cloud Storage provides a reliable and flexible storage solution for businesses of all sizes.

What is BigQuery

BigQuery is an enterprise data warehouse that draws on the processing power of Google Cloud Storage to enable fast processing of SQL queries through massive datasets. BigQuery helps businesses select the most appropriate software provider to assemble their data, based on the platforms the business uses. Once a business’ data is acculumated, it is moved into BigQuery. The company controls access to the data, but BigQuery stores and processes it for greater speed and convenience.

Integrate GCS with BigQuery in minutes

Try for free now

Prerequisites

  1. A GCS account to transfer your customer data automatically from.
  2. A BigQuery account.
  3. An active Airbyte Cloud account, or you can also choose to use Airbyte Open Source locally. You can follow the instructions to set up Airbyte on your system using docker-compose.

Airbyte is an open-source data integration platform that consolidates and streamlines the process of extracting and loading data from multiple data sources to data warehouses. It offers pre-built connectors, including GCS and BigQuery, for seamless data migration.

When using Airbyte to move data from GCS to BigQuery, it extracts data from GCS using the source connector, converts it into a format BigQuery can ingest using the provided schema, and then loads it into BigQuery via the destination connector. This allows businesses to leverage their GCS data for advanced analytics and insights within BigQuery, simplifying the ETL process and saving significant time and resources.

Step 1: Set up GCS as a source connector

1. First, navigate to the Airbyte website and create an account.
2. Once you have logged in, click on the ""Sources"" tab on the left-hand side of the screen.
3. Scroll down until you find the ""Google Cloud Storage"" source connector and click on it.
4. Click on the ""Create Connection"" button.
5. Enter a name for your connection and click on the ""Next"" button.
6. Enter your Google Cloud Storage credentials, including your project ID, service account email, and private key.
7. Click on the ""Test Connection"" button to ensure that your credentials are correct.
8. Once your connection has been successfully tested, click on the ""Create Connection"" button.
9. Your Google Cloud Storage source connector is now connected to Airbyte and ready to use.

Note: It is important to ensure that your Google Cloud Storage account has the necessary permissions to allow Airbyte to access your data. Additionally, it is recommended to review Airbyte's documentation and best practices for securing your data and connections.

Step 2: Set up BigQuery as a destination connector

1. First, navigate to the Airbyte dashboard and select the "Destinations" tab on the left-hand side of the screen.

2. Scroll down until you find the "BigQuery" destination connector and click on it.

3. Click the "Create Destination" button to begin setting up your BigQuery destination.

4. Enter your Google Cloud Platform project ID and service account credentials in the appropriate fields.

5. Next, select the dataset you want to use for your destination and enter the table prefix you want to use.

6. Choose the schema mapping for your data, which will determine how your data is organized in BigQuery.

7. Finally, review your settings and click the "Create Destination" button to complete the setup process.

8. Once your destination is created, you can begin configuring your source connectors to start syncing data to BigQuery.

9. To do this, navigate to the "Sources" tab on the left-hand side of the screen and select the source connector you want to use.

10. Follow the prompts to enter your source credentials and configure your sync settings.

11. When you reach the "Destination" step, select your BigQuery destination from the dropdown menu and choose the dataset and table prefix you want to use.

12. Review your settings and click the "Create Connection" button to start syncing data from your source to your BigQuery destination.

Step 3: Set up a connection to sync your GCS data to BigQuery

Once you've successfully connected GCS as a data source and BigQuery as a destination in Airbyte, you can set up a data pipeline between them with the following steps:

  1. Create a new connection: On the Airbyte dashboard, navigate to the 'Connections' tab and click the '+ New Connection' button.
  2. Choose your source: Select GCS from the dropdown list of your configured sources.
  3. Select your destination: Choose BigQuery from the dropdown list of your configured destinations.
  4. Configure your sync: Define the frequency of your data syncs based on your business needs. Airbyte allows both manual and automatic scheduling for your data refreshes.
  5. Select the data to sync: Choose the specific GCS objects you want to import data from towards BigQuery. You can sync all data or select specific tables and fields.
  6. Select the sync mode for your streams: Choose between full refreshes or incremental syncs (with deduplication if you want), and this for all streams or at the stream level. Incremental is only available for streams that have a primary cursor.
  7. Test your connection: Click the 'Test Connection' button to make sure that your setup works. If the connection test is successful, save your configuration.
  8. Start the sync: If the test passes, click 'Set Up Connection'. Airbyte will start moving data from GCS to BigQuery according to your settings.

Remember, Airbyte keeps your data in sync at the frequency you determine, ensuring your BigQuery data warehouse is always up-to-date with your GCS data.

Use Cases to transfer your GCS data to BigQuery

Integrating data from GCS to BigQuery provides several benefits. Here are a few use cases:

  1. Advanced Analytics: BigQuery’s powerful data processing capabilities enable you to perform complex queries and data analysis on your GCS data, extracting insights that wouldn't be possible within GCS alone.
  2. Data Consolidation: If you're using multiple other sources along with GCS, syncing to BigQuery allows you to centralize your data for a holistic view of your operations, and to set up a change data capture process so you never have any discrepancies in your data again.
  3. Historical Data Analysis: GCS has limits on historical data. Syncing data to BigQuery allows for long-term data retention and analysis of historical trends over time.
  4. Data Security and Compliance: BigQuery provides robust data security features. Syncing GCS data to BigQuery ensures your data is secured and allows for advanced data governance and compliance management.
  5. Scalability: BigQuery can handle large volumes of data without affecting performance, providing an ideal solution for growing businesses with expanding GCS data.
  6. Data Science and Machine Learning: By having GCS data in BigQuery, you can apply machine learning models to your data for predictive analytics, customer segmentation, and more.
  7. Reporting and Visualization: While GCS provides reporting tools, data visualization tools like Tableau, PowerBI, Looker (Google Data Studio) can connect to BigQuery, providing more advanced business intelligence options. If you have a GCS table that needs to be converted to a BigQuery table, Airbyte can do that automatically.

Wrapping Up

To summarize, this tutorial has shown you how to:

  1. Configure a GCS account as an Airbyte data source connector.
  2. Configure BigQuery as a data destination connector.
  3. Create an Airbyte data pipeline that will automatically be moving data directly from GCS to BigQuery after you set a schedule

With Airbyte, creating data pipelines take minutes, and the data integration possibilities are endless. Airbyte supports the largest catalog of API tools, databases, and files, among other sources. Airbyte's connectors are open-source, so you can add any custom objects to the connector, or even build a new connector from scratch without any local dev environment or any data engineer within 10 minutes with the no-code connector builder.

We look forward to seeing you make use of it! We invite you to join the conversation on our community Slack Channel, or sign up for our newsletter. You should also check out other Airbyte tutorials, and Airbyte’s content hub!

What should you do next?

Hope you enjoyed the reading. Here are the 3 ways we can help you in your data journey:

flag icon
Easily address your data movement needs with Airbyte Cloud
Take the first step towards extensible data movement infrastructure that will give a ton of time back to your data team. 
Get started with Airbyte for free
high five icon
Talk to a data infrastructure expert
Get a free consultation with an Airbyte expert to significantly improve your data movement infrastructure. 
Talk to sales
stars sparkling
Improve your data infrastructure knowledge
Subscribe to our monthly newsletter and get the community’s new enlightening content along with Airbyte’s progress in their mission to solve data integration once and for all.
Subscribe to newsletter

Connectors Used

Moving data from GCS to BigQuery can be done by building a data pipeline manually, usually a Python script (you can leverage a tool as Apache Airflow for this). This process can take more than a full week of development. Or it can be done in minutes on Airbyte in three easy steps:

  1. set up Google Cloud Storage as a source connector (using Auth, or usually an API key)
  2. set up BigQuery as a destination connector
  3. define which data you want to transfer and how frequently

You can choose to self-host the pipeline using Airbyte Open Source or have it managed for you with Airbyte Cloud.

This tutorial’s purpose is to show you how.

What is Google Cloud Storage

Google Cloud Storage is a cloud-based storage service that allows users to store and access their data from anywhere in the world. It provides a highly scalable and durable storage solution for businesses and individuals, with features such as automatic data replication, versioning, and access control. Google Cloud Storage offers different storage classes to suit different needs, including multi-regional, regional, nearline, and coldline storage. It also integrates with other Google Cloud services, such as BigQuery and Cloud Functions, to enable data analysis and processing. Overall, Google Cloud Storage provides a reliable and flexible storage solution for businesses of all sizes.

What is BigQuery

BigQuery is an enterprise data warehouse that draws on the processing power of Google Cloud Storage to enable fast processing of SQL queries through massive datasets. BigQuery helps businesses select the most appropriate software provider to assemble their data, based on the platforms the business uses. Once a business’ data is acculumated, it is moved into BigQuery. The company controls access to the data, but BigQuery stores and processes it for greater speed and convenience.


{{COMPONENT_CTA}}

Prerequisites

  1. A Google Cloud Storage account to transfer your customer data automatically from.
  2. A BigQuery account.
  3. An active Airbyte Cloud account, or you can also choose to use Airbyte Open Source locally. You can follow the instructions to set up Airbyte on your system using docker-compose.

Airbyte is an open-source data integration platform that consolidates and streamlines the process of extracting and loading data from multiple data sources to data warehouses. It offers pre-built connectors, including Google Cloud Storage and BigQuery, for seamless data migration.

When using Airbyte to move data from Google Cloud Storage to BigQuery, it extracts data from Google Cloud Storage using the source connector, converts it into a format BigQuery can ingest using the provided schema, and then loads it into BigQuery via the destination connector. This allows businesses to leverage their Google Cloud Storage data for advanced analytics and insights within BigQuery, simplifying the ETL process and saving significant time and resources.

Methods to Move Data From google cloud storage to bigquery 

  • Method 1: Connecting google cloud storage to bigquery using Airbyte.
  • Method 2: Connecting google cloud storage to bigquery manually.

Method 1: Connecting google cloud storage to bigquery using Airbyte

Step 1: Set up Google Cloud Storage as a source connector

1. First, navigate to the Airbyte website and create an account.
2. Once you have logged in, click on the ""Sources"" tab on the left-hand side of the screen.
3. Scroll down until you find the ""Google Cloud Storage"" source connector and click on it.
4. Click on the ""Create Connection"" button.
5. Enter a name for your connection and click on the ""Next"" button.
6. Enter your Google Cloud Storage credentials, including your project ID, service account email, and private key.
7. Click on the ""Test Connection"" button to ensure that your credentials are correct.
8. Once your connection has been successfully tested, click on the ""Create Connection"" button.
9. Your Google Cloud Storage source connector is now connected to Airbyte and ready to use.

Note: It is important to ensure that your Google Cloud Storage account has the necessary permissions to allow Airbyte to access your data. Additionally, it is recommended to review Airbyte's documentation and best practices for securing your data and connections.

Step 2: Set up BigQuery as a destination connector

1. First, navigate to the Airbyte dashboard and select the "Destinations" tab on the left-hand side of the screen.

2. Scroll down until you find the "BigQuery" destination connector and click on it.

3. Click the "Create Destination" button to begin setting up your BigQuery.

4. Enter your Google Cloud Platform project ID and service account credentials in the appropriate fields.

5. Next, select the dataset you want to use for your destination and enter the table prefix you want to use.

6. Choose the schema mapping for your data, which will determine how your data is organized in BigQuery.

7. Finally, review your settings and click the "Create Destination" button to complete the setup process.

8. Once your destination is created, you can begin configuring your source connectors to start syncing data to BigQuery.

9. To do this, navigate to the "Sources" tab on the left-hand side of the screen and select the source connector you want to use.

10. Follow the prompts to enter your source credentials and configure your sync settings.

11. When you reach the "Destination" step, select your BigQuery from the dropdown menu and choose the dataset and table prefix you want to use.

12. Review your settings and click the "Create Connection" button to start syncing data from your source to your BigQuery.

Step 3: Set up a connection to sync your data from GCS to BigQuery

Once you've successfully connected Google Cloud Storage as a data source and BigQuery as a destination in Airbyte, you can set up a data pipeline between them with the following steps:

  1. Create a new connection: On the Airbyte dashboard, navigate to the 'Connections' tab and click the '+ New Connection' button.
  2. Choose your source: Select Google Cloud Storage from the dropdown list of your configured sources.
  3. Select your destination: Choose BigQuery from the dropdown list of your configured destinations.
  4. Configure your sync: Define the frequency of your data syncs based on your business needs. Airbyte allows both manual and automatic scheduling for your data refreshes.
  5. Select the data to sync: Choose the specific Google Cloud Storage objects you want to import data from towards BigQuery. You can sync all data or select specific tables and fields.
  6. Select the sync mode for your streams: Choose between full refreshes or incremental syncs (with deduplication if you want), and this for all streams or at the stream level. Incremental is only available for streams that have a primary cursor.
  7. Test your connection: Click the 'Test Connection' button to make sure that your setup works. If the connection test is successful, save your configuration.
  8. Start the sync: If the test passes, click 'Set Up Connection'. Airbyte will start moving data from Google Cloud Storage to BigQuery according to your settings.

Method 2: Connecting google cloud storage to bigquery manually

Moving data from Google Cloud Storage to BigQuery can be accomplished using Google Cloud Platform (GCP) services without the need for third-party connectors or integrations. Here’s a step-by-step guide to achieve this:

Step 1: Set up Google Cloud Platform Project

1. Create a GCP Project: If you haven’t already, create a new project or select an existing one in the Google Cloud Console.

2. Enable Billing: Ensure that billing is enabled for your project.

Step 2: Enable APIs

1. Enable BigQuery API: Navigate to the "APIs & Services" dashboard and enable the BigQuery API for your project.

2. Enable Cloud Storage API: Similarly, enable the Cloud Storage API if it isn’t already enabled.

Step 3: Prepare Your Data in Cloud Storage

1. Upload Data to Cloud Storage: If your data is not already in Cloud Storage, upload it to a bucket. Ensure that the data is in a format supported by BigQuery (CSV, JSON, Avro, Parquet, etc.).

2. Set Permissions: Make sure that the Cloud Storage bucket and the objects within it are accessible to the BigQuery service. You may need to adjust the permissions or IAM policies to grant access.

Step 4: Set Up BigQuery

1. Create a Dataset: In the BigQuery console, create a new dataset where your tables will reside.

2. Prepare the Schema: If your data requires a specific schema, prepare it beforehand. For some file types, BigQuery can auto-detect the schema.

Step 5: Load Data from Cloud Storage to BigQuery

You have a couple of options to load data from Cloud Storage to BigQuery:

Using the BigQuery Web UI:

1. Navigate to the BigQuery Console: Go to the BigQuery console within your project.

2. Create a Table: Click on your dataset, then click `+ Create Table`.

3. Select Source: In the `Create table from` dropdown, select `Google Cloud Storage`.

4. Specify File Location: Enter the Cloud Storage URI for your data file(s).

5. File Format: Choose the appropriate file format (CSV, JSON, Avro, Parquet, etc.).

6. Table Name: Specify the table name and dataset in which the table should be created.

7. Table Schema: Either enter the schema manually, or select `Auto-detect` if applicable.

8. Advanced Options: Configure additional options as necessary (e.g., partitioning, clustering).

9. Create Table: Click `Create Table`.

Using the `bq` Command-Line Tool:

1. Open Cloud Shell or Local Terminal: Open Cloud Shell in the Google Cloud Console or use your local terminal with `gcloud` and `bq` tools installed and configured.

2. Load Data Command: Use the `bq load` command to load data into BigQuery. Here's an example command:

```bash

bq load --source_format=[FORMAT] [DATASET].[TABLE] gs://[BUCKET_NAME]/[OBJECT_NAME] [SCHEMA]

```

Replace `[FORMAT]` with your data format, `[DATASET]` with your dataset name, `[TABLE]` with your table name, `[BUCKET_NAME]` with your Cloud Storage bucket name, `[OBJECT_NAME]` with the name of your data file, and `[SCHEMA]` with the schema for your data.

3. Run the Command: Execute the command, and the data will be loaded into BigQuery.

Step 6: Verify Data Import

After loading your data, verify that the import was successful:

1. Check Job History: In the BigQuery console, go to the "Job History" to see the status of your load job.

2. Query the Table: Run a simple query against the new table to ensure data has been loaded correctly.

Step 7: Clean Up

After verifying the data:

1. Delete Temporary Files: If you created temporary files in Cloud Storage, consider deleting them to avoid unnecessary storage charges.

2. Review Access Controls: Make sure that the permissions for both the Cloud Storage bucket and the BigQuery dataset are set according to your organization’s security policies.

By following these steps, you should be able to move data from Google Cloud Storage to BigQuery without any third-party connectors or integrations. Always consult the latest GCP documentation for any updates or changes to the services.

Use Cases to transfer your Google Cloud Storage data to BigQuery

Integrating data from Google Cloud Storage to BigQuery provides several benefits. Here are a few use cases:

  1. Advanced Analytics: BigQuery's powerful data processing capabilities enable you to perform complex queries and data analysis on your Google Cloud Storage data, extracting insights that wouldn't be possible within Google Cloud Storage alone.
  2. Data Consolidation: If you're using multiple other sources along with Google Cloud Storage, syncing to BigQuery allows you to centralize your data for a holistic view of your operations, and to set up a change data capture process so you never have any discrepancies in your data again.
  3. Historical Data Analysis: Google Cloud Storage has limits on historical data. Syncing data to BigQuery allows for long-term data retention and analysis of historical trends over time.
  4. Data Security and Compliance: BigQuery provides robust data security features. Syncing Google Cloud Storage data to BigQuery ensures your data is secured and allows for advanced data governance and compliance management.
  5. Scalability: BigQuery can handle large volumes of data without affecting performance, providing an ideal solution for growing businesses with expanding Google Cloud Storage data.
  6. Data Science and Machine Learning: By having Google Cloud Storage data in BigQuery, you can apply machine learning models to your data for predictive analytics, customer segmentation, and more.
  7. Reporting and Visualization: While Google Cloud Storage provides reporting tools, data visualization tools like Tableau, PowerBI, Looker (Google Data Studio) can connect to BigQuery, providing more advanced business intelligence options. If you have a Google Cloud Storage table that needs to be converted to a BigQuery table, Airbyte can do that automatically.

Wrapping Up

To summarize, this tutorial has shown you how to:

  1. Configure a Google Cloud Storage account as an Airbyte data source connector.
  2. Configure BigQuery as a data destination connector.
  3. Create an Airbyte data pipeline that will automatically be moving data directly from Google Cloud Storage to BigQuery after you set a schedule

With Airbyte, creating data pipelines take minutes, and the data integration possibilities are endless. Airbyte supports the largest catalog of API tools, databases, and files, among other sources. Airbyte's connectors are open-source, so you can add any custom objects to the connector, or even build a new connector from scratch without any local dev environment or any data engineer within 10 minutes with the no-code connector builder.

We look forward to seeing you make use of it! We invite you to join the conversation on our community Slack Channel, or sign up for our newsletter. You should also check out other Airbyte tutorials, and Airbyte’s content hub!

What should you do next?

Hope you enjoyed the reading. Here are the 3 ways we can help you in your data journey:

flag icon
Easily address your data movement needs with Airbyte Cloud
Take the first step towards extensible data movement infrastructure that will give a ton of time back to your data team. 
Get started with Airbyte for free
high five icon
Talk to a data infrastructure expert
Get a free consultation with an Airbyte expert to significantly improve your data movement infrastructure. 
Talk to sales
stars sparkling
Improve your data infrastructure knowledge
Subscribe to our monthly newsletter and get the community’s new enlightening content along with Airbyte’s progress in their mission to solve data integration once and for all.
Subscribe to newsletter

Connectors Used

Frequently Asked Questions

What data can you extract from GCS?

Google Cloud Storage's API provides access to various types of data, including:

1. Object data: This includes files and other data objects stored in Google Cloud Storage buckets.

2. Metadata: This includes information about the objects stored in the buckets, such as their size, creation date, and content type.

3. Access control data: This includes information about who has access to the objects stored in the buckets and what level of access they have.

4. Bucket data: This includes information about the buckets themselves, such as their name, location, and storage class.

5. Logging data: This includes information about the activity in the buckets, such as who accessed them and when.

6. Transfer data: This includes information about data transfers to and from the buckets, such as the amount of data transferred and the transfer speed.

Overall, the Google Cloud Storage API provides access to a wide range of data related to object storage and management in the cloud.

What data can you transfer to BigQuery?

You can transfer a wide variety of data to BigQuery. This usually includes structured, semi-structured, and unstructured data like transaction records, log files, JSON data, CSV files, and more, allowing robust, scalable data integration and analysis.

What are top ETL tools to transfer data from GCS to BigQuery?

The most prominent ETL tools to transfer data from GCS to BigQuery include:

  • Airbyte
  • Fivetran
  • Stitch
  • Matillion
  • Talend Data Integration

These tools help in extracting data from GCS and various sources (APIs, databases, and more), transforming it efficiently, and loading it into BigQuery and other databases, data warehouses and data lakes, enhancing data management capabilities.