Building your pipeline or Using Airbyte
Airbyte is the only open solution empowering data teams to meet all their growing custom business demands in the new AI era.
- Inconsistent and inaccurate data
- Laborious and expensive
- Brittle and inflexible
- Reliable and accurate
- Extensible and scalable for all your needs
- Deployed and governed your way
Start syncing with Airbyte in 3 easy steps within 10 minutes
Take a virtual tour
Demo video of Airbyte Cloud
Demo video of AI Connector Builder
What sets Airbyte Apart
Modern GenAI Workflows
Move Large Volumes, Fast
An Extensible Open-Source Standard
Full Control & Security
Fully Featured & Integrated
Enterprise Support with SLAs
What our users say
"The intake layer of Datadog’s self-serve analytics platform is largely built on Airbyte.Airbyte’s ease of use and extensibility allowed any team in the company to push their data into the platform - without assistance from the data team!"
“Airbyte helped us accelerate our progress by years, compared to our competitors. We don’t need to worry about connectors and focus on creating value for our users instead of building infrastructure. That’s priceless. The time and energy saved allows us to disrupt and grow faster.”
“We chose Airbyte for its ease of use, its pricing scalability and its absence of vendor lock-in. Having a lean team makes them our top criteria. The value of being able to scale and execute at a high level by maximizing resources is immense”
FAQs
What is ETL?
ETL, an acronym for Extract, Transform, Load, is a vital data integration process. It involves extracting data from diverse sources, transforming it into a usable format, and loading it into a database, data warehouse or data lake. This process enables meaningful data analysis, enhancing business intelligence.
Salesforce is a cloud-based customer relationship management (CRM) platform providing business solutions software on a subscription basis. Salesforce is a huge force in the ecommerce world, helping businesses with marketing, commerce, service and sales, and enabling enterprises’ IT teams to collaborate easily from anywhere. Salesforces is the force behind many industries, offering healthcare, automotive, finance, media, communications, and manufacturing multichannel support. Its services are wide-ranging, with access to customer, partner, and developer communities as well as an app exchange marketplace.
Salesforce's API provides access to a wide range of data types, including:
1. Accounts: Information about customer accounts, including contact details, billing information, and purchase history.
2. Leads: Data on potential customers, including contact information, lead source, and lead status.
3. Opportunities: Information on potential sales deals, including deal size, stage, and probability of closing.
4. Contacts: Details on individual contacts associated with customer accounts, including contact information and activity history.
5. Cases: Information on customer service cases, including case details, status, and resolution.
6. Products: Data on products and services offered by the company, including pricing, availability, and product descriptions.
7. Campaigns: Information on marketing campaigns, including campaign details, status, and results.
8. Reports and Dashboards: Access to pre-built and custom reports and dashboards that provide insights into sales, marketing, and customer service performance.
9. Custom Objects: Ability to access and manipulate custom objects created by the organization to store specific types of data.
Overall, Salesforce's API provides access to a comprehensive set of data types that enable organizations to manage and analyze their customer relationships, sales processes, and marketing campaigns.
What is ELT?
ELT, standing for Extract, Load, Transform, is a modern take on the traditional ETL data integration process. In ELT, data is first extracted from various sources, loaded directly into a data warehouse, and then transformed. This approach enhances data processing speed, analytical flexibility and autonomy.
Difference between ETL and ELT?
ETL and ELT are critical data integration strategies with key differences. ETL (Extract, Transform, Load) transforms data before loading, ideal for structured data. In contrast, ELT (Extract, Load, Transform) loads data before transformation, perfect for processing large, diverse data sets in modern data warehouses. ELT is becoming the new standard as it offers a lot more flexibility and autonomy to data analysts.
Salesforce is a cloud-based customer relationship management (CRM) platform providing business solutions software on a subscription basis. Salesforce is a huge force in the ecommerce world, helping businesses with marketing, commerce, service and sales, and enabling enterprises’ IT teams to collaborate easily from anywhere. Salesforces is the force behind many industries, offering healthcare, automotive, finance, media, communications, and manufacturing multichannel support. Its services are wide-ranging, with access to customer, partner, and developer communities as well as an app exchange marketplace.
BigQuery is an enterprise data warehouse that draws on the processing power of Google Cloud Storage to enable fast processing of SQL queries through massive datasets. BigQuery helps businesses select the most appropriate software provider to assemble their data, based on the platforms the business uses. Once a business’ data is acculumated, it is moved into BigQuery. The company controls access to the data, but BigQuery stores and processes it for greater speed and convenience.
1. Open the Airbyte platform and navigate to the "Sources" tab on the left-hand side of the screen.
2. Click on the "Salesforce" source connector and select "Create new connection."
3. Enter a name for your connection and click "Next."
4. Enter your Salesforce credentials, including your username, password, and security token.
5. Click "Test connection" to ensure that your credentials are correct and that Airbyte can connect to your Salesforce account.
6. Once the connection is successful, select the objects you want to replicate from Salesforce.
7. Choose the replication frequency and any other settings you want to apply to your connection.
8. Click "Create connection" to save your settings and start replicating data from Salesforce to Airbyte.
9. You can monitor the progress of your replication in the "Connections" tab and view the data in the "Dashboard" tab.
1. First, navigate to the Airbyte dashboard and select the "Destinations" tab on the left-hand side of the screen.
2. Scroll down until you find the "BigQuery" destination connector and click on it.
3. Click the "Create Destination" button to begin setting up your BigQuery destination.
4. Enter your Google Cloud Platform project ID and service account credentials in the appropriate fields.
5. Next, select the dataset you want to use for your destination and enter the table prefix you want to use.
6. Choose the schema mapping for your data, which will determine how your data is organized in BigQuery.
7. Finally, review your settings and click the "Create Destination" button to complete the setup process.
8. Once your destination is created, you can begin configuring your source connectors to start syncing data to BigQuery.
9. To do this, navigate to the "Sources" tab on the left-hand side of the screen and select the source connector you want to use.
10. Follow the prompts to enter your source credentials and configure your sync settings.
11. When you reach the "Destination" step, select your BigQuery destination from the dropdown menu and choose the dataset and table prefix you want to use.
12. Review your settings and click the "Create Connection" button to start syncing data from your source to your BigQuery destination.
With Airbyte, creating data pipelines take minutes, and the data integration possibilities are endless. Airbyte supports the largest catalog of API tools, databases, and files, among other sources. Airbyte's connectors are open-source, so you can add any custom objects to the connector, or even build a new connector from scratch without any local dev environment or any data engineer within 10 minutes with the no-code connector builder.
We look forward to seeing you make use of it! We invite you to join the conversation on our community Slack Channel, or sign up for our newsletter. You should also check out other Airbyte tutorials, and Airbyte’s content hub!
What should you do next?
Hope you enjoyed the reading. Here are the 3 ways we can help you in your data journey:
By integrating marketing, sales, service, and IT teams into one platform, Salesforce has transformed the way businesses operate. However, in today’s changing data landscape, you cannot simply rely on Salesforce itself to derive maximum value for your business. In fact, you need a combination of Salesforce and a powerful data warehouse like Google’s BigQuery so you can open the doors to more robust analytical insights and in turn, drive increased revenue and growth.
In this recipe, we will explore the benefits of why you should consider moving data from your Salesforce to BigQuery, and then demonstrate how you can easily leverage Airbyte to do the job.
Why centralize data from Salesforce to a BigQuery?
Let’s look at some of the reasons why you might want to centralize your Salesforce data into a data warehouse.
Out-of-the-box reporting is not sufficient and difficult to use
While Salesforce is a powerful tool, its default data capabilities are limited. There are hard restrictions on reporting and dashboard capabilities, which makes it inflexible to use for different scenarios. Moreover, when analyzing historical data in Salesforce, the timing of the snapshots has to be planned in advance and cannot be changed on the fly. On top of that, there are other limitations when it comes to large reports with over 100 fields.
Customizing Salesforce can be very expensive and time-consuming
Customizing Salesforce workflows is typically a multi-step process involving hiring certified Salesforce consultants, building an in-house development team, and ultimately collaborating. Once the solution is built, your staff will need to be trained to use it effectively. All of this not only takes time and effort, but can also be quite expensive. The cost of highly customized Salesforce implementations can range from $10,000 to well over $50,000 depending on the complexity of the company's internal processes.
Salesforce's pricing limits access
Salesforce's pricing model is based on pay-per-user, and purchasing an annual license is required to get started. This policy will likely force you to limit the number of Salesforce users in your enterprise to bring down costs. With limited possibilities of sharing, collaboration can be challenging - one of the reasons why Salesforce was implemented to begin with. For example, with this limitation, reports can only be shared with paid Salesforce users. Therefore a C-level executive who might only need to view a report will be unable to do so without paying a license fee.
Why use Airbyte to extract Salesforce data
Now that you realize the value of moving Salesforce data into a data warehouse, how do you implement that process?
In most cases, many businesses start by writing custom ETL scripts but the truth is that they don't succeed with them. Manually writing scripts for this will slow down your project's velocity. Moreover, if these scripts are typically brittle — constant care and time need to be devoted to keep these scripts running. With automation, you can ditch the complex hardcoded scripts that handle data wrangling and scheduling, enabling your teams to work efficiently. Thanks to Airbyte, connectors are open-source and easily customizable. They help you seamlessly integrate your data that is residing across many business apps and databases in your data warehouse. With Airbyte, you get full control over your data in an effortless way. Data is deduplicated and can be transformed on the fly based on custom business logic rules with SQL. Airbyte has built-in scheduling, orchestration, and monitoring. Airbyte's ready-to-go scheduler enables you to replicate data either fully or in an incremental fashion, removing any manual intervention and allowing your developers to focus on more critical matters.
Now, let’s get started with the how-to’s of using Airbyte to replicate data from Salesforce to BigQuery.
Prerequisites
Below are the prerequisite tools you’ll need to get started on backing up your Salesforce data to your Google BigQuery.
- You’ll need to get Airbyte to do the data replication for you. To deploy Airbyte, follow the simple instructions in our documentation here.
- You will need a Salesforce account.
- You also need a Google Cloud Platform account with the BigQuery service enabled.
Salesforce accounts can be created by signing up at the link here. Note: You will need at least a developer account for Airbyte to be able to access Salesforce REST APIs. To set up and create a GCP account with the BigQuery service enabled, follow the instructions at the link here.
{{COMPONENT_CTA}}
Methods to Move Data From Salesforce to Bigquery
- Method 1: Connecting Salesforce to Bigquery using Airbyte.
- Method 2: Connecting Salesforce to Bigquery manually.
Method 1: Connecting Salesforce to Bigquery using Airbyte.
Configure your Salesforce account
The first step is to configure your Salesforce account to allow external applications to connect and access data from the account. Login to Salesforce and go to Setup > Create > Apps and select New in the Connected Apps section.
Fill in the required fields.
Next, enable the OAuth settings, enter https://login.salesforce.com/ under Callback URL and select the following Oauth Scopes.
Save the connected app. Once saved you should be able to see your Consumer Key and Consumer Secret. Make a note of these values which will be used later.
Next you will have to allow access to the connected apps and generate an authorization code which will then be used to get your refresh token which is required to configure Airbyte for Salesforce.
Login to Salesforce and in a new browser tab, go to the following URL: https://.salesforce.com/services/oauth2/authorize?response_type=code&client_id=&redirect_uri=https://login.salesforce.com/
can be found in your Salesforce account verification email. After hitting the URL, you will be prompted to allow access.
Copy the value following code in the URL. This will be used as the authorization code in the next step.
Next you will need to make a POST request to https://.salesforce.com/services/oauth2/token with the following parameters.
Set client_id to , code to the authorization code from the previous step, grant_type to authorization_code, client_secret to and the redirect_url to https://login.salesforce.com/
The response will contain a value for refresh_token which will be required to configure Airbyte.
Add sample data to Salesforce
In the next step we will generate and import sample data into Salesforce. For this example we will create Leads data. Go to https://www.mockaroo.com/ and create the Leads dataset with the following fields.
This will generate a CSV with sample data.
Next login to Salesforce and go to Setup > Data > Data Import Wizard. Choose the Leads object, upload the CSV and follow the Wizard steps to upload the data.
Once complete, you can view the data by going to the Salesforce homepage > Leads.
Set up Google BigQuery
The next step is to create a BigQuery dataset and generate a credentials JSON file required to configure Airbyte. Login to your GCP account and go to BigQuery.
Select Create a Dataset and give and add the required information.
Next, go to the Service Account page and Create a new Service account. Enter the required information. Grant the service account the BigQuery Data Owner role.
Once the service account is created, go to Keys > Add Key and create a new key. You will be prompted to download a credentials file. Download the JSON version which will be used later.
Set up Salesforce as your Airbyte source
Next, set up the connection for the source, which will be your Salesforce account. Under client_id, enter , under client_secret enter and under refresh_token, enter the refresh token obtained at the end of Step 1.
Set up BigQuery as your Airbyte destination
Next, set up Airbyte to use BigQuery as the destination for the data replication. Enter your GCP project-id, dataset ID, and dataset location. Copy the contents of the credentials JSON.
Create a Salesforce to BigQuery connection
Once configured, a list of Salesforce streams that data can be backed up.
Scroll through and select the Leads stream that contains the sample data.
Note: You need to have billing enabled in your BigQuery service for normalization to work without any errors.
Once configured you can manually trigger a sync. Once complete, your data will be backed up to BigQuery.
You can see two tables created in your dataset. The Lead table contains the normalized data.
You can also view your data by going to the dataset and going to the _airbyte_raw_lead table.
You can test out the incremental Sync Append by generating another sample data file and uploading it to Salesforce. Running the sync again will add the additional items to BigQuery.
In this case, an additional 1065 items were added. You can verify the total row count in your dataset by running the following in the BigQuery UI:
Method 2: Connecting Salesforce to Bigquery manually.
To manually connect Salesforce to BigQuery without relying on third-party tools, you will need to perform a series of steps that involve extracting data from Salesforce, preparing it for BigQuery, and then loading it into BigQuery. Below is a high-level guide on how to do this:
Step 1: Extract Data from Salesforce
- Use Salesforce Reports or Data Export Service:
- You can manually generate reports or use the data export service provided by Salesforce to extract your data.
- Schedule or perform an export of the relevant objects (e.g., Leads, Opportunities, Contacts).
- Use Salesforce APIs:
- Utilize the Salesforce REST API or Bulk API to programmatically extract data.
- Write a script or use a command-line tool like curl to make API requests and retrieve the data.
Step 2: Prepare Data for BigQuery
- Format the Data:
- Ensure that the data extracted from Salesforce is in a format supported by BigQuery (CSV, JSON, Avro, or Parquet).
- Clean and transform the data if necessary, making sure to handle any data type discrepancies.
- Compress the Data (Optional):
- BigQuery supports compressed data formats, which can save on storage and improve load times.
- Use tools like gzip to compress your CSV or JSON files.
- Split Large Data Files (Optional):
- If you have very large data files, consider splitting them into smaller chunks to make the upload process more manageable and potentially parallelize the load operation.
Step 3: Upload Data to Google Cloud Storage (GCS)
- Create a Bucket:
- Go to the Google Cloud Console and create a new storage bucket in Google Cloud Storage if you don't already have one.
- Upload Files:
- Use the Google Cloud Console, gsutil, or the Google Cloud Storage API to upload your prepared data files to the GCS bucket.
Step 4: Load Data into BigQuery
- Create a Dataset and Table in BigQuery:
- In the Google Cloud Console, navigate to BigQuery and create a new dataset.
- Define a table schema that matches the structure of your Salesforce data.
- Load Data from GCS into BigQuery:
- Use the BigQuery Web UI, bq command-line tool, or the BigQuery API to create a load job.
- Specify the GCS file path, the table you're loading the data into, and any additional configurations (such as field delimiters, skip header rows, etc.).
Step 5: Verify Data Integrity
- Check the Load Job:
- After the load job completes, check for any errors or warnings that may have occurred during the import process.
- Query the Data:
- Run some test queries in BigQuery to ensure that the data has been loaded correctly and matches your expectations.
Step 6: Automate the Process (Optional)
- Scripting:
- To avoid manual repetition, you can write scripts to automate the extraction, transformation, and loading processes.
- Cloud Functions or Cloud Workflows:
- Use Google Cloud Functions or Cloud Workflows to orchestrate and automate the data pipeline.
- Schedule Regular Updates:
- Set up a schedule to regularly extract data from Salesforce and update your BigQuery dataset.
Keep in mind that this manual process can be time-consuming and may require maintenance. If you find that you need to perform this operation regularly or with large volumes of data, consider using a data pipleine tool like Airbyte.
Wrapping up
Now that you have replicated your Salesforce data to Google BigQuery, you can leverage the rich analytical capabilities of BigQuery to extract more insights from this data. Good luck, and we wish you all the best using Airbyte!
Now that you've replicated your Salesforce data to Google BigQuery, checkout another article to discover how Google BigQuery Sandbox can amplify your data insights journey after replicating Salesforce data.
Join the conversation at Airbyte’s community Slack Channel to share your ideas with over 1000 data engineers and help make everyone’s project successful.
What should you do next?
Hope you enjoyed the reading. Here are the 3 ways we can help you in your data journey:
Ready to get started?
Frequently Asked Questions
Salesforce's API provides access to a wide range of data types, including:
1. Accounts: Information about customer accounts, including contact details, billing information, and purchase history.
2. Leads: Data on potential customers, including contact information, lead source, and lead status.
3. Opportunities: Information on potential sales deals, including deal size, stage, and probability of closing.
4. Contacts: Details on individual contacts associated with customer accounts, including contact information and activity history.
5. Cases: Information on customer service cases, including case details, status, and resolution.
6. Products: Data on products and services offered by the company, including pricing, availability, and product descriptions.
7. Campaigns: Information on marketing campaigns, including campaign details, status, and results.
8. Reports and Dashboards: Access to pre-built and custom reports and dashboards that provide insights into sales, marketing, and customer service performance.
9. Custom Objects: Ability to access and manipulate custom objects created by the organization to store specific types of data.
Overall, Salesforce's API provides access to a comprehensive set of data types that enable organizations to manage and analyze their customer relationships, sales processes, and marketing campaigns.
What should you do next?
Hope you enjoyed the reading. Here are the 3 ways we can help you in your data journey: