

Building your pipeline or Using Airbyte
Airbyte is the only open source solution empowering data teams to meet all their growing custom business demands in the new AI era.
- Inconsistent and inaccurate data
- Laborious and expensive
- Brittle and inflexible
- Reliable and accurate
- Extensible and scalable for all your needs
- Deployed and governed your way
Start syncing with Airbyte in 3 easy steps within 10 minutes



Take a virtual tour
Demo video of Airbyte Cloud
Demo video of AI Connector Builder
What sets Airbyte Apart
Modern GenAI Workflows
Move Large Volumes, Fast
An Extensible Open-Source Standard
Full Control & Security
Fully Featured & Integrated
Enterprise Support with SLAs
What our users say


"The intake layer of Datadog’s self-serve analytics platform is largely built on Airbyte.Airbyte’s ease of use and extensibility allowed any team in the company to push their data into the platform - without assistance from the data team!"


“Airbyte helped us accelerate our progress by years, compared to our competitors. We don’t need to worry about connectors and focus on creating value for our users instead of building infrastructure. That’s priceless. The time and energy saved allows us to disrupt and grow faster.”


“We chose Airbyte for its ease of use, its pricing scalability and its absence of vendor lock-in. Having a lean team makes them our top criteria. The value of being able to scale and execute at a high level by maximizing resources is immense”
1. Log in to your Webflow account and navigate to the project containing the data you want to export.
2. Go to the Collections page where your data is stored.
3. Click on the collection you want to export data from.
4. In the collection panel, look for an export button or similar option. If available, click on it to export the data.
5. Choose the CSV format for the export, as it is easily readable and can be imported to Google Sheets.
6. Download the CSV file to your local machine.
1. Open Google Sheets and create a new spreadsheet.
2. Give your spreadsheet a name that reflects the data you will be importing.
3. If you want to automate the import process, you can skip to Step 3. Otherwise, you can import the CSV file manually by clicking on File > Import > Upload and select the CSV file you exported from Webflow.
1. In your Google Sheets, click on Extensions > Apps Script.
2. Delete any code in the script editor and replace it with a custom script. You will write a script that fetches data from Webflow's API and writes it to your Google Sheet.
Here's an outline of what your script should do:
- Authenticate with Webflow's API: You'll need to get an API token from Webflow and include it in your script to authenticate your requests.
- Fetch data from Webflow: Use the Webflow API to fetch the data you want to import into Google Sheets.
- Parse the fetched data: Convert the data from the API response into a format that Google Sheets can understand (usually an array of arrays).
- Write data to Google Sheets: Use Google Apps Script methods to write the parsed data into your spreadsheet.
```javascript
function importDataFromWebflow() {
var webflowApiToken = 'YOUR_WEBFLOW_API_TOKEN';
var webflowCollectionId = 'YOUR_COLLECTION_ID';
var headers = {
"Authorization": "Bearer " + webflowApiToken,
"accept-version": "1.0.0"
};
var apiUrl = "https://api.webflow.com/collections/" + webflowCollectionId + "/items";
var options = {
"method": "get",
"headers": headers
};
var response = UrlFetchApp.fetch(apiUrl, options);
var jsonResponse = JSON.parse(response.getContentText());
// Assuming the items are in an 'items' array in the JSON response
var items = jsonResponse.items;
var sheet = SpreadsheetApp.getActiveSpreadsheet().getActiveSheet();
// Define the range where you want to write the data
// This example assumes your data starts at row 2 to account for headers
var range = sheet.getRange(2, 1, items.length, items[0].length);
// Write the data to the range
range.setValues(items);
}
```
3. Replace `'YOUR_WEBFLOW_API_TOKEN'` with your actual Webflow API token and `'YOUR_COLLECTION_ID'` with the collection ID from which you want to import data.
4. Save the script with a meaningful name and then run it by clicking on the play button or by creating a trigger to run it at regular intervals.
1. In the Google Apps Script editor, click on the clock icon on the left to open the Triggers page.
2. Click on + Add Trigger at the bottom right corner.
3. Set up the trigger to run the `importDataFromWebflow` function at the interval you desire (e.g., daily, weekly).
1. Review your script and make sure that it is free of errors.
2. Test the script by running it manually at first to ensure it imports the data correctly.
3. Check your Google Sheet to see if the data has been imported as expected.
Important Considerations:
- Rate Limits: Be aware of Webflow's API rate limits to avoid getting temporarily blocked.
- API Changes: Keep an eye on any changes to the Webflow API that might require updates to your script.
- Error Handling: Implement error handling in your script to deal with issues like network errors or unexpected API changes.
- Security: Keep your Webflow API token secure and do not expose it in shared scripts or spreadsheets.
By following these steps, you should be able to move data from Webflow to Google Sheets without using third-party connectors or integrations. Remember that this method requires a fair amount of technical knowledge, including familiarity with Google Apps Script and Webflow's API.
FAQs
What is ETL?
ETL, an acronym for Extract, Transform, Load, is a vital data integration process. It involves extracting data from diverse sources, transforming it into a usable format, and loading it into a database, data warehouse or data lake. This process enables meaningful data analysis, enhancing business intelligence.
Webflow is basically a great platform for web designs that can build production-ready experiences without code. Webflow is the leading platform to design, and launch powerful websites visually that enables you to rapidly design and build production-scale responsive websites and it is also an popular platform of CMS, and hosting provider perfect for building production websites and prototypes without coding. Webflow is an overall innovative tool to simplify the lives of designers and teams all around and helping them work faster and deliver high quality websites.
Webflow's API provides access to a wide range of data related to websites built on the Webflow platform. The following are the categories of data that can be accessed through the API:
1. Site data: This includes information about the website, such as its name, URL, and settings.
2. Collection data: This includes data related to collections, such as the name, description, and fields.
3. Item data: This includes data related to individual items within a collection, such as the item's ID, name, and field values.
4. Asset data: This includes data related to assets used on the website, such as images, videos, and files.
5. Form data: This includes data related to forms on the website, such as form submissions and form fields.
6. E-commerce data: This includes data related to e-commerce functionality on the website, such as products, orders, and customers.
7. CMS data: This includes data related to the content management system used on the website, such as templates, pages, and content.
Overall, the Webflow API provides access to a wide range of data that can be used to build custom integrations and applications that interact with Webflow websites.
What is ELT?
ELT, standing for Extract, Load, Transform, is a modern take on the traditional ETL data integration process. In ELT, data is first extracted from various sources, loaded directly into a data warehouse, and then transformed. This approach enhances data processing speed, analytical flexibility and autonomy.
Difference between ETL and ELT?
ETL and ELT are critical data integration strategies with key differences. ETL (Extract, Transform, Load) transforms data before loading, ideal for structured data. In contrast, ELT (Extract, Load, Transform) loads data before transformation, perfect for processing large, diverse data sets in modern data warehouses. ELT is becoming the new standard as it offers a lot more flexibility and autonomy to data analysts.
What should you do next?
Hope you enjoyed the reading. Here are the 3 ways we can help you in your data journey: