

Building your pipeline or Using Airbyte
Airbyte is the only open source solution empowering data teams to meet all their growing custom business demands in the new AI era.
- Inconsistent and inaccurate data
- Laborious and expensive
- Brittle and inflexible
- Reliable and accurate
- Extensible and scalable for all your needs
- Deployed and governed your way
Start syncing with Airbyte in 3 easy steps within 10 minutes



Take a virtual tour
Demo video of Airbyte Cloud
Demo video of AI Connector Builder
What sets Airbyte Apart
Modern GenAI Workflows
Move Large Volumes, Fast
An Extensible Open-Source Standard
Full Control & Security
Fully Featured & Integrated
Enterprise Support with SLAs
What our users say


"The intake layer of Datadog’s self-serve analytics platform is largely built on Airbyte.Airbyte’s ease of use and extensibility allowed any team in the company to push their data into the platform - without assistance from the data team!"


“Airbyte helped us accelerate our progress by years, compared to our competitors. We don’t need to worry about connectors and focus on creating value for our users instead of building infrastructure. That’s priceless. The time and energy saved allows us to disrupt and grow faster.”


“We chose Airbyte for its ease of use, its pricing scalability and its absence of vendor lock-in. Having a lean team makes them our top criteria. The value of being able to scale and execute at a high level by maximizing resources is immense”
- Open your Google Sheet. Ensure that the data is clean and formatted correctly. The first row should contain column headers that will correspond to your MySQL table fields.
- Check data types. Make sure that the data types in Google Sheets will be compatible with MySQL data types (e.g., text, numbers, dates).
- File Export. Click on File > Download and choose a format that is suitable for MySQL import. CSV (Comma-separated values) is a common choice for this purpose.
- Download the file. Save the CSV file to your local machine.
Access MySQL. Log in to your MySQL server using the command line or a database management tool like phpMyAdmin.
Create a database. Execute CREATE DATABASE your_database_name; to create a new database.
Use the database. Type USE your_database_name; to select the new database.
Create a table. Define a new table with a structure that matches the data in your CSV file using the CREATE TABLE statement. For example:
CREATE TABLE your_table_name (
column1_name column1_datatype,
column2_name column2_datatype,
...
);
- Check CSV formatting. Open the CSV file with a text editor to ensure that the data is correctly delimited (usually by commas) and that text is enclosed in quotes if necessary.
- Adjust line endings. Make sure the CSV file has Unix-style line endings (LF) if you're using a Windows machine, as MySQL expects Unix-style line endings.
Access the MySQL Command Line. Use the command line or your database management tool to access MySQL.
Select the database. If not already selected, use USE your_database_name;.
Disable foreign key checks (if necessary). If your table has foreign key constraints, you may need to disable foreign key checks temporarily with SET FOREIGN_KEY_CHECKS=0;.
Import the CSV file. Use the LOAD DATA INFILE command to import the CSV file. The command will look something like this:
LOAD DATA INFILE '/path/to/your/file.csv'
INTO TABLE your_table_name
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\n'
IGNORE 1 LINES; -- Use this if your CSV file includes a header row
Note: The file path should be the absolute path to where the CSV file is stored on the server. If you're importing the file from your local machine to a remote server, you might need to use LOAD DATA LOCAL INFILE instead and ensure that the local-infile option is enabled in your MySQL configuration.
Re-enable foreign key checks. If you disabled foreign key checks, re-enable them with SET FOREIGN_KEY_CHECKS=1;.
Check the import. Verify that the data has been imported correctly by running a simple SELECT query on the table.
- Review the data. Check for any anomalies or issues with the imported data.
- Create indexes. If necessary, create indexes on your table to improve performance on future queries.
- Test your application. Make sure that your application or service that relies on this data is functioning correctly with the new data.
Notes:
- The LOAD DATA INFILE command may require specific permissions or settings in MySQL, and your MySQL server must have access to the file.
- Always back up your MySQL database before performing operations that modify data in bulk.
- Ensure that character encoding is consistent between your CSV file and the MySQL database to avoid issues with special characters.
- During the process, you may need to convert date formats or other data types to match MySQL's expected format.
FAQs
What is ETL?
ETL, an acronym for Extract, Transform, Load, is a vital data integration process. It involves extracting data from diverse sources, transforming it into a usable format, and loading it into a database, data warehouse or data lake. This process enables meaningful data analysis, enhancing business intelligence.
Google Sheets is a cloud-based spreadsheet program that allows users to create, edit, and share spreadsheets online. It is a free alternative to Microsoft Excel and can be accessed from any device with an internet connection. Google Sheets offers a range of features including formulas, charts, and conditional formatting, making it a powerful tool for data analysis and organization. Users can collaborate in real-time, making it easy to work on projects with others. Additionally, Google Sheets integrates with other Google apps such as Google Drive and Google Forms, making it a versatile tool for personal and professional use.
Google Sheets API provides access to a wide range of data types that can be used for various purposes. Here are some of the categories of data that can be accessed through the API:
1. Spreadsheet data: This includes the data stored in the cells of a spreadsheet, such as text, numbers, and formulas.
2. Cell formatting: The API allows access to the formatting of cells, such as font size, color, and alignment.
3. Sheet properties: This includes information about the sheet, such as its title, size, and visibility.
4. Charts: The API provides access to the charts created in a sheet, including their data and formatting.
5. Named ranges: This includes the named ranges created in a sheet, which can be used to refer to specific cells or ranges of cells.
6. Filters: The API allows access to the filters applied to a sheet, which can be used to sort and filter data.
7. Comments: This includes the comments added to cells in a sheet, which can be used to provide additional context or information.
8. Permissions: The API allows access to the permissions set for a sheet, including who has access to view or edit the sheet.
What is ELT?
ELT, standing for Extract, Load, Transform, is a modern take on the traditional ETL data integration process. In ELT, data is first extracted from various sources, loaded directly into a data warehouse, and then transformed. This approach enhances data processing speed, analytical flexibility and autonomy.
Difference between ETL and ELT?
ETL and ELT are critical data integration strategies with key differences. ETL (Extract, Transform, Load) transforms data before loading, ideal for structured data. In contrast, ELT (Extract, Load, Transform) loads data before transformation, perfect for processing large, diverse data sets in modern data warehouses. ELT is becoming the new standard as it offers a lot more flexibility and autonomy to data analysts.
What should you do next?
Hope you enjoyed the reading. Here are the 3 ways we can help you in your data journey: