How to load data from Notion to S3

Learn how to use Airbyte to synchronize your Notion data into S3 within minutes.

Trusted by data-driven companies

Building your pipeline or Using Airbyte

Airbyte is the only open source solution empowering data teams  to meet all their growing custom business demands in the new AI era.

Building in-house pipelines
Bespoke pipelines are:
  • Inconsistent and inaccurate data
  • Laborious and expensive
  • Brittle and inflexible
Furthermore, you will need to build and maintain Y x Z pipelines with Y sources and Z destinations to cover all your needs.
After Airbyte
Airbyte connections are:
  • Reliable and accurate
  • Extensible and scalable for all your needs
  • Deployed and governed your way
All your pipelines in minutes, however custom they are, thanks to Airbyte’s connector marketplace and AI Connector Builder.

Start syncing with Airbyte in 3 easy steps within 10 minutes

Set up a Notion connector in Airbyte

Connect to Notion or one of 400+ pre-built or 10,000+ custom connectors through simple account authentication.

Set up S3 for your extracted Notion data

Select S3 where you want to import data from your Notion source to. You can also choose other cloud data warehouses, databases, data lakes, vector databases, or any other supported Airbyte destinations.

Configure the Notion to S3 in Airbyte

This includes selecting the data you want to extract - streams and columns -, the sync frequency, where in the destination you want that data to be loaded.

Take a virtual tour

Check out our interactive demo and our how-to videos to learn how you can sync data from any source to any destination.

Demo video of Airbyte Cloud

Demo video of AI Connector Builder

What sets Airbyte Apart

Modern GenAI Workflows

Streamline AI workflows with Airbyte: load unstructured data into vector stores like Pinecone, Weaviate, and Milvus. Supports RAG transformations with LangChain chunking and embeddings from OpenAI, Cohere, etc., all in one operation.

Move Large Volumes, Fast

Quickly get up and running with a 5-minute setup that supports both incremental and full refreshes, for databases of any size.

An Extensible Open-Source Standard

More than 1,000 developers contribute to Airbyte’s connectors, different interfaces (UI, API, Terraform Provider, Python Library), and integrations with the rest of the stack. Airbyte’s AI Connector Builder lets you edit or add new connectors in minutes.

Full Control & Security

Airbyte secures your data with cloud-hosted, self-hosted or hybrid deployment options. Single Sign-On (SSO) and Role-Based Access Control (RBAC) ensure only authorized users have access with the right permissions. Airbyte acts as a HIPAA conduit and supports compliance with CCPA, GDPR, and SOC2.

Fully Featured & Integrated

Airbyte automates schema evolution for seamless data flow, and utilizes efficient Change Data Capture (CDC) for real-time updates. Select only the columns you need, and leverage our dbt integration for powerful data transformations.

Enterprise Support with SLAs

Airbyte Self-Managed Enterprise comes with dedicated support and guaranteed service level agreements (SLAs), ensuring that your data movement infrastructure remains reliable and performant, and expert assistance is available when needed.

What our users say

Jean-Mathieu Saponaro
Data & Analytics Senior Eng Manager

"The intake layer of Datadog’s self-serve analytics platform is largely built on Airbyte.Airbyte’s ease of use and extensibility allowed any team in the company to push their data into the platform - without assistance from the data team!"

Learn more
Chase Zieman headshot
Chase Zieman
Chief Data Officer

“Airbyte helped us accelerate our progress by years, compared to our competitors. We don’t need to worry about connectors and focus on creating value for our users instead of building infrastructure. That’s priceless. The time and energy saved allows us to disrupt and grow faster.”

Learn more
Alexis Weill
Data Lead

“We chose Airbyte for its ease of use, its pricing scalability and its absence of vendor lock-in. Having a lean team makes them our top criteria.
The value of being able to scale and execute at a high level by maximizing resources is immense”

Learn more

How to Sync Notion to S3 Manually

Start by exporting the content you want to migrate
Notion allows you to export individual pages (with their sub-pages) or your entire workspace
On the Notion desktop or web app:

Open the menu (click the ... in the top-right of a page) and choose Export
Select HTML format (with sub-pages and all content) for a complete backup


To export the entire workspace:

Go to Settings → Workspace → Export all workspace content (workspace owners only)


Make sure to include sub-pages and all content (files, images, etc.)
Notion will compile your data and provide a .zip file for download (large exports might be emailed to you)

Save the exported .zip file to your computer
Unzip it to a convenient folder
The Notion export will contain:

Pages as files (HTML or Markdown)
Subfolders for embedded images or attachments


After extraction:

Verify the folder structure is intact
Check that you can open the main HTML file to navigate your notes offline
This ensures the export is complete and organized correctly

Navigate to the AWS Management Console
Sign in with your AWS credentials
Go to the S3 (Simple Storage Service) section:

Find it under "Storage" services or via the search bar


This will take you to the S3 dashboard where you can manage storage buckets

In the S3 console, click the Create bucket button
Provide a unique name for the bucket (must be globally unique across AWS)

Example: my-notion-backup-2025


Choose an AWS Region (typically one close to you or required by compliance needs)
Leave most settings at their defaults
Ensure "Block all public access" remains enabled (best practice for keeping your bucket private)
Click Create bucket
You should see your new bucket in the list of S3 buckets once created

Click on the name of the new bucket to open it
If you prefer to group your Notion data in folders:

Click Create folder
Name it "NotionExport" or similar


This step is optional:

You may upload files directly to the bucket's root
You can drag-and-drop your entire export folder, which preserves internal folders automatically

In your S3 bucket, click the Upload button
In the upload dialog:

Drag and drop the entire folder containing your extracted Notion data, or
Use the Add files / Add folder buttons to select files and folders manually


Ensure all subfolders (for images, sub-pages, etc.) are included
Click Upload at the bottom of the dialog to begin the transfer
Optional: Enable Bucket Versioning before uploading (recommended for backups)
The console will show progress and display a success message when complete

After upload completion, verify all files and folders are present in the S3 bucket
You should see the same folder names and file names as in your extracted export
To check file integrity:

Select one of the files (e.g., index.html)
Use the Download or Open option in the Actions menu
Open the downloaded file on your computer
Confirm the content displays as expected

By default, uploaded files are private (only you and authorized AWS users can access them)
Ensure you haven't made objects public unintentionally
For sensitive or personal data, keep these default settings
If sharing content or hosting as a static website:

Adjust permissions carefully (disable block public access, set specific objects to public)
Enable static website hosting if needed
Only do this for content you intend to be public

Consider enabling additional features for data durability and security:

Enable bucket versioning via the bucket Properties tab
Versioning keeps previous versions when objects are overwritten or deleted


Encryption:

S3 automatically applies server-side encryption with AWS-managed keys (SSE-S3)
Verify this in object properties (encryption details)
For stricter requirements, configure a bucket-level default encryption policy


With versioning, encryption, and blocked public access, your Notion data backup is safely stored

How to Sync Notion to S3 Manually - Method 2:

FAQs

ETL, an acronym for Extract, Transform, Load, is a vital data integration process. It involves extracting data from diverse sources, transforming it into a usable format, and loading it into a database, data warehouse or data lake. This process enables meaningful data analysis, enhancing business intelligence.

Notion is an all-in-one workspace that allows users to organize their personal and professional lives in one place. It combines features of note-taking apps, project management tools, and databases to create a customizable and flexible platform. Users can create pages, databases, and boards to manage tasks, projects, and information. Notion also offers a variety of templates and integrations with other apps to enhance productivity. Its user-friendly interface and collaborative features make it a popular choice for individuals and teams looking to streamline their workflows and stay organized.

Notion's API provides access to a wide range of data types, including:  

1. Pages: This includes all the pages in a Notion workspace, including their properties and content.  
2. Databases: Notion's databases are a powerful way to organize and manage data. The API provides access to all the databases in a workspace, including their properties and content.  
3. Blocks: Notion's blocks are the building blocks of pages and databases. The API provides access to all the blocks in a workspace, including their content and properties.  
4. Users: Notion's API provides access to information about the users in a workspace, including their name, email address, and profile picture.  
5. Workspaces: The API provides access to information about the workspaces themselves, including their name and ID.  
6. Integrations: Notion's API allows developers to create integrations with other tools and services, such as Slack or Zapier.  

Overall, Notion's API provides a comprehensive set of tools for accessing and manipulating data within a workspace, making it a powerful platform for building custom applications and workflows.

This can be done by building a data pipeline manually, usually a Python script (you can leverage a tool as Apache Airflow for this). This process can take more than a full week of development. Or it can be done in minutes on Airbyte in three easy steps: 
1. Set up Notion to S3 as a source connector (using Auth, or usually an API key)
2. Choose a destination (more than 50 available destination databases, data warehouses or lakes) to sync data too and set it up as a destination connector
3. Define which data you want to transfer from Notion to S3 and how frequently
You can choose to self-host the pipeline using Airbyte Open Source or have it managed for you with Airbyte Cloud. 

ELT, standing for Extract, Load, Transform, is a modern take on the traditional ETL data integration process. In ELT, data is first extracted from various sources, loaded directly into a data warehouse, and then transformed. This approach enhances data processing speed, analytical flexibility and autonomy.

ETL and ELT are critical data integration strategies with key differences. ETL (Extract, Transform, Load) transforms data before loading, ideal for structured data. In contrast, ELT (Extract, Load, Transform) loads data before transformation, perfect for processing large, diverse data sets in modern data warehouses. ELT is becoming the new standard as it offers a lot more flexibility and autonomy to data analysts.

What should you do next?

Hope you enjoyed the reading. Here are the 3 ways we can help you in your data journey:

flag icon
Easily address your data movement needs with Airbyte Cloud
Take the first step towards extensible data movement infrastructure that will give a ton of time back to your data team. 
Get started with Airbyte for free
high five icon
Talk to a data infrastructure expert
Get a free consultation with an Airbyte expert to significantly improve your data movement infrastructure. 
Talk to sales
stars sparkling
Improve your data infrastructure knowledge
Subscribe to our monthly newsletter and get the community’s new enlightening content along with Airbyte’s progress in their mission to solve data integration once and for all.
Subscribe to newsletter