Extract data from Stripe’s REST API and send it into Snowflake.
Millions of businesses use Stripe APIs to accept payments, send payouts, and manage their online transactions. Nevertheless, Stripe’s analytical capabilities are limited, offering only a small set of metrics such as gross volumes, net sales, new customers, successful payments, and so on.
To improve your Stripe analytics capabilities, you can use Stripe’s REST APIs to extract your Stripe transactions and replicate your Stripe data into a modern data warehouse architecture such as Snowflake. Snowflake is purpose-built for online analytical processing (OLAP), and once your data is in Snowflake tables it can be easily combined with data from other internal systems, which will provide you with enhanced insights into your business.
One challenge that you may come across is that extracting your Stripe data may require skilled Stripe developers and extensive consultations to Stripe's developer documentation. Furthermore, once you get the data out of Stripe, that is only half the battle – you still need to drive your data into Snowflake! Luckily Airbyte makes this easy by providing a Stripe source connector and a Snowflake destination connector. Airbyte data connectors make it easy to build a data pipeline to move your Stripe data to Snowflake without the need for extensive consultations to the Stripe developer portal or to Stripe’s developer docs.
You will need an active Stripe account to follow this tutorial. Airbyte requires your Stripe Account ID and Secret Key. You can find the Account ID by visiting your account settings.
If you don’t already have a Snowflake account, you’ll need to pick a Snowflake edition and a cloud provider for your warehouse as part of the account creation process.
The Snowflake dashboard will appear after you create a username and password. The worksheet area will be where you’ll run scripts and SQL queries for creating, visualizing, and modifying resources.
For Airbyte to successfully sync data from Stripe, you need to create a data warehouse and a database. Luckily, Airbyte provides a nice ready-to-execute script that lets you do it in a few seconds. You can read more about setting up a Snowflake destination in Airbyte’s Snowflake docs.
The main steps in the following script are:
You can copy and run this whole script in a new Snowflake worksheet. Make sure to select “All Queries” when executing the worksheet.
ℹ️ Note that you should rename all the values for database, warehouse, roles, and user variables in the Snowflake setup script. For simplicity and testing purposes, you will be using default values. This example also uses the sysadmin role, and you should assign reduced permissions to the service account before deploying in production.
After you run the script, Snowflake will tell you whether the command was successfully executed. You can visit the warehouse section to confirm that there is a new warehouse with the name AIRBYTE_WAREHOUSE and the configuration that you defined in the script.
Now that you have configured Snowflake, you are ready to create an ELT pipeline from your Stripe source to your Snowflake destination.
You will define a Stripe source which will be used by Airbyte to extract data from the Stripe REST API. Go to sources and create a new Stripe source from the dropdown.
You can choose whatever name you like for the Stripe source. Use the Account ID and Secret Key that you found earlier in this tutorial. The replication date should be set to a date from before your Stripe account became operational.
Go to the destinations tab in your Airbyte cloud dashboard and select Snowflake as a new destination.
You will need the following details:
Enter the data as follows:
For the purposes of this tutorial, do not set JDBC URL Params.
ℹ️ JDBC URL Params is an optional property may be used to pass custom key-value pairs for connecting to Snowflake.
The Data Staging Method defines where the data is stored (or staged) so that Snowflake can efficiently load it into tables. By default, Airbyte uses Snowflake’s Internal Stage to load data. You can also choose to load data from Amazon S3 or any other cloud host. You can read more about setting up a data loading method in Airbyte’s docs.
Create the destination. This may take a few seconds until Airbyte authentication takes place.
The final step to create a Stripe data pipeline to Snowflake involves creating the connection between the source and destination. Go to the Connections tab to create a new connection. Select the Stripe source and Snowflake destination that you just created.
You will then see the following screen:
Choose a name for this connection, and choose your desired replication frequency. If you want to use the same naming as the source data in Stripe, then choose the “mirror source structure” option for your namespace configuration. Alternatively, you may define a prefix to be used in the destination.
Once you are satisfied with your configuration, press Set up connection and wait for the sync first to complete. Syncing all the namespaces can take a few minutes.
Once the sync has completed, you can look at the Databases tab in the Snowflake dashboard to confirm that the data has been replicated. You should see a page similar to the following:
You can now run advanced analytics and execute custom SQL queries on your Stripe data in Snowflake!
Moving data from Stripe into Snowflake provides you with powerful analytics capabilities and allows you to see a full picture of your business. However, interacting with the Stripe REST API may require expert Stripe developers that know how to navigate Stripe developer docs. Furthermore, driving data into Snowflake may require understanding Snowflake's best practices and may need engineering resources and developer time. Alternatively you can use Airbyte to easily move your data from Stripe to Snowflake, as you have done in this tutorial. Specifically, in this tutorial you have learned how to:
If you have enjoyed this tutorial, you may be interested in other Airbyte tutorials, or in Airbyte’s blog. You can also join the conversation on our community Slack Channel, participate in discussions on Airbyte’s discourse, or sign up for our newsletter. Furthermore, if you are interested in Airbyte as a fully managed service, you can try Airbyte Cloud for free!
Get all your ELT data pipelines running in minutes with Airbyte.