All Collections
Data Sources
Airbridge
Guide to Activating Airbridge Service with Airbyte Data Source and Connectors
Guide to Activating Airbridge Service with Airbyte Data Source and Connectors

Step-by-step example showing how to activate a Stripe as the Source and Amazon S3 as the Destination

Openbridge Support avatar
Written by Openbridge Support
Updated over a week ago

In this guide, we will walk-thru how to activate Airbridge service using Airbyte data source and connectors. The guide will use Stripe as the source and Amazon S3 as the destination. While Stripe and S3 are the as our reference points, other source and destination will follow the same steps detailed below.

Before Your Begin: Pre-requirements

Airbridge operates exclusively within your Amazon Web Services (AWS) account, ensuring that the environment is both private and trusted. As a result, pipelines will always run in your private, secure AWS environment to ensure you have complete authority over the processes.

There are two steps to configuring AWS. (1) runs a ready-to-go CloudFormation template that preps your environment and (2) ensures you have an EC2 instance running that will run your pipelines.

The outcome of these two steps is a configured AWS environment, ready to run your Airbyte source and destination connectors.

Once the above steps are completed, there are three data points which are used in pipeline setup:

  • AWS S3 Bucket Name: your_bucket_name

  • AWS Role Arn: your_role_arn

  • AWS Region: your_region

Make sure you have these ready!

1. Select Your Source: Stripe

Log into your Openbridge account. Select "Sources" and then search for "stripe":

Select Activate

Next, you will see the pre-flight checklist. If you have created an AWS account, and run the cloud formation template process as mentioned in the pre-requirement process, check all the selections, and click on the Let's Start.

2. Set Your Source and Destinations

In this step, you will see inputs for two docker image names, one for the source and one for the destination.

Quick introduction for source and destination Docker images :

  1. Source: Source is the platform from which data is extracted.

  2. Destination: It refers to where the data from the source will be stored.

Since we use the Stripe in this example, the source docker image name is pre-populated in form (i.e., airbyte/source-stripe). For the destination docker image name, we use s3 in this example (i.e., airbyte/destination-s3 ). So, the form will look like the image below.

Hit on the Continue button once all details are filled up.

3. Upload Your Source and Destination Configrations

In this step, you will see Airbyte requires three distinct configurations: Source Configuration, Source Data Catalog, and Destination Configuration. All of them are based on JSON format.

A quick introduction to configurations:

  • Source Configuration: This reflects your authorization to the source.

  • Source Data Catalog: reflects the datasets and schemas defined by the source connector.

  • Destination Configuration: this reflects your authorization to the destination.

In our case, we need source configuration and source data catalog for the Stripe and Destination Configuration for the Amazon S3. Here are the links to get the configuration as per our example. Download the reference configs and fill in the fields per your requirements.

Once complete, save and upload them. After your screen will look like this:

Click on the Continue button.

4. Schedule Your Pipeline

In this step, you can define Data Pipeline Scheduling. Scheduling defines the cadence at which the data pipeline will run.

Quick introduction for Data Pipeline Scheduling :

Data source APIs are not unlimited resources; they have restrictions. If you set a schedule that exceeds the capacity, the source API will block, throttle, or fail your requests. A schedule should only be based on recommended frequencies for a given data source.

Scheduling defines the frequency at which the Stripe data pipeline will run.

I will use the pre-defined pipeline scheduling every 1 hour in this step.

Always confirm your desired schedule aligns with then the source says data is available and the frequency it can be requested.
โ€‹
Click on Continue once you set the scheduling.

5. Provide Your AWS Configuration

Earlier we referenced pre-requirement process and asked that you have the AWS S3 Bucket Name, AWS Role Arn, and AWS Region at the ready. Enter this information in this step as shown below:

Click on Continue.

6. Label Your Pipeline

In this wizard step, you can name your pipeline. This is name helps you identify the specifics of the pipeline you configured, which is helpful if you have 10 different Stripe pipelines you need to activate.

Click on Save Your Pipeline.

7. Complete!

After successfully creating your pipeline, you are redirected to this page, which will look like the image below.

Review

After creating your pipeline, you can see it on the pipeline page, and it will look like the image below.

Did this answer your question?