In this guide, we will show you how simple it is to add drag-and-drop loading of data to Amazon Redshift, Amazon Redshift Spectrum, Amazon Athena or Google BigQuery via Openbridge Data Pipelines. This guide will show you how to mount the Openbridge server so it appears as a disk on your computer.
Lets get started!
Getting Started: The Prerequisites
In this guide, we are using a macOS tool called Transmit. This guide will show you how to use Transmit to mount your Openbridge Data Pipeline so you can upload CSV files into your destination warehouse.
Here are a few things you need to take care of in advance:
- First, set up a Data Pipeline in the Openbridge system. Setting up a Data Pipeline is a simple and easy process.
- Second, install the Transmit software if you have not done so
- Lastly, install the required Transmit disk softwar
We will publish another guide for other tools and operating systems shortly.
Step 1: Launch Transmit
After you have completed the setup of your data pipeline at Openbridge, launch the Transmit software and make a new connection to an
SFTP server. In this case, we will call the server
All set? Click “Next”
Step 2: Enter Openbridge Connection Details
In this step you need to enter the connection details provided to you in Openbridge data pipeline subscription:
- Hostname: pipeline-01.openbridge.io
443is often used in the event a corporate firewall blocks outbound connections on port
Enter Openbridge server connection details
After you have completed entering the connection information click “Save”
Step 3: Mount Your Openbridge Server as a Disk
The next step is to mount your newly configured Openbridge server as a disk. When configuration of the server is set up, it will appear on the right in the Transmit interface. Highlight the
openbridge server and then select “Servers” in the Transmit menu. You will see an option “Mount as Disk”.
Mount your server as a “Disk”
Once mounted, you should be able to see your new drive listed in the macOS Finder:
You server is now a “Disk”
If you open your drive, you will see
loyalty_purchases data pipeline.
Your CSV files loaded to the
loyalty_purchases directory will berouted to a corresponding table in your destination data warehouse. The image below shows an example of dragging and dropping
salesforce_click CSV files to your drive for loading to a destination data warehouse.
Thats it! You have now activated your data pipeline by mounting the Openbridge server in Transmit. Now you can start delivering CSV files to your destination data warehouse!