Snowflake is quick to configure, especially with our fully automated Snowflake Worksheet commands. Please note that the process detailed in this document will not work without an active, billing-enabled Snowflake account. If you have not done so, please make sure you have updated your billing information in Snowflake.
๐ Prerequisites
Before running the setup script, there are a few key connection details that are needed to configure Snowflake destination properly:
Snowflake Account Identifier
You should have access to a Snowflake account withACCOUNTADMIN
or equivalent privileges.Account Identifier
can be found in your Snowflake account details.Warehouse
The script in the docs will create one if it doesn't exist, but ensure naming conventions and sizes meet your org's policies.Username, Password, Database & Schema
All will be created in the script. Modify names as needed.OAuth Integration
You'll need the following values after running the setup script:
Client ID
Client Secret
Authorization URL (your Snowflake Account URL, prefixed with
https://
โ e.g.,https://YOUR-ACCOUNT.snowflakecomputing.com
)
These are generated or derived during the setup process. Detailed steps are provided later.
Stage
The script creates a named stage for storing Parquet files. Ensure your environment supports required file formats and S3 policies if using external staging.S3 Bucket with Permissions
Use the Openbridge CloudFormation template (see below) to create a bucket with required access policies.
Option 1: ๐ชฃ Setting Up the S3 Bucket (via CloudFormation)
To support external staging for your Snowflake destination, Openbridge provides a pre-configured AWS CloudFormation template. This template will automatically create an S3 bucket with the necessary read/write permissions for Openbridge.
โโ
What It Does:
Creates a secure S3 bucket
Sets required bucket policies
Configures permissions for Openbridge access
๐ Launch the Template:
Use the link below to deploy the CloudFormation stack in your AWS account:
(โ ๏ธ Make sure you have appropriate AWS IAM privileges to deploy CloudFormation stacks.)
After deployment, you will have:
An S3 bucket ready to use, make sure you save the bucket name.
Permissions already in placeโno manual bucket policy changes needed
Option 2: ๐งพ Using an Existing S3 Bucket (Manual Setup)
If you're not using the CloudFormation template and prefer to manage your bucket manually, youโll need to configure two layers of access:
โAllow Openbridge to Read/Write:
Add the following policy to your bucket to grant access to Openbridge's IAM role:
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "AllowExternalRoleReadWrite",
"Effect": "Allow",
"Principal": {
"AWS": "arn:aws:iam::360832902187:role/zeroadmin-dev"
},
"Action": [
"s3:GetObject",
"s3:GetObjectVersion",
"s3:PutObject",
"s3:PutObjectAcl",
"s3:DeleteObject",
"s3:DeleteObjectVersion"
],
"Resource": "arn:aws:s3:::my-bucket-name/*"
},
{
"Sid": "AllowExternalRoleListBucket",
"Effect": "Allow",
"Principal": {
"AWS": "arn:aws:iam::360832902187:role/zeroadmin-dev"
},
"Action": [
"s3:ListBucket",
"s3:ListBucketVersions",
"s3:GetBucketLocation"
],
"Resource": "arn:aws:s3:::my-bucket-name"
}
]
}
Replace my-bucket-name
with your actual bucket name.
๐ก๏ธ This policy ensures Openbridgeโs system can securely interact with the contents of your bucket as needed for data loading into Snowflake.
๐ Required Snowflake Setup Sequence (Applies to All Bucket Setups)
Whether you create your S3 bucket manually or use the Openbridge CloudFormation template, the following Snowflake configuration steps must be completed in this order:
๐ Step 1: Manual Setup (From Snowflake Docs)
Step 1: Configure access permissions for the S3 bucket
Add a policy that allows Snowflake to access your S3 bucket.Step 2: Create the IAM role in AWS
Create an IAM role Snowflake will assume to access the bucket.โ ๏ธ This step is required even if you used our CloudFormation template to create the bucket.
๐ Important: Save AWS Role Details for Script
During Step 2 of the Snowflake setup (IAM Role creation), make sure to save:
<AWS_ACCOUNT_ID>
โ Your 12-digit AWS Account ID (from the AWS Console)<YOUR_IAM_ROLE_NAME>
โ The IAM Role name you just created
You'll need these for this line in the Openbridge setup script:
SET openbridge_role_arn = 'arn:aws:iam::<AWS_ACCOUNT_ID>:role/<YOUR_IAM_ROLE_NAME>';
Make sure to replace:
<AWS_ACCOUNT_ID>
with your actual AWS account ID<YOUR_IAM_ROLE_NAME>
with the name of the IAM role you created
โถ๏ธ Step 2: Run the Openbridge Automated Setup Script
Once Steps 1 and 2 are complete, run the Openbridge SQL script.
This script will automatically handle step 3 and step 6 from snowflake docs:
You do not need to perform these steps manually.
โ๏ธ Before You Run the Script
Make sure to replace all placeholder values with those specific to your Snowflake environment:
YOUR_OPENBRIDGE_DB
YOUR_OPENBRIDGE_SCHEMA
YOUR_OPENBRIDGE_USERNAME
YOUR_OPENBRIDGE_PASSWORD
YOUR_OPENBRIDGE_ROLE
YOUR_OPENBRIDGE_STAGE
YOUR_OPENBRIDGE_WAREHOUSE
YOUR_OPENBRIDGE_ROLE_ARN
(exact value from AWS)YOUR_OPENBRIDGE_S3_BUCKET_NAME
(exact S3 bucket name from AWS)
๐ค Naming Guidelines:
Use UPPERCASE for all Snowflake object names
Use exact casing for:
openbridge_s3_bucket_name
(must match bucket name in AWS)openbridge_role_arn
(must match AWS role ARN)openbridge_password
(can be mixed-case)
๐ Save These Values โ Youโll Need Them Later in Openbridge
๐ For OAuth Login (during Openbridge authorization step):
openbridge_username
openbridge_password
๐ฅ For Snowflake Destination Configuration (Step 3 in UI):
Database
Schema
Warehouse
Stage
Role
S3 Bucket (created using the CloudFormation template earlier)
-- Set variables (Replace placeholders with your actual values)
SET openbridge_database = 'YOUR_OPENBRIDGE_DB'; -- e.g., 'BUXZZ_DATABASE'
SET openbridge_schema = 'YOUR_OPENBRIDGE_SCHEMA'; -- e.g., 'BUXZZ_SCHEMA'
SET openbridge_username = 'YOUR_OPENBRIDGE_USERNAME'; -- e.g., 'BUXZZ_USER'
SET openbridge_password = 'YOUR_OPENBRIDGE_PASSWORD'; -- e.g., '12BUXZZ!@'
SET openbridge_role = 'YOUR_OPENBRIDGE_ROLE'; -- e.g., 'BUXZZ_ROLE'
SET openbridge_stage = 'YOUR_OPENBRIDGE_STAGE'; -- e.g., 'BUXZZ_STAGE'
SET openbridge_warehouse = 'YOUR_OPENBRIDGE_WAREHOUSE';-- e.g., 'BUXZZ_WAREHOUSE'
SET openbridge_warehouse_type = 'STANDARD'; -- snowflake warehouse type
SET openbridge_warehouse_size = 'XSMALL'; -- snowflake warehouse size
SET openbridge_role_arn = 'arn:aws:iam::<AWS_ACCOUNT_ID>:role/<YOUR_IAM_ROLE_NAME>'; -- aws policy role arn
SET openbridge_s3_bucket_name = 'YOUR_S3_BUCKET_NAME'; -- aws bucket name
SET openbridge_namespace = $openbridge_database || '.' || $openbridge_schema;
SET openbridge_s3_uri = 's3://' || $openbridge_s3_bucket_name || '/';
-- Use the ACCOUNTADMIN role to ensure sufficient privileges
USE ROLE ACCOUNTADMIN;
-- Step 1: Setup
-- Create the Openbridge role if it doesn't exist
CREATE ROLE IF NOT EXISTS IDENTIFIER($openbridge_role);
-- Create the Openbridge user if it doesn't exist
CREATE USER IF NOT EXISTS IDENTIFIER($openbridge_username)
PASSWORD = $openbridge_password
DEFAULT_ROLE = $openbridge_role
DEFAULT_WAREHOUSE = $openbridge_warehouse
DEFAULT_NAMESPACE = $openbridge_namespace; -- Use pre-built namespace
-- Grant the role to the user
GRANT ROLE IDENTIFIER($openbridge_role) TO USER IDENTIFIER($openbridge_username);
-- Assign the current user's name to a variable
SET current_username = CURRENT_USER();
-- Grant the role to your current user (for testing purposes)
GRANT ROLE IDENTIFIER($openbridge_role) TO USER IDENTIFIER($current_username);
-- Create the warehouse if it doesn't exist
CREATE WAREHOUSE IF NOT EXISTS IDENTIFIER($openbridge_warehouse)
WAREHOUSE_SIZE = $openbridge_warehouse_size
WAREHOUSE_TYPE = $openbridge_warehouse_type
AUTO_SUSPEND = 60
AUTO_RESUME = TRUE
INITIALLY_SUSPENDED = TRUE;
-- Grant usage on the warehouse to the role
GRANT USAGE ON WAREHOUSE IDENTIFIER($openbridge_warehouse) TO ROLE IDENTIFIER($openbridge_role);
-- Create the database if it doesn't exist
CREATE DATABASE IF NOT EXISTS IDENTIFIER($openbridge_database);
-- Grant usage on the database to the role (without CREATE SCHEMA privilege)
GRANT USAGE ON DATABASE IDENTIFIER($openbridge_database) TO ROLE IDENTIFIER($openbridge_role);
-- Use the new database
USE DATABASE IDENTIFIER($openbridge_database);
-- Create the OAuth2 integration
CREATE SECURITY INTEGRATION IF NOT EXISTS
openbridge_oauth2
TYPE = OAUTH
OAUTH_CLIENT = CUSTOM
OAUTH_CLIENT_TYPE = 'CONFIDENTIAL'
OAUTH_REDIRECT_URI = 'https://oauth.api.openbridge.io/oauth/callback'
ENABLED = TRUE
OAUTH_ISSUE_REFRESH_TOKENS = TRUE
OAUTH_REFRESH_TOKEN_VALIDITY = 7776000
COMMENT = 'Openbridge OAuth';
-- Create a schema
CREATE SCHEMA IF NOT EXISTS IDENTIFIER($openbridge_schema);
-- Grant privileges on the schema to the role
-- GRANT USAGE ON SCHEMA IDENTIFIER($openbridge_schema) TO ROLE IDENTIFIER($openbridge_role);
GRANT ALL PRIVILEGES ON SCHEMA IDENTIFIER($openbridge_schema) TO ROLE IDENTIFIER($openbridge_role);
GRANT CREATE TABLE, CREATE VIEW, CREATE STAGE, CREATE FILE FORMAT ON SCHEMA IDENTIFIER($openbridge_schema) TO ROLE IDENTIFIER($openbridge_role);
USE SCHEMA IDENTIFIER($openbridge_namespace);
CREATE STORAGE INTEGRATION IF NOT EXISTS openbridge_s3
TYPE = EXTERNAL_STAGE
STORAGE_PROVIDER = 'S3'
ENABLED = TRUE
STORAGE_AWS_ROLE_ARN = $openbridge_role_arn
STORAGE_ALLOWED_LOCATIONS = ($openbridge_s3_uri);
-- Create the stage
CREATE STAGE IF NOT EXISTS IDENTIFIER($openbridge_stage)
STORAGE_INTEGRATION = openbridge_s3
URL = $openbridge_s3_uri
FILE_FORMAT = (TYPE = 'PARQUET');
GRANT ALL PRIVILEGES ON STAGE IDENTIFIER($openbridge_stage) TO ROLE IDENTIFIER($openbridge_role);
GRANT USAGE ON INTEGRATION openbridge_s3 TO ROLE IDENTIFIER($openbridge_role);
-- Step 2: Switch to the Openbridge role for testing
USE ROLE IDENTIFIER($openbridge_role);
-- Set the context to use the Openbridge warehouse, database, and schema
USE WAREHOUSE IDENTIFIER($openbridge_warehouse);
USE DATABASE IDENTIFIER($openbridge_database);
USE SCHEMA IDENTIFIER($openbridge_schema);
-- Step 3: Test Operations
-- Test 1: Create a table
CREATE OR REPLACE TABLE test_table (
id INT AUTOINCREMENT PRIMARY KEY,
name STRING,
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP()
);
-- Test 2: Insert data into the table
INSERT INTO test_table (name) VALUES
('Alice'),
('Bob'),
('Charlie');
-- Test 3: Query data from the table
SELECT * FROM test_table;
-- Test 4: Update data in the table
UPDATE test_table
SET name = 'Alice Updated'
WHERE name = 'Alice';
-- Test 5: Delete data from the table
DELETE FROM test_table
WHERE name = 'Bob';
-- Test 6: Drop the table
DROP TABLE test_table;
๐ Step 3: Finalize S3 Permissions for Snowflake Access
After the script completes, continue with:
Step 4: Run this command to get the
STORAGE_AWS_ROLE_ARN
and theSTORAGE_AWS_EXTERNAL_ID
that will be used to update your IAM policy role in Step 5:DESC INTEGRATION openbridge_s3
Step 5: Grant that IAM user access to your bucket
Follow the Snowflake instructions for Step 5 to grant permissions likes3:GetObject
,s3:ListBucket
, etc.
๐ Final Step: Retrieve OAuth Credentials
After completing the storage integration steps, retrieve your OAuth credentials by running:
SELECT SYSTEM$SHOW_OAUTH_CLIENT_SECRETS('OPENBRIDGE_OAUTH2');
This returns a JSON object like:
{
"OAUTH_CLIENT_SECRET_2": "YOUR-CLIENT-SECRET-2",
"OAUTH_CLIENT_SECRET": "YOUR-CLIENT-SECRET",
"OAUTH_CLIENT_ID": "YOUR-CLIENT-ID"
}
โ You only need the following values for Openbridge:
OAUTH_CLIENT_ID
OAUTH_CLIENT_SECRET
Ignore OAUTH_CLIENT_SECRET_2
โ it is not used in the setup process.
๐ How to Find the Authorization URL
Snowflake does not explicitly display an "Authorization URL." Instead, you will find your Account URL (also referred to as Server URL or Account Locator) in the Account Details section of the Snowflake UI.
It typically looks like this:
<your-account>.snowflakecomputing.com
To construct the Authorization URL required for Openbridge:
๐ Simply prefix it with https://
https://<your-account>.snowflakecomputing.com
๐ Allow Openbridge IPs via Network Policy (Optional but Recommended)
Some Snowflake accounts restrict external access through network policies. If your account has a policy or requires one, you must allow Openbridgeโs IP addresses to ensure successful authorization and data delivery.
Openbridge IP addresses:
52.2.68.68/32
52.54.227.22/32
You can configure this in one of two ways:
Option 1: Simple Network Policy (Legacy Method)
-- Use ACCOUNTADMIN for required privileges
USE ROLE ACCOUNTADMIN;
-- Create a new network policy with Openbridge IPs
CREATE NETWORK POLICY openbridge_ip_whitelist
ALLOWED_IP_LIST = ('52.2.68.68/32', '52.54.227.22/32');
-- OR, modify your existing policy
ALTER NETWORK POLICY <your_current_network_policy_name>
SET ALLOWED_IP_LIST = ('52.2.68.68/32', '52.54.227.22/32');
Option 2: Using Network Rule + Policy (Recommended)
-- Create a network rule for Openbridge IPs
CREATE OR REPLACE NETWORK RULE openbridge_ip_rule
TYPE = IPV4
MODE = INGRESS
VALUE_LIST = ('52.2.68.68/32', '52.54.227.22/32');
-- Link the rule to a new policy
CREATE OR REPLACE NETWORK POLICY openbridge_access_policy
ALLOWED_NETWORK_RULE_LIST = ('openbridge_ip_rule');
โ Verify Your Settings
-- Check your policy's allowed IPs
DESC NETWORK POLICY openbridge_access_policy;
-- Verify which policy is active on your account
SHOW PARAMETERS LIKE 'network_policy' IN ACCOUNT;
For more info, see Snowflake Network Policies documentation
โ Once Setup is Complete
In the Openbridge UI:
Step 1: Confirm your Snowflake setup is ready.
Step 2:
โ Option A: Use Existing Authorization
If you have a saved authorization, select it from the list.๐ Option B: Create New Authorization
Choose Authorize โ Select "Bring Your Own App" โ EnterClient ID
,Secret
, andAuthorization URL
โ Authorize via direct Snowflake login.
โ(Once complete, your Snowflake identity will appear and can be selected.)
Step 3: Fill in:
Snowflake Account Identifier (can be found in your Snowflake account details)
Database
Schema
Warehouse
Stage
Role
S3 Bucket
Step 4: Name your destination and click Save.
Youโre done! The Snowflake destination is now fully configured using the latest secure and scalable methods.