Skip to main content

Snowflake Migration Guide

Upgrading Existing Destinations to OAuth Authentication

Openbridge Support avatar
Written by Openbridge Support
Updated this week

This guide is for customers who already have a Snowflake destination configured in Openbridge and wish to migrate to the new OAuth-based authentication model.

The migration process lets you reuse your existing Snowflake infrastructure (warehouse, database, schema, etc.) while enabling the new secure OAuth workflow.


Important Notices

  • Backup first! Do not run the migration script on production infrastructure that you have not backed up.

  • Openbridge is not responsible for any issues or data loss that may occur during this process.

  • The Openbridge migration script is non-destructive — it is written to detect and reuse existing objects in Snowflake without overwriting them. However, caution and backups are still strongly recommended.


Pre-Migration Cleanup

Before starting, log in to your Snowflake account and run the following command to check for and remove any leftover test tables that may have been created during your previous setup:

DROP TABLE IF EXISTS test_table;

Why:
These tables were created during the original Snowflake destination setup as part of connection testing. Keeping them may cause conflicts during the migration process.


Step 1: Setup S3 bucket

There are two options for creating buckets. You can use either of the following:

  • Option 1: Set Up a New S3 Bucket

  • Option 2: Use an Existing S3 Bucket

Option 1: Setting Up the S3 Bucket (via CloudFormation)

Openbridge provides a preconfigured AWS CloudFormation template that automatically creates an S3 bucket with the required read/write permissions for Snowflake external staging.

Launch the Template:

Use the provided CloudFormation template to deploy the bucket in your AWS account:

(Ensure you have sufficient IAM permissions to create CloudFormation stacks.)

Steps:

  1. Sign in to AWS Console

  2. Navigate to CloudFormation → Create Stack → With new resources (standard)

  3. Choose Upload a template file or Use a template URL

  4. Provide the Openbridge template

  5. Enter

    • Stack Name

    • Bucket Name

  6. Click Next → Create Stack

CloudFormation will automatically create the bucket with all required permissions. Once the stack completes, you will have:

  • An S3 bucket ready for use — Save this bucket name for later steps.

  • AWS Region — The bucket is created in the Region you select.

    • This Region must match the Region you enter later in the Openbridge setup.

    • Snowflake does not support cross-region S3 buckets.

Option 2: Using an Existing S3 Bucket (Manual Setup)

If you're not using the CloudFormation template and prefer to manage your bucket manually, you’ll need to configure two layers of access.


​Allow Openbridge to Read/Write:

Add the following policy to your existing S3 bucket to grant Openbridge’s IAM role the required read/write access. This allows Openbridge to securely interact with your bucket for data loading into Snowflake.

{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "AllowExternalRoleReadWrite",
"Effect": "Allow",
"Principal": {
"AWS": "arn:aws:iam::791778434480:role/zeroadmin-production"
},
"Action": [
"s3:GetObject",
"s3:GetObjectVersion",
"s3:PutObject",
"s3:PutObjectAcl",
"s3:DeleteObject",
"s3:DeleteObjectVersion"
],
"Resource": "arn:aws:s3:::my-bucket-name/*"
},
{
"Sid": "AllowExternalRoleListBucket",
"Effect": "Allow",
"Principal": {
"AWS": "arn:aws:iam::791778434480:role/zeroadmin-production"
},
"Action": [
"s3:ListBucket",
"s3:ListBucketVersions",
"s3:GetBucketLocation"
],
"Resource": "arn:aws:s3:::my-bucket-name"
}
]
}

Replace my-bucket-name with your actual bucket name. Be sure to save both the bucket name and the bucket’s Region for later steps.

Make sure the bucket’s AWS region matches the region configured for your Snowflake STORAGE INTEGRATION. Buckets in different regions cannot be used.


Note: Whether you create the S3 bucket manually or with the Openbridge CloudFormation template, complete the following Snowflake configuration steps in order:

Step 2: Create an IAM Policy & Role in AWS

When granting permissions, ensure that the Snowflake integration is created in the same AWS region as the S3 bucket.

📌 Important: Save Your AWS Role ARN

During Step 2 (IAM role creation), copy or save the full ARN of the IAM role you created. You’ll need this ARN later when you run the Openbridge setup script.

What happens next?


Step 3: Configure Your Existing Snowflake Environment for OAuth

The same Openbridge setup script used for new destinations can also be used to migrate an existing one.

It will safely create missing objects and integrate OAuth authentication into your environment.

Use the Same Database, Schema, and Warehouse

Use your existing values from your current Openbridge destination.

If you’re unsure, log in to your Openbridge account, open your existing Snowflake destination, and note these fields.

SET openbridge_database = 'YOUR_OPENBRIDGE_DB'; -- e.g., 'BUXZZ_DATABASE'  
SET openbridge_schema = 'YOUR_OPENBRIDGE_SCHEMA'; -- e.g., 'BUXZZ_SCHEMA'
SET openbridge_warehouse = 'YOUR_OPENBRIDGE_WAREHOUSE'; -- e.g., 'BUXZZ_WAREHOUSE'

Do Not Reuse Username and Password

Your existing Snowflake integration includes username and password fields. However, these are not the same as the new fields requested in the script.

They are used exclusively to grant Openbridge a secure token for database access.

Create new ones:

SET openbridge_username = 'YOUR_OPENBRIDGE_USERNAME'; -- e.g., 'BUXZZ_USER'  
SET openbridge_password = 'YOUR_OPENBRIDGE_PASSWORD'; -- e.g., '12BUXZZ!@'

These credentials are used only for the OAuth handshake process, not for direct database logins.

Reusing Your Existing Role

In your previous Snowflake setup, you created a role manually as part of the Openbridge configuration process.

You’ll need to log in to your Snowflake account to confirm what you named that role.

Once you find it, reuse the same role name here:

SET openbridge_role = 'YOUR_EXISTING_ROLE';   -- e.g., 'BUXZZ_ROLE'

Reusing the same role ensures all existing permissions remain valid during migration.

Creating a New Stage

The stage field is new to the OAuth integration.
You don’t need to create it manually — the Openbridge script will create it for you if it doesn’t exist.

SET openbridge_stage = 'YOUR_OPENBRIDGE_STAGE'; -- e.g., 'BUXZZ_STAGE'

Step 4: Run the Official Openbridge Script Using Your Existing Configuration Values

At this stage, you have already defined all required variables (database, schema, warehouse, username, password, role, stage, S3 Bucket and AWS Role ARN details).

Before running the script, copy it into your Snowflake worksheet and replace all placeholder variables with values from your Snowflake and AWS environment.

  • YOUR_OPENBRIDGE_DB

  • YOUR_OPENBRIDGE_SCHEMA

  • YOUR_OPENBRIDGE_USERNAME

  • YOUR_OPENBRIDGE_PASSWORD

  • YOUR_OPENBRIDGE_ROLE

  • YOUR_OPENBRIDGE_STAGE

  • YOUR_OPENBRIDGE_WAREHOUSE

  • YOUR_OPENBRIDGE_ROLE_ARN (exact AWS ARN)

  • YOUR_OPENBRIDGE_S3_BUCKET_NAME (exact AWS bucket name)

Make sure to save all variable values you will use in the script, as you will need them when configuring your Snowflake destination in the Openbridge UI.

Note: The openbridge_username and openbridge_password defined in the script will be used later in the Openbridge OAuth authorization step. Do not use your personal Snowflake login for that step.

Update the variables and run the script in your Snowflake worksheet:

-- Set variables (Replace placeholders with your actual values)
SET openbridge_database = 'YOUR_OPENBRIDGE_DB'; -- e.g., 'BUXZZ_DATABASE'
SET openbridge_schema = 'YOUR_OPENBRIDGE_SCHEMA'; -- e.g., 'BUXZZ_SCHEMA'
SET openbridge_username = 'YOUR_OPENBRIDGE_USERNAME'; -- e.g., 'BUXZZ_USER'
SET openbridge_password = 'YOUR_OPENBRIDGE_PASSWORD'; -- e.g., '12BUXZZ!@'
SET openbridge_role = 'YOUR_OPENBRIDGE_ROLE'; -- e.g., 'BUXZZ_ROLE'
SET openbridge_stage = 'YOUR_OPENBRIDGE_STAGE'; -- e.g., 'BUXZZ_STAGE'
SET openbridge_warehouse = 'YOUR_OPENBRIDGE_WAREHOUSE';-- e.g., 'BUXZZ_WAREHOUSE'

SET openbridge_warehouse_type = 'STANDARD'; -- snowflake warehouse type
SET openbridge_warehouse_size = 'XSMALL'; -- snowflake warehouse size

SET openbridge_role_arn = 'arn:aws:iam::<AWS_ACCOUNT_ID>:role/<YOUR_IAM_ROLE_NAME>'; -- aws policy role arn
SET openbridge_s3_bucket_name = 'YOUR_S3_BUCKET_NAME'; -- aws bucket name

SET openbridge_namespace = $openbridge_database || '.' || $openbridge_schema;
SET openbridge_s3_uri = 's3://' || $openbridge_s3_bucket_name || '/';

-- Use the ACCOUNTADMIN role to ensure sufficient privileges
USE ROLE ACCOUNTADMIN;

-- Step 1: Setup
-- Create the Openbridge role if it doesn't exist
CREATE ROLE IF NOT EXISTS IDENTIFIER($openbridge_role);

-- Create the Openbridge user if it doesn't exist
CREATE USER IF NOT EXISTS IDENTIFIER($openbridge_username)
PASSWORD = $openbridge_password
DEFAULT_ROLE = $openbridge_role
DEFAULT_WAREHOUSE = $openbridge_warehouse
DEFAULT_NAMESPACE = $openbridge_namespace; -- Use pre-built namespace

-- Grant the role to the user
GRANT ROLE IDENTIFIER($openbridge_role) TO USER IDENTIFIER($openbridge_username);

-- Assign the current user's name to a variable
SET current_username = CURRENT_USER();

-- Grant the role to your current user (for testing purposes)
GRANT ROLE IDENTIFIER($openbridge_role) TO USER IDENTIFIER($current_username);

-- Create the warehouse if it doesn't exist
CREATE WAREHOUSE IF NOT EXISTS IDENTIFIER($openbridge_warehouse)
WAREHOUSE_SIZE = $openbridge_warehouse_size
WAREHOUSE_TYPE = $openbridge_warehouse_type
AUTO_SUSPEND = 60
AUTO_RESUME = TRUE
INITIALLY_SUSPENDED = TRUE;

-- Grant usage on the warehouse to the role
GRANT USAGE ON WAREHOUSE IDENTIFIER($openbridge_warehouse) TO ROLE IDENTIFIER($openbridge_role);

-- Create the database if it doesn't exist
CREATE DATABASE IF NOT EXISTS IDENTIFIER($openbridge_database);

-- Grant usage on the database to the role (without CREATE SCHEMA privilege)
GRANT USAGE ON DATABASE IDENTIFIER($openbridge_database) TO ROLE IDENTIFIER($openbridge_role);

-- Use the new database
USE DATABASE IDENTIFIER($openbridge_database);

-- Create the OAuth2 integration
CREATE SECURITY INTEGRATION IF NOT EXISTS
openbridge_oauth2
TYPE = OAUTH
OAUTH_CLIENT = CUSTOM
OAUTH_CLIENT_TYPE = 'CONFIDENTIAL'
OAUTH_REDIRECT_URI = 'https://oauth.api.openbridge.io/oauth/callback'
ENABLED = TRUE
OAUTH_ISSUE_REFRESH_TOKENS = TRUE
OAUTH_REFRESH_TOKEN_VALIDITY = 7776000
COMMENT = 'Openbridge OAuth';

-- Create a schema
CREATE SCHEMA IF NOT EXISTS IDENTIFIER($openbridge_schema);

-- Grant privileges on the schema to the role
-- GRANT USAGE ON SCHEMA IDENTIFIER($openbridge_schema) TO ROLE IDENTIFIER($openbridge_role);
GRANT ALL PRIVILEGES ON SCHEMA IDENTIFIER($openbridge_schema) TO ROLE IDENTIFIER($openbridge_role);
GRANT CREATE TABLE, CREATE VIEW, CREATE STAGE, CREATE FILE FORMAT ON SCHEMA IDENTIFIER($openbridge_schema) TO ROLE IDENTIFIER($openbridge_role);
USE SCHEMA IDENTIFIER($openbridge_namespace);

CREATE STORAGE INTEGRATION IF NOT EXISTS openbridge_s3
TYPE = EXTERNAL_STAGE
STORAGE_PROVIDER = 'S3'
ENABLED = TRUE
STORAGE_AWS_ROLE_ARN = $openbridge_role_arn
STORAGE_ALLOWED_LOCATIONS = ($openbridge_s3_uri);

-- Create the stage
CREATE STAGE IF NOT EXISTS IDENTIFIER($openbridge_stage)
STORAGE_INTEGRATION = openbridge_s3
URL = $openbridge_s3_uri
FILE_FORMAT = (TYPE = 'PARQUET');
GRANT ALL PRIVILEGES ON STAGE IDENTIFIER($openbridge_stage) TO ROLE IDENTIFIER($openbridge_role);

GRANT USAGE ON INTEGRATION openbridge_s3 TO ROLE IDENTIFIER($openbridge_role);

-- Step 2: Switch to the Openbridge role for testing
USE ROLE IDENTIFIER($openbridge_role);

-- Set the context to use the Openbridge warehouse, database, and schema
USE WAREHOUSE IDENTIFIER($openbridge_warehouse);
USE DATABASE IDENTIFIER($openbridge_database);
USE SCHEMA IDENTIFIER($openbridge_schema);

-- Step 3: Test Operations

-- Test 1: Create a table
CREATE OR REPLACE TABLE test_table (
id INT AUTOINCREMENT PRIMARY KEY,
name STRING,
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP()
);

-- Test 2: Insert data into the table
INSERT INTO test_table (name) VALUES
('Alice'),
('Bob'),
('Charlie');

-- Test 3: Query data from the table
SELECT * FROM test_table;

-- Test 4: Update data in the table
UPDATE test_table
SET name = 'Alice Updated'
WHERE name = 'Alice';

-- Test 5: Delete data from the table
DELETE FROM test_table
WHERE name = 'Bob';

-- Test 6: Drop the table
DROP TABLE test_table;

What’s different in this migration scenario:

You are not creating a brand-new environment from scratch. Instead, you are reusing the values from your existing Snowflake configuration to create a new OAuth-based destination that connects to the same Snowflake database, schema, and warehouse.

The script will:

  • Reuse your existing database, schema, and warehouse

  • Create the new OAuth integration

  • Generate required role and stage objects if they do not already exist

  • Establish secure permissions and OAuth secrets


Step 5: Retrieve Required IAM Values

After the script finishes, run the following command to get the STORAGE_AWS_IAM_USER_ARN and STORAGE_AWS_EXTERNAL_ID values:

DESC INTEGRATION openbridge_s3

These values are required for Step 5 of the official Snowflake documentation.


Step 6: Update Your IAM Role Policy

Follow Snowflake’s Step 5 instructions exactly to update the trust policy and grant required S3 permissions (s3:GetObject, s3:ListBucket, etc.).

Important: Do not create a new role.
You must update the existing IAM role created earlier in Step 2 of the Snowflake setup.


Step 7: Retrieve OAuth Credentials

After completing the storage integration steps, retrieve your OAuth credentials by running:

SELECT SYSTEM$SHOW_OAUTH_CLIENT_SECRETS('OPENBRIDGE_OAUTH2');

This returns a JSON object like:

{
"OAUTH_CLIENT_SECRET_2": "YOUR-CLIENT-SECRET-2",
"OAUTH_CLIENT_SECRET": "YOUR-CLIENT-SECRET",
"OAUTH_CLIENT_ID": "YOUR-CLIENT-ID"
}

You only need the following values for Openbridge OAuth Login:

  • OAUTH_CLIENT_ID

  • OAUTH_CLIENT_SECRET

Ignore OAUTH_CLIENT_SECRET_2 — it is not used in the setup process.


Step 8: How to Construct Authorization URL

Snowflake does not explicitly display an “Authorization URL.”

Instead, you will find your Account URL (also referred to as Server URL) in the Account Details section of the Snowflake UI.

Open “View Account Details”

  1. Log in to your Snowflake Console.

  2. Click your user profile (bottom-left corner).

  3. Select your Account name from the list.

  4. Click View account details.

See the screenshot below

Locate the “Account/Server URL”

In the Account Details window, find the field labeled Account/Server URL.

It will look similar to this:

xxxxx-xxxxxx.snowflakecomputing.com

See the screenshot below

Construct the Authorization URL

Simply prefix it with https://

https://xxxxxxxx-xxxxxxxx.snowflakecomputing.com

(Save this for later — it will be used during the Authorization step in the Openbridge UI, along with your Client ID and Client Secret.)

Understand the Correct Format

Correct example:

https://zz11222-xxxxxxxx.snowflakecomputing.com

Incorrect example:

https://zz11222.snowflakecomputing.com

Step 9: Finalize Configuration in Openbridge

Create new Snowflake destination in Openbridge UI:

In the Openbridge UI:

Step 1: Confirm your Snowflake setup is ready.

Step 2: Authorization

  • Option A: Use Existing Authorization
    If you have a saved authorization, select it from the list.

  • Option B: Create New Authorization
    Choose Authorize → Select "Bring Your Own App"

    Select Create New Client, then enter:

    • Name

    • Client ID

    • Client Secret

    • Authorization URL

    Click Continue, then Select the newly created client and Authorize it via direct Snowflake login.
    (Use the same integration username and password defined in the script, not your personal Snowflake credentials.)

Once authorized, your Snowflake identity will appear for selection.

If you attempt to authorize an identity immediately after creating your OAuth integration, you may encounter an "Error requesting access token" message. Please wait approximately 3–4 minutes before retrying to allow Snowflake's systems to fully propagate the new integration.

Step 3: Fill in:

  • Snowflake Account Identifier (can be found in your Snowflake account details)

  • Database

  • Schema

  • Warehouse

  • Stage

  • Role

  • S3 Bucket

  • S3 Bucket Region (must be the same region as your s3 bucket)

Step 4: Name your destination and click Save.


Step 10: Migrate Subscriptions Using the Destination Migration Tool

Once your new OAuth-based Snowflake destination is created, you’ll need to move your existing subscriptions from the old destination to the new one.

Openbridge provides a simple in-app tool to handle this process.

  1. Go to your Openbridge account and navigate to Account → Destination Migration Tool, or visit directly: https://app.openbridge.com/tools/destination-updater

  2. Select your old destination (legacy Snowflake setup).

  3. Select your new destination (OAuth-enabled Snowflake setup).

  4. Follow the prompts to migrate all linked subscriptions.

After migration:

  • All your previous subscriptions will now route to your new OAuth destination.

  • Please be mindful that there may be data still in transit between your old and new destinations. Allow both destinations to be active for at least 48 hours before disabling/deleting your legacy destination. This ensures a smooth and complete transition without data loss.


Migration Complete

You’ve now successfully migrated your existing Snowflake destination to the OAuth-based integration.


Your existing Snowflake resources remain intact, and Openbridge will now use the secure OAuth token flow for all future data operations.

Did this answer your question?