All Collections
Data Destinations
Data Lakehouse
How To Setup Databricks Lakehouse
How To Setup Databricks Lakehouse
Openbridge Support avatar
Written by Openbridge Support
Updated over a week ago

This guide assumes that you have a Databricks account and warehouse already configured. If you do not, see the Databricks docs on starting one up.

There are details in this doc that may not work without an active, billing-enabled (non-trial) Databricks account. If you have not done so, please make sure you have updated your billing information in Databricks.

Getting Started

There are a few key connection details that are needed to configure Databricks as a destination properly:

  • Hostname

  • HTTP path

  • Token

  • Catalog

  • Schema/Database

You will need to take note of all of these details in Databricks as they will be needed within the Openbridge data destination registration process.

Connection (Hostname, HTTP Path, Token)

In your Dtabricks account, navigate to your warehouse (see A). This will then display a screen showing your warehouse(s). In your example, there is a warehouse called Starter Warehouse (see B). Clicking the name will load the connection screen.

Note: If you do not have a SQL Warehouse listed like Starter Warehouse, you will need to create one.

Next, in the Starter Warehouse select the Connection details as shown below;

Selecting Connection details will display the Server Hostname (A) and the HTTP path (B). Take note of these values.

Creating Your Databricks Access Token

A token is needed to authorize the Openbridge connection to Databricks. To generate a token, go to the User Settings (see A). Next, go to the Personal access tokens (see B). Lastly, you will need to Generate new token (see C).

When generating the token, set the expiration to zero ( enter "0"). Also, a token name like "Openbridge" will help you later identify this.

Locating Your Catalog and Schema/Database Names

To find your Catalog and Schema/Database names, go to Data in your Databricks interface (see A). This will open the data explorer. You will likely see the default catalog, which is called hive_metastore (see B). You should also see a Unity Catalog, the recommended approach by Databricks. (Openbridge only supports Unity Catalog)

If you have a different catalog other than the default, you can use that instead. Selecting your catalog will show you the available Schema/Databases (see C).

Select Unity Catalog; this is the catalog we will register your data with.

Unity Catalog

To create a unity catalog, configure this in your Databricks user interface:

External Storage

For details on configuring Databricks to use external storage, see this doc:

Did this answer your question?