Skip to main content
Databricks Lakebase is a fully managed Postgres database integrated with the Databricks lakehouse, built for modern operational workloads. It provides a Postgres-compatible OLTP database that eliminates complex ETL pipelines and ensures transactional data is seamlessly integrated with analytics and AI-driven applications. Integrate Databricks Lakebase with Superblocks to build internal tools and workflows that query, analyze, and update your data using familiar PostgreSQL syntax.

Setting up Databricks Lakebase

Configure authentication

To access your Databricks Lakebase database, you’ll need to authenticate using a Databricks account. Superblocks provides several different ways to authenticate. See the Databricks documentation below for how to configure your preferred authentication method.
MethodDescription
Personal access token (PAT)Use a short or long-lived access token for a user or service principal.
Machine-to-machine OAuthConfigure OAuth client credentials for a service principal. Superblocks will exchange the client credentials with Databricks to retrieve a short-lived OAuth token.
OAuth token federationUse OAuth tokens issued by your identity provider when users log in to Superblocks to authenticate with Databricks using the authenticated user’s permissions.

Add integration

  1. In the web app, navigate to the Integrations page
  2. Click Add integration
  3. Search for Databricks Lakebase and select it from the list of available integrations
  4. Name the integration
  5. Fill out the integration configuration as follows:
FieldRequiredDescription
HostDatabricks Lakebase host name (e.g. xxxx.cloud.databricks.com)
PortPort to use when connecting (default: 5432)
Database NameName of database to connect to
Default schemaAn optional initial schema to use
Database UsernameUsername to use to connect
Access tokenDatabricks personal access token
Enable SSLConnect using SSL
Use a self-signed SSL certificateProvide Server CA, Client Key, and Client Cert
  1. Optionally, add more configurations to set credentials for different environments
  2. Click Test Connection to check that Superblocks can connect to the data source
  3. Click Create

Using Databricks Lakebase in APIs

Once your Databricks Lakebase integration is created, you can start using it by writing SQL in Superblocks APIs. Since Lakebase is fully Postgres-compatible, you can use standard PostgreSQL syntax for all your queries, including support for Postgres extensions like PostGIS and pgvector.

Troubleshooting

If you run into issues, first see our guide on Troubleshooting Database Integrations. There are also several common errors you may see when using Databricks Lakebase. The table below includes error messages, why they happen, and how to address them.
Error messageWhy it’s happening & Resolution
IntegrationOAuthError: OAuth2 - "On-Behalf-Of Token Exchange" could not find identity provider tokenReason
You’ve selected Login identity provider as the subject token source when using OAuth token federation, but you are not currently logged in to Superblocks using an OIDC-based Identity Provider.

Resolution
Reach out to support@superblocks.com for assistance configuring SSO or migrating your SSO configuration to OIDC.
Failed to process token: TOKEN_EXPIREDReason
The access token issued to Superblocks when you logged in, or the static token you’ve provided has expired.

Resolution
If using Logged in identity provider, log out of and back into Superblocks. If using a static token, obtain a new federated JWT from your identity provider.
Failed to process token: TOKEN_INVALID (Ensure a valid federation policy has been configured)Reason
Your Databricks account either does not have a federation policy configured, or the subject_token being sent to Databricks by Superblocks does not satisfy the policy. This can happen if the token is not a valid JWT, or has a different aud or iss than configured in the Databricks federation policy.

Resolution
Make sure the aud and iss configured in Databricks are the same aud and iss your IdP uses when issuing tokens to Superblocks. Make sure the Databricks federation policy points to a valid JWKS URI. By default, Databricks uses the URI provided at <issuer-url>/.well-known/openid-configuration. You may need to change this if your IdP uses a non-default authorization server or does not support a /.well-known discovery URL.
If you are encountering an error that’s not listed, or the provided steps are insufficient to resolve the error, please contact us at support@superblocks.com