Documentation Index
Fetch the complete documentation index at: https://docs.hymalaia.com/llms.txt
Use this file to discover all available pages before exploring further.
How it works
The Databricks connector indexes table metadata for a given Unity Catalog (or Hive) database and schema: it reads column types frominformation_schema and builds a synthetic DDL document per table (including primary/foreign key metadata when available). It does not index row-level table data.
Tables whose metadata has changed (based on last_altered in information_schema.tables) are picked up on incremental runs. Like other connectors, indexing typically runs on a daily schedule.
Setting up
Authorization
Databricks uses a SQL warehouse connection over HTTP, with a personal access token (PAT). Create a credential with:- Server hostname — the hostname from your warehouse’s JDBC/ODBC connection details (e.g.
adb-1234567890.4.azuredatabricks.net), withouthttps:// - HTTP path — the warehouse HTTP Path from the same connection dialog (starts with
/sql/1.0/warehouses/...) - Access token — a Databricks PAT for a user that can run metadata queries on the target catalog/database/schema
Indexing
Configure authorization
In Step 1, set up credentials:
- Select an existing Databricks credential, or click Create New
- Enter Server hostname, HTTP path, and Access token
Specify database and schema
In Step 2, specify:
- Connector Name — a display name for this connector (e.g.
Lakehouse prod) - Database — the Databricks catalog.database name as used in SQL (the first level is often your catalog; use the value that matches
information_schemafor your environment) - Schema — the schema name within that database
- Access Type — whether indexed content is Public or Private in Hymalaia

