Databricks default unity catalog: The 1 best way to set it
Last updated: October 29, 2025
<p>The Databricks workspace packs so many features into their platform user interface it should be a case study on solid product development most other SaaS platforms should learn from. One feature to highlight regarding workspace defaults is setting the default for your Databricks unity catalog. Not too long ago the hive_metastore was the default area for working queries, landing data etc. With the intro and focus on the unity catalog, newer Datbricks accounts/clusters are setup with a unity catalog right away. And Databricks makes that the default typically. But what if you want or need to change the default catalog…</p>
<p>The Databricks workspace packs so many features into their platform user interface it should be a case study on solid product development most other SaaS platforms should learn from. One feature to highlight regarding workspace defaults is setting the default for your Databricks unity catalog.</p> <p>Not too long ago the hive_metastore was the default area for working queries, landing data etc. With the intro and focus on the unity catalog, newer Datbricks accounts/clusters are setup with a unity catalog right away. And Databricks makes that the default typically.</p> <p>But what if you want or need to change the default catalog for your workspace?</p> <h2 class="wp-block-heading">Here’s how you can set the default unity catalog:</h2> <ol> <li>Navigate to your <a href="https://accounts.cloud.databricks.com/workspaces" data-type="link" data-id="https://accounts.cloud.databricks.com/workspaces" target="_blank" rel="noopener">workspace</a></li> <li>In the upper right hand corner of the platform click your user icon</li> <li>Select “<strong>Settings</strong>” from the list of options <ul> <li><img fetchpriority="high" decoding="async" width="1578" height="498" class="wp-image-20659" style="width: 650px;" src="https://public-cdn.dlh.io/migrated/uploads/2025/04/dlh_databricks_workspace_settings.png" alt="Databricks Options" srcset="https://www.datalakehouse.io/wp-content/uploads/2025/04/dlh_databricks_workspace_settings.png 1578w, https://www.datalakehouse.io/wp-content/uploads/2025/04/dlh_databricks_workspace_settings-300x95.png 300w, https://www.datalakehouse.io/wp-content/uploads/2025/04/dlh_databricks_workspace_settings-1024x323.png 1024w, https://www.datalakehouse.io/wp-content/uploads/2025/04/dlh_databricks_workspace_settings-768x242.png 768w, https://www.datalakehouse.io/wp-content/uploads/2025/04/dlh_databricks_workspace_settings-1536x485.png 1536w" sizes="(max-width: 1578px) 100vw, 1578px" /></li> </ul> </li> <li>In the resulting Settings page, select the <strong>Advanced</strong> link from the left menu</li> <li>Scroll to the “Other” section in the main area to see the Default catalog for the workspace area <ul> <li><img decoding="async" width="1948" height="730" class="wp-image-20660" style="width: 750px;" src="https://www.datalakehouse.io/wp-content/uploads/2025/04/dlh_databricks_workspace_advanced_options_other.png" alt="c" srcset="https://www.datalakehouse.io/wp-content/uploads/2025/04/dlh_databricks_workspace_advanced_options_other.png 1948w, https://www.datalakehouse.io/wp-content/uploads/2025/04/dlh_databricks_workspace_advanced_options_other-300x112.png 300w, https://www.datalakehouse.io/wp-content/uploads/2025/04/dlh_databricks_workspace_advanced_options_other-1024x384.png 1024w, https://www.datalakehouse.io/wp-content/uploads/2025/04/dlh_databricks_workspace_advanced_options_other-768x288.png 768w, https://www.datalakehouse.io/wp-content/uploads/2025/04/dlh_databricks_workspace_advanced_options_other-1536x576.png 1536w" sizes="(max-width: 1948px) 100vw, 1948px" /></li> </ul> </li> <li>This is where you will enter your other catalog, in the field available,</li> <li>Click <strong>Save</strong> when readyl</li> </ol> <p>As the instructions state, once you click the <strong>Save</strong> button, in order for the setting to apply you will need to restart any compute (SQL Warehouses or Clusters). :</p> <blockquote class="wp-block-quote"> <p>Setting the default catalog for the workspace determines the catalog that is used when queries do not reference a fully qualified 3 level name. For example, if the default catalog is set to ‘retail_prod’ then a query ‘SELECT * FROM myTable’ would reference the object ‘retail_prod.default.myTable’ (the schema ‘default’ is always assumed).</p> <p>If the default catalog is in Unity Catalog (set to any value other than ‘hive_metastore’ or ‘spark_catalog’), MLflow client code that reads or writes models will target that catalog by default. Otherwise, models will be written to and read from the workspace model registry</p> <p>Creating new registered models in workspace model registry is disabled if the default catalog is in Unity Catalog (set to any value other than ‘hive_metastore’ or ‘spark_catalog’)</p> <p>This setting requires a restart of clusters and SQL warehouses to take effect. Additionally, this setting only applies to Unity Catalog compatible compute i.e. when the workspace has an assigned Unity Catalog metastore, and the cluster is in access mode ‘Shared’ or ‘Single User’, or in SQL warehouses.</p> <cite>https://docs.databricks.com/aws/en/catalogs/default</cite></blockquote> <hr class="wp-block-separator has-alpha-channel-opacity"/> <p></p> <p>Basically from the text summary provided, be sure to consider the fully qualified domain name (FQDN) of the three level name to reach your table in a general query or reference. And, there’s an impact for newer Databricks features that require the unity catalog instead of the legacy hive_metastore. Lastly, as mentioned above, updating the default catalog requires a restart of the compute.</p> <p>Hopefully that helps as you start making more use of your Databricks environments.</p> <p>If you’d like to dig deeper on this topic for the nuances, take a look at the <a href="https://docs.databricks.com/aws/en/catalogs/default" target="_blank" rel="noopener">Databricks Manage the default catalog</a> page.</p>

Staff Expert Writer
Related Articles
Automate Your Data Pipeline Today
Join operators of restaurants, retail, and hospitality brands who use DLH.io to centralize POS, payroll, and operational data.