...
Unity Catalog must be enabled on your Databricks cluster.
In Unity Catalog, create storage credentials and an external location with appropriate access permissions for both the source and destination locations. You must have permission to create storage credentials and external locations.
To create storage credentials, first, create an access connector for Azure Databricks. Assign that access connector while creating storage credentials.
Grant the managed identity access to the storage account. You must have the Owner or User Access Administrator Azure RBAC role to grant permission to access your storage account.
Log in to your Azure Data Lake Storage Gen2 account.
Go to Access Control (IAM), click + Add, and select Add role assignment.
Select the Storage Blob Data Contributor role and click Next.
Under Assign access, select a Managed identity.
Click +Select Members, and select Access connector for Azure Databricks or User-assigned managed identity.
Search for your connector name or user-assigned identity, select it, and click Review and Assign.
Fetch the ABFSS path till the parent directory of your respective storage account.
On the Unity Catalog page, go to external locations and create a new external location by providing an external location name, newly created Storage credential, and fetched ABFSS URL. The URL format should be in abfss://my-container-name@my-storage-account.dfs.core.windows.net/<path>
Test the newly created external location by clicking the 'Test connection' button. This will validate the connection with the external location path. Through the permissions tab, assign your user the CREATE EXTERNAL TABLE and WRITE FILESrole.
Go to semantic model Advanced Properties, and add the following properties:
kyvos.sqlwarehouse.catalog
= <catalog name for temporary tables created while cubes building>
kyvos.sqlwarehouse.tempdb
= <temporary database for temporary tables created while cubes building>
You must have 'create table and Use schema' permissions on the temp catalog.
...