...
Shared Query Engine: In this mode, the query engine not only performs queries but also handles semantic model processing. This dual role is named SHARED because the same process undertakes both activities.
Panel | ||||||
---|---|---|---|---|---|---|
| ||||||
Note From Kyvos 2024.9.1 onwards, if you use Query Engines as a compute server:
|
Dedicated Compute: In this mode, the semantic model is processed via dedicated service. In cloud-based deployment, the semantic model is processed using Kubernetes (K8S) cluster-based nodes while in ON PREM environment, models are processed on dedicated nodes.
...
From Kyvos 2024.10, you can:
Process semantic model with no-Spark using the Shared Query Engine and dedicated Kubernetes cluster on AWS Managed Services.
Resume failed or canceled semantic model process.
Run test data semantic model process job.
...
After deploying Kyvos using no-Spark processing model, perform the following post deployment steps.
Modify the values of the following properties in the advance properties of semantic model job:
kyvos.process.compute.type=KYVOS_COMPUTE
kyvos.build.aggregate.type=TABULAR
...
Post deployment steps on Azure
Anchor | ||||
---|---|---|---|---|
|
...
To add the Storage Blob Data Contributor role,
On the Home page of the Azure portal, search for Storage Accounts.
On the Storage Accounts page, select the storage account that is used for deployment.
Navigate to Access Control (IAM).
Select the Storage Blob Data Contributor role from the list.
In the Assign Access to section, select Managed Identity.
Click Select Member.
On the Select Managed Identity dialog, select the Access Connector for Azure Databricks from the list.
Click Review+assign and click save the permission.
To add an external location,
Go to Databricks workspace. In the left pane, click Catalog.
Click Settings, and then click External Locations.
On the External Location page, click Create location.
On the Create Location dialog, add external location name, select the credentials from the list, and URL
The URL must be in abfss://<Container name >@<Storage name >.dfs.core.windows.net/<Cluster engine_work directroy>
For example, abfss://kyvoscontainer@kyvossa05751.dfs.core.windows.net/user/engine_workClick Create. Click Grant and select all privileges and as CREATE EXTERNAL TABLE AND WRITE FILES roleand grant them to the user whose token is used for while creating SQL Warehouse connection.
Click Grant. The Permission is assigned.
...