Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

Applies to: (tick) Kyvos Enterprise  (tick) Kyvos Cloud (SaaS on AWS) (tick) Kyvos AWS Marketplace

...

Panel
panelIconIdatlassian-note
panelIcon:note:
bgColor#DEEBFF

Note

  • From Kyvos 2024.1 onwards, you can enable or disable include the username in the Snowflake query as a comment in the SQL statement. To do this, you need to set the value of the kyvos.connection.sql.addcomment to ‘True’ in the Snowflake connection.

  • If you have enabled Network policies in Snowflake, you must use a Databricks workspace with the option Deploy Azure Databricks workspace with Secure Cluster Connectivity (No Public IP) as Yes. Additionally, add the NAT Gateway Public IP of Databricks VNet to the Network policies of Snowflake.

  • From the Kyvos 2023.1 release onwards, for Kyvos data security, if you want to use external security configured at the cluster level, such as user impersonation to enforced role and permissions while viewing Snowflake semantic model, you must change the property value from 2 to 1 in the QUERYING_SECURITY_LEVEL property (olapengine.properties). 

  • You can create multiple Snowflake connections for raw data querying. The connections are available on the semantic model designer page, where you can select the connection to be used for a particular semantic model.
    See the Working with non-materialized or raw data semantic models section for more details. 

Prerequisites for creating a Snowflake connection

...

Aura tab collection
paramsJTdCJTIyZ2VuZXJhbCUyMiUzQSU3QiUyMnRhYlNwYWNpbmclMjIlM0EwJTJDJTIydGFiV2lkdGglMjIlM0ExMDAlMkMlMjJ0YWJIZWlnaHQlMjIlM0E1MCUyQyUyMmRpcmVjdGlvbiUyMiUzQSUyMmhvcml6b250YWwlMjIlN0QlMkMlMjJjb250ZW50JTIyJTNBJTdCJTIyYmFja2dyb3VuZENvbG9yJTIyJTNBJTdCJTIyY29sb3IlMjIlM0ElMjIlMjNmZmYlMjIlN0QlMkMlMjJib3JkZXIlMjIlM0ElN0IlMjJzdHlsZSUyMiUzQSUyMnNvbGlkJTIyJTJDJTIyd2lkdGglMjIlM0ExJTJDJTIydG9wJTIyJTNBdHJ1ZSUyQyUyMmJvdHRvbSUyMiUzQXRydWUlMkMlMjJsZWZ0JTIyJTNBdHJ1ZSUyQyUyMnJpZ2h0JTIyJTNBdHJ1ZSUyQyUyMmNvbG9yJTIyJTNBJTdCJTIybGlnaHQlMjIlM0ElMjIlMjNjY2NlY2YlMjIlN0QlN0QlMkMlMjJwYWRkaW5nJTIyJTNBJTdCJTIydG9wJTIyJTNBMTAlMkMlMjJyaWdodCUyMiUzQTEwJTJDJTIyYm90dG9tJTIyJTNBMTAlMkMlMjJsZWZ0JTIyJTNBMTAlN0QlN0QlMkMlMjJhY3RpdmUlMjIlM0ElN0IlMjJiYWNrZ3JvdW5kQ29sb3IlMjIlM0ElN0IlMjJjb2xvciUyMiUzQSU3QiUyMmxpZ2h0JTIyJTNBJTIyJTIzZjU4MjI3JTIyJTdEJTdEJTJDJTIydGV4dCUyMiUzQSU3QiUyMmZvbnRTaXplJTIyJTNBMTYlMkMlMjJjb2xvciUyMiUzQSU3QiUyMmxpZ2h0JTIyJTNBJTIyJTIzMDAwMDAwJTIyJTdEJTJDJTIydGV4dEFsaWduJTIyJTNBJTIybGVmdCUyMiUyQyUyMmZvbnRXZWlnaHQlMjIlM0ElMjJib2xkJTIyJTdEJTdEJTJDJTIyaG92ZXIlMjIlM0ElN0IlMjJiYWNrZ3JvdW5kQ29sb3IlMjIlM0ElN0IlMjJjb2xvciUyMiUzQSUyMiUyM2RmZTFlNiUyMiU3RCUyQyUyMnRleHQlMjIlM0ElN0IlMjJmb250U2l6ZSUyMiUzQTE4JTJDJTIyY29sb3IlMjIlM0ElMjIlMjM1ZTZjODQlMjIlMkMlMjJ0ZXh0QWxpZ24lMjIlM0ElMjJsZWZ0JTIyJTJDJTIyZm9udFdlaWdodCUyMiUzQSUyMmxpZ2h0ZXIlMjIlN0QlN0QlMkMlMjJpbmFjdGl2ZSUyMiUzQSU3QiUyMmJhY2tncm91bmRDb2xvciUyMiUzQSU3QiUyMmNvbG9yJTIyJTNBJTIyJTIzZjRmNWY3JTIyJTdEJTJDJTIydGV4dCUyMiUzQSU3QiUyMmZvbnRTaXplJTIyJTNBMTYlMkMlMjJjb2xvciUyMiUzQSUyMiUyMzVlNmM4NCUyMiUyQyUyMnRleHRBbGlnbiUyMiUzQSUyMmxlZnQlMjIlMkMlMjJmb250V2VpZ2h0JTIyJTNBJTIybGlnaHRlciUyMiU3RCU3RCU3RA==
Aura tab
summaryAuthentication Type: Snowflake
paramsJTdCJTIydGl0bGUlMjIlM0ElMjJBdXRoZW50aWNhdGlvbiUyMFR5cGUlM0ElMjBTbm93Zmxha2UlMjIlN0Q=
  1. From the Toolbox, click Setup, then Connections.

  2. From the Actions menu (  ) click Add Connection.

  3. Enter a name or select it from the Connection list.

  4. Select Warehouse option from the Category List.
    There may be more than one warehouse connection.

  5. For Providers, select Snowflake, and enter provider details.

  6. Specify the server on which the master node is configured. For example, df34534.us-east-1.snowflakecomputing.com. This URL is provided by Snowflake.

  7. Enter the full name of the account provided by Snowflake.

  8. Enter the name of the virtual warehouse to use for reading data once connected to Snowflake.

  9. Enter the default name of the staging database provided with your Snowflake account.

  10. Specify the access control role to use during the Snowflake session. For example, SYSADMIN.

  11. Enter the URL to access the Snowflake server. For example, jdbc:snowflake://abc-west-1.mycompany.com/.

  12. For Authentication Type, select Snowflake.

  13. Enter the User Name and Password to use.

  14. For role-based access control respecting Snowflake security select the Enable User Impersonation checkbox. When enabled, user privileges of the logged in user (as defined in Snowflake account) are respected for any data source operation, otherwise user privileges defined at connection level are respected. Ensure that the Snowflake role is assigned for the users through Custom Attributes.

  15. By default, Is Read checkbox is selected as this connection can only be used to read data (creating registered files) on which the cube semantic model will be created. 

  16. To enable the connection for raw data, click the Is Default SQL Engine checkbox to set this connection to run the default SQL engine. 

  17. Click Properties to view or set properties.

  18. Click the Test button from the top left to validate the connection settings.

  19. If the connection is valid, click the Save button. 

See the Provider parameters table for details.

To refresh connections, click the menu ( ⋮ ) at the top of the Connections column and select Refresh.

Aura tab
summaryAuthentication Type: OAuth
paramsJTdCJTIydGl0bGUlMjIlM0ElMjJBdXRoZW50aWNhdGlvbiUyMFR5cGUlM0ElMjBPQXV0aCUyMiU3RA==
  1. From the Toolbox, click Setup, then Connections.

  2. From the Actions menu ( ⋮ ) click Add Connection.

  3. Enter a name or select it from the Connection list.

  4. Select Warehouse from the Category List.
    There may be more than one warehouse connection.

  5. For Providers, select Snowflake, and enter provider details.

  6. Specify the server on which the master node is configured. For example, df34534.us-east-1.snowflakecomputing.com. This URL is provided by Snowflake.

  7. Enter the full name of the account provided by Snowflake.

  8. Enter the name of the virtual warehouse to use for reading data once connected to Snowflake.

  9. Enter the default name of the staging database provided with your Snowflake account.

  10. Specify the access control role to use during the Snowflake session. For example, SYSADMIN.

  11. Enter the URL to access the Snowflake server. For example, jdbc:snowflake://abc-west-1.mycompany.com/.

  12. For Authentication type, select Oauth.

  13. Click the clipboard icon to copy the Redirect URL so that you can set it on the Snowflake server.

    1. To set the URL, see the Snowflake documentation.
      For example, to set the URL, run the following command:

      Code Block
      ALTER SECURITY INTEGRATION OAUTH_KP_INT SET OAUTH_REDIRECT_URI='
      https://10.80.16.7:8443/kyvos/oauthRedirect';
  14. Enter the Client ID created when you registered your client with Snowflake.

  15. Enter the Client Secret when you registered your client with Snowflake.

  16. Enter the Token URL where the token is stored.

  17. Click Fetch Tokens to get new tokens. You must provide the Client ID, Client Secret, and Token URL.

  18. Enter the Access Token which represents the authorization granted to a client by a user to access their data using a specified role.

  19. If needed, click Refresh Token. When known, the expiration date is shown. 

  20. By default, Is Read checkbox is selected as this connection can only be used to read data (creating registered files) on which the cube semantic model will be created. 

  21. Click Subscribe to get notifications for Refresh Token.

  22. To enable the connection for raw data, click the Is Default SQL Engine checkbox to set this connection to run the default SQL engine.

  23. Click Properties to view or set properties.

  24. Click the Test button from the top left to validate the connection settings.

  25. If the connection is valid, click the Save button.

See the Provider parameters table for details.

Configurations for building semantic models using Snowflake warehouse

To use the Snowflake connection for semantic models processing, add the following Snowflake jar files in the Kyvos default connections properties:

kyvos.connection.classpath.jar

;../lib/snowflake-jdbc-3.6.8.jar;../lib/spark-snowflake_2.11-2.4.7-spark_2.1.jar

kyvos.connection.sparkclasspath.jar

...