Document toolboxDocument toolbox

Processing semantic model with No-Spark on Snowflake Data source

Processing semantic model with source as Snowflake on AWS

To process the semantic model with source as Snowflake on AWS,

  • In semantic model advance properties, provide the value of your secret key in the AWS_SECRET_KEY property.

  • In semantic model advance properties, provide the value of your access key in the AWS_KEY_ID property.
    -or-

  • Configure a Snowflake storage integration to access Amazon S3.

Important

  • Refer to Snowflake documentation for configuring a Snowflake storage integration to access Amazon S3. Ensure that before creating an external stage, as per Snowflake documentation, the URL should be: s3://<your bucket name>/<kvyso work directory>/temp/
    For example, 's3://kyvos-qa/user/engine_work/temp/'

  • For AWS Marketplace document, add IAM role (created in the ‘Step 2: Create the IAM Role in AWS’ section of the Snowflake documentation) in the KMS key policy.

Note

This requires AWS console access and Snowflake account administrators (users with the ACCOUNTADMIN role) or a role with the global CREATE INTEGRATION.

  • Execute the GRANT USAGE ON STAGE <stagename> TO ROLE <role_used_on_Kyvos_SF_Connection>; query on Snowflake with the role used in Snowflake connection in Kyvos.

  • Add the kyvos.connection.snowflake.stage = @mydb.myschema.mystage property on Snowflake connection.
    NOTE: Here, mydb is the database name, myschema is the schema name, and mystage is the stage name.

  • Update the IAM role in your bucket policy that you created while configuring a Snowflake storage integration to access Amazon S3 in the above step.

Processing semantic model with source as Snowflake on GCP

To process the semantic model with source as Snowflake on GCP,

  1. Configure a Snowflake storage integration to access Google Cloud Storage.

Important

To configure a Snowflake storage integration to access Google Cloud Storage, refer to the Snowflake documentation.

Ensure that before creating an external stage, as per Snowflake documentation, the URL should be:
gcs://<your bucket name>/<kyvos work directory>/temp/
For example, 'gcs://kyvos-qa/user/engine_work/temp/'

Note

  • This requires GCP console access and Snowflake account administrators (users with the ACCOUNTADMIN role) or a role with the global CREATE INTEGRATION.

  • Grant the USAGE privilege on the stage to the role used on Kyvos_Snowflake_Connection; query on Snowflake with the role used in Snowflake connection in Kyvos.

  • Add the kyvos.connection.snowflake.stage = @mydb.myschema.mystage property on the Snowflake connection.
    NOTE: Here, mydb is the database name, myschema is the schema name, and mystage is the stage name.

  • If Customer Managed Key (CMK) or Bring Your Own Key (BYOK) is used in Terraform based deployment, use below command for creating stage.

    CREATE STAGE my_gcs_stage
    STORAGE_INTEGRATION = gcs_int
    URL = 'gcs://mybucket1/path1'
    ENCRYPTION = (TYPE = 'GCS_SSE_KMS');

    For more details, refer to Snowflake documentation.

 

Copyright Kyvos, Inc. All rights reserved.