Skip to end of metadata
Go to start of metadata

You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 10 Next »

Processing semantic model with source as Snowflake on AWS

To process the semantic model with source as Snowflake on AWS,

  • In semantic model advance properties, provide the value of your secret key in the AWS_SECRET_KEY property.

  • In semantic model advance properties, provide the value of your access key in the AWS_KEY_ID property.
    -or-

  • Configure a Snowflake storage integration to access Amazon S3.

Important

Refer to Snowflake documentation for configuring a Snowflake storage integration to access Amazon S3. Ensure that before creating an external stage, as per Snowflake documentation, the URL should be: s3://<your bucket name>/<kvyso work directory>/temp/

For example, 's3://kyvos-qa/user/engine_work/temp/'

Note

This requires AWS console access and Snowflake account administrators (users with the ACCOUNTADMIN role) or a role with the global CREATE INTEGRATION.

  • Execute the GRANT USAGE ON STAGE <stagename> TO ROLE <role_used_on_Kyvos_SF_Connection>; query on Snowflake with the role used in Snowflake connection in Kyvos.

  • Add the kyvos.connection.snowflake.stage = @mydb.myschema.mystage property on Snowflake connection.
    NOTE: Here, mydb is the database name, myschema is the schema name, and mystage is the stage name.

  • Update the IAM role in your bucket policy that you created while configuring a Snowflake storage integration to access Amazon S3 in the above step.

Processing semantic model with source as Snowflake on GCP

To process the semantic model with source as Snowflake on GCP,

  1. Configure a Snowflake storage integration to access Google Cloud Storage.

Important

To configure a Snowflake storage integration to access Google Cloud Storage, refer to the Snowflake documentation.

Ensure that before creating an external stage, as per Snowflake documentation, the URL should be:
gcs://<your bucket name>/<kyvos work directory>/temp/
For example, 'gcs://kyvos-qa/user/engine_work/temp/'

Note

  • This requires GCP console access and Snowflake account administrators (users with the ACCOUNTADMIN role) or a role with the global CREATE INTEGRATION.

  • Grant the USAGE privilege on the stage to the role used on Kyvos_Snowflake_Connection; query on Snowflake with the role used in Snowflake connection in Kyvos.

  • Add the kyvos.connection.snowflake.stage = @mydb.myschema.mystage property on the Snowflake connection.
    NOTE: Here, mydb is the database name, myschema is the schema name, and mystage is the stage name.

  • No labels