Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

Processing semantic model with source as Snowflake on AWS

To process the semantic model with source as Snowflake on AWS,

  • Set In semantic model advance properties, provide the value of your secret key in the AWS_SECRET_KEY as user ACCESS_KEY and property.

  • In semantic model advance properties, provide the value of your access key in the AWS_KEY_ID as user SECRET_KEY, in semantic model advance propertiesproperty.
    -or-

  • Configure a Snowflake storage integration to access Amazon S3.

    Anchor
    Configure
    Configure

Panel
panelIconIdatlassian-info
panelIcon:info:
bgColor#FFFAE6

Important

...

  • Ensure that before creating an external stage, as per Snowflake documentation, the URL should be: s3://<your bucket name>/<kvyso work directory>/temp/
    For example, 's3://kyvos-qa/user/engine_work/temp/'

  • For AWS Marketplace document, add IAM role (created in the ‘Step 2: Create the IAM Role in AWS’ section of the Snowflake documentation) in the KMS key policy.

Panel
panelIconIdatlassian-note
panelIcon:note:
bgColor#DEEBFF

Note

This requires AWS console access and Snowflake account administrators (users with the ACCOUNTADMIN role) or a role with the global CREATE INTEGRATION.

  • Execute the GRANT USAGE ON STAGE <stagename> TO ROLE <role_used_on_Kyvos_SF_Connection>; query on Snowflake with the role used in Snowflake connection in Kyvos.

  • Add the kyvos.connection.snowflake.stage = @mydb.myschema.mystage property on Snowflake connection.
    NOTE: Here, mydb is the database name, myschema is the schema name, and mystage is the stage name.

  • Update the IAM role in your bucket policy that you created while configuring a Snowflake storage integration to access Amazon S3 in the above step.

Processing semantic model with source as Snowflake on GCP

To process the semantic model with source as Snowflake on GCP,

  1. Configure a Snowflake storage integration to access Google Cloud Storage.

Panel
panelIconIdatlassian-info
panelIcon:info:
bgColor#FFFAE6

Important

To configure a Snowflake storage integration to access Google Cloud Storage, refer to the Snowflake documentation.

Ensure that before creating an external stage, as per Snowflake documentation, the URL should be:
gcs://<your bucket name>/<kyvos work directory>/temp/
For example, 'gcs://kyvos-qa/user/engine_work/temp/'

Panel
panelIconIdatlassian-note
panelIcon:note:
bgColor#DEEBFF

Note

  • This requires GCP console access and Snowflake account administrators (users with the ACCOUNTADMIN role) or a role with the global CREATE INTEGRATION.

  • Grant the USAGE privilege on the stage to the role used on Kyvos_Snowflake_Connection; query on Snowflake with the role used in Snowflake connection in Kyvos.

  • Add the kyvos.connection.snowflake.stage = @mydb.myschema.mystage property on the Snowflake connection.
    NOTE: Here, mydb is the database name, myschema is the schema name, and mystage is the stage name.