Document toolboxDocument toolbox

Processing semantic model without Spark on Snowflake Data source

To process the semantic model with source as Snowflake on AWS,

  • In semantic model advance properties, provide the value of your secret key in the AWS_SECRET_KEY property.

  • In semantic model advance properties, provide the value of your access key in the AWS_KEY_ID property.
    -or-

  • Configure a Snowflake storage integration to access Amazon S3.

Important

Refer to Snowflake documentation for configuring a Snowflake storage integration to access Amazon S3. Ensure that before creating an external stage, as per Snowflake documentation, the URL should be: s3://<your bucket name>/<kvyso work directory>/temp/

For example, 's3://kyvos-qa/user/engine_work/temp/'

Note

This requires AWS console access and Snowflake account administrators (users with the ACCOUNTADMIN role) or a role with the global CREATE INTEGRATION.

  • Execute the GRANT USAGE ON STAGE <stagename> TO ROLE <role_used_on_Kyvos_SF_Connection>; query on Snowflake with the role used in Snowflake connection in Kyvos.

  • Add the kyvos.connection.snowflake.stage = @mydb.myschema.mystage property on Snowflake connection.
    NOTE: Here, mydb is the database name, myschema is the schema name, and mystage is the stage name.

  • Update the IAM role in your bucket policy that you created while configuring a Snowflake storage integration to access Amazon S3 in the above step.

Copyright Kyvos, Inc. All rights reserved.