Skip to end of metadata
Go to start of metadata

You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 2 Next »

Applies to: (tick) Kyvos Enterprise  (tick) Kyvos Cloud (SaaS on AWS) (error) Kyvos AWS Marketplace

(error) Kyvos Azure Marketplace   (error) Kyvos GCP Marketplace (tick) Kyvos Single Node Installation (Kyvos SNI)


Kyvos supports both ROLAP and HOLAP on BigQuery connections. You can use this connection for querying data from your BigQuery data warehouse.

Points to know

  • From Kyvos 2024.2 onwards, Kyvos now supports Column Level Security defined in Big Query. When creating a Semantic Model with Big Query as the data source, any column level security applied in Big Query will be reflected in Kyvos when accessing data from the Semantic Model.

    • The email ID of the user on Kyvos should be the same as the email ID in Google Cloud.

    • The user should have Fine Grained Reader access for accessing secured columns in Kyvos and BigQuery.

    • To read CLS columns in Kyvos when accessing data from the semantic model, the following permissions must be set on Connection’s service account to successfully call GCP APIs:

      o    bigquery.tables.get

      o   cloudasset.assets.analyzeIamPolicy

      o   cloudasset.assets.searchAllIamPolicies

      o   cloudasset.assets.searchAllResources

      o   datacatalog.taxonomies.get

      o   serviceusage.services.use

      o   iam.roles.get

  • If you implement a filter on a date column with >= or <= and explicit type cast, you must add the kyvos.rf.sqlparser.enabled property and set its value to true at the Hadoop connection. 

  • You can create multiple BigQuery connections for raw data querying. The connections are available on the Dataset designer page, where you can select the connection to be used for a particular semantic model. 

  • All Spark-supported functions must be mapped according to supported BigQuery functions. You must update the SQL and datasets with supported BigQuery functions. Otherwise, the incompatible functions would fail. 

  • You can connect to your BigQuery data warehouse for reading data from both GCP and EMR-based AWS clusters. 

  • You can create multiple read connections using different BigQuery data warehouse accounts from different projects. You can then use datasets from all these BigQuery warehouses in a single relationship and hence a single semantic model. If you intend to create multiple datasets from different warehouses in different projects, ensure all the Materialization Datasets specified when configuring the Connections belong to the same region.

Setting up a BigQuery warehouse connection

To set up or view a BigQuery warehouse connection, perform the following steps. 

  1. From the Toolbox, click Connections.

  2. From the Actions menu ( ) click Add Connection.

  3. Enter a name or select it from the Connection list.

  4. After you have entered the parameters described in the table below, click the Test button from the top left to validate the connection settings.

  5. If the connection is valid, click the Save button.

  6. To refresh connections, click the Actions menu ( ) at the top of the Connections column and select Refresh.


  • No labels