Skip to end of metadata
Go to start of metadata

You are viewing an old version of this page. View the current version.

Compare with Current View Page History

Version 1 Current »

Applies to: (tick) Kyvos Enterprise  (tick) Kyvos Cloud (SaaS on AWS) (tick) Kyvos AWS Marketplace

(tick) Kyvos Azure Marketplace   (tick) Kyvos GCP Marketplace (tick) Kyvos Single Node Installation (Kyvos SNI)


These are examples of parameters. The parameters you are using may be slightly different.

Parameter

Option

Details

Hadoop Cluster

Name

Category

Providers

Version

Vendor

Mode

Work Directory

Is Default Build Cluster

Is Default SQL Engine

Default SQL Engine

Server

URL

Properties

Name of the connection such as HadoopConnection1.

Hadoop

Select from the list. For example, Hadoop_Cluster.

Hadoop version number

For example, Cloudera

YARN, Non-YARN

Path to data for HDFS, for example: /user/Kyvos/Path or for S3, for example: KyvosBucket

Click the checkbox to indicate that this connection is the default cluster used for computing (doing builds).

Click the checkbox to indicate this connection runs the default SQL engine. 

To use Hive or Spark to query raw data, this option must be not selected.
Specify Hive or Spark.

For Spark, you need to specify the server URL.

To add Parameters:

1.  Click Properties. Click a filter to see a smaller set or enter a value in Search.

2.  Click Add Property to add. Or, click a property to select it and click – to remove. Default parameters cannot be removed.

3.  Provide the parameter name such as mapred.child.java.opts and value for the onPrem environment. Or, the parameter name such as kyvos.connection.s3.bucket and value for the cloud environment.

NOTE: Parameters that you need to add or search for AWS environment are:

  • kyvos.connection.s3.bucket <S3 bucket_name>

  • kyvos.connection.filesystem <S3>

PostgreSQL

Name

Category

Providers

Driver version

Server

Port

Database

User name

Password

Is Repository

Name of connection.

RDBMS

Select from the list. For example, POSTGRESQL.

PostgreSQL driver version

IP address of the server Postgre is running on

Port number on which Postgre is configured

Name of the database

User name to log into the server

Password to log into the server 

Click this checkbox to indicate this connection is the repository.

Hive

Name

Category

Providers

URL

Is Default SQL Engine

Name of connection.

SQL Engine

Select from the list. For example, HIVE.

URL of the hive.

Click the checkbox to indicate this connection runs the default SQL engine.

Presto

Name

Category

Providers

Server

Port

Catalog

Authentication Type

User Name

Version

Is Default SQL engine

Properties

Name of connection

SQL engine

Presto

IP address of the server on which the Presto master node is configured

Port number on which the Presto master node is configured

Hive

None, LDAP, or Kerberos

User name to log into the server

Presto version number

Click the checkbox to indicate this connection runs the default SQL engine

Click to view or add properties.

Snowflake

Name

Category

Providers

Server



Account


Warehouse

Staging Database

Role

URL

Authentication Type





Redirect URL

Client ID

Client Secret

Token URL

Fetch Tokens

Access Token

Refresh Token


Subscribe

User Name

Password

IsDefault SQL Engine

Properties

Name of connection

Warehouse

Snowflake

Cloud data platform account URL of the server on which the Snowflake master node is configured. For example, df34534.us-east-1.snowflakecomputing.com. This URL is provided by Snowflake.

Full name of your account (provided by Snowflake).
Note that your full account name might include additional segments that identify the region and cloud platform where your account is hosted.

Name of the virtual warehouse (for reading data) to use once connected to the Snowflake server.

Name of the default staging database provided with your Snowflake account.

Access control role to use for the Snowflake session. For example, Sysadmin.

URL to access the server. For example, jdbc:snowflake://abc-west-1.mycompany.com/

Authenticator to use for verifying user login credentials.

  • Snowflake: Select this option to use the internal Snowflake authenticator. Enter your credentials in the Username and Password fields.

  • OAuth: Select this option to authenticate using OAuth. You need to provide the Token parameter to specify the OAuth token.

Click the icon next to the Redirect URL to copy the URL to the clipboard. Set the redirect URL on the Snowflake server to this URL.

The Client ID is created when you register your client with Snowflake.

The Client Secret is created when you register your client with Snowflake.

The URL where the token is stored.

Click Fetch Tokens to get new tokens. You must provide the Client ID, Client Secret, and Token URL.

The access token represents the authorization granted to a client by a user to access their data using a specified role.

The refresh token is a string that is used to obtain a new access token when it expires. A refresh token is optionally issued by the authorization server to the client together with an access token. When the expiration date is known, it's shown.

Click Subscribe to get notifications for Refresh Token.

User name to log into the server when using Snowflake authentication type.

Password to log into the server when using Snowflake authentication type.

To allow querying of raw data, click the checkbox to indicate this connection runs the default SQL engine.

Click to view or add properties.

BigQuery

Name

Category

Providers

Server

Port

Project ID

Authentication Type

Properties

Name of the connection

Warehouse

BIGQUERY

The IP address of the server on which the BigQuery master node is configured

Port number on which the BigQuery master node is configured

The ID of the project

By default, this is Application Default Credentials

Click to view or add properties.

Redshift

Name

Category 

Provider

Authentication Type

URL


User Name

Password

SSL 

DB User

DB Groups

Auto create DB User

Properties

Name of the connection

Warehouse

REDSHIFT

By default, this is User Name and Password (or IAM Instance Profile Connections)

Connection to the database. The JDBC URL has the following format: jdbc:redshift://endpoint:port/database

Redshift User name 

Redshift password

Checkbox to configure the driver to use a non-validating SSL factory

Redshift user that IAM role is mapped to

Redshift group that IAM role is mapped to (must already exist and if no group is specified then PUBLIC is used)

Checkbox to create DB user at runtime.

Click to view or add properties.


  • No labels