Skip to end of metadata
Go to start of metadata

You are viewing an old version of this page. View the current version.

Compare with Current View Version History

« Previous Version 5 Current »

To use the Ask Kyvos Copilot feature, you need to configure GenAI connection details through Kyvos Manager. The default configurations are auto-populated, and you can modify these settings as needed.

Configuring a connection

Configuring an LLM Connection

To configure the GenAI LLM connection, perform the following steps: 

  1. On the navigation pane, click Kyvos and Ecosystem > GenAI Configuration. The page displays information about the GenAI Configuration connection details.

    image-20241220-093329.png
  2. In the Connections pane, click the plus icon to add a new GenAI connection. The connection that you create will be listed in the Connection pane. You can edit the information when needed.

  3. Select this Enable checkbox to enable GenAI connection. This will enable the following:

    1. The Kyvos Dialogues option for creating expressions for calculated measures and members.

    2. The Kyvos Dialogues option for MDX queries option that will appear in the Query Playground page.

    3. The Kyvos Dialogues option for Conversation AI. This AI-powered feature helps you query your data using natural language (semantic queries).

  4. Enter details as:

  1. Click Save. The GenAI connection details are saved.

Configuring an embedding connection

You can now specify the type of connection while uploading the custom provider.

After specifying the connection type, users can upload the required JAR and configuration files in a ZIP format corresponding to the selected type

image-20250123-132359.png

To configure the GenAI embedding connection, perform the following steps: 

Parameter/Field

Description

Connection Name

A unique name that identifies your GenAI connections.

Provider

The name of the GenAI provider the system will use to generate output. Select the required provider from the list.

Model

Select the name of the model for generating embeddings.

EndPoint

Specify a unique identifier for the end user, which helps OpenAI monitor and detect abuse.

  • For Azure OpenAI, provide the endpoint in the following format:
    {deployment-id}/embeddings?api-version=2024-10-21
    For example, /text-embedding-ada-002/embeddings?api-version=2023-05-15

Prompt Token Limit

Specify maximum tokens allowed for prompt in single request for current model.

Similarity Upper Threshold

Specify upper threshold to control the selection of records for Retrieval Augmented Generation (RAG).

Similarity Lower Threshold

Specify lower threshold to control Retrieval Augmented Generation (RAG) record selection.

Template Folder Path

Provide the folder path for templates.

RAG Max Records

Specify the maximum number of records (RAG) required in a user prompt.

Configuring GenAI Custom provider

On the Custom Provider page, while configuring a custom provider, you must define the ‘applicable For’ field to indicate the type whether the details apply to an LLM Connection, an Embedding Connection, or Both.

image-20250123-135031.png

Parameter/Field

Description

Provider Name

The name of the GenAI provider the system will use to generate output.

GenAI Provider Zip

Upload the GenAI provider zip file.

GenAI Provider ZIP

The zip file should include two folders:

  • conf: This folder must contain a metadata file named metadata.json.

  • lib: This folder must include the required .jar files.

Callback Class Name for LLM Service

Provide fully qualified class name, including package name of the class which implements the GenAICallbackService interface for LLM Service.

Callback Class Name for Embedding Service

Provide fully qualified class name, including package name of the class which implements the GenAIEmbeddingCallbackService interface for Embedding generation.

  • No labels