HCatalog Configuration
Applies to: Kyvos Enterprise Kyvos Cloud (SaaS on AWS) Kyvos AWS Marketplace
Kyvos Azure Marketplace  Kyvos GCP Marketplace Kyvos Single Node Installation (Kyvos SNI)
HCatalog is a table storage management tool for Hadoop that exposes tabular data of Hive metastore to other Hadoop applications. It enables users with different data processing tools (Pig, MapReduce) to quickly write data onto a grid. It provides read and write interfaces for various data processing tools such as Pig or MapReduce.
Note
In the case of Azure (Databricks) deployment, only the Hive Version parameter is available for selection, and other fields are not displayed. Â
To configure the HCatalog properties for the cluster:
Select the Enable HCatalog checkbox to present a relational view of data in the Hadoop Distributed File system (HDFS).
Select the Use Hive as data source check box if your data is stored on Hive.
Enter the details as:
Area | Parameter/Field | Comments/Description |
---|---|---|
Node and Authentication    | Hive Source Node | For Hive Source Node, select the Same As Name Node option. For Hive services, select the Other Node option. |
Hive Node Host Name | If you selected the Other Node option above, enter your node IP here. | |
Use different user account for accessing Hive Node | Select this checkbox to use a user account other than the Hadoop Node authentication user for accessing the Hive node. Specify the user name, Authentication type as Password or Private key.  | |
Paths and Version   | H ive Version | Select the Hive version from the list. |
HCatalog Library Path | Enter library absolute paths for jar inclusion. Use a comma-separated list for multiple paths. Example: /home/hadoop/,/home/hadoop/lib/ Refer to the Appendix for details. | |
HCatalog Configuration Path | Enter configuration absolute paths for configuration file inclusion. Use a comma-separated list for multiple paths. example: /home/hadoop/conf,/etc/hadoop/conf/ | |
Custom Parameters | HCatalog Parameters | Use this to add custom HCatalog parameters for your cluster. |
Click Validate Hive File Paths. The system validates user authentication and paths that connect to the namenode. If validation is unsuccessful, click the Back button and edit information wherever necessary, and click Revalidate .
Note
The Validate Hive File Paths button is not displayed for the Azure (Databricks) environment.Â
Click the Save button from the top-right of the page to save your changes.
Copyright Kyvos, Inc. All rights reserved.