LLM Chat

TellusR allows you to interact with your search data through an LLM, using its powerful underlying RAG capabilities.

When the chat functionality is enabled, it can be accessed through the TellusR API or tested out in the TellusR administration interface in the chat section.

Chat

How to set it up

To enable this functionality, you must add an authorization key for your preferred LLM. You can do this in one of the following ways:

  • Configure directly: Add the key and corresponding settings in the tellusr.config file or set the credentials as environment variables (see details below).
  • Use the LLP setup widget: Configure the key via the LLP setup widget in the TellusR administration interface (LLM Setup). If a key is already set in tellusr.config or as an environment variable, the widget will automatically use that value by default.

LLM configuration in tellusr.conf

OPENAI_API_KEY=<openai authorization token>

# For fireworks.ai
# FIREWORKS_API_KEY=<fireworks authorization token>

# For Azure OpenAI
# AZURE_OPENAI_API_KEY=<azure authorization token>
# AZURE_OPENAI_API_RESOURCE=<resource configured in Azure>
# AZURE_OPENAI_API_DEPLOYMENT=<deplyment as configured in Azure>
#
# If configured, a cheaper key may be used for some steps:
# AZURE2_OPENAI_API_KEY=<azure authorization token>
# AZURE2_OPENAI_API_RESOURCE=<resource configured in Azure>
# AZURE2_OPENAI_API_DEPLOYMENT=<deplyment as configured in Azure>

Please contact the TellusR team for assistance with advanced configuration needs.