TellusR enables you to chat with your search data via an LLM.
When the chat functionality is enabled, it can be accessed through the TellusR API or tested out in the TellusR administration interface in the chat section.
To enable this funtionality you need to add a authorization key for you preferred LLM in tellusr.config
. You also need to setup the name of the project that the LLM will be integrated with.
TELLUSR_DIALOGUE_PROJECT=<name of the project>
OPENAI_API_KEY=<openai authorization token>
# For fireworks.ai
# FIREWORKS_API_KEY=<fireworks authorization token>
# For Azure OpenAI
# AZURE_OPENAI_API_KEY=<azure authorization token>
# AZURE_OPENAI_API_RESOURCE=<resource configured in Azure>
# AZURE_OPENAI_API_DEPLOYMENT=<deplyment as configured in Azure>
#
# If configured, a cheaper key may be used for some steps:
# AZURE2_OPENAI_API_KEY=<azure authorization token>
# AZURE2_OPENAI_API_RESOURCE=<resource configured in Azure>
# AZURE2_OPENAI_API_DEPLOYMENT=<deplyment as configured in Azure>
Please contact the TellusR team for assistance with advanced configuration needs.