kotaemon/libs/ktem
Duc Nguyen (john) a203fc0f7c
Allow users to add LLM within the UI (#6)
* Rename AzureChatOpenAI to LCAzureChatOpenAI
* Provide vanilla ChatOpenAI and AzureChatOpenAI
* Remove the highest accuracy, lowest cost criteria

These criteria are unnecessary. The users, not pipeline creators, should choose
which LLM to use. Furthermore, it's cumbersome to input this information,
really degrades user experience.

* Remove the LLM selection in simple reasoning pipeline
* Provide a dedicated stream method to generate the output
* Return placeholder message to chat if the text is empty
2024-04-06 11:53:17 +07:00
..
ktem Allow users to add LLM within the UI (#6) 2024-04-06 11:53:17 +07:00
ktem_tests Allow users to add LLM within the UI (#6) 2024-04-06 11:53:17 +07:00
migrations Optionally allow database migration with Alembic 2024-01-28 19:54:15 +07:00
.gitignore Make ktem official (#134) 2024-01-23 10:54:18 +07:00
alembic.ini Optionally allow database migration with Alembic 2024-01-28 19:54:15 +07:00
flowsettings.py Allow users to add LLM within the UI (#6) 2024-04-06 11:53:17 +07:00
launch.py Allow users to add LLM within the UI (#6) 2024-04-06 11:53:17 +07:00
MANIFEST.in Improve kotaemon based on insights from projects (#147) 2024-02-28 22:18:29 +07:00
pyproject.toml make default installation faster (#2) 2024-03-21 22:48:20 +07:00
README.md Make ktem official (#134) 2024-01-23 10:54:18 +07:00
requirements.txt Make ktem official (#134) 2024-01-23 10:54:18 +07:00

Example of MVP pipeline for example

Prerequisite

To run the system out-of-the-box, please supply the following environment variables:

OPENAI_API_KEY=
OPENAI_API_BASE=
OPENAI_API_VERSION=
SERPAPI_API_KEY=
COHERE_API_KEY=
OPENAI_API_KEY_EMBEDDING=

# optional
KH_APP_NAME=

Run

gradio launch.py