Move prompts into LLMs module (#70)

Since the only usage of prompt is within LLMs, it is reasonable to keep it within the LLM module. This way, it would be easier to discover module, and make the code base less complicated.

Changes:

* Move prompt components into llms
* Bump version 0.3.1
* Make pip install dependencies in eager mode

---------

Co-authored-by: ian <ian@cinnamon.is>
This commit is contained in:
Nguyen Trung Duc (john)
2023-11-14 16:00:10 +07:00
committed by GitHub
parent 8532138842
commit 693ed39de4
24 changed files with 44 additions and 37 deletions

View File

@@ -2,9 +2,7 @@ from typing import AnyStr, Optional, Type, Union
from pydantic import BaseModel, Field
from kotaemon.llms.chats.base import ChatLLM
from kotaemon.llms.chats.openai import AzureChatOpenAI
from kotaemon.llms.completions.base import LLM
from kotaemon.llms import LLM, AzureChatOpenAI, ChatLLM
from .base import BaseTool, ToolException