Since the only usage of prompt is within LLMs, it is reasonable to keep it within the LLM module. This way, it would be easier to discover module, and make the code base less complicated. Changes: * Move prompt components into llms * Bump version 0.3.1 * Make pip install dependencies in eager mode --------- Co-authored-by: ian <ian@cinnamon.is>
29 lines
750 B
Python
29 lines
750 B
Python
# flake8: noqa
|
|
|
|
from kotaemon.llms import PromptTemplate
|
|
|
|
zero_shot_react_prompt = PromptTemplate(
|
|
template="""Answer the following questions as best you can. You have access to the following tools:
|
|
{tool_description}.
|
|
Use the following format:
|
|
|
|
Question: the input question you must answer
|
|
Thought: you should always think about what to do
|
|
|
|
Action: the action to take, should be one of [{tool_names}]
|
|
|
|
Action Input: the input to the action
|
|
|
|
Observation: the result of the action
|
|
|
|
... (this Thought/Action/Action Input/Observation can repeat N times)
|
|
#Thought: I now know the final answer
|
|
Final Answer: the final answer to the original input question
|
|
|
|
Begin! After each Action Input.
|
|
|
|
Question: {instruction}
|
|
Thought:{agent_scratchpad}
|
|
"""
|
|
)
|