Simplify the BaseComponent
inteface (#64)
This change remove `BaseComponent`'s: - run_raw - run_batch_raw - run_document - run_batch_document - is_document - is_batch Each component is expected to support multiple types of inputs and a single type of output. Since we want the component to work out-of-the-box with both standardized and customized use cases, supporting multiple types of inputs are expected. At the same time, to reduce the complexity of understanding how to use a component, we restrict a component to only have a single output type. To accommodate these changes, we also refactor some components to remove their run_raw, run_batch_raw... methods, and to decide the common output interface for those components. Tests are updated accordingly. Commit changes: * Add kwargs to vector store's query * Simplify the BaseComponent * Update tests * Remove support for Python 3.8 and 3.9 * Bump version 0.3.0 * Fix github PR caching still use old environment after bumping version --------- Co-authored-by: ian <ian@cinnamon.is>
This commit is contained in:
committed by
GitHub
parent
6095526dc7
commit
d79b3744cb
@@ -44,11 +44,6 @@ def test_azureopenai_model(openai_completion):
|
||||
model.agent, AzureOpenAILC
|
||||
), "Agent not wrapped in Langchain's AzureOpenAI"
|
||||
|
||||
output = model(["hello world"])
|
||||
assert isinstance(output, list), "Output for batch is not a list"
|
||||
assert isinstance(output[0], LLMInterface), "Output for text is not LLMInterface"
|
||||
openai_completion.assert_called()
|
||||
|
||||
output = model("hello world")
|
||||
assert isinstance(
|
||||
output, LLMInterface
|
||||
@@ -72,11 +67,6 @@ def test_openai_model(openai_completion):
|
||||
model.agent, OpenAILC
|
||||
), "Agent is not wrapped in Langchain's OpenAI"
|
||||
|
||||
output = model(["hello world"])
|
||||
assert isinstance(output, list), "Output for batch is not a list"
|
||||
assert isinstance(output[0], LLMInterface), "Output for text is not LLMInterface"
|
||||
openai_completion.assert_called()
|
||||
|
||||
output = model("hello world")
|
||||
assert isinstance(
|
||||
output, LLMInterface
|
||||
|
Reference in New Issue
Block a user