Simplify the BaseComponent
inteface (#64)
This change remove `BaseComponent`'s: - run_raw - run_batch_raw - run_document - run_batch_document - is_document - is_batch Each component is expected to support multiple types of inputs and a single type of output. Since we want the component to work out-of-the-box with both standardized and customized use cases, supporting multiple types of inputs are expected. At the same time, to reduce the complexity of understanding how to use a component, we restrict a component to only have a single output type. To accommodate these changes, we also refactor some components to remove their run_raw, run_batch_raw... methods, and to decide the common output interface for those components. Tests are updated accordingly. Commit changes: * Add kwargs to vector store's query * Simplify the BaseComponent * Update tests * Remove support for Python 3.8 and 3.9 * Bump version 0.3.0 * Fix github PR caching still use old environment after bumping version --------- Co-authored-by: ian <ian@cinnamon.is>
This commit is contained in:
committed by
GitHub
parent
6095526dc7
commit
d79b3744cb
@@ -54,12 +54,6 @@ def test_azureopenai_model(openai_completion):
|
||||
), "Output for single text is not LLMInterface"
|
||||
openai_completion.assert_called()
|
||||
|
||||
# test for list[str] input - batch mode
|
||||
output = model(["hello world"])
|
||||
assert isinstance(output, list), "Output for batch string is not a list"
|
||||
assert isinstance(output[0], LLMInterface), "Output for text is not LLMInterface"
|
||||
openai_completion.assert_called()
|
||||
|
||||
# test for list[message] input - stream mode
|
||||
messages = [
|
||||
SystemMessage(content="You are a philosohper"),
|
||||
@@ -73,9 +67,3 @@ def test_azureopenai_model(openai_completion):
|
||||
output, LLMInterface
|
||||
), "Output for single text is not LLMInterface"
|
||||
openai_completion.assert_called()
|
||||
|
||||
# test for list[list[message]] input - batch mode
|
||||
output = model([messages])
|
||||
assert isinstance(output, list), "Output for batch string is not a list"
|
||||
assert isinstance(output[0], LLMInterface), "Output for text is not LLMInterface"
|
||||
openai_completion.assert_called()
|
||||
|
Reference in New Issue
Block a user