Go to file
2024-04-13 18:57:04 +07:00
.github/workflows Enable fastembed as a local embedding vendor (#12) 2024-04-09 01:44:34 +07:00
docs Treat index id as auto-generated field (#27) 2024-04-13 18:29:37 +07:00
libs allow LlamaCppChat to auto download model from hf hub (#29) 2024-04-13 18:57:04 +07:00
scripts pin llama-cpp-python to 0.2.55 due to https://github.com/abetlen/llama-cpp-python/issues/1288 2024-03-27 18:58:19 +07:00
templates Enable fastembed as a local embedding vendor (#12) 2024-04-09 01:44:34 +07:00
.env enable config through .env 2024-03-27 19:04:48 +07:00
.gitattributes Feat/local endpoint llm (#148) 2024-03-15 16:17:33 +07:00
.gitignore Allow users to add LLM within the UI (#6) 2024-04-06 11:53:17 +07:00
.pre-commit-config.yaml Fix UI bugs (#8) 2024-04-03 16:33:54 +07:00
doc_env_reqs.txt Fix integrating indexing and retrieval pipelines to FileIndex (#155) 2024-03-10 16:41:42 +07:00
LICENSE.txt Create LICENSE.txt 2024-03-29 16:39:56 +07:00
mkdocs.yml Update documentations (#23) 2024-04-11 19:41:45 +07:00
pyproject.toml Update docs (#106) 2024-01-30 18:50:17 +07:00
README.md Treat index id as auto-generated field (#27) 2024-04-13 18:29:37 +07:00

kotaemon

demo

User Guide | Developer Guide

Python 3.10+ Code style: black built with Codeium

Build and use local RAG-based Question Answering (QA) applications.

This repository would like to appeal to both end users who want to do QA on their documents and developers who want to build their own QA pipeline.

  • For end users:
    • A local Question Answering UI for RAG-based QA.
    • Supports LLM API providers (OpenAI, AzureOpenAI, Cohere, etc) and local LLMs (currently only GGUF format is supported via llama-cpp-python).
    • Easy installation scripts, no environment setup required.
  • For developers:
    • A framework for building your own RAG-based QA pipeline.
    • See your RAG pipeline in action with the provided UI (built with Gradio).
    • Share your pipeline so that others can use it.

This repository is under active development. Feedback, issues, and PRs are highly appreciated. Your input is valuable as it helps us persuade our business guys to support open source.

Setting up

  • Clone the repo

    git clone git@github.com:Cinnamon/kotaemon.git
    cd kotaemon
    
  • Install the environment

    • Create a conda environment (python >= 3.10 is recommended)

      conda create -n kotaemon python=3.10
      conda activate kotaemon
      
      # install dependencies
      cd libs/kotaemon
      pip install -e ".[all]"
      
    • Or run the installer (one of the scripts/run_* scripts depends on your OS), then you will have all the dependencies installed as a conda environment at install_dir/env.

      conda activate install_dir/env
      
  • Pre-commit

    pre-commit install
    
  • Test

    pytest tests