Go to file
ian_Cin a3bf728400
Update various docs (#4)
* rename cli tool

* remove redundant docs

* update docs

* update macos instructions

* add badges
2024-03-29 19:47:03 +07:00
.github/workflows make default installation faster (#2) 2024-03-21 22:48:20 +07:00
docs Update various docs (#4) 2024-03-29 19:47:03 +07:00
libs Update various docs (#4) 2024-03-29 19:47:03 +07:00
scripts pin llama-cpp-python to 0.2.55 due to https://github.com/abetlen/llama-cpp-python/issues/1288 2024-03-27 18:58:19 +07:00
templates Change template to private attribute and simplify imports (#101) 2023-12-08 18:10:34 +07:00
.env enable config through .env 2024-03-27 19:04:48 +07:00
.gitattributes Feat/local endpoint llm (#148) 2024-03-15 16:17:33 +07:00
.gitignore remove git secret 2024-03-28 16:04:12 +07:00
.pre-commit-config.yaml Utilize llama.cpp for both completion and chat models (#141) 2024-02-20 18:17:48 +07:00
doc_env_reqs.txt Fix integrating indexing and retrieval pipelines to FileIndex (#155) 2024-03-10 16:41:42 +07:00
LICENSE.txt Create LICENSE.txt 2024-03-29 16:39:56 +07:00
mkdocs.yml Update various docs (#4) 2024-03-29 19:47:03 +07:00
pyproject.toml Update docs (#106) 2024-01-30 18:50:17 +07:00
README.md Update various docs (#4) 2024-03-29 19:47:03 +07:00

kotaemon

Documentation

Python 3.10+ Code style: black built with Codeium

Build and use local RAG-based Question Answering (QA) applications.

This repository would like to appeal to both end users who want to do QA on their documents and developers who want to build their own QA pipeline.

  • For end users:
    • A local Question Answering UI for RAG-based QA.
    • Supports LLM API providers (OpenAI, AzureOpenAI, Cohere, etc) and local LLMs (currently only GGUF format is supported via llama-cpp-python).
    • Easy installation scripts, no environment setup required.
  • For developers:
    • A framework for building your own RAG-based QA pipeline.
    • See your RAG pipeline in action with the provided UI (built with Gradio).
    • Share your pipeline so that others can use it.

This repository is under active development. Feedback, issues, and PRs are highly appreciated. Your input is valuable as it helps us persuade our business guys to support open source.

Installation

Manual installation

  • Clone the repo

    git clone git@github.com:Cinnamon/kotaemon.git
    cd kotaemon
    
  • Install the environment

    • Create a conda environment (python >= 3.10 is recommended)

      conda create -n kotaemon python=3.10
      conda activate kotaemon
      
      # install dependencies
      cd libs/kotaemon
      pip install -e ".[all]"
      
    • Or run the installer (one of the scripts/run_* scripts depends on your OS), then you will have all the dependencies installed as a conda environment at install_dir/env.

      conda activate install_dir/env
      
  • Pre-commit

    pre-commit install
    
  • Test

    pytest tests
    

From installation scripts

  1. Clone the repository.
  2. Navigate to the scripts folder and start an installer that matches your OS:
    • Linux: run_linux.sh
    • Windows: run_windows.bat
    • macOS: run_macos.sh
  3. After the installation, the installer will ask to launch the ktem's UI,answer to continue.
  4. If launched, the application will be available at http://localhost:7860/.
  5. The conda environment is located in the install_dir/env folder.

Here is the setup and update strategy:

  • Run the run_* script: This setup environment, including downloading Miniconda (in case Conda is not available in your machine) and installing necessary dependencies in install_dir folder.
  • Launch the UI: To launch the ktem's UI after initial setup or any changes, simply run run_* script again.
  • Reinstall dependencies: Simply delete the install_dir/env folder and run run_* script again. The script will recreate the folder with fresh dependencies.