Been having too much fun using LLMs hosted locally, but can’t seem to get Ollama’s chat with documents to work well. Lots of “what are you talking about? There are no documents here” issues. Does anyone have any recommendations to either a) figure out what’s going wrong or b) Alternative locally hosted options that chat with documents works well with (GPT4all or something?)

  • Bazsalanszky@lemmy.toldi.eu
    link
    fedilink
    arrow-up
    5
    ·
    edit-2
    3 months ago

    I haven’t tested it extensively, but open webui also has RAG functionality (chat with documents).

    The UI it self is also kinda cool and it has other useful features like commands (for common prompts) and searching for stuff online (e.g. with searx). It works quite well with Ollama.