private-gpt/private_gpt
Javier Martinez 20bad17c98
Some checks failed
release-please / release-please (push) Waiting to run
tests / setup (push) Waiting to run
tests / ${{ matrix.quality-command }} (black) (push) Blocked by required conditions
tests / ${{ matrix.quality-command }} (mypy) (push) Blocked by required conditions
tests / ${{ matrix.quality-command }} (ruff) (push) Blocked by required conditions
tests / test (push) Blocked by required conditions
tests / all_checks_passed (push) Blocked by required conditions
publish docs / publish-docs (push) Has been cancelled
feat(llm): autopull ollama models (#2019)
* chore: update ollama (llm)

* feat: allow to autopull ollama models

* fix: mypy

* chore: install always ollama client

* refactor: check connection and pull ollama method to utils

* docs: update ollama config with autopulling info
2024-07-29 13:25:42 +02:00
..
components feat(llm): autopull ollama models (#2019) 2024-07-29 13:25:42 +02:00
open_ai feat: Upgrade to LlamaIndex to 0.10 (#1663) 2024-03-06 17:51:30 +01:00
server feat(RAG): Introduce SentenceTransformer Reranker (#1810) 2024-04-02 10:29:51 +02:00
settings feat(llm): autopull ollama models (#2019) 2024-07-29 13:25:42 +02:00
ui feat(llm): Support for Google Gemini LLMs and Embeddings (#1965) 2024-07-08 11:47:36 +02:00
utils feat(llm): autopull ollama models (#2019) 2024-07-29 13:25:42 +02:00
__init__.py feat(local): tiktoken cache within repo for offline (#1467) 2024-03-11 22:55:13 +01:00
__main__.py fix: Remove global state (#1216) 2023-11-12 22:20:36 +01:00
constants.py Next version of PrivateGPT (#1077) 2023-10-19 16:04:35 +02:00
di.py fix: Remove global state (#1216) 2023-11-12 22:20:36 +01:00
launcher.py feat(llm): adds serveral settings for llamacpp and ollama (#1703) 2024-03-11 22:51:05 +01:00
main.py feat: Upgrade to LlamaIndex to 0.10 (#1663) 2024-03-06 17:51:30 +01:00
paths.py fix: Remove global state (#1216) 2023-11-12 22:20:36 +01:00