private-gpt/private_gpt
2024-06-06 21:07:07 +02:00
..
components While ingesting, some files led to a crash due to encoding error. Even though utf-8 some characters still messed it up. 2024-06-06 21:00:46 +02:00
open_ai feat: Upgrade to LlamaIndex to 0.10 (#1663) 2024-03-06 17:51:30 +01:00
server added window_size setting for ingestion 2024-06-06 21:07:07 +02:00
settings added window_size setting for ingestion 2024-06-06 21:07:07 +02:00
ui feat(ui): Add Model Information to ChatInterface label 2024-04-02 16:52:27 +02:00
utils feat(ingest): Created a faster ingestion mode - pipeline (#1750) 2024-03-19 21:24:46 +01:00
__init__.py feat(local): tiktoken cache within repo for offline (#1467) 2024-03-11 22:55:13 +01:00
__main__.py fix: Remove global state (#1216) 2023-11-12 22:20:36 +01:00
constants.py Next version of PrivateGPT (#1077) 2023-10-19 16:04:35 +02:00
di.py fix: Remove global state (#1216) 2023-11-12 22:20:36 +01:00
launcher.py feat(llm): adds serveral settings for llamacpp and ollama (#1703) 2024-03-11 22:51:05 +01:00
main.py feat: Upgrade to LlamaIndex to 0.10 (#1663) 2024-03-06 17:51:30 +01:00
paths.py fix: Remove global state (#1216) 2023-11-12 22:20:36 +01:00