perf3ct
|
e96fdbf72f
|
fix(llm): fix logging type check
|
2025-06-09 00:23:02 +00:00 |
|
perf3ct
|
f5ad5b875e
|
fix(tests): resolve LLM streaming unit test failures
closer to fixing...
closer...
very close to passing...
|
2025-06-08 23:02:15 +00:00 |
|
perf3ct
|
daa32e4355
|
Revert "fix(unit): comment out this test for now to see if the rest pass"
This reverts commit 95a33ba3c0bd1b01c7f6f716b42864174f186698.
|
2025-06-08 22:02:56 +00:00 |
|
perf3ct
|
95a33ba3c0
|
fix(unit): comment out this test for now to see if the rest pass
|
2025-06-08 21:54:19 +00:00 |
|
perf3ct
|
b28387bada
|
feat(llm): decrease the throttle on the chunking tests lol
|
2025-06-08 21:47:53 +00:00 |
|
perf3ct
|
93cf868dcf
|
feat(llm): last test should be passing now
|
2025-06-08 21:38:57 +00:00 |
|
perf3ct
|
224cae6db2
|
fix(unit): resolve type errors
|
2025-06-08 21:03:07 +00:00 |
|
perf3ct
|
d60e795421
|
feat(llm): still working on fixing tests...
|
2025-06-08 20:39:35 +00:00 |
|
perf3ct
|
c6f2124e9d
|
feat(llm): add tests for streaming
|
2025-06-08 20:30:33 +00:00 |
|
perf3ct
|
c1bcb73337
|
feat(llm): also improve the llm streaming service, to make it cooperate with unit tests better
|
2025-06-08 18:40:20 +00:00 |
|
perf3ct
|
40cad2e886
|
fix(unit): I believe it should pass now?
|
2025-06-08 18:20:30 +00:00 |
|
perf3ct
|
a8faf5d699
|
fix(unit): still working on getting the LLM unit tests to pass...
|
2025-06-08 18:13:27 +00:00 |
|
perf3ct
|
e011c56715
|
fix(unit): no more type errors hopefully
|
2025-06-08 16:33:26 +00:00 |
|
Jon Fuller
|
d7abd3a8ed
|
Merge branch 'develop' into feat/llm-unit-tests
|
2025-06-08 08:49:08 -07:00 |
|
Elian Doran
|
e87789d92b
|
Merge pull request #2208 from TriliumNext/fix/llm-chat-save-bug
fix(llm): save to the same note that the chat request was sent from
|
2025-06-08 10:45:58 +03:00 |
|
perf3ct
|
c6062f453a
|
fix(llm): changing providers works now
|
2025-06-07 23:57:35 +00:00 |
|
perf3ct
|
414781936b
|
fix(llm): always fetch the user's selected model
|
2025-06-07 23:36:53 +00:00 |
|
perf3ct
|
b6b88dff86
|
fix(server): increment SYNC_VERSION and APP_DB_VERSION for LLM embeddings removal
|
2025-06-07 21:13:02 +00:00 |
|
perf3ct
|
7f9ad04b57
|
feat(llm): create unit tests for LLM services
|
2025-06-07 21:03:54 +00:00 |
|
perf3ct
|
ff37050470
|
fix(llm): delete provider_manager for embeddings too
|
2025-06-07 19:33:19 +00:00 |
|
perf3ct
|
b0d804da08
|
fix(llm): remove the vectorSearch stage from the pipeline
|
2025-06-07 18:57:08 +00:00 |
|
perf3ct
|
4550c12c6e
|
feat(llm): remove everything to do with embeddings, part 3
|
2025-06-07 18:30:46 +00:00 |
|
perf3ct
|
44a2e7df21
|
feat(llm): remove everything to do with embeddings, part 2
|
2025-06-07 18:20:06 +00:00 |
|
perf3ct
|
44a45780b7
|
feat(llm): remove everything to do with embeddings
|
2025-06-07 18:11:12 +00:00 |
|
Jin
|
db3bf4c12c
|
feat: 🎸 set SSO login logic
|
2025-06-07 12:10:41 +02:00 |
|
Jin
|
fa44a5343b
|
feat: 🎸 support custon oidc server
|
2025-06-07 12:10:41 +02:00 |
|
Elian Doran
|
408dcf7713
|
chore(release): prepare for v0.94.1
|
2025-06-07 12:46:18 +03:00 |
|
Elian Doran
|
9ead5abc62
|
Merge pull request #2181 from TriliumNext/feat/llm-change-to-single-provider
LLM integration, part 4
|
2025-06-07 11:38:30 +03:00 |
|
perf3ct
|
cb3844e627
|
fix(llm): fix duplicated text when streaming responses
|
2025-06-07 00:27:56 +00:00 |
|
perf3ct
|
6bc9b3c184
|
feat(llm): resolve sending double headers in responses, and not being able to send requests to ollama
|
2025-06-07 00:02:26 +00:00 |
|
perf3ct
|
20ec294774
|
feat(llm): still work on decomplicating provider creation
|
2025-06-06 20:30:24 +00:00 |
|
perf3ct
|
8f33f37de3
|
feat(llm): for sure overcomplicate what should be a very simple thing
|
2025-06-06 20:11:33 +00:00 |
|
perf3ct
|
85cfc8fbd4
|
feat(llm): have OpenAI provider not require API keys (for endpoints like LM Studio)
|
2025-06-06 19:22:39 +00:00 |
|
Elian Doran
|
091cd7a18a
|
fix(server): totp asked even if no authentication is enabled
|
2025-06-06 16:17:21 +03:00 |
|
Elian Doran
|
63a6f00a47
|
chore(server): add logs to debug missing session
|
2025-06-06 09:27:51 +03:00 |
|
perf3ct
|
c26b74495c
|
feat(llm): remove LLM deprecated functions
|
2025-06-05 22:34:20 +00:00 |
|
perf3ct
|
3a4bb47cc1
|
feat(llm): embeddings work and are created when launching for the first ever time
|
2025-06-05 21:03:15 +00:00 |
|
perf3ct
|
bb8a374ab8
|
feat(llm): transition from initializing LLM providers, to creating them on demand
|
2025-06-05 19:27:45 +00:00 |
|
perf3ct
|
c1b10d70b8
|
feat(llm): also add functions to clear/unregister embedding providers
|
2025-06-05 18:59:32 +00:00 |
|
perf3ct
|
49e123f399
|
feat(llm): create endpoints for starting/stopping embeddings
|
2025-06-05 18:47:25 +00:00 |
|
perf3ct
|
a084805762
|
Merge branch 'develop' into feat/llm-change-to-single-provider
|
2025-06-05 18:26:40 +00:00 |
|
perf3ct
|
63722a28a2
|
feat(llm): also add embeddings options for embedding creation
|
2025-06-04 22:30:16 +00:00 |
|
perf3ct
|
fe15a0378a
|
fix(llm): have the model_selection_stage use the instance of the aiServiceManager
|
2025-06-04 20:23:06 +00:00 |
|
perf3ct
|
a20e36f4ee
|
feat(llm): change from using precedence list to using a sing specified provider for either chat and/or embeddings
|
2025-06-04 20:13:13 +00:00 |
|
Elian Doran
|
9bfadd7799
|
Merge branch 'develop' into dateNote
|
2025-06-04 22:54:10 +03:00 |
|
Elian Doran
|
4475568d19
|
fix(server): migration not working due to change in becca loader
|
2025-06-04 22:36:51 +03:00 |
|
Elian Doran
|
86689896a1
|
test(server): don't do automatic backup if migrating database
|
2025-06-04 21:44:27 +03:00 |
|
Elian Doran
|
df7f0d4099
|
Merge pull request #2110 from TriliumNext/feat/llm-integration-part3
LLM Integration, part 3
|
2025-06-04 11:41:49 +03:00 |
|
Elian Doran
|
6563601667
|
Merge pull request #2123 from FliegendeWurst/shortcuts-i18n
feat(i18n): description for all keyboard shortcuts
|
2025-06-04 11:39:47 +03:00 |
|
Elian Doran
|
8445ece231
|
Merge pull request #2106 from TriliumNext/fix/llm-becca-sync
fix(llm): Fix Note Embeddings not being synced correctly and causing sync loops
|
2025-06-04 11:38:49 +03:00 |
|