-
Notifications
You must be signed in to change notification settings - Fork 430
[BUG]: loglevel 0 and 1 from llama.cpp doesn't seem to be supported #995
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Labels
stale
Stale issue will be autoclosed soon
Comments
Thanks for the detailed report! You've located the correct location to fix the problem, would you be interested in putting together a PR with the necessary changes? |
LoicDagnas
pushed a commit
to LoicDagnas/LLamaSharp
that referenced
this issue
Nov 26, 2024
…ned anymore (issue SciSharp#995)
LoicDagnas
pushed a commit
to LoicDagnas/LLamaSharp
that referenced
this issue
Nov 26, 2024
…ned anymore (issue SciSharp#995)
LoicDagnas
pushed a commit
to LoicDagnas/LLamaSharp
that referenced
this issue
Nov 26, 2024
…ned anymore (issue SciSharp#995)
This issue has been automatically marked as stale due to inactivity. If no further activity occurs, it will be closed in 7 days. |
This was resolved by #997 |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Description
Since this PR on llama.cpp, log levels are not anymore between 2 and 5 but between 0 and 4 here. So I guess that the log level mapping should be adapted here.
Reproduction Steps
Adding with
NativeLogConfig.llama_log_set(new LlamaSharpLogger());
this very naive custom logger:any call to
LlamaWeights.LoadFromFile("...")
will fails with anArgumentOutOfRangeException
.Environment & Configuration
Known Workarounds
No response
The text was updated successfully, but these errors were encountered: