Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG]: loglevel 0 and 1 from llama.cpp doesn't seem to be supported #995

Open
LoicDagnas opened this issue Nov 25, 2024 · 1 comment
Open

Comments

@LoicDagnas
Copy link
Contributor

LoicDagnas commented Nov 25, 2024

Description

Since this PR on llama.cpp, log levels are not anymore between 2 and 5 but between 0 and 4 here. So I guess that the log level mapping should be adapted here.

Reproduction Steps

Adding with NativeLogConfig.llama_log_set(new LlamaSharpLogger()); this very naive custom logger:

internal sealed class LlamaSharpLogger : ILogger
{
    public IDisposable BeginScope<TState>(TState state) => default;

    public void Log<TState>(
        LogLevel logLevel,
        EventId eventId,
        TState state,
        Exception exception,
        Func<TState, Exception, string> formatter)
    {

    }

    public bool IsEnabled(LogLevel logLevel) => true;
}

any call to LlamaWeights.LoadFromFile("...") will fails with an ArgumentOutOfRangeException.

Environment & Configuration

  • Operating system: Windows
  • .NET runtime version: 8.0
  • LLamaSharp version: 0.19
  • CUDA version (if you are using cuda backend): 12
  • CPU & GPU device: RTX 3000

Known Workarounds

No response

@martindevans
Copy link
Member

martindevans commented Nov 25, 2024

Thanks for the detailed report!

You've located the correct location to fix the problem, would you be interested in putting together a PR with the necessary changes?

LoicDagnas pushed a commit to LoicDagnas/LLamaSharp that referenced this issue Nov 26, 2024
LoicDagnas pushed a commit to LoicDagnas/LLamaSharp that referenced this issue Nov 26, 2024
LoicDagnas added a commit to LoicDagnas/LLamaSharp that referenced this issue Nov 26, 2024
martindevans pushed a commit that referenced this issue Nov 26, 2024
…ned anymore (issue #995) (#997)

* The log levels defined on llama.cpp and LlamaSharp side were not aligned anymore (issue #995)
* Handle the continuous log level reusing the latest log level used
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants