Skip to content

[BUG]: Error in version 0.25.0 - LLama.Exceptions.RuntimeError: Failed to load the native library. #1275

@rockofme1

Description

@rockofme1

Description

After upgrading to version 0.25.0, an error occurs when downloading:

LLama.Exceptions.RuntimeError: Failed to load the native library. Please check the log for more information.
   at LLama.Native.NativeLibraryUtils.TryLoadLibrary(NativeLibraryConfig config, INativeLibrary& loadedLibrary)
   at LLama.Native.NativeLibraryConfig.DryRun(INativeLibrary& loadedLibrary)
   at LLama.Native.NativeLibraryConfigContainer.DryRun(INativeLibrary& loadedLLamaNativeLibrary, INativeLibrary& loadedLLavaNativeLibrary)

Namely in the code:

        NativeLibraryConfig
            .All
            .WithCuda()
            .WithAutoFallback(false)
            .DryRun(out INativeLibrary? _, out INativeLibrary? _)

The project is running in Docker, variables are used:

GGML_CUDA_DISABLE_GRAPHS: "1"
GGML_CUDA_FORCE_MMQ: "1"
CUDA_LAUNCH_BLOCKING: "1"

UPDATE
I found the following line in the logs:

Failed Loading 'runtimes/linux-x64/native/cuda12/libllava_shared.so'

After this application is shutting down.

Reproduction Steps

I tried adding WithAutoFallback(true)

Environment & Configuration

  • Operating system: Linux
  • .NET runtime version: 8
  • LLamaSharp version: 0.25.0
  • CUDA version (if you are using cuda backend): 12
  • CPU & GPU device: A100

Known Workarounds

No response

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions