forked from abetlen/llama-cpp-python
-
Notifications
You must be signed in to change notification settings - Fork 26
Closed
Description
Is your feature request related to a problem? Please describe.
It would be really helpful if this package was able to find system-installed ggml libraries. Currently, even though I build llama.cpp and this package with CMAKE_ARGS="-DLLAMA_USE_SYSTEM_GGML=ON", the package still fails to find and import the ggml libraries.
In particular, it fails at this line when trying to load the library from the package's root dir:
llama-cpp-python/llama_cpp/_ggml.py
Line 21 in f9f8669
| libggml = ctypes_ext.load_shared_library("ggml", libggml_base_path) |
Describe the solution you'd like
Better path finding and support for external ggml libraries.
Describe alternatives you've considered
Modifying the code directly.
Additional context
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels