-
Notifications
You must be signed in to change notification settings - Fork 1.1k
Could not build wheels for llama-cpp-python, which is required to install pyproject.toml-based projects #244
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
That's an error where the compiler thinks your hardware supports some Intel CPU acceleration features, but in fact it doesn't. Are you by chance compiling in a VM? |
Thank you. Yes, exactly I am running Ubuntu 22.04.2 in Virtual Box. Is there a specific setting I should turn on / off? |
This might help. |
Thank you that fixed the installation. Everything works perfectly now. Much appreciated :) |
hey guys,my os is centos7,when reinstall and upgrade llama-cpp-python ,it shows error with cmake,how to fix it
|
Please open a new issue and provide the complete build output as per the issue template. |
Hi @mindwellsolutions |
Yes. I disabled PAE/NX and VT-X/AMD-V ( Hyper-V) on VirtualBox settings for the VM. I have Paravirtualization Interface set to Default. I also turned of "Harware virtualization" disabled nested paging. (Although I'm not sure if this is needed). |
Thanks @mindwellsolutions!! It worked for me! |
FYI regarding this:
Yes, it is needed. I had PAE/NX and VT-X/AMD-V already disabled and also Paravirtualization Interface set to Default. And I was still getting the error no matter what. Finally, disabling "Nested Paging" did the trick. |
I have dual booted my PC and tried building a docker image on Linux. I did not get what exactly do I need to do for avoiding this error. It would be really helpful if you could please guide me with some steps ? |
Shortened ERROR Text:
"Building wheel for llama-cpp-python (pyproject.toml) did not run successfully. [exit code: 1]"
"Could not build wheels for llama-cpp-python, which is required to install pyproject.toml-based projects."
Prior to trying to install llama-cpp-python I installed Cuda, Ubuntu Build Essentials, Cmake, but still get this error everytime I try to install llama-cpp-python.
Installation methods tried:
I also tried running the dockerfile.txt that glmulder shared 4 days ago @ (Link) and got an identical error.
Full Error Text:
pip install llama-cpp-python
Defaulting to user installation because normal site-packages is not writeable
Collecting llama-cpp-python
Using cached llama_cpp_python-0.1.51.tar.gz (1.2 MB)
Installing build dependencies ... done
Getting requirements to build wheel ... done
Preparing metadata (pyproject.toml) ... done
Collecting typing-extensions>=4.5.0
Using cached typing_extensions-4.5.0-py3-none-any.whl (27 kB)
Building wheels for collected packages: llama-cpp-python
Building wheel for llama-cpp-python (pyproject.toml) ... error
error: subprocess-exited-with-error
× Building wheel for llama-cpp-python (pyproject.toml) did not run successfully.
│ exit code: 1
╰─> [135 lines of output]
note: This error originates from a subprocess, and is likely not a problem with pip.
ERROR: Failed building wheel for llama-cpp-python
Failed to build llama-cpp-python
ERROR: Could not build wheels for llama-cpp-python, which is required to install pyproject.toml-based projects
The text was updated successfully, but these errors were encountered: