Skip to content

Bump llama.cpp to b5686, fix build failures #754

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 3 commits into from
Jun 18, 2025

Conversation

AsbjornOlling
Copy link
Contributor

@AsbjornOlling AsbjornOlling commented Jun 17, 2025

The main motivation for bumping llama.cpp is to close #747

In the meantime, there were some changes to the build-info.cpp generation. This caused a failure during the copy operation in build.rs.
Just removing the entire copy step seems to work. So I did that.

@AsbjornOlling
Copy link
Contributor Author

Hold off on merging this, it seems like there could be an issue with windows builds

@AsbjornOlling
Copy link
Contributor Author

I think I fixed it. At least the windows CI passes on my fork of this repo

@AsbjornOlling
Copy link
Contributor Author

AsbjornOlling commented Jun 18, 2025

Hmph.

Actually now windows/vulkan builds are broken.

Vulkan builds work on linux and macos, though - and non-vulkan builds work on all platforms I have tested, including windows.

This PR is still strictly an improvment over the current main. On the current main branch, vulkan builds don't work for any platform. So I think it is safe to merge this, @MarcusDunn. I'll keep looking into fixing Windows/Vulkan.

@AsbjornOlling
Copy link
Contributor Author

AsbjornOlling commented Jun 18, 2025

Actually now windows/vulkan builds are broken.

Seems like it may be caused by a path length limitation on windows :| Nevermind...

@AsbjornOlling
Copy link
Contributor Author

AsbjornOlling commented Jun 18, 2025

I found a workaround for the windows/vulkan issue, but that introduces new problems. I had pushed my windows/vulkan workaround to this branch, but I rolled it back for now.

Let's just get llama.cpp bumped and the linux/vulkan stuff fixed. I'll make a separate PR for the other stuff 😇

@MarcusDunn MarcusDunn merged commit 8331a39 into utilityai:main Jun 18, 2025
3 of 5 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Fails to build for x86_64-linux w/ Vulkan backend.
2 participants