Skip to content

Commit 8efb198

Browse files
committed
mark as not working on windows
The releases from llama.cpp follow a different format on windows. On unixy OSes, the server binary is at `build/bin/llama-server` On Windows it is just at `llama-server.exe`. You would need to update the download-release.sh script to make this work.
1 parent 8234053 commit 8efb198

File tree

2 files changed

+6
-1
lines changed

2 files changed

+6
-1
lines changed

.github/workflows/CI.yml

+3-1
Original file line numberDiff line numberDiff line change
@@ -32,7 +32,9 @@ jobs:
3232
strategy:
3333
fail-fast: false
3434
matrix:
35-
os: [ubuntu-latest, windows-latest, macos-latest]
35+
# windows isn't supported in the download-release.sh script from llama.cpp.
36+
# See _binary.py for more info
37+
os: [ubuntu-latest, macos-latest]
3638
python-version: ["3.8", "3.12"] # min and max versions, skip the middles
3739
runs-on: ${{ matrix.os }}
3840
steps:

README.md

+3
Original file line numberDiff line numberDiff line change
@@ -32,6 +32,9 @@ For detailed API, read the source code.
3232

3333
## Install
3434

35+
This only currently works on Linux and Mac. File an issue if you want a pointer on
36+
what needs to happen to make this work.
37+
3538
For now, install directly from source:
3639

3740
`python -m pip install git+https://github.com/NickCrews/llama-cpp-server-python@00cc5ece8783848139d41fb7f9c5e5c9b7a62686`

0 commit comments

Comments
 (0)