-
Notifications
You must be signed in to change notification settings - Fork 11.9k
Add ability to cancel model loading #4462
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Merged
Merged
Changes from all commits
Commits
Show all changes
31 commits
Select commit
Hold shift + click to select a range
9abe2e4
llama : Add ability to cancel model load
crasm 3425e62
llama : Add test for model load cancellation
crasm 4b1f70c
Fix bool return in llama_model_load, remove std::ignore use
crasm 1160de3
Update llama.cpp
ggerganov 32ebd52
Fail test if model file is missing
crasm cb8a4be
Merge branch 'cancel-model-load' of github.com:crasm/llama.cpp into c…
crasm 2796953
Revert "Fail test if model file is missing"
crasm 068e7c4
Add test-model-load-cancel to Makefile
crasm fe6a6fb
Revert "Revert "Fail test if model file is missing""
crasm 6bba341
Simplify .gitignore for tests, clang-tidy fixes
crasm fd9d247
Label all ctest tests
crasm 4b63355
ci : ctest uses -L main
crasm aed3cf8
Attempt at writing ctest_with_model
crasm f80ff4d
ci : get ci/run.sh working with test-model-load-cancel
crasm 121b04d
ci : restrict .github/workflows/build.yml ctest to -L main
crasm 1e79625
update requirements.txt
crasm 9809314
Disable test-model-load-cancel in make
crasm 9a056ed
Remove venv before creation
crasm 293d16f
Restructure requirements.txt
crasm 267cfa4
Merge commit 'c50e40016394f124b97ce39da48148b1f6c01833' into cancel-m…
crasm a0eab1e
Make per-python-script requirements work alone
crasm ca122dc
Add comment
crasm ba46057
Merge remote-tracking branch 'upstream/master' into cancel-model-load
crasm b853df4
Add convert-persimmon-to-gguf.py to new requirements.txt scheme
crasm c9a6de8
Add check-requirements.sh script and GitHub workflow
crasm e86b8cd
Remove shellcheck installation step from workflow
crasm bdfe4ba
Add nocleanup special arg
crasm 6bc7411
Merge remote-tracking branch 'upstream' into cancel-model-load
crasm e438257
Fix merge
crasm f607e53
reset to upstream/master
crasm 5f2ee1c
Redo changes for cancelling model load
crasm File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@slaren do you know if this line will be a problem? Since it doesn't get run if the above returns early
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The progress callback should only be called if it loaded successfully, I think. Would be weird to run it with 1.0 if the model load actually failed
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Skipping the
mapping
move should be fine.There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The case I'm trying to avoid is:
1.0f
1.0f
value itself)llama_model *