Refactor and add Llama Python library build#8107
Conversation
🔗 Helpful Links🧪 See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/executorch/8107
Note: Links to docs will display an error until the docs builds have been completed. ⏳ No Failures, 36 PendingAs of commit 11221a9 with merge base 1d43d91 ( This comment was automatically generated by Dr. CI and updates every 15 minutes. |
|
This pull request was exported from Phabricator. Differential Revision: D68937637 |
|
@pytorchbot label "topic: not user facing" |
cccclai
left a comment
There was a problem hiding this comment.
Seems reasonable to me. @winskuo-quic @shewu-quic @haowhsu-quic please take a second look
Summary: As title. To use static llama export outside QC dir. Reviewed By: cccclai Differential Revision: D68937637
47d50e9 to
1b5b3ee
Compare
|
This pull request was exported from Phabricator. Differential Revision: D68937637 |
|
@limintang has imported this pull request. If you are a Meta employee, you can view this diff on Phabricator. |
|
@limintang has imported this pull request. If you are a Meta employee, you can view this diff on Phabricator. |
Summary: Regression from pytorch#8107, it causes buck run python binary fails. Then pytorch#7691 introduces dependency in source transformation Reviewed By: larryliu0820 Differential Revision: D69942429
Summary: Regression from pytorch#8107, it causes buck run python binary fails. Then pytorch#7691 introduces dependency in source transformation Reviewed By: larryliu0820, kirklandsign Differential Revision: D69942429
Summary: As title. To use static llama export outside QC dir.
Differential Revision: D68937637