-
-
Notifications
You must be signed in to change notification settings - Fork 594
experimental_index_url can only be used in one pip.parse(), subsequent calls fail #2648
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
Ah yes, the sdist repos need to be prefixed/suffixed with the python version. The fix should be reasonably easy - only the repo name should be updated. PRs are welcome here if someone needs this before I can take a look at it. |
Thanks for the quick response. Making a reproduction was more difficult than I expected. I'm assuming you mean somewhere in the following lines, is that right? |
Yes, exactly. I think passing |
For reference this comment was wrong, it is unclear yet why this is happening. |
More debug information on the 2nd sdist, logged on the line:https://github.com/bazelbuild/rules_python/blob/e7d2f09394dd14816310c4c661d2fefab33b2b1b/python/private/pypi/extension.bzl#L246
Key part is that the |
Is there a bug in
I think we need to fix the Another option is to handle this at the extension level where the extension is not blindly creating sdists, but that option is less appealing to me because the former is easier to test. |
Yes, this was exactly the issue. The lines are slightly different, with one hash appearing on darwin lock file but not linux linux lock file, and one hash appearing on linux but not the darwin. When I initially tried to create the reproduction, I wasn't using the exact same set of hashes as my main repo. |
Took a stab at fixing it via #2658 but looks like it changes some behavior on mac/universal, and any whls. Potentially due to extension.bzl's handling. Let me know what you think. |
Heads up this can be reproduced with a single
Workaround is to add/subtract hashes to make hashes the same across requirements files |
…ust (#2871) Summary: - Make the requirement line the same as the one that is used in whls. It only contains extras and the version if it is present. - Add debug log statements if we fail to get the version from a direct URL reference. - Move some tests from `parse_requirements_tests` to `index_sources_tests` to improve test maintenance. - Replace the URL encoded `+` to a regular `+` in the filename. - Correctly handle the case when the `=sha256:` is used in the URL. Once this is merged I plan to tackle #2648 by changing the `parse_requirements` code to de-duplicate entries returned by the `parse_requirements` function. I cannot think of anything else that we can do for this as of now, so will mark the associated issue as resolved. Fixes #2363 Work towards #2648
OK, I have spent some time thinking about this and I will write down my analysis here. The problem right now is that the output of The output of
For further analysis we can look at each value of the requirements dict and try and understand the problem more. Some of the fields can differ by target platforms, so I'll denote them with letter
Putting that aside for a little, the
Because So ideally, we probably want to change the type that the Maybe something like
This structure can be created by something like:
Where the
And this in turn is created by something like (the
To actually validate if this model works, we should do the refactor first and then stack the second PR on top of the refactor and see how it works. I am somewhat confident that it should work like this just fine. |
The modeling of the data structures returned by the `parse_requirements` function was not optimal and this was because historically there was more logic in the `extension.bzl` and more things were decided there. With the recent refactors it is possible to have a harder to misuse data structure from the `parse_requirements`. For each `package` we will return a struct which will have a `srcs` field that will contain easy to consume values. With this in place we can do the fix that is outlined in the referenced issue. Work towards bazel-contrib#2648
The modeling of the data structures returned by the `parse_requirements` function was not optimal and this was because historically there was more logic in the `extension.bzl` and more things were decided there. With the recent refactors it is possible to have a harder to misuse data structure from the `parse_requirements`. For each `package` we will return a struct which will have a `srcs` field that will contain easy to consume values. With this in place we can do the fix that is outlined in the referenced issue. Work towards #2648
Uh oh!
There was an error while loading. Please reload this page.
🐞 bug report
Affected Rule
pip.parse()
Is this a regression?
I'm not sure.
Description
I have 4 lock files for 2 different Python versions over 2 platforms. When I try to use
experimental_index_url
on bothpip.parse
calls, it fails.🔬 Minimal Reproduction
Here's an excerpt from MODULE.bazel:
🔥 Exception or Error
🌍 Your Environment
Operating System:
Output of
bazel version
:Rules_python version:
Anything else relevant?
The text was updated successfully, but these errors were encountered: