-
Notifications
You must be signed in to change notification settings - Fork 3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Broken dependency evaluation in pip 24.3.1. Pip will not recognize installed prerelease versions #13089
Comments
Doesn’t the |
|
Hang on - your requirement is spdbtools<1.0,>=0.7.1, but you have 0.7.1a20241118121953. According to the version specification:
So your alpha release does not satisfy the constraint >=0.7.1. Pip is behaving according to spec here. |
Yeah, it's a part of the spec that I see a lot of people confused by, but You must specify |
Yes. I looked very closely at those in the past. It's according to the specification. And specification is good. But still it does not make it easy for the workflow described by @maarre.
Yes, but this is a bit problematic becasue By adding And it does make it very tricky when you have a workflow of releasing two packages at the same time and want to make rc candidates of package A that might become final candidate and you want it to depend on the new version of the package B - that you also want to release as an rc candidate at the same time. Say: you releae We have this very problem very often in Airflow - when it often happens that at the same we want to relese a new common package with new feature (say And this is problematic because you effectively have to modify your dependencies in either git repo or relased packages between the RC and final version. That's not good workflow - for security and relase process especially - modifying code between rc and final version is problematic especially if you already started to develop new version and you release from Our solution in airflow (really a workaround) that we found working for us is to dynamically modify the rc packages with rc dependencies for all our rc packages. In the example above when we generate THis way in our pyproject.toml/hatch_build.py we keep the final dependency ( It's a workable solution and works good for us in Airflow - so maybe you can adapt it as well @maarre - however it requires dynamic generation of requirements and quite some automation of your release process. Another, better solution that maybe we might see in the future is to be able to specify selectively which packages should be treated differently, say if we could specify which packages could be treated differently with pre-releases we could use somethign like Another option is what |
There's a workaround to get per package prerelease, create a constraints file
Then you can add |
This is only half of a solution. You'd still need to modify dynamically your RC package requirement. In the example above, I want So the only solution I see now is to dynamically modify rc1 packages to modify their dependencies Again - I want to avoid to have to bump The flag that could solve it will have to dp two things:
|
As I read this it appears to me what you're asking for is a "light" override of the requirements. It's an interesting idea, but I don't see if getting any kind of traction with pip maintainers unless it became a standard. As maintainers have strongly expressed not wanting to help user install "broken" requirements. There has been discussion of override interfaces before, e.g. #8076 (comment) where I proposed an exactly equivalent interface to uv's |
Yes. but with a twist. I do not want (and I agree with maintainers here) to allow for broken dependencies. That would be very bad. What I would see as a possible solution is to handle specific case where you want to treat pre-release candidates version comparision differently for the packages that have not yet been released (only pre-released). Simply recognising the fact that there is a use case where |
That's a spec change, so the discussion must happen on the packaging forum, as it happens there is a discussion that is going on right now that has devolved into including that very question: https://discuss.python.org/t/proposal-intersect-and-disjoint-operations-for-python-version-specifiers/71888/20 I'm hoping though that discussion gets moved to it's own dedicated thread, I'll post back here if it does. |
It will, and I intend to propose a clarified version of the spec. However, I will say right now that changing the fact that |
Description
I want to be able to test multiple prerelease versions together without having to shoehorn installations.
I really don't want to put prerelease versions in my pyproject.toml.
Current version of pip will only allow prerelease dependencies using --no-deps flag.
Expected behavior
If there is an alpha, beta or release candidate installed i want pip to accept this installation when doing the dependency evaluation. The installations should succeed without having to use --no-deps switch.
pip version
24.3.1
Python version
3.11.2
OS
Windows, Linux
How to Reproduce
1 Create a package with an alpha version.´
2 Build the first package
3 Install the first package
4 Create another package with a dependency to the first package
5 The pyproject.toml file should reference the version of the first package without the alpha specifier.
6 Build the second package
7 Install the second package
Output
No response
Code of Conduct
The text was updated successfully, but these errors were encountered: