How to generate python pip dependency ranges?

523 views Asked by At

I am working on a python library and we are currently using pip-tools to pin dependencies.

E.g. a basic requirements.in:

black
pandas
torch

Running pip-compile on this file will generate a requirements.txt something like the following (which has been simplified here):

black==20.8b1             # via -r requirements.in
click==7.1.2              # via black
numpy==1.19.4             # via pandas, torch
pandas==1.1.5             # via -r requirements.in
pathspec==0.8.1           # via black
pytz==2020.4              # via pandas
regex==2020.11.13         # via black
torch==1.7.1              # via -r requirements.in

My problem is that it is not recommended to pin versions for packages that will become public on pypi due to the lack of flexibility with other libraries for end users. It appears to be common practice to specify dependency ranges in these cases e.g.

numpy >=1.18.0, <1.19.0
pandas >=1.0.0
etc.

My question is whether there is any way to auto generate these dependency ranges given a requirements.in like the above? I imagine this should be relatively easy using existing dependency resolution tools to generate the minimum and maximum versions that are compatible with all libraries. (Any libraries which don't have any versions specified can be left as is.)

I haven't been able to find any tool that does anything like this. Are maintainers of other libraries performing this task manually? I understand the approach I describe above still requires some manual intervention for libraries in requirements.in that do not have cross-dependencies but for the most part (certainly in my case), this should do the heavy lifting.

0

There are 0 answers