-
-
Notifications
You must be signed in to change notification settings - Fork 1.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Including (but not importing!) a requirements.txt file #2148
Comments
I wonder if it’s a workable solution to treat the subsystem as a local (editable?) package. It would contain a setup.py that simply reads in the requirements.txt for now, and, after PEP 518 is fully adopted, specify a custom build system in its pyproject.toml. This would also prevent Pipenv from locking its dependencies, and feels more natural to me. |
Hi. I have a similar need, some of our tools still requires I would love to see a setting in pipfile to generate these files for me. Maybe a script/plugin of Pipfile could be good. I didn't started to look at pep518, this will have a major impact on the python build system. Wonder why you have to use "pipenv install --dev -r subsystem/locked-requirements.txt", how do you handle version conflict between two subsystems? Why don't you use |
While I can treat things as editable packages, it really doesn't make much conceptual sense to do so, as none of the charger software is intended for publication in any of the Python-specific package formats, I just need to be able to specify a locked set of Python dependencies to be downloaded, bundled, and then installed into the on-device virtual environment. That said, reading a It's still not a complete solution though, since @gsemet Your use case is the other way around from mine, as |
What if we had some kind of semantic distinction in the API to say ‘install as reference’, e.g. |
On careful thought this seems like a reasonable thing to do. Cargo (of Rust) has something similar called Workspace. A workspace Cargo.toml does not have a set of dependencies itself, but only references a collection of child projects (members). Each member’s Cargo.toml represents a project on its own, and has its own lock file. Note that a workspace in Cargo is all-or-nothing. A workspace Cargo.toml cannot specify any dependencies, and a non-workspace Cargo.toml cannot have any members. It would be worth discussing whether Pipfile should also be like this, or if we’d allow both As for the command-line API, I’d say it’s better to have a seperate subcommand, |
A Cargo-style workspace-or-application model would be fine for my particular use case - I'd just define the dev utility scripts as a new subproject. However, I think doing it that way would make things significantly more complicated at the Under that model, a "workspace-only" As far as the question of Assuming that the first question is answered with "reference the directory" (so Subproject runtime deps as runtime deps of the workspace project:
Subproject runtime deps as development deps of the workspace project:
Both development and runtime subproject deps as development deps of the workspace project:
Runtime subproject deps as runtime deps, both development and runtime subproject deps as development deps:
|
I can’t decide whether I prefer specifying a file and a directory. It makes more sense to specify by directory (especially for Pipfile projects), but it wouldn’t make as much sense to do this for requirements.txt. Maybe it should look for a Pipfile if you specify by directory, and a requirements.txt if file? |
Files only imo. We can parse lockfile vs pipfile vs requirements—
|
At least files are needed for So it may make sense to see how that works out in practice, and use it to design native support that skips the intermediate setup.py generation step. |
i think this is a terrible idea |
but perhaps i'm misunderstanding. |
I think it only makes sense in the context where You're not especially likely to do that in a SaaS context (or if you are, then exporting I don't want to pull those dependency lists directly into Pipfile, because there are other build processes that need them to be where they are. That said, if this were to be declared out of scope for |
It’s roughly the same as |
In that sens it would be roughly the same as |
Can this be closed? |
I've started using pipenv to manage local development environments for Tritium's electric vehicle DC fast chargers, and it's working pretty well for that purpose (huzzah!).
However, getting
pylint
to work nicely in dev environments is proving to be something of a challenge, since the way the charger environment creation works is to:I don't have any intention to migrate the charger level dependency management away from pip-compile, since the app-requirements -> env-requirements -> deployment-bundle aspects of the pipeline are all very specific to our particular use case, and any tool that was sufficiently general purpose to be able to handle this would be harder to configure than just writing the required process automation scripts directly atop
pip-compile
.The bit that's relevant to
pipenv
is that what I'd like to be able to express viaPipfile
is "When creating a dev environment, install the charger dependencies as well as the dev environment dependencies" (as that way, all the runtime APIs will be visible topylint
and other static analysis tools likemypy
).While
pipenv install --dev -r subsystem/locked-requirements.txt
comes close to this, it embeds a static snapshot of the locked requirements directly intoPipfile
, rather than adding a reference to the input requirements toPipfile
, and then embedding the locked requirements themselves intoPipfile.lock
. While that's OK as a workaround in the near term, in the long run it's a recipe for the snapshot in Pipfile getting out of sync with the actual charger dependencies in the subsystem-specificlocked-requirements.txt
files.As a potential design for supporting this, my first thought is that we might actually need a new section in
Pipfile
:[include]
Entries in
include
would either be:Or else a mapping that tweaks certain aspects of the included dependencies (like adding them to
dev-packages
instead ofpackages
, setting a particularsource
for them all, setting environment markers for the inclusion, etc).From the point of view of lockfile generation, included dependencies would be just like regular dependencies, with one exception: the
_meta
section would gain a newincluded_files
field mapping relative path names for included files to their expected hash values.If such a feature was deemed a potentially acceptable candidate for inclusion, then I'd focus specifically on
requirements.txt
support initially, as I think the considerations for composing a meta-Pipfile from other projects all usingPipfile
themselves are potentially going to be different from those involved in composing a combined dev environmentPipfile
from lower level flat requirements.txt files. (If I prove to be wrong about that, that would be a fine outcome - I just think it's worth postponing the question until after we've considered the simpler requirements.txt-only concept)The text was updated successfully, but these errors were encountered: