Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Including (but not importing!) a requirements.txt file #2148

Closed
ncoghlan opened this issue May 6, 2018 · 15 comments
Closed

Including (but not importing!) a requirements.txt file #2148

ncoghlan opened this issue May 6, 2018 · 15 comments
Labels
Status: Awaiting Update ⏳ This issue requires more information before assistance can be provided. Status: Requires Approval This issue requires additional approval to move forward. Type: Discussion This issue is open for discussion.

Comments

@ncoghlan
Copy link
Member

ncoghlan commented May 6, 2018

I've started using pipenv to manage local development environments for Tritium's electric vehicle DC fast chargers, and it's working pretty well for that purpose (huzzah!).

However, getting pylint to work nicely in dev environments is proving to be something of a challenge, since the way the charger environment creation works is to:

  1. Keep a local "requirements.txt" in the repo adjacent to each deployed app
  2. When those dependency declarations change (or when we want to do a dependency refresh for any other reason), generate a complete "locked-requirements.txt" for that target subsystem with pip-compile (so that we can have a single self-consistent venv-per-execution-environment, rather than a venv-per-app)
  3. Use the locked requirements files to generate a wagon archive for inclusion in the deployed firmware update bundles

I don't have any intention to migrate the charger level dependency management away from pip-compile, since the app-requirements -> env-requirements -> deployment-bundle aspects of the pipeline are all very specific to our particular use case, and any tool that was sufficiently general purpose to be able to handle this would be harder to configure than just writing the required process automation scripts directly atop pip-compile.

The bit that's relevant to pipenv is that what I'd like to be able to express via Pipfile is "When creating a dev environment, install the charger dependencies as well as the dev environment dependencies" (as that way, all the runtime APIs will be visible to pylint and other static analysis tools like mypy).

While pipenv install --dev -r subsystem/locked-requirements.txt comes close to this, it embeds a static snapshot of the locked requirements directly into Pipfile, rather than adding a reference to the input requirements to Pipfile, and then embedding the locked requirements themselves into Pipfile.lock. While that's OK as a workaround in the near term, in the long run it's a recipe for the snapshot in Pipfile getting out of sync with the actual charger dependencies in the subsystem-specific locked-requirements.txt files.

As a potential design for supporting this, my first thought is that we might actually need a new section in Pipfile: [include]

Entries in include would either be:

name = input_filename # "name" is just for debugging purposes

Or else a mapping that tweaks certain aspects of the included dependencies (like adding them to dev-packages instead of packages, setting a particular source for them all, setting environment markers for the inclusion, etc).

From the point of view of lockfile generation, included dependencies would be just like regular dependencies, with one exception: the _meta section would gain a new included_files field mapping relative path names for included files to their expected hash values.

If such a feature was deemed a potentially acceptable candidate for inclusion, then I'd focus specifically on requirements.txt support initially, as I think the considerations for composing a meta-Pipfile from other projects all using Pipfile themselves are potentially going to be different from those involved in composing a combined dev environment Pipfile from lower level flat requirements.txt files. (If I prove to be wrong about that, that would be a fine outcome - I just think it's worth postponing the question until after we've considered the simpler requirements.txt-only concept)

@ncoghlan ncoghlan added Status: Requires Approval This issue requires additional approval to move forward. Type: Discussion This issue is open for discussion. labels May 6, 2018
@uranusjr
Copy link
Member

uranusjr commented May 6, 2018

I wonder if it’s a workable solution to treat the subsystem as a local (editable?) package. It would contain a setup.py that simply reads in the requirements.txt for now, and, after PEP 518 is fully adopted, specify a custom build system in its pyproject.toml. This would also prevent Pipenv from locking its dependencies, and feels more natural to me.

@gsemet
Copy link
Contributor

gsemet commented May 6, 2018

Hi. I have a similar need, some of our tools still requires requirements.txt. I usually generate requirements.txt and requirements-dev.txt so they are available. I prefer generating a version-frozen requirements for application (i think it is your use case) and non-version-frozen requirements for library (so that readthedocs are happy for instance). But the ultimate source of truth is the Pipfile (for library) or the Pipfile.lock for applications. I have this tool to generate these files for me. The only drawback is to not forget to update them and track them in git.

I would love to see a setting in pipfile to generate these files for me. Maybe a script/plugin of Pipfile could be good. I didn't started to look at pep518, this will have a major impact on the python build system.

Wonder why you have to use "pipenv install --dev -r subsystem/locked-requirements.txt", how do you handle version conflict between two subsystems? Why don't you use pipenv install --dev subsystem and let pip do the resolution for your on your integration environment?

@ncoghlan
Copy link
Member Author

ncoghlan commented May 6, 2018

While I can treat things as editable packages, it really doesn't make much conceptual sense to do so, as none of the charger software is intended for publication in any of the Python-specific package formats, I just need to be able to specify a locked set of Python dependencies to be downloaded, bundled, and then installed into the on-device virtual environment.

That said, reading a locked-requirements.txt in the dummy setup.py I already created for wagon's benefit [1] would be a significantly better near-term workaround than adding a dependency snapshot directly to Pipfile (thank you for the suggestion, @uranusjr!).

It's still not a complete solution though, since pipenv commands won't automatically detect that the lock file is out of date when locked-requirements.txt changes - I'll also need to add an explicit lock file regeneration step into the script that collects and locks the charger level dependencies.

[1] cloudify-cosmo/wagon#30

@gsemet Your use case is the other way around from mine, as Pipfile is not my source of truth for charger dependencies, the locked-requirements.txt file is (and I have no interest in changing that, as I'm only using pipenv to manage dev environments, not the deployed charger environments - I don't even deploy wagon to the target devices, I run the unpack-and-pip-install step directly in a shell script)

@techalchemy
Copy link
Member

What if we had some kind of semantic distinction in the API to say ‘install as reference’, e.g. pipenv install --no-import -r locked-requirements.txt and then we store it as a path or file uri with a flag functionally similar to editable=True

@uranusjr
Copy link
Member

uranusjr commented May 6, 2018

On careful thought this seems like a reasonable thing to do. Cargo (of Rust) has something similar called Workspace. A workspace Cargo.toml does not have a set of dependencies itself, but only references a collection of child projects (members). Each member’s Cargo.toml represents a project on its own, and has its own lock file.

Note that a workspace in Cargo is all-or-nothing. A workspace Cargo.toml cannot specify any dependencies, and a non-workspace Cargo.toml cannot have any members. It would be worth discussing whether Pipfile should also be like this, or if we’d allow both [include] (maybe it’s better to name this includes, btw?) and [packages] to work in the same file.

As for the command-line API, I’d say it’s better to have a seperate subcommand, pipenv include [subproject-to-include]. Also it would make sense to be able to include another Pipfile, as well as a requirements.txt file.

@ncoghlan
Copy link
Member Author

ncoghlan commented May 7, 2018

A Cargo-style workspace-or-application model would be fine for my particular use case - I'd just define the dev utility scripts as a new subproject.

However, I think doing it that way would make things significantly more complicated at the Pipfile format specification level (needing new sections, and rules for making some sections mutually exclusive with other sections), whereas using @techalchemy's suggested approach should only require a flag like subproject=True (to indicate that this entry isn't a reference to a pip-installable Python package), and then the subproject links can just be listed under [packages] or [dev-packages] based on where you want their dependencies to end up in Pipfile.lock.

Under that model, a "workspace-only" Pipfile would just be a project level convention whereby you only had subproject=True references, and no direct dependency declarations of your own.

As far as the question of Pipfile managed subprojects goes, in order to support that, we'd need to answer UX questions like: 1) do you reference the file directly, or the directory containing it?; 2) if a Pipfile managed subproject is linked in [packages] do its dev dependencies also get added to [dev-packages]?; 3) if a Pipfile managed project is listed in [dev-packages], do both its runtime and dev dependencies get added to [dev-packages]?

Assuming that the first question is answered with "reference the directory" (so pipenv can automatically take the lockfile into account if present), and that pipenv include --dev --name sp ./subproject defines where the link gets added rather than which dependencies are included, then I'd suggest a separate pipenv include --all subproject flag that becomes an include_all=True flag on the subproject link. You'd then have the following potential configurations:

Subproject runtime deps as runtime deps of the workspace project:

[packages]
sp = {path="./subproject", subproject=True}

Subproject runtime deps as development deps of the workspace project:

[dev-packages]
sp = {path="./subproject", subproject=True}

Both development and runtime subproject deps as development deps of the workspace project:

[dev-packages]
sp = {path="./subproject", subproject=True, include_all=True}

Runtime subproject deps as runtime deps, both development and runtime subproject deps as development deps:

[packages]
sp = {path="./subproject", subproject=True}

[dev-packages]
sp = {path="./subproject", subproject=True, include_all=True}

@uranusjr
Copy link
Member

uranusjr commented May 7, 2018

I can’t decide whether I prefer specifying a file and a directory. It makes more sense to specify by directory (especially for Pipfile projects), but it wouldn’t make as much sense to do this for requirements.txt. Maybe it should look for a Pipfile if you specify by directory, and a requirements.txt if file?

@techalchemy
Copy link
Member

Files only imo. We can parse lockfile vs pipfile vs requirements—

subreq = {path = “./subproject/some-reqs.txt”, format = “requirements”}

@ncoghlan
Copy link
Member Author

At least files are needed for requirements.txt support. However, near term, I'm going to pursue this further with a project to automatate the setup.py based approach that @uranusjr suggested: https://github.com/ncoghlan/wainwright/blob/master/README.md

So it may make sense to see how that works out in practice, and use it to design native support that skips the intermediate setup.py generation step.

@kennethreitz
Copy link
Contributor

i think this is a terrible idea

@kennethreitz
Copy link
Contributor

but perhaps i'm misunderstanding.

@kennethreitz kennethreitz added Status: Requires Approval This issue requires additional approval to move forward. and removed Status: Requires Approval This issue requires additional approval to move forward. labels Jul 4, 2018
@ncoghlan
Copy link
Member Author

ncoghlan commented Jul 4, 2018

I think it only makes sense in the context where Pipfile is being used to manage local development environments, but not production builds or deployments.

You're not especially likely to do that in a SaaS context (or if you are, then exporting Pipfile to a flat requirements file is likely to work), but that's not my use case: my use case is getting to a point where developers can combine "git clone" with "pipenv shell", and getting everything they need to work on the charger software, including the dependencies that pylint et al need in order to make proper sense of any Python code.

I don't want to pull those dependency lists directly into Pipfile, because there are other build processes that need them to be where they are.

That said, if this were to be declared out of scope for pipenv itself, then I'd likely just pull it into scope for wainwright, and treat the Pipfile contents as just another source for a later pip-compile and pip-sync call (targeting the pipenv managed venv).

@techalchemy
Copy link
Member

It’s roughly the same as -e ./some/subpackage from a dependency perspective except it points at another Pipfile afaiu

@sybrenstuvel
Copy link

It’s roughly the same as -e ./some/subpackage from a dependency perspective except it points at another Pipfile afaiu

In that sens it would be roughly the same as -r ./some/subpackage/requirements.txt, which is what I would love to replace with a Pipfile-only approach (as described in #2798).

@matteius
Copy link
Member

Can this be closed?

@matteius matteius added the Status: Awaiting Update ⏳ This issue requires more information before assistance can be provided. label Aug 17, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Status: Awaiting Update ⏳ This issue requires more information before assistance can be provided. Status: Requires Approval This issue requires additional approval to move forward. Type: Discussion This issue is open for discussion.
Projects
None yet
Development

No branches or pull requests

7 participants