-
-
Notifications
You must be signed in to change notification settings - Fork 1.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
How to keep setup.py install_requires and Pipfile in sync #1263
Comments
Does pipenv have a python API that could be used? I manually update the list as I work on a project but the following could be nice:
the function just needs to return a list of the keys in the pipfiles [packages] section. I imagine you could achieve this functionality already using a helper function, but it'd be nice if it was part of pipenv so we don't all have to implement it. |
Pipfile, the implementation backing Pipenv’s Pipfile parsing, can help with this:
But I wouldn’t recommend this, or depending on Pipenv in setup.py. Importing A middle ground solution would be to write a tool similar to bumpversion that automatically syncs a text file based on Pipfile. Distribute this file with your package, and read it in setup.py. Then use CI or a commit hook to make sure the files are always in sync. |
Yeah good point ignore me.
Perhaps “pipenv install” could do the sync?
…On Mon, 8 Jan 2018 at 5:04 pm, Tzu-ping Chung ***@***.***> wrote:
Pipfile <https://github.com/pypa/pipfile>, the implementation backing the
parsing, can help with this:
import pipfile
pf = pipfile.load('LOCATION_OF_PIPFILE')
print(pf.data['default'])
But I wouldn’t recommend this, or depending on Pipenv in setup.py.
Importing pipenv (or pipfile) means the user needs to actually install
that before trying installing your package, and tools like Pipenv trying to
peek into it without installing (setup.py egg_info) won’t work. The
setup.py should only depend on Setuptools.
A middle ground solution would be to write a tool similar to bumpversion
<https://github.com/peritus/bumpversion> that automatically syncs a text
file based on Pipfile. Distribute this file with your package, and read it
in setup.py. Then use CI or a commit hook to make sure the files are always
in sync.
—
You are receiving this because you commented.
Reply to this email directly, view it on GitHub
<#1263 (comment)>, or mute
the thread
<https://github.com/notifications/unsubscribe-auth/ALMBlmmV52kdIL9D4zlMJoQh2JpaGdDbks5tIa_jgaJpZM4RRu3v>
.
|
@uranusjr just testing my assumptions here, but wouldn't it be possible to add pipenv to setup.py's setup_requires, and delaying the pipenv import to a setuptools Command? Or is that considered bad practise? |
@Korijn It might not be per se, but given the current best practice is to use separate virtualenvs for each Python project, this would require the user to install a copy of Pipenv for each project, which is not very intuitive. Pipenv should only be installed once (usually globally), and is used outside the project’s virtualenv to manage it, not inside the project’s virtualenv. |
So what's the resolution to this that led to the issue's closure? Is there no means of keeping track of both the dependencies in the |
For applications that are deployed or distributed in installers, I just use Pipfile. For applications that are distributed as packages with setup.py, I put all my dependencies in install_requires. Then I make my Pipfile depend on setup.py by running [Update 2019-08-23] I keep the dev packages in Pipfile nowadays, only the runtime dependencies get to live in setup.py. |
I thiink @Korijn’s approach is best practice here. Pipfile (and requirements.txt) is for applications; setup.py is for packages. They serve different purposes. If you need to sync them, you’re doing it wrong (IMO). |
@uranusjr Not according to the documentation.
Maybe I'm just not getting it. Could you elaborate on your statement please? The way I understood it is that |
@vascowhite the question you’re asking isn’t about pipenv but rather is about a fundamental separation between python packaging tools. In the python workflow, Pipfiles, like requirements files, are not meant to be traversed recursively. Instead there is a single pipfile which rules over all of the dependencies for a project you might be developing. The point of this is that the old workflow generated a flattened list of pinned requirements, while Pipfiles contain top level requirements and prefer unpinned where possible. When you install a package, the requirements from it’s So if you want to know why Pipfiles aren’t recursively resolved, it’s because that’s just not how they are used in python. Running |
@techalchemy I was half-way through a similar response before yours popping up 😂 (delete everything) I would like to also note that @vascowhite what you’re asking is not in fact outlandish. With Pipfile and the lock file both being available, it is possible to reconcile the two distinct workflows. In an ideal world, Pipfile replaces setup.py’s Python’s packaging system, however, is far from ideal at the present time, and it would require a lot of cleanup before this can ever happen. Heck, Pipenv is already having difficulties handling dependencies right now (p.s. not anyone’s fault), it would probably barely work unless for the simplest of projects if used like that. The hope is not lost though (at least not mine). There’s been a lot of PEP being proposed and implemented around this issue, and I feel things are on the right track with setup.py and requirements.txt both moving toward a rigid, declarative format. With an ecosystem so large, things need to move slowly (or see Python 3.0), but are indeed moving. |
@techalchemy @uranusjr Having come from PHP I have been confused by packaging in python, Composer is a breeze in comparison. I do find python much easier to develop in and love using it. Let's hope things improve, I'm sure they will given the efforts of people like yourselves and Kenneth Reitz. |
If you stick to my advice mentioned above, you can perfectly harmonize both setup.py and pipenv. No need to get all fussy. :) |
looks like I'm not the only one that's confused #1398 Put much better than I could though :) |
Came here for info on using I have a python package which
As you can see I use When I run Therefore +1 for not using |
I think you are expected to call |
@Korijn I'm still not sure about the correct workflow (still experimenting a bit with pipenv). As of yet, the workflow that seems to be working for me is:
Now I can run my package build if I enter into the virtualenv (step 4) before installing the application locally (step 3), it does not work. Perhaps I just have to rewire my brain into remembering that packages should be installed before |
@apiraino I think you’re not getting thing right here. If you want to use (import) click in your package, you should put it in This discussion really does not have anything to do with Pipenv anymore. I suggest you bring this problem to a more suitable forum, such as StackOverflow or Python-related mailing lists. |
@Korijn This is still the case for pipenv 9.0.3. How can I generate |
dont use quotation marks |
I stopped using quotation marks. However, I don't get a |
@benjaminweb I was confused by the same thing today. However, I'm starting to think that the current behavior may correct. @techalchemy mentioned above that
If you use the workflow mentioned in #1263 (comment), when you run
In this case, the only package you explicitly requested to be installed into the virtualenv is the package itself (i.e. "."), so it makes sense that only "." is added to the ( However, when the package installation step happens next, the Note that unlike the Pipfile, the Pipfile.lock records all the exact dependencies for the entire virtualenv, which has to include the It's possible I'm totally misunderstanding how this is expected to work. Maybe @techalchemy or @uranusjr can confirm if this is the correct way of thinking about this? |
Your line of thinking matches mine. I’ll also mention that with recent Setuptools advancement and tools such as Flit you can still specify your package’s dependencies in nice TOML form (instead of requirement strings in setup.py, which is admittedly not very pretty). You just specify them in pyproject.toml instead Pipfile. |
@uranusjr it sounds like what you're saying is that Pipfile only needs to explicitly list project dependencies if they are not already being captured by a packaging tool like Setuptools or Flit (via setup.py or pyproject.toml) For example, if setup.py looks like:
Then the Pipfile only needs the following:
Running If someone is already using Setuptools or Flit, is there ever any reason why dependencies should be added into the Pipfile under It seems like the only reason why you would need to add dependencies explicitly to Pipfile is if you're not using Setuptools or Flit. Is this correct? Are there reasons why this is not true? |
I think it's just personal preference. Listing dev dependencies in As for |
Why is this issue closed? |
This is not a bug, you cannot use the same base entry multiple times in a pipfile. If you specify a dependency in the I would walk through my normal thought experiment but I don’t have time just now so just take my word for it that it could cause dependency conflicts and surprises when you deploy something and find out your dev dependency was hiding a conflict. |
@techalchemy so how can I manage my dev dependencies in this case? I only want to know how to use pipenv in a good way |
I’ve been thinking about this for my own project, and kind of came to realise I don’t really need the |
Use the `pipenv install -e .` strategy to include setuptools `install_requires` dependencies in Pipenv. This way abstracted requirements are expressed via `setup.cfg` while concrete dependencies are expressed via `Pipfile.lock`. Since extra requirements are not installed for editable dependencies (until this moment), `testing` dependencies are handled exclusively inside tox/pytest-runner venvs, and `dev` dependencies should be specified directly in the Pipenv file (not included in `setup.cfg`). ref: pypa/pipenv#1094 (comment) pypa/pipenv#1263 (comment) Basic workflow: - Add abstract dependencies to `setup.cfg` - Proxy `setup.cfg` by doing `pipenv install -e .` - Add dev dependencies by doing `pipenv install -d XXXX` - Use `pipenv update -d` to compile concrete dependencies (and install them in a virtualenv) - Add `Pipfile.lock` to source control for repeatable installations: https://caremad.io/posts/2013/07/setup-vs-requirement/ - Use `pipenv run` to run commands inside the venv (e.g. `pipenv run tox`) - Don't expose test requirements directly to pip-tools. Instead, just rely on tox/pytest-runner to install them inside the test venv.
check out this pipenv-setup package It syncs pipfile/lockfile to setup.py
you can do and one command to solve them all 💯 |
Is there any plan to merge pipenv-setup package to pipenv? |
@uranusjr @techalchemy based on the discussion above, I think pipenv might have a somewhat different philosophy. But If the maintainers agrees, I'd very like to submit a pull request and try to integrate |
You can always parse the |
@Kilo59 I've seen people doing this. A tip to mention is don't forget to include Pipfile.lock as data_file in setup.py (or include it in MANIFEST.in). And that's for lockfile with pinned dependencies. pipfile, on the other hand, is non-trivial to parse, if you want semantic versioning in Pipfile. The same dependency requirement can show in multiple forms. |
Thank you @Madoshakalaka your tool works nicely! I agree with other peers that Setup.py's dependencies are different from Pipfile's project dependencies. But still, having a programmable way to sync those without manual labor is a great time saving feature. Also, avoids typos/common errors. The blackened setup.py was a nice touch too 👍 |
I am working on a Python package with pipenv and am faced with the challenge of keeping
setup(install_requires=...)
in sync with my Pipfile's runtime dependencies. Is there a recommended approach?[Answer 2019-08-23] Best practise as also discussed below:
The text was updated successfully, but these errors were encountered: