-
-
Notifications
You must be signed in to change notification settings - Fork 1.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
documentation/question: package publishing #1288
Comments
Python has traditionally distinguish between library packaging and application packaging. The former uses This, discussion in #1263, and PEP 518 give me a project idea though. What if there is a pipeline that
I’ll experiment a bit when I have time and report back if there’s progress 😎 |
I think you may want to take a look at pybuilder as reference. It also produce |
I'd like to see a more general feature around this, like adding customer tasks/commands, not only |
Sorry for reviving this but I have one more question: does setup.py have a place for development requirements? |
Answering my own question: Apparently people use extras_require by adding a "dev" option there. |
It will be cool to have |
@dimka665 @DrSensor From the documentation https://docs.pipenv.org/advanced/#pipfile-vs-setup-py |
It's crazy idea that we cannot use
I use |
@dimka665 Um, but, you totally can? Pipenv uses Pipenv for development, and it works just fine. It’s just you don’t specify library requirements in Pipfile, but in |
I use pipenv for libraries and still declare the dependencies in pipfile (but I don't track the lock file), reflected back to setup.py with pbr. Works fine so far (still I have to automatically generate the |
Seems like pipenv cli need to adopt some command to add package to |
@DrSensor this is not going to happen — it is a bad practice. Pipfiles and setup files are for different purposes. We don’t parse setup files and have no plans of building tooling to do this |
@DrSensor Unfortunately this is the current consensus from the core developers (not Pipenv, but PyPA, and Python as a whole). As suggested in #1851 (comment), you’d need to raise this to a boarder audience than Pipenv to change the situation. |
It's definitely not a bad practice. I have over 130 modules published in a different technology stack using the combined lib/bin strategy, and it's a wonderful optimization that simplifies development. It's even all in JSON so I can parse it at runtime if desired, which ends up being incredibly useful frequently. I ship libs, executables, or often both out of the same declaration file. I will definitely agree that it's fair to want them separate--that's an opinion. but a bad practice? I think that's subjective, and I also think it's incorrect. Having to learn multiple formats makes python development that much harder. |
Edit--sorry i misread. parsing setup probably indeed isn't worth the effort. But supporting some format that supports shipping software in both modalities is probably worth the investment |
@cdaringe - I'm just a little confused, why doesn't this work?
then |
1 missing thing from that is ensuring lock is updated when you update setup.py, but this is pretty easy to do in, say, a Makefile, like this:
|
@jtratner, sure that looks like it could work. i may try that soon! to achieve the goal of publishing a library and an executable script, the proposed solution requires 3 tools and a handful of associated files, where neighboring technology stacks would require 1 tool and 1 or 2 files. every time i come back to python i feel lost about what the right tool for the job is. easy_install, pip, pipenv, ...make, pip_install, venv/virtualenv, pyenv, etc. they are all sorta kinda related, but all have unique roles. in node, you need just one thing-- i was hoping that this tool would be the |
@cdaringe I don't think that we are against changing things, I think it's really important to have the discussion about how to use the tooling in question and how to consolidate (and those conversations are actively occurring -- just that we are not solely responsible for the decisions... @dstufft or @ncoghlan might be able to say more about where to go with this) Roughly I would be interested in a consolidated list of features you think we are missing so that we can focus on those features directly rather than on what proportion of other ecosystems' tooling we have implemented -- since python is its own beast, not everything node does belongs here, and vice versa. |
Folks may also want to read http://www.curiousefficiency.org/posts/2016/09/python-packaging-ecosystem.html#my-core-software-ecosystem-design-philosophy. Tightly coupling publishing tools to installation tools is useful for forming an initial tight-knit coherent publishing community, and for exploiting network effects as a commercial platform operator, but it's an approach with a limited lifespan as an ecosystem grows and the diversity of deployment and integration models increases. For Python, the first phase of that lasted from around 1998->2004 (build with a distutils based setup.py, publish as a tarball if you had no dependencies, or as a Linux distro package if you wanted dependency management), the second phase from around 2004-> 2008 (build with a setuptools based setup.py, install with easy_install), and we're currently still in the third phase (build with a setuptools based setup.py, install with pip). One key current focus of ecosystem level work is on eliminating the requirement for published projects to include a
|
I haven't done any python pip publishing but isn't the setup.py just python? Why not just parse the Pipfile toml in the setup.py file? Like I said, new to pypi but why not do that? |
@frob It’s definitely viable, but Pipenv devs are not interested in including it into the project (personally I don’t think it’s a good fit). IIRC there are projects doing exactly this, but I can’t recall the name from the top of mu head. |
Right, I agree that it doesn't really need to be a part of the project, but even the docs on contributing to pypi recommend reading the README.md for the long description. I don't see this as any different. |
For example, one difficulty you will encounter taking your proposed approach is requiring a toml parser just to run setup.py. You'll need to somehow postpone importing the dependency or ensure everyone already has it installed prior to running setup.py. |
If they are using pipenv will they not also have the toml parser? |
It is not uncommon for packages to require extra packages to perform setup, and there is a standard way to declare it, see PEP 518. I would say a TOML parser is much nearer to the minor end of the spectrum in term of build requirements :) |
Pipenv and the toml parser are not installed in the virtual environment where the package will be installed. Also see all the issues with
Forgive me, I haven't followed recent developments regarding |
The way I view this is that when I'm using pipenv with a packaged Python project, the "application" that pipenv is managing is the project test suite, rather than the library itself. So while you can declare pytoml as a build dependency, I prefer to go the other way around, and add an editable install of the local source package to Pipfile (this has historically required some workarounds, but I believe it picks up declared dependencies correctly on master now) |
Can you expand on this. It isn't quite clear to me what you mean. |
problem
pipenv
's opening documentation here is worded such that I expected that I could actually package and ship my project with pipenv.pipenv
?anyway, thanks! so far feels like a much better UX that past tooling
Describe your environment
n/a
Expected result
pipenv publish
to kick off a publish cycleActual result
docs and/or feature not present
Steps to replicate
n/a
The text was updated successfully, but these errors were encountered: