Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[7.0.x] Rebuild for openssl3 & aws-sdk-cpp 1.9.379 #873

Conversation

regro-cf-autotick-bot
Copy link
Contributor

@regro-cf-autotick-bot regro-cf-autotick-bot commented Oct 25, 2022

This PR has been triggered in an effort to update openssl3.

Notes and instructions for merging this PR:

  1. Please merge the PR only after the tests have passed.
  2. Feel free to push to the bot's branch to update this PR if needed.

Please note that if you close this PR we presume that the feedstock has been rebuilt, so if you are going to perform the rebuild yourself don't close this PR until the your rebuild has been merged.

Closes #892

If this PR was opened in error or needs to be updated please add the bot-rerun label to this PR. The bot will close this PR and schedule another one. If you do not have permissions to add this label, you can use the phrase @conda-forge-admin, please rerun bot in a PR comment to have the conda-forge-admin add it for you.

This PR was created by the regro-cf-autotick-bot. The regro-cf-autotick-bot is a service to automatically track the dependency graph, migrate packages, and propose package version updates for conda-forge. Feel free to drop us a line if there are any issues! This PR was generated by https://github.com/regro/autotick-bot/actions/runs/3324531783, please use this URL for debugging.

@conda-forge-linter
Copy link
Contributor

Hi! This is the friendly automated conda-forge-linting service.

I just wanted to let you know that I linted all conda-recipes in your PR (recipe) and found it was in an excellent condition.

@h-vetinari h-vetinari changed the title Rebuild for openssl3 [7.0.x] Rebuild for openssl3 Oct 26, 2022
Copy link
Member

@h-vetinari h-vetinari left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Blocked on conda-forge/aws-sdk-cpp-feedstock#577 (or unblocking aws-sdk-cpp 1.9)

@h-vetinari h-vetinari force-pushed the rebuild-openssl3-0-1_he77af5 branch 2 times, most recently from 1d419d8 to 80eb635 Compare December 10, 2022 01:41
@h-vetinari
Copy link
Member

@conda-forge-admin, please rerender

@h-vetinari h-vetinari force-pushed the rebuild-openssl3-0-1_he77af5 branch 2 times, most recently from 0482b97 to b9316cf Compare December 10, 2022 03:25
h-vetinari and others added 9 commits December 10, 2022 14:26
warning otherwise:
```
WARNING (pyarrow,lib/python3.10/site-packages/pyarrow/plasma-store-server): Needed DSO lib/libgflags.so.2.2 found in ['gflags']
WARNING (pyarrow,lib/python3.10/site-packages/pyarrow/plasma-store-server): .. but ['gflags'] not in reqs/run, (i.e. it is overlinking) (likely) or a missing dependency (less likely)
```
remove options not supported upstream for 7.0.x that
got backported accidentally (they are no-ops, but still)
it doesn't build on windows, fails without much info:
```
[75/110] Building CXX object src\arrow\flight\sql\CMakeFiles\arrow_flight_sql_shared.dir\Unity\unity_0_cxx.cxx.obj
FAILED: src/arrow/flight/sql/CMakeFiles/arrow_flight_sql_shared.dir/Unity/unity_0_cxx.cxx.obj
```
@h-vetinari h-vetinari changed the title [7.0.x] Rebuild for openssl3 [7.0.x] Rebuild for openssl3 & aws-sdk-cpp 1.9.379 Dec 10, 2022
recipe/meta.yaml Outdated
Comment on lines 222 to 210
- cudatoolkit >=9.2 # [cuda_compiler_version != "None"]
- cudatoolkit >={{ cuda_compiler_version_min }} # [cuda_compiler_version != "None"]
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This was intentional since Arrow only uses the CUDA driver library

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, that's why it's set to the minimum? I.e. aarch it's 11, and cuda_compiler_version_min takes care of that on a cf level?

Or can we be sure that something compiled with CUDA 10.2 is still compatible with CUDA 9?

Copy link
Contributor

@kkraus14 kkraus14 Dec 12, 2022

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It doesn't use nvcc or any of the libraries in the cudatoolkit package. It only uses the CUDA driver library, libcuda, which is installed and versioned separately.

I believe this was added because cudatoolkit has a __cuda virtual package dependency to ensure that the driver version sufficiently supports the toolkit.

I'm not sure if there was a reason that we didn't use __cuda directly here. Regardless, if there is a reason I'd be more comfortable handling that in a follow up PR instead of this PR.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Could be because that virtual package wasn't around yet at the time when arrow added CUDA support (not sure exactly about the timing)... So let's just use __cuda?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm not sure. It may have been weird solver behavior at the time or something else. Regardless, could we defer changing this to a follow up PR and just leave this as cudatoolkit >=9.2 for now?

Copy link
Member

@h-vetinari h-vetinari Dec 12, 2022

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Which is why I put in a lot of effort to make it reviewable per-commit, and pinged several times for people to do so. So I don't agree that this is surprising, but well, we're all very busy and things slip through the cracks.

In any case, I don't see any harm done here - aside from the fact that CUDA<10.2 is positively ancient, we can just loosen the pins again and voilà, packages are installable again even for those on CUDA 9.2.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Understand you put a lot of effort into that PR and appreciate it

Only suggesting that when we have an unrelated change in PRs in the future, it would help to comment on them to identify that change and raise it for discussion. Could you please do this in the future?

Am not trying to discuss this specific change. Agree with you that is better handled in an issue

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Could you please do this in the future?

Adding pings for the feedstock-team so far has been almost entirely pointless, so adding more is hardly going to help. Again, there were clean-ups all over the place necessary for the maintenance branches (e.g. the different bindings and components were not in sync with upstream, or across platforms, etc. etc.. Just look at the commit history per branch).

To be clear: I'm more than happy to have others participate in discussions and maintenance, and keep pinging in the hopes that this happens. Barring that, changes will be developed on main, and for big changes (like #875) I will leave ample time to ask for feedback. However I will continue to backport minor cleanups to the maintenance branches. In this particular instance I may have misjudged on a miniscule edge case (that's easy to fix afterwards too), but the principle is sound. If people want to review, review on main, or simply participate on the feedstock.

Still, in an effort towards maximum transparency, I noted even the backports in #918:

I have prepared the following PRs, which also backport some clean-ups from #875 (and making sure we enable all bindings/libs for that are available for the respective version):

So all in all, I think this is making a mountain out of a molehill, and it's verging on disrespectful to insist on more communication after (almost) all previous attempts have gone by unanswered. It's a two-way street.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@h-vetinari I truly appreciate all of the work you've been doing in helping to move this and other feedstocks forward. I was sick for ~3 weeks recently and truly out of commission and then was on vacation celebrating my 1st anniversary with my wife, so I've been MIA for quite a while and am still playing catch up.

I'm also perfectly happy to move forward as is here. With CUDA 12.0 having been released recently being back 3 major versions feels more than generous enough.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I was sick for ~3 weeks recently and truly out of commission and then was on vacation celebrating my 1st anniversary with my wife, so I've been MIA for quite a while and am still playing catch up.

Sorry to hear - hope you're feeling better now, and congrats on the anniversary 🥳

With CUDA 12.0 having been released recently being back 3 major versions feels more than generous enough.

Not sure how long it'll be until we support 12 in conda-forge, but in any case, this should now be compatible with 9.2 again (so 9, 10, 11). Not sure how many people are still on drivers more than 3 years old; I mainly wanted to avoid broken metadata (because we're using 10.2 as minimum now, and loosening restrictions after the fact is always easier than tightening them).

@h-vetinari h-vetinari added the automerge Merge the PR when CI passes label Dec 12, 2022
@github-actions github-actions bot merged commit f3f6178 into conda-forge:7.0.x Dec 13, 2022
@github-actions
Copy link
Contributor

Hi! This is the friendly conda-forge automerge bot!

I considered the following status checks when analyzing this PR:

  • linter: passed
  • azure: passed

Thus the PR was passing and merged! Have a great day!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
automerge Merge the PR when CI passes
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants