Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Use namedtuple to store curve results (ROC, PR) #572

Merged
merged 5 commits into from
Mar 5, 2024

Conversation

amrit110
Copy link
Member

@amrit110 amrit110 commented Mar 4, 2024

PR Type ([Feature | Fix | Documentation | Test])

Refactor, addresses #558

Short Description

  • Use namedtuple to store ROC and PR curve results in the evaluation metrics (experimental). Avoids creating bugs when accessed using indices. It still preserves unpacking during function calls.
  • Fixed accessing in classification plotter to use attribute than indices

Tests Added

...

@amrit110 amrit110 added the refactor Refactor existing code, with same or similar functionality label Mar 4, 2024
@amrit110 amrit110 requested review from rjavadi and fcogidi March 4, 2024 14:46
@amrit110 amrit110 self-assigned this Mar 4, 2024
Copy link

codecov bot commented Mar 4, 2024

Codecov Report

Attention: Patch coverage is 87.80488% with 10 lines in your changes are missing coverage. Please review.

Project coverage is 70.78%. Comparing base (e92bcca) to head (838bc96).

Additional details and impacted files

Impacted file tree graph

@@            Coverage Diff             @@
##             main     #572      +/-   ##
==========================================
+ Coverage   70.70%   70.78%   +0.08%     
==========================================
  Files         126      126              
  Lines       11220    11262      +42     
==========================================
+ Hits         7933     7972      +39     
- Misses       3287     3290       +3     
Files Coverage Δ
.../experimental/functional/precision_recall_curve.py 86.52% <100.00%> (+0.29%) ⬆️
...ps/evaluate/metrics/experimental/functional/roc.py 90.99% <100.00%> (+0.60%) ⬆️
...ate/metrics/experimental/precision_recall_curve.py 100.00% <100.00%> (ø)
cyclops/evaluate/metrics/experimental/roc.py 100.00% <100.00%> (ø)
...luate/metrics/functional/precision_recall_curve.py 64.94% <100.00%> (+1.31%) ⬆️
cyclops/evaluate/metrics/functional/roc.py 77.77% <100.00%> (+1.69%) ⬆️
cyclops/evaluate/metrics/precision_recall_curve.py 62.32% <100.00%> (+0.79%) ⬆️
cyclops/evaluate/metrics/roc.py 90.69% <100.00%> (+0.95%) ⬆️
cyclops/evaluate/metrics/utils.py 80.28% <100.00%> (+0.86%) ⬆️
cyclops/report/plot/classification.py 0.00% <0.00%> (ø)

Impacted file tree graph

@rjavadi
Copy link
Collaborator

rjavadi commented Mar 4, 2024

Looks good to me.

#574)

* Use namedtuple to store curve results (ROC, PR) for non-experimental metrics

* Remove commented out doc example

* handle named tuples in `_apply_function_recursively` and fix doctests

---------

Co-authored-by: Franklin <41602287+fcogidi@users.noreply.github.com>
@amrit110 amrit110 requested a review from fcogidi March 5, 2024 18:53
@amrit110 amrit110 merged commit f34e373 into main Mar 5, 2024
10 checks passed
@amrit110 amrit110 deleted the use_namedtuple_curves branch March 5, 2024 20:25
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
refactor Refactor existing code, with same or similar functionality
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants