Skip to content

Support for rebalanced LeaveOneGroupOut#5

Open
alex-1001 wants to merge 2 commits intomainfrom
rebalanced-leave-one-group-out
Open

Support for rebalanced LeaveOneGroupOut#5
alex-1001 wants to merge 2 commits intomainfrom
rebalanced-leave-one-group-out

Conversation

@alex-1001
Copy link

Adds RebalancedLeaveOneGroupOut, a leave-one-group-out cross-validator that rebalances the training set so every fold has the same class balance, matching the behavior of the other RebalancedCV splitters when splitting by groups.

Implementation: For each fold, one group is left out as the test set and the rest form the training set. The training set is then subsampled so that every fold has the same number of samples per class using the minimum/smallest per-class count across folds (we compute the smallest number of elements a given class will have across folds by subtracting the max number of elements of that class across all groups from the number of elements of that class).

Support binary/multi-class labels

Passes all tests locally (100% cov)

support binary/multi-class. included tests (100% cov)
@alex-1001 alex-1001 requested a review from gaustin15 February 3, 2026 17:57
documentation
- more informative docstrings
- explanation of when/when not to use rebalanced LOGO

_iter_test_masks/get_n_splits
- implementation now almost identical to LeaveOneGroupOut
- uses check_array() for validation
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant