Skip to content

Commit

Permalink
Fix issue in KL divergence estimator with non-unique values
Browse files Browse the repository at this point in the history
Signed-off-by: Patrick Bloebaum <bloebp@amazon.com>
  • Loading branch information
bloebp committed Oct 28, 2024
1 parent e79d012 commit 8e84307
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion dowhy/gcm/divergence.py
Original file line number Diff line number Diff line change
Expand Up @@ -61,7 +61,7 @@ def estimate_kl_divergence_continuous_knn(
# Making sure that X and Y have no overlapping values, which would lead to a distance of 0 with k=1 and, thus, to
# a division by zero.
if remove_common_elements:
X = setdiff2d(X, Y, assume_unique=True)
X = setdiff2d(X, Y, assume_unique=False)
if X.shape[0] < k + 1:
# All elements are equal (or at least less than k samples are different)
return 0
Expand Down

0 comments on commit 8e84307

Please sign in to comment.