Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Derive logprob of Leaky ReLU transform #7543

Open
ricardoV94 opened this issue Oct 18, 2024 · 0 comments
Open

Derive logprob of Leaky ReLU transform #7543

ricardoV94 opened this issue Oct 18, 2024 · 0 comments

Comments

@ricardoV94
Copy link
Member

ricardoV94 commented Oct 18, 2024

Description

import pymc

a = 0.5
x = pm.Normal.dist()
y = pm.math.switch(x > 0, x, a * x)
pm.logp(y, 2.3).eval()  # NotImplementedError

We already have a logprob derivation for mixture switches where the condition is constant, but not if it depends on the same measurable variable that is in the branches. This is not a mixture but an invertible transform.

We could support an arbitrary functions on both branches as long as the domains retain the same sign after the transformation (so that it's easy to invert). To respect this, the leaky ReLU actually requires a to be positive, so a runtime check may be needed.

We could actually support arbitrary cutoff points, but it becomes increasingly tricky to figure out which branch to go down when inverting the graph.

In any case, because it is such a common transformation, it would great to at least support the special case of the leaky ReLu.

@ricardoV94 ricardoV94 changed the title Derive logprob of Leaky relu transform Derive logprob of Leaky ReLU transform Oct 18, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

1 participant