Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Ensuring the planar flow transformation is invertible #2

Open
talesa opened this issue Sep 4, 2018 · 6 comments
Open

Ensuring the planar flow transformation is invertible #2

talesa opened this issue Sep 4, 2018 · 6 comments

Comments

@talesa
Copy link

talesa commented Sep 4, 2018

Hey! I believe you should reparameterize u and w in the planar flow like they do in the pymc3 (link below) to ensure the planar flow transformation is invertible, see the appendix in the paper https://arxiv.org/abs/1505.05770.

https://github.com/pymc-devs/pymc3/blob/1cdd1631bea48fef8d140e37c3588a8208498ba0/pymc3/variational/flows.py#L374

@ex4sperans
Copy link
Owner

Hey! Right, I agree that naive implementation of the planar flow is not always invertible, but I believe the invertibility is not required is the case of KL minimization (what I do). Please correct me if my understanding is wrong.

Anyway, it definitely should be implemented to be always invertible, thank you.

@kingofspace0wzz
Copy link

I think it would not be a proper probability distribution if the flow is not invertible, as you need the determinant of inverse jacobian to transform a random variable.

@xuChenSJTU
Copy link

Hi, guys, so have the code been revised for invertible planar flow transformation?
@ex4sperans @kingofspace0wzz @talesa

@talesa
Copy link
Author

talesa commented Oct 8, 2019

@thipokKub
Copy link

thipokKub commented Oct 15, 2019

I haven't tested my code but this should do the trick

class PlanarFlow(nn.Module):
    def __init__(self, dim):
        super().__init__()
        self.weight = nn.Parameter(torch.Tensor(1, dim))
        self.scale = nn.Parameter(torch.Tensor(1, dim))
        self.bias = nn.Parameter(torch.Tensor(1))
        self.tanh = nn.Tanh()
        self.reset_parameters()
    def reset_parameters(self):
        self.weight.data.uniform_(-0.01, 0.01)
        self.scale.data.uniform_(-0.01, 0.01)
        self.bias.data.uniform_(-0.01, 0.01)
    def forward(self, z):
        activation = F.linear(z, self.weight, self.bias)
        scale = self.get_scale(self.scale, self.weight)
        return z + scale * self.tanh(activation)
    def get_scale(self, scale, weight):
        wu = torch.sum(weight * scale, dim=-1, keepdim=True)
        mwu = -1 + torch.log(1 + torch.exp(wu))
        u_h = scale + (mwu - wu) * (weight + 1e-5)/(torch.norm(weight) + 1e-5)
        return u_h

class PlanarFlowLogDetJacobian(nn.Module):
    def __init__(self, affine):
        super().__init__()
        self.weight = affine.weight
        self.bias = affine.bias
        self.scale = affine.scale
        self.tanh = affine.tanh
    def forward(self, z):
        activation = F.linear(z, self.weight, self.bias)
        psi = (1 - self.tanh(activation) ** 2) * self.weight
        scale = self.get_scale(self.scale, self.weight)
        det_grad = 1 + torch.mm(psi, scale.t())
        return safe_log(det_grad.abs())
    def get_scale(self, scale, weight):
        wu = torch.sum(weight * scale, dim=-1, keepdim=True)
        mwu = -1 + torch.log(1 + torch.exp(wu))
        u_h = scale + (mwu - wu) * (weight + 1e-5)/(torch.norm(weight) + 1e-5)
        return u_h

Quick question, by ensuring planar-flow invertibility is there a closed form formula to find such inverse?

@emilemathieu
Copy link

@thipokKub Unfortunately not, one can ensure invertibility without having a closed-form expression of the inverse, see "Invertible Residual Networks" for a more extreme example.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

6 participants