Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]Function RootDecompositionBackward error when batch_size > 10 with MultiTaskDeepGP #2531

Open
ybwang75 opened this issue Jun 15, 2024 · 0 comments
Labels

Comments

@ybwang75
Copy link

In my model, I have constructed a MultiTaskDeepDP object named A. I performed rsample on the A for sampling some samples to
concatenate with other model. Because of this, I initially set gpytorch.settings.num_likelihood_samples(1).

Additionally, since I need to index values when computing the MLL loss, I indexed the mean and covariance of A to obtain two submatrices. I then used these two submatrices to reconstruct a MultiTaskDeepDP object named B, which I used to compute the MLL loss. However, I am now encountering the following error:

image

When I set my batch size to 10 or less, no error occurs. However, if it exceeds 10, I encounter the error. Can anyone help me resolve this issue?

@ybwang75 ybwang75 added the bug label Jun 15, 2024
@ybwang75 ybwang75 changed the title [Bug] [Bug]Function RootDecompositionBackward error when batch_size > 10 with MultiTaskDeepGP Jun 15, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

1 participant