Conditional model P(.|G, q) with multiple values of global q for the same graph G #9883
Unanswered
KubaMichalczyk
asked this question in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I'm working on a node classification problem which estimates conditional probability$P(N=i|G, q)$ where:
This is an inductive learning setup where each training instance consists of a temporal, fully-connected graph$G$ with $n$ nodes. The tensor dimensions are:
(N, C, T)
where N=nodes, C=channels, T=time steps(P, N=1, F, T=1)
where P=number of query points, F=query feature dimensionI am wondering how I can incorporate$q$ in my model in an efficient way.
So far, I tried to create$G$ (including adjusting
P
artificial copies of the graphedge_index
) and concatenateq
tox
as recommended here, obtaining the newx
of shape(P*N, C+F, T)
after concatenation. While broadcasting with.view
method doesn't occupy additional memory, the used memory blew up during the pass through GATv2Conv layer (probably due to the multiplicated number of edges and the associated computations). Therefore, this is an option when P is a small number but is prohibitive when P goes to thousands. Interestingly, during testing, it occurred that concatenatingq
tox
before the graph layer was beneficial in comparison to concatenating it after.Alternatively, I'm thinking if representing it as$q$ forming a set of
HeteroData
object withP
nodes of second type could be a solution here. However, I have to admit I just did initial reading on heterogeneous graph handling in PyG and I do not see what might be the computational overhead in this case (in the end, this will add a lot of new edges to handle as well, but possibly significantly less than explicit copying described above). Could it be the way to go in this type of problem?Or maybe I'm just overthinking this and there is a better way to perform this computation efficiently, besides simply letting the graph layer work on
x
alone and blending it with (embedded)q
afterwards?Beta Was this translation helpful? Give feedback.
All reactions