-
Notifications
You must be signed in to change notification settings - Fork 233
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
How LRP is used in ResNet? #192
Comments
I don't know if this is correct, but in my case I just add branches relevance an simply divide by 2. Theoretically both branches have same input (which later are added together) so relevance should be sum of branches values divided by number of branches. |
@bernerprzemek but if what you say is true then the so-called 'identity' that sum of relevance values for each layer remains constant would no longer be true! Anyway, I have used a different XAI method for my research so I do not need the answer as such. |
Yes probably not, but keep in mid that residual connection isn't simply y=f(x) but y=f(x)+x, and this fact has to be included in conservation rule, the same happen in batchnormalization layer where sum around all relevance values are no longer same as in previous layer. |
This is actually a theoretical question. How are the values propagated back in ResNet where there is skip connection?
Is the value divided equally between the previous layer neuron and the neurons of the layer from which skip connection is done? If so, then the assertion that sum of the relevance of all neurons in every layer is equal would be wrong as some relevance has 'leaked' to the initial layers due to skip connection.
The text was updated successfully, but these errors were encountered: