title | abstract | layout | series | publisher | issn | id | month | tex_title | firstpage | lastpage | page | order | cycles | bibtex_author | author | date | address | container-title | volume | genre | issued | extras | |||||||||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Does Geometric Structure in Convolutional Filter Space Provide Filter Redundancy Information? |
This paper aims to study the geometrical structure present in a CNN filter space for investigating redundancy or importance of an individual filter. In particular, this paper analyses the convolutional layer filter space using simplical geometry to establish a relation between filter relevance and their location on the simplex. Convex combination of extremal points of a simplex can span the entire volume of the simplex. As a result, these points are inherently the most relevant components. Based on this principle, we hypothesise a notion that filters lying near these extremal points of a simplex modelling the filter space are least redundant filters and vice-versa. We validate this positional relevance hypothesis by successfully employing it for data-independent filter ranking and artificial filter fabrication in trained convolutional neural networks. The empirical analysis on different CNN architectures such as ResNet-50 and VGG-16 provide strong evidence in favour of the postulated positional relevance hypothesis. |
inproceedings |
Proceedings of Machine Learning Research |
PMLR |
2640-3498 |
thakur23a |
0 |
Does Geometric Structure in Convolutional Filter Space Provide Filter Redundancy Information? |
111 |
121 |
111-121 |
111 |
false |
Thakur, Anshul and Abrol, Vinayak and Sharma, Pulkit |
|
2023-02-07 |
Proceedings of the 1st NeurIPS Workshop on Symmetry and Geometry in Neural Representations |
197 |
inproceedings |
|