AltUB: Alternating Training Method to Update Base Distribution of Normalizing Flow #710
Unanswered
jpcbertoldo
asked this question in
Feature Requests
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
[1] made the following observations:
Proposed solution: change the (fixed)$N(0, 1)$ by a (trainable) $N(\mu, \Sigma)$ .
Technical detail: the learning rate used to train the NF-network is too high for training the base distribution$N(\mu, \Sigma)$ , so they update it alone once in a while.
My idea is to implement this alternating step as a callback that can be added to the trainning setup of any normalizing flow.
For instance, [1] applied their method on top of [2] and [3].
[1] Y. Kim, H. Jang, D. Lee, and H.-J. Choi, “AltUB: Alternating Training Method to Update Base Distribution of Normalizing Flow for Anomaly Detection.” arXiv, Oct. 26, 2022. doi: 10.48550/arXiv.2210.14913.
[2] D. Gudovskiy, S. Ishizaka, and K. Kozuka, “CFLOW-AD: Real-Time Unsupervised Anomaly Detection With Localization via Conditional Normalizing Flows,” presented at the Proceedings of the IEEE/CVF Winter Conference on Applications of Computer Vision, 2022, pp. 98–107. Accessed: Sep. 27, 2022. [Online]. Available: https://openaccess.thecvf.com/content/WACV2022/html/Gudovskiy_CFLOW-AD_Real-Time_Unsupervised_Anomaly_Detection_With_Localization_via_Conditional_Normalizing_WACV_2022_paper.html
[3] J. Yu et al., “FastFlow: Unsupervised Anomaly Detection and Localization via 2D Normalizing Flows,” arXiv:2111.07677 [cs], Nov. 2021, Accessed: Feb. 02, 2022. [Online]. Available: http://arxiv.org/abs/2111.07677
Beta Was this translation helpful? Give feedback.
All reactions