Stochastic neural networks (SNNs) are currently topical, with several paradigms being actively investigated including dropout, Bayesian neural networks, variational information bottleneck (VIB) and noise regularized learning. These neural network variants impact several major considerations, including generalization, network compression, robustness against adversarial attack and label noise, and model calibration. However, many existing networks are complicated and expensive to train, and/or only address one or two of these practical considerations. In this paper we propose a simple and effective stochastic neural network (SE-SNN) architecture for discriminative learning by directly modeling activation uncertainty and encouraging high activation variability. Compared to existing SNNs, our SE-SNN is simpler to implement and faster to train, and produces state of the art results on network compression by pruning, adversarial defense, learning with label noise, and model calibration.
|Title of host publication||AAAI Conference on Artificial Intelligence (AAAI 2021)|
|Publication status||Accepted/In press - 2 Dec 2020|
|Event||The Thirty-Fifth AAAI Conference on Artificial Intelligence - Virtual Conference|
Duration: 2 Feb 2021 → 9 Feb 2021
Conference number: 35
|Conference||The Thirty-Fifth AAAI Conference on Artificial Intelligence|
|Period||2/02/21 → 9/02/21|