Simple and Effective Stochastic Neural Networks

Tianyuan Yu, Yongxin Yang, Da Li, Timothy Hospedales, Tao Xiang

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract / Description of output

Stochastic neural networks (SNNs) are currently topical, with several paradigms being actively investigated including dropout, Bayesian neural networks, variational information bottleneck (VIB) and noise regularized learning. These neural network variants impact several major considerations, including generalization, network compression, robustness against adversarial attack and label noise, and model calibration. However, many existing networks are complicated and expensive to train, and/or only address one or two of these practical considerations. In this paper we propose a simple and effective stochastic neural network (SE-SNN) architecture for discriminative learning by directly modeling activation uncertainty and encouraging high activation variability. Compared to existing SNNs, our SE-SNN is simpler to implement and faster to train, and produces state of the art results on network compression by pruning, adversarial defense, learning with label noise, and model calibration.
Original languageEnglish
Title of host publicationProceedings of the Thirty-Fifth AAAI Conference on Artificial Intelligence (AAAI-2021)
PublisherAAAI Press
Pages3252 - 3260
Number of pages9
ISBN (Print)978-1-57735-866-4
Publication statusPublished - 18 May 2021
EventThe Thirty-Fifth AAAI Conference on Artificial Intelligence - Virtual Conference
Duration: 2 Feb 20219 Feb 2021
Conference number: 35

Publication series

NameProceedings of the AAAI Confernce on Artificial Intelligence
ISSN (Print)2159-5399
ISSN (Electronic)2374-3468


ConferenceThe Thirty-Fifth AAAI Conference on Artificial Intelligence
Abbreviated titleAAAI-21
Internet address


Dive into the research topics of 'Simple and Effective Stochastic Neural Networks'. Together they form a unique fingerprint.

Cite this