Sponge Examples: Energy-Latency Attacks on Neural Networks

Ilia Shumailov, Yiren Zhao, Daniel Bates, Nicolas Papernot, Robert Mullins, Ross Anderson

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract / Description of output

The high energy costs of neural network training and inference led to the use of acceleration hardware such as GPUs and TPUs. While such devices enable us to train large-scale neural networks in datacenters and deploy them on edge devices, their designers' focus so far is on average-case performance. In this work, we introduce a novel threat vector against neural networks whose energy consumption or decision latency are critical. We show how adversaries can exploit carefully-crafted sponge examples, which are inputs designed to maximise energy consumption and latency, to drive machine learning (ML) systems towards their worst-case performance. Sponge examples are, to our knowledge, the first denial-of-service attack against the ML components of such systems. We mount two variants of our sponge attack on a wide range of state-of-the-art neural network models, and find that language models are surprisingly vulnerable. Sponge examples frequently increase both latency and energy consumption of these models by a factor of 30×. Extensive experiments show that our new attack is effective across different hardware platforms (CPU, GPU and an ASIC simulator) on a wide range of different language tasks. On vision tasks, we show that sponge examples can be produced and a latency degradation observed, but the effect is less pronounced. To demonstrate the effectiveness of sponge examples in the real world, we mount an attack against Microsoft Azure's translator and show an increase of response time from 1ms to 6s (6000×). We conclude by proposing a defense strategy: shifting the analysis of energy consumption in hardware from an average-case to a worst-case perspective.
Original languageEnglish
Title of host publication2021 IEEE European Symposium on Security and Privacy (EuroS&P)
PublisherIEEE
Pages212-231
Number of pages20
ISBN (Electronic)978-1-6654-1491-3
ISBN (Print)978-1-6654-3048-7
DOIs
Publication statusPublished - 13 Nov 2021
Event6th IEEE European Symposium on Security and Privacy 2021 - Virtual Conference
Duration: 6 Sept 202110 Sept 2021
Conference number: 6
https://www.ieee-security.org/TC/EuroSP2021/index.html

Conference

Conference6th IEEE European Symposium on Security and Privacy 2021
Period6/09/2110/09/21
Internet address

Keywords / Materials (for Non-textual outputs)

  • availability attacks
  • adversarial machine learning
  • adversarial examples
  • sponge examples
  • latency attacks
  • denial of service

Fingerprint

Dive into the research topics of 'Sponge Examples: Energy-Latency Attacks on Neural Networks'. Together they form a unique fingerprint.

Cite this