Andreas Müller

February 7, 2024 at 11:00 AM on Zoom / Soda Hall

The Impact of Uniform Inputs on Activation Sparsity and Energy-Latency Attacks in Computer Vision

Abstract: Resource efficiency plays an important role for machine learning
nowadays. The energy and decision latency are two critical aspects to
ensure a sustainable and practical application. Unfortunately, the
energy consumption and decision latency are not robust against
adversaries. Researchers have recently demonstrated that attackers can
compute and submit so-called sponge examples at inference time to
increase the energy consumption and decision latency of neural networks.
In computer vision, the proposed strategy crafts inputs with less
activation sparsity which could otherwise be used to accelerate
the computation.

This talk will analyze the inner workings of these energy-latency
attacks on image models. In particular, it will outline that input
uniformity is a key enabler. A uniform image, that is, an image with
mostly flat, uniformly colored surfaces, triggers more activations due
to a specific interplay of convolution, batch normalization, and ReLU
activation. Based on these insights, two new simple, yet effective
strategies for crafting sponge examples are proposed. The findings are
empirically examined in a comprehensive evaluation with multiple
state-of-the-art neural networks. The results show that these novel
attacks achieve the same sparsity effect as prior sponge-example
methods, but at a fraction of computation effort.

Bio: Andreas Müller is a PhD Student at the Chair for Information Security,
Ruhr-University Bochum in Germany, and currently a visiting scholar at
ICSI. His work is focused on adversarial machine learning. He works
together with Erwin Quiring at ICSI.

Security Lab