Security News > 2020 > June > Availability Attacks against Neural Networks

Availability Attacks against Neural Networks
2020-06-10 11:31

Sponge Examples: Energy-Latency Attacks on Neural Networks shows how to find adversarial examples that cause a DNN to burn more energy, take more time, or both.

They affect a wide range of DNN applications, from image recognition to natural language processing.

Adversaries might use these examples for all sorts of mischief - from draining mobile phone batteries, though degrading the machine-vision systems on which self-driving cars rely, to jamming cognitive radar.

Our most spectacular results are against NLP systems.

There are already examples in the real world where people pause or stumble when asked hard questions but we now have a dependable method for generating such examples automatically and at scale.


News URL

https://www.schneier.com/blog/archives/2020/06/availability_at.html