Skip to main navigation Skip to search Skip to main content

Robust DCNN: The impact of approximate multipliers in defending against adversarial attacks

Research output: Contribution to journalArticlepeer-review

Abstract

Deep Convolutional Neural Networks (DCNNs) excel in various machine learning tasks across critical domains like healthcare, finance, and autonomous transportation. However, they face significant challenges in computational cost and vulnerability to adversarial attacks in sensitive applications. While approximate computation methods have been proposed to enhance DCNN robustness, existing approaches typically cannot maintain resistance against all attack types without compromising accuracy on unperturbed inputs. This study introduces a modified AdaPT framework that optimizes both accuracy and robustness by quantizing parameters to 8 bits and systematically evaluating the model under adversarial conditions. We employ the NSGA-II multi-objective optimization algorithm to select appropriate approximate multipliers for each network layer and determine optimal approximation extents. Unlike previous methods that prioritize either robustness or accuracy, our approach achieves a balanced trade-off between these crucial metrics. Experimental results with ResNet-50 demonstrate that identifying the optimal Pareto front of approximate multiplier combinations yields simultaneous improvements of 31 % in accuracy and 30 % in robustness at a perturbation budget of 0.15 compared to the accurate model.

Original languageEnglish
Article number108220
Number of pages15
JournalFuture Generation Computer Systems
Volume176
DOIs
StatePublished - Mar 2026

Keywords

  • Approximate computing
  • Deep neural networks
  • Edge computing
  • Hardware acceleration
  • Multi-layer neural network
  • Neural network hardware

Fingerprint

Dive into the research topics of 'Robust DCNN: The impact of approximate multipliers in defending against adversarial attacks'. Together they form a unique fingerprint.

Cite this