Power-efficient Approximate Multipliers for Classification Tasks in Neural Networks

Vinicius Zanandrea, Jorge Castro-Godinez, Cristina Meinhardt

Producción científica: Capítulo del libro/informe/acta de congresoContribución a la conferenciarevisión exhaustiva

Resumen

Multiplication is a key operation in neural networks. To overcome the power efficiency challenges of designing dedicated hardware for neural networks, designers can explore approximate multipliers to reduce area and power while maintaining tolerable accuracy. In this work, we evaluate the power and accuracy trade-offs of adopting two approximate multiplier structures, AxMultV1 and AxMultV2, for image classification in neural networks. In these multipliers, we explore seven approximate 4:2 compressors from the literature and compare with our proposed MAX4:2CV1 compressor. The adoption of our proposed compressor in multipliers provides power savings up to 56%, a delay reduction of 45.5%, and reduction in transistor count up to 48% compared to an exact multiplier. The multipliers based on the MAX4:2CV1 compressor can be considered suitable for classification tasks in neural networks, achieving 95.54% accuracy on the MNIST using a Multilayer Perceptron and up to 81.27% accuracy on the SVHN dataset with the LeNet-5 architecture, comparable to the accuracy of an exact multiplier.

Idioma originalInglés
Título de la publicación alojada2025 IEEE 16th Latin American Symposium on Circuits and Systems, LASCAS 2025 - Proceedings
EditorialInstitute of Electrical and Electronics Engineers Inc.
ISBN (versión digital)9798331522124
DOI
EstadoPublicada - 2025
Evento16th IEEE Latin American Symposium on Circuits and Systems, LASCAS 2025 - Bento Goncalves, Brasil
Duración: 25 feb 202528 feb 2025

Serie de la publicación

Nombre2025 IEEE 16th Latin American Symposium on Circuits and Systems, LASCAS 2025 - Proceedings

Conferencia

Conferencia16th IEEE Latin American Symposium on Circuits and Systems, LASCAS 2025
País/TerritorioBrasil
CiudadBento Goncalves
Período25/02/2528/02/25

Huella

Profundice en los temas de investigación de 'Power-efficient Approximate Multipliers for Classification Tasks in Neural Networks'. En conjunto forman una huella única.

Citar esto