An Automated Approach for Improving The Inference Latency and Energy Efficiency of Pretrained CNNs by Removing Irrelevant Pixels with Focused Convolutions
Document Type
Conference Proceeding
Publication Date
1-25-2024
Publication Title
29th Asia and South Pacific Design Automation Conference (ASP-DAC 2024)
Pages
1-6
Abstract
Computer vision often uses highly accurate Convolutional Neural Networks (CNNs), but these deep learning models are associated with ever-increasing energy and computation requirements. Producing more energy-efficient CNNs often requires model training which can be cost-prohibitive. We propose a novel, automated method to make a pretrained CNN more energy-efficient without re-training. Given a pretrained CNN, we insert a threshold layer that filters activations from the preceding layers to identify regions of the image that are irrelevant, i.e. can be ignored by the following layers while maintaining accuracy. Our modified focused convolution operation saves inference latency (by up to 25%) and energy costs (by up to 22%) on various popular pretrained CNNs, with little to no loss in accuracy.
Identifier
10.6084/m9.figshare.25058516
Recommended Citation
Tung, Caleb; Eliopoulos, Nicholas; Jajal, Purvish; Ramshankar, Gowri; Yang, Chen-Yun; Synovic, Nicholas; Zhang, Xuecen; Chaudhary, Vipin; Thiruvathukal, George K.; ; Lu, Yung-Hsiang. "An automated approach for improving the inference latency and energy efficiency of pretrained CNNs by removing irrelevant pixels with focused convolutions." In Proceedings of 29th Asia and South Pacific Design Automation Conference (ASP-DAC 2024). https://doi.org/10.6084/m9.figshare.25058516
Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 International License.
Copyright Statement
© The Author(s), 2024.
Comments
Author Posting © The Author(s), 2024. This is the author's version of the work. This is a pre-print of a paper presented at the 29th Asia and South Pacific Design Automation Conference (ASP-DAC 2024). The final published version of the work is at-