AppCiP: Energy-Efficient Approximate Convolution-in-Pixel Scheme for Neural Network Acceleration

Document Type

Article

Publication Date

3-1-2023

Abstract

Nowadays, always-on intelligent and self-powered visual perception systems have gained considerable attention and are widely used. However, capturing data and analyzing it via a backend/cloud processor are energy-intensive and long-latency, resulting in a memory bottleneck and low-speed feature extraction at the edge. This paper presents AppCiP architecture as a sensing and computing integration design to efficiently enable Artificial Intelligence (AI) on resource-limited sensing devices. AppCiP provides a number of unique capabilities, including instant and reconfigurable RGB to grayscale conversion, highly parallel analog convolution-in-pixel, and realizing low-precision quinary weight neural networks. These features significantly mitigate the overhead of analog-to-digital converters and analog buffers, leading to a considerable reduction in power consumption and area overhead. Our circuit-to-application co-simulation results demonstrate that AppCiP achieves 3 orders of magnitude higher efficiency on power consumption compared with the fastest existing designs considering different CNN workloads. It reaches a frame rate of 3000 and an efficiency of 4.12 TOp/s/W. The performance accuracy of the AppCiP architecture on different datasets such as SVHN, Pest, CIFAR-10, MHIST, and CBL Face detection is evaluated and compared with the state-of-the-art design. The obtained results exhibit the best results among other processing in/near pixel architectures, while AppCip only degrades the accuracy by less than 1% on average compared to the floating-point baseline.

Identifier

85148457399 (Scopus)

Publication Title

IEEE Journal on Emerging and Selected Topics in Circuits and Systems

External Full Text Location

https://doi.org/10.1109/JETCAS.2023.3242167

e-ISSN

21563365

ISSN

21563357

First Page

225

Last Page

236

Issue

1

Volume

13

Grant

2216772

Fund Ref

National Science Foundation

This document is currently not available here.

Share

COinS