keyboard_arrow_up
Compact CNN: Multi-Objective Optimization for Architecture Search

Authors

Wassim Kharrat1, Khadija Bousselmi2 and Ichrack Amdouni1, 1University of Manouba, Tunisia, 2University of Savoie Mont Blanc, France

Abstract

Convolutional Neural Network (CNN) architectures have achieved remarkable success in various image analysis tasks. However, designing these architectures manually remains both labor-intensive and computationally expensive. Neural Architecture Search (NAS) has emerged as a promising approach for automating and optimizing network design. Among NAS methods, gradient-based techniques stand out for their ability to reduce computational costs while maintaining competitive performance. Nevertheless, the architectures they produce can still be demanding in terms of model size and inference time. To address this challenge, we propose a comprehensive image-analysis pipeline that combines the PC-DARTS algorithm with post-training 16-bit quantization and structured pruning. Experimental results show that our pipeline achieves an accuracy of 99.10% and an Intersection over Union (IoU) of 72.03%, while reducing the model size by up to 54%, making it well-suited for deployment in resource-constrained environments.

Keywords

Neural Architecture Search, Convolutional Neural Networks, Compression, Quantization, Pruning, Segmentation

Full Text  Volume 15, Number 22