Download PDFOpen PDF in browser

Optimizing Convolutional Neural Network Models for Resource-Constrained Devices in Telemedicine: a Lightweight Approach

EasyChair Preprint 14660

10 pagesDate: September 3, 2024

Abstract

The growing demand for telemedicine, particularly in remote and underserved regions, necessitates the deployment of sophisticated diagnostic tools on resource-constrained devices such as smartphones and portable medical equipment. Convolutional Neural Networks (CNNs) have proven to be highly effective in medical image analysis and automated disease diagnosis. However, their computational complexity and large model sizes pose significant challenges for implementation on devices with limited processing power, memory, and energy resources. This abstract outlines a lightweight approach to optimizing CNN models, making them suitable for resource-constrained devices in telemedicine applications.

 

The research begins by identifying the key challenges associated with deploying CNN models on low-power devices, including the trade-offs between model accuracy, size, and computational efficiency. It emphasizes the need for CNN architectures that maintain high diagnostic accuracy while being sufficiently lightweight to operate effectively on constrained hardware.

 

Several optimization techniques are explored, including model pruning, quantization, knowledge distillation, and architecture search. Model pruning involves removing redundant parameters and layers from the CNN, significantly reducing its size and computational requirements without compromising performance. Quantization reduces the precision of the model's weights and activations, leading to lower memory usage and faster inference times. Knowledge distillation leverages a large, pre-trained "teacher" model to train a smaller "student" model, transferring knowledge while ensuring that the student model remains lightweight. Architecture search focuses on designing efficient CNN architectures tailored for specific hardware constraints.

Keyphrases: Architecture Search, Convolutional Neural Networks (CNNs), Edge Computing, Federated Learning, Healthcare AI, Knowledge Distillation, Model Pruning, Quantization, Telemedicine, low-resource settings, model optimization, real-time inference, resource-constrained devices

BibTeX entry
BibTeX does not have the right entry for preprints. This is a hack for producing the correct reference:
@booklet{EasyChair:14660,
  author    = {Dylan Stilinki},
  title     = {Optimizing Convolutional Neural Network Models for Resource-Constrained Devices in Telemedicine: a Lightweight Approach},
  howpublished = {EasyChair Preprint 14660},
  year      = {EasyChair, 2024}}
Download PDFOpen PDF in browser