Quantization Aware Training Enabled CNN for On-Device Sleep Disorder Detection
Sunday, Aug 3: 5:35 PM - 5:50 PM
1027
Contributed Papers
Music City Center
Quantization-Aware Training (QAT) was integrated into a 1D 6-layer convolutional neural network (CNN) using the Sleep-EDF Database Expanded to detect sleep disorders by classifying sleep stages. This approach enables efficient on-device execution with limited memory while preserving performance despite quantization. QAT CNN consistently demonstrates reliable performance across all sleep stages, with its 8-bit and 16-bit quantized versions achieving high specificity (99.7%–100%) and lower sensitivity across most stages, and peak sensitivity at Stage R (100%) with lower specificity. These findings highlight QAT CNN's suitability for edge devices, ensuring stage-specific reliability, ie., sensitivity for Stage R and specificity for other stages. In contrast, non-QAT CNN exhibits lower sensitivity and higher specificity (82.1%–99%) across all stages and inconsistent performance in its quantized forms: 8-bit CNN shows high sensitivity and low specificity at Stage 3/4 and the reverse at other stages, while 16-bit CNN shows similar inconsistencies at Stage 1, instead. Overall, this QAT-guided CNN provides consistent and dependable performance for deploying quantized models on-device.
Quantization Aware Training
CNN
on-device
Sleep-EDF Database Expanded
sleep disorder detection
QAT
Main Sponsor
Section on Statistical Learning and Data Science
You have unsaved changes.