Real-Time Model Synchronization: Decentralized and Asynchronous Strategies for Scalable Machine Learning

Zhong Chen Speaker
Southern Illinois University
 
Monday, Aug 4: 11:35 AM - 11:55 AM
Topic-Contributed Paper Session 
Music City Center 
In distributed machine learning systems, the ability to update models dynamically across multiple nodes is critical for maintaining accuracy and responsiveness in environments with continuous data streams. Traditional batch-based or centralized training methods often struggle with scalability, latency, and synchronization bottlenecks. This talk explores cutting-edge techniques for online model updating in distributed settings, focusing on incremental learning (processing data sequentially without retraining), decentralized learning (node-specific updates with minimal coordination), consensus-based strategies (achieving global model coherence through local collaboration), and asynchronous updates (eliminating synchronization barriers to reduce latency). These approaches collectively address challenges such as dynamic data distribution shifts, communication overhead, and system heterogeneity. By enabling real-time adaptation, they enhance scalability, fault tolerance, and resource efficiency while preserving model performance. Practical applications span federated learning, IoT networks, and large-scale analytics, where timely insights depend on seamless coordination across nodes. The talk also discusses trade-offs between consistency and speed, robustness to node failures, and open challenges in balancing theoretical guarantees with real-world deployment constraints. This synthesis of strategies provides a roadmap for building agile, resilient machine learning systems capable of thriving in fast-evolving data landscapes.