---Advertisement---
AI

[2407.15738] Parallel Split Learning with Global Sampling

---Advertisement---

View a PDF of the paper titled Parallel Split Learning with Global Sampling, by Mohammad Kohankhaki and 3 other authors

View PDF
HTML (experimental)

Abstract:Distributed deep learning in resource-constrained environments faces scalability and generalization challenges due to large effective batch sizes and non-identically distributed client data. We introduce a server-driven sampling strategy that maintains a fixed global batch size by dynamically adjusting client-side batch sizes. This decouples the effective batch size from the number of participating devices and ensures that global batches better reflect the overall data distribution. Using standard concentration bounds, we establish tighter deviation guarantees compared to existing approaches. Empirical results on a benchmark dataset confirm that the proposed method improves model accuracy, training efficiency, and convergence stability, offering a scalable solution for learning at the network edge.

Submission history

From: Mohammad Kohankhaki [view email]
[v1]
Mon, 22 Jul 2024 15:41:23 UTC (993 KB)
[v2]
Thu, 8 Aug 2024 21:45:57 UTC (977 KB)
[v3]
Sat, 3 May 2025 18:37:58 UTC (430 KB)
[v4]
Thu, 31 Jul 2025 15:42:11 UTC (429 KB)

Join WhatsApp

Join Now
---Advertisement---

Leave a Comment