Podrobno

Review and comparative evaluation of resource-adaptive collaborative training for heterogeneous edge devices
ID Radovič, Boris (Avtor), ID Canini, Marco (Avtor), ID Pejović, Veljko (Avtor)

.pdfPDF - Predstavitvena datoteka, prenos (1,42 MB)
MD5: 2F04B57DBBF3A2B6ADF278C437DE4E48
URLURL - Izvorni URL, za dostop obiščite https://dl.acm.org/doi/10.1145/3708983 Povezava se odpre v novem oknu

Izvleček
Growing concerns about centralized mining of personal data threatens to stifle further proliferation of machine learning (ML) applications. Consequently, a recent trend in ML training advocates for a paradigm shift – moving the computation of ML models from a centralized server to a federation of edge devices owned by the users whose data is to be mined. Though such decentralization aims to alleviate concerns related to raw data sharing, it introduces a set of challenges due to the hardware heterogeneity among the devices possessing the data. The heterogeneity may, in the most extreme cases, impede the participation of low-end devices in the training or even prevent the deployment of the ML model to such devices. Recent research in distributed collaborative machine learning (DCML) promises to address the issue of ML model training over heterogeneous devices. However, the actual extent to which the issue is solved remains unclear, especially as an independent investigation of the proposed methods’ performance in realistic settings is missing. In this paper, we present a detailed survey and an evaluation of algorithms that aim to enable collaborative model training across diverse devices. We explore approaches that harness three major strategies for DCML, namely Knowledge Distillation, Split Learning, and Partial Training, and we conduct a thorough experimental evaluation of these approaches on a real-world testbed of 14 heterogeneous devices. Our analysis compares algorithms based on the resulting model accuracy, memory consumption, CPU utilization, network activity, and other relevant metrics, and provides guidelines for practitioners as well as pointers for future research in DCML.

Jezik:Angleški jezik
Ključne besede:federated learning, split learning, distributed collaborative learning, ubiquitous and mobile computing, device heterogeneity
Vrsta gradiva:Članek v reviji
Tipologija:1.01 - Izvirni znanstveni članek
Organizacija:FRI - Fakulteta za računalništvo in informatiko
Status publikacije:Objavljeno
Različica publikacije:Objavljena publikacija
Leto izida:2025
Št. strani:35 str.
Številčenje:Vol. ǂ10, no. 1, art. 4
PID:20.500.12556/RUL-166789 Povezava se odpre v novem oknu
UDK:004
ISSN pri članku:2376-3639
DOI:10.1145/3708983 Povezava se odpre v novem oknu
COBISS.SI-ID:221443843 Povezava se odpre v novem oknu
Datum objave v RUL:24.01.2025
Število ogledov:456
Število prenosov:228
Metapodatki:XML DC-XML DC-RDF
:
Kopiraj citat
Objavi na:Bookmark and Share

Gradivo je del revije

Naslov:ACM transactions on modeling and performance evaluation of computing systems
Skrajšan naslov:ACM trans. model. perform. eval. comput. syst.
Založnik:Association for Computing Machinery, Inc.
ISSN:2376-3639
COBISS.SI-ID:68955395 Povezava se odpre v novem oknu

Licence

Licenca:CC BY 4.0, Creative Commons Priznanje avtorstva 4.0 Mednarodna
Povezava:http://creativecommons.org/licenses/by/4.0/deed.sl
Opis:To je standardna licenca Creative Commons, ki daje uporabnikom največ možnosti za nadaljnjo uporabo dela, pri čemer morajo navesti avtorja.

Sekundarni jezik

Jezik:Slovenski jezik
Ključne besede:zvezno učenje, porazdeljeno učenje, vseprisotno računalništvo, mobilno računalništvo

Projekti

Financer:Drugi - Drug financer ali več financerjev
Številka projekta:ORA-CRG2021-4699

Financer:ARRS - Agencija za raziskovalno dejavnost Republike Slovenije
Številka projekta:J2-3047
Naslov:Kontekstno-odvisno približno računanje na mobilnih napravah

Podobna dela

Podobna dela v RUL:
Podobna dela v drugih slovenskih zbirkah:

Nazaj