Your browser does not allow JavaScript!
JavaScript is necessary for the proper functioning of this website. Please enable JavaScript or use a modern browser.
Repository of the University of Ljubljana
Open Science Slovenia
Open Science
DiKUL
slv
|
eng
Search
Browse
New in RUL
About RUL
In numbers
Help
Sign in
Details
Train your cake and eat it too! Repurposing collaborative training to tailor LLMs to private data without sharing
ID
Radovič, Boris
(
Author
),
ID
Aljahdali, Mohammed
(
Author
),
ID
Canini, Marco
(
Author
),
ID
Pejović, Veljko
(
Author
),
ID
Khayyat, Zuhair
(
Author
)
PDF - Presentation file,
Download
(374,25 KB)
MD5: E45BC051716C6DCC50601F8294586511
URL - Source URL, Visit
https://openreview.net/forum?id=FGupKd365r
Image galllery
Abstract
In the emerging field of large language models (LLMs), a significant challenge arises when organizations with vast datasets lack the computational resources to independently train and fine-tune models. This issue stems from privacy, compliance, and resource constraints: organizations cannot share their sensitive data but still need external computational assistance for model training. In this paper, we implement, enhance, and empirically compare several methods, including Split Learning (SL) and select Federated Learning (FL) methods, which enable data-rich yet compute-poor clients to offload LLM training without sharing raw data. Our study evaluates these methods across multiple dimensions, including model quality and training time.
Language:
English
Keywords:
energy usage
,
digital data
Work type:
Other
Typology:
1.08 - Published Scientific Conference Contribution
Organization:
FRI - Faculty of Computer and Information Science
Publication status:
Published
Publication version:
Version of Record
Year:
2024
Number of pages:
Str. 1-10
PID:
20.500.12556/RUL-166792
UDC:
004
COBISS.SI-ID:
221660675
Copyright:
Licenca Creative Commons je navedena na pristajalni strani članka (glej izvorni URL): "Licensed under Crative Commons Atribution 4.0 International". (Datum opombe 24. 1. 2025)
Publication date in RUL:
24.01.2025
Views:
148
Downloads:
81
Metadata:
Cite this work
Plain text
BibTeX
EndNote XML
EndNote/Refer
RIS
ABNT
ACM Ref
AMA
APA
Chicago 17th Author-Date
Harvard
IEEE
ISO 690
MLA
Vancouver
:
Copy citation
Share:
Record is a part of a monograph
Title:
Workshop on Efficient Systems for Foundation Models II : ES-FoMo-II 2024
Place of publishing:
[Massachusetts
Publisher:
OpenReview
Year:
2024
COBISS.SI-ID:
221659651
Licences
License:
CC BY 4.0, Creative Commons Attribution 4.0 International
Link:
http://creativecommons.org/licenses/by/4.0/
Description:
This is the standard Creative Commons license that gives others maximum freedom to do what they want with the work as long as they credit the author.
Secondary language
Language:
Slovenian
Keywords:
poraba energije
,
digitalni podatki
Projects
Funder:
Other - Other funder or multiple funders
Name:
SDAIA-KAUST AI
Funder:
ARRS - Slovenian Research Agency
Project number:
J2-3047
Name:
Kontekstno-odvisno približno računanje na mobilnih napravah
Funder:
ARRS - Slovenian Research Agency
Project number:
P2-0098
Name:
Računalniške strukture in sistemi
Similar documents
Similar works from RUL:
Similar works from other Slovenian collections:
Back