<?xml version="1.0"?>
<metadata xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:dc="http://purl.org/dc/elements/1.1/"><dc:title>Train your cake and eat it too! Repurposing collaborative training to tailor LLMs to private data without sharing</dc:title><dc:creator>Radovič,	Boris	(Avtor)
	</dc:creator><dc:creator>Aljahdali,	Mohammed	(Avtor)
	</dc:creator><dc:creator>Canini,	Marco	(Avtor)
	</dc:creator><dc:creator>Pejović,	Veljko	(Avtor)
	</dc:creator><dc:creator>Khayyat,	Zuhair	(Avtor)
	</dc:creator><dc:subject>energy usage</dc:subject><dc:subject>digital data</dc:subject><dc:description>In the emerging field of large language models (LLMs), a significant challenge arises when organizations with vast datasets lack the computational resources to independently train and fine-tune models. This issue stems from privacy, compliance, and resource constraints: organizations cannot share their sensitive data but still need external computational assistance for model training. In this paper, we implement, enhance, and empirically compare several methods, including Split Learning (SL) and select Federated Learning (FL) methods, which enable data-rich yet compute-poor clients to offload LLM training without sharing raw data. Our study evaluates these methods across multiple dimensions, including model quality and training time.</dc:description><dc:date>2024</dc:date><dc:date>2025-01-24 14:17:40</dc:date><dc:type>Drugo</dc:type><dc:identifier>166792</dc:identifier><dc:identifier>UDK: 004</dc:identifier><dc:identifier>COBISS_ID: 221660675</dc:identifier><dc:identifier>OceCobissID: 221659651</dc:identifier><dc:language>sl</dc:language><dc:rights>Licenca Creative Commons je navedena na pristajalni strani članka (glej izvorni URL): "Licensed under Crative Commons Atribution 4.0 International". (Datum opombe 24. 1. 2025)
</dc:rights></metadata>
