<?xml version="1.0"?>
<rdf:RDF xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:dc="http://purl.org/dc/elements/1.1/"><rdf:Description rdf:about="https://repozitorij.uni-lj.si/IzpisGradiva.php?id=175561"><dc:title>Parameter-Efficient Tuning of Large Language Models on Mobile Devices</dc:title><dc:creator>Korelič,	Martin	(Avtor)
	</dc:creator><dc:creator>Pejović,	Veljko	(Mentor)
	</dc:creator><dc:subject>large language models</dc:subject><dc:subject>deep learning</dc:subject><dc:subject>natural language processing</dc:subject><dc:subject>mobile computing</dc:subject><dc:subject>parameter efficient fine-tuning</dc:subject><dc:description>Large Language Models (LLM) have undergone massive advancements in both scale and task capabilities, yet personalization and fine-tuning of such models on a user's personal device (e.g. their smartphone) remains largely unexplored due to the computational and memory demands. While parameter-efficient fine-tuning (PEFT) offers a promising solution, current frameworks for on-device PEFT rely on emulated environments or provide limited functionality. In this work, we present ORTransformersMobile, a novel end-to-end framework enabling LLM fine-tuning and optimized inference from newly trained weights completely within the native Android operating system and mobile hardware. We develop MARS (Multi-Adapter Rank Sharing) through an ablation study, evaluating and comparing it against 288 other PEFT configurations, showing comparable or better benchmark performance even at lower ranks and with quantized weights, making it effective in resource-constrained environments. Our approach accelerates on-device fine-tuning (7 % speedup) and reduces memory usage compared to the most popular PEFT method - LoRA. Combining ORTransformersMobile and MARS, we train generative models fully on-device, enabling personalized downstream tasks from private user data on a mobile device.</dc:description><dc:date>2025</dc:date><dc:date>2025-11-04 11:55:09</dc:date><dc:type>Magistrsko delo/naloga</dc:type><dc:identifier>175561</dc:identifier><dc:language>sl</dc:language></rdf:Description></rdf:RDF>
