izpis_h1_title_alt

Učenje odgovarjanja na vprašanja v slovenskem jeziku
ID Logar, Katja (Author), ID Robnik Šikonja, Marko (Mentor) More about this mentor... This link opens in a new window

.pdfPDF - Presentation file, Download (528,59 KB)
MD5: 84B397C9E0AB78BF2030EF24D608D0F4

Abstract
Odgovarjanje na vprašanja je pomembna in pogosto naslovljena naloga obdelave naravnega jezika v angleščini. Jezikom z manj viri, kakršna je slovenščina, je namenjeno manj pozornosti. V tem delu uporabimo enega izmed uspešnih angleških pristopov, poimenovanega UnifiedQA, in preizkusimo njegovo delovanje za slovenski jezik. Naučimo generativni model za odgovarjanje na vprašanja, ki pokriva štiri različne tipe vprašanj - da/ne, večizbirni, abstraktni in ekstraktivni tip. Za učenje uporabimo obstoječe podatkovne zbirke BoolQ, COPA, MultiRC in SQuAD 2.0 ter strojno prevedemo podatkovno zbirko MCTest. Pokažemo, da je splošni model sposoben odgovarjati na vprašanja v različnih formatih, saj deluje vsaj tako dobro kot namenski modeli, z vnosom angleškega znanja pa rezultate še izboljšamo.

Language:Slovenian
Keywords:generativno odgovarjanje na vprašanja, slovenski jezik, globoke nevronske mreže, arhitektura transformer
Work type:Master's thesis/paper
Typology:2.09 - Master's Thesis
Organization:FRI - Faculty of Computer and Information Science
Year:2022
PID:20.500.12556/RUL-140005 This link opens in a new window
COBISS.SI-ID:122046723 This link opens in a new window
Publication date in RUL:09.09.2022
Views:1092
Downloads:167
Metadata:XML RDF-CHPDL DC-XML DC-RDF
:
Copy citation
Share:Bookmark and Share

Secondary language

Language:English
Title:Learning question answering in Slovene language
Abstract:
There have been many studies in the field of question answering for English language. Less attention has been devoted to low-resource languages, such as Slovene. In this work, we use one of successful English approaches, named UnifiedQA, and test its viability for Slovene language. We finetune a generative model for question answering, covering four different question formats - yes/no, multiple choice, abstractive and extractive format. For finetuning, we use existing datasets BoolQ, COPA, MultiRC and SQuAD 2.0 and machine translate the MCTest dataset. We show that a general model is capable of answering questions in different formats at least as well as specialized models. The results are further improved using examples in English language.

Keywords:generative question answering, Slovene language, deep neural networks, transformer architecture

Similar documents

Similar works from RUL:
Similar works from other Slovenian collections:

Back