izpis_h1_title_alt

Globoke nevronske mreže za postavljanje vejic v slovenskem jeziku
ID Božič, Martin (Author), ID Robnik Šikonja, Marko (Mentor) More about this mentor... This link opens in a new window

.pdfPDF - Presentation file, Download (3,14 MB)
MD5: 1C1EC00FEC53827327048A5F4DE8AA57

Abstract
Med najpogostejše napake pri pisanju besedil v slovenščini sodi postavljanje vejic. V diplomski nalogi se bomo osredotočili na postavljanje vejic s pomočjo globokih nevronskih mrež. Predstavili bomo dve arhitekturi, eno na podlagi nevronskih mrež s celicami GRU in drugo z vnaprej naučenim jezikovnim modelom tipa BERT. Pri uporabi jezikovnega modela tipa BERT opazimo boljšo klasifikacijsko točnost. Vzrok za to je boljša in kompleksnejša arhitektura modela ter proces učenja, ki izpopolnjuje model z obširnim jezikovnim znanjem. Z uporabo večjezičnega modela BERT, naučenega na 104 jezikih in le manjšo množico slovenskih besedil, pridobimo rešitev, ki je primerljiva z rešitvijo, ki smo jo pridobili z uporabo trojezičnega, slovensko-hrvaško-angleškega modela BERT.

Language:Slovenian
Keywords:globoke nevronske mreže, mreže GRU, model BERT, postavljanje vejic
Work type:Bachelor thesis/paper
Typology:2.11 - Undergraduate Thesis
Organization:FRI - Faculty of Computer and Information Science
Year:2020
PID:20.500.12556/RUL-119034 This link opens in a new window
COBISS.SI-ID:27670787 This link opens in a new window
Publication date in RUL:01.09.2020
Views:1585
Downloads:568
Metadata:XML DC-XML DC-RDF
:
Copy citation
Share:Bookmark and Share

Secondary language

Language:English
Title:Deep neural networks for comma placement in Slovene
Abstract:
Comma placement is the most frequent orthological mistake in Slovene. The thesis focuses on comma placement using deep neural networks. We present two architectures, one based on neural networks with GRU cells and another using a pre-learned BERT language model. Using a pre-learned BERT language model, we get better classification accuracy. The reason for this is better and more complex architecture and the learning process, which fine-tuned a pretrained model with substantial language knowladge. With the multilingual BERT, trained on 104 languages with only a small amount of Slovene texts, we achieve comparable results to Slovene-Croatian-English BERT model, trained with much more Slovene texts.

Keywords:deep neural networks, GRU networks, BERT model, comma placement

Similar documents

Similar works from RUL:
Similar works from other Slovenian collections:

Back