izpis_h1_title_alt

Uporaba rekurentnih nevronskih mrež za generiranje umetnih člankov
ID ZORKO, LUKA (Author), ID Kononenko, Igor (Mentor) More about this mentor... This link opens in a new window

.pdfPDF - Presentation file, Download (460,28 KB)
MD5: D112B2E5B7F15310A6181A13577F5ECA

Abstract
V diplomskem delu sem raziskoval in primerjal različne tipe rekurentnih nevronskih mrež za namen generiranja umetnih člankov. Opisal in preizkusil sem različne tipe rekurentnih nevronskih mrež in sicer klasično, mrežo z dolgim kratkoročnim spominom LSTM (ang. Long Short Term Memory) in poenostavljeno verzijo LSTM, GRU (ang. Gated recurrent unit). Modele sem učil na zbirki športnih člankov, ki sem jo dobil na internetu in na Shakespearovi igri Romeo in Julija. Vsako izmed mrež sem pognal s sedmimi različnimi nastavitvami in za vsako izmed njih generiral po 6 kratkih besedil in jih na koncu, po pregledu, ocenil z ocenami od 1 do 5. Na generiranih besedilih sem preizkusil tudi 5 modelov (3 znakovne in 2 besedna) za klasifikacijo oz. razlikovanje ročno napisanih in generiranih člankov. Pri prepoznavanju generiranih besedil, se je najbolje odrezal besedni model, ki uporablja dvosmerno mrežo z dolgim kratkoročnim spominom BLSTM (ang. Bidirectional Long Short Term Memory), pri generiranju besedil, pa je najslabše ocene dobila navadna rekurentna nevronska mreža RNN, najboljše pa mreža LSTM.

Language:Slovenian
Keywords:umetna inteligenca, nevronske mreže, procesiranje naravnega jezika
Work type:Bachelor thesis/paper
Typology:2.11 - Undergraduate Thesis
Organization:FRI - Faculty of Computer and Information Science
Year:2021
PID:20.500.12556/RUL-130488 This link opens in a new window
COBISS.SI-ID:77913347 This link opens in a new window
Publication date in RUL:15.09.2021
Views:8672
Downloads:120
Metadata:XML DC-XML DC-RDF
:
Copy citation
Share:Bookmark and Share

Secondary language

Language:English
Title:Use of recurrent neural networks for generating artificial text
Abstract:
In this thesis I researched and compared different types of recurrent neural networks for natural language generation. I described and tested various types of recurrent neural networks: classic RNN, neural net with long short term memory LSTM, and a simplified version of LSTM called GRU (Gated recurrent unit). I trained the models on a collection of short sports articles from the internet and on Shakespeare's play Romeo and Juliet. Each type of neural net was run with 7 different settings. For each of the settings I generated 6 short text outputs and graded them from 1 to 5. I also tested the effectiveness of 5 models (3 character-based and 2 word-based) in differentiating handwritten and generated articles. In this task the word-based bidirectional neural net with long short term mermory BLSTM performed the best, while in the task of text generation, the regular RNN performed the worst and LSTM performed the best.

Keywords:artificial intelligence, neural networks, natural language processing

Similar documents

Similar works from RUL:
Similar works from other Slovenian collections:

Back