izpis_h1_title_alt

Rekurenčne nevronske mreže in njihova uporaba : delo diplomskega seminarja
ID Dolenc, Tim (Author), ID Todorovski, Ljupčo (Mentor) More about this mentor... This link opens in a new window

.pdfPDF - Presentation file, Download (1,73 MB)
MD5: 9D80F24B36A75FC49BB986D1CE810280

Abstract
Diplomska naloga raziskuje usmerjene nevronske mreže in rekurenčne nevronske mreže (RNN). Predstavljene so osnovne značilnosti, algoritmi za učenje obeh vrst mrež in dodatni način predstavitve gradienta funkcije izgube pri RNN. Delo vsebuje tudi rezultate lastne implementacije usmerjene nevronske mreže in aplikacijo RNN na problem napovedovanja porabe elektrike. Rezultati kažejo, da so RNN primerne za kratkoročno napovedovanje časovnih vrst, pri daljših zaporedjih pa se soočajo z izzivi kot sta eksplodirajoči in izginjajoči gradient.

Language:Slovenian
Keywords:nevronske mreže, usmerjene nevronske mreže, rekurenčne nevronske mreže, napovedovanje časovnih vrst, gradient funkcije izgube, verižno pravilo, vzvratno razširjanje napake
Work type:Final seminar paper
Typology:2.11 - Undergraduate Thesis
Organization:FMF - Faculty of Mathematics and Physics
Year:2024
PID:20.500.12556/RUL-160627 This link opens in a new window
UDC:004.8
COBISS.SI-ID:206082819 This link opens in a new window
Publication date in RUL:01.09.2024
Views:253
Downloads:1073
Metadata:XML DC-XML DC-RDF
:
Copy citation
Share:Bookmark and Share

Secondary language

Language:English
Title:Reccurent neural networks and their applications
Abstract:
The thesis explores feedforward neural networks and recurrent neural networks (RNNs). It presents the fundamental characteristics and learning algorithms for both types of networks and introduces an additional method for representing the loss function gradient in RNNs. The work also includes a custom implementation of a feedforward neural network and the application of RNNs to the problem of forecasting electricity consumption. The results indicate that RNNs are suitable for short-term time series forecasting, although they face challenges such as exploding and vanishing gradients.

Keywords:neural networks, feedforward neural networks, recurrent neural networks, time series forecasting, loss function gradient, chain rule, backpropagation

Similar documents

Similar works from RUL:
Similar works from other Slovenian collections:

Back