The thesis explores feedforward neural networks and recurrent neural networks (RNNs). It presents the fundamental characteristics and learning algorithms for both types of networks and introduces an additional method for representing the loss function gradient in RNNs. The work also includes a custom implementation of a feedforward neural network and the application of RNNs to the problem of forecasting electricity consumption. The results indicate that RNNs are suitable for short-term time series forecasting, although they face challenges such as exploding and vanishing gradients.
|