This master's thesis deals with the prediction of parallel time series with the use of machine learning. Parallel time series are time series whose values change over time at equal time intervals, such as hour or day simultaneously for all time series. An example of this type of time series are stock exchange rates, where for each security a single time series is created parallel to the time series of other securities.
The contribution of this master's thesis is a new combined algorithm for predicting parallel time series that includes a genetic algorithm and non-linear regression. The genetic algorithm is used to find the sieve and for non-linear functions that describe the model. The numerical method of nonlinear regression is used to calculate unknown function coefficients.
The new proposed algorithm is comparable in terms of accuracy and profit margin with existing machine learning algorithms. The advantage of the algorithm is that it does not need specific data preprocessing for input data as long as data are complete. Another advantage is that it offers an explanation on how data depend on one another. The downside is that the algorithm is time consuming, which was partly avoided with parallelism.
|