Your browser does not allow JavaScript!
JavaScript is necessary for the proper functioning of this website. Please enable JavaScript or use a modern browser.
Open Science Slovenia
Open Science
DiKUL
slv
|
eng
Search
Browse
New in RUL
About RUL
In numbers
Help
Sign in
Numerična optimizacija z metodami usmerjenega spusta : delo diplomskega seminarja
ID
Ribič, Brina
(
Author
),
ID
Grošelj, Jan
(
Mentor
)
More about this mentor...
PDF - Presentation file,
Download
(1,24 MB)
MD5: B456E42F9583B30F3A9F948C916C9682
Image galllery
Abstract
V delu se ukvarjamo s problemom iskanja najmanjše vrednosti funkcije z orodji numerične optimizacije. Obravnavamo iterativne algoritme oziroma metode, za katere si želimo, da z visokim redom konvergence zanesljivo privedejo do rešitve. Na začetku izpeljemo metodo najstrmejšega spusta, ki pa ima v splošnem kvečjemu linearen red konvergence in se v praksi redko uporablja. Nato analiziramo Newtonovo metodo z višjim redom konvergence, a je njena pomanjkljivost v tem, da je za izvedbo treba računati Hessejevo matriko funkcije. Predstavimo še BFGS metodo, ki ima superlinearen red konvergence, hkrati pa ni računsko zahtevna. V zaključnem poglavju primerjamo BFGS metodo in metodo najstrmejšega spusta na primeru CAPM modela, pri katerem z linearno regresijo na podlagi podatkov preteklih let iščemo beta koeficient izbranega podjetja. Ugotovimo, da je BFGS metoda bolj zanesljiva in učinkovita kot metoda najstrmejšega spusta.
Language:
Slovenian
Keywords:
neomejena optimizacija
,
metode usmerjenega spusta
Work type:
Final seminar paper
Typology:
2.11 - Undergraduate Thesis
Organization:
FMF - Faculty of Mathematics and Physics
Year:
2022
PID:
20.500.12556/RUL-139563
UDC:
519.6
COBISS.SI-ID:
120694787
Publication date in RUL:
04.09.2022
Views:
625
Downloads:
105
Metadata:
Cite this work
Plain text
BibTeX
EndNote XML
EndNote/Refer
RIS
ABNT
ACM Ref
AMA
APA
Chicago 17th Author-Date
Harvard
IEEE
ISO 690
MLA
Vancouver
:
Copy citation
Share:
Secondary language
Language:
English
Title:
Numerical optimization with line search methods
Abstract:
In this work we study the problem of finding the minimum value of a function using numerical optimization tools. We seek for algorithms which have high rate of convergence and efficiently find solution. We start by deriving the steepest descent method which in general has at most linear rate of convergence and is rarely used in practice. Later on we analyze Newton's method with a higher rate of convergence, but its main drawback is that it requires the computation of the Hessian matrix of the function. We also introduce the BFGS method which has superlinear rate of convergence and has low computitonal complexity. At the end of this work we compare BFGS method and the steepest descent method on CAPM model. We use linear regression, where we search for beta coeficient based on data from previous years. We conclude that the BFGS method is more reliable and efficient than the steepest descent method.
Keywords:
unconstrainted optimization
,
line search methods
Similar documents
Similar works from RUL:
Similar works from other Slovenian collections:
Back