izpis_h1_title_alt

Rezanje nevronskih mrež z matrično faktorizacijo
ID Roštan, Teja (Author), ID Curk, Tomaž (Mentor) More about this mentor... This link opens in a new window

.pdfPDF - Presentation file, Download (1,09 MB)
MD5: 1E9DCEFBE151CE498C50B438ADF9480E
PID: 20.500.12556/rul/9e5d6cef-d5e1-41b6-b17b-db86e65d2d1d

Abstract
Matrično faktorizacijo, ki se povezuje s postopkom zlivanja podatkov, uporabljamo za odkrivanje vzorcev oziroma skupin v podatkih. Faktorizirani model preslika podatke v nižje-dimenzionalen prostor, jih tako skrči in odpravi del šuma. Tovrstni modeli so zato navadno bolj robustni in imajo višjo napovedno točnost. Pri nevronskih mrežah bi tako znali reševati problem prevelike prilagojenosti podatkom (angl. overtting) in pridobili pri generalizaciji. V nalogi smo preučili, ali s hkratno faktorizacijo parametrov nevronske mreže, ki jih je možno predstaviti z več matrikami, odstranimo (porežemo) nepomembne povezave in tako izboljšamo napovedno točnost mreže. Predlagani postopek rezanja smo preizkusili na navadnih in globokih nevronskih mrežah. Po uspešnosti je primerljiv z ostalimi najuspešnejšimi standardnimi pristopi rezanja nevronskih mrež.

Language:Slovenian
Keywords:nevronske mreže, matrična faktorizacija, rezanje
Work type:Bachelor thesis/paper
Organization:FRI - Faculty of Computer and Information Science
Year:2015
PID:20.500.12556/RUL-72321 This link opens in a new window
Publication date in RUL:11.09.2015
Views:1515
Downloads:332
Metadata:XML RDF-CHPDL DC-XML DC-RDF
:
Copy citation
Share:Bookmark and Share

Secondary language

Language:English
Title:Pruning neural network using matrix factorization
Abstract:
Matrix factorization and the procedure of data fusion are used to detect patterns in data. The factorized model maps the data to a low-dimensional space, therefore shrinking it and partially eliminating noise. Factorized models are thus more robust and have a higher predictive accuracy. With this procedure we could solve the problem of overtting in neural networks and improve their ability to generalize. Here, we report on how to simultaneously factorize the parameters of a neural network, which can be represented with multiple matrices, to prune not important connections and therefore improve predictive accuracy. We report on empirical results of pruning normal and deep neural networks. The proposed method performs similarly to the best standard approaches to pruning neural networks.

Keywords:neural networks, matrix factorization, pruning

Similar documents

Similar works from RUL:
Similar works from other Slovenian collections:

Back