Matrix factorization and the procedure of data fusion are used to detect
patterns in data. The factorized model maps the data to a low-dimensional
space, therefore shrinking it and partially eliminating noise. Factorized models
are thus more robust and have a higher predictive accuracy. With this
procedure we could solve the problem of overtting in neural networks and
improve their ability to generalize. Here, we report on how to simultaneously
factorize the parameters of a neural network, which can be represented with
multiple matrices, to prune not important connections and therefore improve
predictive accuracy. We report on empirical results of pruning normal and
deep neural networks. The proposed method performs similarly to the best
standard approaches to pruning neural networks.
|