In recent years the popularity of artificial neural networks has grown within the field of machine learning. Neural networks have achieved surprisingly good results in specific domains compared to other more traditional machine learning approaches. But with growth in their popularity, their complexity also grew. More complex neural networks are usually harder to train since the process is less stable and it requires more training data. A new data-driven weight initialization is proposed, that is based on unsupervised learning and is using training data to approximate optimal weights. This new approach is useful and in some cases gives a large boost to neural network learning speed and accuracy. Initialization is also scalable since it is easy to parallelize. Proposed initialization and optimal values of its parameters are tested on a dataset of images and datasets of biological gene expressions.
|