Automatic text summarization is a process of extracting important information from texts and presenting that information in the form of a summary. Abstractive summarization approaches progressed using deep neural networks, but results are not yet satisfactory, especially for languages where large training sets do not exist. In several natural language processing tasks, cross-lingual model transfers are succesfully applied for low-resource languages where large enough datasets are not available. For summarization such cross-lingual transfer was so far not attempted due to non-reusable decoder side of neural models. In our work, we used a pretrained English summarization model based on deep neural networks and sequence-to-sequece architecture to summarize Slovene news articles. We solved the problem with inadequate decoder by using an additional language model for target language text generation. We developed five models with different training sample sizes. The results were assessed by automatic and human evaluation. Our cross-lingual model performance is similar to the existing Slovene abstractive summarizer. We also discuss some interdisciplinary aspects, raised by our work.
|