Extractive single document summarization using binary differential evolution: Optimization of different sentence quality measures
Autoři:
Naveen Saini aff001; Sriparna Saha aff001; Dhiraj Chakraborty aff002; Pushpak Bhattacharyya aff001
Působiště autorů:
Department of Computer Science and Engineering, Indian Institute of Technology Patna, Bihar, India
aff001; Department of Computer Science and Application, University of North Bengal, Darjeeling, West Bengal, India
aff002
Vyšlo v časopise:
PLoS ONE 14(11)
Kategorie:
Research Article
prolekare.web.journal.doi_sk:
https://doi.org/10.1371/journal.pone.0223477
Souhrn
With the increase in the amount of text information in different real-life applications, automatic text-summarization systems become more predominant in extracting relevant information. In the current study, we formulated the problem of extractive text-summarization as a binary optimization problem, and multi-objective binary differential evolution (DE) based optimization strategy is employed to solve this. The solutions of DE encode a possible subset of sentences to be present in the summary which is then evaluated based on some statistical features (objective functions) namely, the position of the sentence in the document, the similarity of a sentence with the title, length of the sentence, cohesion, readability, and coverage. These objective functions, measuring different aspects of summary, are optimized simultaneously using the search capability of DE. Some newly designed self-organizing map (SOM) based genetic operators are incorporated in the optimization process to improve the convergence. SOM generates a mating pool containing solutions and their neighborhoods. This mating pool takes part in the genetic operation (crossover and mutation) to create new solutions. To measure the similarity or dissimilarity between sentences, different existing measures like normalized Google distance, word mover distance, and cosine similarity are explored. For the purpose of evaluation, two standard summarization datasets namely, DUC2001, and DUC2002 are utilized, and the obtained results are compared with various supervised, unsupervised and optimization strategy based existing summarization techniques using ROUGE measures. Results illustrate the superiority of our approach in terms of convergence rate and ROUGE scores as compared to state-of-the-art methods. We have obtained 45% and 5% improvements over two recent state-of-the-art methods considering ROUGE−2 and ROUGE−1 scores, respectively, for the DUC2001 dataset. While for the DUC2002 dataset, improvements obtained by our approach are 20% and 5%, considering ROUGE−2 and ROUGE−1 scores, respectively. In addition to these standard datasets, CNN news dataset is also utilized to evaluate the efficacy of our proposed approach. It was also shown that the best performance not only depends on the objective functions used but also on the correct choice of similarity/dissimilarity measure between sentences.
Klíčová slova:
Algorithms – Optimization – Neurons – Semantics – Evolutionary algorithms – Convergent evolution – Gene pool – Cosine similarity
Zdroje
1. Hovy E, Lin CY. Automated text summarization and the SUMMARIST system. In: Proceedings of a workshop on held at Baltimore, Maryland: October 13-15, 1998. Association for Computational Linguistics; 1998. p. 197–214.
2. Gupta V, Lehal GS. A survey of text summarization extractive techniques. Journal of emerging technologies in web intelligence. 2010;2(3):258–268. doi: 10.4304/jetwi.2.3.258-268
3. Ganesan K, Zhai C, Han J. Opinosis: a graph-based approach to abstractive summarization of highly redundant opinions. In: Proceedings of the 23rd international conference on computational linguistics. Association for Computational Linguistics; 2010. p. 340–348.
4. Rush AM, Chopra S, Weston J. A neural attention model for abstractive sentence summarization. In: Proceedings of international Conference on Empirical Methods in Natural Language Processing. Association for Computational Linguistics; 2015. p. 379–389.
5. Liu F, Flanigan J, Thomson S, Sadeh N, A Smith N. Toward Abstractive Summarization Using Semantic Representations. In: HLT-NAACL; 2015. p. 1077–1086.
6. Aliguliyev RM. A new sentence similarity measure and sentence based extractive technique for automatic text summarization. Expert Systems with Applications. 2009;36(4):7764–7772. doi: 10.1016/j.eswa.2008.11.022
7. Mihalcea R. Graph-based ranking algorithms for sentence extraction, applied to text summarization. In: Proceedings of the ACL 2004 on Interactive poster and demonstration sessions. Association for Computational Linguistics; 2004. p. 20.
8. Ferreira R, de Souza Cabral L, Lins RD, e Silva GP, Freitas F, Cavalcanti GD, et al. Assessing sentence scoring techniques for extractive text summarization. Expert systems with applications. 2013;40(14):5755–5764. doi: 10.1016/j.eswa.2013.04.023
9. Mendoza M, Bonilla S, Noguera C, Cobos C, León E. Extractive single-document summarization based on genetic operators and guided local search. Expert Systems with Applications. 2014;41(9):4158–4169. doi: 10.1016/j.eswa.2013.12.042
10. Shen D, Sun JT, Li H, Yang Q, Chen Z. Document Summarization Using Conditional Random Fields. In: IJCAI. vol. 7; 2007. p. 2862–2867.
11. Svore K, Vanderwende L, Burges C. Enhancing single-document summarization by combining RankNet and third-party sources. In: Proceedings of the 2007 joint conference on empirical methods in natural language processing and computational natural language learning (EMNLP-CoNLL); 2007.
12. Cheng J, Lapata M. Neural summarization by extracting sentences and words. arXiv preprint arXiv:160307252. 2016.
13. Nallapati R, Zhai F, Zhou B. SummaRuNNer: A Recurrent Neural Network Based Sequence Model for Extractive Summarization of Documents. In: AAAI; 2017. p. 3075–3081.
14. Deb K, Pratap A, Agarwal S, Meyarivan T. A fast and elitist multiobjective genetic algorithm: NSGA-II. IEEE transactions on evolutionary computation. 2002;6(2):182–197. doi: 10.1109/4235.996017
15. Storn R, Price K. Differential evolution–a simple and efficient heuristic for global optimization over continuous spaces. Journal of global optimization. 1997;11(4):341–359. doi: 10.1023/A:1008202821328
16. Wang L, Fu X, Menhas MI, Fei M. A modified binary differential evolution algorithm. In: Life System Modeling and Intelligent Computing. Springer; 2010. p. 49–57.
17. Bandyopadhyay S, Saha S, Maulik U, Deb K. A simulated annealing-based multiobjective optimization algorithm: AMOSA. IEEE transactions on evolutionary computation. 2008;12(3):269–283. doi: 10.1109/TEVC.2007.900837
18. Zhang D, Wei B. Comparison between differential evolution and particle swarm optimization algorithms. In: Mechatronics and Automation (ICMA), 2014 IEEE International Conference on. IEEE; 2014. p. 239–244.
19. Haykin SS, Haykin SS, Haykin SS, Haykin SS. Neural networks and learning machines. vol. 3. Pearson Upper Saddle River, NJ, USA:; 2009.
20. Yeh JY, Ke HR, Yang WP, Meng IH. Text summarization using a trainable summarizer and latent semantic analysis. Information processing & management. 2005;41(1):75–95. doi: 10.1016/j.ipm.2004.04.003
21. Lafferty J, McCallum A, Pereira FC. Conditional random fields: Probabilistic models for segmenting and labeling sequence data. 2001.
22. Wan X, Yang J, Xiao J. Manifold-Ranking Based Topic-Focused Multi-Document Summarization. In: IJCAI. vol. 7; 2007. p. 2903–2908.
23. Oliveira H, Lins RD, Lima R, Freitas F. A regression-based approach using integer linear programming for single-document summarization. In: 2017 IEEE 29th International Conference on Tools with Artificial Intelligence (ICTAI). IEEE; 2017. p. 270–277.
24. Schrijver A. Theory of linear and integer programming. John Wiley & Sons; 1998.
25. Dunlavy DM, O’Leary DP, Conroy JM, Schlesinger JD. QCS: A system for querying, clustering and summarizing documents. Information processing & management. 2007;43(6):1588–1605. doi: 10.1016/j.ipm.2007.01.003
26. Song W, Choi LC, Park SC, Ding XF. Fuzzy evolutionary optimization modeling and its applications to unsupervised categorization and extractive summarization. Expert Systems with Applications. 2011;38(8):9112–9121. doi: 10.1016/j.eswa.2010.12.102
27. Mendoza M, Cobos C, León E. Extractive Single-Document Summarization Based on Global-Best Harmony Search and a Greedy Local Optimizer. In: Mexican International Conference on Artificial Intelligence. Springer; 2015. p. 52–66.
28. Alguliyev RM, Aliguliyev RM, Isazade NR, Abdi A, Idris N. COSUM: Text summarization based on clustering and optimization. Expert Systems. 2018; p. e12340.
29. Burges C, Shaked T, Renshaw E, Lazier A, Deeds M, Hamilton N, et al. Learning to rank using gradient descent. In: Proceedings of the 22nd international conference on Machine learning. ACM; 2005. p. 89–96.
30. Kohonen T. The self-organizing map. Neurocomputing. 1998;21(1):1–6. doi: 10.1016/S0925-2312(98)00030-7
31. Zhang H, Zhang X, Gao XZ, Song S. Self-organizing multiobjective optimization based on decomposition with neighborhood ensemble. Neurocomputing. 2016;173:1868–1884. doi: 10.1016/j.neucom.2015.08.092
32. Zhang H, Zhou A, Song S, Zhang Q, Gao XZ, Zhang J. A Self-Organizing Multiobjective Evolutionary Algorithm. IEEE Transactions on Evolutionary Computation. 2016;20(5):792–806. doi: 10.1109/TEVC.2016.2521868
33. Pal M, Bandyopadhyay S. ESOEA: Ensemble of single objective evolutionary algorithms for many-objective optimization. Swarm and Evolutionary Computation. 2019;. doi: 10.1016/j.swevo.2019.03.006
34. Li X, Zhang H, Song S. A self-adaptive mating restriction strategy based on survival length for evolutionary multiobjective optimization. Swarm and evolutionary computation. 2018;43:31–49. doi: 10.1016/j.swevo.2018.02.009
35. Zhang Q, Li H. MOEA/D: A multiobjective evolutionary algorithm based on decomposition. IEEE Transactions on evolutionary computation. 2007;11(6):712–731. doi: 10.1109/TEVC.2007.892759
36. Saini N, Chourasia S, Saha S, Bhattacharyya P. A Self Organizing Map Based Multi-objective Framework for Automatic Evolution of Clusters. In: International Conference on Neural Information Processing. Springer; 2017. p. 672–682.
37. Das S, Abraham A, Konar A. Automatic clustering using an improved differential evolution algorithm. IEEE Transactions on systems, man, and cybernetics-Part A: Systems and Humans. 2008;38(1):218–237. doi: 10.1109/TSMCA.2007.909595
38. Suresh K, Kundu D, Ghosh S, Das S, Abraham A. Data clustering using multi-objective differential evolution algorithms. Fundamenta Informaticae. 2009;97(4):381–403.
39. Saini N, Saha S, Bhattacharyya P. Automatic Scientific Document Clustering Using Self-organized Multi-objective Differential Evolution. Cognitive Computation. 2019;11(2):271–293. doi: 10.1007/s12559-018-9611-8
40. Saini N, Saha S, Soni C, Bhattacharyya P. Automatic Evolution of Bi-clusters from Microarray Data using Self-Organized Multi-objective Evolutionary Algorithm. Applied Intelligence. 2019 (accepted).
41. Saini N, Saha S, Harsh A, Bhattacharyya P. Sophisticated SOM based genetic operators in multi-objective clustering framework. Applied Intelligence. 2019;49(5):1803–1822. doi: 10.1007/s10489-018-1350-8
42. Saini N, Saha S, Tuteja H, Bhattacharyya P. Textual Entailment based Figure Summarization for Biomedical Articles. ACM Transactions on Multimedia Computing Communications and Applications. 2019 (accepted).
43. Saini N, Saha S, Jangra A, Bhattacharyya P. Extractive single document summarization using multi-objective optimization: Exploring self-organized differential evolution, grey wolf optimizer and water cycle algorithm. Knowledge-Based Systems. 2019;164:45–67. doi: 10.1016/j.knosys.2018.10.021
44. Saini N, Saha S, Kumar A, Bhattacharyya P. Multi-document Summarization using Adaptive Composite Differential Evolution. In: International Conference on Neural Information Processing. Springer; 2019 (accepted).
45. Dong R. Differential evolution versus particle swarm optimization for PID controller design. In: Natural Computation, 2009. ICNC’09. Fifth International Conference on. vol. 3. IEEE; 2009. p. 236–240.
46. Vesterstrom J, Thomsen R. A comparative study of differential evolution, particle swarm optimization, and evolutionary algorithms on numerical benchmark problems. In: IEEE Congress on Evolutionary Computation. vol. 2; 2004. p. 1980–1987.
47. Kennedy J. Particle swarm optimization. In: Encyclopedia of machine learning. Springer; 2011. p. 760–766.
48. Cilibrasi RL, Vitanyi PM. The google similarity distance. IEEE Transactions on knowledge and data engineering. 2007;19(3). doi: 10.1109/TKDE.2007.48
49. Liu SH, Chen KY, Hsieh YL, Chen B, Wang HM, Yen HC, et al. Exploring Word Mover’s Distance and Semantic-Aware Embedding Techniques for Extractive Broadcast News Summarization. In: INTERSPEECH; 2016. p. 670–674.
50. Qin AK, Huang VL, Suganthan PN. Differential evolution algorithm with strategy adaptation for global numerical optimization. IEEE transactions on Evolutionary Computation. 2009;13(2):398–417. doi: 10.1109/TEVC.2008.927706
51. Kusner M, Sun Y, Kolkin N, Weinberger K. From word embeddings to document distances. In: International Conference on Machine Learning; 2015. p. 957–966.
52. Pele O, Werman M. Fast and robust Earth Mover’s Distances. In: ICCV. vol. 9; 2009. p. 460–467.
53. Mikolov T, Chen K, Corrado G, Dean J. Efficient estimation of word representations in vector space. arXiv preprint arXiv:13013781. 2013.
54. Jungjit S, Freitas A. A lexicographic multi-objective genetic algorithm for multi-label correlation based feature selection. In: Proceedings of the Companion Publication of the 2015 Annual Conference on Genetic and Evolutionary Computation. ACM; 2015. p. 989–996.
55. Fattah MA, Ren F. GA, MR, FFNN, PNN and GMM based models for automatic text summarization. Computer Speech & Language. 2009;23(1):126–144. doi: 10.1016/j.csl.2008.04.002
56. Radev DR, Jing H, Styś M, Tam D. Centroid-based summarization of multiple documents. Information Processing & Management. 2004;40(6):919–938. doi: 10.1016/j.ipm.2003.10.006
57. Silla CN, Pappa GL, Freitas AA, Kaestner CA. Automatic text summarization with genetic algorithm-based attribute selection. In: Ibero-American Conference on Artificial Intelligence. Springer; 2004. p. 305–314.
58. Kupiec J, Pedersen J, Chen F. A trainable document summarizer. In: Proceedings of the 18th annual international ACM SIGIR conference on Research and development in information retrieval. ACM; 1995. p. 68–73.
59. Gupta V, Chauhan P, Garg S, Borude A, Krishnan S. An statistical tool for multi-document summarization. International Journal of Scientific and Research Publications. 2012;2(5).
60. Shareghi E, Hassanabadi LS. Text summarization with harmony search algorithm-based sentence extraction. In: Proceedings of the 5th international conference on Soft computing as transdisciplinary science and technology. ACM; 2008. p. 226–231.
61. Qazvinian V, Hassanabadi LS, Halavati R. Summarising text with a genetic algorithm-based sentence extraction. International Journal of Knowledge Management Studies. 2008;2(4):426–444. doi: 10.1504/IJKMS.2008.019750
62. Liu D, He Y, Ji D, Yang H. Genetic algorithm based multi-document summarization. In: Pacific Rim International Conference on Artificial Intelligence. Springer; 2006. p. 1140–1144.
63. Bird S, Loper E. NLTK: the natural language toolkit. In: Proceedings of the ACL 2004 on Interactive poster and demonstration sessions. Association for Computational Linguistics; 2004. p. 31.
64. Mikolov T, Karafiát M, Burget L, Černockỳ J, Khudanpur S. Recurrent neural network based language model. In: Eleventh Annual Conference of the International Speech Communication Association; 2010.
65. Le Q, Mikolov T. Distributed representations of sentences and documents. In: Proceedings of the 31st International Conference on Machine Learning (ICML-14); 2014. p. 1188–1196.
66. Lau JH, Baldwin T. An empirical evaluation of doc2vec with practical insights into document embedding generation. arXiv preprint arXiv:160705368. 2016.
67. Mani K, Verma I, Meisheri H, Dey L. Multi-document summarization using distributed bag-of-words model. In: IEEE/WIC/ACM International Conference on Web Intelligence (WI). IEEE; 2018. p. 672–675.
68. Wan X. Towards a unified approach to simultaneous single-document and multi-document summarizations. In: Proceedings of the 23rd international conference on computational linguistics. Association for Computational Linguistics; 2010. p. 1137–1145.
69. Lin CY. Rouge: A package for automatic evaluation of summaries. Text Summarization Branches Out. 2004.
70. Papineni K, Roukos S, Ward T, Zhu WJ. BLEU: a method for automatic evaluation of machine translation. In: Proceedings of the 40th annual meeting on association for computational linguistics. Association for Computational Linguistics; 2002. p. 311–318.
71. Welch BL. The generalization of student’s’ problem when several different population variances are involved. Biometrika. 1947;34(1/2):28–35. doi: 10.1093/biomet/34.1-2.28 20287819
72. Aliguliyev RM. Performance evaluation of density-based clustering methods. Information Sciences. 2009;179(20):3583–3602. doi: 10.1016/j.ins.2009.06.012
73. Roussinov D, Chen H. A scalable self-organizing map algorithm for textual classification: A neural network approach to thesaurus generation. Communication Cognition and Artificial Intelligence. 1998;15(1-2):81–111.
Článok vyšiel v časopise
PLOS One
2019 Číslo 11
- Metamizol jako analgetikum první volby: kdy, pro koho, jak a proč?
- Nejasný stín na plicích – kazuistika
- Masturbační chování žen v ČR − dotazníková studie
- Úspěšná resuscitativní thorakotomie v přednemocniční neodkladné péči
- Dlouhodobá recidiva a komplikace spojené s elektivní operací břišní kýly
Najčítanejšie v tomto čísle
- A daily diary study on maladaptive daydreaming, mind wandering, and sleep disturbances: Examining within-person and between-persons relations
- A 3’ UTR SNP rs885863, a cis-eQTL for the circadian gene VIPR2 and lincRNA 689, is associated with opioid addiction
- A substitution mutation in a conserved domain of mammalian acetate-dependent acetyl CoA synthetase 2 results in destabilized protein and impaired HIF-2 signaling
- Molecular validation of clinical Pantoea isolates identified by MALDI-TOF