Open Access Open Access  Restricted Access Subscription or Fee Access

Automatic Text Summarization – A Review

Behjat Reyaz, Falak Jan, Riyaz ul Haque

Abstract


ABSTRACT

 

We present Encoder/Decoder a unique methodology for summarization of text documents. The summarization of text is to be done in two forms one is abstractive and the other one is extractive. Summarization of text is a main job of natural language processing. With the help of new idea we make a unique amalgam model of abstractive-extractive to combine BERT(Bidirectional Encoder Representations from Transformers). Firstly we transform the summaries which are abstractive to their respective embeddings using BERT. Secondly, pre-train two models separately by implementing BERT pre-trained model. Then the extraction network and abstraction network are combined to form a unified model by reinforcement learning. As we know that Automatic text summarization fundamentally compress a long document into a shorter format while securing its intelligence content and overall meaning .A number of automatic summarizers exist which is able of producing good content summaries, but they do not focus on securing the underlying meaning and semantics of the text. So we also catch and preserve the semantics of text as the fundamental feature for summarizing a document using Distribution Semantic model. Also Clustering and Ranking of sentences is done by using various algorithms. In this paper Automatic text summarization is performed on single documents and is generic based. We evaluate our summarizer with the help of ROUGE(Recall-Oriented Understudy for Gisting Evaluation)on DUC-2007 dataset and compare with other state-of-the-art summarizers.

 

Keywords: Encoder, Decoder, BERT, Summarizers, ROUGE

 

E-mail:[email protected]

 

 


Full Text:

PDF

Refbacks

  • There are currently no refbacks.


This site has been shifted to https://stmcomputers.stmjournals.com/