Skip to content

Latest commit

 

History

History
8 lines (6 loc) · 475 Bytes

README.md

File metadata and controls

8 lines (6 loc) · 475 Bytes

BERT: NLG-Text Summarization

Encoder:

Input sequences are encoded into context representations using BERT.

Decoder:

  1. First Stage: a Transformer-based decoder to generate a draft output sequence.
  2. Second stage: each word of the draft sequence is masked and feeded and feeded to BERT. Then by combining the input sequence and the draft representation generated by BERT, a Transformer-based decoder is used to predict the refined word for each masked position.