Skip to content

Experiment with pre-trained model to build Text Summarization Model for summarize Indonesia Article (in this project will be using dataset liputan6 by FajriKoto)

Notifications You must be signed in to change notification settings

Ishaq101/Finetune-Pre-Trained-Model-for-Text-Summarization

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 
 
 
 
 

Repository files navigation

Finetune-Pre-Trained-Model-for-Text-Summarization

Experiment with pre-trained model to build Text Summarization Model for summarize Indonesia Article (in this project will be using dataset id_liputan6 by Fajri Koto)

  • Used Pre-Trained Model : indolem/indobert-base-uncased source
  • Used Dataset : id_liputan6 by Fajri Koto
  • Finetuning Parameter :
    • batch_size = 8
    • learning_rate = 2e-5
    • epoch = 8 (with early stoping)
    • weight_decay = 0.01
  • Finetuning Evaluation :
    • Rouge1 : 0.33 (+/- 0.14)
    • Rouge2 : 0.15 (+/- 0.12)
    • RougeL : 0.28 (+/- 0.13)
    • RougeLsum : 0.28 (+/- 0.13)

About

Experiment with pre-trained model to build Text Summarization Model for summarize Indonesia Article (in this project will be using dataset liputan6 by FajriKoto)

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published