Skip to content

Commit

Permalink
Added new result for GEC (sebastianruder#345)
Browse files Browse the repository at this point in the history
  • Loading branch information
butsugiri authored and sebastianruder committed Sep 18, 2019
1 parent 8606073 commit ddf81c8
Showing 1 changed file with 2 additions and 0 deletions.
2 changes: 2 additions & 0 deletions english/grammatical_error_correction.md
Original file line number Diff line number Diff line change
Expand Up @@ -18,6 +18,7 @@ The shared task setting restricts that systems use only publicly available datas

| Model | F0.5 | Paper / Source | Code |
| ------------- | :-----:| --- | :-----: |
| Transformer + Pre-train with Pseudo Data (Kiyono et al., EMNLP 2019) | 65.0 | [An Empirical Study of Incorporating Pseudo Data into Grammatical Error Correction](https://arxiv.org/abs/1909.00502) | NA |
| Copy-Augmented Transformer + Pre-train (Zhao and Wang, NAACL 2019) | 61.15 | [Improving Grammatical Error Correction via Pre-Training a Copy-Augmented Architecture with Unlabeled Data](https://arxiv.org/pdf/1903.00138.pdf) | [Official](https://github.com/zhawe01/fairseq-gec) |
| CNN Seq2Seq + Quality Estimation (Chollampatt and Ng, EMNLP 2018) | 56.52 | [Neural Quality Estimation of Grammatical Error Correction](http://aclweb.org/anthology/D18-1274) | [Official](https://github.com/nusnlp/neuqe/) |
| SMT + BiGRU (Grundkiewicz and Junczys-Dowmunt, 2018) | 56.25 | [Near Human-Level Performance in Grammatical Error Correction with Hybrid Machine Translation](http://aclweb.org/anthology/N18-2046)| NA |
Expand Down Expand Up @@ -116,6 +117,7 @@ Since current state-of-the-art systems rely on as much annotated learner data as

| Model | F0.5 | Paper / Source | Code |
| ------------- | :-----:| --- | :-----: |
| Transformer + Pre-train with Pseudo Data (Kiyono et al., EMNLP 2019) | 70.2 | [An Empirical Study of Incorporating Pseudo Data into Grammatical Error Correction](https://arxiv.org/abs/1909.00502) | NA |
| Transformer | 69.47 | [Neural Grammatical Error Correction Systems with UnsupervisedPre-training on Synthetic Data](https://www.aclweb.org/anthology/W19-4427)| [Official: Code to be updated soon](https://github.com/grammatical/pretraining-bea2019) |
| Transformer | 69.00 | [A Neural Grammatical Error Correction System Built OnBetter Pre-training and Sequential Transfer Learning](https://www.aclweb.org/anthology/W19-4423)| [Official](https://github.com/kakaobrain/helo_word/) |
| Ensemble of models | 66.78 | [The LAIX Systems in the BEA-2019 GEC Shared Task](https://www.aclweb.org/anthology/W19-4416)| NA |
Expand Down

0 comments on commit ddf81c8

Please sign in to comment.