Skip to content

Latest commit

 

History

History
5 lines (5 loc) · 323 Bytes

README.md

File metadata and controls

5 lines (5 loc) · 323 Bytes

Dilated_Res_Attention_LSTM

Simple implementations of Dilated LSTM, Residual LSTM and Attentive LSTM (follow the corresponding papers). In this repo, we reimplemented Dilated LSTM, Residual LSTM and Attentive LSTM followed by attached papers using Pytorch. Readers can run the main.py file for testing. Hope it is useful!