Skip to content

v2.0.0, update Autograd

Compare
Choose a tag to compare
@AkiRusProd AkiRusProd released this 21 Jul 14:17
· 26 commits to master since this release
19d2d54
  • Now Autograd is tape-based, (previously it was fully recursive)
  • Bug fixes and optimization
    • Simplify nn.Embedding
    • Simplify autograd reverse_broadcast method
    • Fix where grad_fn method
  • Add Seq2Seq Transformer example
  • Other
    • Add nn.LogSoftmax
    • Edit nn.Linear bias initialization
    • Оther fixes and improvements

New autograd version enables transformer training, overcoming previous limitations due to slow backpropagation speed.