Skip to content

Commit

Permalink
clip_grad_norm is now deprecated
Browse files Browse the repository at this point in the history
  • Loading branch information
AutuanLiu committed Nov 9, 2018
1 parent c48008b commit a16173f
Showing 1 changed file with 2 additions and 2 deletions.
4 changes: 2 additions & 2 deletions tutorials/02-intermediate/language_model/main.py
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@
import torch
import torch.nn as nn
import numpy as np
from torch.nn.utils import clip_grad_norm
from torch.nn.utils import clip_grad_norm_
from data_utils import Dictionary, Corpus


Expand Down Expand Up @@ -78,7 +78,7 @@ def detach(states):
# Backward and optimize
model.zero_grad()
loss.backward()
clip_grad_norm(model.parameters(), 0.5)
clip_grad_norm_(model.parameters(), 0.5)
optimizer.step()

step = (i+1) // seq_length
Expand Down

0 comments on commit a16173f

Please sign in to comment.