Skip to content

Commit

Permalink
Switched from SGD to Adam Optimizer
Browse files Browse the repository at this point in the history
  • Loading branch information
SullyChen authored Oct 3, 2016
1 parent cc518e6 commit 61ea801
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion train.py
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@
sess = tf.InteractiveSession()

loss = tf.reduce_mean(tf.square(tf.sub(model.y_, model.y)))
train_step = tf.train.GradientDescentOptimizer(0.01).minimize(loss)
train_step = tf.train.AdamOptimizer(1e-4).minimize(loss)
sess.run(tf.initialize_all_variables())

saver = tf.train.Saver()
Expand Down

0 comments on commit 61ea801

Please sign in to comment.