当我使用tensorflow 2.0自定义训练循环时,是否有任何函数或方法可以显示学习率?
这是张量流指南的示例:
def train_step(images, labels):
with tf.GradientTape() as tape:
predictions = model(images)
loss = loss_object(labels, predictions)
gradients = tape.gradient(loss, model.trainable_variables)
optimizer.apply_gradients(zip(gradients, model.trainable_variables))
train_loss(loss)
train_accuracy(labels, predictions)
当模型训练时,如何从优化器中检索当前的学习率?
我将非常感谢您提供的任何帮助。 :)
在 Tensorflow 2.1 中,Optimizer 类有一个未记录的方法_decayed_lr
(参见定义here),您可以通过提供要转换为的变量类型来在训练循环中调用它:
current_learning_rate = optimizer._decayed_lr(tf.float32)
这里还有一个更完整的 TensorBoard 示例。
train_step_count = 0
summary_writer = tf.summary.create_file_writer('logs/')
def train_step(images, labels):
train_step_count += 1
with tf.GradientTape() as tape:
predictions = model(images)
loss = loss_object(labels, predictions)
gradients = tape.gradient(loss, model.trainable_variables)
optimizer.apply_gradients(zip(gradients, model.trainable_variables))
# optimizer._decayed_lr(tf.float32) is the current Learning Rate.
# You can save it to TensorBoard like so:
with summary_writer.as_default():
tf.summary.scalar('learning_rate',
optimizer._decayed_lr(tf.float32),
step=train_step_count)
本文内容由网友自发贡献,版权归原作者所有,本站不承担相应法律责任。如您发现有涉嫌抄袭侵权的内容,请联系:hwhale#tublm.com(使用前将#替换为@)