您的代码的当前版本将随机生成一个新值rand_var_1
and rand_var_2
每次致电sess.run()
(尽管由于您将种子设置为 0,所以它们在单次调用中将具有相同的值sess.run()
).
如果您想保留随机生成的张量的值以供以后使用,您应该将其分配给tf.Variable https://www.tensorflow.org/versions/master/api_docs/python/state_ops.html#Variable:
rand_var_1 = tf.Variable(tf.random_uniform([5], 0, 10, dtype=tf.int32, seed=0))
rand_var_2 = tf.Variable(tf.random_uniform([5], 0, 10, dtype=tf.int32, seed=0))
# Or, alternatively:
rand_var_1 = tf.Variable(tf.random_uniform([5], 0, 10, dtype=tf.int32, seed=0))
rand_var_2 = tf.Variable(rand_var_1.initialized_value())
# Or, alternatively:
rand_t = tf.random_uniform([5], 0, 10, dtype=tf.int32, seed=0)
rand_var_1 = tf.Variable(rand_t)
rand_var_2 = tf.Variable(rand_t)
...then tf.initialize_all_variables() https://www.tensorflow.org/versions/master/api_docs/python/state_ops.html#initialize_all_variables就会达到想要的效果:
# Op 1
z1 = tf.add(rand_var_1, rand_var_2)
# Op 2
z2 = tf.add(rand_var_1, rand_var_2)
init = tf.initialize_all_variables()
with tf.Session() as sess:
sess.run(init) # Random numbers generated here and cached.
z1_op = sess.run(z1) # Reuses cached values for rand_var_1, rand_var_2.
z2_op = sess.run(z2) # Reuses cached values for rand_var_1, rand_var_2.
print(z1_op, z2_op) # Will print two identical vectors.