What:
在Tensorflow中, 為了區別不同的變量(例如TensorBoard顯示中), 會需要命名空間對不同的變量進行命名. 其中常用的兩個函數為: tf.variable_scope, tf.name_scope.
Why:
在自己的編寫代碼過程中, 用如下代碼進行變量生成並進行卷積操作:
1 import tensorflow as tf 2 import numpy as np 3 4 def my_conv2d(data, name, kh, kw, sh, sw, n_out): 5 n_in = np.shape(data)[-1] 6 with tf.name_scope(name): 7 kernel = tf.get_variable(name="W", shape=[kh, kw, n_in, n_out], dtype=tf.float32, initializer=tf.contrib.layers.xavier_initializer()) 8 bias = tf.Variable(tf.constant(0.0, shape=[n_out], dtype=tf.float32), name="b") 9 conv = tf.nn.conv2d(data, kernel, stride=[1, sh, sw, 1], padding='SAME', name="Conv") 10 result = tf.nn.relu(tf.nn.bias_add(conv, bias), name="Act") 11 return result.
運行時會報錯:
ValueError: Variable bar already exists, disallowed. Did you mean to set reuse=True in VarScope? ...
How:
中國工信出版社的<TensorFlow 實戰Google深度學習學習框架>P231, 對tf.name_scope和tf.variable_scope做了一個詳細的解釋, 轉載在下面:
這兩個函數在大部分情況下是等價的, 唯一的區別是在使用tf.get_variable函數時.
import tensorflow as tf with tf.variable_scope("foo"): a = tf.get_variable("bar", [1]) print a.name # 輸出 foo/bar: 0 with tf.variable_scope("bar"): b = tf.get_variable("bar", [1]) print b.name # 輸出 bar/bar: 0 with tf.name_scope("a"): a = tf.Variable([1]) print a.name # 輸出 a/Variable: 0 a = tf.Variable("b", [1]): print a.name # 輸出 b: 0 with tf.name_scope("b"): tf.get_variable("b", [1]) # Error
從上面的輸出看出. 如果在使用tf.get_variable()生成變量, 這時的命名是不受tf.name_scope的影響, 而會收到tf.variable_scope的影響, 因此對於我自己的情況, 問題在於my_conv2d在多次調用時, 變量命名重復, 由此可見修改方案可以考慮改為用tf.variable_scope. 我在下面貼出另一種解決方案(每次調用都給予不同的變量名):
def my_conv2d(data, name, kh, kw, sh, sw, n_out): n_in = np.shape(data)[-1] with tf.name_scope(name) as scope: kernel = tf.get_variable(name=scope+"W", shape=[kh, kw, n_in, n_out], dtype=tf.float32, initializer=tf.contrib.layers.xavier_initializer()) bias = tf.Variable(tf.constant(0.0, shape=[n_out], dtype=tf.float32), name=scope+"b") conv = tf.nn.conv2d(data, kernel, stride=[1, sh, sw, 1], padding='SAME', name=scope+"Conv") result = tf.nn.relu(tf.nn.bias_add(conv, bias), name=Scope+"Act") return result.