可以看到之前的权重确实保存了下来!!
问题来了,我们的网络需要能有一个输入自定义数据的接口啊!不然这玩意有什么用。别急,当然有办法。
import tensorflow as tf from tensorflow.python.framework.graph_util import convert_variables_to_constants a = tf.Variable([[3],[4]], dtype=tf.float32, name='a') b = tf.Variable(4, dtype=tf.float32, name='b') input_tensor = tf.placeholder(tf.float32, name='input') output = tf.add((a+b), input_tensor, name='out') with tf.Session() as sess: sess.run(tf.global_variables_initializer()) graph = convert_variables_to_constants(sess, sess.graph_def, ["out"]) tf.train.write_graph(graph, '.', 'graph.pb', as_text=False)
用上述代码重新保存网络至graph.pb,这次我们有了一个输入placeholder,下面来看看怎么恢复网络并输入自定义数据。
import tensorflow as tf with tf.Session() as sess: with open('./graph.pb', 'rb') as f: graph_def = tf.GraphDef() graph_def.ParseFromString(f.read()) output = tf.import_graph_def(graph_def, input_map={'input:0':4.}, return_elements=['out:0'], name='a') print(sess.run(output))
输出结果为:
[array([[ 11.],
[ 12.]], dtype=float32)]
可以看到结果没有问题,当然在input_map那里可以替换为新的自定义的placeholder,如下所示:
import tensorflow as tf new_input = tf.placeholder(tf.float32, shape=()) with tf.Session() as sess: with open('./graph.pb', 'rb') as f: graph_def = tf.GraphDef() graph_def.ParseFromString(f.read()) output = tf.import_graph_def(graph_def, input_map={'input:0':new_input}, return_elements=['out:0'], name='a') print(sess.run(output, feed_dict={new_input:4}))
看看输出,同样没有问题。
[array([[ 11.],
[ 12.]], dtype=float32)]
另外需要说明的一点是,在利用tf.train.write_graph写网络架构的时候,如果令as_text=True了,则在导入网络的时候,需要做一点小修改。
import tensorflow as tf from google.protobuf import text_format with tf.Session() as sess: # 不使用'rb'模式 with open('./graph.pb', 'r') as f: graph_def = tf.GraphDef() # 不使用graph_def.ParseFromString(f.read()) text_format.Merge(f.read(), graph_def) output = tf.import_graph_def(graph_def, return_elements=['out:0']) print(sess.run(output))