AI - TensorFlow - 可視化工具TensorBoard


TensorBoard

TensorFlow自帶的可視化工具,能夠以直觀的流程圖的方式,清楚展示出整個神經網絡的結構和框架,便於理解模型和發現問題。

啟動TensorBoard

  • 使用命令“tensorboard --logdir=path/to/log-directory”(或者“python -m tensorboard.main”);
  • 參數logdir指向FileWriter將數據序列化的目錄,建議在logdir上一級目錄執行此命令;
  • TensorBoard運行后,在瀏覽器輸入“localhost:6006”即可查看TensorBoard;

幫助信息

  • 使用“tensorboard --help”查看tensorboard的詳細參數 

示例

程序代碼

 1 # coding=utf-8
 2 from __future__ import print_function
 3 import tensorflow as tf
 4 import numpy as np
 5 import matplotlib.pyplot as plt
 6 import os
 7 
 8 os.environ['TF_CPP_MIN_LOG_LEVEL'] = '2'
 9 
10 
11 # ### 添加神經層
12 
13 
14 def add_layer(inputs, in_size, out_size, n_layer, activation_function=None):  # 參數n_layer用來標識層數
15     layer_name = 'layer{}'.format(n_layer)
16     with tf.name_scope(layer_name):  # 使用with tf.name_scope定義圖層,並指定在可視化圖層中的顯示名稱
17         with tf.name_scope('weights'):  # 定義圖層並指定名稱,注意這里是上一圖層的子圖層
18             Weights = tf.Variable(tf.random_normal([in_size, out_size]), name='W')  # 參數name指定名稱
19             tf.summary.histogram(layer_name + '/weights', Weights)  # 生成直方圖summary,指定圖表名稱和記錄的變量
20         with tf.name_scope('biases'):  # 定義圖層並指定名稱
21             biases = tf.Variable(tf.zeros([1, out_size]) + 0.1, name='b')  # 參數name指定名稱
22             tf.summary.histogram(layer_name + '/biases', biases)  # 生成直方圖summary
23         with tf.name_scope('Wx_plus_b'):  # 定義圖層並指定名稱
24             Wx_plus_b = tf.matmul(inputs, Weights) + biases
25         if activation_function is None:
26             outputs = Wx_plus_b
27         else:
28             outputs = activation_function(Wx_plus_b)
29         tf.summary.histogram(layer_name + '/outputs', outputs)  # 生成直方圖summary
30         return outputs
31 
32 
33 # ### 構建數據
34 x_data = np.linspace(-1, 1, 300, dtype=np.float32)[:, np.newaxis]
35 noise = np.random.normal(0, 0.05, x_data.shape).astype(np.float32)
36 y_data = np.square(x_data) - 0.5 + noise
37 
38 # ### 搭建網絡
39 with tf.name_scope('inputs'):  # 定義圖層並指定名稱
40     xs = tf.placeholder(tf.float32, [None, 1], name='x_input')  # 指定名稱為x_input,也就是在可視化圖層中的顯示名稱
41     ys = tf.placeholder(tf.float32, [None, 1], name='y_input')  # 指定名稱為y_input
42 
43 h1 = add_layer(xs, 1, 10, n_layer=1, activation_function=tf.nn.relu)  # 隱藏層
44 prediction = add_layer(h1, 10, 1, n_layer=2, activation_function=None)  # 輸出層
45 
46 with tf.name_scope('loss'):  # 定義圖層並指定名稱
47     loss = tf.reduce_mean(tf.reduce_sum(tf.square(ys - prediction),
48                                         reduction_indices=[1]))
49     tf.summary.scalar('loss', loss)  # 用於標量的summary,loss在TensorBoard的event欄
50 
51 with tf.name_scope('train'):  # 定義圖層並指定名稱
52     train_step = tf.train.GradientDescentOptimizer(0.1).minimize(loss)
53 
54 sess = tf.Session()
55 merged = tf.summary.merge_all()  # 合並之前定義的所有summary操作
56 writer = tf.summary.FileWriter("logs/", sess.graph)  # 創建FileWriter對象和event文件,指定event文件的存放目錄
57 init = tf.global_variables_initializer()
58 sess.run(init)
59 
60 # ### 結果可視化
61 fig = plt.figure()
62 ax = fig.add_subplot(1, 1, 1)
63 ax.scatter(x_data, y_data)
64 plt.ion()
65 plt.show()
66 
67 # ### 訓練
68 for i in range(1001):
69     sess.run(train_step, feed_dict={xs: x_data, ys: y_data})
70     if i % 50 == 0:
71         result = sess.run(loss, feed_dict={xs: x_data, ys: y_data})
72         print("Steps:{}  Loss:{}".format(i, result))
73         rs = sess.run(merged, feed_dict={xs: x_data, ys: y_data})  # 在sess.run中運行
74         writer.add_summary(rs, i)
75         try:
76             ax.lines.remove(lines[0])
77         except Exception:
78             pass
79         prediction_value = sess.run(prediction, feed_dict={xs: x_data})
80         lines = ax.plot(x_data, prediction_value, 'r-', lw=5)
81         plt.pause(0.2)
82 
83 # ### TensorBoard
84 # TensorFlow自帶的可視化工具,能夠以直觀的流程圖的方式,清楚展示出整個神經網絡的結構和框架,便於理解模型和發現問題;
85 #   - 可視化學習:https://www.tensorflow.org/guide/summaries_and_tensorboard
86 #   - 圖的直觀展示:https://www.tensorflow.org/guide/graph_viz;
87 #   - 直方圖信息中心:https://www.tensorflow.org/guide/tensorboard_histograms
88 #
89 # ### 啟動TensorBoard
90 # 使用命令“tensorboard --logdir=path/to/log-directory”(或者“python -m tensorboard.main”);
91 # 參數logdir指向FileWriter將數據序列化的目錄,建議在logdir上一級目錄執行此命令;
92 # TensorBoard運行后,在瀏覽器輸入“localhost:6006”即可查看TensorBoard;

 

程序運行結果

運行過程中顯示的圖形:

 

某一次運行的命令行輸出:

Steps:0  Loss:0.19870562851428986
Steps:50  Loss:0.006314810831099749
Steps:100  Loss:0.0050856382586061954
Steps:150  Loss:0.0048223137855529785
Steps:200  Loss:0.004617161583155394
Steps:250  Loss:0.004429362714290619
Steps:300  Loss:0.004260621033608913
Steps:350  Loss:0.004093690309673548
Steps:400  Loss:0.003932977095246315
Steps:450  Loss:0.0038178395479917526
Steps:500  Loss:0.003722294932231307
Steps:550  Loss:0.003660505171865225
Steps:600  Loss:0.0036110866349190474
Steps:650  Loss:0.0035716891288757324
Steps:700  Loss:0.0035362064372748137
Steps:750  Loss:0.0034975067246705294
Steps:800  Loss:0.003465239657089114
Steps:850  Loss:0.003431882942095399
Steps:900  Loss:0.00339301535859704
Steps:950  Loss:0.0033665322698652744
Steps:1000  Loss:0.003349516075104475

 

生成的TensorBoard文件:

(mlcc) D:\Anliven\Anliven-Code\PycharmProjects\TempTest>dir logs
 驅動器 D 中的卷是 Files
 卷的序列號是 ACF9-2E0E

 D:\Anliven\Anliven-Code\PycharmProjects\TempTest\logs 的目錄

2019/02/24  23:41    <DIR>          .
2019/02/24  23:41    <DIR>          ..
2019/02/24  23:41           137,221 events.out.tfevents.1551022894.DESKTOP-68OFQFP
               1 個文件        137,221 字節
               2 個目錄 219,401,887,744 可用字節

(mlcc) D:\Anliven\Anliven-Code\PycharmProjects\TempTest>

 

啟動與TensorBoard

執行下面的啟動命令,然后在瀏覽器中輸入“http://localhost:6006/”查看。

(mlcc) D:\Anliven\Anliven-Code\PycharmProjects\TempTest>tensorboard --logdir=logs
TensorBoard 1.12.0 at http://DESKTOP-68OFQFP:6006 (Press CTRL+C to quit)

 

欄目Scalars

 

欄目Graphs

  • 通過鼠標滑輪可以改變顯示大小和位置 
  • 鼠標雙擊“+”標識可以查看進一步的信息
  • 可以將指定圖層從主圖層移出,單獨顯示

 

欄目Distributions

 

欄目histograms

 


免責聲明!

本站轉載的文章為個人學習借鑒使用,本站對版權不負任何法律責任。如果侵犯了您的隱私權益,請聯系本站郵箱yoyou2525@163.com刪除。



 
粵ICP備18138465號   © 2018-2025 CODEPRJ.COM