PaddlePaddle實現線性回歸


在本次實驗中我們將使用PaddlePaddle來搭建一個簡單的線性回歸模型,並利用這一模型預測你的儲蓄(在某地區)可以購買多大面積的房子。並且在學習模型搭建的過程中,了解到機器學習的若干重要概念,掌握一個機器學習預測的基本流程。

線性回歸的基本概念

線性回歸是機器學習中最簡單也是最重要的模型之一,其模型建立遵循此流程:獲取數據、數據預處理、訓練模型、應用模型。

回歸模型可以理解為:存在一個點集,用一條曲線去擬合它分布的過程。如果擬合曲線是一條直線,則稱為線性回歸。如果是一條二次曲線,則被稱為二次回歸。線性回歸是回歸模型中最簡單的一種。

在線性回歸中有幾個基本的概念需要掌握:

  • 假設函數(Hypothesis Function)
  • 損失函數(Loss Function)
  • 優化算法(Optimization Algorithm)

假設函數:

假設函數是指,用數學的方法描述自變量和因變量之間的關系,它們之間可以是一個線性函數或非線性函數。 在本次線性回顧模型中,我們的假設函數為 Y^=aX1+b\hat{Y}= aX_1+bY^​​=aX1​​+b ,其中,Y^\hat{Y}Y^​​表示模型的預測結果(預測房價),用來和真實的Y區分。模型要學習的參數即:a,b。

損失函數:

損失函數是指,用數學的方法衡量假設函數預測結果與真實值之間的誤差。這個差距越小預測越准確,而算法的任務就是使這個差距越來越小。

建立模型后,我們需要給模型一個優化目標,使得學到的參數能夠讓預測值Y^\hat{Y}Y^​​盡可能地接近真實值Y。輸入任意一個數據樣本的目標值yiy_iyi​​和模型給出的預測值\hat{Y_i,損失函數輸出一個非負的實值。這個實值通常用來反映模型誤差的大小。

對於線性模型來講,最常用的損失函數就是均方誤差(Mean Squared Error, MSE)。

MSE=1n∑i=1n(Yi^−Yi)2MSE=\frac{1}{n}\sum_{i=1}^{n}(\hat{Y_i}-Y_i)^2MSE=n1​​i=1n​​(Yi​​^​​Yi​​)2​​

即對於一個大小為n的測試集,MSE是n個數據預測結果誤差平方的均值。

優化算法:

在模型訓練中優化算法也是至關重要的,它決定了一個模型的精度和運算速度。本章的線性回歸實例中主要使用了梯度下降法進行優化。

現在,讓我們正式進入實驗吧!

 

首先導入必要的包,分別是:

paddle.fluid--->PaddlePaddle深度學習框架

numpy---------->python基本庫,用於科學計算

os------------------>python的模塊,可使用該模塊對操作系統進行操作

matplotlib----->python繪圖庫,可方便繪制折線圖、散點圖等圖形

 

import paddle.fluid as fluid
import paddle
import numpy as np
import os
import matplotlib.pyplot as plt

 

 

Step1:准備數據。

(1)uci-housing數據集介紹

數據集共506行,每行14列。前13列用來描述房屋的各種信息,最后一列為該類房屋價格中位數。

PaddlePaddle提供了讀取uci_housing訓練集和測試集的接口,分別為paddle.dataset.uci_housing.train()和paddle.dataset.uci_housing.test()。

(2)train_reader和test_reader

paddle.reader.shuffle()表示每次緩存BUF_SIZE個數據項,並進行打亂

paddle.batch()表示每BATCH_SIZE組成一個batch

BUF_SIZE=500
BATCH_SIZE=20

#用於訓練的數據提供器,每次從緩存中隨機讀取批次大小的數據
train_reader = paddle.batch(
    paddle.reader.shuffle(paddle.dataset.uci_housing.train(), 
                          buf_size=BUF_SIZE),                    
    batch_size=BATCH_SIZE)   
#用於測試的數據提供器,每次從緩存中隨機讀取批次大小的數據
test_reader = paddle.batch(
    paddle.reader.shuffle(paddle.dataset.uci_housing.test(),
                          buf_size=BUF_SIZE),
    batch_size=BATCH_SIZE)  

 

(3)打印看下數據是什么樣的?PaddlePaddle接口提供的數據已經經過歸一化等處理

(array([-0.02964322, -0.11363636, 0.39417967, -0.06916996, 0.14260276, -0.10109875, 0.30715859, -0.13176829, -0.24127857, 0.05489093, 0.29196451, -0.2368098 , 0.12850267]), array([15.6])),

#用於打印,查看uci_housing數據
train_data=paddle.dataset.uci_housing.train();
sampledata=next(train_data())
print(sampledata)

 

Step2:網絡配置

(1)網絡搭建:對於線性回歸來講,它就是一個從輸入到輸出的簡單的全連接層。

對於波士頓房價數據集,假設屬性和房價之間的關系可以被屬性間的線性組合描述。

 

#定義張量變量x,表示13維的特征值
x = fluid.layers.data(name='x', shape=[13], dtype='float32')
#定義張量y,表示目標值
y = fluid.layers.data(name='y', shape=[1], dtype='float32')
#定義一個簡單的線性網絡,連接輸入和輸出的全連接層
#input:輸入tensor;
#size:該層輸出單元的數目
#act:激活函數
y_predict=fluid.layers.fc(input=x,size=1,act=None)

 

(2)定義損失函數

此處使用均方差損失函數。

square_error_cost(input,lable):接受輸入預測值和目標值,並返回方差估計,即為(y-y_predict)的平方

cost = fluid.layers.square_error_cost(input=y_predict, label=y) #求一個batch的損失值
avg_cost = fluid.layers.mean(cost)                              #對損失值求平均值

 

(3)定義優化函數

此處使用的是隨機梯度下降。

test_program = fluid.default_main_program().clone(for_test=True)
optimizer = fluid.optimizer.SGDOptimizer(learning_rate=0.001)
opts = optimizer.minimize(avg_cost)

在上述模型配置完畢后,得到兩個fluid.Program:fluid.default_startup_program() 與fluid.default_main_program() 配置完畢了。

參數初始化操作會被寫入fluid.default_startup_program()

fluid.default_main_program()用於獲取默認或全局main program(主程序)。該主程序用於訓練和測試模型。fluid.layers 中的所有layer函數可以向 default_main_program 中添加算子和變量。default_main_program 是fluid的許多編程接口(API)的Program參數的缺省值。例如,當用戶program沒有傳入的時候, Executor.run() 會默認執行 default_main_program 。

 

Step3.模型訓練 and Step4.模型評估

 

(1)創建Executor

首先定義運算場所 fluid.CPUPlace()和 fluid.CUDAPlace(0)分別表示運算場所為CPU和GPU

Executor:接收傳入的program,通過run()方法運行program。

use_cuda = False                         #use_cuda為False,表示運算場所為CPU;use_cuda為True,表示運算場所為GPU           
place = fluid.CUDAPlace(0) if use_cuda else fluid.CPUPlace()
exe = fluid.Executor(place)              #創建一個Executor實例exe
exe.run(fluid.default_startup_program()) #Executor的run()方法執行startup_program(),進行參數初始化

(2)定義輸入數據維度

DataFeeder負責將數據提供器(train_reader,test_reader)返回的數據轉成一種特殊的數據結構,使其可以輸入到Executor中。

feed_list設置向模型輸入的向變量表或者變量表名

# 定義輸入數據維度
feeder = fluid.DataFeeder(place=place, feed_list=[x, y])#feed_list:向模型輸入的變量表或變量表名

(3)定義繪制訓練過程的損失值變化趨勢的方法draw_train_process

 

iter=0;
iters=[]
train_costs=[]

def draw_train_process(iters,train_costs):
    title="training cost"
    plt.title(title, fontsize=24)
    plt.xlabel("iter", fontsize=14)
    plt.ylabel("cost", fontsize=14)
    plt.plot(iters, train_costs,color='red',label='training cost') 
    plt.grid()
    plt.show()

(4)訓練並保存模型

Executor接收傳入的program,並根據feed map(輸入映射表)和fetch_list(結果獲取表) 向program中添加feed operators(數據輸入算子)和fetch operators(結果獲取算子)。 feed map為該program提供輸入數據。fetch_list提供program訓練結束后用戶預期的變量。

使用feed方式送入訓練數據,先將reader數據轉換為PaddlePaddle可識別的Tensor數據,傳入執行器進行訓練。

注:enumerate() 函數用於將一個可遍歷的數據對象(如列表、元組或字符串)組合為一個索引序列,同時列出數據和數據下標,

EPOCH_NUM=50
model_save_dir = "/home/aistudio/work/fit_a_line.inference.model"

for pass_id in range(EPOCH_NUM):                                  #訓練EPOCH_NUM輪
    # 開始訓練並輸出最后一個batch的損失值
    train_cost = 0
    for batch_id, data in enumerate(train_reader()):              #遍歷train_reader迭代器
        train_cost = exe.run(program=fluid.default_main_program(),#運行主程序
                             feed=feeder.feed(data),              #喂入一個batch的訓練數據,根據feed_list和data提供的信息,將輸入數據轉成一種特殊的數據結構
                             fetch_list=[avg_cost])    
        if batch_id % 40 == 0:
            print("Pass:%d, Cost:%0.5f" % (pass_id, train_cost[0][0]))    #打印最后一個batch的損失值
        iter=iter+BATCH_SIZE
        iters.append(iter)
        train_costs.append(train_cost[0][0])
       
   
    # 開始測試並輸出最后一個batch的損失值
    test_cost = 0
    for batch_id, data in enumerate(test_reader()):               #遍歷test_reader迭代器
        test_cost= exe.run(program=test_program, #運行測試cheng
                            feed=feeder.feed(data),               #喂入一個batch的測試數據
                            fetch_list=[avg_cost])                #fetch均方誤差
    print('Test:%d, Cost:%0.5f' % (pass_id, test_cost[0][0]))     #打印最后一個batch的損失值
    
    #保存模型
    # 如果保存路徑不存在就創建
if not os.path.exists(model_save_dir):
    os.makedirs(model_save_dir)
print ('save models to %s' % (model_save_dir))
#保存訓練參數到指定路徑中,構建一個專門用預測的program
fluid.io.save_inference_model(model_save_dir,   #保存推理model的路徑
                                  ['x'],            #推理(inference)需要 feed 的數據
                                  [y_predict],      #保存推理(inference)結果的 Variables
                                  exe)              #exe 保存 inference model
draw_train_process(iters,train_costs)

----------------------------------------

Pass:0, Cost:730.52649
Test:0, Cost:129.28802
Pass:1, Cost:935.00702
Test:1, Cost:146.41402
Pass:2, Cost:561.50110
Test:2, Cost:188.96291
Pass:3, Cost:455.02338
Test:3, Cost:296.16476
Pass:4, Cost:347.46710
Test:4, Cost:122.57037
Pass:5, Cost:480.02325
Test:5, Cost:140.77341
Pass:6, Cost:464.05698
Test:6, Cost:127.89626
Pass:7, Cost:276.11606
Test:7, Cost:233.21486
Pass:8, Cost:337.44760
Test:8, Cost:163.82315
Pass:9, Cost:311.88654
Test:9, Cost:13.98091
Pass:10, Cost:308.88275
Test:10, Cost:170.74649
Pass:11, Cost:340.49243
Test:11, Cost:77.21281
Pass:12, Cost:301.12851
Test:12, Cost:31.04134
Pass:13, Cost:150.75267
Test:13, Cost:2.99113
Pass:14, Cost:174.88126
Test:14, Cost:39.84206
Pass:15, Cost:279.36380
Test:15, Cost:102.89651
Pass:16, Cost:184.16774
Test:16, Cost:208.48296
Pass:17, Cost:252.75090
Test:17, Cost:65.50356
Pass:18, Cost:125.28737
Test:18, Cost:5.14324
Pass:19, Cost:241.18799
Test:19, Cost:43.11307
Pass:20, Cost:333.37201
Test:20, Cost:48.84952
Pass:21, Cost:150.72885
Test:21, Cost:58.22155
Pass:22, Cost:73.52397
Test:22, Cost:113.02930
Pass:23, Cost:189.21335
Test:23, Cost:29.70313
Pass:24, Cost:182.14908
Test:24, Cost:16.74845
Pass:25, Cost:128.77292
Test:25, Cost:16.76190
Pass:26, Cost:117.02783
Test:26, Cost:10.72589
Pass:27, Cost:107.32870
Test:27, Cost:4.64500
Pass:28, Cost:138.55495
Test:28, Cost:6.51828
Pass:29, Cost:48.11888
Test:29, Cost:8.40414
Pass:30, Cost:127.07739
Test:30, Cost:123.49804
Pass:31, Cost:169.20230
Test:31, Cost:5.44257
Pass:32, Cost:88.83828
Test:32, Cost:7.61720
Pass:33, Cost:80.49153
Test:33, Cost:22.00040
Pass:34, Cost:59.16454
Test:34, Cost:46.63321
Pass:35, Cost:161.52925
Test:35, Cost:26.65326
Pass:36, Cost:81.94468
Test:36, Cost:28.30224
Pass:37, Cost:35.22042
Test:37, Cost:3.84092
Pass:38, Cost:72.79510
Test:38, Cost:16.40567
Pass:39, Cost:109.47186
Test:39, Cost:4.38933
Pass:40, Cost:59.62152
Test:40, Cost:0.58020
Pass:41, Cost:52.41791
Test:41, Cost:2.84398
Pass:42, Cost:139.88603
Test:42, Cost:11.51844
Pass:43, Cost:31.33353
Test:43, Cost:12.27122
Pass:44, Cost:33.70327
Test:44, Cost:11.24299
Pass:45, Cost:36.93304
Test:45, Cost:3.56746
Pass:46, Cost:69.01217
Test:46, Cost:12.32192
Pass:47, Cost:20.34635
Test:47, Cost:5.79740
Pass:48, Cost:37.24659
Test:48, Cost:9.30209
Pass:49, Cost:104.55357
Test:49, Cost:12.87949
save models to /home/aistudio/work/fit_a_line.inference.model

-------------------------------------

Step5.模型預測

(1)創建預測用的Executor

infer_exe = fluid.Executor(place)    #創建推測用的executor
inference_scope = fluid.core.Scope() #Scope指定作用域

(2)可視化真實值與預測值方法定義

infer_results=[]
groud_truths=[]

#繪制真實值和預測值對比圖
def draw_infer_result(groud_truths,infer_results):
    title='Boston'
    plt.title(title, fontsize=24)
    x = np.arange(1,20) 
    y = x
    plt.plot(x, y)
    plt.xlabel('ground truth', fontsize=14)
    plt.ylabel('infer result', fontsize=14)
    plt.scatter(groud_truths, infer_results,color='green',label='training cost') 
    plt.grid()
    plt.show()

 

(3)開始預測

通過fluid.io.load_inference_model,預測器會從params_dirname中讀取已經訓練好的模型,來對從未遇見過的數據進行預測。

with fluid.scope_guard(inference_scope):#修改全局/默認作用域(scope), 運行時中的所有變量都將分配給新的scope。
    #從指定目錄中加載 推理model(inference model)
    [inference_program,                             #推理的program
     feed_target_names,                             #需要在推理program中提供數據的變量名稱
     fetch_targets] = fluid.io.load_inference_model(#fetch_targets: 推斷結果
                                    model_save_dir, #model_save_dir:模型訓練路徑 
                                    infer_exe)      #infer_exe: 預測用executor
    #獲取預測數據
    infer_reader = paddle.batch(paddle.dataset.uci_housing.test(),  #獲取uci_housing的測試數據
                          batch_size=200)                           #從測試數據中讀取一個大小為200的batch數據
    #從test_reader中分割x
    test_data = next(infer_reader())
    test_x = np.array([data[0] for data in test_data]).astype("float32")
    test_y= np.array([data[1] for data in test_data]).astype("float32")
    results = infer_exe.run(inference_program,                              #預測模型
                            feed={feed_target_names[0]: np.array(test_x)},  #喂入要預測的x值
                            fetch_list=fetch_targets)                       #得到推測結果 
                            
    print("infer results: (House Price)")
    for idx, val in enumerate(results[0]):
        print("%d: %.2f" % (idx, val))
        infer_results.append(val)
    print("ground truth:")
    for idx, val in enumerate(test_y):
        print("%d: %.2f" % (idx, val))
        groud_truths.append(val)
    draw_infer_result(groud_truths,infer_results)

 

---------------------------------------------------------------------------------------------------------
infer results: (House Price)
0: 13.23
1: 13.22
2: 13.23
3: 14.47
4: 13.76
5: 14.04
6: 13.25
7: 13.35
8: 11.67
9: 13.38
10: 11.03
11: 12.38
12: 12.95
13: 12.62
14: 12.13
15: 13.39
16: 14.32
17: 14.26
18: 14.61
19: 13.20
20: 13.78
21: 12.54
22: 14.22
23: 13.43
24: 13.58
25: 13.00
26: 14.03
27: 13.89
28: 14.73
29: 13.76
30: 13.53
31: 13.10
32: 13.08
33: 12.20
34: 12.07
35: 13.84
36: 13.83
37: 14.25
38: 14.40
39: 14.27
40: 13.21
41: 12.75
42: 14.15
43: 14.42
44: 14.38
45: 14.07
46: 13.33
47: 14.48
48: 14.61
49: 14.83
50: 13.20
51: 13.52
52: 13.10
53: 13.30
54: 14.40
55: 14.88
56: 14.38
57: 14.92
58: 15.04
59: 15.23
60: 15.58
61: 15.51
62: 13.54
63: 14.46
64: 15.13
65: 15.67
66: 15.32
67: 15.62
68: 15.71
69: 16.01
70: 14.49
71: 14.15
72: 14.95
73: 13.76
74: 14.71
75: 15.18
76: 16.25
77: 16.42
78: 16.55
79: 16.59
80: 16.13
81: 16.34
82: 15.44
83: 16.08
84: 15.77
85: 15.07
86: 14.48
87: 15.83
88: 16.46
89: 20.68
90: 20.86
91: 20.75
92: 19.60
93: 20.27
94: 20.51
95: 20.05
96: 20.16
97: 21.58
98: 21.32
99: 21.59
100: 21.49
101: 21.30
ground truth:
0: 8.50
1: 5.00
2: 11.90
3: 27.90
4: 17.20
5: 27.50
6: 15.00
7: 17.20
8: 17.90
9: 16.30
10: 7.00
11: 7.20
12: 7.50
13: 10.40
14: 8.80
15: 8.40
16: 16.70
17: 14.20
18: 20.80
19: 13.40
20: 11.70
21: 8.30
22: 10.20
23: 10.90
24: 11.00
25: 9.50
26: 14.50
27: 14.10
28: 16.10
29: 14.30
30: 11.70
31: 13.40
32: 9.60
33: 8.70
34: 8.40
35: 12.80
36: 10.50
37: 17.10
38: 18.40
39: 15.40
40: 10.80
41: 11.80
42: 14.90
43: 12.60
44: 14.10
45: 13.00
46: 13.40
47: 15.20
48: 16.10
49: 17.80
50: 14.90
51: 14.10
52: 12.70
53: 13.50
54: 14.90
55: 20.00
56: 16.40
57: 17.70
58: 19.50
59: 20.20
60: 21.40
61: 19.90
62: 19.00
63: 19.10
64: 19.10
65: 20.10
66: 19.90
67: 19.60
68: 23.20
69: 29.80
70: 13.80
71: 13.30
72: 16.70
73: 12.00
74: 14.60
75: 21.40
76: 23.00
77: 23.70
78: 25.00
79: 21.80
80: 20.60
81: 21.20
82: 19.10
83: 20.60
84: 15.20
85: 7.00
86: 8.10
87: 13.60
88: 20.10
89: 21.80
90: 24.50
91: 23.10
92: 19.70
93: 18.30
94: 21.20
95: 17.50
96: 16.80
97: 22.40
98: 20.60
99: 23.90
100: 22.00
101: 11.90

---------------------------------------------------------------------------------------------------------

 

 data.txt

98.87, 599.0
68.74, 450.0
89.24, 440.0
129.19, 780.0
61.64, 450.0
74.0, 315.0
124.07, 998.0
65.0, 435.0
57.52, 435.0
60.42, 225.0
98.78, 685.0
63.3, 320.0
94.8, 568.0
49.95, 365.0
84.76, 530.0
127.82, 720.0
85.19, 709.0
79.91, 510.0
91.56, 600.0
59.0, 300.0
96.22, 580.0
94.69, 380.0
45.0, 210.0
68.0, 320.0
77.37, 630.0
104.0, 415.0
81.0, 798.0
64.33, 450.0
86.43, 935.0
67.29, 310.0
68.69, 469.0
109.01, 480.0
56.78, 450.0
53.0, 550.0
95.26, 625.0
142.0, 850.0
76.19, 450.0
83.42, 450.0
104.88, 580.0
149.13, 750.0
155.51, 888.0
105.15, 570.0
44.98, 215.0
87.48, 680.0
154.9, 1300.0
69.02, 340.0
74.0, 510.0
130.3, 1260.0
160.0, 640.0
88.31, 535.0
80.38, 680.0
94.99, 557.0
87.98, 435.0
88.0, 430.0
65.39, 320.0
89.0, 395.0
97.36, 535.0
53.0, 470.0
120.0, 632.0
129.0, 520.0
138.68, 878.0
81.25, 520.0
81.37, 550.0
101.98, 630.0
99.5, 540.0
59.53, 350.0
60.0, 220.0
50.51, 270.0
120.82, 745.0
107.11, 650.0
57.0, 252.0
145.08, 1350.0
111.8, 625.0
99.51, 518.0
87.09, 705.0
112.82, 510.0
148.93, 1280.0
199.24, 900.0
112.42, 600.0
94.17, 588.0
198.54, 1350.0
77.02, 460.0
153.47, 900.0
69.16, 435.0
84.54, 430.0
100.74, 590.0
110.9, 620.0
109.74, 618.0
107.98, 590.0
108.63, 700.0
67.37, 435.0
138.53, 790.0
44.28, 315.0
107.51, 816.0
59.53, 388.0
118.87, 1155.0
59.38, 260.0
55.0, 360.0
81.02, 410.0
109.68, 680.0
127.0, 630.0
109.23, 545.0
85.03, 699.0
107.27, 620.0
120.3, 480.0
127.08, 680.0
158.63, 850.0
123.25, 895.0
151.68, 1000.0
65.84, 438.0
96.87, 780.0
166.0, 1400.0
59.53, 410.0
137.4, 1150.0
45.0, 209.0
88.54, 722.0
87.22, 720.0
164.0, 2000.0
69.34, 248.0
103.67, 750.0
74.2, 595.0
71.0, 440.0
53.0, 475.0
60.86, 850.0
90.14, 530.0
57.0, 338.0
138.82, 1150.0
89.0, 760.0
98.0, 400.0
143.35, 1150.0
113.81, 575.0
152.2, 1150.0
63.32, 330.0
66.91, 218.0
44.28, 305.0
97.76, 590.0
69.0, 285.0
55.0, 380.0
148.0, 1300.0
154.59, 868.0
131.5, 1020.0
87.1, 780.0
148.68, 749.0
94.22, 590.0
96.79, 670.0
99.19, 578.0
199.0, 1380.0
125.03, 630.0
60.95, 520.0
127.04, 680.0
85.63, 460.0
77.14, 320.0
75.63, 508.0
140.18, 800.0
59.53, 365.0
109.09, 850.0
152.0, 1850.0
122.0, 980.0
111.21, 630.0
56.7, 260.0
84.46, 588.0
83.19, 500.0
132.18, 1260.0
76.14, 480.0
107.29, 585.0
137.71, 780.0
108.22, 610.0
98.81, 570.0
139.0, 1180.0
89.0, 1100.0
89.47, 800.0
75.61, 496.0
84.54, 460.0
75.87, 490.0
61.0, 450.0
83.72, 500.0
53.0, 458.0
86.0, 700.0
98.57, 760.0
84.86, 510.0
82.77, 600.0
102.49, 600.0
139.32, 730.0
145.0, 1290.0
148.0, 1100.0
65.82, 410.0
53.0, 240.0
88.96, 1000.0
86.36, 700.0
65.72, 455.0
88.0, 725.0
65.98, 600.0
99.0, 560.0
131.0, 975.0
59.53, 349.0
86.79, 508.0
110.19, 500.0
42.13, 320.0
89.91, 450.0
44.0, 320.0
107.16, 900.0
98.28, 574.0
109.68, 650.0
65.0, 450.0
103.8, 750.0
71.69, 440.0
94.38, 550.0
107.5, 760.0
85.29, 705.0
152.3, 1000.0
80.66, 665.0
88.0, 600.0
67.0, 350.0
87.0, 700.0
88.15, 430.0
104.14, 600.0
54.0, 250.0
65.44, 435.0
88.93, 525.0
51.0, 338.0
57.0, 500.0
66.94, 470.0
142.99, 800.0
69.16, 500.0
43.0, 202.0
177.0, 1120.0
131.73, 900.0
60.0, 220.0
83.0, 530.0
110.99, 630.0
49.95, 365.0
121.87, 748.0
70.0, 690.0
48.76, 230.0
88.73, 547.0
59.53, 355.0
85.49, 420.0
87.06, 570.0
77.0, 350.0
55.0, 354.0
120.94, 655.0
88.7, 560.0
76.31, 510.0
100.39, 610.0
124.88, 820.0
95.0, 480.0
44.28, 315.0
158.0, 1500.0
55.59, 235.0
87.32, 738.0
64.43, 440.0
77.2, 418.0
89.0, 750.0
130.4, 725.0
98.61, 505.0
55.34, 355.0
132.66, 944.0
88.7, 560.0
67.92, 408.0
88.88, 640.0
57.52, 370.0
71.0, 615.0
86.29, 650.0
51.0, 211.0
53.14, 350.0
63.38, 430.0
90.83, 660.0
95.05, 515.0
96.0, 650.0
135.24, 900.0
80.5, 640.0
132.0, 680.0
69.0, 450.0
56.39, 268.0
59.53, 338.0
74.22, 445.0
88.0, 780.0
112.41, 570.0
140.85, 760.0
108.33, 635.0
104.76, 612.0
86.67, 632.0
169.0, 1550.0
99.19, 570.0
95.0, 780.0
174.0, 1500.0
103.13, 565.0
107.0, 940.0
109.43, 880.0
91.93, 616.0
66.69, 296.0
57.11, 280.0
98.78, 590.0
40.09, 275.0
144.86, 850.0
110.99, 600.0
103.13, 565.0
87.0, 565.0
55.0, 347.0
53.75, 312.0
62.36, 480.0
135.4, 710.0
74.22, 450.0
96.35, 400.0
88.93, 542.0
98.0, 598.0
66.28, 530.0
119.2, 950.0
67.0, 415.0
68.9, 430.0
65.33, 400.0
55.0, 350.0
148.0, 1270.0
60.0, 307.0
88.83, 450.0
85.46, 430.0
137.57, 1130.0
90.59, 440.0
49.51, 220.0
96.87, 780.0
133.24, 820.0
84.0, 650.0
59.53, 420.0
59.16, 418.0
121.0, 670.0
53.0, 489.0
145.0, 1350.0
56.43, 260.0
71.8, 400.0
77.74, 370.0
59.16, 410.0
141.0, 820.0
87.28, 510.0
112.39, 666.0
119.0, 460.0
72.66, 395.0
120.8, 720.0
121.87, 750.0
64.43, 430.0
59.53, 400.0
106.69, 615.0
102.61, 575.0
61.64, 342.0
99.02, 750.0
88.04, 750.0
126.08, 750.0
145.0, 1280.0
84.0, 550.0
109.39, 700.0
199.96, 1850.0
88.0, 410.0
104.65, 750.0
81.0, 760.0
60.0, 230.0
108.0, 760.0
87.0, 550.0
88.15, 415.0
82.86, 490.0
152.0, 750.0
89.0, 565.0
43.5, 260.0
59.62, 356.0
96.6, 800.0
59.53, 450.0
67.48, 270.0
70.0, 455.0
102.84, 600.0
112.0, 423.0
90.23, 720.0
65.0, 380.0
89.67, 730.0
69.53, 480.0
182.36, 870.0
98.56, 569.0
65.0, 430.0
59.53, 325.0
159.83, 880.0
45.0, 260.0
92.64, 628.0
85.63, 413.0
100.43, 400.0
171.68, 1000.0
104.64, 720.0
52.46, 560.0
89.02, 420.0
166.11, 1160.0
67.21, 387.0
71.57, 420.0
68.07, 265.0
170.0, 1395.0
67.0, 455.0
73.5, 480.0
130.53, 760.0
96.04, 570.0
73.57, 265.0
128.6, 750.0
127.09, 870.0
71.0, 450.0
55.43, 230.0
103.0, 560.0
169.0, 1600.0
107.7, 815.0
153.61, 770.0
71.8, 450.0
87.76, 568.0
122.0, 970.0
58.75, 420.0
65.33, 310.0
80.63, 520.0
93.0, 350.0
59.62, 400.0
124.0, 890.0
105.79, 680.0
122.83, 668.0
67.09, 420.0
71.0, 350.0
127.23, 720.0
128.0, 780.0
77.9, 498.0
55.73, 213.0
91.39, 430.0
114.43, 860.0
125.0, 730.0
73.57, 270.0
59.56, 268.0
71.74, 395.0
88.12, 565.0
65.5, 340.0
81.0, 400.0
154.95, 900.0
67.0, 600.0
80.6, 580.0
148.0, 850.0
52.33, 475.0
122.54, 950.0
70.21, 400.0
63.0, 460.0
97.0, 750.0
100.19, 600.0
179.0, 895.0
69.8, 450.0
63.4, 480.0
65.72, 439.0
77.0, 350.0
137.68, 800.0
95.0, 590.0
68.07, 270.0
136.37, 1100.0
57.39, 218.0
83.72, 510.0
125.42, 920.0
99.51, 580.0
73.33, 625.0
53.17, 490.0
53.0, 480.0
51.0, 400.0
131.27, 780.0
95.37, 625.0
59.53, 400.0
88.8, 525.0
67.0, 310.0
129.76, 660.0
98.28, 580.0
101.44, 550.0
89.05, 710.0
157.77, 1310.0
84.73, 640.0
93.96, 540.0
55.24, 365.0
86.0, 740.0
65.8, 395.0
139.0, 1150.0
99.19, 540.0
88.0, 678.0
65.0, 440.0
138.37, 1060.0
65.33, 350.0
140.6, 850.0
90.46, 518.0
53.0, 485.0
73.9, 370.0
71.7, 280.0
80.73, 485.0
113.0, 570.0
97.0, 570.0
65.5, 340.0
77.74, 350.0
145.0, 1280.0
97.46, 800.0
88.8, 530.0
198.04, 1600.0
50.0, 270.0
60.0, 220.0
136.0, 858.0
67.07, 370.0
49.51, 220.0
67.0, 600.0
108.56, 810.0
96.52, 565.0
68.48, 435.0
65.84, 450.0
102.61, 590.0
101.69, 600.0
73.93, 520.0
57.0, 256.0
123.5, 1115.0
154.89, 1260.0
160.34, 1320.0
88.0, 715.0
71.0, 269.0
74.93, 405.0
73.6, 630.0
59.5, 380.0
84.0, 650.0
59.53, 370.0
45.0, 210.0
51.0, 350.0
107.56, 780.0
76.46, 418.0
83.0, 398.0
77.14, 305.0
71.0, 300.0
86.0, 680.0
52.37, 450.0
99.99, 530.0
52.91, 540.0
63.36, 299.0
60.51, 520.0
122.0, 950.0
96.73, 740.0
138.82, 860.0
99.0, 520.0
109.75, 570.0
112.89, 870.0
65.36, 420.0
110.19, 558.0
132.59, 980.0
128.78, 900.0
89.0, 1296.0
182.36, 980.0
146.41, 521.0
90.0, 428.0
157.0, 1280.0
89.0, 528.0
85.19, 695.0
61.0, 686.0
91.56, 720.0
126.25, 670.0
81.0, 775.0
117.31, 745.0
50.6, 350.0
138.53, 740.0
151.66, 1550.0
87.0, 745.0
65.0, 300.0
83.4, 535.0
67.06, 288.0
55.0, 340.0
79.0, 340.0
95.0, 760.0
77.0, 370.0
50.0, 360.0
97.0, 750.0
53.0, 465.0
60.0, 220.0
65.33, 293.0
153.0, 900.0
52.41, 518.0
69.0, 290.0
89.54, 700.0
53.0, 465.0
125.0, 760.0
103.0, 580.0
67.0, 310.0
69.0, 275.0
87.09, 495.0
102.71, 620.0
132.53, 820.0
62.0, 570.0
121.62, 780.0
80.03, 480.0
138.82, 950.0
107.0, 900.0
149.0, 930.0
68.74, 455.0
101.27, 488.0
55.0, 350.0
60.0, 240.0
97.0, 400.0
67.92, 450.0
87.86, 435.0
65.33, 365.0
70.23, 335.0
119.0, 940.0
85.68, 570.0
86.0, 520.0
125.0, 760.0
184.23, 1350.0
77.0, 485.0
57.0, 470.0
108.45, 750.0
107.02, 575.0
65.0, 478.0
97.0, 780.0
107.16, 920.0
53.75, 310.0
122.83, 700.0
63.3, 252.0
80.0, 560.0
123.82, 600.0
87.31, 405.0
126.54, 540.0
132.82, 800.0
152.73, 1280.0
109.0, 650.0
103.0, 845.0
59.62, 420.0
66.28, 525.0
96.33, 400.0
86.29, 350.0
78.78, 308.0
137.36, 1100.0
119.69, 530.0
126.08, 750.0
73.84, 490.0
73.0, 635.0
67.22, 400.0
87.98, 435.0
68.36, 270.0
73.0, 415.0
94.0, 600.0
107.07, 800.0
49.0, 260.0
156.0, 880.0
107.03, 830.0
198.75, 1550.0
60.92, 300.0
83.45, 540.0
57.39, 248.0
68.48, 450.0
140.86, 750.0
146.0, 1150.0
59.53, 430.0
77.14, 350.0
55.0, 360.0
80.15, 615.0
118.6, 660.0
63.3, 330.0
59.53, 350.0
59.65, 325.0
115.59, 470.0
71.0, 460.0
113.0, 500.0
94.7, 650.0
79.15, 460.0
150.04, 975.0
152.73, 1250.0
47.0, 230.0
146.67, 680.0
184.7, 980.0
60.06, 300.0
71.21, 485.0
124.88, 738.0
67.0, 600.0
167.28, 770.0
78.9, 320.0
118.4, 700.0
74.0, 620.0
61.38, 510.0
106.14, 585.0
109.0, 668.0
89.52, 800.0
130.29, 1100.0
136.0, 850.0
99.5, 580.0
84.02, 350.0
118.87, 413.0
88.31, 550.0
88.99, 610.0
65.82, 430.0
59.53, 350.0
120.94, 650.0
67.22, 410.0
184.0, 1380.0
156.45, 1200.0
79.0, 320.0
53.0, 459.0
160.7, 980.0
70.81, 360.0
110.94, 578.0
103.0, 600.0
80.66, 670.0
74.82, 315.0
140.09, 1140.0
89.62, 950.0
97.95, 570.0
88.0, 450.0
50.0, 400.0
112.39, 610.0
148.0, 1350.0
102.85, 565.0
126.71, 700.0
65.0, 350.0
69.0, 439.0
168.0, 770.0
61.88, 399.0
147.26, 1230.0
48.76, 210.0
67.22, 365.0
138.0, 1150.0
71.0, 325.0
115.46, 840.0
96.81, 740.0
90.21, 530.0
120.26, 650.0
53.0, 500.0
136.0, 1030.0
87.0, 660.0
134.6, 900.0
161.61, 850.0
88.88, 635.0
84.66, 485.0
81.79, 500.0
50.0, 259.0
121.0, 760.0
84.94, 595.0
73.32, 630.0
53.0, 500.0
97.86, 580.0
154.0, 1280.0
89.0, 395.0
163.66, 1080.0
101.95, 628.0
55.0, 348.0
68.48, 480.0
154.16, 780.0
157.21, 1350.0
111.0, 600.0
108.73, 580.0
53.0, 390.0
137.69, 810.0
170.83, 1290.0
67.0, 243.0
112.93, 600.0
161.0, 650.0
168.1, 1500.0
86.24, 670.0
63.0, 530.0
128.4, 950.0
98.28, 556.0
107.27, 570.0
84.0, 650.0
79.53, 480.0
110.96, 570.0
107.48, 810.0
56.88, 342.0
106.54, 540.0
59.62, 380.0
60.42, 222.0
88.31, 500.0
58.97, 236.0
88.84, 530.0
44.98, 208.0
167.28, 780.0
65.33, 300.0
55.0, 355.0
77.74, 350.0
78.81, 415.0
135.5, 1100.0
70.3, 380.0
57.0, 248.0
68.47, 330.0
59.42, 360.0
53.0, 202.0
42.13, 326.0
144.5, 1230.0
148.0, 650.0
81.0, 667.0
105.18, 590.0
59.53, 370.0
59.53, 450.0
133.8, 1090.0
59.53, 380.0
55.0, 380.0
53.0, 250.0
57.36, 258.0
137.68, 690.0
96.66, 550.0
49.95, 350.0
67.92, 430.0
81.33, 500.0
98.6, 600.0
76.61, 275.0
87.0, 438.0
84.0, 520.0
111.62, 920.0
153.44, 950.0
61.7, 320.0
90.21, 530.0
68.07, 260.0
98.43, 740.0
135.0, 800.0
65.0, 340.0
139.41, 670.0
72.66, 395.0
56.0, 240.0
84.02, 500.0
161.38, 1300.0
58.64, 420.0
80.42, 400.0
83.45, 560.0
103.0, 545.0
55.0, 380.0
128.0, 750.0
63.2, 440.0
138.01, 700.0
106.99, 625.0
109.01, 480.0
172.71, 1020.0
153.0, 1100.0
112.41, 578.0
138.01, 700.0
139.48, 900.0
103.13, 550.0
71.0, 283.0
96.0, 760.0
133.23, 830.0
79.43, 460.0
87.29, 700.0
65.82, 410.0
70.3, 376.0
96.36, 740.0
155.35, 1150.0
184.0, 1350.0
98.0, 580.0
71.0, 260.0
72.83, 470.0
95.85, 710.0
115.5, 640.0
89.0, 635.0
76.0, 475.0
125.86, 888.0
102.22, 638.0
78.0, 310.0
97.09, 800.0
112.0, 645.0
105.23, 570.0
100.74, 580.0
47.5, 430.0
106.54, 530.0
145.1, 1350.0
108.0, 790.0
59.79, 280.0
107.92, 800.0
124.75, 880.0
126.76, 710.0
91.14, 730.0
67.0, 620.0
137.76, 650.0
99.99, 600.0
150.67, 850.0
107.47, 750.0
138.53, 730.0
65.0, 438.0
53.23, 320.0
89.7, 780.0
134.62, 850.0
89.0, 735.0
59.53, 360.0
97.0, 600.0

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 


免責聲明!

本站轉載的文章為個人學習借鑒使用,本站對版權不負任何法律責任。如果侵犯了您的隱私權益,請聯系本站郵箱yoyou2525@163.com刪除。



 
粵ICP備18138465號   © 2018-2025 CODEPRJ.COM