朴素貝葉斯分類器及Python實現


貝葉斯定理

貝葉斯定理是通過對觀測值概率分布的主觀判斷(即先驗概率)進行修正的定理,在概率論中具有重要地位。

先驗概率分布(邊緣概率)是指基於主觀判斷而非樣本分布的概率分布,后驗概率(條件概率)是根據樣本分布和未知參數的先驗概率分布求得的條件概率分布。

貝葉斯公式:

P(A∩B) = P(A)*P(B|A) = P(B)*P(A|B)

變形得:

P(A|B)=P(B|A)*P(A)/P(B)

其中

  • P(A)是A的先驗概率或邊緣概率,稱作"先驗"是因為它不考慮B因素。

  • P(A|B)是已知B發生后A的條件概率,也稱作A的后驗概率。

  • P(B|A)是已知A發生后B的條件概率,也稱作B的后驗概率,這里稱作似然度。

  • P(B)是B的先驗概率或邊緣概率,這里稱作標准化常量。

  • P(B|A)/P(B)稱作標准似然度。

朴素貝葉斯分類(Naive Bayes)

朴素貝葉斯分類器在估計類條件概率時假設屬性之間條件獨立。

首先定義

  • x = {a1,a2,...}為一個樣本向量,a為一個特征屬性

  • div = {d1 = [l1,u1],...} 特征屬性的一個划分

  • class = {y1,y2,...}樣本所屬的類別

算法流程:

(1) 通過樣本集中類別的分布,對每個類別計算先驗概率p(y[i])

(2) 計算每個類別下每個特征屬性划分的頻率p(a[j] in d[k] | y[i])

(3) 計算每個樣本的p(x|y[i])

p(x|y[i]) = p(a[1] in d | y[i]) * p(a[2] in d | y[i]) * ...

樣本的所有特征屬性已知,所以特征屬性所屬的區間d已知。

可以通過(2)確定p(a[k] in d | y[i])的值,從而求得p(x|y[i])

(4) 由貝葉斯定理得:

p(y[i]|x) = ( p(x|y[i]) * p(y[i]) ) / p(x)

因為分母相同,只需計算分子。

p(y[i]|x)是觀測樣本屬於分類y[i]的概率,找出最大概率對應的分類作為分類結果。

示例:

導入數據集

{a1 = 0, a2 = 0, C = 0} {a1 = 0, a2 = 0, C = 1}

{a1 = 0, a2 = 0, C = 0} {a1 = 0, a2 = 0, C = 1}

{a1 = 0, a2 = 0, C = 0} {a1 = 0, a2 = 0, C = 1}

{a1 = 1, a2 = 0, C = 0} {a1 = 0, a2 = 0, C = 1}

{a1 = 1, a2 = 0, C = 0} {a1 = 0, a2 = 0, C = 1}

{a1 = 1, a2 = 0, C = 0} {a1 = 1, a2 = 0, C = 1}

{a1 = 1, a2 = 1, C = 0} {a1 = 1, a2 = 0, C = 1}

{a1 = 1, a2 = 1, C = 0} {a1 = 1, a2 = 1, C = 1}

{a1 = 1, a2 = 1, C = 0} {a1 = 1, a2 = 1, C = 1}

{a1 = 1, a2 = 1, C = 0} {a1 = 1, a2 = 1, C = 1}

計算類別的先驗概率

P(C = 0) = 0.5

P(C = 1) = 0.5

計算每個特征屬性條件概率:

P(a1 = 0 | C = 0) = 0.3

P(a1 = 1 | C = 0) = 0.7

P(a2 = 0 | C = 0) = 0.4

P(a2 = 1 | C = 0) = 0.6

P(a1 = 0 | C = 1) = 0.5

P(a1 = 1 | C = 1) = 0.5

P(a2 = 0 | C = 1) = 0.7

P(a2 = 1 | C = 1) = 0.3

測試樣本:

x = { a1 = 1, a2 = 2}

p(x | C = 0) = p(a1 = 1 | C = 0) * p( 2 = 2 | C = 0) = 0.3 * 0.6 = 0.18

p(x | C = 1) = p(a1 = 1 | C = 1) * p (a2 = 2 | C = 1) = 0.5 * 0.3 = 0.15

計算P(C | x) * p(x):

P(C = 0) * p(x | C = 1) = 0.5 * 0.18 = 0.09

P(C = 1) * p(x | C = 2) = 0.5 * 0.15 = 0.075

所以認為測試樣本屬於類型C1

Python實現

朴素貝葉斯分類器的訓練過程為計算(1),(2)中的概率表,應用過程為計算(3),(4)並尋找最大值。

還是使用原來的接口進行類封裝:

from numpy import *

class NaiveBayesClassifier(object):
    
    def __init__(self):
        self.dataMat = list()
        self.labelMat = list()
        self.pLabel1 = 0
        self.p0Vec = list()
        self.p1Vec = list()

    def loadDataSet(self,filename):
        fr = open(filename)
        for line in fr.readlines():
            lineArr = line.strip().split()
            dataLine = list()
            for i in lineArr:
                dataLine.append(float(i))
            label = dataLine.pop() # pop the last column referring to  label
            self.dataMat.append(dataLine)
            self.labelMat.append(int(label))


    def train(self):
        dataNum = len(self.dataMat)
        featureNum = len(self.dataMat[0])
        self.pLabel1 = sum(self.labelMat)/float(dataNum)
        p0Num = zeros(featureNum)
        p1Num = zeros(featureNum)
        p0Denom = 1.0
        p1Denom = 1.0
        for i in range(dataNum):
            if self.labelMat[i] == 1:
                p1Num += self.dataMat[i]
                p1Denom += sum(self.dataMat[i])
            else:
                p0Num += self.dataMat[i]
                p0Denom += sum(self.dataMat[i])
        self.p0Vec = p0Num/p0Denom
        self.p1Vec = p1Num/p1Denom

    def classify(self, data):
        p1 = reduce(lambda x, y: x * y, data * self.p1Vec) * self.pLabel1
        p0 = reduce(lambda x, y: x * y, data * self.p0Vec) * (1.0 - self.pLabel1)
        if p1 > p0:
            return 1
        else: 
            return 0

    def test(self):
        self.loadDataSet('testNB.txt')
        self.train()
        print(self.classify([1, 2]))

if __name__ == '__main__':
    NB =  NaiveBayesClassifier()
    NB.test()

Matlab

Matlab的標准工具箱提供了對朴素貝葉斯分類器的支持:

trainData = [0 1; -1 0; 2 2; 3 3; -2 -1;-4.5 -4; 2 -1; -1 -3];
group = [1 1 -1 -1 1 1 -1 -1]';
model = fitcnb(trainData, group)
testData = [5 2;3 1;-4 -3];
predict(model, testData)

fitcnb用來訓練模型,predict用來預測。


免責聲明!

本站轉載的文章為個人學習借鑒使用,本站對版權不負任何法律責任。如果侵犯了您的隱私權益,請聯系本站郵箱yoyou2525@163.com刪除。



 
粵ICP備18138465號   © 2018-2025 CODEPRJ.COM