這里所說的相關工具選擇mimikatz,給出其下載鏈接
http://www.ddooo.com/softdown/133908.htm
我們的實驗環境為Win7虛擬機環境
對其口令進行抓取
mimikatz # log
mimikatz # privilege::debug
mimikatz # sekurlsa::logonpasswords
可以得到
LM-Hash
第一步 將明文口令轉換為其大寫形式
lsw521->LSW521
第二步 將字符串大寫后轉換為16進制字符串
第三步 密碼不足14字節要求用0補全
4C53573532310000000000000000
第四步 將上述編碼分成2組7字節(56bits=14*4)的數據
4C535735323100 和 00000000000000
第五步 將每一組7字節的十六進制轉換為二進制,每7bit一組末尾加0,再轉換成十六進制組成得到2組8字節的編碼
第一組:
1001100001010010101010101100110010100110001000101000100000000000
9852aacca6228800
第二組:
0000000000000000
第六步 以上步驟得到的兩組8字節編碼,分別作為DES加密key為魔術字符串“KGS!@#$% ”進行加密
將其openssl加密
OpenSSL> des -k KGS!@#$% -in 1 -out 11
后者同理
在這時我想比對的時候才發現,LM已經沒有了,無法比對嗚嗚嗚嗚嗚嗚嗚
NTLM-Hash
第一步 ASCII轉Unicode
第二步 MD4單向加密
OpenSSL> dgst -md4 uni
MD4(uni)= a0e503a3f5260abadcbdf3ebf4568830
與上面得到的系統得到結果比較
完全一致!!!
2.利用PCFG算法和部分泄露的口令庫生成一個口令集,測試一下是否能還原Windows登錄密碼?
2.1 數據預處理
打開老師給的口令集,發現里面既有賬號又有密碼,而且兩組數據集給出的格式還不一樣,因此第一步要提取密碼 對於人人網的數據集
其密碼由tab鍵相隔,因此取tab鍵之間的部分就可以
# 導入 re 模塊
import re
f = open('人人網.txt','r',encoding='latin1')
line = f.readlines()
f.close()
f = open('人人網密碼.txt','w+',encoding='latin1')
for j in range(0,len(line)):
for i in range(0,len(line[j])):
if(line[j][i]=='\t'and i<len(line[j])):
password=line[j][i+1:len(line[j])-2]
ti=password+'\n'
f.write(ti)
print("Finish 人人網")
f.close()
f = open('163.txt','r',encoding='latin1')
line = f.readlines()
f.close()
f = open('163密碼.txt','w+',encoding='latin1')
for j in range(0,len(line)):
li=re.findall('----([0-9]*[a-z]*[A-Z]*[\x00-\xff]*)----',line[j])
f.write(li)
print("Finish 人人網")
f.close()
生成密碼集為
對於163的數據,其包含兩種類型:
第一種是用戶名+密碼+郵箱
第二種是郵箱+密碼
分別對其用正則表達式解決:
import re
f = open('163_2.txt','r',encoding='latin1')
line = f.readlines()
f.close()
f = open('163密碼_2.txt','w+',encoding='latin1')
for j in range(0,len(line)):
a=str(line[j])
li=re.findall('----([0-9]*[a-z]*[A-Z]*[\x00-\x80]*)\n',a)
lit=str(li)[2:len(li)-3]+"\n"
f.write(lit)
print("Finish 163_2")
f.close()
f = open('163_1.txt','r',encoding='latin1')
line = f.readlines()
f.close()
f = open('163密碼_1.txt','w+',encoding='latin1')
for j in range(0,len(line)):
a=str(line[j])
li=re.findall('----([0-9]*[a-z]*[A-Z]*[\x00-\x80]*)----',a)
lit=str(li)[2:len(li)-3]+"\n"
f.write(lit)
print("Finish 163_1")
f.close()
得到結果
最后將人人網和163的字典合並
2.2 生成規則集
下載得到PCFG源碼,並將字典導入
打開spyder,運行trainer.py
In [7]: runfile('D:/學習/大三下/網絡密碼應用/PCFG/pcfg_cracker-master/trainer.py',args='-t train.txt -r result',wdir='D:/學習/大三下/網絡密碼應用/PCFG/pcfg_cracker-master')
____ __ __ ______ __
/ __ \________ / /_/ /___ __ / ____/___ ____ / /
/ /_/ / ___/ _ \/ __/ __/ / / / / / / __ \/ __ \/ /
/ ____/ / / __/ /_/ /_/ /_/ / / /___/ /_/ / /_/ / /
/_/ __/_/_ \___/\__/\__/\__, / \__________/\____/_/
/ ____/_ __________ __/_/_ / ____/_ _____ _____________ _____
/ /_ / / / /_ /_ / / / / / / / __/ / / / _ \/ ___/ ___/ _ \/ ___/
/ __/ / /_/ / / /_/ /_/ /_/ / / /_/ / /_/ / __(__ |__ ) __/ /
/_/____/__,_/ /___//__/\__, / \____/\__,_/\___/____/____/\___/_/
/_ __/________ _(_)___ /_/_ _____
/ / / ___/ __ `/ / __ \/ _ \/ ___/
/ / / / / /_/ / / / / / __/ /
/_/ /_/ \__,_/_/_/ /_/\___/_/
Version: 4.3
-----------------------------------------------------------------
Attempting to autodetect file encoding of the training passwords
-----------------------------------------------------------------
File Encoding Detected: utf-8
Confidence for file encoding: 0.938125
If you think another file encoding might have been used please
manually specify the file encoding and run the training program again
-------------------------------------------------
Performing the first pass on the training passwords
What we are learning:
A) Identify words for use in multiword detection
B) Identify alphabet for Markov chains
C) Duplicate password detection, (duplicates are good!)
-------------------------------------------------
Printing out status after every million passwords parsed
------------
1 Million
2 Million
Number of Valid Passwords: 2253866
Number of Encoding Errors Found in Training Set: 0
-------------------------------------------------
Performing the second pass on the training passwords
What we are learning:
A) Learning Markov (OMEN) NGRAMS
B) Training the core PCFG grammar
-------------------------------------------------
Printing out status after every million passwords parsed
------------
1 Million
2 Million
-------------------------------------------------
Calculating Markov (OMEN) probabilities and keyspace
This may take a few minutes
-------------------------------------------------
OMEN Keyspace for Level : 1 : 436
OMEN Keyspace for Level : 2 : 4540
OMEN Keyspace for Level : 3 : 41678
OMEN Keyspace for Level : 4 : 305908
OMEN Keyspace for Level : 5 : 1818836
OMEN Keyspace for Level : 6 : 9407907
OMEN Keyspace for Level : 7 : 42976466
OMEN Keyspace for Level : 8 : 176582831
OMEN Keyspace for Level : 9 : 699890971
OMEN Keyspace for Level : 10 : 2578401879
OMEN Keyspace for Level : 11 : 8816523722
-------------------------------------------------
Performing third pass on the training passwords
What we are learning:
A) What Markov (OMEN) probabilities the training passwords would be created at
-------------------------------------------------
1 Million
2 Million
-------------------------------------------------
Top 5 e-mail providers
-------------------------------------------------
126.com : 246
qq.com : 35
163.com : 9
sina.com : 6
sohu.com : 4
-------------------------------------------------
Top 5 URL domains
-------------------------------------------------
6.com : 33
126.com : 15
q.com : 13
dospy.com : 11
123.com : 8
-------------------------------------------------
Top 10 Years found
-------------------------------------------------
2008 : 1363
2009 : 1148
1987 : 1063
1988 : 917
1986 : 831
1989 : 788
1985 : 648
1984 : 597
1990 : 510
1983 : 507
-------------------------------------------------