Web輕量級掃描工具Skipfish


Web輕量級掃描工具Skipfish

1. Skipfish 簡介

2. Skipfish 基本操作

3.身份認證

 

 

一. Skipfish 簡介

Skipfish是一款主動的Web應用程序安全偵察工具。它通過執行遞歸爬取和基於字典的探測來為目標站點准備交互式站點地圖。最終的地圖然后用來自許多活動(但希望是不中斷的)安全檢查的輸出來注釋。該工具生成的最終報告旨在作為專業Web應用程序安全評估的基礎。

 

主要特征:
高速:純C代碼,高度優化的HTTP處理,最小的CPU占用空間 - 輕松實現響應目標的每秒2000個請求。
易於使用:啟發式支持各種古怪的Web框架和混合技術站點,具有自動學習功能,動態詞匯表創建和表單自動完成功能。
尖端的安全邏輯:高質量,低誤報率,差分安全檢查,能夠發現一系列細微的缺陷,包括盲注入矢量。

更多特征:
c語言編寫
實驗性的主動web安全評估工具
遞歸爬網
基於字典的探測
速度較快
-多路單線程,全異步網絡i/o,消除內存管理和調度開銷
-啟發式自動內容識別
誤報較低

 

 


二. Skipfish 基本操作

1.skipfish --help

查看這個命令的參數選項

 Authentication and access options:

      -A user:pass      - use specified HTTP authentication credentials
      -F host=IP        - pretend that 'host' resolves to 'IP'
      -C name=val       - append a custom cookie to all requests
      -H name=val       - append a custom HTTP header to all requests
      -b (i|f|p)        - use headers consistent with MSIE / Firefox / iPhone
      -N                - do not accept any new cookies
      --auth-form url   - form authentication URL
      --auth-user user  - form authentication user
      --auth-pass pass  - form authentication password
      --auth-verify-url -  URL for in-session detection

    Crawl scope options:

      -d max_depth     - maximum crawl tree depth (16)
      -c max_child     - maximum children to index per node (512)
      -x max_desc      - maximum descendants to index per branch (8192)
      -r r_limit       - max total number of requests to send (100000000)
      -p crawl%        - node and link crawl probability (100%)
      -q hex           - repeat probabilistic scan with given seed
      -I string        - only follow URLs matching 'string'
      -X string        - exclude URLs matching 'string'
      -K string        - do not fuzz parameters named 'string'
      -D domain        - crawl cross-site links to another domain
      -B domain        - trust, but do not crawl, another domain
      -Z               - do not descend into 5xx locations
      -O               - do not submit any forms
      -P               - do not parse HTML, etc, to find new links

    Reporting options:

      -o dir          - write output to specified directory (required)
      -M              - log warnings about mixed content / non-SSL passwords
      -E              - log all HTTP/1.0 / HTTP/1.1 caching intent mismatches
      -U              - log all external URLs and e-mails seen
      -Q              - completely suppress duplicate nodes in reports
      -u              - be quiet, disable realtime progress stats
      -v              - enable runtime logging (to stderr)

    Dictionary management options:

      -W wordlist     - use a specified read-write wordlist (required)
      -S wordlist     - load a supplemental read-only wordlist
      -L              - do not auto-learn new keywords for the site
      -Y              - do not fuzz extensions in directory brute-force
      -R age          - purge words hit more than 'age' scans ago
      -T name=val     - add new form auto-fill rule
      -G max_guess    - maximum number of keyword guesses to keep (256)

      -z sigfile      - load signatures from this file

    Performance settings:

      -g max_conn     - max simultaneous TCP connections, global (40)
      -m host_conn    - max simultaneous connections, per target IP (10)
      -f max_fail     - max number of consecutive HTTP errors (100)
      -t req_tmout    - total request response timeout (20 s)
      -w rw_tmout     - individual network I/O timeout (10 s)
      -i idle_tmout   - timeout on idle HTTP connections (10 s)
      -s s_limit      - response size limit (400000 B)
      -e              - do not keep binary responses for reporting

    Other settings:

      -l max_req      - max requests per second (0.000000)
      -k duration     - stop scanning after the given duration h:m:s
      --config file   - load the specified configuration file

    Send comments and complaints to <heinenn@google.com>.

#使用skipfish中的字典枚舉發現目標服務器隱藏文件
#skipfish的字典默認以wl結尾
root@kali:~# dpkg -L skipfish | grep wl #查找其字典文件
/usr/share/skipfish/dictionaries/medium.wl #中型字典
/usr/share/skipfish/dictionaries/minimal.wl #小型字典
/usr/share/skipfish/dictionaries/extensions-only.wl #擴展字典
/usr/share/skipfish/dictionaries/complete.wl #完整型字典

#參數-o表示將掃描的內容存儲到該參數后面的文件內
#參數-I表示匹配URL中某個字符串進行掃描,在本例中即掃描/dvwa目錄
#參數-S表示指定文件列表,后面跟字典表示用字典去掃描目標的隱藏文件
root@kali:~# skipfish -o test6 -I /dvwa -S /usr/share/skipfish/dictionaries/minimal.wl  http://192.168.128.129/dvwa

#參數-X:表示不檢查包含某個字符串的URL
#參數-K:表示不對制定的參數進行Fuzz測試
#參數-D:表示跨站點爬另一個域,
如下面命令表示去掃描192.168.128.129網站的內容,如果有xxx.com這個域的鏈接,那么也會去掃xxx.com這個域的信息
root@kali:~# skipfish -o test7  -D xxx.com -I /dvwa -S /usr/share/skipfish/dictionaries/minimal.wl  http://192.168.128.129/dvwa

參數-l:每秒最大的請求數下面的例子表示每秒最大請求20次,實際上比20次多一些
root@kali:~# skipfish -o test8  -l 20  -S /usr/share/skipfish/dictionaries/minimal.wl  http://192.168.128.129/dvwa 

參數-m:表示每個ip最大並發連接數
root@kali:~# skipfish -o test9  -m 20  -S /usr/share/skipfish/dictionaries/minimal.wl  http://192.168.128.129/dvwa 

可以在其配置文件內將需要的參數配置好,然后輸命令的時候加上參數--config指定配置文件即可

**Skipfish身份認證**
參數-A 用戶名:密碼:表示使用特定的http驗證
root@kali:~#  skipfish -o test11 -I /dvwa -A admin:password  http://192.168.128.129/dvwa

#參數-C后面接cookie
#參數-X表示不掃描制定的字符串的內容,此例表示不掃描logout.php頁面(一旦掃描logout.php便會退出,故不掃描)
root@kali:~#  skipfish -o test10 -I /dvwa -X logout.php -C "PHPSESSID=6f155b6b28fa5b88721ad9e5cbd3f08" -C "security=low"  http://192.168.128.129/dvwa 

#通過表單提交用戶名密碼
#參數--auth-form表示登陸賬戶名密碼的界面
#參數--auth-user 后面指定用戶名
#參數--auth-pass 后面指定密碼
#參數--auth-verify-url 后面指定登陸成功后的界面(即判斷是否登陸成功)
root@kali:~# skipfish -o test12 --auth-form http://192.168.128.129/dvwa/login.php --a

  

 

 

 

2.skipfish -o test  http://1.1.1.1/dvwa/

1.掃描 http://1.1.1.1/dvwa/,將掃描結果存放在test

2.瀏覽器打開此頁面

  • 掃描了整個站點
  • 結果保存在 test1/index.html 中 

 

root@kali:~# skipfish -o test http://192.168.14.157/dvwa/
    skipfish web application scanner - version 2.10b
    [!] WARNING: Wordlist '/dev/null' contained no valid entries.
    Welcome to skipfish. Here are some useful tips:

    1) To abort the scan at any time, press Ctrl-C. A partial report will be written
       to the specified location. To view a list of currently scanned URLs, you can
       press space at any time during the scan.

    2) Watch the number requests per second shown on the main screen. If this figure
       drops below 100-200, the scan will likely take a very long time.

    3) The scanner does not auto-limit the scope of the scan; on complex sites, you
       may need to specify locations to exclude, or limit brute-force steps.

    4) There are several new releases of the scanner every month. If you run into
       trouble, check for a newer version first, let the author know next.

    More info: http://code.google.com/p/skipfish/wiki/KnownIssues

    Press any key to continue (or wait 60 seconds)... 

  

 

 

 

 

 

3.skipfish -o test @url.txt    #指定目標IP列表文件

掃描多個目標,該命令表示掃描url.txt文件中的url, #並且將掃描結果存放在test文件內

 

 

 

4.skipfish -o test -S complet.wl -W abc.wl http://1.1.1.1 #字典

# 默認掃描使用的字典
root@kali:~# dpkg -L skipfish | grep wl
    /usr/share/skipfish/dictionaries/medium.wl
    /usr/share/skipfish/dictionaries/minimal.wl
    /usr/share/skipfish/dictionaries/extensions-only.wl
    /usr/share/skipfish/dictionaries/complete.wl
# 指定字典 (-S)
root@kali:~# skipfish -o test1 -I /dvwa/ -S /usr/share/skipfish/dictionaries/minimal.wl http://172.16.10.133/dvwa/
    NOTE: The scanner is currently configured for directory brute-force attacks,
    and will make about 65130 requests per every fuzzable location. If this is
    not what you wanted, stop now and consult the documentation.
# 將目標網站特有的特征漏洞代碼存到文件 (-W)
root@kali:~# skipfish -o test1 -I /dvwa/ -S /usr/share/skipfish/dictionaries/minimal.wl -W abc.wl http://172.16.10.133/dvwa/

  

 

5.更多操作

-I 只檢查包含′string′的 URL
skipfish -o test -I /dvwa/ http://1.1.1.1/dvwa/

-X 不檢查包含′string′的URL #例如:login
skipfish -o test -X /login/ http://1.1.1.1/dvwa/

-S 用字典去爬網站
skipfish -o test -S complet.wl http://1.1.1.1/dvwa/

-K 不對指定參數進行 Fuzz 測試
如果你不想對參數進行Fuzz測試就可以指定

-D 跨站點爬另外一個域
skipfish -o test -D http://url -I /dvwa/ http://1.1.1.1/dvwa/

-l 每秒最大請求數 真實性能還要考慮你的網絡環境
skipfish -o test -l 200 -S complet.wl http://1.1.1.1/dvwa/

-m 每IP最大並發連接數
skipfish -o test -m 100 -I /dvwa/ http://1.1.1.1/dvwa/

  

 

 

三.身份認證

基於http身份認證
skipfish -A user:pass -o test http://1.1.1.1/dvwa/


基於Cookies身份認證
skipfish -C "name=val" -o test http://1.1.1.1/dvwa/
如果有多個cookies值,多一個cookies就多一個 -C "name=val"
skipfish  -o test -C "name=val"  -C "name=val" http://1.1.1.1/dvwa/



基於表單的身份認證
skipfish  -o test --auth-form url --auth-form-atrget url --auth-user-filed 用戶名的表單名 --auth-user 用戶名 --auth-pass-filed  用戶名的密碼名 --auth-pass 密碼 --auth-verify-url url http://1.1.1.1/dvwa/ 

--auth-form url 表單所在網頁
--auth-form-atrget 表單提交到哪個url處理表單
--auth-verify-url 表單提交成功后,重定向哪個url,就是身份認證成功后所在頁面

  

 


免責聲明!

本站轉載的文章為個人學習借鑒使用,本站對版權不負任何法律責任。如果侵犯了您的隱私權益,請聯系本站郵箱yoyou2525@163.com刪除。



 
粵ICP備18138465號   © 2018-2025 CODEPRJ.COM