Kali Linux之web安全掃描器skipfish使用


0x00.skipfish簡介

谷歌公司出品的開源web程序評估軟件。 

skipfish特點:CPU資源占用低,掃描速度快,每秒可以輕松處理2000個請求,誤報率低。

1x00.skipfish使用

 1x01  幫助信息 

 

 

root@kali:~# skipfish --help
    skipfish web application scanner - version 2.10b
    Usage: skipfish [ options ... ] -W wordlist -o output_dir start_url [ start_url2 ... ]

    Authentication and access options:

      -A user:pass      - use specified HTTP authentication credentials
      -F host=IP        - pretend that 'host' resolves to 'IP'
      -C name=val       - append a custom cookie to all requests
      -H name=val       - append a custom HTTP header to all requests
      -b (i|f|p)        - use headers consistent with MSIE / Firefox / iPhone
      -N                - do not accept any new cookies
      --auth-form url   - form authentication URL
      --auth-user user  - form authentication user
      --auth-pass pass  - form authentication password
      --auth-verify-url -  URL for in-session detection

    Crawl scope options:

      -d max_depth     - maximum crawl tree depth (16)
      -c max_child     - maximum children to index per node (512)
      -x max_desc      - maximum descendants to index per branch (8192)
      -r r_limit       - max total number of requests to send (100000000)
      -p crawl%        - node and link crawl probability (100%)
      -q hex           - repeat probabilistic scan with given seed
      -I string        - only follow URLs matching 'string'
      -X string        - exclude URLs matching 'string'
      -K string        - do not fuzz parameters named 'string'
      -D domain        - crawl cross-site links to another domain
      -B domain        - trust, but do not crawl, another domain
      -Z               - do not descend into 5xx locations
      -O               - do not submit any forms
      -P               - do not parse HTML, etc, to find new links

    Reporting options:

      -o dir          - write output to specified directory (required)
      -M              - log warnings about mixed content / non-SSL passwords
      -E              - log all HTTP/1.0 / HTTP/1.1 caching intent mismatches
      -U              - log all external URLs and e-mails seen
      -Q              - completely suppress duplicate nodes in reports
      -u              - be quiet, disable realtime progress stats
      -v              - enable runtime logging (to stderr)

    Dictionary management options:

      -W wordlist     - use a specified read-write wordlist (required)
      -S wordlist     - load a supplemental read-only wordlist
      -L              - do not auto-learn new keywords for the site
      -Y              - do not fuzz extensions in directory brute-force
      -R age          - purge words hit more than 'age' scans ago
      -T name=val     - add new form auto-fill rule
      -G max_guess    - maximum number of keyword guesses to keep (256)

      -z sigfile      - load signatures from this file

    Performance settings:

      -g max_conn     - max simultaneous TCP connections, global (40)
      -m host_conn    - max simultaneous connections, per target IP (10)
      -f max_fail     - max number of consecutive HTTP errors (100)
      -t req_tmout    - total request response timeout (20 s)
      -w rw_tmout     - individual network I/O timeout (10 s)
      -i idle_tmout   - timeout on idle HTTP connections (10 s)
      -s s_limit      - response size limit (400000 B)
      -e              - do not keep binary responses for reporting

    Other settings:

      -l max_req      - max requests per second (0.000000)
      -k duration     - stop scanning after the given duration h:m:s
      --config file   - load the specified configuration file

    Send comments and complaints to <heinenn@google.com>.

 

1x02 

• skipfish -o test [url]  #test為保存結果的文件名
• skipfish -o test @url.txt #指定目標IP列表文件
• skipfish -o test -S complet.wl -W abc.wl [url]  #-S load a supplemental read-only wordlist,-W  use a specified read-write wordlist (required)

• -I 只檢查包含´string´的 URL
• -X 不檢查包含´string´的URL
• -K 不對指定參數進行 Fuzz 測試
• -D 跨站點爬另外一個域
• -l 每秒最大請求數
• -m 每IP最大並發連接數
• --config 指定配置文件

 


免責聲明!

本站轉載的文章為個人學習借鑒使用,本站對版權不負任何法律責任。如果侵犯了您的隱私權益,請聯系本站郵箱yoyou2525@163.com刪除。



 
粵ICP備18138465號   © 2018-2025 CODEPRJ.COM