scrapy中的debug信息
在scrapy中設置log
1、在settings中設置log級別,在settings.py中添加一行:
Scrapy提供5層logging級別: CRITICAL - 嚴重錯誤(critical) ERROR - 一般錯誤(regular errors) WARNING - 警告信息(warning messages) INFO - 一般信息(informational messages) DEBUG - 調試信息(debugging messages)
scrapy默認顯示DEBUG級別的log信息
2、將輸出的結果保存為log日志,在settings.py中添加路徑:
LOG_FILE = './log.log'
3、顯示log位置,在pipelines.py中:
import logging logger = logging.getLogger(__name__) def process_item(self, item, spider): logger.warning(item) ....
4.在spider
文件中引入Log日志:
class DcdappSpider(scrapy.Spider): name = 'dcdapp' allowed_domains = ['m.dcdapp.com'] custom_settings = { # 設置管道下載 'ITEM_PIPELINES': { 'autospider.pipelines.DcdAppPipeline': 300, }, # 設置log日志 'LOG_LEVEL':'DEBUG', 'LOG_FILE':'./././Log/dcdapp_log.log' }