scrapy之Logging使用


 

#coding:utf-8
__author__ = 'similarface'
######################
##Logging的使用
######################
import logging
'''
1. logging.CRITICAL - for critical errors (highest severity) 致命錯誤
2. logging.ERROR - for regular errors 一般錯誤
3. logging.WARNING - for warning messages 警告+錯誤
4. logging.INFO - for informational messages 消息+警告+錯誤
5. logging.DEBUG - for debugging messages (lowest severity) 低級別
'''
logging.warning("This is a warning")

logging.log(logging.WARNING,"This is a warning")

#獲取實例對象
logger=logging.getLogger()
logger.warning("這是警告消息")
#指定消息發出者
logger = logging.getLogger('SimilarFace')
logger.warning("This is a warning")

#在爬蟲中使用log
import scrapy
class MySpider(scrapy.Spider):
    name = 'myspider'
    start_urls = ['http://scrapinghub.com']
    def parse(self, response):
        #方法1 自帶的logger
        self.logger.info('Parse function called on %s', response.url)
        #方法2 自己定義個logger
        logger.info('Parse function called on %s', response.url)

'''
Logging 設置
• LOG_FILE
• LOG_ENABLED
• LOG_ENCODING
• LOG_LEVEL
• LOG_FORMAT
• LOG_DATEFORMAT • LOG_STDOUT

命令行中使用
--logfile FILE
Overrides LOG_FILE

--loglevel/-L LEVEL
Overrides LOG_LEVEL

--nolog
Sets LOG_ENABLED to False
'''

import logging
from scrapy.utils.log import configure_logging

configure_logging(install_root_handler=False)
#定義了logging的些屬性
logging.basicConfig(
    filename='/Users/similarface/PycharmProjects/FluentPython/log.txt',
    format='%(levelname)s: %(levelname)s: %(message)s',
    level=logging.INFO
)
#運行時追加模式
logging.info('進入Log文件')
logger = logging.getLogger('SimilarFace')
logger.warning("也要進入Log文件")

 


免責聲明!

本站轉載的文章為個人學習借鑒使用,本站對版權不負任何法律責任。如果侵犯了您的隱私權益,請聯系本站郵箱yoyou2525@163.com刪除。



 
粵ICP備18138465號   © 2018-2025 CODEPRJ.COM