python爬蟲headers設置后無效解決方案


此次遇到的是一個函數使用不熟練造成的問題,但有了分析工具后可以很快定位到問題(此處推薦一個非常棒的抓包工具fiddler)

 

正文如下:

在爬取某個app數據時(app上的數據都是由http請求的),用Fidder分析了請求信息,並把python的request header信息寫在程序中進行請求數據

代碼如下

import requests
url = 'http://xxx?startDate=2017-10-19&endDate=2017-10-19&pageIndex=1&limit=50&sort=datetime&order=desc'

headers={
    "Host":"xxx.com",
    "Connection": "keep-alive",
    "Accept": "application/json, text/javascript, */*; q=0.01",
    "User-Agent": "Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/29.0.1547.59 Safari/537.36",
    "X-Requested-With": "XMLHttpRequest",
    "Referer": "http://app.jg.eastmoney.com/html_Report/index.html",
    "Accept-Encoding": "gzip,deflate",
    "Accept-Language": "en-us,en",
    "Cookie":"xxx"
}
r = requests.get(url,headers)
print (r.text)

請求成功但是,返回的是

{"Id":"6202c187-2fad-46e8-b4c6-b72ac8de0142","ReturnMsg":"加載失敗!"}

就是被發現不是正常請求被攔截了

 

然后我去Fidder中看剛才python發送請求的記錄 #蓋掉的兩個部分分別是Host和URL,

 

然后查看請求詳細信息的時候,請求頭並沒有加載進去,User-Agent就寫着python-requests !   #請求頭里的UA信息是java,python程序,有點反爬蟲意識的網站、app都會攔截掉

Header詳細信息如下

GET http://xxx?istartDate=2017-10-19&endDate=2017-10-19&pageIndex=1&limit=50&sort=datetime&order=desc
  &Host=xxx.com
  &Connection=keep-alive
  &Accept=application%2Fjson%2C+text%2Fjavascript%2C+%2A%2F%2A%3B+q%3D0.01
  &User-Agent=Mozilla%2F5.0+%28Windows+NT+6.1%3B+WOW64%29+AppleWebKit%2F537.36+%28KHTML%2C+like+Gecko%29+Chrome%2F29.0.1547.59+Safari%2F537.36
  &X-Requested-With=XMLHttpRequest
  &Referer=xxx
  &Accept-Encoding=gzip%2Cdeflate
  &Accept-Language=en-us%2Cen
  &Cookie=xxx
HTTP/1.1

Host: xxx.com User-Agent: python-requests/2.18.4 Accept-Encoding: gzip, deflate Accept: */* Connection: keep-alive HTTP/1.1 200 OK Server: nginx/1.2.2 Date: Sat, 21 Oct 2017 06:07:21 GMT Content-Type: application/json; charset=utf-8 Content-Length: 75 Connection: keep-alive Cache-Control: private X-AspNetMvc-Version: 5.2 X-AspNet-Version: 4.0.30319 X-Powered-By: ASP.NET

 

一開始還沒發現,等我把請求的URL信息全部讀完,才發現程序把我的請求頭信息當做參數放到了URL里

 

那就是我請求的時候request函數Header信息參數用錯了

又重新看了一下Requests庫的Headers參數使用方法,發現有一行代碼寫錯了,在使用request.get()方法時要把參數 “headers =“寫出來  

更改如下:

import requests
url = 'http://xxx?startDate=2017-10-19&endDate=2017-10-19&pageIndex=1&limit=50&sort=datetime&order=desc'

headers={
    "Host":"xxx.com",
    "Connection": "keep-alive",
    "Accept": "application/json, text/javascript, */*; q=0.01",
    "User-Agent": "Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/29.0.1547.59 Safari/537.36",
    "X-Requested-With": "XMLHttpRequest",
    "Referer": "http://app.jg.eastmoney.com/html_Report/index.html",
    "Accept-Encoding": "gzip,deflate",
    "Accept-Language": "en-us,en",
    "Cookie":"xxx"
}
r = requests.get(url,headers=headers)

 

然后去查看Fiddler中的請求,

此次python中的請求頭已經正常了,請求詳細信息如下

GET http://xxx?startDate=2017-10-19&endDate=2017-10-19&pageIndex=1&limit=50&sort=datetime&order=desc HTTP/1.1
User-Agent: Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/29.0.1547.59 Safari/537.36
Accept-Encoding: gzip,deflate
Accept: application/json, text/javascript, */*; q=0.01
Connection: keep-alive
Host: xxx.com
X-Requested-With: XMLHttpRequest
Referer: http://xxx
Accept-Language: en-us,en
Cookie: xxx


HTTP/1.1 200 OK
Server: nginx/1.2.2
Date: Sat, 21 Oct 2017 06:42:21 GMT
Content-Type: application/json; charset=utf-8
Content-Length: 75
Connection: keep-alive
Cache-Control: private
X-AspNetMvc-Version: 5.2
X-AspNet-Version: 4.0.30319
X-Powered-By: ASP.NET

 

然后又用python程序請求了一次,結果請求成功,返回的還是

{"Id":"6202c187-2fad-46e8-b4c6-b72ac8de0142","ReturnMsg":"加載失敗!"}

 

因為一般cookie都會在短時間內過期,所以更新了cookie,然后請求成功

 

 

需要注意的是用程序爬蟲一定要把Header設置好,這個app如果反爬的時候封ip的話可能就麻煩了。

 


免責聲明!

本站轉載的文章為個人學習借鑒使用,本站對版權不負任何法律責任。如果侵犯了您的隱私權益,請聯系本站郵箱yoyou2525@163.com刪除。



 
粵ICP備18138465號   © 2018-2025 CODEPRJ.COM