在爬虫的时候,犯了一个很低级的错误,是关于requests请求报错,原因是因为浏览器在显示User-Agent属性时,应为属性值过长,所以使用了省略号,导致添加过程中造成了编译错误。有图为证:
具体报错如下:
-
Traceback (most recent call last):
-
File "D:/python_code/xxxx/xxxx.py", line 34, in <module> getCarrier()
-
File "D:/python_code/xxxx/xxxx.py", line 26, in getCarrier
-
r = requests.get( url, headers=headers)
-
<span style="font-size:16px;color:#FF0000;">File "D:\Python\Python36\lib\site-packages\requests\api.py", line 72, in get
-
return request('get', url, params=params, **kwargs)</span>
-
File "D:\Python\Python36\lib\site-packages\requests\api.py", line 58, in request
-
return session.request(method=method, url=url, **kwargs)
-
File "D:\Python\Python36\lib\site-packages\requests\sessions.py", line 508, in request
-
resp = self.send(prep, **send_kwargs)
-
File "D:\Python\Python36\lib\site-packages\requests\sessions.py", line 618, in send
-
r = adapter.send(request, **kwargs)
-
File "D:\Python\Python36\lib\site-packages\requests\adapters.py", line 440, in send
-
timeout=timeout
-
File "D:\Python\Python36\lib\site-packages\urllib3\connectionpool.py", line 601, in urlopen
-
chunked=chunked)
-
File "D:\Python\Python36\lib\site-packages\urllib3\connectionpool.py", line 357, in _make_request
-
conn.request(method, url, **httplib_request_kw)
-
File "D:\Python\Python36\lib\http\client.py", line 1239, in request
-
self._send_request(method, url, body, headers, encode_chunked)
-
File "D:\Python\Python36\lib\http\client.py", line 1280, in _send_request
-
self.putheader(hdr, value)
-
File "D:\Python\Python36\lib\http\client.py", line 1212, in putheader
-
values[i] = one_value.encode('latin-1')
-
<span style="font-size:16px;color:#FF0000;">UnicodeEncodeError: 'latin-1' codec can't encode character '\u2026' in position 30: ordinal not in range(256)
-
</span>