【問題】
通過Scrapy創建好了項目:
|
1
|
E:\Dev_Root\python\Scrapy>scrapy startproject songtaste
|
運行項目,結果出錯:
|
1
2
3
4
5
6
|
E:\Dev_Root\python\Scrapy>scrapy crawl songtaste -t json -o h1user.json
Scrapy 0.16.2 - no active project
Unknown
command
: crawl
Use
"scrapy"
to see available commands
|
【解決過程】
1.參考:
Trying to get Scrapy into a project to run Crawl command
然后才知道,原來是需要切換到對應的項目所在文件夾,再運行crawl,才可以的。
2.所以去,切換到項目所在目錄songtaste子文件夾中,再運行,就可以了:
|
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
|
E:\Dev_Root\python\Scrapy>
cd
songtaste
E:\Dev_Root\python\Scrapy\songtaste>scrapy crawl songtaste -t json -o h1user.json
2012-11-12 22:39:13+0800 [scrapy] INFO: Scrapy 0.16.2 started (bot: songtaste)
2012-11-12 22:39:13+0800 [scrapy] DEBUG: Enabled extensions: FeedExporter, LogStats, TelnetConsole, CloseSpider, WebServ
ice, CoreStats, SpiderState
2012-11-12 22:39:14+0800 [scrapy] DEBUG: Enabled downloader middlewares: HttpAuthMiddleware, DownloadTimeoutMiddleware,
UserAgentMiddleware, RetryMiddleware, DefaultHeadersMiddleware, RedirectMiddleware, CookiesMiddleware, HttpCompressionMi
ddleware, ChunkedTransferMiddleware, DownloaderStats
2012-11-12 22:39:14+0800 [scrapy] DEBUG: Enabled spider middlewares: HttpErrorMiddleware, OffsiteMiddleware, RefererMidd
leware, UrlLengthMiddleware, DepthMiddleware
2012-11-12 22:39:14+0800 [scrapy] DEBUG: Enabled item pipelines:
2012-11-12 22:39:14+0800 [songtaste] INFO: Spider opened
2012-11-12 22:39:14+0800 [songtaste] INFO: Crawled 0 pages (at 0 pages
/min
), scraped 0 items (at 0 items
/min
)
2012-11-12 22:39:14+0800 [scrapy] DEBUG: Telnet console listening on 0.0.0.0:6023
2012-11-12 22:39:14+0800 [scrapy] DEBUG: Web service listening on 0.0.0.0:6080
2012-11-12 22:39:14+0800 [songtaste] DEBUG: Crawled (200) <GET http:
//www
.songtaste.com
/user/351979/
> (referer: None)
2012-11-12 22:39:14+0800 [songtaste] DEBUG: Scraped from <200 http:
//www
.songtaste.com
/user/351979/
>
{
'h1user'
: [u
'crifan'
]}
2012-11-12 22:39:14+0800 [songtaste] INFO: Closing spider (finished)
2012-11-12 22:39:14+0800 [songtaste] INFO: Stored json feed (1 items)
in
: h1user.json
2012-11-12 22:39:14+0800 [songtaste] INFO: Dumping Scrapy stats:
{
'downloader/request_bytes'
: 235,
'downloader/request_count'
: 1,
'downloader/request_method_count/GET'
: 1,
'downloader/response_bytes'
: 11058,
'downloader/response_count'
: 1,
'downloader/response_status_count/200'
: 1,
'finish_reason'
:
'finished'
,
'finish_time'
: datetime.datetime(2012, 11, 12, 14, 39, 14, 431000),
'item_scraped_count'
: 1,
'log_count/DEBUG'
: 8,
'log_count/INFO'
: 5,
'response_received_count'
: 1,
'scheduler/dequeued'
: 1,
'scheduler/dequeued/memory'
: 1,
'scheduler/enqueued'
: 1,
'scheduler/enqueued/memory'
: 1,
'start_time'
: datetime.datetime(2012, 11, 12, 14, 39, 14, 275000)}
2012-11-12 22:39:14+0800 [songtaste] INFO: Spider closed (finished)
|
【總結】
雖然此錯誤,很低級,但是,由於新接觸Scrapy,由於不太熟悉,所以還是很容易犯的。
轉載請注明:在路上 » 【已解決】Scrapy運行項目時出錯:Scrapy 0.16.2 – no active project,Unknown command: crawl,Use "scrapy" to see available commands
