python - Scrapy shell doesn't work -
i new scrapy, want try scrapy shell debug , learn, it's strange shell command doesn't work @ all.
- it seems website crawled, nothing has been printed more. program pending, seems dead, must use ctrl-c end it.
can figure out what's wrong?
i'm using anaconda + scrapy 1.0.3
$ ping 135.251.157.2 pinging 135.251.157.2 32 bytes of data: reply 135.251.157.2: bytes=32 time=13ms ttl=56 reply 135.251.157.2: bytes=32 time=14ms ttl=56 reply 135.251.157.2: bytes=32 time=14ms ttl=56 reply 135.251.157.2: bytes=32 time=14ms ttl=56 ping statistics 135.251.157.2: packets: sent = 4, received = 4, lost = 0 (0% loss), approximate round trip times in milli-seconds: minimum = 13ms, maximum = 14ms, average = 13ms $ scrapy shell "http://135.251.157.2/" 2016-01-28 21:35:18 [scrapy] info: scrapy 1.0.3 started (bot: demo) 2016-01-28 21:35:18 [scrapy] info: optional features available: ssl, http11, boto 2016-01-28 21:35:18 [scrapy] info: overridden settings: {'newspider_module': 'demo.spiders', 'spider_modules': ['demo.spiders'], 'logstats_interval': 0, 'bot_name': 'demo'} 2016-01-28 21:35:18 [scrapy] info: enabled extensions: closespider, telnetconsole, corestats, spiderstate 2016-01-28 21:35:19 [scrapy] info: enabled downloader middlewares: httpauthmiddleware, downloadtimeoutmiddleware, useragentmiddleware, retrymiddleware, defaultheadersmiddleware, metarefreshmiddleware, httpcompressionmiddleware, redirectmiddleware, cookiesmiddleware, httpproxymiddleware, chunkedtransfermiddleware, downloaderstats 2016-01-28 21:35:19 [scrapy] info: enabled spider middlewares: httperrormiddleware, offsitemiddleware, referermiddleware, urllengthmiddleware, depthmiddleware 2016-01-28 21:35:19 [scrapy] info: enabled item pipelines: 2016-01-28 21:35:19 [scrapy] debug: telnet console listening on 127.0.0.1:6023 2016-01-28 21:35:19 [scrapy] info: spider opened 2016-01-28 21:35:24 [scrapy] debug: crawled (200) <get http://135.251.157.2/> (referer: none) 2016-01-28 21:35:24 [root] debug: using default logger 2016-01-28 21:35:24 [root] debug: using default logger ctrl-c $
i'd close thread, find out root cause related different terminal. when use git bash, doesn't work, if use anaconda prompt, works quite well.
[anaconda2] d:\svn\tools\spider\demo>scrapy shell "http://135.251.157.2/" 2016-01-29 13:40:33 [scrapy] info: scrapy 1.0.3 started (bot: demo) 2016-01-29 13:40:33 [scrapy] info: optional features available: ssl, http11, boto 2016-01-29 13:40:33 [scrapy] info: overridden settings: {'newspider_module': 'demo.spiders', 'spider_modules': ['demo.spiders'], 'logstats_interval': 0, 'bot_name': 'demo'} 2016-01-29 13:40:33 [scrapy] info: enabled extensions: closespider, telnetconsole, corestats, spiderstate 2016-01-29 13:40:33 [scrapy] info: enabled downloader middlewares: httpauthmiddleware, downloadtimeoutmiddleware, useragentmiddleware, retrymiddleware, defaultheadersmiddleware, metarefreshmiddleware, httpcompressionmiddleware, redirectmiddleware, cookiesmiddleware, httpproxymiddleware, chunkedtrans fermiddleware, downloaderstats 2016-01-29 13:40:33 [scrapy] info: enabled spider middlewares: httperrormiddleware, offsitemiddleware, referermiddleware, urllengthmiddleware, depthmiddleware 2016-01-29 13:40:33 [scrapy] info: enabled item pipelines: 2016-01-29 13:40:33 [scrapy] debug: telnet console listening on 127.0.0.1:6023 2016-01-29 13:40:33 [scrapy] info: spider opened 2016-01-29 13:40:41 [scrapy] debug: crawled (200) <get http://135.251.157.2/> (referer: none) [s] available scrapy objects: [s] crawler <scrapy.crawler.crawler object @ 0x0136b290> [s] item {} [s] request <get http://135.251.157.2/> [s] response <200 http://135.251.157.2/> [s] settings <scrapy.settings.settings object @ 0x034204b0> [s] spider <defaultspider 'default' @ 0x3e3c6d0> [s] useful shortcuts: [s] shelp() shell (print help) [s] fetch(req_or_url) fetch request (or url) , update local objects [s] view(response) view response in browser 2016-01-29 13:40:41 [root] debug: using default logger 2016-01-29 13:40:41 [root] debug: using default logger in [1]:
Comments
Post a Comment