国产探花免费观看_亚洲丰满少妇自慰呻吟_97日韩有码在线_资源在线日韩欧美_一区二区精品毛片,辰东完美世界有声小说,欢乐颂第一季,yy玄幻小说排行榜完本

首頁 > 編程 > Python > 正文

Scrapy爬蟲實例講解_校花網(wǎng)

2020-02-16 10:26:52
字體:
供稿:網(wǎng)友

學(xué)習(xí)爬蟲有一段時間了,今天使用Scrapy框架將校花網(wǎng)的圖片爬取到本地。Scrapy爬蟲框架相對于使用requests庫進行網(wǎng)頁的爬取,擁有更高的性能。

Scrapy官方定義:Scrapy是用于抓取網(wǎng)站并提取結(jié)構(gòu)化數(shù)據(jù)的應(yīng)用程序框架,可用于廣泛的有用應(yīng)用程序,如數(shù)據(jù)挖掘,信息處理或歷史存檔。

建立Scrapy爬蟲工程

在安裝好Scrapy框架后,直接使用命令行進行項目的創(chuàng)建:

E:/ScrapyDemo>scrapy startproject xiaohuarNew Scrapy project 'xiaohuar', using template directory 'c://users//lei//appdata//local//programs//python//python35//lib//site-packages//scrapy//templates//project', created in: E:/ScrapyDemo/xiaohuarYou can start your first spider with: cd xiaohuar scrapy genspider example example.com

創(chuàng)建一個Scrapy爬蟲

創(chuàng)建工程的時候,會自動創(chuàng)建一個與工程同名的目錄,進入到目錄中執(zhí)行如下命令:

E:/ScrapyDemo/xiaohuar>scrapy genspider -t basic xiaohua xiaohuar.comCreated spider 'xiaohua' using template 'basic' in module:xiaohuar.spiders.xiaohua命令中"xiaohua"

是生成Spider中*.py文件的文件名,"xiaohuar.com"是將要爬取網(wǎng)站的URL,可以在程序中更改。

編寫Spider代碼

編寫E:/ScrapyDemo/xiaohuar/xiaohuar/spiders中的xiaohua.py文件。主要是配置URL和對請求到的頁面的解析方式。

# -*- coding: utf-8 -*-import scrapyfrom scrapy.http import Requestimport reclass XiaohuaSpider(scrapy.Spider): name = 'xiaohua' allowed_domains = ['xiaohuar.com'] start_urls = [] for i in range(43):  url = "http://www.xiaohuar.com/list-1-%s.html" %i  start_urls.append(url) def parse(self, response):  if "www.xiaohuar.com/list-1" in response.url:   # 下載的html源代碼   html = response.text   # 網(wǎng)頁中圖片存儲地址:src="/d/file/20160126/905e563421921adf9b6fb4408ec4e72f.jpg"   # 通過正則匹配到所有的圖片   # 獲取的是圖片的相對路徑的列表   img_urls = re.findall(r'/d/file//d+//w+/.jpg',html)      # 使用循環(huán)對圖片頁進行請求   for img_url in img_urls:    # 將圖片的URL補全    if "http://" not in img_url:     img_url = "http://www.xiaohuar.com%s" %img_url        # 回調(diào),返回response    yield Request(img_url)  else:   # 下載圖片    url = response.url   # 保存的圖片文件名   title = re.findall(r'/w*.jpg',url)[0]   # 保存圖片   with open('E://xiaohua_img//%s' % title, 'wb') as f:    f.write(response.body)

這里使用正則表達式對圖片的地址進行匹配,其他網(wǎng)頁也都大同小異,需要根據(jù)具體的網(wǎng)頁源代碼進行分析。

運行爬蟲

E:/ScrapyDemo/xiaohuar>scrapy crawl xiaohua2017-10-22 22:30:11 [scrapy.utils.log] INFO: Scrapy 1.4.0 started (bot: xiaohuar)2017-10-22 22:30:11 [scrapy.utils.log] INFO: Overridden settings: {'BOT_NAME': 'xiaohuar', 'SPIDER_MODULES': ['xiaohuar.spiders'], 'ROBOTSTXT_OBEY': True, 'NEWSPIDER_MODULE': 'xiaohuar.spiders'}2017-10-22 22:30:11 [scrapy.middleware] INFO: Enabled extensions:['scrapy.extensions.telnet.TelnetConsole', 'scrapy.extensions.corestats.CoreStats', 'scrapy.extensions.logstats.LogStats']2017-10-22 22:30:12 [scrapy.middleware] INFO: Enabled downloader middlewares:['scrapy.downloadermiddlewares.robotstxt.RobotsTxtMiddleware', 'scrapy.downloadermiddlewares.httpauth.HttpAuthMiddleware', 'scrapy.downloadermiddlewares.downloadtimeout.DownloadTimeoutMiddleware', 'scrapy.downloadermiddlewares.defaultheaders.DefaultHeadersMiddleware', 'scrapy.downloadermiddlewares.useragent.UserAgentMiddleware', 'scrapy.downloadermiddlewares.retry.RetryMiddleware', 'scrapy.downloadermiddlewares.redirect.MetaRefreshMiddleware', 'scrapy.downloadermiddlewares.httpcompression.HttpCompressionMiddleware', 'scrapy.downloadermiddlewares.redirect.RedirectMiddleware', 'scrapy.downloadermiddlewares.cookies.CookiesMiddleware', 'scrapy.downloadermiddlewares.httpproxy.HttpProxyMiddleware', 'scrapy.downloadermiddlewares.stats.DownloaderStats']2017-10-22 22:30:12 [scrapy.middleware] INFO: Enabled spider middlewares:['scrapy.spidermiddlewares.httperror.HttpErrorMiddleware', 'scrapy.spidermiddlewares.offsite.OffsiteMiddleware', 'scrapy.spidermiddlewares.referer.RefererMiddleware', 'scrapy.spidermiddlewares.urllength.UrlLengthMiddleware', 'scrapy.spidermiddlewares.depth.DepthMiddleware']2017-10-22 22:30:12 [scrapy.middleware] INFO: Enabled item pipelines:[]2017-10-22 22:30:12 [scrapy.core.engine] INFO: Spider opened2017-10-22 22:30:12 [scrapy.extensions.logstats] INFO: Crawled 0 pages (at 0 pages/min), scraped 0 items (at 0 items/min)2017-10-22 22:30:12 [scrapy.extensions.telnet] DEBUG: Telnet console listening on 127.0.0.1:60232017-10-22 22:30:12 [scrapy.core.engine] DEBUG: Crawled (200) <GET http://www.xiaohuar.com/robots.txt> (referer: None)2017-10-22 22:30:13 [scrapy.core.engine] DEBUG: Crawled (200) <GET http://www.xiaohuar.com/list-1-0.html> (referer: None)2017-10-22 22:30:13 [scrapy.core.engine] DEBUG: Crawled (200) <GET http://www.xiaohuar.com/d/file/20170721/cb96f1b106b3db4a6bfcf3d2e880dea0.jpg> (referer: http://www.xiaohuar.com/list-1-0.html)2017-10-22 22:30:13 [scrapy.core.engine] DEBUG: Crawled (200) <GET http://www.xiaohuar.com/d/file/20170824/dcc166b0eba6a37e05424cfc29023121.jpg> (referer: http://www.xiaohuar.com/list-1-0.html)2017-10-22 22:30:13 [scrapy.core.engine] DEBUG: Crawled (200) <GET http://www.xiaohuar.com/d/file/20170916/7f78145b1ca162eb814fbc03ad24fbc1.jpg> (referer: http://www.xiaohuar.com/list-1-0.html)2017-10-22 22:30:13 [scrapy.core.engine] DEBUG: Crawled (200) <GET http://www.xiaohuar.com/d/file/20170919/2f728d0f110a21fea95ce13e0b010d06.jpg> (referer: http://www.xiaohuar.com/list-1-0.html)2017-10-22 22:30:13 [scrapy.core.engine] DEBUG: Crawled (200) <GET http://www.xiaohuar.com/d/file/20170819/9c3dfeef7e08cc0303ce233e4ddafa7f.jpg> (referer: http://www.xiaohuar.com/list-1-0.html)2017-10-22 22:30:14 [scrapy.core.engine] DEBUG: Crawled (200) <GET http://www.xiaohuar.com/d/file/20170917/715515e7fe1f1cb9fd388bbbb00467c2.jpg> (referer: http://www.xiaohuar.com/list-1-0.html)2017-10-22 22:30:14 [scrapy.core.engine] DEBUG: Crawled (200) <GET http://www.xiaohuar.com/d/file/20170628/f3d06ef49965aedbe18286a2f221fd9f.jpg> (referer: http://www.xiaohuar.com/list-1-0.html)2017-10-22 22:30:14 [scrapy.core.engine] DEBUG: Crawled (200) <GET http://www.xiaohuar.com/d/file/20170513/6121e3e90ff3ba4c9398121bda1dd582.jpg> (referer: http://www.xiaohuar.com/list-1-0.html)2017-10-22 22:30:14 [scrapy.core.engine] DEBUG: Crawled (200) <GET http://www.xiaohuar.com/d/file/20170516/6e295fe48c33245be858c40d37fb5ee6.jpg> (referer: http://www.xiaohuar.com/list-1-0.html)2017-10-22 22:30:14 [scrapy.core.engine] DEBUG: Crawled (200) <GET http://www.xiaohuar.com/d/file/20170707/f7ca636f73937e33836e765b7261f036.jpg> (referer: http://www.xiaohuar.com/list-1-0.html)2017-10-22 22:30:14 [scrapy.core.engine] DEBUG: Crawled (200) <GET http://www.xiaohuar.com/d/file/20170528/b352258c83776b9a2462277dec375d0c.jpg> (referer: http://www.xiaohuar.com/list-1-0.html)2017-10-22 22:30:14 [scrapy.core.engine] DEBUG: Crawled (200) <GET http://www.xiaohuar.com/d/file/20170527/4a7a7f1e6b69f126292b981c90110d0a.jpg> (referer: http://www.xiaohuar.com/list-1-0.html)2017-10-22 22:30:14 [scrapy.core.engine] DEBUG: Crawled (200) <GET http://www.xiaohuar.com/d/file/20170715/61110ba027f004fb503ff09cdee44d0c.jpg> (referer: http://www.xiaohuar.com/list-1-0.html)2017-10-22 22:30:14 [scrapy.core.engine] DEBUG: Crawled (200) <GET http://www.xiaohuar.com/d/file/20170520/dd21a21751e24a8f161792b66011688c.jpg> (referer: http://www.xiaohuar.com/list-1-0.html)2017-10-22 22:30:15 [scrapy.core.engine] DEBUG: Crawled (200) <GET http://www.xiaohuar.com/d/file/20170529/8140c4ad797ca01f5e99d09c82dd8a42.jpg> (referer: http://www.xiaohuar.com/list-1-0.html)2017-10-22 22:30:15 [scrapy.core.engine] DEBUG: Crawled (200) <GET http://www.xiaohuar.com/d/file/20170603/e55f77fb3aa3c7f118a46eeef5c0fbbf.jpg> (referer: http://www.xiaohuar.com/list-1-0.html)2017-10-22 22:30:15 [scrapy.core.engine] DEBUG: Crawled (200) <GET http://www.xiaohuar.com/d/file/20170529/e5902d4d3e40829f9a0d30f7488eab84.jpg> (referer: http://www.xiaohuar.com/list-1-0.html)2017-10-22 22:30:15 [scrapy.core.engine] DEBUG: Crawled (200) <GET http://www.xiaohuar.com/d/file/20170604/ec3794d0d42b538bf4461a84dac32509.jpg> (referer: http://www.xiaohuar.com/list-1-0.html)2017-10-22 22:30:15 [scrapy.core.engine] DEBUG: Crawled (200) <GET http://www.xiaohuar.com/d/file/20170603/c34b29f68e8f96d44c63fe29bf4a66b8.jpg> (referer: http://www.xiaohuar.com/list-1-0.html)2017-10-22 22:30:15 [scrapy.core.engine] DEBUG: Crawled (200) <GET http://www.xiaohuar.com/d/file/20170701/fb18711a6af87f30942d6a19f6da6b3e.jpg> (referer: http://www.xiaohuar.com/list-1-0.html)2017-10-22 22:30:15 [scrapy.core.engine] DEBUG: Crawled (200) <GET http://www.xiaohuar.com/d/file/20170619/e0456729d4dcbea569a1acbc6a47ab69.jpg> (referer: http://www.xiaohuar.com/list-1-0.html)2017-10-22 22:30:15 [scrapy.core.engine] DEBUG: Crawled (200) <GET http://www.xiaohuar.com/d/file/20170626/0ab1d89f54c90df477a90aa533ceea36.jpg> (referer: http://www.xiaohuar.com/list-1-0.html)2017-10-22 22:30:15 [scrapy.core.engine] INFO: Closing spider (finished)2017-10-22 22:30:15 [scrapy.statscollectors] INFO: Dumping Scrapy stats:{'downloader/request_bytes': 8785, 'downloader/request_count': 24, 'downloader/request_method_count/GET': 24, 'downloader/response_bytes': 2278896, 'downloader/response_count': 24, 'downloader/response_status_count/200': 24, 'finish_reason': 'finished', 'finish_time': datetime.datetime(2017, 10, 22, 14, 30, 15, 892287), 'log_count/DEBUG': 25, 'log_count/INFO': 7, 'request_depth_max': 1, 'response_received_count': 24, 'scheduler/dequeued': 23, 'scheduler/dequeued/memory': 23, 'scheduler/enqueued': 23, 'scheduler/enqueued/memory': 23, 'start_time': datetime.datetime(2017, 10, 22, 14, 30, 12, 698874)}2017-10-22 22:30:15 [scrapy.core.engine] INFO: Spider closed (finished)scrapy crawl xiaohua            
發(fā)表評論 共有條評論
用戶名: 密碼:
驗證碼: 匿名發(fā)表
主站蜘蛛池模板: 博乐市| 织金县| 崇阳县| 城市| 广昌县| 辽宁省| 拜泉县| 肃宁县| 唐海县| 韩城市| 阜平县| 云龙县| 大田县| 华池县| 浮梁县| 寿光市| 随州市| 横峰县| 德化县| 衢州市| 乐陵市| 潜江市| 高州市| 漳浦县| 临清市| 五台县| 赞皇县| 武宣县| 湖北省| 咸阳市| 屏南县| 台南县| 岳西县| 景德镇市| 贡山| 嘉兴市| 勃利县| 五河县| 伊宁县| 乐平市| 马关县|