国产探花免费观看_亚洲丰满少妇自慰呻吟_97日韩有码在线_资源在线日韩欧美_一区二区精品毛片,辰东完美世界有声小说,欢乐颂第一季,yy玄幻小说排行榜完本

首頁 > 服務器 > Web服務器 > 正文

基于alpine用dockerfile創建的爬蟲Scrapy鏡像的實現

2024-09-01 13:56:12
字體:
來源:轉載
供稿:網友

一、下載alpine鏡像

[root@DockerBrian ~]# docker pull alpineUsing default tag: latestTrying to pull repository docker.io/library/alpine ...latest: Pulling from docker.io/library/alpine4fe2ade4980c: Pull completeDigest: sha256:621c2f39f8133acb8e64023a94dbdf0d5ca81896102b9e57c0dc184cadaf5528Status: Downloaded newer image for docker.io/alpine:latest[root@docker43 ~]# docker imagesREPOSITORY TAG IMAGE ID CREATED SIZEdocker.io/alpine latest 196d12cf6ab1 3 weeks ago 4.41 MB 

二、編寫Dockerfile

創建scrapy目錄存放dockerfile文件

[root@DockerBrian ~]# mkdir /opt/alpineDockerfile/[root@DockerBrian ~]# cd /opt/alpineDockerfile/[root@DockerBrian alpineDockerfile]# mkdir scrapy && cd scrapy && touch Dockerfile[root@DockerBrian alpineDockerfile]# cd scrapy/[root@DockerBrian scrapy]# ll總用量 4-rw-r--r-- 1 root root 1394 10月 10 11:36 Dockerfile 

編寫dockerfile文件

# 指定創建的基礎鏡像FROM alpine # 作者描述信息MAINTAINER alpine_python3_scrapy (zhujingzhi@123.com) # 替換阿里云的源RUN echo "http://mirrors.aliyun.com/alpine/latest-stable/main/" > /etc/apk/repositories && /  echo "http://mirrors.aliyun.com/alpine/latest-stable/community/" >> /etc/apk/repositories # 同步時間 # 更新源、安裝openssh 并修改配置文件和生成key 并且同步時間RUN apk update && /  apk add --no-cache openssh-server tzdata && /  cp /usr/share/zoneinfo/Asia/Shanghai /etc/localtime && /  sed -i "s/#PermitRootLogin.*/PermitRootLogin yes/g" /etc/ssh/sshd_config && /  ssh-keygen -t rsa -P "" -f /etc/ssh/ssh_host_rsa_key && /  ssh-keygen -t ecdsa -P "" -f /etc/ssh/ssh_host_ecdsa_key && /  ssh-keygen -t ed25519 -P "" -f /etc/ssh/ssh_host_ed25519_key && /  echo "root:h056zHJLg85oW5xh7VtSa" | chpasswd # 安裝Scrapy依賴包(必須安裝的依賴包)RUN apk add --no-cache python3 python3-dev gcc openssl-dev openssl libressl libc-dev linux-headers libffi-dev libxml2-dev libxml2 libxslt-dev openssh-client openssh-sftp-server # 安裝環境需要pip包(這里的包可以按照需求添加或者刪除)RUN pip3 install --default-timeout=100 --no-cache-dir --upgrade pip setuptools pymysql pymongo redis scrapy-redis ipython Scrapy requests # 啟動ssh腳本RUN echo "/usr/sbin/sshd -D" >> /etc/start.sh && /  chmod +x /etc/start.sh # 開放22端口EXPOSE 22 # 執行ssh啟動命令CMD ["/bin/sh","/etc/start.sh"] 

實現了容器可以SSH遠程訪問 基于Python3 環境安裝的Scrapy,通過start.sh腳本啟動SSH服務

三、創建鏡像

創建鏡像

[root@DockerBrian scrapy]# docker build -t scrapy_redis_ssh:v1 . 

查看鏡像

[root@DockerBrian scrapy]# docker imagesREPOSITORY     TAG         IMAGE ID      CREATED       SIZEscrapy_redis_ssh  v1         b2c95ef95fb9    4 hours ago     282 MBdocker.io/alpine  latest       196d12cf6ab1    4 weeks ago     4.41 MB 

四、創建容器

創建容器(名字為scrapy10086 遠程端口是映射宿主機10086端口)

 

復制代碼代碼如下:
docker run -itd --restart=always --name scrapy10086 -p 10086:22 scrapy_redis_ssh:v1

 

查看容器

[root@DockerBrian scrapy]# docker psCONTAINER ID    IMAGE        COMMAND         CREATED       STATUS       PORTS          NAMES7fb9e69d79f5    b2c95ef95fb9    "/bin/sh /etc/star..."  3 hours ago     Up 3 hours     0.0.0.0:10086->22/tcp  scrapy10086 

登錄容器

[root@DockerBrian scrapy]# ssh root@127.0.0.1 -p 10086 The authenticity of host '[127.0.0.1]:10086 ([127.0.0.1]:10086)' can't be established.ECDSA key fingerprint is SHA256:wC46AU6SLjHyEfQWX6d6ht9MdpGKodeMOK6/cONcpxk.ECDSA key fingerprint is MD5:6a:b7:31:3c:63:02:ca:74:5b:d9:68:42:08:be:22:fc.Are you sure you want to continue connecting (yes/no)? yesWarning: Permanently added '[127.0.0.1]:10086' (ECDSA) to the list of known hosts.root@127.0.0.1's password:                                # 這里的密碼就是dockerfile中定義的 echo "root:h056zHJLg85oW5xh7VtSa" | chpasswdWelcome to Alpine! The Alpine Wiki contains a large amount of how-to guides and generalinformation about administrating Alpine systems.See <http://wiki.alpinelinux.org>. You can setup the system with the command: setup-alpine You may change this message by editing /etc/motd. 7363738cc96a:~# 

五、測試

創建個scrapy項目測試

7363738cc96a:~# scrapy startproject testNew Scrapy project 'test', using template directory '/usr/lib/python3.6/site-packages/scrapy/templates/project', created in:  /root/test You can start your first spider with:  cd test  scrapy genspider example example.com7363738cc96a:~# cd test/7363738cc96a:~/test# lsscrapy.cfg test7363738cc96a:~/test# cd test/7363738cc96a:~/test/test# ls__init__.py   __pycache__   items.py    middlewares.py pipelines.py  settings.py   spiders7363738cc96a:~/test/test# 

測試成功

以上就是本文的全部內容,希望對大家的學習有所幫助,也希望大家多多支持VEVB武林網。


注:相關教程知識閱讀請移步到服務器教程頻道。
發表評論 共有條評論
用戶名: 密碼:
驗證碼: 匿名發表
主站蜘蛛池模板: 西盟| 保康县| 新龙县| 西乡县| 灵石县| 成武县| 甘孜县| 杨浦区| 怀仁县| 蓬溪县| 顺昌县| 融水| 金寨县| 乌兰察布市| 贵南县| 南安市| 化德县| 松滋市| 恩平市| 永济市| 嘉义市| 革吉县| 格尔木市| 巴中市| 西峡县| 南京市| 济源市| 运城市| 杭锦后旗| 油尖旺区| 北流市| 沙田区| 耿马| 江阴市| 泽州县| 三台县| 武义县| 墨竹工卡县| 华蓥市| 平罗县| 青海省|