获取代理ip
如果有付费的代理ip更好,如果没有的话,可以用我构建的docker镜像
docker run -p8765:8765 -d anjia0532/ipproxy-dockerfile
稍等2-5分钟,访问 http://${docker ip}:8765/ ,如果有值,则抓取代理ip成功。
scrapy-proxies-tool
安装
pip install scrapy-proxies-tool
配置
修改 Scrapy settings.py,源repo 只支持从文件读取代理ip
# Retry many times since proxies often fail
RETRY_TIMES = 10
# Retry on most error codes since proxies fail for different reasons
RETRY_HTTP_CODES = [500, 503, 504, 400, 403, 404, 408]
DOWNLOADER_MIDDLEWARES = {
'scrapy.downloadermiddlewares.retry.RetryMiddleware': 90,
'scrapy_proxies.RandomProxy': 100,
'scrapy.downloadermiddlewares.httpproxy.HttpProxyMiddleware': 110,
}
PROXY_SETTINGS = {
# Proxy list containing entries like
# http://host1:port
# http://username:password@host2:port
# http://host3:port
# ...
# if PROXY_SETTINGS[from_proxies_server] = True , proxy_list is server address (ref https://github.com/qiyeboy/IPProxyPool and https://github.com/awolfly9/IPProxyTool )
# Only support http(ref https://github.com/qiyeboy/IPProxyPool#%E5%8F%82%E6%95%B0)
# list : ['http://localhost:8765?protocol=0'],
'list':['/path/to/proxy/list.txt'],
# disable proxy settings and use real ip when all proxies are unusable
'use_real_when_empty':False,
'from_proxies_server':False,
# If proxy mode is 2 uncomment this sentence :
# 'custom_proxy': "http://host1:port",
# Proxy mode
# 0 = Every requests have different proxy
# 1 = Take only one proxy from the list and assign it to every requests
# 2 = Put a custom proxy to use in the settings
'mode':0
}
爬到的代理ip池需要整理成 http://ip:port\n 的格式放在指定文件中即可.(原文没有提及.可以参见 scrapy-proxies-tool 源码仓库说明)
可以通过爬取 myip.ipip.net/ 来判断代理ip是否生效。
参考资料
本文作者:赵安家
链接:https://juejin.im/post/5ca35f616fb9a05e3c698677
来源:掘金
略有删减, 引用的代码库请参见原文