票圈爬虫平台

zhangyong 0183144b03 吉祥祝福为你传递好运 token更新 1 year ago
analysis 2229d41046 修改祝万物复苏代码, 选择用户时为随机选择 1 year ago
benshanzhufu fda106688e 1. 本山祝福配置新日志 1 year ago
common a8b572fc92 修改 mq bug 1 year ago
control d8bcc1c3d1 update 1 year ago
dev 065764cd1e update 1 year ago
douyin 30cb4dc794 看一看 feed 修改规则 1 year ago
fuqiwang 7f7161e35a 福气旺配置新日志 1 year ago
ganggangdouchuan 2f068b4832 刚刚都传-修改platform 1 year ago
gongzhonghao 8b04e17862 天星小视频上线 1 year ago
haitunzhufu dfbb9eb2e2 上线珊瑚祝福代码 1 year ago
haokanshipin acda0bbeb0 好看视频增加新规则 1 year ago
huahaoyueyuanzhonglaonian 51e950f408 新增“花好月圆中老年”爬虫以及相关代码 1 year ago
huanhuanxixizhufudao 3b0e6448ce 修改欢欢喜喜祝福到 1 year ago
ip_change de596f0ff0 软路由 自动切换ip代码优化 1 year ago
jieriyingzhufukuaile ce9c93e96f 新增节日应祝福快乐 小程序 1 year ago
jingdianfuqiwang 803dd8c009 每天送祝福——爬虫上线 1 year ago
jingyuzhufu a9a498e8ea 祝尽善尽美爬虫上线 1 year ago
jixiangjiajieyaozhufu 38d049e57a 新增吉祥佳节要祝福 小程序 1 year ago
jixiangxingfu 2b7ff4f561 线下 代码优化 1 year ago
jixiangzhufuweinichuandihaoyun 0183144b03 吉祥祝福为你传递好运 token更新 1 year ago
kaixinxingfudaowanjia 222bbabce8 开心幸福到万家 url转码失败解决 1 year ago
kanyikan 047518e6c0 美好星河代码优化 1 year ago
kuaishou 30cb4dc794 看一看 feed 修改规则 1 year ago
laonianduiwu 09b9b3491b add 老年队伍-精彩看点 cover 1 year ago
laoniantuandui 1ac3a3995c add 老年团队 1 year ago
main 2b7ff4f561 线下 代码优化 1 year ago
meihaoxinghe cf652ba594 美好星河代码优化 1 year ago
meitiansongzhufu 726ddc395b 每天送祝福——爬虫上线 1 year ago
monitor a224b49898 update 1 year ago
piaopiaoquan 2b7ff4f561 线下 代码优化 1 year ago
scheduling d2920f2989 update 1 year ago
shanhuzhufu 9515afccee 珊瑚祝福,增加等待时间 1 year ago
shipinhao 40df1e76cc 视频号 bug 修复 1 year ago
suisuiniannianyingfuqi e98399386a 1. 岁岁年年迎福气爬虫 1 year ago
tianxingxiaoshipin 8b04e17862 天星小视频上线 1 year ago
tuixiudabenying a2fcd982e0 上线退休大本营爬虫 1 year ago
weixinzhishu 065764cd1e update 1 year ago
xiaoniangao 099299f9cd 小年糕 账号抓取 规则修改 1 year ago
xiaoniangaoplus 2b7ff4f561 线下 代码优化 1 year ago
xigua 11c06a70b9 1. xigua_author,对于置顶的视频不予判断发布时间 1 year ago
youlegaoxiaoxiaoshipin c4a978adc2 小年糕——V2, youlegaoxiaoxiaoshipin_test 1 year ago
youtube 1a603d121c update 1 year ago
zhiqingtiantiankan 38bde8b2cb update 1 year ago
zhonglaonianyule dbb6b16c90 1. 增加中老年娱乐新日志 1 year ago
zhongmiaoyinxin 2b7ff4f561 线下 代码优化 1 year ago
zhufuquanzi 2b7ff4f561 线下 代码优化 1 year ago
zhufushenghuo 65f2d33fc1 1. 祝福生活爬虫上线 1 year ago
zhufuyiqifengfa 25e2cd104f 祝福意气风发上线 1 year ago
zhufuzanmenzhonglaonianzhishipin 0ae5c652c6 祝福咱们中老年之视频 修改video_url 1 year ago
zhufuzhonglaonianrenruyijixiang c6fabebc02 修改 token . 1017 1 year ago
zhujinshanjinmei a9a498e8ea 祝尽善尽美爬虫上线 1 year ago
zhuwanwufusu f291b198db 祝万物复苏,增加了等待时间 1 year ago
.gitignore 4954c999a0 update 1 year ago
README.MD 5bb5404abf 修改 readme, 更新公众号等待时间 1 year ago
requirements.txt 2f02aac746 update 1 year ago
resend_msg.py f291b198db 祝万物复苏,增加了等待时间 1 year ago

README.MD

爬虫调度系统

启动

  1. cd ./piaoquan_crawler
  2. sh ./main/scheduling_main.sh ${crawler_dir} ${log_type} ${crawler} ${env} >>${nohup_dir} 2>&1 &

    参数说明
    ${crawler_dir}:     爬虫执行路径,如: scheduling/scheduling_main/run_write_task.py
    ${log_type}:        日志命名格式,如: scheduling-task,则在 scheduling/logs/目录下,生成 2023-02-08-scheduling-task.log
    ${crawler}:         哪款爬虫,如: youtube / kanyikan / weixinzhishu
    ${env}:             爬虫运行环境,正式环境: prod / 测试环境: dev
    ${nohup_dir}:       nohup日志存储路径,如: shceduling/nohup-task.log
    

    运行命令

    阿里云 102 服务器
    sh ./main/scheduling_main.sh scheduling/scheduling_main/run_write_task.py --log_type="scheduling-write" --crawler="scheduling" --env="prod" nohup-write.log 
    sh ./main/scheduling_main.sh scheduling/scheduling_main/run_scheduling_task.py --log_type="scheduling-task" --crawler="scheduling" --env="prod" nohup-task.log 
    # 读取任务写入 Redis,1分钟/次
    */1 * * * * cd /data5/piaoquan_crawler && /usr/bin/sh ./main/scheduling_main.sh scheduling/scheduling_v3/run_write_task_v3.py --log_type="scheduling-write" --crawler="scheduling" --env="prod" scheduling/logs/scheduling-write.log
    # 调度任务,5秒/次
    * * * * * for i in {1..12}; do cd /data5/piaoquan_crawler && /usr/bin/sh ./main/scheduling_main.sh scheduling/scheduling_v3/run_scheduling_task_v3.py --log_type="scheduling-task" --crawler="scheduling" --env="prod" scheduling/logs/scheduling-task.log; sleep 5; done
    香港服务器
    sh ./main/scheduling_main.sh scheduling/scheduling_main/run_write_task.py --log_type="scheduling-write" --crawler="scheduling" --env="prod" shceduling/nohup-write.log 
    sh ./main/scheduling_main.sh scheduling/scheduling_main/run_scheduling_task.py --log_type="scheduling-task" --crawler="scheduling" --env="prod" shceduling/nohup-task.log 
    
    线下调试
    # 读取任务写入 Redis
    sh ./main/scheduling_main.sh scheduling/scheduling_v3/run_write_task_v3.py --log_type="scheduling-write" --crawler="scheduling" --env="dev"  scheduling/logs/scheduling-write.log 
    # 调度任务
    sh ./main/scheduling_main.sh scheduling/scheduling_v3/run_scheduling_task_v3.py --log_type="scheduling-task" --crawler="scheduling" --env="dev"  scheduling/logs/scheduling-task.log 
    
    杀进程
    ps aux | grep scheduling
    ps aux | grep scheduling | grep -v grep | awk '{print $2}' | xargs kill -9
    

爬虫平台

启动

  1. cd ./piaoquan_crawler
  2. sh ./main/main.sh ${crawler_dir} ${log_type} ${crawler} ${strategy} ${oss_endpoint} ${env} ${machine} ${nohup_dir}

    参数说明
    ${crawler_dir}:     爬虫执行路径,如: ./youtube/youtube_main/run_youtube_follow.py
    ${log_type}:        日志命名格式,如: follow,则在 youtube/logs/目录下,生成 2023-02-08-follow.log
    ${crawler}:         哪款爬虫,如: youtube / kanyikan / weixinzhishu
    ${strategy}:        爬虫策略,如: 定向爬虫策略 / 小时榜爬虫策略 / 热榜爬虫策略
    # ${oss_endpoint}:    OSS网关,内网: inner / 外网: out / 香港: hk
    ${env}:             爬虫运行环境,正式环境: prod / 测试环境: dev
    ${machine}:         爬虫运行机器,阿里云服务器: aliyun_hk / aliyun / macpro / macair / local
    ${nohup_dir}:       nohup日志存储路径,如: ./youtube/nohup.log
    

    YouTube

    sh ./main/main.sh ./youtube/youtube_main/run_youtube_follow.py --log_type="follow" --crawler="youtube" --strategy="定向爬虫策略" --oss_endpoint="hk" --env="prod" --machine="aliyun_hk" youtube/nohup.log
    # sh ./main/main.sh ./youtube/youtube_main/run_youtube_follow.py --log_type="follow" --crawler="youtube" --strategy="定向爬虫策略" --env="prod" --machine="aliyun_hk" youtube/nohup.log
    youtube杀进程命令: 
    ps aux | grep run_youtube
    ps aux | grep run_youtube | grep -v grep | awk '{print $2}' | xargs kill -9
    

微信指数

# 微信指数,Mac Air
00 11 * * * cd ~ && source ./base_profile && ps aux | grep weixinzhishu | grep -v grep | awk '{print $2}' | xargs kill -9 && cd /Users/piaoquan/Desktop/piaoquan_crawler && nohup python3 -u weixinzhishu/weixinzhishu_key/search_key_mac.py >> weixinzhishu/logs/nohup-search-key.log 2>&1 &

获取站外标题, crontab定时脚本, 每天 12:00:00 点运行一次
00 12 * * * cd /data5/piaoquan_crawler/ && /root/anaconda3/bin/python weixinzhishu/weixinzhishu_main/run_weixinzhishu_hot_search.py >>weixinzhishu/logs/nohup-hot-search.log 2>&1 &
获取站外热词微信指数, crontab定时脚本, 每天 12:30:00 点运行一次
30 12 * * * cd /data5/piaoquan_crawler/ && /root/anaconda3/bin/python weixinzhishu/weixinzhishu_main/run_weixinzhishu_today_score.py >>weixinzhishu/logs/today-score.log 2>&1 &
获取微信指数, crontab定时脚本, 每天 08:00:00 20:00:00 各运行一次
00 08,20 * * * cd /data5/piaoquan_crawler/ && /root/anaconda3/bin/python weixinzhishu/weixinzhishu_main/run_weixinzhishu_score.py >>weixinzhishu/logs/nohup-score.log 2>&1 &
nohup python3 -u /data5/piaoquan_crawler/weixinzhishu/weixinzhishu_main/weixinzhishu_inner_long.py >>/data5/piaoquan_crawler/weixinzhishu/logs/nohup_inner_long.log 2>&1 &
nohup python3 -u /data5/piaoquan_crawler/weixinzhishu/weixinzhishu_main/weixinzhishu_out.py >>/data5/piaoquan_crawler/weixinzhishu/logs/nohup_out.log 2>&1 &
nohup python3 -u /data5/piaoquan_crawler/weixinzhishu/weixinzhishu_main/weixinzhishu_inner_sort.py >>/data5/piaoquan_crawler/weixinzhishu/logs/nohup_inner_sort.log 2>&1 &
获取 wechat_key 设备: Mac Air 
cd ~ && source ./base_profile && ps aux | grep weixinzhishu | grep -v grep | awk '{print $2}' | xargs kill -9 && cd /Users/piaoquan/Desktop/piaoquan_crawler && nohup python3 -u weixinzhishu/weixinzhishu_key/search_key_mac.py >> weixinzhishu/logs/nohup-search-key.log 2>&1 &
线下调试
抓取今日微信指数
python3 /Users/wangkun/Desktop/crawler/piaoquan_crawler/weixinzhishu/weixinzhishu_main/run_weixinzhishu_today_score.py
检测进程
ps aux | grep WeChat.app
ps aux | grep weixinzhishu
ps aux | grep weixinzhishu | grep -v grep | awk '{print $2}' | xargs kill -9
ps aux | grep 微信 | grep -v grep | awk '{print $2}' | xargs kill -9

线下爬虫: 刚刚都传 / 吉祥幸福 / 知青天天看 / 众妙音信 / wechat_search_key / start_appium / 祝福圈子

# 线下爬虫调度,每分钟检测线下爬虫进程状态
* * * * * /bin/sh /Users/piaoquan/Desktop/piaoquan_crawler/main/process_offline.sh "prod"
# 启动并检测Appium进程状态
* * * * * /bin/sh /Users/piaoquan/Desktop/piaoquan_crawler/main/start_appium.sh "recommend" "jixiangxingfu" "prod"

线下调试
ps aux | grep Appium.app | grep -v grep | awk '{print $2}' | xargs kill -9 && nohup /opt/homebrew/bin/node /Applications/Appium.app/Contents/Resources/app/node_modules/appium/build/lib/main.js >>/Users/wangkun/Desktop/logs/nohup.log 2>&1 &
sh /Users/wangkun/Desktop/crawler/piaoquan_crawler/main/process_offline.sh "dev"
cd /Users/piaoquan/Desktop/piaoquan_crawler/ && nohup python3 -u weixinzhishu/weixinzhishu_key/search_key_mac.py >> weixinzhishu/logs/nohup-search-key.log 2>&1 &
检测进程
ps aux | grep run_ganggangdouchuan | grep -v grep | awk '{print $2}' | xargs kill -9
ps aux | grep run_jixiangxingfu | grep -v grep | awk '{print $2}' | xargs kill -9
ps aux | grep run_zhongmiaoyinxin | grep -v grep | awk '{print $2}' | xargs kill -9
ps aux | grep run_zhiqingtiantiankan | grep -v grep | awk '{print $2}' | xargs kill -9
ps aux | grep Appium.app | grep -v grep | awk '{print $2}' | xargs kill -9

视频号

正式环境
00 00 * * * /bin/sh /Users/piaoquan/Desktop/piaoquan_crawler/shipinhao/shipinhao_main/run_shipinhao.sh shipinhao/shipinhao_main/run_shipinhao_search.py --log_type="search" --crawler="shipinhao" --env="prod"
线下调试
sh shipinhao/shipinhao_main/run_shipinhao.sh shipinhao/shipinhao_main/run_shipinhao_search.py --log_type="search" --crawler="shipinhao" --env="dev"
检测进程
ps aux | grep shipinhao_search
ps aux | grep shipinhao_search | grep -v grep | awk '{print $2}' | xargs kill -9

207 服务器,CPU/MEMORY 监控

正式环境
* * * * * /usr/bin/sh /root/piaoquan_crawler/monitor/monitor_main/run_monitor.sh monitor/monitor_main/run_cpu_memory.py "cpumemory" "monitor" "prod"
线下调试
sh monitor/monitor_main/run_monitor.sh monitor/monitor_main/run_cpu_memory.py "cpumemory" "monitor" "dev"
检测进程
ps aux | grep run_monitor | grep -v grep | awk '{print $2}' | xargs kill -9

调用MQ的爬虫进程守护: main/process_mq.sh

本地调试
/bin/sh /Users/wangkun/Desktop/crawler/piaoquan_crawler/main/process_mq.sh "ssnnyfq" "suisuiniannianyingfuqi" "recommend" "dev"
/bin/sh /Users/wangkun/Desktop/crawler/piaoquan_crawler/main/process_mq.sh "gzh1" "gongzhonghao" "author" "dev"
/bin/sh /Users/wangkun/Desktop/crawler/piaoquan_crawler/main/process_mq.sh "gzh2" "gongzhonghao" "author" "dev"
/bin/sh /Users/wangkun/Desktop/crawler/piaoquan_crawler/main/process_mq.sh "xg" "xigua" "recommend" "dev"
/bin/sh /Users/wangkun/Desktop/crawler/piaoquan_crawler/main/process_mq.sh "xg" "xigua" "author" "dev"
/bin/sh /Users/wangkun/Desktop/crawler/piaoquan_crawler/main/process_mq.sh "xg" "xigua" "search" "dev"
/bin/sh /Users/wangkun/Desktop/crawler/piaoquan_crawler/main/process_mq.sh "bszf" "benshanzhufu" "recommend" "dev"
/bin/sh /Users/wangkun/Desktop/crawler/piaoquan_crawler/main/process_mq.sh "ks" "kuaishou" "recommend" "dev"
/bin/sh /Users/wangkun/Desktop/crawler/piaoquan_crawler/main/process_mq.sh "ks" "kuaishou" "author" "dev"
/bin/sh /Users/wangkun/Desktop/crawler/piaoquan_crawler/main/process_mq.sh "dy" "douyin" "recommend" "dev"
/bin/sh /Users/wangkun/Desktop/crawler/piaoquan_crawler/main/process_mq.sh "dy" "douyin" "author" "dev"
/bin/sh /Users/wangkun/Desktop/crawler/piaoquan_crawler/main/process_mq.sh "xng" "xiaoniangao" "play" "dev"
/bin/sh /Users/wangkun/Desktop/crawler/piaoquan_crawler/main/process_mq.sh "xng" "xiaoniangao" "hour" "dev"
/bin/sh /Users/wangkun/Desktop/crawler/piaoquan_crawler/main/process_mq.sh "xng" "xiaoniangao" "author" "dev"
/bin/sh /Users/wangkun/Desktop/crawler/piaoquan_crawler/main/process_mq.sh "kykjk" "kanyikan" "recommend" "dev"


207 服务器, 调用 MQ 爬虫守护进程
# 岁岁年年迎福气_luo
* * * * * /usr/bin/sh /root/piaoquan_crawler/main/process_mq.sh "ssnnyfq" "suisuiniannianyingfuqi" "recommend" "prod"
# 公众号_luo(根据抓取目标用户数,自动计算需要启动 X 个进程同时抓取。每 100 个目标抓取用户,占用一个进程)
* * * * * /usr/bin/sh /root/piaoquan_crawler/main/process_mq.sh "gzh" "gongzhonghao" "author" "prod"
# 西瓜账号
* * * * * /usr/bin/sh /root/piaoquan_crawler/main/process_mq.sh "xg" "xigua" "author" "prod"
# 西瓜搜索
* * * * * /usr/bin/sh /root/piaoquan_crawler/main/process_mq.sh "xg" "xigua" "search" "prod"
# 本山祝福
* * * * * /usr/bin/sh /root/piaoquan_crawler/main/process_mq.sh "bszf" "benshanzhufu" "recommend" "prod"
# 快手推荐
* * * * * /usr/bin/sh /root/piaoquan_crawler/main/process_mq.sh "ks" "kuaishou" "recommend" "prod"
# 快手账号
* * * * * /usr/bin/sh /root/piaoquan_crawler/main/process_mq.sh "ks" "kuaishou" "author" "prod"
# 抖音推荐
* * * * * /usr/bin/sh /root/piaoquan_crawler/main/process_mq.sh "dy" "douyin" "recommend" "prod"
# 抖音账号
* * * * * /usr/bin/sh /root/piaoquan_crawler/main/process_mq.sh "dy" "douyin" "author" "prod"
# 小年糕播放榜_luo
* * * * * /usr/bin/sh /root/piaoquan_crawler/main/process_mq.sh "xng" "xiaoniangao" "play" "prod"
# 小年糕上升榜_luo
* * * * * /usr/bin/sh /root/piaoquan_crawler/main/process_mq.sh "xng" "xiaoniangao" "hour" "prod"
# 小年糕账号_luo
* * * * * /usr/bin/sh /root/piaoquan_crawler/main/process_mq.sh "xng" "xiaoniangao" "author" "prod"
# 看一看推荐 1
* * * * * /bin/sh /Users/lieyunye/Desktop/crawler/piaoquan_crawler/main/process_mq.sh "kyk" "kanyikan" "recommend" "prod"
# 看一看推荐健康类
* * * * * /bin/sh /Users/kanyikan/Desktop/crawler/piaoquan_crawler/main/process_mq.sh "kykjk" "kanyikan" "recommend" "prod"
# 西瓜推荐 1_luo
* * * * * /bin/sh /Users/kanyikan/Desktop/crawler/piaoquan_crawler/main/process_mq.sh "xg" "xigua" "recommend" "prod"
# 西瓜推荐民生类_luo
* * * * * /bin/sh /Users/piaoquan/Desktop/piaoquan_crawler/main/process_mq.sh "xgms" "xigua" "recommend" "prod"
# 启动 Appium 
* * * * * /bin/sh /Users/lieyunye/Desktop/crawler/piaoquan_crawler/main/start_appium.sh "recommend" "shipinhao" "prod"
# 视频号推荐
* * * * * /bin/sh /Users/lieyunye/Desktop/crawler/piaoquan_crawler/main/process_mq.sh "sph" "shipinhao" "recommend" "prod"
# 视频号搜索
* * * * * /bin/sh /Users/piaoquan/Desktop/piaoquan_crawler/main/process_mq.sh "sph" "shipinhao" "search" "prod"
# 祝福圈子
* * * * * /bin/sh /Users/piaoquan/Desktop/piaoquan_crawler/main/process_mq.sh "zfqz" "zhufuquanzi" "recommend" "prod"
# 祝福生活
* * * * * /usr/bin/sh /root/piaoquan_crawler/main/process_mq.sh "zfsh" "zhufusheng" "recommend" "prod"
# 福气旺
* * * * * /usr/bin/sh /root/piaoquan_crawler/main/process_mq.sh "fqw" "fuqiwang" "recommend" "prod"
# 刚刚都传_luo
****** /usr/bin/sh /root/piaoquan_crawler/main/process_mq.sh "ggdc" "ganggangdouchuan" "recommend" "prod"
ps aux | grep ganggangdoucuan | grep -v grep | awk '{print $2}' | xargs kill -9
# 欢欢喜喜祝福到_luo
****** /usr/bin/sh /root/piaoquan_crawler/main/process_mq.sh "hhxxzfd" "huanhuanxixizhufudao" "recommend" "prod"
ps aux | grep huanhuanxixizhufudao | grep -v grep | awk '{print $2}' | xargs kill -9
# 海豚祝福_luo
****** /usr/bin/sh /root/piaoquan_crawler/main/process_mq.sh "htzf" "haitunzhufu" "recommend" "prod"
ps aux | grep haitunzhufu | grep -v grep | awk '{print $2}' | xargs kill -9
# 中老年娱乐_luo
****** /usr/bin/sh /root/piaoquan_crawler/main/process_mq.sh "zlnyl" "zhonglaonianyule" "recommend" "prod"
ps aux | grep zhonglaonianyule | grep -v grep | awk '{print $2}' | xargs kill -9
# 花好月圆中老年_luo
****** /usr/bin/sh /root/piaoquan_crawler/main/process_mq.sh "hhyyzln" "huahaoyueyuanzhonglaonian" "recommend" "prod"
ps aux | grep huahaoyueyuanzhonglaonian | grep -v grep | awk '{print $2}' | xargs kill -9
# 祝福中老年人如意吉祥_luo
****** /usr/bin/sh /root/piaoquan_crawler/main/process_mq.sh "zfzlnrryjx" "zhufuzhonglaonianrenruyijixiang" "recommend" "prod"
ps aux | grep zhufuzhonglaonianrenruyijixiang | grep -v grep | awk '{print $2}' | xargs kill -9


杀进程
ps aux | grep suisuiniannianyingfuqi | grep -v grep | awk '{print $2}' | xargs kill -9
ps aux | grep benshanzhufu | grep -v grep | awk '{print $2}' | xargs kill -9
ps aux | grep gongzhonghao | grep -v grep | awk '{print $2}' | xargs kill -9
ps aux | grep xigua | grep -v grep | awk '{print $2}' | xargs kill -9
ps aux | grep kuaishou | grep -v grep | awk '{print $2}' | xargs kill -9
ps aux | grep douyin | grep -v grep | awk '{print $2}' | xargs kill -9
ps aux | grep xiaoniangao | grep -v grep | awk '{print $2}' | xargs kill -9
ps aux | grep zhufushenghuo | grep -v grep | awk '{print $2}' | xargs kill -9
ps aux | grep fuqiwang | grep -v grep | awk '{print $2}' | xargs kill -9
ps aux | grep kanyikan | grep -v grep | awk '{print $2}' | xargs kill -9
ps aux | grep Appium | grep -v grep | awk '{print $2}' | xargs kill -9
ps aux | grep shipinhao | grep -v grep | awk '{print $2}' | xargs kill -9
ps aux | grep Python | grep -v grep | awk '{print $2}' | xargs kill -9

生成 requirements.txt

cd ./piaoquan_crawler && pipreqs ./ --force

# pip3 install Appium-Python-Client
Appium_Python_Client==2.10.1
# 翻墙, pip3 install git+https://github.com/pyatom/pyatom/
atomac==1.2.0
# pip3 install ffmpeg-python
ffmpeg==1.4
# pip3 install loguru
loguru==0.6.0
# pip3 install lxml
lxml==4.9.1
# pip3 install mq_http_sdk, 若您使用的SDK版本为v1.0.0,您需要安装大于等于2.5且小于3.0版本的Python。若您使用的SDK版本大于v1.0.0,您需要安装2.5及以上版本的Python。
mq_http_sdk==1.0.3
# sudo pip3 install oss2
oss2==2.15.0
# pip3 install psutil
psutil==5.9.2
# pip3 install PyExecJS
PyExecJS==1.5.1
# pip3 install PyMysql
PyMySQL==1.0.2
# pip3 install redis
redis==4.5.1
# pip3 install requests
requests==2.27.1
# pip3 install selenium==4.2.0
selenium==4.9.1
# pip3 install urllib3
urllib3==1.26.9
# pip3 install jieba
jieba==0.42.1
# pip3 install workalendar
workalendar==17.0.0
# pip3 install aliyun_python_sdk
# pip3 install -U aliyun-log-python-sdk
aliyun_python_sdk==2.2.0
# pip3 install opencv-python / pip3 install opencv-contrib-python
opencv-python~=4.8.0.74
# pip3 install scikit-learn
scikit-learn~=1.3.0
# pip3 install beautifulsoup4