当前位置: 首页 > news >正文

个人网站制作在线赣州新闻

个人网站制作在线,赣州新闻,网站域名备案在阿里云怎么做,企业微网站哪家好目录 一、理论 1.Celery 二、实验 1.Windows11安装Redis 2.Python3.8环境中配置Celery 三、问题 1.Celery命令报错 2.执行Celery命令报错 3.Win11启动Celery报ValueErro错误 一、理论 1.Celery (1) 概念 Celery是一个基于python开发的分布式系统#xff0c;它是简单…目录 一、理论 1.Celery 二、实验 1.Windows11安装Redis 2.Python3.8环境中配置Celery 三、问题 1.Celery命令报错 2.执行Celery命令报错 3.Win11启动Celery报ValueErro错误 一、理论 1.Celery (1) 概念 Celery是一个基于python开发的分布式系统它是简单、灵活且可靠的处理大量消息专注于实时处理的异步任务队列同时也支持任务调度。 (2) 架构 Celery的架构由三部分组成消息中间件message broker任务执行单元worker和任务执行结果存储task result store组成。 1消息中间件 Celery本身不提供消息服务但是可以方便的和第三方提供的消息中间件集成。包括RabbitMQ, Redis等等2任务执行单元 Worker是Celery提供的任务执行的单元worker并发的运行在分布式的系统节点中。3任务结果存储 Task result store用来存储Worker执行的任务的结果Celery支持以不同方式存储任务的结果包括AMQP, redis等 (3)  特点 1简单 Celery易于使用和维护并且它不需要配置文件并且配置和使用是比较简单的2高可用 当任务执行失败或执行过程中发生连接中断celery会自动尝试重新执行任务3快速 单个 Celery 进程每分钟可处理数以百万计的任务而保持往返延迟在亚毫秒级4灵活 Celery几乎所有部分都可以扩展或单独使用各个部分可以自定义。 4场景 Celery是一个强大的 分布式任务队列的异步处理框架它可以让任务的执行完全脱离主程序甚至可以被分配到其他主机上运行。通常使用它来实现异步任务(async task)和定时任务(crontab)。 1)异步任务 将耗时操作任务提交给Celery去异步执行比如发送短信/邮件、消息推送、音视频处理等等2)定时任务 定时执行某件事情比如每天数据统计 二、实验 1.Windows11安装Redis (1)下载最新版Redis Redis-x64-xxx.zip压缩包到D盘解压后将文件夹重新命名为 Redis (2)查看目录 D:\Redisdir (3)打开一个 cmd 窗口 使用 cd 命令切换目录到 D:\Redis 运行 redis-server.exe redis.windows.conf(4)把 redis 的路径加到系统的环境变量 (5)另外开启一个 cmd 窗口原来的不要关闭因为先前打开的是redis服务端 #切换到 redis 目录下运行 redis-cli.exe -h 127.0.0.1 -p 6379(6)检测连接是否成功 #设置键值对 set firstKey 123#取出键值对 get firstKey#退出 exit (7)ctrlc 退出先前打开的服务端 (8)注册Redis服务 #通过 cmd 命令行工具进入 Redis 安装目录将 Redis 服务注册到 Windows 服务中执行以下命令 redis-server.exe --service-install redis.windows.conf --loglevel verbose(9)启动Redis服务 #执行以下命令启动 Redis 服务 redis-server --service-start(10)Redis 已经被添加到 Windows 服务中 (11)打开Redis服务将启动类型设置为自动即可实现开机自启动 2.Python3.8环境中配置Celery (1) PyCharm安装celeryredis #celery是典型的生产者消费者的模式生产者生产任务并加入队列中消费者取出任务消费。多用于处理异步任务或者定时任务。#第一种方式 pip install celery pip install redis#第二种方式 pip install -i https://pypi.douban.com/simple celery pip install -i https://pypi.douban.com/simple redis2新建异步任务执行文件celery_task.py.相当于注册了celery app # -*- coding: utf-8 -*- from celery import Celery import time app Celery(demo, backendredis://localhost:6379/1, brokerredis://localhost:6379/2) app.task def send_email(name):print(向%s发送邮件...%name)time.sleep(5)print(向%s发送邮件完成%name)return ok(3) 在项目文件目录下创建worker消费任务 PS D:\soft\pythonProject celery --appcelerypro.celery_task worker -n node1 -l INFO-------------- celerynode1 v5.3.5 (emerald-rush) --- ***** ----- -- ******* ---- Windows-10-10.0.22621-SP0 2023-11-22 17:26:39 - *** --- * --- - ** ---------- [config] - ** ---------- . app: test:0x1e6fa358550 - ** ---------- . transport: redis://127.0.0.1:6379/2 - ** ---------- . results: redis://127.0.0.1:6379/1 - *** --- * --- . concurrency: 32 (prefork) -- ******* ---- . task events: OFF (enable -E to monitor tasks in this worker) --- ***** ------------------- [queues]. celery exchangecelery(direct) keycelery[tasks]. celerypro.celery_task.send_email[2023-11-22 17:26:39,265: WARNING/MainProcess] d:\soft\python38\lib\site-packages\celery\worker\consumer\consumer.py:507: CPendingDeprecationWarning: The broker_connection_retry configuration setting will no longer determine [2023-11-22 20:30:08,249: INFO/MainProcess] mingle: searching for neighbors [2023-11-22 20:30:15,379: INFO/MainProcess] mingle: all alone [2023-11-22 20:30:25,608: INFO/MainProcess] celerynode1 ready.4ctrlc 退出 5修改celery_task.py文件增加一个task # -*- coding: utf-8 -*- from celery import Celery import time app Celery(demo, backendredis://localhost:6379/1, brokerredis://localhost:6379/2) app.task def send_email(name):print(向%s发送邮件...%name)time.sleep(5)print(向%s发送邮件完成%name)return ok app.task def send_msg(name):print(向%s发送短信...%name)time.sleep(5)print(向%s发送邮件完成%name)return ok (6)再次在项目文件目录下创建worker消费任务 PS D:\soft\pythonProject celery --appcelerypro.celery_task worker -n node1 -l INFO-------------- celerynode1 v5.3.5 (emerald-rush) --- ***** ----- -- ******* ---- Windows-10-10.0.22621-SP0 2023-11-22 21:01:43 - *** --- * --- - ** ---------- [config] - ** ---------- . app: demo:0x29cea446250 - ** ---------- . transport: redis://localhost:6379/2 - ** ---------- . results: redis://localhost:6379/1 - *** --- * --- . concurrency: 32 (prefork) -- ******* ---- . task events: OFF (enable -E to monitor tasks in this worker) --- ***** ----- -------------- [queues]. celery exchangecelery(direct) keycelery[tasks]. celerypro.celery_task.send_email. celerypro.celery_task.send_msg[2023-11-22 21:01:43,381: WARNING/MainProcess] d:\soft\python38\lib\site-packages\celery\worker\consumer\consumer.py:507: CPendingDeprecationWarning: The broker_connection_retry configuration setting will no longer determine [2023-11-22 21:01:43,612: INFO/SpawnPoolWorker-23] child process 23988 calling self.run() [2023-11-22 21:01:43,612: INFO/SpawnPoolWorker-17] child process 16184 calling self.run() [2023-11-22 21:01:43,612: INFO/SpawnPoolWorker-21] child process 22444 calling self.run() [2023-11-22 21:01:43,612: INFO/SpawnPoolWorker-27] child process 29480 calling self.run() [2023-11-22 21:01:43,612: INFO/SpawnPoolWorker-24] child process 5844 calling self.run() [2023-11-22 21:01:43,631: INFO/SpawnPoolWorker-25] child process 8896 calling self.run() [2023-11-22 21:01:43,634: INFO/SpawnPoolWorker-29] child process 28068 calling self.run() [2023-11-22 21:01:43,634: INFO/SpawnPoolWorker-28] child process 18952 calling self.run() [2023-11-22 21:01:43,636: INFO/SpawnPoolWorker-26] child process 13680 calling self.run() [2023-11-22 21:01:43,638: INFO/SpawnPoolWorker-31] child process 25472 calling self.run() [2023-11-22 21:01:43,638: INFO/SpawnPoolWorker-30] child process 28688 calling self.run() [2023-11-22 21:01:43,638: INFO/SpawnPoolWorker-32] child process 10072 calling self.run() [2023-11-22 21:01:45,401: INFO/MainProcess] Connected to redis://localhost:6379/2 [2023-11-22 21:01:45,401: WARNING/MainProcess] d:\soft\python38\lib\site-packages\celery\worker\consumer\consumer.py:507: CPendingDeprecationWarning: The broker_connection_retry configuration setting will no longer determine whether broker connection retries are made during startup in Celery 6.0 and above. If you wish to retain the existing behavior for retrying connections on startup, you should set broker_connection_retry_on_startup to True.warnings.warn([2023-11-22 21:01:49,477: INFO/MainProcess] mingle: searching for neighbors [2023-11-22 21:01:56,607: INFO/MainProcess] mingle: all alone [2023-11-22 21:02:04,753: INFO/MainProcess] celerynode1 ready.6ctrlc 退出创建执行任务文件produce_task.py # -*- coding: utf-8 -*- from celerypro.celery_task import send_email,send_msg result send_email.delay(david) print(result.id) result2 send_msg.delay(mao) print(result2.id) 7运行produce_task.py 8同时取到id值 9如遇到报错需要安装包 eventlet PS D:\soft\pythonProject pip install eventlet10重新在项目文件目录下创建worker消费任务 PS D:\soft\pythonProject celery --appcelerypro.celery_task worker -n node1 -l INFO -P eventlet-------------- celerynode1 v5.3.5 (emerald-rush) --- ***** ----- -- ******* ---- Windows-10-10.0.22621-SP0 2023-11-22 21:29:34 - *** --- * --- - ** ---------- [config] - ** ---------- . app: demo:0x141511962e0 - ** ---------- . transport: redis://localhost:6379/2 - ** ---------- . results: redis://localhost:6379/1 - *** --- * --- . concurrency: 32 (eventlet) -- ******* ---- . task events: OFF (enable -E to monitor tasks in this worker) --- ***** ------------------- [queues]. celery exchangecelery(direct) keycelery[tasks]. celerypro.celery_task.send_email. celerypro.celery_task.send_msgr_connection_retry configuration setting will no longer determine whether broker connection retries are made during startup in Celery 6.0 and above. If you wish to retain the existing behavior for retrying connections on startup, you should set broker_connection_retry_on_startup to True.warnings.warn([2023-11-22 21:29:48,022: INFO/MainProcess] pidbox: Connected to redis://localhost:6379/2. [2023-11-22 21:29:52,117: INFO/MainProcess] celerynode1 ready.11 运行produce_task.py 12生成id 13查看任务消息 [2023-11-22 21:30:35,194: INFO/MainProcess] Task celerypro.celery_task.send_email[c1a473d5-49ac-4468-9370-19226f377e00] received [2023-11-22 21:30:35,195: WARNING/MainProcess] 向david发送邮件... [2023-11-22 21:30:35,197: INFO/MainProcess] Task celerypro.celery_task.send_msg[de30d70b-9110-4dfb-bcfd-45a61403357f] received [2023-11-22 21:30:35,198: WARNING/MainProcess] 向mao发送短信... [2023-11-22 21:30:40,210: WARNING/MainProcess] 向david发送邮件完成 [2023-11-22 21:30:40,210: WARNING/MainProcess] 向mao发送邮件完成 [2023-11-22 21:30:42,270: INFO/MainProcess] Task celerypro.celery_task.send_msg[de30d70b-9110-4dfb-bcfd-45a61403357f] succeeded in 7.063000000001921s: ok [2023-11-22 21:30:42,270: INFO/MainProcess] Task celerypro.celery_task.send_email[c1a473d5-49ac-4468-9370-19226f377e00] succeeded in 7.063000000001921s: ok14创建py文件result.py查看任务执行结果 取第2个id:de30d70b-9110-4dfb-bcfd-45a61403357f # -*- coding: utf-8 -*- from celery.result import AsyncResult from celerypro.celery_task import app async_result AsyncResult(idde30d70b-9110-4dfb-bcfd-45a61403357f, appapp) if async_result.successful():result async_result.get()print(result) elif async_result.failed():print(执行失败) elif async_result.status PENDING:print(任务等待中被执行) elif async_result.status RETRY:print(任务异常后正在重试) elif async_result.status STARTED:print(任务已经开始被执行) (15) 运行result.py文件 16输出ok 三、问题 1.Celery命令报错 1报错 2原因分析 celery版本不同命令不同。 查看帮助命令 PS D:\soft\pythonProject celery --help Usage: celery [OPTIONS] COMMAND [ARGS]...Celery command entrypoint.Options:-A, --app APPLICATION-b, --broker TEXT--result-backend TEXT--loader TEXT--config TEXT--workdir PATH-C, --no-color-q, --quiet--version--skip-checks Skip Django core checks on startup. Setting theSKIP_CHECKS environment variable to any non-emptystring will have the same effect.--help Show this message and exit.Commands:amqp AMQP Administration Shell.beat Start the beat periodic task scheduler.call Call a task by name.control Workers remote control.events Event-stream utilities.graph The celery graph command.inspect Inspect the worker at runtime.list Get info from broker.logtool The celery logtool command.migrate Migrate tasks from one broker to another.multi Start multiple worker instances.purge Erase all messages from all known task queues.report Shows information useful to include in bug-reports.result Print the return value for a given task id.shell Start shell session with convenient access to celery symbols.status Show list of workers that are online.upgrade Perform upgrade between versions.worker Start worker instance.PS D:\soft\pythonProject celery worker --help Usage: celery worker [OPTIONS]Start worker instance.Examples--------$ celery --appproj worker -l INFO$ celery -A proj worker -l INFO -Q hipri,lopri$ celery -A proj worker --concurrency4$ celery -A proj worker --concurrency1000 -P eventlet$ celery worker --autoscale10,0Worker Options:-n, --hostname HOSTNAME Set custom hostname (e.g., w1%%h).Expands: %%h (hostname), %%n (name) and %%d,(domain).-D, --detach Start worker as a background process.-S, --statedb PATH Path to the state database. The extension.db may be appended to the filename.-l, --loglevel [DEBUG|INFO|WARNING|ERROR|CRITICAL|FATAL]Logging level.-O, --optimization [default|fair]Apply optimization profile.--prefetch-multiplier prefetch multiplierSet custom prefetch multiplier value forthis worker instance.Pool Options:-c, --concurrency concurrencyNumber of child processes processing thequeue. The default is the number of CPUsavailable on your system.-P, --pool [prefork|eventlet|gevent|solo|processes|threads|custom]Pool implementation.-E, --task-events, --events Send task-related events that can becaptured by monitors like celery events,celerymon, and others.--time-limit FLOAT Enables a hard time limit (in secondsint/float) for tasks.--soft-time-limit FLOAT Enables a soft time limit (in secondsint/float) for tasks.--max-tasks-per-child INTEGER Maximum number of tasks a pool worker canexecute before its terminated and replacedby a new worker.--max-memory-per-child INTEGER Maximum amount of resident memory, in KiB,that may be consumed by a child processbefore it will be replaced by a new one. Ifa single task causes a child process toexceed this limit, the task will becompleted and the child process will bereplaced afterwards. Default: no limit.--scheduler TEXTDaemonization Options:-f, --logfile TEXT Log destination; defaults to stderr--pidfile TEXT--uid TEXT--gid TEXT--umask TEXT--executable TEXTOptions:--help Show this message and exit.3解决方法 修改命令 PS D:\soft\pythonProject celery --appcelerypro.celery_task worker -n node1 -l INFO成功 2.执行Celery命令报错 1报错 AttributeError: NoneType object has no attribute Redis 2原因分析 PyCharm未安装redis插件。 3解决方法 安装redis插件 3.Win11启动Celery报ValueErro错误 1报错 Windows 在开发 Celery 异步任务通过命令 celery --appcelerypro.celery_task worker -n node1 -l INFO 启动 Celery 服务后正常 但在使用 delay() 调用任务时会出现以下报错信息 Task handler raised error: ValueError(not enough values to unpack (expected 3, got 0))   2原因分析 PyCharm未安装eventlet 3解决方法 安装包 eventlet pip install eventlet 通过以下命令启动服务 celery --appcelerypro.celery_task worker -n node1 -l INFO -P eventlet
http://wiki.neutronadmin.com/news/329634/

相关文章:

  • 自己造网站wordpress 清新主题
  • 七牛云可以做网站的存储空间吗东莞厚街做网站
  • 网站建站报告2000字智能网站建设设计
  • 内蒙古建设工程质监站网站在哪几个网站里做自媒体赚钱
  • 衡阳网站建设公司哪家好抖音小程序开发者平台
  • 福州建设企业网站软件开发工具排名
  • 给网站做图巩义网站建设方案报价
  • 建设企业网站报价网站开发设计费用
  • 做PHP网站前端网站进不去怎么在网站上做按钮
  • 做网站的时候旋转图片西安专业网站建设
  • 沈阳市建设局网站做竞价网站要准备什么条件
  • 做动态图的网站苏州无名网络科技有限公司
  • 网站seo诊断优化方案企业网络规划和设计方案
  • 国外数码印花图案设计网站新媒体营销与运营
  • wordpress底部自豪采用网站页面关键词优化
  • 完全自定义纯代码打造你的wordpress站点侧边栏室内装修公司排名
  • 门户网站与官网的区别做网站空间不给账号密码
  • 苏州吴中区做网站深圳市城乡和建设局网站
  • 合肥网站建设培训班WordPress高级投稿
  • 做网站好怎么做阿里巴巴国际网站首页
  • 企业网站备案需要法人拍照吗有经验的武进网站建设
  • 免费的室内设计网站app下载安装注册
  • 杭州建设局网站官网做财务还是网站运营
  • 微网站建设哪家优惠企业网站的作用和意义
  • 学校网站建设情况汇报深圳品牌床垫
  • 河南专业建网站应用商城app开发
  • 专业外贸制作网站网站建设技术解决方案
  • 建设网站怎么入账沈阳公司网站设计
  • 网站建设公司组织架构网站建设中最重要的环节是什么
  • 有网站代码怎么建站云主机如何做网站