20230911.log 34 KB

123456789101112131415161718192021222324252627282930313233343536373839404142434445464748495051525354555657585960616263646566676869707172737475767778798081828384858687888990919293949596979899100101102103104105106107108109110111112113114115116117118119120121122123124125126127128129130131132133134135136137138139140141142143144145146147148149150151152153154155156157158159160161162163164165166167168169170171172173174175176177178179180181182183184185186187188189190191192193194195196197198199200201202203204205206207208209210211212213214215216217218219220221222223224225226227228229230231232233234235236237238239240241242243244245246247248249250251252253254255256257258259260261262263264265266267268269270271272273274275276277278279280281282283284285286287288289290291292293294295296297298299300301
  1. 2023-09-11 11:05:39 [scrapy.downloadermiddlewares.retry] ERROR: Gave up retrying <GET https://mp.weixin.qq.com/cgi-bin/searchbiz> (failed 3 times): User timeout caused connection failure: Getting https://mp.weixin.qq.com/cgi-bin/searchbiz took longer than 180.0 seconds..
  2. 2023-09-11 11:06:13 [scrapy.core.scraper] ERROR: Error downloading <GET https://mp.weixin.qq.com/cgi-bin/searchbiz>
  3. Traceback (most recent call last):
  4. File "/Users/luojunhui/miniconda3/envs/Spider/lib/python3.9/site-packages/twisted/internet/defer.py", line 1693, in _inlineCallbacks
  5. result = context.run(
  6. File "/Users/luojunhui/miniconda3/envs/Spider/lib/python3.9/site-packages/twisted/python/failure.py", line 518, in throwExceptionIntoGenerator
  7. return g.throw(self.type, self.value, self.tb)
  8. File "/Users/luojunhui/miniconda3/envs/Spider/lib/python3.9/site-packages/scrapy/core/downloader/middleware.py", line 54, in process_request
  9. return (yield download_func(request=request, spider=spider))
  10. File "/Users/luojunhui/miniconda3/envs/Spider/lib/python3.9/site-packages/twisted/internet/defer.py", line 892, in _runCallbacks
  11. current.result = callback( # type: ignore[misc]
  12. File "/Users/luojunhui/miniconda3/envs/Spider/lib/python3.9/site-packages/scrapy/core/downloader/handlers/http11.py", line 397, in _cb_timeout
  13. raise TimeoutError(f"Getting {url} took longer than {timeout} seconds.")
  14. twisted.internet.error.TimeoutError: User timeout caused connection failure: Getting https://mp.weixin.qq.com/cgi-bin/searchbiz took longer than 180.0 seconds..
  15. 2023-09-11 11:07:21 [scrapy.downloadermiddlewares.retry] ERROR: Gave up retrying <GET https://mp.weixin.qq.com/cgi-bin/searchbiz> (failed 3 times): [<twisted.python.failure.Failure twisted.internet.error.ConnectionLost: Connection to the other side was lost in a non-clean fashion: Connection lost.>]
  16. 2023-09-11 11:07:21 [scrapy.downloadermiddlewares.retry] ERROR: Gave up retrying <GET https://mp.weixin.qq.com/cgi-bin/searchbiz> (failed 3 times): [<twisted.python.failure.Failure twisted.internet.error.ConnectionLost: Connection to the other side was lost in a non-clean fashion: Connection lost.>]
  17. 2023-09-11 11:07:59 [scrapy.core.scraper] ERROR: Error downloading <GET https://mp.weixin.qq.com/cgi-bin/searchbiz>
  18. Traceback (most recent call last):
  19. File "/Users/luojunhui/miniconda3/envs/Spider/lib/python3.9/site-packages/scrapy/core/downloader/middleware.py", line 54, in process_request
  20. return (yield download_func(request=request, spider=spider))
  21. twisted.web._newclient.ResponseNeverReceived: [<twisted.python.failure.Failure twisted.internet.error.ConnectionLost: Connection to the other side was lost in a non-clean fashion: Connection lost.>]
  22. 2023-09-11 11:07:59 [scrapy.core.scraper] ERROR: Error downloading <GET https://mp.weixin.qq.com/cgi-bin/searchbiz>
  23. Traceback (most recent call last):
  24. File "/Users/luojunhui/miniconda3/envs/Spider/lib/python3.9/site-packages/scrapy/core/downloader/middleware.py", line 54, in process_request
  25. return (yield download_func(request=request, spider=spider))
  26. twisted.web._newclient.ResponseNeverReceived: [<twisted.python.failure.Failure twisted.internet.error.ConnectionLost: Connection to the other side was lost in a non-clean fashion: Connection lost.>]
  27. 2023-09-11 11:10:23 [scrapy.downloadermiddlewares.retry] ERROR: Gave up retrying <GET https://mp.weixin.qq.com/cgi-bin/searchbiz> (failed 3 times): User timeout caused connection failure: Getting https://mp.weixin.qq.com/cgi-bin/searchbiz took longer than 180.0 seconds..
  28. 2023-09-11 11:10:23 [scrapy.downloadermiddlewares.retry] ERROR: Gave up retrying <GET https://mp.weixin.qq.com/cgi-bin/searchbiz> (failed 3 times): User timeout caused connection failure: Getting https://mp.weixin.qq.com/cgi-bin/searchbiz took longer than 180.0 seconds..
  29. 2023-09-11 11:10:48 [scrapy.downloadermiddlewares.retry] ERROR: Gave up retrying <GET https://mp.weixin.qq.com/cgi-bin/searchbiz> (failed 3 times): User timeout caused connection failure: Getting https://mp.weixin.qq.com/cgi-bin/searchbiz took longer than 180.0 seconds..
  30. 2023-09-11 11:11:01 [scrapy.core.scraper] ERROR: Error downloading <GET https://mp.weixin.qq.com/cgi-bin/searchbiz>
  31. Traceback (most recent call last):
  32. File "/Users/luojunhui/miniconda3/envs/Spider/lib/python3.9/site-packages/twisted/internet/defer.py", line 1693, in _inlineCallbacks
  33. result = context.run(
  34. File "/Users/luojunhui/miniconda3/envs/Spider/lib/python3.9/site-packages/twisted/python/failure.py", line 518, in throwExceptionIntoGenerator
  35. return g.throw(self.type, self.value, self.tb)
  36. File "/Users/luojunhui/miniconda3/envs/Spider/lib/python3.9/site-packages/scrapy/core/downloader/middleware.py", line 54, in process_request
  37. return (yield download_func(request=request, spider=spider))
  38. File "/Users/luojunhui/miniconda3/envs/Spider/lib/python3.9/site-packages/twisted/internet/defer.py", line 892, in _runCallbacks
  39. current.result = callback( # type: ignore[misc]
  40. File "/Users/luojunhui/miniconda3/envs/Spider/lib/python3.9/site-packages/scrapy/core/downloader/handlers/http11.py", line 397, in _cb_timeout
  41. raise TimeoutError(f"Getting {url} took longer than {timeout} seconds.")
  42. twisted.internet.error.TimeoutError: User timeout caused connection failure: Getting https://mp.weixin.qq.com/cgi-bin/searchbiz took longer than 180.0 seconds..
  43. 2023-09-11 11:11:01 [scrapy.core.scraper] ERROR: Error downloading <GET https://mp.weixin.qq.com/cgi-bin/searchbiz>
  44. Traceback (most recent call last):
  45. File "/Users/luojunhui/miniconda3/envs/Spider/lib/python3.9/site-packages/twisted/internet/defer.py", line 1693, in _inlineCallbacks
  46. result = context.run(
  47. File "/Users/luojunhui/miniconda3/envs/Spider/lib/python3.9/site-packages/twisted/python/failure.py", line 518, in throwExceptionIntoGenerator
  48. return g.throw(self.type, self.value, self.tb)
  49. File "/Users/luojunhui/miniconda3/envs/Spider/lib/python3.9/site-packages/scrapy/core/downloader/middleware.py", line 54, in process_request
  50. return (yield download_func(request=request, spider=spider))
  51. File "/Users/luojunhui/miniconda3/envs/Spider/lib/python3.9/site-packages/twisted/internet/defer.py", line 892, in _runCallbacks
  52. current.result = callback( # type: ignore[misc]
  53. File "/Users/luojunhui/miniconda3/envs/Spider/lib/python3.9/site-packages/scrapy/core/downloader/handlers/http11.py", line 397, in _cb_timeout
  54. raise TimeoutError(f"Getting {url} took longer than {timeout} seconds.")
  55. twisted.internet.error.TimeoutError: User timeout caused connection failure: Getting https://mp.weixin.qq.com/cgi-bin/searchbiz took longer than 180.0 seconds..
  56. 2023-09-11 11:11:31 [scrapy.core.scraper] ERROR: Error downloading <GET https://mp.weixin.qq.com/cgi-bin/searchbiz>
  57. Traceback (most recent call last):
  58. File "/Users/luojunhui/miniconda3/envs/Spider/lib/python3.9/site-packages/twisted/internet/defer.py", line 1693, in _inlineCallbacks
  59. result = context.run(
  60. File "/Users/luojunhui/miniconda3/envs/Spider/lib/python3.9/site-packages/twisted/python/failure.py", line 518, in throwExceptionIntoGenerator
  61. return g.throw(self.type, self.value, self.tb)
  62. File "/Users/luojunhui/miniconda3/envs/Spider/lib/python3.9/site-packages/scrapy/core/downloader/middleware.py", line 54, in process_request
  63. return (yield download_func(request=request, spider=spider))
  64. File "/Users/luojunhui/miniconda3/envs/Spider/lib/python3.9/site-packages/twisted/internet/defer.py", line 892, in _runCallbacks
  65. current.result = callback( # type: ignore[misc]
  66. File "/Users/luojunhui/miniconda3/envs/Spider/lib/python3.9/site-packages/scrapy/core/downloader/handlers/http11.py", line 397, in _cb_timeout
  67. raise TimeoutError(f"Getting {url} took longer than {timeout} seconds.")
  68. twisted.internet.error.TimeoutError: User timeout caused connection failure: Getting https://mp.weixin.qq.com/cgi-bin/searchbiz took longer than 180.0 seconds..
  69. 2023-09-11 11:13:10 [scrapy.downloadermiddlewares.retry] ERROR: Gave up retrying <GET https://mp.weixin.qq.com/cgi-bin/searchbiz> (failed 3 times): [<twisted.python.failure.Failure twisted.internet.error.ConnectionLost: Connection to the other side was lost in a non-clean fashion: Connection lost.>]
  70. 2023-09-11 11:13:10 [scrapy.downloadermiddlewares.retry] ERROR: Gave up retrying <GET https://mp.weixin.qq.com/cgi-bin/searchbiz> (failed 3 times): [<twisted.python.failure.Failure twisted.internet.error.ConnectionLost: Connection to the other side was lost in a non-clean fashion: Connection lost.>]
  71. 2023-09-11 11:13:44 [scrapy.downloadermiddlewares.retry] ERROR: Gave up retrying <GET https://mp.weixin.qq.com/cgi-bin/searchbiz> (failed 3 times): [<twisted.python.failure.Failure twisted.internet.error.ConnectionLost: Connection to the other side was lost in a non-clean fashion: Connection lost.>]
  72. 2023-09-11 11:13:44 [scrapy.core.scraper] ERROR: Error downloading <GET https://mp.weixin.qq.com/cgi-bin/searchbiz>
  73. Traceback (most recent call last):
  74. File "/Users/luojunhui/miniconda3/envs/Spider/lib/python3.9/site-packages/scrapy/core/downloader/middleware.py", line 54, in process_request
  75. return (yield download_func(request=request, spider=spider))
  76. twisted.web._newclient.ResponseNeverReceived: [<twisted.python.failure.Failure twisted.internet.error.ConnectionLost: Connection to the other side was lost in a non-clean fashion: Connection lost.>]
  77. 2023-09-11 11:13:44 [scrapy.core.scraper] ERROR: Error downloading <GET https://mp.weixin.qq.com/cgi-bin/searchbiz>
  78. Traceback (most recent call last):
  79. File "/Users/luojunhui/miniconda3/envs/Spider/lib/python3.9/site-packages/scrapy/core/downloader/middleware.py", line 54, in process_request
  80. return (yield download_func(request=request, spider=spider))
  81. twisted.web._newclient.ResponseNeverReceived: [<twisted.python.failure.Failure twisted.internet.error.ConnectionLost: Connection to the other side was lost in a non-clean fashion: Connection lost.>]
  82. 2023-09-11 11:14:35 [scrapy.core.scraper] ERROR: Spider error processing <GET https://mp.weixin.qq.com/cgi-bin/appmsg> (referer: https://mp.weixin.qq.com/cgi-bin/searchbiz)
  83. Traceback (most recent call last):
  84. File "/Users/luojunhui/miniconda3/envs/Spider/lib/python3.9/site-packages/scrapy/utils/defer.py", line 260, in iter_errback
  85. yield next(it)
  86. File "/Users/luojunhui/miniconda3/envs/Spider/lib/python3.9/site-packages/scrapy/utils/python.py", line 336, in __next__
  87. return next(self.data)
  88. File "/Users/luojunhui/miniconda3/envs/Spider/lib/python3.9/site-packages/scrapy/utils/python.py", line 336, in __next__
  89. return next(self.data)
  90. File "/Users/luojunhui/miniconda3/envs/Spider/lib/python3.9/site-packages/scrapy/core/spidermw.py", line 106, in process_sync
  91. for r in iterable:
  92. File "/Users/luojunhui/miniconda3/envs/Spider/lib/python3.9/site-packages/scrapy/spidermiddlewares/offsite.py", line 28, in <genexpr>
  93. return (r for r in result or () if self._filter(r, spider))
  94. File "/Users/luojunhui/miniconda3/envs/Spider/lib/python3.9/site-packages/scrapy/core/spidermw.py", line 106, in process_sync
  95. for r in iterable:
  96. File "/Users/luojunhui/miniconda3/envs/Spider/lib/python3.9/site-packages/scrapy/spidermiddlewares/referer.py", line 352, in <genexpr>
  97. return (self._set_referer(r, response) for r in result or ())
  98. File "/Users/luojunhui/miniconda3/envs/Spider/lib/python3.9/site-packages/scrapy/core/spidermw.py", line 106, in process_sync
  99. for r in iterable:
  100. File "/Users/luojunhui/miniconda3/envs/Spider/lib/python3.9/site-packages/scrapy/spidermiddlewares/urllength.py", line 27, in <genexpr>
  101. return (r for r in result or () if self._filter(r, spider))
  102. File "/Users/luojunhui/miniconda3/envs/Spider/lib/python3.9/site-packages/scrapy/core/spidermw.py", line 106, in process_sync
  103. for r in iterable:
  104. File "/Users/luojunhui/miniconda3/envs/Spider/lib/python3.9/site-packages/scrapy/spidermiddlewares/depth.py", line 31, in <genexpr>
  105. return (r for r in result or () if self._filter(r, response, spider))
  106. File "/Users/luojunhui/miniconda3/envs/Spider/lib/python3.9/site-packages/scrapy/core/spidermw.py", line 106, in process_sync
  107. for r in iterable:
  108. File "/Users/luojunhui/cyber/gzh_spider/gzh_spider/spiders/gzh_author.py", line 150, in parse_video
  109. item['video_url'] = functions.find_video_url(article_url)
  110. File "/Users/luojunhui/cyber/gzh_spider/gzh_spider/functions/get_video_url.py", line 54, in find_video_url
  111. video_url = get_tencent_video_url(video_id)
  112. File "/Users/luojunhui/cyber/gzh_spider/gzh_spider/functions/get_video_url.py", line 72, in get_tencent_video_url
  113. url = response["vl"]["vi"][0]["ul"]["ui"][0]["url"]
  114. KeyError: 'vl'
  115. 2023-09-11 11:14:35 [scrapy.core.scraper] ERROR: Error downloading <GET https://mp.weixin.qq.com/cgi-bin/searchbiz>
  116. Traceback (most recent call last):
  117. File "/Users/luojunhui/miniconda3/envs/Spider/lib/python3.9/site-packages/scrapy/core/downloader/middleware.py", line 54, in process_request
  118. return (yield download_func(request=request, spider=spider))
  119. twisted.web._newclient.ResponseNeverReceived: [<twisted.python.failure.Failure twisted.internet.error.ConnectionLost: Connection to the other side was lost in a non-clean fashion: Connection lost.>]
  120. 2023-09-11 11:15:46 [scrapy.downloadermiddlewares.retry] ERROR: Gave up retrying <GET https://mp.weixin.qq.com/cgi-bin/searchbiz> (failed 3 times): [<twisted.python.failure.Failure twisted.internet.error.ConnectionLost: Connection to the other side was lost in a non-clean fashion: Connection lost.>]
  121. 2023-09-11 11:16:26 [scrapy.downloadermiddlewares.retry] ERROR: Gave up retrying <GET https://mp.weixin.qq.com/cgi-bin/searchbiz> (failed 3 times): [<twisted.python.failure.Failure twisted.internet.error.ConnectionLost: Connection to the other side was lost in a non-clean fashion: Connection lost.>]
  122. 2023-09-11 11:16:26 [scrapy.core.scraper] ERROR: Error downloading <GET https://mp.weixin.qq.com/cgi-bin/searchbiz>
  123. Traceback (most recent call last):
  124. File "/Users/luojunhui/miniconda3/envs/Spider/lib/python3.9/site-packages/scrapy/core/downloader/middleware.py", line 54, in process_request
  125. return (yield download_func(request=request, spider=spider))
  126. twisted.web._newclient.ResponseNeverReceived: [<twisted.python.failure.Failure twisted.internet.error.ConnectionLost: Connection to the other side was lost in a non-clean fashion: Connection lost.>]
  127. 2023-09-11 11:16:46 [scrapy.core.scraper] ERROR: Error downloading <GET https://mp.weixin.qq.com/cgi-bin/searchbiz>
  128. Traceback (most recent call last):
  129. File "/Users/luojunhui/miniconda3/envs/Spider/lib/python3.9/site-packages/scrapy/core/downloader/middleware.py", line 54, in process_request
  130. return (yield download_func(request=request, spider=spider))
  131. twisted.web._newclient.ResponseNeverReceived: [<twisted.python.failure.Failure twisted.internet.error.ConnectionLost: Connection to the other side was lost in a non-clean fashion: Connection lost.>]
  132. 2023-09-11 11:21:00 [scrapy.downloadermiddlewares.retry] ERROR: Gave up retrying <GET https://mp.weixin.qq.com/cgi-bin/searchbiz> (failed 3 times): User timeout caused connection failure: Getting https://mp.weixin.qq.com/cgi-bin/searchbiz took longer than 180.0 seconds..
  133. 2023-09-11 11:21:00 [scrapy.downloadermiddlewares.retry] ERROR: Gave up retrying <GET https://mp.weixin.qq.com/cgi-bin/searchbiz> (failed 3 times): User timeout caused connection failure: Getting https://mp.weixin.qq.com/cgi-bin/searchbiz took longer than 180.0 seconds..
  134. 2023-09-11 11:21:27 [scrapy.core.scraper] ERROR: Error downloading <GET https://mp.weixin.qq.com/cgi-bin/searchbiz>
  135. Traceback (most recent call last):
  136. File "/Users/luojunhui/miniconda3/envs/Spider/lib/python3.9/site-packages/twisted/internet/defer.py", line 1693, in _inlineCallbacks
  137. result = context.run(
  138. File "/Users/luojunhui/miniconda3/envs/Spider/lib/python3.9/site-packages/twisted/python/failure.py", line 518, in throwExceptionIntoGenerator
  139. return g.throw(self.type, self.value, self.tb)
  140. File "/Users/luojunhui/miniconda3/envs/Spider/lib/python3.9/site-packages/scrapy/core/downloader/middleware.py", line 54, in process_request
  141. return (yield download_func(request=request, spider=spider))
  142. File "/Users/luojunhui/miniconda3/envs/Spider/lib/python3.9/site-packages/twisted/internet/defer.py", line 892, in _runCallbacks
  143. current.result = callback( # type: ignore[misc]
  144. File "/Users/luojunhui/miniconda3/envs/Spider/lib/python3.9/site-packages/scrapy/core/downloader/handlers/http11.py", line 397, in _cb_timeout
  145. raise TimeoutError(f"Getting {url} took longer than {timeout} seconds.")
  146. twisted.internet.error.TimeoutError: User timeout caused connection failure: Getting https://mp.weixin.qq.com/cgi-bin/searchbiz took longer than 180.0 seconds..
  147. 2023-09-11 11:21:27 [scrapy.core.scraper] ERROR: Error downloading <GET https://mp.weixin.qq.com/cgi-bin/searchbiz>
  148. Traceback (most recent call last):
  149. File "/Users/luojunhui/miniconda3/envs/Spider/lib/python3.9/site-packages/twisted/internet/defer.py", line 1693, in _inlineCallbacks
  150. result = context.run(
  151. File "/Users/luojunhui/miniconda3/envs/Spider/lib/python3.9/site-packages/twisted/python/failure.py", line 518, in throwExceptionIntoGenerator
  152. return g.throw(self.type, self.value, self.tb)
  153. File "/Users/luojunhui/miniconda3/envs/Spider/lib/python3.9/site-packages/scrapy/core/downloader/middleware.py", line 54, in process_request
  154. return (yield download_func(request=request, spider=spider))
  155. File "/Users/luojunhui/miniconda3/envs/Spider/lib/python3.9/site-packages/twisted/internet/defer.py", line 892, in _runCallbacks
  156. current.result = callback( # type: ignore[misc]
  157. File "/Users/luojunhui/miniconda3/envs/Spider/lib/python3.9/site-packages/scrapy/core/downloader/handlers/http11.py", line 397, in _cb_timeout
  158. raise TimeoutError(f"Getting {url} took longer than {timeout} seconds.")
  159. twisted.internet.error.TimeoutError: User timeout caused connection failure: Getting https://mp.weixin.qq.com/cgi-bin/searchbiz took longer than 180.0 seconds..
  160. 2023-09-11 11:23:52 [scrapy.downloadermiddlewares.retry] ERROR: Gave up retrying <GET https://mp.weixin.qq.com/cgi-bin/searchbiz> (failed 3 times): [<twisted.python.failure.Failure twisted.internet.error.ConnectionLost: Connection to the other side was lost in a non-clean fashion: Connection lost.>]
  161. 2023-09-11 11:24:21 [scrapy.core.scraper] ERROR: Error downloading <GET https://mp.weixin.qq.com/cgi-bin/searchbiz>
  162. Traceback (most recent call last):
  163. File "/Users/luojunhui/miniconda3/envs/Spider/lib/python3.9/site-packages/scrapy/core/downloader/middleware.py", line 54, in process_request
  164. return (yield download_func(request=request, spider=spider))
  165. twisted.web._newclient.ResponseNeverReceived: [<twisted.python.failure.Failure twisted.internet.error.ConnectionLost: Connection to the other side was lost in a non-clean fashion: Connection lost.>]
  166. 2023-09-11 11:25:57 [scrapy.downloadermiddlewares.retry] ERROR: Gave up retrying <GET https://mp.weixin.qq.com/cgi-bin/searchbiz> (failed 3 times): [<twisted.python.failure.Failure twisted.internet.error.ConnectionLost: Connection to the other side was lost in a non-clean fashion: Connection lost.>]
  167. 2023-09-11 11:26:03 [scrapy.core.scraper] ERROR: Error downloading <GET https://mp.weixin.qq.com/cgi-bin/searchbiz>
  168. Traceback (most recent call last):
  169. File "/Users/luojunhui/miniconda3/envs/Spider/lib/python3.9/site-packages/scrapy/core/downloader/middleware.py", line 54, in process_request
  170. return (yield download_func(request=request, spider=spider))
  171. twisted.web._newclient.ResponseNeverReceived: [<twisted.python.failure.Failure twisted.internet.error.ConnectionLost: Connection to the other side was lost in a non-clean fashion: Connection lost.>]
  172. 2023-09-11 11:44:19 [scrapy.downloadermiddlewares.retry] ERROR: Gave up retrying <GET https://mp.weixin.qq.com/cgi-bin/searchbiz> (failed 3 times): User timeout caused connection failure: Getting https://mp.weixin.qq.com/cgi-bin/searchbiz took longer than 180.0 seconds..
  173. 2023-09-11 11:44:19 [scrapy.downloadermiddlewares.retry] ERROR: Gave up retrying <GET https://mp.weixin.qq.com/cgi-bin/searchbiz> (failed 3 times): User timeout caused connection failure: Getting https://mp.weixin.qq.com/cgi-bin/searchbiz took longer than 180.0 seconds..
  174. 2023-09-11 11:44:19 [scrapy.downloadermiddlewares.retry] ERROR: Gave up retrying <GET https://mp.weixin.qq.com/cgi-bin/searchbiz> (failed 3 times): User timeout caused connection failure: Getting https://mp.weixin.qq.com/cgi-bin/searchbiz took longer than 180.0 seconds..
  175. 2023-09-11 11:44:51 [scrapy.core.scraper] ERROR: Error downloading <GET https://mp.weixin.qq.com/cgi-bin/searchbiz>
  176. Traceback (most recent call last):
  177. File "/Users/luojunhui/miniconda3/envs/Spider/lib/python3.9/site-packages/twisted/internet/defer.py", line 1693, in _inlineCallbacks
  178. result = context.run(
  179. File "/Users/luojunhui/miniconda3/envs/Spider/lib/python3.9/site-packages/twisted/python/failure.py", line 518, in throwExceptionIntoGenerator
  180. return g.throw(self.type, self.value, self.tb)
  181. File "/Users/luojunhui/miniconda3/envs/Spider/lib/python3.9/site-packages/scrapy/core/downloader/middleware.py", line 54, in process_request
  182. return (yield download_func(request=request, spider=spider))
  183. File "/Users/luojunhui/miniconda3/envs/Spider/lib/python3.9/site-packages/twisted/internet/defer.py", line 892, in _runCallbacks
  184. current.result = callback( # type: ignore[misc]
  185. File "/Users/luojunhui/miniconda3/envs/Spider/lib/python3.9/site-packages/scrapy/core/downloader/handlers/http11.py", line 397, in _cb_timeout
  186. raise TimeoutError(f"Getting {url} took longer than {timeout} seconds.")
  187. twisted.internet.error.TimeoutError: User timeout caused connection failure: Getting https://mp.weixin.qq.com/cgi-bin/searchbiz took longer than 180.0 seconds..
  188. 2023-09-11 11:44:51 [scrapy.core.scraper] ERROR: Error downloading <GET https://mp.weixin.qq.com/cgi-bin/searchbiz>
  189. Traceback (most recent call last):
  190. File "/Users/luojunhui/miniconda3/envs/Spider/lib/python3.9/site-packages/twisted/internet/defer.py", line 1693, in _inlineCallbacks
  191. result = context.run(
  192. File "/Users/luojunhui/miniconda3/envs/Spider/lib/python3.9/site-packages/twisted/python/failure.py", line 518, in throwExceptionIntoGenerator
  193. return g.throw(self.type, self.value, self.tb)
  194. File "/Users/luojunhui/miniconda3/envs/Spider/lib/python3.9/site-packages/scrapy/core/downloader/middleware.py", line 54, in process_request
  195. return (yield download_func(request=request, spider=spider))
  196. File "/Users/luojunhui/miniconda3/envs/Spider/lib/python3.9/site-packages/twisted/internet/defer.py", line 892, in _runCallbacks
  197. current.result = callback( # type: ignore[misc]
  198. File "/Users/luojunhui/miniconda3/envs/Spider/lib/python3.9/site-packages/scrapy/core/downloader/handlers/http11.py", line 397, in _cb_timeout
  199. raise TimeoutError(f"Getting {url} took longer than {timeout} seconds.")
  200. twisted.internet.error.TimeoutError: User timeout caused connection failure: Getting https://mp.weixin.qq.com/cgi-bin/searchbiz took longer than 180.0 seconds..
  201. 2023-09-11 11:44:51 [scrapy.core.scraper] ERROR: Error downloading <GET https://mp.weixin.qq.com/cgi-bin/searchbiz>
  202. Traceback (most recent call last):
  203. File "/Users/luojunhui/miniconda3/envs/Spider/lib/python3.9/site-packages/twisted/internet/defer.py", line 1693, in _inlineCallbacks
  204. result = context.run(
  205. File "/Users/luojunhui/miniconda3/envs/Spider/lib/python3.9/site-packages/twisted/python/failure.py", line 518, in throwExceptionIntoGenerator
  206. return g.throw(self.type, self.value, self.tb)
  207. File "/Users/luojunhui/miniconda3/envs/Spider/lib/python3.9/site-packages/scrapy/core/downloader/middleware.py", line 54, in process_request
  208. return (yield download_func(request=request, spider=spider))
  209. File "/Users/luojunhui/miniconda3/envs/Spider/lib/python3.9/site-packages/twisted/internet/defer.py", line 892, in _runCallbacks
  210. current.result = callback( # type: ignore[misc]
  211. File "/Users/luojunhui/miniconda3/envs/Spider/lib/python3.9/site-packages/scrapy/core/downloader/handlers/http11.py", line 397, in _cb_timeout
  212. raise TimeoutError(f"Getting {url} took longer than {timeout} seconds.")
  213. twisted.internet.error.TimeoutError: User timeout caused connection failure: Getting https://mp.weixin.qq.com/cgi-bin/searchbiz took longer than 180.0 seconds..
  214. 2023-09-11 11:50:31 [scrapy.downloadermiddlewares.retry] ERROR: Gave up retrying <GET https://mp.weixin.qq.com/cgi-bin/searchbiz> (failed 3 times): User timeout caused connection failure: Getting https://mp.weixin.qq.com/cgi-bin/searchbiz took longer than 180.0 seconds..
  215. 2023-09-11 11:50:31 [scrapy.downloadermiddlewares.retry] ERROR: Gave up retrying <GET https://mp.weixin.qq.com/cgi-bin/searchbiz> (failed 3 times): User timeout caused connection failure: Getting https://mp.weixin.qq.com/cgi-bin/searchbiz took longer than 180.0 seconds..
  216. 2023-09-11 11:51:07 [scrapy.core.scraper] ERROR: Error downloading <GET https://mp.weixin.qq.com/cgi-bin/searchbiz>
  217. Traceback (most recent call last):
  218. File "/Users/luojunhui/miniconda3/envs/Spider/lib/python3.9/site-packages/twisted/internet/defer.py", line 1693, in _inlineCallbacks
  219. result = context.run(
  220. File "/Users/luojunhui/miniconda3/envs/Spider/lib/python3.9/site-packages/twisted/python/failure.py", line 518, in throwExceptionIntoGenerator
  221. return g.throw(self.type, self.value, self.tb)
  222. File "/Users/luojunhui/miniconda3/envs/Spider/lib/python3.9/site-packages/scrapy/core/downloader/middleware.py", line 54, in process_request
  223. return (yield download_func(request=request, spider=spider))
  224. File "/Users/luojunhui/miniconda3/envs/Spider/lib/python3.9/site-packages/twisted/internet/defer.py", line 892, in _runCallbacks
  225. current.result = callback( # type: ignore[misc]
  226. File "/Users/luojunhui/miniconda3/envs/Spider/lib/python3.9/site-packages/scrapy/core/downloader/handlers/http11.py", line 397, in _cb_timeout
  227. raise TimeoutError(f"Getting {url} took longer than {timeout} seconds.")
  228. twisted.internet.error.TimeoutError: User timeout caused connection failure: Getting https://mp.weixin.qq.com/cgi-bin/searchbiz took longer than 180.0 seconds..
  229. 2023-09-11 11:51:07 [scrapy.core.scraper] ERROR: Error downloading <GET https://mp.weixin.qq.com/cgi-bin/searchbiz>
  230. Traceback (most recent call last):
  231. File "/Users/luojunhui/miniconda3/envs/Spider/lib/python3.9/site-packages/twisted/internet/defer.py", line 1693, in _inlineCallbacks
  232. result = context.run(
  233. File "/Users/luojunhui/miniconda3/envs/Spider/lib/python3.9/site-packages/twisted/python/failure.py", line 518, in throwExceptionIntoGenerator
  234. return g.throw(self.type, self.value, self.tb)
  235. File "/Users/luojunhui/miniconda3/envs/Spider/lib/python3.9/site-packages/scrapy/core/downloader/middleware.py", line 54, in process_request
  236. return (yield download_func(request=request, spider=spider))
  237. File "/Users/luojunhui/miniconda3/envs/Spider/lib/python3.9/site-packages/twisted/internet/defer.py", line 892, in _runCallbacks
  238. current.result = callback( # type: ignore[misc]
  239. File "/Users/luojunhui/miniconda3/envs/Spider/lib/python3.9/site-packages/scrapy/core/downloader/handlers/http11.py", line 397, in _cb_timeout
  240. raise TimeoutError(f"Getting {url} took longer than {timeout} seconds.")
  241. twisted.internet.error.TimeoutError: User timeout caused connection failure: Getting https://mp.weixin.qq.com/cgi-bin/searchbiz took longer than 180.0 seconds..
  242. 2023-09-11 12:00:01 [scrapy.downloadermiddlewares.retry] ERROR: Gave up retrying <GET https://mp.weixin.qq.com/cgi-bin/searchbiz> (failed 3 times): User timeout caused connection failure: Getting https://mp.weixin.qq.com/cgi-bin/searchbiz took longer than 180.0 seconds..
  243. 2023-09-11 12:00:01 [scrapy.downloadermiddlewares.retry] ERROR: Gave up retrying <GET https://mp.weixin.qq.com/cgi-bin/searchbiz> (failed 3 times): User timeout caused connection failure: Getting https://mp.weixin.qq.com/cgi-bin/searchbiz took longer than 180.0 seconds..
  244. 2023-09-11 12:00:01 [scrapy.downloadermiddlewares.retry] ERROR: Gave up retrying <GET https://mp.weixin.qq.com/cgi-bin/searchbiz> (failed 3 times): User timeout caused connection failure: Getting https://mp.weixin.qq.com/cgi-bin/searchbiz took longer than 180.0 seconds..
  245. 2023-09-11 12:00:34 [scrapy.core.scraper] ERROR: Error downloading <GET https://mp.weixin.qq.com/cgi-bin/searchbiz>
  246. Traceback (most recent call last):
  247. File "/Users/luojunhui/miniconda3/envs/Spider/lib/python3.9/site-packages/twisted/internet/defer.py", line 1693, in _inlineCallbacks
  248. result = context.run(
  249. File "/Users/luojunhui/miniconda3/envs/Spider/lib/python3.9/site-packages/twisted/python/failure.py", line 518, in throwExceptionIntoGenerator
  250. return g.throw(self.type, self.value, self.tb)
  251. File "/Users/luojunhui/miniconda3/envs/Spider/lib/python3.9/site-packages/scrapy/core/downloader/middleware.py", line 54, in process_request
  252. return (yield download_func(request=request, spider=spider))
  253. File "/Users/luojunhui/miniconda3/envs/Spider/lib/python3.9/site-packages/twisted/internet/defer.py", line 892, in _runCallbacks
  254. current.result = callback( # type: ignore[misc]
  255. File "/Users/luojunhui/miniconda3/envs/Spider/lib/python3.9/site-packages/scrapy/core/downloader/handlers/http11.py", line 397, in _cb_timeout
  256. raise TimeoutError(f"Getting {url} took longer than {timeout} seconds.")
  257. twisted.internet.error.TimeoutError: User timeout caused connection failure: Getting https://mp.weixin.qq.com/cgi-bin/searchbiz took longer than 180.0 seconds..
  258. 2023-09-11 12:00:34 [scrapy.core.scraper] ERROR: Error downloading <GET https://mp.weixin.qq.com/cgi-bin/searchbiz>
  259. Traceback (most recent call last):
  260. File "/Users/luojunhui/miniconda3/envs/Spider/lib/python3.9/site-packages/twisted/internet/defer.py", line 1693, in _inlineCallbacks
  261. result = context.run(
  262. File "/Users/luojunhui/miniconda3/envs/Spider/lib/python3.9/site-packages/twisted/python/failure.py", line 518, in throwExceptionIntoGenerator
  263. return g.throw(self.type, self.value, self.tb)
  264. File "/Users/luojunhui/miniconda3/envs/Spider/lib/python3.9/site-packages/scrapy/core/downloader/middleware.py", line 54, in process_request
  265. return (yield download_func(request=request, spider=spider))
  266. File "/Users/luojunhui/miniconda3/envs/Spider/lib/python3.9/site-packages/twisted/internet/defer.py", line 892, in _runCallbacks
  267. current.result = callback( # type: ignore[misc]
  268. File "/Users/luojunhui/miniconda3/envs/Spider/lib/python3.9/site-packages/scrapy/core/downloader/handlers/http11.py", line 397, in _cb_timeout
  269. raise TimeoutError(f"Getting {url} took longer than {timeout} seconds.")
  270. twisted.internet.error.TimeoutError: User timeout caused connection failure: Getting https://mp.weixin.qq.com/cgi-bin/searchbiz took longer than 180.0 seconds..
  271. 2023-09-11 12:00:34 [scrapy.core.scraper] ERROR: Error downloading <GET https://mp.weixin.qq.com/cgi-bin/searchbiz>
  272. Traceback (most recent call last):
  273. File "/Users/luojunhui/miniconda3/envs/Spider/lib/python3.9/site-packages/twisted/internet/defer.py", line 1693, in _inlineCallbacks
  274. result = context.run(
  275. File "/Users/luojunhui/miniconda3/envs/Spider/lib/python3.9/site-packages/twisted/python/failure.py", line 518, in throwExceptionIntoGenerator
  276. return g.throw(self.type, self.value, self.tb)
  277. File "/Users/luojunhui/miniconda3/envs/Spider/lib/python3.9/site-packages/scrapy/core/downloader/middleware.py", line 54, in process_request
  278. return (yield download_func(request=request, spider=spider))
  279. File "/Users/luojunhui/miniconda3/envs/Spider/lib/python3.9/site-packages/twisted/internet/defer.py", line 892, in _runCallbacks
  280. current.result = callback( # type: ignore[misc]
  281. File "/Users/luojunhui/miniconda3/envs/Spider/lib/python3.9/site-packages/scrapy/core/downloader/handlers/http11.py", line 397, in _cb_timeout
  282. raise TimeoutError(f"Getting {url} took longer than {timeout} seconds.")
  283. twisted.internet.error.TimeoutError: User timeout caused connection failure: Getting https://mp.weixin.qq.com/cgi-bin/searchbiz took longer than 180.0 seconds..
  284. 2023-09-11 12:03:03 [scrapy.downloadermiddlewares.retry] ERROR: Gave up retrying <GET https://mp.weixin.qq.com/cgi-bin/searchbiz> (failed 3 times): [<twisted.python.failure.Failure twisted.internet.error.ConnectionLost: Connection to the other side was lost in a non-clean fashion: Connection lost.>]
  285. 2023-09-11 12:03:03 [scrapy.downloadermiddlewares.retry] ERROR: Gave up retrying <GET https://mp.weixin.qq.com/cgi-bin/searchbiz> (failed 3 times): [<twisted.python.failure.Failure twisted.internet.error.ConnectionLost: Connection to the other side was lost in a non-clean fashion: Connection lost.>]
  286. 2023-09-11 12:03:03 [scrapy.downloadermiddlewares.retry] ERROR: Gave up retrying <GET https://mp.weixin.qq.com/cgi-bin/searchbiz> (failed 3 times): [<twisted.python.failure.Failure twisted.internet.error.ConnectionLost: Connection to the other side was lost in a non-clean fashion: Connection lost.>]
  287. 2023-09-11 12:03:42 [scrapy.core.scraper] ERROR: Error downloading <GET https://mp.weixin.qq.com/cgi-bin/searchbiz>
  288. Traceback (most recent call last):
  289. File "/Users/luojunhui/miniconda3/envs/Spider/lib/python3.9/site-packages/scrapy/core/downloader/middleware.py", line 54, in process_request
  290. return (yield download_func(request=request, spider=spider))
  291. twisted.web._newclient.ResponseNeverReceived: [<twisted.python.failure.Failure twisted.internet.error.ConnectionLost: Connection to the other side was lost in a non-clean fashion: Connection lost.>]
  292. 2023-09-11 12:03:42 [scrapy.core.scraper] ERROR: Error downloading <GET https://mp.weixin.qq.com/cgi-bin/searchbiz>
  293. Traceback (most recent call last):
  294. File "/Users/luojunhui/miniconda3/envs/Spider/lib/python3.9/site-packages/scrapy/core/downloader/middleware.py", line 54, in process_request
  295. return (yield download_func(request=request, spider=spider))
  296. twisted.web._newclient.ResponseNeverReceived: [<twisted.python.failure.Failure twisted.internet.error.ConnectionLost: Connection to the other side was lost in a non-clean fashion: Connection lost.>]
  297. 2023-09-11 12:03:42 [scrapy.core.scraper] ERROR: Error downloading <GET https://mp.weixin.qq.com/cgi-bin/searchbiz>
  298. Traceback (most recent call last):
  299. File "/Users/luojunhui/miniconda3/envs/Spider/lib/python3.9/site-packages/scrapy/core/downloader/middleware.py", line 54, in process_request
  300. return (yield download_func(request=request, spider=spider))
  301. twisted.web._newclient.ResponseNeverReceived: [<twisted.python.failure.Failure twisted.internet.error.ConnectionLost: Connection to the other side was lost in a non-clean fashion: Connection lost.>]