风的引路人 发表于 2021-7-24 22:15:24

报错Event loop is closed

抓小说的爬虫报错了
import requests
import asyncio
import aiohttp
import json
import aiofiles

"""
1.同步操作: 访问getCatalog拿到所有章节的cid和名称
2. 异步操作:访问getChapterContent 下载所有的文章内容
"""


async def aiodownload(cid, b_id, title):
    data = {
      "book_id": b_id,
      "cid": f"P{b_id}|{cid}",
      "need_bookinfo": 1
    }
    data = json.dumps(data)
    url = f"https://dushu.baidu.com/api/pc/getChapterContent?data={data}"

    async with aiohttp.ClientSession() as session:
      async with session.get(url) as resp:
            dic = await resp.json()

            async with aiofiles.open("E:"+f'\{title}.txt', 'w', encoding='utf-8') as f:
                await f.write(dic['data']['novel']['content'])


async def getCatalog(url):
    resp = requests.get(url)
    dic = resp.json()
    tasks = []
    for item in dic['data']['novel']['items']:
      title = item['title']
      cid = item['cid']
      # 准备异步任务
      tasks.append(aiodownload(cid, b_id, title))

    await asyncio.wait(tasks)


if __name__ == '__main__':
    b_id = "4306063500"
    url = 'https://dushu.baidu.com/api/pc/getCatalog?data={"book_id":"' + b_id + '"}'
    asyncio.run(getCatalog(url))


Exception ignored in: <function _ProactorBasePipeTransport.__del__ at 0x000001D4C4FC1E50>
Traceback (most recent call last):
File "C:\Users\86184\AppData\Local\Programs\Python\Python38\lib\asyncio\proactor_events.py", line 116, in __del__
    self.close()
File "C:\Users\86184\AppData\Local\Programs\Python\Python38\lib\asyncio\proactor_events.py", line 108, in close
    self._loop.call_soon(self._call_connection_lost, None)
File "C:\Users\86184\AppData\Local\Programs\Python\Python38\lib\asyncio\base_events.py", line 719, in call_soon
    self._check_closed()
File "C:\Users\86184\AppData\Local\Programs\Python\Python38\lib\asyncio\base_events.py", line 508, in _check_closed
    raise RuntimeError('Event loop is closed')
RuntimeError: Event loop is closed

大马强 发表于 2021-7-25 18:11:02

for item in dic['data']['novel']['items']:
      title = item['title']
      cid = item['cid']
      # 准备异步任务
      tasks.append(asyncio.creat_task(aiodownload(cid, b_id, title)))
试试这个

isdkz 发表于 2022-4-19 14:40:13

协程不放在事件循环里执行的话,在你的协程返回之前有可能你的主线程就已经结束了,

所以你的协程返回后找不到主线程就会报这个错误,把你代码的最后一行 asyncio.run(getCatalog(url))

改成:
    loop = asyncio.get_event_loop()
    loop.run_until_complete(getCatalog(url))

也就是放在事件循环中执行就好,run_until_complete 会使主线程等待所有的协程执行结束

故对你的代码修改如下:
import requests
import asyncio
import aiohttp
import json
import aiofiles

"""
1.同步操作: 访问getCatalog拿到所有章节的cid和名称
2. 异步操作:访问getChapterContent 下载所有的文章内容
"""


async def aiodownload(cid, b_id, title):
    data = {
      "book_id": b_id,
      "cid": f"P{b_id}|{cid}",
      "need_bookinfo": 1
    }
    data = json.dumps(data)
    url = f"https://dushu.baidu.com/api/pc/getChapterContent?data={data}"

    async with aiohttp.ClientSession() as session:
      async with session.get(url) as resp:
            dic = await resp.json()

            async with aiofiles.open(f'{title}.txt', 'w', encoding='utf-8') as f:
                await f.write(dic['data']['novel']['content'])


async def getCatalog(url):
    resp = requests.get(url)
    dic = resp.json()
    tasks = []
    for item in dic['data']['novel']['items']:
      title = item['title']
      cid = item['cid']
      # 准备异步任务
      tasks.append(aiodownload(cid, b_id, title))

    await asyncio.wait(tasks)


if __name__ == '__main__':
    b_id = "4306063500"
    url = 'https://dushu.baidu.com/api/pc/getCatalog?data={"book_id":"' + b_id + '"}'
    loop = asyncio.get_event_loop()
    loop.run_until_complete(getCatalog(url))

页: [1]
查看完整版本: 报错Event loop is closed