Skip to main content

51. Handling Timeouts and Retries

A good scraper must handle network errors gracefully.


Example

import aiohttp, asyncio

async def fetch(url):
try:
async with aiohttp.ClientSession(timeout=aiohttp.ClientTimeout(total=5)) as session:
async with session.get(url) as resp:
return await resp.text()
except asyncio.TimeoutError:
return "Timeout"

Retries

Use libraries like tenacity for retry logic.


Wrap-Up

Timeouts and retries make your scraper robust in real-world conditions.