How do I rotate User-Agents in my scraper?

Rotating User-Agents helps avoid detection when scraping at scale.

Why rotate:

  • Makes traffic look like multiple users
  • Reduces fingerprinting effectiveness
  • Distributes requests across different "browser profiles"
  • Harder to track and rate-limit your scraper

Simple rotation in Python:

import random

USER_AGENTS = [
    'Mozilla/5.0 (Windows NT 10.0; Win64; x64) Chrome/120.0.0.0',
    'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) Safari/537.36',
    'Mozilla/5.0 (X11; Linux x86_64) Firefox/121.0'
]

def get_random_user_agent():
    return random.choice(USER_AGENTS)

# Use it in requests
headers = {'User-Agent': get_random_user_agent()}
response = requests.get(url, headers=headers)

Advanced rotation with libraries:

Using fake-useragent library:

from fake_useragent import UserAgent

ua = UserAgent()
headers = {'User-Agent': ua.random}

Rotation strategies:

  • Random: Pick a different User-Agent for each request
  • Sequential: Cycle through a list in order
  • Weighted: Favor common browsers (Chrome > Firefox > Safari > Edge)
  • Session-based: Use the same User-Agent for related requests, then switch

Important considerations:

  • Keep your pool realistic (don't include outdated browsers)
  • Match mobile vs desktop based on target site
  • Update your User-Agent pool regularly
  • Don't rotate too aggressively (may look suspicious)

Common mistake:

Rotating User-Agents alone isn't enough for sophisticated anti-bot systems. You also need to:

  • Match Accept headers to the browser
  • Use consistent header combinations
  • Handle cookies properly
  • Respect rate limits

A User-Agent generator helps you build and maintain a pool of realistic, current User-Agent strings.

Related Questions