Proxy Management
Rotate IPs and bypass geo-restrictions with managed proxies
About Proxy Management
ScrapeHub provides built-in proxy management with automatic rotation, geo-targeting, and intelligent routing. Our proxy network includes datacenter, residential, and mobile IPs across 195+ countries.
Proxy Types
Datacenter
Fast and affordable proxies from data centers
- Speed: Very Fast
- Cost: Low
- Best for: High-volume scraping
Residential
Real residential IPs from ISPs
- Speed: Moderate
- Cost: Medium
- Best for: Protected sites
Mobile
Mobile carrier IPs (4G/5G)
- Speed: Fast
- Cost: High
- Best for: Mobile-specific sites
Basic Usage
Enable Proxy Rotation
from scrapehub import ScrapeHubClient
client = ScrapeHubClient(api_key="your_api_key")
# Use automatic proxy rotation
result = client.scrape(
url="https://example.com",
proxy=True # Uses default datacenter proxies
)
print(result.data)Specify Proxy Type
# Use residential proxies
result = client.scrape(
url="https://example.com",
proxy={
"type": "residential", # or "datacenter", "mobile"
"rotate": True # Automatically rotate IPs
}
)
print(result.data)Node.js Example
const { ScrapeHubClient } = require('@scrapehub/node');
const client = new ScrapeHubClient({
apiKey: process.env.SCRAPEHUB_API_KEY
});
async function scrapeWithProxy() {
const result = await client.scrape({
url: 'https://example.com',
proxy: {
type: 'residential',
rotate: true
}
});
console.log(result.data);
}
scrapeWithProxy();Geo-Targeting
Target Specific Country
result = client.scrape(
url="https://example.com",
proxy={
"type": "residential",
"country": "US", # ISO 3166-1 alpha-2 code
"rotate": True
}
)
# Request appears to come from the United States
print(result.data)Target Specific City
result = client.scrape(
url="https://example.com",
proxy={
"type": "residential",
"country": "US",
"city": "New York", # or state: "NY"
"rotate": True
}
)
print(result.data)Available Countries
# Get list of available proxy countries
countries = client.proxies.list_countries()
for country in countries:
print(f"{country['code']}: {country['name']} ({country['available_ips']} IPs)")Advanced Configuration
Custom Rotation Strategy
result = client.scrape(
url="https://example.com",
proxy={
"type": "residential",
"rotation": {
"strategy": "per_request", # "per_request", "per_session", "per_domain"
"max_uses": 10, # Rotate after N uses
"max_age": 300 # Rotate after N seconds
}
}
)Session Persistence
Keep the same IP for multiple requests:
# Create a session with persistent proxy
session = client.create_session(
proxy={
"type": "residential",
"country": "US",
"persist": True # Same IP for all session requests
}
)
# All requests in this session use the same IP
result1 = session.scrape("https://example.com/page1")
result2 = session.scrape("https://example.com/page2")
result3 = session.scrape("https://example.com/page3")
# Clean up
session.close()Custom Proxy Integration
Use your own proxy servers:
result = client.scrape(
url="https://example.com",
proxy={
"custom": True,
"url": "http://username:password@proxy.example.com:8080",
# Or provide a list for rotation
"urls": [
"http://proxy1.example.com:8080",
"http://proxy2.example.com:8080",
"http://proxy3.example.com:8080"
]
}
)Proxy Pool Management
Create Custom Pool
# Create a custom proxy pool
pool = client.proxies.create_pool(
name="my-pool",
type="residential",
countries=["US", "GB", "CA"],
size=50, # Number of IPs in pool
rotation="per_request"
)
# Use the pool
result = client.scrape(
url="https://example.com",
proxy={"pool_id": pool.id}
)
# Get pool statistics
stats = client.proxies.get_pool_stats(pool.id)
print(f"Active IPs: {stats['active_ips']}")
print(f"Success rate: {stats['success_rate']}%")Health Monitoring
# Check proxy health
health = client.proxies.check_health(
proxy_type="residential",
country="US"
)
print(f"Available IPs: {health['available']}")
print(f"Average response time: {health['avg_response_time']}ms")
print(f"Success rate: {health['success_rate']}%")
print(f"Banned IPs: {health['banned']}")Handling Proxy Failures
from scrapehub import ScrapeHubClient
from scrapehub.exceptions import ProxyError
client = ScrapeHubClient(api_key="your_api_key")
try:
result = client.scrape(
url="https://example.com",
proxy={
"type": "residential",
"country": "US",
"retry_on_failure": True, # Auto-retry with different IP
"max_retries": 3
}
)
print(result.data)
except ProxyError as e:
print(f"Proxy error: {e}")
print(f"Failed IPs: {e.failed_ips}")
print(f"Fallback to direct connection or different proxy")Best Practices
Proxy Best Practices
- Use datacenter proxies for high-volume, low-risk scraping
- Use residential proxies for protected or geo-restricted sites
- Enable session persistence when scraping related pages
- Monitor proxy health and success rates regularly
- Implement automatic failover to backup proxies
- Combine proxies with stealth features for best results
- Respect rate limits even with proxy rotation
Pricing
Proxy usage is billed separately based on the type and volume:
Datacenter Proxies$0.001 per requestIncluded in Starter plan and above (up to limits)
Residential Proxies$0.005 per requestAvailable on Professional plan and above
Mobile Proxies$0.015 per requestAvailable on Enterprise plan only
Troubleshooting
Slow Response Times
Try switching to datacenter proxies or a different country with lower latency
Frequent Proxy Bans
Increase rotation frequency, add delays between requests, or upgrade to residential proxies
Geo-Location Not Working
Verify the country code is correct and that residential proxies are available for that region