Once you have a scraping URL (How to Generate a Scraping URL from the Dashboard), you can use it in different ways.
Option 1: Browser (for testing)
Paste the scraping URL into your browser’s address bar.
What happens:
The page is loaded through Proxyrack
The raw HTML response is returned
Recommended for:
Quick testing
Verifying the page loads correctly
Option 2: Terminal or scripts (recommended)
Most users call the scraping URL using:
Terminal tools (
curl,wget)Scripts (Python, Node.js, etc.)
This allows:
Automation
Repeated requests
Integration with parsing logic
Important note
Using the browser is useful for testing, but real scraping workflows almost always use scripts.
