So for this specific client through a Skool post in an AI group and ended up being the middle-man developer on this project.
They have thousands of player contacts and constantly need the latest data. I made a simple dashboard where they just hit Start once a day, and it does the scraping automatically with puppeteer.
It runs 5 puppeteer browsers in the background (images and extra requests off to keep CPU low) and uses proxy support so nothing when rate limits occur I just rotate the proxy. In about 5 hours it scrapes 2,500+ players and updates their Monday board with stuff like nationality, current club, market value, agency, contract dates, ID, and more, all in one place.
A little small case study:
Time comparison
- Manual: 2,500 profiles × 2 mins = 5,000 min (≈ 83.33 hours)
- Automation runtime: ~5 hours (hands‑off)
- Human time saved per run: ≈ 78.33 hours
Cost framing
- At £12/hr data entry: 83.33 hrs × £12 = £999.96 saved per run
- Calendar time reduction: ~94% faster (83.33h → ~5h)