Web Researcher Agent
I’m working on a small project to create a research agent that can:
  1. Crawl an entire website (including all subpages under the same domain).
  2. Extract and save all the data into a single text file.
  3. Download every attachment available on the site (PDFs, docs, etc.).
  4. Later, I’ll feed all this collected data into an LLM-powered notebook for deep analysis and insights.
The idea is to make information gathering automatic and efficient, so I can focus on using the data instead of spending hours collecting it manually.
If anyone has experience building similar agents or optimizing crawlers, I’d love to hear your tips and feedback!
4
6 comments
David Sterenfeld
2
Web Researcher Agent
powered by
Automate with N8N
skool.com/automate-with-n8n-7409
Ready-to-use N8N workflows with step-by-step guides. Learn automation fundamentals while building real solutions for your business or clients.
Build your own community
Bring people together around your passion and get paid.
Powered by