Activity
Mon
Wed
Fri
Sun
Mar
Apr
May
Jun
Jul
Aug
Sep
Oct
Nov
Dec
Jan
Feb
What is this?
Less
More

Memberships

Automate with N8N

342 members โ€ข Free

2 contributions to Automate with N8N
We're back ๐ŸŽ‰
Hey guys its been a few months since I last posted on this community... The reason why I disappeared for a while was to secretly focus entirely on Nodewave, my AI Automations Agency and since then a lot has happened... I got my first paying clients, built a super powerful SEO system using N8N MCPs and Claude.ai thats currently bringing in new leads every singe day, a few different follow up automations and much more ๐Ÿ˜€ I want to get back on YouTube as soon as possible and give you guys this exact system so you can all test it too, just need to finish a few things first to make sure we capitalize on the momentum thats building up... In the meantime would love to connect with you guys and see what you're all building. Feel free to connect with me on LinkedIn as well, I'll be posting updates there as well: https://www.linkedin.com/in/tiago-lemos-30ba87293/
1 like โ€ข 15d
Great to have you back Tiago! Would be interesting to have a video where you go through some of the challenges and obstacles you faced when you started your agency. Just sent through an invite on LinkedIn๐Ÿ‘
Web Researcher Agent
Iโ€™m working on a small project to create a research agent that can: 1. Crawl an entire website (including all subpages under the same domain). 2. Extract and save all the data into a single text file. 3. Download every attachment available on the site (PDFs, docs, etc.). 4. Later, Iโ€™ll feed all this collected data into an LLM-powered notebook for deep analysis and insights. The idea is to make information gathering automatic and efficient, so I can focus on using the data instead of spending hours collecting it manually. If anyone has experience building similar agents or optimizing crawlers, Iโ€™d love to hear your tips and feedback!
3 likes โ€ข Sep '25
I've recently built a scraper using firecrawl, Its affordable and really quick to set up. Just make sure to set the CrawlEntireDomain property true for your use case and it should be good to go, you might also need to limit the the credits used to 100 or so as it can wipe out 500 credits for one site if not specified.
1-2 of 2
Ole Mariri
1
1point to level up
@ole-mariri-9770
GTM Engineer. Building powered revenue architectures for venture backed startups and unicorns.

Active 14d ago
Joined Sep 1, 2025