User
Write something
🔒 Q&A w/ Nate is happening in 42 hours
Day 1 : AIS#7DaysChallenge✅
What I built: I created a newsletter automation following the Day 1 video. Seeing my first automated newsletter email go live felt like a small but meaningful success, and it made me even more excited to keep building. One thing I’d improve: In the next iteration, I want to improve the font styling, color scheme, and overall content structure to make the newsletter look more polished and professional.
0
0
Day 1 : AIS#7DaysChallenge✅
My First n8n Workflow – Hacker News Scraper 🚀
Just built my very first n8n workflow and I'm honestly hooked! 🎉 It's a simple Hacker News scraper that pulls the top 10 items on demand with a single click. Nothing fancy, but it feels amazing to automate something for the first time. The drag-and-drop flow in n8n makes it so intuitive even as a beginner. Next up, I want to automatically send these results to a Google Sheet or Telegram channel. Any tips from the community? 👇 Drop a 🔥 if you remember your first automation!
My First n8n Workflow – Hacker News Scraper 🚀
#AISChallenge-Day 2
Just ran my first scrape with Claude Code + Firecrawl MCP — and it actually worked on the first try. What I scraped: Booking.com hotel search results for Dubai — 25 properties extracted, including hotel names, locations, ratings, review counts, property types, and direct booking URLs. All saved automatically to a clean CSV. What surprised me: I didn't have to tell Claude Code which Firecrawl tool to use or how to call it. I just dropped the URL, and it figured out: use firecrawl_scrape, add a waitFor for the JS-heavy page, parse the markdown, detect the page type (directory/listing), map the right columns, name the file correctly, and write the CSV. Use case idea: Hotel and short-term rental research for travel clients or property investors — scrape competitor listings across Booking.com, Airbnb, or local portals, track rating trends over time, and flag new properties entering a market. Run it on a schedule, and you've got a lightweight market intelligence feed without paying for an expensive data API. Even connecting the MCP server is a win. Once it's wired up, the scraping part is almost trivially easy. #AISChallenge
#AISChallenge-Day 2
#AISChallenge - Day 1 ✅
Just shipped my first fully working Newsletter Automation! From zero to a complete system. All built with Claude Code and no prior coding experience in this area. This marks the official start of my 7-day Challenge. Day 1 in the books — and it feels damn good. Who else is in? Drop a 🔥 if you're doing the #AISChallenge with me. Let’s build in public. #AISChallenge #Day1
#AISChallenge - Day 1 ✅
#AISChallenge Day 6
What I scheduled: A daily Gmail summary that checks my emails automatically — no more opening Gmail and scrolling through everything manually. Scheduled task or loop? Tried the loop first just to see how it works, then switched to a scheduled task so it keeps running even when I close Claude. Loops die when you close the session which is kinda useless for daily stuff. One surprise: It saves what it already read after each run so next time it doesn't just repeat the same emails again. Didn't think it would actually remember stuff between runs like that, that's actually kinda cold.
1-30 of 1,204
AI Automation Society
skool.com/ai-automation-society
Learn to get paid for AI solutions, regardless of your background.
Leaderboard (30-day)
Powered by