Activity
Mon
Wed
Fri
Sun
Oct
Nov
Dec
Jan
Feb
Mar
Apr
May
Jun
Jul
Aug
Sep
What is this?
Less
More

Owned by Dan

The Build Lab

14 members • Free

Let's build AI and Automation tools!

Memberships

Dopamine Digital

4.5k members • Free

Skoolers

174.8k members • Free

AI Agent Developer Academy

2.1k members • Free

Ai Automation Vault

12.3k members • Free

AI Money Lab

27.8k members • Free

AI Automation Society

144.4k members • Free

The Lazy Empire

4k members • Free

Outsource Mastery by Joe Rare

173 members • Free

AI Automation Agency Hub

248.9k members • Free

12 contributions to The Build Lab
Google just dropped a game-changer: Context URLs
I spent the last 3 hours building something in n8n that would've taken me 3 days to code from scratch. No web scraping. No API limits. No blocked requests. Here's what happened: Google quietly released Context URLs - a way to pull structured content from ANY website without traditional scraping. Think of it as giving AI agents x-ray vision for web content. In my n8n demo, I'm pulling real-time data from a sites that normally block automated requests. The Context URL feature treats it like a human browser session, but returns structured JSON that my AI agents can actually understand. The workflow I built: - Input any website URL - Google's Context API extracts the meaningful content - n8n processes it through my custom AI agent - Output: Clean, structured data ready for analysis What used to require complex proxy rotations and headless browsers now takes 3 nodes in n8n. Success rate? Not 100%. But, it still works great! No rate limits hit. No captchas triggered. No IP blocks. The real power isn't just avoiding scraping headaches - it's that Context URLs understand the semantic structure of pages. For anyone building automation workflows, this changes everything. You can now build reliable data pipelines from sites that were previously off-limits. Who else has tried Context URLs yet? What's your biggest challenge with web data extraction that this might solve?
0
0
Google just dropped a game-changer: Context URLs
🚀 N8N Just Changed the Game with Data Table
Quick heads up on something that's about to save you hours (and probably thousands in infrastructure costs). N8N just released Data Tables, and I've spent the last 24 hours going deep on this. What this means for us: You can now store and manage data directly inside your N8N workflows. No external database needed. No SQL knowledge required. No separate infrastructure to maintain. Real build I just tested: Created a complete lead scoring system that: - Captures form submissions - Stores contact data in N8N's native table - Automatically enriches with API data - Scores leads based on behavior Time to build: 22 minutes. Cost: $0 extra. Technical details that excited me: - Full CRUD operations (Create, Read, Update, Delete) - REST API access to your tables - Real-time syncing - Export to any format - Query from any node in your workflow For the solopreneurs here: Test your MVP ideas without technical debt. I watched a founder yesterday replicate what would normally be a $5k/month database setup in about an hour. Challenge for the community: What's one manual process in your workflow that you could eliminate if your data lived inside your automations? Drop your use case below. Who's already playing with this? Share your experiments!
0
0
🚀 N8N Just Changed the Game with Data Table
Creating a Searchable Database with Exact Citations
I just built something that completely changes how we access clinical knowledge. You know that frustration when you need exact language from a clinical guideline, but you're digging through 200-page PDFs? I connected an AI agent in n8n directly to a Pinecone vector database containing uploaded clinical algorithms. Now I get precise citations and exact language from any document in seconds, not hours. The best part? I recorded the entire build process from scratch. In 18 minutes, you'll see exactly how to: - Set up the n8n workflow with Pinecone integration - Configure the vector store for clinical documents - Build the agent that delivers exact citations - Test it with real clinical algorithms - This isn't theory. I've been using this system daily at RocketTools for client work, and it's cut my research time by 80%. The precision is incredible - it doesn't just give you "related content." It gives you the exact paragraph, page number, and context you need for compliance documentation. Watch the full build here: https://vimeo.com/1120642938/d18fc27121?share=copy Who else is building AI agents for healthcare workflows? What's your biggest challenge with clinical documentation?
0
0
Creating a Searchable Database with Exact Citations
A little stuck here...
trying to get typebot to use n8n to use openrouter then respond back to typebot but can't seem to figure out how to close the loop... many times when we counsel patients especially on several medications at once you can tell the patient is not retaining enough of the information, curently we ask them to call us back if they forgot something but im thinking of creating a way to generate a transcript of the discussion which would capture the patients specific questions and then send them an email. I believe this would be especially helpful for patients with specialty medications which can be quite complicated and for spanish speakers (prevalent in El Paso, TX).
A little stuck here...
0 likes • 7d
Great start! First, make sure the app completely works in n8n via the chat interface. If you are going to use an automation only within n8n, then you might can just skip the Typebot part altogether. You can just either a) connect your n8n to a party chat interface (there is a free one on Github or you can use one that is prebuilt like ChatDash), or b) you can build one and host it on your own server with ai. Make sure the base automation works perfect - then we can work on the UI! Keep up the great work!
Adding MCPs to Claude for FREE with Docker!
Hey everyone! 👋 I want to share something that's been a total game-changer for how I use Claude Desktop, and I promise - this is NOT difficult! What's This About? I'm going to show you how to use Docker Desktop to connect Claude to a whole universe of MCP (Model Context Protocol) servers. This simple hack will absolutely transform what you can do with Claude. Why Should You Care? Once you set this up, Claude can: • 📝 Access and edit your local files directly • 🔍 Search the web in real-time • 📊 Connect to databases • 🛠️ Run code in multiple languages • 📚 Access specialized knowledge bases • And SO much more! Whether you're writing, researching, coding, or just exploring - this opens up possibilities you didn't even know existed. The Setup (I Promise It's Simple!) I've created a step-by-step video that walks you through the entire process. No confusing technical jargon, just clear instructions that anyone can follow. 📹 Watch the tutorial here: https://vimeo.com/1118911240/14ea7bcfab?share=copy What You'll Learn: 1. How to install Docker Desktop (if you haven't already) 2. Setting up MCP server access inside Claude Desktop 3. Connecting your first MCP servers 4. Real-world examples of how this supercharges your workflow The Bottom Line This isn't just another "cool trick" - it's a fundamental upgrade to how Claude works. You're essentially giving Claude superpowers to interact with the outside world in ways that weren't possible before. Trust me, once you see what this can do, you'll wonder how you ever used Claude without it. Got questions? Drop them in the comments! I'm here to help everyone get this set up. Who's ready to level up their Claude game? 🎯 P.S. - If you found this helpful, share it with others in the community who might benefit. Let's help everyone unlock Claude's full potential!
1
0
Adding MCPs to Claude for FREE with Docker!
1-10 of 12
Dan McCoy
2
12points to level up
@dan-mccoy-9435
Storytelling consultant for 30 years now building AI and Chatbots

Active 3d ago
Joined Jul 25, 2025