Activity
Mon
Wed
Fri
Sun
May
Jun
Jul
Aug
Sep
Oct
Nov
Dec
Jan
Feb
Mar
Apr
What is this?
Less
More

Memberships

Fractality Community

137 members • Free

Empire IA

1.7k members • Free

Systèmes IA Lab

341 members • Free

MechaPizzAI Community

1.4k members • Free

AISANCE

342 members • Free

AI First Lab - Le HUB

3.9k members • Free

Claude Code France

764 members • Free

LE LABO IA (BASIC)

4.3k members • Free

SOS Développeur

33 members • Free

6 contributions to (n8n) Nodes Automation Lab
Help understanding how to properly use skills and MCP
Hello, Are there websites where people showcase their creations using skills, MCP, and other tools on GitHub or other marketplaces? This would give me some inspiration and, above all, help me better understand how to use them. Since there are many different search tools, I figured there was probably a reason for it. I asked some LLMs and they confirmed that Tavily can only perform searches on Google, but to enter a site and perform searches, you need to use the Brave MCP. However, there is also RPA and the Playwright MCP; here again, I'm trying to understand when to prioritize one over the other. Should I only install these two on my computer/CLI, and then the AI will choose the right one based on my prompt? But then there is another style of tool like OpenClaw or Claude Co-work; I believe OpenAI also has its own. Are these also RPA tools or are they something different? I was thinking of creating an MCP server in n8n or on my computer containing all the search tools in the world, and then, based on my prompt, it would look for the most appropriate tool. But LLMs tell me that’s not the right solution. That’s why I thought about connecting it to GitHub so it could read what the best search tools are—like a database—but that’s still not it. So, I thought about making my own database/RAG using AnythingLLM: first step, I gather all the search MCPs with Apify, and in the second step, I update this RAG with an RSS feed? This way, my CLI would have a solid knowledge base to advise me on the best tools based on what I want to do. I have different tools to resell in very different categories, and to avoid having to search for specialized sites in these categories myself, I would like to create a CLI or n8n workflow that could first find the appropriate sites and maybe even write the ads and post them on its own, and register on these sites. So, would I need to use different MCPs and perhaps even skills and other tools? Thanks.
2
0
Aide pour comprendre comment utiliser correctement les skills et MCP
Bonjour pourriez-vous m'aider à comprendre comment faire correctement des recherches sur différents sites par exemple pour de la recherche d'emploi mais également pour du e-commerce c'est-à-dire sourcing et scrapping
0
0
Apify
Hello I would like to scrape simple and variable products from marketplaces that don't have an actor, like Darty (a French marketplace), but I'm not sure I'm doing it correctly. Is it the same as the configuration for a marketplace that has an existing actor, like Amazon? Could you explain what all the options are for, how to manage pagination and variable products? After that, I'd like to add n8n because I want to add conditions, avoid or delete duplicates, sort suppliers by city, create a list of all suppliers with a "contact" button, and put the CSV files in multiple formats, specifically the import format for each marketplace (I'm not sure if I should do this setup in Apify or n8n). I ask all these questions to LLMs, and they explain it to me, but I'm not sure I've fully understood. The main problem is that they can't show me diagrams like when I use the application. They talk about a "custom vector." Does this mean the only thing I have to do is copy-paste a JSON code and nothing else, no other configuration? Have I understood correctly that I can add conditions and therefore scrape any marketplace by specifying all of this in the JSON I give to Apify? If so, then I'm trying to understand the difference between that and the other conditions I can create in n8n. And finally, do you use Crawl4AI? I ask my chats what the difference is with Apify, and I still can't quite understand. Apparently, it's not as precise as a scraper with custom logic (like Apify), but by adding the APIs of Groq and Deepseek, or others, to Crawl, would that make it as intelligent as Apify Thanks
1 like • Jan 2
@David Ashby ## **E-Commerce Automation Workflow & Strategy** ### **1. Current Sourcing Process** I'd like to do e-commerce and utilize specific extensions to find products. My sourcing workflow currently relies on an image recognition extension called **AliPrice**. When browsing a marketplace like Amazon, this tool allows me to locate the same product across various other marketplaces, specifically AliExpress. **The Bottleneck:** The main issue is that once I click through AliPrice to view the product on AliExpress, I must wait for the page to load and then manually apply filters. I have to filter for the lowest price and specific shipping countries. I am forced to do this manually because the extension does not support pre-set filters for destination countries or automated sorting by ascending price to find the cheapest option. ### **2. Current Scraping & Importing Process** For the second stage, I use an extension called **DSers** to scrape product data and push it to WooCommerce or other marketplace seller accounts. **The Limitations:** * **Platform Restriction:** It only allows scraping from AliExpress, not from any website. * **Target Restriction:** It cannot send data to every marketplace. * **Bulk Actions:** I cannot paste multiple URLs into the application for mass importing. * **Manual Effort:** Despite these two extensions providing some automation, the process still requires a high volume of manual clicks. ### **3. Automation Objectives & Proposed Tools** My goal is to implement an **AI Agent** to eliminate these repetitive clicks. I am considering two Chrome-based options for the sourcing and scraping phases: **crawl4AI** or **Apify**. **The Strategy:** As a dropshipper, my strategy begins with identifying a product located in a local market (specifically Europe). I then need to find the cheapest version of that product within that specific region using filters. Once the optimal source is identified, I proceed with scraping. I am looking for an application that makes this entire sequence as autonomous as possible.
0 likes • Jan 10
@David Ashby ok thanks
help
Hello, I'm having trouble understanding how to properly create AI agents in 8n and AI Studio. I've generated a JSON file for my workflow, but there are still some things I don't understand, and I'd like to discuss this further with a real person rather than an artificial one. Thank you.
0 likes • Jan 2
@Tanner Woodrum ## **E-Commerce Automation Workflow & Strategy** ### **1. Current Sourcing Process** I'd like to do e-commerce and utilize specific extensions to find products. My sourcing workflow currently relies on an image recognition extension called **AliPrice**. When browsing a marketplace like Amazon, this tool allows me to locate the same product across various other marketplaces, specifically AliExpress. **The Bottleneck:** The main issue is that once I click through AliPrice to view the product on AliExpress, I must wait for the page to load and then manually apply filters. I have to filter for the lowest price and specific shipping countries. I am forced to do this manually because the extension does not support pre-set filters for destination countries or automated sorting by ascending price to find the cheapest option. ### **2. Current Scraping & Importing Process** For the second stage, I use an extension called **DSers** to scrape product data and push it to WooCommerce or other marketplace seller accounts. **The Limitations:** * **Platform Restriction:** It only allows scraping from AliExpress, not from any website. * **Target Restriction:** It cannot send data to every marketplace. * **Bulk Actions:** I cannot paste multiple URLs into the application for mass importing. * **Manual Effort:** Despite these two extensions providing some automation, the process still requires a high volume of manual clicks. ### **3. Automation Objectives & Proposed Tools** My goal is to implement an **AI Agent** to eliminate these repetitive clicks. I am considering two Chrome-based options for the sourcing and scraping phases: **crawl4AI** or **Apify**. **The Strategy:** As a dropshipper, my strategy begins with identifying a product located in a local market (specifically Europe). I then need to find the cheapest version of that product within that specific region using filters. Once the optimal source is identified, I proceed with scraping. I am looking for an application that makes this entire sequence as autonomous as possible.
E-COMMERCE, SOURCING + SCRAPING: HOW TO DO IT WITH AN AI AGENT OR A VIBE CODING APPLICATION?
Hello SOURCING: I use an image recognition extension, AliPrice, which allows me to find a product on various marketplaces when I’m on a platform like Amazon. I’d like to understand what technology this uses, but more importantly, whether an AI agent on n8n could replace this extension while adding automation. To maximize automation, I’m thinking of a system or application that would let me paste multiple product URLs I’m interested in. The system would then search for these products across different platforms, record the prices in an Excel file, and apply my preferred filters—such as sorting by price in ascending order and by country. It could even send an email to the supplier. AliPrice doesn’t allow me to add filters, and while I mentioned Amazon, this extension doesn’t work on all marketplaces. SCRAPING: I use a second extension, DSers, to scrape and send products to WooCommerce or another marketplace/seller account. Again, I’d like to know if an AI agent could do the same thing while adding more features. DSers only works for scraping AliExpress, not any other site, and it doesn’t allow me to send products to just any marketplace. For bulk imports, I can’t paste multiple URLs at once, and even though these two extensions automate a lot, I still have to do a lot of manual clicking. That’s why I’d like to use an AI agent to avoid all these clicks. I’m wondering if an AI agent on n8n could help me build an application that combines both functionalities while adding more features. Or maybe a vibe coding application could do even better? Could you give me examples to help me understand the advantages of one over the other? Thank you Cyril from the Louvre France (do you need somthing ; )
0
0
1-6 of 6
Chris Der
2
14points to level up
@chris-der-9361
Hello

Active 8h ago
Joined Aug 11, 2025