# Build a sales prospect chat in 5 min with MCP Resources, #n8n, and Firecrawl ## Metadata - **Published:** 3/24/2025 - **Duration:** 8 minutes - **YouTube URL:** https://youtube.com/watch?v=28M5SwKsZpA - **Channel:** nerding.io ## Description Join the Community: https://nas.io/vibe-coding-retreat 50% off One-click remote hosted MCP servers use NERDINGIO at Supermachine.ai https://supermachine.ai/ πŸ“ž Book a Call: https://calendar.app.google/M1iU6X2x18metzDeA πŸ“° Newsletter: https://sendfox.com/nerdingio In this lightning-fast tutorial, we show you how to build a fully functional chatbot in under 5 minutes using: 🧠 Model Context Protocol (#mcp ) Resource πŸ” n8n Workflows πŸ“„ LLM Full Text instructions Whether you're building for support, automation, or internal tools, this setup gives you a flexible, AI-powered chatbot that’s easy to configure and extend β€” no custom backend required. πŸš€ What You’ll Learn: βœ… How to connect MCP Resources to your n8n workflows βœ… Using LLM Full Text nodes to craft rich, dynamic AI responses βœ… Sending and receiving chat messages via standardized MCP flows βœ… Deploying a lightweight, scalable chatbot that can run anywhere πŸ’‘ Why Use MCP + n8n? Model Context Protocol gives your chatbot standardized access to tools, data, and LLMs β€” all through one unified interface. Combined with n8n, you can visually automate flows, respond to user inputs, and call external services with ease. πŸŽ₯ Chapters 00:00 Introduction 00:25 Demo 02:22 Logic 03:07 Server 05:22 Message Flow 06:13 Inspector 07:04 n8n 08:08 Conclusion πŸ”— Links Source: https://github.com/nerding-io/mcp-sse-example Spec: https://spec.modelcontextprotocol.io/specification/2024-11-05/server/tools/ ‡️ Let's Connect https://everefficient.ai https://nerding.io https://twitter.com/nerding_io https://www.linkedin.com/in/jdfiscus/ https://www.linkedin.com/company/ever-efficient-ai/ ## Key Highlights ### 1. Turn Websites into Chatbots in Minutes The video demonstrates how to quickly convert a website into a chatbot using MCP Resources, n8n, and Firecrawl. It streamlines information access and offers a prospecting tool. ### 2. MCP Resources for Contextual AI MCP resources provide context to AI agents, enabling them to answer questions based on scraped website data. It uses a static resource to gather info using LLMs. ### 3. Firecrawl for Full-Text Scraping Firecrawl is used to scrape entire websites into a single markdown file, which then becomes a resource for the AI agent. This allows for efficient information retrieval. ### 4. Resource Subscription for Updates MCP's resource subscription feature allows clients to be notified when a resource is updated, facilitating real-time data integration and dynamic chatbot responses. ### 5. Resources vs. Tools for AI Agents The video highlights the difference between tools and resources for AI agents, noting that resources provide context, similar to how standard input/output (stdin/stdout) works, but directly. ## Summary ## Video Summary: Build a Sales Prospect Chat in 5 Min with MCP Resources, #n8n, and Firecrawl **1. Executive Summary:** This video demonstrates how to quickly build an AI-powered chatbot using Model Context Protocol (MCP), n8n, and Firecrawl. The chatbot leverages website scraping to provide context to the AI agent, enabling it to answer questions and serve as a prospecting tool without requiring a custom backend. **2. Main Topics Covered:** * **Introduction to MCP Resources:** Explanation of how MCP resources provide context to AI agents, enabling them to answer questions based on scraped website data. * **Firecrawl for Full-Text Scraping:** Demonstration of using Firecrawl to scrape an entire website into a single markdown file, which then becomes a resource for the AI agent. * **n8n Workflow Integration:** Using n8n to connect the chat interface to the MCP resource, enabling real-time data access and dynamic chatbot responses. * **MCP Resource Subscription:** Overview of MCP's resource subscription feature, allowing clients to be notified when a resource is updated, facilitating real-time data integration. * **Resources vs. Tools:** Highlighting the difference between tools and resources for AI agents; resources provide context, similar to stdin/stdout, but directly. * **Demo of the Chatbot:** Real-time demonstration of the chatbot in action, showcasing its ability to answer questions based on scraped website data. * **Server Configuration:** Overview of how the MCP server is configured to serve the scraped website content as a resource. **3. Key Takeaways:** * MCP Resources, n8n, and Firecrawl can be used to rapidly build chatbots that leverage existing website content. * MCP provides a standardized interface for AI agents to access data and tools. * Firecrawl enables efficient scraping of entire websites into a single, usable text file. * n8n facilitates visual automation of flows, enabling easy connection of the chatbot interface to the backend resource. * MCP Resources allow for dynamic updating of chatbot responses through resource subscriptions. **4. Notable Quotes or Examples:** * "So what we're going to do is we're going to take this website and turn it into a chatbot which is pretty simple task." - Introducing the goal of the video. * "Firecrawl ...scrape the entire website and put it into a single text file...that we can then take and put into our context." - Explaining the role of Firecrawl. * "So it's very similar to like a tool except a tool our tools have been doing actions for us whether that be hitting an API or or something else um resources work the same way on SS8 as they do a standard in standard out." - Differentiating between tools and resources in AI agents. * Example Question: "What was the latest?" - Demonstrating the chatbot's ability to extract relevant information from the scraped resource. * Example Question: "Have contact?" - Demonstrating the chatbot's ability to find and extract contact information **5. Target Audience:** * Software developers * AI/ML engineers * Automation specialists * Sales and marketing professionals * Individuals interested in building AI-powered chatbots * Anyone looking to streamline information access using AI. ## Full Transcript hey everyone welcome to Nering I.io I'm JD and today we're going to continue our MCP series specifically looking at resources And the example we're going to do is using an LLM full text We're going to scrape either your own site or potentially like a sales uh prospect's site and turn that into a chatbot So with that let's go ahead and get started All right So what we're going to do is we're going to take this website and turn it into a chatbot which is pretty simple task Um but we're going to do it using MCP resources So this is a side project that a buddy and I have where we do um different games and oneoff uh kind of like standalone projects So the uh the concept will be we're trying to either prospect this or we could actually turn this into a chatbot for our site We're going to use NAD and we're actually going to use NCP So I'm just going to kind of go through a quick demo of what I'm talking about So basically we have this agent and we have a resource and this resource is static We're looking at documentation for I75 corridor So what we can do here is we can say something like what was the latest and it's going out getting information from the resource and pulling it back And so now it's showing the the different projects that um have come out We can also do things like if we were prospecting we could say have contact All right So now we get information about email And so this isn't actually going out and pulling from the site What we did is we actually scraped all of that information in one go with a LLM Real quick everyone if you haven't already please remember to like and subscribe It helps more than you know Also please go check out text yourself It's a simple application that I built that helps keep me on track All you have to do is SMS different tasks and reminders that you need to be sent back to yourself With that let's get back to it Uh full text and what that means is that we go out and we collect all the information from this site and then we turn it into a resource on an MCP server that we're already using So in and then just have like a simple prompt So in our resource we're actually using the uh MCP SSE local We have that running and then we're telling it to go out and grab this URI Can also use the uh you know AI emoji But what does all this mean so if we go into our server and we take a look at uh how we're actually serving this file up So we have our tools from last time We have a couple different services and we're saying we want to pull this file So the way that I made this file go ahead and delete it is using firecrawl and uh doing a llm full text So I'm just going to use uh the command line and actually npx into this So you can see the command that I ran last time All I did was npm generate the output folder and the URL that I want to uh scrape This is going to scrape the entire website and put it into a single text file And then we're going to use that text file as a resource in our AI So you do also have to pass an API key here but um this is the the actual call You can also get it through an API but I'm just going to do it through uh the CLI So this will take a second and then it'll start generating So there we go Now we've already got it back And in our txt we have our documentation We have 575 quarter of our full text and all it is is markdown It's literally taking the entire site and putting it into a markdown uh file that we can then take and put into our context So in order to do that what we're going to do is we look at this server and we have two different resources and we're pulling this information back So really all we're saying is this first one is just a config It's just a straight like flat text example And then the second one is we're basically taking a log the path for a file We're then saying that's our context We're defining our meme type Remember that resources can have different meme types We're going to kind of go into what that looks like in a different video We can then serve that up still using our file context And then we can actually pipe that into our system So if we go and look at the uh way that this resource flows back and forth and we look at the spec basically we have our client we have our resource discovery It's going to go get us a list of resources Based on that list of resources we can then do resource access We're going to then read that resource it will give us the context of that resource and we can then use that in our client or in our agent There's also some advanced pieces so we can actually subscribe So if any of those files are then updated or maybe there's log files or something that's in the folder we can actually get notified back in our client that something was updated right and that's where this we would subscribe to the resource and then as that resource was actually updated we would get a response back to let us know to pull the list So if we want to see this in the inspector what we can do is we can connect to our local host but also go out to the the docker implementation that we did in the uh previous video We're going to go ahead and connect We can get a list of our resources We can see right that we're going to read this is the configuration It's just simple text And then with the documentation it knows to actually pull this corridor This is our URI And then we know it's plain text And then here we go We have the markdown And so that's how we actually provide context to our uh agent So if we want to use this URI uh we can go back to N and that's where our resource is This is where we got our resource URI and this is how we were able to pull that context back into our chat So if you kind of notice what happened was it was starting with our chat going to our agent grabbing information from the resource pulling that back and putting it into the model So it's very similar to like a tool except a tool our tools have been doing actions for us whether that be hitting an API or or something else Um resources work the same way on SS8 as they do a standard in standard out Uh but they're incredibly useful and it's one of the things that I think is really awesome where you can actually pull information back directly in without being a um a a tool So one of the the things that they kind of talk about is again you can use things like databases you can use um images PDFs and we'll kind of go through that in the next video All right that's it for us today everyone So what we went through is actually using firecrawl to pull in uh somebody's entire site using and create an LLM full text and then also turn that into a resource for MCP and connect it into NADN With that happy nerding --- *Generated for LLM consumption from nerding.io video library*