# Multiple Ways to Use MCPs with Stripe ## Metadata - **Published:** 11/18/2025 - **Duration:** 16 minutes - **YouTube URL:** https://youtube.com/watch?v=rtGlC0-GuJw - **Channel:** nerding.io ## Description In this video, we explore all the ways you can use the Stripe MCP (Model Context Protocol) to supercharge your AI agents, automations, developer cycle and workflows. The Stripe MCP server exposes Stripe’s APIs — billing, subscriptions, customers, products, and invoices — as standardized MCP tools. That means your AI model (Claude, Cursor, LangGraph, n8n, or any MCP-compatible client) can interact with Stripe safely and intelligently. This lets your AI agents take action, not just talk. With standardized access to payment tools, you can give your assistants, dashboards, or automation bots transactional superpowers — securely and safely, without direct API calls. It’s the bridge between AI reasoning and real-world business operations. ## Key Highlights ### 1. MCP: Bidirectional Communication Protocol MCP facilitates two-way communication between clients (like AI agents) and servers, enabling data ingestion, action execution via tools, and notifications. ### 2. MCP vs. Traditional API Tools MCP offers a unified, standardized way to send information compared to individual, one-way API tool connections, streamlining agent integration and development. ### 3. Cursor Integration with MCP Cursor utilizes MCP integrations (e.g., GitHub, Supabase, Stripe, Context7) to extend its capabilities, allowing users to interact with these services directly within the IDE. ### 4. Local vs. Remote MCP Servers MCP servers can be installed locally (standard in/out) or accessed remotely via URLs (HTTP streamable), impacting security and authentication methods. ### 5. MCP in AI Chatbots MCP clients can be integrated within AI chatbots to access context and execute actions, such as retrieving order information or placing new orders via Stripe, enhancing user support. ## Summary ## Stripe MCP Video Summary: Supercharging AI with Standardized Payment Tools **1. Executive Summary:** This video explains how Stripe's Model Context Protocol (MCP) can be leveraged to integrate Stripe's payment functionalities into AI agents and workflows. MCP provides a standardized, secure, and bidirectional communication channel enabling AI to interact with Stripe's APIs for actions like creating products, managing subscriptions, and retrieving customer data, thus enhancing AI's transactional capabilities. **2. Main Topics Covered:** * **Introduction to MCP:** Definition, purpose, and benefits of using MCP as a communication protocol between AI agents and servers. * **MCP vs. Traditional API Tools:** Differences between MCP's unified, standardized approach and individual API tool connections. * **Cursor Integration:** Utilizing MCP integrations (e.g., GitHub, Supabase, Stripe, Context7) within the Cursor IDE. * **Local vs. Remote MCP Servers:** Understanding the differences between standard in/out (local) and HTTP streamable (remote) MCP servers, and their security implications. * **MCP in AI Chatbots:** Integrating MCP clients within AI chatbots to enhance functionality by retrieving context and executing actions (e.g., order information, product creation) via Stripe. * **Practical Examples:** Demonstrations of using Context7 for Stripe documentation retrieval and creating a Stripe product directly from Cursor using MCP. * **Implementing MCP in Code:** Using the AI SDK to add MCP clients to existing code projects. **3. Key Takeaways:** * **Standardized Communication:** MCP offers a unified and standardized way for AI agents to interact with various services, streamlining development and integration. * **Bidirectional Interaction:** MCP facilitates two-way communication, enabling data ingestion, action execution, and notifications between clients and servers. * **Action-Oriented AI:** MCP allows AI agents to take actions (e.g., creating products, managing subscriptions) instead of just providing information. * **Enhanced Security:** MCP helps in achieving secure access to payment tools without direct API calls. * **Flexibility:** MCP servers can be installed locally or accessed remotely, offering flexibility in deployment and security. * **Improved User Experience:** MCP enhances user support and engagement by allowing chatbots to access context and execute actions via Stripe. **4. Notable Quotes or Examples:** * "MCP is basically a system that allows us to either ingest more data or take action on different kinds of data. So what this means is that it is a birectional communication protocol that allows us to have a client which is things like cursor or an AI agent and communicate to a server that has the ability to get context or take action through tools, resources or prompts." * "The difference between MCP and tools is pretty interesting. What that means is when you have a tool, you basically have a single connection to an API... what MCP is doing it is is allowing us to basically say as a unified way to send information." * Example: Creating a product directly through Cursor using the Stripe MCP integration. * "The ability to actually plug and play these different types of MCP servers gives a lot of added value to the overall experience of the application." **5. Target Audience:** * AI developers and engineers * Individuals interested in integrating AI agents with payment gateways * Developers working with Stripe's APIs * Those interested in using MCP to supercharge AI agents, automations, developer cycle and workflows. * Users of Cursor or other MCP-compatible IDEs and platforms. ## Full Transcript Hey everyone, welcome to Nerding.io. I'm JD and today what we're going to go through is the multiple different ways that you can use MCPS in order to build out a payment gateway. So what that means is we can look at how we can use Stripe documentation. We can use the Stripe MCP and how we can actually implement it into our chatbot as well. With that, let's go ahead and get started. Okay, so the last thing that we're going to talk about is model context protocol or MCP. So MCP is basically a system that allows us to either ingest more data or take action on different kinds of data. So what this means is that it is a birectional communication protocol that allows us to have a client which is things like cursor or an AI agent and communicate to a server that has the ability to get context or take action through tools, resources or prompts. There's some other uh different things that it can do like notifications where it can send from the server uh something to the client in order to either require elicitation or get a response. Real quick everyone, if you haven't already, please remember to like and subscribe. It helps more than you know. Also, please check out the Vibe Coding Retreat community. We're going to have different types of AI challenges and courses and all different kinds of digital assets. And with that, let's get back to it. How we got to MCP was basically an evolution of different types of things that you can do with AI. So we started with GPT then we had uh assistance the API where it gives us assistant thread run message Q it gives us tools and right now we're at more of a universal standard called MCP. The difference between MCP and tools is pretty interesting. What that means is when you have a tool, you basically have a single connection to an API. It is a one-way uh connection where you can specifically call out to the API and then pull or do an action, but you have to write one tool to the client. Now this makes it really difficult because there's not a universal standard in order to write that tool in your agent or in things like cursor. And so what MCP is doing it is is allowing us to basically say as a unified way to send information. So each one of these standards will basically be sending to the through the MCP server and the LLM can communicate to this MCP server or multiple MCP servers. So it's a way to either get data or take action. So what we're going to look at is how you can actually use this in cursor. And there's a few different ways. The first thing you want to do is command shiftp come into your cursor settings. And then that will uh give you right here the MCP integrations. So the ones that I'm going to to look at are these three. It's kind of been things that we've been using throughout our project. So we could use the GitHub one and that would give us the ability to run commands and uh to GitHub pull either information in or actually do uh different types of commands. We can also do commands about superbase. So we can basically look at different types of docs. We can actually execute queries and we can assign this to a particular project. Stripe has also the ability for us to do things like create products. So, for instance, if we wanted to be able to create uh a product or even a customer, we could do that using Stripe in our MC uh in cursor on our MCP. The last one, this is called context 7. This is actually one of my favorite tools. It allows you to go out and search for different types of libraries. So, for instance, if we have particular packages that we're using for say Nex.js, JS, we can actually get the most relevant information or even things like stripe. So, we're going to kind of go through a scenario of if I wanted to use uh stripe as a connection, how could I do that? So, what we're going to do is I am going to uh you can come in here and to set this up. You can either turn these off and on or you can even turn off the tools, right? So maybe I don't want to list my customers uh and I don't need to list invoices. And what will happen is it will turn these off. You can only have about 40 tools on. And that that's not necessarily a limitation of cursor. It's actually a limitation of the LLM. What's happening is based on these functions and the description of what this tool does, the AI is now aware that it has access to these tools. So, they're discoverable, which means that it can search for the function name or even the description and say, I have a particular task that I'm trying to resolve, what tools are available to me, and I will make the decision on how and what to execute. So it's a discerner it's a non-deterministic path of access to tools. So if you want to edit these tools and turn them on and off, you can also edit here. You just press the uh configuration button. This is going to pull up your servers. So you can actually install them by going to context7.com pulling back the uh way to install. Now there's two different types of MCPS. There's the standard in standard out which basically means this is running on your local machine. That's why I have an MPX command. It means it is going to actually run the N uh the npm executable to install this package and run it. Where you can also have something where it's called a HTTP streamable. This means that it's actually going out to the URL. It has nothing to do with anything being solved locally. And this means that it is a remotebased MCP server. So when you see your commands, that's a standard in standard out, which means that you're running it locally. When you see a URL, that means that you're handling it remotely. There's a lot of different security practices around this, but that is like the main thing to understand. When you're uh the differences are with URL, you can authenticate through OOTH and when you're local, you need to use things like environment variables. So, I'm not going to necessarily show the environment variables, but basically there's a key here for Stripe. Now the other thing is that on Stripe go to the uh MCP that they have and you can see that it's a URL which means that it will allow you to log in without storing your environment variables locally. You can actually just do this and then in your configuration on cursor it will give you a little connect button for you to authenticate with. So that's the main differences between standard in standard out and HTTP streamable. Basically one is installed locally, one is installed remotely. So now what I can do is I am going to actually use context 7 and try and get more information about Stripe in general. So, we're going to say use context 7 MTP to get the latest docs stocks on stripe metered billing. And what this will do is it's going to actually execute a tool in order to give us the information that it needs. Now it's calling now it's calling the uh tool that it has. Remember we have two tools here. The resolve library and the get docs. It's resolved the fact that it has access to the strike docs or looking for the name and was able to pull back information. Now it's actually getting the information of the documentation and it's actually looking for things about metered billing. So we can now see metered price and putting together some information. Great. And so now it's actually giving us code examples uh based and this is the most recent documentation for our uh Node.js. So as you're putting together your implementations for the buildathon, you can actually leverage context 7 in order to ask it questions. How can I do this uh particular task? Use the most up-to-date libraries. So that way when you're depending on the package you've installed, you're able to actually uh put this together. The other thing that you can do with MCPS, so for instance, we have Stripe that is in uh right here and we have multiple different tools is we're going to try and say create a product. So I'm just going to ask it to create a simple product. Um and we'll see what we get. Use so now we're taking action. Uh before was still using tool but we're actually bringing in context. Now what we're doing is we're doing an action to actually send something to stripe. Uh we'll say a metered product or actually let's just do a yeah or n per mile. So now it's going to go out and plan this again. It's evalu evaluating what tools it has available. You can see that this is the MCP Stripe create product. It's also going to create a price. So, not only did I does it know that it needs to create a product, but based on the APIs and the things that it has access to, it is now going to write a mileage product. So, now we have a mileage meter product. We have the ability to see what the result is. It's going ahead and creating the price and we have the actual uh create product created price ids in case we need them. It's even given us the code based on the uh application or the the context that we had before and actually giving us instructions on how we can implement it and then saying do you want me to create a meter wire uh and wire this up. So this would be a way to create that type of uh MCP. So you're using MCPS inside your product. Now there's more that we can do with MCP. We can actually include because we can actually include a client in our code. So like I said, cursor is a client that is a a a MCP client that can leverage MCP servers. There is nothing that says you can't use an MCP client inside your code. So, a lot of different agents are actually using MCP to communicate, grab context, and take action. And so, what we're going to do is we're going to look at how you could do that here. So, in the case of our AI chatbot, we actually have inside of AI SDK this thing called MCP tools. Now, there are regular tools inside of the AI SDK as well, and there's multiple different examples of being able to use that in here. So, the the difference between that is the syntax. So, if you look at how this is going out and doing uh its transport, it's a bit different. we are actually saying that we need a tool, we need to define our schema and we need to post that information. So this is much more of a traditional API request. You can even do tools in multi-steps. But in the AI SDK, they have a way to do multiple things. You can either create your own MCP server. So this means actually building out a server and deploying it or you can create an MCP client using that same client and using the transport. So in this case a URL we're using a remote server above we were using a uh standard in standard out client. So we were saying if the server was uh installed locally and we can actually uh call out to this particular client. So we could say we're calling out to Stripe and based on the tools that we have available for this MCP client, we want to create those as being available or discoverable for the MCP. And so what you would do is you could say that all of these tools are available and when they're done go ahead and close the transport. One of the things that's different about MCP is it allows a session so that you can maintain some sort of state. So you would have all your tools, you have access to all your tools. They are now on your streamable text and you can access them through your chat. So, why would you want to do that? Well, let's say that we go back to our application and we're inside of our consumer and our Sky Market or perhaps we're on our provider and instead of just having a chatbot that we can actually interface with, maybe we want an assistant that allows you to pull back information about the chat order so that you can assist the person and understand more about their active orders. or you could even offer them a new product or service by integrating with this agent. So the agent itself could now call out to Stripe to either access information about its particular orders or it could actually order some new product for this particular user trying to get assistance. This is like the huge benefit of MCP because the same tool that we're using in our C cursor client is going to be the same tool that we can leverage inside of our AI agent. Now, of course, that adds some complexity and security and authentication uh permission issues that we need to be aware of. So there's definitely some security considerations, but the ability to actually plug and play these different types of uh MCP servers gives a lot of added value to the overall experience of the application. --- *Generated for LLM consumption from nerding.io video library*