# Run Your Own AI Agent Server on Vercel with MCP ## Metadata - **Published:** 6/5/2025 - **Duration:** 7 minutes - **YouTube URL:** https://youtube.com/watch?v=rUGvrzX0E6o - **Channel:** nerding.io ## Description Join the Community: https://nas.io/vibe-coding-retreat Get 50% off from http://supermachine.ai/ for 1-click install MCP servers (NERDINGIO) ๐Ÿ“ฉ Newsletter: https://sendfox.com/nerdingio Want to host your own AI agent serverโ€”fully MCP-compatibleโ€”with just a few commands? In this guide, we deploy a Model Context Protocol (MCP) server to Vercel, giving you a production-ready backend for Claude, LangGraph, or any other agent client. This setup enables: ๐Ÿง  Fast agent deployment ๐ŸŒ Secure, public endpoints ๐Ÿ“จ Live prompt streaming with SSE ๐Ÿ”Œ Integration with any MCP client (Claude, Cursor, etc.) ๐Ÿ’ก Ideal For: - AI developer tools - Custom LangGraph flows - Claude Desktop support - Serverless agent experiments ๐Ÿ”— Links: ๐Ÿ“˜ Vercel Release: https://vercel.com/changelog/mcp-server-support-on-vercel ๐Ÿ“˜ Vercel MCP Docs: https://ai-sdk.dev/cookbook/node/mcp-tools#mcp-tools ๐Ÿš€ Deploy Template: https://github.com/vercel-labs/mcp-on-vercel/tree/main ๐Ÿ“ฌ Newsletter: https://sendfox.com/nerdingio ๐Ÿ’ฌ Ready to host your own agent? Let me know in the comments ๐Ÿ‘‡ ๐Ÿ‘ Like & Subscribe for more AI agent tutorials! ## Key Highlights ### 1. Vercel Supports Stateless MCP Vercel now supports MCP via an HTTP stateless adapter, enabling the use of SSE or HTTP streamable transports for AI agent servers. ### 2. Experimental MCP Client in AI SDK Vercel's AI SDK has an experimental MCP client with standard in/standard out, allowing argument execution on a serverless backend. ### 3. Tool Creation and Integration The video demonstrates creating custom tools and integrating them into a Vercel application using MCP transport, showcasing front-end interaction. ### 4. Connecting Front-End to MCP Server The presenter shows how to connect a front-end application to a MCP server running on Vercel, invoking tools and displaying results in real-time. ### 5. Future: Integrate with OpenAI The presenter suggests future work could involve integrating the MCP setup with OpenAI for more complex AI agent interactions. ## Summary ## Video Summary: Run Your Own AI Agent Server on Vercel with MCP **1. Executive Summary:** This video tutorial demonstrates how to deploy a Model Context Protocol (MCP) server to Vercel, enabling a production-ready backend for various AI agent clients like Claude or LangGraph. By leveraging Vercel's stateless MCP support and experimental AI SDK client, users can create and integrate custom tools, connect them to a front-end application, and execute arguments on a serverless backend. **2. Main Topics Covered:** * **Introduction to Vercel's MCP Support:** Explanation of Vercel's support for stateless MCP via an HTTP adapter using SSE and HTTP streamable transports. * **Vercel AI SDK Experimental MCP Client:** Overview of the experimental client featuring standard in/standard out for argument execution on a serverless backend. * **Creating and Integrating Custom Tools:** Demonstrating how to build custom tools (e.g., adding numbers, echoing messages, getting server time) within a Vercel application using MCP transport. * **Connecting Front-End to MCP Server:** Showing how to connect a front-end application to the deployed MCP server on Vercel, invoking tools, and displaying results in real-time. * **Future Integrations:** Suggestion to integrate the MCP setup with OpenAI for more complex AI agent interactions in the future. **3. Key Takeaways:** * Vercel now supports MCP through a stateless HTTP adapter, enabling SSE and HTTP streamable transports for AI agent servers. * Vercel's AI SDK includes an experimental MCP client with standard in/standard out capabilities, allowing for argument execution on a serverless backend. * Users can create custom tools and integrate them into a Vercel application using MCP transport. * Connecting a front-end to the MCP server allows real-time interaction with tools. * Future development could involve integrating this setup with OpenAI. **4. Notable Quotes or Examples:** * "Vercel is actually supporting MCP... it is Versel is actually supporting MCP. And I found this pretty interesting since basically, uh, MCP is stateless." * Example tools demonstrated: "a way to add you can specifically go through Reddus uh in order to maintain state and do your additions as well as echoing um and get the server time." * Showing front-end integration: "We need to make sure that we have our message that's going to be our key and we can actually just put in message hello from front end and we expect to see the call uh explicitly getting executed which it is." * Future direction: "What we can do from here is we could actually add our open AI and go ahead and make it deterministic to see uh when we're actually asking a question rather than actually sending the payload, could we actually um call out to OpenAI?" **5. Target Audience:** * AI developers * Individuals interested in deploying AI agents on Vercel * Developers working with LangGraph, Claude, or other MCP-compatible AI tools * Those looking for serverless backend solutions for AI applications ## Full Transcript Hey everyone, welcome to Nering.io. I'm JD and today what we're going to go through is using Verscell's MCP server SDK as well as look at their experimental client. And with that, let's go ahead and get started. All right, so this came out a little bit ago, uh, but it is Versel is actually supporting MCP. And I found this pretty interesting since basically, uh, MCP is stateless. And so they released a uh adapter that allows you to do an HTTP stateless transport which means that you can use SSE or HTTP streamable. Uh and this gives you the ability to basically create uh tools and allow you to interface and expose an MCP server through Verscell. And as you can see right here, you basically have handlers for get, post, and delete. Um, and they give you some code that you can actually go and run. So, this repo gives you the API specifically that you would need to hit. So, the uh MCP server call and then it also has scripts to test. So, there's two different clients. Uh, and what I found was kind of interesting on this is that they're not actually using the Versell uh, AI SDK client in order to test this. They're actually using straight up MCP SDK for SSE and then they also have an example for streamable HTTP. So what I was interested in doing is seeing well not only do they have the streamable but can we actually look at their MCP client tools and use it in the AI SDK. So this is all experimental but they have uh standard in standard out so you can actually pull in uh pull in transports directly into your Versell back end which is super interesting. Not a lot of uh places are allowing you to actually execute different arguments on your server uh in and being in a stateless environment. They also have the ability to call out to SSE but they don't have at least in the experimental that I could find uh the type to do HTTP streamable. you have this ability to basically create a transport and then that possibly could get it uh and then give yourself the ability to have tools as well as use your um provider and generate text. So this is like a basic implementation and I wanted to try and uh create parts of this. I'm not going to go through the entire uh application and setting, but we're going to actually look and see how can we actually implement part of this and uh connect it. So the first thing that we're going to do is I'll look at all right that's it for us today everyone. If you haven't already, please remember to like and subscribe some of the code that I put together. So basically what I did is I created a front end. I created an action. This action allows you to go out to the uh experimental transport. So in this case, we're using SSE. We're going to go ahead and create a transport ourselves. And this has multiple different tools that you can see. This is based on the the tool that they provided. So essentially it's just a way to add you can specifically go through Reddus uh in order to maintain state and do your additions as well as echoing um and get the server time. So the other piece of this is based on the action we're actually going to return the tools that we have access to. We're actually going to look at those tools in the front end. So, this is being processed from our front end with uh the actions file. And then we're actually going to pull those in and allow us to just simply look at the tools um and if we want send back uh an uh excuse me, send back an action. So, first thing we're going to do is now that we have both the front end and the back end up and running is we're actually going to test this in our inspector. So we can actually go here in an inspector. We can do a connection. We're connected to MCP. And with this, we'll kind of look and we'll see that we're pulling back our tools. So we know that we're connected to the transport. We know that we're actually connected to our um to our tools because we can actually use them here. So we can just say hello birds. And we can see that it's actually executing. So now what we're going to do is we're just going to take a look at the front end and just see if we can actually pull this back the same way that we're doing this through our MTP tools, but on our own front end. So this is the front end that I kind of built. We can see that we're actually connecting to the server. We can actually see that this is getting pulled back. We can actually see that we're echoing uh and all the tools that we're getting back specifically. We can even uh add in and and make a a tool call. So what we're going to do is we can actually like go mimic what we're doing here. So we need to make sure that we have our message that's going to be our key and we can actually just put in message hello from front end and we expect to see the call uh explicitly getting executed which it is. So again this is how we can actually create the uh a full MCP both transport as well as client in Verscell specifically. So again what I did was I created the tools. They're going out to an actions file. That action file is going to the transport. It's basically looking at those tools and then based on the execution coming in here and executing. What we can do from here is we could actually add our open AI and go ahead and make it deterministic to see uh when we're actually asking a question rather than actually sending the payload, could we actually um call out to OpenAI? And again, here's our transport so that we're uh able to look at our tools. All right, that's it for us today, everyone. If you haven't already, please remember to like and subscribe. What we went through was the Verscell MCP server as well as the Verscell AI SDK experimental MCP client. With that, happy nerding. --- *Generated for LLM consumption from nerding.io video library*