# How to Build ChatGPT Apps with Next.js and Vercel: Full Technical Tutorial ## Metadata - **Published:** 10/21/2025 - **Duration:** 13 minutes - **YouTube URL:** https://youtube.com/watch?v=RlXbdsvvxOI - **Channel:** nerding.io ## Description Want to build a full-featured, interactive app that runs natively inside ChatGPT? This video is your complete guide! We will show you exactly how to integrate a modern Next.js application with the OpenAI ChatGPT Apps SDK and the Model Context Protocol (MCP). This isn't just static content—it’s a powerful web application that works seamlessly within the chat window. --- Integrating a complex framework like Next.js into ChatGPT’s interface creates huge technical problems because of its super-strict triple-iframe security architecture. Ready to build? Let’s dive in! **Resources:** * Vercel Blog Post (Deep Dive Source): https://vercel.com/blog/running-next-js-inside-chatgpt-a-deep-dive-into-native-app-integration * Vercel ChatGPT Apps SDK Starter Template: https://vercel.com/changelog/chatgpt-apps-support-on-vercel ## Key Highlights ### 1. Vercel Simplifies ChatGPT App Deployment Vercel provides tooling to easily deploy ChatGPT apps using MCP (Messaging Communication Protocol), making development more accessible. ### 2. MCP Enables Interactive Chatbot Experiences ChatGPT apps use MCP to create interactive experiences. The tutorial shows how to call tools, request user info, and update webpages in real-time within the ChatGPT interface. ### 3. Resource Templates for Dynamic HTML MCP leverages resource templates, in this case HTML files, allowing developers to send dynamic data to be rendered within the ChatGPT app interface. The app registers and uses a 'content widget' resource. ### 4. Chaining Tools and Resources MCP supports chaining actions. A tool call can trigger a resource, or a resource can trigger another tool call, enabling complex interactions within the chatbot application. ### 5. MCP Inspector for Debugging The tutorial uses the MCP inspector to show how to inspect resources and tool calls, which helps developers understand the data flow and debug their applications. ## Summary Here's a comprehensive summary document for the provided video content: **1. Executive Summary** This video provides a technical tutorial on building and deploying ChatGPT apps using Next.js and Vercel, leveraging the Model Context Protocol (MCP). It demonstrates how to create interactive experiences within the ChatGPT interface by integrating a Next.js application, handling complex security architecture, and utilizing Vercel's ChatGPT Apps SDK. **2. Main Topics Covered** * **Introduction to ChatGPT App Development with Vercel:** Overview of Vercel's support for ChatGPT apps and the benefits of using MCP for interactive chatbot experiences. * **Setting up the Development Environment:** Explanation of utilizing a Next.js template, deploying it on Vercel, and understanding the core components like the MCP route, OpenAI metadata, and CORS middleware. * **Integrating with ChatGPT:** Demonstrating how to connect the deployed Next.js app to ChatGPT using connectors and configuring the necessary settings in developer mode. * **MCP Deep Dive:** Detailed explanation of the Model Context Protocol (MCP), including resource templates (HTML files), tool calls, chaining actions, and how data is passed between the ChatGPT interface and the Next.js application. * **Debugging and Inspection:** Using the MCP inspector to examine resources, tool calls, and data flow for debugging and understanding the application's behavior. **3. Key Takeaways** * Vercel simplifies ChatGPT app deployment by providing tooling and support for MCP. * MCP allows developers to create interactive and dynamic chatbot experiences by enabling real-time updates and communication with web applications within ChatGPT. * Resource templates, specifically HTML files, are used to render dynamic data within the ChatGPT app interface. * MCP supports chaining actions, allowing for complex interactions between tools and resources. * The MCP inspector is a valuable tool for debugging and understanding the data flow within ChatGPT applications. * Next.js apps integrated with ChatGPT require specific configuration for CORS and metadata to function correctly. **4. Notable Quotes or Examples** * "What this means is you could actually build out a application through MCP and host it on Verscell that will actually allow you to interact with an app on ChatgT the website." (Explains the core purpose of the tutorial.) * "We're going to do a bit of a hello world where it's going to call out to a tool, ask for a user's name, then display it on the web page." (Describes a practical example of MCP integration.) * "Basically these chat GPT apps are all running on MCP and they're returning a resource that is associated with this information that allows the chatbt to render your HTML but also use the tools to send that information Over." (Summarizes the core functionality.) * Example: The demo application is able to show a user's name on a HTML webpage based on the name the user inputs inside of the chat window. **5. Target Audience** * Web developers interested in building and deploying applications for ChatGPT. * Developers familiar with Next.js and Vercel who want to integrate their applications with the ChatGPT platform. * Individuals seeking to understand the technical aspects of ChatGPT app development and the Model Context Protocol (MCP). * Those looking for a practical guide on building interactive experiences within the ChatGPT interface. ## Full Transcript Hey everyone, welcome to Nering I.io. I'm JD and today what we're going to be going through is how you can actually build a chat GPT app and deploy it on Verscell. What this means is you could actually build out a application through MCP and host it on Verscell that will actually allow you to interact with an app on ChatgT the website. And with that, let's go ahead and get started. All right. So, uh, a little bit ago, a couple days ago actually, this blog came out about how Versel is going to support chat GBT apps. This is like days after the announcement from chat GBT uh, and is really awesome. So, if you saw the previous video that was building a deep researcher on your own data through chat GBT, uh, this follows a very similar pattern. Um these apps are all going to be delivered with MCP and this support on Verscell makes it super easy for for anyone to get started. You could even vive code uh your own kind of application and use uh Versel to actually deploy this. So we're going to go through essentially this example and see what we can come up with. So the the thing to understand is that we're going to be using a Nex.js JS template and we're going to use the Open AI sandbox instead of an iframe. And what we're kind of expecting is the fact that when we have a conversation with uh OpenAI and we are trying to talk with our app, we'll actually be able to see something displayed and something that we can interact with on the Chat GPT uh application. So what we're going to do is we're basically going to take a look at this template and see what we can come up with. Again, this is using the MCP handler that Verscell has put together. So if you go and look at the uh chatbt app with Nex.js, when you look at a demo, it's pretty simple, right? It's just a landing page. Uh but you can see right here they're saying name the uh the name returned with the tool call. So what this means is we're going to actually use MCP. We're going to do a bit of a hello world where it's going to call out to a tool, ask for a user's name, then display it on the web page. And so in order to do that, we're going to take a look at this uh template. So right down here is the GitHub repo. And the way we're going to kind of look at this is it's just a regular application, right? But there are some very specific things here where it starts talking about the route, just the MCP route, the specific OpenAI metadata, and then certain configurations that you're going to need specifically around your base URL and the Coors middleware. So what the corors middleware does is it basically is looking for the appropriate uh headers that allow you to do a uh a response. So in order to like fetch the information that you need to do and then we're going to bootstrap all of this uh because you're going to need specific head information. So to to get started it's it's pretty pretty easy. So, we're going to go up here. As you kind of saw up at the top, there is a deploy button. We can go ahead and just click this deploy. I'm just going to go ahead and put it in here. It's going to put it directly to my GitHub so that I can actually pull it down and we'll go ahead and uh launch this. So, this is going to take a few minutes. I'm going to go ahead and pause while we uh this is creating. So, all right. Awesome. So, this has now deployed. We actually have uh all of the information to go ahead and get started. We're going to click continue. We can actually see our deployment here. Um and let's go back and get our domain. Real quick everyone, if you haven't already, please remember to like and subscribe. It helps more than you know. Also, join the Vive Coding retreat. And with that, let's get back to it. And it is the exact same as the demo. So now we can actually uh try and wire this up to open AAI and see what it comes back with. So the first thing to think about is uh you know what documentation does the app SDK have and basically as you can see the core concepts are we need an MCP server we need some user interaction and we need some design. So what we're going to do is we've already got our server. We've already uh deployed it on Verscell, but now we need to take the next step of actually wiring this up to opening up. So if you look down on the left, there's connect from chat GPT. This is very similar to what we did the other day in the previous video where you have the ability to create a connector. So there's uh there's two ways to do this. There's the uh through the connectors. There's also through the playground. We're just going to go ahead straight into uh chatpt and set it up as a uh connector. You do need to have developer mode on. Again, you can turn this on with your settings. All that information is uh in the previous video. So, if you come in here and you go to add sources, what you need to do is take your uh API or your your URL. So, this and then I'm just doing a slashmcp. I'm copying that and I am going back over to chat. I'm going to click add, connect more, go to create. We'll just say nextjs app. We'll go ahead and put this in here. I don't have authentication on here. And I am going to say trust this application and go ahead and create. Cool. And now I actually have my enabled connectors. So I have the next uh JS app. And if you look down here when we're talking about the meta the different kind of actions this has a show content and the output template and we're going to go through the code and what that means but we basically have different metadata that is getting associated with our tool allows us to use this. So now if we go in what we can do is we can say uh we'll just say use nextJS app and we'll see what it comes back with. Cool. So it's loading content and it's doing its tool call and it's actually performing the show content with the execution and it's just saying user because we didn't actually give it information any information because we don't know exactly what's going on yet. So it returned the app and as you can see now it's displaying with user. So we're actually able to uh send information to the app. So if we say something different and we say uh my name is JD hoping that uh this will actually send it to the content. So now we know it's going to send it as JD. We can go ahead and confirm and it's going to update. So what's interesting about this is we're actually able to integrate with the web page through chat GPT and display a response. We can still click the links here. If you can notice down the bottom left, I'm actually able to click these links and have it pull back in the browser. So, we have an application that's in the browser. We're actually sending data to that application. So let's kind of look at the guts of what this is and we'll actually uh create uh look at like the the intergence of the API or the MCP. So in the MCP basically what's happening is we are con uh constructing a widget content or like a widget uh content um thing. And so basically what that means is we need to display our widget metadata. So these are specific for OpenAI to actually pick up. So when we say what our output template is going to be, it is going to be a resource of a template widget URI. What information do we have that we are invoking? What has been invoked? Uh and is it accessible or not? So when we come down here and we start looking at what this means, if you know a good amount amount about uh MCP, what's happening here is we're actually registering a resource. So a resource or in this case a resource template basically allows you to use a uh URI as a as a template to pass data to a particular resource. Now, typically in the past that resource could be uh all different kinds of things. It could be text, it could be images, uh it could be I've tested like PDFs and things like that. In this case, what we're doing is we're sending an HTML file and we're sending information to that HTML file to then be processed. So, right here is where we're having our register of our resource. We're calling it a content widget. the content widget URI is coming from our parameters up here or our uh definition up here. This is just what the MCP is going to look at in order to pull this information. And then what we're going to do is we're actually sending back our metadata for OpenAI and what kind of HTML uh are we actually processing. Then along with that we are actually sending a MCP tool. This MCP tool is going to gather information from our input and it is going to send it to our context of what that tool needs to be. And then we're going to have structured content and you'll see the metadata is of the widget that we're actually doing. Something else that's really interesting about MCP and kind of how this is working is you can actually chain things together. So that means you can have a tool call and a resource or resource that calls tool call and and those that's kind of the action that's actually happening here. So if we actually take a look at like our uh MCP inspector, it'll kind of give us another idea of what's going on here. So, I'm just going to pull up the npx uh model inspector, and we're just going to go take a look at what these calls will look like. Uh, and I just need to get my app. So, we're going to take this MCP. We're going to go back to our, uh, inspector. We're going to go, you can leave everything the same. And we're going to go ahead and connect. And I think my proxy token is incorrect. All right. So, my problem was is I basically had this authentication still on from the last one. So, make sure your authentication is off and then you should just be able to click connect and uh you're on HTTP. So now that I have my inspector, we can actually see what kind of resources we have and we're going to look. We have the content widget and you can see the information that's getting pulled back here. You can also see the templates which we don't necessarily have. We just have this content widget and then we also have our tools which is show content. So what this is actually doing is we have our metadata. We have our name. we can say JD. We're going to run and we actually get that information as a tool call back to us where it is returning the structured data and the metadata and then the unstructured content. So basically these chat GPT apps are all running on MCP and they're returning a resource that is associated with this information that allows the chatbt to render your HTML but also use the tools to send that information Over. --- *Generated for LLM consumption from nerding.io video library*