# Master OpenAI's NEW ChatGPT Custom MCP Integration in Minutes ## Metadata - **Published:** 10/11/2025 - **Duration:** 16 minutes - **YouTube URL:** https://youtube.com/watch?v=Q8W-VBWc-g0 - **Channel:** nerding.io ## Description Use apify to create a data source: https://www.apify.com?fpr=s9bzxz 🚀 Learn how to integrate your own data sources directly into ChatGPT using OpenAI’s brand-new Model Context Protocol (MCP)! In this step-by-step tutorial, you’ll discover how to build a custom MCP server, connect it to ChatGPT’s Deep Research and Responses API, and unlock powerful search and fetch capabilities — all in just a few minutes. You’ll learn: - What the Model Context Protocol (MCP) is and why it matters - How to create your own MCP data source with search and fetch endpoints - How to connect your MCP server to ChatGPT or OpenAI’s API - How to extend Deep Research to use private or enterprise data - How to deploy a MCP on Vercel Bonus: Real examples and GitHub references to speed up your setup By the end, you’ll have a fully working custom MCP connector that allows ChatGPT to reason over your own data — just like it does with the web. 🎓 Perfect for: developers, data engineers, AI builders, and anyone exploring OpenAI’s new agentic features. 📦 Resources mentioned: - OpenAI Deep Research API Docs: https://platform.openai.com/docs/guides/deep-research - MCP Documentation: https://platform.openai.com/docs/mcp - Example GitHub: https://vercel.com/templates/next.js/openai-deep-research-compatible-mcp-with-next-js 💡 Pro tip: Combine MCP with OpenAI’s Responses API to create agents that can research, fetch, and analyze your own internal data securely. ## Key Highlights ### 1. Custom Data in ChatGPT via MCP Connectors OpenAI now allows custom data source integration via MCP connectors, enabling ChatGPT to access and utilize specific, user-defined information for deep research. ### 2. Search & Fetch: Key Tools for Deep Research MCPs for deep research are limited to two core tools: 'search' and 'fetch,' each with specific requirements for data handling (text, image, resource). Authorization also requires using OAUTH principles. ### 3. Vercel Template Simplifies MCP Deployment Vercel provides a pre-built template with ChatGPT Deep Researcher compatibility, simplifying the deployment of MCP servers and allowing for quick setup with minimal coding. ### 4. Prompt Dashboard for MCP Testing OpenAI's Prompt Dashboard provides a dedicated environment to test and refine MCP server integrations, enabling the evaluation of prompt effectiveness with custom data sources. ### 5. Enable Developer Mode and Connect Custom Connectors Developers have to enable developer mode in settings and then add the custom connector providing the relevant API URL to make use of custom created tools. ## Summary Here's a summary document covering the key aspects of the video: **Document Title: Master OpenAI's NEW ChatGPT Custom MCP Integration: A Summary** **1. Executive Summary:** This video tutorial explains how to create a custom Model Context Protocol (MCP) connector, which allows ChatGPT to access and utilize custom data sources for deeper research and more tailored responses. It details the process of deploying an MCP server using Vercel, connecting it to ChatGPT, and testing its functionality via the OpenAI Prompt Dashboard. **2. Main Topics Covered:** * **Introduction to MCP (Model Context Protocol):** What MCP is and its significance in integrating custom data into ChatGPT. * **MCP Architecture and Required Tools:** The limited tools available in the MCP for deep research: 'search' and 'fetch,' with their specific requirements for data handling (text, image, resource). * **Creating and Deploying a Custom MCP Server:** Instructions on creating search and fetch endpoints, emphasizing the use of the Vercel template for simplified deployment. * **Connecting MCP to ChatGPT:** Step-by-step guidance on enabling developer mode in ChatGPT settings and adding a custom connector using the MCP server's API URL. * **Testing and Refining MCP Integrations:** Leveraging OpenAI's Prompt Dashboard for testing the effectiveness of prompts with custom data sources. * **MCP Authentication:** Authorization requires using OAUTH principles. **3. Key Takeaways:** * OpenAI enables custom data source integration via MCP connectors, allowing ChatGPT to access specific, user-defined information. * MCP's for deep research are limited to 'search' and 'fetch' tools, which have specific requirements for data formats and handling. * Vercel provides a ready-made template compatible with ChatGPT's Deep Research feature, simplifying MCP server deployment. * The Prompt Dashboard offers a dedicated environment for testing and refining MCP server integrations. * Developers must enable developer mode in ChatGPT and add the custom connector with the appropriate API URL to utilize it. * MCP accepts data in text, image, and resource formats. **4. Notable Quotes or Examples:** * "OpenAI has basically made this ability to add sources...you can add custom connectors." (Illustrates the customizability of ChatGPT) * "You can only implement two tools. You can implement a search and you can implement a fetch." (Highlights the constraints of the MCP for deep research) * Demonstration of using the Vercel template to quickly deploy an MCP server and connect it to ChatGPT. * Example of using the Prompt Dashboard to test a prompt that chains both the search and fetch tools. * "Now you have the ability to make an MCP server, pull your for your own documents and share that with different people in your organization... and actually share those resources in a in a data connector way that allows you to aggregate different data from different sources." (Emphasizes the versatility of MCP). **5. Target Audience:** * Developers * Data Engineers * AI Builders * Anyone exploring OpenAI's agentic features and custom data integrations with ChatGPT. ## Full Transcript Hey everyone, welcome to Nering.io. I'm JD and today what we're going to go through is how you can actually make a custom open AI MCP connector for deep research. What that means is you can actually connect your custom data sources and use them in chat GPT and how we can actually set all that up. This is kind of a precursor to the chat GPT apps which I'm going to turn into a minieries. And with that, let's go ahead and get started. All right. So, the first thing I'm just going to kind of show you an example of one that I've already created, but we're going to kind of take a look at what this means. So, OpenAI has bas basically made this ability to add sources. You have to enable this through your uh developer section, but you can add custom connectors. And you can see there's like a bunch of them that are predefined like Canva, Figma, Spotify that we saw like in their release demo. But you can also see that you can have uh custom ones. So this is one that I built based on uh a previous MCP that I have where you can go ahead and connect to it. So that is going to be part of a longer uh demonstration, but we're going to go through how you can actually create one and how you can use Verscell to do it. They actually give you a really nice template. So let's kind of go through some of the uh the specifics for OpenAI. So what this is going to allow you to do is you're basically going to create an MCP server for chatpt that has API integrations. Now you're going to configure things like uh the data source. So in this case we're going to be looking at like a vector store um and we need to create an MCP server. Now a really important thing about uh the deep research specifics of this is that you can only implement two tools. You can implement a search and you can implement a fetch. And they have really specific requirements. So when we're building out this uh deep deep researcher tool that allows us to connect to different data sources that would be our own for instance we're going to actually look at uh uh this and we need to make sure that we follow these these patterns. The other really cool thing is that you can or it can accept text image and resource. So, this is one of the really awesome things that I think is underutilized in MCP is that you can actually pull in images. I've actually done this before using NAN. Um, and I'll leave a link to the video in the description for that, but this is a great way to understand that you don't necessarily need to just send uh text. You can send different types of uh data. And then there's also the fetch tool. Same kind of concept, little bit of difference with the metadata associated with it. So you can see right here for um the result, but again takes text, image, and resource. So we're going to look at how we can do a specific example. They have an example uh that you can do on replet. We're actually going to take a look at uh Verscell um since that's where I spend a lot of my time is index.js. So what's really interesting about this is in the MCP documentation of how you can actually deploy MCPS to Verscell, they already have a template. So in another previous video, I've kind of gone through the fact that Verscell actually has a uh transport layer for route specifically for MCP. And this latest template that I found, they actually have the chat GPT deep researcher uh compatible MCP. So that means out of the box, it's actually going to give us everything that we need. So quick refresher, when we're looking at MCPS, you know, we need to have the uh get post uh and delete as a handler. You can run it locally with the model context protocol. And then we're going to go ahead and configure this MCP uh via a URL as opposed to doing something like uh standard in standard out or even like you know an SSSE. There are options to do authorization. So in the example that I have uh for mine you can actually connect through the uh OOTH principles with like the uh Twitter. So this is going to give it a minute to spin up. All right. So now that my server actually spun up, it just like all it does is go straight to my authorization of this app. You can authorize it and then it's going to send me back to OpenAI. I'm going to make a longer video specifically on authorization uh and how this actually works, how you need to actually build it. I just want to show this as an example right now that you have the ability to uh link these things up and then you'll see that fetch and search are actually pulling back. So, we'll go through authorization in in another video, but just wanted to show a preview of that. So back here in our template, let's go ahead and get this running on Verscell. So all we really need to do is go up to the top, click this MCP chat with uh uh chat GBT research compatibility, and we can actually deploy it. We don't even need to really do uh anything specific. But if you do want to see the repo, you can come over here and you can actually see the sample code is pretty uh pretty simple. You basically have a page and a layout and then you have your MCP transport router. And inside here, what it's going to do is it's just giving us documents, right? So these documents are going to have some text. It's also going to have a URL of where that resource lives and uh the simple ability to kind of do a search here. In this case, we're not doing a true semantic search. We're just kind of looking for does this word uh actually live uh in this document. So, what we're going to do and then uh also the fetch where it's actually going to retrieve the document and so we can get from a document ID. So we'll want to pay attention to just like doc one. So with that, let's go ahead and deploy this. So if we come here uh again up at the top, this is actually just going to launch me into Versell. So you'll see it in my environment. It's going to copy the this over. I'm not going to p put this as a private repo. To me, it's just a fork. I'm going go ahead and create um I'm using a free account on this. Uh, and you should be able to to get started with this right away. All right. So, now we're going through our deployment and uh, it's going to take a second to kind of build through these. I'm going to go ahead and pause. All right. So, now we have this deployed. Uh, we can continue to the dashboard, add any of these things. Uh, you can also see that it's giving us a preview of what we've got here. So, I'm just going to continue. Go ahead and launch because I'm going to need that URL. get our domain. We can see it here. Let's go ahead and do slashmcp. And we can see that the get method isn't allowed. So, let's uh I think I need to sort maybe it's API. No. Okay. So, I think this is correct. We're going to run it in our local uh model uh context inspector in our MCP inspector. So to do that, I'm going to go ahead and run uh I'm not in cloud code. Just clear that up. I'm going to go ahead and run this locally, so do an MPX model inspector. We're going to go ahead and run that. Real quick, everyone, if you haven't already, please remember to like and subscribe. Also, check out the Vibe Coding Retreat, which is our community. And lastly, I wanted to give a thanks to your sponsor, Apathy, which is a great way to actually scrape and aggregate a bunch of data which you could actually use in an uh OpenAI MCP connector. With that, let's go ahead and get started. So, now that we have that URL, what we're going to do is we're going to put it into our uh MCP inspector. We'll have streamable uh HTTP proxy is fine. We don't need any authentication and we're just going to send this. So when we go ahead and connect locally, we should be able to get our information. We're going to see a list of tools. We have search fetch uh exactly like what we're supposed to see. And then you can see different types of information. So if we did like next.js and ran this tool, we're going to get all the documents that Next.js has mentioned. And if we do fetch and we do doc one, we're going to see the resource itself and the metadata source associated with it. So this should be enough for us to then connect directly to chat GPT which is what we will do next. Looking at their docs. So if we look back at the documentation, it just says that we need to connect our connectors here on developer mode. And so what we can do is we go back to chatebt and we look at settings. That's how I got here. Now under app and connectors, you may have to go to advanced and click on developer mode. So that will give you those little orange border to let you know that you're in developer mode, but it will also let you connect custom connectors. So now what we're going to do is we're going to create a uh new tool and we're just going to say NexJS docs. Uh even though this isn't the official Nex.js docs, we're going to take our Verscell deployment of MCP, we don't have authentication on this one and we are going to go ahead and say I trust this application. That's all you need to do in order to add a connector. And now we click create. And this may take a second. Perfect. We now have our NextJS Docs research. And we can test this in our window. Uh we can also test this in what's called the prompts dashboard. So if you go back to the docs and you scroll down down at the bottom, there is uh this test and connect your MCP server. You can click this in the prompts dashboard. And in your prompt in in the platform chat, which is a little bit different, you can actually add in and test your MCP. So, uh, waiting for this to load. We're going to go ahead and click a create. If you go down here to tools and you go to MCP server, you can click this and it's very very similar to the connectors, right? So now we have uh by others but if we want to add our own again we put in URL we say our label uh let's see nering nextjs mcp we'll just say nextjscp again and I don't have authentication you you'll notice though that this has different types of authentication All right. And so right now, if you're connecting the MCP server through the platform, you get none access token or custom headers. We're going to go ahead and do a none. On the other one, you might have seen that it said none or ooth. So if you're doing it in the uh the traditional chat, like you don't have the option to do um the uh headers, just heads up. So we're going to go ahead and wait for this to connect and then we'll try it here. Okay, great. So now we are in the exact same state if we look at the docs as what we would expect. And so you need to select never require approval for this tool. So if we go back here, it we need to change this. We're going to say never require approval for any tool. It now auto selects them. You could also turn them off if you want. We're going to go ahead and add this. And uh now we can actually test our uh prompt if we want. So again, you can uh put in any kind of prompts or user messages that you might want. We're just going to say uh search for the next JS doc one and fetch it. Now, we're being pretty specific of what this is, but I kind of just want to see like how the AI is going to actually operate and uh what tools it's going to be used. So, as you can see, it's pulling back the custom tool that we set over here. It's going to list out these tools, and it's got the ability to fetch that document. Uh, and now it's pulling this information back. So, it didn't, if you notice, oh, it did do the search. So, it did the search of the the next.js. So, we were able to chain this together. That's exactly what I was hoping for. It listed the tools. It searched specifically. it found the doc uh and then pulled back the doc ID. We could probably just as easily say go fetch doc one, but this is a really good example of how this will work in the new prompt dashboard. So really quick, uh we're going to try this on the live as well and see how we if this works uh similar. Now keep in mind all these things are in beta, right? So there may be some bugs. So what we what we need to do when we're in de develop mode on our regular chat is we basically need to come down here. You can see more and you should be able to see this next.js docs now that you created. If you don't click add sources and then add connect more and you should be able to see it here and then that will enable it. So if you but you should be able to see it here. Now you can actually see that this is the next JS docs and let's just use the same prompt for consistency and we'll go ahead and see if this does the same thing. Okay, so it is looking for tools. It's calling to tools. We're not actually Oh, here we go. Perfect. Okay. So, in this case, it did the uh fetch and it just did the fetch. It didn't actually list the tools and search um as opposed to the other one. That's pretty normal with chat uh with AI. It's going to choose its own path. But the reality is is that it's still going out and catching or and fetching our document. So again, now you have the ability to make an MCP server, pull your for your own documents and share that with uh different people in your organization or um different people that you might be selling prompts to, whatever. Um and actually share those resources in a in a data connector way that allows you to aggregate different data from different sources. And now you have your own custom connector in OpenAI. --- *Generated for LLM consumption from nerding.io video library*