# How To Use Next Js 13 With Langchain And Langsmith ## Metadata - **Published:** 10/14/2023 - **Duration:** 9 minutes - **YouTube URL:** https://youtube.com/watch?v=_B-3XwPBzTc - **Channel:** nerding.io ## Description In this video, we will dive into Langchain and Langsmith in a Next js 13 application. We cover everything from understanding the security concerns of using OpenAI in the browser to the backend setup! Learn about session storage, lane chain in the front end, traditional API protection, and more. Get insights into setting up environment variables, creating prompt templates, and connecting with Langsmith. We also touch upon related topics like the Python example and upcoming videos on the web langchain. Take advantage of these valuable insights! Like, subscribe, and join our community of nerds. #OpenAI #Nextjs #API #LangChain #Langsmith #WebDevelopment ๐Ÿ“ฐ FREE snippets & news: https://sendfox.com/nerdingio ๐Ÿ‘‰๐Ÿป Ranked #1 Product of the Day: https://www.producthunt.com/posts/ever-efficient-ai ๐Ÿ“ž Book a Call: https://calendar.app.google/M1iU6X2x18metzDeA ๐ŸŽฅ Chapters 0:00 Introduction 1:09 Getting Started 1:37 LangChain 2:31 NextJS 3:25 Prompt Template 4:03 Testing in the Browser 4:55 API Fetch 5:48 API Route 6:13 Chat Model 7:33 Result 8:04 Langsmith 9:22 Conclusion ๐Ÿ”— Links https://nextjs.org/docs/getting-started/installation https://js.langchain.com/docs/get_started/installation https://js.langchain.com/docs/get_started/quickstart https://smith.langchain.com/ โคต๏ธ Let's Connect https://everefficient.ai https://nerding.io https://twitter.com/nerding_io https://www.linkedin.com/in/jdfiscus/ https://www.linkedin.com/company/ever-efficient-ai/ ## Key Highlights ### 1. Next.js & Langchain Integration Demonstrates using Next.js with Langchain, showcasing JavaScript's increasing role in AI applications, including front-end implementations and server-side API routes. ### 2. Frontend Langchain Implementation Explores using Langchain directly in the Next.js frontend, highlighting prompt templates and dynamic prompt generation, but cautioning about security implications of exposing API keys. ### 3. Backend API Route for Security Presents a more secure approach by creating a Next.js API route to handle Langchain logic, protecting the OpenAI API key and demonstrating data flow from frontend to backend. ### 4. Langsmith Monitoring Integration Shows how to integrate Langsmith for tracing, monitoring, and debugging Langchain applications within Next.js, including visualizing run sequences and latency. ## Summary ## Video Summary: How To Use Next Js 13 With Langchain And Langsmith **1. Executive Summary:** This video provides a practical guide to integrating Langchain and Langsmith within a Next.js 13 application. It covers both front-end and back-end implementations, emphasizing security best practices and demonstrating how to monitor Langchain processes using Langsmith for tracing and debugging. **2. Main Topics Covered:** * **Introduction to Next.js and Langchain:** Briefly explains the increasing usage of JavaScript in AI models and applications. * **Setting up a Next.js Application:** Includes creating a new Next.js project using `create-next-app` with specific configurations (no TypeScript, using the App Router). * **Frontend Langchain Implementation:** Demonstrates using Langchain directly in a Next.js component including Dynamic Prompt Generation. * **Backend API Route for Security:** Discusses creating a secure Next.js API route to handle Langchain logic and protect the OpenAI API key. * **Langsmith Integration:** Showcases how to integrate Langsmith for monitoring, tracing, and debugging Langchain applications, visualizing run sequences, and analyzing latency. * **Working with Prompt Templates:** Creating and using prompt templates within both the front-end and back-end for dynamic text generation. **3. Key Takeaways:** * Langchain can be implemented both on the front-end and back-end within a Next.js application. * Exposing OpenAI API keys directly in the front-end is a security risk. A secure back-end API route is the recommended approach. * Langsmith provides valuable tools for monitoring, tracing, and debugging Langchain applications. * Prompt templates allow for dynamic and reusable text generation. * The presenter mentioned JavaScript's increasing role in AI applications, like flow wise AI, embed chain, and webang chain. **4. Notable Quotes or Examples:** * **Prompt Template Example (Front-end):**`"What is a good name for a company that makes {topic}"`. * **API Route creation:** `app/api/llm/route.js` demonstrates the Next.js file structure for API endpoints. * **Model setup:** The model can default to the openAI API key and can be added in your environment variables. * **JavaScript Joke:** "Why do JavaScript developers prefer wearing glasses? Because they don't miss any exceptions." - Example of generated content. **5. Target Audience:** * Web developers interested in integrating AI functionalities into their Next.js applications. * Developers looking for a practical guide on using Langchain with Next.js. * Individuals interested in learning about secure API key management when working with OpenAI. * Developers who want to utilize Langsmith for monitoring and debugging their Langchain applications. ## Full Transcript hey everyone welcome to nerding IO I'm JD and today we're going to be talking about nextjs in Lang chain and one of the reasons that we want to go over this is we're starting to see JavaScript being used in different AI models applications and tools you can think of things like flow a flow wise AI as well as embed chain and then specifically in Lane chain there's things like the nextjs code space that's on their doc specifically for a chat there's also the Lang chain nextjs starter chat template in verell and then you even have some experimental things like webang chain which was written in nextjs as kind of a research tool last but not least you can also track everything in Langs Smith so that's what we're going to look at today is we're going to build essentially a hello world application that allows us to connect nextjs to a lane chain application and then visualize it and monitor it with Lang Smith All right so the the first thing that we're going to do for getting started is we actually need to build our next nextjs application and so to do that we can just use next PX create next app it's going to ask us a series of questions like what our project name is ETC but there's two key important things the first is that we are not using typescript in this instance and that we are using the app router so once you have your uh app set up the next thing you need to do is do an mpm install for Lane chain to actually get the package and they act in the docs on the Lan chain uh JS documentation they actually have a point for versel and nextjs and this is really important because they go over a couple of things not only only just to import it but they also list out that it can be used in serverless functions Ed Edge functions and then actually in the front end components itself this is pretty interesting because that means again that we're using the uh different types of AI models in the front end so if we what we're going to do with this is go through a couple different examples we'll go through the traditional route of looking at the uh API and then we'll actually just load Lane chain in the front so let's go ahead and get started so I put together this application it has pretty much everything that you would need if you're interested in it it'll be down in the comments there's also a link to subscribe to our newsletter and what we're going to go through is the react portion of this first so you can see here that what I'm doing is I'm going to be setting a topic and I'll show you what this looks like first so if if we go over to our Local Host this is what we're building we have a form we can submit and then we're just going to print out the results kind of like a hello world of how we're using Lane chain so in our uh react portion which is in our our page over here we're just going to be doing a quick and dirty example of a prompt template so what a prompt template is is it allows you to take a string and put together a prompt that you would want to use in some sort of dynamic way and the way to do this is you just have your curly brackets and you put in your variable and then you would prompt the format of whatever your variable name is and the topic so if we go back to test this out and we just say uh I don't know socks what we can do is that it'll print print out what is a good name for a company that makes socks again this is all happening in the in the browser if we wanted to use open AI in the browser we would need to expose our open AI key to the the two nextjs in order to do that you would actually have to have your open AI key with the next prefix however that's not really secure so we don't necessarily want to do that if you wanted to expose your uh use the key right in the browser though you could use something like session storage or local storage to allow the user to put in their own key and then that way it would delete it after time still not the best security practice but it would be a way to allow you to to move forward so now that we've seen how to do this in the front end remember this is actually using Lane chain in the front end we're going to switch to going a more traditional route and have it being protected by an AP so what we're going to do here is we're actually going to look at our we're going to create a fetch again we're putting in our next uh our public base URL in this case it would just be Local Host and then we're putting our URI which is API llm we're going to have a post and then we're going to have our topic and our our body that's being stringified to go to the back end and then we're awaiting our response at which say at which point will set the result which again is getting shown down here as a way to kind of console log our our result on the back end what we need to do is in our app folder we need to create an API folder and then our llm folder and then a route this allows us to create the API URL right here so if we look at this what we're going to be doing is we're still using our prompt template we're using our chat open AI chat model we're setting up our export post function so this is how we would actually hit the route specifically based on the method and then we're going to be looking for our data so this is the way that we pull our request data we're going to destruct our topic here's our model we don't have to put anything in here because again in our environment variable we have our open AI API key so it's defaulting to whatever this uhm isops then we're creating our prompt template in this case what we're going to be saying is tell me a joke about our topic and then we're going to create a pipe for our temp our prompt template and then we're taking our chain and we're actually invoking it and then returning our response so this is using the JavaScript response we're doing another Json stringify and then we're telling it we want our status of 200 so this should give us the uh the data the result of what we're looking for let's go ahead and make sure this is saved and then we'll do a test so now if we switch this we don't want it to be what is a good name for and we're going to say tell us a joke about JavaScript as you can see down here we're actually sending this information out so our payload is our topic of JavaScript and we're getting our response which we're seeing in our code block we have our content which is right here so why do JavaScript developers prefer wearing glasses that's a good one for me because they don't miss any exceptions nice all right so next what we want to do is we want to actually go in and we want to connect this to lsmith this is actually really simple and and we've seen this in the python example which I'll have a link to for our other video all you need to do though is put in your environment variables for laying chain tracing your API key in your project and you'll a you'll be able to monitor your uh sequence so if we we already sent this through I already had the environment variable set up so if we look at our project we can see here we have a nextjs Hello World we have a run count of one it tells us our total tokens and what our latency is so if we go ahead and click here we have our runnable sequence it shows us that we're using the prompt template we're using the open Ai and we can actually look through at our success what we could also do is we can actually if we had uh a data set we could use or if we wanted to we could add this to our data set so with that this is what we learned today we learned how we could put together a nextjs and Lang chain application we kind of look or talked about some things like embed JS and even like the research project that Lang chain is put out which is uh web Lang chain we'll actually have a video upcoming on that and if you're interested in the python version you can check out our Channel I'll leave a a video up here with that don't forget to like And subscribe happy nerding we'll see you soon --- *Generated for LLM consumption from nerding.io video library*