# How to Use HumeAI's EVI Next.js Function Calling API #aitools ## Metadata - **Published:** 7/29/2024 - **Duration:** 11 minutes - **YouTube URL:** https://youtube.com/watch?v=exAICcpzXMc - **Channel:** nerding.io ## Description Unlock the power of emotion AI with HumeAI! In this video, we walk you through the EVI Next.js function calling API, providing step-by-step instructions on how to integrate and utilize it in your projects. Perfect for developers looking to add emotional intelligence to their applications. ๐Ÿ“Œ Key Highlights: - Overview of HumeAI's EVI API - Setting up Next.js for function calls - Code examples and practical implementation - Tips for optimizing your emotional AI integration Don't forget to like, comment, and subscribe for more tech tutorials! ๐Ÿ“ฐ News & Resources: https://sendfox.com/nerdingio ๐Ÿ“ž Book a Call: https://calendar.app.google/M1iU6X2x18metzDeA ๐ŸŽฅ Chapters 00:00 Introduction 00:29 Repo 01:36 Create Configration 03:00 Create Tool 04:30 Code 06:57 Demo 09:36 Playground ๐Ÿ”— Links https://www.hume.ai/ https://github.com/HumeAI/hume-api-examples/tree/main/evi-next-js-function-calling https://dev.hume.ai/docs/empathic-voice-interface-evi/tool-use#create-a-configuration โคต๏ธ Let's Connect https://everefficient.ai https://nerding.io https://twitter.com/nerding_io https://www.linkedin.com/in/jdfiscus/ https://www.linkedin.com/company/ever-efficient-ai/ #chatgpt #voice #ai #programming ## Key Highlights ### 1. HumeAI Function Calling API Overview The video provides an overview of HumeAI's function calling API, enabling AI models to interact with external APIs, similar to OpenAI functions. ### 2. Weather Tool Example The tutorial demonstrates using a weather API as a practical example of how to configure and implement function calling within the HumeAI framework. ### 3. Debugging and Troubleshooting The video highlights the debugging process, showing how to identify and resolve errors related to tool names and API responses within the Next.js application. ### 4. Testing in Dashboard Playground HumeAI's dashboard offers a playground environment for testing function calls directly, allowing for interactive testing and validation of API integrations. ### 5. Voice Provider and Tool Integration The video explains how the voice provider is integrated with the tool call, enabling the system to handle tool calls and generate responses based on the parameters received. ## Summary Here's a comprehensive summary document of the video "How to Use HumeAI's EVI Next.js Function Calling API #aitools": **1. Executive Summary:** This video tutorial explains how to integrate HumeAI's Empathic Voice Interface (EVI) function calling API into a Next.js application, enabling the AI to interact with external APIs, using a weather API as a practical example. The video covers the necessary steps from setting up the environment and creating configurations to debugging and testing the implementation both in the Next.js application and HumeAI's playground. **2. Main Topics Covered:** * **Overview of HumeAI's EVI API and Function Calling:** Explains the purpose and functionality of the EVI API, emphasizing the ability to extend AI capabilities by connecting to external services. * **Next.js Setup and Environment Configuration:** Details the process of setting up a Next.js project, including the necessary environment variables (API keys from HumeAI and Geoapify), and downloading the necessary files. * **Creating and Configuring Tools in HumeAI:** Demonstrates how to create a tool (e.g., a weather API) within the HumeAI dashboard, including defining the API's name, description, and required parameters (e.g., location, format). * **Code Implementation and Explanation:** Walks through the code in a Next.js client component, explaining how it handles tool calls, fetches data from the weather API based on provided arguments, and returns a response. * **Debugging and Troubleshooting:** Highlights common errors that can occur during integration (e.g., incorrect tool names) and provides guidance on how to identify and resolve them. * **Testing in the HumeAI Dashboard Playground:** Shows how to use HumeAI's playground to test function calls directly, allowing for interactive validation of API integrations, including sending success and error responses. * **Voice Provider Integration:** Explanation of how the voice provider is connected to handle tool calls, allowing the system to interpret spoken commands and generate appropriate responses based on the received parameters. **3. Key Takeaways:** * HumeAI's function calling feature is similar to OpenAI's functions, allowing AI models to interact with external APIs. * Setting up the correct environment variables and API keys is crucial for successful integration. * Careful configuration of tools within the HumeAI dashboard is essential for defining the API's capabilities. * Debugging tools and testing environments are available to validate the integration. * Voice interaction is integrated with tools using call handling, allowing spoken commands to invoke external APIs. **4. Notable Quotes or Examples:** * "Tooling is just like it is with Open AI functions where you're trying to basically get or create an API call and make your prompt or your AI aware that it can actually use these types of tools..." * Example Weather Tool parameters: Location (string), format (Celsius or Fahrenheit). * Demonstrates troubleshooting by identifying that the "tool name" in the Next.js code must match the name defined in the HumeAI dashboard. * Successful demonstration: "Can you tell me the temperature outside in Detroit?" "The current temperature in Detroit is 65f." **5. Target Audience:** The target audience for this video is developers with basic experience in: * Next.js * API integrations * Understanding of AI concepts (specifically LLMs and function calling/tool use) * Interest in adding emotional intelligence and voice interaction to their applications using HumeAI. ## Full Transcript hey everyone welcome to nering IO I'm JD and today we're going to be diving a little bit deeper into humi specifically looking at how they have tooling and you can actually call out to thirdparty apis and with that let's go ahead and get started so we're going to be taking all right so we're going to be taking a look at Hume we went through the dashboard uh last time and we're going to look at this example for function calling and the first thing we're going to need to do is get our environments but I also want to our uh keys but I also want to explain what a tool is so tooling is just like it is with open aai functions where you're trying to basically get or create an API call and make your uh prompt or your AI aware that it can actually use these types of tools for um for calling outside so different apis and things like that basically you can see the parameters you're setting up what objects you might need in order to uh in this case get the weather this is pretty much a hello world example so we're just going to go ahead and get uh right into it so let's go back to the dashboard and get our environment variable or our uh keys and create a tool and then create a configuration so if we look over back in the dashboard we have these API Keys we can just go ahead and copy those uh but then over here in underneath EV is this uh create a configuration so the first thing we're going to do is we're going to say we want a particular kind of voice masculine or feminine uh and then you can pick which uh model you want to use even the custom language models we're just going to go with the default system so that we don't necessarily have to put any of these in but they do have you know CLA and open a with both of which have uh functions inside of them um for right now I'm just going to leave the the system prompt empty we we can configure that later if we want so go ahead and add tools so what we're going to do is we're going to click add create a new function and as you can see right here we need this information from their example so we'll jump back over we're going to say our name is this and we are going to have this description and we'll go ahead and take these parameters and as you can see the uh well we're getting in valid Jason so what we can do is actually take this uh object and we know that our properties are going to be uh right here we need location with a type of string description and uh looks like format so let's go ahead and put that together real quick and we'll just uh do a little copy and paste did was take this bit and then actually just told Chachi BD to make it into Jason so that we can just copy and paste and then here we go we have our properties which is location it's a string we have a description we have what our format is going to be Celsius or Fahrenheit and uh that's uh looks like what's required again these are all things that are pretty normal in function calling in in open AI so it's not too too different go ahead and create that we now have our function we can still edit it and then we'll just go ahead and name our config we'll just say YouTube and we'll create our configuration and so now two things one we can actually run this in our playground or we can actually uh put this ID back into uh our configuration so over here we're going to put the the ID into our variable all right and so now that we have this make sure that we download this and we're going to go ahead and actually jump right into the editor and get this started and give it a test so all I've already uh downloaded this this is all the information that we need the only other thing is this Geo coding API key you can actually get that for free through this uh geomaps site so you can come in here just grab an API key it's free and I think you get like 5,000 requests a day all right so now all we have to do is get our terminal and let's go ahead and get started we'll notice that in our source we're actually using the app router and we're going to take a look at our components which is the client component and see how this function actually works so what we're doing is we're saying if tool name weather tool so one of the first things that we're probably going to need to go back and do is actually change this from get weather tool to just weather tool but let's keep going and see where are uh how we can test this and if we get any errors so you can see that what's basically happening is we have this handle tool promise and it is looking for the tool name so you can think about this is you could technically have multiple different tool names and you're only looking for a specific tool we're actually going to go fetch information based on the argument of what we're actually saying we'll take our API key then we're actually going to take information from the weather.gov to get information about the weather going on and then we're going to be looking at once we get that we'll be returning our our tool response based on the tool call ID and the content that's actually happening through the forecast URL and the Json that we're expecting to get back so you can see on the voice provider last time when we set this up we were able to actually add a call and that is on tool call handle tool call so this is why we're allow it's only one call so this is why we're looking for Anytime it's looking for a tool we're actually going to look for this particular name all right so let's take a look at our local host and see what we've got real quick everyone if you haven't already please remember to like And subscribe it helps more than you know and with that let's get back to it we're going to go and test this in our browser we'll be paying attention to the console any any logs and even the websockets that might come back so the first thing we're going to do is try and start the session can you tell me the weather outside in Detroit I'm currently unable to access the weather to tool all right perfect so we want to kind of see what the error was so if we look here we see that we've got the tool and we're getting this call it's recognizing that this is our tool because this is the name that we put in and it's able to pass the parameters correctly but the problem is is that we are unable to get the response and that's because of what we thought which is this this name here is different than what we had in our dashboard so let's go back and switch a couple of things so the first thing we're g to do is go back to our configuration we can edit uh both our tool I'm going to change to GPT 3.5 I put in a sample prompt and I am going to update the code in here to get our tool so now let's go ahead and restart and we'll all right so let's start again can you tell me the temperature outside in Detroit the current temperature in Detroit is 65f all right cool so what happened this time is you can actually see that we're able to actually call the function itself we're getting the uh the area the area we're getting the forecast and then we're actually able to see the tool received and we are able to see what the uh the the tools actually returning the parameters that we sent the call ID and pull all this information back so now what we're going to do is we're going to actually test this in the playground so if we take this information and we go back to our uh dashboard we can actually test it here as well so if we click the run in the playground then all we need to do is get an example of that weather so they actually have all right so we're going to take this and we're just going to copy it start our call let's so let's start our call can you give me the current temperature sure thing all right so now it's actually look like it's recognizing our voice and we can actually send a response so here's the response we're going to send we can also send an error response and it'll reply accordingly so what's interesting about this is that you can see it's trying to send it here so what's interesting is actually able to uh via text either get a success or a failure and you're able to test this in the dashboard which is pretty interesting so not only can we test this through a code in our nextjs application we can test it in the uh dashboard as well and we could even test this just sending like post requests uh back and forth through the API all right that's it for us today everyone what we looked at was humi and specifically how they have tools which are very similar to how we've seen other third-party uh apis being called in in AI so with that happy nerding --- *Generated for LLM consumption from nerding.io video library*