# How to create a daily Reddit Trend Finder with #langgraph #deepseek #slack #n8n ## Metadata - **Published:** 2/3/2025 - **Duration:** 17 minutes - **YouTube URL:** https://youtube.com/watch?v=usUTnRYloi8 - **Channel:** nerding.io ## Description Discover how to build an AI-powered Reddit trend analyzer that automatically detects viral content, leverages Slack emoji reactions for human approval, and triggers n8n workflows to post contentโ€”all in real time! Perfect for developers, marketers, and automation enthusiasts. - Track trending Reddit posts using LangGraph for NLP-powered trend analysis. - Integrate Slack as a "human-in-the-loop" to approve/reject content with emoji reactions (โœ… for approve, โŒ for reject). - Automatically trigger n8n workflows to post approved content to social media, blogs, or databases. - Combine AI efficiency with human judgment for smarter content curation. ๐Ÿ”ง Key Tools & Technologies - LangGraph: Analyze Reddit posts for trends, keywords, and sentiment. - Slack API: Set up emoji-triggered actions for human moderation. - n8n: Build no-code automations to publish content instantly. - Python: Script the backend logic to connect Reddit, Slack, and n8n. ๐Ÿ“Œ What Youโ€™ll Learn - LangGraph Setup: Scrape and analyze Reddit data in real time. - Slack Integration: Configure emoji reactions to trigger approval workflows. - Human-in-the-Loop Design: Let teams vote on content before publishing. - n8n Automation: Post approved trends to platforms like Twitter, Discord, or CMS tools. - Cron to setup daily recurring research task ๐Ÿ‘‰๐Ÿป Text Yourself: https://textyourself.app ๐Ÿ“ฐ Newsletter: https://sendfox.com/nerdingio ๐Ÿ“ž Book a Call: https://calendar.app.google/M1iU6X2x18metzDeA ๐ŸŽฅ Chapters 00:00 Introduction 00:44 Langgraph 04:46 Studio 07:08 Slack 08:32 n8n 15:25 Cron 17:36 Conclusion ๐Ÿ”— Links https://github.com/langchain-ai/reddit-radar https://smith.langchain.com/ https://slack.com/ http://n8n.io/ โคต๏ธ Let's Connect https://everefficient.ai https://nerding.io https://twitter.com/nerding_io https://www.linkedin.com/in/jdfiscus/ https://www.linkedin.com/company/ever-efficient-ai/ ## Key Highlights ### 1. Reddit Trend Finder with LangGraph & LLMs The system uses LangGraph to monitor Reddit trends, leveraging LLMs (like DeepSeek) to analyze posts and comments for relevant information based on user-defined criteria. It filters the vast amount of data to pinpoint user interests. ### 2. Human-in-the-Loop via Slack Blocks The system posts Reddit trends to Slack using formatted 'blocks' within a single message. This structure makes it easy to parse data programmatically in N8N, enabling 'human-in-the-loop' approval and actions on selected trends. ### 3. Automated Actions with N8N and Slack Emojis N8N is used to trigger automated actions (e.g., posting to Twitter threads) based on emoji reactions in Slack. This creates a seamless workflow where users can easily act on interesting Reddit trends. DeepSeek is integrated in N8N. ### 4. Scheduled Trend Discovery with Cron Jobs LangGraph offers cron job capabilities to automatically run the Reddit trend finder on a schedule. This automates the process, ensuring a daily digest of relevant trends is delivered without manual intervention. Can overwrite defaults. ## Summary ## Video Summary: Reddit Trend Finder with LangGraph, Slack, and n8n **1. Executive Summary:** This video demonstrates how to build an automated Reddit trend analysis system using LangGraph for trend detection, Slack for human approval, and n8n for automated posting. The system empowers users to identify trending topics, curate content, and schedule posts to various social media platforms, all driven by AI and user input. **2. Main Topics Covered:** * **LangGraph Setup:** Configuring LangGraph to scrape Reddit data, filter posts and comments based on user-defined criteria, and analyze content for trends using Large Language Models (LLMs). Specifically leveraging the pre-built `reddit-radar` project, customized for specific use cases. * **Slack Integration:** Utilizing the Slack API to post Reddit trends in a structured block format. Configuring emoji reactions (e.g., checkmark, X, bird, clock) as triggers for automated actions. * **Human-in-the-Loop Design:** Implementing a content approval workflow within Slack, allowing users to review AI-generated trends and approve or reject them using emoji reactions. * **n8n Automation:** Building no-code workflows in n8n to respond to Slack emoji reactions. This includes parsing Slack block data (subreddit, post ID) to re-query Reddit API data. Performing actions based on user-defined emoji triggers (e.g., posting to Twitter/X as a thread, LinkedIn, etc.). DeepSeek is integrated and used within the N8N workflow for content summary and enhancement. * **Scheduled Trend Discovery:** Setting up Cron jobs within LangGraph to automatically run the Reddit trend finder on a daily schedule and push results to Slack. This automation ensures consistent trend monitoring without manual intervention. **3. Key Takeaways:** * **AI-Powered Trend Analysis:** LangGraph provides a powerful tool for automated Reddit trend identification. * **Human Oversight is Critical:** Integrating Slack for human-in-the-loop moderation ensures content quality and alignment with brand guidelines. * **Automated Content Curation:** n8n enables seamless automation of posting approved trends to various platforms, saving time and effort. * **Slack Blocks for Data Parsing:** Utilizing Slack blocks provides a structured format for sending information and parsing it within n8n workflows, allowing for easy extraction of relevant data. * **DeepSeek for Improved Processing:** Leverage DeepSeek language models within N8N to improve text analysis and summarization. * **Cron jobs for consistency:** Cron jobs ensure daily or schedule based tasks for consistant content. **4. Notable Quotes or Examples:** * "Blocks are it's still a single message but it almost chunks those blocks into um sections in your slack message so it's it's really just blocks inside of a message even though it it compiles into one one big message." - Explaining the importance of Slack blocks for structured data within messages. * "We are going to trigger an event and that trigger is going to be on a reaction added." - Describing the n8n trigger based on Slack emoji reactions. * Example of using emojis (bird, thread, silhouette, clock) to trigger posting to Twitter, Threads, LinkedIn, or schedule at a certain time. * Demonstrates how to configure and modify the LLM that is being utilized within the N8N workflow. **5. Target Audience:** * Developers interested in building AI-powered automation solutions. * Marketers looking to identify trending topics and automate content curation. * Automation enthusiasts seeking to integrate LangGraph, Slack, and n8n. * Anyone seeking to enhance their social media presence and content scheduling strategies. ## Full Transcript hey everyone welcome to nerding IO I'm JD and today what we're going to look at is Lane graph slack and NN to build a Reddit Trend finder that you can actually use human in the loop to approve what you're interested in and have it post into an automation with that let's go ahead and get started all right so first things first we're going to look at how we can actually get uh information that's coming out of Reddit specifically so we're going to monitor uh some of the top reddits or reddits that we've Chosen and then scan the top post and comments and so we're going to use this thing called Reddit radar and what this does is this is actually from Lang chain um I just forked it and made some modifications for some specific things that that I wanted uh when when working with this so we're actually going to go through what this system is how you can get it uh spun up the the readme is great so we're just going to kind of look under the hood and see what's what's going on but the idea is is that it goes out it finds what's trending and then it'll actually post back into your slack and so I wanted to take this a step further and then say based on what's actually happening in these Trends uh take some sort of action and that action be the human in the loop and the reason for this is I'm I'm just trying to you know there's so much data and information that's coming out I want to understand like what are the things that I'm interested in and how can I send this uh to others or just make it more actionable so the first thing that we're going to do is once you clone this code down we're actually going to take a look at the graph and so inside the graph basically what's happening is we're going to be using anthropic and we have this configurables and so these configurables are basically our default setup and so you can see in here they've hardcoded some values but you can actually configure these uh outside of this configuration you also have the ability to have a user input and so that user input is going to allow us to uh have the configuration but then also search for specific terms and then based on that it's going to maintain state so I also added uh some things here like I want to know what the subreddit is because I'm going to have a lot of these running and I want to know what the Reddit ID is and both of these play a part into the NN Automation and I'm going to show you how that works with slack uh blocks so then also you have your prompt file and this is basically just the instructions of how to put everything together and some more utilities on how to actually get the res Reddit posts when are you going to filter and uh how to compile all that information together also inside the graph is once you gather all that information you're actually going to post all of that back to slack and so the way that slack works is essentially you have these things called blocks and blocks are it's still a single message but it almost chunks those blocks into um sections in your slack message so it's it's really just blocks inside of a message even though it it compiles into one one big message and so you can have different sections in here like the title uh your the llms take on what's actually happening between the topics as well as the comments that are being listed and then it lists all the sources so we can have like attachments to the to either be links or images and we can even uh put in the Reddit URL so if we wanted to go directly to the Reddit URL we could the ones that I also added were the again the subreddit and the Reddit ID because I need to pull those out easily from the the slack message al together and so as you can see at the very end we're then going to take whatever we find and we're actually going to post it into a slack web hook and so let's take a look at what that that actually does so first things first we're actually going to run through this before we send it to slack and see how this is actually working in our lane graph so I've deployed it to my uh L graph platform or like the cloud and we're just going to look at what's going on with deep SE and so right now it's actually going through each node uh on the graph in order to look at the information and pull that information back and now it's starting to generate all the takes so it's found about 20 different articles that you can uh that it it saw and is pulling that information back and it's also looking at users as well as comments again analyzing all this information now we're actually writing back to slack and we've completed so the other thing I just want to show you is our manual configurations are right here you could also change these right in the UI so if you wanted to look at number of comments or if you want to look at number of posts if you want to change your uh time frame or if you want to change your location C or uh sorry your subreddit to um cool so now let's actually we can look at the the traces themselves as well just to kind of see what's actually happening in are in the background so as you can see it's making multiple different calls and the output associated with it as it's pulling back and so even though we're getting all this information back we're not necessarily going to use all this information in the Reddit uh slack message so now what we're going to do is we're actually going to see what we uh what we got in slack real quick everyone if you haven't already please remember to like And subscribe it helps more than you know also please go check out text yourself it's a simple application that I built that helps keep me on track all you have to do is SMS different tasks and reminders that you need to be sent back to yourself with that let's get back to it all right so now that we're back in slack uh you can see I have multiple different Trends here so get a Hacker News product hunt and Reddit and so what's happening is this web Hook is actually posting these articles and as you can see these are the blocks right so now we have blocks inside even though it's one single message we have blocks that are in our uh divided in our message and so that becomes important when we actually want to do something in an automation so this is kind of where the the human in the loop comes in so now let's say that I actually want to take uh an article like this where I could definitely just go to the source but I could also trigger an action maybe uh I want to send this to someone maybe I want to uh post it somewhere and so what I'm going to do is I'm actually going to create a another another uh trigger in automation on the fact that I can actually select an emoji and have it fire an event and so in order to do that we're going to use Na and we're going to connect na to slack and a bunch of our our different uh social media interactions so before I actually select the Emoji I'm going to go show you how to test and how I have this uh N8 end setup so basically what's happening and is we are going to trigger an event and that trigger is going to be on a reaction added and right now you can see that I I've had I have this pinned so what we're going to do is then we have to get all the slack information we're going to take a look at all the comments figure out what the best common is and then parse all that information into uh into more readable text because right now it is pulling all this information and you can see that there's a lot coming from the the GitHub or sorry the Reddit block Reddit comments and this is where it's important around the the blocks so not only are we getting the comments but we can actually go into the information from the slack itself and we can actually see individual pieces of the blocks so if we want to specify our prompt what we can do is look right here and see that this text is uh what we want which is separated from the summary that we're getting back from the slack message and this just makes it way more manageable to actually get all of this information and so the reason that we needed to make our own blocks is in the Reddit section we need to be able to grab the Reddit and the post ID and so all of that information is actually coming from the slack blocks so if you look down here this is where we're getting the uh the subreddit in Block 8 and the post ID in Block n and so by having both of that information then we're actually able to pull from the Reddit uh API again then we're going to go and parse that information this is actually where we're going to use deep seek so in the model itself if you do a fixed expression and then you just sign up uh you use a a normal credential you just have to change your base URL in here that's uh essentially how can use deep seek and then what we'll do is we're actually going to trigger based on the Emoji what what we what action we want to have so if we have a bird it'll go to Twitter if we have a thread it'll go to uh thread um silhouette will be LinkedIn we can to do a speech balloon or we can actually schedule this on a time so another thing that we can do is if we want to specify things to go out at a specific time we could use a specific that would tell us uh for instance a clock at 1 that you want to send this out uh today at 1:00 as opposed to waiting or uh maybe even tomorrow right you could actually get the current time and then send it out in the the the UTC format the other thing that you can do in any and end is you can actually set what your time zone is and then that way it makes it easier for you to schedule so the next piece that we're going to go ahead and do is then once we trigger off of these things we can actually send it to our end location of whichever way we want to go and so let's look at how we can actually test this without it being pinned so if we come in here and we click test step we're going to overwrite our pin it's going to start listening we're going to come in here and we're going to say uh make a thread so now we have our thread we're going to see that we got our reaction we'll pin that and then we can go ahead and just actually play this all the way through and this is where we'll go and make a thread and as you can see it's going through and hitting our deep seek model it's going to then eventually hit our switch and then what we'll do is we'll actually see it come through the uh thread and so what's most of these are actually just chained together and then putting together a single post with the the thread what we actually did was we have a execute workflow trigger and then that will put together a object of tweets and then we're going to go ahead and basically do a continuous loop of creating the the first tweet getting that ID and then passing it to the Twitter thread again so this is essentially a loop you don't actually have to to have a loop um node here the way NN works is it when it sees an array it's just going to keep looping through that object great and so now it's been completed and it's gone ahead and posted and let's see what it came up with cool and so here we are on uh our Twitter page which I haven't been as activated this is another reason I want to to have a automation you can see that it's actually putting together a thread right here and that it was all the way successful so again all we did was we had L graph go through and find our Reddit information information we had the ability to come in here for a human in the loop and decide what we actually wanted to do with knowing more about this information and then we were able to automate that through n8n the other thing about this is so right now we've only done one example right we've we basically uh triggered this manually so what if we want to actually trigger this on a schedule or a cron so every morning I get up I immediately go on to slack so this could just be a way a part of my morning routine so that uh I I don't even have to think about it so if we go back and we actually look at uh the the API for our assistant on L graph they actually have the ability to create cron jobs so that you can schedule your lane graph to run every single day every single hour whatever and so what we're going to do is we're going to set this up through a python notebook so the first thing that we would do is we come in here and we need to put in our URL I actually just like making it automated so I want to search for the uh assistance and I am going to look for the assistant that is created by the assistant or I'm sorry the assistant that's created by the assist system and then if that exists I'm going to take that ID and I'm actually going to use it for the available crons that are there so I already have a cron set up that runs every morning so this is how I'm able to actually search for those crons and then I'm actually going to take this and create a new Crown right so you can see here I can actually change this around so this configuration goes back to the those defaults we can actually override those defaults so if I want to actually run this like you don't actually need either one of these two you can just do the configurable so for the configurable we can say that maybe our subreddit is chat jpt and instead of uh looking at the the default topics we want to look at different use cases we want to look at agents maybe we want to look at Voice or maybe we want a user provided content which we can just put in here and we could just save voice agents right and then when I run this it'll actually set it up and now I can actually see what crons I have available right here and so I have multiple crons as you can see multiple different see right there is the voice agents that we just put in now this will run every day uh at 6:00 in the morning before I get up all right everyone that's it for us today what we went through was the Lang graph Reddit tread finder and the ability to connect to slack as a human in the loop and then actually trigger an n8n automation we also added in deep seek as part of the model that we wanted to use for reasoning with that happy nerding --- *Generated for LLM consumption from nerding.io video library*