Data Processing & Analysisbeginner
November 6, 2025
6 min read
30 minutes
Never Miss a Trending Post: Build Your Social Media Monitor with n8n
Automate social media monitoring with n8n. Get daily trending posts from X, Reddit, and LinkedIn delivered to your inbox and Telegram.
By Kazi Sakib

Keeping up with trending content across multiple social platforms feels like a full-time job. You're jumping between X, Reddit, and LinkedIn, scrolling endlessly, hoping to catch that one viral post in your niche before everyone else does.
What if you could automate the entire process? What if trending posts from your niche automatically landed in your inbox and Telegram every single day, neatly organized and ready to review? That's exactly what this n8n workflow does. It monitors social media platforms for trending posts about UX design (or any topic you choose), stores them in a Google Sheet, and sends you a daily digest via email and Telegram.
No more manual scrolling. No more FOMO. Just the best content, delivered on autopilot.
What You'll Need to Get Started
Before diving into the workflow, let's gather your tools. This automation relies on a few key services working together:
Prerequisites
- Apify Account: You'll use Apify actors to scrape trending posts from X, Reddit, and LinkedIn. You'll need an API token to authenticate your requests.
- Google Sheets: Your central database where all trending posts get stored with their source, URL, and timestamp.
- Gmail Account: For sending beautifully formatted HTML email digests with clickable links.
- Telegram Bot: To receive instant notifications on your phone with a plain text list of trending posts.
- n8n Instance: The automation platform that ties everything together. You can use n8n Cloud or self-host it.
Key Components
This workflow uses several n8n nodes that handle different parts of the automation:
- Manual Trigger: Kicks off the workflow whenever you want (or schedule it with a Cron node for daily automation).
- HTTP Request Nodes: Three separate nodes that call Apify APIs to scrape X, Reddit, and LinkedIn.
- Edit Fields Nodes: Transform the data from each platform into a consistent format.
- Google Sheets Nodes: Append new posts to your spreadsheet and retrieve all stored posts.
- Code Nodes: Format the data into HTML for email and plain text for Telegram.
- Gmail Node: Sends the formatted email digest.
- Telegram Node: Delivers the mobile-friendly notification.
Building Your Social Media Monitor Step by Step
Step 1: Scrape Trending Posts from Multiple Platforms
The workflow starts by fetching trending posts from three social platforms simultaneously. Each HTTP Request node calls a specific Apify actor with customized parameters.
For X (formerly Twitter), the workflow searches for posts with at least 50 likes and 10 retweets containing your keyword. For Reddit, it pulls the top 5 posts from the past day. LinkedIn scrapes posts from the last 24 hours using a search URL. All three run in parallel, making the entire process lightning fast.
The beauty here is flexibility. Want to track "AI tools" instead of "UX design"? Just change the search query. Need more posts? Adjust the limits. The Apify actors handle all the heavy lifting of navigating these platforms and extracting clean data.
Step 2: Standardize the Data Structure
Each social platform returns data in its own unique format. X gives you a timestamp field, Reddit calls it createdAt, and LinkedIn uses postedAtISO. To make everything work together, you need consistency.
img_1.png
Three Edit Fields nodes transform each platform's data into a uniform structure with three fields: url, timestamp, and Source. This standardization makes the downstream processing seamless. Every post now speaks the same language, regardless of where it came from.
Step 3: Store Everything in Google Sheets
All those standardized posts flow into a single Google Sheets node that appends them to your spreadsheet. Think of this sheet as your content database. Every trending post gets logged with its source platform, direct link, and exact timestamp.
Why use a spreadsheet? Because it gives you instant visibility. You can open it anytime to see your growing collection of trending content. Plus, it creates a historical record. Want to analyze what types of posts trended last month? Your sheet has the answers.
After appending the new posts, another Google Sheets node retrieves all rows from the spreadsheet. This becomes your master list that gets formatted and sent out.
img_2.png
Step 4: Format for Multiple Channels
Now comes the creative part. Two Code nodes transform your spreadsheet data into channel-specific formats.
The first Code node creates an HTML version perfect for email. It builds a numbered list where each post URL becomes a clickable link. The formatting uses proper HTML tags with line breaks, making it visually appealing and easy to navigate in your inbox.
The second Code node generates a plain text version optimized for Telegram. It creates the same numbered list but uses simple newline characters instead of HTML. This ensures your mobile notification looks clean and readable.
Same data, two different presentations, each tailored to how you'll actually consume it.
Step 5: Deliver Your Daily Digest
The final stage splits into two parallel deliveries. The Gmail node takes that beautifully formatted HTML and sends it to your email address with the subject line "Trending Posts." Open your inbox, and there's your curated list of viral content, ready to click through and explore.
Simultaneously, the Telegram node sends the plain text version to your chat ID. Whether you're commuting, in a meeting, or away from your desk, you get an instant notification with all the trending posts right on your phone.
Two touchpoints, zero effort required from you.
img_3.png
Why This Workflow Changes the Game
This social media monitoring workflow solves a genuine pain point for content creators, marketers, researchers, and anyone who needs to stay on top of industry trends.
Imagine you're a UX designer trying to understand what the community is buzzing about. Instead of spending 30 minutes each morning checking three different platforms, you get a single, consolidated digest. That's 3.5 hours saved every week, time you can invest in creating your own content or actually doing your work.
Best of all, it runs on autopilot. Add a Schedule Trigger node at the beginning, set it to run daily at 8 AM, and you've got a personal content curator working for you 24/7. Wake up every morning to fresh trending posts, delivered exactly how you want them.
Your Next Steps
The hardest part of building automation is actually starting. You've got the blueprint now. Clone this workflow in your n8n instance, plug in your API credentials, and run it once to see the magic happen.
Start with one platform if three feels overwhelming. Get comfortable with how the data flows. Then expand. The beauty of n8n is that you can build incrementally, testing each node as you go.
Soon, you'll wonder how you ever kept up with social media trends manually. Because once you experience having the internet's best content delivered to you automatically, there's no going back.
Share this article
Help others discover this content
Tap and hold the link button above to access your device's native sharing options
More in Data Processing & Analysis
Continue exploring workflows in this category

Data Processing & Analysisintermediate
1 min read
# Build an AI-Powered Conversational Survey Bot with n8n: Turn Static Forms into Dynamic Interviews
Nayma Sultana
Nov 15
Est: 40 minutes

Data Processing & Analysisintermediate
1 min read
Build an AI-Powered YouTube Parser with n8n
Mahedi Hasan Nadvee
Nov 13
Est: 45 minutes

Data Processing & Analysisadvanced
1 min read
Build a Smart AI Chatbot That Actually Knows Your Documents (Using n8n RAG Workflow)
Nayma Sultana
Nov 13
Est: 1 hour