LinkedIn Auto is a closed-loop system: writing in Notion → automatic Monday publication → stats retrieval → pattern analysis → improvement for the next post.
The founding principle: automate friction, not creation. Writing stays 100% manual. Everything mechanical — publishing, timing, tracking — is delegated.
The problem
For 3 months, I published manually every Monday. The text was always ready. But the final step — opening LinkedIn, pasting, formatting, scheduling — cost more than expected.
Not in time. In mental friction.
Opening LinkedIn to publish also means scrolling. Reading. Replying. Checking stats. 15 minutes later, the post still hasn’t gone out.
I automated that step. Not the writing.
The 3 workflows
Workflow A — Auto Publisher
The post is written in Notion. When it’s marked “Ready”, nothing else to do.
Monday at 8:25am, the system checks if there’s a ready post. If there is, it’s automatically published via the LinkedIn API. The Notion entry is updated with “Published” status and the post URN. I get a Telegram confirmation.
If nothing is ready: Telegram alert. No silence, no forgetting.
Cron Monday 08:25
↓
Query Notion Editorial Calendar
(Status = "Ready", sorted by PlannedDate, limit 1)
↓
Post found?
No → Telegram alert "Nothing to publish this week"
Yes → POST /rest/posts (LinkedIn API)
↓
Update Notion (Status → "Published", LinkedInPostURN, PublishedAt)
↓
Telegram confirmation
Workflow B — Stats Collector
A Telegram command triggers retrieval of all my published post stats via LinkedIn’s Member Post Analytics API. Data lands in Notion, and a GPT-4o-mini-analyzed report is sent to Telegram.
Telegram "/stats"
↓
Query Notion (Status = "Published", LinkedInPostURN not empty)
↓
For each post:
GET /rest/posts/{postUrn}/statistics
↓
EngagementRate = (likes + comments×3 + shares×5 + saves×4) / impressions × 100
↓
Update Notion (Impressions, Likes, Comments, Shares, EngagementRate, StatsUpdatedAt)
↓
Aggregate → GPT Analysis
(patterns: post type, length, hook, concept used, time, day of publication)
↓
Telegram Report
(Top 3 posts · Bottom 3 · Detected patterns · Recommendation for next post)
Workflow C — Creator Analyzer
On Telegram command, analyzes the last 10 posts from any public LinkedIn account via Apify. GPT-4o extracts implicit writing rules, hook patterns, and format/engagement correlations. Useful for benchmarking other creators or understanding what performs in a given space.
Telegram "@account_name"
↓
Apify API — Start Job (LinkedIn Profile Posts scraper, max 10 posts)
↓
Poll until status SUCCEEDED
↓
Get Results (text + likes + comments + date for each post)
↓
GPT Analysis
(implicit writing rules, hook patterns, format/engagement correlations,
recurring formulas, what can be reused)
↓
Telegram Report
(5-7 extracted rules · examples from posts · engagement score per post)
WRITING_RULES protocol — the real value
Automatic publishing is the visible part. The core of the system is the learning loop built around it:
Each post tests one single variable — one isolated editorial rule. After 3 posts with that rule, it’s validated, invalidated, or extended to 5 posts if the signal is ambiguous. Validated rules enter a permanent repository that grows with each publication.
Rules currently being tested: polarizing hook, final punch line, contrast-figure hook.
This protocol avoids Goodhart’s Law: the goal isn’t to maximize EngagementRate — it’s to understand what makes it vary, and why. The metric stays a learning tool, never an end in itself.
Page /analytics — everything is public
All stats, the publication calendar, and the improvement timeline are made public on the /analytics page of this site — automatically updated after each sync.
What’s there:
- Publication calendar — monthly grid with each post’s status
- Global stats — total impressions, average EngagementRate, top post
- Post ranking — sorted by performance, with type, applied rules and stats
- Improvement timeline — each validated or invalidated rule with its measured impact
Transparency about your own numbers is a differentiating position. Recruiters, clients, and co-founders see the progression in real time. The improvement timeline is content in itself: it shows how a system improves through measured iterations.
Why not automate the writing?
Because it’s the only part I find genuinely useful — for myself first.
The real process: a topic surfaces from a podcast, a conversation, something read. I brainstorm with AI by voice — it feels like a debate, not writing. We find the angle, challenge the intuition, look for sources that corroborate or contradict. At the end, there’s a structured, sourced post with an angle that doesn’t look like what’s already circulating.
AI is in the loop, but it doesn’t drive. It helps dig, not fill.
Two reasons to keep my hand on the wheel:
The angle — AI proposes angles everyone already proposes. Topics everyone has an opinion on, framed the way everyone frames them. A recognizable post has an angle AI wouldn’t have produced alone — a personal observation, a counterpoint, a connection between two things that have no apparent relation.
The voice — A generated post looks like a generated post. What people read on LinkedIn is lived experience, real details, a way of thinking they recognize. That’s built through iterations, not prompts.
There’s a simpler reason too: I write for myself first. Going deep on a topic, finding the sources, discovering what holds up — that’s how I learn. The post is the byproduct. The research is the value.
Eventually, when there’s enough text for a model to genuinely recognize the voice and angles, automating the writing becomes an open question again. Not before.
Stack
| Component | Technology | Role |
|---|---|---|
| Orchestration | n8n cloud | Workflows A + B + C |
| Publishing | LinkedIn API v2 (w_member_social) | POST /rest/posts |
| Stats | LinkedIn Community Management API | Member Post Analytics |
| Data source | Notion API | Editorial Calendar |
| Notifications | Telegram Bot API | Confirmations + alerts + reports |
| Pattern analysis | GPT-4o-mini | Weekly stats report |
| Creator analysis | Apify (harvestapi/linkedin-profile-posts) | Workflow C |
| Analytics page | Astro 5 + Notion API read-only | Public stats rendering |
| Deployment | Vercel | Webhook-triggered rebuild after stats sync |
Status
| Component | Status |
|---|---|
| Workflow A — Auto Publisher | ✅ Live |
| Workflow B — Stats Collector | ✅ Live |
| Workflow C — Creator Analyzer | 🔜 Coming soon |
| WRITING_RULES | ✅ Active — 3 rules in testing |
/analytics page | 🔜 In progress |
Built by Thomas Silliard — live since April 2026.