@lucasygu/redbook 0.1.13 → 0.1.15

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
Files changed (3) hide show
  1. package/README.md +65 -13
  2. package/SKILL.md +386 -121
  3. package/package.json +1 -1
package/README.md CHANGED
@@ -11,6 +11,8 @@
11
11
  > **"帮我安装 `@lucasygu/redbook` 这个小红书 CLI 工具,然后运行 `redbook whoami` 验证是否能正常连接。GitHub 地址:https://github.com/lucasygu/redbook"**
12
12
  >
13
13
  > AI 会自动完成安装、验证连接、处理可能的 Cookie 问题。你只需要确保已在 Chrome 中登录 xiaohongshu.com。
14
+ >
15
+ > 安装完成后,试试:**"帮我分析'AI编程'这个话题在小红书上的竞争格局"** —— AI 会自动搜索关键词、分析互动数据、发现头部博主、给出内容建议。
14
16
 
15
17
  ## 安装
16
18
 
@@ -22,6 +24,16 @@ npm install -g @lucasygu/redbook
22
24
 
23
25
  安装后运行 `redbook whoami` 验证连接。如果遇到 macOS 钥匙串弹窗,请点击"始终允许"。CLI 会自动检测所有 Chrome 配置文件,找到你的小红书登录状态。
24
26
 
27
+ ## 能做什么
28
+
29
+ - **话题研究** —— 搜索关键词,分析哪些话题有流量、哪些是蓝海
30
+ - **竞品分析** —— 找到头部博主,对比粉丝量、互动数据、内容风格
31
+ - **爆款拆解** —— 分析爆款笔记的标题钩子、互动比例、评论主题
32
+ - **内容策划** —— 基于数据发现内容机会,生成有数据支撑的选题建议
33
+ - **受众洞察** —— 从互动信号推断目标用户画像
34
+
35
+ 通过 AI 助手使用时,这些工作流可以自动串联完成。直接使用 CLI 时,每个命令也可以独立运行。
36
+
25
37
  ## 快速开始
26
38
 
27
39
  ```bash
@@ -66,7 +78,7 @@ redbook post --title "测试" --body "..." --images img.png --private
66
78
  | `user <userId>` | 查看用户资料 |
67
79
  | `user-posts <userId>` | 列出用户所有笔记 |
68
80
  | `feed` | 获取推荐页内容 |
69
- | `post` | 发布图文笔记 |
81
+ | `post` | 发布图文笔记(易触发验证码,详见下方说明) |
70
82
  | `topics <关键词>` | 搜索话题/标签 |
71
83
  | `analyze-viral <url>` | 分析爆款笔记(钩子、互动、结构) |
72
84
 
@@ -86,7 +98,15 @@ redbook post --title "测试" --body "..." --images img.png --private
86
98
  | `--type <类型>` | `all`(全部)、`video`(视频)、`image`(图文) | `all` |
87
99
  | `--page <页码>` | 页码 | `1` |
88
100
 
89
- ### 发布选项
101
+ ### 分析选项(analyze-viral)
102
+
103
+ | 选项 | 说明 | 默认值 |
104
+ |------|------|--------|
105
+ | `--comment-pages <n>` | 获取评论页数 | `3` |
106
+
107
+ ### 发布选项(post)
108
+
109
+ 发布功能目前**容易触发验证码**(type=124)。图片上传正常,但发布步骤经常被拦截。如需发布笔记,建议使用浏览器自动化。
90
110
 
91
111
  | 选项 | 说明 |
92
112
  |------|------|
@@ -118,13 +138,19 @@ redbook post --title "测试" --body "..." --images img.png --private
118
138
  安装后自动注册为 Claude Code 技能。在 Claude Code 中使用 `/redbook` 命令:
119
139
 
120
140
  ```
121
- /redbook search "AI编程" # 搜索笔记
122
- /redbook read <url> # 阅读笔记
123
- /redbook user <userId> # 查看博主
124
- /redbook analyze <userId> # 完整博主分析
141
+ /redbook search "AI编程" # 搜索笔记
142
+ /redbook read <url> # 阅读笔记
143
+ /redbook user <userId> # 查看博主
144
+ /redbook analyze-viral <url> # 分析爆款笔记
125
145
  ```
126
146
 
127
- Claude 会自动调用 CLI 命令,解析结果,完成竞品分析、话题研究等复杂任务。
147
+ 技能内置了小红书平台专属的分析模块 —— 关键词矩阵、跨话题热力图、互动信号分类、博主画像、内容机会评分等。你可以直接用自然语言下达复杂任务:
148
+
149
+ - *"分析'AI编程'在小红书的竞争格局,找出蓝海关键词"*
150
+ - *"对比这三个博主的内容策略和互动数据"*
151
+ - *"拆解这篇爆款笔记,告诉我为什么火了"*
152
+
153
+ Claude 会自动组合多个命令,解析 JSON 数据,输出结构化分析报告。
128
154
 
129
155
  ## 编程接口
130
156
 
@@ -168,6 +194,8 @@ A fast CLI tool for [Xiaohongshu (小红书 / RED)](https://www.xiaohongshu.com)
168
194
  > **"Install the `@lucasygu/redbook` Xiaohongshu CLI tool and run `redbook whoami` to verify it works. Repo: https://github.com/lucasygu/redbook"**
169
195
  >
170
196
  > The agent will handle installation, verify the connection, and troubleshoot any cookie issues. Just make sure you're logged into xiaohongshu.com in Chrome first.
197
+ >
198
+ > Once installed, try: **"Analyze the competitive landscape for 'AI编程' on Xiaohongshu"** — the agent will search keywords, analyze engagement data, profile top creators, and suggest content opportunities.
171
199
 
172
200
  ## Install
173
201
 
@@ -179,6 +207,16 @@ Requires Node.js >= 22. Uses cookies from your Chrome browser session — you mu
179
207
 
180
208
  After installing, run `redbook whoami` to verify the connection. If macOS shows a Keychain prompt, click "Always Allow". The CLI auto-detects all Chrome profiles to find your XHS session.
181
209
 
210
+ ## What You Can Do
211
+
212
+ - **Topic research** — Search keywords, analyze which topics have demand vs. gaps
213
+ - **Competitive analysis** — Find top creators, compare followers, engagement, content style
214
+ - **Viral note breakdown** — Analyze title hooks, engagement ratios, comment themes
215
+ - **Content planning** — Discover content opportunities with data-backed topic suggestions
216
+ - **Audience insights** — Infer target audience from engagement signals
217
+
218
+ When used through an AI agent, these workflows chain together automatically. Each CLI command also works standalone.
219
+
182
220
  ## Quick Start
183
221
 
184
222
  ```bash
@@ -223,7 +261,7 @@ redbook post --title "测试" --body "..." --images img.png --private
223
261
  | `user <userId>` | Get user profile info |
224
262
  | `user-posts <userId>` | List a user's posted notes |
225
263
  | `feed` | Get homepage feed |
226
- | `post` | Publish an image note |
264
+ | `post` | Publish an image note (captcha-prone, see below) |
227
265
  | `topics <keyword>` | Search for topics/hashtags |
228
266
  | `analyze-viral <url>` | Analyze why a viral note works (hooks, engagement, structure) |
229
267
 
@@ -243,8 +281,16 @@ redbook post --title "测试" --body "..." --images img.png --private
243
281
  | `--type <type>` | `all`, `video`, `image` | `all` |
244
282
  | `--page <n>` | Page number | `1` |
245
283
 
284
+ ### Analyze-Viral Options
285
+
286
+ | Option | Description | Default |
287
+ |--------|-------------|---------|
288
+ | `--comment-pages <n>` | Number of comment pages to fetch | `3` |
289
+
246
290
  ### Post Options
247
291
 
292
+ Publishing **frequently triggers captcha** (type=124). Image upload works, but the publish step is unreliable. For posting, consider using browser automation instead.
293
+
248
294
  | Option | Description |
249
295
  |--------|-------------|
250
296
  | `--title <title>` | Note title (required) |
@@ -275,13 +321,19 @@ redbook post --title "测试" --body "..." --images img.png --private
275
321
  Installs automatically as a Claude Code skill. Use `/redbook` in Claude Code:
276
322
 
277
323
  ```
278
- /redbook search "AI编程" # Search notes
279
- /redbook read <url> # Read a note
280
- /redbook user <userId> # Creator profile
281
- /redbook analyze <userId> # Full creator analysis
324
+ /redbook search "AI编程" # Search notes
325
+ /redbook read <url> # Read a note
326
+ /redbook user <userId> # Creator profile
327
+ /redbook analyze-viral <url> # Analyze a viral note
282
328
  ```
283
329
 
284
- Claude will call CLI commands, parse results, and handle complex workflows like competitive analysis and topic research.
330
+ The skill includes XHS-native analysis modules — keyword engagement matrix, cross-topic heatmaps, engagement signal classification, creator profiling, opportunity scoring, and more. You can give natural language instructions for complex tasks:
331
+
332
+ - *"Analyze the competitive landscape for 'AI编程' on Xiaohongshu and find blue ocean keywords"*
333
+ - *"Compare the content strategies of these three creators"*
334
+ - *"Break down this viral note and tell me why it worked"*
335
+
336
+ Claude will automatically combine multiple commands, parse JSON data, and produce structured analysis reports.
285
337
 
286
338
  ## Programmatic Usage
287
339
 
package/SKILL.md CHANGED
@@ -16,11 +16,7 @@ Use the `redbook` CLI to search notes, read content, analyze creators, and resea
16
16
  /redbook analyze <userId> # Full creator analysis (profile + posts)
17
17
  ```
18
18
 
19
- ## Instructions
20
-
21
- When this command is invoked, determine the user's intent and run the appropriate CLI command(s).
22
-
23
- ### Quick Reference
19
+ ## Quick Reference
24
20
 
25
21
  | Intent | Command |
26
22
  |--------|---------|
@@ -34,13 +30,329 @@ When this command is invoked, determine the user's intent and run the appropriat
34
30
  | Analyze viral note | `redbook analyze-viral <url> --json` |
35
31
  | Check connection | `redbook whoami` |
36
32
 
37
- ### Always Use `--json`
33
+ **Always add `--json`** when parsing output programmatically. Without it, output is human-formatted text.
34
+
35
+ ---
36
+
37
+ ## XHS Platform Signals
38
+
39
+ XHS is not Twitter or Instagram. These platform-specific engagement ratios reveal content type and audience behavior.
40
+
41
+ ### Collect/Like Ratio (`collected_count / liked_count`)
42
+
43
+ XHS's "collect" (收藏) is a save-for-later mechanic — users build personal reference libraries. This ratio is the strongest signal of content utility.
44
+
45
+ | Ratio | Classification | Meaning |
46
+ |-------|---------------|---------|
47
+ | >40% | 工具型 (Reference) | Tutorial, checklist, template — users bookmark for reuse |
48
+ | 20–40% | 认知型 (Insight) | Thought-provoking but not saved for later |
49
+ | <20% | 娱乐型 (Entertainment) | Consumed and forgotten — engagement is passive |
50
+
51
+ ### Comment/Like Ratio (`comment_count / liked_count`)
52
+
53
+ Measures how much a note triggers conversation.
54
+
55
+ | Ratio | Classification | Meaning |
56
+ |-------|---------------|---------|
57
+ | >15% | 讨论型 (Discussion) | Debate, sharing experiences, asking questions |
58
+ | 5–15% | 正常互动 (Normal) | Typical engagement pattern |
59
+ | <5% | 围观型 (Passive) | Users like but don't engage further |
60
+
61
+ ### Share/Like Ratio (`share_count / liked_count`)
38
62
 
39
- Always add `--json` to commands when you need to parse the output programmatically. The JSON output is structured and reliable. Without `--json`, output is human-formatted text.
63
+ Measures social currency whether users share to signal identity or help others.
40
64
 
41
- ### Command Details
65
+ | Ratio | Meaning |
66
+ |-------|---------|
67
+ | >10% | 社交货币 — people share to signal taste, identity, or help friends |
68
+ | <10% | Content consumed individually, not forwarded |
69
+
70
+ ### Search Sort Semantics
71
+
72
+ | Sort | What It Reveals |
73
+ |------|----------------|
74
+ | `--sort popular` | Proven ceiling — the best a keyword can do |
75
+ | `--sort latest` | Content velocity — how much is being posted now |
76
+ | `--sort general` | Algorithm-weighted blend (default) |
77
+
78
+ ### Content Form Dynamics
79
+
80
+ | Form | Tendency |
81
+ |------|----------|
82
+ | 图文 (image-text, `type: "normal"`) | Higher collect rate — users save reference content |
83
+ | 视频 (video, `type: "video"`) | Higher like rate — easier to consume passively |
84
+
85
+ ---
86
+
87
+ ## Analysis Modules
88
+
89
+ Each module is a composable building block. Combine them for different analysis depths.
90
+
91
+ ### Module A: Keyword Engagement Matrix
92
+
93
+ **Answers:** Which keywords have the highest engagement ceiling? Which are saturated vs. underserved?
94
+
95
+ **Commands:**
96
+ ```bash
97
+ redbook search "keyword1" --sort popular --json
98
+ redbook search "keyword2" --sort popular --json
99
+ # Repeat for each keyword in your list
100
+ ```
101
+
102
+ **Fields to extract** from each result's `items[]`:
103
+ - `items[].note_card.interact_info.liked_count` — likes (may use Chinese numbers: "1.5万" = 15,000)
104
+ - `items[].note_card.interact_info.collected_count` — collects
105
+ - `items[].note_card.interact_info.comment_count` — comments
106
+ - `items[].note_card.user.nickname` — author
107
+
108
+ **How to interpret:**
109
+ - **Top1 ceiling** = `items[0]` likes — the best-performing note for this keyword. This is the proven demand signal.
110
+ - **Top10 average** = mean likes across `items[0..9]` — how well an average top note does.
111
+ - A high Top1 but low Top10 avg means one outlier dominates; hard to compete.
112
+ - A high Top10 avg means consistent demand; easier to break in.
113
+
114
+ **Output:** Keyword × engagement table ranked by Top1 ceiling.
115
+
116
+ | Keyword | Top1 Likes | Top10 Avg | Top1 Collects | Collect/Like |
117
+ |---------|-----------|-----------|---------------|-------------|
118
+ | keyword1 | 12,000 | 3,200 | 5,400 | 45% |
119
+ | keyword2 | 8,500 | 4,100 | 1,200 | 14% |
120
+
121
+ ---
122
+
123
+ ### Module B: Cross-Topic Heatmap
124
+
125
+ **Answers:** Which topic × scene intersections have demand? Where are the content gaps?
126
+
127
+ **Commands:**
128
+ ```bash
129
+ # Combine base topic with scene/angle keywords
130
+ redbook search "base topic + scene1" --sort popular --json
131
+ redbook search "base topic + scene2" --sort popular --json
132
+ redbook search "base topic + scene3" --sort popular --json
133
+ ```
134
+
135
+ **Fields to extract:** Same as Module A — Top1 `liked_count` for each combination.
136
+
137
+ **How to interpret:**
138
+ - High Top1 = proven demand for this intersection
139
+ - Zero or very low results = content gap (opportunity or no demand — check if the combination makes sense)
140
+ - Compare across scenes to find which angles resonate most with the base topic
141
+
142
+ **Output:** Base × Scene heatmap.
143
+
144
+ ```
145
+ scene1 scene2 scene3 scene4
146
+ base topic ████ 8K ██ 2K ████ 12K ░░ 200
147
+ ```
148
+
149
+ ---
150
+
151
+ ### Module C: Engagement Signal Analysis
152
+
153
+ **Answers:** What type of content is each keyword? Reference, insight, or entertainment?
154
+
155
+ **Commands:** Use search results from Module A, or for a single note:
156
+ ```bash
157
+ redbook analyze-viral "<noteUrl>" --json
158
+ ```
159
+
160
+ **Fields to extract:**
161
+ - From search results: compute ratios from `interact_info` fields
162
+ - From `analyze-viral`: use pre-computed `engagement.collectToLikeRatio`, `engagement.commentToLikeRatio`, `engagement.shareToLikeRatio`
163
+
164
+ **How to interpret:** Apply the ratio benchmarks from [XHS Platform Signals](#xhs-platform-signals) above.
165
+
166
+ **Output:** Per-keyword or per-note classification.
167
+
168
+ | Keyword | Collect/Like | Comment/Like | Type |
169
+ |---------|-------------|-------------|------|
170
+ | keyword1 | 45% | 8% | 工具型 + 正常互动 |
171
+ | keyword2 | 12% | 22% | 娱乐型 + 讨论型 |
172
+
173
+ ---
174
+
175
+ ### Module D: Creator Discovery & Profiling
176
+
177
+ **Answers:** Who are the key creators in this niche? What are their strategies?
178
+
179
+ **Commands:**
180
+ ```bash
181
+ # 1. Collect unique user_ids from search results across keywords
182
+ # Extract from items[].note_card.user.user_id
183
+
184
+ # 2. For each creator:
185
+ redbook user "<userId>" --json
186
+ redbook user-posts "<userId>" --json
187
+ ```
188
+
189
+ **Fields to extract:**
190
+ - From `user`: `interactions[]` where `type === "fans"` → follower count
191
+ - From `user-posts`: `notes[].interact_info.liked_count` for all posts → compute avg, median, max
192
+ - From `user-posts`: `notes[].display_title` → content patterns, posting frequency
193
+
194
+ **How to interpret:**
195
+ - **Avg vs. Median likes:** Large gap means viral outliers inflate the average. Median is the "true" baseline.
196
+ - **Max / Median ratio:** >5× means they've had breakout hits. Study those notes specifically.
197
+ - **Post frequency:** Count notes to estimate posting cadence. Prolific creators (>3/week) vs. quality-focused (<1/week).
198
+
199
+ **Output:** Creator comparison table.
200
+
201
+ | Creator | Followers | Avg Likes | Median | Max | Posts | Style |
202
+ |---------|----------|-----------|--------|-----|-------|-------|
203
+ | @creator1 | 12万 | 3,200 | 1,800 | 45,000 | 89 | Tutorial |
204
+ | @creator2 | 5.4万 | 8,100 | 6,500 | 22,000 | 34 | Story |
205
+
206
+ ---
207
+
208
+ ### Module E: Content Form Breakdown
209
+
210
+ **Answers:** Do image-text or video notes perform better for this topic?
211
+
212
+ **Commands:**
213
+ ```bash
214
+ redbook search "keyword" --type image --sort popular --json
215
+ redbook search "keyword" --type video --sort popular --json
216
+ ```
217
+
218
+ **Fields to extract:**
219
+ - Compare Top1 and Top10 avg `liked_count` and `collected_count` between the two result sets
220
+ - Note the `type` field: `"normal"` = image-text, `"video"` = video
221
+
222
+ **Output:** Form × engagement table.
223
+
224
+ | Form | Top1 Likes | Top10 Avg | Collect/Like |
225
+ |------|-----------|-----------|-------------|
226
+ | 图文 | 8,000 | 2,400 | 42% |
227
+ | 视频 | 15,000 | 5,100 | 18% |
228
+
229
+ ---
230
+
231
+ ### Module F: Opportunity Scoring
232
+
233
+ **Answers:** Which keywords should I target? Where is the best effort-to-reward ratio?
234
+
235
+ **Input:** Keyword matrix from Module A.
236
+
237
+ **Scoring logic:**
238
+ - **Demand** = Top1 likes ceiling (proven audience size)
239
+ - **Competition** = density of high-engagement results (how many notes in Top10 have >1K likes)
240
+ - **Score** = Demand × (1 / Competition density)
241
+
242
+ **Tier thresholds** (based on Top1 likes):
243
+
244
+ | Tier | Top1 Likes | Meaning |
245
+ |------|-----------|---------|
246
+ | S | >100,000 (10万+) | Massive demand — hard to compete but huge upside |
247
+ | A | 20,000–100,000 | Strong demand — competitive but winnable |
248
+ | B | 5,000–20,000 | Moderate demand — good for growing accounts |
249
+ | C | <5,000 | Niche — low competition, low ceiling |
250
+
251
+ **Output:** Tiered keyword list.
252
+
253
+ | Tier | Keyword | Top1 | Competition | Opportunity |
254
+ |------|---------|------|-------------|------------|
255
+ | A | keyword1 | 45K | Medium (6/10 >1K) | High |
256
+ | B | keyword3 | 12K | Low (2/10 >1K) | Very High |
257
+ | S | keyword2 | 120K | High (10/10 >1K) | Medium |
258
+
259
+ ---
260
+
261
+ ### Module G: Audience Inference
262
+
263
+ **Answers:** Who is the audience for this niche? What do they want?
264
+
265
+ **Input:** Engagement ratios from Module C + comment themes from `analyze-viral` + content patterns.
266
+
267
+ **Fields to extract** from `analyze-viral` JSON:
268
+ - `comments.themes[]` — recurring phrases and keywords from comment section
269
+ - `comments.questionRate` — % of comments that are questions (learning intent)
270
+ - `engagement.collectToLikeRatio` — save behavior signals intent
271
+ - `hook.hookPatterns[]` — what title patterns attract this audience
272
+
273
+ **Inference rules:**
274
+ - High collect rate + high question rate → learning-oriented audience (students, professionals)
275
+ - High comment rate + emotional themes → community-oriented audience (sharing experiences)
276
+ - High share rate → aspiration-oriented audience (lifestyle, identity signaling)
277
+ - Comment language patterns → age/education signals (formal = older, slang = younger)
278
+
279
+ **Output:** Audience persona summary — demographics, intent, content preferences.
280
+
281
+ ---
42
282
 
43
- #### `redbook search <keyword>`
283
+ ### Module H: Content Brainstorm
284
+
285
+ **Answers:** What specific content should I create, backed by data?
286
+
287
+ **Input:** Opportunity scores (Module F) + audience persona (Module G) + heatmap gaps (Module B).
288
+
289
+ **For each content idea, specify:**
290
+ - **Target keyword** — from opportunity scoring
291
+ - **Hook angle** — based on `hookPatterns` that work for this niche
292
+ - **Content type** — 工具型/认知型/娱乐型 based on what the audience wants
293
+ - **Form** — 图文 or 视频 based on Module E
294
+ - **Engagement target** — realistic based on Top10 avg for this keyword
295
+ - **Competitive reference** — specific note URL that proves this angle works
296
+
297
+ **Output:** Ranked content ideas with data backing.
298
+
299
+ | # | Keyword | Hook Angle | Type | Target Likes | Reference |
300
+ |---|---------|-----------|------|-------------|-----------|
301
+ | 1 | keyword3 | "N个方法..." (List) | 工具型 图文 | 5K+ | [top note URL] |
302
+ | 2 | keyword1 | "为什么..." (Question) | 认知型 视频 | 10K+ | [top note URL] |
303
+
304
+ ---
305
+
306
+ ## Composed Workflows
307
+
308
+ Combine modules for different analysis depths.
309
+
310
+ ### Quick Topic Scan (~5 min)
311
+ **Modules:** A → C → F
312
+
313
+ Search 3–5 keywords, classify engagement type, rank opportunities. Good for quickly validating whether a niche is worth deeper research.
314
+
315
+ ### Content Planning
316
+ **Modules:** A → B → E → F → H
317
+
318
+ Build keyword matrix, map topic × scene intersections, check content form performance, score opportunities, brainstorm specific content ideas.
319
+
320
+ ### Creator Competitive Analysis
321
+ **Modules:** A → D
322
+
323
+ Find who dominates a niche and study their content strategy, posting frequency, and engagement patterns.
324
+
325
+ ### Full Niche Analysis
326
+ **Modules:** A → B → C → D → E → F → G → H
327
+
328
+ The comprehensive playbook — keyword landscape, cross-topic heatmap, engagement signals, creator profiles, content form analysis, opportunity scoring, audience personas, and data-backed content ideas.
329
+
330
+ ### Single Note Deep-Dive
331
+ **Command:** `redbook analyze-viral "<url>" --json`
332
+
333
+ No module composition needed — `analyze-viral` returns hook analysis, engagement ratios, comment themes, author baseline comparison, and a 0-100 viral score in one call.
334
+
335
+ ### Viral Pattern Research
336
+ ```bash
337
+ # 1. Find top notes
338
+ redbook search "keyword" --sort popular --json
339
+
340
+ # 2. Analyze 3-5 top notes
341
+ redbook analyze-viral "<url1>" --json
342
+ redbook analyze-viral "<url2>" --json
343
+ redbook analyze-viral "<url3>" --json
344
+
345
+ # 3. Synthesize across notes:
346
+ # - Which hookPatterns[] appear most often?
347
+ # - What collectToLikeRatio is typical?
348
+ # - What content structure drives saves vs. shares?
349
+ ```
350
+
351
+ ---
352
+
353
+ ## Command Details
354
+
355
+ ### `redbook search <keyword>`
44
356
 
45
357
  Search for notes by keyword. Returns note titles, URLs, likes, author info.
46
358
 
@@ -56,7 +368,7 @@ redbook search "MCP Server" --page 2 --json # Pagination
56
368
  - `--type <type>`: `all` (default), `video`, `image`
57
369
  - `--page <n>`: Page number (default: 1)
58
370
 
59
- #### `redbook read <url>`
371
+ ### `redbook read <url>`
60
372
 
61
373
  Read a note's full content — title, body text, images, likes, comments count.
62
374
 
@@ -64,9 +376,9 @@ Read a note's full content — title, body text, images, likes, comments count.
64
376
  redbook read "https://www.xiaohongshu.com/explore/abc123" --json
65
377
  ```
66
378
 
67
- Accepts full URLs or short note IDs. If the API returns a captcha, it falls back to HTML scraping automatically.
379
+ Accepts full URLs or short note IDs. Falls back to HTML scraping if API returns captcha.
68
380
 
69
- #### `redbook comments <url>`
381
+ ### `redbook comments <url>`
70
382
 
71
383
  Get comments on a note. Use `--all` to fetch all pages.
72
384
 
@@ -75,10 +387,7 @@ redbook comments "https://www.xiaohongshu.com/explore/abc123" --json
75
387
  redbook comments "https://www.xiaohongshu.com/explore/abc123" --all --json
76
388
  ```
77
389
 
78
- **Options:**
79
- - `--all`: Fetch all comment pages (default: first page only)
80
-
81
- #### `redbook user <userId>`
390
+ ### `redbook user <userId>`
82
391
 
83
392
  Get a creator's profile — nickname, bio, follower count, note count, likes received.
84
393
 
@@ -88,7 +397,7 @@ redbook user "5a1234567890abcdef012345" --json
88
397
 
89
398
  The userId is the hex string from the creator's profile URL.
90
399
 
91
- #### `redbook user-posts <userId>`
400
+ ### `redbook user-posts <userId>`
92
401
 
93
402
  List all notes posted by a creator. Returns titles, URLs, likes, timestamps.
94
403
 
@@ -96,15 +405,15 @@ List all notes posted by a creator. Returns titles, URLs, likes, timestamps.
96
405
  redbook user-posts "5a1234567890abcdef012345" --json
97
406
  ```
98
407
 
99
- #### `redbook feed`
408
+ ### `redbook feed`
100
409
 
101
- Browse the recommendation feed. Returns a batch of recommended notes.
410
+ Browse the recommendation feed.
102
411
 
103
412
  ```bash
104
413
  redbook feed --json
105
414
  ```
106
415
 
107
- #### `redbook topics <keyword>`
416
+ ### `redbook topics <keyword>`
108
417
 
109
418
  Search for topic hashtags. Useful for finding trending topics to attach to posts.
110
419
 
@@ -112,9 +421,9 @@ Search for topic hashtags. Useful for finding trending topics to attach to posts
112
421
  redbook topics "Claude Code" --json
113
422
  ```
114
423
 
115
- #### `redbook analyze-viral <url>`
424
+ ### `redbook analyze-viral <url>`
116
425
 
117
- Analyze why a viral XHS note works — hook patterns, engagement metrics, content structure, and performance relative to the author's baseline. Returns a deterministic viral score (0-100).
426
+ Analyze why a viral note works. Returns a deterministic viral score (0100).
118
427
 
119
428
  ```bash
120
429
  redbook analyze-viral "https://www.xiaohongshu.com/explore/abc123" --json
@@ -127,15 +436,14 @@ redbook analyze-viral "https://www.xiaohongshu.com/explore/abc123" --comment-pag
127
436
  **JSON output structure:**
128
437
  Returns `{ note, score, hook, content, visual, engagement, comments, relative, fetchedAt }`.
129
438
 
130
- - `score.overall` (0-100) — composite of hook/engagement/relative/content/comments
131
- - `hook.hookPatterns[]` — detected title patterns (Identity Hook, Emotion Word, etc.)
132
- - `engagement` — likes, comments, collects, shares + ratios
133
- - `relative.viralMultiplier` — performance vs author's median
134
- - `comments.themes[]` — top recurring phrases from comments
135
-
136
- Use `--json` when consuming from Claude Code skills for further synthesis.
439
+ - `score.overall` (0100) — composite of hook (20) + engagement (20) + relative (20) + content (20) + comments (20)
440
+ - `hook.hookPatterns[]` — detected title patterns (Identity Hook, Emotion Word, Number Hook, Question, etc.)
441
+ - `engagement` — likes, comments, collects, shares + ratios (collectToLikeRatio, commentToLikeRatio, shareToLikeRatio)
442
+ - `relative.viralMultiplier` — this note's likes / author's median likes
443
+ - `relative.isOutlier` — true if viralMultiplier > 3
444
+ - `comments.themes[]` — top recurring keyword phrases from comments
137
445
 
138
- #### `redbook whoami`
446
+ ### `redbook whoami`
139
447
 
140
448
  Check connection status. Verifies cookies are valid and shows the logged-in user.
141
449
 
@@ -143,11 +451,9 @@ Check connection status. Verifies cookies are valid and shows the logged-in user
143
451
  redbook whoami
144
452
  ```
145
453
 
146
- If this fails, the user needs to log into xiaohongshu.com in Chrome.
147
-
148
- #### `redbook post` (Limited)
454
+ ### `redbook post` (Limited)
149
455
 
150
- Publish an image note. **Note: This command frequently triggers captcha (type=124) on the creator API.** Image upload works, but the publish step is unreliable. For posting, use Chrome browser automation instead.
456
+ Publish an image note. **Frequently triggers captcha (type=124) on the creator API.** Image upload works, but the publish step is unreliable. For posting, consider using browser automation instead.
151
457
 
152
458
  ```bash
153
459
  redbook post --title "标题" --body "正文" --images cover.png --json
@@ -168,90 +474,11 @@ All commands accept:
168
474
  - `--chrome-profile <name>`: Chrome profile directory name (e.g., "Profile 1"). Auto-discovered if omitted.
169
475
  - `--json`: Output as JSON
170
476
 
171
- ## Research Workflows
172
-
173
- ### Competitive Analysis
174
-
175
- Research competitors in a niche:
176
-
177
- ```bash
178
- # 1. Search for content in the niche
179
- redbook search "Claude Code" --sort popular --json
180
-
181
- # 2. Identify top creators from search results (extract user IDs)
182
-
183
- # 3. Deep-dive on each creator
184
- redbook user <userId> --json # Profile + follower stats
185
- redbook user-posts <userId> --json # All their content + engagement
186
-
187
- # 4. Read their top-performing notes
188
- redbook read <noteUrl> --json # Full content
189
- redbook comments <noteUrl> --all --json # What resonates with audience
190
- ```
191
-
192
- ### Topic Research
193
-
194
- Find trending topics and hashtags:
195
-
196
- ```bash
197
- # Search for topics
198
- redbook topics "AI编程" --json
199
-
200
- # Search notes using different sort orders to understand the landscape
201
- redbook search "keyword" --sort popular --json # What's proven
202
- redbook search "keyword" --sort latest --json # What's new
203
- ```
204
-
205
- ### Viral Note Research
206
-
207
- Analyze what makes top-performing notes work:
208
-
209
- ```bash
210
- # 1. Find viral notes in a niche
211
- redbook search "Claude Code" --sort popular --json
212
-
213
- # 2. Analyze the top-performing note
214
- redbook analyze-viral "<noteUrl>" --json
215
-
216
- # 3. Compare multiple viral notes to find common patterns
217
- # Run analyze-viral on 3-5 top notes, then synthesize:
218
- # - Which hook patterns appear most often?
219
- # - What engagement ratios are typical for this niche?
220
- # - What content structure drives saves vs shares?
221
- ```
222
-
223
- ### Creator Deep-Dive
224
-
225
- Analyze a specific creator's strategy:
226
-
227
- ```bash
228
- # Get profile overview
229
- redbook user <userId> --json
230
-
231
- # Get all their posts to analyze content patterns
232
- redbook user-posts <userId> --json
233
-
234
- # Read their top posts for content structure analysis
235
- redbook read <topPostUrl> --json
236
- redbook comments <topPostUrl> --all --json
237
- ```
238
-
239
- ## Programmatic API
240
-
241
- The package also exports a TypeScript client for scripting:
242
-
243
- ```typescript
244
- import { XhsClient } from "@lucasygu/redbook";
245
- import { loadCookies } from "@lucasygu/redbook/cookies";
246
-
247
- const cookies = await loadCookies("chrome");
248
- const client = new XhsClient(cookies);
477
+ ---
249
478
 
250
- const results = await client.searchNotes("AI编程", 1, 20, "popular");
251
- const topics = await client.searchTopics("Claude Code");
252
- ```
479
+ ## Technical Reference
253
480
 
254
- ## xsec_token — CRITICAL for Reading Notes
481
+ ### xsec_token — Required for Reading Notes
255
482
 
256
483
  The XHS API requires a valid `xsec_token` to fetch note content. Without it, `read`, `comments`, and `analyze-viral` return `{}`.
257
484
 
@@ -272,10 +499,9 @@ redbook read "https://www.xiaohongshu.com/explore/689da7b0?xsec_token=OLD_TOKEN"
272
499
  redbook search "AI编程" --sort popular --json
273
500
  # Extract the noteId + xsec_token from search results, then:
274
501
  redbook read "https://www.xiaohongshu.com/explore/<noteId>?xsec_token=<freshToken>" --json
275
- redbook analyze-viral "https://www.xiaohongshu.com/explore/<noteId>?xsec_token=<freshToken>" --json
276
502
  ```
277
503
 
278
- **For Claude Code agents:** When the user gives you a bare XHS note URL (no `xsec_token` param), extract the noteId from the URL path, search for the note title or noteId to get a fresh token, then use the full URL with the fresh token for `read`/`comments`/`analyze-viral`.
504
+ **For agents:** When the user gives a bare XHS note URL (no `xsec_token` param), extract the noteId from the URL path, search for the note title or noteId to get a fresh token, then use the full URL with the fresh token.
279
505
 
280
506
  **How to extract fresh URLs from search results (JSON):**
281
507
 
@@ -287,7 +513,7 @@ redbook analyze-viral "https://www.xiaohongshu.com/explore/<noteId>?xsec_token=<
287
513
  **Commands that need xsec_token:** `read`, `comments`, `analyze-viral`
288
514
  **Commands that do NOT need xsec_token:** `search`, `user`, `user-posts`, `feed`, `whoami`, `topics`
289
515
 
290
- ## Chinese Number Formats in API Responses
516
+ ### Chinese Number Formats in API Responses
291
517
 
292
518
  The XHS API returns abbreviated numbers with Chinese unit suffixes:
293
519
 
@@ -302,16 +528,55 @@ The XHS API returns abbreviated numbers with Chinese unit suffixes:
302
528
 
303
529
  The `analyze-viral` command handles this automatically. When parsing `--json` output manually, watch for these suffixes in `interact_info` fields (`liked_count`, `collected_count`, etc.).
304
530
 
305
- ## Error Handling
531
+ ### Error Handling
306
532
 
307
533
  | Error | Meaning | Fix |
308
534
  |-------|---------|-----|
309
- | `{}` empty response | Missing or expired xsec_token | Search first to get a fresh token (see above) |
535
+ | `{}` empty response | Missing or expired xsec_token | Search first to get a fresh token |
310
536
  | "No 'a1' cookie" | Not logged into XHS in browser | Log into xiaohongshu.com in Chrome |
311
537
  | "Session expired" | Cookie too old | Re-login in Chrome |
312
538
  | "NeedVerify" / captcha | Anti-bot triggered | Wait and retry, or reduce request frequency |
313
539
  | "IP blocked" (300012) | Rate limited | Wait or switch network |
314
540
 
541
+ ---
542
+
543
+ ## Output Format Guidance
544
+
545
+ When producing analysis reports, use these formats:
546
+
547
+ **Data tables:** Markdown tables with exact field mappings. Always include the metric unit.
548
+
549
+ **Heatmaps:** ASCII bar charts for cross-topic comparison:
550
+ ```
551
+ 职场 生活 教育 创业
552
+ AI编程 ████ 8K ██ 2K ████ 12K ░░ 200
553
+ Claude Code ██ 3K ░░ 100 ██ 4K █ 1K
554
+ ```
555
+
556
+ **Creator comparison:** Structured table with both quantitative metrics and qualitative style assessment.
557
+
558
+ **Final reports:** Use this section order:
559
+ 1. Market Overview (demand signals, content velocity)
560
+ 2. Keyword Landscape (engagement matrix, opportunity tiers)
561
+ 3. Cross-Topic Heatmap (topic × scene intersections)
562
+ 4. Audience Persona (demographics, intent, preferences)
563
+ 5. Competitive Landscape (creator profiles, strategy patterns)
564
+ 6. Content Opportunities (tiered recommendations with data backing)
565
+ 7. Content Ideas (specific hooks, angles, targets)
566
+
567
+ ## Programmatic API
568
+
569
+ ```typescript
570
+ import { XhsClient } from "@lucasygu/redbook";
571
+ import { loadCookies } from "@lucasygu/redbook/cookies";
572
+
573
+ const cookies = await loadCookies("chrome");
574
+ const client = new XhsClient(cookies);
575
+
576
+ const results = await client.searchNotes("AI编程", 1, 20, "popular");
577
+ const topics = await client.searchTopics("Claude Code");
578
+ ```
579
+
315
580
  ## Requirements
316
581
 
317
582
  - Node.js >= 22
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "@lucasygu/redbook",
3
- "version": "0.1.13",
3
+ "version": "0.1.15",
4
4
  "description": "CLI tool for Xiaohongshu (Red Note) - read and post via private API",
5
5
  "type": "module",
6
6
  "main": "./dist/lib/client.js",