@lucasygu/redbook 0.1.13 → 0.1.14

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
Files changed (2) hide show
  1. package/SKILL.md +386 -121
  2. package/package.json +1 -1
package/SKILL.md CHANGED
@@ -16,11 +16,7 @@ Use the `redbook` CLI to search notes, read content, analyze creators, and resea
16
16
  /redbook analyze <userId> # Full creator analysis (profile + posts)
17
17
  ```
18
18
 
19
- ## Instructions
20
-
21
- When this command is invoked, determine the user's intent and run the appropriate CLI command(s).
22
-
23
- ### Quick Reference
19
+ ## Quick Reference
24
20
 
25
21
  | Intent | Command |
26
22
  |--------|---------|
@@ -34,13 +30,329 @@ When this command is invoked, determine the user's intent and run the appropriat
34
30
  | Analyze viral note | `redbook analyze-viral <url> --json` |
35
31
  | Check connection | `redbook whoami` |
36
32
 
37
- ### Always Use `--json`
33
+ **Always add `--json`** when parsing output programmatically. Without it, output is human-formatted text.
34
+
35
+ ---
36
+
37
+ ## XHS Platform Signals
38
+
39
+ XHS is not Twitter or Instagram. These platform-specific engagement ratios reveal content type and audience behavior.
40
+
41
+ ### Collect/Like Ratio (`collected_count / liked_count`)
42
+
43
+ XHS's "collect" (收藏) is a save-for-later mechanic — users build personal reference libraries. This ratio is the strongest signal of content utility.
44
+
45
+ | Ratio | Classification | Meaning |
46
+ |-------|---------------|---------|
47
+ | >40% | 工具型 (Reference) | Tutorial, checklist, template — users bookmark for reuse |
48
+ | 20–40% | 认知型 (Insight) | Thought-provoking but not saved for later |
49
+ | <20% | 娱乐型 (Entertainment) | Consumed and forgotten — engagement is passive |
50
+
51
+ ### Comment/Like Ratio (`comment_count / liked_count`)
52
+
53
+ Measures how much a note triggers conversation.
54
+
55
+ | Ratio | Classification | Meaning |
56
+ |-------|---------------|---------|
57
+ | >15% | 讨论型 (Discussion) | Debate, sharing experiences, asking questions |
58
+ | 5–15% | 正常互动 (Normal) | Typical engagement pattern |
59
+ | <5% | 围观型 (Passive) | Users like but don't engage further |
60
+
61
+ ### Share/Like Ratio (`share_count / liked_count`)
38
62
 
39
- Always add `--json` to commands when you need to parse the output programmatically. The JSON output is structured and reliable. Without `--json`, output is human-formatted text.
63
+ Measures social currency whether users share to signal identity or help others.
40
64
 
41
- ### Command Details
65
+ | Ratio | Meaning |
66
+ |-------|---------|
67
+ | >10% | 社交货币 — people share to signal taste, identity, or help friends |
68
+ | <10% | Content consumed individually, not forwarded |
69
+
70
+ ### Search Sort Semantics
71
+
72
+ | Sort | What It Reveals |
73
+ |------|----------------|
74
+ | `--sort popular` | Proven ceiling — the best a keyword can do |
75
+ | `--sort latest` | Content velocity — how much is being posted now |
76
+ | `--sort general` | Algorithm-weighted blend (default) |
77
+
78
+ ### Content Form Dynamics
79
+
80
+ | Form | Tendency |
81
+ |------|----------|
82
+ | 图文 (image-text, `type: "normal"`) | Higher collect rate — users save reference content |
83
+ | 视频 (video, `type: "video"`) | Higher like rate — easier to consume passively |
84
+
85
+ ---
86
+
87
+ ## Analysis Modules
88
+
89
+ Each module is a composable building block. Combine them for different analysis depths.
90
+
91
+ ### Module A: Keyword Engagement Matrix
92
+
93
+ **Answers:** Which keywords have the highest engagement ceiling? Which are saturated vs. underserved?
94
+
95
+ **Commands:**
96
+ ```bash
97
+ redbook search "keyword1" --sort popular --json
98
+ redbook search "keyword2" --sort popular --json
99
+ # Repeat for each keyword in your list
100
+ ```
101
+
102
+ **Fields to extract** from each result's `items[]`:
103
+ - `items[].note_card.interact_info.liked_count` — likes (may use Chinese numbers: "1.5万" = 15,000)
104
+ - `items[].note_card.interact_info.collected_count` — collects
105
+ - `items[].note_card.interact_info.comment_count` — comments
106
+ - `items[].note_card.user.nickname` — author
107
+
108
+ **How to interpret:**
109
+ - **Top1 ceiling** = `items[0]` likes — the best-performing note for this keyword. This is the proven demand signal.
110
+ - **Top10 average** = mean likes across `items[0..9]` — how well an average top note does.
111
+ - A high Top1 but low Top10 avg means one outlier dominates; hard to compete.
112
+ - A high Top10 avg means consistent demand; easier to break in.
113
+
114
+ **Output:** Keyword × engagement table ranked by Top1 ceiling.
115
+
116
+ | Keyword | Top1 Likes | Top10 Avg | Top1 Collects | Collect/Like |
117
+ |---------|-----------|-----------|---------------|-------------|
118
+ | keyword1 | 12,000 | 3,200 | 5,400 | 45% |
119
+ | keyword2 | 8,500 | 4,100 | 1,200 | 14% |
120
+
121
+ ---
122
+
123
+ ### Module B: Cross-Topic Heatmap
124
+
125
+ **Answers:** Which topic × scene intersections have demand? Where are the content gaps?
126
+
127
+ **Commands:**
128
+ ```bash
129
+ # Combine base topic with scene/angle keywords
130
+ redbook search "base topic + scene1" --sort popular --json
131
+ redbook search "base topic + scene2" --sort popular --json
132
+ redbook search "base topic + scene3" --sort popular --json
133
+ ```
134
+
135
+ **Fields to extract:** Same as Module A — Top1 `liked_count` for each combination.
136
+
137
+ **How to interpret:**
138
+ - High Top1 = proven demand for this intersection
139
+ - Zero or very low results = content gap (opportunity or no demand — check if the combination makes sense)
140
+ - Compare across scenes to find which angles resonate most with the base topic
141
+
142
+ **Output:** Base × Scene heatmap.
143
+
144
+ ```
145
+ scene1 scene2 scene3 scene4
146
+ base topic ████ 8K ██ 2K ████ 12K ░░ 200
147
+ ```
148
+
149
+ ---
150
+
151
+ ### Module C: Engagement Signal Analysis
152
+
153
+ **Answers:** What type of content is each keyword? Reference, insight, or entertainment?
154
+
155
+ **Commands:** Use search results from Module A, or for a single note:
156
+ ```bash
157
+ redbook analyze-viral "<noteUrl>" --json
158
+ ```
159
+
160
+ **Fields to extract:**
161
+ - From search results: compute ratios from `interact_info` fields
162
+ - From `analyze-viral`: use pre-computed `engagement.collectToLikeRatio`, `engagement.commentToLikeRatio`, `engagement.shareToLikeRatio`
163
+
164
+ **How to interpret:** Apply the ratio benchmarks from [XHS Platform Signals](#xhs-platform-signals) above.
165
+
166
+ **Output:** Per-keyword or per-note classification.
167
+
168
+ | Keyword | Collect/Like | Comment/Like | Type |
169
+ |---------|-------------|-------------|------|
170
+ | keyword1 | 45% | 8% | 工具型 + 正常互动 |
171
+ | keyword2 | 12% | 22% | 娱乐型 + 讨论型 |
172
+
173
+ ---
174
+
175
+ ### Module D: Creator Discovery & Profiling
176
+
177
+ **Answers:** Who are the key creators in this niche? What are their strategies?
178
+
179
+ **Commands:**
180
+ ```bash
181
+ # 1. Collect unique user_ids from search results across keywords
182
+ # Extract from items[].note_card.user.user_id
183
+
184
+ # 2. For each creator:
185
+ redbook user "<userId>" --json
186
+ redbook user-posts "<userId>" --json
187
+ ```
188
+
189
+ **Fields to extract:**
190
+ - From `user`: `interactions[]` where `type === "fans"` → follower count
191
+ - From `user-posts`: `notes[].interact_info.liked_count` for all posts → compute avg, median, max
192
+ - From `user-posts`: `notes[].display_title` → content patterns, posting frequency
193
+
194
+ **How to interpret:**
195
+ - **Avg vs. Median likes:** Large gap means viral outliers inflate the average. Median is the "true" baseline.
196
+ - **Max / Median ratio:** >5× means they've had breakout hits. Study those notes specifically.
197
+ - **Post frequency:** Count notes to estimate posting cadence. Prolific creators (>3/week) vs. quality-focused (<1/week).
198
+
199
+ **Output:** Creator comparison table.
200
+
201
+ | Creator | Followers | Avg Likes | Median | Max | Posts | Style |
202
+ |---------|----------|-----------|--------|-----|-------|-------|
203
+ | @creator1 | 12万 | 3,200 | 1,800 | 45,000 | 89 | Tutorial |
204
+ | @creator2 | 5.4万 | 8,100 | 6,500 | 22,000 | 34 | Story |
205
+
206
+ ---
207
+
208
+ ### Module E: Content Form Breakdown
209
+
210
+ **Answers:** Do image-text or video notes perform better for this topic?
211
+
212
+ **Commands:**
213
+ ```bash
214
+ redbook search "keyword" --type image --sort popular --json
215
+ redbook search "keyword" --type video --sort popular --json
216
+ ```
217
+
218
+ **Fields to extract:**
219
+ - Compare Top1 and Top10 avg `liked_count` and `collected_count` between the two result sets
220
+ - Note the `type` field: `"normal"` = image-text, `"video"` = video
221
+
222
+ **Output:** Form × engagement table.
223
+
224
+ | Form | Top1 Likes | Top10 Avg | Collect/Like |
225
+ |------|-----------|-----------|-------------|
226
+ | 图文 | 8,000 | 2,400 | 42% |
227
+ | 视频 | 15,000 | 5,100 | 18% |
228
+
229
+ ---
230
+
231
+ ### Module F: Opportunity Scoring
232
+
233
+ **Answers:** Which keywords should I target? Where is the best effort-to-reward ratio?
234
+
235
+ **Input:** Keyword matrix from Module A.
236
+
237
+ **Scoring logic:**
238
+ - **Demand** = Top1 likes ceiling (proven audience size)
239
+ - **Competition** = density of high-engagement results (how many notes in Top10 have >1K likes)
240
+ - **Score** = Demand × (1 / Competition density)
241
+
242
+ **Tier thresholds** (based on Top1 likes):
243
+
244
+ | Tier | Top1 Likes | Meaning |
245
+ |------|-----------|---------|
246
+ | S | >100,000 (10万+) | Massive demand — hard to compete but huge upside |
247
+ | A | 20,000–100,000 | Strong demand — competitive but winnable |
248
+ | B | 5,000–20,000 | Moderate demand — good for growing accounts |
249
+ | C | <5,000 | Niche — low competition, low ceiling |
250
+
251
+ **Output:** Tiered keyword list.
252
+
253
+ | Tier | Keyword | Top1 | Competition | Opportunity |
254
+ |------|---------|------|-------------|------------|
255
+ | A | keyword1 | 45K | Medium (6/10 >1K) | High |
256
+ | B | keyword3 | 12K | Low (2/10 >1K) | Very High |
257
+ | S | keyword2 | 120K | High (10/10 >1K) | Medium |
258
+
259
+ ---
260
+
261
+ ### Module G: Audience Inference
262
+
263
+ **Answers:** Who is the audience for this niche? What do they want?
264
+
265
+ **Input:** Engagement ratios from Module C + comment themes from `analyze-viral` + content patterns.
266
+
267
+ **Fields to extract** from `analyze-viral` JSON:
268
+ - `comments.themes[]` — recurring phrases and keywords from comment section
269
+ - `comments.questionRate` — % of comments that are questions (learning intent)
270
+ - `engagement.collectToLikeRatio` — save behavior signals intent
271
+ - `hook.hookPatterns[]` — what title patterns attract this audience
272
+
273
+ **Inference rules:**
274
+ - High collect rate + high question rate → learning-oriented audience (students, professionals)
275
+ - High comment rate + emotional themes → community-oriented audience (sharing experiences)
276
+ - High share rate → aspiration-oriented audience (lifestyle, identity signaling)
277
+ - Comment language patterns → age/education signals (formal = older, slang = younger)
278
+
279
+ **Output:** Audience persona summary — demographics, intent, content preferences.
280
+
281
+ ---
42
282
 
43
- #### `redbook search <keyword>`
283
+ ### Module H: Content Brainstorm
284
+
285
+ **Answers:** What specific content should I create, backed by data?
286
+
287
+ **Input:** Opportunity scores (Module F) + audience persona (Module G) + heatmap gaps (Module B).
288
+
289
+ **For each content idea, specify:**
290
+ - **Target keyword** — from opportunity scoring
291
+ - **Hook angle** — based on `hookPatterns` that work for this niche
292
+ - **Content type** — 工具型/认知型/娱乐型 based on what the audience wants
293
+ - **Form** — 图文 or 视频 based on Module E
294
+ - **Engagement target** — realistic based on Top10 avg for this keyword
295
+ - **Competitive reference** — specific note URL that proves this angle works
296
+
297
+ **Output:** Ranked content ideas with data backing.
298
+
299
+ | # | Keyword | Hook Angle | Type | Target Likes | Reference |
300
+ |---|---------|-----------|------|-------------|-----------|
301
+ | 1 | keyword3 | "N个方法..." (List) | 工具型 图文 | 5K+ | [top note URL] |
302
+ | 2 | keyword1 | "为什么..." (Question) | 认知型 视频 | 10K+ | [top note URL] |
303
+
304
+ ---
305
+
306
+ ## Composed Workflows
307
+
308
+ Combine modules for different analysis depths.
309
+
310
+ ### Quick Topic Scan (~5 min)
311
+ **Modules:** A → C → F
312
+
313
+ Search 3–5 keywords, classify engagement type, rank opportunities. Good for quickly validating whether a niche is worth deeper research.
314
+
315
+ ### Content Planning
316
+ **Modules:** A → B → E → F → H
317
+
318
+ Build keyword matrix, map topic × scene intersections, check content form performance, score opportunities, brainstorm specific content ideas.
319
+
320
+ ### Creator Competitive Analysis
321
+ **Modules:** A → D
322
+
323
+ Find who dominates a niche and study their content strategy, posting frequency, and engagement patterns.
324
+
325
+ ### Full Niche Analysis
326
+ **Modules:** A → B → C → D → E → F → G → H
327
+
328
+ The comprehensive playbook — keyword landscape, cross-topic heatmap, engagement signals, creator profiles, content form analysis, opportunity scoring, audience personas, and data-backed content ideas.
329
+
330
+ ### Single Note Deep-Dive
331
+ **Command:** `redbook analyze-viral "<url>" --json`
332
+
333
+ No module composition needed — `analyze-viral` returns hook analysis, engagement ratios, comment themes, author baseline comparison, and a 0-100 viral score in one call.
334
+
335
+ ### Viral Pattern Research
336
+ ```bash
337
+ # 1. Find top notes
338
+ redbook search "keyword" --sort popular --json
339
+
340
+ # 2. Analyze 3-5 top notes
341
+ redbook analyze-viral "<url1>" --json
342
+ redbook analyze-viral "<url2>" --json
343
+ redbook analyze-viral "<url3>" --json
344
+
345
+ # 3. Synthesize across notes:
346
+ # - Which hookPatterns[] appear most often?
347
+ # - What collectToLikeRatio is typical?
348
+ # - What content structure drives saves vs. shares?
349
+ ```
350
+
351
+ ---
352
+
353
+ ## Command Details
354
+
355
+ ### `redbook search <keyword>`
44
356
 
45
357
  Search for notes by keyword. Returns note titles, URLs, likes, author info.
46
358
 
@@ -56,7 +368,7 @@ redbook search "MCP Server" --page 2 --json # Pagination
56
368
  - `--type <type>`: `all` (default), `video`, `image`
57
369
  - `--page <n>`: Page number (default: 1)
58
370
 
59
- #### `redbook read <url>`
371
+ ### `redbook read <url>`
60
372
 
61
373
  Read a note's full content — title, body text, images, likes, comments count.
62
374
 
@@ -64,9 +376,9 @@ Read a note's full content — title, body text, images, likes, comments count.
64
376
  redbook read "https://www.xiaohongshu.com/explore/abc123" --json
65
377
  ```
66
378
 
67
- Accepts full URLs or short note IDs. If the API returns a captcha, it falls back to HTML scraping automatically.
379
+ Accepts full URLs or short note IDs. Falls back to HTML scraping if API returns captcha.
68
380
 
69
- #### `redbook comments <url>`
381
+ ### `redbook comments <url>`
70
382
 
71
383
  Get comments on a note. Use `--all` to fetch all pages.
72
384
 
@@ -75,10 +387,7 @@ redbook comments "https://www.xiaohongshu.com/explore/abc123" --json
75
387
  redbook comments "https://www.xiaohongshu.com/explore/abc123" --all --json
76
388
  ```
77
389
 
78
- **Options:**
79
- - `--all`: Fetch all comment pages (default: first page only)
80
-
81
- #### `redbook user <userId>`
390
+ ### `redbook user <userId>`
82
391
 
83
392
  Get a creator's profile — nickname, bio, follower count, note count, likes received.
84
393
 
@@ -88,7 +397,7 @@ redbook user "5a1234567890abcdef012345" --json
88
397
 
89
398
  The userId is the hex string from the creator's profile URL.
90
399
 
91
- #### `redbook user-posts <userId>`
400
+ ### `redbook user-posts <userId>`
92
401
 
93
402
  List all notes posted by a creator. Returns titles, URLs, likes, timestamps.
94
403
 
@@ -96,15 +405,15 @@ List all notes posted by a creator. Returns titles, URLs, likes, timestamps.
96
405
  redbook user-posts "5a1234567890abcdef012345" --json
97
406
  ```
98
407
 
99
- #### `redbook feed`
408
+ ### `redbook feed`
100
409
 
101
- Browse the recommendation feed. Returns a batch of recommended notes.
410
+ Browse the recommendation feed.
102
411
 
103
412
  ```bash
104
413
  redbook feed --json
105
414
  ```
106
415
 
107
- #### `redbook topics <keyword>`
416
+ ### `redbook topics <keyword>`
108
417
 
109
418
  Search for topic hashtags. Useful for finding trending topics to attach to posts.
110
419
 
@@ -112,9 +421,9 @@ Search for topic hashtags. Useful for finding trending topics to attach to posts
112
421
  redbook topics "Claude Code" --json
113
422
  ```
114
423
 
115
- #### `redbook analyze-viral <url>`
424
+ ### `redbook analyze-viral <url>`
116
425
 
117
- Analyze why a viral XHS note works — hook patterns, engagement metrics, content structure, and performance relative to the author's baseline. Returns a deterministic viral score (0-100).
426
+ Analyze why a viral note works. Returns a deterministic viral score (0100).
118
427
 
119
428
  ```bash
120
429
  redbook analyze-viral "https://www.xiaohongshu.com/explore/abc123" --json
@@ -127,15 +436,14 @@ redbook analyze-viral "https://www.xiaohongshu.com/explore/abc123" --comment-pag
127
436
  **JSON output structure:**
128
437
  Returns `{ note, score, hook, content, visual, engagement, comments, relative, fetchedAt }`.
129
438
 
130
- - `score.overall` (0-100) — composite of hook/engagement/relative/content/comments
131
- - `hook.hookPatterns[]` — detected title patterns (Identity Hook, Emotion Word, etc.)
132
- - `engagement` — likes, comments, collects, shares + ratios
133
- - `relative.viralMultiplier` — performance vs author's median
134
- - `comments.themes[]` — top recurring phrases from comments
135
-
136
- Use `--json` when consuming from Claude Code skills for further synthesis.
439
+ - `score.overall` (0100) — composite of hook (20) + engagement (20) + relative (20) + content (20) + comments (20)
440
+ - `hook.hookPatterns[]` — detected title patterns (Identity Hook, Emotion Word, Number Hook, Question, etc.)
441
+ - `engagement` — likes, comments, collects, shares + ratios (collectToLikeRatio, commentToLikeRatio, shareToLikeRatio)
442
+ - `relative.viralMultiplier` — this note's likes / author's median likes
443
+ - `relative.isOutlier` — true if viralMultiplier > 3
444
+ - `comments.themes[]` — top recurring keyword phrases from comments
137
445
 
138
- #### `redbook whoami`
446
+ ### `redbook whoami`
139
447
 
140
448
  Check connection status. Verifies cookies are valid and shows the logged-in user.
141
449
 
@@ -143,11 +451,9 @@ Check connection status. Verifies cookies are valid and shows the logged-in user
143
451
  redbook whoami
144
452
  ```
145
453
 
146
- If this fails, the user needs to log into xiaohongshu.com in Chrome.
147
-
148
- #### `redbook post` (Limited)
454
+ ### `redbook post` (Limited)
149
455
 
150
- Publish an image note. **Note: This command frequently triggers captcha (type=124) on the creator API.** Image upload works, but the publish step is unreliable. For posting, use Chrome browser automation instead.
456
+ Publish an image note. **Frequently triggers captcha (type=124) on the creator API.** Image upload works, but the publish step is unreliable. For posting, consider using browser automation instead.
151
457
 
152
458
  ```bash
153
459
  redbook post --title "标题" --body "正文" --images cover.png --json
@@ -168,90 +474,11 @@ All commands accept:
168
474
  - `--chrome-profile <name>`: Chrome profile directory name (e.g., "Profile 1"). Auto-discovered if omitted.
169
475
  - `--json`: Output as JSON
170
476
 
171
- ## Research Workflows
172
-
173
- ### Competitive Analysis
174
-
175
- Research competitors in a niche:
176
-
177
- ```bash
178
- # 1. Search for content in the niche
179
- redbook search "Claude Code" --sort popular --json
180
-
181
- # 2. Identify top creators from search results (extract user IDs)
182
-
183
- # 3. Deep-dive on each creator
184
- redbook user <userId> --json # Profile + follower stats
185
- redbook user-posts <userId> --json # All their content + engagement
186
-
187
- # 4. Read their top-performing notes
188
- redbook read <noteUrl> --json # Full content
189
- redbook comments <noteUrl> --all --json # What resonates with audience
190
- ```
191
-
192
- ### Topic Research
193
-
194
- Find trending topics and hashtags:
195
-
196
- ```bash
197
- # Search for topics
198
- redbook topics "AI编程" --json
199
-
200
- # Search notes using different sort orders to understand the landscape
201
- redbook search "keyword" --sort popular --json # What's proven
202
- redbook search "keyword" --sort latest --json # What's new
203
- ```
204
-
205
- ### Viral Note Research
206
-
207
- Analyze what makes top-performing notes work:
208
-
209
- ```bash
210
- # 1. Find viral notes in a niche
211
- redbook search "Claude Code" --sort popular --json
212
-
213
- # 2. Analyze the top-performing note
214
- redbook analyze-viral "<noteUrl>" --json
215
-
216
- # 3. Compare multiple viral notes to find common patterns
217
- # Run analyze-viral on 3-5 top notes, then synthesize:
218
- # - Which hook patterns appear most often?
219
- # - What engagement ratios are typical for this niche?
220
- # - What content structure drives saves vs shares?
221
- ```
222
-
223
- ### Creator Deep-Dive
224
-
225
- Analyze a specific creator's strategy:
226
-
227
- ```bash
228
- # Get profile overview
229
- redbook user <userId> --json
230
-
231
- # Get all their posts to analyze content patterns
232
- redbook user-posts <userId> --json
233
-
234
- # Read their top posts for content structure analysis
235
- redbook read <topPostUrl> --json
236
- redbook comments <topPostUrl> --all --json
237
- ```
238
-
239
- ## Programmatic API
240
-
241
- The package also exports a TypeScript client for scripting:
242
-
243
- ```typescript
244
- import { XhsClient } from "@lucasygu/redbook";
245
- import { loadCookies } from "@lucasygu/redbook/cookies";
246
-
247
- const cookies = await loadCookies("chrome");
248
- const client = new XhsClient(cookies);
477
+ ---
249
478
 
250
- const results = await client.searchNotes("AI编程", 1, 20, "popular");
251
- const topics = await client.searchTopics("Claude Code");
252
- ```
479
+ ## Technical Reference
253
480
 
254
- ## xsec_token — CRITICAL for Reading Notes
481
+ ### xsec_token — Required for Reading Notes
255
482
 
256
483
  The XHS API requires a valid `xsec_token` to fetch note content. Without it, `read`, `comments`, and `analyze-viral` return `{}`.
257
484
 
@@ -272,10 +499,9 @@ redbook read "https://www.xiaohongshu.com/explore/689da7b0?xsec_token=OLD_TOKEN"
272
499
  redbook search "AI编程" --sort popular --json
273
500
  # Extract the noteId + xsec_token from search results, then:
274
501
  redbook read "https://www.xiaohongshu.com/explore/<noteId>?xsec_token=<freshToken>" --json
275
- redbook analyze-viral "https://www.xiaohongshu.com/explore/<noteId>?xsec_token=<freshToken>" --json
276
502
  ```
277
503
 
278
- **For Claude Code agents:** When the user gives you a bare XHS note URL (no `xsec_token` param), extract the noteId from the URL path, search for the note title or noteId to get a fresh token, then use the full URL with the fresh token for `read`/`comments`/`analyze-viral`.
504
+ **For agents:** When the user gives a bare XHS note URL (no `xsec_token` param), extract the noteId from the URL path, search for the note title or noteId to get a fresh token, then use the full URL with the fresh token.
279
505
 
280
506
  **How to extract fresh URLs from search results (JSON):**
281
507
 
@@ -287,7 +513,7 @@ redbook analyze-viral "https://www.xiaohongshu.com/explore/<noteId>?xsec_token=<
287
513
  **Commands that need xsec_token:** `read`, `comments`, `analyze-viral`
288
514
  **Commands that do NOT need xsec_token:** `search`, `user`, `user-posts`, `feed`, `whoami`, `topics`
289
515
 
290
- ## Chinese Number Formats in API Responses
516
+ ### Chinese Number Formats in API Responses
291
517
 
292
518
  The XHS API returns abbreviated numbers with Chinese unit suffixes:
293
519
 
@@ -302,16 +528,55 @@ The XHS API returns abbreviated numbers with Chinese unit suffixes:
302
528
 
303
529
  The `analyze-viral` command handles this automatically. When parsing `--json` output manually, watch for these suffixes in `interact_info` fields (`liked_count`, `collected_count`, etc.).
304
530
 
305
- ## Error Handling
531
+ ### Error Handling
306
532
 
307
533
  | Error | Meaning | Fix |
308
534
  |-------|---------|-----|
309
- | `{}` empty response | Missing or expired xsec_token | Search first to get a fresh token (see above) |
535
+ | `{}` empty response | Missing or expired xsec_token | Search first to get a fresh token |
310
536
  | "No 'a1' cookie" | Not logged into XHS in browser | Log into xiaohongshu.com in Chrome |
311
537
  | "Session expired" | Cookie too old | Re-login in Chrome |
312
538
  | "NeedVerify" / captcha | Anti-bot triggered | Wait and retry, or reduce request frequency |
313
539
  | "IP blocked" (300012) | Rate limited | Wait or switch network |
314
540
 
541
+ ---
542
+
543
+ ## Output Format Guidance
544
+
545
+ When producing analysis reports, use these formats:
546
+
547
+ **Data tables:** Markdown tables with exact field mappings. Always include the metric unit.
548
+
549
+ **Heatmaps:** ASCII bar charts for cross-topic comparison:
550
+ ```
551
+ 职场 生活 教育 创业
552
+ AI编程 ████ 8K ██ 2K ████ 12K ░░ 200
553
+ Claude Code ██ 3K ░░ 100 ██ 4K █ 1K
554
+ ```
555
+
556
+ **Creator comparison:** Structured table with both quantitative metrics and qualitative style assessment.
557
+
558
+ **Final reports:** Use this section order:
559
+ 1. Market Overview (demand signals, content velocity)
560
+ 2. Keyword Landscape (engagement matrix, opportunity tiers)
561
+ 3. Cross-Topic Heatmap (topic × scene intersections)
562
+ 4. Audience Persona (demographics, intent, preferences)
563
+ 5. Competitive Landscape (creator profiles, strategy patterns)
564
+ 6. Content Opportunities (tiered recommendations with data backing)
565
+ 7. Content Ideas (specific hooks, angles, targets)
566
+
567
+ ## Programmatic API
568
+
569
+ ```typescript
570
+ import { XhsClient } from "@lucasygu/redbook";
571
+ import { loadCookies } from "@lucasygu/redbook/cookies";
572
+
573
+ const cookies = await loadCookies("chrome");
574
+ const client = new XhsClient(cookies);
575
+
576
+ const results = await client.searchNotes("AI编程", 1, 20, "popular");
577
+ const topics = await client.searchTopics("Claude Code");
578
+ ```
579
+
315
580
  ## Requirements
316
581
 
317
582
  - Node.js >= 22
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "@lucasygu/redbook",
3
- "version": "0.1.13",
3
+ "version": "0.1.14",
4
4
  "description": "CLI tool for Xiaohongshu (Red Note) - read and post via private API",
5
5
  "type": "module",
6
6
  "main": "./dist/lib/client.js",