react-native-pageindex 0.1.2 → 0.1.3

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
Files changed (3) hide show
  1. package/CHANGELOG.md +13 -0
  2. package/README.md +153 -2
  3. package/package.json +1 -1
package/CHANGELOG.md CHANGED
@@ -6,6 +6,19 @@ The format follows [Keep a Changelog](https://keepachangelog.com/en/1.0.0/), and
6
6
 
7
7
  ---
8
8
 
9
+ ## [0.1.3] — 2026-03-08
10
+
11
+ ### Added
12
+ - **Chat mode** (`ChatPanel.tsx`) — conversational multi-turn Q&A over any indexed document, with collapsible cited-source cards per reply (node title, node ID, page range, relevance score)
13
+ - `docs/screenshots/demo-chat-mode.png` — screenshot of the Chat tab in action
14
+
15
+ ### Documentation
16
+ - Added `## Conversational Chat Mode` section to README with minimal code example and full description of the browser demo's Chat tab implementation
17
+ - Updated Demo section with Chat mode screenshot and walkthrough (section 8)
18
+ - Updated Features table to include **Conversational chat**
19
+
20
+ ---
21
+
9
22
  ## [0.1.2] — 2026-03-08
10
23
 
11
24
  ### Documentation
package/README.md CHANGED
@@ -25,6 +25,12 @@ A fully interactive React demo app is included in the [`demo/`](./demo) director
25
25
 
26
26
  ![PageIndex Demo – LLM mode](docs/screenshots/demo-llm-mode.png)
27
27
 
28
+ ### Chat mode — conversational AI over any indexed document
29
+
30
+ > Ask natural-language questions and get cited answers backed by the reverse index. Multi-turn conversation with collapsible source references per reply.
31
+
32
+ ![PageIndex Demo – Chat mode](docs/screenshots/demo-chat-mode.png)
33
+
28
34
  ---
29
35
 
30
36
  ### How the demo is built
@@ -182,7 +188,74 @@ const hits = searchReverseIndex(reverseIndex, query, 20);
182
188
 
183
189
  Results are ranked by `totalScore` and each card shows the matched term, score, confidence level (High / Medium / Low), and the page range covered by that tree node.
184
190
 
185
- #### 8. Running the demo locally
191
+ #### 8. Chat mode (`ChatPanel.tsx`)
192
+
193
+ After the index is built a **💬 Chat** tab appears alongside Tree View, Search and Raw Pages. It implements a full multi-turn conversational loop over the indexed document:
194
+
195
+ 1. **Retrieve** — `searchReverseIndex(reverseIndex, question, 5)` fetches the top-5 relevant tree nodes.
196
+ 2. **Build context** — actual page text (or node summaries as fallback) from those nodes is injected into the prompt, capped to 1 200 chars per node.
197
+ 3. **Chat history** — the last 10 conversation turns are passed as `chatHistory` for multi-turn continuity.
198
+ 4. **Stream answer** — the configured `LLMProvider` returns the answer, which is displayed with collapsible **citations** (node title, node ID, page range, relevance score).
199
+
200
+ ```ts
201
+ import { searchReverseIndex } from 'react-native-pageindex';
202
+ import type { LLMProvider, ReverseIndex, PageIndexResult, PageData } from 'react-native-pageindex';
203
+
204
+ async function chat(
205
+ question: string,
206
+ reverseIndex: ReverseIndex,
207
+ result: PageIndexResult,
208
+ pages: PageData[],
209
+ llm: LLMProvider,
210
+ chatHistory: { role: 'user' | 'assistant'; content: string }[] = [],
211
+ ) {
212
+ // 1. Retrieve relevant nodes
213
+ const hits = searchReverseIndex(reverseIndex, question, 5);
214
+
215
+ // 2. Build grounded context from page text / node summaries
216
+ const contextParts = hits.map(hit => {
217
+ const pageRange = `${hit.startIndex ?? '?'}–${hit.endIndex ?? '?'}`;
218
+ const body = pages
219
+ .slice((hit.startIndex ?? 1) - 1, hit.endIndex ?? 1)
220
+ .map(p => p.text)
221
+ .join('\n')
222
+ .slice(0, 1200);
223
+ return `[${hit.nodeTitle} | pages ${pageRange}]\n${body || hit.summary}`;
224
+ });
225
+
226
+ const systemPrompt =
227
+ `You are a helpful assistant for "${result.doc_name}". ` +
228
+ `Use the provided sections as your primary source. ` +
229
+ `Always cite which section your answer comes from.`;
230
+
231
+ const userTurn =
232
+ `Relevant sections:\n\n${contextParts.join('\n\n---\n\n')}` +
233
+ `\n\nQuestion: ${question}`;
234
+
235
+ // 3. Call LLM with chat history
236
+ const response = await llm(userTurn, {
237
+ chatHistory: [
238
+ { role: 'system' as any, content: systemPrompt },
239
+ ...chatHistory,
240
+ ],
241
+ });
242
+
243
+ // 4. Return answer + citation metadata
244
+ return {
245
+ answer: response.content,
246
+ citations: hits.map(h => ({
247
+ title: h.nodeTitle,
248
+ nodeId: h.nodeId,
249
+ pages: `${h.startIndex}–${h.endIndex}`,
250
+ score: Math.round(h.totalScore * 100),
251
+ })),
252
+ };
253
+ }
254
+ ```
255
+
256
+ > **Chat requires an LLM provider.** In **Keyword mode** the Chat tab is visible but the `llm` ref is `null` — configure an API key in the sidebar and switch to **Full LLM** mode before building the index to enable chat.
257
+
258
+ #### 9. Running the demo locally
186
259
 
187
260
  ```bash
188
261
  git clone https://github.com/subham11/react-native-pageindex.git
@@ -191,7 +264,7 @@ npm install
191
264
  npm run dev # → http://localhost:5173
192
265
  ```
193
266
 
194
- Select **Sample CSV → Keyword** for an instant zero-API-key demo, or select **Sample PDF → Full LLM**, enter an OpenAI or Anthropic key, and click **Build LLM Index** to see the full semantic-tree pipeline in action.
267
+ Select **Sample CSV → Keyword** for an instant zero-API-key demo, or select **Sample PDF → Full LLM**, enter an OpenAI or Anthropic key, and click **Build LLM Index** to see the full semantic-tree pipeline in action. Once the index is built, switch to the **💬 Chat** tab to start a conversation with the document.
195
268
 
196
269
  ---
197
270
 
@@ -202,6 +275,7 @@ Select **Sample CSV → Keyword** for an instant zero-API-key demo, or select **
202
275
  | **Multi-format** | PDF, Word (.docx), CSV, Spreadsheet (.xlsx/.xls), Markdown |
203
276
  | **Forward index** | Hierarchical tree: chapters → sections → subsections |
204
277
  | **Reverse index** | Inverted index: term → node locations for fast lookup |
278
+ | **Conversational chat** | Multi-turn Q&A with cited answers, backed by the reverse index |
205
279
  | **Provider-agnostic** | Pass any LLM (OpenAI, Anthropic, Ollama, Gemini…) |
206
280
  | **Progress tracking** | Fine-grained per-step callbacks (13 PDF steps, 8 MD steps) |
207
281
  | **Fully typed** | 100% TypeScript, `.d.ts` declarations included |
@@ -553,6 +627,83 @@ const llm: LLMProvider = async (prompt) => {
553
627
 
554
628
  ---
555
629
 
630
+ ## Conversational Chat Mode
631
+
632
+ Once you have a `PageIndexResult` and a `ReverseIndex` you can add a full multi-turn chat interface to your app. The pattern is:
633
+
634
+ ```
635
+ User question
636
+ → searchReverseIndex() ← retrieves the most relevant tree nodes
637
+ → build grounded context ← page text / node summaries (no embeddings)
638
+ → LLMProvider() ← any provider, with chat history
639
+ → cited answer ← response + source metadata
640
+ ```
641
+
642
+ ### Minimal example
643
+
644
+ ```ts
645
+ import {
646
+ pageIndex,
647
+ buildReverseIndex,
648
+ searchReverseIndex,
649
+ } from 'react-native-pageindex';
650
+
651
+ // 1. Build the forward index (once per document)
652
+ const result = await pageIndex({ pages, llm, docName: 'My Docs' });
653
+
654
+ // 2. Build the reverse index (once per document)
655
+ const reverseIndex = await buildReverseIndex({ result, pages, options: { mode: 'keyword' } });
656
+
657
+ // 3. Chat loop
658
+ const history: { role: 'user' | 'assistant'; content: string }[] = [];
659
+
660
+ async function ask(question: string) {
661
+ // Retrieve top-5 relevant nodes
662
+ const hits = searchReverseIndex(reverseIndex, question, 5);
663
+
664
+ // Build grounded context
665
+ const context = hits
666
+ .map(h => `[${h.nodeTitle}]\n${h.summary ?? ''}`)
667
+ .join('\n\n---\n\n');
668
+
669
+ const userTurn = `Context:\n${context}\n\nQuestion: ${question}`;
670
+
671
+ // Call LLM with running history
672
+ const { content } = await llm(userTurn, { chatHistory: history });
673
+
674
+ // Update history for next turn
675
+ history.push({ role: 'user', content: question });
676
+ history.push({ role: 'assistant', content });
677
+
678
+ return {
679
+ answer: content,
680
+ sources: hits.map(h => ({ title: h.nodeTitle, pages: `${h.startIndex}–${h.endIndex}` })),
681
+ };
682
+ }
683
+
684
+ // Usage
685
+ const { answer, sources } = await ask('Best season to grow paddy in Odisha?');
686
+ console.log(answer);
687
+ // → "According to the 'Rice (Paddy) Cultivation' section, rice is primarily
688
+ // a kharif crop … the best season is during the kharif / monsoon period."
689
+ console.log(sources);
690
+ // → [{ title: 'Rice (Paddy) Cultivation', pages: '12–15' }, ...]
691
+ ```
692
+
693
+ ### Chat in the browser demo
694
+
695
+ The demo app's **💬 Chat** tab is a fully-featured implementation built on top of the pattern above:
696
+
697
+ - **Multi-turn** — up to 10 previous messages are sent as `chatHistory`, preserving conversational context.
698
+ - **Cited answers** — each response includes expandable source cards with node title, node ID, page range and relevance score (0–100).
699
+ - **Grounded context** — actual page text is preferred over summaries; each node's contribution is capped at 1 200 chars to stay within token budgets.
700
+ - **Keyboard shortcuts** — Enter to send, Shift+Enter for a newline.
701
+ - **LLM providers** — OpenAI (via Vite dev-server proxy), Anthropic (direct), or Ollama (local) — configured in the sidebar before building the index.
702
+
703
+ > **Tip:** For best chat quality, build the index in **Full LLM** mode (not Keyword mode) so each node has a rich LLM-generated summary the chat can draw on when no page text is available.
704
+
705
+ ---
706
+
556
707
  ## React Native Usage
557
708
 
558
709
  ```ts
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "react-native-pageindex",
3
- "version": "0.1.2",
3
+ "version": "0.1.3",
4
4
  "description": "Vectorless, reasoning-based RAG — builds a hierarchical tree index from PDF, DOCX, CSV, XLSX or Markdown using any LLM. React Native compatible.",
5
5
  "main": "dist/index.js",
6
6
  "module": "dist/index.js",