rlm-analyzer 1.3.4 → 1.5.1

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
Files changed (62) hide show
  1. package/CHANGELOG.md +30 -0
  2. package/README.md +275 -40
  3. package/dist/cli.d.ts +2 -0
  4. package/dist/cli.d.ts.map +1 -1
  5. package/dist/cli.js +220 -109
  6. package/dist/cli.js.map +1 -1
  7. package/dist/config.d.ts +51 -4
  8. package/dist/config.d.ts.map +1 -1
  9. package/dist/config.js +203 -8
  10. package/dist/config.js.map +1 -1
  11. package/dist/context-manager.d.ts +2 -2
  12. package/dist/context-manager.d.ts.map +1 -1
  13. package/dist/context-manager.js +1 -1
  14. package/dist/context-manager.js.map +1 -1
  15. package/dist/grounding.d.ts +4 -2
  16. package/dist/grounding.d.ts.map +1 -1
  17. package/dist/grounding.js +26 -32
  18. package/dist/grounding.js.map +1 -1
  19. package/dist/index.d.ts +6 -2
  20. package/dist/index.d.ts.map +1 -1
  21. package/dist/index.js +16 -4
  22. package/dist/index.js.map +1 -1
  23. package/dist/mcp-server.d.ts +33 -0
  24. package/dist/mcp-server.d.ts.map +1 -1
  25. package/dist/mcp-server.js +99 -17
  26. package/dist/mcp-server.js.map +1 -1
  27. package/dist/models.d.ts +60 -4
  28. package/dist/models.d.ts.map +1 -1
  29. package/dist/models.js +238 -14
  30. package/dist/models.js.map +1 -1
  31. package/dist/orchestrator.d.ts +3 -3
  32. package/dist/orchestrator.d.ts.map +1 -1
  33. package/dist/orchestrator.js +42 -45
  34. package/dist/orchestrator.js.map +1 -1
  35. package/dist/providers/bedrock.d.ts +62 -0
  36. package/dist/providers/bedrock.d.ts.map +1 -0
  37. package/dist/providers/bedrock.js +256 -0
  38. package/dist/providers/bedrock.js.map +1 -0
  39. package/dist/providers/claude.d.ts +43 -0
  40. package/dist/providers/claude.d.ts.map +1 -0
  41. package/dist/providers/claude.js +240 -0
  42. package/dist/providers/claude.js.map +1 -0
  43. package/dist/providers/factory.d.ts +44 -0
  44. package/dist/providers/factory.d.ts.map +1 -0
  45. package/dist/providers/factory.js +91 -0
  46. package/dist/providers/factory.js.map +1 -0
  47. package/dist/providers/gemini.d.ts +26 -0
  48. package/dist/providers/gemini.d.ts.map +1 -0
  49. package/dist/providers/gemini.js +139 -0
  50. package/dist/providers/gemini.js.map +1 -0
  51. package/dist/providers/index.d.ts +10 -0
  52. package/dist/providers/index.d.ts.map +1 -0
  53. package/dist/providers/index.js +11 -0
  54. package/dist/providers/index.js.map +1 -0
  55. package/dist/providers/types.d.ts +100 -0
  56. package/dist/providers/types.d.ts.map +1 -0
  57. package/dist/providers/types.js +6 -0
  58. package/dist/providers/types.js.map +1 -0
  59. package/dist/types.d.ts +4 -1
  60. package/dist/types.d.ts.map +1 -1
  61. package/dist/types.js.map +1 -1
  62. package/package.json +18 -2
package/CHANGELOG.md CHANGED
@@ -5,6 +5,36 @@ All notable changes to this project will be documented in this file.
5
5
  The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
6
6
  and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).
7
7
 
8
+ ## [1.5.0] - 2026-01-22
9
+
10
+ ### Added
11
+ - **Claude Provider (Anthropic API)**: Direct integration with Claude API as third provider option
12
+ - New `--provider claude` CLI option
13
+ - Supports Claude 4.5 Sonnet, Opus, and Haiku models
14
+ - Model aliases: `sonnet`, `opus`, `haiku`, `claude-sonnet`, `claude-opus`, `claude-haiku`
15
+ - Authentication via `ANTHROPIC_API_KEY` or `CLAUDE_API_KEY` environment variable
16
+ - **Claude Web Search**: Web grounding support for Claude provider
17
+ - Uses `web_search_20250305` tool for real-time web data
18
+ - Automatically enabled with `--grounding` flag when using Claude
19
+ - **Token Usage Tracking**: Added `TokenUsage` interface for tracking API token consumption
20
+ - Includes `inputTokens`, `outputTokens`, `totalTokens`
21
+ - Cache token tracking for Claude (`cacheCreationTokens`, `cacheReadTokens`)
22
+
23
+ ### Changed
24
+ - **Bedrock Web Grounding**: Now uses Nova 2 Lite (`us.amazon.nova-2-lite-v1:0`) instead of Nova Premier
25
+ - Nova 2 Lite has native web grounding support via `nova_grounding` system tool
26
+ - Updated provider factory to support three providers: `gemini`, `bedrock`, `claude`
27
+ - Enhanced MCP server with Claude provider support and status reporting
28
+
29
+ ### Removed
30
+ - Removed `nova-2-pro` model alias (Nova 2 Pro is in preview only, not generally available)
31
+
32
+ ### Documentation
33
+ - Updated `docs/models-and-commands.md` with Claude provider models and aliases
34
+ - Updated README with Claude provider setup instructions
35
+
36
+ ---
37
+
8
38
  ## [1.3.4] - 2026-01-20
9
39
 
10
40
  ### Added
package/README.md CHANGED
@@ -5,7 +5,7 @@
5
5
  [![npm version](https://badge.fury.io/js/rlm-analyzer.svg)](https://www.npmjs.com/package/rlm-analyzer)
6
6
  [![License: MIT](https://img.shields.io/badge/License-MIT-yellow.svg)](https://opensource.org/licenses/MIT)
7
7
 
8
- Analyze any codebase with AI that can process **100x beyond context limits**. Powered by **Gemini 3** and based on MIT CSAIL research on [Recursive Language Models](https://arxiv.org/abs/2512.24601).
8
+ Analyze any codebase with AI that can process **100x beyond context limits**. Powered by **Gemini 3**, **Amazon Bedrock (Nova/Claude/Llama)**, or **Claude (Anthropic API)** and based on MIT CSAIL research on [Recursive Language Models](https://arxiv.org/abs/2512.24601).
9
9
 
10
10
  ## Features
11
11
 
@@ -16,8 +16,10 @@ Analyze any codebase with AI that can process **100x beyond context limits**. Po
16
16
  - **Refactoring Suggestions** - Identifies code smells and improvements
17
17
  - **Symbol Search** - Find all usages of functions, classes, variables
18
18
  - **Custom Questions** - Ask anything about your codebase
19
+ - **Multi-Provider Support** - Choose between Gemini (default), Amazon Bedrock (Nova/Claude/Llama), or Claude (Anthropic API)
20
+ - **Web Grounding** - Verify package versions with real-time web search (Gemini & Nova Premier)
19
21
  - **MCP Integration** - Works with Claude Code, Cursor, and other MCP clients
20
- - **Cost Efficient** - Save 60-73% on API costs by offloading to Gemini
22
+ - **Cost Efficient** - Save 60-73% on API costs by offloading to Gemini/Nova
21
23
  - **Token Optimization** - Context compression saves additional 50-70%
22
24
 
23
25
  ## Table of Contents
@@ -34,6 +36,11 @@ Analyze any codebase with AI that can process **100x beyond context limits**. Po
34
36
  - [Cost Savings](#cost-savings-with-mcp-integration)
35
37
  - [Troubleshooting](#troubleshooting)
36
38
 
39
+ ## Documentation
40
+
41
+ - **[How It Works](docs/how-it-works.md)** - Deep dive into RLM architecture, recursive analysis, and token optimization
42
+ - **[Models & Commands Reference](docs/models-and-commands.md)** - Complete list of CLI commands, model IDs, and aliases for Gemini, Bedrock, and Claude
43
+
37
44
  ---
38
45
 
39
46
  ## Installation
@@ -60,32 +67,73 @@ npx rlm-analyzer summary
60
67
 
61
68
  ## Quick Start
62
69
 
63
- ### 1. Configure API Key
70
+ ### 1. Configure Provider Credentials
71
+
72
+ #### Option A: Google Gemini (Default)
64
73
 
65
74
  Get a free API key from [Google AI Studio](https://makersuite.google.com/app/apikey), then:
66
75
 
67
76
  ```bash
68
- # Option 1: Use the config command
77
+ # Use the config command
69
78
  rlm config YOUR_GEMINI_API_KEY
70
79
 
71
- # Option 2: Set environment variable
80
+ # Or set environment variable
72
81
  export GEMINI_API_KEY=your_api_key
73
82
 
74
- # Option 3: Create .env file in your project
83
+ # Or create .env file
75
84
  echo "GEMINI_API_KEY=your_api_key" > .env
76
85
  ```
77
86
 
87
+ #### Option B: Amazon Bedrock
88
+
89
+ First, install the AWS SDK (required for Bedrock):
90
+
91
+ ```bash
92
+ npm install @aws-sdk/client-bedrock-runtime
93
+ ```
94
+
95
+ Then configure authentication (choose one):
96
+
97
+ ```bash
98
+ # Option 1: Bedrock API Key (Recommended - simplest setup)
99
+ # Generate at: AWS Console → Bedrock → API keys
100
+ export AWS_BEARER_TOKEN_BEDROCK=your_bedrock_api_key
101
+ export AWS_REGION=us-east-1
102
+
103
+ # Option 2: AWS Access Keys
104
+ export AWS_ACCESS_KEY_ID=your_access_key
105
+ export AWS_SECRET_ACCESS_KEY=your_secret_key
106
+ export AWS_REGION=us-east-1
107
+
108
+ # Option 3: AWS CLI profile
109
+ aws configure
110
+ ```
111
+
112
+ #### Option C: Claude (Anthropic API)
113
+
114
+ Get your API key from [Anthropic Console](https://console.anthropic.com/), then:
115
+
116
+ ```bash
117
+ export ANTHROPIC_API_KEY=your_api_key
118
+ ```
119
+
78
120
  ### 2. Analyze Your Code
79
121
 
80
122
  ```bash
81
- # Get a codebase summary
123
+ # Get a codebase summary (uses Gemini by default)
82
124
  rlm summary
83
125
 
126
+ # Use Amazon Bedrock instead
127
+ rlm summary --provider bedrock
128
+
129
+ # Use Claude (Anthropic API) instead
130
+ rlm summary --provider claude
131
+
84
132
  # Analyze architecture
85
133
  rlm arch
86
134
 
87
- # Security analysis
88
- rlm security
135
+ # Security analysis with web grounding
136
+ rlm security --grounding
89
137
 
90
138
  # Ask a question
91
139
  rlm ask "How does authentication work?"
@@ -117,6 +165,8 @@ rlm ask "How does authentication work?"
117
165
  |--------|-------------|
118
166
  | `--dir, -d <path>` | Directory to analyze (default: current) |
119
167
  | `--model, -m <name>` | Model to use (see [Model Configuration](#model-configuration)) |
168
+ | `--provider, -p <name>` | LLM provider: `gemini` (default), `bedrock`, or `claude` |
169
+ | `--grounding` | Enable web grounding for security analysis |
120
170
  | `--output, -o <file>` | Save results to a markdown file |
121
171
  | `--verbose, -v` | Show detailed turn-by-turn output |
122
172
  | `--json` | Output results as JSON |
@@ -131,6 +181,15 @@ rlm arch --dir /path/to/project
131
181
  # Use a specific model
132
182
  rlm summary --model smart
133
183
 
184
+ # Use Amazon Bedrock with Nova Pro
185
+ rlm summary --provider bedrock --model smart
186
+
187
+ # Use Bedrock with Claude (via AWS)
188
+ rlm arch --provider bedrock --model claude-sonnet
189
+
190
+ # Use Claude directly (Anthropic API)
191
+ rlm arch --provider claude --model sonnet
192
+
134
193
  # Find all usages of a function
135
194
  rlm find "handleSubmit"
136
195
 
@@ -140,6 +199,9 @@ rlm explain src/auth/login.ts
140
199
  # Ask about the codebase
141
200
  rlm ask "What design patterns are used in this codebase?"
142
201
 
202
+ # Security analysis with web grounding (verifies package versions)
203
+ rlm security --grounding
204
+
143
205
  # Get JSON output for scripting
144
206
  rlm summary --json > analysis.json
145
207
 
@@ -160,6 +222,8 @@ RLM Analyzer includes an MCP (Model Context Protocol) server for integration wit
160
222
 
161
223
  Add to your Claude Code configuration (`~/.claude.json` or project `.mcp.json`):
162
224
 
225
+ #### Using Gemini (Default)
226
+
163
227
  ```json
164
228
  {
165
229
  "mcpServers": {
@@ -174,6 +238,71 @@ Add to your Claude Code configuration (`~/.claude.json` or project `.mcp.json`):
174
238
  }
175
239
  ```
176
240
 
241
+ #### Using Amazon Bedrock
242
+
243
+ First, install the AWS SDK in your project or globally:
244
+
245
+ ```bash
246
+ npm install @aws-sdk/client-bedrock-runtime
247
+ ```
248
+
249
+ Then configure the MCP server (choose one authentication method):
250
+
251
+ **Option 1: Bedrock API Key (Recommended)**
252
+
253
+ ```json
254
+ {
255
+ "mcpServers": {
256
+ "rlm-analyzer": {
257
+ "command": "npx",
258
+ "args": ["-y", "rlm-analyzer-mcp"],
259
+ "env": {
260
+ "RLM_PROVIDER": "bedrock",
261
+ "AWS_BEARER_TOKEN_BEDROCK": "your_bedrock_api_key",
262
+ "AWS_REGION": "us-east-1"
263
+ }
264
+ }
265
+ }
266
+ }
267
+ ```
268
+
269
+ **Option 2: AWS Access Keys**
270
+
271
+ ```json
272
+ {
273
+ "mcpServers": {
274
+ "rlm-analyzer": {
275
+ "command": "npx",
276
+ "args": ["-y", "rlm-analyzer-mcp"],
277
+ "env": {
278
+ "RLM_PROVIDER": "bedrock",
279
+ "AWS_ACCESS_KEY_ID": "your_access_key",
280
+ "AWS_SECRET_ACCESS_KEY": "your_secret_key",
281
+ "AWS_REGION": "us-east-1"
282
+ }
283
+ }
284
+ }
285
+ }
286
+ ```
287
+
288
+ **Option 3: AWS Profile**
289
+
290
+ ```json
291
+ {
292
+ "mcpServers": {
293
+ "rlm-analyzer": {
294
+ "command": "npx",
295
+ "args": ["-y", "rlm-analyzer-mcp"],
296
+ "env": {
297
+ "RLM_PROVIDER": "bedrock",
298
+ "AWS_PROFILE": "your_profile_name",
299
+ "AWS_REGION": "us-east-1"
300
+ }
301
+ }
302
+ }
303
+ }
304
+ ```
305
+
177
306
  ### Available MCP Tools
178
307
 
179
308
  | Tool | Description |
@@ -213,12 +342,12 @@ import {
213
342
  loadFiles,
214
343
  } from 'rlm-analyzer';
215
344
 
216
- // Analyze architecture
345
+ // Analyze architecture (uses default provider - Gemini)
217
346
  const result = await analyzeArchitecture('/path/to/project');
218
347
  console.log(result.answer);
219
348
 
220
- // Security analysis
221
- const security = await analyzeSecurity('/path/to/project');
349
+ // Security analysis with Bedrock
350
+ const security = await analyzeSecurity('/path/to/project', { provider: 'bedrock' });
222
351
  console.log(security.answer);
223
352
 
224
353
  // Ask a custom question
@@ -233,7 +362,8 @@ const full = await analyzeCodebase({
233
362
  directory: '/path/to/project',
234
363
  query: 'Explain the data flow',
235
364
  analysisType: 'custom',
236
- model: 'gemini-3-pro-preview',
365
+ model: 'smart', // Uses provider-specific alias
366
+ provider: 'bedrock', // Use Amazon Bedrock
237
367
  verbose: true,
238
368
  });
239
369
  ```
@@ -337,12 +467,26 @@ const relevantMemories = attention.filterByAttention(memories, 10);
337
467
 
338
468
  ### Available Models
339
469
 
340
- | Model ID | Alias | Description |
341
- |----------|-------|-------------|
342
- | `gemini-3-flash-preview` | `fast`, `flash`, `default` | Fast and efficient (recommended) |
343
- | `gemini-3-pro-preview` | `smart`, `pro` | Most capable |
344
- | `gemini-2.5-flash` | `flash-2.5` | Stable release |
345
- | `gemini-2.0-flash-exp` | `flash-2` | Fallback option |
470
+ #### Gemini Models (Default Provider)
471
+
472
+ | Alias | Model ID | Description |
473
+ |-------|----------|-------------|
474
+ | `fast`, `default` | `gemini-3-flash-preview` | Fast and efficient (recommended) |
475
+ | `smart`, `pro` | `gemini-3-pro-preview` | Most capable |
476
+
477
+ #### Amazon Bedrock Models
478
+
479
+ | Alias | Model ID | Description |
480
+ |-------|----------|-------------|
481
+ | `fast`, `default` | `us.amazon.nova-2-lite-v1:0` | Nova 2 Lite (default) |
482
+ | `smart` | `us.anthropic.claude-sonnet-4-5-*` | Claude 4.5 Sonnet |
483
+ | `claude-sonnet` | `us.anthropic.claude-sonnet-4-5-*` | Claude 4.5 Sonnet |
484
+ | `claude-opus` | `us.anthropic.claude-opus-4-5-*` | Claude 4.5 Opus |
485
+ | `qwen3-coder` | `qwen.qwen3-coder-30b-*` | Qwen3 Coder - Best for coding |
486
+ | `gpt-oss` | `openai.gpt-oss-120b-*` | OpenAI GPT OSS |
487
+ | `llama-4` | `us.meta.llama4-maverick-*` | Llama 4 |
488
+
489
+ > **[See all models and aliases →](docs/models-and-commands.md)**
346
490
 
347
491
  ### Configuration Priority
348
492
 
@@ -351,7 +495,7 @@ Model selection follows this priority order:
351
495
  1. **CLI `--model` flag** (highest priority)
352
496
  2. **Environment variables**: `RLM_DEFAULT_MODEL`, `RLM_FALLBACK_MODEL`
353
497
  3. **Config file**: `~/.rlm-analyzer/config.json`
354
- 4. **Built-in defaults**: `gemini-3-flash-preview`
498
+ 4. **Built-in defaults**: `gemini-3-flash-preview` (Gemini) or `amazon.nova-lite-v1:0` (Bedrock)
355
499
 
356
500
  ### Using Model Aliases
357
501
 
@@ -361,16 +505,42 @@ rlm summary --model fast
361
505
 
362
506
  # Use smart model (gemini-3-pro-preview)
363
507
  rlm arch --model smart
508
+
509
+ # Use Bedrock with Nova Pro
510
+ rlm summary --provider bedrock --model smart
511
+
512
+ # Use Bedrock with Claude Sonnet
513
+ rlm arch --provider bedrock --model claude-sonnet
364
514
  ```
365
515
 
366
516
  ### Environment Variables
367
517
 
368
518
  ```bash
519
+ # Set default provider (gemini or bedrock)
520
+ export RLM_PROVIDER=gemini
521
+
369
522
  # Set default model
370
523
  export RLM_DEFAULT_MODEL=gemini-3-pro-preview
371
524
 
372
525
  # Set fallback model
373
526
  export RLM_FALLBACK_MODEL=gemini-2.0-flash-exp
527
+
528
+ # Gemini API key
529
+ export GEMINI_API_KEY=your_api_key
530
+
531
+ # Bedrock authentication (choose one):
532
+ # Option 1: Bedrock API Key (recommended)
533
+ export AWS_BEARER_TOKEN_BEDROCK=your_bedrock_api_key
534
+ export AWS_REGION=us-east-1
535
+
536
+ # Option 2: AWS Access Keys
537
+ export AWS_ACCESS_KEY_ID=your_access_key
538
+ export AWS_SECRET_ACCESS_KEY=your_secret_key
539
+ export AWS_REGION=us-east-1
540
+
541
+ # Option 3: AWS Profile
542
+ export AWS_PROFILE=your_profile_name
543
+ export AWS_REGION=us-east-1
374
544
  ```
375
545
 
376
546
  ### Config File
@@ -379,7 +549,8 @@ Create `~/.rlm-analyzer/config.json`:
379
549
 
380
550
  ```json
381
551
  {
382
- "apiKey": "your_api_key",
552
+ "apiKey": "your_gemini_api_key",
553
+ "provider": "gemini",
383
554
  "models": {
384
555
  "default": "gemini-3-flash-preview",
385
556
  "fallback": "gemini-2.0-flash-exp"
@@ -446,7 +617,9 @@ Multi-pass analysis for quality improvement on complex queries.
446
617
 
447
618
  ## Configuration
448
619
 
449
- ### API Key Storage
620
+ ### Credentials Storage
621
+
622
+ #### Gemini Credentials
450
623
 
451
624
  Your API key can be stored in multiple locations (checked in order):
452
625
 
@@ -457,6 +630,19 @@ Your API key can be stored in multiple locations (checked in order):
457
630
  5. `~/.rlm-analyzer/config.json`
458
631
  6. `~/.config/rlm-analyzer/config.json`
459
632
 
633
+ #### Bedrock Credentials
634
+
635
+ Authentication options (in priority order):
636
+
637
+ 1. **Bedrock API Key** (Recommended): `AWS_BEARER_TOKEN_BEDROCK` environment variable
638
+ - Generate at: AWS Console → Bedrock → API keys
639
+ - Simplest setup, no IAM configuration required
640
+ 2. **AWS Access Keys**: `AWS_ACCESS_KEY_ID` + `AWS_SECRET_ACCESS_KEY`
641
+ 3. **AWS Profile**: `AWS_PROFILE` or `~/.aws/credentials`
642
+ 4. **IAM Role**: Automatic when running on AWS infrastructure
643
+
644
+ The region can be set via `AWS_REGION` environment variable (default: `us-east-1`).
645
+
460
646
  ### File Filtering
461
647
 
462
648
  Default file extensions analyzed:
@@ -491,32 +677,43 @@ target, .idea, .vscode, coverage, .nyc_output
491
677
 
492
678
  ## How It Works
493
679
 
494
- RLM Analyzer uses Recursive Language Models (RLMs) to analyze codebases that exceed traditional context limits:
680
+ RLM Analyzer uses Recursive Language Models (RLMs) to analyze codebases that exceed traditional context limits.
495
681
 
496
682
  ```
497
683
  ┌─────────────────────────────────────────────────────────────┐
498
- RLM Orchestrator
499
- ├─────────────────────────────────────────────────────────────┤
500
- 1. File Loading Load codebase into virtual env │
501
- │ 2. REPL Execution AI writes code to explore files │
502
- │ 3. Sub-LLM Calls Delegate analysis to specialized │
503
- sub-queries (llm_query)
504
- 4. Context Mgmt Compress, optimize, detect rot
505
- 5. Synthesis Combine findings into final answer
684
+ User Query
685
+ └─────────────────────────────────────────────────────────────┘
686
+
687
+
688
+ ┌─────────────────────────────────────────────────────────────┐
689
+ Orchestrator (Main LLM)
690
+ Sees file tree, decides which files to read
691
+ Spawns Sub-LLMs for deep analysis
506
692
  └─────────────────────────────────────────────────────────────┘
693
+
694
+ ┌───────────────┼───────────────┐
695
+ ▼ ▼ ▼
696
+ ┌──────────┐ ┌──────────┐ ┌──────────┐
697
+ │ Sub-LLM │ │ Sub-LLM │ │ Sub-LLM │
698
+ └──────────┘ └──────────┘ └──────────┘
699
+ │ │ │
700
+ └───────────────┼───────────────┘
701
+
702
+ Final Answer
507
703
  ```
508
704
 
509
- ### The RLM Approach
705
+ ### Key Concepts
510
706
 
511
- 1. **File Loading** - Loads your codebase into a virtual file index
512
- 2. **REPL Execution** - AI writes and executes Python-like code to explore files
513
- 3. **Sub-LLM Calls** - Complex analysis delegated via `llm_query()` function
514
- 4. **Context Management** - Compression, sliding window, memory bank
515
- 5. **Iterative Refinement** - Multiple turns until `FINAL()` is called
516
- 6. **Final Answer** - Synthesized analysis based on deep exploration
707
+ 1. **Recursive Analysis** - Main LLM spawns sub-LLMs to analyze files in parallel
708
+ 2. **Context Optimization** - Shows file tree first, LLM requests only needed files
709
+ 3. **Multi-turn Conversation** - Multiple turns to read, analyze, and refine
710
+ 4. **Memory Bank** - Tracks findings to prevent "context rot"
711
+ 5. **Adaptive Compression** - Compresses older context as usage increases
517
712
 
518
713
  This enables analysis of codebases **100x larger** than traditional context windows.
519
714
 
715
+ > **[Read the full technical deep-dive →](docs/how-it-works.md)**
716
+
520
717
  ---
521
718
 
522
719
  ## Cost Savings with MCP Integration
@@ -579,7 +776,7 @@ When used as an MCP tool with Claude Code or Cursor, RLM Analyzer significantly
579
776
 
580
777
  ## Troubleshooting
581
778
 
582
- ### "API key not configured"
779
+ ### "API key not configured" (Gemini)
583
780
 
584
781
  ```bash
585
782
  # Check if key is set
@@ -589,6 +786,29 @@ rlm config
589
786
  rlm config YOUR_API_KEY
590
787
  ```
591
788
 
789
+ ### "AWS credentials not configured" (Bedrock)
790
+
791
+ ```bash
792
+ # Check credentials
793
+ rlm config
794
+
795
+ # Option 1: Set environment variables
796
+ export AWS_ACCESS_KEY_ID=your_key
797
+ export AWS_SECRET_ACCESS_KEY=your_secret
798
+ export AWS_REGION=us-east-1
799
+
800
+ # Option 2: Use AWS CLI to configure
801
+ aws configure
802
+ ```
803
+
804
+ ### "Amazon Bedrock provider requires @aws-sdk/client-bedrock-runtime"
805
+
806
+ The AWS SDK is an optional dependency. Install it to use Bedrock:
807
+
808
+ ```bash
809
+ npm install @aws-sdk/client-bedrock-runtime
810
+ ```
811
+
592
812
  ### "No files found to analyze"
593
813
 
594
814
  Make sure you're in a directory with code files, or specify a directory:
@@ -602,6 +822,7 @@ rlm summary --dir /path/to/code
602
822
  - Large codebases take longer (100+ files = more sub-LLM calls)
603
823
  - Use `--verbose` to see progress and token savings
604
824
  - Consider analyzing specific subdirectories
825
+ - Use `--model fast` for faster analysis
605
826
 
606
827
  ### Execution errors in verbose mode
607
828
 
@@ -610,9 +831,16 @@ Some codebases trigger security filters (e.g., files containing `process.env`).
610
831
  ### MCP server not connecting
611
832
 
612
833
  1. Verify the command works: `npx rlm-analyzer-mcp`
613
- 2. Check API key is set in the MCP config
834
+ 2. Check credentials are set in the MCP config (Gemini API key or AWS credentials)
614
835
  3. Restart your MCP client (Claude Code, Cursor)
615
836
 
837
+ ### Bedrock throttling errors
838
+
839
+ If you see throttling errors with Bedrock, try:
840
+ - Using a smaller model (`--model fast`)
841
+ - Reducing concurrent requests
842
+ - Requesting higher limits from AWS
843
+
616
844
  ---
617
845
 
618
846
  ## TypeScript Types
@@ -628,6 +856,13 @@ import type {
628
856
  CodeAnalysisResult,
629
857
  AnalysisType,
630
858
 
859
+ // Provider types
860
+ ProviderName, // 'gemini' | 'bedrock'
861
+ LLMProvider, // Provider interface
862
+ Message, // Conversation message
863
+ GenerateOptions, // Generation options
864
+ GenerateResponse, // Generation response
865
+
631
866
  // Context management
632
867
  MemoryEntry,
633
868
  CompressedTurn,
package/dist/cli.d.ts CHANGED
@@ -2,6 +2,7 @@
2
2
  /**
3
3
  * RLM Analyzer CLI
4
4
  * Command-line interface for code analysis
5
+ * Supports multiple providers: Gemini (default), Amazon Bedrock, and Claude
5
6
  *
6
7
  * Usage:
7
8
  * rlm <command> [options]
@@ -20,6 +21,7 @@
20
21
  *
21
22
  * Options:
22
23
  * --dir, -d Directory to analyze (default: current)
24
+ * --provider, -p Provider to use (gemini|bedrock|claude)
23
25
  * --verbose, -v Show detailed output
24
26
  * --json Output as JSON
25
27
  * --help, -h Show help
package/dist/cli.d.ts.map CHANGED
@@ -1 +1 @@
1
- {"version":3,"file":"cli.d.ts","sourceRoot":"","sources":["../src/cli.ts"],"names":[],"mappings":";AACA;;;;;;;;;;;;;;;;;;;;;;;;GAwBG;AA2kBH,wBAAsB,MAAM,IAAI,OAAO,CAAC,IAAI,CAAC,CAwE5C"}
1
+ {"version":3,"file":"cli.d.ts","sourceRoot":"","sources":["../src/cli.ts"],"names":[],"mappings":";AACA;;;;;;;;;;;;;;;;;;;;;;;;;;GA0BG;AAkoBH,wBAAsB,MAAM,IAAI,OAAO,CAAC,IAAI,CAAC,CA0I5C"}