recursive-llm-ts 4.3.2 → 4.3.3

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.md CHANGED
@@ -353,6 +353,194 @@ const result = await rlm.completion(
353
353
 
354
354
  See the [LiteLLM documentation](https://docs.litellm.ai/docs/providers) for the complete list of supported providers and their configuration.
355
355
 
356
+ ## Docker Deployment
357
+
358
+ ### Basic Dockerfile with Go Build
359
+
360
+ To containerize your application that uses `recursive-llm-ts`, install Go 1.21+ in your Docker image to build the binary during `npm install`:
361
+
362
+ ```dockerfile
363
+ FROM node:20-alpine
364
+
365
+ # Install Go 1.21+ for building the RLM binary
366
+ RUN apk add --no-cache go
367
+
368
+ # Set Go environment
369
+ ENV GOPATH=/go
370
+ ENV PATH=$PATH:$GOPATH/bin
371
+
372
+ WORKDIR /app
373
+
374
+ COPY package*.json ./
375
+ RUN npm install
376
+
377
+ COPY . .
378
+
379
+ ENV OPENAI_API_KEY=""
380
+ ENV NODE_ENV=production
381
+
382
+ CMD ["node", "your-app.js"]
383
+ ```
384
+
385
+ ### Multi-Stage Build (Recommended for Production)
386
+
387
+ For optimal image size and security, use a multi-stage build:
388
+
389
+ ```dockerfile
390
+ # Stage 1: Build the Go binary
391
+ FROM golang:1.21-alpine AS go-builder
392
+ WORKDIR /build
393
+ COPY go/go.mod go/go.sum ./
394
+ RUN go mod download
395
+ COPY go/ ./
396
+ RUN CGO_ENABLED=0 GOOS=linux go build -ldflags="-s -w" -o rlm-go ./cmd/rlm
397
+
398
+ # Stage 2: Build Node.js dependencies
399
+ FROM node:20-alpine AS node-builder
400
+ WORKDIR /app
401
+ COPY package*.json ./
402
+ RUN npm ci --omit=dev
403
+
404
+ # Stage 3: Final runtime image
405
+ FROM node:20-alpine
406
+ WORKDIR /app
407
+
408
+ COPY --from=node-builder /app/node_modules ./node_modules
409
+ COPY --from=go-builder /build/rlm-go ./bin/rlm-go
410
+ RUN chmod +x ./bin/rlm-go
411
+
412
+ COPY package*.json ./
413
+ COPY dist/ ./dist/
414
+
415
+ ENV NODE_ENV=production
416
+ ENV RLM_GO_BINARY=/app/bin/rlm-go
417
+ ENV OPENAI_API_KEY=""
418
+
419
+ CMD ["node", "dist/index.js"]
420
+ ```
421
+
422
+ **Benefits:** Smaller image (~150MB vs ~500MB), faster builds with caching, more secure.
423
+
424
+ ### Docker Compose
425
+
426
+ ```yaml
427
+ version: '3.8'
428
+ services:
429
+ app:
430
+ build: .
431
+ environment:
432
+ - OPENAI_API_KEY=${OPENAI_API_KEY}
433
+ - NODE_ENV=production
434
+ ports:
435
+ - "3000:3000"
436
+ ```
437
+
438
+ ### Installing Go in Different Base Images
439
+
440
+ ```dockerfile
441
+ # Alpine
442
+ RUN apk add --no-cache go
443
+
444
+ # Debian/Ubuntu
445
+ RUN apt-get update && apt-get install -y golang-1.21
446
+
447
+ # Or use pre-built binary (no Go required)
448
+ # Download from GitHub releases and copy to /app/bin/rlm-go
449
+ ```
450
+
451
+ ## Using the Go Module Directly
452
+
453
+ The Go implementation can be used as a standalone library in Go projects.
454
+
455
+ ### Installation
456
+
457
+ ```bash
458
+ go get github.com/jbeck018/recursive-llm-ts/go
459
+ ```
460
+
461
+ ### Usage
462
+
463
+ ```go
464
+ package main
465
+
466
+ import (
467
+ "fmt"
468
+ "os"
469
+
470
+ "github.com/jbeck018/recursive-llm-ts/go/rlm"
471
+ )
472
+
473
+ func main() {
474
+ config := rlm.Config{
475
+ MaxDepth: 5,
476
+ MaxIterations: 30,
477
+ APIKey: os.Getenv("OPENAI_API_KEY"),
478
+ }
479
+
480
+ engine := rlm.New("gpt-4o-mini", config)
481
+
482
+ answer, stats, err := engine.Completion(
483
+ "What are the key points?",
484
+ "Your long document here...",
485
+ )
486
+ if err != nil {
487
+ fmt.Fprintf(os.Stderr, "Error: %v\n", err)
488
+ os.Exit(1)
489
+ }
490
+
491
+ fmt.Printf("Answer: %s\n", answer)
492
+ fmt.Printf("Stats: %d LLM calls, %d iterations\n",
493
+ stats.LlmCalls, stats.Iterations)
494
+ }
495
+ ```
496
+
497
+ ### Structured Output
498
+
499
+ ```go
500
+ schema := &rlm.JSONSchema{
501
+ Type: "object",
502
+ Properties: map[string]*rlm.JSONSchema{
503
+ "summary": {Type: "string"},
504
+ "score": {Type: "number"},
505
+ },
506
+ Required: []string{"summary", "score"},
507
+ }
508
+
509
+ config := &rlm.StructuredConfig{
510
+ Schema: schema,
511
+ MaxRetries: 3,
512
+ }
513
+
514
+ result, stats, err := engine.StructuredCompletion(
515
+ "Summarize and score",
516
+ document,
517
+ config,
518
+ )
519
+ ```
520
+
521
+ ### Building from Source
522
+
523
+ ```bash
524
+ cd go
525
+
526
+ # Standard build
527
+ go build -o rlm-go ./cmd/rlm
528
+
529
+ # Optimized (smaller binary)
530
+ go build -ldflags="-s -w" -o rlm-go ./cmd/rlm
531
+
532
+ # Cross-compile
533
+ GOOS=linux GOARCH=amd64 go build -o rlm-linux-amd64 ./cmd/rlm
534
+ GOOS=darwin GOARCH=arm64 go build -o rlm-darwin-arm64 ./cmd/rlm
535
+ ```
536
+
537
+ ### Running Tests
538
+
539
+ ```bash
540
+ cd go
541
+ go test -v ./rlm/...
542
+ ```
543
+
356
544
  ## How It Works
357
545
 
358
546
  This package provides a TypeScript wrapper around a Go implementation of Recursive-LLM, enabling seamless integration into Node.js/TypeScript applications without Python dependencies. The Go binary is built locally (or supplied via `RLM_GO_BINARY`) and invoked for completions.
package/bin/rlm-go CHANGED
Binary file
package/go/README.md CHANGED
@@ -1,24 +1,92 @@
1
- # RLM Go Binary
1
+ # RLM Go Module
2
2
 
3
3
  Go implementation of Recursive Language Models (RLM) based on the [original Python implementation](https://github.com/alexzhang13/rlm).
4
4
 
5
5
  ## Overview
6
6
 
7
- This is a self-contained Go binary that implements the RLM algorithm, allowing language models to process extremely long contexts (100k+ tokens) by storing context as a variable and allowing recursive exploration.
7
+ This is both a standalone Go library and CLI binary that implements the RLM algorithm, allowing language models to process extremely long contexts (100k+ tokens) by storing context as a variable and allowing recursive exploration.
8
8
 
9
9
  **Key Difference from Python**: Uses JavaScript REPL instead of Python REPL for code execution.
10
10
 
11
- ## Building
11
+ ## Installation
12
+
13
+ ### As a Go Library
14
+
15
+ ```bash
16
+ go get github.com/jbeck018/recursive-llm-ts/go
17
+ ```
18
+
19
+ ### Usage as Library
20
+
21
+ ```go
22
+ package main
23
+
24
+ import (
25
+ "fmt"
26
+ "os"
27
+
28
+ "github.com/jbeck018/recursive-llm-ts/go/rlm"
29
+ )
30
+
31
+ func main() {
32
+ config := rlm.Config{
33
+ MaxDepth: 5,
34
+ MaxIterations: 30,
35
+ APIKey: os.Getenv("OPENAI_API_KEY"),
36
+ }
37
+
38
+ engine := rlm.New("gpt-4o-mini", config)
39
+
40
+ answer, stats, err := engine.Completion(
41
+ "What are the key themes?",
42
+ "Your long document here...",
43
+ )
44
+ if err != nil {
45
+ fmt.Fprintf(os.Stderr, "Error: %v\n", err)
46
+ os.Exit(1)
47
+ }
48
+
49
+ fmt.Printf("Answer: %s\n", answer)
50
+ fmt.Printf("Stats: %d LLM calls, %d iterations\n",
51
+ stats.LlmCalls, stats.Iterations)
52
+ }
53
+ ```
54
+
55
+ ### Structured Output
56
+
57
+ ```go
58
+ schema := &rlm.JSONSchema{
59
+ Type: "object",
60
+ Properties: map[string]*rlm.JSONSchema{
61
+ "summary": {Type: "string"},
62
+ "score": {Type: "number"},
63
+ },
64
+ Required: []string{"summary", "score"},
65
+ }
66
+
67
+ config := &rlm.StructuredConfig{
68
+ Schema: schema,
69
+ MaxRetries: 3,
70
+ }
71
+
72
+ result, stats, err := engine.StructuredCompletion(
73
+ "Summarize and score",
74
+ document,
75
+ config,
76
+ )
77
+ ```
78
+
79
+ ## Building the CLI Binary
12
80
 
13
81
  ```bash
14
82
  # Build the binary
15
- go build -o rlm ./cmd/rlm
83
+ go build -o rlm-go ./cmd/rlm
16
84
 
17
85
  # Run tests
18
- go test ./internal/rlm/... -v
86
+ go test ./rlm/... -v
19
87
 
20
88
  # Build with optimization
21
- go build -ldflags="-s -w" -o rlm ./cmd/rlm
89
+ go build -ldflags="-s -w" -o rlm-go ./cmd/rlm
22
90
  ```
23
91
 
24
92
  ## Usage
@@ -181,13 +249,16 @@ Works with any OpenAI-compatible API:
181
249
 
182
250
  ```
183
251
  cmd/rlm/main.go # CLI entry point (JSON I/O)
184
- internal/rlm/
252
+ rlm/ # Public package (importable)
253
+ ├── doc.go # Package documentation
185
254
  ├── rlm.go # Core RLM logic
186
255
  ├── types.go # Config and stats types
256
+ ├── structured.go # Structured completion with schema validation
187
257
  ├── parser.go # FINAL() extraction
188
258
  ├── prompt.go # System prompt builder
189
259
  ├── repl.go # JavaScript REPL (goja)
190
- └── openai.go # OpenAI API client
260
+ ├── openai.go # OpenAI API client
261
+ └── errors.go # Error types
191
262
  ```
192
263
 
193
264
  ## Error Handling
@@ -212,16 +283,16 @@ echo '{
212
283
 
213
284
  ```bash
214
285
  # Run all tests
215
- go test ./internal/rlm/... -v
286
+ go test ./rlm/... -v
216
287
 
217
288
  # Run specific test
218
- go test ./internal/rlm -run TestParser -v
289
+ go test ./rlm -run TestParser -v
219
290
 
220
291
  # With coverage
221
- go test ./internal/rlm/... -cover
292
+ go test ./rlm/... -cover
222
293
 
223
294
  # Benchmark
224
- go test ./internal/rlm/... -bench=. -benchmem
295
+ go test ./rlm/... -bench=. -benchmem
225
296
  ```
226
297
 
227
298
  ## Performance
@@ -332,7 +403,7 @@ Check the JavaScript syntax. Use `console.log()` not `print()`, or ensure `print
332
403
  ## Contributing
333
404
 
334
405
  1. Write tests for new features
335
- 2. Ensure all tests pass: `go test ./internal/rlm/... -v`
406
+ 2. Ensure all tests pass: `go test ./rlm/... -v`
336
407
  3. Format code: `go fmt ./...`
337
408
  4. Update documentation
338
409
 
@@ -6,7 +6,7 @@ import (
6
6
  "io"
7
7
  "os"
8
8
 
9
- "recursive-llm-go/internal/rlm"
9
+ "github.com/jbeck018/recursive-llm-ts/go/rlm"
10
10
  )
11
11
 
12
12
  type requestPayload struct {
package/go/go.mod CHANGED
@@ -1,4 +1,4 @@
1
- module recursive-llm-go
1
+ module github.com/jbeck018/recursive-llm-ts/go
2
2
 
3
3
  go 1.21
4
4
 
package/go/rlm/doc.go ADDED
@@ -0,0 +1,82 @@
1
+ // Package rlm provides a Recursive Language Model (RLM) engine for Go.
2
+ //
3
+ // RLM enables language models to use tools and make recursive calls to themselves,
4
+ // allowing for complex multi-step reasoning and task decomposition. This implements
5
+ // the technique from the paper "Recursive Language Models" by Alex Zhang and Omar Khattab (MIT, 2025).
6
+ //
7
+ // # Installation
8
+ //
9
+ // To use this package in your Go project:
10
+ //
11
+ // go get github.com/jbeck018/recursive-llm-ts/go
12
+ //
13
+ // # Basic Usage
14
+ //
15
+ // Create an RLM engine and execute a completion:
16
+ //
17
+ // import "github.com/jbeck018/recursive-llm-ts/go/rlm"
18
+ //
19
+ // config := rlm.Config{
20
+ // MaxDepth: 5,
21
+ // MaxIterations: 30,
22
+ // APIKey: os.Getenv("OPENAI_API_KEY"),
23
+ // }
24
+ //
25
+ // engine := rlm.New("gpt-4", config)
26
+ // answer, stats, err := engine.Completion("What is 2+2?", "")
27
+ //
28
+ // # Structured Output
29
+ //
30
+ // For structured JSON output with validation:
31
+ //
32
+ // schema := &rlm.JSONSchema{
33
+ // Type: "object",
34
+ // Properties: map[string]*rlm.JSONSchema{
35
+ // "name": {Type: "string"},
36
+ // "age": {Type: "number"},
37
+ // },
38
+ // Required: []string{"name", "age"},
39
+ // }
40
+ //
41
+ // config := &rlm.StructuredConfig{
42
+ // Schema: schema,
43
+ // MaxRetries: 3,
44
+ // }
45
+ //
46
+ // result, stats, err := engine.StructuredCompletion(
47
+ // "Extract person info",
48
+ // "John is 30 years old",
49
+ // config,
50
+ // )
51
+ //
52
+ // # Recursive Calls
53
+ //
54
+ // The LLM can make recursive calls to itself using the recursive_llm() function
55
+ // available in the JavaScript REPL environment during execution. This enables
56
+ // complex multi-step reasoning and task decomposition.
57
+ //
58
+ // # Supported Providers
59
+ //
60
+ // RLM works with any OpenAI-compatible API:
61
+ // - OpenAI (default)
62
+ // - Azure OpenAI
63
+ // - Ollama
64
+ // - LiteLLM
65
+ // - Any OpenAI-compatible endpoint
66
+ //
67
+ // Configure the provider using Config.APIBase:
68
+ //
69
+ // config := rlm.Config{
70
+ // APIBase: "https://your-azure-endpoint.openai.azure.com/v1",
71
+ // APIKey: os.Getenv("AZURE_API_KEY"),
72
+ // }
73
+ //
74
+ // # Error Handling
75
+ //
76
+ // All methods return typed errors that can be checked with errors.As():
77
+ //
78
+ // var maxDepthErr *rlm.MaxDepthError
79
+ // if errors.As(err, &maxDepthErr) {
80
+ // fmt.Printf("Hit max depth: %d\n", maxDepthErr.MaxDepth)
81
+ // }
82
+ package rlm
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "recursive-llm-ts",
3
- "version": "4.3.2",
3
+ "version": "4.3.3",
4
4
  "description": "TypeScript bridge for recursive-llm: Recursive Language Models for unbounded context processing with structured outputs",
5
5
  "main": "dist/index.js",
6
6
  "types": "dist/index.d.ts",
File without changes
File without changes
File without changes
File without changes
File without changes
File without changes
File without changes
File without changes
File without changes
File without changes
File without changes