ai-speedometer 1.4.3 → 2.0.1

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.md CHANGED
@@ -6,34 +6,50 @@ A CLI tool for benchmarking AI models across multiple providers with parallel ex
6
6
 
7
7
  ## Install
8
8
 
9
+ Requires [Bun](https://bun.sh) runtime.
10
+
11
+ ```bash
12
+ bun install -g ai-speedometer
13
+ ```
14
+
15
+ Or with npm (Bun still required at runtime):
16
+
9
17
  ```bash
10
18
  npm install -g ai-speedometer
11
19
  ```
12
20
 
21
+ Or run directly from source:
22
+
23
+ ```bash
24
+ bun src/index.ts
25
+ ```
26
+
13
27
  ## What It Measures
14
28
 
15
29
  - **TTFT** (Time to First Token) - How fast the first response token arrives
16
- - **Total Time** - Complete request duration
30
+ - **Total Time** - Complete request duration
17
31
  - **Tokens/Second** - Real-time throughput
18
32
  - **Token Counts** - Input, output, and total tokens used
19
33
 
20
- ## New Features
34
+ ## Features
21
35
 
22
- - **REST API Default** - REST API benchmarking is now the default method for better compatibility
36
+ - **Interactive TUI** - Full terminal UI with Tokyo Night theme, menus, search, and live benchmark progress
37
+ - **REST API Benchmarking** - Default method, works with all OpenAI-compatible providers
23
38
  - **Headless Mode** - Run benchmarks without interactive CLI using command-line arguments
24
- - **Streaming Support** - Full streaming support now available in REST API benchmarks
39
+ - **Parallel Execution** - Benchmark multiple models simultaneously
40
+ - **Provider Management** - Add verified, custom verified, and custom providers
25
41
 
26
42
  ## Quick Setup
27
43
 
28
44
  1. **Set Model**
29
45
  ```bash
30
46
  ai-speedometer
31
- # Select "Set Model" → "Add Verified Provider" → Choose provider (OpenAI, Anthropic, etc.)
47
+ # Select "Run Benchmark" → "Add Verified Provider" → Choose provider (OpenAI, Anthropic, etc.)
32
48
  # Enter your API key when prompted
33
49
  ```
34
50
 
35
51
  2. **Choose Model Provider**
36
- - Verified providers (OpenAI, Anthropic, Google) - auto-configured
52
+ - Verified providers (OpenAI, Anthropic, Google) - auto-configured via models.dev
37
53
  - Custom verified providers (pre-configured trusted providers) - add API key
38
54
  - Custom providers (Ollama, local models) - add your base URL
39
55
 
@@ -42,47 +58,64 @@ npm install -g ai-speedometer
42
58
  - Enter when prompted - stored securely in:
43
59
  - `~/.local/share/opencode/auth.json` (primary storage)
44
60
  - `~/.config/ai-speedometer/ai-benchmark-config.json` (backup storage)
45
- - Both files store verified and custom verified provider keys
46
61
 
47
62
  4. **Run Benchmark**
48
63
  ```bash
49
64
  ai-speedometer
50
- # Select "Run Benchmark (REST API)" → Choose models → Press ENTER
51
- # Note: REST API is now the default benchmark method
65
+ # Select "Run Benchmark" → choose models → press Enter
52
66
  ```
53
67
 
54
68
  ## Usage
55
69
 
56
70
  ```bash
57
- # Start CLI
71
+ # Start interactive TUI
58
72
  ai-speedometer
59
73
 
60
- # Or use short alias
74
+ # Short alias
61
75
  aispeed
62
76
 
63
77
  # Debug mode
64
78
  ai-speedometer --debug
65
79
 
66
- # Headless mode - run benchmark directly
80
+ # Headless benchmark
67
81
  ai-speedometer --bench openai:gpt-4
68
82
  # With custom API key
69
83
  ai-speedometer --bench openai:gpt-4 --api-key "sk-your-key"
70
- # Use AI SDK instead of REST API
71
- ai-speedometer --bench openai:gpt-4 --ai-sdk
84
+ # Custom provider
85
+ ai-speedometer --bench-custom myprovider:mymodel --base-url https://... --api-key "..."
86
+ ```
87
+
88
+ ## Development
89
+
90
+ ```bash
91
+ # Run from source
92
+ bun src/index.ts
93
+
94
+ # Run with auto-reload
95
+ bun --watch src/index.ts
96
+
97
+ # Run tests
98
+ bun test
99
+
100
+ # Typecheck
101
+ bun run typecheck
102
+
103
+ # Build standalone binary
104
+ bun run build # → dist/ai-speedometer
72
105
  ```
73
106
 
74
107
  ## Configuration Files
75
108
 
76
109
  API keys and configuration are stored in:
77
110
 
78
- - **Verified + Custom Verified Providers**:
111
+ - **Verified + Custom Verified Providers**:
79
112
  - Primary: `~/.local/share/opencode/auth.json`
80
113
  - Backup: `~/.config/ai-speedometer/ai-benchmark-config.json` (verifiedProviders section)
81
114
  - **Custom Providers**: `~/.config/ai-speedometer/ai-benchmark-config.json` (customProviders section)
82
- - **Provider Definitions**: `./custom-verified-providers.json`
115
+ - **Provider Definitions**: `./custom-verified-providers.json` (bundled at build time)
83
116
 
84
117
  ## Requirements
85
118
 
86
- - Node.js 18+
119
+ - **Runtime**: Bun 1.0+ (required — install from [bun.sh](https://bun.sh))
87
120
  - API keys for AI providers
88
121
  - Terminal with arrow keys and ANSI colors