@cogitator-ai/cli 0.2.0 → 0.2.1

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
Files changed (2) hide show
  1. package/README.md +553 -21
  2. package/package.json +3 -3
package/README.md CHANGED
@@ -1,69 +1,601 @@
1
1
  # @cogitator-ai/cli
2
2
 
3
- Command-line interface for the Cogitator AI agent runtime.
3
+ Command-line interface for the Cogitator AI agent runtime. Scaffold projects, manage Docker services, and run agents from the terminal.
4
4
 
5
5
  ## Installation
6
6
 
7
7
  ```bash
8
+ # Global installation (recommended)
8
9
  pnpm add -g @cogitator-ai/cli
10
+
11
+ # Or use with npx
12
+ npx @cogitator-ai/cli <command>
9
13
  ```
10
14
 
11
- ## Commands
15
+ ## Features
16
+
17
+ - **Project Scaffolding** - Create new Cogitator projects with sensible defaults
18
+ - **Docker Services** - Start/stop Redis, PostgreSQL, and Ollama with one command
19
+ - **Agent Runner** - Run agents from the command line with streaming output
20
+ - **Interactive Mode** - Chat with agents in a REPL environment
21
+ - **Model Management** - List and pull Ollama models
22
+ - **Service Status** - Monitor running Docker services
23
+ - **Log Viewer** - View logs from all services
24
+
25
+ ---
12
26
 
13
- ### Initialize a new project
27
+ ## Quick Start
14
28
 
15
29
  ```bash
30
+ # Create a new project
16
31
  cogitator init my-project
17
32
  cd my-project
33
+
34
+ # Start Docker services (Redis, Postgres, Ollama)
35
+ cogitator up
36
+
37
+ # Run the example agent
38
+ pnpm dev
39
+
40
+ # Or run a quick chat
41
+ cogitator run "What is the capital of France?"
42
+ ```
43
+
44
+ ---
45
+
46
+ ## Commands
47
+
48
+ ### cogitator init
49
+
50
+ Create a new Cogitator project with all necessary files.
51
+
52
+ ```bash
53
+ cogitator init <name> [options]
54
+ ```
55
+
56
+ | Option | Description |
57
+ | -------------- | -------------------------------------- |
58
+ | `--no-install` | Skip automatic dependency installation |
59
+
60
+ **Generated Project Structure:**
61
+
62
+ ```
63
+ my-project/
64
+ ├── package.json # Dependencies and scripts
65
+ ├── tsconfig.json # TypeScript configuration
66
+ ├── cogitator.yml # Cogitator configuration
67
+ ├── docker-compose.yml # Docker services
68
+ ├── .gitignore # Git ignore rules
69
+ └── src/
70
+ └── agent.ts # Example agent with tools
71
+ ```
72
+
73
+ **Example:**
74
+
75
+ ```bash
76
+ # Create project and install dependencies
77
+ cogitator init my-ai-app
78
+
79
+ # Create project without installing
80
+ cogitator init my-ai-app --no-install
18
81
  ```
19
82
 
20
- ### Start development services
83
+ ---
84
+
85
+ ### cogitator up
86
+
87
+ Start Docker services for local development.
21
88
 
22
89
  ```bash
90
+ cogitator up [options]
91
+ ```
92
+
93
+ | Option | Default | Description |
94
+ | -------------- | ------- | ---------------------------------- |
95
+ | `-d, --detach` | `true` | Run services in background |
96
+ | `--no-detach` | - | Run services in foreground |
97
+ | `--pull` | `false` | Pull latest images before starting |
98
+
99
+ **Services Started:**
100
+
101
+ | Service | Port | Description |
102
+ | ---------- | ----- | --------------------------------- |
103
+ | Redis | 6379 | In-memory cache and queue backend |
104
+ | PostgreSQL | 5432 | Vector database with pgvector |
105
+ | Ollama | 11434 | Local LLM inference server |
106
+
107
+ **Connection Strings:**
108
+
109
+ ```
110
+ Redis: redis://localhost:6379
111
+ Postgres: postgresql://cogitator:cogitator@localhost:5432/cogitator
112
+ Ollama: http://localhost:11434
113
+ ```
114
+
115
+ **Examples:**
116
+
117
+ ```bash
118
+ # Start in background (default)
23
119
  cogitator up
120
+
121
+ # Pull latest images and start
122
+ cogitator up --pull
123
+
124
+ # Run in foreground (see all logs)
125
+ cogitator up --no-detach
24
126
  ```
25
127
 
26
- Starts Redis, PostgreSQL, and Ollama via Docker Compose.
128
+ ---
129
+
130
+ ### cogitator down
27
131
 
28
- ### Stop services
132
+ Stop Docker services.
29
133
 
30
134
  ```bash
31
- cogitator down
135
+ cogitator down [options]
32
136
  ```
33
137
 
34
- ### Run an agent
138
+ | Option | Description |
139
+ | --------------- | --------------------------------- |
140
+ | `-v, --volumes` | Remove volumes (deletes all data) |
141
+
142
+ **Examples:**
35
143
 
36
144
  ```bash
37
- cogitator run "Hello, what can you help me with?"
145
+ # Stop services (keep data)
146
+ cogitator down
147
+
148
+ # Stop services and delete all data
149
+ cogitator down --volumes
38
150
  ```
39
151
 
40
- With a specific model:
152
+ ---
153
+
154
+ ### cogitator run
155
+
156
+ Run an agent with a message or start interactive mode.
41
157
 
42
158
  ```bash
43
- cogitator run -m ollama/gemma3:4b "Explain quantum computing"
159
+ cogitator run [message] [options]
44
160
  ```
45
161
 
46
- Interactive mode:
162
+ | Option | Default | Description |
163
+ | --------------------- | --------------- | --------------------------------------- |
164
+ | `-c, --config <path>` | `cogitator.yml` | Config file path |
165
+ | `-m, --model <model>` | auto-detect | Model to use (e.g., `ollama/gemma3:4b`) |
166
+ | `-i, --interactive` | `false` | Force interactive mode |
167
+ | `-s, --stream` | `true` | Stream response tokens |
168
+ | `--no-stream` | - | Disable streaming |
169
+
170
+ **Model Auto-Detection:**
171
+
172
+ If no model is specified, the CLI will:
173
+
174
+ 1. Check `COGITATOR_MODEL` environment variable
175
+ 2. Query Ollama for available models
176
+ 3. Select from preferred models: llama3.1:8b, llama3:8b, gemma3:4b, gemma2:9b, mistral:7b
177
+ 4. Fall back to first available model
178
+
179
+ **Examples:**
47
180
 
48
181
  ```bash
182
+ # Single message with auto-detected model
183
+ cogitator run "Explain quantum computing in simple terms"
184
+
185
+ # Specify a model
186
+ cogitator run -m ollama/gemma3:4b "Write a haiku about AI"
187
+
188
+ # Use OpenAI
189
+ cogitator run -m openai/gpt-4o "Analyze this code..."
190
+
191
+ # Disable streaming
192
+ cogitator run --no-stream "Hello"
193
+
194
+ # Interactive mode (starts automatically if no message)
49
195
  cogitator run
196
+ cogitator run -i
50
197
  ```
51
198
 
52
- ## Project Structure
199
+ ---
53
200
 
54
- After `cogitator init`, your project will have:
201
+ ### Interactive Mode
202
+
203
+ When running without a message or with `-i`, you enter interactive mode:
55
204
 
56
205
  ```
57
- my-project/
58
- ├── package.json
59
- ├── cogitator.yml # Configuration
60
- ├── agent.ts # Your agent definition
61
- └── docker-compose.yml # Local services
206
+ ___ _ _ _
207
+ / __\___ __ _(_) |_ __ _| |_ ___ _ __
208
+ / / / _ \ / _` | | __/ _` | __/ _ \| '__|
209
+ / /__| (_) | (_| | | || (_| | || (_) | |
210
+ \____/\___/ \__, |_|\__\__,_|\__\___/|_|
211
+ |___/
212
+
213
+ AI Agent Runtime v0.1.0
214
+
215
+ Model: llama3.1:8b
216
+ Commands: /model <name>, /clear, /help, exit
217
+
218
+ > Hello!
219
+ → Hi there! How can I help you today?
220
+
221
+ [1] > What's 2 + 2?
222
+ → 2 + 2 equals 4.
223
+
224
+ [2] >
225
+ ```
226
+
227
+ **Interactive Commands:**
228
+
229
+ | Command | Description |
230
+ | ---------------- | ----------------------------------------- |
231
+ | `/model [name]` | Show current model or switch to a new one |
232
+ | `/clear` | Clear conversation history (start fresh) |
233
+ | `/help` | Show available commands |
234
+ | `exit` or `quit` | Exit interactive mode |
235
+
236
+ **Examples:**
237
+
238
+ ```
239
+ > /model
240
+ Current model: ollama/llama3.1:8b
241
+
242
+ > /model gemma3:4b
243
+ ✓ Switched to model: ollama/gemma3:4b
244
+
245
+ > /clear
246
+ Conversation cleared
247
+
248
+ > exit
249
+ Goodbye!
250
+ ```
251
+
252
+ ---
253
+
254
+ ### cogitator status
255
+
256
+ Show status of all Cogitator services.
257
+
258
+ ```bash
259
+ cogitator status
260
+ # or
261
+ cogitator ps
262
+ ```
263
+
264
+ **Output Example:**
265
+
62
266
  ```
267
+ ℹ Cogitator Services Status
268
+
269
+ Docker Compose Services:
270
+
271
+ ● my-project-redis-1 running Up 2 minutes
272
+ ● my-project-postgres-1 running Up 2 minutes
273
+ ● my-project-ollama-1 running Up 2 minutes
274
+
275
+ External Services:
276
+
277
+ ● Ollama running localhost:11434
278
+ ```
279
+
280
+ ---
281
+
282
+ ### cogitator logs
283
+
284
+ View logs from Docker services.
285
+
286
+ ```bash
287
+ cogitator logs [service] [options]
288
+ ```
289
+
290
+ | Option | Default | Description |
291
+ | -------------------- | ------- | ---------------------------------- |
292
+ | `-f, --follow` | `false` | Follow log output (like `tail -f`) |
293
+ | `-n, --tail <lines>` | `100` | Number of lines to show |
294
+ | `-t, --timestamps` | `false` | Show timestamps |
295
+
296
+ **Available Services:**
297
+
298
+ - `redis` - Redis cache/queue logs
299
+ - `postgres` - PostgreSQL database logs
300
+ - `ollama` - Ollama LLM server logs
301
+
302
+ **Examples:**
303
+
304
+ ```bash
305
+ # View last 100 lines from all services
306
+ cogitator logs
307
+
308
+ # Follow logs in real-time
309
+ cogitator logs -f
63
310
 
64
- ## Documentation
311
+ # View only Ollama logs
312
+ cogitator logs ollama
313
+
314
+ # Follow Ollama logs with timestamps
315
+ cogitator logs ollama -f -t
316
+
317
+ # Show last 50 lines
318
+ cogitator logs -n 50
319
+ ```
320
+
321
+ ---
322
+
323
+ ### cogitator models
324
+
325
+ List and manage Ollama models.
326
+
327
+ ```bash
328
+ cogitator models [options]
329
+ ```
330
+
331
+ | Option | Description |
332
+ | ---------------- | --------------------------------- |
333
+ | `--pull <model>` | Pull a model from Ollama registry |
334
+
335
+ **Output Example:**
336
+
337
+ ```
338
+ ✓ Found 3 model(s)
339
+
340
+ llama3.1:8b 4.7 GB 2 days ago
341
+ gemma3:4b 2.8 GB 1 week ago
342
+ mistral:7b 4.1 GB 3 weeks ago
343
+
344
+ Use with: cogitator run -m ollama/<model> "message"
345
+ ```
346
+
347
+ **Examples:**
348
+
349
+ ```bash
350
+ # List installed models
351
+ cogitator models
352
+
353
+ # Pull a new model
354
+ cogitator models --pull llama3.1:8b
355
+ cogitator models --pull gemma3:4b
356
+ cogitator models --pull mistral:7b
357
+ ```
358
+
359
+ ---
360
+
361
+ ## Configuration
362
+
363
+ ### cogitator.yml
364
+
365
+ The main configuration file for your Cogitator project:
366
+
367
+ ```yaml
368
+ # cogitator.yml
369
+
370
+ llm:
371
+ defaultProvider: ollama
372
+ providers:
373
+ ollama:
374
+ baseUrl: http://localhost:11434
375
+ openai:
376
+ apiKey: ${OPENAI_API_KEY}
377
+
378
+ memory:
379
+ adapter: memory
380
+ # Or use Redis:
381
+ # adapter: redis
382
+ # redis:
383
+ # url: redis://localhost:6379
384
+ ```
385
+
386
+ ### Environment Variables
387
+
388
+ | Variable | Description |
389
+ | ------------------- | ---------------------------------------------- |
390
+ | `COGITATOR_CONFIG` | Path to config file (overrides auto-detection) |
391
+ | `COGITATOR_MODEL` | Default model to use |
392
+ | `OPENAI_API_KEY` | OpenAI API key |
393
+ | `ANTHROPIC_API_KEY` | Anthropic API key |
394
+
395
+ **Example .env:**
396
+
397
+ ```bash
398
+ COGITATOR_MODEL=ollama/llama3.1:8b
399
+ OPENAI_API_KEY=sk-...
400
+ ```
401
+
402
+ ---
403
+
404
+ ## Project Templates
405
+
406
+ ### Basic Agent (Generated by `init`)
407
+
408
+ ```typescript
409
+ // src/agent.ts
410
+ import { Cogitator, Agent, tool } from '@cogitator-ai/core';
411
+ import { z } from 'zod';
412
+
413
+ const greet = tool({
414
+ name: 'greet',
415
+ description: 'Greet someone by name',
416
+ parameters: z.object({
417
+ name: z.string().describe('Name to greet'),
418
+ }),
419
+ execute: async ({ name }) => `Hello, ${name}! 👋`,
420
+ });
421
+
422
+ const agent = new Agent({
423
+ id: 'my-agent',
424
+ name: 'My Agent',
425
+ model: 'ollama/llama3.1:8b',
426
+ instructions: 'You are a helpful assistant. Use the greet tool when asked to greet someone.',
427
+ tools: [greet],
428
+ });
429
+
430
+ const cog = new Cogitator();
431
+
432
+ const result = await cog.run(agent, {
433
+ input: 'Hello! Can you greet Alex?',
434
+ });
435
+
436
+ console.log('Agent:', result.output);
437
+
438
+ await cog.close();
439
+ ```
440
+
441
+ ### Agent with Multiple Tools
442
+
443
+ ```typescript
444
+ import { Cogitator, Agent, tool } from '@cogitator-ai/core';
445
+ import { z } from 'zod';
446
+
447
+ const calculator = tool({
448
+ name: 'calculator',
449
+ description: 'Perform mathematical calculations',
450
+ parameters: z.object({
451
+ expression: z.string().describe('Math expression to evaluate'),
452
+ }),
453
+ execute: async ({ expression }) => {
454
+ const result = Function(`return ${expression}`)();
455
+ return String(result);
456
+ },
457
+ });
458
+
459
+ const datetime = tool({
460
+ name: 'datetime',
461
+ description: 'Get current date and time',
462
+ parameters: z.object({}),
463
+ execute: async () => new Date().toISOString(),
464
+ });
465
+
466
+ const agent = new Agent({
467
+ name: 'Assistant',
468
+ model: 'ollama/llama3.1:8b',
469
+ instructions: 'You are a helpful assistant with calculator and datetime tools.',
470
+ tools: [calculator, datetime],
471
+ });
472
+
473
+ const cog = new Cogitator();
474
+ const result = await cog.run(agent, {
475
+ input: 'What is 15 * 23 + 42? Also, what time is it?',
476
+ });
477
+
478
+ console.log(result.output);
479
+ await cog.close();
480
+ ```
481
+
482
+ ---
483
+
484
+ ## Docker Compose
485
+
486
+ The generated `docker-compose.yml`:
487
+
488
+ ```yaml
489
+ name: my-project
490
+
491
+ services:
492
+ redis:
493
+ image: redis:7-alpine
494
+ ports:
495
+ - '6379:6379'
496
+ volumes:
497
+ - redis-data:/data
498
+
499
+ postgres:
500
+ image: pgvector/pgvector:pg16
501
+ ports:
502
+ - '5432:5432'
503
+ environment:
504
+ POSTGRES_USER: cogitator
505
+ POSTGRES_PASSWORD: cogitator
506
+ POSTGRES_DB: cogitator
507
+ volumes:
508
+ - postgres-data:/var/lib/postgresql/data
509
+
510
+ ollama:
511
+ image: ollama/ollama:latest
512
+ ports:
513
+ - '11434:11434'
514
+ volumes:
515
+ - ollama-data:/root/.ollama
516
+
517
+ volumes:
518
+ redis-data:
519
+ postgres-data:
520
+ ollama-data:
521
+ ```
522
+
523
+ ---
524
+
525
+ ## Troubleshooting
526
+
527
+ ### Ollama Not Running
528
+
529
+ ```
530
+ ✗ Cannot connect to Ollama
531
+ Start Ollama with: ollama serve
532
+ ```
533
+
534
+ **Solutions:**
535
+
536
+ 1. Start Ollama: `ollama serve`
537
+ 2. Or use Docker: `cogitator up`
538
+ 3. Install Ollama: https://ollama.ai
539
+
540
+ ### No Models Found
541
+
542
+ ```
543
+ ⚠ No models installed
544
+ Pull a model with: cogitator models --pull llama3.1:8b
545
+ ```
546
+
547
+ **Solution:**
548
+
549
+ ```bash
550
+ cogitator models --pull llama3.1:8b
551
+ # or
552
+ ollama pull llama3.1:8b
553
+ ```
554
+
555
+ ### Docker Not Running
556
+
557
+ ```
558
+ ✗ Docker is not installed or not running
559
+ Install Docker: https://docs.docker.com/get-docker/
560
+ ```
561
+
562
+ **Solutions:**
563
+
564
+ 1. Start Docker Desktop
565
+ 2. Or: `sudo systemctl start docker`
566
+
567
+ ### Config File Not Found
568
+
569
+ ```
570
+ No config file found
571
+ ```
572
+
573
+ The CLI searches for config in this order:
574
+
575
+ 1. `COGITATOR_CONFIG` environment variable
576
+ 2. `-c` option value
577
+ 3. `cogitator.yml` in current directory
578
+ 4. `cogitator.yaml` in current directory
579
+ 5. `cogitator.json` in current directory
580
+
581
+ ---
582
+
583
+ ## NPM Scripts
584
+
585
+ After `cogitator init`, these scripts are available:
586
+
587
+ ```bash
588
+ # Run agent in watch mode (auto-reload on changes)
589
+ pnpm dev
590
+
591
+ # Run agent once
592
+ pnpm start
593
+
594
+ # Build TypeScript
595
+ pnpm build
596
+ ```
65
597
 
66
- See the [Cogitator documentation](https://github.com/eL1fe/cogitator) for full API reference.
598
+ ---
67
599
 
68
600
  ## License
69
601
 
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "@cogitator-ai/cli",
3
- "version": "0.2.0",
3
+ "version": "0.2.1",
4
4
  "description": "CLI for Cogitator AI Agent Runtime",
5
5
  "type": "module",
6
6
  "main": "./dist/index.js",
@@ -21,8 +21,8 @@
21
21
  "commander": "^12.0.0",
22
22
  "chalk": "^5.3.0",
23
23
  "ora": "^8.0.0",
24
- "@cogitator-ai/core": "0.2.0",
25
- "@cogitator-ai/config": "0.2.0"
24
+ "@cogitator-ai/core": "0.3.0",
25
+ "@cogitator-ai/config": "0.2.1"
26
26
  },
27
27
  "devDependencies": {
28
28
  "@types/node": "^20.10.0",