yaicli 0.0.19__py3-none-any.whl → 0.1.0__py3-none-any.whl
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- pyproject.toml +10 -4
- yaicli/__init__.py +0 -0
- yaicli/api.py +324 -0
- yaicli/cli.py +332 -0
- yaicli/config.py +183 -0
- yaicli/const.py +119 -0
- yaicli/entry.py +95 -0
- yaicli/history.py +72 -0
- yaicli/printer.py +244 -0
- yaicli/utils.py +112 -0
- {yaicli-0.0.19.dist-info → yaicli-0.1.0.dist-info}/METADATA +280 -233
- yaicli-0.1.0.dist-info/RECORD +15 -0
- yaicli-0.1.0.dist-info/entry_points.txt +3 -0
- yaicli-0.0.19.dist-info/RECORD +0 -7
- yaicli-0.0.19.dist-info/entry_points.txt +0 -2
- yaicli.py +0 -667
- {yaicli-0.0.19.dist-info → yaicli-0.1.0.dist-info}/WHEEL +0 -0
- {yaicli-0.0.19.dist-info → yaicli-0.1.0.dist-info}/licenses/LICENSE +0 -0
@@ -1,6 +1,6 @@
|
|
1
1
|
Metadata-Version: 2.4
|
2
2
|
Name: yaicli
|
3
|
-
Version: 0.0
|
3
|
+
Version: 0.1.0
|
4
4
|
Summary: A simple CLI tool to interact with LLM
|
5
5
|
Project-URL: Homepage, https://github.com/belingud/yaicli
|
6
6
|
Project-URL: Repository, https://github.com/belingud/yaicli
|
@@ -222,282 +222,310 @@ Requires-Dist: socksio>=1.0.0
|
|
222
222
|
Requires-Dist: typer>=0.15.2
|
223
223
|
Description-Content-Type: text/markdown
|
224
224
|
|
225
|
-
# YAICLI - Your AI
|
225
|
+
# YAICLI - Your AI Command Line Interface
|
226
226
|
|
227
227
|
[](https://pypi.org/project/yaicli/)
|
228
228
|

|
229
229
|

|
230
230
|

|
231
231
|
|
232
|
-
YAICLI is a
|
232
|
+
YAICLI is a powerful yet lightweight command-line AI assistant that brings the capabilities of Large Language Models (LLMs) like GPT-4o directly to your terminal. Interact with AI through multiple modes: have natural conversations, generate and execute shell commands, or get quick answers without leaving your workflow.
|
233
233
|
|
234
|
-
|
234
|
+
**Supports both standard and deep reasoning models across all major LLM providers.**
|
235
235
|
|
236
|
-
>
|
237
|
-
|
236
|
+
<p align="center">
|
237
|
+
<img src="https://vhs.charm.sh/vhs-5U1BBjJkTUBReRswsSgIVx.gif" alt="YAICLI Demo" width="85%">
|
238
|
+
</p>
|
238
239
|
|
239
|
-
|
240
|
+
> [!NOTE]
|
241
|
+
> YAICLI is actively developed. While core functionality is stable, some features may evolve in future releases.
|
240
242
|
|
241
|
-
|
242
|
-
- 💬 Chat Mode: Persistent dialogue with context tracking
|
243
|
-
- 🚀 Execute Mode: Generate & verify OS-specific commands (Windows/macOS/Linux)
|
244
|
-
- ⚡ Quick Query: Single-shot responses without entering REPL
|
243
|
+
## ✨ Key Features
|
245
244
|
|
246
|
-
|
247
|
-
|
248
|
-
|
249
|
-
|
245
|
+
### 🔄 Multiple Interaction Modes
|
246
|
+
- **💬 Chat Mode**: Engage in persistent conversations with full context tracking
|
247
|
+
- **🚀 Execute Mode**: Generate and safely run OS-specific shell commands
|
248
|
+
- **⚡ Quick Query**: Get instant answers without entering interactive mode
|
250
249
|
|
251
|
-
|
252
|
-
|
253
|
-
|
254
|
-
|
250
|
+
### 🧠 Smart Environment Awareness
|
251
|
+
- **Auto-detection**: Identifies your shell (bash/zsh/PowerShell/CMD) and OS
|
252
|
+
- **Safe Command Execution**: 3-step verification before running any command
|
253
|
+
- **Flexible Input**: Pipe content directly (`cat log.txt | ai "analyze this"`)
|
255
254
|
|
256
|
-
|
257
|
-
|
258
|
-
|
255
|
+
### 🔌 Universal LLM Compatibility
|
256
|
+
- **OpenAI-Compatible**: Works with any OpenAI-compatible API endpoint
|
257
|
+
- **Multi-Provider Support**: Easy configuration for Claude, Gemini, Cohere, etc.
|
258
|
+
- **Custom Response Parsing**: Extract exactly what you need with jmespath
|
259
259
|
|
260
|
-
|
261
|
-
|
262
|
-
|
260
|
+
### 💻 Enhanced Terminal Experience
|
261
|
+
- **Real-time Streaming**: See responses as they're generated with cursor animation
|
262
|
+
- **Rich History Management**: LRU-based history with 500 entries by default
|
263
|
+
- **Syntax Highlighting**: Beautiful code formatting with customizable themes
|
263
264
|
|
264
|
-
|
265
|
+
### 🛠️ Developer-Friendly
|
266
|
+
- **Layered Configuration**: Environment variables > Config file > Sensible defaults
|
267
|
+
- **Debugging Tools**: Verbose mode with detailed API tracing
|
268
|
+
- **Lightweight**: Minimal dependencies with focused functionality
|
269
|
+
|
270
|
+
## 📦 Installation
|
265
271
|
|
266
272
|
### Prerequisites
|
267
273
|
|
268
274
|
- Python 3.9 or higher
|
269
275
|
|
270
|
-
### Install
|
276
|
+
### Quick Install
|
271
277
|
|
272
278
|
```bash
|
273
|
-
#
|
279
|
+
# Using pip (recommended for most users)
|
274
280
|
pip install yaicli
|
275
281
|
|
276
|
-
#
|
282
|
+
# Using pipx (isolated environment)
|
277
283
|
pipx install yaicli
|
278
284
|
|
279
|
-
#
|
285
|
+
# Using uv (faster installation)
|
280
286
|
uv tool install yaicli
|
281
287
|
```
|
282
288
|
|
283
289
|
### Install from Source
|
284
290
|
|
285
291
|
```bash
|
286
|
-
git clone https://github.com/
|
292
|
+
git clone https://github.com/belingud/yaicli.git
|
287
293
|
cd yaicli
|
288
294
|
pip install .
|
289
295
|
```
|
290
296
|
|
291
|
-
## Configuration
|
297
|
+
## ⚙️ Configuration
|
298
|
+
|
299
|
+
YAICLI uses a simple configuration file to store your preferences and API keys.
|
292
300
|
|
293
|
-
|
301
|
+
### First-time Setup
|
294
302
|
|
295
|
-
|
303
|
+
1. Run `ai` once to generate the default configuration file
|
304
|
+
2. Edit `~/.config/yaicli/config.ini` to add your API key
|
305
|
+
3. Customize other settings as needed
|
296
306
|
|
297
|
-
### Configuration File
|
307
|
+
### Configuration File Structure
|
298
308
|
|
299
|
-
The default configuration file is located at `~/.config/yaicli/config.ini`.
|
309
|
+
The default configuration file is located at `~/.config/yaicli/config.ini`. You can use `ai --template` to see default settings, just as below:
|
300
310
|
|
301
311
|
```ini
|
302
312
|
[core]
|
303
|
-
PROVIDER=
|
313
|
+
PROVIDER=openai
|
304
314
|
BASE_URL=https://api.openai.com/v1
|
305
|
-
API_KEY=
|
315
|
+
API_KEY=
|
306
316
|
MODEL=gpt-4o
|
307
317
|
|
308
|
-
# auto detect shell and os
|
318
|
+
# auto detect shell and os (or specify manually, e.g., bash, zsh, powershell.exe)
|
309
319
|
SHELL_NAME=auto
|
310
320
|
OS_NAME=auto
|
311
321
|
|
312
|
-
#
|
313
|
-
COMPLETION_PATH
|
314
|
-
# if you want to use custom answer path, you can set it here
|
322
|
+
# API paths (usually no need to change for OpenAI compatible APIs)
|
323
|
+
COMPLETION_PATH=chat/completions
|
315
324
|
ANSWER_PATH=choices[0].message.content
|
316
325
|
|
317
|
-
# true: streaming response
|
318
|
-
# false: non-streaming response
|
326
|
+
# true: streaming response, false: non-streaming
|
319
327
|
STREAM=true
|
320
|
-
CODE_THEME=monokia
|
321
328
|
|
329
|
+
# LLM parameters
|
322
330
|
TEMPERATURE=0.7
|
323
331
|
TOP_P=1.0
|
324
332
|
MAX_TOKENS=1024
|
333
|
+
TIMEOUT=60
|
334
|
+
|
335
|
+
# Interactive mode parameters
|
336
|
+
INTERACTIVE_ROUND=25
|
337
|
+
|
338
|
+
# UI/UX
|
339
|
+
CODE_THEME=monokai
|
340
|
+
MAX_HISTORY=500 # Max entries kept in history file
|
341
|
+
AUTO_SUGGEST=true
|
325
342
|
```
|
326
343
|
|
327
|
-
### Configuration Options
|
328
|
-
|
329
|
-
|
330
|
-
|
331
|
-
|
332
|
-
|
333
|
-
|
334
|
-
|
335
|
-
|
336
|
-
|
337
|
-
|
338
|
-
|
339
|
-
|
340
|
-
|
341
|
-
|
342
|
-
|
343
|
-
|
344
|
-
|
345
|
-
|
346
|
-
|
347
|
-
|
348
|
-
|
349
|
-
|
350
|
-
|
351
|
-
|
352
|
-
|
353
|
-
|
354
|
-
|
355
|
-
|
356
|
-
|
357
|
-
|
358
|
-
-
|
359
|
-
|
360
|
-
|
361
|
-
|
362
|
-
|
363
|
-
|
364
|
-
|
365
|
-
|
366
|
-
|
367
|
-
|
344
|
+
### Configuration Options Reference
|
345
|
+
|
346
|
+
| Option | Description | Default | Env Variable |
|
347
|
+
|--------|-------------|---------|---------------|
|
348
|
+
| `BASE_URL` | API endpoint URL | `https://api.openai.com/v1` | `YAI_BASE_URL` |
|
349
|
+
| `API_KEY` | Your API key | - | `YAI_API_KEY` |
|
350
|
+
| `MODEL` | LLM model to use | `gpt-4o` | `YAI_MODEL` |
|
351
|
+
| `SHELL_NAME` | Shell type | `auto` | `YAI_SHELL_NAME` |
|
352
|
+
| `OS_NAME` | Operating system | `auto` | `YAI_OS_NAME` |
|
353
|
+
| `COMPLETION_PATH` | API completion path | `chat/completions` | `YAI_COMPLETION_PATH` |
|
354
|
+
| `ANSWER_PATH` | JSON path for response | `choices[0].message.content` | `YAI_ANSWER_PATH` |
|
355
|
+
| `STREAM` | Enable streaming | `true` | `YAI_STREAM` |
|
356
|
+
| `TIMEOUT` | API timeout (seconds) | `60` | `YAI_TIMEOUT` |
|
357
|
+
| `INTERACTIVE_ROUND` | Interactive mode rounds | `25` | `YAI_INTERACTIVE_ROUND` |
|
358
|
+
| `CODE_THEME` | Syntax highlighting theme | `monokai` | `YAI_CODE_THEME` |
|
359
|
+
| `TEMPERATURE` | Response randomness | `0.7` | `YAI_TEMPERATURE` |
|
360
|
+
| `TOP_P` | Top-p sampling | `1.0` | `YAI_TOP_P` |
|
361
|
+
| `MAX_TOKENS` | Max response tokens | `1024` | `YAI_MAX_TOKENS` |
|
362
|
+
| `MAX_HISTORY` | Max history entries | `500` | `YAI_MAX_HISTORY` |
|
363
|
+
| `AUTO_SUGGEST` | Enable history suggestions | `true` | `YAI_AUTO_SUGGEST` |
|
364
|
+
|
365
|
+
### LLM Provider Configuration
|
366
|
+
|
367
|
+
YAICLI works with all major LLM providers. The default configuration is set up for OpenAI, but you can easily switch to other providers.
|
368
|
+
|
369
|
+
#### Pre-configured Provider Settings
|
370
|
+
|
371
|
+
| Provider | BASE_URL | COMPLETION_PATH | ANSWER_PATH |
|
372
|
+
|----------|----------|-----------------|-------------|
|
373
|
+
| **OpenAI** (default) | `https://api.openai.com/v1` | `chat/completions` | `choices[0].message.content` |
|
374
|
+
| **Claude** (native API) | `https://api.anthropic.com/v1` | `messages` | `content[0].text` |
|
375
|
+
| **Claude** (OpenAI-compatible) | `https://api.anthropic.com/v1/openai` | `chat/completions` | `choices[0].message.content` |
|
376
|
+
| **Cohere** | `https://api.cohere.com/v2` | `chat` | `message.content[0].text` |
|
377
|
+
| **Google Gemini** | `https://generativelanguage.googleapis.com/v1beta/openai` | `chat/completions` | `choices[0].message.content` |
|
378
|
+
|
379
|
+
> **Note**: Many providers offer OpenAI-compatible endpoints that work with the default settings.
|
380
|
+
> - Google Gemini: https://ai.google.dev/gemini-api/docs/openai
|
381
|
+
> - Claude: https://docs.anthropic.com/en/api/openai-sdk
|
382
|
+
|
383
|
+
#### Custom Provider Configuration Guide
|
384
|
+
|
385
|
+
To configure a custom provider:
|
386
|
+
|
368
387
|
1. **Find the API Endpoint**:
|
369
|
-
-
|
370
|
-
|
388
|
+
- Check the provider's API documentation for their chat completion endpoint
|
389
|
+
|
371
390
|
2. **Identify the Response Structure**:
|
372
|
-
- Look
|
373
|
-
|
374
|
-
|
375
|
-
|
376
|
-
|
377
|
-
|
378
|
-
|
379
|
-
|
380
|
-
|
381
|
-
|
382
|
-
|
383
|
-
"
|
384
|
-
"model": "claude-3-7-sonnet-20250219",
|
385
|
-
"role": "assistant",
|
386
|
-
"stop_reason": "end_turn",
|
387
|
-
"stop_sequence": null,
|
388
|
-
"type": "message",
|
389
|
-
"usage": {
|
390
|
-
"input_tokens": 2095,
|
391
|
-
"output_tokens": 503
|
392
|
-
}
|
391
|
+
- Look at the JSON response format to find where the text content is located
|
392
|
+
|
393
|
+
3. **Set the Path Expression**:
|
394
|
+
- Use jmespath syntax to specify the path to the text content
|
395
|
+
|
396
|
+
**Example**: For Claude's native API, the response looks like:
|
397
|
+
```json
|
398
|
+
{
|
399
|
+
"content": [
|
400
|
+
{
|
401
|
+
"text": "Hi! My name is Claude.",
|
402
|
+
"type": "text"
|
393
403
|
}
|
394
|
-
|
395
|
-
|
404
|
+
],
|
405
|
+
"id": "msg_013Zva2CMHLNnXjNJJKqJ2EF",
|
406
|
+
"model": "claude-3-7-sonnet-20250219",
|
407
|
+
"role": "assistant"
|
408
|
+
}
|
409
|
+
```
|
410
|
+
|
411
|
+
The path to extract the text is: `content.0.text`
|
412
|
+
|
413
|
+
### Syntax Highlighting Themes
|
396
414
|
|
397
|
-
|
415
|
+
YAICLI supports all Pygments syntax highlighting themes. You can set your preferred theme in the config file:
|
416
|
+
|
417
|
+
```ini
|
418
|
+
CODE_THEME=monokai
|
419
|
+
```
|
398
420
|
|
399
|
-
|
421
|
+
Browse available themes at: https://pygments.org/styles/
|
400
422
|
|
401
|
-
|
402
|
-

|
423
|
+

|
403
424
|
|
404
|
-
## Usage
|
425
|
+
## 🚀 Usage
|
405
426
|
|
406
|
-
###
|
427
|
+
### Quick Start
|
407
428
|
|
408
429
|
```bash
|
409
|
-
#
|
430
|
+
# Get a quick answer
|
410
431
|
ai "What is the capital of France?"
|
411
432
|
|
412
|
-
#
|
433
|
+
# Start an interactive chat session
|
413
434
|
ai --chat
|
414
435
|
|
415
|
-
#
|
436
|
+
# Generate and execute shell commands
|
416
437
|
ai --shell "Create a backup of my Documents folder"
|
417
438
|
|
418
|
-
#
|
439
|
+
# Analyze code from a file
|
440
|
+
cat app.py | ai "Explain what this code does"
|
441
|
+
|
442
|
+
# Debug with verbose mode
|
419
443
|
ai --verbose "Explain quantum computing"
|
420
444
|
```
|
421
445
|
|
422
|
-
### Command Line
|
423
|
-
|
424
|
-
Arguments:
|
425
|
-
- `<PROMPT>`: Argument
|
426
|
-
|
427
|
-
Options:
|
428
|
-
- `--install-completion`: Install completion for the current shell
|
429
|
-
- `--show-completion`: Show completion for the current shell, to copy it or customize the installation
|
430
|
-
- `--help` or `-h`: Show this message and exit
|
431
|
-
- `--template`: Show the config template.
|
432
|
-
|
433
|
-
Run Options:
|
434
|
-
- `--verbose` or `-V`: Show verbose information
|
435
|
-
- `--chat` or `-c`: Start in chat mode
|
436
|
-
- `--shell` or `-s`: Generate and execute shell command
|
437
|
-
|
438
|
-
```bash
|
439
|
-
ai -h
|
446
|
+
### Command Line Reference
|
440
447
|
|
448
|
+
```
|
441
449
|
Usage: ai [OPTIONS] [PROMPT]
|
442
|
-
|
443
|
-
yaicli - Your AI interface in cli.
|
444
|
-
|
445
|
-
╭─ Arguments ─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╮
|
446
|
-
│ prompt [PROMPT] The prompt send to the LLM │
|
447
|
-
╰─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯
|
448
|
-
╭─ Options ───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╮
|
449
|
-
│ --template Show the config template. │
|
450
|
-
│ --install-completion Install completion for the current shell. │
|
451
|
-
│ --show-completion Show completion for the current shell, to copy it or customize the installation. │
|
452
|
-
│ --help -h Show this message and exit. │
|
453
|
-
╰─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯
|
454
|
-
╭─ Run Option ────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╮
|
455
|
-
│ --chat -c Start in chat mode │
|
456
|
-
│ --shell -s Generate and execute shell command │
|
457
|
-
│ --verbose -V Show verbose information │
|
458
|
-
╰─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯
|
459
|
-
|
460
450
|
```
|
461
451
|
|
462
|
-
|
463
|
-
|
464
|
-
|
465
|
-
|
466
|
-
|
467
|
-
|
468
|
-
|
469
|
-
|
470
|
-
|
471
|
-
|
472
|
-
|
473
|
-
|
474
|
-
|
475
|
-
|
476
|
-
|
477
|
-
|
478
|
-
|
479
|
-
|
480
|
-
- `
|
481
|
-
|
482
|
-
|
483
|
-
-
|
452
|
+
| Option | Short | Description |
|
453
|
+
|--------|-------|-------------|
|
454
|
+
| `--chat` | `-c` | Start in interactive chat mode |
|
455
|
+
| `--shell` | `-s` | Generate and execute shell commands |
|
456
|
+
| `--help` | `-h` | Show help message and exit |
|
457
|
+
| `--verbose` | `-V` | Show detailed debug information |
|
458
|
+
| `--template` | | Display the config template |
|
459
|
+
|
460
|
+
### Interactive Mode Features
|
461
|
+
|
462
|
+
<table>
|
463
|
+
<tr>
|
464
|
+
<td width="50%">
|
465
|
+
|
466
|
+
**Commands**
|
467
|
+
- `/exit` - Exit the application
|
468
|
+
- `/clear` - Clear conversation history
|
469
|
+
- `/his` - Show command history
|
470
|
+
- `/mode chat|exec` - Switch modes
|
471
|
+
|
472
|
+
**Keyboard Shortcuts**
|
473
|
+
- `Tab` - Toggle between Chat/Execute modes
|
474
|
+
- `Ctrl+C` or `Ctrl+D` - Exit
|
475
|
+
- `Ctrl+R` - Search history
|
476
|
+
- `↑/↓` - Navigate through history
|
477
|
+
|
478
|
+
</td>
|
479
|
+
<td width="50%">
|
480
|
+
|
481
|
+
**Chat Mode** (💬)
|
482
|
+
- Natural conversations with context
|
483
|
+
- Markdown and code formatting
|
484
|
+
- Reasoning display for complex queries
|
485
|
+
|
486
|
+
**Execute Mode** (🚀)
|
487
|
+
- Generate shell commands from descriptions
|
488
|
+
- Review commands before execution
|
489
|
+
- Edit commands before running
|
490
|
+
- Safe execution with confirmation
|
491
|
+
|
492
|
+
</td>
|
493
|
+
</tr>
|
494
|
+
</table>
|
495
|
+
|
496
|
+
### Input Methods
|
497
|
+
|
498
|
+
**Direct Input**
|
499
|
+
```bash
|
500
|
+
ai "What is the capital of France?"
|
501
|
+
```
|
484
502
|
|
485
|
-
|
486
|
-
You can also pipe input to YAICLI:
|
503
|
+
**Piped Input**
|
487
504
|
```bash
|
488
505
|
echo "What is the capital of France?" | ai
|
489
506
|
```
|
490
507
|
|
508
|
+
**File Analysis**
|
509
|
+
```bash
|
510
|
+
cat demo.py | ai "Explain this code"
|
511
|
+
```
|
512
|
+
|
513
|
+
**Combined Input**
|
491
514
|
```bash
|
492
|
-
cat
|
515
|
+
cat error.log | ai "Why am I getting these errors in my Python app?"
|
493
516
|
```
|
494
517
|
|
495
|
-
### History
|
496
|
-
|
518
|
+
### History Management
|
519
|
+
|
520
|
+
YAICLI maintains a history of your interactions (default: 500 entries) stored in `~/.yaicli_history`. You can:
|
497
521
|
|
498
|
-
|
522
|
+
- Configure history size with `MAX_HISTORY` in config
|
523
|
+
- Search history with `Ctrl+R` in interactive mode
|
524
|
+
- View recent commands with `/his` command
|
499
525
|
|
500
|
-
|
526
|
+
## 📱 Examples
|
527
|
+
|
528
|
+
### Quick Answer Mode
|
501
529
|
|
502
530
|
```bash
|
503
531
|
$ ai "What is the capital of France?"
|
@@ -505,7 +533,7 @@ Assistant:
|
|
505
533
|
The capital of France is Paris.
|
506
534
|
```
|
507
535
|
|
508
|
-
### Command
|
536
|
+
### Command Generation & Execution
|
509
537
|
|
510
538
|
```bash
|
511
539
|
$ ai -s 'Check the current directory size'
|
@@ -526,46 +554,48 @@ Output:
|
|
526
554
|
```bash
|
527
555
|
$ ai --chat
|
528
556
|
|
529
|
-
██ ██ █████ ██ ██████ ██ ██
|
530
|
-
|
531
|
-
|
532
|
-
|
533
|
-
|
557
|
+
██ ██ █████ ██ ██████ ██ ██
|
558
|
+
██ ██ ██ ██ ██ ██ ██ ██
|
559
|
+
████ ███████ ██ ██ ██ ██
|
560
|
+
██ ██ ██ ██ ██ ██ ██
|
561
|
+
██ ██ ██ ██ ██████ ███████ ██
|
562
|
+
|
563
|
+
Welcome to YAICLI!
|
564
|
+
Press TAB to switch mode
|
565
|
+
/clear : Clear chat history
|
566
|
+
/his : Show chat history
|
567
|
+
/exit|Ctrl+D|Ctrl+C: Exit
|
568
|
+
/mode chat|exec : Switch mode (Case insensitive)
|
569
|
+
───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────
|
570
|
+
💬 > Tell me about the solar system
|
534
571
|
|
535
|
-
|
536
|
-
|
537
|
-
Type /his to see chat history
|
538
|
-
Press Ctrl+C or type /exit to exit
|
572
|
+
Assistant:
|
573
|
+
Solar System Overview
|
539
574
|
|
540
|
-
|
575
|
+
• Central Star: The Sun (99% of system mass, nuclear fusion).
|
576
|
+
• Planets: 8 total.
|
577
|
+
• Terrestrial (rocky): Mercury, Venus, Earth, Mars.
|
578
|
+
• Gas Giants: Jupiter, Saturn.
|
579
|
+
• Ice Giants: Uranus, Neptune.
|
580
|
+
• Moons: Over 200 (e.g., Earth: 1, Jupiter: 95).
|
581
|
+
• Smaller Bodies:
|
582
|
+
• Asteroids (between Mars/Venus), comets ( icy, distant), * dwarf planets* (Pluto, Ceres).
|
583
|
+
• Oort Cloud: spherical shell of icy objects ~1–100,000天文單位 (AU) from Sun).
|
584
|
+
• Heliosphere: Solar wind boundary protecting Earth from cosmic radiation.
|
541
585
|
|
542
|
-
|
543
|
-
Certainly! Here’s a brief overview of the solar system:
|
544
|
-
|
545
|
-
• Sun: The central star of the solar system, providing light and energy.
|
546
|
-
• Planets:
|
547
|
-
• Mercury: Closest to the Sun, smallest planet.
|
548
|
-
• Venus: Second planet, known for its thick atmosphere and high surface temperature.
|
549
|
-
• Earth: Third planet, the only known planet to support life.
|
550
|
-
• Mars: Fourth planet, often called the "Red Planet" due to its reddish appearance.
|
551
|
-
• Jupiter: Largest planet, a gas giant with many moons.
|
552
|
-
• Saturn: Known for its prominent ring system, also a gas giant.
|
553
|
-
• Uranus: An ice giant, known for its unique axial tilt.
|
554
|
-
• Neptune: Another ice giant, known for its deep blue color.
|
555
|
-
• Dwarf Planets:
|
556
|
-
• Pluto: Once considered the ninth planet, now classified as
|
586
|
+
Key Fact: Earth is the only confirmed habitable planet.
|
557
587
|
|
558
588
|
🚀 > Check the current directory size
|
559
589
|
Assistant:
|
560
590
|
du -sh .
|
561
|
-
╭─ Command ─╮
|
562
|
-
│ du -sh .
|
563
|
-
|
591
|
+
╭─ Suggest Command ─╮
|
592
|
+
│ du -sh . │
|
593
|
+
╰───────────────────╯
|
564
594
|
Execute command? [e]dit, [y]es, [n]o (n): e
|
565
|
-
Edit command
|
566
|
-
|
567
|
-
|
568
|
-
|
595
|
+
Edit command: du -sh ./
|
596
|
+
--- Executing ---
|
597
|
+
55M ./
|
598
|
+
--- Finished ---
|
569
599
|
🚀 >
|
570
600
|
```
|
571
601
|
|
@@ -575,9 +605,9 @@ Output:
|
|
575
605
|
$ ai --shell "Find all PDF files in my Downloads folder"
|
576
606
|
Assistant:
|
577
607
|
find ~/Downloads -type f -name "*.pdf"
|
578
|
-
╭─ Command
|
579
|
-
│ find ~/Downloads -type f -
|
580
|
-
|
608
|
+
╭─ Suggest Command ───────────────────────╮
|
609
|
+
│ find ~/Downloads -type f -iname "*.pdf" │
|
610
|
+
╰─────────────────────────────────────────╯
|
581
611
|
Execute command? [e]dit, [y]es, [n]o (n): y
|
582
612
|
Output:
|
583
613
|
|
@@ -586,24 +616,41 @@ Output:
|
|
586
616
|
...
|
587
617
|
```
|
588
618
|
|
589
|
-
## Technical
|
619
|
+
## 💻 Technical Details
|
620
|
+
|
621
|
+
### Architecture
|
622
|
+
|
623
|
+
YAICLI is designed with a modular architecture that separates concerns and makes the codebase maintainable:
|
624
|
+
|
625
|
+
- **CLI Module**: Handles user interaction and command parsing
|
626
|
+
- **API Client**: Manages communication with LLM providers
|
627
|
+
- **Config Manager**: Handles layered configuration
|
628
|
+
- **History Manager**: Maintains conversation history with LRU functionality
|
629
|
+
- **Printer**: Formats and displays responses with rich formatting
|
630
|
+
|
631
|
+
### Dependencies
|
590
632
|
|
591
|
-
|
633
|
+
| Library | Purpose |
|
634
|
+
|---------|----------|
|
635
|
+
| [Typer](https://typer.tiangolo.com/) | Command-line interface with type hints |
|
636
|
+
| [Rich](https://rich.readthedocs.io/) | Terminal formatting and beautiful display |
|
637
|
+
| [prompt_toolkit](https://python-prompt-toolkit.readthedocs.io/) | Interactive input with history and auto-completion |
|
638
|
+
| [httpx](https://www.python-httpx.org/) | Modern HTTP client with async support |
|
639
|
+
| [jmespath](https://jmespath.org/) | JSON data extraction |
|
592
640
|
|
593
|
-
|
594
|
-
- **Rich**: Provides terminal content formatting and beautiful display
|
595
|
-
- **prompt_toolkit**: Provides interactive command-line input experience
|
596
|
-
- **httpx**: Handles API requests
|
597
|
-
- **jmespath**: Parses JSON responses
|
641
|
+
## 👨💻 Contributing
|
598
642
|
|
599
|
-
|
643
|
+
Contributions are welcome! Here's how you can help:
|
600
644
|
|
601
|
-
|
645
|
+
- **Bug Reports**: Open an issue describing the bug and how to reproduce it
|
646
|
+
- **Feature Requests**: Suggest new features or improvements
|
647
|
+
- **Code Contributions**: Submit a PR with your changes
|
648
|
+
- **Documentation**: Help improve or translate the documentation
|
602
649
|
|
603
|
-
## License
|
650
|
+
## 📃 License
|
604
651
|
|
605
652
|
[Apache License 2.0](LICENSE)
|
606
653
|
|
607
654
|
---
|
608
655
|
|
609
|
-
|
656
|
+
<p align="center"><i>YAICLI - Your AI Command Line Interface</i></p>
|