yaicli 0.0.19__py3-none-any.whl → 0.2.0__py3-none-any.whl

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
@@ -1,6 +1,6 @@
1
1
  Metadata-Version: 2.4
2
2
  Name: yaicli
3
- Version: 0.0.19
3
+ Version: 0.2.0
4
4
  Summary: A simple CLI tool to interact with LLM
5
5
  Project-URL: Homepage, https://github.com/belingud/yaicli
6
6
  Project-URL: Repository, https://github.com/belingud/yaicli
@@ -222,282 +222,419 @@ Requires-Dist: socksio>=1.0.0
222
222
  Requires-Dist: typer>=0.15.2
223
223
  Description-Content-Type: text/markdown
224
224
 
225
- # YAICLI - Your AI Interface in Command Line
225
+ # YAICLI: Your AI assistant in the command line.
226
226
 
227
227
  [![PyPI version](https://img.shields.io/pypi/v/yaicli?style=for-the-badge)](https://pypi.org/project/yaicli/)
228
228
  ![GitHub License](https://img.shields.io/github/license/belingud/yaicli?style=for-the-badge)
229
229
  ![PyPI - Downloads](https://img.shields.io/pypi/dm/yaicli?logo=pypi&style=for-the-badge)
230
230
  ![Pepy Total Downloads](https://img.shields.io/pepy/dt/yaicli?style=for-the-badge&logo=python)
231
231
 
232
- YAICLI is a compact yet potent command-line AI assistant, allowing you to engage with Large Language Models (LLMs) such as ChatGPT's gpt-4o directly via your terminal. It offers multiple operation modes for everyday conversations, generating and executing shell commands, and one-shot quick queries.
232
+ YAICLI is a powerful yet lightweight command-line AI assistant that brings the capabilities of Large Language Models (LLMs) like GPT-4o directly to your terminal. Interact with AI through multiple modes: have natural conversations, generate and execute shell commands, or get quick answers without leaving your workflow.
233
233
 
234
- Support regular and deep thinking models.
234
+ **Supports both standard and deep reasoning models across all major LLM providers.**
235
235
 
236
- > [!WARNING]
237
- > This is a work in progress, some features could change or be removed in the future.
236
+ <p align="center">
237
+ <img src="https://vhs.charm.sh/vhs-5U1BBjJkTUBReRswsSgIVx.gif" alt="YAICLI Demo" width="85%">
238
+ </p>
238
239
 
239
- ## Features
240
+ > [!NOTE]
241
+ > YAICLI is actively developed. While core functionality is stable, some features may evolve in future releases.
240
242
 
241
- - **Smart Interaction Modes**:
242
- - 💬 Chat Mode: Persistent dialogue with context tracking
243
- - 🚀 Execute Mode: Generate & verify OS-specific commands (Windows/macOS/Linux)
244
- - ⚡ Quick Query: Single-shot responses without entering REPL
243
+ ## Key Features
245
244
 
246
- - **Environment Intelligence**:
247
- - Auto-detects shell type (CMD/PowerShell/bash/zsh)
248
- - Dynamic command validation with 3-step confirmation
249
- - Pipe input support (`cat log.txt | ai "analyze errors"`)
245
+ ### 🔄 Multiple Interaction Modes
250
246
 
251
- - **Enterprise LLM Support**:
252
- - OpenAI API compatible endpoints
253
- - Claude/Gemini/Cohere integration guides
254
- - Custom JSON parsing with jmespath
247
+ - **💬 Chat Mode**: Engage in persistent conversations with full context tracking
248
+ - **🚀 Execute Mode**: Generate and safely run OS-specific shell commands
249
+ - **⚡ Quick Query**: Get instant answers without entering interactive mode
255
250
 
256
- - **Terminal Experience**:
257
- - Real-time streaming with cursor animation
258
- - LRU history management (500 entries default)
251
+ ### 🧠 Smart Environment Awareness
259
252
 
260
- - **DevOps Ready**:
261
- - Layered configuration (Env > File > Defaults)
262
- - Verbose debug mode with API tracing
253
+ - **Auto-detection**: Identifies your shell (bash/zsh/PowerShell/CMD) and OS
254
+ - **Safe Command Execution**: 3-step verification before running any command
255
+ - **Flexible Input**: Pipe content directly (`cat log.txt | ai "analyze this"`)
263
256
 
264
- ## Installation
257
+ ### 🔌 Universal LLM Compatibility
258
+
259
+ - **OpenAI-Compatible**: Works with any OpenAI-compatible API endpoint
260
+ - **Multi-Provider Support**: Easy configuration for Claude, Gemini, Cohere, etc.
261
+ - **Custom Response Parsing**: Extract exactly what you need with jmespath
262
+
263
+ ### 💻 Enhanced Terminal Experience
264
+
265
+ - **Real-time Streaming**: See responses as they're generated with cursor animation
266
+ - **Rich History Management**: LRU-based history with 500 entries by default
267
+ - **Syntax Highlighting**: Beautiful code formatting with customizable themes
268
+
269
+ ### 🛠️ Developer-Friendly
270
+
271
+ - **Layered Configuration**: Environment variables > Config file > Sensible defaults
272
+ - **Debugging Tools**: Verbose mode with detailed API tracing
273
+ - **Lightweight**: Minimal dependencies with focused functionality
274
+
275
+ ## 📦 Installation
265
276
 
266
277
  ### Prerequisites
267
278
 
268
- - Python 3.9 or higher
279
+ - Python 3.9 or higher
269
280
 
270
- ### Install from PyPI
281
+ ### Quick Install
271
282
 
272
283
  ```bash
273
- # Install by pip
284
+ # Using pip (recommended for most users)
274
285
  pip install yaicli
275
286
 
276
- # Install by pipx
287
+ # Using pipx (isolated environment)
277
288
  pipx install yaicli
278
289
 
279
- # Install by uv
290
+ # Using uv (faster installation)
280
291
  uv tool install yaicli
281
292
  ```
282
293
 
283
294
  ### Install from Source
284
295
 
285
296
  ```bash
286
- git clone https://github.com/yourusername/yaicli.git
297
+ git clone https://github.com/belingud/yaicli.git
287
298
  cd yaicli
288
299
  pip install .
289
300
  ```
290
301
 
291
- ## Configuration
302
+ ## ⚙️ Configuration
292
303
 
293
- On first run, YAICLI will create a default configuration file at `~/.config/yaicli/config.ini`. You'll need to edit this file to add your API key and customize other settings.
304
+ YAICLI uses a simple configuration file to store your preferences and API keys.
294
305
 
295
- Just run `ai`, and it will create the config file for you. Then you can edit it to add your api key.
306
+ ### First-time Setup
296
307
 
297
- ### Configuration File
308
+ 1. Run `ai` once to generate the default configuration file
309
+ 2. Edit `~/.config/yaicli/config.ini` to add your API key
310
+ 3. Customize other settings as needed
298
311
 
299
- The default configuration file is located at `~/.config/yaicli/config.ini`. Look at the example below:
312
+ ### Configuration File Structure
313
+
314
+ The default configuration file is located at `~/.config/yaicli/config.ini`. You can use `ai --template` to see default settings, just as below:
300
315
 
301
316
  ```ini
302
317
  [core]
303
- PROVIDER=OPENAI
318
+ PROVIDER=openai
304
319
  BASE_URL=https://api.openai.com/v1
305
- API_KEY=your_api_key_here
320
+ API_KEY=
306
321
  MODEL=gpt-4o
307
322
 
308
- # auto detect shell and os
323
+ # auto detect shell and os (or specify manually, e.g., bash, zsh, powershell.exe)
309
324
  SHELL_NAME=auto
310
325
  OS_NAME=auto
311
326
 
312
- # if you want to use custom completions path, you can set it here
313
- COMPLETION_PATH=/chat/completions
314
- # if you want to use custom answer path, you can set it here
327
+ # API paths (usually no need to change for OpenAI compatible APIs)
328
+ COMPLETION_PATH=chat/completions
315
329
  ANSWER_PATH=choices[0].message.content
316
330
 
317
- # true: streaming response
318
- # false: non-streaming response
331
+ # true: streaming response, false: non-streaming
319
332
  STREAM=true
320
- CODE_THEME=monokia
321
333
 
334
+ # LLM parameters
322
335
  TEMPERATURE=0.7
323
336
  TOP_P=1.0
324
337
  MAX_TOKENS=1024
338
+ TIMEOUT=60
339
+
340
+ # Interactive mode parameters
341
+ INTERACTIVE_ROUND=25
342
+
343
+ # UI/UX
344
+ CODE_THEME=monokai
345
+ MAX_HISTORY=500 # Max entries kept in history file
346
+ AUTO_SUGGEST=true
347
+
348
+ # Chat history settings
349
+ CHAT_HISTORY_DIR={DEFAULT_CONFIG_MAP["CHAT_HISTORY_DIR"]["value"]}
350
+ MAX_SAVED_CHATS={DEFAULT_CONFIG_MAP["MAX_SAVED_CHATS"]["value"]}
325
351
  ```
326
352
 
327
- ### Configuration Options
328
-
329
- Below are the available configuration options and override environment variables:
330
-
331
- - **BASE_URL**: API endpoint URL (default: OpenAI API), env: YAI_BASE_URL
332
- - **API_KEY**: Your API key for the LLM provider, env: YAI_API_KEY
333
- - **MODEL**: The model to use (e.g., gpt-4o, gpt-3.5-turbo), default: gpt-4o, env: YAI_MODEL
334
- - **SHELL_NAME**: Shell to use (auto for automatic detection), default: auto, env: YAI_SHELL_NAME
335
- - **OS_NAME**: OS to use (auto for automatic detection), default: auto, env: YAI_OS_NAME
336
- - **COMPLETION_PATH**: Path for completions endpoint, default: /chat/completions, env: YAI_COMPLETION_PATH
337
- - **ANSWER_PATH**: Json path expression to extract answer from response, default: choices[0].message.content, env: YAI_ANSWER_PATH
338
- - **STREAM**: Enable/disable streaming responses, default: true, env: YAI_STREAM
339
- - **CODE_THEME**: Theme for code blocks, default: monokia, env: YAI_CODE_THEME
340
- - **TEMPERATURE**: Temperature for response generation (default: 0.7), env: YAI_TEMPERATURE
341
- - **TOP_P**: Top-p sampling for response generation (default: 1.0), env: YAI_TOP_P
342
- - **MAX_TOKENS**: Maximum number of tokens for response generation (default: 1024), env: YAI_MAX_TOKENS
343
- - **MAX_HISTORY**: Max history size, default: 500, env: YAI_MAX_HISTORY
344
- - **AUTO_SUGGEST**: Auto suggest from history, default: true, env: YAI_AUTO_SUGGEST
345
-
346
- Default config of `COMPLETION_PATH` and `ANSWER_PATH` is OpenAI compatible. If you are using OpenAI or other OpenAI compatible LLM provider, you can use the default config.
347
-
348
- If you wish to use other providers that are not compatible with the openai interface, you can use the following config:
349
-
350
- - claude:
351
- - BASE_URL: https://api.anthropic.com/v1
352
- - COMPLETION_PATH: /messages
353
- - ANSWER_PATH: content.0.text
354
- - cohere:
355
- - BASE_URL: https://api.cohere.com/v2
356
- - COMPLETION_PATH: /chat
357
- - ANSWER_PATH: message.content.[0].text
358
- - google:
359
- - BASE_URL: https://generativelanguage.googleapis.com/v1beta/openai
360
- - COMPLETION_PATH: /chat/completions
361
- - ANSWER_PATH: choices[0].message.content
362
-
363
- You can use google OpenAI complete endpoint and leave `COMPLETION_PATH` and `ANSWER_PATH` as default. BASE_URL: https://generativelanguage.googleapis.com/v1beta/openai. See https://ai.google.dev/gemini-api/docs/openai
364
-
365
- Claude also has a testable OpenAI-compatible interface, you can just use Calude endpoint and leave `COMPLETION_PATH` and `ANSWER_PATH` as default. See: https://docs.anthropic.com/en/api/openai-sdk
366
-
367
- If you not sure how to config `COMPLETION_PATH` and `ANSWER_PATH`, here is a guide:
353
+ ### Configuration Options Reference
354
+
355
+ | Option | Description | Default | Env Variable |
356
+ | ------------------- | ------------------------------------------- | ---------------------------- | ----------------------- |
357
+ | `PROVIDER` | LLM provider (openai, claude, cohere, etc.) | `openai` | `YAI_PROVIDER` |
358
+ | `BASE_URL` | API endpoint URL | `https://api.openai.com/v1` | `YAI_BASE_URL` |
359
+ | `API_KEY` | Your API key | - | `YAI_API_KEY` |
360
+ | `MODEL` | LLM model to use | `gpt-4o` | `YAI_MODEL` |
361
+ | `SHELL_NAME` | Shell type | `auto` | `YAI_SHELL_NAME` |
362
+ | `OS_NAME` | Operating system | `auto` | `YAI_OS_NAME` |
363
+ | `COMPLETION_PATH` | API completion path | `chat/completions` | `YAI_COMPLETION_PATH` |
364
+ | `ANSWER_PATH` | JSON path for response | `choices[0].message.content` | `YAI_ANSWER_PATH` |
365
+ | `STREAM` | Enable streaming | `true` | `YAI_STREAM` |
366
+ | `TIMEOUT` | API timeout (seconds) | `60` | `YAI_TIMEOUT` |
367
+ | `INTERACTIVE_ROUND` | Interactive mode rounds | `25` | `YAI_INTERACTIVE_ROUND` |
368
+ | `CODE_THEME` | Syntax highlighting theme | `monokai` | `YAI_CODE_THEME` |
369
+ | `TEMPERATURE` | Response randomness | `0.7` | `YAI_TEMPERATURE` |
370
+ | `TOP_P` | Top-p sampling | `1.0` | `YAI_TOP_P` |
371
+ | `MAX_TOKENS` | Max response tokens | `1024` | `YAI_MAX_TOKENS` |
372
+ | `MAX_HISTORY` | Max history entries | `500` | `YAI_MAX_HISTORY` |
373
+ | `AUTO_SUGGEST` | Enable history suggestions | `true` | `YAI_AUTO_SUGGEST` |
374
+ | `CHAT_HISTORY_DIR` | Chat history directory | `<tempdir>/yaicli/history` | `YAI_CHAT_HISTORY_DIR` |
375
+ | `MAX_SAVED_CHATS` | Max saved chats | `20` | `YAI_MAX_SAVED_CHATS` |
376
+
377
+ ### LLM Provider Configuration
378
+
379
+ YAICLI works with all major LLM providers. The default configuration is set up for OpenAI, but you can easily switch to other providers.
380
+
381
+ #### Pre-configured Provider Settings
382
+
383
+ | Provider | BASE_URL | COMPLETION_PATH | ANSWER_PATH |
384
+ | ------------------------------ | --------------------------------------------------------- | ------------------ | ---------------------------- |
385
+ | **OpenAI** (default) | `https://api.openai.com/v1` | `chat/completions` | `choices[0].message.content` |
386
+ | **Claude** (native API) | `https://api.anthropic.com/v1` | `messages` | `content[0].text` |
387
+ | **Claude** (OpenAI-compatible) | `https://api.anthropic.com/v1/openai` | `chat/completions` | `choices[0].message.content` |
388
+ | **Cohere** | `https://api.cohere.com/v2` | `chat` | `message.content[0].text` |
389
+ | **Google Gemini** | `https://generativelanguage.googleapis.com/v1beta/openai` | `chat/completions` | `choices[0].message.content` |
390
+
391
+ > **Note**: Many providers offer OpenAI-compatible endpoints that work with the default settings.
392
+ >
393
+ > - Google Gemini: https://ai.google.dev/gemini-api/docs/openai
394
+ > - Claude: https://docs.anthropic.com/en/api/openai-sdk
395
+
396
+ #### Custom Provider Configuration Guide
397
+
398
+ To configure a custom provider:
399
+
368
400
  1. **Find the API Endpoint**:
369
- - Visit the documentation of the LLM provider you want to use.
370
- - Find the API endpoint for the completion task. This is usually under the "API Reference" or "Developer Documentation" section.
401
+
402
+ - Check the provider's API documentation for their chat completion endpoint
403
+
371
404
  2. **Identify the Response Structure**:
372
- - Look for the structure of the response. This typically includes fields like `choices`, `completion`, etc.
373
- 3. **Identify the Path Expression**:
374
- Forexample, claude response structure like this:
375
- ```json
376
- {
377
- "content": [
405
+
406
+ - Look at the JSON response format to find where the text content is located
407
+
408
+ 3. **Set the Path Expression**:
409
+ - Use jmespath syntax to specify the path to the text content
410
+
411
+ **Example**: For Claude's native API, the response looks like:
412
+
413
+ ```json
414
+ {
415
+ "content": [
378
416
  {
379
- "text": "Hi! My name is Claude.",
380
- "type": "text"
417
+ "text": "Hi! My name is Claude.",
418
+ "type": "text"
381
419
  }
382
- ],
383
- "id": "msg_013Zva2CMHLNnXjNJJKqJ2EF",
384
- "model": "claude-3-7-sonnet-20250219",
385
- "role": "assistant",
386
- "stop_reason": "end_turn",
387
- "stop_sequence": null,
388
- "type": "message",
389
- "usage": {
390
- "input_tokens": 2095,
391
- "output_tokens": 503
392
- }
393
- }
394
- ```
395
- We are looking for the `text` field, so the path should be 1.Key `content`, 2.First obj `[0]`, 3.Key `text`. So it should be `content.[0].text`.
420
+ ],
421
+ "id": "msg_013Zva2CMHLNnXjNJJKqJ2EF",
422
+ "model": "claude-3-7-sonnet-20250219",
423
+ "role": "assistant"
424
+ }
425
+ ```
426
+
427
+ The path to extract the text is: `content.[0].text`
428
+
429
+ ### Syntax Highlighting Themes
430
+
431
+ YAICLI supports all Pygments syntax highlighting themes. You can set your preferred theme in the config file:
396
432
 
397
- **CODE_THEME**
433
+ ```ini
434
+ CODE_THEME=monokai
435
+ ```
398
436
 
399
- You can find the list of code theme here: https://pygments.org/styles/
437
+ Browse available themes at: https://pygments.org/styles/
400
438
 
401
- Default: monokia
402
- ![monikia](artwork/monokia.png)
439
+ ![monokai theme example](artwork/monokia.png)
403
440
 
404
- ## Usage
441
+ ## 🚀 Usage
405
442
 
406
- ### Basic Usage
443
+ ### Quick Start
407
444
 
408
445
  ```bash
409
- # One-shot mode
446
+ # Get a quick answer
410
447
  ai "What is the capital of France?"
411
448
 
412
- # Chat mode
449
+ # Start an interactive chat session
413
450
  ai --chat
414
451
 
415
- # Shell command generation mode
452
+ # Generate and execute shell commands
416
453
  ai --shell "Create a backup of my Documents folder"
417
454
 
418
- # Verbose mode for debugging
455
+ # Analyze code from a file
456
+ cat app.py | ai "Explain what this code does"
457
+
458
+ # Debug with verbose mode
419
459
  ai --verbose "Explain quantum computing"
420
460
  ```
421
461
 
422
- ### Command Line Options
462
+ ### Command Line Reference
463
+
464
+ ```
465
+ Usage: ai [OPTIONS] [PROMPT]
466
+ ```
467
+
468
+ | Option | Short | Description |
469
+ | ------------ | ----- | ----------------------------------- |
470
+ | `--chat` | `-c` | Start in interactive chat mode |
471
+ | `--shell` | `-s` | Generate and execute shell commands |
472
+ | `--help` | `-h` | Show help message and exit |
473
+ | `--verbose` | `-V` | Show detailed debug information |
474
+ | `--template` | | Display the config template |
475
+
476
+ ### Interactive Mode Features
477
+
478
+ <table>
479
+ <tr>
480
+ <td width="50%">
481
+
482
+ **Commands**
423
483
 
424
- Arguments:
425
- - `<PROMPT>`: Argument
484
+ - `/clear` - Clear conversation history
485
+ - `/his` - Show command history
486
+ - `/list` - List saved chats
487
+ - `/save <title>` - Save current chat with title
488
+ - `/load <index>` - Load a saved chat
489
+ - `/del <index>` - Delete a saved chat
490
+ - `/exit` - Exit the application
491
+ - `/mode chat|exec` - Switch modes
426
492
 
427
- Options:
428
- - `--install-completion`: Install completion for the current shell
429
- - `--show-completion`: Show completion for the current shell, to copy it or customize the installation
430
- - `--help` or `-h`: Show this message and exit
431
- - `--template`: Show the config template.
493
+ **Keyboard Shortcuts**
432
494
 
433
- Run Options:
434
- - `--verbose` or `-V`: Show verbose information
435
- - `--chat` or `-c`: Start in chat mode
436
- - `--shell` or `-s`: Generate and execute shell command
495
+ - `Tab` - Toggle between Chat/Execute modes
496
+ - `Ctrl+C` or `Ctrl+D` - Exit
497
+ - `Ctrl+R` - Search history
498
+ - `↑/↓` - Navigate through history
499
+
500
+ </td>
501
+ <td width="50%">
502
+
503
+ **Chat Mode** (💬)
504
+
505
+ - Natural conversations with context
506
+ - Markdown and code formatting
507
+ - Reasoning display for complex queries
508
+
509
+ **Execute Mode** (🚀)
510
+
511
+ - Generate shell commands from descriptions
512
+ - Review commands before execution
513
+ - Edit commands before running
514
+ - Safe execution with confirmation
515
+
516
+ </td>
517
+ </tr>
518
+ </table>
519
+
520
+ ### Chat Persistent
521
+
522
+ The `<PROMPT>` parameter in the chat mode will be used as a title to persist the chat content to the file system, with the save directory being a temporary directory, which may vary between machines, and it is determined on the first run.
523
+
524
+ If the `<PROMPT>` parameter is not specified when entering `chat` mode, the session will be treated as a temporary session and will not be persisted. Of course, you can also manually call the `/save <title>` command to save during the chat.
525
+ When you run the same `chat` command again, the previous session will be automatically loaded.
437
526
 
438
527
  ```bash
439
- ai -h
528
+ $ ai --chat "meaning of life"
529
+ ```
440
530
 
441
- Usage: ai [OPTIONS] [PROMPT]
531
+ > !NOTE: Chat mode is not supported when you redirect input to `ai` command.
532
+ >
533
+ > ```bash
534
+ > $ cat error.log | ai --chat "Explain this error"
535
+ > ```
536
+ >
537
+ > The above command will be parsed as `ai "cat error.log | ai "Explain this error"`.
442
538
 
443
- yaicli - Your AI interface in cli.
444
-
445
- ╭─ Arguments ─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╮
446
- │ prompt [PROMPT] The prompt send to the LLM │
447
- ╰─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯
448
- ╭─ Options ───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╮
449
- │ --template Show the config template. │
450
- │ --install-completion Install completion for the current shell. │
451
- │ --show-completion Show completion for the current shell, to copy it or customize the installation. │
452
- │ --help -h Show this message and exit. │
453
- ╰─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯
454
- ╭─ Run Option ────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╮
455
- │ --chat -c Start in chat mode │
456
- │ --shell -s Generate and execute shell command │
457
- │ --verbose -V Show verbose information │
458
- ╰─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯
539
+ **Start a temporary chat session**
459
540
 
541
+ ```bash
542
+ $ ai --chat
460
543
  ```
461
544
 
462
- ### Interactive Mode
545
+ **Save a temporary chat session**
463
546
 
464
- In interactive mode (chat or shell), you can:
465
- - Type your queries and get responses
466
- - Use `Tab` to switch between Chat and Execute modes
467
- - Type '/exit' to exit
468
- - Type '/clear' to clear history
469
- - Type '/his' to show history
547
+ ```bash
548
+ $ ai --chat
549
+ Starting a temporary chat session (will not be saved automatically)
550
+ ...
551
+ 💬 > hi
552
+ Assistant:
553
+ Hello! How can I assist you today?
554
+ ───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────
555
+ 💬 > /save "hello"
556
+ Chat saved as: hello
557
+ Session is now marked as persistent and will be auto-saved on exit.
558
+ ───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────
559
+ 💬 >
560
+ ```
470
561
 
471
- ### Shell Command Generation
562
+ **Start a persistent chat session**
472
563
 
473
- In Execute mode:
474
- 1. Enter your request in natural language
475
- 2. YAICLI will generate an appropriate shell command
476
- 3. Review the command
477
- 4. Confirm to execute or reject
564
+ ```bash
565
+ $ ai --chat "check disk usage"
566
+ ```
478
567
 
479
- ### Keyboard Shortcuts
480
- - `Tab`: Switch between Chat and Execute modes
481
- - `Ctrl+C`: Exit
482
- - `Ctrl+R`: Search history
483
- - `↑/↓`: Navigate history
568
+ **Load a saved chat session**
569
+
570
+ ```bash
571
+ $ ai --chat hello
572
+ Chat title: hello
573
+
574
+ ██ ██ █████ ██ ██████ ██ ██
575
+ ██ ██ ██ ██ ██ ██ ██ ██
576
+ ████ ███████ ██ ██ ██ ██
577
+ ██ ██ ██ ██ ██ ██ ██
578
+ ██ ██ ██ ██ ██████ ███████ ██
579
+
580
+ Welcome to YAICLI!
581
+ Current: Persistent Session: hello
582
+ Press TAB to switch mode
583
+ /clear : Clear chat history
584
+ /his : Show chat history
585
+ /list : List saved chats
586
+ /save <title> : Save current chat
587
+ /load <index> : Load a saved chat
588
+ /del <index> : Delete a saved chat
589
+ /exit|Ctrl+D|Ctrl+C: Exit
590
+ /mode chat|exec : Switch mode (Case insensitive)
591
+ ───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────
592
+ 💬 > /his
593
+ Chat History:
594
+ 1 User: hi
595
+ Assistant:
596
+ Hello! How can I assist you today?
597
+ ───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────
598
+ 💬 >
599
+ ```
600
+
601
+ ### Input Methods
602
+
603
+ **Direct Input**
604
+
605
+ ```bash
606
+ ai "What is the capital of France?"
607
+ ```
608
+
609
+ **Piped Input**
484
610
 
485
- ### Stdin
486
- You can also pipe input to YAICLI:
487
611
  ```bash
488
612
  echo "What is the capital of France?" | ai
489
613
  ```
490
614
 
615
+ **File Analysis**
616
+
491
617
  ```bash
492
- cat demo.py | ai "How to use this tool?"
618
+ cat demo.py | ai "Explain this code"
493
619
  ```
494
620
 
495
- ### History
496
- Support max history size. Set MAX_HISTORY in config file. Default is 500.
621
+ **Combined Input**
497
622
 
498
- ## Examples
623
+ ```bash
624
+ cat error.log | ai "Why am I getting these errors in my Python app?"
625
+ ```
626
+
627
+ ### History Management
628
+
629
+ YAICLI maintains a history of your interactions (default: 500 entries) stored in `~/.yaicli_history`. You can:
630
+
631
+ - Configure history size with `MAX_HISTORY` in config
632
+ - Search history with `Ctrl+R` in interactive mode
633
+ - View recent commands with `/his` command
499
634
 
500
- ### Have a Chat
635
+ ## 📱 Examples
636
+
637
+ ### Quick Answer Mode
501
638
 
502
639
  ```bash
503
640
  $ ai "What is the capital of France?"
@@ -505,7 +642,7 @@ Assistant:
505
642
  The capital of France is Paris.
506
643
  ```
507
644
 
508
- ### Command Gen and Run
645
+ ### Command Generation & Execution
509
646
 
510
647
  ```bash
511
648
  $ ai -s 'Check the current directory size'
@@ -525,47 +662,55 @@ Output:
525
662
 
526
663
  ```bash
527
664
  $ ai --chat
665
+ Starting a temporary chat session (will not be saved automatically)
666
+
667
+ ██ ██ █████ ██ ██████ ██ ██
668
+ ██ ██ ██ ██ ██ ██ ██ ██
669
+ ████ ███████ ██ ██ ██ ██
670
+ ██ ██ ██ ██ ██ ██ ██
671
+ ██ ██ ██ ██ ██████ ███████ ██
672
+
673
+ Welcome to YAICLI!
674
+ Current: Temporary Session (use /save to make persistent)
675
+ Press TAB to switch mode
676
+ /clear : Clear chat history
677
+ /his : Show chat history
678
+ /list : List saved chats
679
+ /save <title> : Save current chat
680
+ /load <index> : Load a saved chat
681
+ /del <index> : Delete a saved chat
682
+ /exit|Ctrl+D|Ctrl+C: Exit
683
+ /mode chat|exec : Switch mode (Case insensitive)
684
+ ───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────
685
+ 💬 > Tell me about the solar system
528
686
 
529
- ██ ██ █████ ██ ██████ ██ ██
530
- ██ ██ ██ ██ ██ ██ ██ ██
531
- ████ ███████ ██ ██ ██ ██
532
- ██ ██ ██ ██ ██ ██ ██
533
- ██ ██ ██ ██ ██████ ███████ ██
687
+ Assistant:
688
+ Solar System Overview
534
689
 
535
- Press TAB to change in chat and exec mode
536
- Type /clear to clear chat history
537
- Type /his to see chat history
538
- Press Ctrl+C or type /exit to exit
690
+ Central Star: The Sun (99% of system mass, nuclear fusion).
691
+ Planets: 8 total.
692
+ Terrestrial (rocky): Mercury, Venus, Earth, Mars.
693
+ Gas Giants: Jupiter, Saturn.
694
+ • Ice Giants: Uranus, Neptune.
695
+ • Moons: Over 200 (e.g., Earth: 1, Jupiter: 95).
696
+ • Smaller Bodies:
697
+ • Asteroids (between Mars/Venus), comets ( icy, distant), * dwarf planets* (Pluto, Ceres).
698
+ • Oort Cloud: spherical shell of icy objects ~1–100,000天文單位 (AU) from Sun).
699
+ • Heliosphere: Solar wind boundary protecting Earth from cosmic radiation.
539
700
 
540
- 💬 > Tell me about the solar system
541
-
542
- Assistant:
543
- Certainly! Here’s a brief overview of the solar system:
544
-
545
- • Sun: The central star of the solar system, providing light and energy.
546
- • Planets:
547
- • Mercury: Closest to the Sun, smallest planet.
548
- • Venus: Second planet, known for its thick atmosphere and high surface temperature.
549
- • Earth: Third planet, the only known planet to support life.
550
- • Mars: Fourth planet, often called the "Red Planet" due to its reddish appearance.
551
- • Jupiter: Largest planet, a gas giant with many moons.
552
- • Saturn: Known for its prominent ring system, also a gas giant.
553
- • Uranus: An ice giant, known for its unique axial tilt.
554
- • Neptune: Another ice giant, known for its deep blue color.
555
- • Dwarf Planets:
556
- • Pluto: Once considered the ninth planet, now classified as
701
+ Key Fact: Earth is the only confirmed habitable planet.
557
702
 
558
703
  🚀 > Check the current directory size
559
704
  Assistant:
560
705
  du -sh .
561
- ╭─ Command ─╮
562
- │ du -sh .
563
- ╰───────────╯
706
+ ╭─ Suggest Command ─╮
707
+ │ du -sh .
708
+ ╰───────────────────╯
564
709
  Execute command? [e]dit, [y]es, [n]o (n): e
565
- Edit command, press enter to execute:
566
- du -sh ./
567
- Output:
568
- 109M ./
710
+ Edit command: du -sh ./
711
+ --- Executing ---
712
+ 55M ./
713
+ --- Finished ---
569
714
  🚀 >
570
715
  ```
571
716
 
@@ -575,9 +720,9 @@ Output:
575
720
  $ ai --shell "Find all PDF files in my Downloads folder"
576
721
  Assistant:
577
722
  find ~/Downloads -type f -name "*.pdf"
578
- ╭─ Command ──────────────────────────────╮
579
- │ find ~/Downloads -type f -name "*.pdf" │
580
- ╰────────────────────────────────────────╯
723
+ ╭─ Suggest Command ───────────────────────╮
724
+ │ find ~/Downloads -type f -iname "*.pdf" │
725
+ ╰─────────────────────────────────────────╯
581
726
  Execute command? [e]dit, [y]es, [n]o (n): y
582
727
  Output:
583
728
 
@@ -586,24 +731,41 @@ Output:
586
731
  ...
587
732
  ```
588
733
 
589
- ## Technical Implementation
734
+ ## 💻 Technical Details
735
+
736
+ ### Architecture
737
+
738
+ YAICLI is designed with a modular architecture that separates concerns and makes the codebase maintainable:
739
+
740
+ - **CLI Module**: Handles user interaction and command parsing
741
+ - **API Client**: Manages communication with LLM providers
742
+ - **Config Manager**: Handles layered configuration
743
+ - **History Manager**: Maintains conversation history with LRU functionality
744
+ - **Printer**: Formats and displays responses with rich formatting
745
+
746
+ ### Dependencies
590
747
 
591
- YAICLI is built using several Python libraries:
748
+ | Library | Purpose |
749
+ | --------------------------------------------------------------- | -------------------------------------------------- |
750
+ | [Typer](https://typer.tiangolo.com/) | Command-line interface with type hints |
751
+ | [Rich](https://rich.readthedocs.io/) | Terminal formatting and beautiful display |
752
+ | [prompt_toolkit](https://python-prompt-toolkit.readthedocs.io/) | Interactive input with history and auto-completion |
753
+ | [httpx](https://www.python-httpx.org/) | Modern HTTP client with async support |
754
+ | [jmespath](https://jmespath.org/) | JSON data extraction |
592
755
 
593
- - **Typer**: Provides the command-line interface
594
- - **Rich**: Provides terminal content formatting and beautiful display
595
- - **prompt_toolkit**: Provides interactive command-line input experience
596
- - **httpx**: Handles API requests
597
- - **jmespath**: Parses JSON responses
756
+ ## 👨‍💻 Contributing
598
757
 
599
- ## Contributing
758
+ Contributions are welcome! Here's how you can help:
600
759
 
601
- Contributions of code, issue reports, or feature suggestions are welcome.
760
+ - **Bug Reports**: Open an issue describing the bug and how to reproduce it
761
+ - **Feature Requests**: Suggest new features or improvements
762
+ - **Code Contributions**: Submit a PR with your changes
763
+ - **Documentation**: Help improve or translate the documentation
602
764
 
603
- ## License
765
+ ## 📃 License
604
766
 
605
767
  [Apache License 2.0](LICENSE)
606
768
 
607
769
  ---
608
770
 
609
- *YAICLI - Making your terminal smarter*
771
+ <p align="center"><i>YAICLI - Your AI Command Line Interface</i></p>