yaicli 0.1.0__tar.gz → 0.3.0__tar.gz

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
@@ -1,6 +1,6 @@
1
1
  Metadata-Version: 2.4
2
2
  Name: yaicli
3
- Version: 0.1.0
3
+ Version: 0.3.0
4
4
  Summary: A simple CLI tool to interact with LLM
5
5
  Project-URL: Homepage, https://github.com/belingud/yaicli
6
6
  Project-URL: Repository, https://github.com/belingud/yaicli
@@ -222,19 +222,21 @@ Requires-Dist: socksio>=1.0.0
222
222
  Requires-Dist: typer>=0.15.2
223
223
  Description-Content-Type: text/markdown
224
224
 
225
- # YAICLI - Your AI Command Line Interface
225
+ # YAICLI: Your AI assistant in the command line.
226
226
 
227
227
  [![PyPI version](https://img.shields.io/pypi/v/yaicli?style=for-the-badge)](https://pypi.org/project/yaicli/)
228
228
  ![GitHub License](https://img.shields.io/github/license/belingud/yaicli?style=for-the-badge)
229
229
  ![PyPI - Downloads](https://img.shields.io/pypi/dm/yaicli?logo=pypi&style=for-the-badge)
230
230
  ![Pepy Total Downloads](https://img.shields.io/pepy/dt/yaicli?style=for-the-badge&logo=python)
231
231
 
232
- YAICLI is a powerful yet lightweight command-line AI assistant that brings the capabilities of Large Language Models (LLMs) like GPT-4o directly to your terminal. Interact with AI through multiple modes: have natural conversations, generate and execute shell commands, or get quick answers without leaving your workflow.
232
+ YAICLI is a powerful yet lightweight command-line AI assistant that brings the capabilities of Large Language Models (
233
+ LLMs) like GPT-4o directly to your terminal. Interact with AI through multiple modes: have natural conversations,
234
+ generate and execute shell commands, or get quick answers without leaving your workflow.
233
235
 
234
236
  **Supports both standard and deep reasoning models across all major LLM providers.**
235
237
 
236
238
  <p align="center">
237
- <img src="https://vhs.charm.sh/vhs-5U1BBjJkTUBReRswsSgIVx.gif" alt="YAICLI Demo" width="85%">
239
+ <img src="https://vhs.charm.sh/vhs-5U1BBjJkTUBReRswsSgIVx.gif" alt="YAICLI Chat Demo" width="85%">
238
240
  </p>
239
241
 
240
242
  > [!NOTE]
@@ -243,30 +245,37 @@ YAICLI is a powerful yet lightweight command-line AI assistant that brings the c
243
245
  ## ✨ Key Features
244
246
 
245
247
  ### 🔄 Multiple Interaction Modes
248
+
246
249
  - **💬 Chat Mode**: Engage in persistent conversations with full context tracking
247
250
  - **🚀 Execute Mode**: Generate and safely run OS-specific shell commands
248
251
  - **⚡ Quick Query**: Get instant answers without entering interactive mode
249
252
 
250
253
  ### 🧠 Smart Environment Awareness
254
+
251
255
  - **Auto-detection**: Identifies your shell (bash/zsh/PowerShell/CMD) and OS
252
256
  - **Safe Command Execution**: 3-step verification before running any command
253
257
  - **Flexible Input**: Pipe content directly (`cat log.txt | ai "analyze this"`)
254
258
 
255
259
  ### 🔌 Universal LLM Compatibility
260
+
256
261
  - **OpenAI-Compatible**: Works with any OpenAI-compatible API endpoint
257
262
  - **Multi-Provider Support**: Easy configuration for Claude, Gemini, Cohere, etc.
258
263
  - **Custom Response Parsing**: Extract exactly what you need with jmespath
259
264
 
260
265
  ### 💻 Enhanced Terminal Experience
266
+
261
267
  - **Real-time Streaming**: See responses as they're generated with cursor animation
262
268
  - **Rich History Management**: LRU-based history with 500 entries by default
263
269
  - **Syntax Highlighting**: Beautiful code formatting with customizable themes
264
270
 
265
271
  ### 🛠️ Developer-Friendly
272
+
266
273
  - **Layered Configuration**: Environment variables > Config file > Sensible defaults
267
274
  - **Debugging Tools**: Verbose mode with detailed API tracing
268
275
  - **Lightweight**: Minimal dependencies with focused functionality
269
276
 
277
+ ![What is life](artwork/reasoning_example.png)
278
+
270
279
  ## 📦 Installation
271
280
 
272
281
  ### Prerequisites
@@ -306,77 +315,94 @@ YAICLI uses a simple configuration file to store your preferences and API keys.
306
315
 
307
316
  ### Configuration File Structure
308
317
 
309
- The default configuration file is located at `~/.config/yaicli/config.ini`. You can use `ai --template` to see default settings, just as below:
318
+ The default configuration file is located at `~/.config/yaicli/config.ini`. You can use `ai --template` to see default
319
+ settings, just as below:
310
320
 
311
321
  ```ini
312
322
  [core]
313
- PROVIDER=openai
314
- BASE_URL=https://api.openai.com/v1
315
- API_KEY=
316
- MODEL=gpt-4o
323
+ PROVIDER = openai
324
+ BASE_URL = https://api.openai.com/v1
325
+ API_KEY =
326
+ MODEL = gpt-4o
317
327
 
318
328
  # auto detect shell and os (or specify manually, e.g., bash, zsh, powershell.exe)
319
- SHELL_NAME=auto
320
- OS_NAME=auto
329
+ SHELL_NAME = auto
330
+ OS_NAME = auto
321
331
 
322
332
  # API paths (usually no need to change for OpenAI compatible APIs)
323
- COMPLETION_PATH=chat/completions
324
- ANSWER_PATH=choices[0].message.content
333
+ COMPLETION_PATH = chat/completions
334
+ ANSWER_PATH = choices[0].message.content
325
335
 
326
336
  # true: streaming response, false: non-streaming
327
- STREAM=true
337
+ STREAM = true
328
338
 
329
339
  # LLM parameters
330
- TEMPERATURE=0.7
331
- TOP_P=1.0
332
- MAX_TOKENS=1024
333
- TIMEOUT=60
340
+ TEMPERATURE = 0.7
341
+ TOP_P = 1.0
342
+ MAX_TOKENS = 1024
343
+ TIMEOUT = 60
334
344
 
335
345
  # Interactive mode parameters
336
- INTERACTIVE_ROUND=25
346
+ INTERACTIVE_ROUND = 25
337
347
 
338
348
  # UI/UX
339
- CODE_THEME=monokai
340
- MAX_HISTORY=500 # Max entries kept in history file
341
- AUTO_SUGGEST=true
349
+ CODE_THEME = monokai
350
+ # Max entries kept in history file
351
+ MAX_HISTORY = 500
352
+ AUTO_SUGGEST = true
353
+ # Print reasoning content or not
354
+ SHOW_REASONING = true
355
+ # Text alignment (default, left, center, right, full)
356
+ JUSTIFY = default
357
+
358
+ # Chat history settings
359
+ CHAT_HISTORY_DIR = <tempdir>/yaicli/chats
360
+ MAX_SAVED_CHATS = 20
342
361
  ```
343
362
 
344
363
  ### Configuration Options Reference
345
364
 
346
- | Option | Description | Default | Env Variable |
347
- |--------|-------------|---------|---------------|
348
- | `BASE_URL` | API endpoint URL | `https://api.openai.com/v1` | `YAI_BASE_URL` |
349
- | `API_KEY` | Your API key | - | `YAI_API_KEY` |
350
- | `MODEL` | LLM model to use | `gpt-4o` | `YAI_MODEL` |
351
- | `SHELL_NAME` | Shell type | `auto` | `YAI_SHELL_NAME` |
352
- | `OS_NAME` | Operating system | `auto` | `YAI_OS_NAME` |
353
- | `COMPLETION_PATH` | API completion path | `chat/completions` | `YAI_COMPLETION_PATH` |
354
- | `ANSWER_PATH` | JSON path for response | `choices[0].message.content` | `YAI_ANSWER_PATH` |
355
- | `STREAM` | Enable streaming | `true` | `YAI_STREAM` |
356
- | `TIMEOUT` | API timeout (seconds) | `60` | `YAI_TIMEOUT` |
357
- | `INTERACTIVE_ROUND` | Interactive mode rounds | `25` | `YAI_INTERACTIVE_ROUND` |
358
- | `CODE_THEME` | Syntax highlighting theme | `monokai` | `YAI_CODE_THEME` |
359
- | `TEMPERATURE` | Response randomness | `0.7` | `YAI_TEMPERATURE` |
360
- | `TOP_P` | Top-p sampling | `1.0` | `YAI_TOP_P` |
361
- | `MAX_TOKENS` | Max response tokens | `1024` | `YAI_MAX_TOKENS` |
362
- | `MAX_HISTORY` | Max history entries | `500` | `YAI_MAX_HISTORY` |
363
- | `AUTO_SUGGEST` | Enable history suggestions | `true` | `YAI_AUTO_SUGGEST` |
365
+ | Option | Description | Default | Env Variable |
366
+ |---------------------|---------------------------------------------|------------------------------|-------------------------|
367
+ | `PROVIDER` | LLM provider (openai, claude, cohere, etc.) | `openai` | `YAI_PROVIDER` |
368
+ | `BASE_URL` | API endpoint URL | `https://api.openai.com/v1` | `YAI_BASE_URL` |
369
+ | `API_KEY` | Your API key | - | `YAI_API_KEY` |
370
+ | `MODEL` | LLM model to use | `gpt-4o` | `YAI_MODEL` |
371
+ | `SHELL_NAME` | Shell type | `auto` | `YAI_SHELL_NAME` |
372
+ | `OS_NAME` | Operating system | `auto` | `YAI_OS_NAME` |
373
+ | `COMPLETION_PATH` | API completion path | `chat/completions` | `YAI_COMPLETION_PATH` |
374
+ | `ANSWER_PATH` | JSON path for response | `choices[0].message.content` | `YAI_ANSWER_PATH` |
375
+ | `STREAM` | Enable streaming | `true` | `YAI_STREAM` |
376
+ | `TIMEOUT` | API timeout (seconds) | `60` | `YAI_TIMEOUT` |
377
+ | `INTERACTIVE_ROUND` | Interactive mode rounds | `25` | `YAI_INTERACTIVE_ROUND` |
378
+ | `CODE_THEME` | Syntax highlighting theme | `monokai` | `YAI_CODE_THEME` |
379
+ | `TEMPERATURE` | Response randomness | `0.7` | `YAI_TEMPERATURE` |
380
+ | `TOP_P` | Top-p sampling | `1.0` | `YAI_TOP_P` |
381
+ | `MAX_TOKENS` | Max response tokens | `1024` | `YAI_MAX_TOKENS` |
382
+ | `MAX_HISTORY` | Max history entries | `500` | `YAI_MAX_HISTORY` |
383
+ | `AUTO_SUGGEST` | Enable history suggestions | `true` | `YAI_AUTO_SUGGEST` |
384
+ | `SHOW_REASONING` | Enable reasoning display | `true` | `YAI_SHOW_REASONING` |
385
+ | `JUSTIFY` | Text alignment | `default` | `YAI_JUSTIFY` |
386
+ | `CHAT_HISTORY_DIR` | Chat history directory | `<tempdir>/yaicli/history` | `YAI_CHAT_HISTORY_DIR` |
387
+ | `MAX_SAVED_CHATS` | Max saved chats | `20` | `YAI_MAX_SAVED_CHATS` |
364
388
 
365
389
  ### LLM Provider Configuration
366
390
 
367
- YAICLI works with all major LLM providers. The default configuration is set up for OpenAI, but you can easily switch to other providers.
391
+ YAICLI works with all major LLM providers. The default configuration is set up for OpenAI, but you can easily switch to
392
+ other providers.
368
393
 
369
394
  #### Pre-configured Provider Settings
370
395
 
371
- | Provider | BASE_URL | COMPLETION_PATH | ANSWER_PATH |
372
- |----------|----------|-----------------|-------------|
373
- | **OpenAI** (default) | `https://api.openai.com/v1` | `chat/completions` | `choices[0].message.content` |
374
- | **Claude** (native API) | `https://api.anthropic.com/v1` | `messages` | `content[0].text` |
375
- | **Claude** (OpenAI-compatible) | `https://api.anthropic.com/v1/openai` | `chat/completions` | `choices[0].message.content` |
376
- | **Cohere** | `https://api.cohere.com/v2` | `chat` | `message.content[0].text` |
377
- | **Google Gemini** | `https://generativelanguage.googleapis.com/v1beta/openai` | `chat/completions` | `choices[0].message.content` |
396
+ | Provider | BASE_URL | COMPLETION_PATH | ANSWER_PATH |
397
+ |--------------------------------|-----------------------------------------------------------|--------------------|------------------------------|
398
+ | **OpenAI** (default) | `https://api.openai.com/v1` | `chat/completions` | `choices[0].message.content` |
399
+ | **Claude** (native API) | `https://api.anthropic.com/v1` | `messages` | `content[0].text` |
400
+ | **Claude** (OpenAI-compatible) | `https://api.anthropic.com/v1/openai` | `chat/completions` | `choices[0].message.content` |
401
+ | **Cohere** | `https://api.cohere.com/v2` | `chat` | `message.content[0].text` |
402
+ | **Google Gemini** | `https://generativelanguage.googleapis.com/v1beta/openai` | `chat/completions` | `choices[0].message.content` |
378
403
 
379
404
  > **Note**: Many providers offer OpenAI-compatible endpoints that work with the default settings.
405
+ >
380
406
  > - Google Gemini: https://ai.google.dev/gemini-api/docs/openai
381
407
  > - Claude: https://docs.anthropic.com/en/api/openai-sdk
382
408
 
@@ -385,15 +411,18 @@ YAICLI works with all major LLM providers. The default configuration is set up f
385
411
  To configure a custom provider:
386
412
 
387
413
  1. **Find the API Endpoint**:
388
- - Check the provider's API documentation for their chat completion endpoint
414
+
415
+ - Check the provider's API documentation for their chat completion endpoint
389
416
 
390
417
  2. **Identify the Response Structure**:
391
- - Look at the JSON response format to find where the text content is located
418
+
419
+ - Look at the JSON response format to find where the text content is located
392
420
 
393
421
  3. **Set the Path Expression**:
394
- - Use jmespath syntax to specify the path to the text content
422
+ - Use jmespath syntax to specify the path to the text content
395
423
 
396
424
  **Example**: For Claude's native API, the response looks like:
425
+
397
426
  ```json
398
427
  {
399
428
  "content": [
@@ -408,14 +437,14 @@ To configure a custom provider:
408
437
  }
409
438
  ```
410
439
 
411
- The path to extract the text is: `content.0.text`
440
+ The path to extract the text is: `content.[0].text`
412
441
 
413
442
  ### Syntax Highlighting Themes
414
443
 
415
444
  YAICLI supports all Pygments syntax highlighting themes. You can set your preferred theme in the config file:
416
445
 
417
446
  ```ini
418
- CODE_THEME=monokai
447
+ CODE_THEME = monokai
419
448
  ```
420
449
 
421
450
  Browse available themes at: https://pygments.org/styles/
@@ -449,13 +478,13 @@ ai --verbose "Explain quantum computing"
449
478
  Usage: ai [OPTIONS] [PROMPT]
450
479
  ```
451
480
 
452
- | Option | Short | Description |
453
- |--------|-------|-------------|
454
- | `--chat` | `-c` | Start in interactive chat mode |
455
- | `--shell` | `-s` | Generate and execute shell commands |
456
- | `--help` | `-h` | Show help message and exit |
457
- | `--verbose` | `-V` | Show detailed debug information |
458
- | `--template` | | Display the config template |
481
+ | Option | Short | Description |
482
+ |--------------|-------|-------------------------------------|
483
+ | `--chat` | `-c` | Start in interactive chat mode |
484
+ | `--shell` | `-s` | Generate and execute shell commands |
485
+ | `--help` | `-h` | Show help message and exit |
486
+ | `--verbose` | `-V` | Show detailed debug information |
487
+ | `--template` | | Display the config template |
459
488
 
460
489
  ### Interactive Mode Features
461
490
 
@@ -464,12 +493,18 @@ Usage: ai [OPTIONS] [PROMPT]
464
493
  <td width="50%">
465
494
 
466
495
  **Commands**
467
- - `/exit` - Exit the application
496
+
468
497
  - `/clear` - Clear conversation history
469
498
  - `/his` - Show command history
499
+ - `/list` - List saved chats
500
+ - `/save <title>` - Save current chat with title
501
+ - `/load <index>` - Load a saved chat
502
+ - `/del <index>` - Delete a saved chat
503
+ - `/exit` - Exit the application
470
504
  - `/mode chat|exec` - Switch modes
471
505
 
472
506
  **Keyboard Shortcuts**
507
+
473
508
  - `Tab` - Toggle between Chat/Execute modes
474
509
  - `Ctrl+C` or `Ctrl+D` - Exit
475
510
  - `Ctrl+R` - Search history
@@ -479,11 +514,13 @@ Usage: ai [OPTIONS] [PROMPT]
479
514
  <td width="50%">
480
515
 
481
516
  **Chat Mode** (💬)
517
+
482
518
  - Natural conversations with context
483
519
  - Markdown and code formatting
484
520
  - Reasoning display for complex queries
485
521
 
486
522
  **Execute Mode** (🚀)
523
+
487
524
  - Generate shell commands from descriptions
488
525
  - Review commands before execution
489
526
  - Edit commands before running
@@ -493,24 +530,112 @@ Usage: ai [OPTIONS] [PROMPT]
493
530
  </tr>
494
531
  </table>
495
532
 
533
+ ### Chat Persistent
534
+
535
+ The `<PROMPT>` parameter in the chat mode will be used as a title to persist the chat content to the file system, with
536
+ the save directory being a temporary directory, which may vary between machines, and it is determined on the first run.
537
+
538
+ If the `<PROMPT>` parameter is not specified when entering `chat` mode, the session will be treated as a temporary
539
+ session and will not be persisted. Of course, you can also manually call the `/save <title>` command to save during the
540
+ chat.
541
+ When you run the same `chat` command again, the previous session will be automatically loaded.
542
+
543
+ ```bash
544
+ $ ai --chat "meaning of life"
545
+ ```
546
+
547
+ > !NOTE: Chat mode is not supported when you redirect input to `ai` command.
548
+ >
549
+ > ```bash
550
+ > $ cat error.log | ai --chat "Explain this error"
551
+ > ```
552
+ >
553
+ > The above command will be parsed as `ai "cat error.log | ai "Explain this error"`.
554
+
555
+ **Start a temporary chat session**
556
+
557
+ ```bash
558
+ $ ai --chat
559
+ ```
560
+
561
+ **Save a temporary chat session**
562
+
563
+ ```bash
564
+ $ ai --chat
565
+ Starting a temporary chat session (will not be saved automatically)
566
+ ...
567
+ 💬 > hi
568
+ Assistant:
569
+ Hello! How can I assist you today?
570
+ ───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────
571
+ 💬 > /save "hello"
572
+ Chat saved as: hello
573
+ Session is now marked as persistent and will be auto-saved on exit.
574
+ ───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────
575
+ 💬 >
576
+ ```
577
+
578
+ **Start a persistent chat session**
579
+
580
+ ```bash
581
+ $ ai --chat "check disk usage"
582
+ ```
583
+
584
+ **Load a saved chat session**
585
+
586
+ ```bash
587
+ $ ai --chat hello
588
+ Chat title: hello
589
+
590
+ ██ ██ █████ ██ ██████ ██ ██
591
+ ██ ██ ██ ██ ██ ██ ██ ██
592
+ ████ ███████ ██ ██ ██ ██
593
+ ██ ██ ██ ██ ██ ██ ██
594
+ ██ ██ ██ ██ ██████ ███████ ██
595
+
596
+ Welcome to YAICLI!
597
+ Current: Persistent Session: hello
598
+ Press TAB to switch mode
599
+ /clear : Clear chat history
600
+ /his : Show chat history
601
+ /list : List saved chats
602
+ /save <title> : Save current chat
603
+ /load <index> : Load a saved chat
604
+ /del <index> : Delete a saved chat
605
+ /exit|Ctrl+D|Ctrl+C: Exit
606
+ /mode chat|exec : Switch mode (Case insensitive)
607
+ ───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────
608
+ 💬 > /his
609
+ Chat History:
610
+ 1 User: hi
611
+ Assistant:
612
+ Hello! How can I assist you today?
613
+ ───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────
614
+ 💬 >
615
+ ```
616
+
496
617
  ### Input Methods
497
618
 
498
619
  **Direct Input**
620
+
499
621
  ```bash
500
622
  ai "What is the capital of France?"
501
623
  ```
502
624
 
503
625
  **Piped Input**
626
+
504
627
  ```bash
505
628
  echo "What is the capital of France?" | ai
506
629
  ```
507
630
 
508
631
  **File Analysis**
632
+
509
633
  ```bash
510
634
  cat demo.py | ai "Explain this code"
511
635
  ```
512
636
 
513
637
  **Combined Input**
638
+
514
639
  ```bash
515
640
  cat error.log | ai "Why am I getting these errors in my Python app?"
516
641
  ```
@@ -553,6 +678,7 @@ Output:
553
678
 
554
679
  ```bash
555
680
  $ ai --chat
681
+ Starting a temporary chat session (will not be saved automatically)
556
682
 
557
683
  ██ ██ █████ ██ ██████ ██ ██
558
684
  ██ ██ ██ ██ ██ ██ ██ ██
@@ -561,12 +687,17 @@ $ ai --chat
561
687
  ██ ██ ██ ██ ██████ ███████ ██
562
688
 
563
689
  Welcome to YAICLI!
690
+ Current: Temporary Session (use /save to make persistent)
564
691
  Press TAB to switch mode
565
692
  /clear : Clear chat history
566
693
  /his : Show chat history
694
+ /list : List saved chats
695
+ /save <title> : Save current chat
696
+ /load <index> : Load a saved chat
697
+ /del <index> : Delete a saved chat
567
698
  /exit|Ctrl+D|Ctrl+C: Exit
568
699
  /mode chat|exec : Switch mode (Case insensitive)
569
- ───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────
700
+ ───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────
570
701
  💬 > Tell me about the solar system
571
702
 
572
703
  Assistant:
@@ -630,13 +761,13 @@ YAICLI is designed with a modular architecture that separates concerns and makes
630
761
 
631
762
  ### Dependencies
632
763
 
633
- | Library | Purpose |
634
- |---------|----------|
635
- | [Typer](https://typer.tiangolo.com/) | Command-line interface with type hints |
636
- | [Rich](https://rich.readthedocs.io/) | Terminal formatting and beautiful display |
764
+ | Library | Purpose |
765
+ |-----------------------------------------------------------------|----------------------------------------------------|
766
+ | [Typer](https://typer.tiangolo.com/) | Command-line interface with type hints |
767
+ | [Rich](https://rich.readthedocs.io/) | Terminal formatting and beautiful display |
637
768
  | [prompt_toolkit](https://python-prompt-toolkit.readthedocs.io/) | Interactive input with history and auto-completion |
638
- | [httpx](https://www.python-httpx.org/) | Modern HTTP client with async support |
639
- | [jmespath](https://jmespath.org/) | JSON data extraction |
769
+ | [httpx](https://www.python-httpx.org/) | Modern HTTP client with async support |
770
+ | [jmespath](https://jmespath.org/) | JSON data extraction |
640
771
 
641
772
  ## 👨‍💻 Contributing
642
773