yaicli 0.0.14__tar.gz → 0.0.16__tar.gz

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
@@ -1,6 +1,6 @@
1
1
  Metadata-Version: 2.4
2
2
  Name: yaicli
3
- Version: 0.0.14
3
+ Version: 0.0.16
4
4
  Summary: A simple CLI tool to interact with LLM
5
5
  Project-URL: Homepage, https://github.com/belingud/yaicli
6
6
  Project-URL: Repository, https://github.com/belingud/yaicli
@@ -233,7 +233,7 @@ YAICLI is a compact yet potent command-line AI assistant, allowing you to engage
233
233
 
234
234
  Support regular and deep thinking models.
235
235
 
236
- > [!WARNING]
236
+ > [!WARNING]
237
237
  > This is a work in progress, some features could change or be removed in the future.
238
238
 
239
239
  ## Features
@@ -259,6 +259,8 @@ Support regular and deep thinking models.
259
259
 
260
260
  - **Keyboard Shortcuts**:
261
261
  - Tab to switch between Chat and Execute modes
262
+ - `↑/↓` to navigate history
263
+ - `Ctrl+R` to search history
262
264
 
263
265
  - **History**:
264
266
  - Save and recall previous queries
@@ -319,6 +321,7 @@ ANSWER_PATH=choices[0].message.content
319
321
  # true: streaming response
320
322
  # false: non-streaming response
321
323
  STREAM=true
324
+ CODE_THEME=monokia
322
325
 
323
326
  TEMPERATURE=0.7
324
327
  TOP_P=1.0
@@ -337,9 +340,11 @@ Below are the available configuration options and override environment variables
337
340
  - **COMPLETION_PATH**: Path for completions endpoint, default: /chat/completions, env: YAI_COMPLETION_PATH
338
341
  - **ANSWER_PATH**: Json path expression to extract answer from response, default: choices[0].message.content, env: YAI_ANSWER_PATH
339
342
  - **STREAM**: Enable/disable streaming responses, default: true, env: YAI_STREAM
343
+ - **CODE_THEME**: Theme for code blocks, default: monokia, env: YAI_CODE_THEME
340
344
  - **TEMPERATURE**: Temperature for response generation (default: 0.7), env: YAI_TEMPERATURE
341
345
  - **TOP_P**: Top-p sampling for response generation (default: 1.0), env: YAI_TOP_P
342
346
  - **MAX_TOKENS**: Maximum number of tokens for response generation (default: 1024), env: YAI_MAX_TOKENS
347
+ - **MAX_HISTORY**: Max history size, default: 500, env: YAI_MAX_HISTORY
343
348
 
344
349
  Default config of `COMPLETION_PATH` and `ANSWER_PATH` is OpenAI compatible. If you are using OpenAI or other OpenAI compatible LLM provider, you can use the default config.
345
350
 
@@ -392,6 +397,12 @@ If you not sure how to config `COMPLETION_PATH` and `ANSWER_PATH`, here is a gui
392
397
  ```
393
398
  We are looking for the `text` field, so the path should be 1.Key `content`, 2.First obj `[0]`, 3.Key `text`. So it should be `content.[0].text`.
394
399
 
400
+ **CODE_THEME**
401
+
402
+ You can find the list of code theme here: https://pygments.org/styles/
403
+
404
+ Default: monokia
405
+ ![monikia](artwork/monokia.png)
395
406
 
396
407
  ## Usage
397
408
 
@@ -430,10 +441,10 @@ Run Options:
430
441
  ```bash
431
442
  ai -h
432
443
 
433
- Usage: ai [OPTIONS] [PROMPT]
434
-
435
- yaicli - Your AI interface in cli.
436
-
444
+ Usage: ai [OPTIONS] [PROMPT]
445
+
446
+ yaicli - Your AI interface in cli.
447
+
437
448
  ╭─ Arguments ─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╮
438
449
  │ prompt [PROMPT] The prompt send to the LLM │
439
450
  ╰─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯
@@ -468,6 +479,25 @@ In Execute mode:
468
479
  3. Review the command
469
480
  4. Confirm to execute or reject
470
481
 
482
+ ### Keyboard Shortcuts
483
+ - `Tab`: Switch between Chat and Execute modes
484
+ - `Ctrl+C`: Exit
485
+ - `Ctrl+R`: Search history
486
+ - `↑/↓`: Navigate history
487
+
488
+ ### Stdin
489
+ You can also pipe input to YAICLI:
490
+ ```bash
491
+ echo "What is the capital of France?" | ai
492
+ ```
493
+
494
+ ```bash
495
+ cat demo.py | ai "How to use this tool?"
496
+ ```
497
+
498
+ ### History
499
+ Support max history size. Set MAX_HISTORY in config file. Default is 500.
500
+
471
501
  ## Examples
472
502
 
473
503
  ### Have a Chat
@@ -484,9 +514,10 @@ The capital of France is Paris.
484
514
  $ ai -s 'Check the current directory size'
485
515
  Assistant:
486
516
  du -sh .
487
-
488
- Generated command: du -sh .
489
- Execute this command? [y/n/e] (n): e
517
+ ╭─ Command ─╮
518
+ du -sh .
519
+ ╰───────────╯
520
+ Execute command? [e]dit, [y]es, [n]o (n): e
490
521
  Edit command, press enter to execute:
491
522
  du -sh ./
492
523
  Output:
@@ -530,9 +561,10 @@ Certainly! Here’s a brief overview of the solar system:
530
561
  🚀 > Check the current directory size
531
562
  Assistant:
532
563
  du -sh .
533
-
534
- Generated command: du -sh .
535
- Execute this command? [y/n/e] (n): e
564
+ ╭─ Command ─╮
565
+ du -sh .
566
+ ╰───────────╯
567
+ Execute command? [e]dit, [y]es, [n]o (n): e
536
568
  Edit command, press enter to execute:
537
569
  du -sh ./
538
570
  Output:
@@ -544,11 +576,13 @@ Output:
544
576
 
545
577
  ```bash
546
578
  $ ai --shell "Find all PDF files in my Downloads folder"
547
-
548
- Generated command: find ~/Downloads -type f -name "*.pdf"
549
- Execute this command? [y/n]: y
550
-
551
- Executing command: find ~/Downloads -type f -name "*.pdf"
579
+ Assistant:
580
+ find ~/Downloads -type f -name "*.pdf"
581
+ ╭─ Command ──────────────────────────────╮
582
+ │ find ~/Downloads -type f -name "*.pdf" │
583
+ ╰────────────────────────────────────────╯
584
+ Execute command? [e]dit, [y]es, [n]o (n): y
585
+ Output:
552
586
 
553
587
  /Users/username/Downloads/document1.pdf
554
588
  /Users/username/Downloads/report.pdf
@@ -562,7 +596,7 @@ YAICLI is built using several Python libraries:
562
596
  - **Typer**: Provides the command-line interface
563
597
  - **Rich**: Provides terminal content formatting and beautiful display
564
598
  - **prompt_toolkit**: Provides interactive command-line input experience
565
- - **requests**: Handles API requests
599
+ - **httpx**: Handles API requests
566
600
  - **jmespath**: Parses JSON responses
567
601
 
568
602
  ## Contributing
@@ -9,7 +9,7 @@ YAICLI is a compact yet potent command-line AI assistant, allowing you to engage
9
9
 
10
10
  Support regular and deep thinking models.
11
11
 
12
- > [!WARNING]
12
+ > [!WARNING]
13
13
  > This is a work in progress, some features could change or be removed in the future.
14
14
 
15
15
  ## Features
@@ -35,6 +35,8 @@ Support regular and deep thinking models.
35
35
 
36
36
  - **Keyboard Shortcuts**:
37
37
  - Tab to switch between Chat and Execute modes
38
+ - `↑/↓` to navigate history
39
+ - `Ctrl+R` to search history
38
40
 
39
41
  - **History**:
40
42
  - Save and recall previous queries
@@ -95,6 +97,7 @@ ANSWER_PATH=choices[0].message.content
95
97
  # true: streaming response
96
98
  # false: non-streaming response
97
99
  STREAM=true
100
+ CODE_THEME=monokia
98
101
 
99
102
  TEMPERATURE=0.7
100
103
  TOP_P=1.0
@@ -113,9 +116,11 @@ Below are the available configuration options and override environment variables
113
116
  - **COMPLETION_PATH**: Path for completions endpoint, default: /chat/completions, env: YAI_COMPLETION_PATH
114
117
  - **ANSWER_PATH**: Json path expression to extract answer from response, default: choices[0].message.content, env: YAI_ANSWER_PATH
115
118
  - **STREAM**: Enable/disable streaming responses, default: true, env: YAI_STREAM
119
+ - **CODE_THEME**: Theme for code blocks, default: monokia, env: YAI_CODE_THEME
116
120
  - **TEMPERATURE**: Temperature for response generation (default: 0.7), env: YAI_TEMPERATURE
117
121
  - **TOP_P**: Top-p sampling for response generation (default: 1.0), env: YAI_TOP_P
118
122
  - **MAX_TOKENS**: Maximum number of tokens for response generation (default: 1024), env: YAI_MAX_TOKENS
123
+ - **MAX_HISTORY**: Max history size, default: 500, env: YAI_MAX_HISTORY
119
124
 
120
125
  Default config of `COMPLETION_PATH` and `ANSWER_PATH` is OpenAI compatible. If you are using OpenAI or other OpenAI compatible LLM provider, you can use the default config.
121
126
 
@@ -168,6 +173,12 @@ If you not sure how to config `COMPLETION_PATH` and `ANSWER_PATH`, here is a gui
168
173
  ```
169
174
  We are looking for the `text` field, so the path should be 1.Key `content`, 2.First obj `[0]`, 3.Key `text`. So it should be `content.[0].text`.
170
175
 
176
+ **CODE_THEME**
177
+
178
+ You can find the list of code theme here: https://pygments.org/styles/
179
+
180
+ Default: monokia
181
+ ![monikia](artwork/monokia.png)
171
182
 
172
183
  ## Usage
173
184
 
@@ -206,10 +217,10 @@ Run Options:
206
217
  ```bash
207
218
  ai -h
208
219
 
209
- Usage: ai [OPTIONS] [PROMPT]
210
-
211
- yaicli - Your AI interface in cli.
212
-
220
+ Usage: ai [OPTIONS] [PROMPT]
221
+
222
+ yaicli - Your AI interface in cli.
223
+
213
224
  ╭─ Arguments ─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╮
214
225
  │ prompt [PROMPT] The prompt send to the LLM │
215
226
  ╰─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯
@@ -244,6 +255,25 @@ In Execute mode:
244
255
  3. Review the command
245
256
  4. Confirm to execute or reject
246
257
 
258
+ ### Keyboard Shortcuts
259
+ - `Tab`: Switch between Chat and Execute modes
260
+ - `Ctrl+C`: Exit
261
+ - `Ctrl+R`: Search history
262
+ - `↑/↓`: Navigate history
263
+
264
+ ### Stdin
265
+ You can also pipe input to YAICLI:
266
+ ```bash
267
+ echo "What is the capital of France?" | ai
268
+ ```
269
+
270
+ ```bash
271
+ cat demo.py | ai "How to use this tool?"
272
+ ```
273
+
274
+ ### History
275
+ Support max history size. Set MAX_HISTORY in config file. Default is 500.
276
+
247
277
  ## Examples
248
278
 
249
279
  ### Have a Chat
@@ -260,9 +290,10 @@ The capital of France is Paris.
260
290
  $ ai -s 'Check the current directory size'
261
291
  Assistant:
262
292
  du -sh .
263
-
264
- Generated command: du -sh .
265
- Execute this command? [y/n/e] (n): e
293
+ ╭─ Command ─╮
294
+ du -sh .
295
+ ╰───────────╯
296
+ Execute command? [e]dit, [y]es, [n]o (n): e
266
297
  Edit command, press enter to execute:
267
298
  du -sh ./
268
299
  Output:
@@ -306,9 +337,10 @@ Certainly! Here’s a brief overview of the solar system:
306
337
  🚀 > Check the current directory size
307
338
  Assistant:
308
339
  du -sh .
309
-
310
- Generated command: du -sh .
311
- Execute this command? [y/n/e] (n): e
340
+ ╭─ Command ─╮
341
+ du -sh .
342
+ ╰───────────╯
343
+ Execute command? [e]dit, [y]es, [n]o (n): e
312
344
  Edit command, press enter to execute:
313
345
  du -sh ./
314
346
  Output:
@@ -320,11 +352,13 @@ Output:
320
352
 
321
353
  ```bash
322
354
  $ ai --shell "Find all PDF files in my Downloads folder"
323
-
324
- Generated command: find ~/Downloads -type f -name "*.pdf"
325
- Execute this command? [y/n]: y
326
-
327
- Executing command: find ~/Downloads -type f -name "*.pdf"
355
+ Assistant:
356
+ find ~/Downloads -type f -name "*.pdf"
357
+ ╭─ Command ──────────────────────────────╮
358
+ │ find ~/Downloads -type f -name "*.pdf" │
359
+ ╰────────────────────────────────────────╯
360
+ Execute command? [e]dit, [y]es, [n]o (n): y
361
+ Output:
328
362
 
329
363
  /Users/username/Downloads/document1.pdf
330
364
  /Users/username/Downloads/report.pdf
@@ -338,7 +372,7 @@ YAICLI is built using several Python libraries:
338
372
  - **Typer**: Provides the command-line interface
339
373
  - **Rich**: Provides terminal content formatting and beautiful display
340
374
  - **prompt_toolkit**: Provides interactive command-line input experience
341
- - **requests**: Handles API requests
375
+ - **httpx**: Handles API requests
342
376
  - **jmespath**: Parses JSON responses
343
377
 
344
378
  ## Contributing
@@ -1,6 +1,6 @@
1
1
  [project]
2
2
  name = "yaicli"
3
- version = "0.0.14"
3
+ version = "0.0.16"
4
4
  description = "A simple CLI tool to interact with LLM"
5
5
  authors = [{ name = "belingud", email = "im.victor@qq.com" }]
6
6
  readme = "README.md"
@@ -2,9 +2,10 @@ import configparser
2
2
  import json
3
3
  import platform
4
4
  import subprocess
5
+ import sys
5
6
  import time
6
7
  from os import getenv
7
- from os.path import basename, pathsep
8
+ from os.path import basename, exists, pathsep, devnull
8
9
  from pathlib import Path
9
10
  from typing import Annotated, Optional, Union
10
11
 
@@ -14,7 +15,7 @@ import typer
14
15
  from distro import name as distro_name
15
16
  from prompt_toolkit import PromptSession, prompt
16
17
  from prompt_toolkit.completion import WordCompleter
17
- from prompt_toolkit.history import FileHistory
18
+ from prompt_toolkit.history import FileHistory, _StrOrBytesPath
18
19
  from prompt_toolkit.key_binding import KeyBindings, KeyPressEvent
19
20
  from prompt_toolkit.keys import Keys
20
21
  from rich.console import Console
@@ -56,9 +57,11 @@ DEFAULT_CONFIG_MAP = {
56
57
  "COMPLETION_PATH": {"value": "chat/completions", "env_key": "YAI_COMPLETION_PATH"},
57
58
  "ANSWER_PATH": {"value": "choices[0].message.content", "env_key": "YAI_ANSWER_PATH"},
58
59
  "STREAM": {"value": "true", "env_key": "YAI_STREAM"},
60
+ "CODE_THEME": {"value": "monokia", "env_key": "YAI_CODE_THEME"},
59
61
  "TEMPERATURE": {"value": "0.7", "env_key": "YAI_TEMPERATURE"},
60
62
  "TOP_P": {"value": "1.0", "env_key": "YAI_TOP_P"},
61
63
  "MAX_TOKENS": {"value": "1024", "env_key": "YAI_MAX_TOKENS"},
64
+ "MAX_HISTORY": {"value": "500", "env_key": "YAI_MAX_HISTORY"},
62
65
  }
63
66
 
64
67
  DEFAULT_CONFIG_INI = """[core]
@@ -79,10 +82,13 @@ ANSWER_PATH=choices[0].message.content
79
82
  # true: streaming response
80
83
  # false: non-streaming response
81
84
  STREAM=true
85
+ CODE_THEME=monokia
82
86
 
83
87
  TEMPERATURE=0.7
84
88
  TOP_P=1.0
85
- MAX_TOKENS=1024"""
89
+ MAX_TOKENS=1024
90
+
91
+ MAX_HISTORY=500"""
86
92
 
87
93
  app = typer.Typer(
88
94
  name="yaicli",
@@ -98,6 +104,64 @@ class CasePreservingConfigParser(configparser.RawConfigParser):
98
104
  return optionstr
99
105
 
100
106
 
107
+ class LimitedFileHistory(FileHistory):
108
+ def __init__(self, filename: _StrOrBytesPath, max_entries: int = 500, trim_every: int = 5):
109
+ """Limited file history
110
+ Args:
111
+ filename (str): path to history file
112
+ max_entries (int): maximum number of entries to keep
113
+ trim_every (int): trim history every `trim_every` appends
114
+
115
+ Example:
116
+ >>> history = LimitedFileHistory("~/.yaicli_history", max_entries=500, trim_every=10)
117
+ >>> history.append_string("echo hello")
118
+ >>> history.append_string("echo world")
119
+ >>> session = PromptSession(history=history)
120
+ """
121
+ self.max_entries = max_entries
122
+ self._append_count = 0
123
+ self._trim_every = trim_every
124
+ super().__init__(filename)
125
+
126
+ def store_string(self, string: str) -> None:
127
+ # Call the original method to deposit a new record
128
+ super().store_string(string)
129
+
130
+ self._append_count += 1
131
+ if self._append_count >= self._trim_every:
132
+ self._trim_history()
133
+ self._append_count = 0
134
+
135
+ def _trim_history(self):
136
+ if not exists(self.filename):
137
+ return
138
+
139
+ with open(self.filename, "r", encoding="utf-8") as f:
140
+ lines = f.readlines()
141
+
142
+ # By record: each record starts with "# timestamp" followed by a number of "+lines".
143
+ entries = []
144
+ current_entry = []
145
+
146
+ for line in lines:
147
+ if line.startswith("# "):
148
+ if current_entry:
149
+ entries.append(current_entry)
150
+ current_entry = [line]
151
+ elif line.startswith("+") or line.strip() == "":
152
+ current_entry.append(line)
153
+
154
+ if current_entry:
155
+ entries.append(current_entry)
156
+
157
+ # Keep the most recent max_entries row (the next row is newer)
158
+ trimmed_entries = entries[-self.max_entries :]
159
+
160
+ with open(self.filename, "w", encoding="utf-8") as f:
161
+ for entry in trimmed_entries:
162
+ f.writelines(entry)
163
+
164
+
101
165
  class CLI:
102
166
  CONFIG_PATH = Path("~/.config/yaicli/config.ini").expanduser()
103
167
 
@@ -105,7 +169,16 @@ class CLI:
105
169
  self.verbose = verbose
106
170
  self.console = Console()
107
171
  self.bindings = KeyBindings()
172
+ # Disable nonatty warning
173
+ _origin_stderr = None
174
+ if not sys.stderr.isatty():
175
+ _origin_stderr = sys.stderr
176
+ sys.stderr = open(devnull, "w")
108
177
  self.session = PromptSession(key_bindings=self.bindings)
178
+ # Restore stderr
179
+ if _origin_stderr:
180
+ sys.stderr.close()
181
+ sys.stderr = _origin_stderr
109
182
  self.config = {}
110
183
  self.history: list[dict[str, str]] = []
111
184
  self.max_history_length = 25
@@ -124,7 +197,9 @@ class CLI:
124
197
  key_bindings=self.bindings,
125
198
  completer=WordCompleter(["/clear", "/exit", "/his"]),
126
199
  complete_while_typing=True,
127
- history=FileHistory(Path("~/.yaicli_history").expanduser()),
200
+ history=LimitedFileHistory(
201
+ Path("~/.yaicli_history").expanduser(), max_entries=int(self.config["MAX_HISTORY"])
202
+ ),
128
203
  enable_history_search=True,
129
204
  )
130
205
 
@@ -194,7 +269,7 @@ class CLI:
194
269
  if current_platform in ("Windows", "nt"):
195
270
  is_powershell = len(getenv("PSModulePath", "").split(pathsep)) >= 3
196
271
  return "powershell.exe" if is_powershell else "cmd.exe"
197
- return basename(getenv("SHELL", "/bin/sh"))
272
+ return basename(getenv("SHELL", None) or "/bin/sh")
198
273
 
199
274
  def _filter_command(self, command: str) -> Optional[str]:
200
275
  """Filter out unwanted characters from command
@@ -285,8 +360,7 @@ class CLI:
285
360
  if not line:
286
361
  return None
287
362
 
288
- if isinstance(line, bytes):
289
- line = line.decode("utf-8")
363
+ line = str(line)
290
364
  if not line.startswith("data: "):
291
365
  return None
292
366
 
@@ -325,12 +399,12 @@ class CLI:
325
399
  def _print_stream(self, response: httpx.Response) -> str:
326
400
  """Print response from LLM in streaming mode"""
327
401
  self.console.print("Assistant:", style="bold green")
328
- full_completion = ""
402
+ full_content = ""
329
403
  in_reasoning = False
330
404
  cursor_chars = ["_", " "]
331
405
  cursor_index = 0
332
406
 
333
- with Live() as live:
407
+ with Live(console=self.console) as live:
334
408
  for line in response.iter_lines():
335
409
  json_data = self._parse_stream_line(line)
336
410
  if not json_data:
@@ -340,26 +414,25 @@ class CLI:
340
414
  reason = self.get_reasoning_content(delta)
341
415
 
342
416
  if reason is not None:
343
- full_completion, in_reasoning = self._process_reasoning_content(
344
- reason, full_completion, in_reasoning
345
- )
417
+ full_content, in_reasoning = self._process_reasoning_content(reason, full_content, in_reasoning)
346
418
  else:
347
- full_completion, in_reasoning = self._process_regular_content(
348
- delta.get("content", "") or "", full_completion, in_reasoning
419
+ full_content, in_reasoning = self._process_regular_content(
420
+ delta.get("content", "") or "", full_content, in_reasoning
349
421
  )
350
422
 
351
- live.update(Markdown(markup=full_completion + cursor_chars[cursor_index]), refresh=True)
423
+ cursor = cursor_chars[cursor_index]
424
+ live.update(Markdown(markup=full_content + cursor, code_theme=self.config["CODE_THEME"]), refresh=True)
352
425
  cursor_index = (cursor_index + 1) % 2
353
426
  time.sleep(0.005) # Slow down the printing speed, avoiding screen flickering
354
- live.update(Markdown(markup=full_completion), refresh=True)
355
- return full_completion
427
+ live.update(Markdown(markup=full_content, code_theme=self.config["CODE_THEME"]), refresh=True)
428
+ return full_content
356
429
 
357
430
  def _print_normal(self, response: httpx.Response) -> str:
358
431
  """Print response from LLM in non-streaming mode"""
359
432
  self.console.print("Assistant:", style="bold green")
360
- full_completion = jmespath.search(self.config.get("ANSWER_PATH", "choices[0].message.content"), response.json())
361
- self.console.print(Markdown(full_completion + "\n"))
362
- return full_completion
433
+ full_content = jmespath.search(self.config.get("ANSWER_PATH", "choices[0].message.content"), response.json())
434
+ self.console.print(Markdown(full_content + "\n", code_theme=self.config["CODE_THEME"]))
435
+ return full_content
363
436
 
364
437
  def get_prompt_tokens(self) -> list[tuple[str, str]]:
365
438
  """Return prompt tokens for current mode"""
@@ -447,6 +520,7 @@ class CLI:
447
520
  ██ ██ ██ ██ ██ ██ ██
448
521
  ██ ██ ██ ██ ██████ ███████ ██
449
522
  """)
523
+ self.console.print("↑/↓: navigate in history")
450
524
  self.console.print("Press TAB to change in chat and exec mode", style="bold")
451
525
  self.console.print("Type /clear to clear chat history", style="bold")
452
526
  self.console.print("Type /his to see chat history", style="bold")
@@ -489,6 +563,18 @@ class CLI:
489
563
  def run(self, chat: bool, shell: bool, prompt: str) -> None:
490
564
  """Run the CLI"""
491
565
  self.load_config()
566
+ if self.verbose:
567
+ self.console.print(f"CODE_THEME: {self.config['CODE_THEME']}")
568
+ self.console.print(f"ANSWER_PATH: {self.config['ANSWER_PATH']}")
569
+ self.console.print(f"COMPLETION_PATH: {self.config['COMPLETION_PATH']}")
570
+ self.console.print(f"BASE_URL: {self.config['BASE_URL']}")
571
+ self.console.print(f"MODEL: {self.config['MODEL']}")
572
+ self.console.print(f"SHELL_NAME: {self.config['SHELL_NAME']}")
573
+ self.console.print(f"OS_NAME: {self.config['OS_NAME']}")
574
+ self.console.print(f"STREAM: {self.config['STREAM']}")
575
+ self.console.print(f"TEMPERATURE: {self.config['TEMPERATURE']}")
576
+ self.console.print(f"TOP_P: {self.config['TOP_P']}")
577
+ self.console.print(f"MAX_TOKENS: {self.config['MAX_TOKENS']}")
492
578
  if not self.config.get("API_KEY"):
493
579
  self.console.print(
494
580
  "[yellow]API key not set. Please set in ~/.config/yaicli/config.ini or AI_API_KEY env[/]"
@@ -518,6 +604,11 @@ def main(
518
604
  template: Annotated[bool, typer.Option("--template", help="Show the config template.")] = False,
519
605
  ):
520
606
  """yaicli - Your AI interface in cli."""
607
+ # Check for stdin input (from pipe or redirect)
608
+ if not sys.stdin.isatty():
609
+ stdin_content = sys.stdin.read()
610
+ prompt = f"{stdin_content}\n\n{prompt}"
611
+
521
612
  if prompt == "":
522
613
  typer.echo("Empty prompt, ignored")
523
614
  return
File without changes
File without changes