llm-shell 0.7.0 → 0.7.2

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: 89ffe0d7aacb89de310acc7012fb8b383d9c27e0059536ff9ce7ea855f604c37
4
- data.tar.gz: eb1cd003da504a27f6859898c21339406536a4979cf0fea6f1d6740bb2ea9328
3
+ metadata.gz: '0985d061b38dccb46cdeebd598133d17b634cf035f6b9ca688f48539df553c09'
4
+ data.tar.gz: fe594a519302eb963f727e967b6f5bd596d7b97a3b3cebe52b56a1a466b8faf8
5
5
  SHA512:
6
- metadata.gz: aa30fcd97acd567fa2c53d7c5936063e0442a3b9e8a2a4a2d721557dd704af57710f4e3d6389c0da9d9e2bf9babacb2e74f51f800f239c65a18d8da4d429f9c2
7
- data.tar.gz: '0900fcbd3c39c95c3941494f93863b8a11f78a69305c4d8dce31c3a87a2391f4940b97743c3f6812e3acb185be40d85f2a3003a5d75de19cc7f94153773f380c'
6
+ metadata.gz: 0e59a342f4356617da679699f483babae1eba0f05cabb4c9ae72ef0993a8d5d71d6565aa508c764f5e4e41f4267159ded1685821425624c99215b3a0c5257a9d
7
+ data.tar.gz: 7239d794cb6dc952d61286c510347d18da9bc15115ab4c45ab2fb51a44a05a0d5e347309629da927ee7965b93f9fab870f8dcb52a909680a2dc7499ee79f9ed3
data/README.md CHANGED
@@ -5,7 +5,7 @@ console that can interact with multiple Large Language Models
5
5
  (LLMs). It serves as both a demo of the [llmrb/llm](https://github.com/llmrb/llm)
6
6
  library and a tool to help improve the library through real-world
7
7
  usage and feedback. Jump to the [Demos](#demos) section to see
8
- it in action!
8
+ it in action.
9
9
 
10
10
  ## Features
11
11
 
@@ -30,20 +30,15 @@ it in action!
30
30
  ## Demos
31
31
 
32
32
  <details>
33
- <summary><b>1. Tools: "system" function</b></summary>
33
+ <summary><b>1. An introduction to tool calls</b></summary>
34
34
  <img src="share/llm-shell/examples/toolcalls.gif/">
35
35
  </details>
36
36
 
37
37
  <details>
38
- <summary><b>2. Files: import at runtime</b></summary>
38
+ <summary><b>2. Add files as conversation context</b></summary>
39
39
  <img src="share/llm-shell/examples/files-runtime.gif">
40
40
  </details>
41
41
 
42
- <details>
43
- <summary><b>3. Files: import at boot time</b></summary>
44
- <img src="share/llm-shell/examples/files-boottime.gif">
45
- </details>
46
-
47
42
  ## Customization
48
43
 
49
44
  #### Functions
@@ -79,13 +74,10 @@ end
79
74
 
80
75
  llm-shell can be extended with your own console commands. This can be
81
76
  done by creating a Ruby file in the `~/.llm-shell/commands/` directory &ndash;
82
- with one file per command. The commands are loaded at boot time. See the
83
- [file-import](lib/llm/shell/commands/file_import.rb),
84
- [dir-import](lib/llm/shell/commands/dir_import.rb),
85
- [show-history](lib/llm/shell/commands/show_history.rb),
86
- [clear-screen](lib/llm/shell/commands/clear_screen.rb)
87
- and [system-prompt](lib/llm/shell/commands/system_prompt.rb)
88
- commands for more realistic examples:
77
+ with one file per command. The commands are loaded at boot time.
78
+ See the
79
+ [commands/](lib/llm/shell/commands/)
80
+ directory for more examples:
89
81
 
90
82
  ```ruby
91
83
  LLM.command "say-hello" do |cmd|
@@ -127,16 +119,12 @@ path `${HOME}/.llm-shell/config.yml` and it has the following format:
127
119
  # ~/.config/llm-shell.yml
128
120
  openai:
129
121
  key: YOURKEY
130
- model: gpt-4o-mini
131
122
  gemini:
132
123
  key: YOURKEY
133
- model: gemini-2.0-flash-001
134
124
  anthropic:
135
125
  key: YOURKEY
136
- model: claude-3-7-sonnet-20250219
137
126
  deepseek:
138
127
  key: YOURKEY
139
- model: deepseek-chat
140
128
  ollama:
141
129
  host: localhost
142
130
  model: deepseek-coder:6.7b
@@ -160,6 +148,8 @@ Usage: llm-shell [OPTIONS]
160
148
  -o, --port [PORT] Optional. Sometimes required by ollama.
161
149
  -f, --files [GLOB] Optional. Glob pattern(s) separated by a comma.
162
150
  -t, --tools [TOOLS] Optional. One or more tool names to load automatically.
151
+ -r, --prompt [PROMPT] Optional. The prompt to use.
152
+ -v, --version Optional. Print the version and exit
163
153
  ```
164
154
 
165
155
  ## Install
@@ -3,7 +3,14 @@
3
3
  class LLM::Shell
4
4
  class Command
5
5
  require_relative "commands/utils"
6
- Context = Struct.new(:bot, :io)
6
+
7
+ ##
8
+ # @api private
9
+ class Context < Struct.new(:bot, :io)
10
+ def pager(...)
11
+ LLM::Shell.pager(...)
12
+ end
13
+ end
7
14
 
8
15
  ##
9
16
  # Returns the underlying command object
@@ -0,0 +1,58 @@
1
+ # frozen_string_literal: true
2
+
3
+ class LLM::Shell::Command
4
+ class Help
5
+ require_relative "utils"
6
+ include Utils
7
+
8
+ ##
9
+ # @param [LLM::Shell::Context] context
10
+ # The context of the command
11
+ # @return [LLM::Shell::Command::Help]
12
+ def initialize(context)
13
+ @context = context
14
+ end
15
+
16
+ ##
17
+ # Prints help
18
+ # @return [void]
19
+ def call
20
+ pager do |io|
21
+ render_commands(io)
22
+ render_functions(io)
23
+ end
24
+ end
25
+
26
+ private
27
+
28
+ def render_commands(io)
29
+ io.print(Paint["Commands", :bold, :underline], "\n\n")
30
+ commands.each.with_index(1) do |command, index|
31
+ io.puts(command_name(command, index, :red))
32
+ io.puts(command_desc(command), "\n\n")
33
+ end
34
+ end
35
+
36
+ def render_functions(io)
37
+ io.print(Paint["Functions", :bold, :underline], "\n\n")
38
+ if functions.empty?
39
+ io.print(Paint["No functions available", :yellow], "\n\n")
40
+ else
41
+ functions.each.with_index(1) do |fn, index|
42
+ io.print(command_name(fn, index, :green), "\n")
43
+ io.print(command_desc(fn), "\n\n")
44
+ end
45
+ end
46
+ end
47
+
48
+ def commands = LLM.commands.values.sort_by(&:name)
49
+ def functions = LLM.functions.values.sort_by(&:name)
50
+ def command_name(command, index, bgcolor) = [Paint[" #{index} ", :white, bgcolor, :bold], " ", Paint[command.name, :bold]].join
51
+ def command_desc(command) = command.description || "No description"
52
+
53
+ LLM.command "help" do |cmd|
54
+ cmd.description "Shows help"
55
+ cmd.register(self)
56
+ end
57
+ end
58
+ end
@@ -24,7 +24,7 @@ class LLM::Shell::Command
24
24
  private
25
25
 
26
26
  def emit
27
- IO.popen("less -FRX", "w") do |io|
27
+ pager do |io|
28
28
  messages.each.with_index do |message, index|
29
29
  next if index <= 1
30
30
  io << render(message) << "\n"
@@ -16,7 +16,11 @@ class LLM::Shell::Command
16
16
  ##
17
17
  # Emits the system prompt to standard output
18
18
  # @return [void]
19
- def call = puts render(bot.messages.to_a[0])
19
+ def call
20
+ pager do |io|
21
+ io.write render(bot.messages[0])
22
+ end
23
+ end
20
24
 
21
25
  private
22
26
 
@@ -13,8 +13,9 @@ class LLM::Shell::Command
13
13
  ].join("\n")
14
14
  end
15
15
 
16
- def file_pattern = /\A<file path=(.+?)>/
17
16
  def bot = @context.bot
18
17
  def io = @context.io
18
+ def pager(...) = @context.pager(...)
19
+ def file_pattern = /\A<file path=(.+?)>/
19
20
  end
20
21
  end
@@ -49,6 +49,8 @@ class LLM::Shell
49
49
  "\n\n", coderay(code, lang),
50
50
  "\n", Paint["<<< #{lang}", :blue, :bold]].join
51
51
  end
52
+ when :smart_quote
53
+ smart_quotes[node.value]
52
54
  when :text
53
55
  node.value
54
56
  else
@@ -56,13 +58,6 @@ class LLM::Shell
56
58
  end
57
59
  end
58
60
 
59
- def levels
60
- {
61
- 1 => :green, 2 => :blue, 3 => :green,
62
- 4 => :yellow, 5 => :red, 6 => :purple
63
- }
64
- end
65
-
66
61
  def preprocessor(text)
67
62
  text
68
63
  .gsub(/([^\n])\n(#+ )/, "\\1\n\n\\2")
@@ -77,5 +72,19 @@ class LLM::Shell
77
72
  lang = "text"
78
73
  retry
79
74
  end
75
+
76
+ def levels
77
+ {
78
+ 1 => :green, 2 => :blue, 3 => :green,
79
+ 4 => :yellow, 5 => :red, 6 => :purple
80
+ }
81
+ end
82
+
83
+ def smart_quotes
84
+ {
85
+ lsquo: "'", rsquo: "'",
86
+ ldquo: '"', rdquo: '"'
87
+ }
88
+ end
80
89
  end
81
90
  end
@@ -85,9 +85,9 @@ class LLM::Shell
85
85
  end
86
86
 
87
87
  def emit
88
- IO.popen("less -FRX", "w") do
89
- _1.write formatter(unread).format!(:user), "\n"
90
- _1.write formatter(unread).format!(:assistant), "\n"
88
+ LLM::Shell.pager do |io|
89
+ io.write formatter(unread).format!(:user), "\n"
90
+ io.write formatter(unread).format!(:assistant), "\n"
91
91
  end unless unread.empty?
92
92
  end
93
93
 
@@ -4,5 +4,5 @@ module LLM
4
4
  end unless defined?(LLM)
5
5
 
6
6
  class LLM::Shell
7
- VERSION = "0.7.0"
7
+ VERSION = "0.7.2"
8
8
  end
data/lib/llm/shell.rb CHANGED
@@ -24,6 +24,13 @@ class LLM::Shell
24
24
  # Load all commands
25
25
  Dir[File.join(__dir__, "shell", "commands", "*.rb")].each { require(_1) }
26
26
 
27
+ ##
28
+ # Opens a pager
29
+ # @return [void]
30
+ def self.pager(...)
31
+ IO.popen("less -FRX", "w", ...)
32
+ end
33
+
27
34
  ##
28
35
  # @return [String]
29
36
  def self.home
data/llm-shell.gemspec ADDED
@@ -0,0 +1,45 @@
1
+ # frozen_string_literal: true
2
+
3
+ require_relative "lib/llm/shell/version"
4
+
5
+ Gem::Specification.new do |spec|
6
+ spec.name = "llm-shell"
7
+ spec.version = LLM::Shell::VERSION
8
+ spec.authors = ["Antar Azri", "0x1eef"]
9
+ spec.email = ["azantar@proton.me", "0x1eef@proton.me"]
10
+
11
+ spec.summary = "llm-shell is an extensible, developer-oriented " \
12
+ "command-line console that can interact with multiple " \
13
+ "Large Language Models (LLMs)."
14
+ spec.description = spec.summary
15
+ spec.homepage = "https://github.com/llmrb/llm-shell"
16
+ spec.license = "0BSD"
17
+ spec.required_ruby_version = ">= 3.2"
18
+
19
+ spec.metadata["homepage_uri"] = spec.homepage
20
+ spec.metadata["source_code_uri"] = spec.homepage
21
+
22
+ spec.files = Dir[
23
+ "README.md", "LICENSE",
24
+ "lib/*.rb", "lib/**/*.rb",
25
+ "libexec/*", "libexec/**/*",
26
+ "share/llm-shell/prompts/*",
27
+ "bin/*", "llm-shell.gemspec"
28
+ ]
29
+ spec.require_paths = ["lib"]
30
+ spec.executables = ["llm-shell"]
31
+ spec.add_dependency "llm.rb", "~> 0.10.1"
32
+ spec.add_dependency "paint", "~> 2.1"
33
+ spec.add_dependency "kramdown", "~> 2.5"
34
+ spec.add_dependency "coderay", "~> 1.1"
35
+ spec.add_development_dependency "webmock", "~> 3.24.0"
36
+ spec.add_development_dependency "yard", "~> 0.9.37"
37
+ spec.add_development_dependency "kramdown", "~> 2.4"
38
+ spec.add_development_dependency "webrick", "~> 1.8"
39
+ spec.add_development_dependency "test-cmd.rb", "~> 0.12.0"
40
+ spec.add_development_dependency "rake", "~> 13.0"
41
+ spec.add_development_dependency "rspec", "~> 3.0"
42
+ spec.add_development_dependency "standard", "~> 1.40"
43
+ spec.add_development_dependency "vcr", "~> 6.0"
44
+ spec.add_development_dependency "dotenv", "~> 2.8"
45
+ end
metadata CHANGED
@@ -1,7 +1,7 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: llm-shell
3
3
  version: !ruby/object:Gem::Version
4
- version: 0.7.0
4
+ version: 0.7.2
5
5
  platform: ruby
6
6
  authors:
7
7
  - Antar Azri
@@ -16,14 +16,14 @@ dependencies:
16
16
  requirements:
17
17
  - - "~>"
18
18
  - !ruby/object:Gem::Version
19
- version: '0.9'
19
+ version: 0.10.1
20
20
  type: :runtime
21
21
  prerelease: false
22
22
  version_requirements: !ruby/object:Gem::Requirement
23
23
  requirements:
24
24
  - - "~>"
25
25
  - !ruby/object:Gem::Version
26
- version: '0.9'
26
+ version: 0.10.1
27
27
  - !ruby/object:Gem::Dependency
28
28
  name: paint
29
29
  requirement: !ruby/object:Gem::Requirement
@@ -226,6 +226,7 @@ files:
226
226
  - lib/llm/shell/commands/clear_screen.rb
227
227
  - lib/llm/shell/commands/dir_import.rb
228
228
  - lib/llm/shell/commands/file_import.rb
229
+ - lib/llm/shell/commands/help.rb
229
230
  - lib/llm/shell/commands/show_history.rb
230
231
  - lib/llm/shell/commands/system_prompt.rb
231
232
  - lib/llm/shell/commands/utils.rb
@@ -239,6 +240,7 @@ files:
239
240
  - lib/llm/shell/repl.rb
240
241
  - lib/llm/shell/version.rb
241
242
  - libexec/llm-shell/shell
243
+ - llm-shell.gemspec
242
244
  - share/llm-shell/prompts/default.txt
243
245
  homepage: https://github.com/llmrb/llm-shell
244
246
  licenses:
@@ -260,7 +262,7 @@ required_rubygems_version: !ruby/object:Gem::Requirement
260
262
  - !ruby/object:Gem::Version
261
263
  version: '0'
262
264
  requirements: []
263
- rubygems_version: 3.6.8
265
+ rubygems_version: 3.7.1
264
266
  specification_version: 4
265
267
  summary: llm-shell is an extensible, developer-oriented command-line console that
266
268
  can interact with multiple Large Language Models (LLMs).