llm-shell 0.2.0 → 0.4.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: 629a62d1b7f2fb7c4de9149c1af8e640c1ff08ccbef3a581d006cd99d331e4e2
4
- data.tar.gz: 9e8808cba408a0474a938b7de8df00e1b72032635b00928bdd0a7a78f57d1521
3
+ metadata.gz: 033fc0fed08fe04d77063406e076a946ca986b700706f7a4e1ee9ed252fd4f48
4
+ data.tar.gz: 0e1fb76971b1012cf669cef11fc3ede3b1c8ed03e6c99324b296d686782c36de
5
5
  SHA512:
6
- metadata.gz: 4552e7ef324acf1f800c42cffc5fe7812e581389f1a544a447c6963611011d66c4a3e19ec2fb1d5f142fbf822b8dd8a0264ed8de6a40427113b597541d099170
7
- data.tar.gz: 8c453dfd38ff0c2f94707ab460a47afeba52efeccc808ff68c5a9c1007040855104e1954548222367459eca4a6245a174b78cd9f034baa8f301d2451a78b0288
6
+ metadata.gz: f047af335441b63debb894d1fee2b2eb0dd817fa56b180a6bf1e820d52deeb6093d26b02c98b2997aac80e67f0b2b74dc213f7e26a407399f4832840531c717e
7
+ data.tar.gz: f0a08f41e54e86e3c838418da505fca61a596f2a242847ccd1547468fecdef33d6674277c5ffa94b959bddce867686ef9e74b735660f68a097992f1ce777095e
data/README.md CHANGED
@@ -1,7 +1,7 @@
1
1
  ## About
2
2
 
3
3
  llm-shell is an extensible, developer-oriented command-line
4
- utility that can interact with multiple Large Language Models
4
+ console that can interact with multiple Large Language Models
5
5
  (LLMs). It serves as both a demo of the [llmrb/llm](https://github.com/llmrb/llm)
6
6
  library and a tool to help improve the library through real-world
7
7
  usage and feedback. Jump to the [Demos](#demos) section to see
@@ -9,12 +9,23 @@ it in action!
9
9
 
10
10
  ## Features
11
11
 
12
+ #### General
13
+
12
14
  - 🌟 Unified interface for multiple Large Language Models (LLMs)
13
15
  - 🤝 Supports Gemini, OpenAI, Anthropic, LlamaCpp and Ollama
16
+
17
+ #### Customize
18
+
14
19
  - 📤 Attach local files as conversation context
15
20
  - 🔧 Extend with your own functions and tool calls
16
- - 📝 Advanced Markdown formatting and output
21
+ - 🚀 Extend with your own console commands
22
+
23
+ #### Shell
24
+
25
+ - 🤖 Builtin auto-complete powered by Readline
26
+ - 🎨 Builtin syntax highlighting powered by Coderay
17
27
  - 📄 Deploys the less pager for long outputs
28
+ - 📝 Advanced Markdown formatting and output
18
29
 
19
30
  ## Demos
20
31
 
@@ -37,22 +48,16 @@ it in action!
37
48
 
38
49
  #### Functions
39
50
 
51
+ > For security and safety reasons, a user must confirm the execution of
52
+ > all function calls before they happen and also add the function to
53
+ > an allowlist before it will be loaded by llm-shell automatically
54
+ > at boot time.
55
+
40
56
  The `~/.llm-shell/tools/` directory can contain one or more
41
57
  [llmrb/llm](https://github.com/llmrb/llm) functions that the
42
58
  LLM can call once you confirm you are okay with executing the
43
59
  code locally (along with any arguments it provides). See the
44
- earlier demo for an example.
45
-
46
- For security and safety reasons, a user must confirm the execution of
47
- all function calls before they happen and also add the function to
48
- an allowlist before it will be loaded by llm-shell automatically
49
- at boot time. See below for more details on how this can be done.
50
-
51
- An LLM function generally looks like this, and it can be dropped
52
- into the `~/.llm-shell/tools/` directory. This function is the one
53
- from the demo earlier, and I saved it as `~/.llm-shell/tools/system.rb`.
54
- The function's return value is relayed back to the LLM.
55
-
60
+ earlier demo for an example:
56
61
 
57
62
  ```ruby
58
63
  LLM.function(:system) do |fn|
@@ -61,11 +66,31 @@ LLM.function(:system) do |fn|
61
66
  schema.object(command: schema.string.required)
62
67
  end
63
68
  fn.define do |params|
64
- `#{params.command}`
69
+ ro, wo = IO.pipe
70
+ re, we = IO.pipe
71
+ Process.wait Process.spawn(params.command, out: wo, err: we)
72
+ [wo,we].each(&:close)
73
+ {stderr: re.read, stdout: ro.read}
65
74
  end
66
75
  end
67
76
  ```
68
77
 
78
+ #### Commands
79
+
80
+ llm-shell can be extended with your own console commands. This can be
81
+ done by creating a Ruby file in the `~/.llm-shell/commands/` directory –
82
+ with one file per command. The commands are loaded at boot time. See the
83
+ [import-file](lib/llm/shell/commands/import_file.rb)
84
+ command for a realistic example:
85
+
86
+ ```ruby
87
+ LLM.command "say-hello" do |cmd|
88
+ cmd.description "Say hello to somebody"
89
+ cmd.define do |name|
90
+ io.rewind.print "Hello #{name}!"
91
+ end
92
+ end
93
+ ```
69
94
  ## Settings
70
95
 
71
96
  #### YAML
@@ -103,13 +128,13 @@ tools:
103
128
 
104
129
  ```bash
105
130
  Usage: llm-shell [OPTIONS]
106
- -p, --provider NAME Required. Options: gemini, openai, anthropic, or ollama.
107
- -k, --key [KEY] Optional. Required by gemini, openai, and anthropic.
108
- -m, --model [MODEL] Optional. The name of a model.
109
- -h, --host [HOST] Optional. Sometimes required by ollama.
110
- -o, --port [PORT] Optional. Sometimes required by ollama.
111
- -f, --files [GLOB] Optional. Glob pattern(s) separated by a comma.
112
- -t, --tools [TOOLS] Optional. One or more tool names to load automatically.
131
+ -p, --provider NAME Required. Options: gemini, openai, anthropic, ollama or llamacpp.
132
+ -k, --key [KEY] Optional. Required by gemini, openai, and anthropic.
133
+ -m, --model [MODEL] Optional. The name of a model.
134
+ -h, --host [HOST] Optional. Sometimes required by ollama.
135
+ -o, --port [PORT] Optional. Sometimes required by ollama.
136
+ -f, --files [GLOB] Optional. Glob pattern(s) separated by a comma.
137
+ -t, --tools [TOOLS] Optional. One or more tool names to load automatically.
113
138
  ```
114
139
 
115
140
  ## Install
@@ -4,17 +4,19 @@ class LLM::Shell::Command
4
4
  module Extension
5
5
  ##
6
6
  # @example
7
- # LLM.command do |cmd|
8
- # cmd.name "hello"
7
+ # LLM.command(:hello) do |cmd|
9
8
  # cmd.define do |name|
10
9
  # io.rewind.print("Hello #{name}")
11
10
  # end
12
11
  # end
12
+ # @param [String] name
13
+ # The name of the command
13
14
  # @yieldparam [LLM::Shell::Command] cmd
14
15
  # Yields an instance of LLM::Shell::Command
15
16
  # @return [void]
16
- def command
17
+ def command(name)
17
18
  cmd = LLM::Shell::Command.new
19
+ cmd.name(name) if name
18
20
  yield cmd
19
21
  commands[cmd.name] = cmd
20
22
  end
@@ -4,10 +4,16 @@ class LLM::Shell
4
4
  class Command
5
5
  Context = Struct.new(:bot, :io)
6
6
 
7
+ ##
8
+ # Returns the underlying command object
9
+ # @return [Class, #call]
10
+ attr_reader :object
11
+
7
12
  ##
8
13
  # Set or get the command name
9
14
  # @param [String, nil] name
10
15
  # The name of the command
16
+ # @return [String]
11
17
  def name(name = nil)
12
18
  if name
13
19
  @name = name
@@ -16,6 +22,19 @@ class LLM::Shell
16
22
  end
17
23
  end
18
24
 
25
+ ##
26
+ # Set or get the command description
27
+ # @param [String, nil] desc
28
+ # The description of the command
29
+ # @return [String]
30
+ def description(desc = nil)
31
+ if desc
32
+ @description = desc
33
+ else
34
+ @description
35
+ end
36
+ end
37
+
19
38
  ##
20
39
  # Setup the command context
21
40
  # @return [void]
@@ -27,7 +46,7 @@ class LLM::Shell
27
46
  # Define the command
28
47
  # @return [void]
29
48
  def define(klass = nil, &b)
30
- @runner = klass || b
49
+ @object = klass || b
31
50
  end
32
51
  alias_method :register, :define
33
52
 
@@ -37,10 +56,10 @@ class LLM::Shell
37
56
  def call(*argv)
38
57
  if @context.nil?
39
58
  raise "context has not been setup"
40
- elsif Class === @runner
41
- @runner.new(@context).call(*argv)
59
+ elsif Class === @object
60
+ @object.new(@context).call(*argv)
42
61
  else
43
- @context.instance_exec(*argv, &@runner)
62
+ @context.instance_exec(*argv, &@object)
44
63
  end
45
64
  end
46
65
  end
@@ -2,6 +2,20 @@
2
2
 
3
3
  class LLM::Shell::Command
4
4
  class ImportFile
5
+ ##
6
+ # Completes a path with a wildcard.
7
+ # @param path [String]
8
+ # The path to complete.
9
+ # @return [Array<String>]
10
+ # Returns the completed path(s)
11
+ def self.complete(path)
12
+ Dir["#{path}*"]
13
+ end
14
+
15
+ ##
16
+ # @param [LLM::Shell::Context] context
17
+ # The context of the command
18
+ # @return [LLM::Shell::Command::ImportFile]
5
19
  def initialize(context)
6
20
  @context = context
7
21
  end
@@ -24,8 +38,8 @@ class LLM::Shell::Command
24
38
  def io = @context.io
25
39
  end
26
40
 
27
- LLM.command do |command|
28
- command.name "import-file"
29
- command.register ImportFile
41
+ LLM.command "import-file" do |cmd|
42
+ cmd.description "Share one or more files with the LLM"
43
+ cmd.register ImportFile
30
44
  end
31
45
  end
@@ -0,0 +1,40 @@
1
+ # frozen_string_literal: true
2
+
3
+ class LLM::Shell
4
+ class Completion
5
+ ##
6
+ # Returns a proc suitable for Readline completion.
7
+ # @return [Proc]
8
+ def self.to_proc
9
+ new.to_proc
10
+ end
11
+
12
+ ##
13
+ # @return [LLM::Shell::Completion]
14
+ def initialize
15
+ @commands = LLM.commands
16
+ end
17
+
18
+ ##
19
+ # Returns a proc suitable for Readline completion.
20
+ # @return [Proc]
21
+ def to_proc
22
+ method(:complete).to_proc
23
+ end
24
+
25
+ private
26
+
27
+ def complete(input)
28
+ words = Readline.line_buffer.split(" ")
29
+ command = words[0]
30
+ if commands[command]
31
+ object = commands[command].object
32
+ object.respond_to?(:complete) ? object.complete(input) : []
33
+ else
34
+ commands.keys.grep(/\A#{command}/)
35
+ end
36
+ end
37
+
38
+ attr_reader :commands
39
+ end
40
+ end
@@ -3,6 +3,7 @@
3
3
  class LLM::Shell
4
4
  class Markdown
5
5
  require "kramdown"
6
+ require "coderay"
6
7
 
7
8
  ##
8
9
  # @param [String] text
@@ -37,7 +38,14 @@ class LLM::Shell
37
38
  Paint[node.children.map { visit(_1) }.join, :bold]
38
39
  when :br
39
40
  "\n"
40
- when :text, :codespan
41
+ when :codespan, :codeblock
42
+ lines = node.value.each_line.to_a
43
+ lang = lines[0].strip
44
+ code = lines[1..].join
45
+ ["\n", Paint[">>> #{lang}", :blue, :bold],
46
+ "\n\n", coderay(code, lang),
47
+ "\n", Paint["<<< #{lang}", :blue, :bold]].join
48
+ when :text
41
49
  node.value
42
50
  else
43
51
  node.children.map { visit(_1) }.join
@@ -58,5 +66,12 @@ class LLM::Shell
58
66
  .gsub(/\A<think>[\n]*<\/think>(?:\n)/, "")
59
67
  .gsub(/\A\n{2,}/, "")
60
68
  end
69
+
70
+ def coderay(code, lang)
71
+ CodeRay.scan(code, lang).terminal
72
+ rescue ArgumentError
73
+ lang = "text"
74
+ retry
75
+ end
61
76
  end
62
77
  end
@@ -21,6 +21,8 @@ class LLM::Shell
21
21
  # Performs initial setup
22
22
  # @return [void]
23
23
  def setup
24
+ LLM::Shell.commands.each { |file| require file }
25
+ Readline.completion_proc = Completion.to_proc
24
26
  chat options.prompt, role: options.default.role
25
27
  files.each { bot.chat ["--- START: #{_1} ---", File.read(_1), "--- END: #{_1} ---"].join("\n") }
26
28
  bot.messages.each(&:read!)
@@ -4,5 +4,5 @@ module LLM
4
4
  end unless defined?(LLM)
5
5
 
6
6
  class LLM::Shell
7
- VERSION = "0.2.0"
7
+ VERSION = "0.4.0"
8
8
  end
data/lib/llm/shell.rb CHANGED
@@ -16,6 +16,7 @@ class LLM::Shell
16
16
  require_relative "shell/options"
17
17
  require_relative "shell/repl"
18
18
  require_relative "shell/config"
19
+ require_relative "shell/completion"
19
20
  require_relative "shell/version"
20
21
 
21
22
  ##
@@ -34,6 +35,12 @@ class LLM::Shell
34
35
  Dir[File.join(home, "tools", "*.rb")]
35
36
  end
36
37
 
38
+ ##
39
+ # @return [Array<String>]
40
+ def self.commands
41
+ Dir[File.join(home, "commands", "*.rb")]
42
+ end
43
+
37
44
  ##
38
45
  # @param [Hash] options
39
46
  # @return [LLM::Shell]
@@ -17,7 +17,7 @@ end
17
17
  def option_parser
18
18
  OptionParser.new do |o|
19
19
  o.banner = "Usage: llm-shell [OPTIONS]"
20
- o.on("-p PROVIDER", "--provider NAME", "Required. Options: gemini, openai, anthropic, or ollama.", String)
20
+ o.on("-p PROVIDER", "--provider NAME", "Required. Options: gemini, openai, anthropic, ollama or llamacpp.", String)
21
21
  o.on("-k [KEY]", "--key [KEY]", "Optional. Required by gemini, openai, and anthropic.", String)
22
22
  o.on("-m [MODEL]", "--model [MODEL]", "Optional. The name of a model.", Array)
23
23
  o.on("-h [HOST]", "--host [HOST]", "Optional. Sometimes required by ollama.", String)
metadata CHANGED
@@ -1,7 +1,7 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: llm-shell
3
3
  version: !ruby/object:Gem::Version
4
- version: 0.2.0
4
+ version: 0.4.0
5
5
  platform: ruby
6
6
  authors:
7
7
  - Antar Azri
@@ -9,7 +9,7 @@ authors:
9
9
  autorequire:
10
10
  bindir: bin
11
11
  cert_chain: []
12
- date: 2025-05-09 00:00:00.000000000 Z
12
+ date: 2025-05-10 00:00:00.000000000 Z
13
13
  dependencies:
14
14
  - !ruby/object:Gem::Dependency
15
15
  name: llm.rb
@@ -53,6 +53,20 @@ dependencies:
53
53
  - - "~>"
54
54
  - !ruby/object:Gem::Version
55
55
  version: '2.5'
56
+ - !ruby/object:Gem::Dependency
57
+ name: coderay
58
+ requirement: !ruby/object:Gem::Requirement
59
+ requirements:
60
+ - - "~>"
61
+ - !ruby/object:Gem::Version
62
+ version: '1.1'
63
+ type: :runtime
64
+ prerelease: false
65
+ version_requirements: !ruby/object:Gem::Requirement
66
+ requirements:
67
+ - - "~>"
68
+ - !ruby/object:Gem::Version
69
+ version: '1.1'
56
70
  - !ruby/object:Gem::Dependency
57
71
  name: webmock
58
72
  requirement: !ruby/object:Gem::Requirement
@@ -193,7 +207,7 @@ dependencies:
193
207
  - - "~>"
194
208
  - !ruby/object:Gem::Version
195
209
  version: '2.8'
196
- description: llm-shell is an extensible, developer-oriented command-line utility that
210
+ description: llm-shell is an extensible, developer-oriented command-line console that
197
211
  can interact with multiple Large Language Models (LLMs).
198
212
  email:
199
213
  - azantar@proton.me
@@ -211,6 +225,7 @@ files:
211
225
  - lib/llm/shell/command.rb
212
226
  - lib/llm/shell/command/extension.rb
213
227
  - lib/llm/shell/commands/import_file.rb
228
+ - lib/llm/shell/completion.rb
214
229
  - lib/llm/shell/config.rb
215
230
  - lib/llm/shell/default.rb
216
231
  - lib/llm/shell/formatter.rb
@@ -244,6 +259,6 @@ requirements: []
244
259
  rubygems_version: 3.5.23
245
260
  signing_key:
246
261
  specification_version: 4
247
- summary: llm-shell is an extensible, developer-oriented command-line utility that
262
+ summary: llm-shell is an extensible, developer-oriented command-line console that
248
263
  can interact with multiple Large Language Models (LLMs).
249
264
  test_files: []