llm-shell 0.2.0 → 0.3.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: 629a62d1b7f2fb7c4de9149c1af8e640c1ff08ccbef3a581d006cd99d331e4e2
4
- data.tar.gz: 9e8808cba408a0474a938b7de8df00e1b72032635b00928bdd0a7a78f57d1521
3
+ metadata.gz: 2dfeeceab3ca11d0c832f8114b903d83791e237fabf4bb2dd0ce4d840be0e1ee
4
+ data.tar.gz: 119072ae06110415da13cee558ce8cdfe6613b4ad0198e91d207b91235824e5d
5
5
  SHA512:
6
- metadata.gz: 4552e7ef324acf1f800c42cffc5fe7812e581389f1a544a447c6963611011d66c4a3e19ec2fb1d5f142fbf822b8dd8a0264ed8de6a40427113b597541d099170
7
- data.tar.gz: 8c453dfd38ff0c2f94707ab460a47afeba52efeccc808ff68c5a9c1007040855104e1954548222367459eca4a6245a174b78cd9f034baa8f301d2451a78b0288
6
+ metadata.gz: 391dbf0bc2937545a475aaade248ebdfe495fd88c61ecd4691623220276f35dca1d438b55dbaf20826e30e554013493ca98044d40816396a45f9f3d5e89d678a
7
+ data.tar.gz: 63f8201b0aebd0e260788df46b90e96afdd4442f1b27bf9d4e025a6c6b6c6ca2d2338b6140fd69d616cc121cd8fa91bd1e71b3d9043e143af78a0494041ea563
data/README.md CHANGED
@@ -1,7 +1,7 @@
1
1
  ## About
2
2
 
3
3
  llm-shell is an extensible, developer-oriented command-line
4
- utility that can interact with multiple Large Language Models
4
+ console that can interact with multiple Large Language Models
5
5
  (LLMs). It serves as both a demo of the [llmrb/llm](https://github.com/llmrb/llm)
6
6
  library and a tool to help improve the library through real-world
7
7
  usage and feedback. Jump to the [Demos](#demos) section to see
@@ -9,12 +9,22 @@ it in action!
9
9
 
10
10
  ## Features
11
11
 
12
+ #### General
13
+
12
14
  - 🌟 Unified interface for multiple Large Language Models (LLMs)
13
15
  - 🤝 Supports Gemini, OpenAI, Anthropic, LlamaCpp and Ollama
16
+
17
+ #### Customize
18
+
14
19
  - 📤 Attach local files as conversation context
15
20
  - 🔧 Extend with your own functions and tool calls
16
- - 📝 Advanced Markdown formatting and output
21
+ - 🚀 Extend with your own console commands
22
+
23
+ #### Shell
24
+
25
+ - 🤖 Builtin auto-complete powered by Readline
17
26
  - 📄 Deploys the less pager for long outputs
27
+ - 📝 Advanced Markdown formatting and output
18
28
 
19
29
  ## Demos
20
30
 
@@ -37,22 +47,16 @@ it in action!
37
47
 
38
48
  #### Functions
39
49
 
50
+ > For security and safety reasons, a user must confirm the execution of
51
+ > all function calls before they happen and also add the function to
52
+ > an allowlist before it will be loaded by llm-shell automatically
53
+ > at boot time.
54
+
40
55
  The `~/.llm-shell/tools/` directory can contain one or more
41
56
  [llmrb/llm](https://github.com/llmrb/llm) functions that the
42
57
  LLM can call once you confirm you are okay with executing the
43
58
  code locally (along with any arguments it provides). See the
44
- earlier demo for an example.
45
-
46
- For security and safety reasons, a user must confirm the execution of
47
- all function calls before they happen and also add the function to
48
- an allowlist before it will be loaded by llm-shell automatically
49
- at boot time. See below for more details on how this can be done.
50
-
51
- An LLM function generally looks like this, and it can be dropped
52
- into the `~/.llm-shell/tools/` directory. This function is the one
53
- from the demo earlier, and I saved it as `~/.llm-shell/tools/system.rb`.
54
- The function's return value is relayed back to the LLM.
55
-
59
+ earlier demo for an example:
56
60
 
57
61
  ```ruby
58
62
  LLM.function(:system) do |fn|
@@ -61,11 +65,30 @@ LLM.function(:system) do |fn|
61
65
  schema.object(command: schema.string.required)
62
66
  end
63
67
  fn.define do |params|
64
- `#{params.command}`
68
+ ro, wo = IO.pipe
69
+ re, we = IO.pipe
70
+ Process.wait Process.spawn(params.command, out: wo, err: we)
71
+ [wo,we].each(&:close)
72
+ {stderr: re.read, stdout: ro.read}
65
73
  end
66
74
  end
67
75
  ```
68
76
 
77
+ #### Commands
78
+
79
+ llm-shell can be extended with your own console commands. This can be
80
+ done by creating a Ruby file in the `~/.llm-shell/commands/` directory –
81
+ with one file per command. The commands are loaded at boot time. See the
82
+ [import-file](lib/llm/shell/commands/import_file.rb)
83
+ command for a realistic example:
84
+
85
+ ```ruby
86
+ LLM.command "say-hello" do |cmd|
87
+ cmd.define do |name|
88
+ io.rewind.print "Hello #{name}!"
89
+ end
90
+ end
91
+ ```
69
92
  ## Settings
70
93
 
71
94
  #### YAML
@@ -4,17 +4,19 @@ class LLM::Shell::Command
4
4
  module Extension
5
5
  ##
6
6
  # @example
7
- # LLM.command do |cmd|
8
- # cmd.name "hello"
7
+ # LLM.command(:hello) do |cmd|
9
8
  # cmd.define do |name|
10
9
  # io.rewind.print("Hello #{name}")
11
10
  # end
12
11
  # end
12
+ # @param [String] name
13
+ # The name of the command
13
14
  # @yieldparam [LLM::Shell::Command] cmd
14
15
  # Yields an instance of LLM::Shell::Command
15
16
  # @return [void]
16
- def command
17
+ def command(name)
17
18
  cmd = LLM::Shell::Command.new
19
+ cmd.name(name) if name
18
20
  yield cmd
19
21
  commands[cmd.name] = cmd
20
22
  end
@@ -4,6 +4,11 @@ class LLM::Shell
4
4
  class Command
5
5
  Context = Struct.new(:bot, :io)
6
6
 
7
+ ##
8
+ # Returns the underlying command object
9
+ # @return [Class, #call]
10
+ attr_reader :object
11
+
7
12
  ##
8
13
  # Set or get the command name
9
14
  # @param [String, nil] name
@@ -27,7 +32,7 @@ class LLM::Shell
27
32
  # Define the command
28
33
  # @return [void]
29
34
  def define(klass = nil, &b)
30
- @runner = klass || b
35
+ @object = klass || b
31
36
  end
32
37
  alias_method :register, :define
33
38
 
@@ -37,10 +42,10 @@ class LLM::Shell
37
42
  def call(*argv)
38
43
  if @context.nil?
39
44
  raise "context has not been setup"
40
- elsif Class === @runner
41
- @runner.new(@context).call(*argv)
45
+ elsif Class === @object
46
+ @object.new(@context).call(*argv)
42
47
  else
43
- @context.instance_exec(*argv, &@runner)
48
+ @context.instance_exec(*argv, &@object)
44
49
  end
45
50
  end
46
51
  end
@@ -2,6 +2,16 @@
2
2
 
3
3
  class LLM::Shell::Command
4
4
  class ImportFile
5
+ ##
6
+ # Completes a path with a wildcard.
7
+ # @param path [String]
8
+ # The path to complete.
9
+ # @return [Array<String>]
10
+ # Returns the completed path(s)
11
+ def self.complete(path)
12
+ Dir["#{path}*"]
13
+ end
14
+
5
15
  def initialize(context)
6
16
  @context = context
7
17
  end
@@ -24,8 +34,7 @@ class LLM::Shell::Command
24
34
  def io = @context.io
25
35
  end
26
36
 
27
- LLM.command do |command|
28
- command.name "import-file"
29
- command.register ImportFile
37
+ LLM.command "import-file" do |cmd|
38
+ cmd.register ImportFile
30
39
  end
31
40
  end
@@ -0,0 +1,40 @@
1
+ # frozen_string_literal: true
2
+
3
+ class LLM::Shell
4
+ class Completion
5
+ ##
6
+ # Returns a proc suitable for Readline completion.
7
+ # @return [Proc]
8
+ def self.to_proc
9
+ new.to_proc
10
+ end
11
+
12
+ ##
13
+ # @return [LLM::Shell::Completion]
14
+ def initialize
15
+ @commands = LLM.commands
16
+ end
17
+
18
+ ##
19
+ # Returns a proc suitable for Readline completion.
20
+ # @return [Proc]
21
+ def to_proc
22
+ method(:complete).to_proc
23
+ end
24
+
25
+ private
26
+
27
+ def complete(input)
28
+ words = Readline.line_buffer.split(" ")
29
+ command = words[0]
30
+ if commands[command]
31
+ object = commands[command].object
32
+ object.respond_to?(:complete) ? object.complete(input) : []
33
+ else
34
+ commands.keys.grep(/\A#{command}/)
35
+ end
36
+ end
37
+
38
+ attr_reader :commands
39
+ end
40
+ end
@@ -21,6 +21,8 @@ class LLM::Shell
21
21
  # Performs initial setup
22
22
  # @return [void]
23
23
  def setup
24
+ LLM::Shell.commands.each { |file| require file }
25
+ Readline.completion_proc = Completion.to_proc
24
26
  chat options.prompt, role: options.default.role
25
27
  files.each { bot.chat ["--- START: #{_1} ---", File.read(_1), "--- END: #{_1} ---"].join("\n") }
26
28
  bot.messages.each(&:read!)
@@ -4,5 +4,5 @@ module LLM
4
4
  end unless defined?(LLM)
5
5
 
6
6
  class LLM::Shell
7
- VERSION = "0.2.0"
7
+ VERSION = "0.3.0"
8
8
  end
data/lib/llm/shell.rb CHANGED
@@ -16,6 +16,7 @@ class LLM::Shell
16
16
  require_relative "shell/options"
17
17
  require_relative "shell/repl"
18
18
  require_relative "shell/config"
19
+ require_relative "shell/completion"
19
20
  require_relative "shell/version"
20
21
 
21
22
  ##
@@ -34,6 +35,12 @@ class LLM::Shell
34
35
  Dir[File.join(home, "tools", "*.rb")]
35
36
  end
36
37
 
38
+ ##
39
+ # @return [Array<String>]
40
+ def self.commands
41
+ Dir[File.join(home, "commands", "*.rb")]
42
+ end
43
+
37
44
  ##
38
45
  # @param [Hash] options
39
46
  # @return [LLM::Shell]
metadata CHANGED
@@ -1,7 +1,7 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: llm-shell
3
3
  version: !ruby/object:Gem::Version
4
- version: 0.2.0
4
+ version: 0.3.0
5
5
  platform: ruby
6
6
  authors:
7
7
  - Antar Azri
@@ -9,7 +9,7 @@ authors:
9
9
  autorequire:
10
10
  bindir: bin
11
11
  cert_chain: []
12
- date: 2025-05-09 00:00:00.000000000 Z
12
+ date: 2025-05-10 00:00:00.000000000 Z
13
13
  dependencies:
14
14
  - !ruby/object:Gem::Dependency
15
15
  name: llm.rb
@@ -193,7 +193,7 @@ dependencies:
193
193
  - - "~>"
194
194
  - !ruby/object:Gem::Version
195
195
  version: '2.8'
196
- description: llm-shell is an extensible, developer-oriented command-line utility that
196
+ description: llm-shell is an extensible, developer-oriented command-line console that
197
197
  can interact with multiple Large Language Models (LLMs).
198
198
  email:
199
199
  - azantar@proton.me
@@ -211,6 +211,7 @@ files:
211
211
  - lib/llm/shell/command.rb
212
212
  - lib/llm/shell/command/extension.rb
213
213
  - lib/llm/shell/commands/import_file.rb
214
+ - lib/llm/shell/completion.rb
214
215
  - lib/llm/shell/config.rb
215
216
  - lib/llm/shell/default.rb
216
217
  - lib/llm/shell/formatter.rb
@@ -244,6 +245,6 @@ requirements: []
244
245
  rubygems_version: 3.5.23
245
246
  signing_key:
246
247
  specification_version: 4
247
- summary: llm-shell is an extensible, developer-oriented command-line utility that
248
+ summary: llm-shell is an extensible, developer-oriented command-line console that
248
249
  can interact with multiple Large Language Models (LLMs).
249
250
  test_files: []