llm-shell 0.1.0 → 0.3.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: bad036ff98c154b18cabda0e2b598b40c242907e6eda28b6d2cfaa1fba66a265
4
- data.tar.gz: 85d5fbc076495609562221a2296eabb06ec03c422ce1a2ea48661a3cf23213e9
3
+ metadata.gz: 2dfeeceab3ca11d0c832f8114b903d83791e237fabf4bb2dd0ce4d840be0e1ee
4
+ data.tar.gz: 119072ae06110415da13cee558ce8cdfe6613b4ad0198e91d207b91235824e5d
5
5
  SHA512:
6
- metadata.gz: 182781650d0281008f8741ef389b1c7eaaf815191df2a3e8153415d6b65fdf64ce65d5d83ec67debfd05bde2f408f40c9ad79a1f21299dac699ba7c24f580f74
7
- data.tar.gz: cebfa126c00d63d9927a9cfd6684f382016bd2c805aadf0da3ae5798d4bc136ccea3682af9c3c4e89236cfcbb7a837891b092a3744e537581eb676de9ecbaa9d
6
+ metadata.gz: 391dbf0bc2937545a475aaade248ebdfe495fd88c61ecd4691623220276f35dca1d438b55dbaf20826e30e554013493ca98044d40816396a45f9f3d5e89d678a
7
+ data.tar.gz: 63f8201b0aebd0e260788df46b90e96afdd4442f1b27bf9d4e025a6c6b6c6ca2d2338b6140fd69d616cc121cd8fa91bd1e71b3d9043e143af78a0494041ea563
data/README.md CHANGED
@@ -1,7 +1,7 @@
1
1
  ## About
2
2
 
3
3
  llm-shell is an extensible, developer-oriented command-line
4
- utility that can interact with multiple Large Language Models
4
+ console that can interact with multiple Large Language Models
5
5
  (LLMs). It serves as both a demo of the [llmrb/llm](https://github.com/llmrb/llm)
6
6
  library and a tool to help improve the library through real-world
7
7
  usage and feedback. Jump to the [Demos](#demos) section to see
@@ -9,44 +9,54 @@ it in action!
9
9
 
10
10
  ## Features
11
11
 
12
+ #### General
13
+
12
14
  - 🌟 Unified interface for multiple Large Language Models (LLMs)
13
- - 🤝 Supports Gemini, OpenAI, Anthropic, and Ollama
15
+ - 🤝 Supports Gemini, OpenAI, Anthropic, LlamaCpp and Ollama
16
+
17
+ #### Customize
18
+
14
19
  - 📤 Attach local files as conversation context
15
20
  - 🔧 Extend with your own functions and tool calls
21
+ - 🚀 Extend with your own console commands
22
+
23
+ #### Shell
24
+
25
+ - 🤖 Builtin auto-complete powered by Readline
26
+ - 📄 Deploys the less pager for long outputs
16
27
  - 📝 Advanced Markdown formatting and output
17
28
 
18
29
  ## Demos
19
30
 
20
31
  <details>
21
- <summary><b>1. Tool calls</b></summary>
22
- <img src="share/llm-shell/examples/example2.gif/">
32
+ <summary><b>1. Tools: "system" function</b></summary>
33
+ <img src="share/llm-shell/examples/toolcalls.gif/">
23
34
  </details>
24
35
 
25
36
  <details>
26
- <summary><b>2. File discussion</b></summary>
27
- <img src="share/llm-shell/examples/example1.gif">
37
+ <summary><b>2. Files: import at boot time</b></summary>
38
+ <img src="share/llm-shell/examples/files-boottime.gif">
39
+ </details>
40
+
41
+ <details>
42
+ <summary><b>3. Files: import at runtime</b></summary>
43
+ <img src="share/llm-shell/examples/files-runtime.gif">
28
44
  </details>
29
45
 
30
46
  ## Customization
31
47
 
32
48
  #### Functions
33
49
 
50
+ > For security and safety reasons, a user must confirm the execution of
51
+ > all function calls before they happen and also add the function to
52
+ > an allowlist before it will be loaded by llm-shell automatically
53
+ > at boot time.
54
+
34
55
  The `~/.llm-shell/tools/` directory can contain one or more
35
56
  [llmrb/llm](https://github.com/llmrb/llm) functions that the
36
57
  LLM can call once you confirm you are okay with executing the
37
58
  code locally (along with any arguments it provides). See the
38
- earlier demo for an example.
39
-
40
- For security and safety reasons, a user must confirm the execution of
41
- all function calls before they happen and also add the function to
42
- an allowlist before it will be loaded by llm-shell automatically
43
- at boot time. See below for more details on how this can be done.
44
-
45
- An LLM function generally looks like this, and it can be dropped
46
- into the `~/.llm-shell/tools/` directory. This function is the one
47
- from the demo earlier, and I saved it as `~/.llm-shell/tools/system.rb`.
48
- The function's return value is relayed back to the LLM.
49
-
59
+ earlier demo for an example:
50
60
 
51
61
  ```ruby
52
62
  LLM.function(:system) do |fn|
@@ -55,11 +65,30 @@ LLM.function(:system) do |fn|
55
65
  schema.object(command: schema.string.required)
56
66
  end
57
67
  fn.define do |params|
58
- `#{params.command}`
68
+ ro, wo = IO.pipe
69
+ re, we = IO.pipe
70
+ Process.wait Process.spawn(params.command, out: wo, err: we)
71
+ [wo,we].each(&:close)
72
+ {stderr: re.read, stdout: ro.read}
59
73
  end
60
74
  end
61
75
  ```
62
76
 
77
+ #### Commands
78
+
79
+ llm-shell can be extended with your own console commands. This can be
80
+ done by creating a Ruby file in the `~/.llm-shell/commands/` directory &ndash;
81
+ with one file per command. The commands are loaded at boot time. See the
82
+ [import-file](lib/llm/shell/commands/import_file.rb)
83
+ command for a realistic example:
84
+
85
+ ```ruby
86
+ LLM.command "say-hello" do |cmd|
87
+ cmd.define do |name|
88
+ io.rewind.print "Hello #{name}!"
89
+ end
90
+ end
91
+ ```
63
92
  ## Settings
64
93
 
65
94
  #### YAML
@@ -84,6 +113,9 @@ anthropic:
84
113
  ollama:
85
114
  host: localhost
86
115
  model: deepseek-coder:6.7b
116
+ llamacpp:
117
+ host: localhost
118
+ model: qwen3
87
119
  tools:
88
120
  - system
89
121
  ```
@@ -0,0 +1,31 @@
1
+ # frozen_string_literal: true
2
+
3
+ class LLM::Shell::Command
4
+ module Extension
5
+ ##
6
+ # @example
7
+ # LLM.command(:hello) do |cmd|
8
+ # cmd.define do |name|
9
+ # io.rewind.print("Hello #{name}")
10
+ # end
11
+ # end
12
+ # @param [String] name
13
+ # The name of the command
14
+ # @yieldparam [LLM::Shell::Command] cmd
15
+ # Yields an instance of LLM::Shell::Command
16
+ # @return [void]
17
+ def command(name)
18
+ cmd = LLM::Shell::Command.new
19
+ cmd.name(name) if name
20
+ yield cmd
21
+ commands[cmd.name] = cmd
22
+ end
23
+
24
+ ##
25
+ # @return [Hash<String, LLM::Shell::Command>]
26
+ def commands
27
+ @commands ||= {}
28
+ end
29
+ end
30
+ LLM.extend(Extension)
31
+ end
@@ -0,0 +1,52 @@
1
+ # frozen_string_literal: true
2
+
3
+ class LLM::Shell
4
+ class Command
5
+ Context = Struct.new(:bot, :io)
6
+
7
+ ##
8
+ # Returns the underlying command object
9
+ # @return [Class, #call]
10
+ attr_reader :object
11
+
12
+ ##
13
+ # Set or get the command name
14
+ # @param [String, nil] name
15
+ # The name of the command
16
+ def name(name = nil)
17
+ if name
18
+ @name = name
19
+ else
20
+ @name
21
+ end
22
+ end
23
+
24
+ ##
25
+ # Setup the command context
26
+ # @return [void]
27
+ def setup(bot, io)
28
+ @context = Context.new(bot, io)
29
+ end
30
+
31
+ ##
32
+ # Define the command
33
+ # @return [void]
34
+ def define(klass = nil, &b)
35
+ @object = klass || b
36
+ end
37
+ alias_method :register, :define
38
+
39
+ ##
40
+ # Call the command
41
+ # @reurn [void]
42
+ def call(*argv)
43
+ if @context.nil?
44
+ raise "context has not been setup"
45
+ elsif Class === @object
46
+ @object.new(@context).call(*argv)
47
+ else
48
+ @context.instance_exec(*argv, &@object)
49
+ end
50
+ end
51
+ end
52
+ end
@@ -0,0 +1,40 @@
1
+ # frozen_string_literal: true
2
+
3
+ class LLM::Shell::Command
4
+ class ImportFile
5
+ ##
6
+ # Completes a path with a wildcard.
7
+ # @param path [String]
8
+ # The path to complete.
9
+ # @return [Array<String>]
10
+ # Returns the completed path(s)
11
+ def self.complete(path)
12
+ Dir["#{path}*"]
13
+ end
14
+
15
+ def initialize(context)
16
+ @context = context
17
+ end
18
+
19
+ def call(*files)
20
+ Dir[*files].each { import(_1) }
21
+ end
22
+
23
+ private
24
+
25
+ def import(file)
26
+ bot.chat [
27
+ "--- START: #{file} ---",
28
+ File.read(file),
29
+ "--- END: #{file} ---"
30
+ ].join("\n")
31
+ end
32
+
33
+ def bot = @context.bot
34
+ def io = @context.io
35
+ end
36
+
37
+ LLM.command "import-file" do |cmd|
38
+ cmd.register ImportFile
39
+ end
40
+ end
@@ -0,0 +1,40 @@
1
+ # frozen_string_literal: true
2
+
3
+ class LLM::Shell
4
+ class Completion
5
+ ##
6
+ # Returns a proc suitable for Readline completion.
7
+ # @return [Proc]
8
+ def self.to_proc
9
+ new.to_proc
10
+ end
11
+
12
+ ##
13
+ # @return [LLM::Shell::Completion]
14
+ def initialize
15
+ @commands = LLM.commands
16
+ end
17
+
18
+ ##
19
+ # Returns a proc suitable for Readline completion.
20
+ # @return [Proc]
21
+ def to_proc
22
+ method(:complete).to_proc
23
+ end
24
+
25
+ private
26
+
27
+ def complete(input)
28
+ words = Readline.line_buffer.split(" ")
29
+ command = words[0]
30
+ if commands[command]
31
+ object = commands[command].object
32
+ object.respond_to?(:complete) ? object.complete(input) : []
33
+ else
34
+ commands.keys.grep(/\A#{command}/)
35
+ end
36
+ end
37
+
38
+ attr_reader :commands
39
+ end
40
+ end
@@ -7,18 +7,7 @@ class LLM::Shell
7
7
  end
8
8
 
9
9
  def prompt
10
- "You are a helpful assistant." \
11
- "Answer the user's questions as best as you can." \
12
- "The user's environment is a terminal." \
13
- "Provide short and concise answers that are suitable for a terminal." \
14
- "Do not provide long answers." \
15
- "One or more files might be provided at the start of the conversation. " \
16
- "The user might ask you about them, you should try to understand them and what they are. " \
17
- "If you don't understand something, say so. " \
18
- "Respond in markdown format." \
19
- "Each file will be surrounded by the following markers: " \
20
- "'# START: /path/to/file'" \
21
- "'# END: /path/to/file'"
10
+ File.read File.join(SHAREDIR, "prompts", "default.txt")
22
11
  end
23
12
 
24
13
  def role
@@ -27,5 +16,8 @@ class LLM::Shell
27
16
  else :user
28
17
  end
29
18
  end
19
+
20
+ SHAREDIR = File.join(__dir__, "..", "..", "..", "share", "llm-shell")
21
+ private_constant :SHAREDIR
30
22
  end
31
23
  end
@@ -3,6 +3,7 @@
3
3
  class LLM::Shell
4
4
  class Formatter
5
5
  FormatError = Class.new(RuntimeError)
6
+ FILE_REGEXP = /\A--- START: (.+?) ---/
6
7
 
7
8
  def initialize(messages)
8
9
  @messages = messages.reject(&:tool_call?)
@@ -21,25 +22,26 @@ class LLM::Shell
21
22
  attr_reader :messages
22
23
 
23
24
  def format_user(messages)
24
- messages.flat_map do |message|
25
+ messages.filter_map do |message|
25
26
  next unless message.user?
26
27
  next unless String === message.content
28
+ next unless message.content !~ FILE_REGEXP
27
29
  role = Paint[message.role, :bold, :yellow]
28
30
  title = "#{role} says: "
29
31
  body = wrap(message.tap(&:read!).content)
30
- [title, render(body), ""].join("\n")
31
- end.join
32
+ [title, "\n", render(body), "\n"].join
33
+ end.join("\n")
32
34
  end
33
35
 
34
36
  def format_assistant(messages)
35
- messages.flat_map do |message|
37
+ messages.filter_map do |message|
36
38
  next unless message.assistant?
37
39
  next unless String === message.content
38
40
  role = Paint[message.role, :bold, :green]
39
41
  title = "#{role} says: "
40
42
  body = wrap(message.tap(&:read!).content)
41
- [title, render(body)].join("\n")
42
- end.join
43
+ [title, "\n", render(body)].join
44
+ end.join("\n")
43
45
  end
44
46
 
45
47
  def render(text)
@@ -55,6 +55,8 @@ class LLM::Shell
55
55
  text
56
56
  .gsub(/([^\n])\n(#+ )/, "\\1\n\n\\2")
57
57
  .gsub(/(#+ .+?)\n(?!\n)/, "\\1\n\n")
58
+ .gsub(/\A<think>[\n]*<\/think>(?:\n)/, "")
59
+ .gsub(/\A\n{2,}/, "")
58
60
  end
59
61
  end
60
62
  end
@@ -26,5 +26,6 @@ class LLM::Shell
26
26
  def llm = @options
27
27
  def chat = @chat_options
28
28
  def default = @default
29
+ def prompt = default.prompt
29
30
  end
30
31
  end
@@ -14,15 +14,17 @@ class LLM::Shell
14
14
  @bot = bot
15
15
  @console = IO.console
16
16
  @options = options
17
- @line = IO::Line.new($stdout)
17
+ @io = IO::Line.new($stdout)
18
18
  end
19
19
 
20
20
  ##
21
21
  # Performs initial setup
22
22
  # @return [void]
23
23
  def setup
24
- chat options.default.prompt, role: options.default.role
25
- files.each { bot.chat ["# START: #{_1}", File.read(_1), "# END: #{_1}"].join("\n") }
24
+ LLM::Shell.commands.each { |file| require file }
25
+ Readline.completion_proc = Completion.to_proc
26
+ chat options.prompt, role: options.default.role
27
+ files.each { bot.chat ["--- START: #{_1} ---", File.read(_1), "--- END: #{_1} ---"].join("\n") }
26
28
  bot.messages.each(&:read!)
27
29
  clear_screen
28
30
  end
@@ -50,7 +52,7 @@ class LLM::Shell
50
52
  private
51
53
 
52
54
  attr_reader :bot, :console,
53
- :line, :default,
55
+ :io, :default,
54
56
  :options
55
57
 
56
58
  def formatter(messages) = Formatter.new(messages)
@@ -61,9 +63,17 @@ class LLM::Shell
61
63
 
62
64
  def read
63
65
  input = Readline.readline("llm> ", true) || throw(:exit, 0)
64
- chat input.tap { clear_screen }
65
- line.rewind.print(Paint["Thinking", :bold])
66
- unread.tap { line.rewind }
66
+ words = input.split(" ")
67
+ if LLM.commands[words[0]]
68
+ cmd = LLM.commands[words[0]]
69
+ argv = words[1..]
70
+ cmd.setup(bot, io)
71
+ cmd.call(*argv)
72
+ else
73
+ chat input.tap { clear_screen }
74
+ io.rewind.print(Paint["Thinking", :bold])
75
+ unread.tap { io.rewind }
76
+ end
67
77
  end
68
78
 
69
79
  def eval
@@ -74,18 +84,21 @@ class LLM::Shell
74
84
  print "Do you want to call it? "
75
85
  input = $stdin.gets.chomp.downcase
76
86
  puts
77
- if %w(y yes yeah ok).include?(input)
87
+ if %w(y yes yep yeah ok).include?(input)
78
88
  bot.chat function.call
79
- unread.tap { line.rewind }
89
+ unread.tap { io.rewind }
80
90
  else
81
- print "Skipping function call", "\n"
91
+ bot.chat function.cancel
92
+ bot.chat "I decided to not run the function this time. Maybe next time."
82
93
  end
83
94
  end
84
95
  end
85
96
 
86
97
  def emit
87
- print formatter(unread).format!(:user), "\n"
88
- print formatter(unread).format!(:assistant), "\n"
98
+ IO.popen("less -FRX", "w") do
99
+ _1.write formatter(unread).format!(:user), "\n"
100
+ _1.write formatter(unread).format!(:assistant), "\n"
101
+ end
89
102
  end
90
103
 
91
104
  def chat(...)
@@ -4,5 +4,5 @@ module LLM
4
4
  end unless defined?(LLM)
5
5
 
6
6
  class LLM::Shell
7
- VERSION = "0.1.0"
7
+ VERSION = "0.3.0"
8
8
  end
data/lib/llm/shell.rb CHANGED
@@ -8,14 +8,21 @@ require "paint"
8
8
 
9
9
  class LLM::Shell
10
10
  require_relative "../io/line"
11
+ require_relative "shell/command"
12
+ require_relative "shell/command/extension"
11
13
  require_relative "shell/markdown"
12
14
  require_relative "shell/formatter"
13
15
  require_relative "shell/default"
14
16
  require_relative "shell/options"
15
17
  require_relative "shell/repl"
16
18
  require_relative "shell/config"
19
+ require_relative "shell/completion"
17
20
  require_relative "shell/version"
18
21
 
22
+ ##
23
+ # Load all commands
24
+ Dir[File.join(__dir__, "shell", "commands", "*.rb")].each { require(_1) }
25
+
19
26
  ##
20
27
  # @return [String]
21
28
  def self.home
@@ -28,6 +35,12 @@ class LLM::Shell
28
35
  Dir[File.join(home, "tools", "*.rb")]
29
36
  end
30
37
 
38
+ ##
39
+ # @return [Array<String>]
40
+ def self.commands
41
+ Dir[File.join(home, "commands", "*.rb")]
42
+ end
43
+
31
44
  ##
32
45
  # @param [Hash] options
33
46
  # @return [LLM::Shell]
@@ -55,7 +68,7 @@ class LLM::Shell
55
68
  print Paint["llm-shell: ", :green], "load #{name} tool", "\n"
56
69
  eval File.read(path), TOPLEVEL_BINDING, path, 1
57
70
  else
58
- print Paint["llm-shell:: ", :yellow], "skip #{name} tool", "\n"
71
+ print Paint["llm-shell: ", :yellow], "skip #{name} tool", "\n"
59
72
  end
60
73
  end.grep(LLM::Function)
61
74
  end
data/lib/llm-shell.rb CHANGED
@@ -1 +1,3 @@
1
+ # frozen_string_literal: true
2
+
1
3
  require_relative "llm/shell"
@@ -0,0 +1,27 @@
1
+ /no_think
2
+
3
+ ## General
4
+
5
+ You are a helpful assistant.
6
+ Answer the user's questions as best as you can.
7
+
8
+ The user's environment is a terminal.
9
+ Provide short and concise answers that are suitable for a terminal.
10
+ Do not provide long answers.
11
+
12
+ ## Files
13
+
14
+ One or more files *MIGHT* be provided at the start of the conversation.
15
+ One file will be provided per message, *IF* any files are provided at all.
16
+ *IF* a file is provided, it will be in this format:
17
+
18
+ --- START: /path/to/file ---
19
+ <contents>
20
+ --- END: /path/to/file ---
21
+
22
+ Otherwise, no files will be provided and you shouldn't mention them.
23
+ On receipt of one or more files, you will respond with: Got it. And with nothing else.
24
+
25
+ ## Format
26
+
27
+ Respond in markdown.
metadata CHANGED
@@ -1,7 +1,7 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: llm-shell
3
3
  version: !ruby/object:Gem::Version
4
- version: 0.1.0
4
+ version: 0.3.0
5
5
  platform: ruby
6
6
  authors:
7
7
  - Antar Azri
@@ -9,7 +9,7 @@ authors:
9
9
  autorequire:
10
10
  bindir: bin
11
11
  cert_chain: []
12
- date: 2025-05-06 00:00:00.000000000 Z
12
+ date: 2025-05-10 00:00:00.000000000 Z
13
13
  dependencies:
14
14
  - !ruby/object:Gem::Dependency
15
15
  name: llm.rb
@@ -17,14 +17,14 @@ dependencies:
17
17
  requirements:
18
18
  - - "~>"
19
19
  - !ruby/object:Gem::Version
20
- version: '0.6'
20
+ version: '0.7'
21
21
  type: :runtime
22
22
  prerelease: false
23
23
  version_requirements: !ruby/object:Gem::Requirement
24
24
  requirements:
25
25
  - - "~>"
26
26
  - !ruby/object:Gem::Version
27
- version: '0.6'
27
+ version: '0.7'
28
28
  - !ruby/object:Gem::Dependency
29
29
  name: paint
30
30
  requirement: !ruby/object:Gem::Requirement
@@ -193,7 +193,7 @@ dependencies:
193
193
  - - "~>"
194
194
  - !ruby/object:Gem::Version
195
195
  version: '2.8'
196
- description: llm-shell is an extensible, developer-oriented command-line utility that
196
+ description: llm-shell is an extensible, developer-oriented command-line console that
197
197
  can interact with multiple Large Language Models (LLMs).
198
198
  email:
199
199
  - azantar@proton.me
@@ -208,6 +208,10 @@ files:
208
208
  - lib/io/line.rb
209
209
  - lib/llm-shell.rb
210
210
  - lib/llm/shell.rb
211
+ - lib/llm/shell/command.rb
212
+ - lib/llm/shell/command/extension.rb
213
+ - lib/llm/shell/commands/import_file.rb
214
+ - lib/llm/shell/completion.rb
211
215
  - lib/llm/shell/config.rb
212
216
  - lib/llm/shell/default.rb
213
217
  - lib/llm/shell/formatter.rb
@@ -216,6 +220,7 @@ files:
216
220
  - lib/llm/shell/repl.rb
217
221
  - lib/llm/shell/version.rb
218
222
  - libexec/llm-shell/shell
223
+ - share/llm-shell/prompts/default.txt
219
224
  homepage: https://github.com/llmrb/llm-shell
220
225
  licenses:
221
226
  - 0BSD
@@ -240,6 +245,6 @@ requirements: []
240
245
  rubygems_version: 3.5.23
241
246
  signing_key:
242
247
  specification_version: 4
243
- summary: llm-shell is an extensible, developer-oriented command-line utility that
248
+ summary: llm-shell is an extensible, developer-oriented command-line console that
244
249
  can interact with multiple Large Language Models (LLMs).
245
250
  test_files: []