llm-shell 0.7.1 → 0.8.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: 1b4d556b621ab4ce91e5c3c14f50ffa906e0158b10f7cbf619eab797dc3b53ee
4
- data.tar.gz: 0f3dbaa8ebed80f9bb4f22559deddc34960ac3682b4c3e2712efb304c9f8be0e
3
+ metadata.gz: 12bc22975d67ab780255d8c35fc91d6578ddf880db5e10ae639e70d67be7b275
4
+ data.tar.gz: ebb7794394c964e7b6ec0e0a18b72678b857d4f22369c05cc6a799de73b6f0db
5
5
  SHA512:
6
- metadata.gz: da51e7bc421daaaf972ac3e4439100517d31810461dd9d7851895e95704537365fedac0aaad39d1a2a1b788fecd0eec3d7a49040e9f8a90e90b809e8e4416e57
7
- data.tar.gz: 73bfabf14dee823e4bd6792210f29a4efc03a915591c853d7862d76f32f0f5fdc307c999321aef913a1977a4a56df19c6b299555dc287c6de9e51a515c4bd4b8
6
+ metadata.gz: c9b748161d32e911d952a1f326f5be0df9be043a70ceefbaccdf92e7456b5bc2dab2ac54d35e74ca1f68275fa16715f672665e248a1b0621ba030a8a072106d9
7
+ data.tar.gz: 6fe177aea6672502327c4b1bbfd3a4e27ebe9c1d527c93911876c3f8213a87cbe131353a6407563221b2ed44db112215a85e0a18e5c108fd66dea26fdc59e6b9
data/README.md CHANGED
@@ -5,7 +5,7 @@ console that can interact with multiple Large Language Models
5
5
  (LLMs). It serves as both a demo of the [llmrb/llm](https://github.com/llmrb/llm)
6
6
  library and a tool to help improve the library through real-world
7
7
  usage and feedback. Jump to the [Demos](#demos) section to see
8
- it in action!
8
+ it in action.
9
9
 
10
10
  ## Features
11
11
 
@@ -30,18 +30,18 @@ it in action!
30
30
  ## Demos
31
31
 
32
32
  <details>
33
- <summary><b>1. Tools: "system" function</b></summary>
33
+ <summary><b>1. An introduction to tool calls</b></summary>
34
34
  <img src="share/llm-shell/examples/toolcalls.gif/">
35
35
  </details>
36
36
 
37
37
  <details>
38
- <summary><b>2. Files: import at runtime</b></summary>
39
- <img src="share/llm-shell/examples/files-runtime.gif">
38
+ <summary><b>2. Add files as conversation context</b></summary>
39
+ <img src="share/llm-shell/examples/files.gif">
40
40
  </details>
41
41
 
42
42
  <details>
43
- <summary><b>3. Files: import at boot time</b></summary>
44
- <img src="share/llm-shell/examples/files-boottime.gif">
43
+ <summary><b>3. Advanced features: markdown, syntax highlighting</b></summary>
44
+ <img src="share/llm-shell/examples/codegen.gif">
45
45
  </details>
46
46
 
47
47
  ## Customization
@@ -49,15 +49,18 @@ it in action!
49
49
  #### Functions
50
50
 
51
51
  > For security and safety reasons, a user must confirm the execution of
52
- > all function calls before they happen and also add the function to
53
- > an allowlist before it will be loaded by llm-shell automatically
54
- > at boot time.
52
+ > all function calls before they happen
53
+
54
+ llm-shell can be extended with your own functions (also known as tool calls).
55
+ This can be done by creating a Ruby file in the `~/.llm-shell/functions/`
56
+ directory &ndash; with one file per function. The functions are
57
+ loaded at boot time. The functions are shared with the LLM and the LLM
58
+ can request their execution. The LLM is also made aware of a function's
59
+ return value after it has been called.
60
+ See the
61
+ [functions/](lib/llm/shell/functions/)
62
+ directory for more examples:
55
63
 
56
- The `~/.llm-shell/tools/` directory can contain one or more
57
- [llmrb/llm](https://github.com/llmrb/llm) functions that the
58
- LLM can call once you confirm you are okay with executing the
59
- code locally (along with any arguments it provides). See the
60
- earlier demo for an example:
61
64
 
62
65
  ```ruby
63
66
  LLM.function(:system) do |fn|
@@ -65,10 +68,10 @@ LLM.function(:system) do |fn|
65
68
  fn.params do |schema|
66
69
  schema.object(command: schema.string.required)
67
70
  end
68
- fn.define do |params|
71
+ fn.define do |command:|
69
72
  ro, wo = IO.pipe
70
73
  re, we = IO.pipe
71
- Process.wait Process.spawn(params.command, out: wo, err: we)
74
+ Process.wait Process.spawn(command, out: wo, err: we)
72
75
  [wo,we].each(&:close)
73
76
  {stderr: re.read, stdout: ro.read}
74
77
  end
@@ -79,13 +82,10 @@ end
79
82
 
80
83
  llm-shell can be extended with your own console commands. This can be
81
84
  done by creating a Ruby file in the `~/.llm-shell/commands/` directory &ndash;
82
- with one file per command. The commands are loaded at boot time. See the
83
- [file-import](lib/llm/shell/commands/file_import.rb),
84
- [dir-import](lib/llm/shell/commands/dir_import.rb),
85
- [show-history](lib/llm/shell/commands/show_history.rb),
86
- [clear-screen](lib/llm/shell/commands/clear_screen.rb)
87
- and [system-prompt](lib/llm/shell/commands/system_prompt.rb)
88
- commands for more realistic examples:
85
+ with one file per command. The commands are loaded at boot time.
86
+ See the
87
+ [commands/](lib/llm/shell/commands/)
88
+ directory for more examples:
89
89
 
90
90
  ```ruby
91
91
  LLM.command "say-hello" do |cmd|
@@ -127,16 +127,12 @@ path `${HOME}/.llm-shell/config.yml` and it has the following format:
127
127
  # ~/.config/llm-shell.yml
128
128
  openai:
129
129
  key: YOURKEY
130
- model: gpt-4o-mini
131
130
  gemini:
132
131
  key: YOURKEY
133
- model: gemini-2.0-flash-001
134
132
  anthropic:
135
133
  key: YOURKEY
136
- model: claude-3-7-sonnet-20250219
137
134
  deepseek:
138
135
  key: YOURKEY
139
- model: deepseek-chat
140
136
  ollama:
141
137
  host: localhost
142
138
  model: deepseek-coder:6.7b
@@ -159,7 +155,6 @@ Usage: llm-shell [OPTIONS]
159
155
  -h, --host [HOST] Optional. Sometimes required by ollama.
160
156
  -o, --port [PORT] Optional. Sometimes required by ollama.
161
157
  -f, --files [GLOB] Optional. Glob pattern(s) separated by a comma.
162
- -t, --tools [TOOLS] Optional. One or more tool names to load automatically.
163
158
  -r, --prompt [PROMPT] Optional. The prompt to use.
164
159
  -v, --version Optional. Print the version and exit
165
160
  ```
@@ -0,0 +1,17 @@
1
+ # frozen_string_literal: true
2
+
3
+ class LLM::Function
4
+ ##
5
+ # Returns true when a function is a built-in function
6
+ # @return [Boolean]
7
+ def builtin?
8
+ @builtin
9
+ end
10
+
11
+ ##
12
+ # Mark a function as a built-in function
13
+ # @return [void]
14
+ def builtin!
15
+ @builtin = true
16
+ end
17
+ end
@@ -3,7 +3,14 @@
3
3
  class LLM::Shell
4
4
  class Command
5
5
  require_relative "commands/utils"
6
- Context = Struct.new(:bot, :io)
6
+
7
+ ##
8
+ # @api private
9
+ class Context < Struct.new(:bot, :io)
10
+ def pager(...)
11
+ LLM::Shell.pager(...)
12
+ end
13
+ end
7
14
 
8
15
  ##
9
16
  # Returns the underlying command object
@@ -63,5 +70,19 @@ class LLM::Shell
63
70
  @context.instance_exec(*argv, &@object)
64
71
  end
65
72
  end
73
+
74
+ ##
75
+ # @return [Boolean]
76
+ # Returns true if this is a builtin command
77
+ def builtin?
78
+ @builtin
79
+ end
80
+
81
+ ##
82
+ # Mark this command as builtin command
83
+ # @return [void]
84
+ def builtin!
85
+ @builtin = true
86
+ end
66
87
  end
67
88
  end
@@ -24,8 +24,9 @@ class LLM::Shell::Command
24
24
  def clear_screen = console.clear_screen
25
25
 
26
26
  LLM.command "clear-screen" do |cmd|
27
- cmd.description "Clears the screen"
27
+ cmd.description "Clear the screen"
28
28
  cmd.register(self)
29
+ cmd.builtin!
29
30
  end
30
31
  end
31
32
  end
@@ -41,8 +41,9 @@ class LLM::Shell::Command
41
41
  private
42
42
 
43
43
  LLM.command "dir-import" do |cmd|
44
- cmd.description "Share the contents of a directory with the LLM"
44
+ cmd.description "Share a directory with the LLM"
45
45
  cmd.register(self)
46
+ cmd.builtin!
46
47
  end
47
48
  end
48
49
  end
@@ -33,8 +33,9 @@ class LLM::Shell::Command
33
33
  private
34
34
 
35
35
  LLM.command "file-import" do |cmd|
36
- cmd.description "Share one or more files with the LLM"
36
+ cmd.description "Share a file with the LLM"
37
37
  cmd.register(self)
38
+ cmd.builtin!
38
39
  end
39
40
  end
40
41
  end
@@ -0,0 +1,66 @@
1
+ # frozen_string_literal: true
2
+
3
+ class LLM::Shell::Command
4
+ class Help
5
+ require_relative "utils"
6
+ include Utils
7
+
8
+ ##
9
+ # @param [LLM::Shell::Context] context
10
+ # The context of the command
11
+ # @return [LLM::Shell::Command::Help]
12
+ def initialize(context)
13
+ @context = context
14
+ end
15
+
16
+ ##
17
+ # Prints help
18
+ # @return [void]
19
+ def call
20
+ pager do |io|
21
+ render_commands(io)
22
+ render_functions(io)
23
+ end
24
+ end
25
+
26
+ private
27
+
28
+ def render_commands(io)
29
+ io.print(Paint["Commands", :bold, :underline], "\n\n")
30
+ io.print(Paint["Builtin", :bold], "\n\n")
31
+ render_group commands.select(&:builtin?), io, :cyan
32
+ io.print(Paint["User", :bold], "\n\n")
33
+ render_group commands.reject(&:builtin?), io, :cyan
34
+ end
35
+
36
+ def render_functions(io)
37
+ io.print(Paint["Functions", :bold, :underline], "\n\n")
38
+ io.print(Paint["Builtin", :bold], "\n\n")
39
+ render_group functions.select(&:builtin?), io, :blue
40
+ io.print(Paint["User", :bold], "\n\n")
41
+ render_group functions.reject(&:builtin?), io, :blue
42
+ end
43
+
44
+ def render_group(commands, io, bgcolor)
45
+ if commands.empty?
46
+ io.print(Paint["None available", :yellow], "\n\n")
47
+ else
48
+ commands.each.with_index(1) do |command, index|
49
+ io.print(name(command, index, bgcolor), "\n")
50
+ io.print(desc(command), "\n\n")
51
+ end
52
+ end
53
+ end
54
+
55
+ def commands = LLM.commands.values.sort_by(&:name)
56
+ def functions = LLM.functions.values.sort_by(&:name)
57
+ def name(command, index, bgcolor) = [Paint[" #{index} ", :white, bgcolor, :bold], " ", Paint[command.name, :bold]].join
58
+ def desc(command) = command.description || "No description"
59
+
60
+ LLM.command "help" do |cmd|
61
+ cmd.description "Show the help menu"
62
+ cmd.register(self)
63
+ cmd.builtin!
64
+ end
65
+ end
66
+ end
@@ -1,14 +1,14 @@
1
1
  # frozen_string_literal: true
2
2
 
3
3
  class LLM::Shell::Command
4
- class ShowHistory
4
+ class ShowChat
5
5
  require_relative "utils"
6
6
  include Utils
7
7
 
8
8
  ##
9
9
  # @param [LLM::Shell::Context] context
10
10
  # The context of the command
11
- # @return [LLM::Shell::Command::ShowHistory]
11
+ # @return [LLM::Shell::Command::ShowChat]
12
12
  def initialize(context)
13
13
  @context = context
14
14
  end
@@ -24,7 +24,7 @@ class LLM::Shell::Command
24
24
  private
25
25
 
26
26
  def emit
27
- IO.popen("less -FRX", "w") do |io|
27
+ pager do |io|
28
28
  messages.each.with_index do |message, index|
29
29
  next if index <= 1
30
30
  io << render(message) << "\n"
@@ -37,9 +37,10 @@ class LLM::Shell::Command
37
37
  def messages = bot.messages
38
38
  def render(message) = LLM::Shell::Renderer.new(message).render
39
39
 
40
- LLM.command "show-history" do |cmd|
41
- cmd.description "Show the full chat history"
40
+ LLM.command "show-chat" do |cmd|
41
+ cmd.description "Show the chat"
42
42
  cmd.register(self)
43
+ cmd.builtin!
43
44
  end
44
45
  end
45
46
  end
@@ -16,7 +16,11 @@ class LLM::Shell::Command
16
16
  ##
17
17
  # Emits the system prompt to standard output
18
18
  # @return [void]
19
- def call = puts render(bot.messages[0])
19
+ def call
20
+ pager do |io|
21
+ io.write render(bot.messages[0])
22
+ end
23
+ end
20
24
 
21
25
  private
22
26
 
@@ -25,6 +29,7 @@ class LLM::Shell::Command
25
29
  LLM.command "system-prompt" do |cmd|
26
30
  cmd.description "Show the system prompt"
27
31
  cmd.register(self)
32
+ cmd.builtin!
28
33
  end
29
34
  end
30
35
  end
@@ -13,8 +13,9 @@ class LLM::Shell::Command
13
13
  ].join("\n")
14
14
  end
15
15
 
16
- def file_pattern = /\A<file path=(.+?)>/
17
16
  def bot = @context.bot
18
17
  def io = @context.io
18
+ def pager(...) = @context.pager(...)
19
+ def file_pattern = /\A<file path=(.+?)>/
19
20
  end
20
21
  end
@@ -0,0 +1,22 @@
1
+ # frozen_string_literal: true
2
+
3
+ module LLM::Shell::Functions
4
+ class ReadFile
5
+ def call(path:)
6
+ {ok: true, content: File.read(path)}
7
+ rescue => ex
8
+ {ok: false, error: {class: ex.class.to_s, message: ex.message}}
9
+ end
10
+
11
+ private
12
+
13
+ LLM.function(:read_file) do |fn|
14
+ fn.description "Read the contents of a file"
15
+ fn.params do |schema|
16
+ schema.object(path: schema.string.required)
17
+ end
18
+ fn.register(self)
19
+ fn.builtin!
20
+ end
21
+ end
22
+ end
@@ -0,0 +1,22 @@
1
+ # frozen_string_literal: true
2
+
3
+ module LLM::Shell::Functions
4
+ class WriteFile
5
+ def call(path:, content:)
6
+ {ok: true, content: File.binwrite(path, content)}
7
+ rescue => ex
8
+ {ok: false, error: {class: ex.class.to_s, message: ex.message}}
9
+ end
10
+
11
+ private
12
+
13
+ LLM.function(:write_file) do |fn|
14
+ fn.description "Write the contents of a file"
15
+ fn.params do |schema|
16
+ schema.object(path: schema.string.required, content: schema.string.required)
17
+ end
18
+ fn.register(self)
19
+ fn.builtin!
20
+ end
21
+ end
22
+ end
@@ -1,71 +1,71 @@
1
1
  # frozen_string_literal: true
2
2
 
3
- class LLM::Shell
4
- class Markdown
5
- require "kramdown"
6
- require "coderay"
3
+ require "redcarpet"
4
+ require "coderay"
7
5
 
6
+ class LLM::Shell
7
+ ##
8
+ # @api private
9
+ # @see redcarpet https://github.com/vmg/redcarpet/blob/master/ext/redcarpet/markdown.h#L69-L110
10
+ class Markdown < Redcarpet::Render::Base
8
11
  ##
9
- # @param [String] text
10
- # @return [LLM::Shell::Markdown]
11
- def initialize(text)
12
- @document = Kramdown::Document.new preprocessor(text)
12
+ # Renders markdown text to a terminal-friendly format.
13
+ # @return [String
14
+ def self.render(text)
15
+ renderer = Redcarpet::Markdown.new(self, options)
16
+ renderer.render(wrap(p: text)).strip
13
17
  end
14
18
 
15
19
  ##
16
- # @return [String]
17
- def to_ansi
18
- @document.root.children.map { |node| visit(node) }.join("\n")
20
+ # @api private
21
+ def self.wrap(p:, width: 80)
22
+ in_code = false
23
+ p.lines.map do |line|
24
+ if line =~ /^(\s*)(```|~~~)/
25
+ in_code = !in_code
26
+ line
27
+ elsif in_code || line =~ /^\s{4}/
28
+ line
29
+ else
30
+ line.gsub(/(.{1,#{width}})(\s+|\Z)/, "\\1\n")
31
+ end
32
+ end.join.strip + "\n"
19
33
  end
20
34
 
21
- private
35
+ ##
36
+ # @api private
37
+ def self.options
38
+ {
39
+ autolink: false, no_intra_emphasis: true,
40
+ fenced_code_blocks: true, lax_spacing: true,
41
+ strikethrough: true, superscript: true,
42
+ tables: true, with_toc_data: true
43
+ }
44
+ end
22
45
 
23
- def visit(node)
24
- case node.type
25
- when :header
26
- level = node.options[:level]
27
- color = levels[level]
28
- Paint[("#" * level) + " " + node.children.map { visit(_1) }.join, color]
29
- when :p
30
- node.children.map { visit(_1) }.join
31
- when :ul
32
- node.children.map { visit(_1) }.join("\n")
33
- when :li
34
- "• " + node.children.map { visit(_1) }.join
35
- when :em
36
- Paint[node.children.map { visit(_1) }.join, :italic]
37
- when :strong
38
- Paint[node.children.map { visit(_1) }.join, :bold]
39
- when :br
40
- "\n"
41
- when :codespan, :codeblock
42
- lines = node.value.each_line.to_a
43
- lang = lines[0].strip
44
- code = lines[1..].join
45
- if lines.size == 1
46
- Paint[node.value, :italic]
47
- else
48
- ["\n", Paint[">>> #{lang}", :blue, :bold],
49
- "\n\n", coderay(code, lang),
50
- "\n", Paint["<<< #{lang}", :blue, :bold]].join
51
- end
52
- when :smart_quote
53
- smart_quotes[node.value]
54
- when :text
55
- node.value
56
- else
57
- node.children.map { visit(_1) }.join
58
- end
46
+ def block_code(code, lang)
47
+ ["\n", Paint["#{lang}:", :blue, :bold],
48
+ "\n", coderay(code, lang),
49
+ "\n"].join
59
50
  end
60
51
 
61
- def preprocessor(text)
62
- text
63
- .gsub(/([^\n])\n(#+ )/, "\\1\n\n\\2")
64
- .gsub(/(#+ .+?)\n(?!\n)/, "\\1\n\n")
65
- .gsub(/\A<think>[\n]*<\/think>(?:\n)/, "")
66
- .gsub(/\A\n{2,}/, "")
52
+ def header(text, level)
53
+ color = levels.fetch(level, :white)
54
+ "\n" + Paint[("#" * level) + " " + text, color] + "\n"
67
55
  end
68
56
 
57
+ def paragraph(p) = "#{p.strip}\n\n"
58
+ def list(items, _type) = items
59
+ def list_item(item, _type) = "\n• #{item.strip}\n"
60
+ def emphasis(text) = Paint[text, :italic]
61
+ def double_emphasis(text) = Paint[text, :bold]
62
+ def codespan(code) = Paint[code, :yellow, :underline]
63
+ def block_quote(quote) = Paint[quote, :italic]
64
+ def normal_text(text) = text
65
+ def linebreak = "\n"
66
+
67
+ private
68
+
69
69
  def coderay(code, lang)
70
70
  CodeRay.scan(code, lang).terminal
71
71
  rescue ArgumentError
@@ -79,12 +79,5 @@ class LLM::Shell
79
79
  4 => :yellow, 5 => :red, 6 => :purple
80
80
  }
81
81
  end
82
-
83
- def smart_quotes
84
- {
85
- :lsquo => "'", :rsquo => "'",
86
- :ldquo => '"', :rdquo => '"'
87
- }
88
- end
89
82
  end
90
83
  end
@@ -28,20 +28,23 @@ class LLM::Shell
28
28
 
29
29
  private
30
30
 
31
+ attr_reader :message
32
+
31
33
  def render_message(message, color)
32
- role = Paint[message.role, :bold, color]
33
- title = "#{role} says: "
34
- if message.content =~ file_pattern
35
- path = message.content.match(file_pattern) ? Regexp.last_match[1] : nil
34
+ role = Paint[message.role, :bold, color]
35
+ title = "#{role} says: "
36
+ content = message.content
37
+ if message.tool_call?
38
+ body = "Tool call(s) request"
39
+ elsif message.tool_return?
40
+ body = "Tool call(s) return"
41
+ elsif content =~ file_pattern
42
+ path = content.match(file_pattern) ? Regexp.last_match[1] : nil
36
43
  body = "<file path=#{path} />"
37
44
  else
38
- body = markdown(wrap(message.content))
45
+ body = Markdown.render(content)
39
46
  end
40
47
  [title, "\n", body, "\n"].join
41
48
  end
42
-
43
- attr_reader :message
44
- def markdown(text) = Markdown.new(text).to_ansi
45
- def wrap(text, width = 80) = text.gsub(/(.{1,#{width}})(\s+|\Z)/, "\\1\n")
46
49
  end
47
50
  end
@@ -85,9 +85,9 @@ class LLM::Shell
85
85
  end
86
86
 
87
87
  def emit
88
- IO.popen("less -FRX", "w") do
89
- _1.write formatter(unread).format!(:user), "\n"
90
- _1.write formatter(unread).format!(:assistant), "\n"
88
+ LLM::Shell.pager do |io|
89
+ io.write formatter(unread).format!(:user), "\n"
90
+ io.write formatter(unread).format!(:assistant), "\n"
91
91
  end unless unread.empty?
92
92
  end
93
93
 
@@ -4,5 +4,5 @@ module LLM
4
4
  end unless defined?(LLM)
5
5
 
6
6
  class LLM::Shell
7
- VERSION = "0.7.1"
7
+ VERSION = "0.8.0"
8
8
  end
data/lib/llm/shell.rb CHANGED
@@ -7,6 +7,7 @@ require "llm"
7
7
  require "paint"
8
8
 
9
9
  class LLM::Shell
10
+ require_relative "function"
10
11
  require_relative "../io/line"
11
12
  require_relative "shell/command"
12
13
  require_relative "shell/command/extension"
@@ -24,6 +25,13 @@ class LLM::Shell
24
25
  # Load all commands
25
26
  Dir[File.join(__dir__, "shell", "commands", "*.rb")].each { require(_1) }
26
27
 
28
+ ##
29
+ # Opens a pager
30
+ # @return [void]
31
+ def self.pager(...)
32
+ IO.popen("less -FRX", "w", ...)
33
+ end
34
+
27
35
  ##
28
36
  # @return [String]
29
37
  def self.home
@@ -33,7 +41,7 @@ class LLM::Shell
33
41
  ##
34
42
  # @return [Array<String>]
35
43
  def self.tools
36
- Dir[File.join(home, "tools", "*.rb")]
44
+ Dir[*TOOLGLOBS]
37
45
  end
38
46
 
39
47
  ##
@@ -42,6 +50,12 @@ class LLM::Shell
42
50
  Dir[File.join(home, "commands", "*.rb")]
43
51
  end
44
52
 
53
+ TOOLGLOBS = [
54
+ File.join(home, "tools", "*.rb"),
55
+ File.join(__dir__, "shell", "functions", "*.rb")
56
+ ].freeze
57
+ private_constant :TOOLGLOBS
58
+
45
59
  ##
46
60
  # @param [Hash] options
47
61
  # @return [LLM::Shell]
@@ -63,14 +77,8 @@ class LLM::Shell
63
77
  private
64
78
 
65
79
  def tools
66
- LLM::Shell.tools.filter_map do |path|
67
- name = File.basename(path, File.extname(path))
68
- if options.tools.include?(name)
69
- print Paint["llm-shell: ", :green], "load #{name} tool", "\n"
70
- eval File.read(path), TOPLEVEL_BINDING, path, 1
71
- else
72
- print Paint["llm-shell: ", :yellow], "skip #{name} tool", "\n"
73
- end
80
+ LLM::Shell.tools.map do |path|
81
+ eval File.read(path), TOPLEVEL_BINDING, path, 1
74
82
  end.grep(LLM::Function)
75
83
  end
76
84
 
@@ -27,7 +27,6 @@ def option_parser
27
27
  o.on("-h [HOST]", "--host [HOST]", "Optional. Sometimes required by ollama.", String)
28
28
  o.on("-o [PORT]", "--port [PORT]", "Optional. Sometimes required by ollama.", Integer)
29
29
  o.on("-f [GLOB]", "--files [GLOB]", "Optional. Glob pattern(s) separated by a comma.", Array)
30
- o.on("-t [TOOLS]", "--tools [TOOLS]", "Optional. One or more tool names to load automatically.", Array)
31
30
  o.on("-r [PROMPT]", "--prompt [PROMPT]", "Optional. The prompt to use.", String)
32
31
  o.on("-v", "--version", "Optional. Print the version and exit.")
33
32
  end
data/llm-shell.gemspec ADDED
@@ -0,0 +1,45 @@
1
+ # frozen_string_literal: true
2
+
3
+ require_relative "lib/llm/shell/version"
4
+
5
+ Gem::Specification.new do |spec|
6
+ spec.name = "llm-shell"
7
+ spec.version = LLM::Shell::VERSION
8
+ spec.authors = ["Antar Azri", "0x1eef"]
9
+ spec.email = ["azantar@proton.me", "0x1eef@proton.me"]
10
+
11
+ spec.summary = "llm-shell is an extensible, developer-oriented " \
12
+ "command-line console that can interact with multiple " \
13
+ "Large Language Models (LLMs)."
14
+ spec.description = spec.summary
15
+ spec.homepage = "https://github.com/llmrb/llm-shell"
16
+ spec.license = "0BSD"
17
+ spec.required_ruby_version = ">= 3.2"
18
+
19
+ spec.metadata["homepage_uri"] = spec.homepage
20
+ spec.metadata["source_code_uri"] = spec.homepage
21
+
22
+ spec.files = Dir[
23
+ "README.md", "LICENSE",
24
+ "lib/*.rb", "lib/**/*.rb",
25
+ "libexec/*", "libexec/**/*",
26
+ "share/llm-shell/prompts/*",
27
+ "bin/*", "llm-shell.gemspec"
28
+ ]
29
+ spec.require_paths = ["lib"]
30
+ spec.executables = ["llm-shell"]
31
+ spec.add_dependency "llm.rb", "~> 0.11"
32
+ spec.add_dependency "paint", "~> 2.1"
33
+ spec.add_dependency "redcarpet", "~> 3.6"
34
+ spec.add_dependency "coderay", "~> 1.1"
35
+ spec.add_development_dependency "webmock", "~> 3.24.0"
36
+ spec.add_development_dependency "yard", "~> 0.9.37"
37
+ spec.add_development_dependency "kramdown", "~> 2.4"
38
+ spec.add_development_dependency "webrick", "~> 1.8"
39
+ spec.add_development_dependency "test-cmd.rb", "~> 0.12.0"
40
+ spec.add_development_dependency "rake", "~> 13.0"
41
+ spec.add_development_dependency "rspec", "~> 3.0"
42
+ spec.add_development_dependency "standard", "~> 1.40"
43
+ spec.add_development_dependency "vcr", "~> 6.0"
44
+ spec.add_development_dependency "dotenv", "~> 2.8"
45
+ end
metadata CHANGED
@@ -1,7 +1,7 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: llm-shell
3
3
  version: !ruby/object:Gem::Version
4
- version: 0.7.1
4
+ version: 0.8.0
5
5
  platform: ruby
6
6
  authors:
7
7
  - Antar Azri
@@ -16,14 +16,14 @@ dependencies:
16
16
  requirements:
17
17
  - - "~>"
18
18
  - !ruby/object:Gem::Version
19
- version: '0.10'
19
+ version: '0.11'
20
20
  type: :runtime
21
21
  prerelease: false
22
22
  version_requirements: !ruby/object:Gem::Requirement
23
23
  requirements:
24
24
  - - "~>"
25
25
  - !ruby/object:Gem::Version
26
- version: '0.10'
26
+ version: '0.11'
27
27
  - !ruby/object:Gem::Dependency
28
28
  name: paint
29
29
  requirement: !ruby/object:Gem::Requirement
@@ -39,19 +39,19 @@ dependencies:
39
39
  - !ruby/object:Gem::Version
40
40
  version: '2.1'
41
41
  - !ruby/object:Gem::Dependency
42
- name: kramdown
42
+ name: redcarpet
43
43
  requirement: !ruby/object:Gem::Requirement
44
44
  requirements:
45
45
  - - "~>"
46
46
  - !ruby/object:Gem::Version
47
- version: '2.5'
47
+ version: '3.6'
48
48
  type: :runtime
49
49
  prerelease: false
50
50
  version_requirements: !ruby/object:Gem::Requirement
51
51
  requirements:
52
52
  - - "~>"
53
53
  - !ruby/object:Gem::Version
54
- version: '2.5'
54
+ version: '3.6'
55
55
  - !ruby/object:Gem::Dependency
56
56
  name: coderay
57
57
  requirement: !ruby/object:Gem::Requirement
@@ -220,25 +220,30 @@ files:
220
220
  - bin/llm-shell
221
221
  - lib/io/line.rb
222
222
  - lib/llm-shell.rb
223
+ - lib/llm/function.rb
223
224
  - lib/llm/shell.rb
224
225
  - lib/llm/shell/command.rb
225
226
  - lib/llm/shell/command/extension.rb
226
227
  - lib/llm/shell/commands/clear_screen.rb
227
228
  - lib/llm/shell/commands/dir_import.rb
228
229
  - lib/llm/shell/commands/file_import.rb
229
- - lib/llm/shell/commands/show_history.rb
230
+ - lib/llm/shell/commands/help.rb
231
+ - lib/llm/shell/commands/show_chat.rb
230
232
  - lib/llm/shell/commands/system_prompt.rb
231
233
  - lib/llm/shell/commands/utils.rb
232
234
  - lib/llm/shell/completion.rb
233
235
  - lib/llm/shell/config.rb
234
236
  - lib/llm/shell/default.rb
235
237
  - lib/llm/shell/formatter.rb
238
+ - lib/llm/shell/functions/read_file.rb
239
+ - lib/llm/shell/functions/write_file.rb
236
240
  - lib/llm/shell/markdown.rb
237
241
  - lib/llm/shell/options.rb
238
242
  - lib/llm/shell/renderer.rb
239
243
  - lib/llm/shell/repl.rb
240
244
  - lib/llm/shell/version.rb
241
245
  - libexec/llm-shell/shell
246
+ - llm-shell.gemspec
242
247
  - share/llm-shell/prompts/default.txt
243
248
  homepage: https://github.com/llmrb/llm-shell
244
249
  licenses: