ollama_chat 0.0.72 → 0.0.73

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: aba37f4f59280b1d21b76df79cb20b1e6d4f1b7e05fd91dfd6940368ff8a168f
4
- data.tar.gz: 88d8d50a503c98973159f2874c0f04a4139e27469363c77b66560e0a965cf890
3
+ metadata.gz: a7303f13e1f7757bf9679066210841e544a07e4ff6b913d13b89d824e4b9eae5
4
+ data.tar.gz: 96cfd6b581dee08207fb3b42e3b4777d106692dd4a08774c268fad61ed0b18cc
5
5
  SHA512:
6
- metadata.gz: 9bc913077bc16ded309925188841e5f803521df0582a47c747fd20c31b03b4a15cfdabf324f683a910d3f9de4f2003b8b31867a4da052e84c072736033f613a2
7
- data.tar.gz: 9df32e077b73b959ddff9754241fe10a507b8b861e917c59dd6a6032ed4eb4d520ce8bf3047d4401f06e528c17de55d2cba12d26137d52b8fc69eeddf0d17045
6
+ metadata.gz: f247764f56a31e93b7dd0d38e7069dac61291bbfce9c7d2718aa2492a7786d72d1ef63e82130e8b39148eb9c335d4e46dca74506d95aacc8432646ef46c6ce3a
7
+ data.tar.gz: 2a6de055cc188e56b939261f6866affd3de469585d7f9d897e4e658055555427c140a7c3e2709fd22d554c5f9bd28c0748636477a94a597c939e2e6fbfce513e
data/CHANGES.md CHANGED
@@ -1,5 +1,45 @@
1
1
  # Changes
2
2
 
3
+ ## 2026-03-05 v0.0.73
4
+
5
+ - Renamed the `"/regenerate"` command to `"/revise"` with an optional `edit`
6
+ subcommand in `chat.rb`.
7
+ - Replaced the `revise_last` method with `change_response` in
8
+ `message_editing.rb`.
9
+ - Added a new `edit_text` method that uses temporary files for external editor
10
+ integration.
11
+ - Updated help text and documentation references throughout the codebase.
12
+ - Adjusted tests in `chat_spec.rb` and `message_editing_spec.rb` accordingly.
13
+ - Added a `ModelMetadata` struct to encapsulate model information, including
14
+ capabilities.
15
+ - Introduced `OllamaChat::UnknownModelError` for better error handling when
16
+ models are not found.
17
+ - Implemented a `can?` method on `ModelMetadata` for clean capability checks,
18
+ e.g., `@model_metadata.can?('thinking')`.
19
+ - Updated `pull_model_unless_present` to return a `ModelMetadata` instance
20
+ instead of a system prompt string.
21
+ - Enhanced the `"/model"` command with proper error handling using the new
22
+ exception class.
23
+ - Added a capabilities display in the information output: `STDOUT.puts
24
+ "Capabilities: #{Array(@model_metadata.capabilities) * ', '}"`.
25
+ - Refactored the `use_model` method for cleaner model selection flow.
26
+ - Updated tests to validate capability parsing from API responses (e.g.,
27
+ `"completion","tools"` for **8.0B** models).
28
+ - Modified `setup_system_prompt` to use the new metadata system
29
+ (`@model_metadata.system`) instead of the old approach.
30
+ - Improved error handling in chat initialization and the `"/model"` command
31
+ with specific model‑not‑found errors.
32
+ - Removed an unused parameter from the `pull_model_unless_present` method
33
+ signature.
34
+ - Updated `OllamaChat::Chat` to handle both `think?` and `tools_support?`
35
+ conditions in error recovery logic.
36
+ - Modified the rescue block to check `(think? || tools_support.on?)` instead of
37
+ just `think?`.
38
+ - Added a `tools_support.set false` call alongside disabling think mode when
39
+ errors occur.
40
+ - Improved the error message to reflect that both modes are disabled during
41
+ retry attempts.
42
+
3
43
  ## 2026-03-04 v0.0.72
4
44
 
5
45
  - Introduced `OllamaChat::PersonaeManagement` module and `/persona` command
data/README.md CHANGED
@@ -152,47 +152,47 @@ subject - the young, blue-eyed cat.
152
152
  The following commands can be given inside the chat, if prefixed by a `/`:
153
153
 
154
154
  ```
155
- /reconnect reconnect to current ollama server
156
- /copy to copy last response to clipboard
157
- /paste to paste content
158
- /markdown toggle markdown output
159
- /stream toggle stream output
160
- /location toggle location submission
161
- /voice [change] toggle voice output or change the voice
162
- /last [n] show the last n / 1 system/assistant message
163
- /list [n] list the last n / all conversation exchanges
164
- /clear [what] clear what=messages|links|history|tags|all
165
- /clobber clear the conversation, links, and collection
166
- /drop [n] drop the last n exchanges, defaults to 1
167
- /model change the model
168
- /system [show] change/show system prompt
169
- /prompt prefill user prompt with preset prompts
170
- /persona add|delete|edit|file|info|list|load|play manage and load/play personas for roleplay
171
- /regenerate the last answer message
172
- /collection [clear|change] change (default) collection or clear
173
- /info show information for current session
174
- /config [edit|reload] output/edit/reload current configuration ("/Users/flori/.config/ollama_chat/config.yml")
175
- /document_policy pick a scan policy for document references
176
- /think choose ollama think mode setting for models
177
- /think_loud enable to think out loud instead of silently
178
- /import source import the source's content
179
- /summarize [n] source summarize the source's content in n words
180
- /embedding toggle embedding paused or not
181
- /embed source embed the source's content
182
- /web [n] query query web & for n(=1) results (policy: importing)
183
- /links [clear] display (or clear) links used in the chat
184
- /save filename store conversation messages
185
- /load filename load conversation messages
186
- /compose compose content using an EDITOR
187
- /input [pattern] select and read content from a file (default: **/*)
188
- /context [pattern...] collect context with glob patterns
189
- /revise_last edit the last response in an external editor
190
- /output filename save last response to filename
191
- /pipe command write last response to command's stdin
192
- /tools [enable|disable|on|off] list enabled, enable/disable tools, support on/off
193
- /vim insert the last message into a vim server
194
- /quit to quit
195
- /help [me] to view this help (me = interactive ai help)
155
+ /reconnect reconnect to current ollama server
156
+ /copy to copy last response to clipboard
157
+ /paste to paste content
158
+ /markdown toggle markdown output
159
+ /stream toggle stream output
160
+ /location toggle location submission
161
+ /voice [change] toggle voice output or change the voice
162
+ /last [n] show the last n / 1 system/assistant message
163
+ /list [n] list the last n / all conversation exchanges
164
+ /clear [what] clear what=messages|links|history|tags|all
165
+ /clobber clear the conversation, links, and collection
166
+ /drop [n] drop the last n exchanges, defaults to 1
167
+ /model change the model
168
+ /system [show] change/show system prompt
169
+ /prompt prefill user prompt with preset prompts
170
+ /persona [add|delete|edit|file|info|list|load|play] manage and load/play personas for roleplay
171
+ /revise [edit] the last answer message
172
+ /collection [clear|change] change (default) collection or clear
173
+ /info show information for current session
174
+ /config [edit|reload] output/edit/reload current configuration ("/Users/flori/.config/ollama_chat/config.yml")
175
+ /document_policy pick a scan policy for document references
176
+ /think choose ollama think mode setting for models
177
+ /think_loud enable to think out loud instead of silently
178
+ /import source import the source's content
179
+ /summarize [n] source summarize the source's content in n words
180
+ /embedding toggle embedding paused or not
181
+ /embed source embed the source's content
182
+ /web [n] query query web & for n(=1) results (policy: importing)
183
+ /links [clear] display (or clear) links used in the chat
184
+ /save filename store conversation messages
185
+ /load filename load conversation messages
186
+ /compose compose content using an EDITOR
187
+ /input [pattern] select and read content from a file (default: **/*)
188
+ /context [pattern...] collect context with glob patterns
189
+ /change_response edit the last response in an external editor
190
+ /output filename save last response to filename
191
+ /pipe command write last response to command's stdin
192
+ /tools [enable|disable|on|off] list enabled, enable/disable tools, support on/off
193
+ /vim insert the last message into a vim server
194
+ /quit to quit
195
+ /help [me] to view this help (me = interactive ai help)
196
196
  ```
197
197
 
198
198
  ### Using `ollama_chat_send` to send input to a running `ollama_chat`
@@ -97,9 +97,12 @@ class OllamaChat::Chat
97
97
  setup_switches(config)
98
98
  setup_state_selectors(config)
99
99
  connect_ollama
100
- @model = choose_model(@opts[?m], config.model.name)
101
- @model_options = Ollama::Options[config.model.options]
102
- @model_system = pull_model_unless_present(@model, @model_options)
100
+ begin
101
+ use_model(@opts[?m].full? || config.model.name)
102
+ rescue OllamaChat::UnknownModelError => e
103
+ abort "Failed to use to model: #{e}"
104
+ end
105
+ @model_options = Ollama::Options[config.model.options]
103
106
  if conversation_file = @opts[?c]
104
107
  messages.load_conversation(conversation_file)
105
108
  elsif backup_file = OC::XDG_CACHE_HOME + 'backup.json' and backup_file.exist?
@@ -162,6 +165,14 @@ class OllamaChat::Chat
162
165
  # The start method initializes the chat session by displaying information,
163
166
  # then prompts the user for input to begin interacting with the chat.
164
167
  def start
168
+ if think? && !@model_metadata.can?('thinking')
169
+ think_mode.selected = 'disabled'
170
+ end
171
+
172
+ if tools_support.on? && !@model_metadata.can?('tools')
173
+ tools_support.set false
174
+ end
175
+
165
176
  info
166
177
  STDOUT.puts "\nType /help to display the chat help."
167
178
 
@@ -240,7 +251,11 @@ class OllamaChat::Chat
240
251
  messages.show_last
241
252
  :next
242
253
  when %r(^/model$)
243
- @model = choose_model('', @model)
254
+ begin
255
+ use_model
256
+ rescue OllamaChatError::UnknownModelError => e
257
+ STDERR.puts "Caught #{e.class}: #{e}"
258
+ end
244
259
  :next
245
260
  when %r(^/system(?:\s+(show))?$)
246
261
  if $1 != 'show'
@@ -251,10 +266,14 @@ class OllamaChat::Chat
251
266
  when %r(^/prompt)
252
267
  @prefill_prompt = choose_prompt
253
268
  :next
254
- when %r(^/regenerate$)
269
+ when %r(^/revise(?:\s+(edit))?$)
270
+ subcommand = $1
255
271
  if content = messages.second_last&.content
256
272
  content.gsub!(/\nConsider these chunks for your answer.*\z/, '')
257
273
  messages.drop(1)
274
+ if subcommand == 'edit'
275
+ content = edit_text(content)
276
+ end
258
277
  else
259
278
  STDOUT.puts "Not enough messages in this conversation."
260
279
  return :redo
@@ -328,8 +347,8 @@ class OllamaChat::Chat
328
347
  context_spook(patterns) or :next
329
348
  when %r(^/compose$)
330
349
  compose or :next
331
- when %r(^/revise_last$)
332
- revise_last
350
+ when %r(^/change_response$)
351
+ change_response
333
352
  :next
334
353
  when %r(^/save\s+(.+)$)
335
354
  save_conversation($1)
@@ -689,10 +708,11 @@ class OllamaChat::Chat
689
708
  &handler
690
709
  )
691
710
  rescue Ollama::Errors::BadRequestError
692
- if think? && !retried
693
- STDOUT.puts "#{bold('Error')}: in think mode, switch thinking off and retry."
711
+ if (think? || tools_support.on?) && !retried
712
+ STDOUT.puts "#{bold('Error')}: in think mode/with tool support, switch both off and retry."
694
713
  sleep 1
695
714
  think_mode.selected = 'disabled'
715
+ tools_support.set false
696
716
  retried = true
697
717
  retry
698
718
  else
@@ -777,7 +797,7 @@ class OllamaChat::Chat
777
797
  # retrieves the system prompt from a file or uses the default value, then
778
798
  # sets it in the message history.
779
799
  def setup_system_prompt
780
- default = config.system_prompts.default? || @model_system
800
+ default = config.system_prompts.default? || @model_metadata.system
781
801
  if @opts[?s] =~ /\A\?/
782
802
  change_system_prompt(default, system: @opts[?s])
783
803
  else
@@ -797,7 +817,7 @@ class OllamaChat::Chat
797
817
  if embedding.on?
798
818
  @embedding_model = config.embedding.model.name
799
819
  @embedding_model_options = Ollama::Options[config.embedding.model.options]
800
- pull_model_unless_present(@embedding_model, @embedding_model_options)
820
+ pull_model_unless_present(@embedding_model)
801
821
  collection = @opts[?C] || config.embedding.collection
802
822
  @documents = Documentrix::Documents.new(
803
823
  ollama:,
@@ -13,6 +13,27 @@ module OllamaChat::FileEditing
13
13
  system Shellwords.join([ editor, filename ])
14
14
  end
15
15
 
16
+ # The edit_text method temporarily writes the given text to a file,
17
+ # attempts to edit it using an external editor, and returns the edited
18
+ # content if successful.
19
+ #
20
+ # @param text [String] the text to be edited
21
+ #
22
+ # @return [String, nil] the edited text or nil if editing failed
23
+ def edit_text(text)
24
+ Tempfile.open do |tmp|
25
+ tmp.write(text)
26
+ tmp.flush
27
+
28
+ if result = edit_file(tmp.path)
29
+ new_text = File.read(tmp.path)
30
+ return new_text
31
+ else
32
+ STDERR.puts "Editor failed to edit message."
33
+ end
34
+ end
35
+ end
36
+
16
37
  # The vim method creates and returns a new Vim instance for interacting with
17
38
  # a Vim server.
18
39
  #
@@ -85,6 +85,7 @@ module OllamaChat::Information
85
85
  STDOUT.puts "Running ollama_chat version: #{bold(OllamaChat::VERSION)}"
86
86
  STDOUT.puts "Connected to ollama server version: #{bold(server_version)} on: #{bold(server_url)}"
87
87
  STDOUT.puts "Current conversation model is #{bold{@model}}."
88
+ STDOUT.puts " Capabilities: #{Array(@model_metadata.capabilities) * ', '}"
88
89
  if @model_options.present?
89
90
  STDOUT.puts " Options: #{JSON.pretty_generate(@model_options).gsub(/(?<!\A)^/, ' ')}"
90
91
  end
@@ -117,47 +118,47 @@ module OllamaChat::Information
117
118
  # interface.
118
119
  private def display_chat_help_message
119
120
  <<~EOT
120
- /reconnect reconnect to current ollama server
121
- /copy to copy last response to clipboard
122
- /paste to paste content
123
- /markdown toggle markdown output
124
- /stream toggle stream output
125
- /location toggle location submission
126
- /voice [change] toggle voice output or change the voice
127
- /last [n] show the last n / 1 system/assistant message
128
- /list [n] list the last n / all conversation exchanges
129
- /clear [what] clear what=messages|links|history|tags|all
130
- /clobber clear the conversation, links, and collection
131
- /drop [n] drop the last n exchanges, defaults to 1
132
- /model change the model
133
- /system [show] change/show system prompt
134
- /prompt prefill user prompt with preset prompts
135
- /persona add|delete|edit|file|info|list|load|play manage and load/play personas for roleplay
136
- /regenerate the last answer message
137
- /collection [clear|change] change (default) collection or clear
138
- /info show information for current session
139
- /config [edit|reload] output/edit/reload current configuration (#{@ollama_chat_config.filename.to_s.inspect})
140
- /document_policy pick a scan policy for document references
141
- /think choose ollama think mode setting for models
142
- /think_loud enable to think out loud instead of silently
143
- /import source import the source's content
144
- /summarize [n] source summarize the source's content in n words
145
- /embedding toggle embedding paused or not
146
- /embed source embed the source's content
147
- /web [n] query query web & for n(=1) results (policy: #{document_policy})
148
- /links [clear] display (or clear) links used in the chat
149
- /save filename store conversation messages
150
- /load filename load conversation messages
151
- /compose compose content using an EDITOR
152
- /input [pattern] select and read content from a file (default: **/*)
153
- /context [pattern...] collect context with glob patterns
154
- /revise_last edit the last response in an external editor
155
- /output filename save last response to filename
156
- /pipe command write last response to command's stdin
157
- /tools [enable|disable|on|off] list enabled, enable/disable tools, support on/off
158
- /vim insert the last message into a vim server
159
- /quit to quit
160
- /help [me] to view this help (me = interactive ai help)
121
+ /reconnect reconnect to current ollama server
122
+ /copy to copy last response to clipboard
123
+ /paste to paste content
124
+ /markdown toggle markdown output
125
+ /stream toggle stream output
126
+ /location toggle location submission
127
+ /voice [change] toggle voice output or change the voice
128
+ /last [n] show the last n / 1 system/assistant message
129
+ /list [n] list the last n / all conversation exchanges
130
+ /clear [what] clear what=messages|links|history|tags|all
131
+ /clobber clear the conversation, links, and collection
132
+ /drop [n] drop the last n exchanges, defaults to 1
133
+ /model change the model
134
+ /system [show] change/show system prompt
135
+ /prompt prefill user prompt with preset prompts
136
+ /persona [add|delete|edit|file|info|list|load|play] manage and load/play personas for roleplay
137
+ /revise [edit] the last answer message
138
+ /collection [clear|change] change (default) collection or clear
139
+ /info show information for current session
140
+ /config [edit|reload] output/edit/reload current configuration (#{@ollama_chat_config.filename.to_s.inspect})
141
+ /document_policy pick a scan policy for document references
142
+ /think choose ollama think mode setting for models
143
+ /think_loud enable to think out loud instead of silently
144
+ /import source import the source's content
145
+ /summarize [n] source summarize the source's content in n words
146
+ /embedding toggle embedding paused or not
147
+ /embed source embed the source's content
148
+ /web [n] query query web & for n(=1) results (policy: #{document_policy})
149
+ /links [clear] display (or clear) links used in the chat
150
+ /save filename store conversation messages
151
+ /load filename load conversation messages
152
+ /compose compose content using an EDITOR
153
+ /input [pattern] select and read content from a file (default: **/*)
154
+ /context [pattern...] collect context with glob patterns
155
+ /change_response edit the last response in an external editor
156
+ /output filename save last response to filename
157
+ /pipe command write last response to command's stdin
158
+ /tools [enable|disable|on|off] list enabled, enable/disable tools, support on/off
159
+ /vim insert the last message into a vim server
160
+ /quit to quit
161
+ /help [me] to view this help (me = interactive ai help)
161
162
  EOT
162
163
  end
163
164
 
@@ -1,14 +1,12 @@
1
1
  # A module that provides message editing functionality for OllamaChat.
2
2
  #
3
3
  # The MessageEditing module encapsulates methods for modifying existing chat
4
- # messages using an external editor. It allows users to edit the last message
5
- # in the conversation, whether it's a system prompt, user message, or assistant
6
- # response.
4
+ # messages using an external editor.
7
5
  module OllamaChat::MessageEditing
8
6
  private
9
7
 
10
- # The revise_last method opens the last message in an external editor for
11
- # modification.
8
+ # The change_response method opens the last message (usually the assistant's
9
+ # response) in an external editor for modification.
12
10
  #
13
11
  # This method retrieves the last message from the conversation, writes its
14
12
  # content to a temporary file, opens that file in the configured editor,
@@ -16,7 +14,7 @@ module OllamaChat::MessageEditing
16
14
  # completion.
17
15
  #
18
16
  # @return [String, nil] the edited content if successful, nil otherwise
19
- def revise_last
17
+ def change_response
20
18
  if message = @messages.last
21
19
  Tempfile.open do |tmp|
22
20
  tmp.write(message.content)
@@ -34,7 +32,7 @@ module OllamaChat::MessageEditing
34
32
  end
35
33
  end
36
34
  else
37
- STDERR.puts "No message available to revise."
35
+ STDERR.puts "No message available to change."
38
36
  end
39
37
  nil
40
38
  end
@@ -16,16 +16,39 @@
16
16
  # @example Ensuring a model is available locally
17
17
  # chat.pull_model_unless_present('phi3', {})
18
18
  module OllamaChat::ModelHandling
19
+
20
+ # A simple data structure representing metadata about a model.
21
+ #
22
+ # @attr_reader name [String] the name of the model
23
+ # @attr_reader system [String] the system prompt associated with the model
24
+ # @attr_reader capabilities [Array<String>] the capabilities supported by the model
25
+ class ModelMetadata < Struct.new(:name, :system, :capabilities)
26
+ # Checks if the given capability is included in the object's capabilities.
27
+ #
28
+ # @param capability [String] the capability to check for
29
+ # @return [true, false] true if the capability is present, false otherwise
30
+ def can?(capability)
31
+ Array(capabilities).member?(capability)
32
+ end
33
+ end
34
+
19
35
  private
20
36
 
21
- # The model_present? method checks if the specified Ollama model is available.
37
+ # The model_present? method checks if the specified Ollama model is
38
+ # available.
22
39
  #
23
40
  # @param model [ String ] the name of the Ollama model
24
41
  #
25
- # @return [ String, FalseClass ] the system prompt if the model is present,
42
+ # @return [ ModelMetadata, FalseClass ] if the model is present,
26
43
  # false otherwise
27
44
  def model_present?(model)
28
- ollama.show(model:) { return _1.system.to_s }
45
+ ollama.show(model:) do |md|
46
+ return ModelMetadata.new(
47
+ model,
48
+ md.system,
49
+ md.capabilities,
50
+ )
51
+ end
29
52
  rescue Ollama::Errors::NotFoundError
30
53
  false
31
54
  end
@@ -39,37 +62,27 @@ module OllamaChat::ModelHandling
39
62
  ollama.pull(model:)
40
63
  end
41
64
 
42
- # The pull_model_unless_present method checks if the specified model is
43
- # present on the system.
44
- #
45
- # If the model is already present, it returns the system prompt if it is
46
- # present.
65
+ # The pull_model_unless_present method ensures that a specified model is
66
+ # available on the Ollama server. It first checks if the model metadata
67
+ # exists locally; if not, it pulls the model from a remote source and
68
+ # verifies its presence again. If the model still cannot be found, it raises
69
+ # an UnknownModelError indicating the missing model name.
47
70
  #
48
- # Otherwise, it attempts to pull the model from the remote server using the
49
- # pull_model_from_remote method. If the model is still not found after
50
- # pulling, it exits the program with a message indicating that the model was
51
- # not found remotely.
71
+ # @param model [String] the name of the model to ensure is present
52
72
  #
53
- # @param model [ String ] The name of the model to check for presence.
54
- # @param options [ Hash ] Options for the pull_model_from_remote method.
55
- #
56
- # @return [ String, FalseClass ] the system prompt if the model and it are
57
- # present, false otherwise.
58
- def pull_model_unless_present(model, options)
59
- if system = model_present?(model)
60
- return system.full?
73
+ # @return [ModelMetadata] the metadata for the available model
74
+ # @raise [OllamaChat::UnknownModelError] if the model cannot be found after
75
+ # attempting to pull it from remote
76
+ def pull_model_unless_present(model)
77
+ if model_metadata = model_present?(model)
78
+ return model_metadata
61
79
  else
62
80
  pull_model_from_remote(model)
63
- if system = model_present?(model)
64
- return system.full?
65
- else
66
- STDOUT.puts "Model #{bold{model}} not found remotely. => Exiting."
67
- exit 1
81
+ if model_metadata = model_present?(model)
82
+ return model_metadata
68
83
  end
84
+ raise OllamaChat::UnknownModelError, "unknown model named #{@model.inspect}"
69
85
  end
70
- rescue Ollama::Errors::Error => e
71
- warn "Caught #{e.class} while pulling model: #{e} => Exiting."
72
- exit 1
73
86
  end
74
87
 
75
88
  # The model_with_size method formats a model's size for display
@@ -91,6 +104,15 @@ module OllamaChat::ModelHandling
91
104
  result
92
105
  end
93
106
 
107
+ def use_model(model = nil)
108
+ if model.nil?
109
+ @model = choose_model('', @model)
110
+ else
111
+ @model = choose_model(model, config.model.name)
112
+ end
113
+ @model_metadata = pull_model_unless_present(@model)
114
+ end
115
+
94
116
  # The choose_model method selects a model from the available list based on
95
117
  # CLI input or user interaction.
96
118
  # It processes the provided CLI model parameter to determine if a regex