ollama_chat 0.0.17 → 0.0.19

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: 1ca8b4d15da972d1600ccab9349f1852ccb75f2733cd9a85ccfb57f2890cb786
4
- data.tar.gz: 3ec2450f156b98ac722b58358de3190e58d30ebf6f8e09dd435804b409c1e1e0
3
+ metadata.gz: 56f1fbbeb7e84fe906636ac0e16d5223f4968a063ee73c862398bdd575b8409b
4
+ data.tar.gz: f9ffc5725d73ad54efbd55239c55a615f0cd630c682d293c11fa461b72be17d9
5
5
  SHA512:
6
- metadata.gz: 98a1335f95549196c6a6a486465c3dad12694b8006c18a7ff2619be8a866b826869ed35dde99d810fea330101e67361142000ac705e5784450bc9ae11952f8d7
7
- data.tar.gz: dce73464d416d095eee8d86723ca241cc7affdbc87b4472bf4e0cf0caf084f52d2c47810fd3fe8ec67cb883df85e58098caae1ad45c4088850d98cc37705c2b0
6
+ metadata.gz: b331c3ca96198b45e8cf6531a1e7e0f7542607a13512d516767180ae7a21c4cc6ba3eda280a24ee0472a9bf24a45a2acaba586f635dc8e1fe3f6e459b567f574
7
+ data.tar.gz: 0f2da5cc8c319d414d812af7067701567f463c8b9db6507f1cdb34eec880bb05204ddd1111a448bad289e8299858a59534315da2091287b5b033e0a85d28c7f1
data/CHANGES.md CHANGED
@@ -1,5 +1,52 @@
1
1
  # Changes
2
2
 
3
+ ## 2025-08-11 v0.0.19
4
+
5
+ * Added `/last` command to show last assistant message:
6
+ * Introduced `show_last` method in `MessageList` class to display the last
7
+ non-user message.
8
+ * Extracted message formatting logic into `message_text_for` method for
9
+ better code organization.
10
+ * Updated documentation comments for improved clarity.
11
+ * Updated `README.md` to document the new `/last` command.
12
+ * Added `/output` and `/pipe` commands for response handling:
13
+ * Introduced `OllamaChat::MessageOutput` module with `pipe` and `output`
14
+ methods.
15
+ * Updated `MessageList#save_conversation` and `MessageList#load_conversation`
16
+ to use `STDERR` for errors.
17
+ * Added comprehensive error handling with exit code checking for pipe
18
+ operations.
19
+ * Updated help text to document new `/output` and `/pipe` commands.
20
+ * Sorted prompt lists for consistent ordering:
21
+ * Ensured predictable prompt selection order in dialog interface.
22
+ * Removed RSpec describe syntax in favor of bare `describe`.
23
+ * Supported application/xml content type for RSS parsing:
24
+ * Added `application/xml` MIME type support alongside existing `text/xml`.
25
+ * Updated `OllamaChat::Parsing` module condition matching.
26
+ * Added test case for `application/xml` RSS parsing.
27
+ * Maintained other development dependencies at their current versions.
28
+ * Updated error message wording in parsing module.
29
+
30
+ ## 2025-07-31 v0.0.18
31
+
32
+ * **Added /prompt command**: The `/prompt` command was added to the list of
33
+ supported commands, allowing users to prefill their input with text from
34
+ predefined prompts.
35
+ + Integrated prompt handling in `lib/ollama_chat/chat.rb`, where a new case
36
+ statement for `/prompt` sets up prefill functionality.
37
+ + Implemented prompt selection using the `choose_prompt` method in
38
+ `lib/ollama_chat/dialog.rb`.
39
+ + Set up input hooks using `Reline.pre_input_hook` to insert selected prompts
40
+ before user input.
41
+ * **Improved user interaction**:
42
+ - Added model size display during model selection via the `model_with_size`
43
+ method in `lib/ollama_chat/dialog.rb`.
44
+ - Updated model selection logic to include formatted sizes in the display.
45
+ * **Optimized voice list generation**: In
46
+ `lib/ollama_chat/ollama_chat_config/default_config.yml`, updated the voice
47
+ list generation logic to use a more efficient method of retrieving voice
48
+ names.
49
+
3
50
  ## 2025-07-14 v0.0.17
4
51
 
5
52
  * Implement Pager Support for List Command
data/README.md CHANGED
@@ -125,12 +125,14 @@ The following commands can be given inside the chat, if prefixed by a `/`:
125
125
  /stream toggle stream output
126
126
  /location toggle location submission
127
127
  /voice [change] toggle voice output or change the voice
128
+ /last show the last system/assistant message
128
129
  /list [n] list the last n / all conversation exchanges
129
130
  /clear [what] clear what=messages|links|history|tags|all
130
131
  /clobber clear the conversation, links, and collection
131
132
  /drop [n] drop the last n exchanges, defaults to 1
132
133
  /model change the model
133
134
  /system [show] change/show system prompt
135
+ /prompt prefill user prompt with preset prompts
134
136
  /regenerate the last answer message
135
137
  /collection [clear|change] change (default) collection or clear
136
138
  /info show information for current session
@@ -145,6 +147,8 @@ The following commands can be given inside the chat, if prefixed by a `/`:
145
147
  /links [clear] display (or clear) links used in the chat
146
148
  /save filename store conversation messages
147
149
  /load filename load conversation messages
150
+ /output filename save last response to filename
151
+ /pipe command write last response to command's stdin
148
152
  /quit to quit
149
153
  /help to view this help
150
154
  ```
data/VERSION CHANGED
@@ -1 +1 @@
1
- 0.0.17
1
+ 0.0.19
@@ -28,6 +28,7 @@ class OllamaChat::Chat
28
28
  include OllamaChat::WebSearching
29
29
  include OllamaChat::Dialog
30
30
  include OllamaChat::Information
31
+ include OllamaChat::MessageOutput
31
32
  include OllamaChat::Clipboard
32
33
  include OllamaChat::MessageFormat
33
34
  include OllamaChat::History
@@ -140,6 +141,9 @@ class OllamaChat::Chat
140
141
  last = 2 * $1.to_i if $1
141
142
  messages.list_conversation(last)
142
143
  :next
144
+ when %r(^/last$)
145
+ messages.show_last
146
+ :next
143
147
  when %r(^/clear(?:\s+(messages|links|history|tags|all))?$)
144
148
  clean($1)
145
149
  :next
@@ -159,6 +163,9 @@ class OllamaChat::Chat
159
163
  end
160
164
  @messages.show_system_prompt
161
165
  :next
166
+ when %r(^/prompt)
167
+ @prefill_prompt = choose_prompt
168
+ :next
162
169
  when %r(^/regenerate$)
163
170
  if content = messages.second_last&.content
164
171
  content.gsub!(/\nConsider these chunks for your answer.*\z/, '')
@@ -224,18 +231,33 @@ class OllamaChat::Chat
224
231
  @parse_content = false
225
232
  web($1, $2)
226
233
  when %r(^/save\s+(.+)$)
227
- messages.save_conversation($1)
228
- STDOUT.puts "Saved conversation to #$1."
234
+ filename = $1
235
+ if messages.save_conversation(filename)
236
+ STDOUT.puts "Saved conversation to #{filename.inspect}."
237
+ else
238
+ STDOUT.puts "Saving conversation to #{filename.inspect} failed."
239
+ end
229
240
  :next
230
241
  when %r(^/links(?:\s+(clear))?$)
231
242
  manage_links($1)
232
243
  :next
233
244
  when %r(^/load\s+(.+)$)
234
- messages.load_conversation($1)
245
+ filename = $1
246
+ success = messages.load_conversation(filename)
235
247
  if messages.size > 1
236
248
  messages.list_conversation(2)
237
249
  end
238
- STDOUT.puts "Loaded conversation from #$1."
250
+ if success
251
+ STDOUT.puts "Loaded conversation from #{filename.inspect}."
252
+ else
253
+ STDOUT.puts "Loading conversation from #{filename.inspect} failed."
254
+ end
255
+ :next
256
+ when %r(^/pipe\s+(.+)$)
257
+ pipe($1)
258
+ :next
259
+ when %r(^/output\s+(.+)$)
260
+ output($1)
239
261
  :next
240
262
  when %r(^/config$)
241
263
  display_config
@@ -350,6 +372,14 @@ class OllamaChat::Chat
350
372
  input_prompt = bold { color(172) { message_type(@images) + " user" } } + bold { "> " }
351
373
  begin
352
374
  content = enable_command_completion do
375
+ if prefill_prompt = @prefill_prompt.full?
376
+ Reline.pre_input_hook = -> {
377
+ Reline.insert_text prefill_prompt.gsub(/\n*\z/, '')
378
+ @prefill_prompt = nil
379
+ }
380
+ else
381
+ Reline.pre_input_hook = nil
382
+ end
353
383
  Reline.readline(input_prompt, true)&.chomp
354
384
  end
355
385
  rescue Interrupt
@@ -1,10 +1,21 @@
1
1
  module OllamaChat::Dialog
2
+ private def model_with_size(model)
3
+ result = model.name
4
+ formatted_size = Term::ANSIColor.bold {
5
+ Tins::Unit.format(model.size, unit: ?B, prefix: 1024, format: '%.1f %U')
6
+ }
7
+ result.singleton_class.class_eval do
8
+ define_method(:to_s) { "%s %s" % [ model.name, formatted_size ] }
9
+ end
10
+ result
11
+ end
12
+
2
13
  def choose_model(cli_model, current_model)
3
14
  selector = if cli_model =~ /\A\?+(.*)\z/
4
15
  cli_model = ''
5
16
  Regexp.new($1)
6
17
  end
7
- models = ollama.tags.models.map(&:name).sort
18
+ models = ollama.tags.models.sort_by(&:name).map { |m| model_with_size(m) }
8
19
  selector and models = models.grep(selector)
9
20
  model = if cli_model == ''
10
21
  OllamaChat::Utils::Chooser.choose(models) || current_model
@@ -68,7 +79,7 @@ module OllamaChat::Dialog
68
79
  else
69
80
  Regexp.new(system.to_s)
70
81
  end
71
- prompts = config.system_prompts.attribute_names.compact.grep(selector)
82
+ prompts = config.system_prompts.attribute_names.compact.grep(selector).sort
72
83
  if prompts.size == 1
73
84
  system = config.system_prompts.send(prompts.first)
74
85
  else
@@ -92,6 +103,18 @@ module OllamaChat::Dialog
92
103
  @messages.set_system_prompt(system)
93
104
  end
94
105
 
106
+ def choose_prompt
107
+ prompts = config.prompts.attribute_names.sort
108
+ prompts.unshift('[EXIT]')
109
+ case chosen = OllamaChat::Utils::Chooser.choose(prompts)
110
+ when '[EXIT]', nil
111
+ STDOUT.puts "Exiting chooser."
112
+ return
113
+ when *prompts
114
+ config.prompts.send(chosen)
115
+ end
116
+ end
117
+
95
118
  def change_voice
96
119
  chosen = OllamaChat::Utils::Chooser.choose(config.voice.list)
97
120
  @current_voice = chosen.full? || config.voice.default
@@ -66,12 +66,14 @@ module OllamaChat::Information
66
66
  /stream toggle stream output
67
67
  /location toggle location submission
68
68
  /voice [change] toggle voice output or change the voice
69
+ /last show the last system/assistant message
69
70
  /list [n] list the last n / all conversation exchanges
70
71
  /clear [what] clear what=messages|links|history|tags|all
71
72
  /clobber clear the conversation, links, and collection
72
73
  /drop [n] drop the last n exchanges, defaults to 1
73
74
  /model change the model
74
75
  /system [show] change/show system prompt
76
+ /prompt prefill user prompt with preset prompts
75
77
  /regenerate the last answer message
76
78
  /collection [clear|change] change (default) collection or clear
77
79
  /info show information for current session
@@ -86,6 +88,8 @@ module OllamaChat::Information
86
88
  /links [clear] display (or clear) links used in the chat
87
89
  /save filename store conversation messages
88
90
  /load filename load conversation messages
91
+ /output filename save last response to filename
92
+ /pipe command write last response to command's stdin
89
93
  /quit to quit
90
94
  /help to view this help
91
95
  EOT
@@ -66,7 +66,7 @@ class OllamaChat::MessageList
66
66
  # @return [ OllamaChat::MessageList ] self
67
67
  def load_conversation(filename)
68
68
  unless File.exist?(filename)
69
- STDOUT.puts "File #{filename} doesn't exist. Choose another filename."
69
+ STDERR.puts "File #{filename.inspect} doesn't exist. Choose another filename."
70
70
  return
71
71
  end
72
72
  @messages =
@@ -83,10 +83,10 @@ class OllamaChat::MessageList
83
83
  # @return [ OllamaChat::MessageList ] self
84
84
  def save_conversation(filename)
85
85
  if File.exist?(filename)
86
- STDOUT.puts "File #{filename} already exists. Choose another filename."
86
+ STDERR.puts "File #{filename.inspect} already exists. Choose another filename."
87
87
  return
88
88
  end
89
- File.open(filename, 'w') do |output|
89
+ File.open(filename, ?w) do |output|
90
90
  output.puts JSON(@messages)
91
91
  end
92
92
  self
@@ -100,36 +100,26 @@ class OllamaChat::MessageList
100
100
  def list_conversation(last = nil)
101
101
  last = (last || @messages.size).clamp(0, @messages.size)
102
102
  use_pager do |output|
103
- @messages[-last..-1].to_a.each do |m|
104
- role_color = case m.role
105
- when 'user' then 172
106
- when 'assistant' then 111
107
- when 'system' then 213
108
- else 210
109
- end
110
- thinking = if @chat.think.on?
111
- think_annotate do
112
- m.thinking.full? { @chat.markdown.on? ? Kramdown::ANSI.parse(_1) : _1 }
113
- end
114
- end
115
- content = m.content.full? { @chat.markdown.on? ? Kramdown::ANSI.parse(_1) : _1 }
116
- message_text = message_type(m.images) + " "
117
- message_text += bold { color(role_color) { m.role } }
118
- if thinking
119
- message_text += [ ?:, thinking, talk_annotate { content } ].compact.
120
- map { _1.chomp } * ?\n
121
- else
122
- message_text += ":\n#{content}"
123
- end
124
- m.images.full? { |images|
125
- message_text += "\nImages: " + italic { images.map(&:path) * ', ' }
126
- }
127
- output.puts message_text
103
+ @messages[-last..-1].to_a.each do |message|
104
+ output.puts message_text_for(message)
128
105
  end
129
106
  end
130
107
  self
131
108
  end
132
109
 
110
+ # The show_last method displays the text of the last message if it is not
111
+ # from the user. It uses a pager for output and returns the instance itself.
112
+ #
113
+ # @return [ OllamaChat::MessageList ] returns the instance of the class
114
+ def show_last
115
+ message = last
116
+ message&.role == 'user' and return
117
+ use_pager do |output|
118
+ output.puts message_text_for(message)
119
+ end
120
+ self
121
+ end
122
+
133
123
  # Removes the last `n` exchanges from the message list. An exchange consists
134
124
  # of a user and an assistant message. If only a single user message is
135
125
  # present at the end, it will be removed first before proceeding with
@@ -260,10 +250,21 @@ class OllamaChat::MessageList
260
250
 
261
251
  private
262
252
 
253
+ # The config method provides access to the chat configuration object.
254
+ #
255
+ # @return [ Object ] the configuration object associated with the chat instance
263
256
  def config
264
257
  @chat.config
265
258
  end
266
259
 
260
+ # The determine_pager_command method identifies an appropriate pager command
261
+ # for displaying content.
262
+ # It first checks for a default pager specified by the PAGER environment variable.
263
+ # If no default is found, it attempts to locate 'less' or 'more' in the
264
+ # system PATH as fallback options.
265
+ # The method returns the selected pager command, ensuring it includes the
266
+ # '-r' flag for proper handling of raw control characters when a fallback
267
+ # pager is used.
267
268
  def determine_pager_command
268
269
  default_pager = ENV['PAGER'].full?
269
270
  if fallback_pager = `which less`.chomp.full? || `which more`.chomp.full?
@@ -272,6 +273,11 @@ class OllamaChat::MessageList
272
273
  default_pager || fallback_pager
273
274
  end
274
275
 
276
+ # The use_pager method wraps the given block with a pager context.
277
+ # If the output would exceed the terminal's line capacity, it pipes the content
278
+ # through an appropriate pager command (like 'less' or 'more').
279
+ #
280
+ # @param block [Proc] A block that yields an IO object to write output to
275
281
  def use_pager
276
282
  command = determine_pager_command
277
283
  output_buffer = StringIO.new
@@ -281,4 +287,41 @@ class OllamaChat::MessageList
281
287
  output.puts messages
282
288
  end
283
289
  end
290
+
291
+ # The message_text_for method generates formatted text representation of a
292
+ # message including its role, content, thinking annotations, and associated
293
+ # images.
294
+ # It applies color coding to different message roles and uses markdown
295
+ # parsing when enabled. The method also handles special formatting for
296
+ # thinking annotations and image references within the message.
297
+ #
298
+ # @param message [Object] the message object containing role, content, thinking, and images
299
+ #
300
+ # @return [String] the formatted text representation of the message
301
+ def message_text_for(message)
302
+ role_color = case message.role
303
+ when 'user' then 172
304
+ when 'assistant' then 111
305
+ when 'system' then 213
306
+ else 210
307
+ end
308
+ thinking = if @chat.think.on?
309
+ think_annotate do
310
+ message.thinking.full? { @chat.markdown.on? ? Kramdown::ANSI.parse(_1) : _1 }
311
+ end
312
+ end
313
+ content = message.content.full? { @chat.markdown.on? ? Kramdown::ANSI.parse(_1) : _1 }
314
+ message_text = message_type(message.images) + " "
315
+ message_text += bold { color(role_color) { message.role } }
316
+ if thinking
317
+ message_text += [ ?:, thinking, talk_annotate { content } ].compact.
318
+ map { _1.chomp } * ?\n
319
+ else
320
+ message_text += ":\n#{content}"
321
+ end
322
+ message.images.full? { |images|
323
+ message_text += "\nImages: " + italic { images.map(&:path) * ', ' }
324
+ }
325
+ message_text
326
+ end
284
327
  end
@@ -0,0 +1,50 @@
1
+ module OllamaChat::MessageOutput
2
+ def pipe(cmd)
3
+ cmd.present? or return
4
+ if message = @messages.last and message.role == 'assistant'
5
+ begin
6
+ IO.popen(cmd, ?w) do |output|
7
+ output.write(message.content)
8
+ end
9
+ exit_code = $?&.exitstatus
10
+ if exit_code == 0
11
+ STDOUT.puts "Last response was piped to #{cmd.inspect}."
12
+ else
13
+ STDERR.puts "Executing #{cmd.inspect}, failed with exit code #{exit_code}."
14
+ end
15
+ self
16
+ rescue => e
17
+ STDERR.puts "Executing #{cmd.inspect}, caused #{e.class}: #{e}."
18
+ end
19
+ else
20
+ STDERR.puts "No response available to output to pipe command #{cmd.inspect}."
21
+ end
22
+ end
23
+
24
+ def output(filename)
25
+ if message = @messages.last and message.role == 'assistant'
26
+ begin
27
+ write_file_unless_exist(filename)
28
+ STDOUT.puts "Last response was written to #{filename.inspect}."
29
+ self
30
+ rescue => e
31
+ STDERR.puts "Writing to #{filename.inspect}, caused #{e.class}: #{e}."
32
+ end
33
+ else
34
+ STDERR.puts "No response available to write to #{filename.inspect}."
35
+ end
36
+ end
37
+
38
+ private
39
+
40
+ def write_file_unless_exist(filename)
41
+ if File.exist?(filename)
42
+ STDERR.puts "File #{filename.inspect} already exists. Choose another filename."
43
+ return
44
+ end
45
+ File.open(filename, ?w) do |output|
46
+ output.write(message.content)
47
+ end
48
+ true
49
+ end
50
+ end
@@ -32,7 +32,7 @@ system_prompts:
32
32
  voice:
33
33
  enabled: false
34
34
  default: Samantha
35
- list: <%= `say -v ? 2>/dev/null`.lines.map { _1[/^(.+?)\s+[a-z]{2}_[a-zA-Z0-9]{2,}/, 1] }.uniq.sort.to_s.force_encoding('ASCII-8BIT') %>
35
+ list: <%= `say -v ? 2>/dev/null`.lines.map { |l| l.force_encoding('ASCII-8BIT'); l[/^(.+?)\s+[a-z]{2}_[a-zA-Z0-9]{2,}/, 1] }.uniq.sort.to_s %>
36
36
  markdown: true
37
37
  stream: true
38
38
  document_policy: importing
@@ -3,7 +3,7 @@ module OllamaChat::Parsing
3
3
  case source_io&.content_type
4
4
  when 'text/html'
5
5
  reverse_markdown(source_io.read)
6
- when 'text/xml'
6
+ when 'text/xml', 'application/xml'
7
7
  if source_io.read(8192) =~ %r(^\s*<rss\s)
8
8
  source_io.rewind
9
9
  return parse_rss(source_io)
@@ -23,7 +23,7 @@ module OllamaChat::Parsing
23
23
  when %r(\Aapplication/(json|ld\+json|x-ruby|x-perl|x-gawk|x-python|x-javascript|x-c?sh|x-dosexec|x-shellscript|x-tex|x-latex|x-lyx|x-bibtex)), %r(\Atext/), nil
24
24
  source_io.read
25
25
  else
26
- STDERR.puts "Cannot embed #{source_io&.content_type} document."
26
+ STDERR.puts "Cannot parse #{source_io&.content_type} document."
27
27
  return
28
28
  end
29
29
  end
@@ -1,6 +1,6 @@
1
1
  module OllamaChat
2
2
  # OllamaChat version
3
- VERSION = '0.0.17'
3
+ VERSION = '0.0.19'
4
4
  VERSION_ARRAY = VERSION.split('.').map(&:to_i) # :nodoc:
5
5
  VERSION_MAJOR = VERSION_ARRAY[0] # :nodoc:
6
6
  VERSION_MINOR = VERSION_ARRAY[1] # :nodoc:
data/lib/ollama_chat.rb CHANGED
@@ -17,6 +17,7 @@ require 'ollama_chat/source_fetching'
17
17
  require 'ollama_chat/web_searching'
18
18
  require 'ollama_chat/dialog'
19
19
  require 'ollama_chat/information'
20
+ require 'ollama_chat/message_output'
20
21
  require 'ollama_chat/clipboard'
21
22
  require 'ollama_chat/document_cache'
22
23
  require 'ollama_chat/history'
data/ollama_chat.gemspec CHANGED
@@ -1,9 +1,9 @@
1
1
  # -*- encoding: utf-8 -*-
2
- # stub: ollama_chat 0.0.17 ruby lib
2
+ # stub: ollama_chat 0.0.19 ruby lib
3
3
 
4
4
  Gem::Specification.new do |s|
5
5
  s.name = "ollama_chat".freeze
6
- s.version = "0.0.17".freeze
6
+ s.version = "0.0.19".freeze
7
7
 
8
8
  s.required_rubygems_version = Gem::Requirement.new(">= 0".freeze) if s.respond_to? :required_rubygems_version=
9
9
  s.require_paths = ["lib".freeze]
@@ -12,19 +12,19 @@ Gem::Specification.new do |s|
12
12
  s.description = "The app provides a command-line interface (CLI) to an Ollama AI model,\nallowing users to engage in text-based conversations and generate\nhuman-like responses. Users can import data from local files or web pages,\nwhich are then processed through three different modes: fully importing the\ncontent into the conversation context, summarizing the information for\nconcise reference, or storing it in an embedding vector database for later\nretrieval based on the conversation.\n".freeze
13
13
  s.email = "flori@ping.de".freeze
14
14
  s.executables = ["ollama_chat".freeze, "ollama_chat_send".freeze]
15
- s.extra_rdoc_files = ["README.md".freeze, "lib/ollama_chat.rb".freeze, "lib/ollama_chat/chat.rb".freeze, "lib/ollama_chat/clipboard.rb".freeze, "lib/ollama_chat/dialog.rb".freeze, "lib/ollama_chat/document_cache.rb".freeze, "lib/ollama_chat/follow_chat.rb".freeze, "lib/ollama_chat/history.rb".freeze, "lib/ollama_chat/information.rb".freeze, "lib/ollama_chat/message_format.rb".freeze, "lib/ollama_chat/message_list.rb".freeze, "lib/ollama_chat/model_handling.rb".freeze, "lib/ollama_chat/ollama_chat_config.rb".freeze, "lib/ollama_chat/parsing.rb".freeze, "lib/ollama_chat/server_socket.rb".freeze, "lib/ollama_chat/source_fetching.rb".freeze, "lib/ollama_chat/switches.rb".freeze, "lib/ollama_chat/utils.rb".freeze, "lib/ollama_chat/utils/cache_fetcher.rb".freeze, "lib/ollama_chat/utils/chooser.rb".freeze, "lib/ollama_chat/utils/fetcher.rb".freeze, "lib/ollama_chat/utils/file_argument.rb".freeze, "lib/ollama_chat/version.rb".freeze, "lib/ollama_chat/web_searching.rb".freeze]
16
- s.files = [".all_images.yml".freeze, ".envrc".freeze, ".gitignore".freeze, "CHANGES.md".freeze, "Gemfile".freeze, "README.md".freeze, "Rakefile".freeze, "VERSION".freeze, "bin/ollama_chat".freeze, "bin/ollama_chat_send".freeze, "config/searxng/settings.yml".freeze, "docker-compose.yml".freeze, "lib/ollama_chat.rb".freeze, "lib/ollama_chat/chat.rb".freeze, "lib/ollama_chat/clipboard.rb".freeze, "lib/ollama_chat/dialog.rb".freeze, "lib/ollama_chat/document_cache.rb".freeze, "lib/ollama_chat/follow_chat.rb".freeze, "lib/ollama_chat/history.rb".freeze, "lib/ollama_chat/information.rb".freeze, "lib/ollama_chat/message_format.rb".freeze, "lib/ollama_chat/message_list.rb".freeze, "lib/ollama_chat/model_handling.rb".freeze, "lib/ollama_chat/ollama_chat_config.rb".freeze, "lib/ollama_chat/ollama_chat_config/default_config.yml".freeze, "lib/ollama_chat/parsing.rb".freeze, "lib/ollama_chat/server_socket.rb".freeze, "lib/ollama_chat/source_fetching.rb".freeze, "lib/ollama_chat/switches.rb".freeze, "lib/ollama_chat/utils.rb".freeze, "lib/ollama_chat/utils/cache_fetcher.rb".freeze, "lib/ollama_chat/utils/chooser.rb".freeze, "lib/ollama_chat/utils/fetcher.rb".freeze, "lib/ollama_chat/utils/file_argument.rb".freeze, "lib/ollama_chat/version.rb".freeze, "lib/ollama_chat/web_searching.rb".freeze, "ollama_chat.gemspec".freeze, "redis/redis.conf".freeze, "spec/assets/api_show.json".freeze, "spec/assets/api_tags.json".freeze, "spec/assets/api_version.json".freeze, "spec/assets/conversation.json".freeze, "spec/assets/duckduckgo.html".freeze, "spec/assets/example.atom".freeze, "spec/assets/example.csv".freeze, "spec/assets/example.html".freeze, "spec/assets/example.pdf".freeze, "spec/assets/example.ps".freeze, "spec/assets/example.rb".freeze, "spec/assets/example.rss".freeze, "spec/assets/example.xml".freeze, "spec/assets/kitten.jpg".freeze, "spec/assets/prompt.txt".freeze, "spec/assets/searxng.json".freeze, "spec/ollama_chat/chat_spec.rb".freeze, "spec/ollama_chat/clipboard_spec.rb".freeze, "spec/ollama_chat/follow_chat_spec.rb".freeze, "spec/ollama_chat/information_spec.rb".freeze, "spec/ollama_chat/message_list_spec.rb".freeze, "spec/ollama_chat/model_handling_spec.rb".freeze, "spec/ollama_chat/parsing_spec.rb".freeze, "spec/ollama_chat/source_fetching_spec.rb".freeze, "spec/ollama_chat/switches_spec.rb".freeze, "spec/ollama_chat/utils/cache_fetcher_spec.rb".freeze, "spec/ollama_chat/utils/fetcher_spec.rb".freeze, "spec/ollama_chat/utils/file_argument_spec.rb".freeze, "spec/ollama_chat/web_searching_spec.rb".freeze, "spec/spec_helper.rb".freeze, "tmp/.keep".freeze]
15
+ s.extra_rdoc_files = ["README.md".freeze, "lib/ollama_chat.rb".freeze, "lib/ollama_chat/chat.rb".freeze, "lib/ollama_chat/clipboard.rb".freeze, "lib/ollama_chat/dialog.rb".freeze, "lib/ollama_chat/document_cache.rb".freeze, "lib/ollama_chat/follow_chat.rb".freeze, "lib/ollama_chat/history.rb".freeze, "lib/ollama_chat/information.rb".freeze, "lib/ollama_chat/message_format.rb".freeze, "lib/ollama_chat/message_list.rb".freeze, "lib/ollama_chat/message_output.rb".freeze, "lib/ollama_chat/model_handling.rb".freeze, "lib/ollama_chat/ollama_chat_config.rb".freeze, "lib/ollama_chat/parsing.rb".freeze, "lib/ollama_chat/server_socket.rb".freeze, "lib/ollama_chat/source_fetching.rb".freeze, "lib/ollama_chat/switches.rb".freeze, "lib/ollama_chat/utils.rb".freeze, "lib/ollama_chat/utils/cache_fetcher.rb".freeze, "lib/ollama_chat/utils/chooser.rb".freeze, "lib/ollama_chat/utils/fetcher.rb".freeze, "lib/ollama_chat/utils/file_argument.rb".freeze, "lib/ollama_chat/version.rb".freeze, "lib/ollama_chat/web_searching.rb".freeze]
16
+ s.files = [".all_images.yml".freeze, ".envrc".freeze, ".gitignore".freeze, "CHANGES.md".freeze, "Gemfile".freeze, "README.md".freeze, "Rakefile".freeze, "VERSION".freeze, "bin/ollama_chat".freeze, "bin/ollama_chat_send".freeze, "config/searxng/settings.yml".freeze, "docker-compose.yml".freeze, "lib/ollama_chat.rb".freeze, "lib/ollama_chat/chat.rb".freeze, "lib/ollama_chat/clipboard.rb".freeze, "lib/ollama_chat/dialog.rb".freeze, "lib/ollama_chat/document_cache.rb".freeze, "lib/ollama_chat/follow_chat.rb".freeze, "lib/ollama_chat/history.rb".freeze, "lib/ollama_chat/information.rb".freeze, "lib/ollama_chat/message_format.rb".freeze, "lib/ollama_chat/message_list.rb".freeze, "lib/ollama_chat/message_output.rb".freeze, "lib/ollama_chat/model_handling.rb".freeze, "lib/ollama_chat/ollama_chat_config.rb".freeze, "lib/ollama_chat/ollama_chat_config/default_config.yml".freeze, "lib/ollama_chat/parsing.rb".freeze, "lib/ollama_chat/server_socket.rb".freeze, "lib/ollama_chat/source_fetching.rb".freeze, "lib/ollama_chat/switches.rb".freeze, "lib/ollama_chat/utils.rb".freeze, "lib/ollama_chat/utils/cache_fetcher.rb".freeze, "lib/ollama_chat/utils/chooser.rb".freeze, "lib/ollama_chat/utils/fetcher.rb".freeze, "lib/ollama_chat/utils/file_argument.rb".freeze, "lib/ollama_chat/version.rb".freeze, "lib/ollama_chat/web_searching.rb".freeze, "ollama_chat.gemspec".freeze, "redis/redis.conf".freeze, "spec/assets/api_show.json".freeze, "spec/assets/api_tags.json".freeze, "spec/assets/api_version.json".freeze, "spec/assets/conversation.json".freeze, "spec/assets/duckduckgo.html".freeze, "spec/assets/example.atom".freeze, "spec/assets/example.csv".freeze, "spec/assets/example.html".freeze, "spec/assets/example.pdf".freeze, "spec/assets/example.ps".freeze, "spec/assets/example.rb".freeze, "spec/assets/example.rss".freeze, "spec/assets/example.xml".freeze, "spec/assets/kitten.jpg".freeze, "spec/assets/prompt.txt".freeze, "spec/assets/searxng.json".freeze, "spec/ollama_chat/chat_spec.rb".freeze, "spec/ollama_chat/clipboard_spec.rb".freeze, "spec/ollama_chat/follow_chat_spec.rb".freeze, "spec/ollama_chat/information_spec.rb".freeze, "spec/ollama_chat/message_list_spec.rb".freeze, "spec/ollama_chat/message_output_spec.rb".freeze, "spec/ollama_chat/model_handling_spec.rb".freeze, "spec/ollama_chat/parsing_spec.rb".freeze, "spec/ollama_chat/source_fetching_spec.rb".freeze, "spec/ollama_chat/switches_spec.rb".freeze, "spec/ollama_chat/utils/cache_fetcher_spec.rb".freeze, "spec/ollama_chat/utils/fetcher_spec.rb".freeze, "spec/ollama_chat/utils/file_argument_spec.rb".freeze, "spec/ollama_chat/web_searching_spec.rb".freeze, "spec/spec_helper.rb".freeze, "tmp/.keep".freeze]
17
17
  s.homepage = "https://github.com/flori/ollama_chat".freeze
18
18
  s.licenses = ["MIT".freeze]
19
19
  s.rdoc_options = ["--title".freeze, "OllamaChat - A command-line interface (CLI) for interacting with an Ollama AI model.".freeze, "--main".freeze, "README.md".freeze]
20
20
  s.required_ruby_version = Gem::Requirement.new("~> 3.1".freeze)
21
21
  s.rubygems_version = "3.6.9".freeze
22
22
  s.summary = "A command-line interface (CLI) for interacting with an Ollama AI model.".freeze
23
- s.test_files = ["spec/assets/example.rb".freeze, "spec/ollama_chat/chat_spec.rb".freeze, "spec/ollama_chat/clipboard_spec.rb".freeze, "spec/ollama_chat/follow_chat_spec.rb".freeze, "spec/ollama_chat/information_spec.rb".freeze, "spec/ollama_chat/message_list_spec.rb".freeze, "spec/ollama_chat/model_handling_spec.rb".freeze, "spec/ollama_chat/parsing_spec.rb".freeze, "spec/ollama_chat/source_fetching_spec.rb".freeze, "spec/ollama_chat/switches_spec.rb".freeze, "spec/ollama_chat/utils/cache_fetcher_spec.rb".freeze, "spec/ollama_chat/utils/fetcher_spec.rb".freeze, "spec/ollama_chat/utils/file_argument_spec.rb".freeze, "spec/ollama_chat/web_searching_spec.rb".freeze, "spec/spec_helper.rb".freeze]
23
+ s.test_files = ["spec/assets/example.rb".freeze, "spec/ollama_chat/chat_spec.rb".freeze, "spec/ollama_chat/clipboard_spec.rb".freeze, "spec/ollama_chat/follow_chat_spec.rb".freeze, "spec/ollama_chat/information_spec.rb".freeze, "spec/ollama_chat/message_list_spec.rb".freeze, "spec/ollama_chat/message_output_spec.rb".freeze, "spec/ollama_chat/model_handling_spec.rb".freeze, "spec/ollama_chat/parsing_spec.rb".freeze, "spec/ollama_chat/source_fetching_spec.rb".freeze, "spec/ollama_chat/switches_spec.rb".freeze, "spec/ollama_chat/utils/cache_fetcher_spec.rb".freeze, "spec/ollama_chat/utils/fetcher_spec.rb".freeze, "spec/ollama_chat/utils/file_argument_spec.rb".freeze, "spec/ollama_chat/web_searching_spec.rb".freeze, "spec/spec_helper.rb".freeze]
24
24
 
25
25
  s.specification_version = 4
26
26
 
27
- s.add_development_dependency(%q<gem_hadar>.freeze, ["~> 1.20".freeze])
27
+ s.add_development_dependency(%q<gem_hadar>.freeze, ["~> 1.27".freeze])
28
28
  s.add_development_dependency(%q<all_images>.freeze, ["~> 0.6".freeze])
29
29
  s.add_development_dependency(%q<rspec>.freeze, ["~> 3.2".freeze])
30
30
  s.add_development_dependency(%q<kramdown>.freeze, ["~> 2.0".freeze])
@@ -1,6 +1,6 @@
1
1
  require 'spec_helper'
2
2
 
3
- RSpec.describe OllamaChat::Chat do
3
+ describe OllamaChat::Chat do
4
4
  let :argv do
5
5
  %w[ -C test ]
6
6
  end
@@ -1,6 +1,6 @@
1
1
  require 'spec_helper'
2
2
 
3
- RSpec.describe OllamaChat::Clipboard do
3
+ describe OllamaChat::Clipboard do
4
4
  let :chat do
5
5
  OllamaChat::Chat.new
6
6
  end
@@ -1,6 +1,6 @@
1
1
  require 'spec_helper'
2
2
 
3
- RSpec.describe OllamaChat::FollowChat do
3
+ describe OllamaChat::FollowChat do
4
4
  let :messages do
5
5
  [
6
6
  Ollama::Message.new(role: 'user', content: 'hello', images: []),
@@ -1,6 +1,6 @@
1
1
  require 'spec_helper'
2
2
 
3
- RSpec.describe OllamaChat::Information do
3
+ describe OllamaChat::Information do
4
4
  let :chat do
5
5
  OllamaChat::Chat.new
6
6
  end
@@ -1,6 +1,6 @@
1
1
  require 'spec_helper'
2
2
 
3
- RSpec.describe OllamaChat::MessageList do
3
+ describe OllamaChat::MessageList do
4
4
  let :config do
5
5
  double(
6
6
  location: double(
@@ -70,6 +70,16 @@ RSpec.describe OllamaChat::MessageList do
70
70
  expect(list).to receive(:determine_pager_command).and_return nil
71
71
  end
72
72
 
73
+ it 'can show last message' do
74
+ expect(chat).to receive(:markdown).
75
+ and_return(double(on?: true)).at_least(:once)
76
+ expect(chat).to receive(:think).
77
+ and_return(double(on?: false)).at_least(:once)
78
+ expect(STDOUT).to receive(:puts).
79
+ with("📨 \e[1m\e[38;5;213msystem\e[0m\e[0m:\nhello\n")
80
+ list.show_last
81
+ end
82
+
73
83
  it 'can list conversations without thinking' do
74
84
  expect(chat).to receive(:markdown).
75
85
  and_return(double(on?: true)).at_least(:once)
@@ -0,0 +1,26 @@
1
+ require 'spec_helper'
2
+
3
+ describe OllamaChat::MessageOutput do
4
+ let :chat do
5
+ OllamaChat::Chat.new
6
+ end
7
+
8
+ connect_to_ollama_server
9
+
10
+ it 'output can write to file' do
11
+ expect(STDERR).to receive(:puts).with(/No response available to write to "foo.txt"/)
12
+ expect(chat.output('foo.txt')).to be_nil
13
+ chat.instance_variable_get(:@messages).load_conversation(asset('conversation.json'))
14
+ expect(chat).to receive(:write_file_unless_exist).and_return true
15
+ expect(STDOUT).to receive(:puts).with(/Last response was written to \"foo.txt\"./)
16
+ expect(chat.output('foo.txt')).to eq chat
17
+ end
18
+
19
+ it 'pipe can write to command stdin' do
20
+ expect(STDERR).to receive(:puts).with(/No response available to output to pipe command "true"/)
21
+ expect(chat.pipe('true')).to be_nil
22
+ chat.instance_variable_get(:@messages).load_conversation(asset('conversation.json'))
23
+ expect(STDOUT).to receive(:puts).with(/Last response was piped to \"true\"./)
24
+ expect(chat.pipe('true')).to eq chat
25
+ end
26
+ end
@@ -1,6 +1,6 @@
1
1
  require 'spec_helper'
2
2
 
3
- RSpec.describe OllamaChat::ModelHandling do
3
+ describe OllamaChat::ModelHandling do
4
4
  let :chat do
5
5
  OllamaChat::Chat.new
6
6
  end
@@ -1,7 +1,7 @@
1
1
  require 'spec_helper'
2
2
  require 'pathname'
3
3
 
4
- RSpec.describe OllamaChat::Parsing do
4
+ describe OllamaChat::Parsing do
5
5
  let :chat do
6
6
  OllamaChat::Chat.new.tap do |chat|
7
7
  chat.document_policy = 'importing'
@@ -31,6 +31,19 @@ RSpec.describe OllamaChat::Parsing do
31
31
  end
32
32
  end
33
33
 
34
+ it 'can parse RSS with application/xml content type' do
35
+ asset_io('example.rss') do |io|
36
+ def io.content_type
37
+ 'application/xml'
38
+ end
39
+ expect(chat.parse_source(io)).to start_with(<<~EOT)
40
+ # Example News Feed
41
+
42
+ ## [New Study Shows Benefits of Meditation](https://example.com/article/meditation-benefits)
43
+ EOT
44
+ end
45
+ end
46
+
34
47
  it 'can parse CSV' do
35
48
  asset_io('example.csv') do |io|
36
49
  def io.content_type
@@ -1,6 +1,6 @@
1
1
  require 'spec_helper'
2
2
 
3
- RSpec.describe OllamaChat::SourceFetching do
3
+ describe OllamaChat::SourceFetching do
4
4
  let :chat do
5
5
  OllamaChat::Chat.new
6
6
  end
@@ -1,6 +1,6 @@
1
1
  require 'spec_helper'
2
2
 
3
- RSpec.describe OllamaChat::Switches do
3
+ describe OllamaChat::Switches do
4
4
  describe OllamaChat::Switches::Switch do
5
5
  let :switch do
6
6
  described_class.new(
@@ -1,6 +1,6 @@
1
1
  require 'spec_helper'
2
2
 
3
- RSpec.describe OllamaChat::Utils::CacheFetcher do
3
+ describe OllamaChat::Utils::CacheFetcher do
4
4
  let :url do
5
5
  'https://www.example.com/hello'
6
6
  end
@@ -1,6 +1,6 @@
1
1
  require 'spec_helper'
2
2
 
3
- RSpec.describe OllamaChat::Utils::Fetcher do
3
+ describe OllamaChat::Utils::Fetcher do
4
4
  let :url do
5
5
  'https://www.example.com/hello'
6
6
  end
@@ -1,6 +1,6 @@
1
1
  require 'spec_helper'
2
2
 
3
- RSpec.describe OllamaChat::Utils::FileArgument do
3
+ describe OllamaChat::Utils::FileArgument do
4
4
  it 'it can return content' do
5
5
  expect(described_class.get_file_argument('foo')).to eq 'foo'
6
6
  end
@@ -1,6 +1,6 @@
1
1
  require 'spec_helper'
2
2
 
3
- RSpec.describe OllamaChat::WebSearching do
3
+ describe OllamaChat::WebSearching do
4
4
  let :chat do
5
5
  OllamaChat::Chat.new
6
6
  end
metadata CHANGED
@@ -1,7 +1,7 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: ollama_chat
3
3
  version: !ruby/object:Gem::Version
4
- version: 0.0.17
4
+ version: 0.0.19
5
5
  platform: ruby
6
6
  authors:
7
7
  - Florian Frank
@@ -15,14 +15,14 @@ dependencies:
15
15
  requirements:
16
16
  - - "~>"
17
17
  - !ruby/object:Gem::Version
18
- version: '1.20'
18
+ version: '1.27'
19
19
  type: :development
20
20
  prerelease: false
21
21
  version_requirements: !ruby/object:Gem::Requirement
22
22
  requirements:
23
23
  - - "~>"
24
24
  - !ruby/object:Gem::Version
25
- version: '1.20'
25
+ version: '1.27'
26
26
  - !ruby/object:Gem::Dependency
27
27
  name: all_images
28
28
  requirement: !ruby/object:Gem::Requirement
@@ -388,6 +388,7 @@ extra_rdoc_files:
388
388
  - lib/ollama_chat/information.rb
389
389
  - lib/ollama_chat/message_format.rb
390
390
  - lib/ollama_chat/message_list.rb
391
+ - lib/ollama_chat/message_output.rb
391
392
  - lib/ollama_chat/model_handling.rb
392
393
  - lib/ollama_chat/ollama_chat_config.rb
393
394
  - lib/ollama_chat/parsing.rb
@@ -424,6 +425,7 @@ files:
424
425
  - lib/ollama_chat/information.rb
425
426
  - lib/ollama_chat/message_format.rb
426
427
  - lib/ollama_chat/message_list.rb
428
+ - lib/ollama_chat/message_output.rb
427
429
  - lib/ollama_chat/model_handling.rb
428
430
  - lib/ollama_chat/ollama_chat_config.rb
429
431
  - lib/ollama_chat/ollama_chat_config/default_config.yml
@@ -461,6 +463,7 @@ files:
461
463
  - spec/ollama_chat/follow_chat_spec.rb
462
464
  - spec/ollama_chat/information_spec.rb
463
465
  - spec/ollama_chat/message_list_spec.rb
466
+ - spec/ollama_chat/message_output_spec.rb
464
467
  - spec/ollama_chat/model_handling_spec.rb
465
468
  - spec/ollama_chat/parsing_spec.rb
466
469
  - spec/ollama_chat/source_fetching_spec.rb
@@ -503,6 +506,7 @@ test_files:
503
506
  - spec/ollama_chat/follow_chat_spec.rb
504
507
  - spec/ollama_chat/information_spec.rb
505
508
  - spec/ollama_chat/message_list_spec.rb
509
+ - spec/ollama_chat/message_output_spec.rb
506
510
  - spec/ollama_chat/model_handling_spec.rb
507
511
  - spec/ollama_chat/parsing_spec.rb
508
512
  - spec/ollama_chat/source_fetching_spec.rb