ollama_chat 0.0.13 → 0.0.15

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: a6a03b8af7c470d83520d269829f51fe480f7a5813bc1b4df50c63d339e28953
4
- data.tar.gz: cb86dd9896d6948fb736c889a672fb29e4f21c8174a3ddb62d6fcc93be59f020
3
+ metadata.gz: c05a3e4579e75ba1c724d7128581de7b96dac914c2bd633368920a207be24b3a
4
+ data.tar.gz: 6d4cf892e3a3ec3d00dc1b4cf00e686f311240893a5cd2af561afdb7ea6946de
5
5
  SHA512:
6
- metadata.gz: ef19594894b1bb11217e710e1a302a2d30add6140da629c66e3e60d990e30f45a29196d310e06f7c06b36d8ddf33f84e7bd4278f87705b48e501b06d5f53d290
7
- data.tar.gz: 332c66cdca39a187ecb3aff193e5a2522ad11deabaedb584c32a78a620d4c068be3cc062d181a5e6c1d0bf44fed371b3409010f2a4e404614ba142e37d5afbdc
6
+ metadata.gz: 9afec82d69645e4b8400b63c8f5bf51bd4b4d0471cb59c419b304e9a532fe874dce33f403625abb4fac4543892ed5c0e40b476a56da7023eda4ea99a15ac6275
7
+ data.tar.gz: 611da289b8ac3d9bef78d7453418ec9e90123e5f7af36d386b62a7646a578adbe90ebc5396a09248152d117085ee0598a743cf4746e4e1e9a402a6e32475bd02
data/CHANGES.md CHANGED
@@ -1,8 +1,36 @@
1
1
  # Changes
2
2
 
3
+ ## 2025-07-02 v0.0.15
4
+
5
+ - **Enhanced `ollama_chat_send` and Unix Domain Socket Support:**
6
+ - Added support for advanced parameters:
7
+ - `-t`: Sends input as terminal commands.
8
+ - `-r`: Enables two-way communication by waiting for and returning the server's response.
9
+ - `-h` or `--help`: Displays usage information and available options.
10
+ - Improved socket management using the `unix_socks` gem.
11
+ - Enhanced message processing logic to handle different types of messages
12
+ (`:socket_input`, `:terminal_input`, `:socket_input_with_response`).
13
+ - **Selector Support for Model and System Prompt Selection:**
14
+ - Introduced `?selector` syntax to filter models and prompts.
15
+ - Updated documentation to reflect this new feature.
16
+ - Added a chooser dialog when multiple options match the selector.
17
+
18
+ ## 2025-06-07 v0.0.14
19
+
20
+ * **Message List Improvements**:
21
+ * Added thinking status to messages when chat is in think mode
22
+ * Improved system prompt handling with new method documentation
23
+ * **Improved /system command handling for OllamaChat chat system**:
24
+ * Added support for '/system [show]' command to show or change system prompt.
25
+ * Add conversation length to chat information display
26
+ * **Improvements to OllamaChat::SourceFetching**:
27
+ * Fixed bug where document type concatenation could cause errors when `full?`
28
+ returns `nil`, ensuring proper string formatting and avoiding potential
29
+ crashes.
30
+
3
31
  ## 2025-06-05 v0.0.13
4
32
 
5
- * Improved chat command handling
33
+ * **Improved chat command handling**
6
34
  - Added support for '/clear tags' to clear all tags.
7
35
  - Updated cases for 'history', 'all' and added case for 'tags'.
8
36
  - Added commands to clear documents collection and print a message in `information.rb`.
data/README.md CHANGED
@@ -24,8 +24,8 @@ Usage: ollama_chat [OPTIONS]
24
24
 
25
25
  -f CONFIG config file to read
26
26
  -u URL the ollama base url, OLLAMA_URL
27
- -m MODEL the ollama model to chat with, OLLAMA_CHAT_MODEL
28
- -s SYSTEM the system prompt to use as a file, OLLAMA_CHAT_SYSTEM
27
+ -m MODEL the ollama model to chat with, OLLAMA_CHAT_MODEL, ?selector
28
+ -s SYSTEM the system prompt to use as a file, OLLAMA_CHAT_SYSTEM, ?selector
29
29
  -c CHAT a saved chat conversation to load
30
30
  -C COLLECTION name of the collection used in this conversation
31
31
  -D DOCUMENT load document and add to embeddings collection (multiple)
@@ -34,6 +34,9 @@ Usage: ollama_chat [OPTIONS]
34
34
  -S open a socket to receive input from ollama_chat_send
35
35
  -V display the current version number and quit
36
36
  -h this help
37
+
38
+ Use `?selector` with `-m` or `-s` to filter options. Multiple matches
39
+ will open a chooser dialog.
37
40
  ```
38
41
 
39
42
  The base URL can be either set by the environment variable `OLLAMA_URL` or it
@@ -123,11 +126,11 @@ The following commands can be given inside the chat, if prefixed by a `/`:
123
126
  /location toggle location submission
124
127
  /voice [change] toggle voice output or change the voice
125
128
  /list [n] list the last n / all conversation exchanges
126
- /clear [messages|links|history] clear the all messages, links, or the chat history (defaults to messages)
129
+ /clear [what] clear what=messages|links|history|tags|all
127
130
  /clobber clear the conversation, links, and collection
128
131
  /drop [n] drop the last n exchanges, defaults to 1
129
132
  /model change the model
130
- /system change system prompt (clears conversation)
133
+ /system [show] change/show system prompt
131
134
  /regenerate the last answer message
132
135
  /collection [clear|change] change (default) collection or clear
133
136
  /info show information for current session
@@ -167,6 +170,32 @@ function! OllamaChatSend(input)
167
170
  endfunction
168
171
  ```
169
172
 
173
+ #### Advanced Parameters for `ollama_chat_send`
174
+
175
+ The `ollama_chat_send` command now supports additional parameters to enhance functionality:
176
+
177
+ - **Terminal Input (`-t`)**: Sends input as terminal commands, enabling special commands like `/import`.
178
+
179
+ ```bash
180
+ $ echo "/import https://example.com/some-content" | ollama_chat_send -t
181
+ ```
182
+
183
+ - **Wait for Response (`-r`)**: Enables two-way communication by waiting for and returning the server's response.
184
+
185
+ ```bash
186
+ $ response=$(echo "Tell me a joke." | ollama_chat_send -r)
187
+ $ echo "$response"
188
+ ```
189
+
190
+ - **Help (`-h` or `--help`)**: Displays usage information and available options.
191
+
192
+ ```bash
193
+ $ ollama_chat_send -h
194
+ ```
195
+
196
+ These parameters provide greater flexibility in how you interact with
197
+ `ollama_chat`, whether from the command line or integrated tools like `vim`.
198
+
170
199
  ## Download
171
200
 
172
201
  The homepage of this app is located at
data/Rakefile CHANGED
@@ -32,6 +32,7 @@ GemHadar do
32
32
  dependency 'excon', '~> 1.0'
33
33
  dependency 'ollama-ruby', '~> 1.2'
34
34
  dependency 'documentrix', '~> 0.0', '>= 0.0.2'
35
+ dependency 'unix_socks'
35
36
  dependency 'rss', '~> 0.3'
36
37
  dependency 'term-ansicolor', '~> 1.11'
37
38
  dependency 'redis', '~> 5.0'
data/VERSION CHANGED
@@ -1 +1 @@
1
- 0.0.13
1
+ 0.0.15
data/bin/ollama_chat_send CHANGED
@@ -1,10 +1,35 @@
1
1
  #!/usr/bin/env ruby
2
2
 
3
3
  require 'ollama_chat'
4
+ require 'tins/go'
5
+ include Tins::GO
4
6
 
7
+
8
+ opts = go 'rth', ARGV
9
+
10
+ def usage(rc = 0)
11
+ puts <<~EOT
12
+ Usage: #{File.basename($0)} [OPTIONS]
13
+
14
+ Options:
15
+ -r Wait for the response from Ollama Chat and output it
16
+ -t Send input as terminal input including commands, e. g. /import
17
+ -h Show this help message
18
+
19
+ Send data to a running Ollame Chat client via standard input.
20
+ EOT
21
+ exit rc
22
+ end
23
+
24
+ opts[?h] and usage
5
25
  begin
6
- type = (ARGV.shift || 'socket_input').to_sym
7
- OllamaChat::ServerSocket.send_to_server_socket(STDIN.read, type:)
26
+ type = if opts[?t]
27
+ :terminal_input
28
+ else
29
+ opts[?r] ? :socket_input_with_response : :socket_input
30
+ end
31
+ response = OllamaChat::ServerSocket.send_to_server_socket(STDIN.read, type:)
32
+ type == :socket_input_with_response and puts response.content
8
33
  rescue => e
9
34
  warn "Caught #{e.class}: #{e}"
10
35
  exit 1
@@ -1,6 +1,7 @@
1
1
  require 'tins'
2
2
  require 'tins/secure_write'
3
3
  require 'tins/xt/string_version'
4
+ require 'tins/xt/full'
4
5
  require 'json'
5
6
  require 'term/ansicolor'
6
7
  require 'reline'
@@ -152,9 +153,11 @@ class OllamaChat::Chat
152
153
  when %r(^/model$)
153
154
  @model = choose_model('', @model)
154
155
  :next
155
- when %r(^/system$)
156
- change_system_prompt(@system)
157
- info
156
+ when %r(^/system(?:\s+(show))?$)
157
+ if $1 != 'show'
158
+ change_system_prompt(@system)
159
+ end
160
+ @messages.show_system_prompt
158
161
  :next
159
162
  when %r(^/regenerate$)
160
163
  if content = messages.second_last&.content
@@ -350,15 +353,14 @@ class OllamaChat::Chat
350
353
  content = Reline.readline(input_prompt, true)&.chomp
351
354
  rescue Interrupt
352
355
  if message = server_socket_message
353
- self.server_socket_message = nil
354
- type = message.fetch('type', 'socket_input').to_sym
355
- content = message['content']
356
+ type = message.type.full?(:to_sym) || :socket_input
357
+ content = message.content
356
358
  else
357
359
  raise
358
360
  end
359
361
  end
360
362
 
361
- unless type == :socket_input
363
+ if type == :terminal_input
362
364
  case next_action = handle_input(content)
363
365
  when :next
364
366
  next
@@ -371,6 +373,8 @@ class OllamaChat::Chat
371
373
  end
372
374
  end
373
375
 
376
+ content = content.encode(invalid: :replace)
377
+
374
378
  content, tags = if @parse_content
375
379
  parse_content(content, @images)
376
380
  else
@@ -417,10 +421,22 @@ class OllamaChat::Chat
417
421
  }.uniq.map { |l, t| hyperlink(l, t) }.join(' ')
418
422
  config.debug and jj messages.to_ary
419
423
  end
424
+
425
+ case type
426
+ when :socket_input
427
+ server_socket_message&.disconnect
428
+ when :socket_input_with_response
429
+ if message = handler.messages.last
430
+ server_socket_message.respond({ role: message.role, content: message.content })
431
+ end
432
+ server_socket_message&.disconnect
433
+ end
420
434
  rescue Ollama::Errors::TimeoutError
421
435
  STDOUT.puts "#{bold('Error')}: Currently lost connection to ollama server and cannot send command."
422
436
  rescue Interrupt
423
437
  STDOUT.puts "Type /quit to quit."
438
+ ensure
439
+ self.server_socket_message = nil
424
440
  end
425
441
  0
426
442
  rescue ComplexConfig::AttributeMissing, ComplexConfig::ConfigurationSyntaxError => e
@@ -1,6 +1,11 @@
1
1
  module OllamaChat::Dialog
2
2
  def choose_model(cli_model, current_model)
3
+ selector = if cli_model =~ /\A\?+(.*)\z/
4
+ cli_model = ''
5
+ Regexp.new($1)
6
+ end
3
7
  models = ollama.tags.models.map(&:name).sort
8
+ selector and models = models.grep(selector)
4
9
  model = if cli_model == ''
5
10
  OllamaChat::Utils::Chooser.choose(models) || current_model
6
11
  else
@@ -58,8 +63,12 @@ module OllamaChat::Dialog
58
63
  end
59
64
 
60
65
  def change_system_prompt(default, system: nil)
61
- selector = Regexp.new(system.to_s[1..-1].to_s)
62
- prompts = config.system_prompts.attribute_names.compact.grep(selector)
66
+ selector = if system =~ /\A\?(.+)\z/
67
+ Regexp.new($1)
68
+ else
69
+ Regexp.new(system.to_s)
70
+ end
71
+ prompts = config.system_prompts.attribute_names.compact.grep(selector)
63
72
  if prompts.size == 1
64
73
  system = config.system_prompts.send(prompts.first)
65
74
  else
@@ -13,6 +13,8 @@ class OllamaChat::FollowChat
13
13
  @user = nil
14
14
  end
15
15
 
16
+ attr_reader :messages
17
+
16
18
  def call(response)
17
19
  debug_output(response)
18
20
 
@@ -1,8 +1,12 @@
1
1
  module OllamaChat::History
2
+ # Returns the full path of the chat history filename based on the
3
+ # configuration.
2
4
  def chat_history_filename
3
5
  File.expand_path(config.chat_history_filename)
4
6
  end
5
7
 
8
+ # Initializes the chat history by loading it from a file if it exists, and
9
+ # then loads the history into Readline::HISTORY.
6
10
  def init_chat_history
7
11
  if File.exist?(chat_history_filename)
8
12
  File.open(chat_history_filename, ?r) do |history|
@@ -13,10 +17,12 @@ module OllamaChat::History
13
17
  end
14
18
  end
15
19
 
20
+ # Saves the current chat history to a file in JSON format.
16
21
  def save_history
17
22
  File.secure_write(chat_history_filename, JSON.dump(Readline::HISTORY))
18
23
  end
19
24
 
25
+ # Clears all entries from Readline::HISTORY.
20
26
  def clear_history
21
27
  Readline::HISTORY.clear
22
28
  end
@@ -29,31 +29,32 @@ module OllamaChat::Information
29
29
 
30
30
  def info
31
31
  STDOUT.puts "Running ollama_chat version: #{bold(OllamaChat::VERSION)}"
32
- STDOUT.puts "Connected to ollama server version: #{bold(server_version)}"
33
- STDOUT.puts "Current model is #{bold{@model}}."
32
+ STDOUT.puts "Connected to ollama server version: #{bold(server_version)} on: #{bold(server_url)}"
33
+ STDOUT.puts "Current conversation model is #{bold{@model}}."
34
34
  if @model_options.present?
35
35
  STDOUT.puts " Options: #{JSON.pretty_generate(@model_options).gsub(/(?<!\A)^/, ' ')}"
36
36
  end
37
37
  @embedding.show
38
38
  if @embedding.on?
39
- STDOUT.puts "Embedding model is #{bold{@embedding_model}}"
39
+ STDOUT.puts "Current embedding model is #{bold{@embedding_model}}"
40
40
  if @embedding_model_options.present?
41
41
  STDOUT.puts " Options: #{JSON.pretty_generate(@embedding_model_options).gsub(/(?<!\A)^/, ' ')}"
42
42
  end
43
43
  STDOUT.puts "Text splitter is #{bold{config.embedding.splitter.name}}."
44
44
  collection_stats
45
45
  end
46
- STDOUT.puts "Documents database cache is #{@documents.nil? ? 'n/a' : bold{@documents.cache.class}}"
47
46
  markdown.show
48
47
  stream.show
48
+ think.show
49
49
  location.show
50
- STDOUT.puts "Document policy for references in user text: #{bold{@document_policy}}"
51
- STDOUT.puts "Thinking is #{bold(think.on? ? 'enabled' : 'disabled')}."
52
- STDOUT.puts "Currently selected search engine is #{bold(search_engine)}."
50
+ voice.show
53
51
  if @voice.on?
54
- STDOUT.puts "Using voice #{bold{@current_voice}} to speak."
52
+ STDOUT.puts " Using voice #{bold{@current_voice}} to speak."
55
53
  end
56
- @messages.show_system_prompt
54
+ STDOUT.puts "Documents database cache is #{@documents.nil? ? 'n/a' : bold{@documents.cache.class}}"
55
+ STDOUT.puts "Document policy for references in user text: #{bold{@document_policy}}"
56
+ STDOUT.puts "Currently selected search engine is #{bold(search_engine)}."
57
+ STDOUT.puts "Conversation length: #{bold(@messages.size.to_s)} message(s)."
57
58
  nil
58
59
  end
59
60
 
@@ -70,7 +71,7 @@ module OllamaChat::Information
70
71
  /clobber clear the conversation, links, and collection
71
72
  /drop [n] drop the last n exchanges, defaults to 1
72
73
  /model change the model
73
- /system change system prompt (clears conversation)
74
+ /system [show] change/show system prompt
74
75
  /regenerate the last answer message
75
76
  /collection [clear|change] change (default) collection or clear
76
77
  /info show information for current session
@@ -97,8 +98,8 @@ module OllamaChat::Information
97
98
 
98
99
  -f CONFIG config file to read
99
100
  -u URL the ollama base url, OLLAMA_URL
100
- -m MODEL the ollama model to chat with, OLLAMA_CHAT_MODEL
101
- -s SYSTEM the system prompt to use as a file, OLLAMA_CHAT_SYSTEM
101
+ -m MODEL the ollama model to chat with, OLLAMA_CHAT_MODEL, ?selector
102
+ -s SYSTEM the system prompt to use as a file, OLLAMA_CHAT_SYSTEM, ?selector
102
103
  -c CHAT a saved chat conversation to load
103
104
  -C COLLECTION name of the collection used in this conversation
104
105
  -D DOCUMENT load document and add to embeddings collection (multiple)
@@ -108,6 +109,8 @@ module OllamaChat::Information
108
109
  -V display the current version number and quit
109
110
  -h this help
110
111
 
112
+ Use `?selector` with `-m` or `-s` to filter options. Multiple matches
113
+ will open a chooser dialog.
111
114
  EOT
112
115
  0
113
116
  end
@@ -120,4 +123,8 @@ module OllamaChat::Information
120
123
  def server_version
121
124
  @server_version ||= ollama.version.version
122
125
  end
126
+
127
+ def server_url
128
+ @server_url ||= ollama.base_url
129
+ end
123
130
  end
@@ -106,10 +106,20 @@ class OllamaChat::MessageList
106
106
  when 'system' then 213
107
107
  else 210
108
108
  end
109
+ thinking = if @chat.think.on?
110
+ think_annotate do
111
+ m.thinking.full? { @chat.markdown.on? ? Kramdown::ANSI.parse(_1) : _1 }
112
+ end
113
+ end
109
114
  content = m.content.full? { @chat.markdown.on? ? Kramdown::ANSI.parse(_1) : _1 }
110
115
  message_text = message_type(m.images) + " "
111
116
  message_text += bold { color(role_color) { m.role } }
112
- message_text += ":\n#{content}"
117
+ if thinking
118
+ message_text += [ ?:, thinking, talk_annotate { content } ].compact.
119
+ map { _1.chomp } * ?\n
120
+ else
121
+ message_text += ":\n#{content}"
122
+ end
113
123
  m.images.full? { |images|
114
124
  message_text += "\nImages: " + italic { images.map(&:path) * ', ' }
115
125
  }
@@ -118,35 +128,64 @@ class OllamaChat::MessageList
118
128
  self
119
129
  end
120
130
 
121
- # The drop method removes the last n exchanges from the message list and returns the number of removed exchanges.
131
+ # Removes the last `n` exchanges from the message list. An exchange consists
132
+ # of a user and an assistant message. If only a single user message is
133
+ # present at the end, it will be removed first before proceeding with
134
+ # complete exchanges.
122
135
  #
123
- # @param n [ Integer ] the number of exchanges to remove
136
+ # @param n [Integer] The number of exchanges to remove.
137
+ # @return [Integer] The actual number of complete exchanges removed.
138
+ # This may be less than `n` if there are not enough messages.
124
139
  #
125
- # @return [ Integer ] the number of removed exchanges, or 0 if there are no more exchanges to pop
140
+ # @note
141
+ # - System messages are preserved and not considered part of an exchange.
142
+ # - If only one incomplete exchange (a single user message) exists, it will
143
+ # be dropped first before removing complete exchanges.
126
144
  def drop(n)
127
- if @messages.reject { _1.role == 'system' }.size > 1
128
- n = n.to_i.clamp(1, Float::INFINITY)
129
- r = @messages.pop(2 * n)
130
- m = r.size / 2
131
- STDOUT.puts "Dropped the last #{m} exchanges."
132
- m
133
- else
134
- STDOUT.puts "No more exchanges you can drop."
135
- 0
145
+ n = n.to_i.clamp(1, Float::INFINITY)
146
+ non_system_messages = @messages.reject { _1.role == 'system' }
147
+ if non_system_messages&.last&.role == 'user'
148
+ @messages.pop
149
+ n -= 1
150
+ end
151
+ if n == 0
152
+ STDOUT.puts "Dropped the last exchange."
153
+ return 1
136
154
  end
155
+ if non_system_messages.empty?
156
+ STDOUT.puts "No more exchanges can be dropped."
157
+ return 0
158
+ end
159
+ m = 0
160
+ while @messages.size > 1 && n > 0
161
+ @messages.pop(2)
162
+ m += 1
163
+ n -= 1
164
+ end
165
+ STDOUT.puts "Dropped the last #{m} exchanges."
166
+ m
137
167
  end
138
168
 
139
- # The set_system_prompt method sets the system prompt for the chat session.
140
- # This implies deleting all of the messages in the message list, so it only
141
- # contains the system prompt at the end.
169
+ # Sets the system prompt for the chat session.
170
+ #
171
+ # @param system [String, nil] The new system prompt. If `nil` or `false`, clears the system prompt.
142
172
  #
143
- # @param system [ String ] the new system prompt
173
+ # @return [OllamaChat::MessageList] Returns `self` to allow chaining of method calls.
144
174
  #
145
- # @return [ OllamaChat::MessageList ] the message list instance itself, allowing for chaining.
175
+ # @note This method:
176
+ # - Removes all existing system prompts from the message list
177
+ # - Adds the new system prompt to the beginning of the message list if provided
178
+ # - Handles edge cases such as clearing prompts when `system` is `nil` or `false`
146
179
  def set_system_prompt(system)
147
- @system = system.to_s
148
- @messages.clear
149
- @messages << Ollama::Message.new(role: 'system', content: self.system)
180
+ @messages.reject! { |msg| msg.role == 'system' }
181
+ if new_system_prompt = system.full?(:to_s)
182
+ @system = new_system_prompt
183
+ @messages.unshift(
184
+ Ollama::Message.new(role: 'system', content: self.system)
185
+ )
186
+ else
187
+ @system = nil
188
+ end
150
189
  self
151
190
  end
152
191
 
@@ -1,15 +1,44 @@
1
1
  module OllamaChat::ModelHandling
2
+
3
+ # The model_present? method checks if the specified Ollama model is available.
4
+ #
5
+ # @param model [ String ] the name of the Ollama model
6
+ #
7
+ # @return [ String, FalseClass ] the system prompt if the model is present,
8
+ # false otherwise
2
9
  def model_present?(model)
3
10
  ollama.show(model:) { return _1.system.to_s }
4
11
  rescue Ollama::Errors::NotFoundError
5
12
  false
6
13
  end
7
14
 
15
+ # The pull_model_from_remote method attempts to retrieve a model from the
16
+ # remote server if it is not found locally.
17
+ #
18
+ # @param model [ String ] the name of the model to be pulled
19
+ #
20
+ # @return [ nil ]
8
21
  def pull_model_from_remote(model)
9
22
  STDOUT.puts "Model #{bold{model}} not found locally, attempting to pull it from remote now…"
10
23
  ollama.pull(model:)
11
24
  end
12
25
 
26
+ # The pull_model_unless_present method checks if the specified model is
27
+ # present on the system.
28
+ #
29
+ # If the model is already present, it returns the system prompt if it is
30
+ # present.
31
+ #
32
+ # Otherwise, it attempts to pull the model from the remote server using the
33
+ # pull_model_from_remote method. If the model is still not found after
34
+ # pulling, it exits the program with a message indicating that the model was
35
+ # not found remotely.
36
+ #
37
+ # @param model [ String ] The name of the model to check for presence.
38
+ # @param options [ Hash ] Options for the pull_model_from_remote method.
39
+ #
40
+ # @return [ String, FalseClass ] the system prompt if the model and it are
41
+ # present, false otherwise.
13
42
  def pull_model_unless_present(model, options)
14
43
  if system = model_present?(model)
15
44
  return system.full?
@@ -1,65 +1,42 @@
1
1
  module OllamaChat::ServerSocket
2
2
  class << self
3
- # Returns the path to the XDG runtime directory, or a default path if not set.
4
- # @return [String] the expanded path to the XDG runtime directory
5
- def runtime_dir
6
- File.expand_path(ENV.fetch('XDG_RUNTIME_DIR', '~/.local/run'))
7
- end
8
-
9
- # Constructs the full path to the server socket file.
10
- # @return [String] the full path to the Unix socket
11
- def server_socket_path
12
- File.join(runtime_dir, 'ollama_chat.sock')
13
- end
3
+ # The send_to_server_socket method sends content to the server socket and returns
4
+ # the response if type is :socket_input_with_response, otherwise it returns nil.
14
5
 
15
- # Sends a message to the server socket.
6
+ # @param content [ String ] the message to be sent to the server
7
+ # @param type [ Symbol ] the type of message being sent (default: :socket_input)
16
8
  #
17
- # @param content [String] the content to send
18
- # @param type [Symbol] the type of message (default: :socket_input)
19
- # @raise [Errno::ENOENT] if the socket file does not exist
20
- # @raise [Errno::ECONNREFUSED] if the socket is not listening (server no running)
9
+ # @return [ String, NilClass ] the response from the server if type is
10
+ # :socket_input_with_response, otherwise nil.
21
11
  def send_to_server_socket(content, type: :socket_input)
22
- FileUtils.mkdir_p runtime_dir
12
+ server = UnixSocks::Server.new(socket_name: 'ollama_chat.sock')
23
13
  message = { content:, type: }
24
- socket = UNIXSocket.new(server_socket_path)
25
- socket.puts JSON(message)
26
- socket.close
14
+ if type.to_sym == :socket_input_with_response
15
+ return server.transmit_with_response(message)
16
+ else
17
+ server.transmit(message)
18
+ nil
19
+ end
27
20
  end
28
21
  end
29
22
 
30
- # Accessor for the server socket message.
31
- # Holds the last message received from the Unix socket.
32
- # @return [String, nil] the message content, or nil if not set
33
- # @see OllamaChat::ServerSocket#init_server_socket
34
- # @see OllamaChat::ServerSocket#send_to_server_socket
35
23
  attr_accessor :server_socket_message
36
24
 
37
- # Initializes a Unix domain socket server for OllamaChat.
25
+ # Initializes the server socket to receive messages from the Ollama Chat
26
+ # Client.
38
27
  #
39
- # Creates the necessary runtime directory, checks for existing socket file,
40
- # and starts a server loop in a new thread. Listens for incoming connections,
41
- # reads JSON data, and terminates the server upon receiving a message.
28
+ # This method sets up a Unix domain socket server that listens for incoming
29
+ # messages in the background. When a message is received, it updates the
30
+ # instance variable `server_socket_message` and sends an interrupt signal
31
+ # to the current process in order to handle the message.
42
32
  #
43
- # Raises Errno::EEXIST if the socket path already exists.
33
+ # @return [ nil ] This method does not return any value, it only sets up the
34
+ # server socket and kills the process when a message is received.
44
35
  def init_server_socket
45
- FileUtils.mkdir_p OllamaChat::ServerSocket.runtime_dir
46
- if File.exist?(OllamaChat::ServerSocket.server_socket_path)
47
- raise Errno::EEXIST, "Path already exists #{OllamaChat::ServerSocket.server_socket_path.inspect}"
48
- end
49
- Thread.new do
50
- Socket.unix_server_loop(OllamaChat::ServerSocket.server_socket_path) do |sock, client_addrinfo|
51
- begin
52
- data = sock.readline.chomp
53
- self.server_socket_message = JSON.load(data)
54
- Process.kill :INT, $$
55
- rescue JSON::ParserError
56
- ensure
57
- sock.close
58
- end
59
- end
60
- rescue Errno::ENOENT
61
- ensure
62
- FileUtils.rm_f OllamaChat::ServerSocket.server_socket_path
36
+ server = UnixSocks::Server.new(socket_name: 'ollama_chat.sock')
37
+ server.receive_in_background do |message|
38
+ self.server_socket_message = message
39
+ Process.kill :INT, $$
63
40
  end
64
41
  end
65
42
  end
@@ -50,7 +50,7 @@ module OllamaChat::SourceFetching
50
50
 
51
51
  def import_source(source_io, source)
52
52
  source = source.to_s
53
- document_type = source_io&.content_type.full? { |ct| italic { ct } } + ' '
53
+ document_type = source_io&.content_type.full? { |ct| italic { ct } + ' ' }
54
54
  STDOUT.puts "Importing #{document_type}document #{source.to_s.inspect} now."
55
55
  source_content = parse_source(source_io)
56
56
  "Imported #{source.inspect}:\n\n#{source_content}\n\n"
@@ -1,6 +1,6 @@
1
1
  module OllamaChat
2
2
  # OllamaChat version
3
- VERSION = '0.0.13'
3
+ VERSION = '0.0.15'
4
4
  VERSION_ARRAY = VERSION.split('.').map(&:to_i) # :nodoc:
5
5
  VERSION_MAJOR = VERSION_ARRAY[0] # :nodoc:
6
6
  VERSION_MINOR = VERSION_ARRAY[1] # :nodoc:
data/lib/ollama_chat.rb CHANGED
@@ -3,6 +3,7 @@ end
3
3
 
4
4
  require 'ollama'
5
5
  require 'documentrix'
6
+ require 'unix_socks'
6
7
  require 'ollama_chat/version'
7
8
  require 'ollama_chat/utils'
8
9
  require 'ollama_chat/message_format'
data/ollama_chat.gemspec CHANGED
@@ -1,9 +1,9 @@
1
1
  # -*- encoding: utf-8 -*-
2
- # stub: ollama_chat 0.0.13 ruby lib
2
+ # stub: ollama_chat 0.0.15 ruby lib
3
3
 
4
4
  Gem::Specification.new do |s|
5
5
  s.name = "ollama_chat".freeze
6
- s.version = "0.0.13".freeze
6
+ s.version = "0.0.15".freeze
7
7
 
8
8
  s.required_rubygems_version = Gem::Requirement.new(">= 0".freeze) if s.respond_to? :required_rubygems_version=
9
9
  s.require_paths = ["lib".freeze]
@@ -18,7 +18,7 @@ Gem::Specification.new do |s|
18
18
  s.licenses = ["MIT".freeze]
19
19
  s.rdoc_options = ["--title".freeze, "OllamaChat - A command-line interface (CLI) for interacting with an Ollama AI model.".freeze, "--main".freeze, "README.md".freeze]
20
20
  s.required_ruby_version = Gem::Requirement.new("~> 3.1".freeze)
21
- s.rubygems_version = "3.6.7".freeze
21
+ s.rubygems_version = "3.6.9".freeze
22
22
  s.summary = "A command-line interface (CLI) for interacting with an Ollama AI model.".freeze
23
23
  s.test_files = ["spec/assets/example.rb".freeze, "spec/ollama_chat/chat_spec.rb".freeze, "spec/ollama_chat/clipboard_spec.rb".freeze, "spec/ollama_chat/follow_chat_spec.rb".freeze, "spec/ollama_chat/information_spec.rb".freeze, "spec/ollama_chat/message_list_spec.rb".freeze, "spec/ollama_chat/model_handling_spec.rb".freeze, "spec/ollama_chat/parsing_spec.rb".freeze, "spec/ollama_chat/source_fetching_spec.rb".freeze, "spec/ollama_chat/switches_spec.rb".freeze, "spec/ollama_chat/utils/cache_fetcher_spec.rb".freeze, "spec/ollama_chat/utils/fetcher_spec.rb".freeze, "spec/ollama_chat/utils/file_argument_spec.rb".freeze, "spec/ollama_chat/web_searching_spec.rb".freeze, "spec/spec_helper.rb".freeze]
24
24
 
@@ -34,6 +34,7 @@ Gem::Specification.new do |s|
34
34
  s.add_runtime_dependency(%q<excon>.freeze, ["~> 1.0".freeze])
35
35
  s.add_runtime_dependency(%q<ollama-ruby>.freeze, ["~> 1.2".freeze])
36
36
  s.add_runtime_dependency(%q<documentrix>.freeze, ["~> 0.0".freeze, ">= 0.0.2".freeze])
37
+ s.add_runtime_dependency(%q<unix_socks>.freeze, [">= 0".freeze])
37
38
  s.add_runtime_dependency(%q<rss>.freeze, ["~> 0.3".freeze])
38
39
  s.add_runtime_dependency(%q<term-ansicolor>.freeze, ["~> 1.11".freeze])
39
40
  s.add_runtime_dependency(%q<redis>.freeze, ["~> 5.0".freeze])
@@ -5,6 +5,10 @@ RSpec.describe OllamaChat::Chat do
5
5
  %w[ -C test ]
6
6
  end
7
7
 
8
+ before do
9
+ ENV['OLLAMA_CHAT_MODEL'] = 'llama3.1'
10
+ end
11
+
8
12
  let :chat do
9
13
  OllamaChat::Chat.new(argv: argv).expose
10
14
  end
@@ -86,7 +90,7 @@ RSpec.describe OllamaChat::Chat do
86
90
 
87
91
  it 'returns :next when input is "/system"' do
88
92
  expect(chat).to receive(:change_system_prompt).with(nil)
89
- expect(chat).to receive(:info)
93
+ expect(chat.messages).to receive(:show_system_prompt)
90
94
  expect(chat.handle_input("/system")).to eq :next
91
95
  end
92
96
 
@@ -288,7 +292,8 @@ RSpec.describe OllamaChat::Chat do
288
292
  /
289
293
  Running\ ollama_chat\ version|
290
294
  Connected\ to\ ollama\ server|
291
- Current\ model|
295
+ Current\ conversation\ model|
296
+ Current\ embedding\ model|
292
297
  Options|
293
298
  Embedding|
294
299
  Text\ splitter|
@@ -297,8 +302,10 @@ RSpec.describe OllamaChat::Chat do
297
302
  Streaming|
298
303
  Location|
299
304
  Document\ policy|
300
- Thinking\ is|
301
- Currently\ selected\ search\ engine
305
+ Thinking|
306
+ Voice\ output|
307
+ Currently\ selected\ search\ engine|
308
+ Conversation\ length
302
309
  /x
303
310
  ).at_least(1)
304
311
  expect(chat.info).to be_nil
@@ -24,7 +24,7 @@ RSpec.describe OllamaChat::Information do
24
24
 
25
25
  it 'can show info' do
26
26
  expect(STDOUT).to receive(:puts).with(/Connected to ollama server version/)
27
- expect(STDOUT).to receive(:puts).with(/Current model is/)
27
+ expect(STDOUT).to receive(:puts).with(/Current conversation model is/)
28
28
  expect(STDOUT).to receive(:puts).at_least(1)
29
29
  expect(chat.info).to be_nil
30
30
  end
@@ -39,7 +39,16 @@ RSpec.describe OllamaChat::Information do
39
39
  expect(chat.usage).to eq 0
40
40
  end
41
41
 
42
- it 'can show version' do
42
+ it 'can show version' do
43
+ expect(STDOUT).to receive(:puts).with(/^ollama_chat \d+\.\d+\.\d+$/)
43
44
  expect(chat.version).to eq 0
44
45
  end
46
+
47
+ it 'can show server version' do
48
+ expect(chat.server_version).to eq '6.6.6'
49
+ end
50
+
51
+ it 'can show server URL' do
52
+ expect(chat.server_url).to be_a URI::HTTP
53
+ end
45
54
  end
@@ -64,9 +64,11 @@ RSpec.describe OllamaChat::MessageList do
64
64
  FileUtils.rm_f 'tmp/test-conversation.json'
65
65
  end
66
66
 
67
- it 'can list conversations' do
67
+ it 'can list conversations without thinking' do
68
68
  expect(chat).to receive(:markdown).
69
69
  and_return(double(on?: true)).at_least(:once)
70
+ expect(chat).to receive(:think).
71
+ and_return(double(on?: false)).at_least(:once)
70
72
  list << Ollama::Message.new(role: 'user', content: 'world')
71
73
  expect(STDOUT).to receive(:puts).
72
74
  with("📨 \e[1m\e[38;5;213msystem\e[0m\e[0m:\nhello\n")
@@ -75,6 +77,21 @@ RSpec.describe OllamaChat::MessageList do
75
77
  list.list_conversation
76
78
  end
77
79
 
80
+ it 'can list conversations with thinking' do
81
+ expect(chat).to receive(:markdown).
82
+ and_return(double(on?: true)).at_least(:once)
83
+ expect(chat).to receive(:think).
84
+ and_return(double(on?: true)).at_least(:once)
85
+ expect(STDOUT).to receive(:puts).
86
+ with("📨 \e[1m\e[38;5;213msystem\e[0m\e[0m:\n💭\nI need to say something nice…\n\n💬\nhello\n")
87
+ expect(STDOUT).to receive(:puts).
88
+ with("📨 \e[1m\e[38;5;172muser\e[0m\e[0m:\nworld\n")
89
+ list.set_system_prompt nil
90
+ list << Ollama::Message.new(role: 'system', content: 'hello', thinking: 'I need to say something nice…')
91
+ list << Ollama::Message.new(role: 'user', content: 'world')
92
+ list.list_conversation
93
+ end
94
+
78
95
  it 'can show_system_prompt' do
79
96
  expect(list).to receive(:system).and_return 'test **prompt**'
80
97
  expect(Kramdown::ANSI).to receive(:parse).with('test **prompt**').
@@ -82,10 +99,27 @@ RSpec.describe OllamaChat::MessageList do
82
99
  expect(list.show_system_prompt).to eq list
83
100
  end
84
101
 
85
- it 'can set_system_prompt' do
102
+ it 'can set_system_prompt if unset' do
103
+ list.messages.clear
104
+ expect(list.messages.count { _1.role == 'system' }).to eq 0
86
105
  expect {
87
106
  expect(list.set_system_prompt('test prompt')).to eq list
88
107
  }.to change { list.system }.from(nil).to('test prompt')
108
+ expect(list.messages.count { _1.role == 'system' }).to eq 1
109
+ end
110
+
111
+ it 'can set_system_prompt if already set' do
112
+ list.messages.clear
113
+ expect(list.messages.count { _1.role == 'system' }).to eq 0
114
+ list.set_system_prompt('first prompt')
115
+ expect(list.system).to eq('first prompt')
116
+ expect(list.messages.count { _1.role == 'system' }).to eq 1
117
+ #
118
+ list.set_system_prompt('new prompt')
119
+ expect(list.system).to eq('new prompt')
120
+ expect(list.messages.count { _1.role == 'system' }).to eq 1
121
+ expect(list.messages.first.role).to eq('system')
122
+ expect(list.messages.first.content).to eq('new prompt')
89
123
  end
90
124
 
91
125
  it 'can drop n conversations exhanges' do
@@ -94,11 +128,26 @@ RSpec.describe OllamaChat::MessageList do
94
128
  expect(list.size).to eq 1
95
129
  list << Ollama::Message.new(role: 'user', content: 'world')
96
130
  expect(list.size).to eq 2
97
- expect(list.drop(1)).to eq 0
98
131
  list << Ollama::Message.new(role: 'assistant', content: 'hi')
99
132
  expect(list.size).to eq 3
100
133
  expect(list.drop(1)).to eq 1
101
134
  expect(list.size).to eq 1
135
+ expect(list.drop(1)).to eq 0
136
+ expect(list.size).to eq 1
137
+ expect(list.drop(1)).to eq 0
138
+ expect(list.size).to eq 1
139
+ end
140
+
141
+ it 'drops the last user message when there is no assistant response' do
142
+ expect(list.size).to eq 1
143
+ list << Ollama::Message.new(role: 'user', content: 'hello')
144
+ list << Ollama::Message.new(role: 'assistant', content: 'hi')
145
+ list << Ollama::Message.new(role: 'user', content: 'world')
146
+ expect(list.size).to eq 4
147
+ expect(list.drop(1)).to eq 1
148
+ expect(list.size).to eq 3
149
+ expect(list.drop(1)).to eq 1
150
+ expect(list.size).to eq 1
102
151
  end
103
152
 
104
153
  it 'can determine location for system prompt' do
metadata CHANGED
@@ -1,7 +1,7 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: ollama_chat
3
3
  version: !ruby/object:Gem::Version
4
- version: 0.0.13
4
+ version: 0.0.15
5
5
  platform: ruby
6
6
  authors:
7
7
  - Florian Frank
@@ -155,6 +155,20 @@ dependencies:
155
155
  - - ">="
156
156
  - !ruby/object:Gem::Version
157
157
  version: 0.0.2
158
+ - !ruby/object:Gem::Dependency
159
+ name: unix_socks
160
+ requirement: !ruby/object:Gem::Requirement
161
+ requirements:
162
+ - - ">="
163
+ - !ruby/object:Gem::Version
164
+ version: '0'
165
+ type: :runtime
166
+ prerelease: false
167
+ version_requirements: !ruby/object:Gem::Requirement
168
+ requirements:
169
+ - - ">="
170
+ - !ruby/object:Gem::Version
171
+ version: '0'
158
172
  - !ruby/object:Gem::Dependency
159
173
  name: rss
160
174
  requirement: !ruby/object:Gem::Requirement
@@ -479,7 +493,7 @@ required_rubygems_version: !ruby/object:Gem::Requirement
479
493
  - !ruby/object:Gem::Version
480
494
  version: '0'
481
495
  requirements: []
482
- rubygems_version: 3.6.7
496
+ rubygems_version: 3.6.9
483
497
  specification_version: 4
484
498
  summary: A command-line interface (CLI) for interacting with an Ollama AI model.
485
499
  test_files: