ollama_chat 0.0.54 → 0.0.56

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: f0a01656cc0ad8d063570956971ee34ebf94929e90feb3c5f8207879d8367a12
4
- data.tar.gz: 87ce3880db6bfcbdaf049618a8922536bdcbdd20b15366187612fb0e33594fa0
3
+ metadata.gz: 683c0bf5fdb8756030c0c5b69767d95f8ae852094a0048cb61b4cf7166f6a045
4
+ data.tar.gz: 8e7f9022cfd1e9c32f987b76196222d346100cd5ceaa6d59e6dc5c30257e5dbb
5
5
  SHA512:
6
- metadata.gz: c53b7fa09b117940de484fda33145e8f68ef7340da1e392e335f7aca5952c11ecbcad30a32e78551df1ff7972b20d7519a58a64dd808b5e9fecb309445ab12f8
7
- data.tar.gz: c83bb2ae1c82af0aa66a99262c94d0fc7060807d3333d6ffd261bd558d0d338502419ca6ddf3a8b9b31f5a7ef48dc2c15ad6bd42d227a86290506bab12f3f7de
6
+ metadata.gz: 854af0ea07ced05e3f2f36741dd98592a2f6968e3307662e10c14bcb9a10b2168a1dde962de32f44e73719d177547cbccd8eee76c0554878a0de7ce2e1811772
7
+ data.tar.gz: 86b71f7b0748be19d0f772213b1a21ea8c7f838296bcd48348076ef39372e0a0f07db20c59a1c1b56f8a7592d100993a7ffa6b6507b4febbbca2977f0375baff
data/CHANGES.md CHANGED
@@ -1,5 +1,57 @@
1
1
  # Changes
2
2
 
3
+ ## 2026-01-17 v0.0.56
4
+
5
+ - Updated `context_spook` dependency from version **1.4** to **1.5**
6
+ - Expanded context file inclusion to support YAML files
7
+ - Updated `context_spook` method to pass `format` parameter to
8
+ `ContextSpook::generate_context` calls
9
+ - Added `context` section to default configuration with `format: JSON` setting
10
+ - Added `/reconnect` command to reset Ollama connection
11
+ - Introduced `connect_ollama` method to create new Ollama client instances with
12
+ current configuration
13
+ - Added `base_url` method to resolve connection URL from command-line or
14
+ environment config
15
+ - Updated `handle_input` to process `/reconnect` command and trigger
16
+ reconnection
17
+ - Enhanced `OllamaChat::InputContent#input` method to select and read multiple
18
+ files matching a glob pattern
19
+ - Updated `OllamaChat::InputContent#choose_filename` to accept a `chosen`
20
+ parameter for tracking selections
21
+ - Modified test cases in `spec/ollama_chat/input_content_spec.rb` to verify
22
+ multiple file selection behavior
23
+ - Files are now concatenated with filename headers in the output
24
+ - Maintains backward compatibility with single file selection
25
+ - Uses `Set` for efficient duplicate prevention during selection
26
+ - Removed specialized CSV parsing functionality from `OllamaChat::Parsing`
27
+ module
28
+ - Handle nil from `STDIN.gets` to prevent `NoMethodError`
29
+
30
+ ## 2026-01-08 v0.0.55
31
+
32
+ - Added `OllamaChat::Vim` class for inserting text into Vim buffers using the
33
+ clientserver protocol
34
+ - Implemented `server_name` and `clientserver` attribute readers for the
35
+ `OllamaChat::Vim` class
36
+ - Enhanced error message for missing vim server to include the specific server
37
+ name being used
38
+ - Added error handling to the `insert` method with fallback to `STDERR` output
39
+ when vim command fails
40
+ - Added comprehensive specs for `OllamaChat::Vim` module with **100%** coverage
41
+ - Fixed typo in help text for `ollama_chat_send` command ("reqired" →
42
+ "required")
43
+ - Added comprehensive tests for `ThinkControl` module with **100%** coverage
44
+ - Updated `README.md` and `OllamaChat::Information` to document the new
45
+ `/revise_last` command
46
+ - Improved test coverage for Vim integration
47
+ - Ensured proper state management and no side effects during selection in
48
+ `ThinkControl` tests
49
+ - All tests use `connect_to_ollama_server` for proper setup
50
+ - Fixed edge cases including exit conditions and nil selections in
51
+ `ThinkControl` tests
52
+ - Included tests for combined logic with `think_loud` switch in `ThinkControl`
53
+ tests
54
+
3
55
  ## 2026-01-08 v0.0.54
4
56
 
5
57
  ### New Features
data/README.md CHANGED
@@ -151,6 +151,7 @@ subject - the young, blue-eyed cat.
151
151
  The following commands can be given inside the chat, if prefixed by a `/`:
152
152
 
153
153
  ```
154
+ /reconnect reconnect to current ollama server
154
155
  /copy to copy last response to clipboard
155
156
  /paste to paste content
156
157
  /markdown toggle markdown output
@@ -183,6 +184,7 @@ The following commands can be given inside the chat, if prefixed by a `/`:
183
184
  /compose compose content using an EDITOR
184
185
  /input [pattern] select and read content from a file (default: **/*)
185
186
  /context [pattern...] collect context with glob patterns
187
+ /revise_last edit the last response in an external editor
186
188
  /output filename save last response to filename
187
189
  /pipe command write last response to command's stdin
188
190
  /vim insert the last message into a vim server
data/Rakefile CHANGED
@@ -58,7 +58,7 @@ GemHadar do
58
58
  dependency 'bigdecimal', '~> 3.1'
59
59
  dependency 'csv', '~> 3.0'
60
60
  dependency 'const_conf', '~> 0.3'
61
- dependency 'context_spook', '~> 1.1'
61
+ dependency 'context_spook', '~> 1.5'
62
62
  development_dependency 'all_images', '~> 0.6'
63
63
  development_dependency 'rspec', '~> 3.2'
64
64
  development_dependency 'kramdown', '~> 2.0'
data/bin/ollama_chat_send CHANGED
@@ -25,7 +25,7 @@ def usage(rc = 0)
25
25
  -d DIR the working directory to derive the socket file from
26
26
  -h Show this help message
27
27
 
28
- Send data to a running Ollame Chat client via standard input.
28
+ Send data to a running Ollama Chat client via standard input.
29
29
  EOT
30
30
  exit rc
31
31
  end
@@ -88,15 +88,7 @@ class OllamaChat::Chat
88
88
  @ollama_chat_config = OllamaChat::OllamaChatConfig.new(@opts[?f])
89
89
  self.config = @ollama_chat_config.config
90
90
  setup_switches(config)
91
- base_url = @opts[?u] || OllamaChat::EnvConfig::OLLAMA::URL
92
- @ollama = Ollama::Client.new(
93
- connect_timeout: config.timeouts.connect_timeout?,
94
- read_timeout: config.timeouts.read_timeout?,
95
- write_timeout: config.timeouts.write_timeout?,
96
- base_url: base_url,
97
- debug: ,
98
- user_agent:
99
- )
91
+ @ollama = connect_ollama
100
92
  if server_version.version < '0.9.0'.version
101
93
  raise ArgumentError, 'require ollama API version 0.9.0 or higher'
102
94
  end
@@ -213,6 +205,11 @@ class OllamaChat::Chat
213
205
  # the content to be processed, or nil for no action needed
214
206
  def handle_input(content)
215
207
  case content
208
+ when %r(^/reconnect)
209
+ STDERR.print green { "Reconnecting to ollama #{base_url.to_s.inspect}…" }
210
+ @ollama = connect_ollama
211
+ STDERR.puts green { " Done." }
212
+ :next
216
213
  when %r(^/copy$)
217
214
  copy_to_clipboard
218
215
  :next
@@ -680,6 +677,21 @@ class OllamaChat::Chat
680
677
 
681
678
  private
682
679
 
680
+ def base_url
681
+ @opts[?u] || OllamaChat::EnvConfig::OLLAMA::URL
682
+ end
683
+
684
+ def connect_ollama
685
+ Ollama::Client.new(
686
+ connect_timeout: config.timeouts.connect_timeout?,
687
+ read_timeout: config.timeouts.read_timeout?,
688
+ write_timeout: config.timeouts.write_timeout?,
689
+ base_url: base_url,
690
+ debug: ,
691
+ user_agent:
692
+ )
693
+ end
694
+
683
695
  # The setup_documents method initializes the document processing pipeline by
684
696
  # configuring the embedding model and database connection.
685
697
  # It then loads specified documents into the system and returns the
@@ -76,7 +76,7 @@ module OllamaChat::Dialog
76
76
  # @return [ String ] the user's response with trailing newline removed
77
77
  def ask?(prompt:)
78
78
  print prompt
79
- STDIN.gets.chomp
79
+ STDIN.gets.to_s.chomp
80
80
  end
81
81
 
82
82
  # The choose_collection method presents a menu to select or create a document
@@ -118,6 +118,7 @@ module OllamaChat::Information
118
118
  # interface.
119
119
  private def display_chat_help_message
120
120
  <<~EOT
121
+ /reconnect reconnect to current ollama server
121
122
  /copy to copy last response to clipboard
122
123
  /paste to paste content
123
124
  /markdown toggle markdown output
@@ -150,6 +151,7 @@ module OllamaChat::Information
150
151
  /compose compose content using an EDITOR
151
152
  /input [pattern] select and read content from a file (default: **/*)
152
153
  /context [pattern...] collect context with glob patterns
154
+ /revise_last edit the last response in an external editor
153
155
  /output filename save last response to filename
154
156
  /pipe command write last response to command's stdin
155
157
  /vim insert the last message into a vim server
@@ -8,35 +8,40 @@ require 'tempfile'
8
8
  # interactive file selection and context collection for enhancing chat
9
9
  # interactions with local or remote content.
10
10
  module OllamaChat::InputContent
11
- # The input method reads and returns the content of a selected file.
11
+ # The input method selects and reads content from files matching a pattern.
12
12
  #
13
- # This method searches for files matching the given pattern and presents them
14
- # in an interactive chooser menu. If a file is selected, its content is read
15
- # and returned. If the user chooses to exit or no file is selected, the
16
- # method returns nil.
13
+ # This method prompts the user to select files matching the given glob
14
+ # pattern, reads their content, and returns a concatenated string with each
15
+ # file's content preceded by its filename.
17
16
  #
18
- # @param pattern [ String ] the glob pattern to search for files (defaults to '**/*')
17
+ # @param pattern [String] the glob pattern to search for files (defaults to '**/*')
19
18
  #
20
- # @return [ String, nil ] the content of the selected file or nil if no file
21
- # was chosen
19
+ # @return [String] a concatenated string of file contents with filenames as headers
22
20
  def input(pattern)
23
21
  pattern ||= '**/*'
24
- if filename = choose_filename(pattern)
25
- File.read(filename)
22
+ files = Set[]
23
+ while filename = choose_filename(pattern, chosen: files)
24
+ files << filename
26
25
  end
26
+ result = ''
27
+ files.each do |filename|
28
+ result << ("%s:\n\n%s\n\n" % [ filename, File.read(filename) ])
29
+ end
30
+ result.full?
27
31
  end
28
32
 
29
- # The choose_filename method selects a file from a list of matching files.
30
- #
31
- # This method searches for files matching the given glob pattern, presents
32
- # them in an interactive chooser menu, and returns the selected filename. If
33
- # the user chooses to exit or no file is selected, the method returns nil.
33
+ # The choose_filename method selects a file from a list of matching files. It
34
+ # searches for files matching the given pattern, excludes already chosen
35
+ # files, and presents them in an interactive chooser menu.
34
36
  #
35
- # @param pattern [ String ] the glob pattern to search for files (defaults to '**/*')
37
+ # @param pattern [ String ] the glob pattern to search for files
38
+ # @param chosen [ Set ] a set of already chosen filenames to exclude from
39
+ # selection
36
40
  #
37
- # @return [ String, nil ] the path to the selected file or nil if no file was chosen
38
- def choose_filename(pattern)
39
- files = Dir.glob(pattern).select { File.file?(_1) }
41
+ # @return [ String, nil ] the selected filename or nil if no file was chosen or user exited
42
+ def choose_filename(pattern, chosen: nil)
43
+ files = Dir.glob(pattern).reject { chosen&.member?(_1) }.
44
+ select { File.file?(_1) }
40
45
  files.unshift('[EXIT]')
41
46
  case chosen = OllamaChat::Utils::Chooser.choose(files)
42
47
  when '[EXIT]', nil
@@ -75,8 +80,9 @@ module OllamaChat::InputContent
75
80
  # @example Load default context
76
81
  # context_spook(nil)
77
82
  def context_spook(patterns)
83
+ format = config.context.format
78
84
  if patterns
79
- ContextSpook::generate_context(verbose: true) do |context|
85
+ ContextSpook::generate_context(verbose: true, format:) do |context|
80
86
  context do
81
87
  Dir.glob(patterns).each do |filename|
82
88
  File.file?(filename) or next
@@ -86,7 +92,7 @@ module OllamaChat::InputContent
86
92
  end.to_json
87
93
  else
88
94
  if context_filename = choose_filename('.contexts/*.rb')
89
- ContextSpook.generate_context(context_filename, verbose: true).to_json
95
+ ContextSpook.generate_context(context_filename, verbose: true, format:).to_json
90
96
  end
91
97
  end
92
98
  end
@@ -102,7 +108,7 @@ module OllamaChat::InputContent
102
108
  # @return [ String, nil ] the composed content if successful, nil otherwise
103
109
  def compose
104
110
  unless editor = OllamaChat::EnvConfig::EDITOR?
105
- STDERR.puts "Editor reqired for compose, set env var "\
111
+ STDERR.puts "Editor required for compose, set env var "\
106
112
  "#{OllamaChat::EnvConfig::EDITOR!.env_var.inspect}."
107
113
  return
108
114
  end
@@ -43,6 +43,8 @@ stream: true
43
43
  document_policy: importing
44
44
  think: false
45
45
  think_loud: true
46
+ context:
47
+ format: JSON
46
48
  embedding:
47
49
  enabled: true
48
50
  paused: false
@@ -30,8 +30,6 @@ module OllamaChat::Parsing
30
30
  end
31
31
  source_io.rewind
32
32
  source_io.read
33
- when 'text/csv'
34
- parse_csv(source_io)
35
33
  when 'application/rss+xml'
36
34
  parse_rss(source_io)
37
35
  when 'application/atom+xml'
@@ -1,6 +1,6 @@
1
1
  module OllamaChat
2
2
  # OllamaChat version
3
- VERSION = '0.0.54'
3
+ VERSION = '0.0.56'
4
4
  VERSION_ARRAY = VERSION.split('.').map(&:to_i) # :nodoc:
5
5
  VERSION_MAJOR = VERSION_ARRAY[0] # :nodoc:
6
6
  VERSION_MINOR = VERSION_ARRAY[1] # :nodoc:
@@ -27,6 +27,18 @@ class OllamaChat::Vim
27
27
  @clientserver = clientserver || 'socket'
28
28
  end
29
29
 
30
+ # The server_name attribute reader returns the name of the Vim server to
31
+ # connect to.
32
+ #
33
+ # @return [ String ] the name of the Vim server
34
+ attr_reader :server_name
35
+
36
+ # The clientserver attribute reader returns the clientserver protocol to be
37
+ # used.
38
+ #
39
+ # @return [ String ] the clientserver protocol identifier
40
+ attr_reader :clientserver
41
+
30
42
  # The default_server_name method generates a standardized server name
31
43
  # based on a given name or the current working directory.
32
44
  #
@@ -54,14 +66,24 @@ class OllamaChat::Vim
54
66
  # The text is automatically indented to match the current column position.
55
67
  #
56
68
  # @param text [String] The text to be inserted into the Vim buffer
69
+ # @return [OllamaChat::Vim, nil] Returns self if successful or nil.
57
70
  def insert(text)
58
71
  spaces = (col - 1).clamp(0..)
59
72
  text = text.gsub(/^/, ' ' * spaces)
60
73
  Tempfile.open do |tmp|
61
74
  tmp.write(text)
62
75
  tmp.flush
63
- system %{vim --clientserver "#@clientserver" --servername "#@server_name" --remote-send "<ESC>:r #{tmp.path}<CR>"}
76
+ result = system %{
77
+ vim --clientserver "#@clientserver" --servername "#@server_name" --remote-send "<ESC>:r #{tmp.path}<CR>"
78
+ }
79
+ unless result
80
+ STDERR.puts <<~EOT
81
+ Failed! vim is required in path and running with server name "#@server_name".
82
+ EOT
83
+ return
84
+ end
64
85
  end
86
+ self
65
87
  end
66
88
 
67
89
  # Returns the current column position of the cursor in the Vim server
data/ollama_chat.gemspec CHANGED
@@ -1,9 +1,9 @@
1
1
  # -*- encoding: utf-8 -*-
2
- # stub: ollama_chat 0.0.54 ruby lib
2
+ # stub: ollama_chat 0.0.56 ruby lib
3
3
 
4
4
  Gem::Specification.new do |s|
5
5
  s.name = "ollama_chat".freeze
6
- s.version = "0.0.54".freeze
6
+ s.version = "0.0.56".freeze
7
7
 
8
8
  s.required_rubygems_version = Gem::Requirement.new(">= 0".freeze) if s.respond_to? :required_rubygems_version=
9
9
  s.require_paths = ["lib".freeze]
@@ -13,18 +13,18 @@ Gem::Specification.new do |s|
13
13
  s.email = "flori@ping.de".freeze
14
14
  s.executables = ["ollama_chat".freeze, "ollama_chat_send".freeze]
15
15
  s.extra_rdoc_files = ["README.md".freeze, "lib/ollama_chat.rb".freeze, "lib/ollama_chat/chat.rb".freeze, "lib/ollama_chat/clipboard.rb".freeze, "lib/ollama_chat/conversation.rb".freeze, "lib/ollama_chat/dialog.rb".freeze, "lib/ollama_chat/document_cache.rb".freeze, "lib/ollama_chat/env_config.rb".freeze, "lib/ollama_chat/follow_chat.rb".freeze, "lib/ollama_chat/history.rb".freeze, "lib/ollama_chat/information.rb".freeze, "lib/ollama_chat/input_content.rb".freeze, "lib/ollama_chat/kramdown_ansi.rb".freeze, "lib/ollama_chat/message_editing.rb".freeze, "lib/ollama_chat/message_format.rb".freeze, "lib/ollama_chat/message_list.rb".freeze, "lib/ollama_chat/message_output.rb".freeze, "lib/ollama_chat/model_handling.rb".freeze, "lib/ollama_chat/ollama_chat_config.rb".freeze, "lib/ollama_chat/parsing.rb".freeze, "lib/ollama_chat/redis_cache.rb".freeze, "lib/ollama_chat/server_socket.rb".freeze, "lib/ollama_chat/source_fetching.rb".freeze, "lib/ollama_chat/switches.rb".freeze, "lib/ollama_chat/think_control.rb".freeze, "lib/ollama_chat/utils.rb".freeze, "lib/ollama_chat/utils/cache_fetcher.rb".freeze, "lib/ollama_chat/utils/chooser.rb".freeze, "lib/ollama_chat/utils/fetcher.rb".freeze, "lib/ollama_chat/utils/file_argument.rb".freeze, "lib/ollama_chat/version.rb".freeze, "lib/ollama_chat/vim.rb".freeze, "lib/ollama_chat/web_searching.rb".freeze]
16
- s.files = [".utilsrc".freeze, "CHANGES.md".freeze, "Gemfile".freeze, "README.md".freeze, "Rakefile".freeze, "bin/ollama_chat".freeze, "bin/ollama_chat_send".freeze, "config/searxng/settings.yml".freeze, "docker-compose.yml".freeze, "lib/ollama_chat.rb".freeze, "lib/ollama_chat/chat.rb".freeze, "lib/ollama_chat/clipboard.rb".freeze, "lib/ollama_chat/conversation.rb".freeze, "lib/ollama_chat/dialog.rb".freeze, "lib/ollama_chat/document_cache.rb".freeze, "lib/ollama_chat/env_config.rb".freeze, "lib/ollama_chat/follow_chat.rb".freeze, "lib/ollama_chat/history.rb".freeze, "lib/ollama_chat/information.rb".freeze, "lib/ollama_chat/input_content.rb".freeze, "lib/ollama_chat/kramdown_ansi.rb".freeze, "lib/ollama_chat/message_editing.rb".freeze, "lib/ollama_chat/message_format.rb".freeze, "lib/ollama_chat/message_list.rb".freeze, "lib/ollama_chat/message_output.rb".freeze, "lib/ollama_chat/model_handling.rb".freeze, "lib/ollama_chat/ollama_chat_config.rb".freeze, "lib/ollama_chat/ollama_chat_config/default_config.yml".freeze, "lib/ollama_chat/parsing.rb".freeze, "lib/ollama_chat/redis_cache.rb".freeze, "lib/ollama_chat/server_socket.rb".freeze, "lib/ollama_chat/source_fetching.rb".freeze, "lib/ollama_chat/switches.rb".freeze, "lib/ollama_chat/think_control.rb".freeze, "lib/ollama_chat/utils.rb".freeze, "lib/ollama_chat/utils/cache_fetcher.rb".freeze, "lib/ollama_chat/utils/chooser.rb".freeze, "lib/ollama_chat/utils/fetcher.rb".freeze, "lib/ollama_chat/utils/file_argument.rb".freeze, "lib/ollama_chat/version.rb".freeze, "lib/ollama_chat/vim.rb".freeze, "lib/ollama_chat/web_searching.rb".freeze, "ollama_chat.gemspec".freeze, "redis/redis.conf".freeze, "spec/assets/api_show.json".freeze, "spec/assets/api_tags.json".freeze, "spec/assets/api_version.json".freeze, "spec/assets/conversation.json".freeze, "spec/assets/duckduckgo.html".freeze, "spec/assets/example.atom".freeze, "spec/assets/example.csv".freeze, "spec/assets/example.html".freeze, "spec/assets/example.pdf".freeze, "spec/assets/example.ps".freeze, "spec/assets/example.rb".freeze, "spec/assets/example.rss".freeze, "spec/assets/example.xml".freeze, "spec/assets/example_with_quote.html".freeze, "spec/assets/kitten.jpg".freeze, "spec/assets/prompt.txt".freeze, "spec/assets/searxng.json".freeze, "spec/ollama_chat/chat_spec.rb".freeze, "spec/ollama_chat/clipboard_spec.rb".freeze, "spec/ollama_chat/follow_chat_spec.rb".freeze, "spec/ollama_chat/information_spec.rb".freeze, "spec/ollama_chat/input_content_spec.rb".freeze, "spec/ollama_chat/kramdown_ansi_spec.rb".freeze, "spec/ollama_chat/message_editing_spec.rb".freeze, "spec/ollama_chat/message_list_spec.rb".freeze, "spec/ollama_chat/message_output_spec.rb".freeze, "spec/ollama_chat/model_handling_spec.rb".freeze, "spec/ollama_chat/parsing_spec.rb".freeze, "spec/ollama_chat/redis_cache_spec.rb".freeze, "spec/ollama_chat/server_socket_spec.rb".freeze, "spec/ollama_chat/source_fetching_spec.rb".freeze, "spec/ollama_chat/switches_spec.rb".freeze, "spec/ollama_chat/utils/cache_fetcher_spec.rb".freeze, "spec/ollama_chat/utils/fetcher_spec.rb".freeze, "spec/ollama_chat/utils/file_argument_spec.rb".freeze, "spec/ollama_chat/web_searching_spec.rb".freeze, "spec/spec_helper.rb".freeze, "tmp/.keep".freeze]
16
+ s.files = [".utilsrc".freeze, "CHANGES.md".freeze, "Gemfile".freeze, "README.md".freeze, "Rakefile".freeze, "bin/ollama_chat".freeze, "bin/ollama_chat_send".freeze, "config/searxng/settings.yml".freeze, "docker-compose.yml".freeze, "lib/ollama_chat.rb".freeze, "lib/ollama_chat/chat.rb".freeze, "lib/ollama_chat/clipboard.rb".freeze, "lib/ollama_chat/conversation.rb".freeze, "lib/ollama_chat/dialog.rb".freeze, "lib/ollama_chat/document_cache.rb".freeze, "lib/ollama_chat/env_config.rb".freeze, "lib/ollama_chat/follow_chat.rb".freeze, "lib/ollama_chat/history.rb".freeze, "lib/ollama_chat/information.rb".freeze, "lib/ollama_chat/input_content.rb".freeze, "lib/ollama_chat/kramdown_ansi.rb".freeze, "lib/ollama_chat/message_editing.rb".freeze, "lib/ollama_chat/message_format.rb".freeze, "lib/ollama_chat/message_list.rb".freeze, "lib/ollama_chat/message_output.rb".freeze, "lib/ollama_chat/model_handling.rb".freeze, "lib/ollama_chat/ollama_chat_config.rb".freeze, "lib/ollama_chat/ollama_chat_config/default_config.yml".freeze, "lib/ollama_chat/parsing.rb".freeze, "lib/ollama_chat/redis_cache.rb".freeze, "lib/ollama_chat/server_socket.rb".freeze, "lib/ollama_chat/source_fetching.rb".freeze, "lib/ollama_chat/switches.rb".freeze, "lib/ollama_chat/think_control.rb".freeze, "lib/ollama_chat/utils.rb".freeze, "lib/ollama_chat/utils/cache_fetcher.rb".freeze, "lib/ollama_chat/utils/chooser.rb".freeze, "lib/ollama_chat/utils/fetcher.rb".freeze, "lib/ollama_chat/utils/file_argument.rb".freeze, "lib/ollama_chat/version.rb".freeze, "lib/ollama_chat/vim.rb".freeze, "lib/ollama_chat/web_searching.rb".freeze, "ollama_chat.gemspec".freeze, "redis/redis.conf".freeze, "spec/assets/api_show.json".freeze, "spec/assets/api_tags.json".freeze, "spec/assets/api_version.json".freeze, "spec/assets/conversation.json".freeze, "spec/assets/duckduckgo.html".freeze, "spec/assets/example.atom".freeze, "spec/assets/example.csv".freeze, "spec/assets/example.html".freeze, "spec/assets/example.pdf".freeze, "spec/assets/example.ps".freeze, "spec/assets/example.rb".freeze, "spec/assets/example.rss".freeze, "spec/assets/example.xml".freeze, "spec/assets/example_with_quote.html".freeze, "spec/assets/kitten.jpg".freeze, "spec/assets/prompt.txt".freeze, "spec/assets/searxng.json".freeze, "spec/ollama_chat/chat_spec.rb".freeze, "spec/ollama_chat/clipboard_spec.rb".freeze, "spec/ollama_chat/follow_chat_spec.rb".freeze, "spec/ollama_chat/information_spec.rb".freeze, "spec/ollama_chat/input_content_spec.rb".freeze, "spec/ollama_chat/kramdown_ansi_spec.rb".freeze, "spec/ollama_chat/message_editing_spec.rb".freeze, "spec/ollama_chat/message_list_spec.rb".freeze, "spec/ollama_chat/message_output_spec.rb".freeze, "spec/ollama_chat/model_handling_spec.rb".freeze, "spec/ollama_chat/parsing_spec.rb".freeze, "spec/ollama_chat/redis_cache_spec.rb".freeze, "spec/ollama_chat/server_socket_spec.rb".freeze, "spec/ollama_chat/source_fetching_spec.rb".freeze, "spec/ollama_chat/switches_spec.rb".freeze, "spec/ollama_chat/think_control_spec.rb".freeze, "spec/ollama_chat/utils/cache_fetcher_spec.rb".freeze, "spec/ollama_chat/utils/fetcher_spec.rb".freeze, "spec/ollama_chat/utils/file_argument_spec.rb".freeze, "spec/ollama_chat/vim_spec.rb".freeze, "spec/ollama_chat/web_searching_spec.rb".freeze, "spec/spec_helper.rb".freeze, "tmp/.keep".freeze]
17
17
  s.homepage = "https://github.com/flori/ollama_chat".freeze
18
18
  s.licenses = ["MIT".freeze]
19
19
  s.rdoc_options = ["--title".freeze, "OllamaChat - A command-line interface (CLI) for interacting with an Ollama AI model.".freeze, "--main".freeze, "README.md".freeze]
20
20
  s.required_ruby_version = Gem::Requirement.new(">= 3.2".freeze)
21
- s.rubygems_version = "4.0.2".freeze
21
+ s.rubygems_version = "4.0.3".freeze
22
22
  s.summary = "A command-line interface (CLI) for interacting with an Ollama AI model.".freeze
23
- s.test_files = ["spec/assets/example.rb".freeze, "spec/ollama_chat/chat_spec.rb".freeze, "spec/ollama_chat/clipboard_spec.rb".freeze, "spec/ollama_chat/follow_chat_spec.rb".freeze, "spec/ollama_chat/information_spec.rb".freeze, "spec/ollama_chat/input_content_spec.rb".freeze, "spec/ollama_chat/kramdown_ansi_spec.rb".freeze, "spec/ollama_chat/message_editing_spec.rb".freeze, "spec/ollama_chat/message_list_spec.rb".freeze, "spec/ollama_chat/message_output_spec.rb".freeze, "spec/ollama_chat/model_handling_spec.rb".freeze, "spec/ollama_chat/parsing_spec.rb".freeze, "spec/ollama_chat/redis_cache_spec.rb".freeze, "spec/ollama_chat/server_socket_spec.rb".freeze, "spec/ollama_chat/source_fetching_spec.rb".freeze, "spec/ollama_chat/switches_spec.rb".freeze, "spec/ollama_chat/utils/cache_fetcher_spec.rb".freeze, "spec/ollama_chat/utils/fetcher_spec.rb".freeze, "spec/ollama_chat/utils/file_argument_spec.rb".freeze, "spec/ollama_chat/web_searching_spec.rb".freeze, "spec/spec_helper.rb".freeze]
23
+ s.test_files = ["spec/assets/example.rb".freeze, "spec/ollama_chat/chat_spec.rb".freeze, "spec/ollama_chat/clipboard_spec.rb".freeze, "spec/ollama_chat/follow_chat_spec.rb".freeze, "spec/ollama_chat/information_spec.rb".freeze, "spec/ollama_chat/input_content_spec.rb".freeze, "spec/ollama_chat/kramdown_ansi_spec.rb".freeze, "spec/ollama_chat/message_editing_spec.rb".freeze, "spec/ollama_chat/message_list_spec.rb".freeze, "spec/ollama_chat/message_output_spec.rb".freeze, "spec/ollama_chat/model_handling_spec.rb".freeze, "spec/ollama_chat/parsing_spec.rb".freeze, "spec/ollama_chat/redis_cache_spec.rb".freeze, "spec/ollama_chat/server_socket_spec.rb".freeze, "spec/ollama_chat/source_fetching_spec.rb".freeze, "spec/ollama_chat/switches_spec.rb".freeze, "spec/ollama_chat/think_control_spec.rb".freeze, "spec/ollama_chat/utils/cache_fetcher_spec.rb".freeze, "spec/ollama_chat/utils/fetcher_spec.rb".freeze, "spec/ollama_chat/utils/file_argument_spec.rb".freeze, "spec/ollama_chat/vim_spec.rb".freeze, "spec/ollama_chat/web_searching_spec.rb".freeze, "spec/spec_helper.rb".freeze]
24
24
 
25
25
  s.specification_version = 4
26
26
 
27
- s.add_development_dependency(%q<gem_hadar>.freeze, [">= 2.16.3".freeze])
27
+ s.add_development_dependency(%q<gem_hadar>.freeze, [">= 2.17.0".freeze])
28
28
  s.add_development_dependency(%q<all_images>.freeze, ["~> 0.6".freeze])
29
29
  s.add_development_dependency(%q<rspec>.freeze, ["~> 3.2".freeze])
30
30
  s.add_development_dependency(%q<kramdown>.freeze, ["~> 2.0".freeze])
@@ -50,5 +50,5 @@ Gem::Specification.new do |s|
50
50
  s.add_runtime_dependency(%q<bigdecimal>.freeze, ["~> 3.1".freeze])
51
51
  s.add_runtime_dependency(%q<csv>.freeze, ["~> 3.0".freeze])
52
52
  s.add_runtime_dependency(%q<const_conf>.freeze, ["~> 0.3".freeze])
53
- s.add_runtime_dependency(%q<context_spook>.freeze, ["~> 1.1".freeze])
53
+ s.add_runtime_dependency(%q<context_spook>.freeze, ["~> 1.5".freeze])
54
54
  end
@@ -28,6 +28,11 @@ describe OllamaChat::Chat, protect_env: true do
28
28
  describe 'handle_input' do
29
29
  connect_to_ollama_server
30
30
 
31
+ it 'returns :next when input is "/reconnect"' do
32
+ expect(chat).to receive(:connect_ollama).and_return double('ollama')
33
+ expect(chat.handle_input("/reconnect")).to eq :next
34
+ end
35
+
31
36
  it 'returns :next when input is "/copy"' do
32
37
  expect(chat).to receive(:copy_to_clipboard)
33
38
  expect(chat.handle_input("/copy")).to eq :next
@@ -9,9 +9,12 @@ describe OllamaChat::InputContent do
9
9
 
10
10
  describe '#input' do
11
11
  it 'can read content from a selected file' do
12
+ selected_filename = 'spec/assets/example.rb'
12
13
  # Mock the file selection process
13
- expect(chat).to receive(:choose_filename).with('**/*').
14
- and_return('spec/assets/example.rb')
14
+ expect(chat).to receive(:choose_filename).with('**/*', chosen: Set[]).
15
+ and_return(selected_filename)
16
+ expect(chat).to receive(:choose_filename).with('**/*', chosen: Set[selected_filename]).
17
+ and_return nil
15
18
 
16
19
  # Test that it returns the file content
17
20
  result = chat.input(nil)
@@ -19,13 +22,19 @@ describe OllamaChat::InputContent do
19
22
  end
20
23
 
21
24
  it 'returns nil when no file is selected' do
22
- expect(chat).to receive(:choose_filename).with('**/*').and_return(nil)
25
+ expect(chat).to receive(:choose_filename).with('**/*', chosen: Set[]).
26
+ and_return(nil)
23
27
  expect(chat.input(nil)).to be_nil
24
28
  end
25
29
 
26
30
  it 'can read content with specific pattern' do
27
- expect(chat).to receive(:choose_filename).with('spec/assets/*').
28
- and_return('spec/assets/example.rb')
31
+ selected_filename = 'spec/assets/example.rb'
32
+ expect(chat).to receive(:choose_filename).
33
+ with('spec/assets/*', chosen: Set[]).
34
+ and_return(selected_filename)
35
+ expect(chat).to receive(:choose_filename).
36
+ with('spec/assets/*', chosen: Set[selected_filename]).
37
+ and_return nil
29
38
  result = chat.input('spec/assets/*')
30
39
  expect(result).to include('puts "Hello World!"')
31
40
  end
@@ -108,7 +117,7 @@ describe OllamaChat::InputContent do
108
117
  it 'handles missing editor gracefully' do
109
118
  const_conf_as('OllamaChat::EnvConfig::EDITOR' => nil)
110
119
 
111
- expect(STDERR).to receive(:puts).with(/Editor reqired for compose/)
120
+ expect(STDERR).to receive(:puts).with(/Editor required for compose/)
112
121
  expect(chat.compose).to be_nil
113
122
  end
114
123
 
@@ -49,24 +49,7 @@ describe OllamaChat::Parsing do
49
49
  def io.content_type
50
50
  'text/csv'
51
51
  end
52
- expect(chat.parse_source(io)).to eq(<<EOT)
53
- name: John Doe
54
- age: 32
55
- occupation: Software Engineer
56
-
57
- name: Jane Smith
58
- age: 28
59
- occupation: Marketing Manager
60
-
61
- name: Bob Johnson
62
- age: 45
63
- occupation: Retired
64
-
65
- name: Alice Brown
66
- age: 25
67
- occupation: Student
68
-
69
- EOT
52
+ expect(chat.parse_source(io)).to eq(asset_content('example.csv'))
70
53
  end
71
54
  end
72
55
 
@@ -0,0 +1,154 @@
1
+ require 'spec_helper'
2
+
3
+ describe OllamaChat::ThinkControl do
4
+ let :chat do
5
+ OllamaChat::Chat.new(
6
+ argv: %w[ -f lib/ollama_chat/ollama_chat_config/default_config.yml ]
7
+ )
8
+ end
9
+
10
+ connect_to_ollama_server
11
+
12
+ describe '#think' do
13
+ it 'returns the current think mode state' do
14
+ expect(chat.think).to be false
15
+ chat.instance_variable_set(:@think, true)
16
+ expect(chat.think).to be true
17
+ chat.instance_variable_set(:@think, false)
18
+ expect(chat.think).to be false
19
+ chat.instance_variable_set(:@think, 'low')
20
+ expect(chat.think).to eq 'low'
21
+ end
22
+ end
23
+
24
+ describe '#think?' do
25
+ it 'returns true when think mode is enabled (boolean true)' do
26
+ chat.instance_variable_set(:@think, true)
27
+ expect(chat.think?).to be true
28
+ end
29
+
30
+ it 'returns true when think mode is enabled (string value)' do
31
+ chat.instance_variable_set(:@think, 'high')
32
+ expect(chat.think?).to be true
33
+ end
34
+
35
+ it 'returns false when think mode is disabled (boolean false)' do
36
+ chat.instance_variable_set(:@think, false)
37
+ expect(chat.think?).to be false
38
+ end
39
+
40
+ it 'returns false when think mode is nil' do
41
+ chat.instance_variable_set(:@think, nil)
42
+ expect(chat.think?).to be false
43
+ end
44
+ end
45
+
46
+ describe '#think_mode' do
47
+ it 'returns "enabled" when think is true' do
48
+ chat.instance_variable_set(:@think, true)
49
+ expect(chat.think_mode).to eq 'enabled'
50
+ end
51
+
52
+ it 'returns the think value when it is a string' do
53
+ chat.instance_variable_set(:@think, 'medium')
54
+ expect(chat.think_mode).to eq 'medium'
55
+ end
56
+
57
+ it 'returns "disabled" when think is false' do
58
+ chat.instance_variable_set(:@think, false)
59
+ expect(chat.think_mode).to eq 'disabled'
60
+ end
61
+
62
+ it 'returns "disabled" when think is nil' do
63
+ chat.instance_variable_set(:@think, nil)
64
+ expect(chat.think_mode).to eq 'disabled'
65
+ end
66
+ end
67
+
68
+ describe '#think_show' do
69
+ it 'displays the current think mode status' do
70
+ chat.instance_variable_set(:@think, true)
71
+ expect(STDOUT).to receive(:puts).with(/Think mode is \e\[1menabled\e\[0m\./)
72
+ chat.think_show
73
+ end
74
+
75
+ it 'displays the think mode level when set to string' do
76
+ chat.instance_variable_set(:@think, 'high')
77
+ expect(STDOUT).to receive(:puts).with(/Think mode is \e\[1mhigh\e\[0m\./)
78
+ chat.think_show
79
+ end
80
+
81
+ it 'displays "disabled" when think is false' do
82
+ chat.instance_variable_set(:@think, false)
83
+ expect(STDOUT).to receive(:puts).with(/Think mode is \e\[1mdisabled\e\[0m\./)
84
+ chat.think_show
85
+ end
86
+
87
+ it 'displays "disabled" when think is nil' do
88
+ chat.instance_variable_set(:@think, nil)
89
+ expect(STDOUT).to receive(:puts).with(/Think mode is \e\[1mdisabled\e\[0m\./)
90
+ chat.think_show
91
+ end
92
+ end
93
+
94
+ describe '#think_loud?' do
95
+ it 'returns false when think is disabled' do
96
+ chat.instance_variable_set(:@think, false)
97
+ expect(chat.think_loud?).to be false
98
+ end
99
+
100
+ it 'returns false when think_loud is off' do
101
+ chat.instance_variable_set(:@think, true)
102
+ allow(chat).to receive(:think_loud).and_return(double(on?: false))
103
+ expect(chat.think_loud?).to be false
104
+ end
105
+
106
+ it 'returns true when both think and think_loud are enabled' do
107
+ chat.instance_variable_set(:@think, true)
108
+ allow(chat).to receive(:think_loud).and_return(double(on?: true))
109
+ expect(chat.think_loud?).to be true
110
+ end
111
+ end
112
+
113
+ describe '#choose_think_mode' do
114
+ it 'can select "off" mode' do
115
+ expect(OllamaChat::Utils::Chooser).to receive(:choose).and_return('off')
116
+ chat.choose_think_mode
117
+ expect(chat.think).to be false
118
+ end
119
+
120
+ it 'can select "on" mode' do
121
+ expect(OllamaChat::Utils::Chooser).to receive(:choose).and_return('on')
122
+ chat.choose_think_mode
123
+ expect(chat.think).to be true
124
+ end
125
+
126
+ it 'can select "low" mode' do
127
+ expect(OllamaChat::Utils::Chooser).to receive(:choose).and_return('low')
128
+ chat.choose_think_mode
129
+ expect(chat.think).to eq 'low'
130
+ end
131
+
132
+ it 'can select "medium" mode' do
133
+ expect(OllamaChat::Utils::Chooser).to receive(:choose).and_return('medium')
134
+ chat.choose_think_mode
135
+ expect(chat.think).to eq 'medium'
136
+ end
137
+
138
+ it 'can select "high" mode' do
139
+ expect(OllamaChat::Utils::Chooser).to receive(:choose).and_return('high')
140
+ chat.choose_think_mode
141
+ expect(chat.think).to eq 'high'
142
+ end
143
+
144
+ it 'can exit selection' do
145
+ expect(OllamaChat::Utils::Chooser).to receive(:choose).and_return('[EXIT]')
146
+ expect { chat.choose_think_mode }.not_to change { chat.think }
147
+ end
148
+
149
+ it 'can handle nil selection' do
150
+ expect(OllamaChat::Utils::Chooser).to receive(:choose).and_return(nil)
151
+ expect { chat.choose_think_mode }.not_to change { chat.think }
152
+ end
153
+ end
154
+ end
@@ -0,0 +1,88 @@
1
+ require 'spec_helper'
2
+
3
+ describe OllamaChat::Vim do
4
+ let(:server_name) { 'TEST_SERVER' }
5
+ let(:vim) { described_class.new(server_name) }
6
+
7
+ describe '#initialize' do
8
+ it 'can be initialized with a server name' do
9
+ expect(vim.server_name).to eq server_name
10
+ end
11
+
12
+ it 'can be initialized without a server name' do
13
+ vim_without_name = described_class.new(nil)
14
+ expect(vim_without_name.server_name).to be_a(String)
15
+ end
16
+
17
+ it 'uses socket as default clientserver protocol' do
18
+ expect(vim.clientserver).to eq 'socket'
19
+ end
20
+
21
+ it 'can specify clientserver protocol' do
22
+ vim_with_protocol = described_class.new(server_name, clientserver: 'pipe')
23
+ expect(vim_with_protocol.clientserver).to eq 'pipe'
24
+ end
25
+ end
26
+
27
+ describe '.default_server_name' do
28
+ it 'generates a standardized server name from a directory' do
29
+ name = described_class.default_server_name('/path/to/project')
30
+ expect(name).to match(/\A[A-Z0-9]+-[A-Z0-9]+\z/)
31
+ expect(name).to include('PROJECT')
32
+ end
33
+
34
+ it 'handles current working directory' do
35
+ name = described_class.default_server_name
36
+ expect(name).to be_a(String)
37
+ expect(name).to_not be_empty
38
+ end
39
+
40
+ it 'generates consistent names for same path' do
41
+ name1 = described_class.default_server_name('/tmp/test')
42
+ name2 = described_class.default_server_name('/tmp/test')
43
+ expect(name1).to eq name2
44
+ end
45
+ end
46
+
47
+ describe '#insert' do
48
+ it 'can insert text into vim' do
49
+ expect(vim).to receive(:`).with(
50
+ /vim.*--remote-expr.*col\('\.'\)/
51
+ ).and_return("5\n")
52
+ # Mock the system call to avoid actual vim interaction
53
+ expect(vim).to receive(:system).with(
54
+ /vim.*--servername.*#{server_name}.*--remote-send/
55
+ )
56
+ vim.insert('test content')
57
+ end
58
+
59
+ it 'handles text indentation' do
60
+ # Mock the col method to return a specific column
61
+ expect(vim).to receive(:`).with(
62
+ /vim.*--remote-expr.*col\('\.'\)/).and_return("5\n"
63
+ )
64
+ tmp = double('Tempfile', flush: true, path: '/tmp/test')
65
+ expect(Tempfile).to receive(:open).and_yield(tmp)
66
+ expect(tmp).to receive(:write).with(' test content')
67
+ expect(vim).to receive(:system).with(
68
+ /vim --clientserver.*--servername.*--remote-send.*\/tmp\/test/
69
+ ).and_return true
70
+ vim.insert('test content')
71
+ end
72
+ end
73
+
74
+ describe '#col' do
75
+ it 'can get current column position' do
76
+ # Mock the system call to return a specific column
77
+ expect(vim).to receive(:`).with(
78
+ /vim.*--remote-expr.*col\('\.'\)/
79
+ ).and_return("5\n")
80
+ expect(vim.col).to eq 5
81
+ end
82
+
83
+ it 'handles empty response' do
84
+ expect(vim).to receive(:`).and_return("\n")
85
+ expect(vim.col).to eq 0
86
+ end
87
+ end
88
+ end
metadata CHANGED
@@ -1,7 +1,7 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: ollama_chat
3
3
  version: !ruby/object:Gem::Version
4
- version: 0.0.54
4
+ version: 0.0.56
5
5
  platform: ruby
6
6
  authors:
7
7
  - Florian Frank
@@ -15,14 +15,14 @@ dependencies:
15
15
  requirements:
16
16
  - - ">="
17
17
  - !ruby/object:Gem::Version
18
- version: 2.16.3
18
+ version: 2.17.0
19
19
  type: :development
20
20
  prerelease: false
21
21
  version_requirements: !ruby/object:Gem::Requirement
22
22
  requirements:
23
23
  - - ">="
24
24
  - !ruby/object:Gem::Version
25
- version: 2.16.3
25
+ version: 2.17.0
26
26
  - !ruby/object:Gem::Dependency
27
27
  name: all_images
28
28
  requirement: !ruby/object:Gem::Requirement
@@ -385,14 +385,14 @@ dependencies:
385
385
  requirements:
386
386
  - - "~>"
387
387
  - !ruby/object:Gem::Version
388
- version: '1.1'
388
+ version: '1.5'
389
389
  type: :runtime
390
390
  prerelease: false
391
391
  version_requirements: !ruby/object:Gem::Requirement
392
392
  requirements:
393
393
  - - "~>"
394
394
  - !ruby/object:Gem::Version
395
- version: '1.1'
395
+ version: '1.5'
396
396
  description: |
397
397
  The app provides a command-line interface (CLI) to an Ollama AI model,
398
398
  allowing users to engage in text-based conversations and generate
@@ -517,9 +517,11 @@ files:
517
517
  - spec/ollama_chat/server_socket_spec.rb
518
518
  - spec/ollama_chat/source_fetching_spec.rb
519
519
  - spec/ollama_chat/switches_spec.rb
520
+ - spec/ollama_chat/think_control_spec.rb
520
521
  - spec/ollama_chat/utils/cache_fetcher_spec.rb
521
522
  - spec/ollama_chat/utils/fetcher_spec.rb
522
523
  - spec/ollama_chat/utils/file_argument_spec.rb
524
+ - spec/ollama_chat/vim_spec.rb
523
525
  - spec/ollama_chat/web_searching_spec.rb
524
526
  - spec/spec_helper.rb
525
527
  - tmp/.keep
@@ -545,7 +547,7 @@ required_rubygems_version: !ruby/object:Gem::Requirement
545
547
  - !ruby/object:Gem::Version
546
548
  version: '0'
547
549
  requirements: []
548
- rubygems_version: 4.0.2
550
+ rubygems_version: 4.0.3
549
551
  specification_version: 4
550
552
  summary: A command-line interface (CLI) for interacting with an Ollama AI model.
551
553
  test_files:
@@ -565,8 +567,10 @@ test_files:
565
567
  - spec/ollama_chat/server_socket_spec.rb
566
568
  - spec/ollama_chat/source_fetching_spec.rb
567
569
  - spec/ollama_chat/switches_spec.rb
570
+ - spec/ollama_chat/think_control_spec.rb
568
571
  - spec/ollama_chat/utils/cache_fetcher_spec.rb
569
572
  - spec/ollama_chat/utils/fetcher_spec.rb
570
573
  - spec/ollama_chat/utils/file_argument_spec.rb
574
+ - spec/ollama_chat/vim_spec.rb
571
575
  - spec/ollama_chat/web_searching_spec.rb
572
576
  - spec/spec_helper.rb