ollama_chat 0.0.33 → 0.0.35

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: 7f2ffa19191743f1d9b41bfb72893dba9ca6e1e26836ed4c553a52b1c13f29ca
4
- data.tar.gz: 83bc423d7f4daf9cda26a69c05858a300957c82cc74071285ee05c4afc734281
3
+ metadata.gz: a0c32096e0846e22b31ec8e27fa8aa20087ac0d88cb421c9e697d17e61bfad0b
4
+ data.tar.gz: 9b828f24b207fbea21c5b9a2a13c572c5f45765cf6b88b6a4624ec923bc5f109
5
5
  SHA512:
6
- metadata.gz: 9064c4a0b0ada0f57b2fca76c1665296f60607ec284a93f9e172aa83e6c36876a025caaec37dc84aa95ca1ad00e29a19bd93e93046bd8b07d21d54c31cfa5d7e
7
- data.tar.gz: 37f74dfca1c4bee0946f56136d00b098303b69fe735bd84f9e3a6d1bb3f6ff59c4d1f03be9de7b451714b4909ece0a769cf9931c46b7c89d87bf0c3b53ebc98e
6
+ metadata.gz: 4a351e48b81cca88fde7299da6bab2f9e4fdcc10ce98b3af1cc8d7733dd58675bb5b7fe75896ad369169e33e7c91d858ff85ad69406e019845eaaa6d8c6d4614
7
+ data.tar.gz: 9951320316b3fcd7c804dfdd18a2540f5b56399c131677f29aa011995b25877346e83d550ab222e1b67f8bf92f5828bbd510bda5aed4df5dcfb9e1f92995a41d
data/CHANGES.md CHANGED
@@ -1,5 +1,41 @@
1
1
  # Changes
2
2
 
3
+ ## 2025-09-18 v0.0.35
4
+
5
+ - Replaced ad-hoc ENV handling with `const_conf` gem for structured
6
+ configuration management
7
+ - Bumped required Ruby version from **3.1** to **3.2**
8
+ - Added `const_conf (~> 0.3)` as a runtime dependency
9
+ - Introduced `OllamaChat::EnvConfig` module to centralize environment variables
10
+ - Updated `OllamaChat::Chat`, `OllamaChat::FollowChat`, and related classes to
11
+ use the new configuration system
12
+ - Replaced direct ENV access with `EnvConfig::OLLAMA::URL`,
13
+ `EnvConfig::PAGER?`, etc.
14
+ - Refactored default configuration values in YAML files to use `const_conf`
15
+ constants
16
+ - Removed legacy debug flag handling and simplified `debug` method
17
+ implementation
18
+ - Updated test suite to use `const_conf_as` helper and removed old ENV stubbing
19
+ logic
20
+ - Adjusted gemspec to include new files and dependencies, updated required Ruby
21
+ version
22
+ - Updated regex pattern to match only escaped spaces in file paths
23
+
24
+ ## 2025-09-17 v0.0.34
25
+
26
+ - Modified `-d` flag semantics to use working directory instead of runtime
27
+ directory for socket path derivation
28
+ - Updated `send_to_server_socket` and `create_socket_server` methods to accept
29
+ `working_dir` parameter
30
+ - Changed argument parsing in `ollama_chat_send` binary to utilize
31
+ `working_dir` instead of `runtime_dir`
32
+ - Updated documentation and help text to reflect new `-d` option semantics
33
+ - Added tests covering new `working_dir` functionality with default fallback to
34
+ current directory
35
+ - Maintained backward compatibility by preserving existing `runtime_dir`
36
+ behavior when specified
37
+ - Modified method signatures to support both old and new parameter styles
38
+
3
39
  ## 2025-09-15 v0.0.33
4
40
 
5
41
  - Enhanced `CONTENT_REGEXP` to support escaped spaces in file paths using
data/README.md CHANGED
@@ -228,6 +228,16 @@ The `ollama_chat_send` command now supports additional parameters to enhance fun
228
228
  $ echo "Visit https://example.com for more info" | ollama_chat_send -p
229
229
  ```
230
230
 
231
+ - **Working Directory (`-d`)**: Specifies the working directory used to derive
232
+ the Unix socket file path. When the ollama chat configuration is set to use a
233
+ working directory dependent socket (via `working_dir_dependent_socket: true`),
234
+ this option determines the base path for socket naming. If not specified, the
235
+ current working directory is assumed.
236
+
237
+ ```bash
238
+ $ echo "Hello world" | ollama_chat_send -d /tmp/my_working_dir -r
239
+ ```
240
+
231
241
  - **Runtime Directory (`-d`)**: Specifies the directory where the Unix socket
232
242
  file of `ollama_chat` was created, if you want to send to a specific
233
243
  `ollama_chat`.
data/Rakefile CHANGED
@@ -29,7 +29,7 @@ GemHadar do
29
29
 
30
30
  readme 'README.md'
31
31
 
32
- required_ruby_version '~> 3.1'
32
+ required_ruby_version '~> 3.2'
33
33
 
34
34
  executables << 'ollama_chat' << 'ollama_chat_send'
35
35
 
@@ -50,6 +50,7 @@ GemHadar do
50
50
  dependency 'amatch', '~> 0.4.1'
51
51
  dependency 'pdf-reader', '~> 2.0'
52
52
  dependency 'csv', '~> 3.0'
53
+ dependency 'const_conf', '~> 0.3'
53
54
  development_dependency 'all_images', '~> 0.6'
54
55
  development_dependency 'rspec', '~> 3.2'
55
56
  development_dependency 'kramdown', '~> 2.0'
data/bin/ollama_chat_send CHANGED
@@ -4,9 +4,15 @@ require 'ollama_chat'
4
4
  require 'tins/go'
5
5
  include Tins::GO
6
6
 
7
-
8
7
  opts = go 'f:d:rtph', ARGV
9
8
 
9
+ # Displays the usage information for the ollama_chat_send command.
10
+ #
11
+ # This method outputs a formatted help message that describes the available
12
+ # options and usage of the ollama_chat_send command, including details about
13
+ # sending input to a running Ollama Chat client.
14
+ #
15
+ # @param rc [ Integer ] the exit code to use when exiting the program
10
16
  def usage(rc = 0)
11
17
  puts <<~EOT
12
18
  Usage: #{File.basename($0)} [OPTIONS]
@@ -16,7 +22,7 @@ def usage(rc = 0)
16
22
  -t Send input as terminal input including commands, e. g. /import
17
23
  -p Send input with source parsing enabled (defaults to disabled)
18
24
  -f CONFIG file to read
19
- -d DIR the runtime directory to look for the socket file
25
+ -d DIR the working directory to derive the socket file from
20
26
  -h Show this help message
21
27
 
22
28
  Send data to a running Ollame Chat client via standard input.
@@ -36,8 +42,8 @@ begin
36
42
  STDIN.read,
37
43
  type:,
38
44
  config:,
39
- runtime_dir: opts[?d],
40
- parse: !!opts[?p]
45
+ working_dir: opts[?d],
46
+ parse: !!opts[?p]
41
47
  )
42
48
  type == :socket_input_with_response and puts response.content
43
49
  rescue => e
@@ -84,13 +84,13 @@ class OllamaChat::Chat
84
84
  @ollama_chat_config = OllamaChat::OllamaChatConfig.new(@opts[?f])
85
85
  self.config = @ollama_chat_config.config
86
86
  setup_switches(config)
87
- base_url = @opts[?u] || config.url
87
+ base_url = @opts[?u] || OllamaChat::EnvConfig::OLLAMA::URL
88
88
  @ollama = Ollama::Client.new(
89
89
  connect_timeout: config.timeouts.connect_timeout?,
90
90
  read_timeout: config.timeouts.read_timeout?,
91
91
  write_timeout: config.timeouts.write_timeout?,
92
92
  base_url: base_url,
93
- debug: config.debug,
93
+ debug: ,
94
94
  user_agent:
95
95
  )
96
96
  if server_version.version < '0.9.0'.version
@@ -124,6 +124,13 @@ class OllamaChat::Chat
124
124
  fix_config(e)
125
125
  end
126
126
 
127
+ # The debug method accesses the debug configuration setting.
128
+ #
129
+ # @return [TrueClass, FalseClass] the current debug mode status
130
+ def debug
131
+ OllamaChat::EnvConfig::OLLAMA::CHAT::DEBUG
132
+ end
133
+
127
134
  # The ollama reader returns the Ollama API client instance.
128
135
  #
129
136
  # @return [Ollama::Client] the configured Ollama API client
@@ -497,11 +504,7 @@ class OllamaChat::Chat
497
504
  # and available system commands, then uses Kramdown::ANSI::Pager to show the
498
505
  # formatted configuration output.
499
506
  def display_config
500
- default_pager = ENV['PAGER'].full?
501
- if fallback_pager = `which less`.chomp.full? || `which more`.chomp.full?
502
- fallback_pager << ' -r'
503
- end
504
- my_pager = default_pager || fallback_pager
507
+ command = OllamaChat::EnvConfig::PAGER?
505
508
  rendered = config.to_s
506
509
  Kramdown::ANSI::Pager.pager(
507
510
  lines: rendered.count(?\n),
@@ -610,7 +613,7 @@ class OllamaChat::Chat
610
613
  end
611
614
  [ link, ?# + record.tags.first ]
612
615
  }.uniq.map { |l, t| hyperlink(l, t) }.join(' ')
613
- config.debug and jj messages.to_ary
616
+ debug and jj messages.to_ary
614
617
  end
615
618
 
616
619
  case type
@@ -660,7 +663,7 @@ class OllamaChat::Chat
660
663
  collection: ,
661
664
  cache: configure_cache,
662
665
  redis_url: config.redis.documents.url?,
663
- debug: config.debug
666
+ debug:
664
667
  )
665
668
 
666
669
  document_list = @opts[?D].to_a
@@ -750,9 +753,12 @@ class OllamaChat::Chat
750
753
  save_conversation('backup.json')
751
754
  STDOUT.puts "When reading the config file, a #{exception.class} "\
752
755
  "exception was caught: #{exception.message.inspect}"
756
+ unless diff_tool = OllamaChat::EnvConfig::DIFF_TOOL?
757
+ exit 1
758
+ end
753
759
  if ask?(prompt: 'Do you want to fix the config? (y/n) ') =~ /\Ay/i
754
760
  system Shellwords.join([
755
- @ollama_chat_config.diff_tool,
761
+ diff_tool,
756
762
  @ollama_chat_config.filename,
757
763
  @ollama_chat_config.default_config_path,
758
764
  ])
@@ -58,7 +58,15 @@ module OllamaChat::Dialog
58
58
  cli_model || current_model
59
59
  end
60
60
  ensure
61
- STDOUT.puts green { "Connecting to #{model}@#{ollama.base_url} now…" }
61
+ connect_message(model, ollama.base_url)
62
+ end
63
+
64
+ # The connect_message method displays a connection status message.
65
+ #
66
+ # @param model [String] the model name to connect to
67
+ # @param base_url [String] the base URL of the connection
68
+ def connect_message(model, base_url)
69
+ STDOUT.puts green { "Connecting to #{model}@#{base_url} now…" }
62
70
  end
63
71
 
64
72
  # The ask? method prompts the user with a question and returns their input.
@@ -0,0 +1,100 @@
1
+ require 'const_conf'
2
+
3
+ module OllamaChat
4
+ module EnvConfig
5
+ include ConstConf
6
+
7
+ description 'Environment config for OllamaChat'
8
+ prefix ''
9
+
10
+ PAGER = set do
11
+ description 'Pager command to use in case terminal lines are exceeded by output'
12
+
13
+ default do
14
+ if fallback_pager = `which less`.full?(:chomp) || `which more`.full?(:chomp)
15
+ fallback_pager << ' -r'
16
+ end
17
+ end
18
+ end
19
+
20
+ DIFF_TOOL = set do
21
+ description 'Diff tool to apply changes with'
22
+
23
+ default do
24
+ if diff = `which vimdiff`.full?(:chomp)
25
+ diff
26
+ else
27
+ warn 'Need a diff tool configured via env var "DIFF_TOOL"'
28
+ end
29
+ end
30
+ end
31
+
32
+ KRAMDOWN_ANSI_OLLAMA_CHAT_STYLES = set do
33
+ description 'Styles to use for kramdown-ansi markdown'
34
+
35
+ default ENV['KRAMDOWN_ANSI_STYLES'].full?
36
+ end
37
+
38
+ module OLLAMA
39
+ description 'Ollama Configuration'
40
+ prefix 'OLLAMA'
41
+
42
+ HOST = set do
43
+ description 'Ollama "host" to connect to'
44
+ default 'localhost:11434'
45
+ end
46
+
47
+ URL = set do
48
+ description 'Ollama base URL to connect to'
49
+ default { 'http://%s' % OllamaChat::EnvConfig::OLLAMA::HOST }
50
+ sensitive true
51
+ end
52
+
53
+ SEARXNG_URL = set do
54
+ description 'URL for the SearXNG service for searches'
55
+ default 'http://localhost:8088/search?q=%{query}&language=en&format=json'
56
+ sensitive true
57
+ end
58
+
59
+ REDIS_URL = set do
60
+ description 'Redis URL for documents'
61
+ default { ENV['REDIS_URL'].full? }
62
+ sensitive true
63
+ end
64
+
65
+ REDIS_EXPIRING_URL = set do
66
+ description 'Redis URL for caching'
67
+ default { EnvConfig::OLLAMA::REDIS_URL? || ENV['REDIS_URL'].full? }
68
+ sensitive true
69
+ end
70
+
71
+ module CHAT
72
+ description 'OllamaChat Configuration'
73
+
74
+ DEBUG = set do
75
+ description 'Enable debugging for chat client'
76
+ decode { it.to_i == 1 }
77
+ default 0
78
+ end
79
+
80
+ MODEL = set do
81
+ description 'Default model to use for the chat'
82
+ default 'llama3.1'
83
+ end
84
+
85
+ SYSTEM = set do
86
+ description 'Default system prompt'
87
+ end
88
+
89
+ COLLECTION = set do
90
+ description 'Default collection for embeddings'
91
+ end
92
+
93
+ HISTORY = set do
94
+ description 'File to save the chat history in'
95
+ default '~/.ollama_chat_history'
96
+ end
97
+ end
98
+ end
99
+ end
100
+ end
@@ -174,6 +174,6 @@ class OllamaChat::FollowChat
174
174
  #
175
175
  # @param response [ Object ] the response object to be outputted
176
176
  def debug_output(response)
177
- OllamaChat::Chat.config.debug and jj response
177
+ @chat.debug and jj response
178
178
  end
179
179
  end
@@ -26,7 +26,7 @@ module OllamaChat::History
26
26
  # @return [String] the absolute file path to the chat history file as
27
27
  # specified in the configuration
28
28
  def chat_history_filename
29
- File.expand_path(config.chat_history_filename)
29
+ File.expand_path(OllamaChat::EnvConfig::OLLAMA::CHAT::HISTORY)
30
30
  end
31
31
 
32
32
  # The init_chat_history method initializes the chat session by loading
@@ -18,8 +18,8 @@ module OllamaChat::KramdownANSI
18
18
  # @return [ Hash ] a hash of ANSI styles configured either from environment
19
19
  # variables or using default settings
20
20
  def configure_kramdown_ansi_styles
21
- if env_var = %w[ KRAMDOWN_ANSI_OLLAMA_CHAT_STYLES KRAMDOWN_ANSI_STYLES ].find { ENV.key?(_1) }
22
- Kramdown::ANSI::Styles.from_env_var(env_var).ansi_styles
21
+ if json = OllamaChat::EnvConfig::KRAMDOWN_ANSI_OLLAMA_CHAT_STYLES?
22
+ Kramdown::ANSI::Styles.from_json(json).ansi_styles
23
23
  else
24
24
  Kramdown::ANSI::Styles.new.ansi_styles
25
25
  end
@@ -312,11 +312,7 @@ class OllamaChat::MessageList
312
312
  # '-r' flag for proper handling of raw control characters when a fallback
313
313
  # pager is used.
314
314
  def determine_pager_command
315
- default_pager = ENV['PAGER'].full?
316
- if fallback_pager = `which less`.chomp.full? || `which more`.chomp.full?
317
- fallback_pager << ' -r'
318
- end
319
- default_pager || fallback_pager
315
+ OllamaChat::EnvConfig::PAGER?
320
316
  end
321
317
 
322
318
  # The use_pager method wraps the given block with a pager context.
@@ -1,8 +1,7 @@
1
1
  ---
2
- url: <%= ENV['OLLAMA_URL'] || 'http://%s' % ENV.fetch('OLLAMA_HOST') %>
3
2
  proxy: null # http://localhost:8080
4
3
  model:
5
- name: <%= ENV.fetch('OLLAMA_CHAT_MODEL', 'llama3.1') %>
4
+ name: <%= OllamaChat::EnvConfig::OLLAMA::CHAT::MODEL %>
6
5
  options:
7
6
  num_ctx: 8192
8
7
  timeouts:
@@ -33,7 +32,7 @@ prompts:
33
32
  %{results}
34
33
  location: You are at %{location_name}, %{location_decimal_degrees}, on %{localtime}, preferring %{units}
35
34
  system_prompts:
36
- default: <%= ENV.fetch('OLLAMA_CHAT_SYSTEM', 'null') %>
35
+ default: <%= OllamaChat::EnvConfig::OLLAMA::CHAT::SYSTEM || 'null' %>
37
36
  assistant: You are a helpful assistant.
38
37
  voice:
39
38
  enabled: false
@@ -54,7 +53,7 @@ embedding:
54
53
  prompt: 'Represent this sentence for searching relevant passages: %s'
55
54
  batch_size: 10
56
55
  database_filename: null # ':memory:'
57
- collection: <%= ENV['OLLAMA_CHAT_COLLECTION'] %>
56
+ collection: <%= OllamaChat::EnvConfig::OLLAMA::CHAT::COLLECTION %>
58
57
  found_texts_size: 4096
59
58
  found_texts_count: 10
60
59
  splitter:
@@ -63,13 +62,11 @@ embedding:
63
62
  cache: Documentrix::Documents::SQLiteCache
64
63
  redis:
65
64
  documents:
66
- url: <%= ENV.fetch('REDIS_URL', 'null') %>
65
+ url: <%= OllamaChat::EnvConfig::OLLAMA::REDIS_URL %>
67
66
  expiring:
68
- url: <%= ENV.fetch('REDIS_EXPIRING_URL', 'null') %>
67
+ url: <%= OllamaChat::EnvConfig::OLLAMA::REDIS_EXPIRING_URL %>
69
68
  ex: 86400
70
- chat_history_filename: <%= ENV.fetch('OLLAMA_CHAT_HISTORY', '~/.ollama_chat_history') %>
71
69
  working_dir_dependent_socket: true
72
- debug: <%= ENV['OLLAMA_CHAT_DEBUG'].to_i == 1 ? true : false %>
73
70
  request_headers:
74
71
  Accept: 'text/*,application/*,image/*'
75
72
  ssl_no_verify: []
@@ -80,4 +77,4 @@ web_search:
80
77
  duckduckgo:
81
78
  url: 'https://www.duckduckgo.com/html/?q=%{query}'
82
79
  searxng:
83
- url: <%= ENV.fetch('OLLAMA_SEARXNG_URL', 'http://localhost:8088/search?q=%{query}&language=en&format=json') %>
80
+ url: <%= OllamaChat::EnvConfig::OLLAMA::SEARXNG_URL %>
@@ -112,6 +112,6 @@ class OllamaChat::OllamaChatConfig
112
112
  #
113
113
  # @return [ String ] the command name of the diff tool to be used
114
114
  def diff_tool
115
- ENV.fetch('DIFF_TOOL', 'vimdiff')
115
+ OllamaChat::EnvConfig::DIFF_TOOL?
116
116
  end
117
117
  end
@@ -200,7 +200,7 @@ module OllamaChat::Parsing
200
200
  | # OR
201
201
  "((?:\.\.|[~.]?)/(?:\\"|\\|[^"\\]+)+)" # Quoted file path with escaped " quotes
202
202
  | # OR
203
- ((?:\.\.|[~.]?)/(?:\\\ |\\|[^\\ ]+)+) # File path with escaped spaces
203
+ ((?:\.\.|[~.]?)/(?:\\\ |\\|[^\\\s]+)+) # File path with escaped spaces
204
204
  }x
205
205
  private_constant :CONTENT_REGEXP
206
206
 
@@ -38,12 +38,13 @@ module OllamaChat::ServerSocket
38
38
  # @param config [ ComplexConfig::Settings ] the configuration object containing server settings
39
39
  # @param type [ Symbol ] the type of message transmission, defaults to :socket_input
40
40
  # @param runtime_dir [ String ] pathname to runtime_dir of socket file
41
+ # @param working_dir [ String ] pathname to working_dir used for deriving socket file
41
42
  # @param parse [ TrueClass, FalseClass ] whether to parse the response, defaults to false
42
43
  #
43
44
  # @return [ UnixSocks::Message, nil ] the response from transmit_with_response if type
44
45
  # is :socket_input_with_response, otherwise nil
45
- def send_to_server_socket(content, config:, type: :socket_input, runtime_dir: nil, parse: false)
46
- server = create_socket_server(config:, runtime_dir:)
46
+ def send_to_server_socket(content, config:, type: :socket_input, runtime_dir: nil, working_dir: nil, parse: false)
47
+ server = create_socket_server(config:, runtime_dir:, working_dir:)
47
48
  message = { content:, type:, parse: }
48
49
  if type.to_sym == :socket_input_with_response
49
50
  server.transmit_with_response(message)
@@ -65,15 +66,18 @@ module OllamaChat::ServerSocket
65
66
  #
66
67
  # @param config [ComplexConfig::Settings] the configuration object
67
68
  # containing server settings
69
+ # @param runtime_dir [ String ] pathname to runtime_dir of socket file
70
+ # @param working_dir [ String ] pathname to working_dir used for deriving socket file
68
71
  #
69
72
  # @return [UnixSocks::Server] a configured Unix domain socket server
70
73
  # instance ready to receive messages
71
- def create_socket_server(config:, runtime_dir: nil)
74
+ def create_socket_server(config:, runtime_dir: nil, working_dir: nil)
75
+ working_dir ||= Dir.pwd
72
76
  if runtime_dir
73
77
  return UnixSocks::Server.new(socket_name: 'ollama_chat.sock', runtime_dir:)
74
78
  end
75
79
  if config.working_dir_dependent_socket
76
- path = File.expand_path(Dir.pwd)
80
+ path = File.expand_path(working_dir)
77
81
  digest = Digest::MD5.hexdigest(path)
78
82
  UnixSocks::Server.new(socket_name: "ollama_chat-#{digest}.sock")
79
83
  else
@@ -65,7 +65,7 @@ module OllamaChat::SourceFetching
65
65
  source,
66
66
  headers: config.request_headers?.to_h,
67
67
  cache: @cache,
68
- debug: config.debug,
68
+ debug: ,
69
69
  http_options: http_options(OllamaChat::Utils::Fetcher.normalize_url(source))
70
70
  ) do |tmp|
71
71
  block.(tmp)
@@ -1,6 +1,6 @@
1
1
  module OllamaChat
2
2
  # OllamaChat version
3
- VERSION = '0.0.33'
3
+ VERSION = '0.0.35'
4
4
  VERSION_ARRAY = VERSION.split('.').map(&:to_i) # :nodoc:
5
5
  VERSION_MAJOR = VERSION_ARRAY[0] # :nodoc:
6
6
  VERSION_MINOR = VERSION_ARRAY[1] # :nodoc:
@@ -58,7 +58,7 @@ module OllamaChat::WebSearching
58
58
  OllamaChat::Utils::Fetcher.get(
59
59
  url,
60
60
  headers: config.request_headers?.to_h,
61
- debug: config.debug
61
+ debug:
62
62
  ) do |tmp|
63
63
  data = JSON.parse(tmp.read, object_class: JSON::GenericObject)
64
64
  data.results.first(n).map(&:url)
@@ -79,7 +79,7 @@ module OllamaChat::WebSearching
79
79
  OllamaChat::Utils::Fetcher.get(
80
80
  url,
81
81
  headers: config.request_headers?.to_h,
82
- debug: config.debug
82
+ debug:
83
83
  ) do |tmp|
84
84
  result = []
85
85
  doc = Nokogiri::HTML(tmp)
data/lib/ollama_chat.rb CHANGED
@@ -34,4 +34,5 @@ require 'ollama_chat/history'
34
34
  require 'ollama_chat/server_socket'
35
35
  require 'ollama_chat/kramdown_ansi'
36
36
  require 'ollama_chat/conversation'
37
+ require 'ollama_chat/env_config'
37
38
  require 'ollama_chat/chat'
data/ollama_chat.gemspec CHANGED
@@ -1,9 +1,9 @@
1
1
  # -*- encoding: utf-8 -*-
2
- # stub: ollama_chat 0.0.33 ruby lib
2
+ # stub: ollama_chat 0.0.35 ruby lib
3
3
 
4
4
  Gem::Specification.new do |s|
5
5
  s.name = "ollama_chat".freeze
6
- s.version = "0.0.33".freeze
6
+ s.version = "0.0.35".freeze
7
7
 
8
8
  s.required_rubygems_version = Gem::Requirement.new(">= 0".freeze) if s.respond_to? :required_rubygems_version=
9
9
  s.require_paths = ["lib".freeze]
@@ -12,12 +12,12 @@ Gem::Specification.new do |s|
12
12
  s.description = "The app provides a command-line interface (CLI) to an Ollama AI model,\nallowing users to engage in text-based conversations and generate\nhuman-like responses. Users can import data from local files or web pages,\nwhich are then processed through three different modes: fully importing the\ncontent into the conversation context, summarizing the information for\nconcise reference, or storing it in an embedding vector database for later\nretrieval based on the conversation.\n".freeze
13
13
  s.email = "flori@ping.de".freeze
14
14
  s.executables = ["ollama_chat".freeze, "ollama_chat_send".freeze]
15
- s.extra_rdoc_files = ["README.md".freeze, "lib/ollama_chat.rb".freeze, "lib/ollama_chat/chat.rb".freeze, "lib/ollama_chat/clipboard.rb".freeze, "lib/ollama_chat/conversation.rb".freeze, "lib/ollama_chat/dialog.rb".freeze, "lib/ollama_chat/document_cache.rb".freeze, "lib/ollama_chat/follow_chat.rb".freeze, "lib/ollama_chat/history.rb".freeze, "lib/ollama_chat/information.rb".freeze, "lib/ollama_chat/kramdown_ansi.rb".freeze, "lib/ollama_chat/message_format.rb".freeze, "lib/ollama_chat/message_list.rb".freeze, "lib/ollama_chat/message_output.rb".freeze, "lib/ollama_chat/model_handling.rb".freeze, "lib/ollama_chat/ollama_chat_config.rb".freeze, "lib/ollama_chat/parsing.rb".freeze, "lib/ollama_chat/server_socket.rb".freeze, "lib/ollama_chat/source_fetching.rb".freeze, "lib/ollama_chat/switches.rb".freeze, "lib/ollama_chat/utils.rb".freeze, "lib/ollama_chat/utils/cache_fetcher.rb".freeze, "lib/ollama_chat/utils/chooser.rb".freeze, "lib/ollama_chat/utils/fetcher.rb".freeze, "lib/ollama_chat/utils/file_argument.rb".freeze, "lib/ollama_chat/version.rb".freeze, "lib/ollama_chat/vim.rb".freeze, "lib/ollama_chat/web_searching.rb".freeze]
16
- s.files = [".utilsrc".freeze, "CHANGES.md".freeze, "Gemfile".freeze, "README.md".freeze, "Rakefile".freeze, "bin/ollama_chat".freeze, "bin/ollama_chat_send".freeze, "config/searxng/settings.yml".freeze, "docker-compose.yml".freeze, "lib/ollama_chat.rb".freeze, "lib/ollama_chat/chat.rb".freeze, "lib/ollama_chat/clipboard.rb".freeze, "lib/ollama_chat/conversation.rb".freeze, "lib/ollama_chat/dialog.rb".freeze, "lib/ollama_chat/document_cache.rb".freeze, "lib/ollama_chat/follow_chat.rb".freeze, "lib/ollama_chat/history.rb".freeze, "lib/ollama_chat/information.rb".freeze, "lib/ollama_chat/kramdown_ansi.rb".freeze, "lib/ollama_chat/message_format.rb".freeze, "lib/ollama_chat/message_list.rb".freeze, "lib/ollama_chat/message_output.rb".freeze, "lib/ollama_chat/model_handling.rb".freeze, "lib/ollama_chat/ollama_chat_config.rb".freeze, "lib/ollama_chat/ollama_chat_config/default_config.yml".freeze, "lib/ollama_chat/parsing.rb".freeze, "lib/ollama_chat/server_socket.rb".freeze, "lib/ollama_chat/source_fetching.rb".freeze, "lib/ollama_chat/switches.rb".freeze, "lib/ollama_chat/utils.rb".freeze, "lib/ollama_chat/utils/cache_fetcher.rb".freeze, "lib/ollama_chat/utils/chooser.rb".freeze, "lib/ollama_chat/utils/fetcher.rb".freeze, "lib/ollama_chat/utils/file_argument.rb".freeze, "lib/ollama_chat/version.rb".freeze, "lib/ollama_chat/vim.rb".freeze, "lib/ollama_chat/web_searching.rb".freeze, "ollama_chat.gemspec".freeze, "redis/redis.conf".freeze, "spec/assets/api_show.json".freeze, "spec/assets/api_tags.json".freeze, "spec/assets/api_version.json".freeze, "spec/assets/conversation.json".freeze, "spec/assets/duckduckgo.html".freeze, "spec/assets/example.atom".freeze, "spec/assets/example.csv".freeze, "spec/assets/example.html".freeze, "spec/assets/example.pdf".freeze, "spec/assets/example.ps".freeze, "spec/assets/example.rb".freeze, "spec/assets/example.rss".freeze, "spec/assets/example.xml".freeze, "spec/assets/example_with_quote.html".freeze, "spec/assets/kitten.jpg".freeze, "spec/assets/prompt.txt".freeze, "spec/assets/searxng.json".freeze, "spec/ollama_chat/chat_spec.rb".freeze, "spec/ollama_chat/clipboard_spec.rb".freeze, "spec/ollama_chat/follow_chat_spec.rb".freeze, "spec/ollama_chat/information_spec.rb".freeze, "spec/ollama_chat/kramdown_ansi_spec.rb".freeze, "spec/ollama_chat/message_list_spec.rb".freeze, "spec/ollama_chat/message_output_spec.rb".freeze, "spec/ollama_chat/model_handling_spec.rb".freeze, "spec/ollama_chat/parsing_spec.rb".freeze, "spec/ollama_chat/server_socket_spec.rb".freeze, "spec/ollama_chat/source_fetching_spec.rb".freeze, "spec/ollama_chat/switches_spec.rb".freeze, "spec/ollama_chat/utils/cache_fetcher_spec.rb".freeze, "spec/ollama_chat/utils/fetcher_spec.rb".freeze, "spec/ollama_chat/utils/file_argument_spec.rb".freeze, "spec/ollama_chat/web_searching_spec.rb".freeze, "spec/spec_helper.rb".freeze, "tmp/.keep".freeze]
15
+ s.extra_rdoc_files = ["README.md".freeze, "lib/ollama_chat.rb".freeze, "lib/ollama_chat/chat.rb".freeze, "lib/ollama_chat/clipboard.rb".freeze, "lib/ollama_chat/conversation.rb".freeze, "lib/ollama_chat/dialog.rb".freeze, "lib/ollama_chat/document_cache.rb".freeze, "lib/ollama_chat/env_config.rb".freeze, "lib/ollama_chat/follow_chat.rb".freeze, "lib/ollama_chat/history.rb".freeze, "lib/ollama_chat/information.rb".freeze, "lib/ollama_chat/kramdown_ansi.rb".freeze, "lib/ollama_chat/message_format.rb".freeze, "lib/ollama_chat/message_list.rb".freeze, "lib/ollama_chat/message_output.rb".freeze, "lib/ollama_chat/model_handling.rb".freeze, "lib/ollama_chat/ollama_chat_config.rb".freeze, "lib/ollama_chat/parsing.rb".freeze, "lib/ollama_chat/server_socket.rb".freeze, "lib/ollama_chat/source_fetching.rb".freeze, "lib/ollama_chat/switches.rb".freeze, "lib/ollama_chat/utils.rb".freeze, "lib/ollama_chat/utils/cache_fetcher.rb".freeze, "lib/ollama_chat/utils/chooser.rb".freeze, "lib/ollama_chat/utils/fetcher.rb".freeze, "lib/ollama_chat/utils/file_argument.rb".freeze, "lib/ollama_chat/version.rb".freeze, "lib/ollama_chat/vim.rb".freeze, "lib/ollama_chat/web_searching.rb".freeze]
16
+ s.files = [".utilsrc".freeze, "CHANGES.md".freeze, "Gemfile".freeze, "README.md".freeze, "Rakefile".freeze, "bin/ollama_chat".freeze, "bin/ollama_chat_send".freeze, "config/searxng/settings.yml".freeze, "docker-compose.yml".freeze, "lib/ollama_chat.rb".freeze, "lib/ollama_chat/chat.rb".freeze, "lib/ollama_chat/clipboard.rb".freeze, "lib/ollama_chat/conversation.rb".freeze, "lib/ollama_chat/dialog.rb".freeze, "lib/ollama_chat/document_cache.rb".freeze, "lib/ollama_chat/env_config.rb".freeze, "lib/ollama_chat/follow_chat.rb".freeze, "lib/ollama_chat/history.rb".freeze, "lib/ollama_chat/information.rb".freeze, "lib/ollama_chat/kramdown_ansi.rb".freeze, "lib/ollama_chat/message_format.rb".freeze, "lib/ollama_chat/message_list.rb".freeze, "lib/ollama_chat/message_output.rb".freeze, "lib/ollama_chat/model_handling.rb".freeze, "lib/ollama_chat/ollama_chat_config.rb".freeze, "lib/ollama_chat/ollama_chat_config/default_config.yml".freeze, "lib/ollama_chat/parsing.rb".freeze, "lib/ollama_chat/server_socket.rb".freeze, "lib/ollama_chat/source_fetching.rb".freeze, "lib/ollama_chat/switches.rb".freeze, "lib/ollama_chat/utils.rb".freeze, "lib/ollama_chat/utils/cache_fetcher.rb".freeze, "lib/ollama_chat/utils/chooser.rb".freeze, "lib/ollama_chat/utils/fetcher.rb".freeze, "lib/ollama_chat/utils/file_argument.rb".freeze, "lib/ollama_chat/version.rb".freeze, "lib/ollama_chat/vim.rb".freeze, "lib/ollama_chat/web_searching.rb".freeze, "ollama_chat.gemspec".freeze, "redis/redis.conf".freeze, "spec/assets/api_show.json".freeze, "spec/assets/api_tags.json".freeze, "spec/assets/api_version.json".freeze, "spec/assets/conversation.json".freeze, "spec/assets/duckduckgo.html".freeze, "spec/assets/example.atom".freeze, "spec/assets/example.csv".freeze, "spec/assets/example.html".freeze, "spec/assets/example.pdf".freeze, "spec/assets/example.ps".freeze, "spec/assets/example.rb".freeze, "spec/assets/example.rss".freeze, "spec/assets/example.xml".freeze, "spec/assets/example_with_quote.html".freeze, "spec/assets/kitten.jpg".freeze, "spec/assets/prompt.txt".freeze, "spec/assets/searxng.json".freeze, "spec/ollama_chat/chat_spec.rb".freeze, "spec/ollama_chat/clipboard_spec.rb".freeze, "spec/ollama_chat/follow_chat_spec.rb".freeze, "spec/ollama_chat/information_spec.rb".freeze, "spec/ollama_chat/kramdown_ansi_spec.rb".freeze, "spec/ollama_chat/message_list_spec.rb".freeze, "spec/ollama_chat/message_output_spec.rb".freeze, "spec/ollama_chat/model_handling_spec.rb".freeze, "spec/ollama_chat/parsing_spec.rb".freeze, "spec/ollama_chat/server_socket_spec.rb".freeze, "spec/ollama_chat/source_fetching_spec.rb".freeze, "spec/ollama_chat/switches_spec.rb".freeze, "spec/ollama_chat/utils/cache_fetcher_spec.rb".freeze, "spec/ollama_chat/utils/fetcher_spec.rb".freeze, "spec/ollama_chat/utils/file_argument_spec.rb".freeze, "spec/ollama_chat/web_searching_spec.rb".freeze, "spec/spec_helper.rb".freeze, "tmp/.keep".freeze]
17
17
  s.homepage = "https://github.com/flori/ollama_chat".freeze
18
18
  s.licenses = ["MIT".freeze]
19
19
  s.rdoc_options = ["--title".freeze, "OllamaChat - A command-line interface (CLI) for interacting with an Ollama AI model.".freeze, "--main".freeze, "README.md".freeze]
20
- s.required_ruby_version = Gem::Requirement.new("~> 3.1".freeze)
20
+ s.required_ruby_version = Gem::Requirement.new("~> 3.2".freeze)
21
21
  s.rubygems_version = "3.6.9".freeze
22
22
  s.summary = "A command-line interface (CLI) for interacting with an Ollama AI model.".freeze
23
23
  s.test_files = ["spec/assets/example.rb".freeze, "spec/ollama_chat/chat_spec.rb".freeze, "spec/ollama_chat/clipboard_spec.rb".freeze, "spec/ollama_chat/follow_chat_spec.rb".freeze, "spec/ollama_chat/information_spec.rb".freeze, "spec/ollama_chat/kramdown_ansi_spec.rb".freeze, "spec/ollama_chat/message_list_spec.rb".freeze, "spec/ollama_chat/message_output_spec.rb".freeze, "spec/ollama_chat/model_handling_spec.rb".freeze, "spec/ollama_chat/parsing_spec.rb".freeze, "spec/ollama_chat/server_socket_spec.rb".freeze, "spec/ollama_chat/source_fetching_spec.rb".freeze, "spec/ollama_chat/switches_spec.rb".freeze, "spec/ollama_chat/utils/cache_fetcher_spec.rb".freeze, "spec/ollama_chat/utils/fetcher_spec.rb".freeze, "spec/ollama_chat/utils/file_argument_spec.rb".freeze, "spec/ollama_chat/web_searching_spec.rb".freeze, "spec/spec_helper.rb".freeze]
@@ -49,4 +49,5 @@ Gem::Specification.new do |s|
49
49
  s.add_runtime_dependency(%q<amatch>.freeze, ["~> 0.4.1".freeze])
50
50
  s.add_runtime_dependency(%q<pdf-reader>.freeze, ["~> 2.0".freeze])
51
51
  s.add_runtime_dependency(%q<csv>.freeze, ["~> 3.0".freeze])
52
+ s.add_runtime_dependency(%q<const_conf>.freeze, ["~> 0.3".freeze])
52
53
  end
@@ -8,7 +8,7 @@ describe OllamaChat::FollowChat do
8
8
  end
9
9
 
10
10
  let :chat do
11
- double('Chat', markdown: double(on?: false), think: double(on?: false))
11
+ double('Chat', markdown: double(on?: false), think: double(on?: false), debug: false)
12
12
  end
13
13
 
14
14
  let :follow_chat do
@@ -19,10 +19,6 @@ describe OllamaChat::FollowChat do
19
19
  double('output', :sync= => true)
20
20
  end
21
21
 
22
- before do
23
- allow(OllamaChat::Chat).to receive(:config).and_return(double(debug: false))
24
- end
25
-
26
22
  it 'has .call' do
27
23
  expect(follow_chat).to receive(:call).with(:foo)
28
24
  follow_chat.call(:foo)
@@ -7,21 +7,21 @@ describe OllamaChat::KramdownANSI do
7
7
 
8
8
  describe '#configure_kramdown_ansi_styles', protect_env: true do
9
9
  it 'can be configured via env var' do
10
- ENV['KRAMDOWN_ANSI_OLLAMA_CHAT_STYLES'] = '{}'
11
- ENV.delete('KRAMDOWN_ANSI_STYLES')
12
-
10
+ const_conf_as(
11
+ 'OllamaChat::EnvConfig::KRAMDOWN_ANSI_OLLAMA_CHAT_STYLES' => '{"foo":"bar"}'
12
+ )
13
13
  styles = { bold: '1' }
14
- expect(Kramdown::ANSI::Styles).to receive(:from_env_var).
15
- with('KRAMDOWN_ANSI_OLLAMA_CHAT_STYLES').
14
+ expect(Kramdown::ANSI::Styles).to receive(:from_json).
15
+ with('{"foo":"bar"}').
16
16
  and_return(double(ansi_styles: styles))
17
17
 
18
18
  expect(chat.configure_kramdown_ansi_styles).to eq(styles)
19
19
  end
20
20
 
21
21
  it 'has a default configuration' do
22
- ENV.delete('KRAMDOWN_ANSI_OLLAMA_CHAT_STYLES')
23
- ENV.delete('KRAMDOWN_ANSI_STYLES')
24
-
22
+ const_conf_as(
23
+ 'OllamaChat::EnvConfig::KRAMDOWN_ANSI_OLLAMA_CHAT_STYLES' => nil
24
+ )
25
25
  expect(chat.configure_kramdown_ansi_styles).to be_a(Hash)
26
26
  end
27
27
  end
@@ -13,7 +13,7 @@ describe OllamaChat::ServerSocket do
13
13
 
14
14
  before do
15
15
  expect(OllamaChat::ServerSocket).to receive(:create_socket_server).
16
- with(config: config, runtime_dir: nil).and_return(server)
16
+ with(config: config, runtime_dir: nil, working_dir: nil).and_return(server)
17
17
  end
18
18
 
19
19
  context 'with default parameters' do
@@ -50,7 +50,8 @@ describe OllamaChat::ServerSocket do
50
50
  message = { content: 'test', type: :socket_input_with_response, parse: false }
51
51
  response = double('Response')
52
52
 
53
- expect(server).to receive(:transmit_with_response).with(message).and_return(response)
53
+ expect(server).to receive(:transmit_with_response).with(message).
54
+ and_return(response)
54
55
 
55
56
  result = OllamaChat::ServerSocket.send_to_server_socket(
56
57
  'test',
@@ -68,7 +69,8 @@ describe OllamaChat::ServerSocket do
68
69
  message = { content: 'test', type: :socket_input_with_response, parse: true }
69
70
  response = double('Response')
70
71
 
71
- expect(server).to receive(:transmit_with_response).with(message).and_return(response)
72
+ expect(server).to receive(:transmit_with_response).with(message).
73
+ and_return(response)
72
74
 
73
75
  result = OllamaChat::ServerSocket.send_to_server_socket(
74
76
  'test',
@@ -82,10 +84,32 @@ describe OllamaChat::ServerSocket do
82
84
  end
83
85
  end
84
86
 
87
+ context 'with working_dir' do
88
+ before do
89
+ expect(OllamaChat::ServerSocket).to receive(:create_socket_server).
90
+ with(config: config, runtime_dir: nil, working_dir: 'foo/path').
91
+ and_return(server)
92
+ end
93
+
94
+ context 'with working_dir parameter' do
95
+ it 'uses correct parameter' do
96
+ message = { content: 'test', type: :socket_input, parse: false }
97
+ expect(server).to receive(:transmit).with(message).and_return(nil)
98
+
99
+ result = OllamaChat::ServerSocket.send_to_server_socket(
100
+ 'test', config: config, working_dir: 'foo/path'
101
+ )
102
+
103
+ expect(result).to be_nil
104
+ end
105
+ end
106
+ end
107
+
85
108
  context 'with runtime_dir parameter' do
86
109
  before do
87
110
  expect(OllamaChat::ServerSocket).to receive(:create_socket_server).
88
- with(config: config, runtime_dir: '/foo/bar').and_return(server)
111
+ with(config: config, runtime_dir: '/foo/bar', working_dir: nil).
112
+ and_return(server)
89
113
  end
90
114
 
91
115
  it 'uses correct defaults' do
@@ -94,7 +118,9 @@ describe OllamaChat::ServerSocket do
94
118
  expect(server).to receive(:transmit).with(message).and_return(nil)
95
119
 
96
120
 
97
- result = OllamaChat::ServerSocket.send_to_server_socket('test', config: config, runtime_dir: '/foo/bar')
121
+ result = OllamaChat::ServerSocket.send_to_server_socket(
122
+ 'test', config: config, runtime_dir: '/foo/bar'
123
+ )
98
124
 
99
125
  expect(result).to be_nil
100
126
  end
data/spec/spec_helper.rb CHANGED
@@ -11,6 +11,7 @@ rescue LoadError
11
11
  end
12
12
  require 'webmock/rspec'
13
13
  WebMock.disable_net_connect!
14
+ require 'const_conf/spec'
14
15
  require 'ollama_chat'
15
16
 
16
17
  ComplexConfig::Provider.deep_freeze = false
@@ -108,6 +109,7 @@ module StubOllamaServer
108
109
  to_return(status: 200, body: asset_json('api_show.json'))
109
110
  stub_request(:get, %r(/api/version\z)).
110
111
  to_return(status: 200, body: asset_json('api_version.json'))
112
+ allow_any_instance_of(OllamaChat::Chat).to receive(:connect_message)
111
113
  instantiate and chat
112
114
  end
113
115
  end
@@ -145,12 +147,10 @@ RSpec.configure do |config|
145
147
  config.include AssetHelpers
146
148
  config.extend StubOllamaServer
147
149
 
148
- config.around(&ProtectEnvVars.apply)
149
- config.before(:each) do
150
- ENV['OLLAMA_HOST'] = 'localhost:11434'
151
- end
152
-
153
150
  config.before(:suite) do
154
151
  infobar.show = nil
155
152
  end
153
+
154
+ config.around(&ProtectEnvVars.apply)
155
+ config.include(ConstConf::ConstConfHelper)
156
156
  end
metadata CHANGED
@@ -1,7 +1,7 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: ollama_chat
3
3
  version: !ruby/object:Gem::Version
4
- version: 0.0.33
4
+ version: 0.0.35
5
5
  platform: ruby
6
6
  authors:
7
7
  - Florian Frank
@@ -371,6 +371,20 @@ dependencies:
371
371
  - - "~>"
372
372
  - !ruby/object:Gem::Version
373
373
  version: '3.0'
374
+ - !ruby/object:Gem::Dependency
375
+ name: const_conf
376
+ requirement: !ruby/object:Gem::Requirement
377
+ requirements:
378
+ - - "~>"
379
+ - !ruby/object:Gem::Version
380
+ version: '0.3'
381
+ type: :runtime
382
+ prerelease: false
383
+ version_requirements: !ruby/object:Gem::Requirement
384
+ requirements:
385
+ - - "~>"
386
+ - !ruby/object:Gem::Version
387
+ version: '0.3'
374
388
  description: |
375
389
  The app provides a command-line interface (CLI) to an Ollama AI model,
376
390
  allowing users to engage in text-based conversations and generate
@@ -392,6 +406,7 @@ extra_rdoc_files:
392
406
  - lib/ollama_chat/conversation.rb
393
407
  - lib/ollama_chat/dialog.rb
394
408
  - lib/ollama_chat/document_cache.rb
409
+ - lib/ollama_chat/env_config.rb
395
410
  - lib/ollama_chat/follow_chat.rb
396
411
  - lib/ollama_chat/history.rb
397
412
  - lib/ollama_chat/information.rb
@@ -429,6 +444,7 @@ files:
429
444
  - lib/ollama_chat/conversation.rb
430
445
  - lib/ollama_chat/dialog.rb
431
446
  - lib/ollama_chat/document_cache.rb
447
+ - lib/ollama_chat/env_config.rb
432
448
  - lib/ollama_chat/follow_chat.rb
433
449
  - lib/ollama_chat/history.rb
434
450
  - lib/ollama_chat/information.rb
@@ -503,7 +519,7 @@ required_ruby_version: !ruby/object:Gem::Requirement
503
519
  requirements:
504
520
  - - "~>"
505
521
  - !ruby/object:Gem::Version
506
- version: '3.1'
522
+ version: '3.2'
507
523
  required_rubygems_version: !ruby/object:Gem::Requirement
508
524
  requirements:
509
525
  - - ">="