ollama_chat 0.0.34 → 0.0.36

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: 1285d76eebf4b0382986bb3bcf9d3d3df08bbf95e2d8b7fd5bf82c60baf82e79
4
- data.tar.gz: '0384dce9df4bbb3ebe371dedb09fb8ef8697950f8b0b2e3960038e93fc0a67b6'
3
+ metadata.gz: 3549c1258a47d46e043e045573d51e41729cc8531ecb5c9746eceb385ff01b99
4
+ data.tar.gz: cda7120617c155a20326f9a8445207e4b74cbd7faf100ac194b30cdbe4524293
5
5
  SHA512:
6
- metadata.gz: d161dba8d86bdd135395b4b5949dd721544b4bdb475904cb2642733762c5aee00a0176399048582b304aa02140aa9317f2c7a78cae0df8082b4776d146105ebb
7
- data.tar.gz: ac519d43fb532ac993fe396c9afbdd71e65fef7fd4ac42b4be9f04067ae6df684b8cbd0574d409ebe3388ce9d5603abbbc725a2ae323ce8725200efab3f8f539
6
+ metadata.gz: 88d787dbb1e4e8aa9756b21c5e0a64ab01ab52fdb2ebf84f3f4df6aa65a8b0b4aa03c77c21384ce6e98f156c9dcc9cb711d83d5baf99504e41879f05cc877131
7
+ data.tar.gz: 55097e08afa04486918e5939062b71d729ec27c14fcc186078f11c631f5c7dba05d8fca6f39a1823abd521acdd6712c06e0f5812c548321cdb3449cae1351549
data/CHANGES.md CHANGED
@@ -1,5 +1,44 @@
1
1
  # Changes
2
2
 
3
+ ## 2025-10-11 v0.0.36
4
+
5
+ - Added `openssl-dev` package to apk packages in Dockerfile
6
+ - Replaced explicit `_1` parameter syntax with implicit `_1` syntax for
7
+ compatibility with older Ruby versions
8
+ - Removed `require 'xdg'` statement from `chat.rb`
9
+ - Removed `xdg` gem dependency and implemented direct XDG directory usage
10
+ - Added documentation link to README with a link to the GitHub.io documentation
11
+ site
12
+ - Introduced GitHub Actions workflow for static content deployment to GitHub
13
+ Pages
14
+ - Updated `gem_hadar` development dependency to version **2.8**
15
+ - Reordered menu options in dialog prompts to place `[EXIT]` first
16
+ - Corrected YARD documentation guidelines for `initialize` methods
17
+ - Updated documentation comments with consistent formatting
18
+ - Updated Redis (ValKey) image version from **8.1.1** to **8.1.3**
19
+ - Removed deprecated `REDIS_EXPRING_URL` environment variable from `.envrc`
20
+
21
+ ## 2025-09-18 v0.0.35
22
+
23
+ - Replaced ad-hoc ENV handling with `const_conf` gem for structured
24
+ configuration management
25
+ - Bumped required Ruby version from **3.1** to **3.2**
26
+ - Added `const_conf (~> 0.3)` as a runtime dependency
27
+ - Introduced `OllamaChat::EnvConfig` module to centralize environment variables
28
+ - Updated `OllamaChat::Chat`, `OllamaChat::FollowChat`, and related classes to
29
+ use the new configuration system
30
+ - Replaced direct ENV access with `EnvConfig::OLLAMA::URL`,
31
+ `EnvConfig::PAGER?`, etc.
32
+ - Refactored default configuration values in YAML files to use `const_conf`
33
+ constants
34
+ - Removed legacy debug flag handling and simplified `debug` method
35
+ implementation
36
+ - Updated test suite to use `const_conf_as` helper and removed old ENV stubbing
37
+ logic
38
+ - Adjusted gemspec to include new files and dependencies, updated required Ruby
39
+ version
40
+ - Updated regex pattern to match only escaped spaces in file paths
41
+
3
42
  ## 2025-09-17 v0.0.34
4
43
 
5
44
  - Modified `-d` flag semantics to use working directory instead of runtime
data/README.md CHANGED
@@ -5,6 +5,10 @@
5
5
  **ollama_chat** is a chat client, that can be used to connect to an ollama
6
6
  server and enter chat conversations with the LLMs provided by it.
7
7
 
8
+ ## Documentation
9
+
10
+ Complete API documentation is available at: [GitHub.io](https://flori.github.io/ollama_chat/)
11
+
8
12
  ## Installation (gem)
9
13
 
10
14
  To install **ollama_chat**, you can type
data/Rakefile CHANGED
@@ -24,15 +24,18 @@ GemHadar do
24
24
  '.yardoc', 'doc', 'tags', 'corpus', 'coverage', '/config/searxng/*',
25
25
  '.starscope.db', 'cscope.out'
26
26
  package_ignore '.all_images.yml', '.tool-versions', '.gitignore', 'VERSION',
27
- '.rspec', '.github', *FileList['.contexts/*'], '.envrc'
28
-
27
+ '.rspec', '.github', '.contexts', '.envrc', '.yardopts'
29
28
 
30
29
  readme 'README.md'
31
30
 
32
- required_ruby_version '~> 3.1'
31
+ required_ruby_version '~> 3.2'
33
32
 
34
33
  executables << 'ollama_chat' << 'ollama_chat_send'
35
34
 
35
+ github_workflows(
36
+ 'static.yml' => {}
37
+ )
38
+
36
39
  dependency 'excon', '~> 1.0'
37
40
  dependency 'ollama-ruby', '~> 1.7'
38
41
  dependency 'documentrix', '~> 0.0', '>= 0.0.2'
@@ -42,7 +45,6 @@ GemHadar do
42
45
  dependency 'redis', '~> 5.0'
43
46
  dependency 'mime-types', '~> 3.0'
44
47
  dependency 'reverse_markdown', '~> 3.0'
45
- dependency 'xdg'
46
48
  dependency 'kramdown-ansi', '~> 0.2'
47
49
  dependency 'complex_config', '~> 0.22', '>= 0.22.2'
48
50
  dependency 'tins', '~> 1.41'
@@ -50,6 +52,7 @@ GemHadar do
50
52
  dependency 'amatch', '~> 0.4.1'
51
53
  dependency 'pdf-reader', '~> 2.0'
52
54
  dependency 'csv', '~> 3.0'
55
+ dependency 'const_conf', '~> 0.3'
53
56
  development_dependency 'all_images', '~> 0.6'
54
57
  development_dependency 'rspec', '~> 3.2'
55
58
  development_dependency 'kramdown', '~> 2.0'
data/docker-compose.yml CHANGED
@@ -1,7 +1,7 @@
1
1
  services:
2
2
  redis:
3
3
  container_name: redis
4
- image: valkey/valkey:8.1.1-alpine
4
+ image: valkey/valkey:8.1.3-alpine
5
5
  restart: unless-stopped
6
6
  ports: [ "127.0.0.1:9736:6379" ]
7
7
  volumes:
@@ -13,7 +13,6 @@ require 'nokogiri'
13
13
  require 'rss'
14
14
  require 'pdf/reader'
15
15
  require 'csv'
16
- require 'xdg'
17
16
  require 'socket'
18
17
  require 'shellwords'
19
18
 
@@ -84,13 +83,13 @@ class OllamaChat::Chat
84
83
  @ollama_chat_config = OllamaChat::OllamaChatConfig.new(@opts[?f])
85
84
  self.config = @ollama_chat_config.config
86
85
  setup_switches(config)
87
- base_url = @opts[?u] || config.url
86
+ base_url = @opts[?u] || OllamaChat::EnvConfig::OLLAMA::URL
88
87
  @ollama = Ollama::Client.new(
89
88
  connect_timeout: config.timeouts.connect_timeout?,
90
89
  read_timeout: config.timeouts.read_timeout?,
91
90
  write_timeout: config.timeouts.write_timeout?,
92
91
  base_url: base_url,
93
- debug: config.debug,
92
+ debug: ,
94
93
  user_agent:
95
94
  )
96
95
  if server_version.version < '0.9.0'.version
@@ -124,6 +123,13 @@ class OllamaChat::Chat
124
123
  fix_config(e)
125
124
  end
126
125
 
126
+ # The debug method accesses the debug configuration setting.
127
+ #
128
+ # @return [TrueClass, FalseClass] the current debug mode status
129
+ def debug
130
+ OllamaChat::EnvConfig::OLLAMA::CHAT::DEBUG
131
+ end
132
+
127
133
  # The ollama reader returns the Ollama API client instance.
128
134
  #
129
135
  # @return [Ollama::Client] the configured Ollama API client
@@ -136,7 +142,7 @@ class OllamaChat::Chat
136
142
  # Documentrix::Documents instance.
137
143
  #
138
144
  # @return [Documentrix::Documents] A Documentrix::Documents object containing
139
- # all documents associated with this instance
145
+ # all documents associated with this instance
140
146
  attr_reader :documents
141
147
 
142
148
  # Returns the messages set for this object, initializing it lazily if needed.
@@ -146,7 +152,7 @@ class OllamaChat::Chat
146
152
  # OllamaChat::MessageList instance.
147
153
  #
148
154
  # @return [OllamaChat::MessageList] A MessageList object containing all
149
- # messages associated with this instance
155
+ # messages associated with this instance
150
156
  attr_reader :messages
151
157
 
152
158
  # Returns the links set for this object, initializing it lazily if needed.
@@ -195,6 +201,11 @@ class OllamaChat::Chat
195
201
 
196
202
  private
197
203
 
204
+ # Handles user input commands and processes chat interactions.
205
+ #
206
+ # @param content [String] The input content to process
207
+ # @return [Symbol, String, nil] Returns a symbol indicating next action,
208
+ # the content to be processed, or nil for no action needed
198
209
  def handle_input(content)
199
210
  case content
200
211
  when %r(^/copy$)
@@ -463,7 +474,7 @@ class OllamaChat::Chat
463
474
  # specified parameter.
464
475
  #
465
476
  # @param what [ String, nil ] the type of data to clear, defaults to
466
- # 'messages' if nil
477
+ # 'messages' if nil
467
478
  def clean(what)
468
479
  case what
469
480
  when 'messages', nil
@@ -497,11 +508,7 @@ class OllamaChat::Chat
497
508
  # and available system commands, then uses Kramdown::ANSI::Pager to show the
498
509
  # formatted configuration output.
499
510
  def display_config
500
- default_pager = ENV['PAGER'].full?
501
- if fallback_pager = `which less`.chomp.full? || `which more`.chomp.full?
502
- fallback_pager << ' -r'
503
- end
504
- my_pager = default_pager || fallback_pager
511
+ command = OllamaChat::EnvConfig::PAGER?
505
512
  rendered = config.to_s
506
513
  Kramdown::ANSI::Pager.pager(
507
514
  lines: rendered.count(?\n),
@@ -610,7 +617,7 @@ class OllamaChat::Chat
610
617
  end
611
618
  [ link, ?# + record.tags.first ]
612
619
  }.uniq.map { |l, t| hyperlink(l, t) }.join(' ')
613
- config.debug and jj messages.to_ary
620
+ debug and jj messages.to_ary
614
621
  end
615
622
 
616
623
  case type
@@ -644,7 +651,7 @@ class OllamaChat::Chat
644
651
  # configured document collection.
645
652
  #
646
653
  # @return [ Documentrix::Documents, NULL ] the initialized document
647
- # collection if embedding is enabled, otherwise NULL
654
+ # collection if embedding is enabled, otherwise NULL
648
655
  def setup_documents
649
656
  if embedding.on?
650
657
  @embedding_model = config.embedding.model.name
@@ -660,7 +667,7 @@ class OllamaChat::Chat
660
667
  collection: ,
661
668
  cache: configure_cache,
662
669
  redis_url: config.redis.documents.url?,
663
- debug: config.debug
670
+ debug:
664
671
  )
665
672
 
666
673
  document_list = @opts[?D].to_a
@@ -726,7 +733,7 @@ class OllamaChat::Chat
726
733
  # expiring keys if a Redis URL is configured.
727
734
  #
728
735
  # @return [ Documentrix::Documents::RedisCache, nil ] the configured Redis
729
- # cache instance or nil if no URL is set.
736
+ # cache instance or nil if no URL is set.
730
737
  def setup_cache
731
738
  if url = config.redis.expiring.url?
732
739
  ex = config.redis.expiring.ex?.to_i
@@ -745,14 +752,17 @@ class OllamaChat::Chat
745
752
  # This method exits the program after handling the configuration error
746
753
  #
747
754
  # @param exception [ Exception ] the exception that occurred while reading
748
- # the config file
755
+ # the config file
749
756
  def fix_config(exception)
750
757
  save_conversation('backup.json')
751
758
  STDOUT.puts "When reading the config file, a #{exception.class} "\
752
759
  "exception was caught: #{exception.message.inspect}"
760
+ unless diff_tool = OllamaChat::EnvConfig::DIFF_TOOL?
761
+ exit 1
762
+ end
753
763
  if ask?(prompt: 'Do you want to fix the config? (y/n) ') =~ /\Ay/i
754
764
  system Shellwords.join([
755
- @ollama_chat_config.diff_tool,
765
+ diff_tool,
756
766
  @ollama_chat_config.filename,
757
767
  @ollama_chat_config.default_config_path,
758
768
  ])
@@ -58,7 +58,15 @@ module OllamaChat::Dialog
58
58
  cli_model || current_model
59
59
  end
60
60
  ensure
61
- STDOUT.puts green { "Connecting to #{model}@#{ollama.base_url} now…" }
61
+ connect_message(model, ollama.base_url)
62
+ end
63
+
64
+ # The connect_message method displays a connection status message.
65
+ #
66
+ # @param model [String] the model name to connect to
67
+ # @param base_url [String] the base URL of the connection
68
+ def connect_message(model, base_url)
69
+ STDOUT.puts green { "Connecting to #{model}@#{base_url} now…" }
62
70
  end
63
71
 
64
72
  # The ask? method prompts the user with a question and returns their input.
@@ -150,7 +158,7 @@ module OllamaChat::Dialog
150
158
  if prompts.size == 1
151
159
  system = config.system_prompts.send(prompts.first)
152
160
  else
153
- prompts.unshift('[EXIT]').unshift('[NEW]')
161
+ prompts.unshift('[NEW]').unshift('[EXIT]')
154
162
  chosen = OllamaChat::Utils::Chooser.choose(prompts)
155
163
  system =
156
164
  case chosen
@@ -0,0 +1,113 @@
1
+ require 'const_conf'
2
+ require 'pathname'
3
+
4
+ module OllamaChat
5
+ module EnvConfig
6
+ include ConstConf
7
+
8
+ description 'Environment config for OllamaChat'
9
+ prefix ''
10
+
11
+ XDG_CONFIG_HOME = set do
12
+ description 'XDG Configuration directory path'
13
+ default { '~/.config' }
14
+ decode { Pathname.new(_1) + 'ollama_chat' }
15
+ end
16
+
17
+ XDG_CACHE_HOME = set do
18
+ description 'XDG Cache directory path'
19
+ default { '~/.cache' }
20
+ decode { Pathname.new(_1) + 'ollama_chat' }
21
+ end
22
+
23
+ PAGER = set do
24
+ description 'Pager command to use in case terminal lines are exceeded by output'
25
+
26
+ default do
27
+ if fallback_pager = `which less`.full?(:chomp) || `which more`.full?(:chomp)
28
+ fallback_pager << ' -r'
29
+ end
30
+ end
31
+ end
32
+
33
+ DIFF_TOOL = set do
34
+ description 'Diff tool to apply changes with'
35
+
36
+ default do
37
+ if diff = `which vimdiff`.full?(:chomp)
38
+ diff
39
+ else
40
+ warn 'Need a diff tool configured via env var "DIFF_TOOL"'
41
+ end
42
+ end
43
+ end
44
+
45
+ KRAMDOWN_ANSI_OLLAMA_CHAT_STYLES = set do
46
+ description 'Styles to use for kramdown-ansi markdown'
47
+
48
+ default ENV['KRAMDOWN_ANSI_STYLES'].full?
49
+ end
50
+
51
+ module OLLAMA
52
+ description 'Ollama Configuration'
53
+ prefix 'OLLAMA'
54
+
55
+ HOST = set do
56
+ description 'Ollama "host" to connect to'
57
+ default 'localhost:11434'
58
+ end
59
+
60
+ URL = set do
61
+ description 'Ollama base URL to connect to'
62
+ default { 'http://%s' % OllamaChat::EnvConfig::OLLAMA::HOST }
63
+ sensitive true
64
+ end
65
+
66
+ SEARXNG_URL = set do
67
+ description 'URL for the SearXNG service for searches'
68
+ default 'http://localhost:8088/search?q=%{query}&language=en&format=json'
69
+ sensitive true
70
+ end
71
+
72
+ REDIS_URL = set do
73
+ description 'Redis URL for documents'
74
+ default { ENV['REDIS_URL'].full? }
75
+ sensitive true
76
+ end
77
+
78
+ REDIS_EXPIRING_URL = set do
79
+ description 'Redis URL for caching'
80
+ default { EnvConfig::OLLAMA::REDIS_URL? || ENV['REDIS_URL'].full? }
81
+ sensitive true
82
+ end
83
+
84
+ module CHAT
85
+ description 'OllamaChat Configuration'
86
+
87
+ DEBUG = set do
88
+ description 'Enable debugging for chat client'
89
+ decode { _1.to_i == 1 }
90
+ default 0
91
+ end
92
+
93
+ MODEL = set do
94
+ description 'Default model to use for the chat'
95
+ default 'llama3.1'
96
+ end
97
+
98
+ SYSTEM = set do
99
+ description 'Default system prompt'
100
+ end
101
+
102
+ COLLECTION = set do
103
+ description 'Default collection for embeddings'
104
+ end
105
+
106
+ HISTORY = set do
107
+ description 'File to save the chat history in'
108
+ default '~/.ollama_chat_history'
109
+ end
110
+ end
111
+ end
112
+ end
113
+ end
@@ -174,6 +174,6 @@ class OllamaChat::FollowChat
174
174
  #
175
175
  # @param response [ Object ] the response object to be outputted
176
176
  def debug_output(response)
177
- OllamaChat::Chat.config.debug and jj response
177
+ @chat.debug and jj response
178
178
  end
179
179
  end
@@ -26,7 +26,7 @@ module OllamaChat::History
26
26
  # @return [String] the absolute file path to the chat history file as
27
27
  # specified in the configuration
28
28
  def chat_history_filename
29
- File.expand_path(config.chat_history_filename)
29
+ File.expand_path(OllamaChat::EnvConfig::OLLAMA::CHAT::HISTORY)
30
30
  end
31
31
 
32
32
  # The init_chat_history method initializes the chat session by loading
@@ -18,8 +18,8 @@ module OllamaChat::KramdownANSI
18
18
  # @return [ Hash ] a hash of ANSI styles configured either from environment
19
19
  # variables or using default settings
20
20
  def configure_kramdown_ansi_styles
21
- if env_var = %w[ KRAMDOWN_ANSI_OLLAMA_CHAT_STYLES KRAMDOWN_ANSI_STYLES ].find { ENV.key?(_1) }
22
- Kramdown::ANSI::Styles.from_env_var(env_var).ansi_styles
21
+ if json = OllamaChat::EnvConfig::KRAMDOWN_ANSI_OLLAMA_CHAT_STYLES?
22
+ Kramdown::ANSI::Styles.from_json(json).ansi_styles
23
23
  else
24
24
  Kramdown::ANSI::Styles.new.ansi_styles
25
25
  end
@@ -312,11 +312,7 @@ class OllamaChat::MessageList
312
312
  # '-r' flag for proper handling of raw control characters when a fallback
313
313
  # pager is used.
314
314
  def determine_pager_command
315
- default_pager = ENV['PAGER'].full?
316
- if fallback_pager = `which less`.chomp.full? || `which more`.chomp.full?
317
- fallback_pager << ' -r'
318
- end
319
- default_pager || fallback_pager
315
+ OllamaChat::EnvConfig::PAGER?
320
316
  end
321
317
 
322
318
  # The use_pager method wraps the given block with a pager context.
@@ -1,8 +1,7 @@
1
1
  ---
2
- url: <%= ENV['OLLAMA_URL'] || 'http://%s' % ENV.fetch('OLLAMA_HOST') %>
3
2
  proxy: null # http://localhost:8080
4
3
  model:
5
- name: <%= ENV.fetch('OLLAMA_CHAT_MODEL', 'llama3.1') %>
4
+ name: <%= OllamaChat::EnvConfig::OLLAMA::CHAT::MODEL %>
6
5
  options:
7
6
  num_ctx: 8192
8
7
  timeouts:
@@ -33,7 +32,7 @@ prompts:
33
32
  %{results}
34
33
  location: You are at %{location_name}, %{location_decimal_degrees}, on %{localtime}, preferring %{units}
35
34
  system_prompts:
36
- default: <%= ENV.fetch('OLLAMA_CHAT_SYSTEM', 'null') %>
35
+ default: <%= OllamaChat::EnvConfig::OLLAMA::CHAT::SYSTEM || 'null' %>
37
36
  assistant: You are a helpful assistant.
38
37
  voice:
39
38
  enabled: false
@@ -54,7 +53,7 @@ embedding:
54
53
  prompt: 'Represent this sentence for searching relevant passages: %s'
55
54
  batch_size: 10
56
55
  database_filename: null # ':memory:'
57
- collection: <%= ENV['OLLAMA_CHAT_COLLECTION'] %>
56
+ collection: <%= OllamaChat::EnvConfig::OLLAMA::CHAT::COLLECTION %>
58
57
  found_texts_size: 4096
59
58
  found_texts_count: 10
60
59
  splitter:
@@ -63,13 +62,11 @@ embedding:
63
62
  cache: Documentrix::Documents::SQLiteCache
64
63
  redis:
65
64
  documents:
66
- url: <%= ENV.fetch('REDIS_URL', 'null') %>
65
+ url: <%= OllamaChat::EnvConfig::OLLAMA::REDIS_URL %>
67
66
  expiring:
68
- url: <%= ENV.fetch('REDIS_EXPIRING_URL', 'null') %>
67
+ url: <%= OllamaChat::EnvConfig::OLLAMA::REDIS_EXPIRING_URL %>
69
68
  ex: 86400
70
- chat_history_filename: <%= ENV.fetch('OLLAMA_CHAT_HISTORY', '~/.ollama_chat_history') %>
71
69
  working_dir_dependent_socket: true
72
- debug: <%= ENV['OLLAMA_CHAT_DEBUG'].to_i == 1 ? true : false %>
73
70
  request_headers:
74
71
  Accept: 'text/*,application/*,image/*'
75
72
  ssl_no_verify: []
@@ -80,4 +77,4 @@ web_search:
80
77
  duckduckgo:
81
78
  url: 'https://www.duckduckgo.com/html/?q=%{query}'
82
79
  searxng:
83
- url: <%= ENV.fetch('OLLAMA_SEARXNG_URL', 'http://localhost:8088/search?q=%{query}&language=en&format=json') %>
80
+ url: <%= OllamaChat::EnvConfig::OLLAMA::SEARXNG_URL %>
@@ -87,7 +87,7 @@ class OllamaChat::OllamaChatConfig
87
87
  # @return [ Pathname ] the pathname object representing the configuration
88
88
  # directory
89
89
  def config_dir_path
90
- XDG.new.config_home + 'ollama_chat'
90
+ OllamaChat::EnvConfig::XDG_CONFIG_HOME
91
91
  end
92
92
 
93
93
  # The cache_dir_path method returns the path to the ollama_chat cache
@@ -95,7 +95,7 @@ class OllamaChat::OllamaChatConfig
95
95
  #
96
96
  # @return [ Pathname ] the pathname object representing the cache directory path
97
97
  def cache_dir_path
98
- XDG.new.cache_home + 'ollama_chat'
98
+ OllamaChat::EnvConfig::XDG_CACHE_HOME
99
99
  end
100
100
 
101
101
  # The database_path method constructs the full path to the documents database
@@ -112,6 +112,6 @@ class OllamaChat::OllamaChatConfig
112
112
  #
113
113
  # @return [ String ] the command name of the diff tool to be used
114
114
  def diff_tool
115
- ENV.fetch('DIFF_TOOL', 'vimdiff')
115
+ OllamaChat::EnvConfig::DIFF_TOOL?
116
116
  end
117
117
  end
@@ -200,7 +200,7 @@ module OllamaChat::Parsing
200
200
  | # OR
201
201
  "((?:\.\.|[~.]?)/(?:\\"|\\|[^"\\]+)+)" # Quoted file path with escaped " quotes
202
202
  | # OR
203
- ((?:\.\.|[~.]?)/(?:\\\ |\\|[^\\ ]+)+) # File path with escaped spaces
203
+ ((?:\.\.|[~.]?)/(?:\\\ |\\|[^\\\s]+)+) # File path with escaped spaces
204
204
  }x
205
205
  private_constant :CONTENT_REGEXP
206
206
 
@@ -65,7 +65,7 @@ module OllamaChat::SourceFetching
65
65
  source,
66
66
  headers: config.request_headers?.to_h,
67
67
  cache: @cache,
68
- debug: config.debug,
68
+ debug: ,
69
69
  http_options: http_options(OllamaChat::Utils::Fetcher.normalize_url(source))
70
70
  ) do |tmp|
71
71
  block.(tmp)
@@ -1,6 +1,6 @@
1
1
  module OllamaChat
2
2
  # OllamaChat version
3
- VERSION = '0.0.34'
3
+ VERSION = '0.0.36'
4
4
  VERSION_ARRAY = VERSION.split('.').map(&:to_i) # :nodoc:
5
5
  VERSION_MAJOR = VERSION_ARRAY[0] # :nodoc:
6
6
  VERSION_MINOR = VERSION_ARRAY[1] # :nodoc:
@@ -58,7 +58,7 @@ module OllamaChat::WebSearching
58
58
  OllamaChat::Utils::Fetcher.get(
59
59
  url,
60
60
  headers: config.request_headers?.to_h,
61
- debug: config.debug
61
+ debug:
62
62
  ) do |tmp|
63
63
  data = JSON.parse(tmp.read, object_class: JSON::GenericObject)
64
64
  data.results.first(n).map(&:url)
@@ -79,7 +79,7 @@ module OllamaChat::WebSearching
79
79
  OllamaChat::Utils::Fetcher.get(
80
80
  url,
81
81
  headers: config.request_headers?.to_h,
82
- debug: config.debug
82
+ debug:
83
83
  ) do |tmp|
84
84
  result = []
85
85
  doc = Nokogiri::HTML(tmp)
data/lib/ollama_chat.rb CHANGED
@@ -34,4 +34,5 @@ require 'ollama_chat/history'
34
34
  require 'ollama_chat/server_socket'
35
35
  require 'ollama_chat/kramdown_ansi'
36
36
  require 'ollama_chat/conversation'
37
+ require 'ollama_chat/env_config'
37
38
  require 'ollama_chat/chat'
data/ollama_chat.gemspec CHANGED
@@ -1,9 +1,9 @@
1
1
  # -*- encoding: utf-8 -*-
2
- # stub: ollama_chat 0.0.34 ruby lib
2
+ # stub: ollama_chat 0.0.36 ruby lib
3
3
 
4
4
  Gem::Specification.new do |s|
5
5
  s.name = "ollama_chat".freeze
6
- s.version = "0.0.34".freeze
6
+ s.version = "0.0.36".freeze
7
7
 
8
8
  s.required_rubygems_version = Gem::Requirement.new(">= 0".freeze) if s.respond_to? :required_rubygems_version=
9
9
  s.require_paths = ["lib".freeze]
@@ -12,19 +12,19 @@ Gem::Specification.new do |s|
12
12
  s.description = "The app provides a command-line interface (CLI) to an Ollama AI model,\nallowing users to engage in text-based conversations and generate\nhuman-like responses. Users can import data from local files or web pages,\nwhich are then processed through three different modes: fully importing the\ncontent into the conversation context, summarizing the information for\nconcise reference, or storing it in an embedding vector database for later\nretrieval based on the conversation.\n".freeze
13
13
  s.email = "flori@ping.de".freeze
14
14
  s.executables = ["ollama_chat".freeze, "ollama_chat_send".freeze]
15
- s.extra_rdoc_files = ["README.md".freeze, "lib/ollama_chat.rb".freeze, "lib/ollama_chat/chat.rb".freeze, "lib/ollama_chat/clipboard.rb".freeze, "lib/ollama_chat/conversation.rb".freeze, "lib/ollama_chat/dialog.rb".freeze, "lib/ollama_chat/document_cache.rb".freeze, "lib/ollama_chat/follow_chat.rb".freeze, "lib/ollama_chat/history.rb".freeze, "lib/ollama_chat/information.rb".freeze, "lib/ollama_chat/kramdown_ansi.rb".freeze, "lib/ollama_chat/message_format.rb".freeze, "lib/ollama_chat/message_list.rb".freeze, "lib/ollama_chat/message_output.rb".freeze, "lib/ollama_chat/model_handling.rb".freeze, "lib/ollama_chat/ollama_chat_config.rb".freeze, "lib/ollama_chat/parsing.rb".freeze, "lib/ollama_chat/server_socket.rb".freeze, "lib/ollama_chat/source_fetching.rb".freeze, "lib/ollama_chat/switches.rb".freeze, "lib/ollama_chat/utils.rb".freeze, "lib/ollama_chat/utils/cache_fetcher.rb".freeze, "lib/ollama_chat/utils/chooser.rb".freeze, "lib/ollama_chat/utils/fetcher.rb".freeze, "lib/ollama_chat/utils/file_argument.rb".freeze, "lib/ollama_chat/version.rb".freeze, "lib/ollama_chat/vim.rb".freeze, "lib/ollama_chat/web_searching.rb".freeze]
16
- s.files = [".utilsrc".freeze, "CHANGES.md".freeze, "Gemfile".freeze, "README.md".freeze, "Rakefile".freeze, "bin/ollama_chat".freeze, "bin/ollama_chat_send".freeze, "config/searxng/settings.yml".freeze, "docker-compose.yml".freeze, "lib/ollama_chat.rb".freeze, "lib/ollama_chat/chat.rb".freeze, "lib/ollama_chat/clipboard.rb".freeze, "lib/ollama_chat/conversation.rb".freeze, "lib/ollama_chat/dialog.rb".freeze, "lib/ollama_chat/document_cache.rb".freeze, "lib/ollama_chat/follow_chat.rb".freeze, "lib/ollama_chat/history.rb".freeze, "lib/ollama_chat/information.rb".freeze, "lib/ollama_chat/kramdown_ansi.rb".freeze, "lib/ollama_chat/message_format.rb".freeze, "lib/ollama_chat/message_list.rb".freeze, "lib/ollama_chat/message_output.rb".freeze, "lib/ollama_chat/model_handling.rb".freeze, "lib/ollama_chat/ollama_chat_config.rb".freeze, "lib/ollama_chat/ollama_chat_config/default_config.yml".freeze, "lib/ollama_chat/parsing.rb".freeze, "lib/ollama_chat/server_socket.rb".freeze, "lib/ollama_chat/source_fetching.rb".freeze, "lib/ollama_chat/switches.rb".freeze, "lib/ollama_chat/utils.rb".freeze, "lib/ollama_chat/utils/cache_fetcher.rb".freeze, "lib/ollama_chat/utils/chooser.rb".freeze, "lib/ollama_chat/utils/fetcher.rb".freeze, "lib/ollama_chat/utils/file_argument.rb".freeze, "lib/ollama_chat/version.rb".freeze, "lib/ollama_chat/vim.rb".freeze, "lib/ollama_chat/web_searching.rb".freeze, "ollama_chat.gemspec".freeze, "redis/redis.conf".freeze, "spec/assets/api_show.json".freeze, "spec/assets/api_tags.json".freeze, "spec/assets/api_version.json".freeze, "spec/assets/conversation.json".freeze, "spec/assets/duckduckgo.html".freeze, "spec/assets/example.atom".freeze, "spec/assets/example.csv".freeze, "spec/assets/example.html".freeze, "spec/assets/example.pdf".freeze, "spec/assets/example.ps".freeze, "spec/assets/example.rb".freeze, "spec/assets/example.rss".freeze, "spec/assets/example.xml".freeze, "spec/assets/example_with_quote.html".freeze, "spec/assets/kitten.jpg".freeze, "spec/assets/prompt.txt".freeze, "spec/assets/searxng.json".freeze, "spec/ollama_chat/chat_spec.rb".freeze, "spec/ollama_chat/clipboard_spec.rb".freeze, "spec/ollama_chat/follow_chat_spec.rb".freeze, "spec/ollama_chat/information_spec.rb".freeze, "spec/ollama_chat/kramdown_ansi_spec.rb".freeze, "spec/ollama_chat/message_list_spec.rb".freeze, "spec/ollama_chat/message_output_spec.rb".freeze, "spec/ollama_chat/model_handling_spec.rb".freeze, "spec/ollama_chat/parsing_spec.rb".freeze, "spec/ollama_chat/server_socket_spec.rb".freeze, "spec/ollama_chat/source_fetching_spec.rb".freeze, "spec/ollama_chat/switches_spec.rb".freeze, "spec/ollama_chat/utils/cache_fetcher_spec.rb".freeze, "spec/ollama_chat/utils/fetcher_spec.rb".freeze, "spec/ollama_chat/utils/file_argument_spec.rb".freeze, "spec/ollama_chat/web_searching_spec.rb".freeze, "spec/spec_helper.rb".freeze, "tmp/.keep".freeze]
15
+ s.extra_rdoc_files = ["README.md".freeze, "lib/ollama_chat.rb".freeze, "lib/ollama_chat/chat.rb".freeze, "lib/ollama_chat/clipboard.rb".freeze, "lib/ollama_chat/conversation.rb".freeze, "lib/ollama_chat/dialog.rb".freeze, "lib/ollama_chat/document_cache.rb".freeze, "lib/ollama_chat/env_config.rb".freeze, "lib/ollama_chat/follow_chat.rb".freeze, "lib/ollama_chat/history.rb".freeze, "lib/ollama_chat/information.rb".freeze, "lib/ollama_chat/kramdown_ansi.rb".freeze, "lib/ollama_chat/message_format.rb".freeze, "lib/ollama_chat/message_list.rb".freeze, "lib/ollama_chat/message_output.rb".freeze, "lib/ollama_chat/model_handling.rb".freeze, "lib/ollama_chat/ollama_chat_config.rb".freeze, "lib/ollama_chat/parsing.rb".freeze, "lib/ollama_chat/server_socket.rb".freeze, "lib/ollama_chat/source_fetching.rb".freeze, "lib/ollama_chat/switches.rb".freeze, "lib/ollama_chat/utils.rb".freeze, "lib/ollama_chat/utils/cache_fetcher.rb".freeze, "lib/ollama_chat/utils/chooser.rb".freeze, "lib/ollama_chat/utils/fetcher.rb".freeze, "lib/ollama_chat/utils/file_argument.rb".freeze, "lib/ollama_chat/version.rb".freeze, "lib/ollama_chat/vim.rb".freeze, "lib/ollama_chat/web_searching.rb".freeze]
16
+ s.files = [".utilsrc".freeze, "CHANGES.md".freeze, "Gemfile".freeze, "README.md".freeze, "Rakefile".freeze, "bin/ollama_chat".freeze, "bin/ollama_chat_send".freeze, "config/searxng/settings.yml".freeze, "docker-compose.yml".freeze, "lib/ollama_chat.rb".freeze, "lib/ollama_chat/chat.rb".freeze, "lib/ollama_chat/clipboard.rb".freeze, "lib/ollama_chat/conversation.rb".freeze, "lib/ollama_chat/dialog.rb".freeze, "lib/ollama_chat/document_cache.rb".freeze, "lib/ollama_chat/env_config.rb".freeze, "lib/ollama_chat/follow_chat.rb".freeze, "lib/ollama_chat/history.rb".freeze, "lib/ollama_chat/information.rb".freeze, "lib/ollama_chat/kramdown_ansi.rb".freeze, "lib/ollama_chat/message_format.rb".freeze, "lib/ollama_chat/message_list.rb".freeze, "lib/ollama_chat/message_output.rb".freeze, "lib/ollama_chat/model_handling.rb".freeze, "lib/ollama_chat/ollama_chat_config.rb".freeze, "lib/ollama_chat/ollama_chat_config/default_config.yml".freeze, "lib/ollama_chat/parsing.rb".freeze, "lib/ollama_chat/server_socket.rb".freeze, "lib/ollama_chat/source_fetching.rb".freeze, "lib/ollama_chat/switches.rb".freeze, "lib/ollama_chat/utils.rb".freeze, "lib/ollama_chat/utils/cache_fetcher.rb".freeze, "lib/ollama_chat/utils/chooser.rb".freeze, "lib/ollama_chat/utils/fetcher.rb".freeze, "lib/ollama_chat/utils/file_argument.rb".freeze, "lib/ollama_chat/version.rb".freeze, "lib/ollama_chat/vim.rb".freeze, "lib/ollama_chat/web_searching.rb".freeze, "ollama_chat.gemspec".freeze, "redis/redis.conf".freeze, "spec/assets/api_show.json".freeze, "spec/assets/api_tags.json".freeze, "spec/assets/api_version.json".freeze, "spec/assets/conversation.json".freeze, "spec/assets/duckduckgo.html".freeze, "spec/assets/example.atom".freeze, "spec/assets/example.csv".freeze, "spec/assets/example.html".freeze, "spec/assets/example.pdf".freeze, "spec/assets/example.ps".freeze, "spec/assets/example.rb".freeze, "spec/assets/example.rss".freeze, "spec/assets/example.xml".freeze, "spec/assets/example_with_quote.html".freeze, "spec/assets/kitten.jpg".freeze, "spec/assets/prompt.txt".freeze, "spec/assets/searxng.json".freeze, "spec/ollama_chat/chat_spec.rb".freeze, "spec/ollama_chat/clipboard_spec.rb".freeze, "spec/ollama_chat/follow_chat_spec.rb".freeze, "spec/ollama_chat/information_spec.rb".freeze, "spec/ollama_chat/kramdown_ansi_spec.rb".freeze, "spec/ollama_chat/message_list_spec.rb".freeze, "spec/ollama_chat/message_output_spec.rb".freeze, "spec/ollama_chat/model_handling_spec.rb".freeze, "spec/ollama_chat/parsing_spec.rb".freeze, "spec/ollama_chat/server_socket_spec.rb".freeze, "spec/ollama_chat/source_fetching_spec.rb".freeze, "spec/ollama_chat/switches_spec.rb".freeze, "spec/ollama_chat/utils/cache_fetcher_spec.rb".freeze, "spec/ollama_chat/utils/fetcher_spec.rb".freeze, "spec/ollama_chat/utils/file_argument_spec.rb".freeze, "spec/ollama_chat/web_searching_spec.rb".freeze, "spec/spec_helper.rb".freeze, "tmp/.keep".freeze]
17
17
  s.homepage = "https://github.com/flori/ollama_chat".freeze
18
18
  s.licenses = ["MIT".freeze]
19
19
  s.rdoc_options = ["--title".freeze, "OllamaChat - A command-line interface (CLI) for interacting with an Ollama AI model.".freeze, "--main".freeze, "README.md".freeze]
20
- s.required_ruby_version = Gem::Requirement.new("~> 3.1".freeze)
20
+ s.required_ruby_version = Gem::Requirement.new("~> 3.2".freeze)
21
21
  s.rubygems_version = "3.6.9".freeze
22
22
  s.summary = "A command-line interface (CLI) for interacting with an Ollama AI model.".freeze
23
23
  s.test_files = ["spec/assets/example.rb".freeze, "spec/ollama_chat/chat_spec.rb".freeze, "spec/ollama_chat/clipboard_spec.rb".freeze, "spec/ollama_chat/follow_chat_spec.rb".freeze, "spec/ollama_chat/information_spec.rb".freeze, "spec/ollama_chat/kramdown_ansi_spec.rb".freeze, "spec/ollama_chat/message_list_spec.rb".freeze, "spec/ollama_chat/message_output_spec.rb".freeze, "spec/ollama_chat/model_handling_spec.rb".freeze, "spec/ollama_chat/parsing_spec.rb".freeze, "spec/ollama_chat/server_socket_spec.rb".freeze, "spec/ollama_chat/source_fetching_spec.rb".freeze, "spec/ollama_chat/switches_spec.rb".freeze, "spec/ollama_chat/utils/cache_fetcher_spec.rb".freeze, "spec/ollama_chat/utils/fetcher_spec.rb".freeze, "spec/ollama_chat/utils/file_argument_spec.rb".freeze, "spec/ollama_chat/web_searching_spec.rb".freeze, "spec/spec_helper.rb".freeze]
24
24
 
25
25
  s.specification_version = 4
26
26
 
27
- s.add_development_dependency(%q<gem_hadar>.freeze, ["~> 2.6".freeze])
27
+ s.add_development_dependency(%q<gem_hadar>.freeze, ["~> 2.8".freeze])
28
28
  s.add_development_dependency(%q<all_images>.freeze, ["~> 0.6".freeze])
29
29
  s.add_development_dependency(%q<rspec>.freeze, ["~> 3.2".freeze])
30
30
  s.add_development_dependency(%q<kramdown>.freeze, ["~> 2.0".freeze])
@@ -41,7 +41,6 @@ Gem::Specification.new do |s|
41
41
  s.add_runtime_dependency(%q<redis>.freeze, ["~> 5.0".freeze])
42
42
  s.add_runtime_dependency(%q<mime-types>.freeze, ["~> 3.0".freeze])
43
43
  s.add_runtime_dependency(%q<reverse_markdown>.freeze, ["~> 3.0".freeze])
44
- s.add_runtime_dependency(%q<xdg>.freeze, [">= 0".freeze])
45
44
  s.add_runtime_dependency(%q<kramdown-ansi>.freeze, ["~> 0.2".freeze])
46
45
  s.add_runtime_dependency(%q<complex_config>.freeze, ["~> 0.22".freeze, ">= 0.22.2".freeze])
47
46
  s.add_runtime_dependency(%q<tins>.freeze, ["~> 1.41".freeze])
@@ -49,4 +48,5 @@ Gem::Specification.new do |s|
49
48
  s.add_runtime_dependency(%q<amatch>.freeze, ["~> 0.4.1".freeze])
50
49
  s.add_runtime_dependency(%q<pdf-reader>.freeze, ["~> 2.0".freeze])
51
50
  s.add_runtime_dependency(%q<csv>.freeze, ["~> 3.0".freeze])
51
+ s.add_runtime_dependency(%q<const_conf>.freeze, ["~> 0.3".freeze])
52
52
  end
@@ -8,7 +8,7 @@ describe OllamaChat::FollowChat do
8
8
  end
9
9
 
10
10
  let :chat do
11
- double('Chat', markdown: double(on?: false), think: double(on?: false))
11
+ double('Chat', markdown: double(on?: false), think: double(on?: false), debug: false)
12
12
  end
13
13
 
14
14
  let :follow_chat do
@@ -19,10 +19,6 @@ describe OllamaChat::FollowChat do
19
19
  double('output', :sync= => true)
20
20
  end
21
21
 
22
- before do
23
- allow(OllamaChat::Chat).to receive(:config).and_return(double(debug: false))
24
- end
25
-
26
22
  it 'has .call' do
27
23
  expect(follow_chat).to receive(:call).with(:foo)
28
24
  follow_chat.call(:foo)
@@ -7,21 +7,21 @@ describe OllamaChat::KramdownANSI do
7
7
 
8
8
  describe '#configure_kramdown_ansi_styles', protect_env: true do
9
9
  it 'can be configured via env var' do
10
- ENV['KRAMDOWN_ANSI_OLLAMA_CHAT_STYLES'] = '{}'
11
- ENV.delete('KRAMDOWN_ANSI_STYLES')
12
-
10
+ const_conf_as(
11
+ 'OllamaChat::EnvConfig::KRAMDOWN_ANSI_OLLAMA_CHAT_STYLES' => '{"foo":"bar"}'
12
+ )
13
13
  styles = { bold: '1' }
14
- expect(Kramdown::ANSI::Styles).to receive(:from_env_var).
15
- with('KRAMDOWN_ANSI_OLLAMA_CHAT_STYLES').
14
+ expect(Kramdown::ANSI::Styles).to receive(:from_json).
15
+ with('{"foo":"bar"}').
16
16
  and_return(double(ansi_styles: styles))
17
17
 
18
18
  expect(chat.configure_kramdown_ansi_styles).to eq(styles)
19
19
  end
20
20
 
21
21
  it 'has a default configuration' do
22
- ENV.delete('KRAMDOWN_ANSI_OLLAMA_CHAT_STYLES')
23
- ENV.delete('KRAMDOWN_ANSI_STYLES')
24
-
22
+ const_conf_as(
23
+ 'OllamaChat::EnvConfig::KRAMDOWN_ANSI_OLLAMA_CHAT_STYLES' => nil
24
+ )
25
25
  expect(chat.configure_kramdown_ansi_styles).to be_a(Hash)
26
26
  end
27
27
  end
data/spec/spec_helper.rb CHANGED
@@ -11,6 +11,7 @@ rescue LoadError
11
11
  end
12
12
  require 'webmock/rspec'
13
13
  WebMock.disable_net_connect!
14
+ require 'const_conf/spec'
14
15
  require 'ollama_chat'
15
16
 
16
17
  ComplexConfig::Provider.deep_freeze = false
@@ -108,6 +109,7 @@ module StubOllamaServer
108
109
  to_return(status: 200, body: asset_json('api_show.json'))
109
110
  stub_request(:get, %r(/api/version\z)).
110
111
  to_return(status: 200, body: asset_json('api_version.json'))
112
+ allow_any_instance_of(OllamaChat::Chat).to receive(:connect_message)
111
113
  instantiate and chat
112
114
  end
113
115
  end
@@ -145,12 +147,10 @@ RSpec.configure do |config|
145
147
  config.include AssetHelpers
146
148
  config.extend StubOllamaServer
147
149
 
148
- config.around(&ProtectEnvVars.apply)
149
- config.before(:each) do
150
- ENV['OLLAMA_HOST'] = 'localhost:11434'
151
- end
152
-
153
150
  config.before(:suite) do
154
151
  infobar.show = nil
155
152
  end
153
+
154
+ config.around(&ProtectEnvVars.apply)
155
+ config.include(ConstConf::ConstConfHelper)
156
156
  end
metadata CHANGED
@@ -1,7 +1,7 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: ollama_chat
3
3
  version: !ruby/object:Gem::Version
4
- version: 0.0.34
4
+ version: 0.0.36
5
5
  platform: ruby
6
6
  authors:
7
7
  - Florian Frank
@@ -15,14 +15,14 @@ dependencies:
15
15
  requirements:
16
16
  - - "~>"
17
17
  - !ruby/object:Gem::Version
18
- version: '2.6'
18
+ version: '2.8'
19
19
  type: :development
20
20
  prerelease: false
21
21
  version_requirements: !ruby/object:Gem::Requirement
22
22
  requirements:
23
23
  - - "~>"
24
24
  - !ruby/object:Gem::Version
25
- version: '2.6'
25
+ version: '2.8'
26
26
  - !ruby/object:Gem::Dependency
27
27
  name: all_images
28
28
  requirement: !ruby/object:Gem::Requirement
@@ -253,20 +253,6 @@ dependencies:
253
253
  - - "~>"
254
254
  - !ruby/object:Gem::Version
255
255
  version: '3.0'
256
- - !ruby/object:Gem::Dependency
257
- name: xdg
258
- requirement: !ruby/object:Gem::Requirement
259
- requirements:
260
- - - ">="
261
- - !ruby/object:Gem::Version
262
- version: '0'
263
- type: :runtime
264
- prerelease: false
265
- version_requirements: !ruby/object:Gem::Requirement
266
- requirements:
267
- - - ">="
268
- - !ruby/object:Gem::Version
269
- version: '0'
270
256
  - !ruby/object:Gem::Dependency
271
257
  name: kramdown-ansi
272
258
  requirement: !ruby/object:Gem::Requirement
@@ -371,6 +357,20 @@ dependencies:
371
357
  - - "~>"
372
358
  - !ruby/object:Gem::Version
373
359
  version: '3.0'
360
+ - !ruby/object:Gem::Dependency
361
+ name: const_conf
362
+ requirement: !ruby/object:Gem::Requirement
363
+ requirements:
364
+ - - "~>"
365
+ - !ruby/object:Gem::Version
366
+ version: '0.3'
367
+ type: :runtime
368
+ prerelease: false
369
+ version_requirements: !ruby/object:Gem::Requirement
370
+ requirements:
371
+ - - "~>"
372
+ - !ruby/object:Gem::Version
373
+ version: '0.3'
374
374
  description: |
375
375
  The app provides a command-line interface (CLI) to an Ollama AI model,
376
376
  allowing users to engage in text-based conversations and generate
@@ -392,6 +392,7 @@ extra_rdoc_files:
392
392
  - lib/ollama_chat/conversation.rb
393
393
  - lib/ollama_chat/dialog.rb
394
394
  - lib/ollama_chat/document_cache.rb
395
+ - lib/ollama_chat/env_config.rb
395
396
  - lib/ollama_chat/follow_chat.rb
396
397
  - lib/ollama_chat/history.rb
397
398
  - lib/ollama_chat/information.rb
@@ -429,6 +430,7 @@ files:
429
430
  - lib/ollama_chat/conversation.rb
430
431
  - lib/ollama_chat/dialog.rb
431
432
  - lib/ollama_chat/document_cache.rb
433
+ - lib/ollama_chat/env_config.rb
432
434
  - lib/ollama_chat/follow_chat.rb
433
435
  - lib/ollama_chat/history.rb
434
436
  - lib/ollama_chat/information.rb
@@ -503,7 +505,7 @@ required_ruby_version: !ruby/object:Gem::Requirement
503
505
  requirements:
504
506
  - - "~>"
505
507
  - !ruby/object:Gem::Version
506
- version: '3.1'
508
+ version: '3.2'
507
509
  required_rubygems_version: !ruby/object:Gem::Requirement
508
510
  requirements:
509
511
  - - ">="