ollama_chat 0.0.5 → 0.0.7

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: fc27a8640616134647d04ee2d9b806e41806a55dc8a492e24bcf90509a84f8c8
4
- data.tar.gz: b560ce8230eb463f38881605c29f73cab6a601dc9194b79cacec2ce443b1a4bc
3
+ metadata.gz: b2f4b23386064513dc339fa0b3479e70c8e15b091b146bfcf700602fd1497b87
4
+ data.tar.gz: ce9857a08de2bec94a1e19b5e28a13535657ebdab3ca977fe2b5582a86708bb7
5
5
  SHA512:
6
- metadata.gz: 92e51935010fcace611baa132f5619b2894da2509854eb20d428e2a5d74b6bc658fa90edd577633a887b381f24861c7d3c580a79891e7cfc348a11b6b9191a08
7
- data.tar.gz: d74878f18ff236f99c75fee50b40a91c69a3aaa245c2c681d33f68d8869d5a745afb7afcc8219e8cb4a1f443fe3c055661a3a25b5f03f6007fa8f9042ffaa0d6
6
+ metadata.gz: c7bcfbc3966c7189cc3d93b6ac3e09a49d79b1dd3bbfd5ed729a7ae5484ef7fcdadd194705363c574f18dd43f11c6c8d204577292516c3f9b286ba4380417e41
7
+ data.tar.gz: 855accf3d5d72aa6c8a6cdd9fc8e534db40d97a4533bee03fc6a240cc5a12b8da46f304d02cf1c71378f592347866b84a95e9b18f9046975f630902996dd409a
data/CHANGES.md CHANGED
@@ -1,5 +1,40 @@
1
1
  # Changes
2
2
 
3
+ ## 2025-05-22 v0.0.7
4
+
5
+ * Added `ollama_chat_send` executable in `/bin`, required 'ollama_chat' gem,
6
+ sent user input to Ollama server via
7
+ `OllamaChat::ServerSocket.send_to_server_socket` method and handled
8
+ exceptions and exit with non-zero status code if an error occurs.
9
+ * Added new `server_socket.rb` file containing module and methods for
10
+ server-side socket handling, modified `chat.rb` to include `ServerSocket`
11
+ module and use its `init_server_socket` method to start server socket,
12
+ updated `chat.rb` to handle incoming messages from server socket in
13
+ interactive loop sent via `send_to_server_socket`.
14
+ * Refactored chat history management into separate module by adding
15
+ OllamaChat::History module, including History in OllamaChat::Chat and moving
16
+ chat history methods to new History module.
17
+ * Refactored chat history configuration by making chat history filename use
18
+ config setting first instead of environment variable OLLAMA_CHAT_HISTORY.
19
+ * Updated chat commands and documentation by updating `/clear`, `/collection`,
20
+ `/links` command helps to use more consistent syntax, updated
21
+ `OllamaChat::Information` module in `lib/ollama_chat/information.rb` to
22
+ reflect changes.
23
+ * Added support for chat history loading and saving by adding `require
24
+ 'tins/secure_write'` and `require 'json'` to dependencies, modified
25
+ OllamaChat::Chat class to initialize, save, and clear chat history, utilized
26
+ `File.secure_write` for secure saving of chat history.
27
+
28
+ ## 2025-04-15 v0.0.6
29
+
30
+ * Updated Rakefile to use `ollama-ruby` version **1.0**.
31
+ * Modified `model_present?` method in `lib/ollama_chat/model_handling.rb` to use `ollama.show(model:)`.
32
+ * Changed `pull_model_from_remote` method in `lib/ollama_chat/model_handling.rb` to use `ollama.pull(model:).
33
+ * Updated `ollama_chat.gemspec` to use `ollama-ruby` version **1.0** and updated date to **2025-04-14**.
34
+ * Attempt to capture stderr as well by redirecting stderr to stdout for
35
+ commands that output to it always or in the error case.
36
+ * Updated development dependencies in `ollama_chat.gemspec`.
37
+
3
38
  ## 2025-03-22 v0.0.5
4
39
 
5
40
  * Updated default config to use environment variable for Searxng URL:
data/README.md CHANGED
@@ -122,13 +122,13 @@ The following commands can be given inside the chat, if prefixed by a `/`:
122
122
  /location toggle location submission
123
123
  /voice( change) toggle voice output or change the voice
124
124
  /list [n] list the last n / all conversation exchanges
125
- /clear clear the whole conversation
126
- /clobber clear the conversation and collection
127
- /drop [n] drop the last n exchanges, defaults to 1
125
+ /clear [messages|links|history] clear the all messages, links, or the chat history (defaults to messages)
126
+ /clobber clear the conversation, links, and collection
127
+ /drop [n] drop the last n exchanges, defaults to 1
128
128
  /model change the model
129
129
  /system change system prompt (clears conversation)
130
130
  /regenerate the last answer message
131
- /collection( clear|change) change (default) collection or clear
131
+ /collection [clear|change] change (default) collection or clear
132
132
  /info show information for current session
133
133
  /config output current configuration ("/Users/flori/.config/ollama_chat/config.yml")
134
134
  /document_policy pick a scan policy for document references
@@ -137,7 +137,7 @@ The following commands can be given inside the chat, if prefixed by a `/`:
137
137
  /embedding toggle embedding paused or not
138
138
  /embed source embed the source's content
139
139
  /web [n] query query web search & return n or 1 results
140
- /links( clear) display (or clear) links used in the chat
140
+ /links [clear] display (or clear) links used in the chat
141
141
  /save filename store conversation messages
142
142
  /load filename load conversation messages
143
143
  /quit to quit
data/Rakefile CHANGED
@@ -27,10 +27,10 @@ GemHadar do
27
27
 
28
28
  required_ruby_version '~> 3.1'
29
29
 
30
- executables << 'ollama_chat'
30
+ executables << 'ollama_chat' << 'ollama_chat_send'
31
31
 
32
32
  dependency 'excon', '~> 1.0'
33
- dependency 'ollama-ruby', '~> 0.15'
33
+ dependency 'ollama-ruby', '~> 1.0'
34
34
  dependency 'documentrix', '~> 0.0'
35
35
  dependency 'rss', '~> 0.3'
36
36
  dependency 'term-ansicolor', '~> 1.11'
data/VERSION CHANGED
@@ -1 +1 @@
1
- 0.0.5
1
+ 0.0.7
@@ -0,0 +1,11 @@
1
+ #!/usr/bin/env ruby
2
+
3
+ require 'ollama_chat'
4
+
5
+ begin
6
+ OllamaChat::ServerSocket.send_to_server_socket(STDIN.read)
7
+ rescue => e
8
+ warn "Caught #{e.class}: #{e}"
9
+ exit 1
10
+ end
11
+ exit 0
data/docker-compose.yml CHANGED
@@ -1,7 +1,7 @@
1
1
  services:
2
2
  redis:
3
3
  container_name: redis
4
- image: valkey/valkey:7.2.8-alpine
4
+ image: valkey/valkey:7.2.9-alpine
5
5
  restart: unless-stopped
6
6
  ports: [ "127.0.0.1:9736:6379" ]
7
7
  volumes:
@@ -1,4 +1,6 @@
1
1
  require 'tins'
2
+ require 'tins/secure_write'
3
+ require 'json'
2
4
  require 'term/ansicolor'
3
5
  require 'reline'
4
6
  require 'reverse_markdown'
@@ -10,6 +12,7 @@ require 'rss'
10
12
  require 'pdf/reader'
11
13
  require 'csv'
12
14
  require 'xdg'
15
+ require 'socket'
13
16
 
14
17
  class OllamaChat::Chat
15
18
  include Tins::GO
@@ -24,6 +27,8 @@ class OllamaChat::Chat
24
27
  include OllamaChat::Information
25
28
  include OllamaChat::Clipboard
26
29
  include OllamaChat::MessageType
30
+ include OllamaChat::History
31
+ include OllamaChat::ServerSocket
27
32
 
28
33
  def initialize(argv: ARGV.dup)
29
34
  @opts = go 'f:u:m:s:c:C:D:MEVh', argv
@@ -60,6 +65,8 @@ class OllamaChat::Chat
60
65
  @cache = setup_cache
61
66
  @current_voice = config.voice.default
62
67
  @images = []
68
+ init_chat_history
69
+ init_server_socket
63
70
  end
64
71
 
65
72
  attr_reader :ollama
@@ -100,7 +107,17 @@ class OllamaChat::Chat
100
107
  loop do
101
108
  parse_content = true
102
109
  input_prompt = bold { color(172) { message_type(@images) + " user" } } + bold { "> " }
103
- content = Reline.readline(input_prompt, true)&.chomp
110
+
111
+ begin
112
+ content = Reline.readline(input_prompt, true)&.chomp
113
+ rescue Interrupt
114
+ if message = server_socket_message
115
+ self.server_socket_message = nil
116
+ content = message['content']
117
+ else
118
+ raise
119
+ end
120
+ end
104
121
 
105
122
  case content
106
123
  when %r(^/copy$)
@@ -128,15 +145,26 @@ class OllamaChat::Chat
128
145
  last = 2 * $1.to_i if $1
129
146
  messages.list_conversation(last)
130
147
  next
131
- when %r(^/clear$)
132
- messages.clear
133
- STDOUT.puts "Cleared messages."
148
+ when %r(^/clear(?:\s+(messages|links|history))?$)
149
+ what = $1.nil? ? 'messages' : $1
150
+ case what
151
+ when 'messages'
152
+ messages.clear
153
+ STDOUT.puts "Cleared messages."
154
+ when 'links'
155
+ links.clear
156
+ STDOUT.puts "Cleared links."
157
+ when 'history'
158
+ clear_history
159
+ STDOUT.puts "Cleared history."
160
+ end
134
161
  next
135
162
  when %r(^/clobber$)
136
163
  if ask?(prompt: 'Are you sure to clear messages and collection? (y/n) ') =~ /\Ay/i
137
164
  messages.clear
138
165
  @documents.clear
139
166
  links.clear
167
+ clear_history
140
168
  STDOUT.puts "Cleared messages and collection #{bold{@documents.collection}}."
141
169
  else
142
170
  STDOUT.puts 'Cancelled.'
@@ -348,8 +376,12 @@ class OllamaChat::Chat
348
376
  STDOUT.puts "Type /quit to quit."
349
377
  end
350
378
  0
379
+ ensure
380
+ save_history
351
381
  end
352
382
 
383
+ private
384
+
353
385
  def setup_documents
354
386
  if embedding.on?
355
387
  @embedding_model = config.embedding.model.name
@@ -405,10 +437,11 @@ class OllamaChat::Chat
405
437
 
406
438
  def setup_cache
407
439
  if url = config.redis.expiring.url?
440
+ ex = config.redis.expiring.ex?.to_i
408
441
  Documentrix::Documents::RedisCache.new(
409
442
  prefix: 'Expiring-',
410
443
  url:,
411
- ex: config.redis.expiring.ex?.to_i,
444
+ ex:
412
445
  )
413
446
  end
414
447
  end
@@ -0,0 +1,23 @@
1
+ module OllamaChat::History
2
+ def chat_history_filename
3
+ File.expand_path(config.chat_history_filename)
4
+ end
5
+
6
+ def init_chat_history
7
+ if File.exist?(chat_history_filename)
8
+ File.open(chat_history_filename, ?r) do |history|
9
+ history_data = JSON.load(history)
10
+ clear_history
11
+ Readline::HISTORY.push(*history_data)
12
+ end
13
+ end
14
+ end
15
+
16
+ def save_history
17
+ File.secure_write(chat_history_filename, JSON.dump(Readline::HISTORY))
18
+ end
19
+
20
+ def clear_history
21
+ Readline::HISTORY.clear
22
+ end
23
+ end
@@ -63,15 +63,15 @@ module OllamaChat::Information
63
63
  /markdown toggle markdown output
64
64
  /stream toggle stream output
65
65
  /location toggle location submission
66
- /voice( change) toggle voice output or change the voice
66
+ /voice [change] toggle voice output or change the voice
67
67
  /list [n] list the last n / all conversation exchanges
68
- /clear clear the whole conversation
68
+ /clear [messages|links|history] clear the all messages, links, or the chat history (defaults to messages)
69
69
  /clobber clear the conversation, links, and collection
70
70
  /drop [n] drop the last n exchanges, defaults to 1
71
71
  /model change the model
72
72
  /system change system prompt (clears conversation)
73
73
  /regenerate the last answer message
74
- /collection( clear|change) change (default) collection or clear
74
+ /collection [clear|change] change (default) collection or clear
75
75
  /info show information for current session
76
76
  /config output current configuration (#{@ollama_chat_config.filename.to_s.inspect})
77
77
  /document_policy pick a scan policy for document references
@@ -1,13 +1,13 @@
1
1
  module OllamaChat::ModelHandling
2
2
  def model_present?(model)
3
- ollama.show(name: model) { return _1.system.to_s }
3
+ ollama.show(model:) { return _1.system.to_s }
4
4
  rescue Ollama::Errors::NotFoundError
5
5
  false
6
6
  end
7
7
 
8
8
  def pull_model_from_remote(model)
9
9
  STDOUT.puts "Model #{bold{model}} not found locally, attempting to pull it from remote now…"
10
- ollama.pull(name: model)
10
+ ollama.pull(model:)
11
11
  end
12
12
 
13
13
  def pull_model_unless_present(model, options)
@@ -55,6 +55,7 @@ redis:
55
55
  expiring:
56
56
  url: <%= ENV.fetch('REDIS_EXPIRING_URL', 'null') %>
57
57
  ex: 86400
58
+ chat_history_filename: <%= ENV.fetch('OLLAMA_CHAT_HISTORY', '~/.ollama_chat_history') %>
58
59
  debug: <%= ENV['OLLAMA_CHAT_DEBUG'].to_i == 1 ? true : false %>
59
60
  request_headers:
60
61
  Accept: 'text/*,application/*,image/*'
@@ -0,0 +1,38 @@
1
+ module OllamaChat::ServerSocket
2
+ class << self
3
+ def runtime_dir
4
+ File.expand_path(ENV.fetch('XDG_RUNTIME_DIR', '~/.local/run'))
5
+ end
6
+
7
+ def server_socket_path
8
+ File.join(runtime_dir, 'ollama_chat.sock')
9
+ end
10
+
11
+ def send_to_server_socket(content)
12
+ FileUtils.mkdir_p runtime_dir
13
+ message = { content: }
14
+ socket = UNIXSocket.new(server_socket_path)
15
+ socket.puts JSON(message)
16
+ socket.close
17
+ end
18
+ end
19
+
20
+ attr_accessor :server_socket_message
21
+
22
+ def init_server_socket
23
+ FileUtils.mkdir_p OllamaChat::ServerSocket.runtime_dir
24
+ Thread.new do
25
+ Socket.unix_server_loop(OllamaChat::ServerSocket.server_socket_path) do |sock, client_addrinfo|
26
+ begin
27
+ data = sock.readline.chomp
28
+ self.server_socket_message = JSON.load(data)
29
+ Process.kill :INT, $$
30
+ rescue JSON::ParserError
31
+ ensure
32
+ sock.close
33
+ end
34
+ end
35
+ rescue Errno::ENOENT
36
+ end
37
+ end
38
+ end
@@ -49,8 +49,9 @@ module OllamaChat::SourceFetching
49
49
  end
50
50
 
51
51
  def import_source(source_io, source)
52
- source = source.to_s
53
- STDOUT.puts "Importing #{italic { source_io&.content_type }} document #{source.to_s.inspect} now."
52
+ source = source.to_s
53
+ document_type = source_io&.content_type.full? { |ct| italic { ct } } + ' '
54
+ STDOUT.puts "Importing #{document_type}document #{source.to_s.inspect} now."
54
55
  source_content = parse_source(source_io)
55
56
  "Imported #{source.inspect}:\n\n#{source_content}\n\n"
56
57
  end
@@ -59,9 +59,12 @@ class OllamaChat::Utils::Fetcher
59
59
 
60
60
  def self.execute(command, &block)
61
61
  Tempfile.open do |tmp|
62
- IO.popen(command) do |command|
63
- until command.eof?
64
- tmp.write command.read(1 << 14)
62
+ unless command =~ /2>&1/
63
+ command += ' 2>&1'
64
+ end
65
+ IO.popen(command) do |io|
66
+ until io.eof?
67
+ tmp.write io.read(1 << 14)
65
68
  end
66
69
  tmp.rewind
67
70
  tmp.extend(OllamaChat::Utils::Fetcher::HeaderExtension)
@@ -1,6 +1,6 @@
1
1
  module OllamaChat
2
2
  # OllamaChat version
3
- VERSION = '0.0.5'
3
+ VERSION = '0.0.7'
4
4
  VERSION_ARRAY = VERSION.split('.').map(&:to_i) # :nodoc:
5
5
  VERSION_MAJOR = VERSION_ARRAY[0] # :nodoc:
6
6
  VERSION_MINOR = VERSION_ARRAY[1] # :nodoc:
data/lib/ollama_chat.rb CHANGED
@@ -18,4 +18,6 @@ require 'ollama_chat/dialog'
18
18
  require 'ollama_chat/information'
19
19
  require 'ollama_chat/clipboard'
20
20
  require 'ollama_chat/document_cache'
21
+ require 'ollama_chat/history'
22
+ require 'ollama_chat/server_socket'
21
23
  require 'ollama_chat/chat'
data/ollama_chat.gemspec CHANGED
@@ -1,24 +1,24 @@
1
1
  # -*- encoding: utf-8 -*-
2
- # stub: ollama_chat 0.0.5 ruby lib
2
+ # stub: ollama_chat 0.0.7 ruby lib
3
3
 
4
4
  Gem::Specification.new do |s|
5
5
  s.name = "ollama_chat".freeze
6
- s.version = "0.0.5".freeze
6
+ s.version = "0.0.7".freeze
7
7
 
8
8
  s.required_rubygems_version = Gem::Requirement.new(">= 0".freeze) if s.respond_to? :required_rubygems_version=
9
9
  s.require_paths = ["lib".freeze]
10
10
  s.authors = ["Florian Frank".freeze]
11
- s.date = "2025-03-22"
11
+ s.date = "1980-01-02"
12
12
  s.description = "The app provides a command-line interface (CLI) to an Ollama AI model,\nallowing users to engage in text-based conversations and generate\nhuman-like responses. Users can import data from local files or web pages,\nwhich are then processed through three different modes: fully importing the\ncontent into the conversation context, summarizing the information for\nconcise reference, or storing it in an embedding vector database for later\nretrieval based on the conversation.\n".freeze
13
13
  s.email = "flori@ping.de".freeze
14
- s.executables = ["ollama_chat".freeze]
15
- s.extra_rdoc_files = ["README.md".freeze, "lib/ollama_chat.rb".freeze, "lib/ollama_chat/chat.rb".freeze, "lib/ollama_chat/clipboard.rb".freeze, "lib/ollama_chat/dialog.rb".freeze, "lib/ollama_chat/document_cache.rb".freeze, "lib/ollama_chat/follow_chat.rb".freeze, "lib/ollama_chat/information.rb".freeze, "lib/ollama_chat/message_list.rb".freeze, "lib/ollama_chat/message_type.rb".freeze, "lib/ollama_chat/model_handling.rb".freeze, "lib/ollama_chat/ollama_chat_config.rb".freeze, "lib/ollama_chat/parsing.rb".freeze, "lib/ollama_chat/source_fetching.rb".freeze, "lib/ollama_chat/switches.rb".freeze, "lib/ollama_chat/utils.rb".freeze, "lib/ollama_chat/utils/cache_fetcher.rb".freeze, "lib/ollama_chat/utils/chooser.rb".freeze, "lib/ollama_chat/utils/fetcher.rb".freeze, "lib/ollama_chat/utils/file_argument.rb".freeze, "lib/ollama_chat/version.rb".freeze, "lib/ollama_chat/web_searching.rb".freeze]
16
- s.files = [".all_images.yml".freeze, ".envrc".freeze, ".gitignore".freeze, "CHANGES.md".freeze, "Gemfile".freeze, "README.md".freeze, "Rakefile".freeze, "VERSION".freeze, "bin/ollama_chat".freeze, "config/searxng/settings.yml".freeze, "docker-compose.yml".freeze, "lib/ollama_chat.rb".freeze, "lib/ollama_chat/chat.rb".freeze, "lib/ollama_chat/clipboard.rb".freeze, "lib/ollama_chat/dialog.rb".freeze, "lib/ollama_chat/document_cache.rb".freeze, "lib/ollama_chat/follow_chat.rb".freeze, "lib/ollama_chat/information.rb".freeze, "lib/ollama_chat/message_list.rb".freeze, "lib/ollama_chat/message_type.rb".freeze, "lib/ollama_chat/model_handling.rb".freeze, "lib/ollama_chat/ollama_chat_config.rb".freeze, "lib/ollama_chat/ollama_chat_config/default_config.yml".freeze, "lib/ollama_chat/parsing.rb".freeze, "lib/ollama_chat/source_fetching.rb".freeze, "lib/ollama_chat/switches.rb".freeze, "lib/ollama_chat/utils.rb".freeze, "lib/ollama_chat/utils/cache_fetcher.rb".freeze, "lib/ollama_chat/utils/chooser.rb".freeze, "lib/ollama_chat/utils/fetcher.rb".freeze, "lib/ollama_chat/utils/file_argument.rb".freeze, "lib/ollama_chat/version.rb".freeze, "lib/ollama_chat/web_searching.rb".freeze, "ollama_chat.gemspec".freeze, "redis/redis.conf".freeze, "spec/assets/api_show.json".freeze, "spec/assets/api_tags.json".freeze, "spec/assets/api_version.json".freeze, "spec/assets/conversation.json".freeze, "spec/assets/duckduckgo.html".freeze, "spec/assets/example.atom".freeze, "spec/assets/example.csv".freeze, "spec/assets/example.html".freeze, "spec/assets/example.pdf".freeze, "spec/assets/example.ps".freeze, "spec/assets/example.rb".freeze, "spec/assets/example.rss".freeze, "spec/assets/example.xml".freeze, "spec/assets/kitten.jpg".freeze, "spec/assets/prompt.txt".freeze, "spec/assets/searxng.json".freeze, "spec/ollama_chat/chat_spec.rb".freeze, "spec/ollama_chat/clipboard_spec.rb".freeze, "spec/ollama_chat/follow_chat_spec.rb".freeze, "spec/ollama_chat/information_spec.rb".freeze, "spec/ollama_chat/message_list_spec.rb".freeze, "spec/ollama_chat/model_handling_spec.rb".freeze, "spec/ollama_chat/parsing_spec.rb".freeze, "spec/ollama_chat/source_fetching_spec.rb".freeze, "spec/ollama_chat/switches_spec.rb".freeze, "spec/ollama_chat/utils/cache_fetcher_spec.rb".freeze, "spec/ollama_chat/utils/fetcher_spec.rb".freeze, "spec/ollama_chat/utils/file_argument_spec.rb".freeze, "spec/ollama_chat/web_searching_spec.rb".freeze, "spec/spec_helper.rb".freeze, "tmp/.keep".freeze]
14
+ s.executables = ["ollama_chat".freeze, "ollama_chat_send".freeze]
15
+ s.extra_rdoc_files = ["README.md".freeze, "lib/ollama_chat.rb".freeze, "lib/ollama_chat/chat.rb".freeze, "lib/ollama_chat/clipboard.rb".freeze, "lib/ollama_chat/dialog.rb".freeze, "lib/ollama_chat/document_cache.rb".freeze, "lib/ollama_chat/follow_chat.rb".freeze, "lib/ollama_chat/history.rb".freeze, "lib/ollama_chat/information.rb".freeze, "lib/ollama_chat/message_list.rb".freeze, "lib/ollama_chat/message_type.rb".freeze, "lib/ollama_chat/model_handling.rb".freeze, "lib/ollama_chat/ollama_chat_config.rb".freeze, "lib/ollama_chat/parsing.rb".freeze, "lib/ollama_chat/server_socket.rb".freeze, "lib/ollama_chat/source_fetching.rb".freeze, "lib/ollama_chat/switches.rb".freeze, "lib/ollama_chat/utils.rb".freeze, "lib/ollama_chat/utils/cache_fetcher.rb".freeze, "lib/ollama_chat/utils/chooser.rb".freeze, "lib/ollama_chat/utils/fetcher.rb".freeze, "lib/ollama_chat/utils/file_argument.rb".freeze, "lib/ollama_chat/version.rb".freeze, "lib/ollama_chat/web_searching.rb".freeze]
16
+ s.files = [".all_images.yml".freeze, ".envrc".freeze, ".gitignore".freeze, "CHANGES.md".freeze, "Gemfile".freeze, "README.md".freeze, "Rakefile".freeze, "VERSION".freeze, "bin/ollama_chat".freeze, "bin/ollama_chat_send".freeze, "config/searxng/settings.yml".freeze, "docker-compose.yml".freeze, "lib/ollama_chat.rb".freeze, "lib/ollama_chat/chat.rb".freeze, "lib/ollama_chat/clipboard.rb".freeze, "lib/ollama_chat/dialog.rb".freeze, "lib/ollama_chat/document_cache.rb".freeze, "lib/ollama_chat/follow_chat.rb".freeze, "lib/ollama_chat/history.rb".freeze, "lib/ollama_chat/information.rb".freeze, "lib/ollama_chat/message_list.rb".freeze, "lib/ollama_chat/message_type.rb".freeze, "lib/ollama_chat/model_handling.rb".freeze, "lib/ollama_chat/ollama_chat_config.rb".freeze, "lib/ollama_chat/ollama_chat_config/default_config.yml".freeze, "lib/ollama_chat/parsing.rb".freeze, "lib/ollama_chat/server_socket.rb".freeze, "lib/ollama_chat/source_fetching.rb".freeze, "lib/ollama_chat/switches.rb".freeze, "lib/ollama_chat/utils.rb".freeze, "lib/ollama_chat/utils/cache_fetcher.rb".freeze, "lib/ollama_chat/utils/chooser.rb".freeze, "lib/ollama_chat/utils/fetcher.rb".freeze, "lib/ollama_chat/utils/file_argument.rb".freeze, "lib/ollama_chat/version.rb".freeze, "lib/ollama_chat/web_searching.rb".freeze, "ollama_chat.gemspec".freeze, "redis/redis.conf".freeze, "spec/assets/api_show.json".freeze, "spec/assets/api_tags.json".freeze, "spec/assets/api_version.json".freeze, "spec/assets/conversation.json".freeze, "spec/assets/duckduckgo.html".freeze, "spec/assets/example.atom".freeze, "spec/assets/example.csv".freeze, "spec/assets/example.html".freeze, "spec/assets/example.pdf".freeze, "spec/assets/example.ps".freeze, "spec/assets/example.rb".freeze, "spec/assets/example.rss".freeze, "spec/assets/example.xml".freeze, "spec/assets/kitten.jpg".freeze, "spec/assets/prompt.txt".freeze, "spec/assets/searxng.json".freeze, "spec/ollama_chat/chat_spec.rb".freeze, "spec/ollama_chat/clipboard_spec.rb".freeze, "spec/ollama_chat/follow_chat_spec.rb".freeze, "spec/ollama_chat/information_spec.rb".freeze, "spec/ollama_chat/message_list_spec.rb".freeze, "spec/ollama_chat/model_handling_spec.rb".freeze, "spec/ollama_chat/parsing_spec.rb".freeze, "spec/ollama_chat/source_fetching_spec.rb".freeze, "spec/ollama_chat/switches_spec.rb".freeze, "spec/ollama_chat/utils/cache_fetcher_spec.rb".freeze, "spec/ollama_chat/utils/fetcher_spec.rb".freeze, "spec/ollama_chat/utils/file_argument_spec.rb".freeze, "spec/ollama_chat/web_searching_spec.rb".freeze, "spec/spec_helper.rb".freeze, "tmp/.keep".freeze]
17
17
  s.homepage = "https://github.com/flori/ollama_chat".freeze
18
18
  s.licenses = ["MIT".freeze]
19
19
  s.rdoc_options = ["--title".freeze, "OllamaChat - A command-line interface (CLI) for interacting with an Ollama AI model.".freeze, "--main".freeze, "README.md".freeze]
20
20
  s.required_ruby_version = Gem::Requirement.new("~> 3.1".freeze)
21
- s.rubygems_version = "3.6.2".freeze
21
+ s.rubygems_version = "3.6.7".freeze
22
22
  s.summary = "A command-line interface (CLI) for interacting with an Ollama AI model.".freeze
23
23
  s.test_files = ["spec/assets/example.rb".freeze, "spec/ollama_chat/chat_spec.rb".freeze, "spec/ollama_chat/clipboard_spec.rb".freeze, "spec/ollama_chat/follow_chat_spec.rb".freeze, "spec/ollama_chat/information_spec.rb".freeze, "spec/ollama_chat/message_list_spec.rb".freeze, "spec/ollama_chat/model_handling_spec.rb".freeze, "spec/ollama_chat/parsing_spec.rb".freeze, "spec/ollama_chat/source_fetching_spec.rb".freeze, "spec/ollama_chat/switches_spec.rb".freeze, "spec/ollama_chat/utils/cache_fetcher_spec.rb".freeze, "spec/ollama_chat/utils/fetcher_spec.rb".freeze, "spec/ollama_chat/utils/file_argument_spec.rb".freeze, "spec/ollama_chat/web_searching_spec.rb".freeze, "spec/spec_helper.rb".freeze]
24
24
 
@@ -32,7 +32,7 @@ Gem::Specification.new do |s|
32
32
  s.add_development_dependency(%q<debug>.freeze, [">= 0".freeze])
33
33
  s.add_development_dependency(%q<simplecov>.freeze, [">= 0".freeze])
34
34
  s.add_runtime_dependency(%q<excon>.freeze, ["~> 1.0".freeze])
35
- s.add_runtime_dependency(%q<ollama-ruby>.freeze, ["~> 0.15".freeze])
35
+ s.add_runtime_dependency(%q<ollama-ruby>.freeze, ["~> 1.0".freeze])
36
36
  s.add_runtime_dependency(%q<documentrix>.freeze, ["~> 0.0".freeze])
37
37
  s.add_runtime_dependency(%q<rss>.freeze, ["~> 0.3".freeze])
38
38
  s.add_runtime_dependency(%q<term-ansicolor>.freeze, ["~> 1.11".freeze])
@@ -15,6 +15,33 @@ RSpec.describe OllamaChat::Chat do
15
15
  expect(chat).to be_a described_class
16
16
  end
17
17
 
18
+ describe 'chat history' do
19
+ it 'derives chat_history_filename' do
20
+ expect(chat.send(:chat_history_filename)).to_not be_nil
21
+ end
22
+
23
+ it 'can save chat history' do
24
+ expect(File).to receive(:secure_write).with(
25
+ chat.send(:chat_history_filename),
26
+ kind_of(String)
27
+ )
28
+ chat.send(:save_history)
29
+ end
30
+
31
+ it 'can initialize chat history' do
32
+ expect(File).to receive(:exist?).with(chat.send(:chat_history_filename)).
33
+ and_return true
34
+ expect(File).to receive(:open).with(chat.send(:chat_history_filename), ?r)
35
+ chat.send(:init_chat_history)
36
+ end
37
+
38
+ it 'can clear history' do
39
+ chat
40
+ expect(Readline::HISTORY).to receive(:clear)
41
+ chat.send(:clear_history)
42
+ end
43
+ end
44
+
18
45
  context 'loading conversations' do
19
46
  let :argv do
20
47
  %w[ -C test -c ] << asset('conversation.json')
@@ -3,7 +3,9 @@ require 'pathname'
3
3
 
4
4
  RSpec.describe OllamaChat::Parsing do
5
5
  let :chat do
6
- OllamaChat::Chat.new
6
+ OllamaChat::Chat.new.tap do |chat|
7
+ chat.document_policy = 'importing'
8
+ end
7
9
  end
8
10
 
9
11
  connect_to_ollama_server
@@ -212,6 +214,7 @@ EOT
212
214
  Pathname.pwd.join('spec/assets/example.html').to_s
213
215
  )
214
216
  content, = chat.parse_content(c, [])
217
+ expect(content).to eq c
215
218
  end
216
219
 
217
220
  it 'can be summarizing' do
@@ -113,9 +113,9 @@ RSpec.describe OllamaChat::Utils::Fetcher do
113
113
  end
114
114
 
115
115
  it 'can .execute' do
116
- described_class.execute('echo -n hello world') do |file|
116
+ described_class.execute('echo hello world') do |file|
117
117
  expect(file).to be_a Tempfile
118
- expect(file.read).to eq 'hello world'
118
+ expect(file.read).to eq "hello world\n"
119
119
  expect(file.content_type).to eq 'text/plain'
120
120
  end
121
121
  end
metadata CHANGED
@@ -1,13 +1,13 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: ollama_chat
3
3
  version: !ruby/object:Gem::Version
4
- version: 0.0.5
4
+ version: 0.0.7
5
5
  platform: ruby
6
6
  authors:
7
7
  - Florian Frank
8
8
  bindir: bin
9
9
  cert_chain: []
10
- date: 2025-03-22 00:00:00.000000000 Z
10
+ date: 1980-01-02 00:00:00.000000000 Z
11
11
  dependencies:
12
12
  - !ruby/object:Gem::Dependency
13
13
  name: gem_hadar
@@ -127,14 +127,14 @@ dependencies:
127
127
  requirements:
128
128
  - - "~>"
129
129
  - !ruby/object:Gem::Version
130
- version: '0.15'
130
+ version: '1.0'
131
131
  type: :runtime
132
132
  prerelease: false
133
133
  version_requirements: !ruby/object:Gem::Requirement
134
134
  requirements:
135
135
  - - "~>"
136
136
  - !ruby/object:Gem::Version
137
- version: '0.15'
137
+ version: '1.0'
138
138
  - !ruby/object:Gem::Dependency
139
139
  name: documentrix
140
140
  requirement: !ruby/object:Gem::Requirement
@@ -354,6 +354,7 @@ description: |
354
354
  email: flori@ping.de
355
355
  executables:
356
356
  - ollama_chat
357
+ - ollama_chat_send
357
358
  extensions: []
358
359
  extra_rdoc_files:
359
360
  - README.md
@@ -363,12 +364,14 @@ extra_rdoc_files:
363
364
  - lib/ollama_chat/dialog.rb
364
365
  - lib/ollama_chat/document_cache.rb
365
366
  - lib/ollama_chat/follow_chat.rb
367
+ - lib/ollama_chat/history.rb
366
368
  - lib/ollama_chat/information.rb
367
369
  - lib/ollama_chat/message_list.rb
368
370
  - lib/ollama_chat/message_type.rb
369
371
  - lib/ollama_chat/model_handling.rb
370
372
  - lib/ollama_chat/ollama_chat_config.rb
371
373
  - lib/ollama_chat/parsing.rb
374
+ - lib/ollama_chat/server_socket.rb
372
375
  - lib/ollama_chat/source_fetching.rb
373
376
  - lib/ollama_chat/switches.rb
374
377
  - lib/ollama_chat/utils.rb
@@ -388,6 +391,7 @@ files:
388
391
  - Rakefile
389
392
  - VERSION
390
393
  - bin/ollama_chat
394
+ - bin/ollama_chat_send
391
395
  - config/searxng/settings.yml
392
396
  - docker-compose.yml
393
397
  - lib/ollama_chat.rb
@@ -396,6 +400,7 @@ files:
396
400
  - lib/ollama_chat/dialog.rb
397
401
  - lib/ollama_chat/document_cache.rb
398
402
  - lib/ollama_chat/follow_chat.rb
403
+ - lib/ollama_chat/history.rb
399
404
  - lib/ollama_chat/information.rb
400
405
  - lib/ollama_chat/message_list.rb
401
406
  - lib/ollama_chat/message_type.rb
@@ -403,6 +408,7 @@ files:
403
408
  - lib/ollama_chat/ollama_chat_config.rb
404
409
  - lib/ollama_chat/ollama_chat_config/default_config.yml
405
410
  - lib/ollama_chat/parsing.rb
411
+ - lib/ollama_chat/server_socket.rb
406
412
  - lib/ollama_chat/source_fetching.rb
407
413
  - lib/ollama_chat/switches.rb
408
414
  - lib/ollama_chat/utils.rb
@@ -467,7 +473,7 @@ required_rubygems_version: !ruby/object:Gem::Requirement
467
473
  - !ruby/object:Gem::Version
468
474
  version: '0'
469
475
  requirements: []
470
- rubygems_version: 3.6.2
476
+ rubygems_version: 3.6.7
471
477
  specification_version: 4
472
478
  summary: A command-line interface (CLI) for interacting with an Ollama AI model.
473
479
  test_files: