ollama_chat 0.0.21 → 0.0.23

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: 91a8452d2970be06a595b83e8dfdc4ee8be0388b37171967ef884139ad97b492
4
- data.tar.gz: 116ca9055525145d36a65abc5d38341a603fbc62ccf4d52257876ddaa3a3f0d9
3
+ metadata.gz: 7811d92313881ed278cca428c652ec13b6e52288a7f403f0dd88d771f8c7f494
4
+ data.tar.gz: 950605c809843f1b81ff94795f0db0ae062b9e8485f85dbe4ad23eded4001507
5
5
  SHA512:
6
- metadata.gz: 710139d1db7d25ed49218eaf6c22f61cea64352bbd2d8d1c418ccaba67f190c6157b14d8c5ec41da18e83669177310143b0e3683d5df7b2d7e9c68e3edf72df7
7
- data.tar.gz: ef6f2705cfacb4c4523ead9e3361ce980aef227e06b6bc3a7a639d6e492d82cbd03f6c1ca4b82383b29cb95d740d9b815fff3d5c57336baf5f885e1e43e81277
6
+ metadata.gz: 60980bfcfd8aca4de1c7bbbe4451d08d99512bd70b438e63f7fca5dbb4bbbcf9e2855477f7e6cf26d38ed9d673a36beca565d8fe29bacd99e66a34a01de114bc
7
+ data.tar.gz: e1b5a35b0a24c870668d59670d405fce3f6d654bc801e280677a43b04ad720c564d1dbc48e058464b00e159bac1e5d27d5782ec045fce97d490046064ff5a560
data/CHANGES.md CHANGED
@@ -1,5 +1,50 @@
1
1
  # Changes
2
2
 
3
+ ## 2025-08-17 v0.0.23
4
+
5
+ - Added `OllamaChat::KramdownANSI` module with `configure_kramdown_ansi_styles` and `kramdown_ansi_parse` methods for consistent Markdown formatting
6
+ - Replaced direct calls to `Kramdown::ANSI.parse` with `@chat.kramdown_ansi_parse` in `FollowChat` and `MessageList`
7
+ - Integrated `OllamaChat::KramdownANSI` module into `OllamaChat::Chat` class
8
+ - Configured `@kramdown_ansi_styles` during chat initialization
9
+ - Added support for environment variables `KRAMDOWN_ANSI_OLLAMA_CHAT_STYLES` and `KRAMDOWN_ANSI_STYLES` for styling configuration
10
+ - Updated tests to mock `kramdown_ansi_parse` instead of direct `Kramdown::ANSI.parse`
11
+ - Documented environment variables for customizing Markdown formatting with example JSON format
12
+ - Added `lib/ollama_chat/kramdown_ansi.rb` to `extra_rdoc_files` and `files` list in gemspec
13
+ - Escaped dot in regex pattern in `parsing_spec.rb` for proper image file matching
14
+ - Implemented `File.expand_path` to resolve `~` shortcuts before existence check in parsing module
15
+ - Added error handling for malformed paths by rescuing `ArgumentError` exceptions
16
+ - Skipped invalid file paths during processing loop using `next` statement
17
+ - Maintained backward compatibility for standard file paths
18
+ - Added comprehensive list of supported environment variables in documentation
19
+
20
+ ## 2025-08-13 v0.0.22
21
+
22
+ - Added new `-p` command line flag for enabling source parsing functionality
23
+ - Enhanced `send_to_server_socket` method to accept and pass a `parse` parameter
24
+ - Modified `chat.rb` to handle the `parse` content flag from server messages
25
+ - Updated documentation in `README.md` with example usage of the new `-p` flag
26
+ - Added comprehensive tests for the new parsing functionality in `server_socket_spec.rb`
27
+ - Improved method documentation in `server_socket.rb` with detailed parameter descriptions
28
+ - Replaced `messages.list_conversation(2)` with `messages.show_last` in `/drop` command behavior
29
+ - Updated `gem_hadar` development dependency from version **1.27** to **2.0**
30
+ - Simplified SimpleCov setup by using `GemHadar::SimpleCov.start` instead of manual configuration
31
+
32
+ ## 2025-08-11 v0.0.21
33
+
34
+ * **Vim Integration**: The `/vim` command allows users to insert the last chat
35
+ message into a Vim server, improving workflow integration. It uses
36
+ `--servername` and `--remote-send` to insert text at the cursor position and
37
+ automatically indents based on the current column.
38
+ * **Improved Documentation**: Comprehensive documentation has been added to
39
+ various modules and classes, making it easier for developers to understand
40
+ and use the gem's features.
41
+ * **Model Selection Logic**: When only a single model is available, the code
42
+ now automatically selects that model instead of showing a prompt, improving
43
+ usability.
44
+ * **Configuration Handling**: Configuration file error handling has been
45
+ updated to use `STDERR` for output, ensuring errors are displayed
46
+ appropriately.
47
+
3
48
  ## 2025-08-11 v0.0.20
4
49
 
5
50
  ### Documentation
data/README.md CHANGED
@@ -15,6 +15,34 @@ gem install ollama_chat
15
15
 
16
16
  in your terminal.
17
17
 
18
+ ## Configuration
19
+
20
+ ### Environment Variables
21
+
22
+ The following environment variables can be used to configure behavior:
23
+
24
+ - `OLLAMA_URL` - Base URL for Ollama server (default: `http://localhost:11434`)
25
+ - `OLLAMA_HOST` - Base URL for Ollama server (default: `localhost:11434`)
26
+ - `OLLAMA_MODEL` - Default model to use (e.g., `llama3.1`)
27
+ - `KRAMDOWN_ANSI_OLLAMA_CHAT_STYLES` - Custom ANSI styles for Markdown formatting
28
+ - `KRAMDOWN_ANSI_STYLES` - Fallback ANSI styles configuration
29
+ - `OLLAMA_CHAT_SYSTEM` - System prompt file or content (default: `null`)
30
+ - `OLLAMA_CHAT_COLLECTION` - Collection name for embeddings
31
+ - `PAGER` - Default pager for output
32
+ - `REDIS_URL` - Redis connection URL for caching
33
+ - `REDIS_EXPIRING_URL` - Redis connection URL for expiring data
34
+ - `OLLAMA_CHAT_HISTORY` - Chat history filename (default: `~/.ollama_chat_history`)
35
+ - `OLLAMA_CHAT_DEBUG` - Debug mode toggle (1 = enabled)
36
+ - `DIFF_TOOL` - Tool for diff operations (default: `vimdiff`)
37
+ - `OLLAMA_SEARXNG_URL` - SearxNG search endpoint URL
38
+
39
+ Example usage for `KRAMDOWN_ANSI_OLLAMA_CHAT_STYLES`:
40
+
41
+ ```bash
42
+ # Set custom ANSI colors for Markdown output as a JSON object:
43
+ export KRAMDOWN_ANSI_OLLAMA_CHAT_STYLES='{"header":["bold","on_color241","white"],"strong":["bold","color76"],"em":["italic","color227"],"code":["bold","color214"]}'
44
+ ```
45
+
18
46
  ## Usage
19
47
 
20
48
  It can be started with the following arguments:
@@ -191,6 +219,14 @@ The `ollama_chat_send` command now supports additional parameters to enhance fun
191
219
  $ echo "$response"
192
220
  ```
193
221
 
222
+ - **Source Parsing (`-p`)**: Enables automatic parsing of URLs, file paths, and
223
+ similar tokens in input content. When enabled, the system will attempt to
224
+ resolve and include external resources.
225
+
226
+ ```bash
227
+ $ echo "Visit https://example.com for more info" | ollama_chat_send -p
228
+ ```
229
+
194
230
  - **Help (`-h` or `--help`)**: Displays usage information and available options.
195
231
 
196
232
  ```bash
data/VERSION CHANGED
@@ -1 +1 @@
1
- 0.0.21
1
+ 0.0.23
data/bin/ollama_chat_send CHANGED
@@ -5,7 +5,7 @@ require 'tins/go'
5
5
  include Tins::GO
6
6
 
7
7
 
8
- opts = go 'f:rth', ARGV
8
+ opts = go 'f:rtph', ARGV
9
9
 
10
10
  def usage(rc = 0)
11
11
  puts <<~EOT
@@ -14,6 +14,7 @@ def usage(rc = 0)
14
14
  Options:
15
15
  -r Wait for the response from Ollama Chat and output it
16
16
  -t Send input as terminal input including commands, e. g. /import
17
+ -p Send input with source parsing enabled (defaults to disabled)
17
18
  -f CONFIG file to read
18
19
  -h Show this help message
19
20
 
@@ -30,7 +31,7 @@ begin
30
31
  else
31
32
  opts[?r] ? :socket_input_with_response : :socket_input
32
33
  end
33
- response = OllamaChat::ServerSocket.send_to_server_socket(STDIN.read, type:, config:)
34
+ response = OllamaChat::ServerSocket.send_to_server_socket(STDIN.read, type:, config:, parse: !!opts[?p])
34
35
  type == :socket_input_with_response and puts response.content
35
36
  rescue => e
36
37
  warn "Caught #{e.class}: #{e}"
@@ -33,6 +33,7 @@ class OllamaChat::Chat
33
33
  include OllamaChat::MessageFormat
34
34
  include OllamaChat::History
35
35
  include OllamaChat::ServerSocket
36
+ include OllamaChat::KramdownANSI
36
37
 
37
38
  # Initializes a new OllamaChat::Chat instance with the given command-line
38
39
  # arguments.
@@ -97,10 +98,11 @@ class OllamaChat::Chat
97
98
  system.present? and messages.set_system_prompt(system)
98
99
  end
99
100
  end
100
- @documents = setup_documents
101
- @cache = setup_cache
102
- @current_voice = config.voice.default
103
- @images = []
101
+ @documents = setup_documents
102
+ @cache = setup_cache
103
+ @current_voice = config.voice.default
104
+ @images = []
105
+ @kramdown_ansi_styles = configure_kramdown_ansi_styles
104
106
  init_chat_history
105
107
  @opts[?S] and init_server_socket
106
108
  rescue ComplexConfig::AttributeMissing, ComplexConfig::ConfigurationSyntaxError => e
@@ -217,7 +219,7 @@ class OllamaChat::Chat
217
219
  :next
218
220
  when %r(^/drop(?:\s+(\d*))?$)
219
221
  messages.drop($1)
220
- messages.list_conversation(2)
222
+ messages.show_last
221
223
  :next
222
224
  when %r(^/model$)
223
225
  @model = choose_model('', @model)
@@ -495,8 +497,9 @@ class OllamaChat::Chat
495
497
  end
496
498
  rescue Interrupt
497
499
  if message = server_socket_message
498
- type = message.type.full?(:to_sym) || :socket_input
499
- content = message.content
500
+ type = message.type.full?(:to_sym) || :socket_input
501
+ content = message.content
502
+ @parse_content = message.parse
500
503
  STDOUT.puts color(112) { "Received a server socket message. Processing now…" }
501
504
  else
502
505
  raise
@@ -102,9 +102,9 @@ class OllamaChat::FollowChat
102
102
  def display_formatted_terminal_output
103
103
  content, thinking = @messages.last.content, @messages.last.thinking
104
104
  if @chat.markdown.on?
105
- content = talk_annotate { Kramdown::ANSI.parse(content) }
105
+ content = talk_annotate { @chat.kramdown_ansi_parse(content) }
106
106
  if @chat.think.on?
107
- thinking = think_annotate { Kramdown::ANSI.parse(thinking) }
107
+ thinking = think_annotate { @chat.kramdown_ansi_parse(content) }
108
108
  end
109
109
  else
110
110
  content = talk_annotate { content }
@@ -0,0 +1,31 @@
1
+ module OllamaChat::KramdownANSI
2
+ # The configure_kramdown_ansi_styles method sets up ANSI styling for
3
+ # Kramdown::ANSI output by checking for specific environment variables and
4
+ # falling back to default styles.
5
+ #
6
+ # @return [ Hash ] a hash of ANSI styles configured either from environment
7
+ # variables or using default settings
8
+ def configure_kramdown_ansi_styles
9
+ if env_var = %w[ KRAMDOWN_ANSI_OLLAMA_CHAT_STYLES KRAMDOWN_ANSI_STYLES ].find { ENV.key?(_1) }
10
+ Kramdown::ANSI::Styles.from_env_var(env_var).ansi_styles
11
+ else
12
+ Kramdown::ANSI::Styles.new.ansi_styles
13
+ end
14
+ end
15
+
16
+ # The kramdown_ansi_parse method processes content using Kramdown::ANSI with
17
+ # custom ANSI styles.
18
+ #
19
+ # This method takes raw content and converts it into formatted ANSI output by
20
+ # applying the instance's configured ANSI styles. It is used to render
21
+ # content with appropriate terminal formatting based on the application's
22
+ # styling configuration.
23
+ #
24
+ # @param content [ String ] the raw content to be parsed and formatted
25
+ #
26
+ # @return [ String ] the content formatted with ANSI escape sequences
27
+ # according to the configured styles
28
+ def kramdown_ansi_parse(content)
29
+ Kramdown::ANSI.parse(content, ansi_styles: @kramdown_ansi_styles)
30
+ end
31
+ end
@@ -193,7 +193,7 @@ class OllamaChat::MessageList
193
193
  #
194
194
  # @return [self, NilClass] nil if the system prompt is empty, otherwise self.
195
195
  def show_system_prompt
196
- system_prompt = Kramdown::ANSI.parse(system.to_s).gsub(/\n+\z/, '').full?
196
+ system_prompt = @chat.kramdown_ansi_parse(system.to_s).gsub(/\n+\z/, '').full?
197
197
  system_prompt or return
198
198
  STDOUT.puts <<~EOT
199
199
  Configured system prompt is:
@@ -307,10 +307,10 @@ class OllamaChat::MessageList
307
307
  end
308
308
  thinking = if @chat.think.on?
309
309
  think_annotate do
310
- message.thinking.full? { @chat.markdown.on? ? Kramdown::ANSI.parse(_1) : _1 }
310
+ message.thinking.full? { @chat.markdown.on? ? @chat.kramdown_ansi_parse(_1) : _1 }
311
311
  end
312
312
  end
313
- content = message.content.full? { @chat.markdown.on? ? Kramdown::ANSI.parse(_1) : _1 }
313
+ content = message.content.full? { @chat.markdown.on? ? @chat.kramdown_ansi_parse(_1) : _1 }
314
314
  message_text = message_type(message.images) + " "
315
315
  message_text += bold { color(role_color) { message.role } }
316
316
  if thinking
@@ -200,6 +200,11 @@ module OllamaChat::Parsing
200
200
  when file
201
201
  file = file.sub(/#.*/, '')
202
202
  file =~ %r(\A[~./]) or file.prepend('./')
203
+ file = begin
204
+ File.expand_path(file)
205
+ rescue ArgumentError
206
+ next
207
+ end
203
208
  File.exist?(file) or next
204
209
  source = file
205
210
  when url
@@ -1,24 +1,48 @@
1
1
  module OllamaChat::ServerSocket
2
2
  class << self
3
- # The send_to_server_socket method sends content to the server socket and returns
4
- # the response if type is :socket_input_with_response, otherwise it returns nil.
5
-
6
- # @param content [ String ] the message to be sent to the server
7
- # @param type [ Symbol ] the type of message being sent (default: :socket_input)
3
+ # The send_to_server_socket method transmits a message to a Unix domain
4
+ # socket server for processing by the Ollama Chat client.
5
+ #
6
+ # This method creates a socket server instance using the provided
7
+ # configuration, prepares a message with the given content, type, and parse
8
+ # flag, then sends it either as a simple transmission or with a response
9
+ # expectation depending on the message type. It is used to enable
10
+ # communication between external processes and the chat session via a named
11
+ # Unix socket.
8
12
  #
9
- # @return [ String, NilClass ] the response from the server if type is
10
- # :socket_input_with_response, otherwise nil.
11
- def send_to_server_socket(content, config:, type: :socket_input)
13
+ # @param content [ String ] the message content to be sent
14
+ # @param config [ ComplexConfig::Settings ] the configuration object containing server settings
15
+ # @param type [ Symbol ] the type of message transmission, defaults to :socket_input
16
+ # @param parse [ TrueClass, FalseClass ] whether to parse the response, defaults to false
17
+ #
18
+ # @return [ UnixSocks::Message, nil ] the response from transmit_with_response if type
19
+ # is :socket_input_with_response, otherwise nil
20
+ def send_to_server_socket(content, config:, type: :socket_input, parse: false)
12
21
  server = create_socket_server(config:)
13
- message = { content:, type: }
22
+ message = { content:, type:, parse: }
14
23
  if type.to_sym == :socket_input_with_response
15
- return server.transmit_with_response(message)
24
+ server.transmit_with_response(message)
16
25
  else
17
- server.transmit(message)
18
- nil
26
+ server.transmit(message)
27
+ nil
19
28
  end
20
29
  end
21
30
 
31
+ # The create_socket_server method constructs and returns a Unix domain
32
+ # socket server instance for communication with the Ollama Chat client.
33
+ #
34
+ # This method initializes a UnixSocks::Server object configured to listen
35
+ # for incoming messages on a named socket file. It supports specifying a
36
+ # custom runtime directory for the socket, which is useful for isolating
37
+ # multiple instances or environments. If no runtime directory is provided
38
+ # in the configuration, it defaults to using the standard system location
39
+ # for Unix domain sockets.
40
+ #
41
+ # @param config [ComplexConfig::Settings] the configuration object
42
+ # containing server settings
43
+ #
44
+ # @return [UnixSocks::Server] a configured Unix domain socket server
45
+ # instance ready to receive messages
22
46
  def create_socket_server(config:)
23
47
  if runtime_dir = config.server_socket_runtime_dir
24
48
  UnixSocks::Server.new(socket_name: 'ollama_chat.sock', runtime_dir:)
@@ -37,9 +61,6 @@ module OllamaChat::ServerSocket
37
61
  # messages in the background. When a message is received, it updates the
38
62
  # instance variable `server_socket_message` and sends an interrupt signal
39
63
  # to the current process in order to handle the message.
40
- #
41
- # @return [ nil ] This method does not return any value, it only sets up the
42
- # server socket and kills the process when a message is received.
43
64
  def init_server_socket
44
65
  server = OllamaChat::ServerSocket.create_socket_server(config:)
45
66
  server.receive_in_background do |message|
@@ -1,6 +1,6 @@
1
1
  module OllamaChat
2
2
  # OllamaChat version
3
- VERSION = '0.0.21'
3
+ VERSION = '0.0.23'
4
4
  VERSION_ARRAY = VERSION.split('.').map(&:to_i) # :nodoc:
5
5
  VERSION_MAJOR = VERSION_ARRAY[0] # :nodoc:
6
6
  VERSION_MINOR = VERSION_ARRAY[1] # :nodoc:
data/lib/ollama_chat.rb CHANGED
@@ -23,4 +23,5 @@ require 'ollama_chat/vim'
23
23
  require 'ollama_chat/document_cache'
24
24
  require 'ollama_chat/history'
25
25
  require 'ollama_chat/server_socket'
26
+ require 'ollama_chat/kramdown_ansi'
26
27
  require 'ollama_chat/chat'
data/ollama_chat.gemspec CHANGED
@@ -1,9 +1,9 @@
1
1
  # -*- encoding: utf-8 -*-
2
- # stub: ollama_chat 0.0.21 ruby lib
2
+ # stub: ollama_chat 0.0.23 ruby lib
3
3
 
4
4
  Gem::Specification.new do |s|
5
5
  s.name = "ollama_chat".freeze
6
- s.version = "0.0.21".freeze
6
+ s.version = "0.0.23".freeze
7
7
 
8
8
  s.required_rubygems_version = Gem::Requirement.new(">= 0".freeze) if s.respond_to? :required_rubygems_version=
9
9
  s.require_paths = ["lib".freeze]
@@ -12,19 +12,19 @@ Gem::Specification.new do |s|
12
12
  s.description = "The app provides a command-line interface (CLI) to an Ollama AI model,\nallowing users to engage in text-based conversations and generate\nhuman-like responses. Users can import data from local files or web pages,\nwhich are then processed through three different modes: fully importing the\ncontent into the conversation context, summarizing the information for\nconcise reference, or storing it in an embedding vector database for later\nretrieval based on the conversation.\n".freeze
13
13
  s.email = "flori@ping.de".freeze
14
14
  s.executables = ["ollama_chat".freeze, "ollama_chat_send".freeze]
15
- s.extra_rdoc_files = ["README.md".freeze, "lib/ollama_chat.rb".freeze, "lib/ollama_chat/chat.rb".freeze, "lib/ollama_chat/clipboard.rb".freeze, "lib/ollama_chat/dialog.rb".freeze, "lib/ollama_chat/document_cache.rb".freeze, "lib/ollama_chat/follow_chat.rb".freeze, "lib/ollama_chat/history.rb".freeze, "lib/ollama_chat/information.rb".freeze, "lib/ollama_chat/message_format.rb".freeze, "lib/ollama_chat/message_list.rb".freeze, "lib/ollama_chat/message_output.rb".freeze, "lib/ollama_chat/model_handling.rb".freeze, "lib/ollama_chat/ollama_chat_config.rb".freeze, "lib/ollama_chat/parsing.rb".freeze, "lib/ollama_chat/server_socket.rb".freeze, "lib/ollama_chat/source_fetching.rb".freeze, "lib/ollama_chat/switches.rb".freeze, "lib/ollama_chat/utils.rb".freeze, "lib/ollama_chat/utils/cache_fetcher.rb".freeze, "lib/ollama_chat/utils/chooser.rb".freeze, "lib/ollama_chat/utils/fetcher.rb".freeze, "lib/ollama_chat/utils/file_argument.rb".freeze, "lib/ollama_chat/version.rb".freeze, "lib/ollama_chat/vim.rb".freeze, "lib/ollama_chat/web_searching.rb".freeze]
16
- s.files = [".all_images.yml".freeze, ".envrc".freeze, ".gitignore".freeze, "CHANGES.md".freeze, "Gemfile".freeze, "README.md".freeze, "Rakefile".freeze, "VERSION".freeze, "bin/ollama_chat".freeze, "bin/ollama_chat_send".freeze, "config/searxng/settings.yml".freeze, "docker-compose.yml".freeze, "lib/ollama_chat.rb".freeze, "lib/ollama_chat/chat.rb".freeze, "lib/ollama_chat/clipboard.rb".freeze, "lib/ollama_chat/dialog.rb".freeze, "lib/ollama_chat/document_cache.rb".freeze, "lib/ollama_chat/follow_chat.rb".freeze, "lib/ollama_chat/history.rb".freeze, "lib/ollama_chat/information.rb".freeze, "lib/ollama_chat/message_format.rb".freeze, "lib/ollama_chat/message_list.rb".freeze, "lib/ollama_chat/message_output.rb".freeze, "lib/ollama_chat/model_handling.rb".freeze, "lib/ollama_chat/ollama_chat_config.rb".freeze, "lib/ollama_chat/ollama_chat_config/default_config.yml".freeze, "lib/ollama_chat/parsing.rb".freeze, "lib/ollama_chat/server_socket.rb".freeze, "lib/ollama_chat/source_fetching.rb".freeze, "lib/ollama_chat/switches.rb".freeze, "lib/ollama_chat/utils.rb".freeze, "lib/ollama_chat/utils/cache_fetcher.rb".freeze, "lib/ollama_chat/utils/chooser.rb".freeze, "lib/ollama_chat/utils/fetcher.rb".freeze, "lib/ollama_chat/utils/file_argument.rb".freeze, "lib/ollama_chat/version.rb".freeze, "lib/ollama_chat/vim.rb".freeze, "lib/ollama_chat/web_searching.rb".freeze, "ollama_chat.gemspec".freeze, "redis/redis.conf".freeze, "spec/assets/api_show.json".freeze, "spec/assets/api_tags.json".freeze, "spec/assets/api_version.json".freeze, "spec/assets/conversation.json".freeze, "spec/assets/duckduckgo.html".freeze, "spec/assets/example.atom".freeze, "spec/assets/example.csv".freeze, "spec/assets/example.html".freeze, "spec/assets/example.pdf".freeze, "spec/assets/example.ps".freeze, "spec/assets/example.rb".freeze, "spec/assets/example.rss".freeze, "spec/assets/example.xml".freeze, "spec/assets/kitten.jpg".freeze, "spec/assets/prompt.txt".freeze, "spec/assets/searxng.json".freeze, "spec/ollama_chat/chat_spec.rb".freeze, "spec/ollama_chat/clipboard_spec.rb".freeze, "spec/ollama_chat/follow_chat_spec.rb".freeze, "spec/ollama_chat/information_spec.rb".freeze, "spec/ollama_chat/message_list_spec.rb".freeze, "spec/ollama_chat/message_output_spec.rb".freeze, "spec/ollama_chat/model_handling_spec.rb".freeze, "spec/ollama_chat/parsing_spec.rb".freeze, "spec/ollama_chat/source_fetching_spec.rb".freeze, "spec/ollama_chat/switches_spec.rb".freeze, "spec/ollama_chat/utils/cache_fetcher_spec.rb".freeze, "spec/ollama_chat/utils/fetcher_spec.rb".freeze, "spec/ollama_chat/utils/file_argument_spec.rb".freeze, "spec/ollama_chat/web_searching_spec.rb".freeze, "spec/spec_helper.rb".freeze, "tmp/.keep".freeze]
15
+ s.extra_rdoc_files = ["README.md".freeze, "lib/ollama_chat.rb".freeze, "lib/ollama_chat/chat.rb".freeze, "lib/ollama_chat/clipboard.rb".freeze, "lib/ollama_chat/dialog.rb".freeze, "lib/ollama_chat/document_cache.rb".freeze, "lib/ollama_chat/follow_chat.rb".freeze, "lib/ollama_chat/history.rb".freeze, "lib/ollama_chat/information.rb".freeze, "lib/ollama_chat/kramdown_ansi.rb".freeze, "lib/ollama_chat/message_format.rb".freeze, "lib/ollama_chat/message_list.rb".freeze, "lib/ollama_chat/message_output.rb".freeze, "lib/ollama_chat/model_handling.rb".freeze, "lib/ollama_chat/ollama_chat_config.rb".freeze, "lib/ollama_chat/parsing.rb".freeze, "lib/ollama_chat/server_socket.rb".freeze, "lib/ollama_chat/source_fetching.rb".freeze, "lib/ollama_chat/switches.rb".freeze, "lib/ollama_chat/utils.rb".freeze, "lib/ollama_chat/utils/cache_fetcher.rb".freeze, "lib/ollama_chat/utils/chooser.rb".freeze, "lib/ollama_chat/utils/fetcher.rb".freeze, "lib/ollama_chat/utils/file_argument.rb".freeze, "lib/ollama_chat/version.rb".freeze, "lib/ollama_chat/vim.rb".freeze, "lib/ollama_chat/web_searching.rb".freeze]
16
+ s.files = [".all_images.yml".freeze, ".envrc".freeze, ".gitignore".freeze, "CHANGES.md".freeze, "Gemfile".freeze, "README.md".freeze, "Rakefile".freeze, "VERSION".freeze, "bin/ollama_chat".freeze, "bin/ollama_chat_send".freeze, "config/searxng/settings.yml".freeze, "docker-compose.yml".freeze, "lib/ollama_chat.rb".freeze, "lib/ollama_chat/chat.rb".freeze, "lib/ollama_chat/clipboard.rb".freeze, "lib/ollama_chat/dialog.rb".freeze, "lib/ollama_chat/document_cache.rb".freeze, "lib/ollama_chat/follow_chat.rb".freeze, "lib/ollama_chat/history.rb".freeze, "lib/ollama_chat/information.rb".freeze, "lib/ollama_chat/kramdown_ansi.rb".freeze, "lib/ollama_chat/message_format.rb".freeze, "lib/ollama_chat/message_list.rb".freeze, "lib/ollama_chat/message_output.rb".freeze, "lib/ollama_chat/model_handling.rb".freeze, "lib/ollama_chat/ollama_chat_config.rb".freeze, "lib/ollama_chat/ollama_chat_config/default_config.yml".freeze, "lib/ollama_chat/parsing.rb".freeze, "lib/ollama_chat/server_socket.rb".freeze, "lib/ollama_chat/source_fetching.rb".freeze, "lib/ollama_chat/switches.rb".freeze, "lib/ollama_chat/utils.rb".freeze, "lib/ollama_chat/utils/cache_fetcher.rb".freeze, "lib/ollama_chat/utils/chooser.rb".freeze, "lib/ollama_chat/utils/fetcher.rb".freeze, "lib/ollama_chat/utils/file_argument.rb".freeze, "lib/ollama_chat/version.rb".freeze, "lib/ollama_chat/vim.rb".freeze, "lib/ollama_chat/web_searching.rb".freeze, "ollama_chat.gemspec".freeze, "redis/redis.conf".freeze, "spec/assets/api_show.json".freeze, "spec/assets/api_tags.json".freeze, "spec/assets/api_version.json".freeze, "spec/assets/conversation.json".freeze, "spec/assets/duckduckgo.html".freeze, "spec/assets/example.atom".freeze, "spec/assets/example.csv".freeze, "spec/assets/example.html".freeze, "spec/assets/example.pdf".freeze, "spec/assets/example.ps".freeze, "spec/assets/example.rb".freeze, "spec/assets/example.rss".freeze, "spec/assets/example.xml".freeze, "spec/assets/kitten.jpg".freeze, "spec/assets/prompt.txt".freeze, "spec/assets/searxng.json".freeze, "spec/ollama_chat/chat_spec.rb".freeze, "spec/ollama_chat/clipboard_spec.rb".freeze, "spec/ollama_chat/follow_chat_spec.rb".freeze, "spec/ollama_chat/information_spec.rb".freeze, "spec/ollama_chat/message_list_spec.rb".freeze, "spec/ollama_chat/message_output_spec.rb".freeze, "spec/ollama_chat/model_handling_spec.rb".freeze, "spec/ollama_chat/parsing_spec.rb".freeze, "spec/ollama_chat/server_socket_spec.rb".freeze, "spec/ollama_chat/source_fetching_spec.rb".freeze, "spec/ollama_chat/switches_spec.rb".freeze, "spec/ollama_chat/utils/cache_fetcher_spec.rb".freeze, "spec/ollama_chat/utils/fetcher_spec.rb".freeze, "spec/ollama_chat/utils/file_argument_spec.rb".freeze, "spec/ollama_chat/web_searching_spec.rb".freeze, "spec/spec_helper.rb".freeze, "tmp/.keep".freeze]
17
17
  s.homepage = "https://github.com/flori/ollama_chat".freeze
18
18
  s.licenses = ["MIT".freeze]
19
19
  s.rdoc_options = ["--title".freeze, "OllamaChat - A command-line interface (CLI) for interacting with an Ollama AI model.".freeze, "--main".freeze, "README.md".freeze]
20
20
  s.required_ruby_version = Gem::Requirement.new("~> 3.1".freeze)
21
21
  s.rubygems_version = "3.6.9".freeze
22
22
  s.summary = "A command-line interface (CLI) for interacting with an Ollama AI model.".freeze
23
- s.test_files = ["spec/assets/example.rb".freeze, "spec/ollama_chat/chat_spec.rb".freeze, "spec/ollama_chat/clipboard_spec.rb".freeze, "spec/ollama_chat/follow_chat_spec.rb".freeze, "spec/ollama_chat/information_spec.rb".freeze, "spec/ollama_chat/message_list_spec.rb".freeze, "spec/ollama_chat/message_output_spec.rb".freeze, "spec/ollama_chat/model_handling_spec.rb".freeze, "spec/ollama_chat/parsing_spec.rb".freeze, "spec/ollama_chat/source_fetching_spec.rb".freeze, "spec/ollama_chat/switches_spec.rb".freeze, "spec/ollama_chat/utils/cache_fetcher_spec.rb".freeze, "spec/ollama_chat/utils/fetcher_spec.rb".freeze, "spec/ollama_chat/utils/file_argument_spec.rb".freeze, "spec/ollama_chat/web_searching_spec.rb".freeze, "spec/spec_helper.rb".freeze]
23
+ s.test_files = ["spec/assets/example.rb".freeze, "spec/ollama_chat/chat_spec.rb".freeze, "spec/ollama_chat/clipboard_spec.rb".freeze, "spec/ollama_chat/follow_chat_spec.rb".freeze, "spec/ollama_chat/information_spec.rb".freeze, "spec/ollama_chat/message_list_spec.rb".freeze, "spec/ollama_chat/message_output_spec.rb".freeze, "spec/ollama_chat/model_handling_spec.rb".freeze, "spec/ollama_chat/parsing_spec.rb".freeze, "spec/ollama_chat/server_socket_spec.rb".freeze, "spec/ollama_chat/source_fetching_spec.rb".freeze, "spec/ollama_chat/switches_spec.rb".freeze, "spec/ollama_chat/utils/cache_fetcher_spec.rb".freeze, "spec/ollama_chat/utils/fetcher_spec.rb".freeze, "spec/ollama_chat/utils/file_argument_spec.rb".freeze, "spec/ollama_chat/web_searching_spec.rb".freeze, "spec/spec_helper.rb".freeze]
24
24
 
25
25
  s.specification_version = 4
26
26
 
27
- s.add_development_dependency(%q<gem_hadar>.freeze, ["~> 1.27".freeze])
27
+ s.add_development_dependency(%q<gem_hadar>.freeze, ["~> 2.0".freeze])
28
28
  s.add_development_dependency(%q<all_images>.freeze, ["~> 0.6".freeze])
29
29
  s.add_development_dependency(%q<rspec>.freeze, ["~> 3.2".freeze])
30
30
  s.add_development_dependency(%q<kramdown>.freeze, ["~> 2.0".freeze])
@@ -79,7 +79,7 @@ describe OllamaChat::Chat do
79
79
 
80
80
  it 'returns :next when input is "/drop(?:\s+(\d*))?"' do
81
81
  expect(chat.messages).to receive(:drop).with(?2)
82
- expect(chat.messages).to receive(:list_conversation).with(2)
82
+ expect(chat.messages).to receive(:show_last)
83
83
  expect(chat.handle_input("/drop 2")).to eq :next
84
84
  end
85
85
 
@@ -23,6 +23,12 @@ describe OllamaChat::MessageList do
23
23
  double('Chat', config:)
24
24
  end
25
25
 
26
+ before do
27
+ allow(chat).to receive(:kramdown_ansi_parse) do |content|
28
+ Kramdown::ANSI.parse(content)
29
+ end
30
+ end
31
+
26
32
  let :list do
27
33
  described_class.new(chat).tap do |list|
28
34
  list << Ollama::Message.new(role: 'system', content: 'hello')
@@ -133,8 +139,6 @@ describe OllamaChat::MessageList do
133
139
 
134
140
  it 'can show_system_prompt' do
135
141
  expect(list).to receive(:system).and_return 'test **prompt**'
136
- expect(Kramdown::ANSI).to receive(:parse).with('test **prompt**').
137
- and_call_original
138
142
  expect(list.show_system_prompt).to eq list
139
143
  end
140
144
 
@@ -191,7 +191,7 @@ EOT
191
191
  it 'can add images' do
192
192
  images = []
193
193
  expect(chat).to receive(:add_image).
194
- with(images, kind_of(IO), %r(/spec/assets/kitten.jpg\z)).
194
+ with(images, kind_of(IO), %r(/spec/assets/kitten\.jpg\z)).
195
195
  and_call_original
196
196
  chat.parse_content('./spec/assets/kitten.jpg', images)
197
197
  expect(images.size).to eq 1
@@ -0,0 +1,133 @@
1
+ require 'spec_helper'
2
+
3
+ describe OllamaChat::ServerSocket do
4
+ let :instance do
5
+ Object.extend(described_class)
6
+ end
7
+
8
+ describe '#send_to_server_socket' do
9
+ let(:config) { double('Config') }
10
+ let(:server) { double('Server') }
11
+
12
+ before do
13
+ expect(OllamaChat::ServerSocket).to receive(:create_socket_server).with(config: config).and_return(server)
14
+ end
15
+
16
+ context 'with default parameters' do
17
+ it 'uses correct defaults' do
18
+ message = { content: 'test', type: :socket_input, parse: false }
19
+
20
+ expect(server).to receive(:transmit).with(message).and_return(nil)
21
+
22
+ result = OllamaChat::ServerSocket.send_to_server_socket('test', config: config)
23
+
24
+ expect(result).to be_nil
25
+ end
26
+ end
27
+
28
+ context 'with :socket_input type and parse: true' do
29
+ it 'sends message with parse flag and returns nil' do
30
+ message = { content: 'test', type: :socket_input, parse: true }
31
+
32
+ expect(server).to receive(:transmit).with(message).and_return(nil)
33
+
34
+ result = OllamaChat::ServerSocket.send_to_server_socket(
35
+ 'test',
36
+ config: config,
37
+ type: :socket_input,
38
+ parse: true
39
+ )
40
+
41
+ expect(result).to be_nil
42
+ end
43
+ end
44
+
45
+ context 'with :socket_input_with_response type and parse: false' do
46
+ it 'sends message and returns response with parse flag' do
47
+ message = { content: 'test', type: :socket_input_with_response, parse: false }
48
+ response = double('Response')
49
+
50
+ expect(server).to receive(:transmit_with_response).with(message).and_return(response)
51
+
52
+ result = OllamaChat::ServerSocket.send_to_server_socket(
53
+ 'test',
54
+ config: config,
55
+ type: :socket_input_with_response,
56
+ parse: false
57
+ )
58
+
59
+ expect(result).to eq(response)
60
+ end
61
+ end
62
+
63
+ context 'with :socket_input_with_response type and parse: true' do
64
+ it 'sends message and returns response with parse flag' do
65
+ message = { content: 'test', type: :socket_input_with_response, parse: true }
66
+ response = double('Response')
67
+
68
+ expect(server).to receive(:transmit_with_response).with(message).and_return(response)
69
+
70
+ result = OllamaChat::ServerSocket.send_to_server_socket(
71
+ 'test',
72
+ config: config,
73
+ type: :socket_input_with_response,
74
+ parse: true
75
+ )
76
+
77
+ expect(result).to eq(response)
78
+ end
79
+ end
80
+
81
+ end
82
+
83
+ describe '#create_socket_server' do
84
+ context 'with configured runtime_dir' do
85
+ it 'can be created with configured runtime_dir' do
86
+ config = double('Config', server_socket_runtime_dir: '/custom/runtime')
87
+ expect(UnixSocks::Server).to receive(:new).with(
88
+ socket_name: 'ollama_chat.sock',
89
+ runtime_dir: '/custom/runtime'
90
+ ).and_return :unix_socks_server
91
+
92
+ result = OllamaChat::ServerSocket.create_socket_server(config: config)
93
+ expect(result).to eq :unix_socks_server
94
+ end
95
+ end
96
+
97
+ context 'with default runtime_dir' do
98
+ it 'can be created with default runtime_dir' do
99
+ config = double('Config', server_socket_runtime_dir: nil)
100
+ expect(UnixSocks::Server).to receive(:new).with(
101
+ socket_name: 'ollama_chat.sock'
102
+ ).and_return :unix_socks_server
103
+
104
+ result = OllamaChat::ServerSocket.create_socket_server(config: config)
105
+ expect(result).to eq :unix_socks_server
106
+ end
107
+ end
108
+ end
109
+
110
+ describe '#server_socket_message' do
111
+ it 'can be set' do
112
+ message = double('message')
113
+ instance.server_socket_message = message
114
+ expect(instance.server_socket_message).to eq(message)
115
+ end
116
+
117
+ it 'can be read' do
118
+ message = double('message')
119
+ instance.server_socket_message = message
120
+ expect(instance.server_socket_message).to eq(message)
121
+ end
122
+ end
123
+
124
+ describe '#init_server_socket' do
125
+ it 'can be initialized' do
126
+ config = double('Config')
127
+ expect(instance).to receive(:config).and_return config
128
+ server = double('Server', receive_in_background: :receive_in_background)
129
+ expect(described_class).to receive(:create_socket_server).and_return server
130
+ expect(instance.init_server_socket).to eq :receive_in_background
131
+ end
132
+ end
133
+ end
data/spec/spec_helper.rb CHANGED
@@ -1,9 +1,5 @@
1
- if ENV['START_SIMPLECOV'].to_i == 1
2
- require 'simplecov'
3
- SimpleCov.start do
4
- add_filter "#{File.basename(File.dirname(__FILE__))}/"
5
- end
6
- end
1
+ require 'gem_hadar/simplecov'
2
+ GemHadar::SimpleCov.start
7
3
  require 'rspec'
8
4
  require 'tins/xt/expose'
9
5
  begin
metadata CHANGED
@@ -1,7 +1,7 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: ollama_chat
3
3
  version: !ruby/object:Gem::Version
4
- version: 0.0.21
4
+ version: 0.0.23
5
5
  platform: ruby
6
6
  authors:
7
7
  - Florian Frank
@@ -15,14 +15,14 @@ dependencies:
15
15
  requirements:
16
16
  - - "~>"
17
17
  - !ruby/object:Gem::Version
18
- version: '1.27'
18
+ version: '2.0'
19
19
  type: :development
20
20
  prerelease: false
21
21
  version_requirements: !ruby/object:Gem::Requirement
22
22
  requirements:
23
23
  - - "~>"
24
24
  - !ruby/object:Gem::Version
25
- version: '1.27'
25
+ version: '2.0'
26
26
  - !ruby/object:Gem::Dependency
27
27
  name: all_images
28
28
  requirement: !ruby/object:Gem::Requirement
@@ -386,6 +386,7 @@ extra_rdoc_files:
386
386
  - lib/ollama_chat/follow_chat.rb
387
387
  - lib/ollama_chat/history.rb
388
388
  - lib/ollama_chat/information.rb
389
+ - lib/ollama_chat/kramdown_ansi.rb
389
390
  - lib/ollama_chat/message_format.rb
390
391
  - lib/ollama_chat/message_list.rb
391
392
  - lib/ollama_chat/message_output.rb
@@ -424,6 +425,7 @@ files:
424
425
  - lib/ollama_chat/follow_chat.rb
425
426
  - lib/ollama_chat/history.rb
426
427
  - lib/ollama_chat/information.rb
428
+ - lib/ollama_chat/kramdown_ansi.rb
427
429
  - lib/ollama_chat/message_format.rb
428
430
  - lib/ollama_chat/message_list.rb
429
431
  - lib/ollama_chat/message_output.rb
@@ -468,6 +470,7 @@ files:
468
470
  - spec/ollama_chat/message_output_spec.rb
469
471
  - spec/ollama_chat/model_handling_spec.rb
470
472
  - spec/ollama_chat/parsing_spec.rb
473
+ - spec/ollama_chat/server_socket_spec.rb
471
474
  - spec/ollama_chat/source_fetching_spec.rb
472
475
  - spec/ollama_chat/switches_spec.rb
473
476
  - spec/ollama_chat/utils/cache_fetcher_spec.rb
@@ -511,6 +514,7 @@ test_files:
511
514
  - spec/ollama_chat/message_output_spec.rb
512
515
  - spec/ollama_chat/model_handling_spec.rb
513
516
  - spec/ollama_chat/parsing_spec.rb
517
+ - spec/ollama_chat/server_socket_spec.rb
514
518
  - spec/ollama_chat/source_fetching_spec.rb
515
519
  - spec/ollama_chat/switches_spec.rb
516
520
  - spec/ollama_chat/utils/cache_fetcher_spec.rb