ollama-ruby 0.13.0 → 0.14.1

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: d7ec8e3eb645bcb195053ea275a2c124b66ab7633a31b9795462252ed4c5580e
4
- data.tar.gz: b480ff0ff9cfad97a43e3131498df71c94ba855a1bc91ee2764fa06a53c84c58
3
+ metadata.gz: ca130b81b4baf0c93d2b727beef4f87a13fe7a1b97006963e780620372091d7f
4
+ data.tar.gz: 95dcad426a570bd1b78abf24bc868be1b1ce3836fd52c7a02ae4d9b857371500
5
5
  SHA512:
6
- metadata.gz: 71bc10942c4f0dc23e12501ea2928e2d5e05c73bbdf7b438cd3ebf259c5a07214bec6db1e3fbc80a6fef284b6e58daa96c7c3ee5af9084898fe0d186762fb330
7
- data.tar.gz: d8d527588398323bb3f8f9172d0be22fa41e2aeeca7f1b5ce9aecc6be9981e13d9abd047b34ca1c13375d55de61f6be2e29449e2d86e3cb5f755cc7f84493b11
6
+ metadata.gz: 208bbe4f45666e567e4f1ddb34c642490efd62ba2945386f83b4fdfeedbcfc9bda30780ded742be631d371c26b6e69c2b7fa487470f70a19491017f44eab9773
7
+ data.tar.gz: 6f272e0d919b65a567c67f509e2e5571d1358f3357087578e6931b1a0dc6db0ede94f181d012268dea0c303016206e349634496bc02f790b04789b5e790e7709
data/CHANGES.md CHANGED
@@ -1,5 +1,44 @@
1
1
  # Changes
2
2
 
3
+ ## 2025-01-29 v0.14.1
4
+
5
+ * Removed dependency on `Documentrix`:
6
+ * Dependency removed from Rakefile
7
+ * Dependency removed from `ollama-ruby.gemspec`
8
+
9
+ ## 2025-01-29 v0.14.0
10
+
11
+ * Removed `term-ansicolor`, `redis`, `mime-types`, `reverse_markdown`,
12
+ `complex_config`, `search_ui`, `amatch`, `pdf-reader`, and `logger`
13
+ dependencies from gemspec.
14
+ * Added `kramdown-ansi` dependency to gemspec.
15
+ * Moved `ollama_chat` executable to its own gem.
16
+ * Refactored Ollama library by removing no longer used utils files, and specs.
17
+ * Removed test prompt from spec/assets/prompt.txt file.
18
+ * Removed Redis configuration.
19
+ * Removed docker-compose.yml.
20
+ * Removed corpus from .gitignore.
21
+ * Updated Docker setup for Gem installation:
22
+ + Updated `.all_images.yml` to remove unnecessary `gem update --system` and
23
+ `gem install bundler` commands.
24
+ * Added new image configuration for **ruby:3.4-alpine** container:
25
+ + Update `.all_images.yml` to include `ruby:3.4-alpine` environment.
26
+ * Implemented equals method for Ollama::DTO and added tests for it:
27
+ + Added `==` method to `Ollama::DTO` class.
28
+ + Added two new specs to `message_spec.rb`:
29
+ - Test that a message is equal to itself using the `eq` matcher.
30
+ - Test that a message is not equal to its duplicate using the `equal` matcher.
31
+ * Simplified System Prompt Changer:
32
+ + Removed redundant variable assignments for `chosen` and `system`.
33
+ + Introduced a simple prompt selection logic when only one option is available.
34
+ + Added options to exit or create a new prompt in the chooser interface.
35
+ * Improvements to system prompt display:
36
+ + Added system prompt length display using **bold** formatting.
37
+ * Added new command to output current configuration:
38
+ + Added `/config` command to display current chat configuration in `ollama_chat.rb`
39
+ + Modified `display_chat_help` method to include `/config` option
40
+ + Implemented pager functionality for displaying large configurations.
41
+
3
42
  ## 2024-12-07 v0.13.0
4
43
 
5
44
  * Refactor: Extract documents database logic into separate gem `documentrix`.
data/README.md CHANGED
@@ -12,7 +12,7 @@ To install Ollama, you can use the following methods:
12
12
 
13
13
  1. Type
14
14
 
15
- ```
15
+ ```bash
16
16
  gem install ollama-ruby
17
17
  ```
18
18
 
@@ -20,7 +20,7 @@ in your terminal.
20
20
 
21
21
  1. Or add the line
22
22
 
23
- ```
23
+ ```bash
24
24
  gem 'ollama-ruby'
25
25
  ```
26
26
 
@@ -48,8 +48,8 @@ This is an interactive console where you can try out the different commands
48
48
  provided by an `Ollama::Client` instance. For example, this command generates a
49
49
  response and displays it on the screen using the Markdown handler:
50
50
 
51
- ```
52
- $ ollama_console
51
+ ```bash
52
+ ollama_console
53
53
  Commands: chat,copy,create,delete,embeddings,generate,help,ps,pull,push,show,tags
54
54
  >> generate(model: 'llama3.1', stream: true, prompt: 'tell story w/ emoji and markdown', &Markdown)
55
55
  ```
@@ -312,159 +312,85 @@ For more generic errors an `Ollama::Errors::Error` is raised.
312
312
 
313
313
  ## Other executables
314
314
 
315
- ### ollama\_chat
315
+ ### ollama\_cli
316
316
 
317
- This a chat client, that can be used to connect to an ollama server and enter a
318
- chat converstation with a LLM. It can be called with the following arguments:
317
+ The `ollama_cli` executable is a command-line interface for interacting with
318
+ the Ollama API. It allows users to generate text, and perform other tasks using
319
+ a variety of options.
319
320
 
320
- ```
321
- Usage: ollama_chat [OPTIONS]
322
-
323
- -f CONFIG config file to read
324
- -u URL the ollama base url, OLLAMA_URL
325
- -m MODEL the ollama model to chat with, OLLAMA_CHAT_MODEL
326
- -s SYSTEM the system prompt to use as a file, OLLAMA_CHAT_SYSTEM
327
- -c CHAT a saved chat conversation to load
328
- -C COLLECTION name of the collection used in this conversation
329
- -D DOCUMENT load document and add to embeddings collection (multiple)
330
- -M use (empty) MemoryCache for this chat session
331
- -E disable embeddings for this chat session
332
- -V display the current version number and quit
333
- -h this help
321
+ #### Usage
322
+
323
+ To use `ollama_cli`, simply run it from the command line and follow the usage
324
+ instructions:
325
+
326
+ ```bash
327
+ ollama_cli [OPTIONS]
334
328
  ```
335
329
 
336
- The base URL can be either set by the environment variable `OLLAMA_URL` or it
337
- is derived from the environment variable `OLLAMA_HOST`. The default model to
338
- connect can be configured in the environment variable `OLLAMA_MODEL`.
330
+ The available options are:
339
331
 
340
- The YAML config file in `$XDG_CONFIG_HOME/ollama_chat/config.yml`, that you can
341
- use for more complex settings, it looks like this:
332
+ * `-u URL`: The Ollama base URL. Can be set as an environment variable
333
+ `OLLAMA_URL`.
334
+ * `-m MODEL`: The Ollama model to chat with. Defaults to `llama3.1` if not
335
+ specified.
336
+ * `-M OPTIONS`: The Ollama model options to use. Can be set as an environment
337
+ variable `OLLAMA_MODEL_OPTIONS`.
338
+ * `-s SYSTEM`: The system prompt to use as a file. Can be set as an environment
339
+ variable `OLLAMA_SYSTEM`.
340
+ * `-p PROMPT`: The user prompt to use as a file. If it contains `%{stdin}`, it
341
+ will be substituted with the standard input. If not given, stdin will be used
342
+ as the prompt.
343
+ * `-P VARIABLE`: Sets a prompt variable, e.g. `foo=bar`. Can be used multiple
344
+ times.
345
+ * `-H HANDLER`: The handler to use for the response. Defaults to `Print`.
346
+ * `-S`: Use streaming for generation.
342
347
 
343
- ```
344
- ---
345
- url: <%= ENV['OLLAMA_URL'] || 'http://%s' % ENV.fetch('OLLAMA_HOST') %>
346
- model:
347
- name: <%= ENV.fetch('OLLAMA_CHAT_MODEL', 'llama3.1') %>
348
- options:
349
- num_ctx: 8192
350
- system: <%= ENV.fetch('OLLAMA_CHAT_SYSTEM', 'null') %>
351
- voice: Samantha
352
- markdown: true
353
- embedding:
354
- enabled: true
355
- model:
356
- name: mxbai-embed-large
357
- options: {}
358
- collection: <%= ENV.fetch('OLLAMA_CHAT_COLLECTION', 'ollama_chat') %>
359
- found_texts_size: 4096
360
- splitter:
361
- name: RecursiveCharacter
362
- chunk_size: 1024
363
- cache: Ollama::Documents::RedisCache
364
- redis:
365
- url: <%= ENV.fetch('REDIS_URL', 'null') %>
366
- debug: <%= ENV['OLLAMA_CHAT_DEBUG'].to_i == 1 ? true : false %>
367
- ```
348
+ #### Environment Variables
368
349
 
369
- If you want to store embeddings persistently, set an environment variable
370
- `REDIS_URL` or update the `redis.url` setting in your `config.yml` file to
371
- connect to a Redis server. Without this setup, embeddings will only be stored
372
- in process memory, which is less durable.
350
+ The following environment variables can be set to customize the behavior of
351
+ `ollama_cli`:
373
352
 
374
- Some settings can be passed as arguments as well, e. g. if you want to choose a
375
- specific system prompt:
353
+ * `OLLAMA_URL`: The Ollama base URL.
354
+ * `OLLAMA_MODEL`: The Ollama model to chat with.
355
+ * `OLLAMA_MODEL_OPTIONS`: The Ollama model options to use.
356
+ * `OLLAMA_SYSTEM`: The system prompt to use as a file.
357
+ * `OLLAMA_PROMPT`: The user prompt to use as a file.
376
358
 
377
- ```
378
- $ ollama_chat -s sherlock.txt
379
- Model with architecture llama found.
380
- Connecting to llama3.1@http://ollama.local.net:11434 now…
381
- Configured system prompt is:
382
- You are Sherlock Holmes and the user is your new client, Dr. Watson is also in
383
- the room. You will talk and act in the typical manner of Sherlock Holmes do and
384
- try to solve the user's case using logic and deduction.
385
-
386
- Type /help to display the chat help.
387
- 📨 user:
388
- Good morning.
389
- 📨 assistant:
390
- Ah, good morning, my dear fellow! It is a pleasure to make your acquaintance. I
391
- am Sherlock Holmes, the renowned detective, and this is my trusty sidekick, Dr.
392
- Watson. Please, have a seat and tell us about the nature of your visit. What
393
- seems to be the problem that has brought you to our humble abode at 221B Baker
394
- Street?
395
-
396
- (Watson nods in encouragement as he takes notes)
397
-
398
- Now, pray tell, what is it that puzzles you, my dear client? A missing item,
399
- perhaps? Or a mysterious occurrence that requires clarification? The game, as
400
- they say, is afoot!
401
- ```
359
+ #### Debug Mode
402
360
 
403
- This example shows how an image like this can be sent to a vision model for
404
- analysis:
361
+ If the `DEBUG` environment variable is set to `1`, `ollama_cli` will print out
362
+ the values of various variables, including the base URL, model, system prompt,
363
+ and options. This can be useful for debugging purposes.
405
364
 
406
- ![cat](spec/assets/kitten.jpg)
365
+ #### Handler Options
407
366
 
408
- ```
409
- $ ollama_chat -m llava-llama3
410
- Model with architecture llama found.
411
- Connecting to llava-llama3@http://localhost:11434 now…
412
- Type /help to display the chat help.
413
- 📸 user> What's on this image? ./spec/assets/kitten.jpg
414
- 📨 assistant:
415
- The image captures a moment of tranquility featuring a young cat. The cat,
416
- adorned with gray and white fur marked by black stripes on its face and legs,
417
- is the central figure in this scene. Its eyes, a striking shade of blue, are
418
- wide open and directed towards the camera, giving an impression of curiosity or
419
- alertness.
420
-
421
- The cat is comfortably nestled on a red blanket, which contrasts vividly with
422
- its fur. The blanket, soft and inviting, provides a sense of warmth to the
423
- image. In the background, partially obscured by the cat's head, is another
424
- blanket of similar red hue. The repetition of the color adds a sense of harmony
425
- to the composition.
426
-
427
- The cat's position on the right side of the photo creates an interesting
428
- asymmetry with the camera lens, which occupies the left side of the frame. This
429
- visual balance enhances the overall composition of the image.
430
-
431
- There are no discernible texts or other objects in the image. The focus is
432
- solely on the cat and its immediate surroundings. The image does not provide
433
- any information about the location or setting beyond what has been described.
434
- The simplicity of the scene allows the viewer to concentrate on the main
435
- subject - the young, blue-eyed cat.
436
- ```
367
+ The `-H` option specifies the handler to use for the response. The available
368
+ handlers are:
437
369
 
438
- The following commands can be given inside the chat, if prefixed by a `/`:
370
+ * `Print`: Prints the response to the console. This is the default.
371
+ * `Markdown`: Prints the response to the console as markdown.
372
+ * `DumpJSON`: Dumps all responses as JSON to the output.
373
+ * `DumpYAML`: Dumps all responses as YAML to the output.
374
+ * `Say`: Outputs the response with a voice.
439
375
 
376
+ #### Streaming
377
+
378
+ The `-S` option enables streaming for generation. This allows the model to
379
+ generate text in chunks, rather than waiting for the entire response to be
380
+ generated.
381
+
382
+ ### ollama\_chat
383
+
384
+ This is a chat client that allows you to connect to an Ollama server and engage
385
+ in conversations with Large Language Models (LLMs). It can be installed using
386
+ the following command:
387
+
388
+ ```bash
389
+ gem install ollama-chat
440
390
  ```
441
- /copy to copy last response to clipboard
442
- /paste to paste content
443
- /markdown toggle markdown output
444
- /stream toggle stream output
445
- /location toggle location submission
446
- /voice( change) toggle voice output or change the voice
447
- /list [n] list the last n / all conversation exchanges
448
- /clear clear the whole conversation
449
- /clobber clear the conversation and collection
450
- /pop [n] pop the last n exchanges, defaults to 1
451
- /model change the model
452
- /system change system prompt (clears conversation)
453
- /regenerate the last answer message
454
- /collection( clear|change) change (default) collection or clear
455
- /info show information for current session
456
- /document_policy pick a scan policy for document references
457
- /import source import the source's content
458
- /summarize [n] source summarize the source's content in n words
459
- /embedding toggle embedding paused or not
460
- /embed source embed the source's content
461
- /web [n] query query web search & return n or 1 results
462
- /links( clear) display (or clear) links used in the chat
463
- /save filename store conversation messages
464
- /load filename load conversation messages
465
- /quit to quit
466
- /help to view this help
467
- ```
391
+
392
+ Once installed, you can run `ollama_chat` from your terminal or command prompt.
393
+ This will launch a chat interface where you can interact with an LLM.
468
394
 
469
395
  ## Download
470
396
 
data/Rakefile CHANGED
@@ -14,34 +14,22 @@ GemHadar do
14
14
  test_dir 'spec'
15
15
  ignore '.*.sw[pon]', 'pkg', 'Gemfile.lock', '.AppleDouble', '.bundle',
16
16
  '.yardoc', 'doc', 'tags', 'errors.lst', 'cscope.out', 'coverage', 'tmp',
17
- 'corpus', 'yard'
17
+ 'yard'
18
18
  package_ignore '.all_images.yml', '.tool-versions', '.gitignore', 'VERSION',
19
19
  '.rspec', *Dir.glob('.github/**/*', File::FNM_DOTMATCH)
20
20
  readme 'README.md'
21
21
 
22
- executables << 'ollama_console' << 'ollama_chat' <<
23
- 'ollama_update' << 'ollama_cli'
22
+ executables << 'ollama_console' << 'ollama_update' << 'ollama_cli'
24
23
 
25
24
  required_ruby_version '~> 3.1'
26
25
 
27
26
  dependency 'excon', '~> 1.0'
28
27
  dependency 'infobar', '~> 0.8'
29
- dependency 'term-ansicolor', '~> 1.11'
30
- dependency 'redis', '~> 5.0'
31
- dependency 'mime-types', '~> 3.0'
32
- dependency 'reverse_markdown', '~> 3.0'
33
- dependency 'complex_config', '~> 0.22', '>= 0.22.2'
34
- dependency 'search_ui', '~> 0.0'
35
- dependency 'amatch', '~> 0.4.1'
36
- dependency 'pdf-reader', '~> 2.0'
37
- dependency 'logger', '~> 1.0'
38
28
  dependency 'json', '~> 2.0'
39
- dependency 'xdg', '~> 7.0'
40
29
  dependency 'tins', '~> 1.34'
30
+ dependency 'term-ansicolor', '~> 1.11'
41
31
  dependency 'kramdown-ansi', '~> 0.0', '>= 0.0.1'
42
32
  dependency 'ostruct', '~> 0.0'
43
- dependency 'rss', '~> 0.3'
44
- dependency 'documentrix', '~> 0.0'
45
33
  development_dependency 'all_images', '~> 0.6'
46
34
  development_dependency 'rspec', '~> 3.2'
47
35
  development_dependency 'kramdown', '~> 2.0'
data/bin/ollama_cli CHANGED
@@ -2,20 +2,51 @@
2
2
 
3
3
  require 'ollama'
4
4
  include Ollama
5
- include Ollama::Utils::FileArgument
6
5
  require 'tins'
7
6
  include Tins::GO
8
7
  require 'json'
9
8
 
9
+ # Returns the contents of a file or string, or a default value if neither is provided.
10
+ #
11
+ # @param [String] path_or_content The path to a file or a string containing
12
+ # the content.
13
+ #
14
+ # @param [String] default The default value to return if no valid input is
15
+ # given. Defaults to nil.
16
+ #
17
+ # @return [String] The contents of the file, the string, or the default value.
18
+ #
19
+ # @example Get the contents of a file
20
+ # get_file_argument('path/to/file')
21
+ #
22
+ # @example Use a string as content
23
+ # get_file_argument('string content')
24
+ #
25
+ # @example Return a default value if no valid input is given
26
+ # get_file_argument(nil, default: 'default content')
27
+ def get_file_argument(path_or_content, default: nil)
28
+ if path_or_content.present? && path_or_content.size < 2 ** 15 &&
29
+ File.basename(path_or_content).size < 2 ** 8 &&
30
+ File.exist?(path_or_content)
31
+ then
32
+ File.read(path_or_content)
33
+ elsif path_or_content.present?
34
+ path_or_content
35
+ else
36
+ default
37
+ end
38
+ end
39
+
40
+ # Outputs usage information for the `ollama_cli`.
10
41
  def usage
11
42
  puts <<~EOT
12
43
  Usage: #{File.basename($0)} [OPTIONS]
13
44
 
14
- -u URL the ollama base url, OLLAMA_URL
15
- -m MODEL the ollama model to chat with, OLLAMA_MODEL
16
- -M OPTIONS the ollama model options to use, OLLAMA_MODEL_OPTIONS
17
- -s SYSTEM the system prompt to use as a file, OLLAMA_SYSTEM
18
- -p PROMPT the user prompt to use as a file, OLLAMA_PROMPT
45
+ -u URL the ollama base url, $OLLAMA_URL
46
+ -m MODEL the ollama model to chat with, $OLLAMA_MODEL
47
+ -M OPTIONS the ollama model options to use, $OLLAMA_MODEL_OPTIONS
48
+ -s SYSTEM the system prompt to use as a file, $OLLAMA_SYSTEM
49
+ -p PROMPT the user prompt to use as a file, $OLLAMA_PROMPT
19
50
  if it contains %{stdin} it is substituted by stdin input
20
51
  -P VARIABLE sets prompt var %{foo} to "bar" if VARIABLE is foo=bar
21
52
  -H HANDLER the handler to use for the response, defaults to Print
data/lib/ollama/dto.rb CHANGED
@@ -31,6 +31,10 @@ module Ollama::DTO
31
31
  reject { _2.nil? || _2.ask_and_send(:size) == 0 }
32
32
  end
33
33
 
34
+ def ==(other)
35
+ as_json == other.as_json
36
+ end
37
+
34
38
  alias to_hash as_json
35
39
 
36
40
  def empty?
@@ -1,6 +1,6 @@
1
1
  module Ollama
2
2
  # Ollama version
3
- VERSION = '0.13.0'
3
+ VERSION = '0.14.1'
4
4
  VERSION_ARRAY = VERSION.split('.').map(&:to_i) # :nodoc:
5
5
  VERSION_MAJOR = VERSION_ARRAY[0] # :nodoc:
6
6
  VERSION_MINOR = VERSION_ARRAY[1] # :nodoc:
data/lib/ollama.rb CHANGED
@@ -1,5 +1,4 @@
1
1
  require 'json'
2
- require 'logger'
3
2
  require 'excon'
4
3
  require 'tins'
5
4
  require 'tins/xt/full'
@@ -14,12 +13,6 @@ module Ollama
14
13
  include Ollama::Handlers
15
14
  end
16
15
 
17
- module Ollama::Utils
18
- end
19
- require 'ollama/utils/fetcher'
20
- require 'ollama/utils/chooser'
21
- require 'ollama/utils/file_argument'
22
-
23
16
  require 'ollama/version'
24
17
  require 'ollama/errors'
25
18
  require 'ollama/dto'
data/ollama-ruby.gemspec CHANGED
@@ -1,26 +1,26 @@
1
1
  # -*- encoding: utf-8 -*-
2
- # stub: ollama-ruby 0.13.0 ruby lib
2
+ # stub: ollama-ruby 0.14.1 ruby lib
3
3
 
4
4
  Gem::Specification.new do |s|
5
5
  s.name = "ollama-ruby".freeze
6
- s.version = "0.13.0".freeze
6
+ s.version = "0.14.1".freeze
7
7
 
8
8
  s.required_rubygems_version = Gem::Requirement.new(">= 0".freeze) if s.respond_to? :required_rubygems_version=
9
9
  s.require_paths = ["lib".freeze]
10
10
  s.authors = ["Florian Frank".freeze]
11
- s.date = "2024-12-07"
11
+ s.date = "2025-01-29"
12
12
  s.description = "Library that allows interacting with the Ollama API".freeze
13
13
  s.email = "flori@ping.de".freeze
14
- s.executables = ["ollama_console".freeze, "ollama_chat".freeze, "ollama_update".freeze, "ollama_cli".freeze]
15
- s.extra_rdoc_files = ["README.md".freeze, "lib/ollama.rb".freeze, "lib/ollama/client.rb".freeze, "lib/ollama/client/command.rb".freeze, "lib/ollama/client/doc.rb".freeze, "lib/ollama/commands/chat.rb".freeze, "lib/ollama/commands/copy.rb".freeze, "lib/ollama/commands/create.rb".freeze, "lib/ollama/commands/delete.rb".freeze, "lib/ollama/commands/embed.rb".freeze, "lib/ollama/commands/embeddings.rb".freeze, "lib/ollama/commands/generate.rb".freeze, "lib/ollama/commands/ps.rb".freeze, "lib/ollama/commands/pull.rb".freeze, "lib/ollama/commands/push.rb".freeze, "lib/ollama/commands/show.rb".freeze, "lib/ollama/commands/tags.rb".freeze, "lib/ollama/dto.rb".freeze, "lib/ollama/errors.rb".freeze, "lib/ollama/handlers.rb".freeze, "lib/ollama/handlers/collector.rb".freeze, "lib/ollama/handlers/concern.rb".freeze, "lib/ollama/handlers/dump_json.rb".freeze, "lib/ollama/handlers/dump_yaml.rb".freeze, "lib/ollama/handlers/markdown.rb".freeze, "lib/ollama/handlers/nop.rb".freeze, "lib/ollama/handlers/print.rb".freeze, "lib/ollama/handlers/progress.rb".freeze, "lib/ollama/handlers/say.rb".freeze, "lib/ollama/handlers/single.rb".freeze, "lib/ollama/image.rb".freeze, "lib/ollama/message.rb".freeze, "lib/ollama/options.rb".freeze, "lib/ollama/response.rb".freeze, "lib/ollama/tool.rb".freeze, "lib/ollama/tool/function.rb".freeze, "lib/ollama/tool/function/parameters.rb".freeze, "lib/ollama/tool/function/parameters/property.rb".freeze, "lib/ollama/utils/cache_fetcher.rb".freeze, "lib/ollama/utils/chooser.rb".freeze, "lib/ollama/utils/fetcher.rb".freeze, "lib/ollama/utils/file_argument.rb".freeze, "lib/ollama/version.rb".freeze]
16
- s.files = [".envrc".freeze, ".yardopts".freeze, "CHANGES.md".freeze, "Gemfile".freeze, "LICENSE".freeze, "README.md".freeze, "Rakefile".freeze, "bin/ollama_chat".freeze, "bin/ollama_cli".freeze, "bin/ollama_console".freeze, "bin/ollama_update".freeze, "config/redis.conf".freeze, "docker-compose.yml".freeze, "lib/ollama.rb".freeze, "lib/ollama/client.rb".freeze, "lib/ollama/client/command.rb".freeze, "lib/ollama/client/doc.rb".freeze, "lib/ollama/commands/chat.rb".freeze, "lib/ollama/commands/copy.rb".freeze, "lib/ollama/commands/create.rb".freeze, "lib/ollama/commands/delete.rb".freeze, "lib/ollama/commands/embed.rb".freeze, "lib/ollama/commands/embeddings.rb".freeze, "lib/ollama/commands/generate.rb".freeze, "lib/ollama/commands/ps.rb".freeze, "lib/ollama/commands/pull.rb".freeze, "lib/ollama/commands/push.rb".freeze, "lib/ollama/commands/show.rb".freeze, "lib/ollama/commands/tags.rb".freeze, "lib/ollama/dto.rb".freeze, "lib/ollama/errors.rb".freeze, "lib/ollama/handlers.rb".freeze, "lib/ollama/handlers/collector.rb".freeze, "lib/ollama/handlers/concern.rb".freeze, "lib/ollama/handlers/dump_json.rb".freeze, "lib/ollama/handlers/dump_yaml.rb".freeze, "lib/ollama/handlers/markdown.rb".freeze, "lib/ollama/handlers/nop.rb".freeze, "lib/ollama/handlers/print.rb".freeze, "lib/ollama/handlers/progress.rb".freeze, "lib/ollama/handlers/say.rb".freeze, "lib/ollama/handlers/single.rb".freeze, "lib/ollama/image.rb".freeze, "lib/ollama/message.rb".freeze, "lib/ollama/options.rb".freeze, "lib/ollama/response.rb".freeze, "lib/ollama/tool.rb".freeze, "lib/ollama/tool/function.rb".freeze, "lib/ollama/tool/function/parameters.rb".freeze, "lib/ollama/tool/function/parameters/property.rb".freeze, "lib/ollama/utils/cache_fetcher.rb".freeze, "lib/ollama/utils/chooser.rb".freeze, "lib/ollama/utils/fetcher.rb".freeze, "lib/ollama/utils/file_argument.rb".freeze, "lib/ollama/version.rb".freeze, "ollama-ruby.gemspec".freeze, "spec/assets/kitten.jpg".freeze, "spec/assets/prompt.txt".freeze, "spec/ollama/client/doc_spec.rb".freeze, "spec/ollama/client_spec.rb".freeze, "spec/ollama/commands/chat_spec.rb".freeze, "spec/ollama/commands/copy_spec.rb".freeze, "spec/ollama/commands/create_spec.rb".freeze, "spec/ollama/commands/delete_spec.rb".freeze, "spec/ollama/commands/embed_spec.rb".freeze, "spec/ollama/commands/embeddings_spec.rb".freeze, "spec/ollama/commands/generate_spec.rb".freeze, "spec/ollama/commands/ps_spec.rb".freeze, "spec/ollama/commands/pull_spec.rb".freeze, "spec/ollama/commands/push_spec.rb".freeze, "spec/ollama/commands/show_spec.rb".freeze, "spec/ollama/commands/tags_spec.rb".freeze, "spec/ollama/handlers/collector_spec.rb".freeze, "spec/ollama/handlers/dump_json_spec.rb".freeze, "spec/ollama/handlers/dump_yaml_spec.rb".freeze, "spec/ollama/handlers/markdown_spec.rb".freeze, "spec/ollama/handlers/nop_spec.rb".freeze, "spec/ollama/handlers/print_spec.rb".freeze, "spec/ollama/handlers/progress_spec.rb".freeze, "spec/ollama/handlers/say_spec.rb".freeze, "spec/ollama/handlers/single_spec.rb".freeze, "spec/ollama/image_spec.rb".freeze, "spec/ollama/message_spec.rb".freeze, "spec/ollama/options_spec.rb".freeze, "spec/ollama/tool_spec.rb".freeze, "spec/ollama/utils/cache_fetcher_spec.rb".freeze, "spec/ollama/utils/fetcher_spec.rb".freeze, "spec/ollama/utils/file_argument_spec.rb".freeze, "spec/spec_helper.rb".freeze, "tmp/.keep".freeze]
14
+ s.executables = ["ollama_console".freeze, "ollama_update".freeze, "ollama_cli".freeze]
15
+ s.extra_rdoc_files = ["README.md".freeze, "lib/ollama.rb".freeze, "lib/ollama/client.rb".freeze, "lib/ollama/client/command.rb".freeze, "lib/ollama/client/doc.rb".freeze, "lib/ollama/commands/chat.rb".freeze, "lib/ollama/commands/copy.rb".freeze, "lib/ollama/commands/create.rb".freeze, "lib/ollama/commands/delete.rb".freeze, "lib/ollama/commands/embed.rb".freeze, "lib/ollama/commands/embeddings.rb".freeze, "lib/ollama/commands/generate.rb".freeze, "lib/ollama/commands/ps.rb".freeze, "lib/ollama/commands/pull.rb".freeze, "lib/ollama/commands/push.rb".freeze, "lib/ollama/commands/show.rb".freeze, "lib/ollama/commands/tags.rb".freeze, "lib/ollama/dto.rb".freeze, "lib/ollama/errors.rb".freeze, "lib/ollama/handlers.rb".freeze, "lib/ollama/handlers/collector.rb".freeze, "lib/ollama/handlers/concern.rb".freeze, "lib/ollama/handlers/dump_json.rb".freeze, "lib/ollama/handlers/dump_yaml.rb".freeze, "lib/ollama/handlers/markdown.rb".freeze, "lib/ollama/handlers/nop.rb".freeze, "lib/ollama/handlers/print.rb".freeze, "lib/ollama/handlers/progress.rb".freeze, "lib/ollama/handlers/say.rb".freeze, "lib/ollama/handlers/single.rb".freeze, "lib/ollama/image.rb".freeze, "lib/ollama/message.rb".freeze, "lib/ollama/options.rb".freeze, "lib/ollama/response.rb".freeze, "lib/ollama/tool.rb".freeze, "lib/ollama/tool/function.rb".freeze, "lib/ollama/tool/function/parameters.rb".freeze, "lib/ollama/tool/function/parameters/property.rb".freeze, "lib/ollama/version.rb".freeze]
16
+ s.files = [".envrc".freeze, ".yardopts".freeze, "CHANGES.md".freeze, "Gemfile".freeze, "LICENSE".freeze, "README.md".freeze, "Rakefile".freeze, "bin/ollama_cli".freeze, "bin/ollama_console".freeze, "bin/ollama_update".freeze, "lib/ollama.rb".freeze, "lib/ollama/client.rb".freeze, "lib/ollama/client/command.rb".freeze, "lib/ollama/client/doc.rb".freeze, "lib/ollama/commands/chat.rb".freeze, "lib/ollama/commands/copy.rb".freeze, "lib/ollama/commands/create.rb".freeze, "lib/ollama/commands/delete.rb".freeze, "lib/ollama/commands/embed.rb".freeze, "lib/ollama/commands/embeddings.rb".freeze, "lib/ollama/commands/generate.rb".freeze, "lib/ollama/commands/ps.rb".freeze, "lib/ollama/commands/pull.rb".freeze, "lib/ollama/commands/push.rb".freeze, "lib/ollama/commands/show.rb".freeze, "lib/ollama/commands/tags.rb".freeze, "lib/ollama/dto.rb".freeze, "lib/ollama/errors.rb".freeze, "lib/ollama/handlers.rb".freeze, "lib/ollama/handlers/collector.rb".freeze, "lib/ollama/handlers/concern.rb".freeze, "lib/ollama/handlers/dump_json.rb".freeze, "lib/ollama/handlers/dump_yaml.rb".freeze, "lib/ollama/handlers/markdown.rb".freeze, "lib/ollama/handlers/nop.rb".freeze, "lib/ollama/handlers/print.rb".freeze, "lib/ollama/handlers/progress.rb".freeze, "lib/ollama/handlers/say.rb".freeze, "lib/ollama/handlers/single.rb".freeze, "lib/ollama/image.rb".freeze, "lib/ollama/message.rb".freeze, "lib/ollama/options.rb".freeze, "lib/ollama/response.rb".freeze, "lib/ollama/tool.rb".freeze, "lib/ollama/tool/function.rb".freeze, "lib/ollama/tool/function/parameters.rb".freeze, "lib/ollama/tool/function/parameters/property.rb".freeze, "lib/ollama/version.rb".freeze, "ollama-ruby.gemspec".freeze, "spec/assets/kitten.jpg".freeze, "spec/ollama/client/doc_spec.rb".freeze, "spec/ollama/client_spec.rb".freeze, "spec/ollama/commands/chat_spec.rb".freeze, "spec/ollama/commands/copy_spec.rb".freeze, "spec/ollama/commands/create_spec.rb".freeze, "spec/ollama/commands/delete_spec.rb".freeze, "spec/ollama/commands/embed_spec.rb".freeze, "spec/ollama/commands/embeddings_spec.rb".freeze, "spec/ollama/commands/generate_spec.rb".freeze, "spec/ollama/commands/ps_spec.rb".freeze, "spec/ollama/commands/pull_spec.rb".freeze, "spec/ollama/commands/push_spec.rb".freeze, "spec/ollama/commands/show_spec.rb".freeze, "spec/ollama/commands/tags_spec.rb".freeze, "spec/ollama/handlers/collector_spec.rb".freeze, "spec/ollama/handlers/dump_json_spec.rb".freeze, "spec/ollama/handlers/dump_yaml_spec.rb".freeze, "spec/ollama/handlers/markdown_spec.rb".freeze, "spec/ollama/handlers/nop_spec.rb".freeze, "spec/ollama/handlers/print_spec.rb".freeze, "spec/ollama/handlers/progress_spec.rb".freeze, "spec/ollama/handlers/say_spec.rb".freeze, "spec/ollama/handlers/single_spec.rb".freeze, "spec/ollama/image_spec.rb".freeze, "spec/ollama/message_spec.rb".freeze, "spec/ollama/options_spec.rb".freeze, "spec/ollama/tool_spec.rb".freeze, "spec/spec_helper.rb".freeze, "tmp/.keep".freeze]
17
17
  s.homepage = "https://github.com/flori/ollama-ruby".freeze
18
18
  s.licenses = ["MIT".freeze]
19
19
  s.rdoc_options = ["--title".freeze, "Ollama-ruby - Interacting with the Ollama API".freeze, "--main".freeze, "README.md".freeze]
20
20
  s.required_ruby_version = Gem::Requirement.new("~> 3.1".freeze)
21
- s.rubygems_version = "3.5.23".freeze
21
+ s.rubygems_version = "3.6.2".freeze
22
22
  s.summary = "Interacting with the Ollama API".freeze
23
- s.test_files = ["spec/ollama/client/doc_spec.rb".freeze, "spec/ollama/client_spec.rb".freeze, "spec/ollama/commands/chat_spec.rb".freeze, "spec/ollama/commands/copy_spec.rb".freeze, "spec/ollama/commands/create_spec.rb".freeze, "spec/ollama/commands/delete_spec.rb".freeze, "spec/ollama/commands/embed_spec.rb".freeze, "spec/ollama/commands/embeddings_spec.rb".freeze, "spec/ollama/commands/generate_spec.rb".freeze, "spec/ollama/commands/ps_spec.rb".freeze, "spec/ollama/commands/pull_spec.rb".freeze, "spec/ollama/commands/push_spec.rb".freeze, "spec/ollama/commands/show_spec.rb".freeze, "spec/ollama/commands/tags_spec.rb".freeze, "spec/ollama/handlers/collector_spec.rb".freeze, "spec/ollama/handlers/dump_json_spec.rb".freeze, "spec/ollama/handlers/dump_yaml_spec.rb".freeze, "spec/ollama/handlers/markdown_spec.rb".freeze, "spec/ollama/handlers/nop_spec.rb".freeze, "spec/ollama/handlers/print_spec.rb".freeze, "spec/ollama/handlers/progress_spec.rb".freeze, "spec/ollama/handlers/say_spec.rb".freeze, "spec/ollama/handlers/single_spec.rb".freeze, "spec/ollama/image_spec.rb".freeze, "spec/ollama/message_spec.rb".freeze, "spec/ollama/options_spec.rb".freeze, "spec/ollama/tool_spec.rb".freeze, "spec/ollama/utils/cache_fetcher_spec.rb".freeze, "spec/ollama/utils/fetcher_spec.rb".freeze, "spec/ollama/utils/file_argument_spec.rb".freeze, "spec/spec_helper.rb".freeze]
23
+ s.test_files = ["spec/ollama/client/doc_spec.rb".freeze, "spec/ollama/client_spec.rb".freeze, "spec/ollama/commands/chat_spec.rb".freeze, "spec/ollama/commands/copy_spec.rb".freeze, "spec/ollama/commands/create_spec.rb".freeze, "spec/ollama/commands/delete_spec.rb".freeze, "spec/ollama/commands/embed_spec.rb".freeze, "spec/ollama/commands/embeddings_spec.rb".freeze, "spec/ollama/commands/generate_spec.rb".freeze, "spec/ollama/commands/ps_spec.rb".freeze, "spec/ollama/commands/pull_spec.rb".freeze, "spec/ollama/commands/push_spec.rb".freeze, "spec/ollama/commands/show_spec.rb".freeze, "spec/ollama/commands/tags_spec.rb".freeze, "spec/ollama/handlers/collector_spec.rb".freeze, "spec/ollama/handlers/dump_json_spec.rb".freeze, "spec/ollama/handlers/dump_yaml_spec.rb".freeze, "spec/ollama/handlers/markdown_spec.rb".freeze, "spec/ollama/handlers/nop_spec.rb".freeze, "spec/ollama/handlers/print_spec.rb".freeze, "spec/ollama/handlers/progress_spec.rb".freeze, "spec/ollama/handlers/say_spec.rb".freeze, "spec/ollama/handlers/single_spec.rb".freeze, "spec/ollama/image_spec.rb".freeze, "spec/ollama/message_spec.rb".freeze, "spec/ollama/options_spec.rb".freeze, "spec/ollama/tool_spec.rb".freeze, "spec/spec_helper.rb".freeze]
24
24
 
25
25
  s.specification_version = 4
26
26
 
@@ -33,20 +33,9 @@ Gem::Specification.new do |s|
33
33
  s.add_development_dependency(%q<simplecov>.freeze, [">= 0".freeze])
34
34
  s.add_runtime_dependency(%q<excon>.freeze, ["~> 1.0".freeze])
35
35
  s.add_runtime_dependency(%q<infobar>.freeze, ["~> 0.8".freeze])
36
- s.add_runtime_dependency(%q<term-ansicolor>.freeze, ["~> 1.11".freeze])
37
- s.add_runtime_dependency(%q<redis>.freeze, ["~> 5.0".freeze])
38
- s.add_runtime_dependency(%q<mime-types>.freeze, ["~> 3.0".freeze])
39
- s.add_runtime_dependency(%q<reverse_markdown>.freeze, ["~> 3.0".freeze])
40
- s.add_runtime_dependency(%q<complex_config>.freeze, ["~> 0.22".freeze, ">= 0.22.2".freeze])
41
- s.add_runtime_dependency(%q<search_ui>.freeze, ["~> 0.0".freeze])
42
- s.add_runtime_dependency(%q<amatch>.freeze, ["~> 0.4.1".freeze])
43
- s.add_runtime_dependency(%q<pdf-reader>.freeze, ["~> 2.0".freeze])
44
- s.add_runtime_dependency(%q<logger>.freeze, ["~> 1.0".freeze])
45
36
  s.add_runtime_dependency(%q<json>.freeze, ["~> 2.0".freeze])
46
- s.add_runtime_dependency(%q<xdg>.freeze, ["~> 7.0".freeze])
47
37
  s.add_runtime_dependency(%q<tins>.freeze, ["~> 1.34".freeze])
38
+ s.add_runtime_dependency(%q<term-ansicolor>.freeze, ["~> 1.11".freeze])
48
39
  s.add_runtime_dependency(%q<kramdown-ansi>.freeze, ["~> 0.0".freeze, ">= 0.0.1".freeze])
49
40
  s.add_runtime_dependency(%q<ostruct>.freeze, ["~> 0.0".freeze])
50
- s.add_runtime_dependency(%q<rss>.freeze, ["~> 0.3".freeze])
51
- s.add_runtime_dependency(%q<documentrix>.freeze, ["~> 0.0".freeze])
52
41
  end
@@ -33,4 +33,13 @@ RSpec.describe Ollama::Message do
33
33
  {"role":"user","content":"hello world","images":["dGVzdA==\n"]}
34
34
  end
35
35
  end
36
+
37
+ it 'can be eq(ual) to itself' do
38
+ expect(message).to eq message
39
+ end
40
+
41
+ it 'can be eq(ual) to other message' do
42
+ expect(message).to eq message.dup
43
+ expect(message).not_to equal message.dup
44
+ end
36
45
  end