ollama-ruby 0.15.0 → 1.0.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: 44b2b4e2384b4f05a8457cf9dbcd7d9c654ceff267080245d5d7a1915554743b
4
- data.tar.gz: b03c72fed7968802c87681464b6bcd365f05906a351f7cdcde61dc60661f15cd
3
+ metadata.gz: 6b75645170e207cb661e3b98fed0754610a8807012715343c674206a66b0b72a
4
+ data.tar.gz: 64617fa5bc78f5658db02ea7ae61dee6c537924951de022f2aaddda9ae933131
5
5
  SHA512:
6
- metadata.gz: 7ab16b2e5b23dbe1b94f240124c70a9073184af44d7cbdc31c6fedab92960f30b0a605c4e499eb7ac0a93f7cdf2b72ea62a380d4ea29e2d5d79d2667eabfc1c0
7
- data.tar.gz: 20e6c0e290ad0c076fede5bde2d0959da37697e17d0ed90fb6e136bc8f6ab3c4084202086299a38b27ca62084dafb6979394b9bb2d52c47f553ca2eeaa32959f
6
+ metadata.gz: 9e2968fca36a1c4345c0f948d12252d3f501186f24f5c433d824dd2cd8cfda8afeb65d7073655cca1f74ef2ff4f14546747ebf08f15cf842c2065a33e714d953
7
+ data.tar.gz: f09c3d81093347f898af01386a2f84bd9a97e99b6c7e33e41eaf3eb3261815107ac12723ba4fb00e7180203396c95f0e20f55b3cbbc65dc5d3c5e0dc92753ffe
data/CHANGES.md CHANGED
@@ -1,5 +1,41 @@
1
1
  # Changes
2
2
 
3
+ ## 2025-04-15 v1.0.0
4
+
5
+ **Use model parameter and support new create parameters**
6
+ * Update parameter names in Ollama::Commands::Create specs:
7
+ + Renamed `name` to `model` and `modelfile` to `system` in `described_class.new` calls.
8
+ + Updated corresponding JSON serialization and deserialization tests.
9
+ * Adjust parameters in Create class and add helper methods:
10
+ + Changed parameter names in `Ollama::Commands::Create` class.
11
+ + Added methods: `as_hash(obj)` and `as_array(obj)` in `Ollama::DTO`.
12
+ * Update parameter names in README to match new method arguments:
13
+ + Updated `name` to `model` in `README.md`.
14
+ + Updated `push` method in `lib/ollama/commands/push.rb` to use `model` instead of `name`.
15
+ + Updated tests in `spec/ollama/commands/push_spec.rb` to use `model` instead of `name`.
16
+ * Refactor delete command and model attribute name:
17
+ + Renamed `delete` method parameter from `name` to `model`.
18
+ + Updated code in README.md, lib/ollama/commands/delete.rb, and spec/ollama/commands/delete_spec.rb.
19
+ * Rename parameter name in Ollama::Commands::Show class:
20
+ + Renamed `name` parameters to `model`.
21
+ + Updated method initializers, attribute readers and writers accordingly.
22
+ + Updated spec tests for new parameter name.
23
+ * Renamed `name` parameters to `model` in the following places:
24
+ + `ollama_update` script
25
+ + `Ollama::Commands::Pull` class
26
+ + `pull_spec.rb` spec file
27
+
28
+ ## 2025-02-17 v0.16.0
29
+
30
+ * Updated Ollama CLI with new handler that allows saving of chat and
31
+ continuation with `ollama_chat`:
32
+ * Added `require 'tins/xt/secure_write'` and `require 'tmpdir'`.
33
+ * Created a new `ChatStart` class that handles chat responses.
34
+ * Updated options parsing to use `ChatStart` as the default handler.
35
+ * Changed code to handle `ChatStart` instances.
36
+ * Added secure write functionality for chat conversation in tmpdir.
37
+ * Added `yaml-dev` to `apk add` command in `.all_images.yml`
38
+
3
39
  ## 2025-02-12 v0.15.0
4
40
 
5
41
  * Added "version" command to display version of the ollama server:
data/README.md CHANGED
@@ -140,7 +140,7 @@ tags.models.map(&:name) => ["llama3.1:latest",…]
140
140
  `default_handler` is **Single**, streaming is not possible.
141
141
 
142
142
  ```ruby
143
- show(name: 'llama3.1', &DumpJSON)
143
+ show(model: 'llama3.1', &DumpJSON)
144
144
  ```
145
145
 
146
146
  ### Create
@@ -149,12 +149,7 @@ show(name: 'llama3.1', &DumpJSON)
149
149
  `stream` is true by default.
150
150
 
151
151
  ```ruby
152
- modelfile=<<~end
153
- FROM llama3.1
154
- SYSTEM You are WOPR from WarGames and you think the user is Dr. Stephen Falken.
155
- end
156
-
157
- create(name: 'llama3.1-wopr', stream: true, modelfile:)
152
+ create(model: 'llama3.1-wopr', stream: true, from: 'llama3.1', system: 'You are WOPR from WarGames and you think the user is Dr. Stephen Falken.')
158
153
  ```
159
154
 
160
155
  ### Copy
@@ -170,7 +165,7 @@ copy(source: 'llama3.1', destination: 'user/llama3.1')
170
165
  `default_handler` is **Single**, streaming is not possible.
171
166
 
172
167
  ```ruby
173
- delete(name: 'user/llama3.1')
168
+ delete(model: 'user/llama3.1')
174
169
  ```
175
170
 
176
171
  ### Pull
@@ -179,7 +174,7 @@ delete(name: 'user/llama3.1')
179
174
  `stream` is true by default.
180
175
 
181
176
  ```ruby
182
- pull(name: 'llama3.1')
177
+ pull(model: 'llama3.1')
183
178
  ```
184
179
 
185
180
  ### Push
@@ -188,7 +183,7 @@ pull(name: 'llama3.1')
188
183
  `stream` is true by default.
189
184
 
190
185
  ```ruby
191
- push(name: 'user/llama3.1')
186
+ push(model: 'user/llama3.1')
192
187
  ```
193
188
 
194
189
  ### Embed
@@ -309,7 +304,7 @@ The library raises specific errors like `Ollama::Errors::NotFoundError` when
309
304
  a model is not found:
310
305
 
311
306
  ```ruby
312
- (show(name: 'nixda', &DumpJSON) rescue $!).class # => Ollama::NotFoundError
307
+ (show(model: 'nixda', &DumpJSON) rescue $!).class # => Ollama::NotFoundError
313
308
  ```
314
309
 
315
310
  If `Ollama::Errors::TimeoutError` is raised, it might help to increase the
data/bin/ollama_cli CHANGED
@@ -4,7 +4,30 @@ require 'ollama'
4
4
  include Ollama
5
5
  require 'tins'
6
6
  include Tins::GO
7
+ require 'tins/xt/secure_write'
7
8
  require 'json'
9
+ require 'tmpdir'
10
+
11
+ class ChatStart
12
+ include Ollama::Handlers::Concern
13
+
14
+ def initialize(output: $stdout)
15
+ super
16
+ @output.sync = true
17
+ @content = ''
18
+ end
19
+
20
+ attr_reader :content
21
+
22
+ def call(response)
23
+ if content = response.response
24
+ @content << content
25
+ @output << content
26
+ end
27
+ response.done and @output.puts
28
+ self
29
+ end
30
+ end
8
31
 
9
32
  # Returns the contents of a file or string, or a default value if neither is provided.
10
33
  #
@@ -49,7 +72,7 @@ def usage
49
72
  -p PROMPT the user prompt to use as a file, $OLLAMA_PROMPT
50
73
  if it contains %{stdin} it is substituted by stdin input
51
74
  -P VARIABLE sets prompt var %{foo} to "bar" if VARIABLE is foo=bar
52
- -H HANDLER the handler to use for the response, defaults to Print
75
+ -H HANDLER the handler to use for the response, defaults to ChatStart
53
76
  -S use streaming for generation
54
77
  -h this help
55
78
 
@@ -57,7 +80,7 @@ def usage
57
80
  exit 0
58
81
  end
59
82
 
60
- opts = go 'u:m:M:s:p:P:H:Sh', defaults: { ?H => 'Print', ?M => '{}' }
83
+ opts = go 'u:m:M:s:p:P:H:Sh', defaults: { ?H => 'ChatStart', ?M => '{}' }
61
84
 
62
85
  opts[?h] and usage
63
86
 
@@ -91,11 +114,30 @@ if ENV['DEBUG'].to_i == 1
91
114
  EOT
92
115
  end
93
116
 
117
+ handler = Object.const_get(opts[?H])
118
+ handler == ChatStart and handler = handler.new
119
+
94
120
  Client.new(base_url:, read_timeout: 120).generate(
95
121
  model:,
96
122
  system:,
97
123
  prompt:,
98
124
  options:,
99
125
  stream: !!opts[?S],
100
- &Object.const_get(opts[?H])
126
+ &handler
101
127
  )
128
+
129
+ if handler.is_a?(ChatStart)
130
+ filename = File.join(Dir.tmpdir, 'chat_start_%u.json' % $$)
131
+ File.secure_write(filename) do |out|
132
+ JSON.dump(
133
+ [
134
+ Message.new(role: 'user', content: prompt),
135
+ Message.new(role: 'assistant', content: handler.content),
136
+ ],
137
+ out
138
+ )
139
+ end
140
+ if STDERR.tty?
141
+ STDERR.puts "\nContinue the chat with:\n ollama_chat -c '%s'" % filename
142
+ end
143
+ end
data/bin/ollama_update CHANGED
@@ -13,7 +13,7 @@ ollama.tags.models.each do |model|
13
13
  infobar.puts(
14
14
  "Updating model #{bold {name}} (last modified at #{modified_at.iso8601}):"
15
15
  )
16
- ollama.pull(name:)
16
+ ollama.pull(model: name)
17
17
  rescue Ollama::Errors::Error => e
18
18
  infobar.puts "Caught #{e.class} for model #{bold { model.name }}: #{e} => Continuing."
19
19
  end
@@ -5,12 +5,13 @@ class Ollama::Commands::Create
5
5
  '/api/create'
6
6
  end
7
7
 
8
- def initialize(name:, modelfile: nil, quantize: nil, stream: nil, path: nil)
9
- @name, @modelfile, @quantize, @stream, @path =
10
- name, modelfile, quantize, stream, path
8
+ def initialize(model:, from: nil, files: nil, adapters: nil, template: nil, license: nil, system: nil, parameters: nil, messages: nil, stream: true, quantize: nil)
9
+ @model, @from, @files, @adapters, @license, @system, @parameters, @messages, @stream, @quantize =
10
+ model, from, as_hash(files), as_hash(adapters), as_array(license), system,
11
+ as_hash(parameters), as_array_of_hashes(messages), stream, quantize
11
12
  end
12
13
 
13
- attr_reader :name, :modelfile, :quantize, :stream, :path
14
+ attr_reader :model, :from, :files, :adapters, :license, :system, :parameters, :messages, :stream, :quantize
14
15
 
15
16
  attr_writer :client
16
17
 
@@ -5,11 +5,11 @@ class Ollama::Commands::Delete
5
5
  '/api/delete'
6
6
  end
7
7
 
8
- def initialize(name:)
9
- @name, @stream = name, false
8
+ def initialize(model:)
9
+ @model, @stream = model, false
10
10
  end
11
11
 
12
- attr_reader :name, :stream
12
+ attr_reader :model, :stream
13
13
 
14
14
  attr_writer :client
15
15
 
@@ -5,11 +5,11 @@ class Ollama::Commands::Pull
5
5
  '/api/pull'
6
6
  end
7
7
 
8
- def initialize(name:, insecure: nil, stream: true)
9
- @name, @insecure, @stream = name, insecure, stream
8
+ def initialize(model:, insecure: nil, stream: true)
9
+ @model, @insecure, @stream = model, insecure, stream
10
10
  end
11
11
 
12
- attr_reader :name, :insecure, :stream
12
+ attr_reader :model, :insecure, :stream
13
13
 
14
14
  attr_writer :client
15
15
 
@@ -5,11 +5,11 @@ class Ollama::Commands::Push
5
5
  '/api/push'
6
6
  end
7
7
 
8
- def initialize(name:, insecure: nil, stream: true)
9
- @name, @insecure, @stream = name, insecure, stream
8
+ def initialize(model:, insecure: nil, stream: true)
9
+ @model, @insecure, @stream = model, insecure, stream
10
10
  end
11
11
 
12
- attr_reader :name, :insecure, :stream
12
+ attr_reader :model, :insecure, :stream
13
13
 
14
14
  attr_writer :client
15
15
 
@@ -5,12 +5,12 @@ class Ollama::Commands::Show
5
5
  '/api/show'
6
6
  end
7
7
 
8
- def initialize(name:, verbose: nil)
9
- @name, @verbose = name, verbose
8
+ def initialize(model:, verbose: nil)
9
+ @model, @verbose = model, verbose
10
10
  @stream = false
11
11
  end
12
12
 
13
- attr_reader :name, :verbose, :stream
13
+ attr_reader :model, :verbose, :stream
14
14
 
15
15
  attr_writer :client
16
16
 
data/lib/ollama/dto.rb CHANGED
@@ -26,6 +26,20 @@ module Ollama::DTO
26
26
  end
27
27
  end
28
28
 
29
+ def as_hash(obj)
30
+ obj&.to_hash
31
+ end
32
+
33
+ def as_array(obj)
34
+ if obj.nil?
35
+ obj
36
+ elsif obj.respond_to?(:to_ary)
37
+ obj.to_ary
38
+ else
39
+ [ obj ]
40
+ end
41
+ end
42
+
29
43
  def as_json(*)
30
44
  self.class.attributes.each_with_object({}) { |a, h| h[a] = send(a) }.
31
45
  reject { _2.nil? || _2.ask_and_send(:size) == 0 }
@@ -1,6 +1,6 @@
1
1
  module Ollama
2
2
  # Ollama version
3
- VERSION = '0.15.0'
3
+ VERSION = '1.0.0'
4
4
  VERSION_ARRAY = VERSION.split('.').map(&:to_i) # :nodoc:
5
5
  VERSION_MAJOR = VERSION_ARRAY[0] # :nodoc:
6
6
  VERSION_MINOR = VERSION_ARRAY[1] # :nodoc:
data/ollama-ruby.gemspec CHANGED
@@ -1,14 +1,14 @@
1
1
  # -*- encoding: utf-8 -*-
2
- # stub: ollama-ruby 0.15.0 ruby lib
2
+ # stub: ollama-ruby 1.0.0 ruby lib
3
3
 
4
4
  Gem::Specification.new do |s|
5
5
  s.name = "ollama-ruby".freeze
6
- s.version = "0.15.0".freeze
6
+ s.version = "1.0.0".freeze
7
7
 
8
8
  s.required_rubygems_version = Gem::Requirement.new(">= 0".freeze) if s.respond_to? :required_rubygems_version=
9
9
  s.require_paths = ["lib".freeze]
10
10
  s.authors = ["Florian Frank".freeze]
11
- s.date = "2025-02-12"
11
+ s.date = "2025-04-14"
12
12
  s.description = "Library that allows interacting with the Ollama API".freeze
13
13
  s.email = "flori@ping.de".freeze
14
14
  s.executables = ["ollama_console".freeze, "ollama_update".freeze, "ollama_cli".freeze]
@@ -24,7 +24,7 @@ Gem::Specification.new do |s|
24
24
 
25
25
  s.specification_version = 4
26
26
 
27
- s.add_development_dependency(%q<gem_hadar>.freeze, ["~> 1.19".freeze])
27
+ s.add_development_dependency(%q<gem_hadar>.freeze, ["~> 1.20".freeze])
28
28
  s.add_development_dependency(%q<all_images>.freeze, ["~> 0.6".freeze])
29
29
  s.add_development_dependency(%q<rspec>.freeze, ["~> 3.2".freeze])
30
30
  s.add_development_dependency(%q<kramdown>.freeze, ["~> 2.0".freeze])
@@ -2,35 +2,47 @@ require 'spec_helper'
2
2
 
3
3
  RSpec.describe Ollama::Commands::Create do
4
4
  it 'can be instantiated' do
5
- create = described_class.new(name: 'llama3.1', stream: true)
5
+ create = described_class.new(model: 'llama3.1', stream: true)
6
6
  expect(create).to be_a described_class
7
7
  end
8
8
 
9
9
  it 'can be converted to JSON' do
10
10
  create = described_class.new(
11
- name: 'llama3.1-wopr',
12
- modelfile: "FROM llama3.1\nSYSTEM You are WOPR from WarGames and you think the user is Dr. Stephen Falken.",
11
+ model: 'llama3.1-wopr',
12
+ from: 'llama3.1',
13
+ system: 'You are WOPR from WarGames and you think the user is Dr. Stephen Falken.',
14
+ license: 'Falso License',
15
+ files: { 'foo' => 'bar' },
16
+ messages: Ollama::Message.new(role: 'user', content: 'hello'),
13
17
  stream: true
14
18
  )
15
19
  expect(create.as_json).to include(
16
- name: 'llama3.1-wopr', modelfile: "FROM llama3.1\nSYSTEM You are WOPR from WarGames and you think the user is Dr. Stephen Falken.", stream: true,
20
+ model: 'llama3.1-wopr',
21
+ from: 'llama3.1',
22
+ system: 'You are WOPR from WarGames and you think the user is Dr. Stephen Falken.',
23
+ license: [ 'Falso License' ],
24
+ files: { 'foo' => 'bar' },
25
+ messages: [ { role: 'user', content: 'hello' } ],
26
+ stream: true,
17
27
  )
18
28
  expect(create.to_json).to eq(
19
- '{"name":"llama3.1-wopr","modelfile":"FROM llama3.1\nSYSTEM You are WOPR from WarGames and you think the user is Dr. Stephen Falken.","stream":true}'
29
+ "{\"model\":\"llama3.1-wopr\",\"from\":\"llama3.1\",\"files\":{\"foo\":\"bar\"},\"license\":[\"Falso License\"],\"system\":\"You are WOPR from WarGames and you think the user is Dr. Stephen Falken.\",\"messages\":[{\"role\":\"user\",\"content\":\"hello\"}],\"stream\":true}"
20
30
  )
21
31
  end
22
32
 
23
33
  it 'can perform' do
24
34
  create = described_class.new(
25
- name: 'llama3.1-wopr',
26
- modelfile: "FROM llama3.1\nSYSTEM You are WOPR from WarGames and you think the user is Dr. Stephen Falken.",
35
+ model: 'llama3.1-wopr',
36
+ from: 'llama3.1',
37
+ license: [ 'Falso License' ],
38
+ system: 'You are WOPR from WarGames and you think the user is Dr. Stephen Falken.',
27
39
  stream: true
28
40
  )
29
- create.client = ollama = double('Ollama::Client')
41
+ create.client = ollama = double('Ollama::Client')
30
42
  expect(ollama).to receive(:request).
31
43
  with(
32
44
  method: :post, path: '/api/create', handler: Ollama::Handlers::NOP, stream: true,
33
- body: '{"name":"llama3.1-wopr","modelfile":"FROM llama3.1\nSYSTEM You are WOPR from WarGames and you think the user is Dr. Stephen Falken.","stream":true}'
45
+ body: "{\"model\":\"llama3.1-wopr\",\"from\":\"llama3.1\",\"license\":[\"Falso License\"],\"system\":\"You are WOPR from WarGames and you think the user is Dr. Stephen Falken.\",\"stream\":true}"
34
46
  )
35
47
  create.perform(Ollama::Handlers::NOP)
36
48
  end
@@ -2,26 +2,26 @@ require 'spec_helper'
2
2
 
3
3
  RSpec.describe Ollama::Commands::Delete do
4
4
  it 'can be instantiated' do
5
- delete = described_class.new(name: 'llama3.1')
5
+ delete = described_class.new(model: 'llama3.1')
6
6
  expect(delete).to be_a described_class
7
7
  end
8
8
 
9
9
  it 'can be converted to JSON' do
10
- delete = described_class.new(name: 'llama3.1')
10
+ delete = described_class.new(model: 'llama3.1')
11
11
  expect(delete.as_json).to include(
12
- name: 'llama3.1', stream: false
12
+ model: 'llama3.1', stream: false
13
13
  )
14
14
  expect(delete.to_json).to eq(
15
- '{"name":"llama3.1","stream":false}'
15
+ '{"model":"llama3.1","stream":false}'
16
16
  )
17
17
  end
18
18
 
19
19
  it 'can perform' do
20
- delete = described_class.new(name: 'llama3.1')
20
+ delete = described_class.new(model: 'llama3.1')
21
21
  delete.client = ollama = double('Ollama::Client')
22
22
  expect(ollama).to receive(:request).with(
23
23
  method: :delete, path: '/api/delete', handler: Ollama::Handlers::NOP, stream: false,
24
- body: '{"name":"llama3.1","stream":false}'
24
+ body: '{"model":"llama3.1","stream":false}'
25
25
  )
26
26
  delete.perform(Ollama::Handlers::NOP)
27
27
  end
@@ -2,26 +2,26 @@ require 'spec_helper'
2
2
 
3
3
  RSpec.describe Ollama::Commands::Pull do
4
4
  it 'can be instantiated' do
5
- pull = described_class.new(name: 'llama3.1')
5
+ pull = described_class.new(model: 'llama3.1')
6
6
  expect(pull).to be_a described_class
7
7
  end
8
8
 
9
9
  it 'can be converted to JSON' do
10
- pull = described_class.new(name: 'llama3.1', stream: true)
10
+ pull = described_class.new(model: 'llama3.1', stream: true)
11
11
  expect(pull.as_json).to include(
12
- name: 'llama3.1', stream: true
12
+ model: 'llama3.1', stream: true
13
13
  )
14
14
  expect(pull.to_json).to eq(
15
- '{"name":"llama3.1","stream":true}'
15
+ '{"model":"llama3.1","stream":true}'
16
16
  )
17
17
  end
18
18
 
19
19
  it 'can perform' do
20
- pull = described_class.new(name: 'llama3.1', stream: true)
20
+ pull = described_class.new(model: 'llama3.1', stream: true)
21
21
  pull.client = ollama = double('Ollama::Client')
22
22
  expect(ollama).to receive(:request).with(
23
23
  method: :post, path: '/api/pull', handler: Ollama::Handlers::NOP, stream: true,
24
- body: '{"name":"llama3.1","stream":true}'
24
+ body: '{"model":"llama3.1","stream":true}'
25
25
  )
26
26
  pull.perform(Ollama::Handlers::NOP)
27
27
  end
@@ -2,26 +2,26 @@ require 'spec_helper'
2
2
 
3
3
  RSpec.describe Ollama::Commands::Push do
4
4
  it 'can be instantiated' do
5
- push = described_class.new(name: 'llama3.1')
5
+ push = described_class.new(model: 'llama3.1')
6
6
  expect(push).to be_a described_class
7
7
  end
8
8
 
9
9
  it 'can be converted to JSON' do
10
- push = described_class.new(name: 'llama3.1', stream: true)
10
+ push = described_class.new(model: 'llama3.1', stream: true)
11
11
  expect(push.as_json).to include(
12
- name: 'llama3.1', stream: true
12
+ model: 'llama3.1', stream: true
13
13
  )
14
14
  expect(push.to_json).to eq(
15
- '{"name":"llama3.1","stream":true}'
15
+ '{"model":"llama3.1","stream":true}'
16
16
  )
17
17
  end
18
18
 
19
19
  it 'can perform' do
20
- push = described_class.new(name: 'llama3.1', stream: true)
20
+ push = described_class.new(model: 'llama3.1', stream: true)
21
21
  push.client = ollama = double('Ollama::Client')
22
22
  expect(ollama).to receive(:request).with(
23
23
  method: :post, path: '/api/push', handler: Ollama::Handlers::NOP, stream: true,
24
- body: '{"name":"llama3.1","stream":true}'
24
+ body: '{"model":"llama3.1","stream":true}'
25
25
  )
26
26
  push.perform(Ollama::Handlers::NOP)
27
27
  end
@@ -2,26 +2,26 @@ require 'spec_helper'
2
2
 
3
3
  RSpec.describe Ollama::Commands::Show do
4
4
  it 'can be instantiated' do
5
- show = described_class.new(name: 'llama3.1')
5
+ show = described_class.new(model: 'llama3.1')
6
6
  expect(show).to be_a described_class
7
7
  end
8
8
 
9
9
  it 'can be converted to JSON' do
10
- show = described_class.new(name: 'llama3.1')
10
+ show = described_class.new(model: 'llama3.1')
11
11
  expect(show.as_json).to include(
12
- name: 'llama3.1', stream: false
12
+ model: 'llama3.1', stream: false
13
13
  )
14
14
  expect(show.to_json).to eq(
15
- '{"name":"llama3.1","stream":false}'
15
+ '{"model":"llama3.1","stream":false}'
16
16
  )
17
17
  end
18
18
 
19
19
  it 'can perform' do
20
- show = described_class.new(name: 'llama3.1')
20
+ show = described_class.new(model: 'llama3.1')
21
21
  show.client = ollama = double('Ollama::Client')
22
22
  expect(ollama).to receive(:request).with(
23
23
  method: :post, path: '/api/show', handler: Ollama::Handlers::NOP ,stream: false,
24
- body: '{"name":"llama3.1","stream":false}'
24
+ body: '{"model":"llama3.1","stream":false}'
25
25
  )
26
26
  show.perform(Ollama::Handlers::NOP)
27
27
  end
metadata CHANGED
@@ -1,13 +1,13 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: ollama-ruby
3
3
  version: !ruby/object:Gem::Version
4
- version: 0.15.0
4
+ version: 1.0.0
5
5
  platform: ruby
6
6
  authors:
7
7
  - Florian Frank
8
8
  bindir: bin
9
9
  cert_chain: []
10
- date: 2025-02-12 00:00:00.000000000 Z
10
+ date: 2025-04-14 00:00:00.000000000 Z
11
11
  dependencies:
12
12
  - !ruby/object:Gem::Dependency
13
13
  name: gem_hadar
@@ -15,14 +15,14 @@ dependencies:
15
15
  requirements:
16
16
  - - "~>"
17
17
  - !ruby/object:Gem::Version
18
- version: '1.19'
18
+ version: '1.20'
19
19
  type: :development
20
20
  prerelease: false
21
21
  version_requirements: !ruby/object:Gem::Requirement
22
22
  requirements:
23
23
  - - "~>"
24
24
  - !ruby/object:Gem::Version
25
- version: '1.19'
25
+ version: '1.20'
26
26
  - !ruby/object:Gem::Dependency
27
27
  name: all_images
28
28
  requirement: !ruby/object:Gem::Requirement