ollama-ruby 0.16.0 → 1.1.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: 96d3c8afe0962abc4938b2e2d6cea5dd773e224ac05e87f819b371429083d79d
4
- data.tar.gz: 6e293ec73919c33eede640b9043b9adfff00f5d9761c1d2759f022016a2b6daa
3
+ metadata.gz: dc9d143d1e8c579098ebceb971e448266da56836249e48bbaa54fd484801227e
4
+ data.tar.gz: 54ca7b0e5d6e9aae5555baf224444d05d674d1a0f9723d8c8ced60c9dd9885ad
5
5
  SHA512:
6
- metadata.gz: c11223503b8ff33ed77c0629f52ea5eb703ac8fc838b6a258e27e07eda297c66ad40ec672fccf1e5dd9e00f432754b219196114dd1bed93bacdd7760451d88db
7
- data.tar.gz: 2d089cfaaeccced65a17dbe622ec9f621238241c25c24b6504815310e3bc157f3a7baba95747275d4300292722dc5df2a74fa6cef581a7997af65e9bd86613cb
6
+ metadata.gz: 8d38465b67b00c410ef8ade0116478b61ef148bb0f556e36910e9b0402ad6f3dba1720913f42f5785501f536adc8bf2d3c62cad5fb54578527a6c280bf3eaf30
7
+ data.tar.gz: 07ff87834a7ad4983295a896260ee7210100c31bc2887bf6ab3732ec32c041d1d8fde0152c94dade0e4d12c4142b75b517aaebde2e214b784963b4cd6d191369
data/CHANGES.md CHANGED
@@ -1,5 +1,38 @@
1
1
  # Changes
2
2
 
3
+ ## 2025-06-01 v1.1.0
4
+
5
+ * Added the `think` option to chat and generate commands:
6
+ * Added `think` parameter to `initialize` method in `lib/ollama/commands/chat.rb`
7
+ * Added `think` attribute reader and writer to `lib/ollama/commands/chat.rb`
8
+ * Added `think` parameter to `initialize` method in `lib/ollama/commands/generate.rb`
9
+ * Added `think` attribute reader to `lib/ollama/commands/generate.rb`
10
+
11
+ ## 2025-04-15 v1.0.0
12
+
13
+ **Use model parameter and support new create parameters**
14
+ * Update parameter names in Ollama::Commands::Create specs:
15
+ + Renamed `name` to `model` and `modelfile` to `system` in `described_class.new` calls.
16
+ + Updated corresponding JSON serialization and deserialization tests.
17
+ * Adjust parameters in Create class and add helper methods:
18
+ + Changed parameter names in `Ollama::Commands::Create` class.
19
+ + Added methods: `as_hash(obj)` and `as_array(obj)` in `Ollama::DTO`.
20
+ * Update parameter names in README to match new method arguments:
21
+ + Updated `name` to `model` in `README.md`.
22
+ + Updated `push` method in `lib/ollama/commands/push.rb` to use `model` instead of `name`.
23
+ + Updated tests in `spec/ollama/commands/push_spec.rb` to use `model` instead of `name`.
24
+ * Refactor delete command and model attribute name:
25
+ + Renamed `delete` method parameter from `name` to `model`.
26
+ + Updated code in README.md, lib/ollama/commands/delete.rb, and spec/ollama/commands/delete_spec.rb.
27
+ * Rename parameter name in Ollama::Commands::Show class:
28
+ + Renamed `name` parameters to `model`.
29
+ + Updated method initializers, attribute readers and writers accordingly.
30
+ + Updated spec tests for new parameter name.
31
+ * Renamed `name` parameters to `model` in the following places:
32
+ + `ollama_update` script
33
+ + `Ollama::Commands::Pull` class
34
+ + `pull_spec.rb` spec file
35
+
3
36
  ## 2025-02-17 v0.16.0
4
37
 
5
38
  * Updated Ollama CLI with new handler that allows saving of chat and
data/README.md CHANGED
@@ -140,7 +140,7 @@ tags.models.map(&:name) => ["llama3.1:latest",…]
140
140
  `default_handler` is **Single**, streaming is not possible.
141
141
 
142
142
  ```ruby
143
- show(name: 'llama3.1', &DumpJSON)
143
+ show(model: 'llama3.1', &DumpJSON)
144
144
  ```
145
145
 
146
146
  ### Create
@@ -149,12 +149,7 @@ show(name: 'llama3.1', &DumpJSON)
149
149
  `stream` is true by default.
150
150
 
151
151
  ```ruby
152
- modelfile=<<~end
153
- FROM llama3.1
154
- SYSTEM You are WOPR from WarGames and you think the user is Dr. Stephen Falken.
155
- end
156
-
157
- create(name: 'llama3.1-wopr', stream: true, modelfile:)
152
+ create(model: 'llama3.1-wopr', stream: true, from: 'llama3.1', system: 'You are WOPR from WarGames and you think the user is Dr. Stephen Falken.')
158
153
  ```
159
154
 
160
155
  ### Copy
@@ -170,7 +165,7 @@ copy(source: 'llama3.1', destination: 'user/llama3.1')
170
165
  `default_handler` is **Single**, streaming is not possible.
171
166
 
172
167
  ```ruby
173
- delete(name: 'user/llama3.1')
168
+ delete(model: 'user/llama3.1')
174
169
  ```
175
170
 
176
171
  ### Pull
@@ -179,7 +174,7 @@ delete(name: 'user/llama3.1')
179
174
  `stream` is true by default.
180
175
 
181
176
  ```ruby
182
- pull(name: 'llama3.1')
177
+ pull(model: 'llama3.1')
183
178
  ```
184
179
 
185
180
  ### Push
@@ -188,7 +183,7 @@ pull(name: 'llama3.1')
188
183
  `stream` is true by default.
189
184
 
190
185
  ```ruby
191
- push(name: 'user/llama3.1')
186
+ push(model: 'user/llama3.1')
192
187
  ```
193
188
 
194
189
  ### Embed
@@ -309,7 +304,7 @@ The library raises specific errors like `Ollama::Errors::NotFoundError` when
309
304
  a model is not found:
310
305
 
311
306
  ```ruby
312
- (show(name: 'nixda', &DumpJSON) rescue $!).class # => Ollama::NotFoundError
307
+ (show(model: 'nixda', &DumpJSON) rescue $!).class # => Ollama::NotFoundError
313
308
  ```
314
309
 
315
310
  If `Ollama::Errors::TimeoutError` is raised, it might help to increase the
data/bin/ollama_cli CHANGED
@@ -138,6 +138,6 @@ if handler.is_a?(ChatStart)
138
138
  )
139
139
  end
140
140
  if STDERR.tty?
141
- STDERR.puts "\nContinue the chat with: ollama_chat -c '%s'" % filename
141
+ STDERR.puts "\nContinue the chat with:\n ollama_chat -c '%s'" % filename
142
142
  end
143
143
  end
data/bin/ollama_update CHANGED
@@ -13,7 +13,7 @@ ollama.tags.models.each do |model|
13
13
  infobar.puts(
14
14
  "Updating model #{bold {name}} (last modified at #{modified_at.iso8601}):"
15
15
  )
16
- ollama.pull(name:)
16
+ ollama.pull(model: name)
17
17
  rescue Ollama::Errors::Error => e
18
18
  infobar.puts "Caught #{e.class} for model #{bold { model.name }}: #{e} => Continuing."
19
19
  end
@@ -5,13 +5,13 @@ class Ollama::Commands::Chat
5
5
  '/api/chat'
6
6
  end
7
7
 
8
- def initialize(model:, messages:, tools: nil, format: nil, options: nil, stream: nil, keep_alive: nil)
9
- @model, @messages, @tools, @format, @options, @stream, @keep_alive =
8
+ def initialize(model:, messages:, tools: nil, format: nil, options: nil, stream: nil, keep_alive: nil, think: nil)
9
+ @model, @messages, @tools, @format, @options, @stream, @keep_alive, @think =
10
10
  model, as_array_of_hashes(messages), as_array_of_hashes(tools),
11
- format, options, stream, keep_alive
11
+ format, options, stream, keep_alive, think
12
12
  end
13
13
 
14
- attr_reader :model, :messages, :tools, :format, :options, :stream, :keep_alive
14
+ attr_reader :model, :messages, :tools, :format, :options, :stream, :keep_alive, :think
15
15
 
16
16
  attr_writer :client
17
17
 
@@ -5,12 +5,13 @@ class Ollama::Commands::Create
5
5
  '/api/create'
6
6
  end
7
7
 
8
- def initialize(name:, modelfile: nil, quantize: nil, stream: nil, path: nil)
9
- @name, @modelfile, @quantize, @stream, @path =
10
- name, modelfile, quantize, stream, path
8
+ def initialize(model:, from: nil, files: nil, adapters: nil, template: nil, license: nil, system: nil, parameters: nil, messages: nil, stream: true, quantize: nil)
9
+ @model, @from, @files, @adapters, @license, @system, @parameters, @messages, @stream, @quantize =
10
+ model, from, as_hash(files), as_hash(adapters), as_array(license), system,
11
+ as_hash(parameters), as_array_of_hashes(messages), stream, quantize
11
12
  end
12
13
 
13
- attr_reader :name, :modelfile, :quantize, :stream, :path
14
+ attr_reader :model, :from, :files, :adapters, :license, :system, :parameters, :messages, :stream, :quantize
14
15
 
15
16
  attr_writer :client
16
17
 
@@ -5,11 +5,11 @@ class Ollama::Commands::Delete
5
5
  '/api/delete'
6
6
  end
7
7
 
8
- def initialize(name:)
9
- @name, @stream = name, false
8
+ def initialize(model:)
9
+ @model, @stream = model, false
10
10
  end
11
11
 
12
- attr_reader :name, :stream
12
+ attr_reader :model, :stream
13
13
 
14
14
  attr_writer :client
15
15
 
@@ -5,13 +5,13 @@ class Ollama::Commands::Generate
5
5
  '/api/generate'
6
6
  end
7
7
 
8
- def initialize(model:, prompt:, suffix: nil, images: nil, format: nil, options: nil, system: nil, template: nil, context: nil, stream: nil, raw: nil, keep_alive: nil)
9
- @model, @prompt, @suffix, @images, @format, @options, @system, @template, @context, @stream, @raw, @keep_alive =
10
- model, prompt, suffix, (Array(images) if images), format, options, system, template, context, stream, raw, keep_alive
8
+ def initialize(model:, prompt:, suffix: nil, images: nil, format: nil, options: nil, system: nil, template: nil, context: nil, stream: nil, raw: nil, keep_alive: nil, think: nil)
9
+ @model, @prompt, @suffix, @images, @format, @options, @system, @template, @context, @stream, @raw, @keep_alive, @think =
10
+ model, prompt, suffix, (Array(images) if images), format, options, system, template, context, stream, raw, keep_alive, think
11
11
  end
12
12
 
13
13
  attr_reader :model, :prompt, :suffix, :images, :format, :options, :system,
14
- :template, :context, :stream, :raw, :keep_alive
14
+ :template, :context, :stream, :raw, :keep_alive, :think
15
15
 
16
16
  attr_writer :client
17
17
 
@@ -5,11 +5,11 @@ class Ollama::Commands::Pull
5
5
  '/api/pull'
6
6
  end
7
7
 
8
- def initialize(name:, insecure: nil, stream: true)
9
- @name, @insecure, @stream = name, insecure, stream
8
+ def initialize(model:, insecure: nil, stream: true)
9
+ @model, @insecure, @stream = model, insecure, stream
10
10
  end
11
11
 
12
- attr_reader :name, :insecure, :stream
12
+ attr_reader :model, :insecure, :stream
13
13
 
14
14
  attr_writer :client
15
15
 
@@ -5,11 +5,11 @@ class Ollama::Commands::Push
5
5
  '/api/push'
6
6
  end
7
7
 
8
- def initialize(name:, insecure: nil, stream: true)
9
- @name, @insecure, @stream = name, insecure, stream
8
+ def initialize(model:, insecure: nil, stream: true)
9
+ @model, @insecure, @stream = model, insecure, stream
10
10
  end
11
11
 
12
- attr_reader :name, :insecure, :stream
12
+ attr_reader :model, :insecure, :stream
13
13
 
14
14
  attr_writer :client
15
15
 
@@ -5,12 +5,12 @@ class Ollama::Commands::Show
5
5
  '/api/show'
6
6
  end
7
7
 
8
- def initialize(name:, verbose: nil)
9
- @name, @verbose = name, verbose
8
+ def initialize(model:, verbose: nil)
9
+ @model, @verbose = model, verbose
10
10
  @stream = false
11
11
  end
12
12
 
13
- attr_reader :name, :verbose, :stream
13
+ attr_reader :model, :verbose, :stream
14
14
 
15
15
  attr_writer :client
16
16
 
data/lib/ollama/dto.rb CHANGED
@@ -26,6 +26,20 @@ module Ollama::DTO
26
26
  end
27
27
  end
28
28
 
29
+ def as_hash(obj)
30
+ obj&.to_hash
31
+ end
32
+
33
+ def as_array(obj)
34
+ if obj.nil?
35
+ obj
36
+ elsif obj.respond_to?(:to_ary)
37
+ obj.to_ary
38
+ else
39
+ [ obj ]
40
+ end
41
+ end
42
+
29
43
  def as_json(*)
30
44
  self.class.attributes.each_with_object({}) { |a, h| h[a] = send(a) }.
31
45
  reject { _2.nil? || _2.ask_and_send(:size) == 0 }
@@ -1,6 +1,6 @@
1
1
  module Ollama
2
2
  # Ollama version
3
- VERSION = '0.16.0'
3
+ VERSION = '1.1.0'
4
4
  VERSION_ARRAY = VERSION.split('.').map(&:to_i) # :nodoc:
5
5
  VERSION_MAJOR = VERSION_ARRAY[0] # :nodoc:
6
6
  VERSION_MINOR = VERSION_ARRAY[1] # :nodoc:
data/ollama-ruby.gemspec CHANGED
@@ -1,14 +1,14 @@
1
1
  # -*- encoding: utf-8 -*-
2
- # stub: ollama-ruby 0.16.0 ruby lib
2
+ # stub: ollama-ruby 1.1.0 ruby lib
3
3
 
4
4
  Gem::Specification.new do |s|
5
5
  s.name = "ollama-ruby".freeze
6
- s.version = "0.16.0".freeze
6
+ s.version = "1.1.0".freeze
7
7
 
8
8
  s.required_rubygems_version = Gem::Requirement.new(">= 0".freeze) if s.respond_to? :required_rubygems_version=
9
9
  s.require_paths = ["lib".freeze]
10
10
  s.authors = ["Florian Frank".freeze]
11
- s.date = "2025-02-17"
11
+ s.date = "1980-01-02"
12
12
  s.description = "Library that allows interacting with the Ollama API".freeze
13
13
  s.email = "flori@ping.de".freeze
14
14
  s.executables = ["ollama_console".freeze, "ollama_update".freeze, "ollama_cli".freeze]
@@ -18,13 +18,13 @@ Gem::Specification.new do |s|
18
18
  s.licenses = ["MIT".freeze]
19
19
  s.rdoc_options = ["--title".freeze, "Ollama-ruby - Interacting with the Ollama API".freeze, "--main".freeze, "README.md".freeze]
20
20
  s.required_ruby_version = Gem::Requirement.new("~> 3.1".freeze)
21
- s.rubygems_version = "3.6.2".freeze
21
+ s.rubygems_version = "3.6.7".freeze
22
22
  s.summary = "Interacting with the Ollama API".freeze
23
23
  s.test_files = ["spec/ollama/client/doc_spec.rb".freeze, "spec/ollama/client_spec.rb".freeze, "spec/ollama/commands/chat_spec.rb".freeze, "spec/ollama/commands/copy_spec.rb".freeze, "spec/ollama/commands/create_spec.rb".freeze, "spec/ollama/commands/delete_spec.rb".freeze, "spec/ollama/commands/embed_spec.rb".freeze, "spec/ollama/commands/embeddings_spec.rb".freeze, "spec/ollama/commands/generate_spec.rb".freeze, "spec/ollama/commands/ps_spec.rb".freeze, "spec/ollama/commands/pull_spec.rb".freeze, "spec/ollama/commands/push_spec.rb".freeze, "spec/ollama/commands/show_spec.rb".freeze, "spec/ollama/commands/tags_spec.rb".freeze, "spec/ollama/commands/version_spec.rb".freeze, "spec/ollama/handlers/collector_spec.rb".freeze, "spec/ollama/handlers/dump_json_spec.rb".freeze, "spec/ollama/handlers/dump_yaml_spec.rb".freeze, "spec/ollama/handlers/markdown_spec.rb".freeze, "spec/ollama/handlers/nop_spec.rb".freeze, "spec/ollama/handlers/print_spec.rb".freeze, "spec/ollama/handlers/progress_spec.rb".freeze, "spec/ollama/handlers/say_spec.rb".freeze, "spec/ollama/handlers/single_spec.rb".freeze, "spec/ollama/image_spec.rb".freeze, "spec/ollama/message_spec.rb".freeze, "spec/ollama/options_spec.rb".freeze, "spec/ollama/tool_spec.rb".freeze, "spec/spec_helper.rb".freeze]
24
24
 
25
25
  s.specification_version = 4
26
26
 
27
- s.add_development_dependency(%q<gem_hadar>.freeze, ["~> 1.19".freeze])
27
+ s.add_development_dependency(%q<gem_hadar>.freeze, ["~> 1.20".freeze])
28
28
  s.add_development_dependency(%q<all_images>.freeze, ["~> 0.6".freeze])
29
29
  s.add_development_dependency(%q<rspec>.freeze, ["~> 3.2".freeze])
30
30
  s.add_development_dependency(%q<kramdown>.freeze, ["~> 2.0".freeze])
@@ -2,35 +2,47 @@ require 'spec_helper'
2
2
 
3
3
  RSpec.describe Ollama::Commands::Create do
4
4
  it 'can be instantiated' do
5
- create = described_class.new(name: 'llama3.1', stream: true)
5
+ create = described_class.new(model: 'llama3.1', stream: true)
6
6
  expect(create).to be_a described_class
7
7
  end
8
8
 
9
9
  it 'can be converted to JSON' do
10
10
  create = described_class.new(
11
- name: 'llama3.1-wopr',
12
- modelfile: "FROM llama3.1\nSYSTEM You are WOPR from WarGames and you think the user is Dr. Stephen Falken.",
11
+ model: 'llama3.1-wopr',
12
+ from: 'llama3.1',
13
+ system: 'You are WOPR from WarGames and you think the user is Dr. Stephen Falken.',
14
+ license: 'Falso License',
15
+ files: { 'foo' => 'bar' },
16
+ messages: Ollama::Message.new(role: 'user', content: 'hello'),
13
17
  stream: true
14
18
  )
15
19
  expect(create.as_json).to include(
16
- name: 'llama3.1-wopr', modelfile: "FROM llama3.1\nSYSTEM You are WOPR from WarGames and you think the user is Dr. Stephen Falken.", stream: true,
20
+ model: 'llama3.1-wopr',
21
+ from: 'llama3.1',
22
+ system: 'You are WOPR from WarGames and you think the user is Dr. Stephen Falken.',
23
+ license: [ 'Falso License' ],
24
+ files: { 'foo' => 'bar' },
25
+ messages: [ { role: 'user', content: 'hello' } ],
26
+ stream: true,
17
27
  )
18
28
  expect(create.to_json).to eq(
19
- '{"name":"llama3.1-wopr","modelfile":"FROM llama3.1\nSYSTEM You are WOPR from WarGames and you think the user is Dr. Stephen Falken.","stream":true}'
29
+ "{\"model\":\"llama3.1-wopr\",\"from\":\"llama3.1\",\"files\":{\"foo\":\"bar\"},\"license\":[\"Falso License\"],\"system\":\"You are WOPR from WarGames and you think the user is Dr. Stephen Falken.\",\"messages\":[{\"role\":\"user\",\"content\":\"hello\"}],\"stream\":true}"
20
30
  )
21
31
  end
22
32
 
23
33
  it 'can perform' do
24
34
  create = described_class.new(
25
- name: 'llama3.1-wopr',
26
- modelfile: "FROM llama3.1\nSYSTEM You are WOPR from WarGames and you think the user is Dr. Stephen Falken.",
35
+ model: 'llama3.1-wopr',
36
+ from: 'llama3.1',
37
+ license: [ 'Falso License' ],
38
+ system: 'You are WOPR from WarGames and you think the user is Dr. Stephen Falken.',
27
39
  stream: true
28
40
  )
29
- create.client = ollama = double('Ollama::Client')
41
+ create.client = ollama = double('Ollama::Client')
30
42
  expect(ollama).to receive(:request).
31
43
  with(
32
44
  method: :post, path: '/api/create', handler: Ollama::Handlers::NOP, stream: true,
33
- body: '{"name":"llama3.1-wopr","modelfile":"FROM llama3.1\nSYSTEM You are WOPR from WarGames and you think the user is Dr. Stephen Falken.","stream":true}'
45
+ body: "{\"model\":\"llama3.1-wopr\",\"from\":\"llama3.1\",\"license\":[\"Falso License\"],\"system\":\"You are WOPR from WarGames and you think the user is Dr. Stephen Falken.\",\"stream\":true}"
34
46
  )
35
47
  create.perform(Ollama::Handlers::NOP)
36
48
  end
@@ -2,26 +2,26 @@ require 'spec_helper'
2
2
 
3
3
  RSpec.describe Ollama::Commands::Delete do
4
4
  it 'can be instantiated' do
5
- delete = described_class.new(name: 'llama3.1')
5
+ delete = described_class.new(model: 'llama3.1')
6
6
  expect(delete).to be_a described_class
7
7
  end
8
8
 
9
9
  it 'can be converted to JSON' do
10
- delete = described_class.new(name: 'llama3.1')
10
+ delete = described_class.new(model: 'llama3.1')
11
11
  expect(delete.as_json).to include(
12
- name: 'llama3.1', stream: false
12
+ model: 'llama3.1', stream: false
13
13
  )
14
14
  expect(delete.to_json).to eq(
15
- '{"name":"llama3.1","stream":false}'
15
+ '{"model":"llama3.1","stream":false}'
16
16
  )
17
17
  end
18
18
 
19
19
  it 'can perform' do
20
- delete = described_class.new(name: 'llama3.1')
20
+ delete = described_class.new(model: 'llama3.1')
21
21
  delete.client = ollama = double('Ollama::Client')
22
22
  expect(ollama).to receive(:request).with(
23
23
  method: :delete, path: '/api/delete', handler: Ollama::Handlers::NOP, stream: false,
24
- body: '{"name":"llama3.1","stream":false}'
24
+ body: '{"model":"llama3.1","stream":false}'
25
25
  )
26
26
  delete.perform(Ollama::Handlers::NOP)
27
27
  end
@@ -2,26 +2,26 @@ require 'spec_helper'
2
2
 
3
3
  RSpec.describe Ollama::Commands::Pull do
4
4
  it 'can be instantiated' do
5
- pull = described_class.new(name: 'llama3.1')
5
+ pull = described_class.new(model: 'llama3.1')
6
6
  expect(pull).to be_a described_class
7
7
  end
8
8
 
9
9
  it 'can be converted to JSON' do
10
- pull = described_class.new(name: 'llama3.1', stream: true)
10
+ pull = described_class.new(model: 'llama3.1', stream: true)
11
11
  expect(pull.as_json).to include(
12
- name: 'llama3.1', stream: true
12
+ model: 'llama3.1', stream: true
13
13
  )
14
14
  expect(pull.to_json).to eq(
15
- '{"name":"llama3.1","stream":true}'
15
+ '{"model":"llama3.1","stream":true}'
16
16
  )
17
17
  end
18
18
 
19
19
  it 'can perform' do
20
- pull = described_class.new(name: 'llama3.1', stream: true)
20
+ pull = described_class.new(model: 'llama3.1', stream: true)
21
21
  pull.client = ollama = double('Ollama::Client')
22
22
  expect(ollama).to receive(:request).with(
23
23
  method: :post, path: '/api/pull', handler: Ollama::Handlers::NOP, stream: true,
24
- body: '{"name":"llama3.1","stream":true}'
24
+ body: '{"model":"llama3.1","stream":true}'
25
25
  )
26
26
  pull.perform(Ollama::Handlers::NOP)
27
27
  end
@@ -2,26 +2,26 @@ require 'spec_helper'
2
2
 
3
3
  RSpec.describe Ollama::Commands::Push do
4
4
  it 'can be instantiated' do
5
- push = described_class.new(name: 'llama3.1')
5
+ push = described_class.new(model: 'llama3.1')
6
6
  expect(push).to be_a described_class
7
7
  end
8
8
 
9
9
  it 'can be converted to JSON' do
10
- push = described_class.new(name: 'llama3.1', stream: true)
10
+ push = described_class.new(model: 'llama3.1', stream: true)
11
11
  expect(push.as_json).to include(
12
- name: 'llama3.1', stream: true
12
+ model: 'llama3.1', stream: true
13
13
  )
14
14
  expect(push.to_json).to eq(
15
- '{"name":"llama3.1","stream":true}'
15
+ '{"model":"llama3.1","stream":true}'
16
16
  )
17
17
  end
18
18
 
19
19
  it 'can perform' do
20
- push = described_class.new(name: 'llama3.1', stream: true)
20
+ push = described_class.new(model: 'llama3.1', stream: true)
21
21
  push.client = ollama = double('Ollama::Client')
22
22
  expect(ollama).to receive(:request).with(
23
23
  method: :post, path: '/api/push', handler: Ollama::Handlers::NOP, stream: true,
24
- body: '{"name":"llama3.1","stream":true}'
24
+ body: '{"model":"llama3.1","stream":true}'
25
25
  )
26
26
  push.perform(Ollama::Handlers::NOP)
27
27
  end
@@ -2,26 +2,26 @@ require 'spec_helper'
2
2
 
3
3
  RSpec.describe Ollama::Commands::Show do
4
4
  it 'can be instantiated' do
5
- show = described_class.new(name: 'llama3.1')
5
+ show = described_class.new(model: 'llama3.1')
6
6
  expect(show).to be_a described_class
7
7
  end
8
8
 
9
9
  it 'can be converted to JSON' do
10
- show = described_class.new(name: 'llama3.1')
10
+ show = described_class.new(model: 'llama3.1')
11
11
  expect(show.as_json).to include(
12
- name: 'llama3.1', stream: false
12
+ model: 'llama3.1', stream: false
13
13
  )
14
14
  expect(show.to_json).to eq(
15
- '{"name":"llama3.1","stream":false}'
15
+ '{"model":"llama3.1","stream":false}'
16
16
  )
17
17
  end
18
18
 
19
19
  it 'can perform' do
20
- show = described_class.new(name: 'llama3.1')
20
+ show = described_class.new(model: 'llama3.1')
21
21
  show.client = ollama = double('Ollama::Client')
22
22
  expect(ollama).to receive(:request).with(
23
23
  method: :post, path: '/api/show', handler: Ollama::Handlers::NOP ,stream: false,
24
- body: '{"name":"llama3.1","stream":false}'
24
+ body: '{"model":"llama3.1","stream":false}'
25
25
  )
26
26
  show.perform(Ollama::Handlers::NOP)
27
27
  end
metadata CHANGED
@@ -1,13 +1,13 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: ollama-ruby
3
3
  version: !ruby/object:Gem::Version
4
- version: 0.16.0
4
+ version: 1.1.0
5
5
  platform: ruby
6
6
  authors:
7
7
  - Florian Frank
8
8
  bindir: bin
9
9
  cert_chain: []
10
- date: 2025-02-17 00:00:00.000000000 Z
10
+ date: 1980-01-02 00:00:00.000000000 Z
11
11
  dependencies:
12
12
  - !ruby/object:Gem::Dependency
13
13
  name: gem_hadar
@@ -15,14 +15,14 @@ dependencies:
15
15
  requirements:
16
16
  - - "~>"
17
17
  - !ruby/object:Gem::Version
18
- version: '1.19'
18
+ version: '1.20'
19
19
  type: :development
20
20
  prerelease: false
21
21
  version_requirements: !ruby/object:Gem::Requirement
22
22
  requirements:
23
23
  - - "~>"
24
24
  - !ruby/object:Gem::Version
25
- version: '1.19'
25
+ version: '1.20'
26
26
  - !ruby/object:Gem::Dependency
27
27
  name: all_images
28
28
  requirement: !ruby/object:Gem::Requirement
@@ -214,9 +214,9 @@ dependencies:
214
214
  description: Library that allows interacting with the Ollama API
215
215
  email: flori@ping.de
216
216
  executables:
217
+ - ollama_cli
217
218
  - ollama_console
218
219
  - ollama_update
219
- - ollama_cli
220
220
  extensions: []
221
221
  extra_rdoc_files:
222
222
  - README.md
@@ -362,7 +362,7 @@ required_rubygems_version: !ruby/object:Gem::Requirement
362
362
  - !ruby/object:Gem::Version
363
363
  version: '0'
364
364
  requirements: []
365
- rubygems_version: 3.6.2
365
+ rubygems_version: 3.6.7
366
366
  specification_version: 4
367
367
  summary: Interacting with the Ollama API
368
368
  test_files: