ollama-ruby 0.1.0 → 0.2.0

Sign up to get free protection for your applications and to get access to all the features.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: 9f88e75feb900230387f4b960f5762daa8622af0f9562c0145d6b22c4a01fb3d
4
- data.tar.gz: 4c6fea6ddc54c83ac6d003d4e5f3ccb6b90195b79a97aea6d98a95e32db36f99
3
+ metadata.gz: 1f80ae8ee6e8acbbedfff8b56923b25b583fc60b96c733985e32908874d542bb
4
+ data.tar.gz: 80834beb676929f08f4216e373e56cadd12d16c0b50e95b1a599bdd48bf29c86
5
5
  SHA512:
6
- metadata.gz: 9002b36cd15680f89c50d6f3ee34b3a4e5d77362cd4982d4ca41fc9abbd0bcd1d44483579686f3bf9f20f9e10b1309c28ae8da789ba8ada66c7b5f6384643f87
7
- data.tar.gz: e2566f950aea15cf47d60381340db2b0e7a8e3a2fd79aaa8ee421f6cbd367baeae0f64b24e5d9baa801bd715695edf80c5a88d0ba41867c9358bc3076d1ed112
6
+ metadata.gz: 94823ec618f940056bcee6ac7d706c633087404efcb49122cdccc89a4596b4a5f14a0d1c1d60fd7bb13a28e10f3ba4dc2027cd96cfe384169b25acaf75f532b7
7
+ data.tar.gz: ef8d4a7001c5502bc787f074f10dfaf16692b8fa90c4a6e32ae3c0c75b94da45b4691094229106cec72438837eef57b933cd5e5c9b28e63d0d4a598c152a9c32
data/CHANGES.md ADDED
@@ -0,0 +1,78 @@
1
+ # Changes
2
+
3
+ ## 2024-09-03 v0.2.0
4
+
5
+ ### Changes
6
+
7
+ * **Added Web Search Functionality to `ollama_chat`**
8
+ + Added `/web` command to fetch search results from DuckDuckGo
9
+ + Updated `/summarize` command to handle cases where summarization fails
10
+ + Fix bug in parsing content type of source document
11
+ * **Refactored Options Class and Usage**
12
+ + Renamed `options` variable to use `Options[]` method in ollama_chat script
13
+ + Added `[](value)` method to Ollama::Options class for casting hashes
14
+ + Updated options_spec.rb with tests for casting hashes and error handling
15
+ * **Refactored Web Search Command**
16
+ + Added support for specifying a page number in `/web` command
17
+ + Updated regular expression to match new format
18
+ + Passed page number as an argument to `search_web` method
19
+ + Updated content string to reference the query and sources correctly
20
+ * **DTO Class Changes**
21
+ + Renamed `json_create` method to `from_hash` in Ollama::DTO class
22
+ + Updated `as_json` method to remove now unnecessary hash creation
23
+ * **Message and Tool Spec Changes**
24
+ + Removed `json_class` from JSON serialization in message_spec
25
+ + Removed `json_class` from JSON serialization in tool_spec
26
+ * **Command Spec Changes**
27
+ + Removed `json_class` from JSON serialization in various command specs (e.g. generate_spec, pull_spec, etc.)
28
+ * **Miscellaneous Changes**
29
+ + Improved width calculation for text truncation
30
+ + Updated FollowChat class to display evaluation statistics
31
+ + Update OllamaChatConfig to use EOT instead of end for heredoc syntax
32
+ + Add .keep file to tmp directory
33
+
34
+ ## 2024-08-30 v0.1.0
35
+
36
+ ### Change Log for New Version
37
+
38
+ #### Significant Changes
39
+
40
+ * **Document Splitting and Embedding Functionality**: Added `Ollama::Documents` class with methods for adding documents, checking existence, deleting documents, and finding similar documents.
41
+ + Introduced two types of caches: `MemoryCache` and `RedisCache`
42
+ + Implemented `SemanticSplitter` class to split text into sentences based on semantic similarity
43
+ * **Improved Ollama Chat Client**: Added support for document embeddings and web/file RAG
44
+ + Allowed configuration per yaml file
45
+ + Parse user input for URLs or files to send images to multimodal models
46
+ * **Redis Docker Service**: Set `REDIS_URL` environment variable to `redis://localhost:9736`
47
+ + Added Redis service to `docker-compose.yml`
48
+ * **Status Display and Progress Updates**: Added infobar.label = response.status when available
49
+ + Updated infobar with progress message on each call if total and completed are set
50
+ + Display error message from response.error if present
51
+ * **Refactored Chat Commands**: Simplified regular expression patterns for `/pop`, `/save`, `/load`, and `/image` commands
52
+ + Added whitespace to some command patterns for better readability
53
+
54
+ #### Other Changes
55
+
56
+ * Added `Character` and `RecursiveCharacter` splitter classes to split text into chunks based on character separators
57
+ * Added RSpec tests for the Ollama::Documents class(es)
58
+ * Updated dependencies and added new methods for calculating breakpoint thresholds and sentence embeddings
59
+ * Added 'ollama_update' to executables in Rakefile
60
+ * Started using webmock
61
+ * Refactored chooser and add fetcher specs
62
+ * Added tests for Ollama::Utils::Fetcher
63
+ * Update README.md
64
+
65
+ ## 2024-08-16 v0.0.1
66
+
67
+ * **New Features**
68
+ + Added missing options parameter to Embed command
69
+ + Documented new `/api/embed` endpoint
70
+ * **Improvements**
71
+ + Improved example in README.md
72
+ * **Code Refactoring**
73
+ + Renamed `client` to `ollama` in client and command specs
74
+ + Updated expectations to use `ollama` instead of `client`
75
+
76
+ ## 2024-08-12 v0.0.0
77
+
78
+ * Start
data/README.md CHANGED
@@ -43,7 +43,6 @@ ollama_chat [OPTIONS]
43
43
  -c CHAT a saved chat conversation to load
44
44
  -C COLLECTION name of the collection used in this conversation
45
45
  -D DOCUMENT load document and add to collection (multiple)
46
- -d use markdown to display the chat messages
47
46
  -v use voice output
48
47
  -h this help
49
48
  ```
data/bin/ollama_chat CHANGED
@@ -4,18 +4,20 @@ require 'ollama'
4
4
  include Ollama
5
5
  require 'term/ansicolor'
6
6
  include Term::ANSIColor
7
- require 'tins/go'
7
+ require 'tins'
8
8
  include Tins::GO
9
9
  require 'reline'
10
10
  require 'reverse_markdown'
11
11
  require 'complex_config'
12
12
  require 'fileutils'
13
+ require 'uri'
14
+ require 'nokogiri'
13
15
 
14
16
  class OllamaChatConfig
15
17
  include ComplexConfig
16
18
  include FileUtils
17
19
 
18
- DEFAULT_CONFIG = <<~end
20
+ DEFAULT_CONFIG = <<~EOT
19
21
  ---
20
22
  url: <%= ENV['OLLAMA_URL'] || 'http://%s' % ENV.fetch('OLLAMA_HOST') %>
21
23
  model:
@@ -41,7 +43,7 @@ class OllamaChatConfig
41
43
  redis:
42
44
  url: <%= ENV.fetch('REDIS_URL', 'null') %>
43
45
  debug: <%= ENV['OLLAMA_CHAT_DEBUG'].to_i == 1 ? true : false %>
44
- end
46
+ EOT
45
47
 
46
48
  def initialize(filename = nil)
47
49
  @filename = filename || default_path
@@ -109,11 +111,46 @@ class FollowChat
109
111
  end
110
112
  @say.call(response)
111
113
  end
112
- response.done and @output.puts
114
+ if response.done
115
+ @output.puts
116
+ eval_stats = {
117
+ eval_duration: Tins::Duration.new(response.eval_duration / 1e9),
118
+ eval_count: response.eval_count,
119
+ prompt_eval_duration: Tins::Duration.new(response.prompt_eval_duration / 1e9),
120
+ prompt_eval_count: response.prompt_eval_count,
121
+ total_duration: Tins::Duration.new(response.total_duration / 1e9),
122
+ load_duration: Tins::Duration.new(response.load_duration / 1e9),
123
+ }.map { _1 * '=' } * ' '
124
+ @output.puts '📊 ' + color(111) { Utils::Width.wrap(eval_stats, percentage: 90) }
125
+ end
113
126
  self
114
127
  end
115
128
  end
116
129
 
130
+ def search_web(query, n = 5)
131
+ query = URI.encode_uri_component(query)
132
+ url = "https://www.duckduckgo.com/html/?q=#{query}"
133
+ Ollama::Utils::Fetcher.new.get(url) do |tmp|
134
+ result = []
135
+ doc = Nokogiri::HTML(tmp)
136
+ doc.css('.results_links').each do |link|
137
+ if n > 0
138
+ url = link.css('.result__a').first&.[]('href')
139
+ url.sub!(%r(\A/l/\?uddg=), '')
140
+ url.sub!(%r(&rut=.*), '')
141
+ url = URI.decode_uri_component(url)
142
+ url = URI.parse(url)
143
+ url.host =~ /duckduckgo\.com/ and next
144
+ result << url
145
+ n -= 1
146
+ else
147
+ break
148
+ end
149
+ end
150
+ result
151
+ end
152
+ end
153
+
117
154
  def pull_model_unless_present(model, options, retried = false)
118
155
  ollama.show(name: model) { |response|
119
156
  puts "Model #{bold{model}} with architecture #{response.model_info['general.architecture']} found."
@@ -144,7 +181,7 @@ def load_conversation(filename)
144
181
  return
145
182
  end
146
183
  File.open(filename, 'r') do |output|
147
- return JSON(output.read, create_additions: true)
184
+ return JSON(output.read).map { Ollama::Message.from_hash(_1) }
148
185
  end
149
186
  end
150
187
 
@@ -201,7 +238,7 @@ def parse_source(source_io)
201
238
  when 'plain', 'csv', 'xml'
202
239
  source_io.read
203
240
  else
204
- STDERR.puts "Cannot import #{source_io.content_type} document."
241
+ STDERR.puts "Cannot import #{source_io&.content_type} document."
205
242
  return
206
243
  end
207
244
  end
@@ -211,7 +248,7 @@ def import_document(source_io, source)
211
248
  STDOUT.puts "Embedding disabled, I won't import any documents, try: /summarize"
212
249
  return
213
250
  end
214
- STDOUT.puts "Importing #{source_io.content_type} document #{source.to_s.inspect}."
251
+ infobar.puts "Importing #{italic { source_io.content_type }} document #{source.to_s.inspect}."
215
252
  text = parse_source(source_io) or return
216
253
  text.downcase!
217
254
  splitter_config = $config.embedding.splitter
@@ -381,7 +418,6 @@ def usage
381
418
  -c CHAT a saved chat conversation to load
382
419
  -C COLLECTION name of the collection used in this conversation
383
420
  -D DOCUMENT load document and add to collection (multiple)
384
- -d use markdown to display the chat messages
385
421
  -v use voice output
386
422
  -h this help
387
423
 
@@ -393,7 +429,7 @@ def ollama
393
429
  $ollama
394
430
  end
395
431
 
396
- opts = go 'f:u:m:s:c:C:D:dvh'
432
+ opts = go 'f:u:m:s:c:C:D:vh'
397
433
 
398
434
  config = OllamaChatConfig.new(opts[?f])
399
435
  $config = config.config
@@ -407,13 +443,13 @@ base_url = opts[?u] || $config.url
407
443
  $ollama = Client.new(base_url:, debug: $config.debug)
408
444
 
409
445
  model = choose_model(opts[?m], $config.model.name)
410
- options = $config.model.options
446
+ options = Options[$config.model.options]
411
447
  model_system = pull_model_unless_present(model, options)
412
448
  messages = []
413
449
 
414
450
  if $config.embedding.enabled
415
451
  embedding_model = $config.embedding.model.name
416
- embedding_model_options = $config.embedding.model.options
452
+ embedding_model_options = Options[$config.embedding.model.options]
417
453
  pull_model_unless_present(embedding_model, embedding_model_options)
418
454
  collection = opts[?C] || $config.embedding.collection
419
455
  $documents = Documents.new(
@@ -456,8 +492,7 @@ end
456
492
  if voice = ($config.voice if opts[?v])
457
493
  puts "Using voice #{bold{voice}} to speak."
458
494
  end
459
-
460
- markdown = set_markdown(opts[?d] || $config.markdown)
495
+ markdown = set_markdown($config.markdown)
461
496
 
462
497
  if opts[?c]
463
498
  messages.concat load_conversation(opts[?c])
@@ -478,12 +513,13 @@ end
478
513
 
479
514
  puts "\nType /help to display the chat help."
480
515
 
481
- images = []
516
+ images = []
482
517
  loop do
483
518
  parse_content = true
484
-
485
519
  input_prompt = bold { color(172) { message_type(images) + " user" } } + bold { "> " }
486
- case content = Reline.readline(input_prompt, true)&.chomp
520
+ content = Reline.readline(input_prompt, true)&.chomp
521
+
522
+ case content
487
523
  when %r(^/paste$)
488
524
  puts bold { "Paste your content and then press C-d!" }
489
525
  content = STDIN.read
@@ -518,7 +554,7 @@ loop do
518
554
  when %r(^/pop?(?:\s+(\d*))?$)
519
555
  n = $1.to_i.clamp(1, Float::INFINITY)
520
556
  r = messages.pop(2 * n)
521
- m = r.size
557
+ m = r.size / 2
522
558
  puts "Popped the last #{m} exchanges."
523
559
  next
524
560
  when %r(^/model$)
@@ -534,7 +570,15 @@ loop do
534
570
  end
535
571
  when %r(^/summarize\s+(.+))
536
572
  parse_content = false
537
- content = summarize($1)
573
+ content = summarize($1) or next
574
+ when %r(^/web\s+(?:(\d+)\s+)(.+)$)
575
+ parse_content = true
576
+ urls = search_web($2, $1.to_i)
577
+ content = <<~end
578
+ Answer the the query #{$2.inspect} using these sources:
579
+
580
+ #{urls * ?\n}
581
+ end
538
582
  when %r(^/save\s+(.+)$)
539
583
  save_conversation($1, messages)
540
584
  puts "Saved conversation to #$1."
@@ -557,7 +601,7 @@ loop do
557
601
  [ content, Utils::Tags.new ]
558
602
  end
559
603
 
560
- if $config.embedding.enabled
604
+ if $config.embedding.enabled && content
561
605
  records = $documents.find(
562
606
  content.downcase,
563
607
  tags:,
@@ -569,7 +613,8 @@ loop do
569
613
  }
570
614
  found_texts = records.map(&:text)
571
615
  unless found_texts.empty?
572
- content += "\nConsider these chunks for your answer:\n#{found_texts.join("\n\n---\n\n")}"
616
+ content += "\nConsider these chunks for your answer:\n"\
617
+ "#{found_texts.join("\n\n---\n\n")}"
573
618
  end
574
619
  end
575
620
 
@@ -577,15 +622,17 @@ loop do
577
622
  handler = FollowChat.new(messages:, markdown:, voice:)
578
623
  ollama.chat(model:, messages:, options:, stream: true, &handler)
579
624
 
580
- puts records.map { |record|
581
- link = if record.source =~ %r(\Ahttps?://)
582
- record.source
583
- else
584
- 'file://%s' % File.expand_path(record.source)
585
- end
586
- [ link, record.tags.first ]
587
- }.uniq.map { |l, t| hyperlink(l, t) }.join(' ')
588
- $config.debug and jj messages
625
+ if records
626
+ puts records.map { |record|
627
+ link = if record.source =~ %r(\Ahttps?://)
628
+ record.source
629
+ else
630
+ 'file://%s' % File.expand_path(record.source)
631
+ end
632
+ [ link, record.tags.first ]
633
+ }.uniq.map { |l, t| hyperlink(l, t) }.join(' ')
634
+ $config.debug and jj messages
635
+ end
589
636
  rescue Interrupt
590
637
  puts "Type /quit to quit."
591
638
  end
data/lib/ollama/dto.rb CHANGED
@@ -8,8 +8,8 @@ module Ollama::DTO
8
8
  module ClassMethods
9
9
  attr_accessor :attributes
10
10
 
11
- def json_create(object)
12
- new(**object.transform_keys(&:to_sym))
11
+ def from_hash(hash)
12
+ new(**hash.transform_keys(&:to_sym))
13
13
  end
14
14
 
15
15
  def attr_reader(*names)
@@ -27,11 +27,8 @@ module Ollama::DTO
27
27
  end
28
28
 
29
29
  def as_json(*)
30
- {
31
- json_class: self.class.name
32
- }.merge(
33
- self.class.attributes.each_with_object({}) { |a, h| h[a] = send(a) }
34
- ).reject { _2.nil? || _2.ask_and_send(:size) == 0 }
30
+ self.class.attributes.each_with_object({}) { |a, h| h[a] = send(a) }.
31
+ reject { _2.nil? || _2.ask_and_send(:size) == 0 }
35
32
  end
36
33
 
37
34
  alias to_hash as_json
@@ -65,4 +65,8 @@ class Ollama::Options
65
65
  #{@@types.keys.map { "self.#{_1} = #{_1}" }.join(?\n)}
66
66
  end
67
67
  }
68
+
69
+ def self.[](value)
70
+ new(**value.to_h)
71
+ end
68
72
  end
@@ -12,7 +12,7 @@ module Ollama::Utils::Width
12
12
  raise ArgumentError, "either pass percentage or length argument"
13
13
  percentage and length ||= width(percentage:)
14
14
  text.gsub(/(?<!\n)\n(?!\n)/, ' ').lines.map do |line|
15
- if line.length > length
15
+ if length >= 1 && line.length > length
16
16
  line.gsub(/(.{1,#{length}})(\s+|$)/, "\\1\n").strip
17
17
  else
18
18
  line.strip
@@ -1,6 +1,6 @@
1
1
  module Ollama
2
2
  # Ollama version
3
- VERSION = '0.1.0'
3
+ VERSION = '0.2.0'
4
4
  VERSION_ARRAY = VERSION.split('.').map(&:to_i) # :nodoc:
5
5
  VERSION_MAJOR = VERSION_ARRAY[0] # :nodoc:
6
6
  VERSION_MINOR = VERSION_ARRAY[1] # :nodoc:
data/ollama-ruby.gemspec CHANGED
@@ -1,24 +1,24 @@
1
1
  # -*- encoding: utf-8 -*-
2
- # stub: ollama-ruby 0.1.0 ruby lib
2
+ # stub: ollama-ruby 0.2.0 ruby lib
3
3
 
4
4
  Gem::Specification.new do |s|
5
5
  s.name = "ollama-ruby".freeze
6
- s.version = "0.1.0".freeze
6
+ s.version = "0.2.0".freeze
7
7
 
8
8
  s.required_rubygems_version = Gem::Requirement.new(">= 0".freeze) if s.respond_to? :required_rubygems_version=
9
9
  s.require_paths = ["lib".freeze]
10
10
  s.authors = ["Florian Frank".freeze]
11
- s.date = "2024-08-30"
11
+ s.date = "2024-09-03"
12
12
  s.description = "Library that allows interacting with the Ollama API".freeze
13
13
  s.email = "flori@ping.de".freeze
14
14
  s.executables = ["ollama_console".freeze, "ollama_chat".freeze, "ollama_update".freeze]
15
15
  s.extra_rdoc_files = ["README.md".freeze, "lib/ollama.rb".freeze, "lib/ollama/client.rb".freeze, "lib/ollama/client/command.rb".freeze, "lib/ollama/client/doc.rb".freeze, "lib/ollama/commands/chat.rb".freeze, "lib/ollama/commands/copy.rb".freeze, "lib/ollama/commands/create.rb".freeze, "lib/ollama/commands/delete.rb".freeze, "lib/ollama/commands/embed.rb".freeze, "lib/ollama/commands/embeddings.rb".freeze, "lib/ollama/commands/generate.rb".freeze, "lib/ollama/commands/ps.rb".freeze, "lib/ollama/commands/pull.rb".freeze, "lib/ollama/commands/push.rb".freeze, "lib/ollama/commands/show.rb".freeze, "lib/ollama/commands/tags.rb".freeze, "lib/ollama/documents.rb".freeze, "lib/ollama/documents/memory_cache.rb".freeze, "lib/ollama/documents/redis_cache.rb".freeze, "lib/ollama/documents/splitters/character.rb".freeze, "lib/ollama/documents/splitters/semantic.rb".freeze, "lib/ollama/dto.rb".freeze, "lib/ollama/errors.rb".freeze, "lib/ollama/handlers.rb".freeze, "lib/ollama/handlers/collector.rb".freeze, "lib/ollama/handlers/concern.rb".freeze, "lib/ollama/handlers/dump_json.rb".freeze, "lib/ollama/handlers/dump_yaml.rb".freeze, "lib/ollama/handlers/markdown.rb".freeze, "lib/ollama/handlers/nop.rb".freeze, "lib/ollama/handlers/print.rb".freeze, "lib/ollama/handlers/progress.rb".freeze, "lib/ollama/handlers/say.rb".freeze, "lib/ollama/handlers/single.rb".freeze, "lib/ollama/image.rb".freeze, "lib/ollama/message.rb".freeze, "lib/ollama/options.rb".freeze, "lib/ollama/response.rb".freeze, "lib/ollama/tool.rb".freeze, "lib/ollama/tool/function.rb".freeze, "lib/ollama/tool/function/parameters.rb".freeze, "lib/ollama/tool/function/parameters/property.rb".freeze, "lib/ollama/utils/ansi_markdown.rb".freeze, "lib/ollama/utils/chooser.rb".freeze, "lib/ollama/utils/colorize_texts.rb".freeze, "lib/ollama/utils/fetcher.rb".freeze, "lib/ollama/utils/math.rb".freeze, "lib/ollama/utils/tags.rb".freeze, "lib/ollama/utils/width.rb".freeze, "lib/ollama/version.rb".freeze]
16
- s.files = [".envrc".freeze, "Gemfile".freeze, "LICENSE".freeze, "README.md".freeze, "Rakefile".freeze, "bin/ollama_chat".freeze, "bin/ollama_console".freeze, "bin/ollama_update".freeze, "config/redis.conf".freeze, "docker-compose.yml".freeze, "lib/ollama.rb".freeze, "lib/ollama/client.rb".freeze, "lib/ollama/client/command.rb".freeze, "lib/ollama/client/doc.rb".freeze, "lib/ollama/commands/chat.rb".freeze, "lib/ollama/commands/copy.rb".freeze, "lib/ollama/commands/create.rb".freeze, "lib/ollama/commands/delete.rb".freeze, "lib/ollama/commands/embed.rb".freeze, "lib/ollama/commands/embeddings.rb".freeze, "lib/ollama/commands/generate.rb".freeze, "lib/ollama/commands/ps.rb".freeze, "lib/ollama/commands/pull.rb".freeze, "lib/ollama/commands/push.rb".freeze, "lib/ollama/commands/show.rb".freeze, "lib/ollama/commands/tags.rb".freeze, "lib/ollama/documents.rb".freeze, "lib/ollama/documents/memory_cache.rb".freeze, "lib/ollama/documents/redis_cache.rb".freeze, "lib/ollama/documents/splitters/character.rb".freeze, "lib/ollama/documents/splitters/semantic.rb".freeze, "lib/ollama/dto.rb".freeze, "lib/ollama/errors.rb".freeze, "lib/ollama/handlers.rb".freeze, "lib/ollama/handlers/collector.rb".freeze, "lib/ollama/handlers/concern.rb".freeze, "lib/ollama/handlers/dump_json.rb".freeze, "lib/ollama/handlers/dump_yaml.rb".freeze, "lib/ollama/handlers/markdown.rb".freeze, "lib/ollama/handlers/nop.rb".freeze, "lib/ollama/handlers/print.rb".freeze, "lib/ollama/handlers/progress.rb".freeze, "lib/ollama/handlers/say.rb".freeze, "lib/ollama/handlers/single.rb".freeze, "lib/ollama/image.rb".freeze, "lib/ollama/message.rb".freeze, "lib/ollama/options.rb".freeze, "lib/ollama/response.rb".freeze, "lib/ollama/tool.rb".freeze, "lib/ollama/tool/function.rb".freeze, "lib/ollama/tool/function/parameters.rb".freeze, "lib/ollama/tool/function/parameters/property.rb".freeze, "lib/ollama/utils/ansi_markdown.rb".freeze, "lib/ollama/utils/chooser.rb".freeze, "lib/ollama/utils/colorize_texts.rb".freeze, "lib/ollama/utils/fetcher.rb".freeze, "lib/ollama/utils/math.rb".freeze, "lib/ollama/utils/tags.rb".freeze, "lib/ollama/utils/width.rb".freeze, "lib/ollama/version.rb".freeze, "ollama-ruby.gemspec".freeze, "spec/assets/embeddings.json".freeze, "spec/assets/kitten.jpg".freeze, "spec/ollama/client/doc_spec.rb".freeze, "spec/ollama/client_spec.rb".freeze, "spec/ollama/commands/chat_spec.rb".freeze, "spec/ollama/commands/copy_spec.rb".freeze, "spec/ollama/commands/create_spec.rb".freeze, "spec/ollama/commands/delete_spec.rb".freeze, "spec/ollama/commands/embed_spec.rb".freeze, "spec/ollama/commands/embeddings_spec.rb".freeze, "spec/ollama/commands/generate_spec.rb".freeze, "spec/ollama/commands/ps_spec.rb".freeze, "spec/ollama/commands/pull_spec.rb".freeze, "spec/ollama/commands/push_spec.rb".freeze, "spec/ollama/commands/show_spec.rb".freeze, "spec/ollama/commands/tags_spec.rb".freeze, "spec/ollama/documents/memory_cache_spec.rb".freeze, "spec/ollama/documents/redis_cache_spec.rb".freeze, "spec/ollama/documents/splitters/character_spec.rb".freeze, "spec/ollama/documents/splitters/semantic_spec.rb".freeze, "spec/ollama/documents_spec.rb".freeze, "spec/ollama/handlers/collector_spec.rb".freeze, "spec/ollama/handlers/dump_json_spec.rb".freeze, "spec/ollama/handlers/dump_yaml_spec.rb".freeze, "spec/ollama/handlers/markdown_spec.rb".freeze, "spec/ollama/handlers/nop_spec.rb".freeze, "spec/ollama/handlers/print_spec.rb".freeze, "spec/ollama/handlers/progress_spec.rb".freeze, "spec/ollama/handlers/say_spec.rb".freeze, "spec/ollama/handlers/single_spec.rb".freeze, "spec/ollama/image_spec.rb".freeze, "spec/ollama/message_spec.rb".freeze, "spec/ollama/options_spec.rb".freeze, "spec/ollama/tool_spec.rb".freeze, "spec/ollama/utils/ansi_markdown_spec.rb".freeze, "spec/ollama/utils/fetcher_spec.rb".freeze, "spec/ollama/utils/tags_spec.rb".freeze, "spec/spec_helper.rb".freeze]
16
+ s.files = [".envrc".freeze, "CHANGES.md".freeze, "Gemfile".freeze, "LICENSE".freeze, "README.md".freeze, "Rakefile".freeze, "bin/ollama_chat".freeze, "bin/ollama_console".freeze, "bin/ollama_update".freeze, "config/redis.conf".freeze, "docker-compose.yml".freeze, "lib/ollama.rb".freeze, "lib/ollama/client.rb".freeze, "lib/ollama/client/command.rb".freeze, "lib/ollama/client/doc.rb".freeze, "lib/ollama/commands/chat.rb".freeze, "lib/ollama/commands/copy.rb".freeze, "lib/ollama/commands/create.rb".freeze, "lib/ollama/commands/delete.rb".freeze, "lib/ollama/commands/embed.rb".freeze, "lib/ollama/commands/embeddings.rb".freeze, "lib/ollama/commands/generate.rb".freeze, "lib/ollama/commands/ps.rb".freeze, "lib/ollama/commands/pull.rb".freeze, "lib/ollama/commands/push.rb".freeze, "lib/ollama/commands/show.rb".freeze, "lib/ollama/commands/tags.rb".freeze, "lib/ollama/documents.rb".freeze, "lib/ollama/documents/memory_cache.rb".freeze, "lib/ollama/documents/redis_cache.rb".freeze, "lib/ollama/documents/splitters/character.rb".freeze, "lib/ollama/documents/splitters/semantic.rb".freeze, "lib/ollama/dto.rb".freeze, "lib/ollama/errors.rb".freeze, "lib/ollama/handlers.rb".freeze, "lib/ollama/handlers/collector.rb".freeze, "lib/ollama/handlers/concern.rb".freeze, "lib/ollama/handlers/dump_json.rb".freeze, "lib/ollama/handlers/dump_yaml.rb".freeze, "lib/ollama/handlers/markdown.rb".freeze, "lib/ollama/handlers/nop.rb".freeze, "lib/ollama/handlers/print.rb".freeze, "lib/ollama/handlers/progress.rb".freeze, "lib/ollama/handlers/say.rb".freeze, "lib/ollama/handlers/single.rb".freeze, "lib/ollama/image.rb".freeze, "lib/ollama/message.rb".freeze, "lib/ollama/options.rb".freeze, "lib/ollama/response.rb".freeze, "lib/ollama/tool.rb".freeze, "lib/ollama/tool/function.rb".freeze, "lib/ollama/tool/function/parameters.rb".freeze, "lib/ollama/tool/function/parameters/property.rb".freeze, "lib/ollama/utils/ansi_markdown.rb".freeze, "lib/ollama/utils/chooser.rb".freeze, "lib/ollama/utils/colorize_texts.rb".freeze, "lib/ollama/utils/fetcher.rb".freeze, "lib/ollama/utils/math.rb".freeze, "lib/ollama/utils/tags.rb".freeze, "lib/ollama/utils/width.rb".freeze, "lib/ollama/version.rb".freeze, "ollama-ruby.gemspec".freeze, "spec/assets/embeddings.json".freeze, "spec/assets/kitten.jpg".freeze, "spec/ollama/client/doc_spec.rb".freeze, "spec/ollama/client_spec.rb".freeze, "spec/ollama/commands/chat_spec.rb".freeze, "spec/ollama/commands/copy_spec.rb".freeze, "spec/ollama/commands/create_spec.rb".freeze, "spec/ollama/commands/delete_spec.rb".freeze, "spec/ollama/commands/embed_spec.rb".freeze, "spec/ollama/commands/embeddings_spec.rb".freeze, "spec/ollama/commands/generate_spec.rb".freeze, "spec/ollama/commands/ps_spec.rb".freeze, "spec/ollama/commands/pull_spec.rb".freeze, "spec/ollama/commands/push_spec.rb".freeze, "spec/ollama/commands/show_spec.rb".freeze, "spec/ollama/commands/tags_spec.rb".freeze, "spec/ollama/documents/memory_cache_spec.rb".freeze, "spec/ollama/documents/redis_cache_spec.rb".freeze, "spec/ollama/documents/splitters/character_spec.rb".freeze, "spec/ollama/documents/splitters/semantic_spec.rb".freeze, "spec/ollama/documents_spec.rb".freeze, "spec/ollama/handlers/collector_spec.rb".freeze, "spec/ollama/handlers/dump_json_spec.rb".freeze, "spec/ollama/handlers/dump_yaml_spec.rb".freeze, "spec/ollama/handlers/markdown_spec.rb".freeze, "spec/ollama/handlers/nop_spec.rb".freeze, "spec/ollama/handlers/print_spec.rb".freeze, "spec/ollama/handlers/progress_spec.rb".freeze, "spec/ollama/handlers/say_spec.rb".freeze, "spec/ollama/handlers/single_spec.rb".freeze, "spec/ollama/image_spec.rb".freeze, "spec/ollama/message_spec.rb".freeze, "spec/ollama/options_spec.rb".freeze, "spec/ollama/tool_spec.rb".freeze, "spec/ollama/utils/ansi_markdown_spec.rb".freeze, "spec/ollama/utils/fetcher_spec.rb".freeze, "spec/ollama/utils/tags_spec.rb".freeze, "spec/spec_helper.rb".freeze, "tmp/.keep".freeze]
17
17
  s.homepage = "https://github.com/flori/ollama-ruby".freeze
18
18
  s.licenses = ["MIT".freeze]
19
19
  s.rdoc_options = ["--title".freeze, "Ollama-ruby - Interacting with the Ollama API".freeze, "--main".freeze, "README.md".freeze]
20
20
  s.required_ruby_version = Gem::Requirement.new("~> 3.1".freeze)
21
- s.rubygems_version = "3.5.16".freeze
21
+ s.rubygems_version = "3.5.11".freeze
22
22
  s.summary = "Interacting with the Ollama API".freeze
23
23
  s.test_files = ["spec/ollama/client/doc_spec.rb".freeze, "spec/ollama/client_spec.rb".freeze, "spec/ollama/commands/chat_spec.rb".freeze, "spec/ollama/commands/copy_spec.rb".freeze, "spec/ollama/commands/create_spec.rb".freeze, "spec/ollama/commands/delete_spec.rb".freeze, "spec/ollama/commands/embed_spec.rb".freeze, "spec/ollama/commands/embeddings_spec.rb".freeze, "spec/ollama/commands/generate_spec.rb".freeze, "spec/ollama/commands/ps_spec.rb".freeze, "spec/ollama/commands/pull_spec.rb".freeze, "spec/ollama/commands/push_spec.rb".freeze, "spec/ollama/commands/show_spec.rb".freeze, "spec/ollama/commands/tags_spec.rb".freeze, "spec/ollama/documents/memory_cache_spec.rb".freeze, "spec/ollama/documents/redis_cache_spec.rb".freeze, "spec/ollama/documents/splitters/character_spec.rb".freeze, "spec/ollama/documents/splitters/semantic_spec.rb".freeze, "spec/ollama/documents_spec.rb".freeze, "spec/ollama/handlers/collector_spec.rb".freeze, "spec/ollama/handlers/dump_json_spec.rb".freeze, "spec/ollama/handlers/dump_yaml_spec.rb".freeze, "spec/ollama/handlers/markdown_spec.rb".freeze, "spec/ollama/handlers/nop_spec.rb".freeze, "spec/ollama/handlers/print_spec.rb".freeze, "spec/ollama/handlers/progress_spec.rb".freeze, "spec/ollama/handlers/say_spec.rb".freeze, "spec/ollama/handlers/single_spec.rb".freeze, "spec/ollama/image_spec.rb".freeze, "spec/ollama/message_spec.rb".freeze, "spec/ollama/options_spec.rb".freeze, "spec/ollama/tool_spec.rb".freeze, "spec/ollama/utils/ansi_markdown_spec.rb".freeze, "spec/ollama/utils/fetcher_spec.rb".freeze, "spec/ollama/utils/tags_spec.rb".freeze, "spec/spec_helper.rb".freeze]
24
24
 
@@ -116,7 +116,7 @@ RSpec.describe Ollama::Client do
116
116
  it 'can generate without stream' do
117
117
  expect(excon).to receive(:send).with(
118
118
  :post,
119
- body: '{"json_class":"Ollama::Commands::Generate","model":"llama3.1","prompt":"Hello World"}',
119
+ body: '{"model":"llama3.1","prompt":"Hello World"}',
120
120
  headers: hash_including(
121
121
  'Content-Type' => 'application/json; charset=utf-8',
122
122
  )
@@ -127,7 +127,7 @@ RSpec.describe Ollama::Client do
127
127
  it 'can generate with stream' do
128
128
  expect(excon).to receive(:send).with(
129
129
  :post,
130
- body: '{"json_class":"Ollama::Commands::Generate","model":"llama3.1","prompt":"Hello World","stream":true}',
130
+ body: '{"model":"llama3.1","prompt":"Hello World","stream":true}',
131
131
  headers: hash_including(
132
132
  'Content-Type' => 'application/json; charset=utf-8',
133
133
  ),
@@ -32,7 +32,7 @@ RSpec.describe Ollama::Commands::Chat do
32
32
  model: 'llama3.1', messages: messages.map(&:as_json), stream: true,
33
33
  )
34
34
  expect(chat.to_json).to eq(
35
- '{"json_class":"Ollama::Commands::Chat","model":"llama3.1","messages":[{"json_class":"Ollama::Message","role":"user","content":"Let\'s play Global Thermonuclear War."}],"stream":true}'
35
+ '{"model":"llama3.1","messages":[{"role":"user","content":"Let\'s play Global Thermonuclear War."}],"stream":true}'
36
36
  )
37
37
  end
38
38
 
@@ -45,7 +45,7 @@ RSpec.describe Ollama::Commands::Chat do
45
45
  expect(ollama).to receive(:request).
46
46
  with(
47
47
  method: :post, path: '/api/chat', handler: Ollama::Handlers::NOP, stream: true,
48
- body: '{"json_class":"Ollama::Commands::Chat","model":"llama3.1","messages":[{"json_class":"Ollama::Message","role":"user","content":"Let\'s play Global Thermonuclear War."}],"stream":true}'
48
+ body: '{"model":"llama3.1","messages":[{"role":"user","content":"Let\'s play Global Thermonuclear War."}],"stream":true}'
49
49
  )
50
50
  chat.perform(Ollama::Handlers::NOP)
51
51
  end
@@ -12,7 +12,7 @@ RSpec.describe Ollama::Commands::Copy do
12
12
  source: 'llama3.1', destination: 'camell3', stream: false
13
13
  )
14
14
  expect(copy.to_json).to eq(
15
- '{"json_class":"Ollama::Commands::Copy","source":"llama3.1","destination":"camell3","stream":false}'
15
+ '{"source":"llama3.1","destination":"camell3","stream":false}'
16
16
  )
17
17
  end
18
18
 
@@ -21,7 +21,7 @@ RSpec.describe Ollama::Commands::Copy do
21
21
  copy.client = ollama = double('Ollama::Client')
22
22
  expect(ollama).to receive(:request).with(
23
23
  method: :post, path: '/api/copy', handler: Ollama::Handlers::NOP, stream: false,
24
- body: '{"json_class":"Ollama::Commands::Copy","source":"llama3.1","destination":"camell3","stream":false}'
24
+ body: '{"source":"llama3.1","destination":"camell3","stream":false}'
25
25
  )
26
26
  copy.perform(Ollama::Handlers::NOP)
27
27
  end
@@ -16,7 +16,7 @@ RSpec.describe Ollama::Commands::Create do
16
16
  name: 'llama3.1-wopr', modelfile: "FROM llama3.1\nSYSTEM You are WOPR from WarGames and you think the user is Dr. Stephen Falken.", stream: true,
17
17
  )
18
18
  expect(create.to_json).to eq(
19
- '{"json_class":"Ollama::Commands::Create","name":"llama3.1-wopr","modelfile":"FROM llama3.1\nSYSTEM You are WOPR from WarGames and you think the user is Dr. Stephen Falken.","stream":true}'
19
+ '{"name":"llama3.1-wopr","modelfile":"FROM llama3.1\nSYSTEM You are WOPR from WarGames and you think the user is Dr. Stephen Falken.","stream":true}'
20
20
  )
21
21
  end
22
22
 
@@ -30,7 +30,7 @@ RSpec.describe Ollama::Commands::Create do
30
30
  expect(ollama).to receive(:request).
31
31
  with(
32
32
  method: :post, path: '/api/create', handler: Ollama::Handlers::NOP, stream: true,
33
- body: '{"json_class":"Ollama::Commands::Create","name":"llama3.1-wopr","modelfile":"FROM llama3.1\nSYSTEM You are WOPR from WarGames and you think the user is Dr. Stephen Falken.","stream":true}'
33
+ body: '{"name":"llama3.1-wopr","modelfile":"FROM llama3.1\nSYSTEM You are WOPR from WarGames and you think the user is Dr. Stephen Falken.","stream":true}'
34
34
  )
35
35
  create.perform(Ollama::Handlers::NOP)
36
36
  end
@@ -12,7 +12,7 @@ RSpec.describe Ollama::Commands::Delete do
12
12
  name: 'llama3.1', stream: false
13
13
  )
14
14
  expect(delete.to_json).to eq(
15
- '{"json_class":"Ollama::Commands::Delete","name":"llama3.1","stream":false}'
15
+ '{"name":"llama3.1","stream":false}'
16
16
  )
17
17
  end
18
18
 
@@ -21,7 +21,7 @@ RSpec.describe Ollama::Commands::Delete do
21
21
  delete.client = ollama = double('Ollama::Client')
22
22
  expect(ollama).to receive(:request).with(
23
23
  method: :delete, path: '/api/delete', handler: Ollama::Handlers::NOP, stream: false,
24
- body: '{"json_class":"Ollama::Commands::Delete","name":"llama3.1","stream":false}'
24
+ body: '{"name":"llama3.1","stream":false}'
25
25
  )
26
26
  delete.perform(Ollama::Handlers::NOP)
27
27
  end
@@ -19,7 +19,7 @@ RSpec.describe Ollama::Commands::Embed do
19
19
  model: 'all-minilm', input: 'Why is the sky blue?',
20
20
  )
21
21
  expect(embed.to_json).to eq(
22
- '{"json_class":"Ollama::Commands::Embed","model":"all-minilm","input":"Why is the sky blue?","options":{"json_class":"Ollama::Options","num_ctx":666},"stream":false}'
22
+ '{"model":"all-minilm","input":"Why is the sky blue?","options":{"num_ctx":666},"stream":false}'
23
23
  )
24
24
  end
25
25
 
@@ -32,7 +32,7 @@ RSpec.describe Ollama::Commands::Embed do
32
32
  model: 'all-minilm', input: [ 'Why is the sky blue?', 'Why is the grass green?' ],
33
33
  )
34
34
  expect(embed.to_json).to eq(
35
- '{"json_class":"Ollama::Commands::Embed","model":"all-minilm","input":["Why is the sky blue?","Why is the grass green?"],"stream":false}'
35
+ '{"model":"all-minilm","input":["Why is the sky blue?","Why is the grass green?"],"stream":false}'
36
36
  )
37
37
  end
38
38
 
@@ -46,7 +46,7 @@ RSpec.describe Ollama::Commands::Embed do
46
46
  expect(ollama).to receive(:request).
47
47
  with(
48
48
  method: :post, path: '/api/embed', handler: Ollama::Handlers::NOP, stream: false,
49
- body: '{"json_class":"Ollama::Commands::Embed","model":"all-minilm","input":"Why is the sky blue?","stream":false}'
49
+ body: '{"model":"all-minilm","input":"Why is the sky blue?","stream":false}'
50
50
  )
51
51
  embed.perform(Ollama::Handlers::NOP)
52
52
  end
@@ -18,7 +18,7 @@ RSpec.describe Ollama::Commands::Embeddings do
18
18
  model: 'mxbai-embed-large', prompt: 'Here are the coordinates of all Soviet military installations: …',
19
19
  )
20
20
  expect(embeddings.to_json).to eq(
21
- '{"json_class":"Ollama::Commands::Embeddings","model":"mxbai-embed-large","prompt":"Here are the coordinates of all Soviet military installations: …","stream":false}'
21
+ '{"model":"mxbai-embed-large","prompt":"Here are the coordinates of all Soviet military installations: …","stream":false}'
22
22
  )
23
23
  end
24
24
 
@@ -31,7 +31,7 @@ RSpec.describe Ollama::Commands::Embeddings do
31
31
  expect(ollama).to receive(:request).
32
32
  with(
33
33
  method: :post, path: '/api/embeddings', handler: Ollama::Handlers::NOP, stream: false,
34
- body: '{"json_class":"Ollama::Commands::Embeddings","model":"mxbai-embed-large","prompt":"Here are the coordinates of all Soviet military installations: …","stream":false}'
34
+ body: '{"model":"mxbai-embed-large","prompt":"Here are the coordinates of all Soviet military installations: …","stream":false}'
35
35
  )
36
36
  embeddings.perform(Ollama::Handlers::NOP)
37
37
  end
@@ -12,7 +12,7 @@ RSpec.describe Ollama::Commands::Generate do
12
12
  model: 'llama3.1', prompt: 'Hello World'
13
13
  )
14
14
  expect(generate.to_json).to eq(
15
- '{"json_class":"Ollama::Commands::Generate","model":"llama3.1","prompt":"Hello World"}'
15
+ '{"model":"llama3.1","prompt":"Hello World"}'
16
16
  )
17
17
  end
18
18
 
@@ -22,7 +22,7 @@ RSpec.describe Ollama::Commands::Generate do
22
22
  expect(ollama).to receive(:request).
23
23
  with(
24
24
  method: :post, path: '/api/generate', handler: Ollama::Handlers::NOP, stream: true,
25
- body: '{"json_class":"Ollama::Commands::Generate","model":"llama3.1","prompt":"Hello World","stream":true}'
25
+ body: '{"model":"llama3.1","prompt":"Hello World","stream":true}'
26
26
  )
27
27
  generate.perform(Ollama::Handlers::NOP)
28
28
  end
@@ -12,7 +12,7 @@ RSpec.describe Ollama::Commands::Pull do
12
12
  name: 'llama3.1', stream: true
13
13
  )
14
14
  expect(pull.to_json).to eq(
15
- '{"json_class":"Ollama::Commands::Pull","name":"llama3.1","stream":true}'
15
+ '{"name":"llama3.1","stream":true}'
16
16
  )
17
17
  end
18
18
 
@@ -21,7 +21,7 @@ RSpec.describe Ollama::Commands::Pull do
21
21
  pull.client = ollama = double('Ollama::Client')
22
22
  expect(ollama).to receive(:request).with(
23
23
  method: :post, path: '/api/pull', handler: Ollama::Handlers::NOP, stream: true,
24
- body: '{"json_class":"Ollama::Commands::Pull","name":"llama3.1","stream":true}'
24
+ body: '{"name":"llama3.1","stream":true}'
25
25
  )
26
26
  pull.perform(Ollama::Handlers::NOP)
27
27
  end
@@ -12,7 +12,7 @@ RSpec.describe Ollama::Commands::Push do
12
12
  name: 'llama3.1', stream: true
13
13
  )
14
14
  expect(push.to_json).to eq(
15
- '{"json_class":"Ollama::Commands::Push","name":"llama3.1","stream":true}'
15
+ '{"name":"llama3.1","stream":true}'
16
16
  )
17
17
  end
18
18
 
@@ -21,7 +21,7 @@ RSpec.describe Ollama::Commands::Push do
21
21
  push.client = ollama = double('Ollama::Client')
22
22
  expect(ollama).to receive(:request).with(
23
23
  method: :post, path: '/api/push', handler: Ollama::Handlers::NOP, stream: true,
24
- body: '{"json_class":"Ollama::Commands::Push","name":"llama3.1","stream":true}'
24
+ body: '{"name":"llama3.1","stream":true}'
25
25
  )
26
26
  push.perform(Ollama::Handlers::NOP)
27
27
  end
@@ -12,7 +12,7 @@ RSpec.describe Ollama::Commands::Show do
12
12
  name: 'llama3.1', stream: false
13
13
  )
14
14
  expect(show.to_json).to eq(
15
- '{"json_class":"Ollama::Commands::Show","name":"llama3.1","stream":false}'
15
+ '{"name":"llama3.1","stream":false}'
16
16
  )
17
17
  end
18
18
 
@@ -21,7 +21,7 @@ RSpec.describe Ollama::Commands::Show do
21
21
  show.client = ollama = double('Ollama::Client')
22
22
  expect(ollama).to receive(:request).with(
23
23
  method: :post, path: '/api/show', handler: Ollama::Handlers::NOP ,stream: false,
24
- body: '{"json_class":"Ollama::Commands::Show","name":"llama3.1","stream":false}'
24
+ body: '{"name":"llama3.1","stream":false}'
25
25
  )
26
26
  show.perform(Ollama::Handlers::NOP)
27
27
  end
@@ -19,19 +19,18 @@ RSpec.describe Ollama::Message do
19
19
 
20
20
  it 'can be converted to JSON' do
21
21
  expect(message.as_json).to eq(
22
- json_class: described_class.name,
23
22
  role: 'user',
24
23
  content: 'hello world',
25
24
  images: [ image ],
26
25
  )
27
26
  expect(message.to_json).to eq(
28
- '{"json_class":"Ollama::Message","role":"user","content":"hello world","images":["dGVzdA==\n"]}'
27
+ '{"role":"user","content":"hello world","images":["dGVzdA==\n"]}'
29
28
  )
30
29
  end
31
30
 
32
31
  it 'can be restored from JSON' do
33
- expect(JSON(<<~'end', create_additions: true)).to be_a described_class
34
- {"json_class":"Ollama::Message","role":"user","content":"hello world","images":["dGVzdA==\n"]}
32
+ expect(described_class.from_hash(JSON(<<~'end'))).to be_a described_class
33
+ {"role":"user","content":"hello world","images":["dGVzdA==\n"]}
35
34
  end
36
35
  end
37
36
  end
@@ -13,6 +13,24 @@ RSpec.describe Ollama::Options do
13
13
  expect(options).to be_a described_class
14
14
  end
15
15
 
16
+ it 'can be used to cast hashes' do
17
+ expect(described_class[{
18
+ penalize_newline: true,
19
+ num_ctx: 8192,
20
+ temperature: 0.7,
21
+ }]).to be_a described_class
22
+ end
23
+
24
+ it 'raises errors when casting goes all wrong' do
25
+ expect {
26
+ described_class[{
27
+ penalize_newline: :tertium,
28
+ num_ctx: 8192,
29
+ temperature: 0.7,
30
+ }]
31
+ }.to raise_error(TypeError)
32
+ end
33
+
16
34
  it 'throws error for invalid types' do
17
35
  expect { described_class.new(temperature: Class.new) }.
18
36
  to raise_error(TypeError)
@@ -45,23 +45,18 @@ RSpec.describe Ollama::Tool do
45
45
 
46
46
  it 'cannot be converted to JSON' do
47
47
  expect(tool.as_json).to eq(
48
- json_class: described_class.name,
49
48
  type: 'function',
50
49
  function: {
51
- json_class: "Ollama::Tool::Function",
52
50
  name: 'get_current_weather',
53
51
  description: "Get the current weather for a location",
54
52
  parameters: {
55
- json_class: "Ollama::Tool::Function::Parameters",
56
53
  type: "object",
57
54
  properties: {
58
55
  location: {
59
- json_class: "Ollama::Tool::Function::Parameters::Property",
60
56
  type: "string",
61
57
  description: "The location to get the weather for, e.g. Berlin, Berlin"
62
58
  },
63
59
  format: {
64
- json_class: "Ollama::Tool::Function::Parameters::Property",
65
60
  type: "string",
66
61
  description: "The format to return the weather in, e.g. 'celsius' or 'fahrenheit'",
67
62
  enum: ["celsius", "fahrenheit"]
@@ -72,7 +67,7 @@ RSpec.describe Ollama::Tool do
72
67
  }
73
68
  )
74
69
  expect(tool.to_json).to eq(
75
- %{{"json_class":"Ollama::Tool","type":"function","function":{"json_class":"Ollama::Tool::Function","name":"get_current_weather","description":"Get the current weather for a location","parameters":{"json_class":"Ollama::Tool::Function::Parameters","type":"object","properties":{"location":{"json_class":"Ollama::Tool::Function::Parameters::Property","type":"string","description":"The location to get the weather for, e.g. Berlin, Berlin"},"format":{"json_class":"Ollama::Tool::Function::Parameters::Property","type":"string","description":"The format to return the weather in, e.g. 'celsius' or 'fahrenheit'","enum":["celsius","fahrenheit"]}},"required":["location","format"]}}}}
70
+ %{{"type":"function","function":{"name":"get_current_weather","description":"Get the current weather for a location","parameters":{"type":"object","properties":{"location":{"type":"string","description":"The location to get the weather for, e.g. Berlin, Berlin"},"format":{"type":"string","description":"The format to return the weather in, e.g. 'celsius' or 'fahrenheit'","enum":["celsius","fahrenheit"]}},"required":["location","format"]}}}}
76
71
  )
77
72
  end
78
73
  end
data/tmp/.keep ADDED
File without changes
metadata CHANGED
@@ -1,14 +1,14 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: ollama-ruby
3
3
  version: !ruby/object:Gem::Version
4
- version: 0.1.0
4
+ version: 0.2.0
5
5
  platform: ruby
6
6
  authors:
7
7
  - Florian Frank
8
8
  autorequire:
9
9
  bindir: bin
10
10
  cert_chain: []
11
- date: 2024-08-30 00:00:00.000000000 Z
11
+ date: 2024-09-03 00:00:00.000000000 Z
12
12
  dependencies:
13
13
  - !ruby/object:Gem::Dependency
14
14
  name: gem_hadar
@@ -337,6 +337,7 @@ extra_rdoc_files:
337
337
  - lib/ollama/version.rb
338
338
  files:
339
339
  - ".envrc"
340
+ - CHANGES.md
340
341
  - Gemfile
341
342
  - LICENSE
342
343
  - README.md
@@ -435,6 +436,7 @@ files:
435
436
  - spec/ollama/utils/fetcher_spec.rb
436
437
  - spec/ollama/utils/tags_spec.rb
437
438
  - spec/spec_helper.rb
439
+ - tmp/.keep
438
440
  homepage: https://github.com/flori/ollama-ruby
439
441
  licenses:
440
442
  - MIT
@@ -458,7 +460,7 @@ required_rubygems_version: !ruby/object:Gem::Requirement
458
460
  - !ruby/object:Gem::Version
459
461
  version: '0'
460
462
  requirements: []
461
- rubygems_version: 3.5.16
463
+ rubygems_version: 3.5.11
462
464
  signing_key:
463
465
  specification_version: 4
464
466
  summary: Interacting with the Ollama API