ollama-ai 1.0.1 → 1.2.1

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: fde2a3054fbb7ae9b7dc02b3bb0f083a18496fa9af299299ca138a28ba0663e7
4
- data.tar.gz: 371bda0d8c2db9e82cee85162d665a2ccf4bb204a487f5b0c19a156f7ac4e292
3
+ metadata.gz: 2b2cbe0dd2d75850aea0dddf66463e295dc3941205d66bbd5abbb1c8590fc8a7
4
+ data.tar.gz: 802c5423e24119488d9112726704e056bbed29c70c0c98247dd89a757fdefaaf
5
5
  SHA512:
6
- metadata.gz: bdf82b0c94f7011f6d35b4997db27f65c1f87ddd60b05adb221d84f7504934eec3492be675647fd448e6157fa880a149d7a98d7066c597ab76559477101744d1
7
- data.tar.gz: de8ddb9883d714f226edfa5f12a343002703f619717c653f7ab58f1757bb43f6662d451f77abc850cf1880c41165e64aa35589cb52a695c73d5452d6a25baf56
6
+ metadata.gz: acaa09929dfd956842f026ff504cfc6cc3213c7ae829a39a901a6302b9a45694e23795698375e25f0773780275811575dc5f6eef8b22d940849d9fdf855c4bf2
7
+ data.tar.gz: bd2758a1306d2a1a44b67c4662cb029f403100b3c2a34f44925d1d6b1bb514b2015154b004035927a5320ef9e137959d872d2f5361e13e6d8fd5767182ced122
data/Gemfile CHANGED
@@ -6,5 +6,5 @@ gemspec
6
6
 
7
7
  group :test, :development do
8
8
  gem 'pry-byebug', '~> 3.10', '>= 3.10.1'
9
- gem 'rubocop', '~> 1.58'
9
+ gem 'rubocop', '~> 1.63', '>= 1.63.1'
10
10
  end
data/Gemfile.lock CHANGED
@@ -1,8 +1,9 @@
1
1
  PATH
2
2
  remote: .
3
3
  specs:
4
- ollama-ai (1.0.1)
4
+ ollama-ai (1.2.1)
5
5
  faraday (~> 2.9)
6
+ faraday-typhoeus (~> 1.1)
6
7
 
7
8
  GEM
8
9
  remote: https://rubygems.org/
@@ -10,17 +11,23 @@ GEM
10
11
  ast (2.4.2)
11
12
  byebug (11.1.3)
12
13
  coderay (1.1.3)
14
+ ethon (0.16.0)
15
+ ffi (>= 1.15.0)
13
16
  faraday (2.9.0)
14
17
  faraday-net_http (>= 2.0, < 3.2)
15
18
  faraday-net_http (3.1.0)
16
19
  net-http
17
- json (2.7.1)
20
+ faraday-typhoeus (1.1.0)
21
+ faraday (~> 2.0)
22
+ typhoeus (~> 1.4)
23
+ ffi (1.16.3)
24
+ json (2.7.2)
18
25
  language_server-protocol (3.17.0.3)
19
26
  method_source (1.0.0)
20
27
  net-http (0.4.1)
21
28
  uri
22
29
  parallel (1.24.0)
23
- parser (3.3.0.3)
30
+ parser (3.3.0.5)
24
31
  ast (~> 2.4.1)
25
32
  racc
26
33
  pry (0.14.2)
@@ -33,20 +40,22 @@ GEM
33
40
  rainbow (3.1.1)
34
41
  regexp_parser (2.9.0)
35
42
  rexml (3.2.6)
36
- rubocop (1.59.0)
43
+ rubocop (1.63.1)
37
44
  json (~> 2.3)
38
45
  language_server-protocol (>= 3.17.0)
39
46
  parallel (~> 1.10)
40
- parser (>= 3.2.2.4)
47
+ parser (>= 3.3.0.2)
41
48
  rainbow (>= 2.2.2, < 4.0)
42
49
  regexp_parser (>= 1.8, < 3.0)
43
50
  rexml (>= 3.2.5, < 4.0)
44
- rubocop-ast (>= 1.30.0, < 2.0)
51
+ rubocop-ast (>= 1.31.1, < 2.0)
45
52
  ruby-progressbar (~> 1.7)
46
53
  unicode-display_width (>= 2.4.0, < 3.0)
47
- rubocop-ast (1.30.0)
48
- parser (>= 3.2.1.0)
54
+ rubocop-ast (1.31.2)
55
+ parser (>= 3.3.0.4)
49
56
  ruby-progressbar (1.13.0)
57
+ typhoeus (1.4.1)
58
+ ethon (>= 0.9.0)
50
59
  unicode-display_width (2.5.0)
51
60
  uri (0.13.0)
52
61
 
@@ -56,7 +65,7 @@ PLATFORMS
56
65
  DEPENDENCIES
57
66
  ollama-ai!
58
67
  pry-byebug (~> 3.10, >= 3.10.1)
59
- rubocop (~> 1.58)
68
+ rubocop (~> 1.63, >= 1.63.1)
60
69
 
61
70
  BUNDLED WITH
62
71
  2.4.22
data/README.md CHANGED
@@ -9,7 +9,7 @@ A Ruby gem for interacting with [Ollama](https://ollama.ai)'s API that allows yo
9
9
  ## TL;DR and Quick Start
10
10
 
11
11
  ```ruby
12
- gem 'ollama-ai', '~> 1.0.1'
12
+ gem 'ollama-ai', '~> 1.2.1'
13
13
  ```
14
14
 
15
15
  ```ruby
@@ -62,40 +62,41 @@ Result:
62
62
  - [TL;DR and Quick Start](#tldr-and-quick-start)
63
63
  - [Index](#index)
64
64
  - [Setup](#setup)
65
- - [Installing](#installing)
65
+ - [Installing](#installing)
66
66
  - [Usage](#usage)
67
- - [Client](#client)
68
- - [Methods](#methods)
69
- - [generate: Generate a completion](#generate-generate-a-completion)
70
- - [Without Streaming Events](#without-streaming-events)
71
- - [Receiving Stream Events](#receiving-stream-events)
72
- - [chat: Generate a chat completion](#chat-generate-a-chat-completion)
73
- - [Back-and-Forth Conversations](#back-and-forth-conversations)
74
- - [embeddings: Generate Embeddings](#embeddings-generate-embeddings)
75
- - [Models](#models)
76
- - [create: Create a Model](#create-create-a-model)
77
- - [tags: List Local Models](#tags-list-local-models)
78
- - [show: Show Model Information](#show-show-model-information)
79
- - [copy: Copy a Model](#copy-copy-a-model)
80
- - [delete: Delete a Model](#delete-delete-a-model)
81
- - [pull: Pull a Model](#pull-pull-a-model)
82
- - [push: Push a Model](#push-push-a-model)
83
- - [Modes](#modes)
84
- - [Text](#text)
85
- - [Image](#image)
86
- - [Streaming and Server-Sent Events (SSE)](#streaming-and-server-sent-events-sse)
87
- - [Server-Sent Events (SSE) Hang](#server-sent-events-sse-hang)
88
- - [New Functionalities and APIs](#new-functionalities-and-apis)
89
- - [Request Options](#request-options)
90
- - [Timeout](#timeout)
91
- - [Error Handling](#error-handling)
92
- - [Rescuing](#rescuing)
93
- - [For Short](#for-short)
94
- - [Errors](#errors)
67
+ - [Client](#client)
68
+ - [Methods](#methods)
69
+ - [generate: Generate a completion](#generate-generate-a-completion)
70
+ - [Without Streaming Events](#without-streaming-events)
71
+ - [Receiving Stream Events](#receiving-stream-events)
72
+ - [chat: Generate a chat completion](#chat-generate-a-chat-completion)
73
+ - [Back-and-Forth Conversations](#back-and-forth-conversations)
74
+ - [embeddings: Generate Embeddings](#embeddings-generate-embeddings)
75
+ - [Models](#models)
76
+ - [create: Create a Model](#create-create-a-model)
77
+ - [tags: List Local Models](#tags-list-local-models)
78
+ - [show: Show Model Information](#show-show-model-information)
79
+ - [copy: Copy a Model](#copy-copy-a-model)
80
+ - [delete: Delete a Model](#delete-delete-a-model)
81
+ - [pull: Pull a Model](#pull-pull-a-model)
82
+ - [push: Push a Model](#push-push-a-model)
83
+ - [Modes](#modes)
84
+ - [Text](#text)
85
+ - [Image](#image)
86
+ - [Streaming and Server-Sent Events (SSE)](#streaming-and-server-sent-events-sse)
87
+ - [Server-Sent Events (SSE) Hang](#server-sent-events-sse-hang)
88
+ - [New Functionalities and APIs](#new-functionalities-and-apis)
89
+ - [Request Options](#request-options)
90
+ - [Adapter](#adapter)
91
+ - [Timeout](#timeout)
92
+ - [Error Handling](#error-handling)
93
+ - [Rescuing](#rescuing)
94
+ - [For Short](#for-short)
95
+ - [Errors](#errors)
95
96
  - [Development](#development)
96
- - [Purpose](#purpose)
97
- - [Publish to RubyGems](#publish-to-rubygems)
98
- - [Updating the README](#updating-the-readme)
97
+ - [Purpose](#purpose)
98
+ - [Publish to RubyGems](#publish-to-rubygems)
99
+ - [Updating the README](#updating-the-readme)
99
100
  - [Resources and References](#resources-and-references)
100
101
  - [Disclaimer](#disclaimer)
101
102
 
@@ -104,11 +105,11 @@ Result:
104
105
  ### Installing
105
106
 
106
107
  ```sh
107
- gem install ollama-ai -v 1.0.1
108
+ gem install ollama-ai -v 1.2.1
108
109
  ```
109
110
 
110
111
  ```sh
111
- gem 'ollama-ai', '~> 1.0.1'
112
+ gem 'ollama-ai', '~> 1.2.1'
112
113
  ```
113
114
 
114
115
  ## Usage
@@ -767,6 +768,21 @@ result = client.request(
767
768
 
768
769
  ### Request Options
769
770
 
771
+ #### Adapter
772
+
773
+ The gem uses [Faraday](https://github.com/lostisland/faraday) with the [Typhoeus](https://github.com/typhoeus/typhoeus) adapter by default.
774
+
775
+ You can use a different adapter if you want:
776
+
777
+ ```ruby
778
+ require 'faraday/net_http'
779
+
780
+ client = Ollama.new(
781
+ credentials: { address: 'http://localhost:11434' },
782
+ options: { connection: { adapter: :net_http } }
783
+ )
784
+ ```
785
+
770
786
  #### Timeout
771
787
 
772
788
  You can set the maximum number of seconds to wait for the request to complete with the `timeout` option:
@@ -855,6 +871,7 @@ bundle
855
871
  rubocop -A
856
872
 
857
873
  bundle exec ruby spec/tasks/run-client.rb
874
+ bundle exec ruby spec/tasks/test-encoding.rb
858
875
  ```
859
876
 
860
877
  ### Purpose
@@ -868,7 +885,7 @@ gem build ollama-ai.gemspec
868
885
 
869
886
  gem signin
870
887
 
871
- gem push ollama-ai-1.0.1.gem
888
+ gem push ollama-ai-1.2.1.gem
872
889
  ```
873
890
 
874
891
  ### Updating the README
@@ -1,6 +1,7 @@
1
1
  # frozen_string_literal: true
2
2
 
3
3
  require 'faraday'
4
+ require 'faraday/typhoeus'
4
5
  require 'json'
5
6
 
6
7
  require_relative '../components/errors'
@@ -12,6 +13,8 @@ module Ollama
12
13
 
13
14
  ALLOWED_REQUEST_OPTIONS = %i[timeout open_timeout read_timeout write_timeout].freeze
14
15
 
16
+ DEFAULT_FARADAY_ADAPTER = :typhoeus
17
+
15
18
  def initialize(config)
16
19
  @server_sent_events = config.dig(:options, :server_sent_events)
17
20
 
@@ -30,6 +33,8 @@ module Ollama
30
33
  else
31
34
  {}
32
35
  end
36
+
37
+ @faraday_adapter = config.dig(:options, :connection, :adapter) || DEFAULT_FARADAY_ADAPTER
33
38
  end
34
39
 
35
40
  def generate(payload, server_sent_events: nil, &callback)
@@ -87,9 +92,10 @@ module Ollama
87
92
 
88
93
  method_to_call = request_method.to_s.strip.downcase.to_sym
89
94
 
90
- partial_json = ''
95
+ partial_json = String.new.force_encoding('UTF-8')
91
96
 
92
97
  response = Faraday.new(request: @request_options) do |faraday|
98
+ faraday.adapter @faraday_adapter
93
99
  faraday.response :raise_error
94
100
  end.send(method_to_call) do |request|
95
101
  request.url url
@@ -104,7 +110,13 @@ module Ollama
104
110
  raise_error.on_complete(env.merge(body: chunk))
105
111
  end
106
112
 
107
- partial_json += chunk
113
+ utf8_chunk = chunk.force_encoding('UTF-8')
114
+
115
+ partial_json += if utf8_chunk.valid_encoding?
116
+ utf8_chunk
117
+ else
118
+ utf8_chunk.encode('UTF-8', invalid: :replace, undef: :replace)
119
+ end
108
120
 
109
121
  parsed_json = safe_parse_json(partial_json)
110
122
 
@@ -115,7 +127,7 @@ module Ollama
115
127
 
116
128
  results << result
117
129
 
118
- partial_json = ''
130
+ partial_json = String.new.force_encoding('UTF-8')
119
131
  end
120
132
  end
121
133
  end
data/ollama-ai.gemspec CHANGED
@@ -30,6 +30,7 @@ Gem::Specification.new do |spec|
30
30
  spec.require_paths = ['ports/dsl']
31
31
 
32
32
  spec.add_dependency 'faraday', '~> 2.9'
33
+ spec.add_dependency 'faraday-typhoeus', '~> 1.1'
33
34
 
34
35
  spec.metadata['rubygems_mfa_required'] = 'true'
35
36
  end
data/static/gem.rb CHANGED
@@ -3,7 +3,7 @@
3
3
  module Ollama
4
4
  GEM = {
5
5
  name: 'ollama-ai',
6
- version: '1.0.1',
6
+ version: '1.2.1',
7
7
  author: 'gbaptista',
8
8
  summary: 'Interact with Ollama API to run open source AI models locally.',
9
9
  description: "A Ruby gem for interacting with Ollama's API that allows you to run open source AI LLMs (Large Language Models) locally.",
@@ -23,7 +23,7 @@
23
23
  (remove nil?))]
24
24
  (->> processed-lines
25
25
  (map (fn [{:keys [level title link]}]
26
- (str (apply str (repeat (* 4 (- level 2)) " "))
26
+ (str (apply str (repeat (* 2 (- level 2)) " "))
27
27
  "- ["
28
28
  title
29
29
  "](#"
data/template.md CHANGED
@@ -9,7 +9,7 @@ A Ruby gem for interacting with [Ollama](https://ollama.ai)'s API that allows yo
9
9
  ## TL;DR and Quick Start
10
10
 
11
11
  ```ruby
12
- gem 'ollama-ai', '~> 1.0.1'
12
+ gem 'ollama-ai', '~> 1.2.1'
13
13
  ```
14
14
 
15
15
  ```ruby
@@ -66,11 +66,11 @@ Result:
66
66
  ### Installing
67
67
 
68
68
  ```sh
69
- gem install ollama-ai -v 1.0.1
69
+ gem install ollama-ai -v 1.2.1
70
70
  ```
71
71
 
72
72
  ```sh
73
- gem 'ollama-ai', '~> 1.0.1'
73
+ gem 'ollama-ai', '~> 1.2.1'
74
74
  ```
75
75
 
76
76
  ## Usage
@@ -729,6 +729,21 @@ result = client.request(
729
729
 
730
730
  ### Request Options
731
731
 
732
+ #### Adapter
733
+
734
+ The gem uses [Faraday](https://github.com/lostisland/faraday) with the [Typhoeus](https://github.com/typhoeus/typhoeus) adapter by default.
735
+
736
+ You can use a different adapter if you want:
737
+
738
+ ```ruby
739
+ require 'faraday/net_http'
740
+
741
+ client = Ollama.new(
742
+ credentials: { address: 'http://localhost:11434' },
743
+ options: { connection: { adapter: :net_http } }
744
+ )
745
+ ```
746
+
732
747
  #### Timeout
733
748
 
734
749
  You can set the maximum number of seconds to wait for the request to complete with the `timeout` option:
@@ -817,6 +832,7 @@ bundle
817
832
  rubocop -A
818
833
 
819
834
  bundle exec ruby spec/tasks/run-client.rb
835
+ bundle exec ruby spec/tasks/test-encoding.rb
820
836
  ```
821
837
 
822
838
  ### Purpose
@@ -830,7 +846,7 @@ gem build ollama-ai.gemspec
830
846
 
831
847
  gem signin
832
848
 
833
- gem push ollama-ai-1.0.1.gem
849
+ gem push ollama-ai-1.2.1.gem
834
850
  ```
835
851
 
836
852
  ### Updating the README
metadata CHANGED
@@ -1,14 +1,14 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: ollama-ai
3
3
  version: !ruby/object:Gem::Version
4
- version: 1.0.1
4
+ version: 1.2.1
5
5
  platform: ruby
6
6
  authors:
7
7
  - gbaptista
8
- autorequire:
8
+ autorequire:
9
9
  bindir: bin
10
10
  cert_chain: []
11
- date: 2024-01-13 00:00:00.000000000 Z
11
+ date: 2024-04-13 00:00:00.000000000 Z
12
12
  dependencies:
13
13
  - !ruby/object:Gem::Dependency
14
14
  name: faraday
@@ -24,9 +24,23 @@ dependencies:
24
24
  - - "~>"
25
25
  - !ruby/object:Gem::Version
26
26
  version: '2.9'
27
+ - !ruby/object:Gem::Dependency
28
+ name: faraday-typhoeus
29
+ requirement: !ruby/object:Gem::Requirement
30
+ requirements:
31
+ - - "~>"
32
+ - !ruby/object:Gem::Version
33
+ version: '1.1'
34
+ type: :runtime
35
+ prerelease: false
36
+ version_requirements: !ruby/object:Gem::Requirement
37
+ requirements:
38
+ - - "~>"
39
+ - !ruby/object:Gem::Version
40
+ version: '1.1'
27
41
  description: A Ruby gem for interacting with Ollama's API that allows you to run open
28
42
  source AI LLMs (Large Language Models) locally.
29
- email:
43
+ email:
30
44
  executables: []
31
45
  extensions: []
32
46
  extra_rdoc_files: []
@@ -54,7 +68,7 @@ metadata:
54
68
  homepage_uri: https://github.com/gbaptista/ollama-ai
55
69
  source_code_uri: https://github.com/gbaptista/ollama-ai
56
70
  rubygems_mfa_required: 'true'
57
- post_install_message:
71
+ post_install_message:
58
72
  rdoc_options: []
59
73
  require_paths:
60
74
  - ports/dsl
@@ -70,7 +84,7 @@ required_rubygems_version: !ruby/object:Gem::Requirement
70
84
  version: '0'
71
85
  requirements: []
72
86
  rubygems_version: 3.3.3
73
- signing_key:
87
+ signing_key:
74
88
  specification_version: 4
75
89
  summary: Interact with Ollama API to run open source AI models locally.
76
90
  test_files: []