omniai-mistral 0.2.0 → 1.0.2

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: ecfcd380d66660c2b8cb71d7cec580dd10c9df8030b25c9ee60e78de5181c173
4
- data.tar.gz: a2230cb09db232a7024acd6677e206e47e345e3daa07081b0c0e2acde3622d96
3
+ metadata.gz: 289a9603e7ba3e918348526995497971d87dcf49adba10092f8c30776b112974
4
+ data.tar.gz: 958d329ccd5ad40493b335adb4cc82f63cfa04887d2c7cd73864176a671dc00f
5
5
  SHA512:
6
- metadata.gz: 34442af7ff4d2c376e6959d86d5f43b7a02bb57397882f1ec834b2f1c40448458573560abec4e21ac8c79b1c42646d073aac058f5be7ad39332cf74db96c1855
7
- data.tar.gz: 446b60f9e02e37c822bf5d9728db0059a9926485afb961cafc6ff4ae79938baf5e6a10de6e371a4c687ee89956258e78a0f679220e68cafc5e110e6d0fb163b3
6
+ metadata.gz: ed46c824ac8094bb7a32e454681205cc6cd3d83c7a075359df8a9ea5efbf8ea01dda8f8650afea79145cbd15a61520d3dd24c5fdeeb0e6bdf5aa52e89d717183
7
+ data.tar.gz: 60ae3047124bd0455cb5a937d55e782e09d43fe739d2014571ad087f65dd68e100c7c94850410074ae666398e30b25b51aa3e56f4f90a781816088dfad0108b5
data/Gemfile CHANGED
@@ -11,5 +11,4 @@ gem 'rspec_junit_formatter'
11
11
  gem 'rubocop'
12
12
  gem 'rubocop-rake'
13
13
  gem 'rubocop-rspec'
14
- gem 'vcr'
15
14
  gem 'webmock'
data/README.md CHANGED
@@ -39,22 +39,22 @@ end
39
39
  A chat completion is generated by passing in prompts using any a variety of formats:
40
40
 
41
41
  ```ruby
42
- completion = client.chat.completion('Tell me a joke!')
42
+ completion = client.chat('Tell me a joke!')
43
43
  completion.choice.message.content # 'Why did the chicken cross the road? To get to the other side.'
44
44
  ```
45
45
 
46
46
  ```ruby
47
- completion = client.chat.completion({
48
- role: OmniAI::Mistral::Chat::Role::USER,
47
+ completion = client.chat({
48
+ role: OmniAI::Chat::Role::USER,
49
49
  content: 'Is it wise to jump off a bridge?'
50
50
  })
51
51
  completion.choice.message.content # 'No.'
52
52
  ```
53
53
 
54
54
  ```ruby
55
- completion = client.chat.completion([
55
+ completion = client.chat([
56
56
  {
57
- role: OmniAI::Mistral::Chat::Role::SYSTEM,
57
+ role: OmniAI::Chat::Role::SYSTEM,
58
58
  content: 'You are a helpful assistant.'
59
59
  },
60
60
  'What is the capital of Canada?',
@@ -67,19 +67,23 @@ completion.choice.message.content # 'The capital of Canada is Ottawa.'
67
67
  `model` takes an optional string (default is `mistral-medium-latest`):
68
68
 
69
69
  ```ruby
70
- completion = client.chat.completion('Provide code for fibonacci', model: OmniAI::Mistral::Chat::Model::CODESTRAL)
70
+ completion = client.chat('Provide code for fibonacci', model: OmniAI::Mistral::Chat::Model::CODESTRAL)
71
71
  completion.choice.message.content # 'def fibonacci(n)...end'
72
72
  ```
73
73
 
74
+ [Mistral API Reference `model`](https://docs.mistral.ai/getting-started/models/)
75
+
74
76
  #### Temperature
75
77
 
76
78
  `temperature` takes an optional float between `0.0` and `1.0` (defaults is `0.7`):
77
79
 
78
80
  ```ruby
79
- completion = client.chat.completion('Pick a number between 1 and 5', temperature: 1.0)
81
+ completion = client.chat('Pick a number between 1 and 5', temperature: 1.0)
80
82
  completion.choice.message.content # '3'
81
83
  ```
82
84
 
85
+ [Mistral API Reference `temperature`](https://docs.mistral.ai/api/)
86
+
83
87
  #### Stream
84
88
 
85
89
  `stream` takes an optional a proc to stream responses in real-time chunks instead of waiting for a complete response:
@@ -88,14 +92,23 @@ completion.choice.message.content # '3'
88
92
  stream = proc do |chunk|
89
93
  print(chunk.choice.delta.content) # 'Better', 'three', 'hours', ...
90
94
  end
91
- client.chat.completion('Be poetic.', stream:)
95
+ client.chat('Be poetic.', stream:)
92
96
  ```
93
97
 
98
+ [Mistral API Reference `stream`](https://docs.mistral.ai/api/)
99
+
94
100
  #### Format
95
101
 
96
- `format` takes an optional symbol (i.e. `:json`) and switches the client to parsing message content as `json`:
102
+ `format` takes an optional symbol (`:json`) and that sets the `response_format` to `json_object`:
97
103
 
98
104
  ```ruby
99
- completion = client.chat.completion('Please provide a color name / hex / hsl as JSON.', format: :json)
100
- completion.choice.message.content # { "name": "Black", "hex": "#000", "hsl": { "h": 0, "s": 0, "l": 0 } }
105
+ completion = client.chat([
106
+ { role: OmniAI::Chat::Role::SYSTEM, content: OmniAI::Chat::JSON_PROMPT },
107
+ { role: OmniAI::Chat::Role::USER, content: 'What is the name of the drummer for the Beatles?' }
108
+ ], format: :json)
109
+ JSON.parse(completion.choice.message.content) # { "name": "Ringo" }
101
110
  ```
111
+
112
+ [Mistral API Reference `response_format`](https://docs.mistral.ai/api/)
113
+
114
+ > When using JSON mode you MUST also instruct the model to produce JSON yourself with a system or a user message.
@@ -25,18 +25,24 @@ module OmniAI
25
25
  SYSTEM = 'system'
26
26
  end
27
27
 
28
- # @raise [OmniAI::Error]
29
- #
30
- # @param prompt [String]
31
- # @param model [String] optional
32
- # @param format [Symbol] optional :text or :json
33
- # @param temperature [Float, nil] optional
34
- # @param stream [Proc, nil] optional
35
- #
36
- # @return [OmniAI::Mistral::Chat::Response]
37
- def completion(messages, model: Model::MEDIUM, temperature: nil, format: nil, stream: nil)
38
- request = OmniAI::Mistral::Chat::Request.new(client: @client, messages:, model:, temperature:, format:, stream:)
39
- request.process!
28
+ JSON_RESPONSE_FORMAT = { type: 'json_object' }.freeze
29
+
30
+ protected
31
+
32
+ # @return [Hash]
33
+ def payload
34
+ OmniAI::Mistral.config.chat_options.merge({
35
+ messages:,
36
+ model: @model,
37
+ stream: @stream.nil? ? nil : !@stream.nil?,
38
+ temperature: @temperature,
39
+ response_format: (JSON_RESPONSE_FORMAT if @format.eql?(:json)),
40
+ }).compact
41
+ end
42
+
43
+ # @return [String]
44
+ def path
45
+ "/#{OmniAI::Mistral::Client::VERSION}/chat/completions"
40
46
  end
41
47
  end
42
48
  end
@@ -41,9 +41,17 @@ module OmniAI
41
41
  @connection ||= HTTP.auth("Bearer #{api_key}").persistent(@host)
42
42
  end
43
43
 
44
- # @return [OmniAI::Mistral::Chat]
45
- def chat
46
- Chat.new(client: self)
44
+ # @raise [OmniAI::Error]
45
+ #
46
+ # @param messages [String, Array, Hash]
47
+ # @param model [String] optional
48
+ # @param format [Symbol] optional :text or :json
49
+ # @param temperature [Float, nil] optional
50
+ # @param stream [Proc, nil] optional
51
+ #
52
+ # @return [OmniAI::Chat::Completion]
53
+ def chat(messages, model: Chat::Model::MEDIUM, temperature: nil, format: nil, stream: nil)
54
+ Chat.process!(messages, model:, temperature:, format:, stream:, client: self)
47
55
  end
48
56
  end
49
57
  end
@@ -2,12 +2,15 @@
2
2
 
3
3
  module OmniAI
4
4
  module Mistral
5
- # Config for the Mistral `api_key` / `host` / `logger`.
5
+ # Config for the Mistral `api_key` / `host` / `logger`, `chat_options`.
6
6
  class Config < OmniAI::Config
7
+ attr_accessor :chat_options
8
+
7
9
  def initialize
8
10
  super
9
11
  @api_key = ENV.fetch('MISTRAL_API_KEY', nil)
10
12
  @host = ENV.fetch('MISTRAL_HOST', 'https://api.mistral.ai')
13
+ @chat_options = {}
11
14
  end
12
15
  end
13
16
  end
@@ -2,6 +2,6 @@
2
2
 
3
3
  module OmniAI
4
4
  module Mistral
5
- VERSION = '0.2.0'
5
+ VERSION = '1.0.2'
6
6
  end
7
7
  end
metadata CHANGED
@@ -1,14 +1,14 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: omniai-mistral
3
3
  version: !ruby/object:Gem::Version
4
- version: 0.2.0
4
+ version: 1.0.2
5
5
  platform: ruby
6
6
  authors:
7
7
  - Kevin Sylvestre
8
8
  autorequire:
9
9
  bindir: exe
10
10
  cert_chain: []
11
- date: 2024-06-14 00:00:00.000000000 Z
11
+ date: 2024-06-15 00:00:00.000000000 Z
12
12
  dependencies:
13
13
  - !ruby/object:Gem::Dependency
14
14
  name: event_stream_parser
@@ -63,7 +63,6 @@ files:
63
63
  - README.md
64
64
  - lib/omniai/mistral.rb
65
65
  - lib/omniai/mistral/chat.rb
66
- - lib/omniai/mistral/chat/request.rb
67
66
  - lib/omniai/mistral/client.rb
68
67
  - lib/omniai/mistral/config.rb
69
68
  - lib/omniai/mistral/version.rb
@@ -1,46 +0,0 @@
1
- # frozen_string_literal: true
2
-
3
- module OmniAI
4
- module Mistral
5
- class Chat
6
- # An implementation of OmniAI::Chat::Request for Mistral.
7
- class Request < OmniAI::Chat::Request
8
- protected
9
-
10
- # @return [Hash]
11
- def payload
12
- { messages: }.tap do |payload|
13
- payload[:model] = @model
14
- payload[:stream] = !@stream.nil? unless @stream.nil?
15
- payload[:temperature] = @temperature if @temperature
16
- payload[:response_format] = { type: 'json_object' } if @format.eql?(:json)
17
- end
18
- end
19
-
20
- # @return [Array<Hash>]
21
- def messages
22
- arrayify(@messages).map do |content|
23
- case content
24
- when String then { role: OmniAI::Mistral::Chat::Role::USER, content: }
25
- when Hash then content
26
- else raise Error, "Unsupported content=#{content.inspect}"
27
- end
28
- end
29
- end
30
-
31
- # @return [String]
32
- def path
33
- "/#{OmniAI::Mistral::Client::VERSION}/chat/completions"
34
- end
35
-
36
- private
37
-
38
- # @param value [Object, Array<Object>]
39
- # @return [Array<Object>]
40
- def arrayify(value)
41
- value.is_a?(Array) ? value : [value]
42
- end
43
- end
44
- end
45
- end
46
- end