gen-ai 0.0.1 → 0.2.0

Sign up to get free protection for your applications and to get access to all the features.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: e8b3a8ca11b07cadcaf16a1fce985173cd14982a2f575189bbdc631f2306d745
4
- data.tar.gz: f64ecb2bcf5ef48bd66598261db1005bbffcba45596c6339c35ae51f1b257232
3
+ metadata.gz: 20e16166f4ff21fb5130cf198be040b3d89019ba18cebd55e25f02720b32de62
4
+ data.tar.gz: 0eb3c0b7aace3f1dc97cc6f28635b45dae67aac3304d9fc9926a2f6a517d42cf
5
5
  SHA512:
6
- metadata.gz: 105c83eee425a422b64bb257dc9fc20c5f6619daf3dd12165ac03e585cf6ffa0752dd3b45118bf1330db57cfa8a24df025e6de9c2e9cf9d2360d8decc4440878
7
- data.tar.gz: 166c15562a30f0af948a18ad4773d4704c594d9a28e1aaab948338867a634c42d4937596114c13cdbaefe8efc7901125e71484c3743a865d0d204fb05306cc24
6
+ metadata.gz: '02380f33e49b058b40c62c07b2470d5cebb36e8301f4d985077fbd27c9279948cf981227b6f8293bcac74a7b455f17e0ce64310601afcc7cfcd4bf81e3293fee'
7
+ data.tar.gz: 51e6bd0e3950fa62ae26f7f4103fbdd457e262212a840a9dab3abf84ed8337a6f50b60f011fbc717b891b4ac78483cbc60224784f4dfc78cdc7b938ac29f22a6
data/.rubocop.yml ADDED
@@ -0,0 +1,38 @@
1
+ AllCops:
2
+ NewCops: enable
3
+ TargetRubyVersion: 2.7
4
+
5
+ Documentation:
6
+ Enabled: false
7
+
8
+ Gemspec/DevelopmentDependencies:
9
+ Enabled: false
10
+
11
+ Lint/MissingSuper:
12
+ Enabled: false
13
+
14
+ Lint/NonDeterministicRequireOrder:
15
+ Enabled: false
16
+
17
+ Lint/UnusedMethodArgument:
18
+ Enabled: false
19
+
20
+ Layout/FirstArrayElementIndentation:
21
+ Enabled: false
22
+
23
+ Layout/FirstHashElementIndentation:
24
+ Enabled: false
25
+
26
+ Layout/SpaceInsideHashLiteralBraces:
27
+ Enabled: false
28
+
29
+ Layout/ArgumentAlignment:
30
+ Enabled: false
31
+
32
+ Layout/LineLength:
33
+ Exclude:
34
+ - spec/**/*.rb
35
+
36
+ Metrics/BlockLength:
37
+ Exclude:
38
+ - spec/**/*.rb
data/LICENSE ADDED
@@ -0,0 +1,21 @@
1
+ The MIT License (MIT)
2
+
3
+ Copyright (c) 2023 Alex Chaplinsky
4
+
5
+ Permission is hereby granted, free of charge, to any person obtaining a copy
6
+ of this software and associated documentation files (the "Software"), to deal
7
+ in the Software without restriction, including without limitation the rights
8
+ to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
9
+ copies of the Software, and to permit persons to whom the Software is
10
+ furnished to do so, subject to the following conditions:
11
+
12
+ The above copyright notice and this permission notice shall be included in
13
+ all copies or substantial portions of the Software.
14
+
15
+ THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
16
+ IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
17
+ FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
18
+ AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
19
+ LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
20
+ OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
21
+ THE SOFTWARE.
data/README.md CHANGED
@@ -1,24 +1,160 @@
1
- # Gen::Ai
1
+ # GenAI
2
2
 
3
- TODO: Delete this and the text below, and describe your gem
3
+ Generative AI toolset for Ruby
4
4
 
5
- Welcome to your new gem! In this directory, you'll find the files you need to be able to package up your Ruby library into a gem. Put your Ruby code in the file `lib/gen/ai`. To experiment with that code, run `bin/console` for an interactive prompt.
5
+ GenAI allows you to easily integrate Generative AI model providers like OpenAI, Google Vertex AI, Stability AI, etc. Easily add Large Language Models, Stable Diffusion image generation, and other AI models into your application!
6
6
 
7
- ## Installation
7
+ ![Tests](https://github.com/alchaplinsky/gen-ai/actions/workflows/main.yml/badge.svg?branch=main)
8
+ [![Gem Version](https://badge.fury.io/rb/gen-ai.svg)](https://badge.fury.io/rb/gen-ai)
9
+ [![License](https://img.shields.io/badge/license-MIT-green.svg)](https://github.com/alchaplinsky/gen-ai/blob/main/LICENSE.txt)
8
10
 
9
- TODO: Replace `UPDATE_WITH_YOUR_GEM_NAME_PRIOR_TO_RELEASE_TO_RUBYGEMS_ORG` with your gem name right after releasing it to RubyGems.org. Please do not do it earlier due to security reasons. Alternatively, replace this section with instructions to install your gem from git if you don't plan to release to RubyGems.org.
11
+ ## Installation
10
12
 
11
13
  Install the gem and add to the application's Gemfile by executing:
12
14
 
13
- $ bundle add UPDATE_WITH_YOUR_GEM_NAME_PRIOR_TO_RELEASE_TO_RUBYGEMS_ORG
15
+ $ bundle add gen-ai
14
16
 
15
17
  If bundler is not being used to manage dependencies, install the gem by executing:
16
18
 
17
- $ gem install UPDATE_WITH_YOUR_GEM_NAME_PRIOR_TO_RELEASE_TO_RUBYGEMS_ORG
19
+ $ gem install gen-ai
18
20
 
19
21
  ## Usage
20
22
 
21
- TODO: Write usage instructions here
23
+ ### Feature support
24
+ ✅ - Supported | ❌ - Not supported | 🛠️ - Work in progress
25
+
26
+ Language models capabilities
27
+
28
+ | Provider | Embedding | Completion | Conversation | Sentiment | Summarization |
29
+ | ---------------- | --------- | :--------: | :----------: | :-------: | :-----------: |
30
+ | **OpenAI** | ✅ | ✅ | ✅ | 🛠️ | 🛠️ |
31
+ | **Google Palm2** | ✅ | ✅ | ✅ | 🛠️ | 🛠️ |
32
+
33
+
34
+ Image generation model capabilities
35
+
36
+ | Provider | Generate | Variations | Edit | Upscale |
37
+ | ---------------- | --------- | :--------: | :----------: | :-------: |
38
+ | **OpenAI** | ✅ | ✅ | ✅ | ❌ |
39
+ | **StabilityAI** | ✅ | ❌ | ✅ | 🛠️ |
40
+
41
+ ### Language
42
+
43
+ Instantiate a language model client by passing a provider name and an API token.
44
+
45
+ ```ruby
46
+ model = GenAI::Language.new(:open_ai, ENV['OPEN_AI_TOKEN'])
47
+ ```
48
+
49
+ Generate **embedding(s)** for text using provider/model that fits your needs
50
+
51
+ ```ruby
52
+ result = model.embed('Hi, how are you?')
53
+ # => #<GenAI::Result:0x0000000110be6f20...>
54
+
55
+ result.value
56
+ # => [-0.013577374, 0.0021624255, 0.0019274801, ... ]
57
+
58
+ result = model.embed(['Hello', 'Bonjour', 'Cześć'])
59
+ # => #<GenAI::Result:0x0000000110be6f34...>
60
+
61
+ result.values
62
+ # => [[-0.021834826, -0.007176527, -0.02836839,, ... ], [...], [...]]
63
+ ```
64
+
65
+ Generate text **completions** using Large Language Models
66
+
67
+ ```ruby
68
+ result = model.complete('London is a ', temperature: 0, max_tokens: 11)
69
+ # => #<GenAI::Result:0x0000000110be6d21...>
70
+
71
+ result.value
72
+ # => "vibrant and diverse city located in the United Kingdom"
73
+
74
+
75
+ result = model.complete('London is a ', max_tokens: 12, n: 2)
76
+ # => #<GenAI::Result:0x0000000110c25c70...>
77
+
78
+ result.values
79
+ # => ["thriving, bustling city known for its rich history.", "major global city and the capital of the United Kingdom."]
80
+
81
+ ```
82
+
83
+ Have a **conversation** with Large Language Model.
84
+
85
+ ```ruby
86
+ result = model.chat('Hi, how are you?')
87
+ # = >#<GenAI::Result:0x0000000106ff3d20...>
88
+
89
+ result.value
90
+ # => "Hello! I'm an AI, so I don't have feelings, but I'm here to help. How can I assist you today?"
91
+
92
+ history = [
93
+ {role: 'user', content: 'What is the capital of Great Britain?'},
94
+ {role: 'assistant', content: 'London'},
95
+ ]
96
+
97
+ result = model.chat("what about France?", history: history)
98
+ # => #<GenAI::Result:0x00000001033c3bc0...>
99
+
100
+ result.value
101
+ # => "Paris"
102
+ ```
103
+
104
+ ### Image
105
+
106
+ Instantiate a image generation model client by passing a provider name and an API token.
107
+
108
+ ```ruby
109
+ model = GenAI::Image.new(:open_ai, ENV['OPEN_AI_TOKEN'])
110
+ ```
111
+
112
+ Generate **image(s)** using provider/model that fits your needs
113
+
114
+ ```ruby
115
+ result = model.generate('A painting of a dog')
116
+ # => #<GenAI::Result:0x0000000110be6f20...>
117
+
118
+ result.value
119
+ # => Base64 encoded image
120
+
121
+ # Save image to file
122
+ File.open('dog.jpg', 'wb') do |f|
123
+ f.write(Base64.decode64(result.value))
124
+ end
125
+ ```
126
+
127
+ Get more **variations** of the same image
128
+
129
+ ```ruby
130
+ result = model.variations('./dog.jpg')
131
+ # => #<GenAI::Result:0x0000000116a1ec50...>
132
+
133
+ result.value
134
+ # => Base64 encoded image
135
+
136
+ # Save image to file
137
+ File.open('dog_variation.jpg', 'wb') do |f|
138
+ f.write(Base64.decode64(result.value))
139
+ end
140
+
141
+ ```
142
+
143
+ **Editing** existing images with additional prompt
144
+
145
+ ```ruby
146
+ result = model.edit('./llama.jpg', 'A cute llama wearing a beret', mask: './mask.png')
147
+ # => #<GenAI::Result:0x0000000116a1ec50...>
148
+
149
+ result.value
150
+ # => Base64 encoded image
151
+
152
+ # Save image to file
153
+ File.open('dog_edited.jpg', 'wb') do |f|
154
+ f.write(Base64.decode64(result.value))
155
+ end
156
+
157
+ ```
22
158
 
23
159
  ## Development
24
160
 
@@ -28,8 +164,8 @@ To install this gem onto your local machine, run `bundle exec rake install`. To
28
164
 
29
165
  ## Contributing
30
166
 
31
- Bug reports and pull requests are welcome on GitHub at https://github.com/[USERNAME]/gen-ai. This project is intended to be a safe, welcoming space for collaboration, and contributors are expected to adhere to the [code of conduct](https://github.com/[USERNAME]/gen-ai/blob/main/CODE_OF_CONDUCT.md).
167
+ Bug reports and pull requests are welcome on GitHub at https://github.com/alchaplinsky/gen-ai. This project is intended to be a safe, welcoming space for collaboration, and contributors are expected to adhere to the [code of conduct](https://github.com/alchaplinsky/gen-ai/blob/main/CODE_OF_CONDUCT.md).
32
168
 
33
169
  ## Code of Conduct
34
170
 
35
- Everyone interacting in the Gen::Ai project's codebases, issue trackers, chat rooms and mailing lists is expected to follow the [code of conduct](https://github.com/[USERNAME]/gen-ai/blob/main/CODE_OF_CONDUCT.md).
171
+ Everyone interacting in the GenAI project's codebases, issue trackers, chat rooms, and mailing lists is expected to follow the [code of conduct](https://github.com/alchaplinsky/gen-ai/blob/main/CODE_OF_CONDUCT.md).
data/Rakefile CHANGED
@@ -1,7 +1,7 @@
1
1
  # frozen_string_literal: true
2
2
 
3
- require "bundler/gem_tasks"
4
- require "rspec/core/rake_task"
3
+ require 'bundler/gem_tasks'
4
+ require 'rspec/core/rake_task'
5
5
 
6
6
  RSpec::Core::RakeTask.new(:spec)
7
7
 
data/gen-ai.gemspec CHANGED
@@ -1,23 +1,23 @@
1
1
  # frozen_string_literal: true
2
2
 
3
- require_relative "lib/version"
3
+ require_relative 'lib/gen_ai/version'
4
4
 
5
5
  Gem::Specification.new do |spec|
6
- spec.name = "gen-ai"
6
+ spec.name = 'gen-ai'
7
7
  spec.version = GenAI::VERSION
8
- spec.authors = ["Alex Chaplinsky"]
9
- spec.email = ["alchaplinsky@gmail.com"]
8
+ spec.authors = ['Alex Chaplinsky']
9
+ spec.email = ['alchaplinsky@gmail.com']
10
10
 
11
- spec.summary = "Generative AI toolset for Ruby."
12
- spec.description = "Generative AI toolset for Ruby."
13
- spec.homepage = "https://github.com/alchaplinsky/gen-ai"
14
- spec.required_ruby_version = ">= 2.6.0"
11
+ spec.summary = 'Generative AI toolset for Ruby.'
12
+ spec.description = 'Generative AI toolset for Ruby.'
13
+ spec.homepage = 'https://github.com/alchaplinsky/gen-ai'
14
+ spec.required_ruby_version = '>= 2.7.0'
15
15
 
16
16
  # spec.metadata["allowed_push_host"] = "TODO: Set to your gem server 'https://example.com'"
17
17
 
18
- spec.metadata["homepage_uri"] = spec.homepage
19
- spec.metadata["source_code_uri"] = "https://github.com/alchaplinsky/gen-ai"
20
- spec.metadata["changelog_uri"] = "https://github.com/alchaplinsky/gen-ai/blob/master/CHANGELOG.md"
18
+ spec.metadata['homepage_uri'] = spec.homepage
19
+ spec.metadata['source_code_uri'] = 'https://github.com/alchaplinsky/gen-ai'
20
+ spec.metadata['changelog_uri'] = 'https://github.com/alchaplinsky/gen-ai/blob/master/CHANGELOG.md'
21
21
 
22
22
  # Specify which files should be added to the gem when it is released.
23
23
  # The `git ls-files -z` loads the files in the RubyGem that have been added into git.
@@ -27,13 +27,18 @@ Gem::Specification.new do |spec|
27
27
  f.start_with?(*%w[bin/ test/ spec/ features/ .git .circleci appveyor Gemfile])
28
28
  end
29
29
  end
30
- spec.bindir = "exe"
30
+ spec.bindir = 'exe'
31
31
  spec.executables = spec.files.grep(%r{\Aexe/}) { |f| File.basename(f) }
32
- spec.require_paths = ["lib"]
32
+ spec.require_paths = ['lib']
33
33
 
34
34
  # Uncomment to register a new dependency of your gem
35
- # spec.add_dependency "example-gem", "~> 1.0"
35
+ spec.add_dependency 'faraday', '~> 2.7'
36
+ spec.add_dependency 'faraday-multipart'
37
+ spec.add_dependency 'zeitwerk', '~> 2.6'
36
38
 
39
+ spec.add_development_dependency 'google_palm_api', '~> 0.1'
40
+ spec.add_development_dependency 'ruby-openai', '~> 5.1'
37
41
  # For more information and examples about making a new gem, check out our
38
42
  # guide at: https://bundler.io/guides/creating_gem.html
43
+ spec.metadata['rubygems_mfa_required'] = 'true'
39
44
  end
@@ -0,0 +1,53 @@
1
+ # frozen_string_literal: true
2
+
3
+ require 'faraday'
4
+ require 'faraday/multipart'
5
+
6
+ module GenAI
7
+ module Api
8
+ class Client
9
+ def initialize(url:, token:)
10
+ @url = url
11
+ @token = token
12
+ end
13
+
14
+ def post(path, body, options = {})
15
+ multipart = options.delete(:multipart) || false
16
+ payload = multipart ? body : JSON.generate(body)
17
+
18
+ handle_response do
19
+ connection(multipart: multipart).post(path, payload)
20
+ end
21
+ end
22
+
23
+ def get(path, options)
24
+ handle_response do
25
+ connection.get(path, options)
26
+ end
27
+ end
28
+
29
+ private
30
+
31
+ attr_reader :url, :token
32
+
33
+ def connection(multipart: false)
34
+ Faraday.new(url: url, headers: {
35
+ 'Accept' => 'application/json',
36
+ 'Content-Type' => multipart ? 'multipart/form-data' : 'application/json',
37
+ 'Authorization' => "Bearer #{token}"
38
+ }) do |conn|
39
+ conn.request :multipart if multipart
40
+ conn.request :url_encoded
41
+ end
42
+ end
43
+
44
+ def handle_response
45
+ response = yield
46
+
47
+ raise GenAI::ApiError, "API error: #{JSON.parse(response.body)}" unless response.success?
48
+
49
+ JSON.parse(response.body)
50
+ end
51
+ end
52
+ end
53
+ end
@@ -0,0 +1,32 @@
1
+ # frozen_string_literal: true
2
+
3
+ module GenAI
4
+ class Base
5
+ include GenAI::Dependency
6
+
7
+ private
8
+
9
+ attr_reader :client
10
+
11
+ def handle_errors
12
+ response = yield
13
+ return if !response || response.empty?
14
+
15
+ raise GenAI::ApiError, "#{api_provider_name} API error: #{response.dig('error', 'message')}" if response['error']
16
+
17
+ response
18
+ end
19
+
20
+ def provider_name
21
+ api_provider_name.gsub(/(.)([A-Z])/, '\1_\2').downcase
22
+ end
23
+
24
+ def api_provider_name
25
+ self.class.name.split('::').last
26
+ end
27
+
28
+ def build_result(model:, raw:, parsed:)
29
+ GenAI::Result.new(provider: provider_name.to_sym, model: model, raw: raw, values: parsed)
30
+ end
31
+ end
32
+ end
@@ -0,0 +1,42 @@
1
+ # frozen_string_literal: true
2
+
3
+ module GenAI
4
+ module Dependency
5
+ class VersionError < StandardError; end
6
+
7
+ def depends_on(*names)
8
+ names.each { |name| load_dependency(name) }
9
+ end
10
+
11
+ private
12
+
13
+ def load_dependency(name)
14
+ gem(name)
15
+
16
+ return true unless defined? Bundler
17
+
18
+ gem_spec = Gem::Specification.find_by_name(name)
19
+ gem_requirement = dependencies.find { |gem| gem.name == gem_spec.name }.requirement
20
+
21
+ unless gem_requirement.satisfied_by?(gem_spec.version)
22
+ raise VersionError, version_error(gem_spec, gem_requirement)
23
+ end
24
+
25
+ require_gem(gem_spec)
26
+ end
27
+
28
+ def version_error(gem_spec, gem_requirement)
29
+ "'#{gem_spec.name}' gem version is #{gem_spec.version}, but your Gemfile specified #{gem_requirement}."
30
+ end
31
+
32
+ def require_gem(gem_spec)
33
+ gem_spec.full_require_paths.each do |path|
34
+ Dir.glob("#{path}/*.rb").each { |file| require file }
35
+ end
36
+ end
37
+
38
+ def dependencies
39
+ Bundler.load.dependencies
40
+ end
41
+ end
42
+ end
@@ -0,0 +1,15 @@
1
+ # frozen_string_literal: true
2
+
3
+ module GenAI
4
+ class Image
5
+ class Base < GenAI::Base
6
+ def generate(...)
7
+ raise NotImplementedError, "#{self.class.name} does not support generate"
8
+ end
9
+
10
+ def variations(...)
11
+ raise NotImplementedError, "#{self.class.name} does not support variations"
12
+ end
13
+ end
14
+ end
15
+ end
@@ -0,0 +1,79 @@
1
+ # frozen_string_literal: true
2
+
3
+ module GenAI
4
+ class Image
5
+ class OpenAI < Base
6
+ DEFAULT_SIZE = '256x256'
7
+ RESPONSE_FORMAT = 'b64_json'
8
+
9
+ def initialize(token:, options: {})
10
+ depends_on 'ruby-openai'
11
+
12
+ @client = ::OpenAI::Client.new(access_token: token)
13
+ end
14
+
15
+ def generate(prompt, options = {})
16
+ parameters = build_generation_options(prompt, options)
17
+
18
+ response = handle_errors { @client.images.generate(parameters: parameters) }
19
+
20
+ build_result(
21
+ raw: response,
22
+ model: 'dall-e',
23
+ parsed: response['data'].map { |datum| datum[RESPONSE_FORMAT] }
24
+ )
25
+ end
26
+
27
+ def variations(image, options = {})
28
+ parameters = build_variations_options(image, options)
29
+
30
+ response = handle_errors { @client.images.variations(parameters: parameters) }
31
+
32
+ build_result(
33
+ raw: response,
34
+ model: 'dall-e',
35
+ parsed: response['data'].map { |datum| datum[RESPONSE_FORMAT] }
36
+ )
37
+ end
38
+
39
+ def edit(image, prompt, options = {})
40
+ parameters = build_edit_options(image, prompt, options)
41
+
42
+ response = handle_errors { @client.images.edit(parameters: parameters) }
43
+
44
+ build_result(
45
+ raw: response,
46
+ model: 'dall-e',
47
+ parsed: response['data'].map { |datum| datum[RESPONSE_FORMAT] }
48
+ )
49
+ end
50
+
51
+ private
52
+
53
+ def build_generation_options(prompt, options)
54
+ {
55
+ prompt: prompt,
56
+ size: options.delete(:size) || DEFAULT_SIZE,
57
+ response_format: options.delete(:response_format) || RESPONSE_FORMAT
58
+ }.merge(options)
59
+ end
60
+
61
+ def build_variations_options(image, options)
62
+ {
63
+ image: image,
64
+ size: options.delete(:size) || DEFAULT_SIZE,
65
+ response_format: options.delete(:response_format) || RESPONSE_FORMAT
66
+ }.merge(options)
67
+ end
68
+
69
+ def build_edit_options(image, prompt, options)
70
+ {
71
+ image: image,
72
+ prompt: prompt,
73
+ size: options.delete(:size) || DEFAULT_SIZE,
74
+ response_format: options.delete(:response_format) || RESPONSE_FORMAT
75
+ }.merge(options)
76
+ end
77
+ end
78
+ end
79
+ end
@@ -0,0 +1,72 @@
1
+ # frozen_string_literal: true
2
+
3
+ require 'faraday'
4
+
5
+ module GenAI
6
+ class Image
7
+ class StabilityAI < Base
8
+ DEFAULT_SIZE = '256x256'
9
+ API_BASE_URL = 'https://api.stability.ai'
10
+ DEFAULT_MODEL = 'stable-diffusion-xl-beta-v2-2-2'
11
+
12
+ def initialize(token:, options: {})
13
+ build_client(token)
14
+ end
15
+
16
+ def generate(prompt, options = {})
17
+ model = options[:model] || DEFAULT_MODEL
18
+ url = "/v1/generation/#{model}/text-to-image"
19
+
20
+ response = client.post url, build_generation_body(prompt, options)
21
+
22
+ build_result(
23
+ raw: response,
24
+ model: model,
25
+ parsed: response['artifacts'].map { |artifact| artifact['base64'] }
26
+ )
27
+ end
28
+
29
+ def edit(image, prompt, options = {})
30
+ model = options[:model] || DEFAULT_MODEL
31
+ url = "/v1/generation/#{model}/image-to-image"
32
+
33
+ response = client.post url, build_edit_body(image, prompt, options), multipart: true
34
+
35
+ build_result(
36
+ raw: response,
37
+ model: model,
38
+ parsed: response['artifacts'].map { |artifact| artifact['base64'] }
39
+ )
40
+ end
41
+
42
+ private
43
+
44
+ def build_client(token)
45
+ @client = GenAI::Api::Client.new(url: API_BASE_URL, token: token)
46
+ end
47
+
48
+ def build_generation_body(prompt, options)
49
+ w, h = size(options)
50
+
51
+ {
52
+ text_prompts: [{ text: prompt }],
53
+ height: h,
54
+ width: w
55
+ }.merge(options)
56
+ end
57
+
58
+ def build_edit_body(image, prompt, options)
59
+ {
60
+ init_image: File.binread(image),
61
+ 'text_prompts[0][text]' => prompt,
62
+ 'text_prompts[0][weight]' => 1.0
63
+ }.merge(options)
64
+ end
65
+
66
+ def size(options)
67
+ size = options.delete(:size) || DEFAULT_SIZE
68
+ size.split('x').map(&:to_i)
69
+ end
70
+ end
71
+ end
72
+ end
@@ -0,0 +1,37 @@
1
+ # frozen_string_literal: true
2
+
3
+ module GenAI
4
+ class Image
5
+ def initialize(provider, token, options: {})
6
+ build_client(provider, token, options)
7
+ end
8
+
9
+ def generate(prompt, options = {})
10
+ client.generate(prompt, options)
11
+ end
12
+
13
+ def variations(image, options = {})
14
+ client.variations(image, options)
15
+ end
16
+
17
+ def edit(image, prompt, options = {})
18
+ client.edit(image, prompt, options)
19
+ end
20
+
21
+ # def upscale; end
22
+
23
+ private
24
+
25
+ attr_reader :client
26
+
27
+ def build_client(provider, token, options)
28
+ klass = GenAI::Image.constants.find do |const|
29
+ const.to_s.downcase == provider.to_s.downcase.gsub('_', '')
30
+ end
31
+
32
+ raise UnsupportedProvider, "Unsupported Image provider '#{provider}'" unless klass
33
+
34
+ @client = GenAI::Image.const_get(klass).new(token: token, options: options)
35
+ end
36
+ end
37
+ end
@@ -0,0 +1,21 @@
1
+ # frozen_string_literal: true
2
+
3
+ module GenAI
4
+ class Language
5
+ class Base < GenAI::Base
6
+ DEFAULT_ROLE = 'user'
7
+
8
+ def embed(...)
9
+ raise NotImplementedError, "#{self.class.name} does not support embedding"
10
+ end
11
+
12
+ def complete(...)
13
+ raise NotImplementedError, "#{self.class.name} does not support completion"
14
+ end
15
+
16
+ def chat(...)
17
+ raise NotImplementedError, "#{self.class.name} does not support conversations"
18
+ end
19
+ end
20
+ end
21
+ end
@@ -0,0 +1,103 @@
1
+ # frozen_string_literal: true
2
+
3
+ module GenAI
4
+ class Language
5
+ class GooglePalm < Base
6
+ DEFAULT_ROLE = '0'
7
+ EMBEDDING_MODEL = 'textembedding-gecko-001'
8
+ COMPLETION_MODEL = 'text-bison-001'
9
+ CHAT_COMPLETION_MODEL = 'chat-bison-001'
10
+
11
+ def initialize(token:, options: {})
12
+ depends_on 'google_palm_api'
13
+
14
+ @client = ::GooglePalmApi::Client.new(api_key: token)
15
+ end
16
+
17
+ def embed(input, model: nil)
18
+ responses = array_wrap(input).map do |text|
19
+ handle_errors { client.embed(text: text, model: model) }
20
+ end
21
+
22
+ build_result(
23
+ model: model || EMBEDDING_MODEL,
24
+ raw: { 'data' => responses, 'usage' => {} },
25
+ parsed: extract_embeddings(responses)
26
+ )
27
+ end
28
+
29
+ def complete(prompt, options: {})
30
+ parameters = build_completion_options(prompt, options)
31
+
32
+ response = handle_errors { client.generate_text(**parameters) }
33
+
34
+ build_result(
35
+ model: parameters[:model],
36
+ raw: response.merge('usage' => {}),
37
+ parsed: extract_completions(response)
38
+ )
39
+ end
40
+
41
+ def chat(message, context: nil, history: [], examples: [], options: {})
42
+ parameters = build_chat_options(message, context, history, examples, options)
43
+
44
+ response = handle_errors { client.generate_chat_message(**parameters) }
45
+
46
+ build_result(
47
+ model: parameters[:model],
48
+ raw: response.merge('usage' => {}),
49
+ parsed: extract_chat_messages(response)
50
+ )
51
+ end
52
+
53
+ private
54
+
55
+ def build_chat_options(message, context, history, examples, options)
56
+ {
57
+ model: options.delete(:model) || CHAT_COMPLETION_MODEL,
58
+ messages: history.append({ author: DEFAULT_ROLE, content: message }),
59
+ examples: compose_examples(examples),
60
+ context: context
61
+ }.merge(options)
62
+ end
63
+
64
+ def build_completion_options(prompt, options)
65
+ {
66
+ prompt: prompt,
67
+ model: options.delete(:model) || COMPLETION_MODEL
68
+ }.merge(options)
69
+ end
70
+
71
+ def compose_examples(examples)
72
+ examples.each_slice(2).map do |example|
73
+ {
74
+ input: { content: symbolize(example.first)[:content] },
75
+ output: { content: symbolize(example.last)[:content] }
76
+ }
77
+ end
78
+ end
79
+
80
+ def symbolize(hash)
81
+ hash.transform_keys(&:to_sym)
82
+ end
83
+
84
+ def array_wrap(object)
85
+ return [] if object.nil?
86
+
87
+ object.respond_to?(:to_ary) ? object.to_ary || [object] : [object]
88
+ end
89
+
90
+ def extract_embeddings(responses)
91
+ responses.map { |response| response.dig('embedding', 'value') }
92
+ end
93
+
94
+ def extract_completions(response)
95
+ response['candidates'].map { |candidate| candidate['output'] }
96
+ end
97
+
98
+ def extract_chat_messages(response)
99
+ response['candidates'].map { |candidate| candidate['content'] }
100
+ end
101
+ end
102
+ end
103
+ end
@@ -0,0 +1,72 @@
1
+ # frozen_string_literal: true
2
+
3
+ module GenAI
4
+ class Language
5
+ class OpenAI < Base
6
+ EMBEDDING_MODEL = 'text-embedding-ada-002'
7
+ COMPLETION_MODEL = 'gpt-3.5-turbo'
8
+
9
+ def initialize(token:, options: {})
10
+ depends_on 'ruby-openai'
11
+
12
+ @client = ::OpenAI::Client.new(access_token: token)
13
+ end
14
+
15
+ def embed(input, model: nil)
16
+ parameters = { input: input, model: model || EMBEDDING_MODEL }
17
+
18
+ response = handle_errors { client.embeddings(parameters: parameters) }
19
+
20
+ build_result(model: parameters[:model], raw: response, parsed: extract_embeddings(response))
21
+ end
22
+
23
+ def complete(prompt, options: {})
24
+ parameters = build_completion_options(prompt, options)
25
+
26
+ response = handle_errors { client.chat(parameters: parameters) }
27
+
28
+ build_result(model: parameters[:model], raw: response, parsed: extract_completions(response))
29
+ end
30
+
31
+ def chat(message, context: nil, history: [], examples: [], options: {})
32
+ parameters = build_chat_options(message, context, history, examples, options)
33
+
34
+ response = handle_errors { client.chat(parameters: parameters) }
35
+
36
+ build_result(model: parameters[:model], raw: response, parsed: extract_completions(response))
37
+ end
38
+
39
+ private
40
+
41
+ def build_chat_options(message, context, history, examples, options)
42
+ messages = []
43
+ messages.concat(examples)
44
+ messages.concat(history)
45
+
46
+ messages.prepend({ role: 'system', content: context }) if context
47
+
48
+ messages.append({ role: DEFAULT_ROLE, content: message })
49
+
50
+ {
51
+ messages: messages,
52
+ model: options.delete(:model) || COMPLETION_MODEL
53
+ }.merge(options)
54
+ end
55
+
56
+ def build_completion_options(prompt, options)
57
+ {
58
+ messages: [{ role: DEFAULT_ROLE, content: prompt }],
59
+ model: options.delete(:model) || COMPLETION_MODEL
60
+ }.merge(options)
61
+ end
62
+
63
+ def extract_embeddings(response)
64
+ response['data'].map { |datum| datum['embedding'] }
65
+ end
66
+
67
+ def extract_completions(response)
68
+ response['choices'].map { |choice| choice.dig('message', 'content') }
69
+ end
70
+ end
71
+ end
72
+ end
@@ -0,0 +1,47 @@
1
+ # frozen_string_literal: true
2
+
3
+ module GenAI
4
+ class Language
5
+ def initialize(provider, token, options: {})
6
+ build_llm(provider, token, options)
7
+ end
8
+
9
+ def embed(text, model: nil)
10
+ llm.embed(text, model: model)
11
+ end
12
+
13
+ def complete(prompt, options = {})
14
+ llm.complete(prompt, options: options)
15
+ end
16
+
17
+ def chat(message, context: nil, history: [], examples: [], **options)
18
+ llm.chat(message, context: context, history: history, examples: examples, options: options)
19
+ end
20
+
21
+ # def answer(prompt); end
22
+
23
+ # def sentiment(text); end
24
+
25
+ # def keywords(text); end
26
+
27
+ # def summarization(text); end
28
+
29
+ # def translation(text, _target:); end
30
+
31
+ # def correction(text); end
32
+
33
+ private
34
+
35
+ attr_reader :llm
36
+
37
+ def build_llm(provider, token, options)
38
+ klass = GenAI::Language.constants.find do |const|
39
+ const.to_s.downcase == provider.to_s.downcase.gsub('_', '')
40
+ end
41
+
42
+ raise UnsupportedProvider, "Unsupported LLM provider '#{provider}'" unless klass
43
+
44
+ @llm = GenAI::Language.const_get(klass).new(token: token, options: options)
45
+ end
46
+ end
47
+ end
@@ -0,0 +1,43 @@
1
+ # frozen_string_literal: true
2
+
3
+ module GenAI
4
+ class Result
5
+ attr_reader :raw, :provider, :model, :values
6
+
7
+ def initialize(provider:, model:, raw:, values:)
8
+ @raw = raw
9
+ @provider = provider
10
+ @model = model
11
+ @values = values
12
+ end
13
+
14
+ def value
15
+ values.first
16
+ end
17
+
18
+ def prompt_tokens
19
+ usage['prompt_tokens']
20
+ end
21
+
22
+ def completion_tokens
23
+ if usage['completion_tokens'] ||
24
+ (total_tokens && prompt_tokens)
25
+ total_tokens.to_i - prompt_tokens.to_i
26
+ end
27
+ end
28
+
29
+ def total_tokens
30
+ usage['total_tokens']
31
+ end
32
+
33
+ private
34
+
35
+ def usage
36
+ raw['usage'] || {
37
+ 'prompt_tokens' => nil,
38
+ 'completion_tokens' => nil,
39
+ 'total_tokens' => nil
40
+ }
41
+ end
42
+ end
43
+ end
@@ -0,0 +1,5 @@
1
+ # frozen_string_literal: true
2
+
3
+ module GenAI
4
+ VERSION = '0.2.0'
5
+ end
data/lib/gen_ai.rb CHANGED
@@ -1,13 +1,18 @@
1
1
  # frozen_string_literal: true
2
2
 
3
- require_relative "version"
3
+ require 'zeitwerk'
4
+ require 'pry'
5
+
6
+ loader = Zeitwerk::Loader.for_gem
7
+ loader.inflector.inflect(
8
+ 'gen_ai' => 'GenAI',
9
+ 'open_ai' => 'OpenAI',
10
+ 'stability_ai' => 'StabilityAI'
11
+ )
12
+ loader.setup
4
13
 
5
14
  module GenAI
6
15
  class Error < StandardError; end
7
- class UnsupportedConfiguration < Error; end
8
- # Your code goes here...
16
+ class ApiError < Error; end
17
+ class UnsupportedProvider < Error; end
9
18
  end
10
-
11
- require_relative "language/google_palm"
12
- require_relative "language/open_ai"
13
- require_relative "language"
metadata CHANGED
@@ -1,15 +1,85 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: gen-ai
3
3
  version: !ruby/object:Gem::Version
4
- version: 0.0.1
4
+ version: 0.2.0
5
5
  platform: ruby
6
6
  authors:
7
7
  - Alex Chaplinsky
8
8
  autorequire:
9
9
  bindir: exe
10
10
  cert_chain: []
11
- date: 2023-10-14 00:00:00.000000000 Z
12
- dependencies: []
11
+ date: 2023-10-22 00:00:00.000000000 Z
12
+ dependencies:
13
+ - !ruby/object:Gem::Dependency
14
+ name: faraday
15
+ requirement: !ruby/object:Gem::Requirement
16
+ requirements:
17
+ - - "~>"
18
+ - !ruby/object:Gem::Version
19
+ version: '2.7'
20
+ type: :runtime
21
+ prerelease: false
22
+ version_requirements: !ruby/object:Gem::Requirement
23
+ requirements:
24
+ - - "~>"
25
+ - !ruby/object:Gem::Version
26
+ version: '2.7'
27
+ - !ruby/object:Gem::Dependency
28
+ name: faraday-multipart
29
+ requirement: !ruby/object:Gem::Requirement
30
+ requirements:
31
+ - - ">="
32
+ - !ruby/object:Gem::Version
33
+ version: '0'
34
+ type: :runtime
35
+ prerelease: false
36
+ version_requirements: !ruby/object:Gem::Requirement
37
+ requirements:
38
+ - - ">="
39
+ - !ruby/object:Gem::Version
40
+ version: '0'
41
+ - !ruby/object:Gem::Dependency
42
+ name: zeitwerk
43
+ requirement: !ruby/object:Gem::Requirement
44
+ requirements:
45
+ - - "~>"
46
+ - !ruby/object:Gem::Version
47
+ version: '2.6'
48
+ type: :runtime
49
+ prerelease: false
50
+ version_requirements: !ruby/object:Gem::Requirement
51
+ requirements:
52
+ - - "~>"
53
+ - !ruby/object:Gem::Version
54
+ version: '2.6'
55
+ - !ruby/object:Gem::Dependency
56
+ name: google_palm_api
57
+ requirement: !ruby/object:Gem::Requirement
58
+ requirements:
59
+ - - "~>"
60
+ - !ruby/object:Gem::Version
61
+ version: '0.1'
62
+ type: :development
63
+ prerelease: false
64
+ version_requirements: !ruby/object:Gem::Requirement
65
+ requirements:
66
+ - - "~>"
67
+ - !ruby/object:Gem::Version
68
+ version: '0.1'
69
+ - !ruby/object:Gem::Dependency
70
+ name: ruby-openai
71
+ requirement: !ruby/object:Gem::Requirement
72
+ requirements:
73
+ - - "~>"
74
+ - !ruby/object:Gem::Version
75
+ version: '5.1'
76
+ type: :development
77
+ prerelease: false
78
+ version_requirements: !ruby/object:Gem::Requirement
79
+ requirements:
80
+ - - "~>"
81
+ - !ruby/object:Gem::Version
82
+ version: '5.1'
13
83
  description: Generative AI toolset for Ruby.
14
84
  email:
15
85
  - alchaplinsky@gmail.com
@@ -18,16 +88,27 @@ extensions: []
18
88
  extra_rdoc_files: []
19
89
  files:
20
90
  - ".rspec"
91
+ - ".rubocop.yml"
21
92
  - CHANGELOG.md
22
93
  - CODE_OF_CONDUCT.md
94
+ - LICENSE
23
95
  - README.md
24
96
  - Rakefile
25
97
  - gen-ai.gemspec
26
98
  - lib/gen_ai.rb
27
- - lib/language.rb
28
- - lib/language/google_palm.rb
29
- - lib/language/open_ai.rb
30
- - lib/version.rb
99
+ - lib/gen_ai/api/client.rb
100
+ - lib/gen_ai/base.rb
101
+ - lib/gen_ai/dependency.rb
102
+ - lib/gen_ai/image.rb
103
+ - lib/gen_ai/image/base.rb
104
+ - lib/gen_ai/image/open_ai.rb
105
+ - lib/gen_ai/image/stability_ai.rb
106
+ - lib/gen_ai/language.rb
107
+ - lib/gen_ai/language/base.rb
108
+ - lib/gen_ai/language/google_palm.rb
109
+ - lib/gen_ai/language/open_ai.rb
110
+ - lib/gen_ai/result.rb
111
+ - lib/gen_ai/version.rb
31
112
  - sig/gen_ai.rbs
32
113
  homepage: https://github.com/alchaplinsky/gen-ai
33
114
  licenses: []
@@ -35,6 +116,7 @@ metadata:
35
116
  homepage_uri: https://github.com/alchaplinsky/gen-ai
36
117
  source_code_uri: https://github.com/alchaplinsky/gen-ai
37
118
  changelog_uri: https://github.com/alchaplinsky/gen-ai/blob/master/CHANGELOG.md
119
+ rubygems_mfa_required: 'true'
38
120
  post_install_message:
39
121
  rdoc_options: []
40
122
  require_paths:
@@ -43,14 +125,14 @@ required_ruby_version: !ruby/object:Gem::Requirement
43
125
  requirements:
44
126
  - - ">="
45
127
  - !ruby/object:Gem::Version
46
- version: 2.6.0
128
+ version: 2.7.0
47
129
  required_rubygems_version: !ruby/object:Gem::Requirement
48
130
  requirements:
49
131
  - - ">="
50
132
  - !ruby/object:Gem::Version
51
133
  version: '0'
52
134
  requirements: []
53
- rubygems_version: 3.2.33
135
+ rubygems_version: 3.3.7
54
136
  signing_key:
55
137
  specification_version: 4
56
138
  summary: Generative AI toolset for Ruby.
@@ -1,9 +0,0 @@
1
- module GenAI
2
- class Language
3
- class GooglePalm
4
- def initialize(token:, options: {})
5
- @token = token
6
- end
7
- end
8
- end
9
- end
@@ -1,9 +0,0 @@
1
- module GenAI
2
- class Language
3
- class OpenAI
4
- def initialize(token:, options: {})
5
- @token = token
6
- end
7
- end
8
- end
9
- end
data/lib/language.rb DELETED
@@ -1,48 +0,0 @@
1
- module GenAI
2
- class Language
3
- def initialize(provider, token, options: {})
4
- @provider = provider
5
- @token = token
6
- end
7
-
8
- def answer(question, context: {})
9
- return 'Yes, it is.'
10
- end
11
-
12
- def completion(prompt, options: {})
13
- end
14
-
15
- def conversation(prompt, options: {})
16
- end
17
-
18
- def embedding(text)
19
- end
20
-
21
- def sentiment(text)
22
- end
23
-
24
- def keywords(text)
25
- end
26
-
27
- def summarization(text)
28
- end
29
-
30
- def translation(text, target:)
31
- end
32
-
33
- def correction(text)
34
- end
35
-
36
-
37
- def llm
38
- case @provider
39
- when :openai
40
- GenAI::Language::OpenAI.new(token: @token)
41
- when :google_palm
42
- GenAI::Language::GooglePalm.new(token: @token)
43
- else
44
- raise UnsupportedConfiguration.new "Unknown LLM provider"
45
- end
46
- end
47
- end
48
- end
data/lib/version.rb DELETED
@@ -1,3 +0,0 @@
1
- module GenAI
2
- VERSION = "0.0.1"
3
- end