cloudflare-ai 0.2.0 → 0.4.0

Sign up to get free protection for your applications and to get access to all the features.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: 4a98e10b8688a9f4a0853534b104c9a05ceaa91e4405c71d4dde52fdd1953ce7
4
- data.tar.gz: 5921d2c627b91dfcf58375b39e9d0781f839e880d8c996146d7002f9ba48f365
3
+ metadata.gz: f8c78786e06d2a3cc05b177aedc99672b4be56016c33d6be2e718533b6b3f507
4
+ data.tar.gz: 535f23f9e6ffedbe0e090ded97459e021e57b84025456dab77d3d85dd2da194d
5
5
  SHA512:
6
- metadata.gz: a2c1f4389bea67083ef6e19e12620b29850bdf818e4bbce1e3dcd4e5916175fabf69f04fe33b36ae539ec43f622aeb3456f3fff34ec9f236e745fe7f853cdbdc
7
- data.tar.gz: c3e48f425b89449cae0bcbc480ee23c94d6494972f950070f3bb18e013711da95b45d7b6e3b2c2e7c9069ef458cc384ddf76ef3ea3a55cc61e46453bd190e189
6
+ metadata.gz: 6d35d01ae35a9407b97acfc0643c66c33b26d608d90373daf3c00001f86d7d2d2893162b370c417e2db7a96b43cec74d66e2652caa11ed89bbf31b968b4fd34d
7
+ data.tar.gz: 5af4ef84fdfc0ed649f951f18c10105b97ee4ae55b5d951f90551f7b0e5e452ca98ce408ded86980e67da05bfaf4e42321097b0b3f1cb2cfd224d67da74f8d44
data/README.md CHANGED
@@ -1,35 +1,40 @@
1
- cloudflare-ai: a Cloudflare Workers AI client for ruby developers
1
+ Cloudflare Workers AI API client for ruby
2
2
  ---
3
3
  Cloudflare is testing its [Workers AI](https://blog.cloudflare.com/workers-ai) API.
4
4
  Hopefully this project makes it easier for ruby-first developers to consume
5
5
  Cloudflare's latest and greatest.
6
6
 
7
- I'm a lawyer who codes, and I'm really interested in applying retrieval-augmented
7
+
8
+ ![Tests status](https://github.com/ajaynomics/cloudflare-ai/actions/workflows/ci.yml/badge.svg?branch=main)
9
+ [![Gem Version](https://badge.fury.io/rb/cloudflare-ai.svg)](https://badge.fury.io/rb/cloudflare-ai)
10
+ ![GitHub License](https://img.shields.io/github/license/ajaynomics/cloudflare-ai)
11
+
12
+ I'm really interested in applying retrieval-augmented
8
13
  generation to make legal services more accessible. [Email me](mailto:cloudflare-ai@krishnan.ca).
9
14
 
10
15
  If you're looking for legal help, it's best to book a slot via https://www.krishnan.ca.
11
16
 
12
- ## Todo
17
+ # Todo
13
18
  It's still early days, and here are my immediate priorities:
14
- * [ ] Support for streamed responses
15
- * [ ] CI pipeline and guidance for contributors
19
+ * [x] Support for streamed responses
20
+ * [x] CI pipeline
16
21
  * [ ] Support for more AI model categories
17
22
  * [x] [Text Generation](https://developers.cloudflare.com/workers-ai/models/text-generation/)
18
- * [ ] [Text Embeddings](https://developers.cloudflare.com/workers-ai/models/text-embeddings/)
23
+ * [x] [Text Embeddings](https://developers.cloudflare.com/workers-ai/models/text-embeddings/)
19
24
  * [ ] [Text Classification](https://developers.cloudflare.com/workers-ai/models/text-classification/)
20
25
  * [ ] [Image Classification](https://developers.cloudflare.com/workers-ai/models/image-classification/)
21
26
  * [ ] [Translation](https://developers.cloudflare.com/workers-ai/models/translation/)
22
27
  * [ ] [Text-to-Image](https://developers.cloudflare.com/workers-ai/models/text-to-image/)
23
28
  * [ ] [Automatic Speech Recognition](https://developers.cloudflare.com/workers-ai/models/speech-recognition/)
24
29
 
25
- ## Table of Contents
30
+ # Table of Contents
26
31
 
27
32
  - [Installation](#installation)
28
33
  - [Usage](#usage)
29
34
  - [Logging](#logging)
30
35
  - [Development](#development)
31
36
 
32
- ## Installation
37
+ # Installation
33
38
 
34
39
  Install the gem and add to the application's Gemfile by executing:
35
40
 
@@ -39,45 +44,38 @@ If bundler is not being used to manage dependencies, install the gem by executin
39
44
 
40
45
  gem install cloudflare-ai
41
46
 
42
- ## Usage
47
+ # Usage
43
48
 
44
49
  ```ruby
45
50
  require "cloudflare/ai"
46
51
  ```
47
52
 
48
- ### Cloudflare Workers AI
53
+ ## Cloudflare Workers AI
49
54
  Please visit the [Cloudflare Workers AI website](https://developers.cloudflare.com/workers-ai/) for more details.
50
55
  Thiis gem provides a client that wraps around [Cloudflare's REST API](https://developers.cloudflare.com/workers-ai/get-started/rest-api/).
51
56
 
52
57
 
53
- ### Client
58
+ ## Client
54
59
 
55
60
  ```ruby
56
61
  client = Cloudflare::AI::Client.new(account_id: ENV["CLOUDFLARE_ACCOUNT_ID"], api_token: ENV["CLOUDFLARE_API_TOKEN"])
57
62
  ```
58
63
 
59
- #### Result object
60
- All invocations of the client return a `Cloudflare::AI::Result` object. This object's serializable JSON output is
61
- based on the raw response from the Cloudflare API.
62
-
64
+ ### Model selection
65
+ The model name is an optional parameter to every one of the client methods described below.
66
+ For example, if an example is documented as
63
67
  ```ruby
64
- result = client.complete(prompt: "What is your name?")
65
-
66
- # Successful
67
- puts result.response # => "My name is John."
68
- puts result.success? # => true
69
- puts result.failure? # => false
70
- puts result.to_json # => {"result":{"response":"My name is John"},"success":true,"errors":[],"messages":[]}
71
-
72
- # Unsuccessful
73
- puts result.response # => nil
74
- puts result.success? # => false
75
- puts result.failure? # => true
76
- puts result.to_json # => {"result":null,"success":false,"errors":[{"code":7009,"message":"Upstream service unavailable"}],"messages":[]}
68
+ result = client.complete(prompt: "Hello my name is")
77
69
  ```
70
+ this is implicitly the same as
71
+ ```ruby
72
+ result = client.complete(prompt: "Hello my name is", model: "@cf/meta/llama-2-7b-chat-fp16")
73
+ ```
74
+ The full list of supported models is available here: [models.rb](lib/cloudflare/ai/models.rb).
75
+ More information is available [in the cloudflare documentation](https://developers.cloudflare.com/workers-ai/models/).
76
+ The default model used is the first enumerated model in the applicable set in [models.rb](lib/cloudflare/ai/models.rb).
78
77
 
79
-
80
- #### Text generation (chat / scoped prompt)
78
+ ### Text generation (chat / scoped prompt)
81
79
  ```ruby
82
80
  messages = [
83
81
  Cloudflare::AI::Message.new(role: "system", content: "You are a big fan of Cloudflare and Ruby."),
@@ -89,7 +87,7 @@ result = client.chat(messages: messages)
89
87
  puts result.response # => "Yes, I love Cloudflare!"
90
88
  ```
91
89
 
92
- ##### Streaming responses
90
+ #### Streaming responses
93
91
  Responses will be streamed back to the client using Server Side Events (SSE) if a block is passed to the `chat` or `complete` method.
94
92
  ```ruby
95
93
  result = client.complete(prompt: "Hi!") { |data| puts data}
@@ -99,8 +97,54 @@ result = client.complete(prompt: "Hi!") { |data| puts data}
99
97
  # {"response":"!"}
100
98
  # {"response":""}
101
99
  # [DONE]
100
+
102
101
  ```
103
- ## Logging
102
+ #### Result object
103
+ All invocations of the `prompt` and `chat` methods return a `Cloudflare::AI::Results::TextGeneration` object. This object's serializable JSON output is
104
+ based on the raw response from the Cloudflare API.
105
+
106
+ ```ruby
107
+ result = client.complete(prompt: "What is your name?")
108
+
109
+ # Successful
110
+ puts result.response # => "My name is John."
111
+ puts result.success? # => true
112
+ puts result.failure? # => false
113
+ puts result.to_json # => {"result":{"response":"My name is John"},"success":true,"errors":[],"messages":[]}
114
+
115
+ # Unsuccessful
116
+ puts result.response # => nil
117
+ puts result.success? # => false
118
+ puts result.failure? # => true
119
+ puts result.to_json # => {"result":null,"success":false,"errors":[{"code":7009,"message":"Upstream service unavailable"}],"messages":[]}
120
+ ```
121
+
122
+
123
+ ### Text embedding
124
+ ```ruby
125
+ result = client.embed(text: "Hello")
126
+ p result.shape # => [1, 768] # (1 embedding, 768 dimensions per embedding)
127
+ p result.embedding # => [[-0.008496830239892006, 0.001376907923258841, -0.0323275662958622, ...]]
128
+ ```
129
+
130
+ The input can be either a string (as above) or an array of strings:
131
+ ```ruby
132
+ result = client.embed(text: ["Hello", "World"])
133
+ ```
134
+
135
+ #### Result object
136
+ All invocations of the `embed` methods return a `Cloudflare::AI::Results::TextEmbedding`.
137
+
138
+ ### Text classification
139
+ ```ruby
140
+ result = client.classify(text: "You meanie!")
141
+ p result.result # => [{"label"=>"NEGATIVE", "score"=>0.6647962927818298}, {"label"=>"POSITIVE", "score"=>0.3352036774158478}]
142
+ ```
143
+
144
+ #### Result object
145
+ All invocations of the `classify` methods return a `Cloudflare::AI::Results::TextClassification`.
146
+
147
+ # Logging
104
148
 
105
149
  This gem uses standard logging mechanisms and defaults to `:warn` level. Most messages are at info level, but we will add debug or warn statements as needed.
106
150
  To show all log messages:
@@ -113,12 +157,12 @@ You can use this logger as you would the default ruby logger. For example:
113
157
  ```ruby
114
158
  Cloudflare::AI.logger = Logger.new($stdout)
115
159
  ```
116
- ## Development
160
+ # Development
117
161
 
118
162
  1. `git clone https://github.com/ajaynomics/cloudflare-ai.git`
119
163
  2. `bundle exec rake` to ensure that the tests pass and to run standardrb
120
164
 
121
- ## Contributing
165
+ # Contributing
122
166
 
123
167
  Bug reports and pull requests are welcome on GitHub at https://github.com/ajaynomics/cloudflare-ai.
124
168
 
@@ -2,6 +2,8 @@ require "event_stream_parser"
2
2
  require "faraday"
3
3
 
4
4
  class Cloudflare::AI::Client
5
+ include Cloudflare::AI::Clients::TextGenerationHelpers
6
+
5
7
  attr_reader :url, :account_id, :api_token
6
8
 
7
9
  def initialize(account_id:, api_token:)
@@ -9,50 +11,35 @@ class Cloudflare::AI::Client
9
11
  @api_token = api_token
10
12
  end
11
13
 
12
- def complete(prompt:, model_name: models[:text_generation].first, &block)
14
+ def chat(messages:, model_name: default_text_generation_model_name, &block)
13
15
  url = service_url_for(account_id: account_id, model_name: model_name)
14
16
  stream = block ? true : false
15
- payload = create_payload({prompt: prompt}, stream: stream)
16
- post_request(url, payload, &block)
17
+ payload = create_streamable_payload({messages: messages.map(&:serializable_hash)}, stream: stream)
18
+ post_streamable_request(url, payload, &block)
17
19
  end
18
20
 
19
- def chat(messages:, model_name: models[:text_generation].first, &block)
21
+ def classify(text:, model_name: Cloudflare::AI::Models.text_classification.first)
20
22
  url = service_url_for(account_id: account_id, model_name: model_name)
21
- stream = block ? true : false
22
- payload = create_payload({messages: messages.map(&:serializable_hash)}, stream: stream)
23
- post_request(url, payload, &block)
23
+ payload = {text: text}.to_json
24
+
25
+ Cloudflare::AI::Results::TextClassification.new(connection.post(url, payload).body)
24
26
  end
25
27
 
26
- def models
27
- {
28
- text_generation: %w[@cf/meta/llama-2-7b-chat-fp16 @cf/meta/llama-2-7b-chat-int8 @cf/mistral/mistral-7b-instruct-v0.1 @hf/thebloke/codellama-7b-instruct-awq],
29
- speech_recognition: %w[@cf/openai/whisper],
30
- translation: %w[@cf/meta/m2m100-1.2b],
31
- text_classification: %w[@cf/huggingface/distilbert-sst-2-int8],
32
- image_classification: %w[@cf/huggingface/distilbert-sst-2-int8],
33
- text_to_image: %w[@cf/stabilityai/stable-diffusion-xl-base-1.0],
34
- text_embeddings: %w[@cf/baai/bge-base-en-v1.5 @cf/baai/bge-large-en-v1.5 @cf/baai/bge-small-en-v1.5]
35
- }.freeze
28
+ def complete(prompt:, model_name: default_text_generation_model_name, &block)
29
+ url = service_url_for(account_id: account_id, model_name: model_name)
30
+ stream = block ? true : false
31
+ payload = create_streamable_payload({prompt: prompt}, stream: stream)
32
+ post_streamable_request(url, payload, &block)
36
33
  end
37
34
 
38
- private
35
+ def embed(text:, model_name: Cloudflare::AI::Models.text_embedding.first)
36
+ url = service_url_for(account_id: account_id, model_name: model_name)
37
+ payload = {text: text}.to_json
39
38
 
40
- def create_payload(data, stream: false)
41
- data.merge({stream: stream}).to_json
39
+ Cloudflare::AI::Results::TextEmbedding.new(connection.post(url, payload).body)
42
40
  end
43
41
 
44
- def post_request(url, payload, &block)
45
- if block
46
- parser = EventStreamParser::Parser.new
47
- connection.post(url, payload) do |response|
48
- response.options.on_data = parser.stream do |_type, data, _id, _reconnection_time, _size|
49
- yield data
50
- end
51
- end
52
- else
53
- Cloudflare::AI::Result.new(connection.post(url, payload).body)
54
- end
55
- end
42
+ private
56
43
 
57
44
  def connection
58
45
  @connection ||= ::Faraday.new(headers: {Authorization: "Bearer #{api_token}"})
@@ -0,0 +1,28 @@
1
+ module Cloudflare
2
+ module AI
3
+ module Clients
4
+ module TextGenerationHelpers
5
+ def default_text_generation_model_name
6
+ Cloudflare::AI::Models.text_generation.first
7
+ end
8
+
9
+ def create_streamable_payload(data, stream: false)
10
+ data.merge({stream: stream}).to_json
11
+ end
12
+
13
+ def post_streamable_request(url, payload, &block)
14
+ if block
15
+ parser = EventStreamParser::Parser.new
16
+ connection.post(url, payload) do |response|
17
+ response.options.on_data = parser.stream do |_type, data, _id, _reconnection_time, _size|
18
+ yield data
19
+ end
20
+ end
21
+ else
22
+ Cloudflare::AI::Results::TextGeneration.new(connection.post(url, payload).body)
23
+ end
24
+ end
25
+ end
26
+ end
27
+ end
28
+ end
@@ -0,0 +1,43 @@
1
+ class Cloudflare::AI::Models
2
+ class << self
3
+ def text_generation
4
+ %w[@cf/meta/llama-2-7b-chat-fp16 @cf/meta/llama-2-7b-chat-int8 @cf/mistral/mistral-7b-instruct-v0.1 @hf/thebloke/codellama-7b-instruct-awq]
5
+ end
6
+
7
+ def speech_recognition
8
+ %w[@cf/openai/whisper]
9
+ end
10
+
11
+ def translation
12
+ %w[@cf/meta/m2m100-1.2b]
13
+ end
14
+
15
+ def text_classification
16
+ %w[@cf/huggingface/distilbert-sst-2-int8]
17
+ end
18
+
19
+ def image_classification
20
+ %w[@cf/huggingface/distilbert-sst-2-int8]
21
+ end
22
+
23
+ def text_to_image
24
+ %w[@cf/stabilityai/stable-diffusion-xl-base-1.0]
25
+ end
26
+
27
+ def text_embedding
28
+ %w[@cf/baai/bge-base-en-v1.5 @cf/baai/bge-large-en-v1.5 @cf/baai/bge-small-en-v1.5]
29
+ end
30
+
31
+ def all
32
+ {
33
+ text_generation: text_generation,
34
+ speech_recognition: speech_recognition,
35
+ translation: translation,
36
+ text_classification: text_classification,
37
+ image_classification: image_classification,
38
+ text_to_image: text_to_image,
39
+ text_embeddings: text_embedding
40
+ }
41
+ end
42
+ end
43
+ end
@@ -1,32 +1,45 @@
1
1
  require "active_support/core_ext/hash/indifferent_access"
2
+ require "json"
2
3
 
3
4
  class Cloudflare::AI::Result
4
- attr_reader :result, :success, :errors, :messages
5
-
6
- def initialize(json)
7
- @json = json
8
- @json = JSON.parse(@json) unless @json.is_a?(Hash)
9
- @json = @json.with_indifferent_access
5
+ def initialize(json_string_or_ruby_hash)
6
+ @result_data = parse_data(json_string_or_ruby_hash)
7
+ end
10
8
 
11
- @result = @json["result"]
12
- @success = @json["success"]
13
- @errors = @json["errors"]
14
- @messages = @json["messages"]
9
+ def result
10
+ result_data[:result]
15
11
  end
16
12
 
17
- def to_json
18
- @json.to_json
13
+ def success?
14
+ success == true
19
15
  end
20
16
 
21
17
  def failure?
22
18
  !success?
23
19
  end
24
20
 
25
- def success?
26
- success
21
+ def errors
22
+ result_data.dig(:errors)
23
+ end
24
+
25
+ def messages
26
+ result_data.dig(:messages)
27
+ end
28
+
29
+ def to_json
30
+ result_data.to_json
31
+ end
32
+
33
+ private
34
+
35
+ attr_reader :result_data
36
+
37
+ def success
38
+ result_data[:success]
27
39
  end
28
40
 
29
- def response
30
- result.with_indifferent_access["response"]
41
+ def parse_data(input)
42
+ input = JSON.parse(input) if input.is_a?(String)
43
+ input.with_indifferent_access
31
44
  end
32
45
  end
@@ -0,0 +1,3 @@
1
+ class Cloudflare::AI::Results::TextClassification < Cloudflare::AI::Result
2
+ # Empty seam kept for consistency with other result objects that have more complexity.
3
+ end
@@ -0,0 +1,9 @@
1
+ class Cloudflare::AI::Results::TextEmbedding < Cloudflare::AI::Result
2
+ def shape
3
+ result&.dig(:shape) # nil if no shape
4
+ end
5
+
6
+ def data
7
+ result&.dig(:data) # nil if no data
8
+ end
9
+ end
@@ -0,0 +1,5 @@
1
+ class Cloudflare::AI::Results::TextGeneration < Cloudflare::AI::Result
2
+ def response
3
+ result&.dig(:response) # nil if no response
4
+ end
5
+ end
@@ -2,6 +2,6 @@
2
2
 
3
3
  module Cloudflare
4
4
  module AI
5
- VERSION = "0.2.0"
5
+ VERSION = "0.4.0"
6
6
  end
7
7
  end
data/lib/cloudflare/ai.rb CHANGED
@@ -1,5 +1,3 @@
1
- # frozen_string_literal: true
2
-
3
1
  require_relative "ai/version"
4
2
  require "logger"
5
3
  require "pathname"
@@ -27,6 +25,5 @@ module Cloudflare
27
25
  @root = Pathname.new(__dir__)
28
26
 
29
27
  class Error < StandardError; end
30
- # Your code goes here...
31
28
  end
32
29
  end
metadata CHANGED
@@ -1,14 +1,14 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: cloudflare-ai
3
3
  version: !ruby/object:Gem::Version
4
- version: 0.2.0
4
+ version: 0.4.0
5
5
  platform: ruby
6
6
  authors:
7
7
  - Ajay Krishnan
8
8
  autorequire:
9
9
  bindir: exe
10
10
  cert_chain: []
11
- date: 2024-01-21 00:00:00.000000000 Z
11
+ date: 2024-01-22 00:00:00.000000000 Z
12
12
  dependencies:
13
13
  - !ruby/object:Gem::Dependency
14
14
  name: activemodel
@@ -93,9 +93,14 @@ files:
93
93
  - README.md
94
94
  - lib/cloudflare/ai.rb
95
95
  - lib/cloudflare/ai/client.rb
96
+ - lib/cloudflare/ai/clients/text_generation_helpers.rb
96
97
  - lib/cloudflare/ai/contextual_logger.rb
97
98
  - lib/cloudflare/ai/message.rb
99
+ - lib/cloudflare/ai/models.rb
98
100
  - lib/cloudflare/ai/result.rb
101
+ - lib/cloudflare/ai/results/text_classification.rb
102
+ - lib/cloudflare/ai/results/text_embedding.rb
103
+ - lib/cloudflare/ai/results/text_generation.rb
99
104
  - lib/cloudflare/ai/version.rb
100
105
  homepage: https://rubygems.org/gems/cloudflare-ai
101
106
  licenses: