cloudflare-ai 0.2.0 → 0.3.0
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- checksums.yaml +4 -4
- data/README.md +56 -34
- data/lib/cloudflare/ai/client.rb +13 -33
- data/lib/cloudflare/ai/clients/text_generation_helpers.rb +28 -0
- data/lib/cloudflare/ai/models.rb +43 -0
- data/lib/cloudflare/ai/result.rb +29 -16
- data/lib/cloudflare/ai/results/text_embedding.rb +9 -0
- data/lib/cloudflare/ai/results/text_generation.rb +5 -0
- data/lib/cloudflare/ai/version.rb +1 -1
- data/lib/cloudflare/ai.rb +0 -3
- metadata +5 -1
checksums.yaml
CHANGED
@@ -1,7 +1,7 @@
|
|
1
1
|
---
|
2
2
|
SHA256:
|
3
|
-
metadata.gz:
|
4
|
-
data.tar.gz:
|
3
|
+
metadata.gz: b0a6b80e669a5c4d64ec87433ed1c3426ee0a2a1087016773c5648c799b34912
|
4
|
+
data.tar.gz: fe1d3b2afb5f70cbe476b1b8e006819b2e6e469184169993a15c7c4c34b89ca8
|
5
5
|
SHA512:
|
6
|
-
metadata.gz:
|
7
|
-
data.tar.gz:
|
6
|
+
metadata.gz: 84dcec88bcd44e4a7b004235b15b559d924c1e8dfce552234b2d9286762ab2a12d282954e8d863523b1c9170c24f49a772da7320b5b475b66b79b141ac2c532d
|
7
|
+
data.tar.gz: 0f676b2accb65edc63b2b9b1c2b8a532ebfedba588dfdcb1ae38ccfdf7edb4869c1ebee2060e887e5ac166772cda91d3e43e2ca650e878eb63b8c964d7fc19e1
|
data/README.md
CHANGED
@@ -1,35 +1,40 @@
|
|
1
|
-
|
1
|
+
Cloudflare Workers AI API client for ruby
|
2
2
|
---
|
3
3
|
Cloudflare is testing its [Workers AI](https://blog.cloudflare.com/workers-ai) API.
|
4
4
|
Hopefully this project makes it easier for ruby-first developers to consume
|
5
5
|
Cloudflare's latest and greatest.
|
6
6
|
|
7
|
-
|
7
|
+
|
8
|
+

|
9
|
+
[](https://badge.fury.io/rb/cloudflare-ai)
|
10
|
+

|
11
|
+
|
12
|
+
I'm really interested in applying retrieval-augmented
|
8
13
|
generation to make legal services more accessible. [Email me](mailto:cloudflare-ai@krishnan.ca).
|
9
14
|
|
10
15
|
If you're looking for legal help, it's best to book a slot via https://www.krishnan.ca.
|
11
16
|
|
12
|
-
|
17
|
+
# Todo
|
13
18
|
It's still early days, and here are my immediate priorities:
|
14
|
-
* [
|
15
|
-
* [
|
19
|
+
* [x] Support for streamed responses
|
20
|
+
* [x] CI pipeline
|
16
21
|
* [ ] Support for more AI model categories
|
17
22
|
* [x] [Text Generation](https://developers.cloudflare.com/workers-ai/models/text-generation/)
|
18
|
-
* [
|
23
|
+
* [x] [Text Embeddings](https://developers.cloudflare.com/workers-ai/models/text-embeddings/)
|
19
24
|
* [ ] [Text Classification](https://developers.cloudflare.com/workers-ai/models/text-classification/)
|
20
25
|
* [ ] [Image Classification](https://developers.cloudflare.com/workers-ai/models/image-classification/)
|
21
26
|
* [ ] [Translation](https://developers.cloudflare.com/workers-ai/models/translation/)
|
22
27
|
* [ ] [Text-to-Image](https://developers.cloudflare.com/workers-ai/models/text-to-image/)
|
23
28
|
* [ ] [Automatic Speech Recognition](https://developers.cloudflare.com/workers-ai/models/speech-recognition/)
|
24
29
|
|
25
|
-
|
30
|
+
# Table of Contents
|
26
31
|
|
27
32
|
- [Installation](#installation)
|
28
33
|
- [Usage](#usage)
|
29
34
|
- [Logging](#logging)
|
30
35
|
- [Development](#development)
|
31
36
|
|
32
|
-
|
37
|
+
# Installation
|
33
38
|
|
34
39
|
Install the gem and add to the application's Gemfile by executing:
|
35
40
|
|
@@ -39,26 +44,51 @@ If bundler is not being used to manage dependencies, install the gem by executin
|
|
39
44
|
|
40
45
|
gem install cloudflare-ai
|
41
46
|
|
42
|
-
|
47
|
+
# Usage
|
43
48
|
|
44
49
|
```ruby
|
45
50
|
require "cloudflare/ai"
|
46
51
|
```
|
47
52
|
|
48
|
-
|
53
|
+
## Cloudflare Workers AI
|
49
54
|
Please visit the [Cloudflare Workers AI website](https://developers.cloudflare.com/workers-ai/) for more details.
|
50
55
|
Thiis gem provides a client that wraps around [Cloudflare's REST API](https://developers.cloudflare.com/workers-ai/get-started/rest-api/).
|
51
56
|
|
52
57
|
|
53
|
-
|
58
|
+
## Client
|
54
59
|
|
55
60
|
```ruby
|
56
61
|
client = Cloudflare::AI::Client.new(account_id: ENV["CLOUDFLARE_ACCOUNT_ID"], api_token: ENV["CLOUDFLARE_API_TOKEN"])
|
57
62
|
```
|
58
63
|
|
64
|
+
|
65
|
+
### Text generation (chat / scoped prompt)
|
66
|
+
```ruby
|
67
|
+
messages = [
|
68
|
+
Cloudflare::AI::Message.new(role: "system", content: "You are a big fan of Cloudflare and Ruby."),
|
69
|
+
Cloudflare::AI::Message.new(role: "user", content: "What is your favourite tech stack?"),
|
70
|
+
Cloudflare::AI::Message.new(role: "assistant", content: "I love building with Ruby on Rails and Cloudflare!"),
|
71
|
+
Cloudflare::AI::Message.new(role: "user", content: "Really? You like Cloudflare even though there isn't great support for Ruby?"),
|
72
|
+
]
|
73
|
+
result = client.chat(messages: messages)
|
74
|
+
puts result.response # => "Yes, I love Cloudflare!"
|
75
|
+
```
|
76
|
+
|
77
|
+
#### Streaming responses
|
78
|
+
Responses will be streamed back to the client using Server Side Events (SSE) if a block is passed to the `chat` or `complete` method.
|
79
|
+
```ruby
|
80
|
+
result = client.complete(prompt: "Hi!") { |data| puts data}
|
81
|
+
# {"response":" "}
|
82
|
+
# {"response":" Hello"}
|
83
|
+
# {"response":" there"}
|
84
|
+
# {"response":"!"}
|
85
|
+
# {"response":""}
|
86
|
+
# [DONE]
|
87
|
+
|
88
|
+
```
|
59
89
|
#### Result object
|
60
|
-
All invocations of the
|
61
|
-
based on the raw response from the Cloudflare API.
|
90
|
+
All invocations of the `prompt` and `chat` methods return a `Cloudflare::AI::Results::TextGeneration` object. This object's serializable JSON output is
|
91
|
+
based on the raw response from the Cloudflare API.
|
62
92
|
|
63
93
|
```ruby
|
64
94
|
result = client.complete(prompt: "What is your name?")
|
@@ -77,30 +107,22 @@ puts result.to_json # => {"result":null,"success":false,"errors":[{"code":7009,"
|
|
77
107
|
```
|
78
108
|
|
79
109
|
|
80
|
-
|
110
|
+
### Text embedding
|
81
111
|
```ruby
|
82
|
-
|
83
|
-
|
84
|
-
|
85
|
-
Cloudflare::AI::Message.new(role: "assistant", content: "I love building with Ruby on Rails and Cloudflare!"),
|
86
|
-
Cloudflare::AI::Message.new(role: "user", content: "Really? You like Cloudflare even though there isn't great support for Ruby?"),
|
87
|
-
]
|
88
|
-
result = client.chat(messages: messages)
|
89
|
-
puts result.response # => "Yes, I love Cloudflare!"
|
112
|
+
result = client.embed(text: "Hello")
|
113
|
+
p result.shape # => [1, 768] # (1 embedding, 768 dimensions per embedding)
|
114
|
+
p result.embedding # => [[-0.008496830239892006, 0.001376907923258841, -0.0323275662958622, ...]]
|
90
115
|
```
|
91
116
|
|
92
|
-
|
93
|
-
Responses will be streamed back to the client using Server Side Events (SSE) if a block is passed to the `chat` or `complete` method.
|
117
|
+
The input can be either a string (as above) or an array of strings:
|
94
118
|
```ruby
|
95
|
-
result = client.
|
96
|
-
# {"response":" "}
|
97
|
-
# {"response":" Hello"}
|
98
|
-
# {"response":" there"}
|
99
|
-
# {"response":"!"}
|
100
|
-
# {"response":""}
|
101
|
-
# [DONE]
|
119
|
+
result = client.embed(text: ["Hello", "World"])
|
102
120
|
```
|
103
|
-
|
121
|
+
|
122
|
+
#### Result object
|
123
|
+
All invocations of the `embedding` methods return a `Cloudflare::AI::Results::TextEmbedding`.
|
124
|
+
|
125
|
+
# Logging
|
104
126
|
|
105
127
|
This gem uses standard logging mechanisms and defaults to `:warn` level. Most messages are at info level, but we will add debug or warn statements as needed.
|
106
128
|
To show all log messages:
|
@@ -113,12 +135,12 @@ You can use this logger as you would the default ruby logger. For example:
|
|
113
135
|
```ruby
|
114
136
|
Cloudflare::AI.logger = Logger.new($stdout)
|
115
137
|
```
|
116
|
-
|
138
|
+
# Development
|
117
139
|
|
118
140
|
1. `git clone https://github.com/ajaynomics/cloudflare-ai.git`
|
119
141
|
2. `bundle exec rake` to ensure that the tests pass and to run standardrb
|
120
142
|
|
121
|
-
|
143
|
+
# Contributing
|
122
144
|
|
123
145
|
Bug reports and pull requests are welcome on GitHub at https://github.com/ajaynomics/cloudflare-ai.
|
124
146
|
|
data/lib/cloudflare/ai/client.rb
CHANGED
@@ -2,6 +2,8 @@ require "event_stream_parser"
|
|
2
2
|
require "faraday"
|
3
3
|
|
4
4
|
class Cloudflare::AI::Client
|
5
|
+
include Cloudflare::AI::Clients::TextGenerationHelpers
|
6
|
+
|
5
7
|
attr_reader :url, :account_id, :api_token
|
6
8
|
|
7
9
|
def initialize(account_id:, api_token:)
|
@@ -9,50 +11,28 @@ class Cloudflare::AI::Client
|
|
9
11
|
@api_token = api_token
|
10
12
|
end
|
11
13
|
|
12
|
-
def
|
14
|
+
def chat(messages:, model_name: default_text_generation_model_name, &block)
|
13
15
|
url = service_url_for(account_id: account_id, model_name: model_name)
|
14
16
|
stream = block ? true : false
|
15
|
-
payload =
|
16
|
-
|
17
|
+
payload = create_streamable_payload({messages: messages.map(&:serializable_hash)}, stream: stream)
|
18
|
+
post_streamable_request(url, payload, &block)
|
17
19
|
end
|
18
20
|
|
19
|
-
def
|
21
|
+
def complete(prompt:, model_name: default_text_generation_model_name, &block)
|
20
22
|
url = service_url_for(account_id: account_id, model_name: model_name)
|
21
23
|
stream = block ? true : false
|
22
|
-
payload =
|
23
|
-
|
24
|
-
end
|
25
|
-
|
26
|
-
def models
|
27
|
-
{
|
28
|
-
text_generation: %w[@cf/meta/llama-2-7b-chat-fp16 @cf/meta/llama-2-7b-chat-int8 @cf/mistral/mistral-7b-instruct-v0.1 @hf/thebloke/codellama-7b-instruct-awq],
|
29
|
-
speech_recognition: %w[@cf/openai/whisper],
|
30
|
-
translation: %w[@cf/meta/m2m100-1.2b],
|
31
|
-
text_classification: %w[@cf/huggingface/distilbert-sst-2-int8],
|
32
|
-
image_classification: %w[@cf/huggingface/distilbert-sst-2-int8],
|
33
|
-
text_to_image: %w[@cf/stabilityai/stable-diffusion-xl-base-1.0],
|
34
|
-
text_embeddings: %w[@cf/baai/bge-base-en-v1.5 @cf/baai/bge-large-en-v1.5 @cf/baai/bge-small-en-v1.5]
|
35
|
-
}.freeze
|
24
|
+
payload = create_streamable_payload({prompt: prompt}, stream: stream)
|
25
|
+
post_streamable_request(url, payload, &block)
|
36
26
|
end
|
37
27
|
|
38
|
-
|
28
|
+
def embed(text:, model_name: Cloudflare::AI::Models.text_embedding.first)
|
29
|
+
url = service_url_for(account_id: account_id, model_name: model_name)
|
30
|
+
payload = {text: text}.to_json
|
39
31
|
|
40
|
-
|
41
|
-
data.merge({stream: stream}).to_json
|
32
|
+
Cloudflare::AI::Results::TextEmbedding.new(connection.post(url, payload).body)
|
42
33
|
end
|
43
34
|
|
44
|
-
|
45
|
-
if block
|
46
|
-
parser = EventStreamParser::Parser.new
|
47
|
-
connection.post(url, payload) do |response|
|
48
|
-
response.options.on_data = parser.stream do |_type, data, _id, _reconnection_time, _size|
|
49
|
-
yield data
|
50
|
-
end
|
51
|
-
end
|
52
|
-
else
|
53
|
-
Cloudflare::AI::Result.new(connection.post(url, payload).body)
|
54
|
-
end
|
55
|
-
end
|
35
|
+
private
|
56
36
|
|
57
37
|
def connection
|
58
38
|
@connection ||= ::Faraday.new(headers: {Authorization: "Bearer #{api_token}"})
|
@@ -0,0 +1,28 @@
|
|
1
|
+
module Cloudflare
|
2
|
+
module AI
|
3
|
+
module Clients
|
4
|
+
module TextGenerationHelpers
|
5
|
+
def default_text_generation_model_name
|
6
|
+
Cloudflare::AI::Models.text_generation.first
|
7
|
+
end
|
8
|
+
|
9
|
+
def create_streamable_payload(data, stream: false)
|
10
|
+
data.merge({stream: stream}).to_json
|
11
|
+
end
|
12
|
+
|
13
|
+
def post_streamable_request(url, payload, &block)
|
14
|
+
if block
|
15
|
+
parser = EventStreamParser::Parser.new
|
16
|
+
connection.post(url, payload) do |response|
|
17
|
+
response.options.on_data = parser.stream do |_type, data, _id, _reconnection_time, _size|
|
18
|
+
yield data
|
19
|
+
end
|
20
|
+
end
|
21
|
+
else
|
22
|
+
Cloudflare::AI::Results::TextGeneration.new(connection.post(url, payload).body)
|
23
|
+
end
|
24
|
+
end
|
25
|
+
end
|
26
|
+
end
|
27
|
+
end
|
28
|
+
end
|
@@ -0,0 +1,43 @@
|
|
1
|
+
class Cloudflare::AI::Models
|
2
|
+
class << self
|
3
|
+
def text_generation
|
4
|
+
%w[@cf/meta/llama-2-7b-chat-fp16 @cf/meta/llama-2-7b-chat-int8 @cf/mistral/mistral-7b-instruct-v0.1 @hf/thebloke/codellama-7b-instruct-awq]
|
5
|
+
end
|
6
|
+
|
7
|
+
def speech_recognition
|
8
|
+
%w[@cf/openai/whisper]
|
9
|
+
end
|
10
|
+
|
11
|
+
def translation
|
12
|
+
%w[@cf/meta/m2m100-1.2b]
|
13
|
+
end
|
14
|
+
|
15
|
+
def text_classification
|
16
|
+
%w[@cf/huggingface/distilbert-sst-2-int8]
|
17
|
+
end
|
18
|
+
|
19
|
+
def image_classification
|
20
|
+
%w[@cf/huggingface/distilbert-sst-2-int8]
|
21
|
+
end
|
22
|
+
|
23
|
+
def text_to_image
|
24
|
+
%w[@cf/stabilityai/stable-diffusion-xl-base-1.0]
|
25
|
+
end
|
26
|
+
|
27
|
+
def text_embedding
|
28
|
+
%w[@cf/baai/bge-base-en-v1.5 @cf/baai/bge-large-en-v1.5 @cf/baai/bge-small-en-v1.5]
|
29
|
+
end
|
30
|
+
|
31
|
+
def all
|
32
|
+
{
|
33
|
+
text_generation: text_generation,
|
34
|
+
speech_recognition: speech_recognition,
|
35
|
+
translation: translation,
|
36
|
+
text_classification: text_classification,
|
37
|
+
image_classification: image_classification,
|
38
|
+
text_to_image: text_to_image,
|
39
|
+
text_embeddings: text_embedding
|
40
|
+
}
|
41
|
+
end
|
42
|
+
end
|
43
|
+
end
|
data/lib/cloudflare/ai/result.rb
CHANGED
@@ -1,32 +1,45 @@
|
|
1
1
|
require "active_support/core_ext/hash/indifferent_access"
|
2
|
+
require "json"
|
2
3
|
|
3
4
|
class Cloudflare::AI::Result
|
4
|
-
|
5
|
-
|
6
|
-
|
7
|
-
@json = json
|
8
|
-
@json = JSON.parse(@json) unless @json.is_a?(Hash)
|
9
|
-
@json = @json.with_indifferent_access
|
5
|
+
def initialize(json_string_or_ruby_hash)
|
6
|
+
@result_data = parse_data(json_string_or_ruby_hash)
|
7
|
+
end
|
10
8
|
|
11
|
-
|
12
|
-
|
13
|
-
@errors = @json["errors"]
|
14
|
-
@messages = @json["messages"]
|
9
|
+
def result
|
10
|
+
result_data[:result]
|
15
11
|
end
|
16
12
|
|
17
|
-
def
|
18
|
-
|
13
|
+
def success?
|
14
|
+
success == true
|
19
15
|
end
|
20
16
|
|
21
17
|
def failure?
|
22
18
|
!success?
|
23
19
|
end
|
24
20
|
|
25
|
-
def
|
26
|
-
|
21
|
+
def errors
|
22
|
+
result_data.dig(:errors)
|
23
|
+
end
|
24
|
+
|
25
|
+
def messages
|
26
|
+
result_data.dig(:messages)
|
27
|
+
end
|
28
|
+
|
29
|
+
def to_json
|
30
|
+
result_data.to_json
|
31
|
+
end
|
32
|
+
|
33
|
+
private
|
34
|
+
|
35
|
+
attr_reader :result_data
|
36
|
+
|
37
|
+
def success
|
38
|
+
result_data[:success]
|
27
39
|
end
|
28
40
|
|
29
|
-
def
|
30
|
-
|
41
|
+
def parse_data(input)
|
42
|
+
input = JSON.parse(input) if input.is_a?(String)
|
43
|
+
input.with_indifferent_access
|
31
44
|
end
|
32
45
|
end
|
data/lib/cloudflare/ai.rb
CHANGED
metadata
CHANGED
@@ -1,7 +1,7 @@
|
|
1
1
|
--- !ruby/object:Gem::Specification
|
2
2
|
name: cloudflare-ai
|
3
3
|
version: !ruby/object:Gem::Version
|
4
|
-
version: 0.
|
4
|
+
version: 0.3.0
|
5
5
|
platform: ruby
|
6
6
|
authors:
|
7
7
|
- Ajay Krishnan
|
@@ -93,9 +93,13 @@ files:
|
|
93
93
|
- README.md
|
94
94
|
- lib/cloudflare/ai.rb
|
95
95
|
- lib/cloudflare/ai/client.rb
|
96
|
+
- lib/cloudflare/ai/clients/text_generation_helpers.rb
|
96
97
|
- lib/cloudflare/ai/contextual_logger.rb
|
97
98
|
- lib/cloudflare/ai/message.rb
|
99
|
+
- lib/cloudflare/ai/models.rb
|
98
100
|
- lib/cloudflare/ai/result.rb
|
101
|
+
- lib/cloudflare/ai/results/text_embedding.rb
|
102
|
+
- lib/cloudflare/ai/results/text_generation.rb
|
99
103
|
- lib/cloudflare/ai/version.rb
|
100
104
|
homepage: https://rubygems.org/gems/cloudflare-ai
|
101
105
|
licenses:
|