groq 0.2.0
Sign up to get free protection for your applications and to get access to all the features.
- checksums.yaml +7 -0
- data/.tool-versions +1 -0
- data/CHANGELOG.md +5 -0
- data/CODE_OF_CONDUCT.md +84 -0
- data/LICENSE.txt +21 -0
- data/README.md +328 -0
- data/Rakefile +8 -0
- data/docs/images/groq-speed-price-20240421.png +0 -0
- data/lib/groq/client.rb +77 -0
- data/lib/groq/configuration.rb +28 -0
- data/lib/groq/helpers.rb +29 -0
- data/lib/groq/model.rb +71 -0
- data/lib/groq/version.rb +5 -0
- data/lib/groq-ruby.rb +1 -0
- data/lib/groq.rb +20 -0
- data/sig/groq.rbs +4 -0
- metadata +130 -0
checksums.yaml
ADDED
@@ -0,0 +1,7 @@
|
|
1
|
+
---
|
2
|
+
SHA256:
|
3
|
+
metadata.gz: 8d1461971dedb839a98ceba16edeec695a3fbc48216295314e6c319e5976f621
|
4
|
+
data.tar.gz: ac0437a0a14d79c9faab3c88054100928970606e90997187c6e908e67a67dc8c
|
5
|
+
SHA512:
|
6
|
+
metadata.gz: 422b5c160196127928397e568aa15e76dc4f63d1388391bce9cae4ad4d6d0b0fb4063fb52126f04a5b32667179532ea62af96f64acb785ffffed875d4c0646cb
|
7
|
+
data.tar.gz: a537f489dedaa533e9fdb444c6e6d3007dab7164c8306260b0409cd5d2b8bf8802373320d7a79016f04a8e71295845f2504162aad8d7f4997db30c2cad7e32f5
|
data/.tool-versions
ADDED
@@ -0,0 +1 @@
|
|
1
|
+
ruby 3.1.4
|
data/CHANGELOG.md
ADDED
data/CODE_OF_CONDUCT.md
ADDED
@@ -0,0 +1,84 @@
|
|
1
|
+
# Contributor Covenant Code of Conduct
|
2
|
+
|
3
|
+
## Our Pledge
|
4
|
+
|
5
|
+
We as members, contributors, and leaders pledge to make participation in our community a harassment-free experience for everyone, regardless of age, body size, visible or invisible disability, ethnicity, sex characteristics, gender identity and expression, level of experience, education, socio-economic status, nationality, personal appearance, race, religion, or sexual identity and orientation.
|
6
|
+
|
7
|
+
We pledge to act and interact in ways that contribute to an open, welcoming, diverse, inclusive, and healthy community.
|
8
|
+
|
9
|
+
## Our Standards
|
10
|
+
|
11
|
+
Examples of behavior that contributes to a positive environment for our community include:
|
12
|
+
|
13
|
+
* Demonstrating empathy and kindness toward other people
|
14
|
+
* Being respectful of differing opinions, viewpoints, and experiences
|
15
|
+
* Giving and gracefully accepting constructive feedback
|
16
|
+
* Accepting responsibility and apologizing to those affected by our mistakes, and learning from the experience
|
17
|
+
* Focusing on what is best not just for us as individuals, but for the overall community
|
18
|
+
|
19
|
+
Examples of unacceptable behavior include:
|
20
|
+
|
21
|
+
* The use of sexualized language or imagery, and sexual attention or
|
22
|
+
advances of any kind
|
23
|
+
* Trolling, insulting or derogatory comments, and personal or political attacks
|
24
|
+
* Public or private harassment
|
25
|
+
* Publishing others' private information, such as a physical or email
|
26
|
+
address, without their explicit permission
|
27
|
+
* Other conduct which could reasonably be considered inappropriate in a
|
28
|
+
professional setting
|
29
|
+
|
30
|
+
## Enforcement Responsibilities
|
31
|
+
|
32
|
+
Community leaders are responsible for clarifying and enforcing our standards of acceptable behavior and will take appropriate and fair corrective action in response to any behavior that they deem inappropriate, threatening, offensive, or harmful.
|
33
|
+
|
34
|
+
Community leaders have the right and responsibility to remove, edit, or reject comments, commits, code, wiki edits, issues, and other contributions that are not aligned to this Code of Conduct, and will communicate reasons for moderation decisions when appropriate.
|
35
|
+
|
36
|
+
## Scope
|
37
|
+
|
38
|
+
This Code of Conduct applies within all community spaces, and also applies when an individual is officially representing the community in public spaces. Examples of representing our community include using an official e-mail address, posting via an official social media account, or acting as an appointed representative at an online or offline event.
|
39
|
+
|
40
|
+
## Enforcement
|
41
|
+
|
42
|
+
Instances of abusive, harassing, or otherwise unacceptable behavior may be reported to the community leaders responsible for enforcement at drnicwilliams@gmail.com. All complaints will be reviewed and investigated promptly and fairly.
|
43
|
+
|
44
|
+
All community leaders are obligated to respect the privacy and security of the reporter of any incident.
|
45
|
+
|
46
|
+
## Enforcement Guidelines
|
47
|
+
|
48
|
+
Community leaders will follow these Community Impact Guidelines in determining the consequences for any action they deem in violation of this Code of Conduct:
|
49
|
+
|
50
|
+
### 1. Correction
|
51
|
+
|
52
|
+
**Community Impact**: Use of inappropriate language or other behavior deemed unprofessional or unwelcome in the community.
|
53
|
+
|
54
|
+
**Consequence**: A private, written warning from community leaders, providing clarity around the nature of the violation and an explanation of why the behavior was inappropriate. A public apology may be requested.
|
55
|
+
|
56
|
+
### 2. Warning
|
57
|
+
|
58
|
+
**Community Impact**: A violation through a single incident or series of actions.
|
59
|
+
|
60
|
+
**Consequence**: A warning with consequences for continued behavior. No interaction with the people involved, including unsolicited interaction with those enforcing the Code of Conduct, for a specified period of time. This includes avoiding interactions in community spaces as well as external channels like social media. Violating these terms may lead to a temporary or permanent ban.
|
61
|
+
|
62
|
+
### 3. Temporary Ban
|
63
|
+
|
64
|
+
**Community Impact**: A serious violation of community standards, including sustained inappropriate behavior.
|
65
|
+
|
66
|
+
**Consequence**: A temporary ban from any sort of interaction or public communication with the community for a specified period of time. No public or private interaction with the people involved, including unsolicited interaction with those enforcing the Code of Conduct, is allowed during this period. Violating these terms may lead to a permanent ban.
|
67
|
+
|
68
|
+
### 4. Permanent Ban
|
69
|
+
|
70
|
+
**Community Impact**: Demonstrating a pattern of violation of community standards, including sustained inappropriate behavior, harassment of an individual, or aggression toward or disparagement of classes of individuals.
|
71
|
+
|
72
|
+
**Consequence**: A permanent ban from any sort of public interaction within the community.
|
73
|
+
|
74
|
+
## Attribution
|
75
|
+
|
76
|
+
This Code of Conduct is adapted from the [Contributor Covenant][homepage], version 2.0,
|
77
|
+
available at https://www.contributor-covenant.org/version/2/0/code_of_conduct.html.
|
78
|
+
|
79
|
+
Community Impact Guidelines were inspired by [Mozilla's code of conduct enforcement ladder](https://github.com/mozilla/diversity).
|
80
|
+
|
81
|
+
[homepage]: https://www.contributor-covenant.org
|
82
|
+
|
83
|
+
For answers to common questions about this code of conduct, see the FAQ at
|
84
|
+
https://www.contributor-covenant.org/faq. Translations are available at https://www.contributor-covenant.org/translations.
|
data/LICENSE.txt
ADDED
@@ -0,0 +1,21 @@
|
|
1
|
+
The MIT License (MIT)
|
2
|
+
|
3
|
+
Copyright (c) 2024 Dr Nic Williams
|
4
|
+
|
5
|
+
Permission is hereby granted, free of charge, to any person obtaining a copy
|
6
|
+
of this software and associated documentation files (the "Software"), to deal
|
7
|
+
in the Software without restriction, including without limitation the rights
|
8
|
+
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
9
|
+
copies of the Software, and to permit persons to whom the Software is
|
10
|
+
furnished to do so, subject to the following conditions:
|
11
|
+
|
12
|
+
The above copyright notice and this permission notice shall be included in
|
13
|
+
all copies or substantial portions of the Software.
|
14
|
+
|
15
|
+
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
16
|
+
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
17
|
+
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
18
|
+
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
19
|
+
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
20
|
+
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
|
21
|
+
THE SOFTWARE.
|
data/README.md
ADDED
@@ -0,0 +1,328 @@
|
|
1
|
+
# Groq
|
2
|
+
|
3
|
+
Groq Cloud runs LLM models fast and cheap. Llama 3, Mixtrel, Gemma, and more at hundreds of tokens per second, at cents per million tokens.
|
4
|
+
|
5
|
+
[![speed-pricing](docs/images/groq-speed-price-20240421.png)](https://wow.groq.com/)
|
6
|
+
|
7
|
+
Speed and pricing at 2024-04-21. Also see their [changelog](https://console.groq.com/docs/changelog) for new models and features.
|
8
|
+
|
9
|
+
## Groq Cloud API
|
10
|
+
|
11
|
+
You can interact with their API using any Ruby HTTP library by following their documentation at <https://console.groq.com/docs/quickstart>. Also use their [Playground](https://console.groq.com/playground) and watch the API traffic in the browser's developer tools.
|
12
|
+
|
13
|
+
The Groq Cloud API looks to be copying a subset of the OpenAI API. For example, you perform chat completions at `https://api.groq.com/openai/v1/chat/completions` with the same POST body schema as OpenAI. The Tools support looks to have the same schema for defining tools/functions.
|
14
|
+
|
15
|
+
So you can write your own Ruby client code to interact with the Groq Cloud API.
|
16
|
+
|
17
|
+
Or you can use this convenience RubyGem with some nice helpers to get you started.
|
18
|
+
|
19
|
+
```ruby
|
20
|
+
@client = Groq::Client.new
|
21
|
+
@client.chat("Hello, world!")
|
22
|
+
=> {"role"=>"assistant", "content"=>"Hello there! It's great to meet you!"}
|
23
|
+
|
24
|
+
include Groq::Helpers
|
25
|
+
@client.chat([
|
26
|
+
User("Hi"),
|
27
|
+
Assistant("Hello back. Ask me anything. I'll reply with 'cat'"),
|
28
|
+
User("Favourite food?")
|
29
|
+
])
|
30
|
+
# => {"role"=>"assistant", "content"=>"Um... CAT"}
|
31
|
+
# => {"role"=>"assistant", "content"=>"Not a cat! It's a pizza!"}
|
32
|
+
# => {"role"=>"assistant", "content"=>"Pizza"}
|
33
|
+
# => {"role"=>"assistant", "content"=>"Cat"}
|
34
|
+
|
35
|
+
@client.chat([
|
36
|
+
System("I am an obedient AI"),
|
37
|
+
U("Hi"),
|
38
|
+
A("Hello back. Ask me anything. I'll reply with 'cat'"),
|
39
|
+
U("Favourite food?")
|
40
|
+
])
|
41
|
+
# => {"role"=>"assistant", "content"=>"Cat"}
|
42
|
+
# => {"role"=>"assistant", "content"=>"cat"}
|
43
|
+
# => {"role"=>"assistant", "content"=>"Cat"}
|
44
|
+
```
|
45
|
+
|
46
|
+
JSON mode:
|
47
|
+
|
48
|
+
```ruby
|
49
|
+
response = @client.chat([
|
50
|
+
S("Reply with JSON. Use {\"number\": 7} for the answer."),
|
51
|
+
U("What's 3+4?")
|
52
|
+
], json: true)
|
53
|
+
# => {"role"=>"assistant", "content"=>"{\"number\": 7}"}
|
54
|
+
|
55
|
+
JSON.parse(response["content"])
|
56
|
+
# => {"number"=>7}
|
57
|
+
```
|
58
|
+
|
59
|
+
## Installation
|
60
|
+
|
61
|
+
Install the gem and add to the application's Gemfile by executing:
|
62
|
+
|
63
|
+
```plain
|
64
|
+
bundle add groq
|
65
|
+
```
|
66
|
+
|
67
|
+
If bundler is not being used to manage dependencies, install the gem by executing:
|
68
|
+
|
69
|
+
```plain
|
70
|
+
gem install groq
|
71
|
+
```
|
72
|
+
|
73
|
+
## Usage
|
74
|
+
|
75
|
+
- Get your API key from [console.groq.com/keys](https://console.groq.com/keys)
|
76
|
+
- Place in env var `GROQ_API_KEY`, or explicitly pass into configuration below.
|
77
|
+
- Use the `Groq::Client` to interact with Groq and your favourite model.
|
78
|
+
|
79
|
+
```ruby
|
80
|
+
client = Groq::Client.new # uses ENV["GROQ_API_KEY"] and "llama3-8b-8192"
|
81
|
+
client = Groq::Client.new(api_key: "...", model_id: "llama3-8b-8192")
|
82
|
+
|
83
|
+
Groq.configuration do |config|
|
84
|
+
config.api_key = "..."
|
85
|
+
config.model_id = "llama3-70b-8192"
|
86
|
+
end
|
87
|
+
client = Groq::Client.new
|
88
|
+
```
|
89
|
+
|
90
|
+
There is a simple chat function to send messages to a model:
|
91
|
+
|
92
|
+
```ruby
|
93
|
+
# either pass a single message and get a single response
|
94
|
+
client.chat("Hello, world!")
|
95
|
+
=> {"role"=>"assistant", "content"=>"Hello there! It's great to meet you!"}
|
96
|
+
|
97
|
+
# or pass in a messages array containing multiple messages between user and assistant
|
98
|
+
client.chat([
|
99
|
+
{role: "user", content: "What's the next day after Wednesday?"},
|
100
|
+
{role: "assistant", content: "The next day after Wednesday is Thursday."},
|
101
|
+
{role: "user", content: "What's the next day after that?"}
|
102
|
+
])
|
103
|
+
# => {"role" => "assistant", "content" => "The next day after Thursday is Friday."}
|
104
|
+
```
|
105
|
+
|
106
|
+
### Interactive console (IRb)
|
107
|
+
|
108
|
+
```plain
|
109
|
+
bin/console
|
110
|
+
```
|
111
|
+
|
112
|
+
This repository has a `bin/console` script to start an interactive console to play with the Groq API. The `@client` variable is setup using `$GROQ_API_KEY` environment variable; and the `U`, `A`, `T` helpers are already included.
|
113
|
+
|
114
|
+
```ruby
|
115
|
+
@client.chat("Hello, world!")
|
116
|
+
{"role"=>"assistant",
|
117
|
+
"content"=>"Hello there! It's great to meet you! Is there something you'd like to talk about or ask? I'm here to listen and help if I can!"}
|
118
|
+
```
|
119
|
+
|
120
|
+
The remaining examples below will use `@client` variable to allow you to copy+paste into `bin/console`.
|
121
|
+
|
122
|
+
### Message helpers
|
123
|
+
|
124
|
+
We also have some handy `U`, `A`, `S`, and `F` methods to produce the `{role:, content:}` hashes:
|
125
|
+
|
126
|
+
```ruby
|
127
|
+
include Groq::Helpers
|
128
|
+
@client.chat([
|
129
|
+
S("I am an obedient AI"),
|
130
|
+
U("Hi"),
|
131
|
+
A("Hello back. Ask me anything. I'll reply with 'cat'"),
|
132
|
+
U("Favourite food?")
|
133
|
+
])
|
134
|
+
# => {"role"=>"assistant", "content"=>"Cat"}
|
135
|
+
```
|
136
|
+
|
137
|
+
The `T()` is to provide function/tool responses:
|
138
|
+
|
139
|
+
```
|
140
|
+
T("tool", tool_call_id: "call_b790", name: "get_weather_report", content: "25 degrees celcius")
|
141
|
+
# => {"role"=>"function", "tool_call_id"=>"call_b790", "name"=>"get_weather_report", "content"=>"25 degrees celcius"}
|
142
|
+
```
|
143
|
+
|
144
|
+
There are also aliases for each helper function:
|
145
|
+
|
146
|
+
* `U(content)` is also `User(content)`
|
147
|
+
* `A(content)` is also `Assistant(content)`
|
148
|
+
* `S(content)` is also `System(content)`
|
149
|
+
* `T(content, ...)` is also `Tool`, `ToolReply`, `Function`, `F`
|
150
|
+
|
151
|
+
### Specifying an LLM model
|
152
|
+
|
153
|
+
At the time of writing, Groq Cloud service supports a limited number of models. They've suggested they'll allow uploading custom models in future.
|
154
|
+
|
155
|
+
To get the list of known model IDs:
|
156
|
+
|
157
|
+
```ruby
|
158
|
+
Groq::Model.model_ids
|
159
|
+
=> ["llama3-8b-8192", "llama3-70b-8192", "llama2-70b-4096", "mixtral-8x7b-32768", "gemma-7b-it"]
|
160
|
+
```
|
161
|
+
|
162
|
+
To get more data about each model, see `Groq::Model::MODELS`.
|
163
|
+
|
164
|
+
As above, you can specify the default model to use for all `chat()` calls:
|
165
|
+
|
166
|
+
```ruby
|
167
|
+
client = Groq::Client.new(model_id: "llama3-70b-8192")
|
168
|
+
# or
|
169
|
+
Groq.configuration do |config|
|
170
|
+
config.model_id = "llama3-70b-8192"
|
171
|
+
end
|
172
|
+
```
|
173
|
+
|
174
|
+
You can also specify the model within the `chat()` call:
|
175
|
+
|
176
|
+
```ruby
|
177
|
+
@client.chat("Hello, world!", model_id: "llama3-70b-8192")
|
178
|
+
```
|
179
|
+
|
180
|
+
To see all known models reply:
|
181
|
+
|
182
|
+
```ruby
|
183
|
+
puts "User message: Hello, world!"
|
184
|
+
Groq::Model.model_ids.each do |model_id|
|
185
|
+
puts "Assistant reply with model #{model_id}:"
|
186
|
+
p @client.chat("Hello, world!", model_id: model_id)
|
187
|
+
end
|
188
|
+
```
|
189
|
+
|
190
|
+
The output might looks similar to:
|
191
|
+
|
192
|
+
```plain
|
193
|
+
User message: Hello, world!
|
194
|
+
Assistant reply with model llama3-8b-8192:
|
195
|
+
{"role"=>"assistant", "content"=>"Hello, world! It's great to meet you! Is there something I can help you with, or would you like to chat?"}
|
196
|
+
Assistant reply with model llama3-70b-8192:
|
197
|
+
{"role"=>"assistant", "content"=>"The classic \"Hello, world!\" It's great to see you here! Is there something I can help you with, or would you like to just chat?"}
|
198
|
+
Assistant reply with model llama2-70b-4096:
|
199
|
+
{"role"=>"assistant", "content"=>"Hello, world!"}
|
200
|
+
Assistant reply with model mixtral-8x7b-32768:
|
201
|
+
{"role"=>"assistant", "content"=>"Hello! It's nice to meet you. Is there something specific you would like to know or talk about? I'm here to help answer any questions you have to the best of my ability. I can provide information on a wide variety of topics, so feel free to ask me anything. I'm here to assist you."}
|
202
|
+
Assistant reply with model gemma-7b-it:
|
203
|
+
{"role"=>"assistant", "content"=>"Hello to you too! 👋🌎 It's great to hear from you. What would you like to talk about today? 😊"}
|
204
|
+
```
|
205
|
+
|
206
|
+
### JSON mode
|
207
|
+
|
208
|
+
JSON mode is a beta feature that guarantees all chat completions are valid JSON.
|
209
|
+
|
210
|
+
To use JSON mode:
|
211
|
+
|
212
|
+
1. Pass `json: true` to the `chat()` call
|
213
|
+
2. Provide a system message that contains `JSON` in the content, e.g. `S("Reply with JSON")`
|
214
|
+
|
215
|
+
A good idea is to provide an example JSON schema in the system message that you'd prefer to receive.
|
216
|
+
|
217
|
+
Other suggestions at [JSON mode (beta)](https://console.groq.com/docs/text-chat#json-mode-object-object) Groq docs page.
|
218
|
+
|
219
|
+
```ruby
|
220
|
+
response = @client.chat([
|
221
|
+
S("Reply with JSON. Use {\n\"number\": 7\n} for the answer."),
|
222
|
+
U("What's 3+4?")
|
223
|
+
], json: true)
|
224
|
+
# => {"role"=>"assistant", "content"=>"{\n\"number\": 7\n}"}
|
225
|
+
|
226
|
+
JSON.parse(response["content"])
|
227
|
+
# => {"number"=>7}
|
228
|
+
```
|
229
|
+
|
230
|
+
### Tools/Functions
|
231
|
+
|
232
|
+
LLMs are increasingly supporting deferring to tools or functions to fetch data, perform calculations, or store structured data. Groq Cloud in turn then supports their tool implementations through its API.
|
233
|
+
|
234
|
+
See the [Using Tools](https://console.groq.com/docs/tool-use) documentation for the list of models that currently support tools. Others might support it sometimes and raise errors other times.
|
235
|
+
|
236
|
+
```ruby
|
237
|
+
@client = Groq::Client.new(model_id: "mixtral-8x7b-32768")
|
238
|
+
```
|
239
|
+
|
240
|
+
The Groq/OpenAI schema for defining a tool/function (which differs from the Anthropic/Claude3 schema) is:
|
241
|
+
|
242
|
+
```ruby
|
243
|
+
tools = [{
|
244
|
+
type: "function",
|
245
|
+
function: {
|
246
|
+
name: "get_weather_report",
|
247
|
+
description: "Get the weather report for a city",
|
248
|
+
parameters: {
|
249
|
+
type: "object",
|
250
|
+
properties: {
|
251
|
+
city: {
|
252
|
+
type: "string",
|
253
|
+
description: "The city or region to get the weather report for"
|
254
|
+
}
|
255
|
+
},
|
256
|
+
required: ["city"]
|
257
|
+
}
|
258
|
+
}
|
259
|
+
}]
|
260
|
+
```
|
261
|
+
|
262
|
+
Pass the `tools` array into the `chat()` call:
|
263
|
+
|
264
|
+
```ruby
|
265
|
+
@client = Groq::Client.new(model_id: "mixtral-8x7b-32768")
|
266
|
+
|
267
|
+
include Groq::Helpers
|
268
|
+
messages = [U("What's the weather in Paris?")]
|
269
|
+
response = @client.chat(messages, tools: tools)
|
270
|
+
# => {"role"=>"assistant", "tool_calls"=>[{"id"=>"call_b790", "type"=>"function", "function"=>{"name"=>"get_weather_report", "arguments"=>"{\"city\":\"Paris\"}"}}]}
|
271
|
+
```
|
272
|
+
|
273
|
+
You'd then invoke the Ruby implementation of `get_weather_report` to return the weather report for Paris as the next message in the chat.
|
274
|
+
|
275
|
+
```ruby
|
276
|
+
messages << response
|
277
|
+
|
278
|
+
tool_call_id = response["tool_calls"].first["id"]
|
279
|
+
messages << T("25 degrees celcius", tool_call_id: tool_call_id, name: "get_weather_report")
|
280
|
+
@client.chat(messages)
|
281
|
+
# => {"role"=>"assistant", "content"=> "I'm glad you called the function!\n\nAs of your current location, the weather in Paris is indeed 25°C (77°F)..."}
|
282
|
+
```
|
283
|
+
|
284
|
+
### Max Tokens & Temperature
|
285
|
+
|
286
|
+
Max tokens is the maximum number of tokens that the model can process in a single response. This limits ensures computational efficiency and resource management.
|
287
|
+
|
288
|
+
Temperature setting for each API call controls randomness of responses. A lower temperature leads to more predictable outputs while a higher temperature results in more varies and sometimes more creative outputs. The range of values is 0 to 2.
|
289
|
+
|
290
|
+
Each API call includes a `max_token:` and `temperature:` value.
|
291
|
+
|
292
|
+
The defaults are:
|
293
|
+
|
294
|
+
```ruby
|
295
|
+
@client.max_tokens
|
296
|
+
=> 1024
|
297
|
+
@client.temperature
|
298
|
+
=> 1
|
299
|
+
```
|
300
|
+
|
301
|
+
You can override them in the `Groq.configuration` block, or with each `chat()` call:
|
302
|
+
|
303
|
+
```ruby
|
304
|
+
Groq.configuration do |config|
|
305
|
+
config.max_tokens = 512
|
306
|
+
config.temperature = 0.5
|
307
|
+
end
|
308
|
+
# or
|
309
|
+
@client.chat("Hello, world!", max_tokens: 512, temperature: 0.5)
|
310
|
+
```
|
311
|
+
|
312
|
+
## Development
|
313
|
+
|
314
|
+
After checking out the repo, run `bin/setup` to install dependencies. Then, run `rake test` to run the tests. You can also run `bin/console` for an interactive prompt that will allow you to experiment.
|
315
|
+
|
316
|
+
To install this gem onto your local machine, run `bundle exec rake install`. To release a new version, update the version number in `version.rb`, and then run `bundle exec rake release`, which will create a git tag for the version, push git commits and the created tag, and push the `.gem` file to [rubygems.org](https://rubygems.org).
|
317
|
+
|
318
|
+
## Contributing
|
319
|
+
|
320
|
+
Bug reports and pull requests are welcome on GitHub at https://github.com/drnic/groq-ruby. This project is intended to be a safe, welcoming space for collaboration, and contributors are expected to adhere to the [code of conduct](https://github.com/drnic/groq-ruby/blob/develop/CODE_OF_CONDUCT.md).
|
321
|
+
|
322
|
+
## License
|
323
|
+
|
324
|
+
The gem is available as open source under the terms of the [MIT License](https://opensource.org/licenses/MIT).
|
325
|
+
|
326
|
+
## Code of Conduct
|
327
|
+
|
328
|
+
Everyone interacting in the Groq project's codebases, issue trackers, chat rooms and mailing lists is expected to follow the [code of conduct](https://github.com/drnic/groq-ruby/blob/develop/CODE_OF_CONDUCT.md).
|
data/Rakefile
ADDED
Binary file
|
data/lib/groq/client.rb
ADDED
@@ -0,0 +1,77 @@
|
|
1
|
+
require "faraday"
|
2
|
+
|
3
|
+
class Groq::Client
|
4
|
+
CONFIG_KEYS = %i[
|
5
|
+
api_key
|
6
|
+
api_url
|
7
|
+
model_id
|
8
|
+
max_tokens
|
9
|
+
temperature
|
10
|
+
].freeze
|
11
|
+
attr_reader(*CONFIG_KEYS, :faraday_middleware)
|
12
|
+
|
13
|
+
class Error < StandardError; end
|
14
|
+
|
15
|
+
def initialize(config = {}, &faraday_middleware)
|
16
|
+
CONFIG_KEYS.each do |key|
|
17
|
+
# Set instance variables like api_key.
|
18
|
+
# Fall back to global config if not present.
|
19
|
+
instance_variable_set(:"@#{key}", config[key] || Groq.configuration.send(key))
|
20
|
+
end
|
21
|
+
@faraday_middleware = faraday_middleware
|
22
|
+
end
|
23
|
+
|
24
|
+
# TODO: support stream: true; or &stream block
|
25
|
+
def chat(messages, model_id: nil, tools: nil, max_tokens: nil, temperature: nil, json: false)
|
26
|
+
unless messages.is_a?(Array) || messages.is_a?(String)
|
27
|
+
raise ArgumentError, "require messages to be an Array or String"
|
28
|
+
end
|
29
|
+
|
30
|
+
if messages.is_a?(String)
|
31
|
+
messages = [{role: "user", content: messages}]
|
32
|
+
end
|
33
|
+
|
34
|
+
model_id ||= @model_id
|
35
|
+
|
36
|
+
body = {
|
37
|
+
model: model_id,
|
38
|
+
messages: messages,
|
39
|
+
tools: tools,
|
40
|
+
max_tokens: max_tokens || @max_tokens,
|
41
|
+
temperature: temperature || @temperature,
|
42
|
+
response_format: json ? {type: "json_object"} : nil
|
43
|
+
}.compact
|
44
|
+
response = post(path: "/openai/v1/chat/completions", body: body)
|
45
|
+
if response.status == 200
|
46
|
+
response.body.dig("choices", 0, "message")
|
47
|
+
else
|
48
|
+
# TODO: send the response.body back in Error object
|
49
|
+
puts "Error: #{response.status}"
|
50
|
+
pp response.body
|
51
|
+
raise Error, "Request failed with status #{response.status}: #{response.body}"
|
52
|
+
end
|
53
|
+
end
|
54
|
+
|
55
|
+
def get(path:)
|
56
|
+
client.get do |req|
|
57
|
+
req.url path
|
58
|
+
req.headers["Authorization"] = "Bearer #{@api_key}"
|
59
|
+
end
|
60
|
+
end
|
61
|
+
|
62
|
+
def post(path:, body:)
|
63
|
+
client.post do |req|
|
64
|
+
req.url path
|
65
|
+
req.headers["Authorization"] = "Bearer #{@api_key}"
|
66
|
+
req.body = body
|
67
|
+
end
|
68
|
+
end
|
69
|
+
|
70
|
+
def client
|
71
|
+
@client ||= Faraday.new(url: @api_url) do |f|
|
72
|
+
f.request :json # automatically encode the request body as JSON
|
73
|
+
f.response :json # automatically decode JSON responses
|
74
|
+
f.adapter Faraday.default_adapter
|
75
|
+
end
|
76
|
+
end
|
77
|
+
end
|
@@ -0,0 +1,28 @@
|
|
1
|
+
class Groq::Configuration
|
2
|
+
attr_writer :api_key
|
3
|
+
attr_accessor :model_id, :max_tokens, :temperature
|
4
|
+
attr_accessor :api_url, :request_timeout, :extra_headers
|
5
|
+
|
6
|
+
DEFAULT_API_URL = "https://api.groq.com"
|
7
|
+
DEFAULT_REQUEST_TIMEOUT = 5
|
8
|
+
DEFAULT_MAX_TOKENS = 1024
|
9
|
+
DEFAULT_TEMPERATURE = 1
|
10
|
+
|
11
|
+
class Error < StandardError; end
|
12
|
+
|
13
|
+
def initialize
|
14
|
+
@api_key = ENV["GROQ_API_KEY"]
|
15
|
+
@api_url = DEFAULT_API_URL
|
16
|
+
@request_timeout = DEFAULT_REQUEST_TIMEOUT
|
17
|
+
@extra_headers = {}
|
18
|
+
|
19
|
+
@model_id = Groq::Model.default_model_id
|
20
|
+
@max_tokens = DEFAULT_MAX_TOKENS
|
21
|
+
@temperature = DEFAULT_TEMPERATURE
|
22
|
+
end
|
23
|
+
|
24
|
+
def api_key
|
25
|
+
return @api_key if @api_key
|
26
|
+
raise Error, "No GROQ API key provided. Set via $GROQ_API_KEY or Groq.configuration.api_key"
|
27
|
+
end
|
28
|
+
end
|
data/lib/groq/helpers.rb
ADDED
@@ -0,0 +1,29 @@
|
|
1
|
+
require "active_support/concern"
|
2
|
+
|
3
|
+
module Groq::Helpers
|
4
|
+
extend ActiveSupport::Concern
|
5
|
+
included do
|
6
|
+
def U(content)
|
7
|
+
{role: "user", content: content}
|
8
|
+
end
|
9
|
+
alias_method :User, :U
|
10
|
+
|
11
|
+
def A(content)
|
12
|
+
{role: "assistant", content: content}
|
13
|
+
end
|
14
|
+
alias_method :Assistant, :A
|
15
|
+
|
16
|
+
def S(content)
|
17
|
+
{role: "system", content: content}
|
18
|
+
end
|
19
|
+
alias_method :System, :S
|
20
|
+
|
21
|
+
def T(content, tool_call_id:, name:)
|
22
|
+
{role: "function", tool_call_id: tool_call_id, name: name, content: content}
|
23
|
+
end
|
24
|
+
alias_method :Tool, :T
|
25
|
+
alias_method :ToolReply, :T
|
26
|
+
alias_method :Function, :T
|
27
|
+
alias_method :F, :T
|
28
|
+
end
|
29
|
+
end
|
data/lib/groq/model.rb
ADDED
@@ -0,0 +1,71 @@
|
|
1
|
+
class Groq::Model
|
2
|
+
MODELS = [
|
3
|
+
{
|
4
|
+
name: "LLaMA3 8b",
|
5
|
+
model_id: "llama3-8b-8192",
|
6
|
+
developer: "Meta",
|
7
|
+
context_window: 8192,
|
8
|
+
model_card: "https://huggingface.co/meta-llama/Meta-Llama-3-8B"
|
9
|
+
},
|
10
|
+
{
|
11
|
+
name: "LLaMA3 70b",
|
12
|
+
model_id: "llama3-70b-8192",
|
13
|
+
developer: "Meta",
|
14
|
+
context_window: 8192,
|
15
|
+
model_card: "https://huggingface.co/meta-llama/Meta-Llama-3-70B"
|
16
|
+
},
|
17
|
+
{
|
18
|
+
name: "LLaMA2 70b",
|
19
|
+
model_id: "llama2-70b-4096",
|
20
|
+
developer: "Meta",
|
21
|
+
context_window: 4096,
|
22
|
+
model_card: "https://huggingface.co/meta-llama/Llama-2-70b"
|
23
|
+
},
|
24
|
+
{
|
25
|
+
name: "Mixtral 8x7b",
|
26
|
+
model_id: "mixtral-8x7b-32768",
|
27
|
+
developer: "Mistral",
|
28
|
+
context_window: 32768,
|
29
|
+
model_card: "https://huggingface.co/mistralai/Mixtral-8x7B-Instruct-v0.1"
|
30
|
+
},
|
31
|
+
{
|
32
|
+
name: "Gemma 7b",
|
33
|
+
model_id: "gemma-7b-it",
|
34
|
+
developer: "Google",
|
35
|
+
context_window: 8192,
|
36
|
+
model_card: "https://huggingface.co/google/gemma-1.1-7b-it"
|
37
|
+
}
|
38
|
+
]
|
39
|
+
|
40
|
+
class << self
|
41
|
+
def model_ids
|
42
|
+
MODELS.map { |m| m[:model_id] }
|
43
|
+
end
|
44
|
+
|
45
|
+
def default_model
|
46
|
+
MODELS.first
|
47
|
+
end
|
48
|
+
|
49
|
+
def default_model_id
|
50
|
+
default_model[:model_id]
|
51
|
+
end
|
52
|
+
|
53
|
+
# https://api.groq.com/openai/v1/models
|
54
|
+
# Output:
|
55
|
+
# {"object": "list",
|
56
|
+
# "data": [
|
57
|
+
# {
|
58
|
+
# "id": "gemma-7b-it",
|
59
|
+
# "object": "model",
|
60
|
+
# "created": 1693721698,
|
61
|
+
# "owned_by": "Google",
|
62
|
+
# "active": true,
|
63
|
+
# "context_window": 8192
|
64
|
+
# },
|
65
|
+
def load_models(client:)
|
66
|
+
client ||= Groq::Client.new
|
67
|
+
response = client.get(path: "/openai/v1/models")
|
68
|
+
response.body
|
69
|
+
end
|
70
|
+
end
|
71
|
+
end
|
data/lib/groq/version.rb
ADDED
data/lib/groq-ruby.rb
ADDED
@@ -0,0 +1 @@
|
|
1
|
+
require_relative "groq"
|
data/lib/groq.rb
ADDED
@@ -0,0 +1,20 @@
|
|
1
|
+
# frozen_string_literal: true
|
2
|
+
|
3
|
+
require_relative "groq/version"
|
4
|
+
|
5
|
+
module Groq
|
6
|
+
autoload :Configuration, "groq/configuration"
|
7
|
+
autoload :Client, "groq/client"
|
8
|
+
autoload :Model, "groq/model"
|
9
|
+
autoload :Helpers, "groq/helpers"
|
10
|
+
|
11
|
+
class << self
|
12
|
+
def configuration
|
13
|
+
@configuration ||= Configuration.new
|
14
|
+
end
|
15
|
+
|
16
|
+
def configure
|
17
|
+
yield configuration
|
18
|
+
end
|
19
|
+
end
|
20
|
+
end
|
data/sig/groq.rbs
ADDED
metadata
ADDED
@@ -0,0 +1,130 @@
|
|
1
|
+
--- !ruby/object:Gem::Specification
|
2
|
+
name: groq
|
3
|
+
version: !ruby/object:Gem::Version
|
4
|
+
version: 0.2.0
|
5
|
+
platform: ruby
|
6
|
+
authors:
|
7
|
+
- Dr Nic Williams
|
8
|
+
autorequire:
|
9
|
+
bindir: exe
|
10
|
+
cert_chain: []
|
11
|
+
date: 2024-04-20 00:00:00.000000000 Z
|
12
|
+
dependencies:
|
13
|
+
- !ruby/object:Gem::Dependency
|
14
|
+
name: faraday
|
15
|
+
requirement: !ruby/object:Gem::Requirement
|
16
|
+
requirements:
|
17
|
+
- - "~>"
|
18
|
+
- !ruby/object:Gem::Version
|
19
|
+
version: '2.0'
|
20
|
+
type: :runtime
|
21
|
+
prerelease: false
|
22
|
+
version_requirements: !ruby/object:Gem::Requirement
|
23
|
+
requirements:
|
24
|
+
- - "~>"
|
25
|
+
- !ruby/object:Gem::Version
|
26
|
+
version: '2.0'
|
27
|
+
- !ruby/object:Gem::Dependency
|
28
|
+
name: json
|
29
|
+
requirement: !ruby/object:Gem::Requirement
|
30
|
+
requirements:
|
31
|
+
- - ">="
|
32
|
+
- !ruby/object:Gem::Version
|
33
|
+
version: '0'
|
34
|
+
type: :runtime
|
35
|
+
prerelease: false
|
36
|
+
version_requirements: !ruby/object:Gem::Requirement
|
37
|
+
requirements:
|
38
|
+
- - ">="
|
39
|
+
- !ruby/object:Gem::Version
|
40
|
+
version: '0'
|
41
|
+
- !ruby/object:Gem::Dependency
|
42
|
+
name: activesupport
|
43
|
+
requirement: !ruby/object:Gem::Requirement
|
44
|
+
requirements:
|
45
|
+
- - ">"
|
46
|
+
- !ruby/object:Gem::Version
|
47
|
+
version: '5'
|
48
|
+
type: :runtime
|
49
|
+
prerelease: false
|
50
|
+
version_requirements: !ruby/object:Gem::Requirement
|
51
|
+
requirements:
|
52
|
+
- - ">"
|
53
|
+
- !ruby/object:Gem::Version
|
54
|
+
version: '5'
|
55
|
+
- !ruby/object:Gem::Dependency
|
56
|
+
name: vcr
|
57
|
+
requirement: !ruby/object:Gem::Requirement
|
58
|
+
requirements:
|
59
|
+
- - "~>"
|
60
|
+
- !ruby/object:Gem::Version
|
61
|
+
version: '6.0'
|
62
|
+
type: :development
|
63
|
+
prerelease: false
|
64
|
+
version_requirements: !ruby/object:Gem::Requirement
|
65
|
+
requirements:
|
66
|
+
- - "~>"
|
67
|
+
- !ruby/object:Gem::Version
|
68
|
+
version: '6.0'
|
69
|
+
- !ruby/object:Gem::Dependency
|
70
|
+
name: webmock
|
71
|
+
requirement: !ruby/object:Gem::Requirement
|
72
|
+
requirements:
|
73
|
+
- - "~>"
|
74
|
+
- !ruby/object:Gem::Version
|
75
|
+
version: '3.0'
|
76
|
+
type: :development
|
77
|
+
prerelease: false
|
78
|
+
version_requirements: !ruby/object:Gem::Requirement
|
79
|
+
requirements:
|
80
|
+
- - "~>"
|
81
|
+
- !ruby/object:Gem::Version
|
82
|
+
version: '3.0'
|
83
|
+
description: Client library for Groq API for fast LLM inference.
|
84
|
+
email:
|
85
|
+
- drnicwilliams@gmail.com
|
86
|
+
executables: []
|
87
|
+
extensions: []
|
88
|
+
extra_rdoc_files: []
|
89
|
+
files:
|
90
|
+
- ".tool-versions"
|
91
|
+
- CHANGELOG.md
|
92
|
+
- CODE_OF_CONDUCT.md
|
93
|
+
- LICENSE.txt
|
94
|
+
- README.md
|
95
|
+
- Rakefile
|
96
|
+
- docs/images/groq-speed-price-20240421.png
|
97
|
+
- lib/groq-ruby.rb
|
98
|
+
- lib/groq.rb
|
99
|
+
- lib/groq/client.rb
|
100
|
+
- lib/groq/configuration.rb
|
101
|
+
- lib/groq/helpers.rb
|
102
|
+
- lib/groq/model.rb
|
103
|
+
- lib/groq/version.rb
|
104
|
+
- sig/groq.rbs
|
105
|
+
homepage: https://github.com/drnic/groq-ruby
|
106
|
+
licenses:
|
107
|
+
- MIT
|
108
|
+
metadata:
|
109
|
+
allowed_push_host: https://rubygems.org
|
110
|
+
homepage_uri: https://github.com/drnic/groq-ruby
|
111
|
+
post_install_message:
|
112
|
+
rdoc_options: []
|
113
|
+
require_paths:
|
114
|
+
- lib
|
115
|
+
required_ruby_version: !ruby/object:Gem::Requirement
|
116
|
+
requirements:
|
117
|
+
- - ">="
|
118
|
+
- !ruby/object:Gem::Version
|
119
|
+
version: 3.1.0
|
120
|
+
required_rubygems_version: !ruby/object:Gem::Requirement
|
121
|
+
requirements:
|
122
|
+
- - ">="
|
123
|
+
- !ruby/object:Gem::Version
|
124
|
+
version: '0'
|
125
|
+
requirements: []
|
126
|
+
rubygems_version: 3.5.5
|
127
|
+
signing_key:
|
128
|
+
specification_version: 4
|
129
|
+
summary: Client library for Groq API for fast LLM inference.
|
130
|
+
test_files: []
|