groq 0.3.0 → 0.3.1

Sign up to get free protection for your applications and to get access to all the features.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: af8963c428e4a7760f76a17a48d6c92cdae50363c96d6e97eef21293a3321beb
4
- data.tar.gz: 92d619e893e9fa727c76f42c85f095cc1e874ba78474840f7c6b1d139d691077
3
+ metadata.gz: a27c810eca4d98436dc29bad4d48332a0f00440b3da642895d15739e43f9d9b0
4
+ data.tar.gz: 021b1ca81acc6d07d059b49b9e4a4771997fbcf2f5f8b026872a28a17052b936
5
5
  SHA512:
6
- metadata.gz: bc25feb5cfdaf37955932e59f079568ce4f135dcd61eade7d13bf1d62777fec08642f246bbe2b90d4b0c0ffaaa28e9bc18092814cd19b2da3f9c06486d713417
7
- data.tar.gz: 2c436ded0ab2152c5625a7b0690942b7d24b71cb36aef56a95984a5faa31ee8b959230f7f5389c4941ed613f19608971a00c415c22b7b6199d04f6a8674f2a14
6
+ metadata.gz: beafd45e9e8d1fa716f7fbe409b700d5759d55969fc35fef905aff1e02a340b969170ebc4c19ae267755df74325aa88ad9076cb799877de6497b9c2bd393b6df
7
+ data.tar.gz: f99f6a845c58eec585508f7e90bb88717152d72a5e535462e2d2a470fd1742974968dac0c901626d03acc39a4cbef5a3fe43a00c49c1a87d66ad0ea966c5bd3d
data/README.md CHANGED
@@ -60,11 +60,14 @@ JSON.parse(response["content"])
60
60
 
61
61
  Install the gem and add to the application's Gemfile by executing:
62
62
 
63
- > bundle add groq
63
+ ```plain
64
+ bundle add groq
64
65
  ```
66
+
65
67
  If bundler is not being used to manage dependencies, install the gem by executing:
66
68
 
67
- > gem install groq
69
+ ```plain
70
+ gem install groq
68
71
  ```
69
72
  ## Usage
70
73
 
@@ -76,13 +79,19 @@ If bundler is not being used to manage dependencies, install the gem by executin
76
79
  client = Groq::Client.new # uses ENV["GROQ_API_KEY"] and "llama3-8b-8192"
77
80
  client = Groq::Client.new(api_key: "...", model_id: "llama3-8b-8192")
78
81
 
79
- Groq.configuration do |config|
82
+ Groq.configure do |config|
80
83
  config.api_key = "..."
81
84
  config.model_id = "llama3-70b-8192"
82
85
  end
83
86
  client = Groq::Client.new
84
87
  ```
85
88
 
89
+ In a Rails application, you can generate a `config/initializer/groq.rb` file with:
90
+
91
+ ```plain
92
+ rails g groq:install
93
+ ```
94
+
86
95
  There is a simple chat function to send messages to a model:
87
96
 
88
97
  ```ruby
@@ -101,8 +110,10 @@ client.chat([
101
110
 
102
111
  ### Interactive console (IRb)
103
112
 
104
- > bin/console
113
+ ```plain
114
+ bin/console
105
115
  ```
116
+
106
117
  This repository has a `bin/console` script to start an interactive console to play with the Groq API. The `@client` variable is setup using `$GROQ_API_KEY` environment variable; and the `U`, `A`, `T` helpers are already included.
107
118
 
108
119
  ```ruby
@@ -160,7 +171,7 @@ As above, you can specify the default model to use for all `chat()` calls:
160
171
  ```ruby
161
172
  client = Groq::Client.new(model_id: "llama3-70b-8192")
162
173
  # or
163
- Groq.configuration do |config|
174
+ Groq.configure do |config|
164
175
  config.model_id = "llama3-70b-8192"
165
176
  end
166
177
  ```
@@ -318,10 +329,10 @@ The defaults are:
318
329
  => 1
319
330
  ```
320
331
 
321
- You can override them in the `Groq.configuration` block, or with each `chat()` call:
332
+ You can override them in the `Groq.configure` block, or with each `chat()` call:
322
333
 
323
334
  ```ruby
324
- Groq.configuration do |config|
335
+ Groq.configure do |config|
325
336
  config.max_tokens = 512
326
337
  config.temperature = 0.5
327
338
  end
data/examples/README.md CHANGED
@@ -5,9 +5,9 @@
5
5
  Chat with a pre-defined agent using the following command:
6
6
 
7
7
  ```bash
8
- bundle exec examples/groq-user-chat.rb
8
+ bundle exec examples/user-chat.rb
9
9
  # or
10
- bundle exec examples/groq-user-chat.rb --agent-prompt examples/agent-prompts/helloworld.yml
10
+ bundle exec examples/user-chat.rb --agent-prompt examples/agent-prompts/helloworld.yml
11
11
  ```
12
12
 
13
13
  There are two example agent prompts available:
@@ -20,14 +20,47 @@ At the prompt, either talk to the AI agent, or some special commands:
20
20
  - `exit` to exit the conversation
21
21
  - `summary` to get a summary of the conversation so far
22
22
 
23
- ### Streaming
23
+ ### Streaming text chunks
24
24
 
25
25
  There is also an example of streaming the conversation to terminal as it is received from Groq API.
26
26
 
27
27
  It defaults to the slower `llama3-70b-8192` model so that the streaming is more noticable.
28
28
 
29
29
  ```bash
30
- bundle exec examples/groq-user-chat-streaming.rb --agent-prompt examples/agent-prompts/pizzeria-sales.yml
30
+ bundle exec examples/user-chat-streaming.rb --agent-prompt examples/agent-prompts/pizzeria-sales.yml
31
+ ```
32
+
33
+ ### Streaming useful chunks (e.g. JSON)
34
+
35
+ If the response is returning a list of objects, such as a sequence of JSON objects, you can try to stream the chunks that make up the JSON objects and process them as soon as they are complete.
36
+
37
+ ```bash
38
+ bundle exec examples/streaming-to-json-objects.rb
39
+ ```
40
+
41
+ This will produce JSON for each planet in the solar system, one at a time. The API does not return each JSON as a chunk, rather it only returns `{` and `"` and `name` as distinct chunks. But the example code [`examples/streaming-to-json-objects.rb`](examples/streaming-to-json-objects.rb) shows how you might build up JSON objects from chunks, and process it (e.g. store to DB) as soon as it is complete.
42
+
43
+ The system prompt used is:
44
+
45
+ ```plain
46
+ Write out the names of the planets of our solar system, and a brief description of each one.
47
+
48
+ Return JSON object for each one:
49
+
50
+ { "name": "Mercury", "position": 1, "description": "Mercury is ..." }
51
+
52
+ Between each response, say "NEXT" to clearly delineate each JSON response.
53
+
54
+ Don't say anything else except the JSON objects above.
55
+ ```
56
+
57
+ The code in the repo uses the `NEXT` token to know when to process the JSON object.
58
+
59
+ The output will look like, with each JSON object printed (or saved to DB) obly when it has been completely built from chunks.
60
+
61
+ ```json
62
+ {"name":"Mercury","position":1,"description":"Mercury is the smallest planet in our solar system, with a highly elliptical orbit that takes it extremely close to the sun."}
63
+ {"name":"Venus","position":2,"description":"Venus is often called Earth's twin due to their similar size and mass, but it has a thick atmosphere that traps heat, making it the hottest planet."}
31
64
  ```
32
65
 
33
66
  ### Pizzeria
@@ -35,7 +68,7 @@ bundle exec examples/groq-user-chat-streaming.rb --agent-prompt examples/agent-p
35
68
  Run the pizzeria example with the following command:
36
69
 
37
70
  ```bash
38
- bundle exec examples/groq-user-chat.rb --agent-prompt examples/agent-prompts/pizzeria-sales.yml
71
+ bundle exec examples/user-chat.rb --agent-prompt examples/agent-prompts/pizzeria-sales.yml
39
72
  ```
40
73
 
41
74
  > 🍕 Hello! Thank you for calling our pizzeria. I'm happy to help you with your inquiry. Do you have a question about our menu or would you like to place an order?
@@ -59,3 +92,29 @@ bundle exec examples/groq-user-chat.rb --agent-prompt examples/agent-prompts/piz
59
92
  > The conversation started with a customer calling the pizzeria and speaking with an AI assistant. The assistant offered to help with menu inquiries or taking an order. The customer was considering ordering from the menu. The assistant presented the cheapest menu item, Garlic Knots, and asked if the customer wanted to add it to their order. The customer may have been interested in other options as well. The assistant then presented the cheapest pizza option, the Veggie Pizza, and asked if the customer wanted to order it along with the Garlic Knots. The customer agreed, and the assistant took note of the order, which consisted of a Veggie Pizza and 6 Garlic Knots for a total of $18. The assistant asked how the customer would like to pay for their order.
60
93
  >
61
94
  > 😋 exit
95
+
96
+ ## Two Agents
97
+
98
+ Here is an example of two agents talking to each other, without any user input.
99
+
100
+ ### Buying Pizza
101
+
102
+ ```bash
103
+ bundle exec examples/groq-two-agents-chatting.rb --agent-prompt examples/agent-prompts/pizzeria-sales.yml --agent-prompt examples/agent-prompts/food-customer.yml
104
+ ```
105
+
106
+ This will proceed for 10 turns, with the agents taking turns to speak. Pass the `-i 15` option to increase the number of turns.
107
+
108
+ > 🍕 Hello! Welcome to Pizza Palace. Thank you for reaching out to us. How can I assist you today?
109
+ >
110
+ > 😋 Hi! Thanks for having me. I'm actually pretty hungry, so I'm looking to order some food. Can you tell me a bit about your menu? What are some of your popular items?
111
+ >
112
+ > 🍕 I'd be happy to tell you more about our menu. We have a variety of delicious options to choose from. Our most popular items include our Margherita Pizza, Pepperoni Pizza, and BBQ Chicken Pizza. The Margherita is a classic with tomato sauce, mozzarella, and fresh basil. The Pepperoni Pizza is a crowd-pleaser with a generous layer of pepperoni on top. And our BBQ Chicken Pizza has a sweet and tangy BBQ sauce, topped with chicken, onions, and cilantro.
113
+ >
114
+ > We also have some great non-pizza options, such as our Garlic Knots, which are a favorite among our customers. And for dessert, our Cannoli are a must-try - they're filled with creamy ricotta cheese and chocolate chips.
115
+ >
116
+ > What sounds good to you? Would you like me to walk you through our entire menu or is there something specific you're in the mood for?
117
+ >
118
+ > 😋 Mmm, everything sounds delicious! I think I'll go for something a bit hearty. Can you tell me more about the BBQ Chicken Pizza? What kind of chicken is used? And is the pepperoni on the Pepperoni Pizza thick-cut or thin-cut?
119
+ >
120
+ > Also, how would you recommend ordering the Garlic Knots? Are they a side dish or can I get them as part of a combo?
@@ -0,0 +1,12 @@
1
+ ---
2
+ name: "Food Customer"
3
+ system_prompt: |-
4
+ You are a hungry customer looking to order some food.
5
+
6
+ You can ask about the menu, place an order, or inquire about delivery options.
7
+
8
+ When asked about delivery, you say you'll pick up.
9
+ When asked about payment, you confirm you'll pay when you pick up.
10
+ You have $25 to spend.
11
+ agent_emoji: "😋"
12
+ can_go_first: true
@@ -1,5 +1,6 @@
1
1
  ---
2
- system: |-
2
+ name: "Hello World"
3
+ system_prompt: |-
3
4
  I am a friendly agent who always replies to any prompt
4
5
  with a pleasant "Hello" and wishing them well.
5
6
  agent_emoji: "🤖"
@@ -1,5 +1,6 @@
1
1
  ---
2
- system: |-
2
+ name: "Pizzeria Sales"
3
+ system_prompt: |-
3
4
  You are a phone operator at a busy pizzeria. Your responsibilities include answering calls and online chats from customers who may ask about the menu, wish to place or change orders, or inquire about opening hours.
4
5
 
5
6
  Here are some of our popular menu items:
@@ -0,0 +1,124 @@
1
+ #!/usr/bin/env ruby
2
+ #
3
+ # This is a variation of groq-user-chat.rb but without any user prompting.
4
+ # Just two agents chatting with each other.
5
+
6
+ require "optparse"
7
+ require "groq"
8
+ require "yaml"
9
+
10
+ include Groq::Helpers
11
+
12
+ @options = {
13
+ model: "llama3-8b-8192",
14
+ # model: "llama3-70b-8192",
15
+ agent_prompt_paths: [],
16
+ timeout: 20,
17
+ interaction_count: 10 # total count of interactions between agents
18
+ }
19
+ OptionParser.new do |opts|
20
+ opts.banner = "Usage: ruby script.rb [options]"
21
+
22
+ opts.on("-m", "--model MODEL", "Model name") do |v|
23
+ @options[:model] = v
24
+ end
25
+
26
+ opts.on("-a", "--agent-prompt PATH", "Path to an agent prompt file") do |v|
27
+ @options[:agent_prompt_paths] << v
28
+ end
29
+
30
+ opts.on("-t", "--timeout TIMEOUT", "Timeout in seconds") do |v|
31
+ @options[:timeout] = v.to_i
32
+ end
33
+
34
+ opts.on("-d", "--debug", "Enable debug mode") do |v|
35
+ @options[:debug] = v
36
+ end
37
+
38
+ opts.on("-i", "--interaction-count COUNT", "Total count of interactions between agents") do |v|
39
+ @options[:interaction_count] = v.to_i
40
+ end
41
+ end.parse!
42
+
43
+ raise "New two --agent-prompt paths" if @options[:agent_prompt_paths]&.length&.to_i != 2
44
+
45
+ def debug?
46
+ @options[:debug]
47
+ end
48
+
49
+ # Will be instantiated from the agent prompt file
50
+ class Agent
51
+ def initialize(args = {})
52
+ args.each do |k, v|
53
+ instance_variable_set(:"@#{k}", v)
54
+ end
55
+ @messages = [S(@system_prompt)]
56
+ end
57
+ attr_reader :messages
58
+ attr_reader :name, :can_go_first, :user_emoji, :agent_emoji, :system_prompt
59
+ def can_go_first?
60
+ @can_go_first
61
+ end
62
+
63
+ def self.load_from_file(path)
64
+ new(YAML.load_file(path))
65
+ end
66
+ end
67
+
68
+ # Read the agent prompt from the file
69
+ agents = @options[:agent_prompt_paths].map do |agent_prompt_path|
70
+ Agent.load_from_file(agent_prompt_path)
71
+ end
72
+ go_first = agents.find { |agent| agent.can_go_first? } || agents.first
73
+
74
+ # check that each agent contains a system prompt
75
+ agents.each do |agent|
76
+ raise "Agent #{agent.name} is missing a system prompt" if agent.system_prompt.nil?
77
+ end
78
+
79
+ # Initialize the Groq client
80
+ @client = Groq::Client.new(model_id: @options[:model], request_timeout: @options[:timeout]) do |f|
81
+ if debug?
82
+ require "logger"
83
+
84
+ # Create a logger instance
85
+ logger = Logger.new($stdout)
86
+ logger.level = Logger::DEBUG
87
+
88
+ f.response :logger, logger, bodies: true # Log request and response bodies
89
+ end
90
+ end
91
+
92
+ puts "Welcome to a conversation between #{agents.map(&:name).join(", ")}. Our first speaker will be #{go_first.name}."
93
+ puts "You can quit by typing 'exit'."
94
+
95
+ agent_speaking_index = agents.index(go_first)
96
+ loop_count = 0
97
+
98
+ loop do
99
+ speaking_agent = agents[agent_speaking_index]
100
+ # Show speaking agent emoji immediately to indicate request going to Groq API
101
+ print("#{speaking_agent.agent_emoji} ")
102
+
103
+ # Use Groq to generate a response
104
+ response = @client.chat(speaking_agent.messages)
105
+
106
+ # Finish the speaking agent line on screen with message response
107
+ puts(message = response.dig("content"))
108
+
109
+ # speaking agent tracks its own message as the Assistant
110
+ speaking_agent.messages << A(message)
111
+
112
+ # other agent tracks the message as the User
113
+ other_agents = agents.reject { |agent| agent == speaking_agent }
114
+ other_agents.each do |agent|
115
+ agent.messages << U(message)
116
+ end
117
+
118
+ agent_speaking_index = (agent_speaking_index + 1) % agents.length
119
+ loop_count += 1
120
+ break if loop_count > @options[:interaction_count]
121
+ rescue Faraday::TooManyRequestsError
122
+ warn "...\n\nGroq API error: too many requests. Exiting."
123
+ exit 1
124
+ end
@@ -0,0 +1,87 @@
1
+ #!/usr/bin/env ruby
2
+
3
+ require "optparse"
4
+ require "groq"
5
+ require "yaml"
6
+
7
+ include Groq::Helpers
8
+
9
+ @options = {
10
+ model: "llama3-70b-8192",
11
+ timeout: 20
12
+ }
13
+ OptionParser.new do |opts|
14
+ opts.banner = "Usage: ruby script.rb [options]"
15
+
16
+ opts.on("-m", "--model MODEL", "Model name") do |v|
17
+ @options[:model] = v
18
+ end
19
+
20
+ opts.on("-t", "--timeout TIMEOUT", "Timeout in seconds") do |v|
21
+ @options[:timeout] = v.to_i
22
+ end
23
+
24
+ opts.on("-d", "--debug", "Enable debug mode") do |v|
25
+ @options[:debug] = v
26
+ end
27
+ end.parse!
28
+
29
+ raise "Missing --model option" if @options[:model].nil?
30
+
31
+ # Initialize the Groq client
32
+ @client = Groq::Client.new(model_id: @options[:model], request_timeout: @options[:timeout]) do |f|
33
+ if @options[:debug]
34
+ require "logger"
35
+
36
+ # Create a logger instance
37
+ logger = Logger.new($stdout)
38
+ logger.level = Logger::DEBUG
39
+
40
+ f.response :logger, logger, bodies: true # Log request and response bodies
41
+ end
42
+ end
43
+
44
+ prompt = <<~TEXT
45
+ Write out the names of the planets of our solar system, and a brief description of each one.
46
+
47
+ Return JSON object for each one:
48
+
49
+ { "name": "Mercury", "position": 1, "description": "Mercury is ..." }
50
+
51
+ Between each response, say "NEXT" to clearly delineate each JSON response.
52
+
53
+ Don't say anything else except the JSON objects above.
54
+ TEXT
55
+
56
+ # Handle each JSON object once it has been fully streamed
57
+
58
+ class PlanetStreamer
59
+ def initialize
60
+ @buffer = ""
61
+ end
62
+
63
+ def call(content)
64
+ if !content || content.include?("NEXT")
65
+ json = JSON.parse(@buffer)
66
+
67
+ # do something with JSON, e.g. save to database
68
+ puts json.to_json
69
+
70
+ # reset buffer
71
+ @buffer = ""
72
+ return
73
+ end
74
+ # if @buffer is empty; and content is not JSON start {, then ignore + return
75
+ if @buffer.empty? && !content.start_with?("{")
76
+ return
77
+ end
78
+
79
+ # build JSON
80
+ @buffer << content
81
+ end
82
+ end
83
+
84
+ streamer = PlanetStreamer.new
85
+
86
+ @client.chat([S(prompt)], stream: streamer)
87
+ puts
@@ -34,10 +34,6 @@ end.parse!
34
34
  raise "Missing --model option" if @options[:model].nil?
35
35
  raise "Missing --agent-prompt option" if @options[:agent_prompt_path].nil?
36
36
 
37
- def debug?
38
- @options[:debug]
39
- end
40
-
41
37
  # Read the agent prompt from the file
42
38
  agent_prompt = YAML.load_file(@options[:agent_prompt_path])
43
39
  user_emoji = agent_prompt["user_emoji"]
@@ -47,7 +43,7 @@ can_go_first = agent_prompt["can_go_first"]
47
43
 
48
44
  # Initialize the Groq client
49
45
  @client = Groq::Client.new(model_id: @options[:model], request_timeout: @options[:timeout]) do |f|
50
- if debug?
46
+ if @options[:debug]
51
47
  require "logger"
52
48
 
53
49
  # Create a logger instance
@@ -35,10 +35,6 @@ end.parse!
35
35
  raise "Missing --model option" if @options[:model].nil?
36
36
  raise "Missing --agent-prompt option" if @options[:agent_prompt_path].nil?
37
37
 
38
- def debug?
39
- @options[:debug]
40
- end
41
-
42
38
  # Read the agent prompt from the file
43
39
  agent_prompt = YAML.load_file(@options[:agent_prompt_path])
44
40
  user_emoji = agent_prompt["user_emoji"]
@@ -48,7 +44,7 @@ can_go_first = agent_prompt["can_go_first"]
48
44
 
49
45
  # Initialize the Groq client
50
46
  @client = Groq::Client.new(model_id: @options[:model], request_timeout: @options[:timeout]) do |f|
51
- if debug?
47
+ if @options[:debug]
52
48
  require "logger"
53
49
 
54
50
  # Create a logger instance
@@ -0,0 +1,20 @@
1
+ require "rails/generators/base"
2
+
3
+ module Groq
4
+ module Generators
5
+ class InstallGenerator < Rails::Generators::Base
6
+ source_root File.expand_path("templates", __dir__)
7
+
8
+ def create_groq_init_file
9
+ create_file "config/initializers/groq.rb", <<~RUBY
10
+ # frozen_string_literal: true
11
+
12
+ Groq.configure do |config|
13
+ config.api_key = ENV["GROQ_API_KEY"]
14
+ config.model_id = "llama3-70b-8192"
15
+ end
16
+ RUBY
17
+ end
18
+ end
19
+ end
20
+ end
data/lib/groq/version.rb CHANGED
@@ -1,5 +1,5 @@
1
1
  # frozen_string_literal: true
2
2
 
3
3
  module Groq
4
- VERSION = "0.3.0"
4
+ VERSION = "0.3.1"
5
5
  end
metadata CHANGED
@@ -1,14 +1,14 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: groq
3
3
  version: !ruby/object:Gem::Version
4
- version: 0.3.0
4
+ version: 0.3.1
5
5
  platform: ruby
6
6
  authors:
7
7
  - Dr Nic Williams
8
8
  autorequire:
9
9
  bindir: exe
10
10
  cert_chain: []
11
- date: 2024-04-25 00:00:00.000000000 Z
11
+ date: 2024-05-05 00:00:00.000000000 Z
12
12
  dependencies:
13
13
  - !ruby/object:Gem::Dependency
14
14
  name: faraday
@@ -123,10 +123,14 @@ files:
123
123
  - Rakefile
124
124
  - docs/images/groq-speed-price-20240421.png
125
125
  - examples/README.md
126
+ - examples/agent-prompts/food-customer.yml
126
127
  - examples/agent-prompts/helloworld.yml
127
128
  - examples/agent-prompts/pizzeria-sales.yml
128
- - examples/groq-user-chat-streaming.rb
129
- - examples/groq-user-chat.rb
129
+ - examples/groq-two-agents-chatting.rb
130
+ - examples/streaming-to-json-objects.rb
131
+ - examples/user-chat-streaming.rb
132
+ - examples/user-chat.rb
133
+ - lib/generators/groq/install_generator.rb
130
134
  - lib/groq-ruby.rb
131
135
  - lib/groq.rb
132
136
  - lib/groq/client.rb