promptcraft 0.1.0 → 0.1.1

Sign up to get free protection for your applications and to get access to all the features.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: ccfea1d708de15e5bc3e3d8b2bf179813014ba45f84d4a2b950fcb91109c1698
4
- data.tar.gz: 36acdfa7e390788b0cc62a3f88bd6259cbc8cab0140d8d1b0c68b9eb197f9839
3
+ metadata.gz: 7759927da72f5b771061dbe0b7a465e1509728f9ced3b6ad5efc7b341c754576
4
+ data.tar.gz: e4a89b721bdae7fde071f8c13aea70259bc909c35ab5c77ce82ea9091130441d
5
5
  SHA512:
6
- metadata.gz: 23b0f0febcf5574f183b6f35a0452c75ba4f4be43e1edb35dfa96c0b957d0d9db42e4bf9a7edaf212d080e5e41b91a959d81d5a56e8addca7819a85552bc43ba
7
- data.tar.gz: be1c072618fe888cae89f2e1ef2a5e781b9ce65c3bc38b7359d507487ee95d114a7c76286920fba2cc170adfc7a19f2099581668dde80518e213ac5c83672282
6
+ metadata.gz: 5a8eaaac272a535c1f9e8b704b05a3871fa0ca681f42657257cdc66778ad69b84a4cbd74b9ecdd0317f7b80b074d6f9b88dded966b43ec1c8a9a43d0198ee289
7
+ data.tar.gz: 8fc6e1a05a7a9a16a29b48fc47dd9006a06cd487fb476e53a456d39982aad2df8242a66393c3fa7865583c315d2af17a2be569e6125f21451d01ab66c0a6ec46
data/README.md CHANGED
@@ -19,7 +19,7 @@ messages:
19
19
  Let's replay this single conversation with a new system prompt:
20
20
 
21
21
  ```plain
22
- bundle exec exe/promptcraft \
22
+ promptcraft \
23
23
  --prompt "I'm terrible at maths. If I'm asked a maths question, I reply with a question." \
24
24
  --conversation examples/maths/start/already_answered.yml
25
25
  ```
@@ -69,7 +69,7 @@ messages:
69
69
  The CLI will replay each conversation with the new system prompt.
70
70
 
71
71
  ```plain
72
- bundle exec exe/promptcraft \
72
+ promptcraft \
73
73
  --conversation examples/maths/start/already_answered_multiple.yml \
74
74
  --prompt "I like cats. Answer any questions using cats."
75
75
  ```
@@ -118,7 +118,7 @@ When you're getting started, you don't even need to know the conversation file f
118
118
 
119
119
  ```plain
120
120
  echo "---\nWhat is 2+2?\n---\nWhat is 6 divided by 2?" | \
121
- bundle exec exe/promptcraft --prompt "I solve maths using pizza metaphors."
121
+ promptcraft --prompt "I solve maths using pizza metaphors."
122
122
  ```
123
123
 
124
124
  The output will be our conversation YAML format, with the system prompt, the incoming user messages as separate conversations, and the assistant replies within each conversation:
@@ -164,7 +164,7 @@ You'll notice, the LLM used (which defaults to Groq's `llama3-70b-8192` because
164
164
  Of course, you could pass each plain text user message using the `--conversation` argument too:
165
165
 
166
166
  ```plain
167
- bundle exec exe/promptcraft \
167
+ promptcraft \
168
168
  --conversation "What is 2+2?"
169
169
  --conversation "What is 6 divided by 2?" \
170
170
  --prompt "I solve maths using pizza metaphors."
@@ -173,13 +173,13 @@ bundle exec exe/promptcraft \
173
173
  Why does it output YAML? (or JSON if you pass `--json` flag) So that you can save it to a file; and then replay (or rechat) this new set of conversations in a minute with a new system prompt.
174
174
 
175
175
  ```plain
176
- bundle exec exe/promptcraft \
176
+ promptcraft \
177
177
  --conversation "What is 2+2?" \
178
178
  --conversation "What is 6 divided by 2?" \
179
179
  --prompt "I am happy person". \
180
180
  > tmp/maths-as-happy-person.yml
181
181
 
182
- bundle exec exe/promptcraft \
182
+ promptcraft \
183
183
  --conversation tmp/maths-as-happy-person.yml \
184
184
  --prompt "I solve maths using pizza metaphors." \
185
185
  > tmp/maths-with-pizza.yml
@@ -190,7 +190,7 @@ echo "I am an excellent maths tutor.
190
190
  When I'm asked a maths question, I will first
191
191
  ask a question in return to help the student." > tmp/prompt-maths-tutor.txt
192
192
 
193
- bundle exec exe/promptcraft \
193
+ promptcraft \
194
194
  --conversation tmp/maths-with-pizza.yml \
195
195
  --prompt tmp/prompt-maths-tutor.txt
196
196
  ```
@@ -210,7 +210,7 @@ messages:
210
210
  With each one separated new line. Say nothing else except producing YAML.
211
211
  " > tmp/prompt-list-20-hellos.txt
212
212
 
213
- bundle exec exe/promptcraft \
213
+ promptcraft \
214
214
  -c "Generate a list of 20 things a customer might say when they first ring into a hair salon phone service" \
215
215
  -p tmp/prompt-list-20-hellos.txt \
216
216
  --format json > tmp/hair-salon-20-hellos.json
@@ -222,7 +222,7 @@ cat tmp/hair-salon-20-hellos.json | jq -r ".messages[1].content" \
222
222
  The file `tmp/hair-salon-20-0000.txt` now contains 20 user messages that you can use to initiate a conversation with your AI assistant system prompt.
223
223
 
224
224
  ```plain
225
- bundle exec exe/promptcraft \
225
+ promptcraft \
226
226
  -p "I'm a hair salon phone service. I sell haircuts" \
227
227
  -c tmp/hair-salon-20-0000.txt \
228
228
  > tmp/hair-salon-20-replies-0001.yml
@@ -263,7 +263,27 @@ Tools you might want to use in conjunction with `promptcraft`:
263
263
 
264
264
  ## Installation
265
265
 
266
- Right now, you need to run the CLI from the source code.
266
+ Right now, you can either install with:
267
+
268
+ * Homebrew
269
+ * Or, you need to run the CLI from the source code
270
+
271
+ Whilst this is a RubyGem, it currently requires some Git branches that are not yet released. The Homebrew recipe takes a big tarball of all source code and dependencies and installs it. The tarball is also available via [Github Releases](https://github.com/drnic/promptcraft/releases).
272
+
273
+ Once the Git branches are released, then the `promptcraft` gem will also be installable via RubyGems.
274
+
275
+ It requires Ruby 3.3 or later. Mostly because I like the new syntax features.
276
+
277
+ ### Homebrew
278
+
279
+ The project is currently distributed by the Homebrew tap [`drnic/ai`](https://github.com/drnic/homebrew-ai).
280
+
281
+ ```plain
282
+ brew tap drnic/ai
283
+ brew install promptcraft
284
+ ```
285
+
286
+ ### Run from Source
267
287
 
268
288
  ```plain
269
289
  git clone https://github.com/drnic/promptcraft
@@ -275,7 +295,9 @@ bundle exec exe/promptcraft \
275
295
  --provider groq
276
296
  ```
277
297
 
278
- It defaults to `--provider groq --model llama3-70b-8192` and assumes you have `$GROQ_API_KEY` set in your environment.
298
+ ### Configuration
299
+
300
+ The `promptcraft` CLI defaults to `--provider groq --model llama3-70b-8192` and assumes you have `$GROQ_API_KEY` set in your environment.
279
301
 
280
302
  You can also use [OpenAI](https://openai.com/) with `--provider openai`, which defaults to `--model gpt-3.5-turbo`. It assumes you have `$OPENAI_API_KEY` set in your environment.
281
303
 
@@ -286,7 +308,7 @@ You can also use [Ollama](https://ollama.com/) locally with `--provider ollama`,
286
308
  If the conversation file has an `llm` key with `provider` and `model` keys, then those will be used instead of the defaults.
287
309
 
288
310
  ```plain
289
- bundle exec exe/promptcraft \
311
+ promptcraft \
290
312
  --conversation examples/maths/start/already_answered_gpt4.yml \
291
313
  --prompt "I always reply with a question"
292
314
 
@@ -311,7 +333,7 @@ The following example commands assume you have `$GROQ_API_KEY` and will use Groq
311
333
  Run `promptcraft` with no arguments to get a default prompt and an initial assistant message.
312
334
 
313
335
  ```plain
314
- bundle exec exe/promptcraft
336
+ promptcraft
315
337
  ```
316
338
 
317
339
  The output might be:
@@ -327,13 +349,13 @@ messages:
327
349
  Provide a different provider, such as `openai` (which assumes you have `$OPENAI_API_KEY` set in your environment).
328
350
 
329
351
  ```plain
330
- bundle exec exe/promptcraft --provider openai
352
+ promptcraft --provider openai
331
353
  ```
332
354
 
333
355
  Or you could provide your own system prompt, and it will generate an initial assistant message.
334
356
 
335
357
  ```plaim
336
- bundle exec exe/promptcraft --prompt "I like to solve maths problems."
358
+ promptcraft --prompt "I like to solve maths problems."
337
359
  ```
338
360
 
339
361
  The output might be:
@@ -367,26 +389,26 @@ The primary point of `promptcraft` is to replay conversations with a new system
367
389
  An example of the `--conversation` option:
368
390
 
369
391
  ```plain
370
- bundle exec exe/promptcraft \
392
+ promptcraft \
371
393
  --conversation examples/maths/start/basic.yml
372
394
  ```
373
395
 
374
396
  You can also pipe a stream of conversation YAML into `promptcraft` via STDIN
375
397
 
376
398
  ```plain
377
- echo "---\nsystem_prompt: I like to solve maths problems.\nmessages:\n- role: \"user\"\n content: \"What is 2+2?\"" | bundle exec exe/promptcraft
399
+ echo "---\nsystem_prompt: I like to solve maths problems.\nmessages:\n- role: \"user\"\n content: \"What is 2+2?\"" | promptcraft
378
400
  ```
379
401
 
380
402
  JSON is valid YAML, so you can also use JSON:
381
403
 
382
404
  ```plain
383
- echo "{\"system_prompt\": \"I like to solve maths problems.\", \"messages\": [{\"role\": \"user\", \"content\": \"What is 2+2?\"}]}" | bundle exec exe/promptcraft
405
+ echo "{\"system_prompt\": \"I like to solve maths problems.\", \"messages\": [{\"role\": \"user\", \"content\": \"What is 2+2?\"}]}" | promptcraft
384
406
  ```
385
407
 
386
408
  Or pipe one or more files into `promptcraft`:
387
409
 
388
410
  ```plain
389
- ( cat examples/maths/start/basic.yml ; cat examples/maths/start/already_answered.yml ) | bundle exec exe/promptcraft
411
+ ( cat examples/maths/start/basic.yml ; cat examples/maths/start/already_answered.yml ) | promptcraft
390
412
  ```
391
413
 
392
414
  As long as the input is a stream of YAML documents (separated by `---`), it will be processed.
@@ -421,7 +443,7 @@ messages:
421
443
  When we replay the conversation with the same system prompt (by omitting the `--prompt` option), it will add the missing assistant reply:
422
444
 
423
445
  ```plain
424
- bundle exec exe/promptcraft \
446
+ promptcraft \
425
447
  --conversation examples/maths/start/basic.yml
426
448
  ```
427
449
 
@@ -440,12 +462,38 @@ messages:
440
462
  content: That's an easy one! The answer is... 4!
441
463
  ```
442
464
 
465
+ ### Set the temperature
466
+
467
+ The `--temperature` option controls the randomness of the assistant's responses. The default for each provider is typically `0.0`. Lower values will produce more deterministic responses, while higher values will produce more creative responses.
468
+
469
+ ```plain
470
+ promptcraft \
471
+ --conversation examples/maths/start/basic.yml \
472
+ --temperature 0.5
473
+ ```
474
+
475
+ The output YAML for each conversation will store the `temperature` value used in the `llm:` section:
476
+
477
+ ```yaml
478
+ ---
479
+ system_prompt: I like to solve maths problems.
480
+ llm:
481
+ provider: groq
482
+ model: llama3-70b-8192
483
+ temperature: 0.5
484
+ messages:
485
+ - role: user
486
+ content: What is 2+2?
487
+ - role: assistant
488
+ content: That's an easy one! The answer is... 4!
489
+ ```
490
+
443
491
  ### Limericks
444
492
 
445
493
  Here are some previously [generated limericks](examples/maths/start/many_limericks.yml). To regenerate them to start with letter "E" on each line:
446
494
 
447
495
  ```plain
448
- bundle exec exe/promptcraft \
496
+ promptcraft \
449
497
  --conversation examples/maths/start/many_limericks.yml \
450
498
  --prompt "I am excellent at limericks. I always start each line with the letter E."
451
499
  ```
@@ -453,7 +501,7 @@ bundle exec exe/promptcraft \
453
501
  It might still include some preamble in each response. To try to encourage the LLM to remove it:
454
502
 
455
503
  ```plain
456
- bundle exec exe/promptcraft \
504
+ promptcraft \
457
505
  --conversation examples/maths/start/many_limericks.yml \
458
506
  --prompt "I am excellent at limericks. I always start each line with the letter E. This is very important. Only return the limerick without any other comments."
459
507
  ```
@@ -472,12 +520,22 @@ To set new version number:
472
520
  gem bump --version [patch|minor|major]
473
521
  ```
474
522
 
475
- To tag and release:
523
+ To tag and release Rubygem to <Rubygems.org>:
476
524
 
477
525
  ```plain
478
526
  rake release
479
527
  ```
480
528
 
529
+ To update Homebrew formula:
530
+
531
+ ```plain
532
+ rake release:build_package
533
+ rake release:upload_package
534
+ rake release:generate_homebrew_formula
535
+ ```
536
+
537
+ Now copy `tmp/promptcraft.rb` formula into <https://github.com/drnic/homebrew-ai> and push.
538
+
481
539
  ## Contributing
482
540
 
483
541
  Bug reports and pull requests are welcome on GitHub at https://github.com/drnic/promptcraft. This project is intended to be a safe, welcoming space for collaboration, and contributors are expected to adhere to the [code of conduct](https://github.com/drnic/promptcraft/blob/develop/CODE_OF_CONDUCT.md).
data/Rakefile CHANGED
@@ -2,6 +2,7 @@
2
2
 
3
3
  require "bundler/gem_tasks"
4
4
  require "minitest/test_task"
5
+ Dir["#{File.dirname(__FILE__)}/lib/tasks/**/*.rake"].each { |file| load file }
5
6
 
6
7
  Minitest::TestTask.create
7
8
 
@@ -33,6 +33,13 @@ class Promptcraft::Cli::RunCommand
33
33
  desc "String or filename containing system prompt"
34
34
  end
35
35
 
36
+ option :temperature do
37
+ short "-t"
38
+ long "--temperature temperature"
39
+ desc "Temperature for chat completion"
40
+ convert :float
41
+ end
42
+
36
43
  flag :help do
37
44
  short "-h"
38
45
  long "--help"
@@ -72,14 +79,16 @@ class Promptcraft::Cli::RunCommand
72
79
  desc "Enable debug mode"
73
80
  end
74
81
 
75
- def run(stdin: nil)
82
+ # Arguments are for the benefit of test suite
83
+ def run(stdin: nil, threads: nil)
76
84
  if params[:help]
77
85
  warn help
78
86
  elsif params.errors.any?
79
87
  warn params.errors.summary
80
88
  else
81
89
  # Load files in threads
82
- pool = Concurrent::FixedThreadPool.new(params[:threads])
90
+ threads ||= params[:threads]
91
+ pool = Concurrent::FixedThreadPool.new(threads)
83
92
  conversations = Concurrent::Array.new
84
93
  # TODO: load in thread pool
85
94
  (params[:conversation] || []).each do |filename|
@@ -123,7 +132,7 @@ class Promptcraft::Cli::RunCommand
123
132
  end
124
133
 
125
134
  # Process each conversation in a concurrent thread via a thread pool
126
- pool = Concurrent::FixedThreadPool.new(params[:threads])
135
+ pool = Concurrent::FixedThreadPool.new(threads)
127
136
  mutex = Mutex.new
128
137
 
129
138
  updated_conversations = Concurrent::Array.new
@@ -138,6 +147,7 @@ class Promptcraft::Cli::RunCommand
138
147
  Promptcraft::Llm.new
139
148
  end
140
149
  llm.model = params[:model] if params[:model]
150
+ llm.temperature = params[:temperature] if params[:temperature]
141
151
 
142
152
  system_prompt = new_system_prompt || conversation.system_prompt
143
153
 
@@ -4,13 +4,14 @@ class Promptcraft::Conversation
4
4
  include Promptcraft::Helpers
5
5
  extend Promptcraft::Helpers
6
6
 
7
- attr_accessor :system_prompt, :messages
7
+ attr_accessor :system_prompt, :messages, :temperature
8
8
  attr_accessor :llm
9
9
 
10
- def initialize(system_prompt:, messages: [], llm: nil)
10
+ def initialize(system_prompt:, messages: [], llm: nil, temperature: 0)
11
11
  @system_prompt = system_prompt
12
12
  @messages = messages
13
13
  @llm = llm
14
+ @temperature = temperature
14
15
  end
15
16
 
16
17
  def add_message(role:, content:)
@@ -85,9 +86,9 @@ class Promptcraft::Conversation
85
86
  # content: 2 + 2 = 4
86
87
  def to_yaml
87
88
  YAML.dump(deep_stringify_keys({
88
- system_prompt: @system_prompt&.strip,
89
- llm: @llm&.to_h,
90
- messages: @messages
89
+ system_prompt: system_prompt&.strip,
90
+ llm: llm&.to_h,
91
+ messages:
91
92
  }.compact))
92
93
  end
93
94
 
@@ -4,12 +4,13 @@ class Promptcraft::Llm
4
4
  DEFAULT_PROVIDER = "groq"
5
5
 
6
6
  attr_reader :langchain
7
- attr_accessor :provider, :model
7
+ attr_accessor :provider, :model, :temperature
8
8
 
9
9
  delegate_missing_to :langchain
10
10
 
11
- def initialize(provider: DEFAULT_PROVIDER, model: nil, api_key: nil)
11
+ def initialize(provider: DEFAULT_PROVIDER, model: nil, temperature: nil, api_key: nil)
12
12
  @provider = provider
13
+ @temperature = temperature
13
14
  @langchain = case provider
14
15
  when "groq"
15
16
  @model = model || "llama3-70b-8192"
@@ -17,14 +18,20 @@ class Promptcraft::Llm
17
18
  Langchain::LLM::OpenAI.new(
18
19
  api_key: api_key || ENV.fetch("GROQ_API_KEY"),
19
20
  llm_options: {uri_base: "https://api.groq.com/openai/"},
20
- default_options: {chat_completion_model_name: @model}
21
+ default_options: {
22
+ temperature: temperature,
23
+ chat_completion_model_name: @model
24
+ }.compact
21
25
  )
22
26
  when "openai"
23
27
  @model = model || "gpt-3.5-turbo"
24
28
  require "openai"
25
29
  Langchain::LLM::OpenAI.new(
26
30
  api_key: api_key || ENV.fetch("OPENAI_API_KEY"),
27
- default_options: {chat_completion_model_name: @model}
31
+ default_options: {
32
+ temperature: temperature,
33
+ chat_completion_model_name: @model
34
+ }.compact
28
35
  )
29
36
  when "openrouter"
30
37
  @model = model || "meta-llama/llama-3-8b-instruct:free"
@@ -32,7 +39,10 @@ class Promptcraft::Llm
32
39
  Langchain::LLM::OpenAI.new(
33
40
  api_key: api_key || ENV.fetch("OPENROUTER_API_KEY"),
34
41
  llm_options: {uri_base: "https://openrouter.ai/api/"},
35
- default_options: {chat_completion_model_name: @model}
42
+ default_options: {
43
+ temperature: temperature,
44
+ chat_completion_model_name: @model
45
+ }.compact
36
46
  )
37
47
  when "ollama"
38
48
  @model = model || "llama3"
@@ -46,10 +56,10 @@ class Promptcraft::Llm
46
56
  end
47
57
 
48
58
  def to_h
49
- {provider: provider, model: model}
59
+ {provider:, model:, temperature:}.compact
50
60
  end
51
61
 
52
62
  def self.from_h(hash)
53
- new(provider: hash[:provider], model: hash[:model])
63
+ new(provider: hash[:provider], model: hash[:model], temperature: hash[:temperature])
54
64
  end
55
65
  end
@@ -1,5 +1,5 @@
1
1
  # frozen_string_literal: true
2
2
 
3
3
  module Promptcraft
4
- VERSION = "0.1.0"
4
+ VERSION = "0.1.1"
5
5
  end
@@ -0,0 +1,69 @@
1
+ # lib/tasks/homebrew_formula.rake
2
+
3
+ require "rubygems"
4
+ require "rake"
5
+
6
+ namespace :release do
7
+ task :build_package do
8
+ system "bundle config set cache_all true"
9
+ system "bundle package"
10
+ name = Gem::Specification.load(Dir.glob("*.gemspec").first).name
11
+ version = Gem::Specification.load(Dir.glob("*.gemspec").first).version
12
+ license = Gem::Specification.load(Dir.glob("*.gemspec").first).license
13
+ system "rm -f #{name}*.tar* #{name}*.sha256"
14
+ system "fpm -s dir -t tar --name #{name} --version #{version} --license #{license} --exclude .git --exclude test --exclude spec ."
15
+ system "mv #{name}.tar #{name}-#{version}.tar"
16
+ system "xz -z #{name}-#{version}.tar"
17
+ sha = `shasum -a 256 #{name}-#{version}.tar.xz`.split(" ").first
18
+ File.write("#{name}-#{version}.sha256", sha)
19
+ end
20
+
21
+ task :upload_package do
22
+ name = Gem::Specification.load(Dir.glob("*.gemspec").first).name
23
+ version = Gem::Specification.load(Dir.glob("*.gemspec").first).version
24
+ file = "#{name}-#{version}.tar.xz"
25
+ file_sha256 = "#{name}-#{version}.sha256"
26
+ system "gh release create v#{version} #{file} #{file_sha256} --title 'v#{version}' --notes ''"
27
+ end
28
+
29
+ desc "Generate Homebrew formula"
30
+ task :generate_homebrew_formula do
31
+ spec = Gem::Specification.load(Dir.glob("*.gemspec").first)
32
+ name = spec.name
33
+ version = spec.version
34
+ sha256sum = File.read("#{name}-#{version}.sha256").strip
35
+ url = `gh release view v#{version} --json assets | jq -r '.assets[] | select(.name == "#{name}-#{version}.tar.xz") | .url'`.strip
36
+
37
+ formula_name = spec.name.capitalize
38
+ class_name = formula_name.gsub(/_\w/) { |match| match[1].upcase }.to_s
39
+
40
+ formula = <<~RUBY
41
+ class #{class_name} < Formula
42
+ desc "#{spec.summary}"
43
+ homepage "#{spec.homepage}"
44
+ version "#{spec.version}"
45
+ url "#{url}"
46
+ sha256 "#{sha256sum}"
47
+
48
+ depends_on "ruby@3.3"
49
+
50
+ def install
51
+ # Extract all files to libexec, which is a common Homebrew practice for third-party tools
52
+ libexec.install Dir["*"]
53
+ # Create a symbolic link for the executable in Homebrew's bin directory
54
+ bin.install_symlink "\#{libexec}/exe/promptcraft" => "promptcraft"
55
+ end
56
+
57
+ test do
58
+ # Simple test to check the version or a help command
59
+ system "\#{bin}/promptcraft", "--version"
60
+ end
61
+ end
62
+ RUBY
63
+
64
+ Dir.mkdir("tmp") unless Dir.exist?("tmp")
65
+ path = "tmp/#{spec.name.downcase}.rb"
66
+ File.write(path, formula)
67
+ puts "Generated Homebrew formula for #{spec.name} version #{spec.version} at #{path}"
68
+ end
69
+ end
metadata CHANGED
@@ -1,14 +1,14 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: promptcraft
3
3
  version: !ruby/object:Gem::Version
4
- version: 0.1.0
4
+ version: 0.1.1
5
5
  platform: ruby
6
6
  authors:
7
7
  - Dr Nic Williams
8
8
  autorequire:
9
9
  bindir: exe
10
10
  cert_chain: []
11
- date: 2024-05-13 00:00:00.000000000 Z
11
+ date: 2024-05-14 00:00:00.000000000 Z
12
12
  dependencies:
13
13
  - !ruby/object:Gem::Dependency
14
14
  name: activesupport
@@ -44,14 +44,14 @@ dependencies:
44
44
  requirements:
45
45
  - - ">="
46
46
  - !ruby/object:Gem::Version
47
- version: '0'
47
+ version: 0.12.1
48
48
  type: :runtime
49
49
  prerelease: false
50
50
  version_requirements: !ruby/object:Gem::Requirement
51
51
  requirements:
52
52
  - - ">="
53
53
  - !ruby/object:Gem::Version
54
- version: '0'
54
+ version: 0.12.1
55
55
  - !ruby/object:Gem::Dependency
56
56
  name: faraday
57
57
  requirement: !ruby/object:Gem::Requirement
@@ -185,6 +185,7 @@ files:
185
185
  - lib/promptcraft/helpers.rb
186
186
  - lib/promptcraft/llm.rb
187
187
  - lib/promptcraft/version.rb
188
+ - lib/tasks/release.rake
188
189
  - sig/promptcraft.rbs
189
190
  homepage: https://github.com/drnic/promptcraft
190
191
  licenses: