promptcraft 0.1.0 → 0.1.2
Sign up to get free protection for your applications and to get access to all the features.
- checksums.yaml +4 -4
- data/README.md +92 -23
- data/Rakefile +1 -0
- data/exe/promptcraft +3 -0
- data/lib/promptcraft/cli/run_command.rb +13 -3
- data/lib/promptcraft/conversation.rb +6 -5
- data/lib/promptcraft/llm.rb +17 -7
- data/lib/promptcraft/version.rb +1 -1
- data/lib/tasks/release.rake +72 -0
- metadata +5 -4
checksums.yaml
CHANGED
@@ -1,7 +1,7 @@
|
|
1
1
|
---
|
2
2
|
SHA256:
|
3
|
-
metadata.gz:
|
4
|
-
data.tar.gz:
|
3
|
+
metadata.gz: 03ceb8ec85a855e2bc5fd2e23983c7a975dca780e4f449a70742c7451fc0d7f3
|
4
|
+
data.tar.gz: 937d12936d5c7349f4d3f1596c579d7b2c5065760f937e7627ac4b59db5d75fd
|
5
5
|
SHA512:
|
6
|
-
metadata.gz:
|
7
|
-
data.tar.gz:
|
6
|
+
metadata.gz: 36d690630e1383037a2273a3e9cf7c18914e313c36b25d3f26f00b42af63e732980011b3f19e2045a0b2b710e4023b445db3540ab78992eb74130de14d50a6bc
|
7
|
+
data.tar.gz: 9f81ce9b1e40fb93e24b85fe16ff9fabfc6d1cf93ee961e552365f6d24aaa34ab346b8a7a32ab4724feb80ac716c93ef569df1af3a4a17e3b7c688e5e0ab463c
|
data/README.md
CHANGED
@@ -19,7 +19,7 @@ messages:
|
|
19
19
|
Let's replay this single conversation with a new system prompt:
|
20
20
|
|
21
21
|
```plain
|
22
|
-
|
22
|
+
promptcraft \
|
23
23
|
--prompt "I'm terrible at maths. If I'm asked a maths question, I reply with a question." \
|
24
24
|
--conversation examples/maths/start/already_answered.yml
|
25
25
|
```
|
@@ -69,7 +69,7 @@ messages:
|
|
69
69
|
The CLI will replay each conversation with the new system prompt.
|
70
70
|
|
71
71
|
```plain
|
72
|
-
|
72
|
+
promptcraft \
|
73
73
|
--conversation examples/maths/start/already_answered_multiple.yml \
|
74
74
|
--prompt "I like cats. Answer any questions using cats."
|
75
75
|
```
|
@@ -118,7 +118,7 @@ When you're getting started, you don't even need to know the conversation file f
|
|
118
118
|
|
119
119
|
```plain
|
120
120
|
echo "---\nWhat is 2+2?\n---\nWhat is 6 divided by 2?" | \
|
121
|
-
|
121
|
+
promptcraft --prompt "I solve maths using pizza metaphors."
|
122
122
|
```
|
123
123
|
|
124
124
|
The output will be our conversation YAML format, with the system prompt, the incoming user messages as separate conversations, and the assistant replies within each conversation:
|
@@ -164,7 +164,7 @@ You'll notice, the LLM used (which defaults to Groq's `llama3-70b-8192` because
|
|
164
164
|
Of course, you could pass each plain text user message using the `--conversation` argument too:
|
165
165
|
|
166
166
|
```plain
|
167
|
-
|
167
|
+
promptcraft \
|
168
168
|
--conversation "What is 2+2?"
|
169
169
|
--conversation "What is 6 divided by 2?" \
|
170
170
|
--prompt "I solve maths using pizza metaphors."
|
@@ -173,13 +173,13 @@ bundle exec exe/promptcraft \
|
|
173
173
|
Why does it output YAML? (or JSON if you pass `--json` flag) So that you can save it to a file; and then replay (or rechat) this new set of conversations in a minute with a new system prompt.
|
174
174
|
|
175
175
|
```plain
|
176
|
-
|
176
|
+
promptcraft \
|
177
177
|
--conversation "What is 2+2?" \
|
178
178
|
--conversation "What is 6 divided by 2?" \
|
179
179
|
--prompt "I am happy person". \
|
180
180
|
> tmp/maths-as-happy-person.yml
|
181
181
|
|
182
|
-
|
182
|
+
promptcraft \
|
183
183
|
--conversation tmp/maths-as-happy-person.yml \
|
184
184
|
--prompt "I solve maths using pizza metaphors." \
|
185
185
|
> tmp/maths-with-pizza.yml
|
@@ -190,7 +190,7 @@ echo "I am an excellent maths tutor.
|
|
190
190
|
When I'm asked a maths question, I will first
|
191
191
|
ask a question in return to help the student." > tmp/prompt-maths-tutor.txt
|
192
192
|
|
193
|
-
|
193
|
+
promptcraft \
|
194
194
|
--conversation tmp/maths-with-pizza.yml \
|
195
195
|
--prompt tmp/prompt-maths-tutor.txt
|
196
196
|
```
|
@@ -210,7 +210,7 @@ messages:
|
|
210
210
|
With each one separated new line. Say nothing else except producing YAML.
|
211
211
|
" > tmp/prompt-list-20-hellos.txt
|
212
212
|
|
213
|
-
|
213
|
+
promptcraft \
|
214
214
|
-c "Generate a list of 20 things a customer might say when they first ring into a hair salon phone service" \
|
215
215
|
-p tmp/prompt-list-20-hellos.txt \
|
216
216
|
--format json > tmp/hair-salon-20-hellos.json
|
@@ -222,7 +222,7 @@ cat tmp/hair-salon-20-hellos.json | jq -r ".messages[1].content" \
|
|
222
222
|
The file `tmp/hair-salon-20-0000.txt` now contains 20 user messages that you can use to initiate a conversation with your AI assistant system prompt.
|
223
223
|
|
224
224
|
```plain
|
225
|
-
|
225
|
+
promptcraft \
|
226
226
|
-p "I'm a hair salon phone service. I sell haircuts" \
|
227
227
|
-c tmp/hair-salon-20-0000.txt \
|
228
228
|
> tmp/hair-salon-20-replies-0001.yml
|
@@ -263,7 +263,27 @@ Tools you might want to use in conjunction with `promptcraft`:
|
|
263
263
|
|
264
264
|
## Installation
|
265
265
|
|
266
|
-
Right now, you
|
266
|
+
Right now, you can either install with:
|
267
|
+
|
268
|
+
* Homebrew
|
269
|
+
* Or, you need to run the CLI from the source code
|
270
|
+
|
271
|
+
Whilst this is a RubyGem, it currently requires some Git branches that are not yet released. The Homebrew recipe takes a big tarball of all source code and dependencies and installs it. The tarball is also available via [Github Releases](https://github.com/drnic/promptcraft/releases).
|
272
|
+
|
273
|
+
Once the Git branches are released, then the `promptcraft` gem will also be installable via RubyGems.
|
274
|
+
|
275
|
+
It requires Ruby 3.3 or later. Mostly because I like the new syntax features.
|
276
|
+
|
277
|
+
### Homebrew
|
278
|
+
|
279
|
+
The project is currently distributed by the Homebrew tap [`drnic/ai`](https://github.com/drnic/homebrew-ai).
|
280
|
+
|
281
|
+
```plain
|
282
|
+
brew tap drnic/ai
|
283
|
+
brew install promptcraft
|
284
|
+
```
|
285
|
+
|
286
|
+
### Run from Source
|
267
287
|
|
268
288
|
```plain
|
269
289
|
git clone https://github.com/drnic/promptcraft
|
@@ -275,7 +295,9 @@ bundle exec exe/promptcraft \
|
|
275
295
|
--provider groq
|
276
296
|
```
|
277
297
|
|
278
|
-
|
298
|
+
### Configuration
|
299
|
+
|
300
|
+
The `promptcraft` CLI defaults to `--provider groq --model llama3-70b-8192` and assumes you have `$GROQ_API_KEY` set in your environment.
|
279
301
|
|
280
302
|
You can also use [OpenAI](https://openai.com/) with `--provider openai`, which defaults to `--model gpt-3.5-turbo`. It assumes you have `$OPENAI_API_KEY` set in your environment.
|
281
303
|
|
@@ -286,7 +308,7 @@ You can also use [Ollama](https://ollama.com/) locally with `--provider ollama`,
|
|
286
308
|
If the conversation file has an `llm` key with `provider` and `model` keys, then those will be used instead of the defaults.
|
287
309
|
|
288
310
|
```plain
|
289
|
-
|
311
|
+
promptcraft \
|
290
312
|
--conversation examples/maths/start/already_answered_gpt4.yml \
|
291
313
|
--prompt "I always reply with a question"
|
292
314
|
|
@@ -311,7 +333,7 @@ The following example commands assume you have `$GROQ_API_KEY` and will use Groq
|
|
311
333
|
Run `promptcraft` with no arguments to get a default prompt and an initial assistant message.
|
312
334
|
|
313
335
|
```plain
|
314
|
-
|
336
|
+
promptcraft
|
315
337
|
```
|
316
338
|
|
317
339
|
The output might be:
|
@@ -327,13 +349,13 @@ messages:
|
|
327
349
|
Provide a different provider, such as `openai` (which assumes you have `$OPENAI_API_KEY` set in your environment).
|
328
350
|
|
329
351
|
```plain
|
330
|
-
|
352
|
+
promptcraft --provider openai
|
331
353
|
```
|
332
354
|
|
333
355
|
Or you could provide your own system prompt, and it will generate an initial assistant message.
|
334
356
|
|
335
357
|
```plaim
|
336
|
-
|
358
|
+
promptcraft --prompt "I like to solve maths problems."
|
337
359
|
```
|
338
360
|
|
339
361
|
The output might be:
|
@@ -367,26 +389,26 @@ The primary point of `promptcraft` is to replay conversations with a new system
|
|
367
389
|
An example of the `--conversation` option:
|
368
390
|
|
369
391
|
```plain
|
370
|
-
|
392
|
+
promptcraft \
|
371
393
|
--conversation examples/maths/start/basic.yml
|
372
394
|
```
|
373
395
|
|
374
396
|
You can also pipe a stream of conversation YAML into `promptcraft` via STDIN
|
375
397
|
|
376
398
|
```plain
|
377
|
-
echo "---\nsystem_prompt: I like to solve maths problems.\nmessages:\n- role: \"user\"\n content: \"What is 2+2?\"" |
|
399
|
+
echo "---\nsystem_prompt: I like to solve maths problems.\nmessages:\n- role: \"user\"\n content: \"What is 2+2?\"" | promptcraft
|
378
400
|
```
|
379
401
|
|
380
402
|
JSON is valid YAML, so you can also use JSON:
|
381
403
|
|
382
404
|
```plain
|
383
|
-
echo "{\"system_prompt\": \"I like to solve maths problems.\", \"messages\": [{\"role\": \"user\", \"content\": \"What is 2+2?\"}]}" |
|
405
|
+
echo "{\"system_prompt\": \"I like to solve maths problems.\", \"messages\": [{\"role\": \"user\", \"content\": \"What is 2+2?\"}]}" | promptcraft
|
384
406
|
```
|
385
407
|
|
386
408
|
Or pipe one or more files into `promptcraft`:
|
387
409
|
|
388
410
|
```plain
|
389
|
-
( cat examples/maths/start/basic.yml ; cat examples/maths/start/already_answered.yml ) |
|
411
|
+
( cat examples/maths/start/basic.yml ; cat examples/maths/start/already_answered.yml ) | promptcraft
|
390
412
|
```
|
391
413
|
|
392
414
|
As long as the input is a stream of YAML documents (separated by `---`), it will be processed.
|
@@ -421,7 +443,7 @@ messages:
|
|
421
443
|
When we replay the conversation with the same system prompt (by omitting the `--prompt` option), it will add the missing assistant reply:
|
422
444
|
|
423
445
|
```plain
|
424
|
-
|
446
|
+
promptcraft \
|
425
447
|
--conversation examples/maths/start/basic.yml
|
426
448
|
```
|
427
449
|
|
@@ -440,12 +462,38 @@ messages:
|
|
440
462
|
content: That's an easy one! The answer is... 4!
|
441
463
|
```
|
442
464
|
|
465
|
+
### Set the temperature
|
466
|
+
|
467
|
+
The `--temperature` option controls the randomness of the assistant's responses. The default for each provider is typically `0.0`. Lower values will produce more deterministic responses, while higher values will produce more creative responses.
|
468
|
+
|
469
|
+
```plain
|
470
|
+
promptcraft \
|
471
|
+
--conversation examples/maths/start/basic.yml \
|
472
|
+
--temperature 0.5
|
473
|
+
```
|
474
|
+
|
475
|
+
The output YAML for each conversation will store the `temperature` value used in the `llm:` section:
|
476
|
+
|
477
|
+
```yaml
|
478
|
+
---
|
479
|
+
system_prompt: I like to solve maths problems.
|
480
|
+
llm:
|
481
|
+
provider: groq
|
482
|
+
model: llama3-70b-8192
|
483
|
+
temperature: 0.5
|
484
|
+
messages:
|
485
|
+
- role: user
|
486
|
+
content: What is 2+2?
|
487
|
+
- role: assistant
|
488
|
+
content: That's an easy one! The answer is... 4!
|
489
|
+
```
|
490
|
+
|
443
491
|
### Limericks
|
444
492
|
|
445
493
|
Here are some previously [generated limericks](examples/maths/start/many_limericks.yml). To regenerate them to start with letter "E" on each line:
|
446
494
|
|
447
495
|
```plain
|
448
|
-
|
496
|
+
promptcraft \
|
449
497
|
--conversation examples/maths/start/many_limericks.yml \
|
450
498
|
--prompt "I am excellent at limericks. I always start each line with the letter E."
|
451
499
|
```
|
@@ -453,7 +501,7 @@ bundle exec exe/promptcraft \
|
|
453
501
|
It might still include some preamble in each response. To try to encourage the LLM to remove it:
|
454
502
|
|
455
503
|
```plain
|
456
|
-
|
504
|
+
promptcraft \
|
457
505
|
--conversation examples/maths/start/many_limericks.yml \
|
458
506
|
--prompt "I am excellent at limericks. I always start each line with the letter E. This is very important. Only return the limerick without any other comments."
|
459
507
|
```
|
@@ -469,15 +517,36 @@ To install this gem onto your local machine, run `bundle exec rake install`. To
|
|
469
517
|
To set new version number:
|
470
518
|
|
471
519
|
```plain
|
520
|
+
gem install gem-release
|
472
521
|
gem bump --version [patch|minor|major]
|
522
|
+
bundle install
|
523
|
+
git add Gemfile.lock; git commit --amend --no-edit
|
524
|
+
git push
|
473
525
|
```
|
474
526
|
|
475
|
-
To tag and release
|
527
|
+
To tag and release Rubygem to <Rubygems.org>:
|
476
528
|
|
477
529
|
```plain
|
478
530
|
rake release
|
479
531
|
```
|
480
532
|
|
533
|
+
To update Homebrew formula:
|
534
|
+
|
535
|
+
```plain
|
536
|
+
rake release:build_package
|
537
|
+
rake release:upload_package
|
538
|
+
rake release:generate_homebrew_formula
|
539
|
+
```
|
540
|
+
|
541
|
+
Now copy `tmp/promptcraft.rb` formula into <https://github.com/drnic/homebrew-ai> and push.
|
542
|
+
|
543
|
+
```plain
|
544
|
+
git clone https://github.com/drnic/homebrew-ai tmp/homebrew-ai
|
545
|
+
cp tmp/promptcraft.rb tmp/homebrew-ai
|
546
|
+
( cd tmp/homebrew-ai; git add .; gca -m "Bump promptcraft"; git push )
|
547
|
+
rm -rf tmp/homebrew-ai
|
548
|
+
```
|
549
|
+
|
481
550
|
## Contributing
|
482
551
|
|
483
552
|
Bug reports and pull requests are welcome on GitHub at https://github.com/drnic/promptcraft. This project is intended to be a safe, welcoming space for collaboration, and contributors are expected to adhere to the [code of conduct](https://github.com/drnic/promptcraft/blob/develop/CODE_OF_CONDUCT.md).
|
data/Rakefile
CHANGED
data/exe/promptcraft
CHANGED
@@ -33,6 +33,13 @@ class Promptcraft::Cli::RunCommand
|
|
33
33
|
desc "String or filename containing system prompt"
|
34
34
|
end
|
35
35
|
|
36
|
+
option :temperature do
|
37
|
+
short "-t"
|
38
|
+
long "--temperature temperature"
|
39
|
+
desc "Temperature for chat completion"
|
40
|
+
convert :float
|
41
|
+
end
|
42
|
+
|
36
43
|
flag :help do
|
37
44
|
short "-h"
|
38
45
|
long "--help"
|
@@ -72,14 +79,16 @@ class Promptcraft::Cli::RunCommand
|
|
72
79
|
desc "Enable debug mode"
|
73
80
|
end
|
74
81
|
|
75
|
-
|
82
|
+
# Arguments are for the benefit of test suite
|
83
|
+
def run(stdin: nil, threads: nil)
|
76
84
|
if params[:help]
|
77
85
|
warn help
|
78
86
|
elsif params.errors.any?
|
79
87
|
warn params.errors.summary
|
80
88
|
else
|
81
89
|
# Load files in threads
|
82
|
-
|
90
|
+
threads ||= params[:threads]
|
91
|
+
pool = Concurrent::FixedThreadPool.new(threads)
|
83
92
|
conversations = Concurrent::Array.new
|
84
93
|
# TODO: load in thread pool
|
85
94
|
(params[:conversation] || []).each do |filename|
|
@@ -123,7 +132,7 @@ class Promptcraft::Cli::RunCommand
|
|
123
132
|
end
|
124
133
|
|
125
134
|
# Process each conversation in a concurrent thread via a thread pool
|
126
|
-
pool = Concurrent::FixedThreadPool.new(
|
135
|
+
pool = Concurrent::FixedThreadPool.new(threads)
|
127
136
|
mutex = Mutex.new
|
128
137
|
|
129
138
|
updated_conversations = Concurrent::Array.new
|
@@ -138,6 +147,7 @@ class Promptcraft::Cli::RunCommand
|
|
138
147
|
Promptcraft::Llm.new
|
139
148
|
end
|
140
149
|
llm.model = params[:model] if params[:model]
|
150
|
+
llm.temperature = params[:temperature] if params[:temperature]
|
141
151
|
|
142
152
|
system_prompt = new_system_prompt || conversation.system_prompt
|
143
153
|
|
@@ -4,13 +4,14 @@ class Promptcraft::Conversation
|
|
4
4
|
include Promptcraft::Helpers
|
5
5
|
extend Promptcraft::Helpers
|
6
6
|
|
7
|
-
attr_accessor :system_prompt, :messages
|
7
|
+
attr_accessor :system_prompt, :messages, :temperature
|
8
8
|
attr_accessor :llm
|
9
9
|
|
10
|
-
def initialize(system_prompt:, messages: [], llm: nil)
|
10
|
+
def initialize(system_prompt:, messages: [], llm: nil, temperature: 0)
|
11
11
|
@system_prompt = system_prompt
|
12
12
|
@messages = messages
|
13
13
|
@llm = llm
|
14
|
+
@temperature = temperature
|
14
15
|
end
|
15
16
|
|
16
17
|
def add_message(role:, content:)
|
@@ -85,9 +86,9 @@ class Promptcraft::Conversation
|
|
85
86
|
# content: 2 + 2 = 4
|
86
87
|
def to_yaml
|
87
88
|
YAML.dump(deep_stringify_keys({
|
88
|
-
system_prompt:
|
89
|
-
llm:
|
90
|
-
messages:
|
89
|
+
system_prompt: system_prompt&.strip,
|
90
|
+
llm: llm&.to_h,
|
91
|
+
messages:
|
91
92
|
}.compact))
|
92
93
|
end
|
93
94
|
|
data/lib/promptcraft/llm.rb
CHANGED
@@ -4,12 +4,13 @@ class Promptcraft::Llm
|
|
4
4
|
DEFAULT_PROVIDER = "groq"
|
5
5
|
|
6
6
|
attr_reader :langchain
|
7
|
-
attr_accessor :provider, :model
|
7
|
+
attr_accessor :provider, :model, :temperature
|
8
8
|
|
9
9
|
delegate_missing_to :langchain
|
10
10
|
|
11
|
-
def initialize(provider: DEFAULT_PROVIDER, model: nil, api_key: nil)
|
11
|
+
def initialize(provider: DEFAULT_PROVIDER, model: nil, temperature: nil, api_key: nil)
|
12
12
|
@provider = provider
|
13
|
+
@temperature = temperature
|
13
14
|
@langchain = case provider
|
14
15
|
when "groq"
|
15
16
|
@model = model || "llama3-70b-8192"
|
@@ -17,14 +18,20 @@ class Promptcraft::Llm
|
|
17
18
|
Langchain::LLM::OpenAI.new(
|
18
19
|
api_key: api_key || ENV.fetch("GROQ_API_KEY"),
|
19
20
|
llm_options: {uri_base: "https://api.groq.com/openai/"},
|
20
|
-
default_options: {
|
21
|
+
default_options: {
|
22
|
+
temperature: temperature,
|
23
|
+
chat_completion_model_name: @model
|
24
|
+
}.compact
|
21
25
|
)
|
22
26
|
when "openai"
|
23
27
|
@model = model || "gpt-3.5-turbo"
|
24
28
|
require "openai"
|
25
29
|
Langchain::LLM::OpenAI.new(
|
26
30
|
api_key: api_key || ENV.fetch("OPENAI_API_KEY"),
|
27
|
-
default_options: {
|
31
|
+
default_options: {
|
32
|
+
temperature: temperature,
|
33
|
+
chat_completion_model_name: @model
|
34
|
+
}.compact
|
28
35
|
)
|
29
36
|
when "openrouter"
|
30
37
|
@model = model || "meta-llama/llama-3-8b-instruct:free"
|
@@ -32,7 +39,10 @@ class Promptcraft::Llm
|
|
32
39
|
Langchain::LLM::OpenAI.new(
|
33
40
|
api_key: api_key || ENV.fetch("OPENROUTER_API_KEY"),
|
34
41
|
llm_options: {uri_base: "https://openrouter.ai/api/"},
|
35
|
-
default_options: {
|
42
|
+
default_options: {
|
43
|
+
temperature: temperature,
|
44
|
+
chat_completion_model_name: @model
|
45
|
+
}.compact
|
36
46
|
)
|
37
47
|
when "ollama"
|
38
48
|
@model = model || "llama3"
|
@@ -46,10 +56,10 @@ class Promptcraft::Llm
|
|
46
56
|
end
|
47
57
|
|
48
58
|
def to_h
|
49
|
-
{provider
|
59
|
+
{provider:, model:, temperature:}.compact
|
50
60
|
end
|
51
61
|
|
52
62
|
def self.from_h(hash)
|
53
|
-
new(provider: hash[:provider], model: hash[:model])
|
63
|
+
new(provider: hash[:provider], model: hash[:model], temperature: hash[:temperature])
|
54
64
|
end
|
55
65
|
end
|
data/lib/promptcraft/version.rb
CHANGED
@@ -0,0 +1,72 @@
|
|
1
|
+
# lib/tasks/homebrew_formula.rake
|
2
|
+
|
3
|
+
require "rubygems"
|
4
|
+
require "rake"
|
5
|
+
|
6
|
+
namespace :release do
|
7
|
+
task :build_package do
|
8
|
+
system "bundle config set cache_all true"
|
9
|
+
system "bundle package"
|
10
|
+
name = Gem::Specification.load(Dir.glob("*.gemspec").first).name
|
11
|
+
version = Gem::Specification.load(Dir.glob("*.gemspec").first).version
|
12
|
+
license = Gem::Specification.load(Dir.glob("*.gemspec").first).license
|
13
|
+
system "rm -f #{name}*.tar* #{name}*.sha256"
|
14
|
+
system "fpm -s dir -t tar --name #{name} --version #{version} --license #{license} --exclude .git --exclude test --exclude spec ."
|
15
|
+
system "mv #{name}.tar #{name}-#{version}.tar"
|
16
|
+
system "xz -z #{name}-#{version}.tar"
|
17
|
+
sha = `shasum -a 256 #{name}-#{version}.tar.xz`.split(" ").first
|
18
|
+
File.write("#{name}-#{version}.sha256", sha)
|
19
|
+
end
|
20
|
+
|
21
|
+
task :upload_package do
|
22
|
+
name = Gem::Specification.load(Dir.glob("*.gemspec").first).name
|
23
|
+
version = Gem::Specification.load(Dir.glob("*.gemspec").first).version
|
24
|
+
file = "#{name}-#{version}.tar.xz"
|
25
|
+
file_sha256 = "#{name}-#{version}.sha256"
|
26
|
+
system "gh release create v#{version} #{file} #{file_sha256} --title 'v#{version}' --notes ''"
|
27
|
+
end
|
28
|
+
|
29
|
+
desc "Generate Homebrew formula"
|
30
|
+
task :generate_homebrew_formula do
|
31
|
+
spec = Gem::Specification.load(Dir.glob("*.gemspec").first)
|
32
|
+
name = spec.name
|
33
|
+
version = spec.version
|
34
|
+
sha256sum = File.read("#{name}-#{version}.sha256").strip
|
35
|
+
url = `gh release view v#{version} --json assets | jq -r '.assets[] | select(.name == "#{name}-#{version}.tar.xz") | .url'`.strip
|
36
|
+
|
37
|
+
formula_name = spec.name.capitalize
|
38
|
+
class_name = formula_name.gsub(/_\w/) { |match| match[1].upcase }.to_s
|
39
|
+
|
40
|
+
formula = <<~RUBY
|
41
|
+
class #{class_name} < Formula
|
42
|
+
desc "#{spec.summary}"
|
43
|
+
homepage "#{spec.homepage}"
|
44
|
+
version "#{spec.version}"
|
45
|
+
url "#{url}"
|
46
|
+
sha256 "#{sha256sum}"
|
47
|
+
|
48
|
+
depends_on "ruby"
|
49
|
+
|
50
|
+
def install
|
51
|
+
ENV["GEM_HOME"] = libexec
|
52
|
+
|
53
|
+
# Extract all files to libexec, which is a common Homebrew practice for third-party tools
|
54
|
+
libexec.install Dir["*"]
|
55
|
+
|
56
|
+
bin.install libexec/"exe/promptcraft"
|
57
|
+
bin.env_script_all_files(libexec/"bin", GEM_HOME: ENV.fetch("GEM_HOME"))
|
58
|
+
end
|
59
|
+
|
60
|
+
test do
|
61
|
+
# Simple test to check the version or a help command
|
62
|
+
system "\#{bin}/promptcraft", "--version"
|
63
|
+
end
|
64
|
+
end
|
65
|
+
RUBY
|
66
|
+
|
67
|
+
Dir.mkdir("tmp") unless Dir.exist?("tmp")
|
68
|
+
path = "tmp/#{spec.name.downcase}.rb"
|
69
|
+
File.write(path, formula)
|
70
|
+
puts "Generated Homebrew formula for #{spec.name} version #{spec.version} at #{path}"
|
71
|
+
end
|
72
|
+
end
|
metadata
CHANGED
@@ -1,14 +1,14 @@
|
|
1
1
|
--- !ruby/object:Gem::Specification
|
2
2
|
name: promptcraft
|
3
3
|
version: !ruby/object:Gem::Version
|
4
|
-
version: 0.1.
|
4
|
+
version: 0.1.2
|
5
5
|
platform: ruby
|
6
6
|
authors:
|
7
7
|
- Dr Nic Williams
|
8
8
|
autorequire:
|
9
9
|
bindir: exe
|
10
10
|
cert_chain: []
|
11
|
-
date: 2024-05-
|
11
|
+
date: 2024-05-14 00:00:00.000000000 Z
|
12
12
|
dependencies:
|
13
13
|
- !ruby/object:Gem::Dependency
|
14
14
|
name: activesupport
|
@@ -44,14 +44,14 @@ dependencies:
|
|
44
44
|
requirements:
|
45
45
|
- - ">="
|
46
46
|
- !ruby/object:Gem::Version
|
47
|
-
version:
|
47
|
+
version: 0.12.1
|
48
48
|
type: :runtime
|
49
49
|
prerelease: false
|
50
50
|
version_requirements: !ruby/object:Gem::Requirement
|
51
51
|
requirements:
|
52
52
|
- - ">="
|
53
53
|
- !ruby/object:Gem::Version
|
54
|
-
version:
|
54
|
+
version: 0.12.1
|
55
55
|
- !ruby/object:Gem::Dependency
|
56
56
|
name: faraday
|
57
57
|
requirement: !ruby/object:Gem::Requirement
|
@@ -185,6 +185,7 @@ files:
|
|
185
185
|
- lib/promptcraft/helpers.rb
|
186
186
|
- lib/promptcraft/llm.rb
|
187
187
|
- lib/promptcraft/version.rb
|
188
|
+
- lib/tasks/release.rake
|
188
189
|
- sig/promptcraft.rbs
|
189
190
|
homepage: https://github.com/drnic/promptcraft
|
190
191
|
licenses:
|