smart_prompt 0.1.3 → 0.1.4
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- checksums.yaml +4 -4
- data/README.cn.md +113 -0
- data/README.md +90 -11
- data/lib/smart_prompt/llm_adapter.rb +5 -1
- data/lib/smart_prompt/version.rb +1 -1
- metadata +2 -1
checksums.yaml
CHANGED
@@ -1,7 +1,7 @@
|
|
1
1
|
---
|
2
2
|
SHA256:
|
3
|
-
metadata.gz:
|
4
|
-
data.tar.gz:
|
3
|
+
metadata.gz: 846477d8ddb503119a9233f9ce85d2094c1755be52adcfc0515b0a7209524a3d
|
4
|
+
data.tar.gz: 4f165cedebc54e9d8ae292e3fd03329ec04150a8e0804fcb90234b7837135e6d
|
5
5
|
SHA512:
|
6
|
-
metadata.gz:
|
7
|
-
data.tar.gz:
|
6
|
+
metadata.gz: 7f3521861b4690b233c9b82d6f08123b9859c0fed28e40274fb4eaa014aa93b2df63121aff77451c9b1f5f93f9eeb683c5b044c94cef5e7d11a8e2151c5e97f8
|
7
|
+
data.tar.gz: 9c78f290101fd158403d2feb9e84312d8ffb90a22a32a57e91d96f5c287a38cfd8cb8c318817e860d7fb09bd5584310e6d397fce490deaa29ca52bafb3db7fba
|
data/README.cn.md
ADDED
@@ -0,0 +1,113 @@
|
|
1
|
+
# SmartPrompt
|
2
|
+
|
3
|
+
SmartPrompt 是一个强大的 Ruby gem,提供了一种领域特定语言(DSL),使其他 Ruby 程序能够更加方便、自然地调用各种大型语言模型(LLM)的能力。
|
4
|
+
|
5
|
+
## 主要特性
|
6
|
+
|
7
|
+
- 灵活的任务组合:以特定服务提供商 + 特定 LLM + 特定 prompt 的方式组合各种任务
|
8
|
+
- 子任务嵌套:支持以 DSL 形式组合调用其他子任务
|
9
|
+
- 性能优化:在保证质量的同时,提供性能最优或成本最低的解决方案
|
10
|
+
|
11
|
+
## 安装
|
12
|
+
|
13
|
+
将 gem 安装并添加到应用程序的 Gemfile 中,执行以下命令:
|
14
|
+
|
15
|
+
```
|
16
|
+
$ bundle add smart_prompt
|
17
|
+
```
|
18
|
+
|
19
|
+
如果不使用 bundler 来管理依赖,可以通过执行以下命令来安装 gem:
|
20
|
+
|
21
|
+
```
|
22
|
+
$ gem install smart_prompt
|
23
|
+
```
|
24
|
+
|
25
|
+
## 用法
|
26
|
+
|
27
|
+
以下是一些基本用法示例:
|
28
|
+
|
29
|
+
### 配置文件
|
30
|
+
|
31
|
+
```
|
32
|
+
adapters:
|
33
|
+
openai: OpenAIAdapter
|
34
|
+
ollama: OllamaAdapter
|
35
|
+
llms:
|
36
|
+
siliconflow:
|
37
|
+
adapter: openai
|
38
|
+
url: https://api.siliconflow.cn/v1/
|
39
|
+
api_key: ENV["APIKey"]
|
40
|
+
default_model: Qwen/Qwen2.5-7B-Instruct
|
41
|
+
llamacpp:
|
42
|
+
adapter: openai
|
43
|
+
url: http://localhost:8080/
|
44
|
+
ollama:
|
45
|
+
adapter: ollama
|
46
|
+
url: http://localhost:11434/
|
47
|
+
default_model: qwen2.5
|
48
|
+
default_llm: siliconflow
|
49
|
+
worker_path: "./workers"
|
50
|
+
template_path: "./templates"
|
51
|
+
```
|
52
|
+
|
53
|
+
### 基本使用
|
54
|
+
|
55
|
+
```
|
56
|
+
require 'smart_prompt'
|
57
|
+
engine = SmartPrompt::Engine.new('./config/llm_config.yml')
|
58
|
+
result = engine.call_worker(:daily_report, {location: "Shanghai"})
|
59
|
+
puts result
|
60
|
+
```
|
61
|
+
|
62
|
+
### workers/daily_report.rb
|
63
|
+
|
64
|
+
```
|
65
|
+
SmartPrompt.define_worker :daily_report do
|
66
|
+
use "ollama"
|
67
|
+
model "gemma2"
|
68
|
+
system "You are a helpful report writer."
|
69
|
+
weather = call_worker(:weather_summary, { location: params[:location], date: "today" })
|
70
|
+
prompt :daily_report, { weather: weather, location: params[:location] }
|
71
|
+
send_msg
|
72
|
+
end
|
73
|
+
```
|
74
|
+
|
75
|
+
### workers/weather_summary.rb
|
76
|
+
|
77
|
+
```
|
78
|
+
SmartPrompt.define_worker :weather_summary do
|
79
|
+
use "ollama"
|
80
|
+
model "gemma2"
|
81
|
+
sys_msg "You are a helpful weather assistant."
|
82
|
+
prompt :weather, { location: params[:location], date: params[:date] }
|
83
|
+
weather_info = send_msg
|
84
|
+
prompt :summarize, { text: weather_info }
|
85
|
+
send_msg
|
86
|
+
end
|
87
|
+
```
|
88
|
+
|
89
|
+
### templates/daily_report.erb
|
90
|
+
|
91
|
+
```
|
92
|
+
Please create a brief daily report for <%= location %> based on the following weather information:
|
93
|
+
|
94
|
+
<%= weather %>
|
95
|
+
|
96
|
+
The report should include:
|
97
|
+
1. A summary of the weather
|
98
|
+
2. Any notable events or conditions
|
99
|
+
3. Recommendations for residents
|
100
|
+
```
|
101
|
+
### templates/weather.erb
|
102
|
+
|
103
|
+
```
|
104
|
+
What's the weather like in <%= location %> <%= date %>? Please provide a brief description including temperature and general conditions.
|
105
|
+
```
|
106
|
+
|
107
|
+
### templates/summarize.erb
|
108
|
+
|
109
|
+
```
|
110
|
+
Please summarize the following text in one sentence:
|
111
|
+
|
112
|
+
<%= text %>
|
113
|
+
```
|
data/README.md
CHANGED
@@ -1,36 +1,115 @@
|
|
1
|
+
EN | [中文](./README.cn.md)
|
2
|
+
|
1
3
|
# SmartPrompt
|
2
4
|
|
3
|
-
SmartPrompt
|
5
|
+
SmartPrompt is a powerful Ruby gem that provides a domain-specific language (DSL), enabling other Ruby programs to conveniently and naturally call upon the capabilities of various large language models (LLMs).
|
4
6
|
|
5
|
-
##
|
7
|
+
## Key Features
|
6
8
|
|
7
|
-
-
|
8
|
-
-
|
9
|
-
-
|
9
|
+
- Flexible task composition: Combine various tasks using specific service providers + specific LLMs + specific prompts
|
10
|
+
- Nested subtasks: Support for composing and calling other subtasks in DSL form
|
11
|
+
- Performance optimization: Provide performance-optimal or cost-effective solutions while ensuring quality
|
10
12
|
|
11
|
-
##
|
13
|
+
## Installation
|
12
14
|
|
13
|
-
|
15
|
+
To install the gem and add it to your application's Gemfile, execute the following command:
|
14
16
|
|
15
17
|
```
|
16
18
|
$ bundle add smart_prompt
|
17
19
|
```
|
18
20
|
|
19
|
-
|
21
|
+
If you don't use a bundler to manage dependencies, you can install the gem by executing the following command:
|
20
22
|
|
21
23
|
```
|
22
24
|
$ gem install smart_prompt
|
23
25
|
```
|
24
26
|
|
25
|
-
##
|
27
|
+
## Usage
|
26
28
|
|
27
|
-
|
29
|
+
The following are some examples of basic usage:
|
28
30
|
|
29
|
-
###
|
31
|
+
### llm_config.yml
|
32
|
+
|
33
|
+
```
|
34
|
+
adapters:
|
35
|
+
openai: OpenAIAdapter
|
36
|
+
ollama: OllamaAdapter
|
37
|
+
llms:
|
38
|
+
siliconflow:
|
39
|
+
adapter: openai
|
40
|
+
url: https://api.siliconflow.cn/v1/
|
41
|
+
api_key: ENV["APIKey"]
|
42
|
+
default_model: Qwen/Qwen2.5-7B-Instruct
|
43
|
+
llamacpp:
|
44
|
+
adapter: openai
|
45
|
+
url: http://localhost:8080/
|
46
|
+
ollama:
|
47
|
+
adapter: ollama
|
48
|
+
url: http://localhost:11434/
|
49
|
+
default_model: qwen2.5
|
50
|
+
default_llm: siliconflow
|
51
|
+
worker_path: "./workers"
|
52
|
+
template_path: "./templates"
|
53
|
+
```
|
54
|
+
|
55
|
+
### Basic usage
|
30
56
|
|
31
57
|
```
|
32
58
|
require 'smart_prompt'
|
33
59
|
engine = SmartPrompt::Engine.new('./config/llm_config.yml')
|
34
60
|
result = engine.call_worker(:daily_report, {location: "Shanghai"})
|
35
61
|
puts result
|
62
|
+
```
|
63
|
+
|
64
|
+
### workers/daily_report.rb
|
65
|
+
|
66
|
+
```
|
67
|
+
SmartPrompt.define_worker :daily_report do
|
68
|
+
use "ollama"
|
69
|
+
model "gemma2"
|
70
|
+
system "You are a helpful report writer."
|
71
|
+
weather = call_worker(:weather_summary, { location: params[:location], date: "today" })
|
72
|
+
prompt :daily_report, { weather: weather, location: params[:location] }
|
73
|
+
send_msg
|
74
|
+
end
|
75
|
+
```
|
76
|
+
|
77
|
+
### workers/weather_summary.rb
|
78
|
+
|
79
|
+
```
|
80
|
+
SmartPrompt.define_worker :weather_summary do
|
81
|
+
use "ollama"
|
82
|
+
model "gemma2"
|
83
|
+
sys_msg "You are a helpful weather assistant."
|
84
|
+
prompt :weather, { location: params[:location], date: params[:date] }
|
85
|
+
weather_info = send_msg
|
86
|
+
prompt :summarize, { text: weather_info }
|
87
|
+
send_msg
|
88
|
+
end
|
89
|
+
```
|
90
|
+
|
91
|
+
### templates/daily_report.erb
|
92
|
+
|
93
|
+
```
|
94
|
+
Please create a brief daily report for <%= location %> based on the following weather information:
|
95
|
+
|
96
|
+
<%= weather %>
|
97
|
+
|
98
|
+
The report should include:
|
99
|
+
1. A summary of the weather
|
100
|
+
2. Any notable events or conditions
|
101
|
+
3. Recommendations for residents
|
102
|
+
```
|
103
|
+
### templates/weather.erb
|
104
|
+
|
105
|
+
```
|
106
|
+
What's the weather like in <%= location %> <%= date %>? Please provide a brief description including temperature and general conditions.
|
107
|
+
```
|
108
|
+
|
109
|
+
### templates/summarize.erb
|
110
|
+
|
111
|
+
```
|
112
|
+
Please summarize the following text in one sentence:
|
113
|
+
|
114
|
+
<%= text %>
|
36
115
|
```
|
@@ -18,8 +18,12 @@ module SmartPrompt
|
|
18
18
|
class OpenAIAdapter < LLMAdapter
|
19
19
|
def initialize(config)
|
20
20
|
super
|
21
|
+
api_key = @config['api_key']
|
22
|
+
if api_key.is_a?(String) && api_key.start_with?('ENV[') && api_key.end_with?(']')
|
23
|
+
api_key = eval(api_key)
|
24
|
+
end
|
21
25
|
@client = OpenAI::Client.new(
|
22
|
-
access_token:
|
26
|
+
access_token: api_key,
|
23
27
|
uri_base: @config['url'],
|
24
28
|
request_timeout: 240
|
25
29
|
)
|
data/lib/smart_prompt/version.rb
CHANGED
metadata
CHANGED
@@ -1,7 +1,7 @@
|
|
1
1
|
--- !ruby/object:Gem::Specification
|
2
2
|
name: smart_prompt
|
3
3
|
version: !ruby/object:Gem::Version
|
4
|
-
version: 0.1.
|
4
|
+
version: 0.1.4
|
5
5
|
platform: ruby
|
6
6
|
authors:
|
7
7
|
- zhuang biaowei
|
@@ -89,6 +89,7 @@ extensions: []
|
|
89
89
|
extra_rdoc_files: []
|
90
90
|
files:
|
91
91
|
- LICENSE.txt
|
92
|
+
- README.cn.md
|
92
93
|
- README.md
|
93
94
|
- Rakefile
|
94
95
|
- lib/smart_prompt.rb
|