smart_prompt 0.1.2 → 0.1.4

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: 5405e5a08476567f1737276d32ac24849eaa693c9ed302ca1c633a372e230996
4
- data.tar.gz: f5720ede56fc9eff6afa79990e25538dd6749b67bf85638f23ed94527e302401
3
+ metadata.gz: 846477d8ddb503119a9233f9ce85d2094c1755be52adcfc0515b0a7209524a3d
4
+ data.tar.gz: 4f165cedebc54e9d8ae292e3fd03329ec04150a8e0804fcb90234b7837135e6d
5
5
  SHA512:
6
- metadata.gz: a2cff6947e229493681d555ca407563d6aff0e9543d8fecb843fb1ee5c2f2534ff9e9994eb44a1ea4f9ddacdfc12a95da7b119aa2c2cd55ee638a65682a250c6
7
- data.tar.gz: 96db920d1c87b839dadeecf3b2552e6e9179d6013d3867f72e5e23370d63294261d134b9d9eb13ad6aa51743e9282d3dceac4743333d35e2554dbb8138e50eca
6
+ metadata.gz: 7f3521861b4690b233c9b82d6f08123b9859c0fed28e40274fb4eaa014aa93b2df63121aff77451c9b1f5f93f9eeb683c5b044c94cef5e7d11a8e2151c5e97f8
7
+ data.tar.gz: 9c78f290101fd158403d2feb9e84312d8ffb90a22a32a57e91d96f5c287a38cfd8cb8c318817e860d7fb09bd5584310e6d397fce490deaa29ca52bafb3db7fba
data/README.cn.md ADDED
@@ -0,0 +1,113 @@
1
+ # SmartPrompt
2
+
3
+ SmartPrompt 是一个强大的 Ruby gem,提供了一种领域特定语言(DSL),使其他 Ruby 程序能够更加方便、自然地调用各种大型语言模型(LLM)的能力。
4
+
5
+ ## 主要特性
6
+
7
+ - 灵活的任务组合:以特定服务提供商 + 特定 LLM + 特定 prompt 的方式组合各种任务
8
+ - 子任务嵌套:支持以 DSL 形式组合调用其他子任务
9
+ - 性能优化:在保证质量的同时,提供性能最优或成本最低的解决方案
10
+
11
+ ## 安装
12
+
13
+ 将 gem 安装并添加到应用程序的 Gemfile 中,执行以下命令:
14
+
15
+ ```
16
+ $ bundle add smart_prompt
17
+ ```
18
+
19
+ 如果不使用 bundler 来管理依赖,可以通过执行以下命令来安装 gem:
20
+
21
+ ```
22
+ $ gem install smart_prompt
23
+ ```
24
+
25
+ ## 用法
26
+
27
+ 以下是一些基本用法示例:
28
+
29
+ ### 配置文件
30
+
31
+ ```
32
+ adapters:
33
+ openai: OpenAIAdapter
34
+ ollama: OllamaAdapter
35
+ llms:
36
+ siliconflow:
37
+ adapter: openai
38
+ url: https://api.siliconflow.cn/v1/
39
+ api_key: ENV["APIKey"]
40
+ default_model: Qwen/Qwen2.5-7B-Instruct
41
+ llamacpp:
42
+ adapter: openai
43
+ url: http://localhost:8080/
44
+ ollama:
45
+ adapter: ollama
46
+ url: http://localhost:11434/
47
+ default_model: qwen2.5
48
+ default_llm: siliconflow
49
+ worker_path: "./workers"
50
+ template_path: "./templates"
51
+ ```
52
+
53
+ ### 基本使用
54
+
55
+ ```
56
+ require 'smart_prompt'
57
+ engine = SmartPrompt::Engine.new('./config/llm_config.yml')
58
+ result = engine.call_worker(:daily_report, {location: "Shanghai"})
59
+ puts result
60
+ ```
61
+
62
+ ### workers/daily_report.rb
63
+
64
+ ```
65
+ SmartPrompt.define_worker :daily_report do
66
+ use "ollama"
67
+ model "gemma2"
68
+ system "You are a helpful report writer."
69
+ weather = call_worker(:weather_summary, { location: params[:location], date: "today" })
70
+ prompt :daily_report, { weather: weather, location: params[:location] }
71
+ send_msg
72
+ end
73
+ ```
74
+
75
+ ### workers/weather_summary.rb
76
+
77
+ ```
78
+ SmartPrompt.define_worker :weather_summary do
79
+ use "ollama"
80
+ model "gemma2"
81
+ sys_msg "You are a helpful weather assistant."
82
+ prompt :weather, { location: params[:location], date: params[:date] }
83
+ weather_info = send_msg
84
+ prompt :summarize, { text: weather_info }
85
+ send_msg
86
+ end
87
+ ```
88
+
89
+ ### templates/daily_report.erb
90
+
91
+ ```
92
+ Please create a brief daily report for <%= location %> based on the following weather information:
93
+
94
+ <%= weather %>
95
+
96
+ The report should include:
97
+ 1. A summary of the weather
98
+ 2. Any notable events or conditions
99
+ 3. Recommendations for residents
100
+ ```
101
+ ### templates/weather.erb
102
+
103
+ ```
104
+ What's the weather like in <%= location %> <%= date %>? Please provide a brief description including temperature and general conditions.
105
+ ```
106
+
107
+ ### templates/summarize.erb
108
+
109
+ ```
110
+ Please summarize the following text in one sentence:
111
+
112
+ <%= text %>
113
+ ```
data/README.md CHANGED
@@ -1,36 +1,115 @@
1
+ EN | [中文](./README.cn.md)
2
+
1
3
  # SmartPrompt
2
4
 
3
- SmartPrompt 是一个强大的 Ruby gem,提供了一种领域特定语言(DSL),使其他 Ruby 程序能够更加方便、自然地调用各种大型语言模型(LLM)的能力。
5
+ SmartPrompt is a powerful Ruby gem that provides a domain-specific language (DSL), enabling other Ruby programs to conveniently and naturally call upon the capabilities of various large language models (LLMs).
4
6
 
5
- ## 主要特性
7
+ ## Key Features
6
8
 
7
- - 灵活的任务组合:以特定服务提供商 + 特定 LLM + 特定 prompt 的方式组合各种任务
8
- - 子任务嵌套:支持以 DSL 形式组合调用其他子任务
9
- - 性能优化:在保证质量的同时,提供性能最优或成本最低的解决方案
9
+ - Flexible task composition: Combine various tasks using specific service providers + specific LLMs + specific prompts
10
+ - Nested subtasks: Support for composing and calling other subtasks in DSL form
11
+ - Performance optimization: Provide performance-optimal or cost-effective solutions while ensuring quality
10
12
 
11
- ## 安装
13
+ ## Installation
12
14
 
13
- gem 安装并添加到应用程序的 Gemfile 中,执行以下命令:
15
+ To install the gem and add it to your application's Gemfile, execute the following command:
14
16
 
15
17
  ```
16
18
  $ bundle add smart_prompt
17
19
  ```
18
20
 
19
- 如果不使用 bundler 来管理依赖,可以通过执行以下命令来安装 gem
21
+ If you don't use a bundler to manage dependencies, you can install the gem by executing the following command:
20
22
 
21
23
  ```
22
24
  $ gem install smart_prompt
23
25
  ```
24
26
 
25
- ## 用法
27
+ ## Usage
26
28
 
27
- 以下是一些基本用法示例:
29
+ The following are some examples of basic usage:
28
30
 
29
- ### 基本使用
31
+ ### llm_config.yml
32
+
33
+ ```
34
+ adapters:
35
+ openai: OpenAIAdapter
36
+ ollama: OllamaAdapter
37
+ llms:
38
+ siliconflow:
39
+ adapter: openai
40
+ url: https://api.siliconflow.cn/v1/
41
+ api_key: ENV["APIKey"]
42
+ default_model: Qwen/Qwen2.5-7B-Instruct
43
+ llamacpp:
44
+ adapter: openai
45
+ url: http://localhost:8080/
46
+ ollama:
47
+ adapter: ollama
48
+ url: http://localhost:11434/
49
+ default_model: qwen2.5
50
+ default_llm: siliconflow
51
+ worker_path: "./workers"
52
+ template_path: "./templates"
53
+ ```
54
+
55
+ ### Basic usage
30
56
 
31
57
  ```
32
58
  require 'smart_prompt'
33
59
  engine = SmartPrompt::Engine.new('./config/llm_config.yml')
34
60
  result = engine.call_worker(:daily_report, {location: "Shanghai"})
35
61
  puts result
62
+ ```
63
+
64
+ ### workers/daily_report.rb
65
+
66
+ ```
67
+ SmartPrompt.define_worker :daily_report do
68
+ use "ollama"
69
+ model "gemma2"
70
+ system "You are a helpful report writer."
71
+ weather = call_worker(:weather_summary, { location: params[:location], date: "today" })
72
+ prompt :daily_report, { weather: weather, location: params[:location] }
73
+ send_msg
74
+ end
75
+ ```
76
+
77
+ ### workers/weather_summary.rb
78
+
79
+ ```
80
+ SmartPrompt.define_worker :weather_summary do
81
+ use "ollama"
82
+ model "gemma2"
83
+ sys_msg "You are a helpful weather assistant."
84
+ prompt :weather, { location: params[:location], date: params[:date] }
85
+ weather_info = send_msg
86
+ prompt :summarize, { text: weather_info }
87
+ send_msg
88
+ end
89
+ ```
90
+
91
+ ### templates/daily_report.erb
92
+
93
+ ```
94
+ Please create a brief daily report for <%= location %> based on the following weather information:
95
+
96
+ <%= weather %>
97
+
98
+ The report should include:
99
+ 1. A summary of the weather
100
+ 2. Any notable events or conditions
101
+ 3. Recommendations for residents
102
+ ```
103
+ ### templates/weather.erb
104
+
105
+ ```
106
+ What's the weather like in <%= location %> <%= date %>? Please provide a brief description including temperature and general conditions.
107
+ ```
108
+
109
+ ### templates/summarize.erb
110
+
111
+ ```
112
+ Please summarize the following text in one sentence:
113
+
114
+ <%= text %>
36
115
  ```
@@ -8,14 +8,15 @@ module SmartPrompt
8
8
  @messages = []
9
9
  @engine = engine
10
10
  @adapters = engine.adapters
11
+ @llms = engine.llms
11
12
  @templates = engine.templates
12
13
  @current_adapter = engine.current_adapter
13
14
  @last_response = nil
14
15
  end
15
16
 
16
- def use(adapter_name)
17
- raise "Adapter #{adapter_name} not configured" unless @adapters.key?(adapter_name)
18
- @current_adapter = adapter_name
17
+ def use(llm_name)
18
+ raise "Adapter #{adapter_name} not configured" unless @llms.key?(llm_name)
19
+ @current_llm = @llms[llm_name]
19
20
  self
20
21
  end
21
22
 
@@ -38,8 +39,8 @@ module SmartPrompt
38
39
  end
39
40
 
40
41
  def send_msg
41
- raise "No adapter selected" if @current_adapter.nil?
42
- @last_response = @adapters[@current_adapter].send_request(@messages, @model_name)
42
+ raise "No LLM selected" if @current_llm.nil?
43
+ @last_response = @current_llm.send_request(@messages, @model_name)
43
44
  @messages=[]
44
45
  @messages << { role: 'system', content: @sys_msg }
45
46
  @last_response
@@ -1,9 +1,10 @@
1
1
  module SmartPrompt
2
2
  class Engine
3
- attr_reader :config_file, :config, :adapters, :current_adapter, :templates
3
+ attr_reader :config_file, :config, :adapters, :current_adapter, :llms, :templates
4
4
  def initialize(config_file)
5
5
  @config_file = config_file
6
6
  @adapters={}
7
+ @llms={}
7
8
  @templates={}
8
9
  load_config(config_file)
9
10
  end
@@ -11,13 +12,18 @@ module SmartPrompt
11
12
  def load_config(config_file)
12
13
  @config_file = config_file
13
14
  @config = YAML.load_file(config_file)
14
- @config['adapters'].each do |adapter_name, adapter_config|
15
- adapter_class = SmartPrompt.const_get("#{adapter_name.capitalize}Adapter")
16
- @adapters[adapter_name] = adapter_class.new(adapter_config)
15
+ @config['adapters'].each do |adapter_name, adapter_class|
16
+ adapter_class = SmartPrompt.const_get(adapter_class)
17
+ @adapters[adapter_name] = adapter_class
17
18
  end
18
- @current_adapter = @config['default_adapter'] if @config['default_adapter']
19
- @config['templates'].each do |template_name, template_file|
20
- @templates[template_name] = PromptTemplate.new(template_file)
19
+ @config['llms'].each do |llm_name,llm_config|
20
+ adapter_class = @adapters[llm_config['adapter']]
21
+ @llms[llm_name]=adapter_class.new(llm_config)
22
+ end
23
+ @current_llm = @config['default_llm'] if @config['default_llm']
24
+ Dir.glob(File.join(@config['template_path'], '*.erb')).each do |file|
25
+ template_name = file.gsub(@config['template_path']+"/","").gsub("\.erb","")
26
+ @templates[template_name] = PromptTemplate.new(file)
21
27
  end
22
28
  load_workers
23
29
  end
@@ -15,11 +15,15 @@ module SmartPrompt
15
15
  end
16
16
  end
17
17
 
18
- class OpenaiAdapter < LLMAdapter
18
+ class OpenAIAdapter < LLMAdapter
19
19
  def initialize(config)
20
20
  super
21
+ api_key = @config['api_key']
22
+ if api_key.is_a?(String) && api_key.start_with?('ENV[') && api_key.end_with?(']')
23
+ api_key = eval(api_key)
24
+ end
21
25
  @client = OpenAI::Client.new(
22
- access_token: @config['api_key'],
26
+ access_token: api_key,
23
27
  uri_base: @config['url'],
24
28
  request_timeout: 240
25
29
  )
@@ -1,3 +1,3 @@
1
1
  module SmartPrompt
2
- VERSION = "0.1.2"
2
+ VERSION = "0.1.4"
3
3
  end
metadata CHANGED
@@ -1,14 +1,14 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: smart_prompt
3
3
  version: !ruby/object:Gem::Version
4
- version: 0.1.2
4
+ version: 0.1.4
5
5
  platform: ruby
6
6
  authors:
7
7
  - zhuang biaowei
8
8
  autorequire:
9
9
  bindir: exe
10
10
  cert_chain: []
11
- date: 2024-09-30 00:00:00.000000000 Z
11
+ date: 2024-10-02 00:00:00.000000000 Z
12
12
  dependencies:
13
13
  - !ruby/object:Gem::Dependency
14
14
  name: yaml
@@ -89,6 +89,7 @@ extensions: []
89
89
  extra_rdoc_files: []
90
90
  files:
91
91
  - LICENSE.txt
92
+ - README.cn.md
92
93
  - README.md
93
94
  - Rakefile
94
95
  - lib/smart_prompt.rb