smart_prompt 0.1.3 → 0.1.5

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: 48814b17b9e142698d6cb49efd4ab15195bddce4fca7ac5f111b52e5e8b3e04b
4
- data.tar.gz: c0e23945a97549403ab6b5d3225e3b281b0d2f6f6f6097f5158ae98edb70d722
3
+ metadata.gz: 844c87d47dccd945bedcab69c58377ae144238ac9087b49219041382db539146
4
+ data.tar.gz: 3e70f88fbe8d26a0a15408e73ae9060b9f711a18e6b44eb561910b90156ca14b
5
5
  SHA512:
6
- metadata.gz: 1dac2c898d593345d1f1d8c9508faabda41c2954938171f99aa9999fbe4f34532124e441ad2ccbb265bc6ceb65fb188fd7b60d8e8f1439ad478f6d2d62a38018
7
- data.tar.gz: 851707ebcac1db79aa0fc29a8a92dd9e1571dd0d1207d4cd9ac75fb5992ecfac1bac09eec8599594e247066e586d4bd48b6fddf3da073e168f2039823dcce747
6
+ metadata.gz: 398854ce070f96f794ae944cc656a10d2f4752268cb35b4f9a857fcd8a722489b3ee112de97e5cdfe1cc1efb75cf92e4cd4a9b36cb338cbd8de6e22f1d6cc7dc
7
+ data.tar.gz: babac8b7d0479eb376df22b00cd693e6e657293e6b28b654aaa6b2ef4f9fcfdccb9709d6f1dd0b066cb6f22c07eed0173426816121061a80c91fc9afdd62ba06
data/README.cn.md ADDED
@@ -0,0 +1,113 @@
1
+ # SmartPrompt
2
+
3
+ SmartPrompt 是一个强大的 Ruby gem,提供了一种领域特定语言(DSL),使其他 Ruby 程序能够更加方便、自然地调用各种大型语言模型(LLM)的能力。
4
+
5
+ ## 主要特性
6
+
7
+ - 灵活的任务组合:以特定服务提供商 + 特定 LLM + 特定 prompt 的方式组合各种任务
8
+ - 子任务嵌套:支持以 DSL 形式组合调用其他子任务
9
+ - 性能优化:在保证质量的同时,提供性能最优或成本最低的解决方案
10
+
11
+ ## 安装
12
+
13
+ 将 gem 安装并添加到应用程序的 Gemfile 中,执行以下命令:
14
+
15
+ ```
16
+ $ bundle add smart_prompt
17
+ ```
18
+
19
+ 如果不使用 bundler 来管理依赖,可以通过执行以下命令来安装 gem:
20
+
21
+ ```
22
+ $ gem install smart_prompt
23
+ ```
24
+
25
+ ## 用法
26
+
27
+ 以下是一些基本用法示例:
28
+
29
+ ### 配置文件
30
+
31
+ ```
32
+ adapters:
33
+ openai: OpenAIAdapter
34
+ ollama: OllamaAdapter
35
+ llms:
36
+ siliconflow:
37
+ adapter: openai
38
+ url: https://api.siliconflow.cn/v1/
39
+ api_key: ENV["APIKey"]
40
+ default_model: Qwen/Qwen2.5-7B-Instruct
41
+ llamacpp:
42
+ adapter: openai
43
+ url: http://localhost:8080/
44
+ ollama:
45
+ adapter: ollama
46
+ url: http://localhost:11434/
47
+ default_model: qwen2.5
48
+ default_llm: siliconflow
49
+ worker_path: "./workers"
50
+ template_path: "./templates"
51
+ ```
52
+
53
+ ### 基本使用
54
+
55
+ ```
56
+ require 'smart_prompt'
57
+ engine = SmartPrompt::Engine.new('./config/llm_config.yml')
58
+ result = engine.call_worker(:daily_report, {location: "Shanghai"})
59
+ puts result
60
+ ```
61
+
62
+ ### workers/daily_report.rb
63
+
64
+ ```
65
+ SmartPrompt.define_worker :daily_report do
66
+ use "ollama"
67
+ model "gemma2"
68
+ system "You are a helpful report writer."
69
+ weather = call_worker(:weather_summary, { location: params[:location], date: "today" })
70
+ prompt :daily_report, { weather: weather, location: params[:location] }
71
+ send_msg
72
+ end
73
+ ```
74
+
75
+ ### workers/weather_summary.rb
76
+
77
+ ```
78
+ SmartPrompt.define_worker :weather_summary do
79
+ use "ollama"
80
+ model "gemma2"
81
+ sys_msg "You are a helpful weather assistant."
82
+ prompt :weather, { location: params[:location], date: params[:date] }
83
+ weather_info = send_msg
84
+ prompt :summarize, { text: weather_info }
85
+ send_msg
86
+ end
87
+ ```
88
+
89
+ ### templates/daily_report.erb
90
+
91
+ ```
92
+ Please create a brief daily report for <%= location %> based on the following weather information:
93
+
94
+ <%= weather %>
95
+
96
+ The report should include:
97
+ 1. A summary of the weather
98
+ 2. Any notable events or conditions
99
+ 3. Recommendations for residents
100
+ ```
101
+ ### templates/weather.erb
102
+
103
+ ```
104
+ What's the weather like in <%= location %> <%= date %>? Please provide a brief description including temperature and general conditions.
105
+ ```
106
+
107
+ ### templates/summarize.erb
108
+
109
+ ```
110
+ Please summarize the following text in one sentence:
111
+
112
+ <%= text %>
113
+ ```
data/README.md CHANGED
@@ -1,36 +1,115 @@
1
+ EN | [中文](./README.cn.md)
2
+
1
3
  # SmartPrompt
2
4
 
3
- SmartPrompt 是一个强大的 Ruby gem,提供了一种领域特定语言(DSL),使其他 Ruby 程序能够更加方便、自然地调用各种大型语言模型(LLM)的能力。
5
+ SmartPrompt is a powerful Ruby gem that provides a domain-specific language (DSL), enabling other Ruby programs to conveniently and naturally call upon the capabilities of various large language models (LLMs).
4
6
 
5
- ## 主要特性
7
+ ## Key Features
6
8
 
7
- - 灵活的任务组合:以特定服务提供商 + 特定 LLM + 特定 prompt 的方式组合各种任务
8
- - 子任务嵌套:支持以 DSL 形式组合调用其他子任务
9
- - 性能优化:在保证质量的同时,提供性能最优或成本最低的解决方案
9
+ - Flexible task composition: Combine various tasks using specific service providers + specific LLMs + specific prompts
10
+ - Nested subtasks: Support for composing and calling other subtasks in DSL form
11
+ - Performance optimization: Provide performance-optimal or cost-effective solutions while ensuring quality
10
12
 
11
- ## 安装
13
+ ## Installation
12
14
 
13
- gem 安装并添加到应用程序的 Gemfile 中,执行以下命令:
15
+ To install the gem and add it to your application's Gemfile, execute the following command:
14
16
 
15
17
  ```
16
18
  $ bundle add smart_prompt
17
19
  ```
18
20
 
19
- 如果不使用 bundler 来管理依赖,可以通过执行以下命令来安装 gem
21
+ If you don't use a bundler to manage dependencies, you can install the gem by executing the following command:
20
22
 
21
23
  ```
22
24
  $ gem install smart_prompt
23
25
  ```
24
26
 
25
- ## 用法
27
+ ## Usage
26
28
 
27
- 以下是一些基本用法示例:
29
+ The following are some examples of basic usage:
28
30
 
29
- ### 基本使用
31
+ ### llm_config.yml
32
+
33
+ ```
34
+ adapters:
35
+ openai: OpenAIAdapter
36
+ ollama: OllamaAdapter
37
+ llms:
38
+ siliconflow:
39
+ adapter: openai
40
+ url: https://api.siliconflow.cn/v1/
41
+ api_key: ENV["APIKey"]
42
+ default_model: Qwen/Qwen2.5-7B-Instruct
43
+ llamacpp:
44
+ adapter: openai
45
+ url: http://localhost:8080/
46
+ ollama:
47
+ adapter: ollama
48
+ url: http://localhost:11434/
49
+ default_model: qwen2.5
50
+ default_llm: siliconflow
51
+ worker_path: "./workers"
52
+ template_path: "./templates"
53
+ ```
54
+
55
+ ### Basic usage
30
56
 
31
57
  ```
32
58
  require 'smart_prompt'
33
59
  engine = SmartPrompt::Engine.new('./config/llm_config.yml')
34
60
  result = engine.call_worker(:daily_report, {location: "Shanghai"})
35
61
  puts result
62
+ ```
63
+
64
+ ### workers/daily_report.rb
65
+
66
+ ```
67
+ SmartPrompt.define_worker :daily_report do
68
+ use "ollama"
69
+ model "gemma2"
70
+ system "You are a helpful report writer."
71
+ weather = call_worker(:weather_summary, { location: params[:location], date: "today" })
72
+ prompt :daily_report, { weather: weather, location: params[:location] }
73
+ send_msg
74
+ end
75
+ ```
76
+
77
+ ### workers/weather_summary.rb
78
+
79
+ ```
80
+ SmartPrompt.define_worker :weather_summary do
81
+ use "ollama"
82
+ model "gemma2"
83
+ sys_msg "You are a helpful weather assistant."
84
+ prompt :weather, { location: params[:location], date: params[:date] }
85
+ weather_info = send_msg
86
+ prompt :summarize, { text: weather_info }
87
+ send_msg
88
+ end
89
+ ```
90
+
91
+ ### templates/daily_report.erb
92
+
93
+ ```
94
+ Please create a brief daily report for <%= location %> based on the following weather information:
95
+
96
+ <%= weather %>
97
+
98
+ The report should include:
99
+ 1. A summary of the weather
100
+ 2. Any notable events or conditions
101
+ 3. Recommendations for residents
102
+ ```
103
+ ### templates/weather.erb
104
+
105
+ ```
106
+ What's the weather like in <%= location %> <%= date %>? Please provide a brief description including temperature and general conditions.
107
+ ```
108
+
109
+ ### templates/summarize.erb
110
+
111
+ ```
112
+ Please summarize the following text in one sentence:
113
+
114
+ <%= text %>
36
115
  ```
@@ -5,6 +5,7 @@ module SmartPrompt
5
5
  attr_reader :messages, :last_response, :config_file
6
6
 
7
7
  def initialize(engine)
8
+ SmartPrompt.logger.info "Create Conversation"
8
9
  @messages = []
9
10
  @engine = engine
10
11
  @adapters = engine.adapters
@@ -26,6 +27,7 @@ module SmartPrompt
26
27
 
27
28
  def prompt(template_name, params = {})
28
29
  template_name = template_name.to_s
30
+ SmartPrompt.logger.info "Use template #{template_name}"
29
31
  raise "Template #{template_name} not found" unless @templates.key?(template_name)
30
32
  content = @templates[template_name].render(params)
31
33
  @messages << { role: 'user', content: content }
@@ -2,6 +2,7 @@ module SmartPrompt
2
2
  class Engine
3
3
  attr_reader :config_file, :config, :adapters, :current_adapter, :llms, :templates
4
4
  def initialize(config_file)
5
+ SmartPrompt.logger.info "Start create the SmartPrompt engine."
5
6
  @config_file = config_file
6
7
  @adapters={}
7
8
  @llms={}
@@ -10,6 +11,7 @@ module SmartPrompt
10
11
  end
11
12
 
12
13
  def load_config(config_file)
14
+ SmartPrompt.logger.info "Loading configuration from file: #{config_file}"
13
15
  @config_file = config_file
14
16
  @config = YAML.load_file(config_file)
15
17
  @config['adapters'].each do |adapter_name, adapter_class|
@@ -35,14 +37,25 @@ module SmartPrompt
35
37
  end
36
38
 
37
39
  def call_worker(worker_name, params = {})
40
+ SmartPrompt.logger.info "Calling worker: #{worker_name} with params: #{params}"
38
41
  worker = get_worker(worker_name)
39
- worker.execute(params)
42
+
43
+ begin
44
+ result = worker.execute(params)
45
+ SmartPrompt.logger.info "Worker #{worker_name} executed successfully"
46
+ result
47
+ rescue => e
48
+ SmartPrompt.logger.error "Error executing worker #{worker_name}: #{e.message}"
49
+ SmartPrompt.logger.debug e.backtrace.join("\n")
50
+ raise
51
+ end
40
52
  end
41
53
 
42
54
  private
43
55
 
44
56
  def get_worker(worker_name)
45
- worker = Worker.new(worker_name, self)
57
+ SmartPrompt.logger.info "Creating worker instance for: #{worker_name}"
58
+ Worker.new(worker_name, self)
46
59
  end
47
60
  end
48
61
  end
@@ -7,10 +7,12 @@ require 'ollama-ai'
7
7
  module SmartPrompt
8
8
  class LLMAdapter
9
9
  def initialize(config)
10
+ SmartPrompt.logger.info "Start create the SmartPrompt LLMAdapter."
10
11
  @config = config
11
12
  end
12
13
 
13
14
  def send_request(messages)
15
+ SmartPrompt.logger.error "LLMAdapter: Subclasses must implement send_request"
14
16
  raise NotImplementedError, "Subclasses must implement send_request"
15
17
  end
16
18
  end
@@ -18,19 +20,25 @@ module SmartPrompt
18
20
  class OpenAIAdapter < LLMAdapter
19
21
  def initialize(config)
20
22
  super
23
+ api_key = @config['api_key']
24
+ if api_key.is_a?(String) && api_key.start_with?('ENV[') && api_key.end_with?(']')
25
+ api_key = eval(api_key)
26
+ end
21
27
  @client = OpenAI::Client.new(
22
- access_token: @config['api_key'],
28
+ access_token: api_key,
23
29
  uri_base: @config['url'],
24
30
  request_timeout: 240
25
31
  )
26
32
  end
27
33
 
28
34
  def send_request(messages, model=nil)
35
+ SmartPrompt.logger.info "OpenAIAdapter: Sending request to OpenAI"
29
36
  if model
30
37
  model_name = model
31
38
  else
32
39
  model_name = @config['model']
33
40
  end
41
+ SmartPrompt.logger.info "OpenAIAdapter: Using model #{model_name}"
34
42
  response = @client.chat(
35
43
  parameters: {
36
44
  model: model_name,
@@ -38,6 +46,7 @@ module SmartPrompt
38
46
  temperature: @config['temperature'] || 0.7
39
47
  }
40
48
  )
49
+ SmartPrompt.logger.info "OpenAIAdapter: Received response from OpenAI"
41
50
  response.dig("choices", 0, "message", "content")
42
51
  end
43
52
  end
@@ -50,12 +59,14 @@ module SmartPrompt
50
59
  )
51
60
  end
52
61
  def send_request(messages, model=nil)
62
+ SmartPrompt.logger.info "LlamacppAdapter: Sending request to Llamacpp"
53
63
  response = @client.chat(
54
64
  parameters: {
55
65
  messages: messages,
56
66
  temperature: @config['temperature'] || 0.7
57
67
  }
58
68
  )
69
+ SmartPrompt.logger.info "LlamacppAdapter: Received response from Llamacpp"
59
70
  response.dig("choices", 0, "message", "content")
60
71
  end
61
72
  end
@@ -67,11 +78,13 @@ module SmartPrompt
67
78
  end
68
79
 
69
80
  def send_request(messages, model=nil)
81
+ SmartPrompt.logger.info "OllamaAdapter: Sending request to Ollama"
70
82
  if model
71
83
  model_name = model
72
84
  else
73
85
  model_name = @config['model']
74
86
  end
87
+ SmartPrompt.logger.info "OllamaAdapter: Using model #{model_name}"
75
88
  response = @client.generate(
76
89
  {
77
90
  model: model_name,
@@ -79,6 +92,7 @@ module SmartPrompt
79
92
  stream: false
80
93
  }
81
94
  )
95
+ SmartPrompt.logger.info "OllamaAdapter: Received response from Ollama"
82
96
  return response[0]["response"]
83
97
  end
84
98
  end
@@ -1,3 +1,3 @@
1
1
  module SmartPrompt
2
- VERSION = "0.1.3"
2
+ VERSION = "0.1.5"
3
3
  end
@@ -3,6 +3,7 @@ module SmartPrompt
3
3
  attr_reader :name, :config_file
4
4
 
5
5
  def initialize(name, engine)
6
+ SmartPrompt.logger.info "Create worker's name is #{name}"
6
7
  @name = name
7
8
  @engine = engine
8
9
  @config = engine.config
data/lib/smart_prompt.rb CHANGED
@@ -7,6 +7,7 @@ require File.expand_path('../smart_prompt/worker', __FILE__)
7
7
 
8
8
  module SmartPrompt
9
9
  class Error < StandardError; end
10
+ attr_writer :logger
10
11
 
11
12
  def self.define_worker(name, &block)
12
13
  Worker.define(name, &block)
@@ -16,4 +17,10 @@ module SmartPrompt
16
17
  worker = Worker.new(name, config_file)
17
18
  worker.execute(params)
18
19
  end
20
+
21
+ def self.logger
22
+ @logger ||= Logger.new($stdout).tap do |log|
23
+ log.progname = self.name
24
+ end
25
+ end
19
26
  end
metadata CHANGED
@@ -1,14 +1,14 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: smart_prompt
3
3
  version: !ruby/object:Gem::Version
4
- version: 0.1.3
4
+ version: 0.1.5
5
5
  platform: ruby
6
6
  authors:
7
7
  - zhuang biaowei
8
8
  autorequire:
9
9
  bindir: exe
10
10
  cert_chain: []
11
- date: 2024-10-02 00:00:00.000000000 Z
11
+ date: 2024-10-07 00:00:00.000000000 Z
12
12
  dependencies:
13
13
  - !ruby/object:Gem::Dependency
14
14
  name: yaml
@@ -89,6 +89,7 @@ extensions: []
89
89
  extra_rdoc_files: []
90
90
  files:
91
91
  - LICENSE.txt
92
+ - README.cn.md
92
93
  - README.md
93
94
  - Rakefile
94
95
  - lib/smart_prompt.rb