luogu 0.1.19 → 0.2.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: fdfa637f68a431adad34ebf8e7349f74dec597f61949c56c9e2cd98b8adfd11e
4
- data.tar.gz: 2c70ae6c20b90e694a73406f5b99cc37b432647e364a19d7a2ec850da9376910
3
+ metadata.gz: 50079e5c23717d6caf07800b5878ba3b6398252caea1ff033175e16cbbe360b7
4
+ data.tar.gz: 4fa6b63d827b806de779caa38e085126c0fd112b02baad9984fece843801d7f7
5
5
  SHA512:
6
- metadata.gz: 0f0b8d18cc5d7781600f37e810ede2d4983c4e4d0ef6373dfd4ac8fdf09f8761331fa93a38e734ad564c515da01283f7558a9b59b2cc7396aa4fe38f10aa191f
7
- data.tar.gz: 0d29516aad6e9b1adb0111b730651a8c95ed3c462064b1f99bbed18476e604b587aca6510560b86b91603dc69931edfcfa001bea38838b220e166902551f5745
6
+ metadata.gz: eb4de03374038e36faa54427c667017e03f1ab6a1980b606e69d8975f28395adba2c6be9f3065b2d1a578f43c3c89385abe9ac429d888529f172734167bad0dd
7
+ data.tar.gz: 18c4ee842a0e12b64cce016798a52f6efd9b89d97018e2b800c64226d743f042f866977418c550a0d2e2db7fa2bd041a3fe2729f28fbbc7345048611a96696a1
data/Gemfile.lock CHANGED
@@ -1,9 +1,10 @@
1
1
  PATH
2
2
  remote: .
3
3
  specs:
4
- luogu (0.1.18)
4
+ luogu (0.2.0)
5
5
  dotenv (~> 2.8, >= 2.8.1)
6
6
  dry-cli (~> 1.0)
7
+ dry-configurable (~> 1.0, >= 1.0.1)
7
8
  http (~> 5.1, >= 5.1.1)
8
9
  tty-prompt (~> 0.23.1)
9
10
 
@@ -12,10 +13,17 @@ GEM
12
13
  specs:
13
14
  addressable (2.8.4)
14
15
  public_suffix (>= 2.0.2, < 6.0)
16
+ concurrent-ruby (1.2.2)
15
17
  domain_name (0.5.20190701)
16
18
  unf (>= 0.0.5, < 1.0.0)
17
19
  dotenv (2.8.1)
18
20
  dry-cli (1.0.0)
21
+ dry-configurable (1.0.1)
22
+ dry-core (~> 1.0, < 2)
23
+ zeitwerk (~> 2.6)
24
+ dry-core (1.0.0)
25
+ concurrent-ruby (~> 1.0)
26
+ zeitwerk (~> 2.6)
19
27
  ffi (1.15.5)
20
28
  ffi-compiler (1.0.1)
21
29
  ffi (>= 1.0.0)
@@ -49,6 +57,7 @@ GEM
49
57
  unf_ext
50
58
  unf_ext (0.0.8.2)
51
59
  wisper (2.0.1)
60
+ zeitwerk (2.6.7)
52
61
 
53
62
  PLATFORMS
54
63
  arm64-darwin-22
data/LICENSE ADDED
@@ -0,0 +1,21 @@
1
+ MIT License
2
+
3
+ Copyright (c) 2023 Mj
4
+
5
+ Permission is hereby granted, free of charge, to any person obtaining a copy
6
+ of this software and associated documentation files (the "Software"), to deal
7
+ in the Software without restriction, including without limitation the rights
8
+ to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
9
+ copies of the Software, and to permit persons to whom the Software is
10
+ furnished to do so, subject to the following conditions:
11
+
12
+ The above copyright notice and this permission notice shall be included in all
13
+ copies or substantial portions of the Software.
14
+
15
+ THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
16
+ IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
17
+ FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
18
+ AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
19
+ LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
20
+ OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
21
+ SOFTWARE.
data/README.md CHANGED
@@ -1,19 +1,20 @@
1
1
  # Luogu
2
- > 锣鼓,现在的写代码就如古代的求神仪式一样,以前敲锣打鼓求天,现在敲锣打鼓求大模型
3
2
 
4
- 用来开发 prompt 工程的工具
3
+ > 锣鼓,现在的写代码就如古代的求神仪式一样,以前敲锣打鼓求天,现在敲锣打鼓求大模型,PS.本文档也是通过GPT来做优化的,具体查看`prompt.md`和`prompt.plugin.rb`
5
4
 
6
- ### 更新记录
7
- - 0.1.15 http库替换成HTTP.rb并且加入了重试机制,默认为3次,可通过设置环境变量 OPENAI_REQUEST_RETRIES 来设置次数
8
- - 0.1.16 增加对agent的支持
5
+ Luogu是一个针对产品经理的Prompt设计工具,可以帮助你更方便地编写和测试Prompt。
9
6
 
10
- ### 安装
11
- - 安装ruby,要求2.6以上,Mac自带不需要再安装
12
- - gem install luogu
13
- - 如果使用 mac 可能你需要使用sudu权限
14
- - 如果需要在终端显示markdown,需要 [glow](https://github.com/charmbracelet/glow)
7
+ ## 安装
8
+
9
+ 1. 安装Ruby,要求2.6以上,Mac自带不需要再安装
10
+ 2. 在终端中运行`gem install luogu`命令安装Luogu
11
+ 3. 如果使用Mac可能需要使用sudo权限
12
+ 4. 如果需要在终端显示markdown,需要安装[glow](https://github.com/charmbracelet/glow)
13
+
14
+ ## 使用
15
+
16
+ Luogu提供了以下命令:
15
17
 
16
- ### 使用
17
18
  ```Bash
18
19
  Commands:
19
20
  luogu build PROMPT_FILE [TARGET_FILE] # 编译 Prompt.md 成能够提交给 ChatGPT API 的 messages. 默认输出为 <同文件名>.json
@@ -23,52 +24,28 @@ Commands:
23
24
  luogu version # 打印版本
24
25
  ```
25
26
 
26
- 你可以在项目目录的.env中设置下面的环境变量,或者直接系统设置
27
+ 你可以在项目目录的.env中设置下面的环境变量,或者直接系统设置:
28
+
27
29
  ```
28
30
  OPENAI_ACCESS_TOKEN=zheshiyigetoken
29
31
  OPENAI_TEMPERATURE=0.7
30
32
  OPENAI_LIMIT_HISTORY=6
31
33
  ```
32
34
 
33
- prompt.md 示例
34
- ```
35
- @s
36
- 你是一个罗纳尔多的球迷
35
+ ## 进入run模式
37
36
 
38
- @a
39
- 好的
40
-
41
- @u
42
- 罗纳尔多是谁?
37
+ 在run模式下,你可以使用以下命令:
43
38
 
44
- @a
45
- 是大罗,不是C罗
46
- ```
47
-
48
- 如果需要处理历史记录的是否进入上下文可以使用,在 prompt.md写入
49
-
50
-
51
- @callback
52
- ```ruby
53
- if assistant_message =~ /```ruby/
54
- puts "记录本次记录"
55
- self.push_history(user_message, assistant_message)
56
- else
57
- puts "抛弃本次记录"
58
- end
59
- ```
39
+ - save:保存对话
40
+ - row history:查看对话历史
41
+ - history:查看当前上下文
42
+ - exit:退出
60
43
 
44
+ ## 插件模式
61
45
 
62
- ### 进入run模式
63
- - save 保存对话
64
- - row history 查看对话历史
65
- - history 查看当前上下文
66
- - exit 退出
46
+ run和test中可以使用插件模式,可以使用--plugin=<file>.plugin.rb。默认情况下,你可以使用<文件名>.plugin.rb来实现一个prompt.md的插件。
67
47
 
68
- ## 插件模式
69
- 在 run 和 test 中可以使用,可以使用 --plugin=<file>.plugin.rb
70
- 默认情况下你可以使用 <文件名>.plugin.rb 来实现一个prompt.md的插件
71
- 在插件中有两个对象
48
+ 在插件中有两个对象:
72
49
 
73
50
  ```ruby
74
51
  #gpt
@@ -88,7 +65,8 @@ OpenStruct.new(
88
65
  # 如果需要在方法中使用使用变量传递,必须使用context包括你要的变量名,比如 context.name = "luogu"
89
66
  ```
90
67
 
91
- 支持的回调
68
+ 支持的回调:
69
+
92
70
  ```ruby
93
71
  # 所有方法都必须返回context
94
72
  # 可以使用
@@ -127,8 +105,44 @@ end
127
105
  after_save_historydo |gpt, context|
128
106
  context
129
107
  end
108
+ ```
109
+
110
+ ## 示例
111
+
112
+ 下面是一个prompt.md的示例:
130
113
 
114
+ ```
115
+ @s
116
+ 你是一个罗纳尔多的球迷
131
117
 
118
+ @a
119
+ 好的
120
+
121
+ @u
122
+ 罗纳尔多是谁?
123
+
124
+ @a
125
+ 是大罗,不是C罗
132
126
  ```
133
127
 
134
- ## MIT 协议
128
+ 如果需要处理历史记录的是否进入上下文可以使用,在prompt.md写入:
129
+
130
+ @callback
131
+ ```ruby
132
+ if assistant_message =~ /```ruby/
133
+ puts "记录本次记录"
134
+ self.push_history(user_message, assistant_message)
135
+ else
136
+ puts "抛弃本次记录"
137
+ end
138
+ ```
139
+
140
+ ## 更新记录
141
+
142
+ - 0.1.15:http库替换成HTTP.rb并且加入了重试机制,默认为3次,可通过设置环境变量OPENAI_REQUEST_RETRIES来设置次数
143
+ - 0.1.16:增加对agent的支持
144
+ - 0.2.0:重构了代码,agent的支持基本达到可用状态,具体看`bin/agent.rb`示例,有破坏性更新,不兼容0.1.x的agent写法
145
+
146
+ ## License
147
+
148
+ This project is licensed under the terms of the MIT license. See the [LICENSE](LICENSE) file for details.
@@ -1,27 +1,44 @@
1
- module Luogu
2
- class AgentRunner
3
- attr_accessor :system_prompt_template, :user_input_prompt_template
4
- attr_reader :session, :histories
5
-
6
- def initialize(system_prompt_template: nil, user_input_prompt_template: nil, tools_response_prompt_template: nil, session: Session.new)
7
- @system_prompt_template = system_prompt_template || load_system_prompt_default_template
8
- @user_input_prompt_template = user_input_prompt_template || load_user_input_prompt_default_template
9
- @tools_response_prompt_template = tools_response_prompt_template || load_tools_response_prompt_default
10
-
11
- @agents = []
12
- @session = session
13
-
14
- @chatgpt_request_body = Luogu::OpenAI::ChatRequestBody.new(temperature: 0)
15
- @chatgpt_request_body.stop = ["\nObservation:", "\n\tObservation:"]
16
- @limit_history = ENV.fetch('OPENAI_LIMIT_HISTORY', '6').to_i * 2
17
- @histories = HistoryQueue.new @limit_history
1
+ # frozen_string_literal: true
18
2
 
3
+ module Luogu
4
+ class AgentRunner < Base
5
+ setting :templates do
6
+ setting :system, default: PromptTemplate.load_template('agent_system.md.erb')
7
+ setting :user, default: PromptTemplate.load_template('agent_input.md.erb')
8
+ setting :tool, default: PromptTemplate.load_template('agent_tool_input.md.erb')
9
+ end
10
+
11
+ setting :run_agent_retries, default: Application.config.run_agent_retries
12
+
13
+ setting :provider do
14
+ setting :parameter_model, default: ->() {
15
+ OpenAI::ChatRequestParams.new(
16
+ model: 'gpt-3.5-turbo',
17
+ stop: %W[\nObservation: \n\tObservation:],
18
+ temperature: 0
19
+ )
20
+ }
21
+ setting :request, default: ->(params) { OpenAI.chat(params: params) }
22
+ setting :parse, default: ->(response) { OpenAI.chat_response_handle(response) }
23
+ setting :find_final_answer, default: ->(content) { OpenAI.find_final_answer(content) }
24
+ setting :history_limit, default: Application.config.openai.history_limit
25
+ end
26
+
27
+ attr_reader :request_params, :agents
28
+ def initialize()
29
+ @request_params = provider.parameter_model.call
30
+ @histories = HistoryQueue.new provider.history_limit
19
31
  @last_user_input = ''
32
+ @agents = []
20
33
  @tools_response = []
21
34
  end
22
35
 
23
- def openai_configuration(&block)
24
- block.call @chatgpt_request_body
36
+ def provider
37
+ config.provider
38
+ end
39
+
40
+ def templates
41
+ config.templates
25
42
  end
26
43
 
27
44
  def register(agent)
@@ -30,126 +47,69 @@ module Luogu
30
47
  self
31
48
  end
32
49
 
33
- def request(messages)
34
- @chatgpt_request_body.messages = messages
35
- response = client.chat(params: @chatgpt_request_body.to_h)
36
- if response.code == 200
37
- response.parse
38
- else
39
- logger.error response.body.to_s
40
- raise OpenAI::RequestError
41
- end
42
- end
43
-
44
- def user_input_prompt_template
45
- @user_input_prompt_template.result binding
46
- end
47
-
48
- def system_prompt_template
49
- @system_prompt_template.result binding
50
+ def run(text)
51
+ @last_user_input = text
52
+ messages = create_messages(
53
+ [{role: "user", content: templates.user.result(binding)}]
54
+ )
55
+ request(messages)
50
56
  end
57
+ alias_method :chat, :run
51
58
 
52
- def tools_response_prompt_template
53
- @tools_response_prompt_template.result binding
59
+ def create_messages(messages)
60
+ [
61
+ { role: "system", content: templates.system.result(binding) }
62
+ ] + @histories.to_a + messages
54
63
  end
55
64
 
56
- def find_final_answer(content)
57
- if content.is_a?(Hash) && content['action'] == 'Final Answer'
58
- content['action_input']
65
+ def request(messages, run_agent_retries: 0)
66
+ logger.debug "request chat: #{messages}"
67
+ @request_params.messages = messages
68
+ response = provider.request.call(@request_params.to_h)
69
+ unless response.code == 200
70
+ logger.error response.body
71
+ raise RequestError
72
+ end
73
+ content = provider.parse.call(response)
74
+ logger.debug content
75
+ if (answer = self.find_and_save_final_answer(content))
76
+ logger.info "final answer: #{answer}"
77
+ answer
59
78
  elsif content.is_a?(Array)
60
- result = content.find { |element| element["action"] == "Final Answer" }
61
- if result
62
- result["action_input"]
63
- else
64
- nil
65
- end
66
- else
67
- nil
79
+ run_agents(content, messages, run_agent_retries: run_agent_retries)
68
80
  end
69
81
  end
70
82
 
71
83
  def find_and_save_final_answer(content)
72
- if answer = self.find_final_answer(content)
73
- self.save_history(answer)
84
+ if (answer = provider.find_final_answer.call(content))
85
+ @histories.enqueue({role: "user", content: @last_user_input})
86
+ @histories.enqueue({role: "assistant", content: answer})
74
87
  answer
75
88
  else
76
89
  nil
77
90
  end
78
91
  end
79
92
 
80
- def save_history(finnal_answer)
81
- @histories.enqueue({role: "user", content: @last_user_input})
82
- @histories.enqueue({role: "assistant", content: finnal_answer})
83
- end
84
-
85
- def parse_json(markdown)
86
- json_regex = /```json(.+?)```/im
87
- json_blocks = markdown.scan(json_regex)
88
- result_json = nil
89
- json_blocks.each do |json_block|
90
- json_string = json_block[0]
91
- result_json = JSON.parse(json_string)
92
- end
93
-
94
- if result_json.nil?
95
- JSON.parse markdown
96
- else
97
- result_json
98
- end
99
- end
100
-
101
- def create_messages(messages)
102
- [{role: "system", content: self.system_prompt_template}] + @histories.to_a + messages
103
- end
104
-
105
- def request_chat(messages)
106
- logger.debug "request chat: #{messages}"
107
- response = request messages
108
- content = self.parse_json response.dig("choices", 0, "message", "content")
109
- logger.debug content
110
- if answer = self.find_and_save_final_answer(content)
111
- logger.info "finnal answer: #{answer}"
112
- elsif content.is_a?(Array)
113
- self.run_agents(content, messages)
114
- end
115
- end
116
-
117
- def chat(message)
118
- @last_user_input = message
119
- messages = self.create_messages([{role: "user", content: self.user_input_prompt_template}])
120
- self.request_chat(messages)
121
- end
122
-
123
- def run_agents(agents, _messages_)
124
- if answer = self.find_and_save_final_answer(agents)
125
- logger.info "finnal answer: #{answer}"
93
+ def run_agents(agents, _messages_, run_agent_retries: 0)
94
+ return if run_agent_retries > config.run_agent_retries
95
+ run_agent_retries += 1
96
+ if (answer = find_and_save_final_answer(agents))
97
+ logger.info "final answer: #{answer}"
126
98
  return
127
99
  end
128
100
  @tools_response = []
129
101
  agents.each do |agent|
130
102
  agent_class = Module.const_get(agent['action'])
131
- logger.info "running #{agent_class}"
103
+ logger.info "#{run_agent_retries} running #{agent_class} input: #{agent['action_input']}"
132
104
  response = agent_class.new.call(agent['action_input'])
133
105
  @tools_response << {name: agent['action'], response: response}
134
106
  end
135
- messages = _messages_ + [{role: "assistant", content: agents.to_json}, {role: "user", content: self.tools_response_prompt_template}]
136
- self.request_chat messages
137
- end
138
-
139
- private
140
- def load_system_prompt_default_template
141
- PromptTemplate.load_template('agent_system.md.erb')
107
+ messages = _messages_ + [
108
+ { role: "assistant", content: agents.to_json },
109
+ { role: "user", content: templates.tool.result(binding) }
110
+ ]
111
+ request messages, run_agent_retries: run_agent_retries
142
112
  end
143
113
 
144
- def load_user_input_prompt_default_template
145
- PromptTemplate.load_template('agent_input.md.erb')
146
- end
147
-
148
- def load_tools_response_prompt_default
149
- PromptTemplate.load_template('agent_tool_input.md.erb')
150
- end
151
- end
152
-
153
- class AssertionError < StandardError
154
114
  end
155
- end
115
+ end
@@ -0,0 +1,25 @@
1
+ # frozen_string_literal: true
2
+
3
+ module Luogu
4
+ module Application
5
+ extend Dry::Configurable
6
+
7
+ setting :openai do
8
+ setting :access_token, default: ENV.fetch('OPENAI_ACCESS_TOKEN')
9
+ setting :retries, default: ENV.fetch('OPENAI_REQUEST_RETRIES', 3).to_i
10
+ setting :host, default: ENV.fetch('OPENAI_HOST', 'https://api.openai.com')
11
+ setting :history_limit, default: ENV.fetch('OPENAI_LIMIT_HISTORY', '6').to_i * 2
12
+ setting :temperature, default: ENV.fetch('OPENAI_TEMPERATURE', 1).to_i
13
+ end
14
+
15
+ setting :run_agent_retries, default: ENV.fetch('RUN_AGENT_RETRIES', 5).to_i
16
+
17
+ setting :logger, reader: true,
18
+ default: ENV.fetch('LOG_LEVEL', Logger::INFO),
19
+ constructor: proc { |value|
20
+ logger = Logger.new(STDOUT)
21
+ logger.level = value
22
+ logger
23
+ }
24
+ end
25
+ end
data/lib/luogu/base.rb ADDED
@@ -0,0 +1,17 @@
1
+ # frozen_string_literal: true
2
+
3
+ module Luogu
4
+
5
+ class Base
6
+ extend Dry::Configurable
7
+
8
+ def logger
9
+ Application.logger
10
+ end
11
+
12
+ def config
13
+ self.class.config
14
+ end
15
+ end
16
+
17
+ end
@@ -1,64 +1,76 @@
1
1
  module Luogu
2
- class ChatGPT
2
+ class ChatLLM < Base
3
+
4
+ setting :provider do
5
+ setting :parameter_model, default: ->() {
6
+ OpenAI::ChatRequestParams.new(
7
+ model: 'gpt-3.5-turbo',
8
+ temperature: Application.config.openai.temperature
9
+ )
10
+ }
11
+ setting :request, default: ->(params) { OpenAI.chat(params: params) }
12
+ setting :parse, default: ->(response) { OpenAI.get_content(response) }
13
+ setting :find_final_answer, default: ->(content) { OpenAI.find_final_answer(content) }
14
+ setting :history_limit, default: Application.config.openai.history_limit
15
+ end
3
16
 
4
- attr_accessor :limit_history, :prompt, :row_history, :history, :temperature, :model_name, :context
17
+ attr_accessor :context
18
+ attr_reader :plugin
5
19
 
6
20
  def initialize(file, history_path='.', plugin_file_path=nil)
7
21
  @plugin_file_path = plugin_file_path || file.sub(File.extname(file), ".plugin.rb")
8
-
9
22
  if File.exist?(@plugin_file_path)
10
23
  @plugin = Plugin.new(@plugin_file_path).load()
11
24
  else
12
25
  @plugin = Plugin.new(@plugin_file_path)
13
26
  end
14
-
15
- @temperature = ENV.fetch('OPENAI_TEMPERATURE', '0.7').to_f
16
- @limit_history = ENV.fetch('OPENAI_LIMIT_HISTORY', '6').to_i * 2
17
- @model_name = "gpt-3.5-turbo"
18
-
19
27
  @history_path = history_path
20
28
  @prompt_file = file
21
-
22
29
  @prompt = PromptParser.new(file)
23
30
  @row_history = []
24
- @history = HistoryQueue.new @limit_history
31
+ @histories = HistoryQueue.new provider.history_limit
32
+
33
+ @request_params = provider.parameter_model.call
25
34
 
26
35
  @context = OpenStruct.new
27
36
 
28
- if @plugin.setup_proc
29
- @plugin.setup_proc.call(self, @context)
30
- end
37
+ run_plugin :setup
38
+ end
39
+
40
+ def run_plugin(method_name, &block)
41
+ plugin.run method_name: method_name, llm: self, context: @context, &block
42
+ end
43
+
44
+ def provider
45
+ config.provider
31
46
  end
32
47
 
33
48
  def request(messages)
34
- params = {
35
- model: @model_name,
36
- messages: messages,
37
- temperature: @temperature,
38
- }
39
-
40
- if @plugin.before_request_proc
41
- @context.request_params = params
42
- params = @plugin.before_request_proc.call(self, @context).request_params
49
+ @request_params.messages = messages
50
+ @context.request_params = @request_params
51
+ run_plugin :before_request
52
+ response = provider.request.call(@request_params.to_h)
53
+ unless response.code == 200
54
+ logger.error response.body.to_s
55
+ raise RequestError
43
56
  end
44
- response = client.chat(parameters: params).parse
45
57
  @context.response = response
46
- @plugin.after_request_proc.call(self, @context) if @plugin.after_request_proc
47
- response.dig("choices", 0, "message", "content")
58
+ run_plugin :after_request
59
+
60
+ provider.parse.call(response)
48
61
  end
49
62
 
50
63
  def chat(user_message)
51
- if @plugin.before_input_proc
52
- @context.user_input = user_message
53
- user_message = @plugin.before_input_proc.call(self, @context).user_input
54
- end
55
- messages = (@prompt.render + @history.to_a) << {role: "user", content: user_message}
56
- if @plugin.after_input_proc
57
- @context.request_messages = messages
58
- messages = @plugin.after_input_proc.call(self, @context).request_messages
64
+ @context.user_input = user_message
65
+ run_plugin :before_input do |context|
66
+ user_message = context.user_input
59
67
  end
60
68
 
61
- assistant_message = self.request(messages)
69
+ messages = (@prompt.render + @histories.to_a) << { role: "user", content: user_message}
70
+ run_plugin :after_input do
71
+ messages = @context.request_messages
72
+ end
73
+ assistant_message = request(messages)
62
74
 
63
75
  self.push_row_history(user_message, assistant_message)
64
76
 
@@ -68,12 +80,15 @@ module Luogu
68
80
  elsif @plugin.before_save_history_proc
69
81
  @context.user_input = user_message
70
82
  @context.response_message = assistant_message
71
- @plugin.before_save_history_proc.call(self, @context)
83
+
84
+ run_plugin :before_save_history
72
85
  else
73
86
  puts "执行默认的历史记录"
74
87
  self.push_history(user_message, assistant_message)
75
88
  end
76
89
 
90
+ run_plugin :after_save_history
91
+
77
92
  assistant_message
78
93
  end
79
94
 
@@ -83,8 +98,8 @@ module Luogu
83
98
  end
84
99
 
85
100
  def push_history(user_message, assistant_message)
86
- @history.enqueue({role: "user", content: user_message})
87
- @history.enqueue({role: "assistant", content: assistant_message})
101
+ @histories.enqueue({ role: "user", content: user_message})
102
+ @histories.enqueue({ role: "assistant", content: assistant_message})
88
103
  if @plugin.after_save_history_proc
89
104
  @context.user_input = user_message
90
105
  @context.response_message = response_message
@@ -118,11 +133,11 @@ module Luogu
118
133
  when "save"
119
134
  file_name = File.basename(@prompt_file, ".*")
120
135
  self.class.save @row_history, File.join(@history_path, "#{file_name}.row_history.md")
121
- self.class.save @history.to_a, File.join(@history_path, "#{file_name}.history.md")
136
+ self.class.save @histories.to_a, File.join(@history_path, "#{file_name}.history.md")
122
137
  when "row history"
123
138
  p @row_history
124
139
  when "history"
125
- p @history.to_a
140
+ p @histories.to_a
126
141
  when "exit"
127
142
  puts "再见!"
128
143
  break
@@ -147,7 +162,7 @@ module Luogu
147
162
  file_name = File.basename(@prompt_file, ".*")
148
163
 
149
164
  self.class.save @row_history, File.join(@history_path, "#{file_name}-#{now}.row_history.md")
150
- self.class.save @history.to_a, File.join(@history_path, "#{file_name}-#{now}.history.md")
165
+ self.class.save @histories.to_a, File.join(@history_path, "#{file_name}-#{now}.history.md")
151
166
  end
152
167
 
153
168
  class << self
data/lib/luogu/cli.rb CHANGED
@@ -35,7 +35,7 @@ module Luogu
35
35
  option :plugin, type: :string, desc: "运行的时候载入对应的插件"
36
36
 
37
37
  def call(prompt_file: nil, **options)
38
- chatgpt = ChatGPT.new(prompt_file, options.fetch(:out), options.fetch(:plugin, nil))
38
+ chatgpt = ChatLLM.new(prompt_file, options.fetch(:out), options.fetch(:plugin, nil))
39
39
  chatgpt.run
40
40
  end
41
41
 
@@ -51,7 +51,7 @@ module Luogu
51
51
  json = JSON.parse(File.read(json_file), symbolize_names: true)
52
52
  prompt_file ||= json_file.sub(File.extname(json_file), ".md")
53
53
 
54
- chatgpt = ChatGPT.save(json, prompt_file)
54
+ chatgpt = ChatLLM.save(json, prompt_file)
55
55
  end
56
56
 
57
57
  end
@@ -67,7 +67,7 @@ module Luogu
67
67
  def call(prompt_file: nil, test_file:nil, **options)
68
68
  test_file ||= prompt_file.sub(File.extname(prompt_file), ".test.yml")
69
69
 
70
- chatgpt = ChatGPT.new(prompt_file, options.fetch(:out), options.fetch(:plugin, nil))
70
+ chatgpt = ChatLLM.new(prompt_file, options.fetch(:out), options.fetch(:plugin, nil))
71
71
  messages = YAML.load_file(test_file)
72
72
  chatgpt.playload messages
73
73
  end
@@ -83,7 +83,7 @@ module Luogu
83
83
  end
84
84
  end
85
85
 
86
- register "version", Version, aliases: ["v", "-v", "--version"]
86
+ register "version", Version, aliases: %w[v -v --version]
87
87
  register "build", Build, aliases: ["b"]
88
88
  register "run", Run, aliases: ["r"]
89
89
  register "generate", Generate, aliases: ["g"]
@@ -0,0 +1,11 @@
1
+ # frozen_string_literal: true
2
+ module Luogu
3
+ class AssertionError < StandardError
4
+ end
5
+
6
+ class RequestError < StandardError
7
+ end
8
+
9
+ class PluginError < StandardError
10
+ end
11
+ end
data/lib/luogu/init.rb CHANGED
@@ -3,40 +3,31 @@ require 'dotenv/load'
3
3
  require "tty-prompt"
4
4
  require 'json'
5
5
  require 'yaml'
6
- require "dry/cli"
7
6
  require 'fileutils'
8
7
  require 'ostruct'
9
8
  require 'benchmark'
10
9
  require 'erb'
11
10
  require 'logger'
12
11
 
12
+ require "dry/cli"
13
+ require 'dry-configurable'
14
+
15
+ require_relative 'base'
16
+ require_relative 'error'
17
+ require_relative 'application'
18
+
13
19
  require_relative "prompt_template"
14
- require_relative 'openai'
15
20
  require_relative 'plugin'
16
21
  require_relative 'history_queue'
17
22
  require_relative "prompt_parser"
18
- require_relative "chatgpt"
23
+ require_relative "chatllm"
19
24
  require_relative "cli"
20
25
 
21
26
  require_relative "agent"
22
- require_relative "session"
23
27
  require_relative "agent_runner"
24
28
 
25
-
26
-
27
- def client
28
- $client ||= Luogu::OpenAI::Client.new
29
- end
30
-
31
- def logger_init
32
- logger = Logger.new(STDOUT)
33
- logger.level = ENV['LOG_LEVEL'] ? Logger.const_get(ENV['LOG_LEVEL']) : Logger::INFO
34
- logger
35
- end
36
-
37
- def logger
38
- $logger ||= logger_init
39
- end
29
+ require_relative 'openai'
30
+ require_relative 'terminal'
40
31
 
41
32
  class String
42
33
  def cover_chinese
data/lib/luogu/openai.rb CHANGED
@@ -1,14 +1,10 @@
1
- module Luogu::OpenAI
2
-
3
- class ChatRequestBody < Struct.new(:model, :messages, :temperature,
4
- :top_p, :n, :stream, :stop, :max_tokens,
5
- :presence_penalty, :frequency_penalty, :logit_bias, :user)
6
-
7
- def initialize(*args)
8
- defaults = { model: "gpt-3.5-turbo", messages: []}
9
- super(*defaults.merge(args.first || {}).values_at(*self.class.members))
10
- end
1
+ # frozen_string_literal: true
11
2
 
3
+ module Luogu::OpenAI
4
+ class ChatRequestParams < Struct.new(:model, :messages, :temperature,
5
+ :top_p, :n, :stream, :stop,
6
+ :max_tokens, :presence_penalty,
7
+ :frequency_penalty, :logit_bias, :user)
12
8
  def to_h
13
9
  super.reject { |_, v| v.nil? }
14
10
  end
@@ -16,31 +12,67 @@ module Luogu::OpenAI
16
12
  alias to_hash to_h
17
13
  end
18
14
 
19
- class RequestError < StandardError
15
+ def chat(parameters: nil, params: nil, retries: nil)
16
+ params ||= parameters
17
+ retries_left = retries || Luogu::Application.config.openai.retries
18
+ begin
19
+ client.post('/v1/chat/completions', json: params)
20
+ rescue HTTP::Error => e
21
+ if retries_left > 0
22
+ puts "retrying ..."
23
+ retries_left -= 1
24
+ sleep(1)
25
+ retry
26
+ else
27
+ puts "Connection error #{e}"
28
+ return nil
29
+ end
30
+ end
31
+ end
32
+
33
+ def client
34
+ @client ||= HTTP.auth("Bearer #{Luogu::Application.config.openai.access_token}")
35
+ .persistent Luogu::Application.config.openai.host
20
36
  end
21
37
 
22
- class Client
23
- def initialize
24
- @access_token = ENV.fetch('OPENAI_ACCESS_TOKEN')
25
- @client = HTTP.auth("Bearer #{@access_token}").persistent "https://api.openai.com"
38
+ def parse_json(markdown)
39
+ json_regex = /```json(.+?)```/im
40
+ json_blocks = markdown.scan(json_regex)
41
+ result_json = nil
42
+ json_blocks.each do |json_block|
43
+ json_string = json_block[0]
44
+ result_json = JSON.parse(json_string)
26
45
  end
27
46
 
28
- def chat(parameters: nil, params: nil, retries: 3)
29
- params ||= parameters
30
- retries_left = ENV.fetch('OPENAI_REQUEST_RETRIES', retries)
31
- begin
32
- @client.post('/v1/chat/completions', json: params)
33
- rescue HTTP::Error => e
34
- if retries_left > 0
35
- puts "retrying ..."
36
- retries_left -= 1
37
- sleep(1)
38
- retry
39
- else
40
- puts "Connection error #{e}"
41
- return nil
42
- end
47
+ if result_json.nil?
48
+ JSON.parse markdown
49
+ else
50
+ result_json
51
+ end
52
+ end
53
+
54
+ def chat_response_handle(response)
55
+ parse_json get_content(response)
56
+ end
57
+
58
+ def get_content(response)
59
+ response.parse.dig("choices", 0, "message", "content")
60
+ end
61
+
62
+ def find_final_answer(content)
63
+ if content.is_a?(Hash) && content['action'] == 'Final Answer'
64
+ content['action_input']
65
+ elsif content.is_a?(Array)
66
+ result = content.find { |element| element["action"] == "Final Answer" }
67
+ if result
68
+ result["action_input"]
69
+ else
70
+ nil
43
71
  end
72
+ else
73
+ nil
44
74
  end
45
75
  end
76
+
77
+ module_function :chat, :client, :parse_json, :chat_response_handle, :find_final_answer, :get_content
46
78
  end
data/lib/luogu/plugin.rb CHANGED
@@ -1,52 +1,40 @@
1
1
  module Luogu
2
2
  class Plugin
3
- attr_reader :before_input_proc, :before_save_history_proc, :after_input_proc, :after_save_history_proc,
4
- :setup_proc, :before_request_proc, :after_request_proc
3
+ attr_reader :plugin_file_path, :before_input_proc, :before_save_history_proc, :after_input_proc,
4
+ :after_save_history_proc, :setup_proc, :before_request_proc, :after_request_proc
5
5
 
6
- def initialize(plugin_file_path)
7
- @plugin_file_path = plugin_file_path
8
-
9
- @before_input_proc = nil
10
- @before_save_history_proc = nil
11
-
12
- @after_input_proc = nil
13
- @after_save_history_proc = nil
14
-
15
- @setup_proc = nil
16
- end
17
-
18
- def before_input(&block)
19
- @before_input_proc = block
20
- end
21
-
22
- def before_save_history(&block)
23
- @before_save_history_proc = block
24
- end
25
-
26
- def after_input(&block)
27
- @after_input_proc = block
28
- end
29
-
30
- def after_save_history(&block)
31
- @after_save_history_proc = block
32
- end
33
-
34
- def setup(&block)
35
- @setup_proc = block
6
+ # 定义一个元编程方法来动态定义属性设置方法
7
+ def self.define_attr_setter(attr_name)
8
+ define_method("#{attr_name}") do |&block|
9
+ instance_variable_set("@#{attr_name}_proc", block)
10
+ end
36
11
  end
37
12
 
38
- def before_request(&block)
39
- @before_request_proc = block
13
+ def initialize(plugin_file_path)
14
+ @plugin_file_path = plugin_file_path
40
15
  end
41
16
 
42
- def after_request(&block)
43
- @after_request_proc = block
44
- end
17
+ # 使用元编程方法来动态定义属性设置方法
18
+ define_attr_setter :before_input
19
+ define_attr_setter :before_save_history
20
+ define_attr_setter :after_input
21
+ define_attr_setter :after_save_history
22
+ define_attr_setter :setup
23
+ define_attr_setter :before_request
24
+ define_attr_setter :after_request
45
25
 
46
26
  def load()
47
- self.instance_eval File.read(@plugin_file_path)
27
+ self.instance_eval File.read(plugin_file_path)
48
28
  self
49
29
  end
50
-
30
+
31
+ def run(method_name: nil, llm: nil, context: nil, &block)
32
+ method_name = "#{method_name}_proc".to_sym
33
+ if send(method_name).respond_to?(:call)
34
+ rt = send(method_name).call(llm, context)
35
+ block.call(rt) unless block.nil?
36
+ rt
37
+ end
38
+ end
51
39
  end
52
- end
40
+ end
@@ -0,0 +1,40 @@
1
+ # frozen_string_literal: true
2
+
3
+ module Luogu
4
+ class Terminal < Base
5
+ def initialize(desc: "请输入你的指令>")
6
+ @desc = desc
7
+ @default_action = nil
8
+ @actions = []
9
+ end
10
+
11
+ def action(action_name, &block)
12
+ @actions << { action_name: action_name, action_method: block }
13
+ end
14
+
15
+ def default(&block)
16
+ @default_action = block
17
+ end
18
+
19
+ def run
20
+ loop do
21
+ # 从命令行读取输入
22
+ input = ask(@desc).cover_chinese
23
+
24
+ if input == 'exit'
25
+ break
26
+ end
27
+
28
+ if (action = @actions.find { |h| h[:action_name].to_s == input })
29
+ action.fetch(:action_method)&.call(input)
30
+ else
31
+ @default_action&.call(input)
32
+ end
33
+ end
34
+ end
35
+
36
+ def ask(message)
37
+ TTY::Prompt.new.ask(message)
38
+ end
39
+ end
40
+ end
data/lib/luogu/version.rb CHANGED
@@ -1,5 +1,5 @@
1
1
  # frozen_string_literal: true
2
2
 
3
3
  module Luogu
4
- VERSION = "0.1.19"
4
+ VERSION = "0.2.0"
5
5
  end
data/luogu.gemspec CHANGED
@@ -36,6 +36,7 @@ Gem::Specification.new do |spec|
36
36
  spec.add_dependency 'tty-prompt', '~> 0.23.1'
37
37
  spec.add_dependency 'dry-cli', '~> 1.0'
38
38
  spec.add_dependency 'http', '~> 5.1', '>= 5.1.1'
39
+ spec.add_dependency 'dry-configurable', '~> 1.0', '>= 1.0.1'
39
40
 
40
41
  # For more information and examples about making a new gem, check out our
41
42
  # guide at: https://bundler.io/guides/creating_gem.html
data/prompt.md ADDED
@@ -0,0 +1,2 @@
1
+ @s
2
+ 你是一个Ruby开源方向的专家,接下来我发给你我的ReadME,你帮我优化
data/prompt.plugin.rb ADDED
@@ -0,0 +1,9 @@
1
+ before_input do |gpt, context|
2
+ context.user_input = File.read("./README.md")
3
+ context
4
+ end
5
+
6
+ after_request do |gpt, context|
7
+ puts context.response
8
+ context
9
+ end
@@ -0,0 +1,9 @@
1
+ module Luogu
2
+ class Client
3
+ self.@client: untyped
4
+
5
+ def self.client: -> untyped
6
+
7
+ def chat: -> untyped
8
+ end
9
+ end
metadata CHANGED
@@ -1,14 +1,14 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: luogu
3
3
  version: !ruby/object:Gem::Version
4
- version: 0.1.19
4
+ version: 0.2.0
5
5
  platform: ruby
6
6
  authors:
7
7
  - MJ
8
8
  autorequire:
9
9
  bindir: exe
10
10
  cert_chain: []
11
- date: 2023-04-21 00:00:00.000000000 Z
11
+ date: 2023-04-22 00:00:00.000000000 Z
12
12
  dependencies:
13
13
  - !ruby/object:Gem::Dependency
14
14
  name: dotenv
@@ -78,6 +78,26 @@ dependencies:
78
78
  - - ">="
79
79
  - !ruby/object:Gem::Version
80
80
  version: 5.1.1
81
+ - !ruby/object:Gem::Dependency
82
+ name: dry-configurable
83
+ requirement: !ruby/object:Gem::Requirement
84
+ requirements:
85
+ - - "~>"
86
+ - !ruby/object:Gem::Version
87
+ version: '1.0'
88
+ - - ">="
89
+ - !ruby/object:Gem::Version
90
+ version: 1.0.1
91
+ type: :runtime
92
+ prerelease: false
93
+ version_requirements: !ruby/object:Gem::Requirement
94
+ requirements:
95
+ - - "~>"
96
+ - !ruby/object:Gem::Version
97
+ version: '1.0'
98
+ - - ">="
99
+ - !ruby/object:Gem::Version
100
+ version: 1.0.1
81
101
  description: 使用markdown来快速实现 Prompt工程研发
82
102
  email:
83
103
  - tywf91@gmail.com
@@ -88,27 +108,34 @@ extra_rdoc_files: []
88
108
  files:
89
109
  - Gemfile
90
110
  - Gemfile.lock
111
+ - LICENSE
91
112
  - README.md
92
113
  - Rakefile
93
114
  - exe/luogu
94
115
  - lib/luogu.rb
95
116
  - lib/luogu/agent.rb
96
117
  - lib/luogu/agent_runner.rb
97
- - lib/luogu/chatgpt.rb
118
+ - lib/luogu/application.rb
119
+ - lib/luogu/base.rb
120
+ - lib/luogu/chatllm.rb
98
121
  - lib/luogu/cli.rb
122
+ - lib/luogu/error.rb
99
123
  - lib/luogu/history_queue.rb
100
124
  - lib/luogu/init.rb
101
125
  - lib/luogu/openai.rb
102
126
  - lib/luogu/plugin.rb
103
127
  - lib/luogu/prompt_parser.rb
104
128
  - lib/luogu/prompt_template.rb
105
- - lib/luogu/session.rb
106
129
  - lib/luogu/templates/agent_input.md.erb
107
130
  - lib/luogu/templates/agent_system.md.erb
108
131
  - lib/luogu/templates/agent_tool_input.md.erb
132
+ - lib/luogu/terminal.rb
109
133
  - lib/luogu/version.rb
110
134
  - luogu.gemspec
135
+ - prompt.md
136
+ - prompt.plugin.rb
111
137
  - sig/luogu.rbs
138
+ - sig/luogu/client.rbs
112
139
  homepage: https://github.com/mjason/luogu
113
140
  licenses: []
114
141
  metadata:
data/lib/luogu/session.rb DELETED
@@ -1,4 +0,0 @@
1
- module Luogu
2
- class Session
3
- end
4
- end