action_ai 0.1.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
checksums.yaml ADDED
@@ -0,0 +1,7 @@
1
+ ---
2
+ SHA256:
3
+ metadata.gz: 026dc5bbd0ff082b147fc05e6e9e2784d0f89da393249857c383e38da57c79ac
4
+ data.tar.gz: ffeec65357f212fd305ad480a7a5f35c2b96ba9880acfcafbdaf437885f532b0
5
+ SHA512:
6
+ metadata.gz: 742b0b2a62dc5ab713b5fa3cb64aa255a197835b0ef027f6e79bc0903f4c3497f1b13836a2792dfd75852423d08dd7fd65dbc81108c92d143b6f812a64ac9172
7
+ data.tar.gz: 9999ac05bfd4eca661dbe84156b2d06a8a6ab0006ae64d7bc2d2f0c334b125e62ea8ed4f03ba24c6248f90b0d305ca9848a10cfa19040c06815face10bdc251a
data/CHANGELOG.md ADDED
@@ -0,0 +1,23 @@
1
+ # Changelog
2
+
3
+ All notable changes to this project will be documented in this file.
4
+
5
+ ## [0.1.0] - 2026-05-05
6
+
7
+ Refactored from Action Mailer.
8
+
9
+ ### Added
10
+
11
+ - In-memory AI testing provider (see `RubyLLM::Tester`).
12
+
13
+ ### Changed
14
+
15
+ - Forked from Action Mailer and renamed framework internals to `ActionAI`.
16
+ - Decoupled the package identity and gem structure toward standalone Action AI usage.
17
+ - Introduced `ActionAI::Agent` as the primary base class and rewired the public API around `ask` / prompt execution instead of email delivery.
18
+ - Updated execution pipeline, callbacks, rescuable flow, previews, log subscriber, parameterized execution, test helpers, and railtie integration to the new agent lifecycle.
19
+ - Renamed helper/generator surface from mailer to agent (`mail_helper` -> `prompt_helper`; `rails g mailer` templates replaced with `rails g ai` templates).
20
+
21
+ ### Removed
22
+
23
+ - Email-delivery-first concepts from the primary API surface (headers/delivery method workflow) in favor of prompt and model execution.
data/MIT-LICENSE ADDED
@@ -0,0 +1,21 @@
1
+ Copyright (c) David Heinemeier Hansson
2
+
3
+ Permission is hereby granted, free of charge, to any person obtaining
4
+ a copy of this software and associated documentation files (the
5
+ "Software"), to deal in the Software without restriction, including
6
+ without limitation the rights to use, copy, modify, merge, publish,
7
+ distribute, sublicense, and/or sell copies of the Software, and to
8
+ permit persons to whom the Software is furnished to do so, subject to
9
+ the following conditions:
10
+
11
+ The above copyright notice and this permission notice shall be
12
+ included in all copies or substantial portions of the Software.
13
+
14
+ THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND,
15
+ EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
16
+ MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND
17
+ NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE
18
+ LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION
19
+ OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION
20
+ WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
21
+
data/README.rdoc ADDED
@@ -0,0 +1,107 @@
1
+ = Action AI -- Easy AI prompt execution and testing
2
+
3
+ Action AI is a framework for designing AI interaction layers. These layers
4
+ are used to consolidate prompt generation and execution in one place,
5
+ instead of scattering provider calls across controllers, jobs, and models.
6
+
7
+ Action AI is in essence a wrapper around Action Controller and the
8
+ RubyLLM gem. It provides a way to make AI prompts using templates in the same
9
+ way that Action Controller renders views using templates.
10
+
11
+ The architecture is intentionally modeled after Action Mailer: class-level
12
+ actions, view-backed templates, and lazy execution. Action AI rebuilds that
13
+ shape for AI interactions rather than building an unrelated API from scratch.
14
+
15
+ == Sending prompts
16
+
17
+ The framework works by initializing any instance variables you want to be
18
+ available in the prompt template.
19
+
20
+ This can be as simple as:
21
+
22
+ class Generator < ApplicationAI
23
+ default model: "gpt-4o"
24
+
25
+ def code(task, language)
26
+ @task = task
27
+ @language = language
28
+ ask
29
+ end
30
+ end
31
+
32
+ After the action method completes, the framework will automatically:
33
+ 1. Render the prompt from the corresponding template (e.g., +app/ai/prompts/generator/code.erb+)
34
+ 2. Send it to the configured AI model via +ask+
35
+ 3. Return an +ActionAI::Interaction+ object
36
+
37
+ The prompt text is created by using an Action View template (regular
38
+ ERB) that has the instance variables that are declared in the agent action.
39
+
40
+ So the corresponding template for the +code+ method above could look like this:
41
+
42
+ You are an expert <%= @language.to_s.camelize %> developer.
43
+ Write clean, well-commented code for the following task:
44
+
45
+ <%= @task %>
46
+
47
+ If the task description was "Parse a CSV file and return unique values", the rendered prompt
48
+ would look like this:
49
+
50
+ You are an expert Ruby developer.
51
+ Write clean, well-commented code for the following task:
52
+
53
+ Parse a CSV file and return unique values
54
+
55
+ In order to execute prompts, you simply call the method and then call +content+
56
+ to get the result or just +run+ on the return value.
57
+
58
+ Calling the method returns a RubyLLM Message object:
59
+
60
+ prompt = Generator.code("Parse CSV and dedupe", :ruby) # => Returns a RubyLLM::Message object
61
+ prompt.run # => executes the prompt
62
+
63
+ Or you can just chain the methods together like:
64
+
65
+ Generator.code("Parse CSV and dedupe", :ruby).content # Returns AI's response for the prompt
66
+
67
+ == Setting defaults
68
+
69
+ It is possible to set default values that will be used in every method in your
70
+ Action AI Agent class. To implement this functionality, you just call the public
71
+ class method +default+ which you get for free from ActionAI::Agent.
72
+ This method accepts a Hash as the parameter. You can use any options
73
+ supported by +RubyLLM::Chat+, such as +:provider+ and +:model+.
74
+ Finally, it is also possible to pass in a
75
+ Proc that will get evaluated when it is needed.
76
+
77
+ Note that every value you set with this method will get overwritten if you use the
78
+ same key in your agent method.
79
+
80
+ Example:
81
+
82
+ class Generator < ApplicationAI
83
+ default model: proc { Current.user.preferred_model }
84
+ end
85
+
86
+ == Configuration
87
+
88
+ The Agent class has the full list of configuration options. Here's an example:
89
+
90
+ ActionAI::Agent.default_options = {
91
+ provider: :openai,
92
+ model: "gpt-4o-mini"
93
+ }
94
+
95
+
96
+ == Download and installation
97
+
98
+ The latest version of Action AI can be installed with RubyGems:
99
+
100
+ $ gem install action_ai
101
+
102
+
103
+ == License
104
+
105
+ Action AI is released under the MIT license:
106
+
107
+ * https://opensource.org/licenses/MIT
@@ -0,0 +1,465 @@
1
+ # frozen_string_literal: true
2
+
3
+ require "ruby_llm"
4
+ require "active_support/core_ext/string/inflections"
5
+ require "active_support/core_ext/hash/except"
6
+ require "active_support/core_ext/module/anonymous"
7
+ require "memery"
8
+
9
+ require "action_ai/log_subscriber"
10
+ require "action_ai/rescuable"
11
+
12
+ module ActionAI
13
+ # = Action AI \Agent
14
+ #
15
+ # Action AI allows you to send AI prompts from your application using an agent model and views.
16
+ #
17
+ # == Agent Models
18
+ #
19
+ # To use Action AI, you need to create an agent model.
20
+ #
21
+ # $ bin/rails generate ai:agent Generator
22
+ #
23
+ # The generated model inherits from <tt>ApplicationAI</tt> which in turn
24
+ # inherits from +ActionAI::Agent+. An agent model defines methods
25
+ # used to generate an AI prompt. In these methods, you can set up variables to be used in
26
+ # the prompt views, options on the AI model used such as the <tt>:model</tt> ID, and attachments.
27
+ #
28
+ # class ApplicationAI < ActionAI::Agent
29
+ # default model: 'gpt-4o-mini',
30
+ # provider: :openai
31
+ # end
32
+ #
33
+ # class Generator < ApplicationAI
34
+ # default model: 'gpt-4o'
35
+ #
36
+ # def code(task, language)
37
+ # @task = task
38
+ # @language = language
39
+ # end
40
+ # end
41
+ #
42
+ # Within the agent method, you have access to the following methods:
43
+ #
44
+ # * <tt>attachments</tt> - Allows you to add attachments to your prompt in an intuitive
45
+ # manner; <tt>attachments << 'url_or_path/to/filename.png'</tt>
46
+ #
47
+ # * <tt>ask</tt> - Allows you to specify a prompt to be sent.
48
+ # Like +render+ in Action Controller, this is optional.
49
+ #
50
+ # The +ask+ method uses [rdoc-ref:RubyLLM::Chat#ask] under the hood and provides the same API,
51
+ # except that +prompt+ is optional. If the +prompt+ is not specified, it will be rendered from the view.
52
+ #
53
+ # == Prompt views
54
+ #
55
+ # Like Action Controller, each agent class has a corresponding view directory in which each
56
+ # method of the class looks for a template with its name.
57
+ #
58
+ # To define a template to be used with an agent, create an <tt>.erb</tt> file with the same
59
+ # name as the method in your agent model. For example, in the agent defined above, the template at
60
+ # <tt>app/ai/prompts/generator/code.erb</tt> would be used to generate the prompt.
61
+ #
62
+ # Variables defined in the methods of your agent model are accessible as instance variables in their
63
+ # corresponding view.
64
+ #
65
+ # Prompts by default are sent in plain text, so a sample view for our model example might look like this:
66
+ #
67
+ # You are an expert <%= @language.to_s.camelize %> developer.
68
+ # Write clean, well-commented code to accomplish the following:
69
+ # <%= @task %>
70
+ #
71
+ # You can even use Action View helpers in these views. For example:
72
+ #
73
+ # You are an expert <%= t @language, scope: 'languages' %> developer.
74
+ # Write clean, well-commented code to accomplish the following:
75
+ # <%= @task %>
76
+ #
77
+ # If you need to access the AI provider or model in the view, you can do that through chat object:
78
+ #
79
+ # You are a <%= chat.model.name %> model, an expert <%= @language.to_s.camelize %> developer.
80
+ # Write clean, well-commented code to accomplish the following:
81
+ # <%= @task %>
82
+ #
83
+ #
84
+ # == Generating URLs
85
+ #
86
+ # URLs can be generated in agent views using <tt>url_for</tt> or named routes. Unlike controllers from
87
+ # Action Pack, the agent instance doesn't have any context about the incoming request, so you'll need
88
+ # to provide all of the details needed to generate a URL.
89
+ #
90
+ # When using <tt>url_for</tt> you'll need to provide the <tt>:host</tt>, <tt>:controller</tt>, and <tt>:action</tt>:
91
+ #
92
+ # <%= url_for(host: "example.com", controller: "projects", action: "show", id: @project.id) %>
93
+ #
94
+ # When using named routes you only need to supply the <tt>:host</tt>:
95
+ #
96
+ # <%= project_url(@project, host: "example.com") %>
97
+ #
98
+ # You should use the <tt>named_route_url</tt> style (which generates absolute URLs) and avoid using the
99
+ # <tt>named_route_path</tt> style (which generates relative URLs), since a model reading the prompt will
100
+ # have no concept of a current URL from which to determine a relative path.
101
+ #
102
+ # It is also possible to set a default host that will be used in all agents by setting the <tt>:host</tt>
103
+ # option as a configuration option in <tt>config/application.rb</tt>:
104
+ #
105
+ # config.action_ai.default_url_options = { host: "example.com" }
106
+ #
107
+ # You can also define a <tt>default_url_options</tt> method on individual agents to override these
108
+ # default settings per-agent.
109
+ #
110
+ # By default when <tt>config.force_ssl</tt> is +true+, URLs generated for hosts will use the HTTPS protocol.
111
+ #
112
+ # == Sending prompts
113
+ #
114
+ # Once an agent action and template are defined, you can chat with an AI model using your prompt
115
+ # or defer its creation and all the interactions for later:
116
+ #
117
+ # Generator.code("Parse CSV", :ruby).content # returns the generated code
118
+ # prompt = Generator.code("Parse CSV", :ruby) # => an ActionAI::Interaction object
119
+ # prompt.run # generates and executes the prompt now
120
+ #
121
+ # The ActionAI::Interaction class is a wrapper around a delegate that will call
122
+ # your method to generate the prompt. If you want direct access to the delegator, or +RubyLLM::Message+,
123
+ # you can call the <tt>message</tt> method on the ActionAI::Interaction object.
124
+ #
125
+ # Generator.code("Parse CSV", :ruby).message # => a RubyLLM::Message object
126
+ #
127
+ # Action AI is nicely integrated with Active Job so you can generate and send prompts in the background
128
+ # (example: outside of the request-response cycle, so the user doesn't have to wait on it):
129
+ #
130
+ # Generator.code("Parse CSV", :ruby).later # enqueue the AI processing to Active Job
131
+ #
132
+ # Note that <tt>later</tt> will execute your method from the background job.
133
+ #
134
+ # You never instantiate your agent class. Rather, you just call the method you defined on the class itself.
135
+ #
136
+ # == Attachments
137
+ #
138
+ # Sending attachments with prompts is easy:
139
+ #
140
+ # class Generator < ApplicationAI
141
+ # def code(task, language, spec_file = nil)
142
+ # @task = task
143
+ # @language = language
144
+ # attachments << spec_file if spec_file
145
+ # end
146
+ # end
147
+ #
148
+ # If you need to send attachments with no prompt, you need to create an empty view for it,
149
+ # or pass an empty prompt explicitly:
150
+ #
151
+ # class Generator < ApplicationAI
152
+ # def code(spec_file)
153
+ # attachments << spec_file
154
+ # ask ""
155
+ # end
156
+ # end
157
+ #
158
+ # == Default \Hash
159
+ #
160
+ # Action AI provides some intelligent defaults for your AI interactions, these are usually specified in a
161
+ # default method inside the class definition:
162
+ #
163
+ # class Generator < ApplicationAI
164
+ # default model: 'gpt-4o'
165
+ # end
166
+ #
167
+ # You can pass in any config value that a +RubyLLM::Chat+ accepts.
168
+ #
169
+ # Finally, Action AI also supports passing <tt>Proc</tt> and <tt>Lambda</tt> objects into the default hash,
170
+ # so you can define methods that evaluate as the message is being generated:
171
+ #
172
+ # class Generator < ApplicationAI
173
+ # default model: -> { Current.user.preferred_model },
174
+ # api_key: proc { Current.user.ai_api_key }
175
+ # end
176
+ #
177
+ # Note that the proc/lambda is evaluated right at the start of the prompt generation, so if you
178
+ # set something in the default hash using a proc, and then set the same thing inside of your
179
+ # agent method, it will get overwritten by the agent method.
180
+ #
181
+ # It is also possible to set these default options that will be used in all agents through
182
+ # the <tt>default_options=</tt> configuration in <tt>config/application.rb</tt>:
183
+ #
184
+ # config.action_ai.default_options = { provider: :openai, model: "gpt-4o-mini" }
185
+ #
186
+ # == \Callbacks
187
+ #
188
+ # You can specify callbacks using <tt>before_action</tt> and <tt>after_action</tt> to manage your AI interactions,
189
+ # and using <tt>before_execution</tt> and <tt>after_execution</tt> for wrapping the prompt execution process.
190
+ # For example, when you want to add default attachments and log execution for all prompts
191
+ # executed by a certain agent class:
192
+ #
193
+ # class Generator < ApplicationAI
194
+ # before_action :add_shared_context!
195
+ # after_execution :log_costs
196
+ #
197
+ # def code(task, language)
198
+ # @task = task
199
+ # @language = language
200
+ # end
201
+ #
202
+ # private
203
+ # def add_shared_context!
204
+ # @context = Rails.root.join('ARCHITECTURE.md').read
205
+ # end
206
+ #
207
+ # def log_costs
208
+ # Rails.logger.info "Generated code using #{message.input_tokens} input and #{message.output_tokens} output tokens."
209
+ # end
210
+ # end
211
+ #
212
+ # Action callbacks in Action AI Agent are implemented using
213
+ # AbstractController::Callbacks, so you can define and configure
214
+ # callbacks in the same manner that you would use callbacks in classes that
215
+ # inherit from ActionController::Base.
216
+ #
217
+ # Note that unless you have a specific reason to do so, you should prefer
218
+ # using <tt>before_action</tt> rather than <tt>after_action</tt> in your
219
+ # Action AI Agent classes for setup.
220
+ #
221
+ # == Rescuing Errors
222
+ #
223
+ # +rescue+ blocks inside of an agent method cannot rescue errors that occur
224
+ # outside of rendering -- for example, record deserialization errors in a
225
+ # background job.
226
+ #
227
+ # To rescue errors that occur during any part of the AI interaction process, use
228
+ # {rescue_from}[rdoc-ref:ActiveSupport::Rescuable::ClassMethods#rescue_from]:
229
+ #
230
+ # class Generator < ApplicationAI
231
+ # rescue_from RubyLLM::ApiQuotaExceededError do |error|
232
+ # Rails.logger.warn "API quota exceeded: #{error.message}"
233
+ # end
234
+ #
235
+ # def code(task, language)
236
+ # @task = task
237
+ # @language = language
238
+ # end
239
+ # end
240
+ #
241
+ # == Previewing prompts
242
+ #
243
+ # You can preview your prompt templates visually by adding a prompt preview file to the
244
+ # <tt>ActionAI::Agent.preview_paths</tt>. Since prompts may do something interesting
245
+ # with database data, you may need to write some scenarios to load messages with fake data:
246
+ #
247
+ # class GeneratorPreview < ActionAI::Preview
248
+ # def code
249
+ # Generator.code("Sort an array efficiently", :ruby)
250
+ # end
251
+ # end
252
+ #
253
+ # Methods must return a +RubyLLM::Message+ object which can be generated by calling the agent
254
+ # method without the additional <tt>content</tt> / <tt>later</tt>. The location of the
255
+ # agent preview directories can be configured using the <tt>preview_paths</tt> option which has a default
256
+ # of <tt>test/ai/agents/previews</tt>:
257
+ #
258
+ # config.action_ai.preview_paths << "#{Rails.root}/lib/ai/previews"
259
+ #
260
+ # An overview of all previews is accessible at <tt>http://localhost:3000/rails/ai/agents</tt>
261
+ # on a running development server instance.
262
+ #
263
+ # == Configuration options
264
+ #
265
+ # These options are specified on the class level, like
266
+ # <tt>ActionAI::Agent.raise_execution_errors = true</tt>
267
+ #
268
+ # * <tt>default_options</tt> - You can pass this in at a class level as well as within the class itself as
269
+ # per the above section.
270
+ #
271
+ # * <tt>logger</tt> - the logger is used for generating information on prompt execution if available.
272
+ # Can be set to +nil+ for no logging. Compatible with both Ruby's own +Logger+ and Log4r loggers.
273
+ #
274
+ # * <tt>execution_job</tt> - The job class used with <tt>later</tt>. Agents can set this to use a
275
+ # custom execution job. Defaults to +ActionAI::ExecutionJob+.
276
+ #
277
+ # * <tt>execute_later_queue_name</tt> - The queue name used by <tt>later</tt> with the default
278
+ # <tt>execution_job</tt>. Agents can set this to use a custom queue name.
279
+ class Agent < AbstractController::Base
280
+ include Callbacks
281
+ include QueuedExecution
282
+ include Rescuable
283
+ include Parameterized
284
+ include Previews
285
+
286
+ abstract!
287
+
288
+ include AbstractController::Rendering
289
+
290
+ include AbstractController::Logger
291
+ include AbstractController::Helpers
292
+ include AbstractController::Translation
293
+ include AbstractController::AssetPaths
294
+ include AbstractController::Callbacks
295
+ include AbstractController::Caching
296
+
297
+ include ActionView::Layouts
298
+
299
+ include Memery
300
+
301
+ PROTECTED_IVARS = AbstractController::Rendering::DEFAULT_PROTECTED_INSTANCE_VARIABLES + [:@_action_has_layout]
302
+
303
+ helper ActionAI::PromptHelper
304
+
305
+ class_attribute :default_params, default: {
306
+ # none
307
+ }.freeze
308
+
309
+ class << self
310
+ # Returns the name of the current agent. This method is also being used as a path for a view lookup.
311
+ # If this is an anonymous agent, this method will return +anonymous+ instead.
312
+ def agent_name
313
+ @agent_name ||= anonymous? ? "anonymous" : name.underscore
314
+ end
315
+ # Allows to set the name of current agent.
316
+ attr_writer :agent_name
317
+ alias :controller_path :agent_name
318
+
319
+ # Allows to set defaults through app configuration:
320
+ #
321
+ # config.action_ai.default_options = { provider: :openai }
322
+ def default(value = nil)
323
+ self.default_params = default_params.merge(value).freeze if value
324
+ default_params
325
+ end
326
+ alias :default_options= :default
327
+
328
+ private
329
+ def method_missing(method_name, ...)
330
+ if action_methods.include?(method_name.name)
331
+ Interaction.new(self, method_name, ...)
332
+ else
333
+ super
334
+ end
335
+ end
336
+
337
+ def respond_to_missing?(method, include_all = false)
338
+ action_methods.include?(method.name) || super
339
+ end
340
+ end
341
+
342
+ attr_internal :message
343
+
344
+ memoize def chat = RubyLLM.chat(**apply_defaults)
345
+
346
+ def process(method_name, *args) # :nodoc:
347
+ payload = {
348
+ agent: self.class.name,
349
+ action: method_name,
350
+ args: args
351
+ }
352
+
353
+ ActiveSupport::Notifications.instrument("process.action_ai", payload) do
354
+ super
355
+ end
356
+ end
357
+ ruby2_keywords(:process)
358
+
359
+ def response_body = message&.content
360
+
361
+ # Returns the name of the agent object.
362
+ def agent_name = self.class.agent_name
363
+
364
+ def prompt = render_to_string
365
+
366
+ # Allows you to add attachments to a prompt, like so:
367
+ #
368
+ # attachments << '/path/to/filename.jpg'
369
+ #
370
+ def attachments = @attachments ||= []
371
+
372
+ # The main method that sends the rendered prompt to the AI model. There are
373
+ # two ways to call this method, with a block, or without a block.
374
+ # If +prompt+ is omitted, it is rendered from the matching template.
375
+ #
376
+ # It accepts an optional +with:+ keyword for attachments:
377
+ #
378
+ # * +:with+ - Array of file paths or URLs to attach to the prompt.
379
+ #
380
+ # You can set default model options using the ::default class method:
381
+ #
382
+ # class Generator < ActionAI::Agent
383
+ # default model: 'gpt-4o', provider: :openai
384
+ # end
385
+ #
386
+ # It will find a template in the view paths using by default the agent name and the
387
+ # method name that it is being called from, it will then call
388
+ # +RubyLLM::Chat#ask+ and return a resulting +RubyLLM::Message+.
389
+ #
390
+ # For example:
391
+ #
392
+ # class Generator < ActionAI::Agent
393
+ # default model: 'gpt-4o'
394
+ #
395
+ # def code(task, language)
396
+ # @task = task
397
+ # @language = language
398
+ #
399
+ # ask # can be omitted, like +render+ in action controllers
400
+ # end
401
+ # end
402
+ #
403
+ # Will look for all templates at "app/ai/prompts/generator" with name "code".
404
+ # If no code template exists, it will raise an ActionView::MissingTemplate error.
405
+ #
406
+ # However, those can be customized:
407
+ #
408
+ # ask render(template: 'shared/prompt')
409
+ #
410
+ # You can even render plain text directly without using a template:
411
+ #
412
+ # ask("Write Ruby code for the following task: #{task}")
413
+ #
414
+ def ask(prompt = self.prompt, with: use_attachments, &)
415
+ @_message = chat.ask(prompt, with:, &)
416
+ end
417
+
418
+ private
419
+ # Prompts do not support relative path links.
420
+ def self.supports_path? # :doc:
421
+ false
422
+ end
423
+
424
+ def apply_defaults(config = {})
425
+ default_values = self.class.default.except(*config.keys).transform_values do |value|
426
+ compute_default(value)
427
+ end
428
+
429
+ config.reverse_merge(default_values)
430
+ end
431
+
432
+ def compute_default(value)
433
+ return value unless value.is_a?(Proc)
434
+
435
+ if value.arity == 1
436
+ instance_exec(self, &value)
437
+ else
438
+ instance_exec(&value)
439
+ end
440
+ end
441
+
442
+ def use_attachments
443
+ attachments.presence
444
+ .tap { @attachments = nil } # reset
445
+ end
446
+
447
+ # This and #instrument_name is for caching instrument
448
+ def instrument_payload(key)
449
+ {
450
+ agent: agent_name,
451
+ key: key
452
+ }
453
+ end
454
+
455
+ def instrument_name
456
+ "action_ai"
457
+ end
458
+
459
+ def _protected_ivars
460
+ PROTECTED_IVARS
461
+ end
462
+
463
+ ActiveSupport.run_load_hooks(:action_ai, self)
464
+ end
465
+ end
@@ -0,0 +1,31 @@
1
+ # frozen_string_literal: true
2
+
3
+ module ActionAI
4
+ module Callbacks
5
+ extend ActiveSupport::Concern
6
+
7
+ included do
8
+ include ActiveSupport::Callbacks
9
+ define_callbacks :execution, skip_after_callbacks_if_terminated: true
10
+ end
11
+
12
+ module ClassMethods
13
+ # Defines a callback that will get called right before the
14
+ # prompt is sent to the execution method.
15
+ def before_execution(*filters, &blk)
16
+ set_callback(:execution, :before, *filters, &blk)
17
+ end
18
+
19
+ # Defines a callback that will get called right after the
20
+ # prompt's execution method is finished.
21
+ def after_execution(*filters, &blk)
22
+ set_callback(:execution, :after, *filters, &blk)
23
+ end
24
+
25
+ # Defines a callback that will get called around the prompts's execution method.
26
+ def around_execution(*filters, &blk)
27
+ set_callback(:execution, :around, *filters, &blk)
28
+ end
29
+ end
30
+ end
31
+ end
@@ -0,0 +1,7 @@
1
+ # frozen_string_literal: true
2
+
3
+ module ActionAI
4
+ def self.deprecator # :nodoc:
5
+ @deprecator ||= ActiveSupport::Deprecation.new
6
+ end
7
+ end