ask_chatgpt 0.2.1 → 0.3.0

Sign up to get free protection for your applications and to get access to all the features.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: 66462915e1e106d09ff5990c0e892e02401c8a157306705bff4a6338bf775f1d
4
- data.tar.gz: a747d8a310b2861ccd088a1ac27accde74acd2e4732c2f41938ef6536bcdbc76
3
+ metadata.gz: 81b5ff785019505f049a92996ae9a8f2afbb6bda16375df14a9b101d110a4e3f
4
+ data.tar.gz: 0c4c037173a02db028500cf6dbafe67331b3f3b90b2bcd80327f3e0336e678e6
5
5
  SHA512:
6
- metadata.gz: 287029104e1b14dd1f2f9b36e8cf11e55082424ebda4f88a767bd29088faccbdbadf79d3314075e9f0868bebbe38801cce9202c108147617c644454584162115
7
- data.tar.gz: c1bc943ae6ae4e6195a1e98dd53e82d3d72afbcbbf170e9d8590b92fc62b740e2a6ca30b9a67698a9113b04c575eb74ed6bd57b1162ba78cd19f918777af4259
6
+ metadata.gz: e7d950e4d0cd28e23a37a1987ae6600d8627eea1974360ae138dadf556a9976ced78423afa69b6da94d5f7a6d7d6fac9c41d057d5f9c78b40419cc779a003e93
7
+ data.tar.gz: 3062aee654d3d3505af33762a2c781d02a02b6b50edd1539a6ab248914ffaac31169899c1eda184782683c14fbeacad361859ae4b2930f739d90055d71597ee6
data/README.md CHANGED
@@ -5,10 +5,10 @@
5
5
 
6
6
  AI-Powered Assistant Gem right in your Rails console.
7
7
 
8
- ![AskChatGPT](docs/gpt.gif)
8
+ ![AskChatGPT](docs/interactive.gif)
9
9
 
10
10
  A Gem that leverages the power of AI to make your development experience more efficient and enjoyable. With this gem, you can streamline your coding process, effortlessly refactor and improve your code, and even generate tests on the fly.
11
-
11
+ +
12
12
  See more [examples](#examples) below.
13
13
 
14
14
  ## Usage
@@ -33,8 +33,23 @@ Go to Rails console and run:
33
33
  }
34
34
  ```
35
35
 
36
+ OR with CLI tool:
37
+
38
+ ```shell
39
+ >ask_chatgpt -q "134*1245"
40
+ 166830
41
+
42
+ >ask_chatgpt base64 this string "hello world"
43
+ aGVsbG8gd29ybGQ=
44
+
45
+ >ask_chatgpt decode base64 this string "aGVsbG8gd29ybGQ="
46
+ hello world
47
+ ```
48
+
36
49
  See some examples below. You can also create your own prompts with just few lines of code [here](#options--configurations).
37
50
 
51
+ Also you can use a CLI tool, [how to use it](#cli-tool).
52
+
38
53
  ## Examples
39
54
 
40
55
  Typical use-cases how you can use this plugin
@@ -88,14 +103,18 @@ And you can edit:
88
103
  ```ruby
89
104
  AskChatGPT.setup do |config|
90
105
  # config.access_token = ENV["OPENAI_API_KEY"]
91
- # config.debug = false
92
- # config.model = "gpt-3.5-turbo"
93
- # config.temperature = 0.1
94
- # config.max_tokens = 3000 # or nil by default
95
- # config.included_prompt = []
106
+
107
+ # async mode will use OpenAI streamming feature and will return results as they come
108
+ # config.mode = :async # or :sync
109
+ # config.markdown = true # try to output nicely Markdown response
110
+ # config.debug = false
111
+ # config.model = "gpt-3.5-turbo"
112
+ # config.temperature = 0.1
113
+ # config.max_tokens = 3000 # or nil by default
114
+ # config.included_prompts = []
96
115
 
97
116
  # Examples of custom prompts:
98
- # you can use them `gpt.ask(:extract_email, "some string")`
117
+ # you can use them `gpt.extract_email("some string")`
99
118
 
100
119
  # config.register_prompt :extract_email do |arg|
101
120
  # "Extract email from: #{arg} as JSON"
@@ -111,7 +130,15 @@ And you can edit:
111
130
  end
112
131
  ```
113
132
 
114
- Note: that you need to setup your API Key https://platform.openai.com/account/api-keys. You can store it in the .env or .bash_profile. BUT make sure it won't be committed to the Github. Is must be private.
133
+ Note: that you need to setup your API Key https://platform.openai.com/account/api-keys. You can store it in the .env or .bash_profile.
134
+
135
+ Example with `nano ~/.bash_profile`:
136
+
137
+ ```
138
+ export OPENAI_API_KEY=key
139
+ ```
140
+
141
+ BUT make sure it won't be committed to the Github. Is must be private.
115
142
 
116
143
  You can define you own prompts and use them using `.register_prompt`. For example:
117
144
 
@@ -122,7 +149,7 @@ You can define you own prompts and use them using `.register_prompt`. For exampl
122
149
  ```
123
150
 
124
151
  And later you can call it with `gpt.extract_email("some text with email@site.com, user@email.com")`.
125
- If you believe your custom promts will be useful - create a PR for this gem.
152
+ If you believe your custom prompt will be useful - create a PR for this gem.
126
153
 
127
154
  If you want to get source code use this helper `AskChatGPT::Helpers.extract_source(str)`.
128
155
 
@@ -136,6 +163,8 @@ You can pass:
136
163
  AskChatGPT::Helpers.extract_source("a = b")
137
164
  ```
138
165
 
166
+ By default when you use in Rails app default one prompt is included (`.included_prompts`) which is sending Ruby/Rails versions, and name of the database adapter.
167
+
139
168
  ## Debug Mode
140
169
 
141
170
  You can enable debug mode to see request/response from the OpenAI using two ways:
@@ -144,21 +173,65 @@ You can enable debug mode to see request/response from the OpenAI using two ways
144
173
  AskChatGPT.setup do |config|
145
174
  config.debug = false
146
175
  end
176
+
177
+ # or
178
+
179
+ # gpt.on!(:debug)
180
+ # gpt.off!(:debug)
147
181
  ```
148
182
 
149
183
  or directly in console `gpt.debug!` (and finish `gpt.debug!(:off)`)
150
184
 
185
+ ## CLI Tool
186
+
187
+ Example 1:
188
+ ![AskChatGPT](docs/unzip.gif)
189
+
190
+ Example 2:
191
+ ![AskChatGPT](docs/avg_user_age_json.gif)
192
+
193
+ How to use:
194
+
195
+ ```
196
+ ask_chatgpt -q "How to parse JSON file in Ruby?"
197
+ ask_chatgpt -f app/models/user.rb -q "find a bug in this Rails model"
198
+ ask_chatgpt -f app/models/user.rb -q "create RSpec spec for this model"
199
+ ask_chatgpt -f test/dummy/Gemfile -q "sort Ruby gems alphabetically"
200
+ ```
201
+
202
+ ## Streaming (async vs sync mode)
203
+
204
+ Control the mode from a console. Or, from the initializer, using `config.mode = :async` (or sync).
205
+
206
+ ```ruby
207
+ gpt.async!
208
+ gpt.sync!
209
+ ```
210
+
211
+ ## Markdown
212
+
213
+ Try to format response from Markdown and print it nicely in the console.
214
+
215
+ ```ruby
216
+ AskChatGPT.setup do |config|
217
+ config.markdown = true
218
+ end
219
+
220
+ # or
221
+
222
+ # gpt.on!(:markdown)
223
+ # gpt.off!(:markdown)
224
+ ```
225
+
151
226
  ## TODO
152
227
 
153
- - cli app? `ask_gpt <something> --file <file>` ...
228
+ - better CLI?
154
229
  - more prompts (cover controllers, sql, etc?), e.g. `with_controller`, `with_class`, ...
155
230
  - tests(rspec, vcr)
156
- - CI (but first specs)
157
231
  - can it be used with pry/byebug/etc?
158
232
  - print tokens usage? `.with_usage`
159
233
  - support org_id? in the configs
160
234
  - use `gpt` in the code of the main app (e.g. model/controller)
161
- - remove dependency on `ruby-openai` and just do a `Net::HTTP` call
162
235
 
163
236
  ## Contributing
164
237
 
data/bin/ask_chatgpt ADDED
@@ -0,0 +1,63 @@
1
+ #!/usr/bin/env ruby
2
+
3
+ require 'optparse'
4
+ require 'active_support'
5
+
6
+ require_relative "../lib/ask_chatgpt"
7
+
8
+ AskChatgpt.setup do |config|
9
+ config.included_prompts = []
10
+ end
11
+
12
+ options = {}
13
+
14
+ parser = OptionParser.new do |opts|
15
+ opts.banner = <<-USAGE
16
+ Usage: ask_chatgpt [options]"
17
+
18
+ Make sure you have set OPENAI_API_KEY environment variable.
19
+ You can put it in your ~/.bashrc, ~/.bash_profile, ~/.zshrc file or using plenty of other options.
20
+
21
+ Examples:
22
+ ask_chatgpt -q "How to parse JSON file in Ruby?"
23
+ ask_chatgpt -f app/models/user.rb -q "find a bug in this Rails model"
24
+ ask_chatgpt -f app/models/user.rb -q "create RSpec spec for this model"
25
+ ask_chatgpt -f test/dummy/Gemfile -q "sort Ruby gems alphabetically"
26
+
27
+ USAGE
28
+
29
+ opts.on("-q", "--question \"Your Prompt\"", String, "Specify your prompt with full context, language, etc.") do |prompt|
30
+ options[:prompt] = prompt
31
+ end
32
+
33
+ opts.on("-f", "--file FILE", String, "Specify file with prompt") do |file|
34
+ options[:file_path] = file
35
+ end
36
+
37
+ opts.on("-d", "--debug", "Output request/response") do |debug|
38
+ options[:debug] = true
39
+ end
40
+
41
+ opts.on("-h", "--help", "Prints this help") do
42
+ puts opts
43
+ exit
44
+ end
45
+ end
46
+
47
+ parser.parse!
48
+
49
+ AskChatGPT.debug = !!options[:debug]
50
+
51
+ options[:prompt] = ARGV.join(" ") if options[:prompt].blank?
52
+
53
+ if options[:prompt].blank?
54
+ puts parser
55
+ exit
56
+ end
57
+
58
+ include AskChatGPT::Console
59
+
60
+ instance = gpt.ask(options[:prompt])
61
+ instance = instance.payload(File.read(options[:file_path])) if options[:file_path].present?
62
+
63
+ puts instance.inspect
@@ -0,0 +1,31 @@
1
+ module AskChatgpt
2
+ module DefaultBehavior
3
+ DEFAULT_PROMPTS = [:improve, :refactor, :question, :find_bug, :code_review, :rspec_test, :unit_test, :explain]
4
+
5
+ def with_model(*models)
6
+ self.tap do
7
+ models.each do |model|
8
+ add_prompt AskChatGPT::Prompts::Model.new(model)
9
+ end
10
+ end
11
+ end
12
+ alias :with_models :with_model
13
+
14
+ DEFAULT_PROMPTS.each do |method|
15
+ define_method(method) do |*args|
16
+ # camelize method name and get constant from AskChatGPT::Prompts
17
+ add_prompt(AskChatGPT::Prompts.const_get(ActiveSupport::Inflector.camelize(method.to_s)).new(*args))
18
+ end
19
+ end
20
+ alias :ask :question
21
+ alias :payload :question
22
+ alias :how :question
23
+ alias :find :question
24
+ alias :review :code_review
25
+
26
+ def add_prompt(prompt)
27
+ scope << prompt
28
+ self
29
+ end
30
+ end
31
+ end
@@ -1,5 +1,8 @@
1
+ require_relative "sugar"
1
2
  require_relative "prompts/base"
2
3
  require_relative "prompts/improve"
4
+ require_relative "default_behavior"
5
+
3
6
  Dir[File.join(__dir__, "prompts", "*.rb")].each do |file|
4
7
  require file
5
8
  end
@@ -8,70 +11,93 @@ end
8
11
  # https://www.greataiprompts.com/chat-gpt/best-coding-prompts-for-chat-gpt/
9
12
 
10
13
  module AskChatgpt
14
+ class InputError < StandardError; end
15
+
11
16
  class Executor
12
- DEFAULT_PROMPTS = [:improve, :refactor, :question, :find_bug, :code_review, :rspec_test, :unit_test, :explain]
17
+ include AskChatgpt::Sugar, AskChatgpt::DefaultBehavior
13
18
 
14
- attr_reader :scope, :client
19
+ attr_reader :scope, :client, :spinner, :cursor
15
20
 
16
21
  def initialize(client)
17
- @scope = AskChatGPT.included_prompt.dup
18
- @client = client
22
+ @scope = AskChatGPT.included_prompts.dup
23
+ @client = client
24
+ @spinner = TTY::Spinner.new(format: :classic)
25
+ @cursor = TTY::Cursor
19
26
  end
20
27
 
21
- def debug!(mode = :on)
22
- AskChatGPT.debug = mode == :on
23
- end
24
-
25
- def with_model(*models)
26
- self.tap do
27
- models.each do |model|
28
- add_prompt AskChatGPT::Prompts::Model.new(model)
29
- end
30
- end
31
- end
32
- alias :with_models :with_model
33
-
34
- DEFAULT_PROMPTS.each do |method|
35
- define_method(method) do |*args|
36
- add_prompt(AskChatGPT::Prompts.const_get(method.to_s.camelize).new(*args))
37
- end
38
- end
39
- alias :ask :question
40
- alias :payload :question
41
- alias :how :question
42
- alias :find :question
43
- alias :review :code_review
44
-
45
28
  def inspect
46
29
  pp(executor_parameters) if AskChatGPT.debug
47
- puts(call); nil
30
+ call_with_validations do
31
+ case AskChatGPT.mode
32
+ when :async
33
+ call_async
34
+ else
35
+ call_sync
36
+ end
37
+ end
38
+ rescue InputError => e
39
+ puts e.message
48
40
  rescue StandardError => e
49
41
  puts e.message
50
42
  puts e.backtrace.take(5).join("\n")
43
+ ensure
51
44
  nil
52
45
  end
53
46
 
54
- def call
47
+ private
48
+
49
+ def call_with_validations
55
50
  if scope.empty? || (scope.size == 1 && scope.first.is_a?(AskChatGPT::Prompts::App))
56
- return puts("No prompts given")
51
+ raise InputError, "No prompts given"
57
52
  end
58
-
59
- spinner = TTY::Spinner.new(format: :classic)
53
+ print cursor.save
60
54
  spinner.auto_spin
61
- response = client.chat(parameters: executor_parameters)
62
- spinner.stop
55
+ yield
56
+ ensure
57
+ spinner.stop if spinner&.spinning?
58
+ end
59
+
60
+ # we will collect all chunks and print them at once later with Markdown
61
+ def call_async
62
+ content = []
63
+ response = client.chat(parameters: executor_parameters.merge({
64
+ stream: proc do |chunk, _bytesize|
65
+ spinner.stop if spinner&.spinning?
66
+ content_part = chunk.dig("choices", 0, "delta", "content")
67
+ content << content_part
68
+ print content_part
69
+ end
70
+ }))
71
+ if AskChatGPT.markdown
72
+ # re-draw the screen
73
+ # go back to the top by the number of new lines previously printed
74
+ print cursor.clear_lines(content.compact.join.split("\n").size + 1, :up)
75
+ # print cursor.restore
76
+ # print cursor.down
77
+ # print cursor.clear_screen_down
78
+ # $content = content.compact.join
79
+ puts(TTY::Markdown.parse(content.compact.join))
80
+ else
81
+ # nothing, content is already printed in the stream
82
+ end
83
+ end
63
84
 
85
+ # wait for the whole response and print it at once
86
+ def call_sync
87
+ response = client.chat(parameters: executor_parameters)
64
88
  pp(response) if AskChatGPT.debug
89
+ spinner&.stop
65
90
 
66
91
  if response["error"]
67
- puts response["error"]["message"]
92
+ puts(response["error"]["message"])
68
93
  else
69
94
  content = response.dig("choices", 0, "message", "content")
70
- parsed = TTY::Markdown.parse(content)
71
- parsed
95
+ if AskChatGPT.markdown
96
+ puts(TTY::Markdown.parse(content))
97
+ else
98
+ puts(content)
99
+ end
72
100
  end
73
- ensure
74
- spinner.stop if spinner&.spinning?
75
101
  end
76
102
 
77
103
  def executor_parameters
@@ -80,13 +106,7 @@ module AskChatgpt
80
106
  temperature: AskChatGPT.temperature,
81
107
  max_tokens: AskChatGPT.max_tokens,
82
108
  messages: scope.map { |e| { role: "user", content: e.content } }.reject { |e| e[:content].blank? },
83
- }.compact_blank
84
- end
85
-
86
- def add_prompt(prompt)
87
- @scope << prompt
88
- self
109
+ }.reject { |_, v| v.blank? }
89
110
  end
90
-
91
111
  end
92
112
  end
@@ -6,7 +6,7 @@ module AskChatgpt
6
6
  general_info,
7
7
  version_info,
8
8
  database_info
9
- ].compact_blank.join(", ")
9
+ ].reject { |v| v.blank? }.join(", ")
10
10
  end
11
11
 
12
12
  private
@@ -2,7 +2,7 @@ module AskChatgpt
2
2
  module Prompts
3
3
  class CodeReview < Improve
4
4
  private def action_info
5
- "Analyze the given Ruby code for code smells and suggest improvements:"
5
+ "Analyze the given Ruby code for code smells and suggest improvements, use Markdown for code snippets:"
6
6
  end
7
7
  end
8
8
  end
@@ -2,7 +2,7 @@ module AskChatgpt
2
2
  module Prompts
3
3
  class Explain < Improve
4
4
  private def action_info
5
- "Explain me this Ruby code snippet:"
5
+ "Explain me this Ruby code snippet, use Markdown for code snippets:"
6
6
  end
7
7
  end
8
8
  end
@@ -6,7 +6,7 @@ module AskChatgpt
6
6
  [
7
7
  action_info,
8
8
  method_info,
9
- ].compact_blank.join("\n")
9
+ ].reject { |v| v.blank? }.join("\n")
10
10
  end
11
11
 
12
12
  private
@@ -7,7 +7,7 @@ module AskChatgpt
7
7
  schema_info,
8
8
  associations_info,
9
9
  instructions_info
10
- ].compact_blank.join("\n\n")
10
+ ].reject { |v| v.blank? }.join("\n\n")
11
11
  end
12
12
 
13
13
  private
@@ -2,7 +2,7 @@ module AskChatgpt
2
2
  module Prompts
3
3
  class RspecTest < Improve
4
4
  private def action_info
5
- "Write a rspec test for the given Ruby on Rails code:"
5
+ "Write a rspec test for the given Ruby on Rails code in Ruby code, return as Markdown:"
6
6
  end
7
7
  end
8
8
  end
@@ -2,7 +2,7 @@ module AskChatgpt
2
2
  module Prompts
3
3
  class UnitTest < Improve
4
4
  private def action_info
5
- "Write a unit test for the given Ruby on Rails code:"
5
+ "Write a unit test for the given Ruby on Rails code in Ruby code, return as Markdown:"
6
6
  end
7
7
  end
8
8
  end
@@ -0,0 +1,47 @@
1
+ module AskChatgpt
2
+ module Sugar
3
+
4
+ def debug!(mode = :on)
5
+ AskChatGPT.debug = mode == :on
6
+ end
7
+
8
+ def sync!
9
+ AskChatGPT.mode = :sync
10
+ end
11
+
12
+ def async!
13
+ AskChatGPT.mode = :async
14
+ end
15
+
16
+ def on!(feature)
17
+ case feature
18
+ when :markdown
19
+ AskChatGPT.markdown = true
20
+ when :debug
21
+ AskChatGPT.debug = true
22
+ when :async
23
+ AskChatGPT.mode = :async
24
+ when :sync
25
+ AskChatGPT.mode = :sync
26
+ else
27
+ raise InputError, "Unknown feature: #{feature}"
28
+ end
29
+ end
30
+
31
+ def off!(feature)
32
+ case feature
33
+ when :markdown
34
+ AskChatGPT.markdown = false
35
+ when :debug
36
+ AskChatGPT.debug = false
37
+ when :async
38
+ AskChatGPT.mode = :sync
39
+ when :sync
40
+ AskChatGPT.mode = :async
41
+ else
42
+ raise InputError, "Unknown feature: #{feature}"
43
+ end
44
+ end
45
+
46
+ end
47
+ end
@@ -1,3 +1,3 @@
1
1
  module AskChatgpt
2
- VERSION = "0.2.1"
2
+ VERSION = "0.3.0"
3
3
  end
data/lib/ask_chatgpt.rb CHANGED
@@ -1,10 +1,14 @@
1
- require "ask_chatgpt/version"
2
- require "ask_chatgpt/railtie"
1
+ # to make sure we have good gem version
2
+ gem "ruby-openai", '>= 4.0.0'
3
+
4
+ require_relative "ask_chatgpt/version"
5
+ require "ask_chatgpt/railtie" if defined?(Rails)
3
6
  require "net/http"
4
7
  require "json"
5
8
  require "openai"
6
9
  require "tty-markdown"
7
10
  require "tty-spinner"
11
+ require "tty-cursor"
8
12
 
9
13
  require_relative "ask_chatgpt/console"
10
14
  require_relative "ask_chatgpt/executor"
@@ -14,6 +18,13 @@ require_relative "ask_chatgpt/core"
14
18
  module AskChatgpt
15
19
  ::AskChatGPT = AskChatgpt
16
20
 
21
+ # async mode will use OpenAI streamming feature and will return results as they come
22
+ mattr_accessor :mode
23
+ @@mode = :async # or :sync
24
+
25
+ mattr_accessor :markdown
26
+ @@markdown = true
27
+
17
28
  mattr_accessor :debug
18
29
  @@debug = false
19
30
 
@@ -35,8 +46,8 @@ module AskChatgpt
35
46
 
36
47
  # this prompt is always included
37
48
  # it constain info that you have Rails app and Rails/Ruby versions, and DB adapter name
38
- mattr_accessor :included_prompt
39
- @@included_prompt = [AskChatGPT::Prompts::App.new]
49
+ mattr_accessor :included_prompts
50
+ @@included_prompts = [AskChatGPT::Prompts::App.new]
40
51
 
41
52
  def self.setup
42
53
  yield(self)
@@ -1,10 +1,14 @@
1
1
  AskChatGPT.setup do |config|
2
2
  # config.access_token = ENV["OPENAI_API_KEY"]
3
- # config.debug = false
4
- # config.model = "gpt-3.5-turbo"
5
- # config.max_tokens = 3000 # or nil by default
6
- # config.temperature = 0.1
7
- # config.included_prompt = []
3
+
4
+ # async mode will use OpenAI streamming feature and will return results as they come
5
+ # config.mode = :async # or :sync
6
+ # config.markdown = true # try to convert response if Markdown
7
+ # config.debug = false
8
+ # config.model = "gpt-3.5-turbo"
9
+ # config.max_tokens = 3000 # or nil by default
10
+ # config.temperature = 0.1
11
+ # config.included_prompts = []
8
12
 
9
13
  # Examples of custom prompts:
10
14
  # you can use them `gpt.ask(:extract_email, "some string")`
metadata CHANGED
@@ -1,7 +1,7 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: ask_chatgpt
3
3
  version: !ruby/object:Gem::Version
4
- version: 0.2.1
4
+ version: 0.3.0
5
5
  platform: ruby
6
6
  authors:
7
7
  - Igor Kasyanchuk
@@ -9,7 +9,7 @@ authors:
9
9
  autorequire:
10
10
  bindir: bin
11
11
  cert_chain: []
12
- date: 2023-04-26 00:00:00.000000000 Z
12
+ date: 2023-04-29 00:00:00.000000000 Z
13
13
  dependencies:
14
14
  - !ruby/object:Gem::Dependency
15
15
  name: rails
@@ -31,14 +31,14 @@ dependencies:
31
31
  requirements:
32
32
  - - ">="
33
33
  - !ruby/object:Gem::Version
34
- version: '0'
34
+ version: 4.0.0
35
35
  type: :runtime
36
36
  prerelease: false
37
37
  version_requirements: !ruby/object:Gem::Requirement
38
38
  requirements:
39
39
  - - ">="
40
40
  - !ruby/object:Gem::Version
41
- version: '0'
41
+ version: 4.0.0
42
42
  - !ruby/object:Gem::Dependency
43
43
  name: tty-markdown
44
44
  requirement: !ruby/object:Gem::Requirement
@@ -67,6 +67,20 @@ dependencies:
67
67
  - - ">="
68
68
  - !ruby/object:Gem::Version
69
69
  version: '0'
70
+ - !ruby/object:Gem::Dependency
71
+ name: tty-cursor
72
+ requirement: !ruby/object:Gem::Requirement
73
+ requirements:
74
+ - - ">="
75
+ - !ruby/object:Gem::Version
76
+ version: '0'
77
+ type: :runtime
78
+ prerelease: false
79
+ version_requirements: !ruby/object:Gem::Requirement
80
+ requirements:
81
+ - - ">="
82
+ - !ruby/object:Gem::Version
83
+ version: '0'
70
84
  - !ruby/object:Gem::Dependency
71
85
  name: irb
72
86
  requirement: !ruby/object:Gem::Requirement
@@ -113,16 +127,19 @@ description: AI-Powered Assistant Gem right in your Rails console.
113
127
  email:
114
128
  - igorkasyanchuk@gmail.com
115
129
  - manastyretskyi@gmail.com
116
- executables: []
130
+ executables:
131
+ - ask_chatgpt
117
132
  extensions: []
118
133
  extra_rdoc_files: []
119
134
  files:
120
135
  - MIT-LICENSE
121
136
  - README.md
122
137
  - Rakefile
138
+ - bin/ask_chatgpt
123
139
  - lib/ask_chatgpt.rb
124
140
  - lib/ask_chatgpt/console.rb
125
141
  - lib/ask_chatgpt/core.rb
142
+ - lib/ask_chatgpt/default_behavior.rb
126
143
  - lib/ask_chatgpt/executor.rb
127
144
  - lib/ask_chatgpt/helpers.rb
128
145
  - lib/ask_chatgpt/prompts/app.rb
@@ -138,6 +155,7 @@ files:
138
155
  - lib/ask_chatgpt/prompts/rspec_test.rb
139
156
  - lib/ask_chatgpt/prompts/unit_test.rb
140
157
  - lib/ask_chatgpt/railtie.rb
158
+ - lib/ask_chatgpt/sugar.rb
141
159
  - lib/ask_chatgpt/version.rb
142
160
  - lib/generators/ask_chatgpt/USAGE
143
161
  - lib/generators/ask_chatgpt/ask_chatgpt_generator.rb