aia 0.5.13 → 0.5.15

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: 1487b5005351fcb62b10d7ccdfdfe0153a40b931b633562671cc28e9f554ad8f
4
- data.tar.gz: 9d34fc975adb14a52d0cf62a541a7568af61b4da5cb963c326a83253613d6b30
3
+ metadata.gz: 2d027907d70cb497a761f25ad65c2e9429f312cc9694466bf9a8db612a1ead0a
4
+ data.tar.gz: b76efb7bd589a685e9380d969db85f3df88c36fe888fd03e44819d26e339a353
5
5
  SHA512:
6
- metadata.gz: '00994916d15ef59a91ee9ae8b73acee3e10a8cc7deb1c41cbfa5fff4997fd97622deb2d16fdbc555da17bddba48dc349db21115333aac9bdf8d9a3dfaaa220c4'
7
- data.tar.gz: b40df4e189d676dbee86f6593fac9fb90153d392105c07af7a53a3a1be1e3961b9978e05e29738e94c5ee135f7f9af0abdb7de19475edd4e7f7615f42d701327
6
+ metadata.gz: b83635891018c810bf7c794a66bb7c0842e28d9152ceb444f5105468928b419575ce7f3f88e70528ef8dc87b46ab2246525e43726f9480a2f2ce8be00a850270
7
+ data.tar.gz: 979137d859737b3dec4f264d1e6c6611131b5fbbcb4682ec1dd5843a69c147754b0937414f9c891cc1e8b0a09d95129af83ba1432bc9b01e8465ff7562245ca9
data/.semver CHANGED
@@ -1,6 +1,6 @@
1
1
  ---
2
2
  :major: 0
3
3
  :minor: 5
4
- :patch: 13
4
+ :patch: 15
5
5
  :special: ''
6
6
  :metadata: ''
data/CHANGELOG.md CHANGED
@@ -1,5 +1,14 @@
1
1
  ## [Unreleased]
2
2
 
3
+ ## [0.5.15] 2024-03-30
4
+ - Added the ability to accept piped in text to be appeded to the end of the prompt text: curl $URL | aia ad_hoc
5
+ - Fixed bugs with entering directives as follow-up prompts during a chat session
6
+
7
+ ## [0.5.14] 2024-03-09
8
+ - Directly access OpenAI to do text to speech when using the `--speak` option
9
+ - Added --voice to specify which voice to use
10
+ - Added --speech_model to specify which TTS model to use
11
+
3
12
  ## [0.5.13] 2024-03-03
4
13
  - Added CLI-utility `llm` as a backend processor
5
14
 
data/README.md CHANGED
@@ -6,14 +6,15 @@ It leverages the `prompt_manager` gem to manage prompts for the `mods` and `sgpt
6
6
 
7
7
  **Most Recent Change**: Refer to the [Changelog](CHANGELOG.md)
8
8
 
9
- > v0.5.13
10
- > - Added an initial integration for CLI-tool `llm` as a backend processor
11
- > Its primary feature is its **ability to use local LLMs and APIs to keep all processing within your local workstation.**
12
- >
13
- > v0.5.12
14
- > - Supports Prompt Sequencing
15
- > - Added --next option
16
- > - Added --pipeline option
9
+ > v0.5.15
10
+ > - Support piped content by appending to end of prompt text
11
+ > - fixed bugs with directives entered as follow-up while in chat mode
12
+ >
13
+ > v0.5.14
14
+ > - Directly access OpenAI to do text to speech when using the `--speak` option
15
+ > - Added --voice to specify which voice to use
16
+ > - Added --speech_model to specify which TTS model to use
17
+ >
17
18
 
18
19
 
19
20
  <!-- Tocer[start]: Auto-generated, don't remove. -->
@@ -106,9 +107,12 @@ The `aia` configuration defaults can be over-ridden by system environment variab
106
107
  | log_file | ~/.prompts/_prompts.log | AIA_LOG_FILE |
107
108
  | markdown | true | AIA_MARKDOWN |
108
109
  | model | gpt-4-1106-preview | AIA_MODEL |
109
- | out_file | STDOUT | AIA_OUT_FILE |
110
+ | out_file | STDOUT | AIA_OUT_FILE |
110
111
  | prompts_dir | ~/.prompts | AIA_PROMPTS_DIR |
111
- | VERBOSE | FALSE | AIA_VERBOSE |
112
+ | speech_model. | tts-1 | AIA_SPEECH_MODEL |
113
+ | verbose | FALSE | AIA_VERBOSE |
114
+ | voice | alloy | AIA_VOICE |
115
+
112
116
 
113
117
 
114
118
  See the `@options` hash in the `cli.rb` file for a complete list. There are some config items that do not necessarily make sense for use as an envar over-ride. For example if you set `export AIA_DUMP_FILE=config.yaml` then `aia` would dump the current configuration config.yaml and exit every time it is ran until you finally `unset AIA_DUMP_FILE`
data/lib/aia/cli.rb CHANGED
@@ -155,6 +155,8 @@ class AIA::Cli
155
155
  extra: [''], #
156
156
  #
157
157
  model: ["gpt-4-1106-preview", "--llm --model"],
158
+ speech_model: ["tts-1", "--sm --spech_model"],
159
+ voice: ["alloy", "--voice"],
158
160
  #
159
161
  dump_file: [nil, "--dump"],
160
162
  completion: [nil, "--completion"],
data/lib/aia/main.rb CHANGED
@@ -20,11 +20,16 @@ class AIA::Main
20
20
  include AIA::DynamicContent
21
21
  include AIA::UserQuery
22
22
 
23
- attr_accessor :logger, :tools, :backend, :directive_output
23
+ attr_accessor :logger, :tools, :backend, :directive_output, :piped_content
24
24
 
25
25
  attr_reader :spinner
26
26
 
27
27
  def initialize(args= ARGV)
28
+ unless $stdin.tty?
29
+ @piped_content = $stdin.readlines.join.chomp
30
+ $stdin.reopen("/dev/tty")
31
+ end
32
+
28
33
  @directive_output = ""
29
34
  AIA::Tools.load_tools
30
35
 
@@ -48,22 +53,13 @@ class AIA::Main
48
53
 
49
54
  @prompt = AIA::Prompt.new.prompt
50
55
 
56
+ @prompt.text += piped_content unless piped_content.nil?
57
+
51
58
  # TODO: still should verify that the tools are ion the $PATH
52
59
  # tools.class.verify_tools
53
60
  end
54
61
 
55
62
 
56
- def speak(what)
57
- return false unless AIA.config.speak?
58
- # TODO: Consider putting this into a thread
59
- # so that it can speak at the same time
60
- # the output is going to the screen
61
- # MacOS uses the say command
62
- system "say #{Shellwords.escape(what)}"
63
- true
64
- end
65
-
66
-
67
63
  # Function to setup the Reline history with a maximum depth
68
64
  def setup_reline_history(max_history_size=5)
69
65
  Reline::HISTORY.clear
@@ -74,7 +70,7 @@ class AIA::Main
74
70
  # This will be recursive with the new options
75
71
  # --next and --pipeline
76
72
  def call
77
- directive_output = @directives_processor.execute_my_directives
73
+ @directive_output = @directives_processor.execute_my_directives
78
74
 
79
75
  if AIA.config.chat?
80
76
  AIA.config.out_file = STDOUT
@@ -106,7 +102,7 @@ class AIA::Main
106
102
 
107
103
  the_prompt = @prompt.to_s
108
104
 
109
- the_prompt.prepend(directive_output + "\n") unless directive_output.nil? || directive_output.empty?
105
+ the_prompt.prepend(@directive_output + "\n") unless @directive_output.nil? || @directive_output.empty?
110
106
 
111
107
  if AIA.config.terse?
112
108
  the_prompt.prepend "Be terse in your response. "
@@ -123,7 +119,7 @@ class AIA::Main
123
119
 
124
120
  if AIA.config.chat?
125
121
  setup_reline_history
126
- speak result
122
+ AIA.speak result
127
123
  lets_chat
128
124
  end
129
125
 
@@ -202,9 +198,10 @@ class AIA::Main
202
198
  directive = parts.shift
203
199
  parameters = parts.join(' ')
204
200
  AIA.config.directives << [directive, parameters]
205
- directive_output = @directives_processor.execute_my_directives
201
+
202
+ @directive_output = @directives_processor.execute_my_directives
206
203
  else
207
- directive_output = ""
204
+ @directive_output = ""
208
205
  end
209
206
 
210
207
  result
@@ -221,20 +218,21 @@ class AIA::Main
221
218
  the_prompt_text = render_env(the_prompt_text) if AIA.config.shell?
222
219
 
223
220
  if handle_directives(the_prompt_text)
224
- unless directive_output.nil?
225
- the_prompt_text = insert_terse_phrase(the_prompt_text)
226
- the_prompt_text << directive_output
221
+ if @directive_output.nil? || @directive_output.empty?
222
+ # Do nothing
223
+ else
224
+ the_prompt_text = @directive_output
225
+ the_prompt_text = render_erb(the_prompt_text) if AIA.config.erb?
226
+ the_prompt_text = render_env(the_prompt_text) if AIA.config.shell?
227
227
  result = get_and_display_result(the_prompt_text)
228
-
229
228
  log_the_follow_up(the_prompt_text, result)
230
- speak result
229
+ AIA.speak result
231
230
  end
232
231
  else
233
232
  the_prompt_text = insert_terse_phrase(the_prompt_text)
234
233
  result = get_and_display_result(the_prompt_text)
235
-
236
234
  log_the_follow_up(the_prompt_text, result)
237
- speak result
235
+ AIA.speak result
238
236
  end
239
237
 
240
238
  the_prompt_text = ask_question_with_reline("\nFollow Up: ")
data/lib/aia.rb CHANGED
@@ -20,11 +20,12 @@ tramp_require('debug_me') {
20
20
  }
21
21
 
22
22
  require 'hashie'
23
+ require 'openai'
23
24
  require 'os'
24
25
  require 'pathname'
25
26
  require 'reline'
26
27
  require 'shellwords'
27
- require 'tempfile' # SMELL: is this still being used?
28
+ require 'tempfile'
28
29
 
29
30
  require 'tty-spinner'
30
31
 
@@ -45,8 +46,15 @@ require_relative "core_ext/string_wrap"
45
46
  module AIA
46
47
  class << self
47
48
  attr_accessor :config
49
+ attr_accessor :client
48
50
 
49
51
  def run(args=ARGV)
52
+ begin
53
+ @client = OpenAI::Client.new(access_token: ENV["OPENAI_API_KEY"])
54
+ rescue OpenAI::ConfigurationError
55
+ @client = nil
56
+ end
57
+
50
58
  args = args.split(' ') if args.is_a?(String)
51
59
 
52
60
  # TODO: Currently this is a one and done architecture.
@@ -59,8 +67,47 @@ module AIA
59
67
 
60
68
 
61
69
  def speak(what)
62
- return unless AIA.config.speak?
63
- system "say #{Shellwords.escape(what)}" if OS.osx?
70
+ return unless config.speak?
71
+
72
+ if OS.osx? && 'siri' == config.voice.downcase
73
+ system "say #{Shellwords.escape(what)}"
74
+ else
75
+ use_openai_tts(what)
76
+ end
77
+ end
78
+
79
+
80
+ def use_openai_tts(what)
81
+ if client.nil?
82
+ puts "\nWARNING: OpenAI's text to speech capability is not available at this time."
83
+ return
84
+ end
85
+
86
+ player = if OS.osx?
87
+ 'afplay'
88
+ elsif OS.linux?
89
+ 'mpg123'
90
+ elsif OS.windows?
91
+ 'cmdmp3'
92
+ else
93
+ puts "\nWARNING: There is no MP3 player available"
94
+ return
95
+ end
96
+
97
+ response = client.audio.speech(
98
+ parameters: {
99
+ model: config.speech_model,
100
+ input: what,
101
+ voice: config.voice
102
+ }
103
+ )
104
+
105
+ Tempfile.create(['speech', '.mp3']) do |f|
106
+ f.binmode
107
+ f.write(response)
108
+ f.close
109
+ `#{player} #{f.path}`
110
+ end
64
111
  end
65
112
  end
66
113
  end
data/man/aia.1 CHANGED
@@ -1,6 +1,6 @@
1
1
  .\" Generated by kramdown-man 1.0.1
2
2
  .\" https://github.com/postmodern/kramdown-man#readme
3
- .TH aia 1 "v0.5.13" AIA "User Manuals"
3
+ .TH aia 1 "v0.5.14" AIA "User Manuals"
4
4
  .SH NAME
5
5
  .PP
6
6
  aia \- command\-line interface for an AI assistant
@@ -98,6 +98,12 @@ A role ID is the same as a prompt ID\. A \[lq]role\[rq] is a specialized prompt
98
98
  .TP
99
99
  \fB\-v\fR, \fB\-\-verbose\fR
100
100
  Be Verbose \- default is false
101
+ .TP
102
+ \fB\-\-voice\fR
103
+ The voice to use when the option \fB\-\-speak\fR is used\. If you are on a Mac, then setting voice to \[lq]siri\[rq] will use your Mac\[cq]s default siri voice and not access OpenAI \- default is \[lq]alloy\[rq] from OpenAI
104
+ .TP
105
+ \fB\-\-sm\fR, \fB\-\-speech\[ru]model\fR
106
+ The OpenAI speech model to use when converting text into speech \- default is \[lq]tts\-1\[rq]
101
107
  .SH CONFIGURATION HIERARCHY
102
108
  .PP
103
109
  System Environment Variables (envars) that are all uppercase and begin with \[lq]AIA\[ru]\[rq] can be used to over\-ride the default configuration settings\. For example setting \[lq]export AIA\[ru]PROMPTS\[ru]DIR\[eq]\[ti]\[sl]Documents\[sl]prompts\[rq] will over\-ride the default configuration; however, a config value provided by a command line options will over\-ride an envar setting\.
data/man/aia.1.md CHANGED
@@ -1,4 +1,4 @@
1
- # aia 1 "v0.5.13" AIA "User Manuals"
1
+ # aia 1 "v0.5.14" AIA "User Manuals"
2
2
 
3
3
  ## NAME
4
4
 
@@ -103,6 +103,11 @@ The aia command-line tool is an interface for interacting with an AI model backe
103
103
  `-v`, `--verbose`
104
104
  : Be Verbose - default is false
105
105
 
106
+ `--voice`
107
+ : The voice to use when the option `--speak` is used. If you are on a Mac, then setting voice to "siri" will use your Mac's default siri voice and not access OpenAI - default is "alloy" from OpenAI
108
+
109
+ `--sm`, `--speech_model`
110
+ : The OpenAI speech model to use when converting text into speech - default is "tts-1"
106
111
 
107
112
  ## CONFIGURATION HIERARCHY
108
113
 
metadata CHANGED
@@ -1,14 +1,14 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: aia
3
3
  version: !ruby/object:Gem::Version
4
- version: 0.5.13
4
+ version: 0.5.15
5
5
  platform: ruby
6
6
  authors:
7
7
  - Dewayne VanHoozer
8
8
  autorequire:
9
9
  bindir: bin
10
10
  cert_chain: []
11
- date: 2024-03-03 00:00:00.000000000 Z
11
+ date: 2024-03-30 00:00:00.000000000 Z
12
12
  dependencies:
13
13
  - !ruby/object:Gem::Dependency
14
14
  name: hashie
@@ -66,6 +66,20 @@ dependencies:
66
66
  - - ">="
67
67
  - !ruby/object:Gem::Version
68
68
  version: '0'
69
+ - !ruby/object:Gem::Dependency
70
+ name: ruby-openai
71
+ requirement: !ruby/object:Gem::Requirement
72
+ requirements:
73
+ - - ">="
74
+ - !ruby/object:Gem::Version
75
+ version: '0'
76
+ type: :runtime
77
+ prerelease: false
78
+ version_requirements: !ruby/object:Gem::Requirement
79
+ requirements:
80
+ - - ">="
81
+ - !ruby/object:Gem::Version
82
+ version: '0'
69
83
  - !ruby/object:Gem::Dependency
70
84
  name: semver2
71
85
  requirement: !ruby/object:Gem::Requirement
@@ -302,7 +316,7 @@ required_rubygems_version: !ruby/object:Gem::Requirement
302
316
  - !ruby/object:Gem::Version
303
317
  version: '0'
304
318
  requirements: []
305
- rubygems_version: 3.5.6
319
+ rubygems_version: 3.5.7
306
320
  signing_key:
307
321
  specification_version: 4
308
322
  summary: AI Assistant (aia) a command-line (CLI) utility