aia 0.5.12 → 0.5.13

Sign up to get free protection for your applications and to get access to all the features.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: 8fa0d67e36209d8ac1840c94820ce58e0b6c08a6b01e47e579dfe90f04b94420
4
- data.tar.gz: 382d34ab554077b0e81d3a724362b41463e5ccc88f888dbba00b1ad65ff0c528
3
+ metadata.gz: 1487b5005351fcb62b10d7ccdfdfe0153a40b931b633562671cc28e9f554ad8f
4
+ data.tar.gz: 9d34fc975adb14a52d0cf62a541a7568af61b4da5cb963c326a83253613d6b30
5
5
  SHA512:
6
- metadata.gz: 109f27eb85889450bd19bb78a123d02ba789bd99708cf7be20528a8da8c090a7ef68655e60f7fda31a0da12a506a38f3fcdf8c9d2b4a2ea9096006ab81c5d494
7
- data.tar.gz: 4b46125f0937d74ea4fe446c343b474f7c3b7d0443def5fea9782a5eea9a7c7bf4997e866871723dc2e4fe2d4fd9db07fe7384f9a95655891efbbfac5711a40f
6
+ metadata.gz: '00994916d15ef59a91ee9ae8b73acee3e10a8cc7deb1c41cbfa5fff4997fd97622deb2d16fdbc555da17bddba48dc349db21115333aac9bdf8d9a3dfaaa220c4'
7
+ data.tar.gz: b40df4e189d676dbee86f6593fac9fb90153d392105c07af7a53a3a1be1e3961b9978e05e29738e94c5ee135f7f9af0abdb7de19475edd4e7f7615f42d701327
data/.semver CHANGED
@@ -1,6 +1,6 @@
1
1
  ---
2
2
  :major: 0
3
3
  :minor: 5
4
- :patch: 12
4
+ :patch: 13
5
5
  :special: ''
6
6
  :metadata: ''
data/CHANGELOG.md CHANGED
@@ -1,5 +1,8 @@
1
1
  ## [Unreleased]
2
2
 
3
+ ## [0.5.13] 2024-03-03
4
+ - Added CLI-utility `llm` as a backend processor
5
+
3
6
  ## [0.5.12] 2024-02-24
4
7
  - Happy Birthday Ruby!
5
8
  - Added --next CLI option
data/README.md CHANGED
@@ -6,16 +6,15 @@ It leverages the `prompt_manager` gem to manage prompts for the `mods` and `sgpt
6
6
 
7
7
  **Most Recent Change**: Refer to the [Changelog](CHANGELOG.md)
8
8
 
9
+ > v0.5.13
10
+ > - Added an initial integration for CLI-tool `llm` as a backend processor
11
+ > Its primary feature is its **ability to use local LLMs and APIs to keep all processing within your local workstation.**
12
+ >
9
13
  > v0.5.12
10
14
  > - Supports Prompt Sequencing
11
15
  > - Added --next option
12
16
  > - Added --pipeline option
13
- >
14
- > v0.5.11
15
- > - Allow directives to prepend content into the prompt text
16
- > - Added //include path_to_file
17
- > - Added //shell shell_command
18
- > - Added //ruby ruby code
17
+
19
18
 
20
19
  <!-- Tocer[start]: Auto-generated, don't remove. -->
21
20
 
@@ -48,6 +47,10 @@ It leverages the `prompt_manager` gem to manage prompts for the `mods` and `sgpt
48
47
  - [The --role Option](#the---role-option)
49
48
  - [Other Ways to Insert Roles into Prompts](#other-ways-to-insert-roles-into-prompts)
50
49
  - [External CLI Tools Used](#external-cli-tools-used)
50
+ - [Optional External CLI-tools](#optional-external-cli-tools)
51
+ - [Backend Processor `llm`](#backend-processor-llm)
52
+ - [Backend Processor `sgpt`](#backend-processor-sgpt)
53
+ - [Occassionally Useful Tool `plz`](#occassionally-useful-tool-plz)
51
54
  - [Shell Completion](#shell-completion)
52
55
  - [My Most Powerful Prompt](#my-most-powerful-prompt)
53
56
  - [My Configuration](#my-configuration)
@@ -431,7 +434,28 @@ system environment variable 'EDITOR' like this:
431
434
 
432
435
  export EDITOR="subl -w"
433
436
 
437
+ ### Optional External CLI-tools
438
+
439
+ #### Backend Processor `llm`
440
+
441
+ ```
442
+ llm Access large language models from the command-line
443
+ | brew install llm
444
+ |__ https://llm.datasette.io/
445
+ ```
446
+
447
+ As of `aia v0.5.13` the `llm` backend processor is available in a limited integration. It is a very powerful python-based implementation that has its own prompt templating system. The reason that it is be included within the `aia` environment is for its ability to make use of local LLM models.
448
+
449
+
450
+ #### Backend Processor `sgpt`
451
+
452
+ `shell-gpt` aka `sgpt` is also a python implementation of a CLI-tool that processes prompts through OpenAI. It has less features than both `mods` and `llm` and is less flexible.
453
+
454
+ #### Occassionally Useful Tool `plz`
455
+
456
+ `plz-cli` aka `plz` is not integrated with `aia` however, it gets an honorable mention for its ability to except a prompt that tailored to doing something on the command line. Its response is a CLI command (sometimes a piped sequence) that accomplishes the task set forth in the prompt. It will return the commands to be executed agaist the data files you specified with a query to execute the command.
434
457
 
458
+ - brew install plz-cli
435
459
 
436
460
  ## Shell Completion
437
461
 
@@ -0,0 +1,77 @@
1
+ # lib/aia/tools/llm.rb
2
+
3
+ require_relative 'backend_common'
4
+
5
+ class AIA::Llm < AIA::Tools
6
+ include AIA::BackendCommon
7
+
8
+ meta(
9
+ name: 'llm',
10
+ role: :backend,
11
+ desc: "llm on the command line using local and remote models",
12
+ url: "https://llm.datasette.io/",
13
+ install: "brew install llm",
14
+ )
15
+
16
+
17
+ DEFAULT_PARAMETERS = [
18
+ # "--verbose", # enable verbose logging (if applicable)
19
+ # Add default parameters here
20
+ ].join(' ').freeze
21
+
22
+ DIRECTIVES = %w[
23
+ api_key
24
+ frequency_penalty
25
+ max_tokens
26
+ model
27
+ presence_penalty
28
+ stop_sequence
29
+ temperature
30
+ top_p
31
+ ]
32
+ end
33
+
34
+ __END__
35
+
36
+ #########################################################
37
+
38
+ llm, version 0.13.1
39
+
40
+ Usage: llm [OPTIONS] COMMAND [ARGS]...
41
+
42
+ Access large language models from the command-line
43
+
44
+ Documentation: https://llm.datasette.io/
45
+
46
+ To get started, obtain an OpenAI key and set it like this:
47
+
48
+ $ llm keys set openai
49
+ Enter key: ...
50
+
51
+ Then execute a prompt like this:
52
+
53
+ llm 'Five outrageous names for a pet pelican'
54
+
55
+ Options:
56
+ --version Show the version and exit.
57
+ --help Show this message and exit.
58
+
59
+ Commands:
60
+ prompt* Execute a prompt
61
+ aliases Manage model aliases
62
+ chat Hold an ongoing chat with a model.
63
+ collections View and manage collections of embeddings
64
+ embed Embed text and store or return the result
65
+ embed-models Manage available embedding models
66
+ embed-multi Store embeddings for multiple strings at once
67
+ install Install packages from PyPI into the same environment as LLM
68
+ keys Manage stored API keys for different models
69
+ logs Tools for exploring logged prompts and responses
70
+ models Manage available models
71
+ openai Commands for working directly with the OpenAI API
72
+ plugins List installed plugins
73
+ similar Return top N similar IDs from a collection
74
+ templates Manage stored prompt templates
75
+ uninstall Uninstall Python packages from the LLM environment
76
+
77
+
@@ -8,7 +8,7 @@ class AIA::Mods < AIA::Tools
8
8
  meta(
9
9
  name: 'mods',
10
10
  role: :backend,
11
- desc: 'AI on the command-line',
11
+ desc: 'GPT on the command line. Built for pipelines.',
12
12
  url: 'https://github.com/charmbracelet/mods',
13
13
  install: 'brew install mods',
14
14
  )
@@ -16,25 +16,35 @@ class AIA::Mods < AIA::Tools
16
16
 
17
17
  DEFAULT_PARAMETERS = [
18
18
  # "--no-cache", # do not save prompt and response
19
- "--no-limit" # no limit on input context
19
+ "--no-limit", # no limit on input context
20
+ "--quiet", # Quiet mode (hide the spinner while loading and stderr messages for success).
20
21
  ].join(' ').freeze
21
22
 
22
23
 
23
24
  DIRECTIVES = %w[
24
- api
25
- fanciness
26
- http-proxy
25
+ api
26
+ ask-model
27
+ continue
28
+ continue-last
29
+ fanciness
30
+ format-as
31
+ http-proxy
27
32
  max-retries
33
+ max-retries
34
+ max-tokens
28
35
  max-tokens
29
36
  model
30
37
  no-cache
31
38
  no-limit
32
- quiet
33
- raw
39
+ prompt
40
+ prompt-args
41
+ quiet
42
+ raw
34
43
  status-text
35
- temp
36
- title
37
- topp
44
+ temp
45
+ title
46
+ topp
47
+ word-wrap
38
48
  ]
39
49
  end
40
50
 
@@ -43,6 +53,8 @@ __END__
43
53
 
44
54
  ##########################################################
45
55
 
56
+ mods version 1.2.1 (Homebre)
57
+
46
58
  GPT on the command line. Built for pipelines.
47
59
 
48
60
  Usage:
@@ -50,9 +62,11 @@ Usage:
50
62
 
51
63
  Options:
52
64
  -m, --model Default model (gpt-3.5-turbo, gpt-4, ggml-gpt4all-j...).
65
+ -M, --ask-model Ask which model to use with an interactive prompt.
53
66
  -a, --api OpenAI compatible REST API (openai, localai).
54
67
  -x, --http-proxy HTTP proxy to use for API requests.
55
68
  -f, --format Ask for the response to be formatted as markdown unless otherwise set.
69
+ --format-as
56
70
  -r, --raw Render output as raw text when connected to a TTY.
57
71
  -P, --prompt Include the prompt from the arguments and stdin, truncate stdin to specified number of lines.
58
72
  -p, --prompt-args Include the prompt from the arguments in the response.
@@ -61,14 +75,16 @@ Options:
61
75
  -l, --list Lists saved conversations.
62
76
  -t, --title Saves the current conversation with the given title.
63
77
  -d, --delete Deletes a saved conversation with the given title or ID.
78
+ --delete-older-than Deletes all saved conversations older than the specified duration. Valid units are: ns, us, µs, μs, ms, s, m, h, d, w, mo, and y.
64
79
  -s, --show Show a saved conversation with the given title or ID.
65
- -S, --show-last Show a the last saved conversation.
80
+ -S, --show-last Show the last saved conversation.
66
81
  -q, --quiet Quiet mode (hide the spinner while loading and stderr messages for success).
67
82
  -h, --help Show help and exit.
68
83
  -v, --version Show version and exit.
69
84
  --max-retries Maximum number of times to retry API calls.
70
85
  --no-limit Turn off the client-side limit on the size of the input into the model.
71
86
  --max-tokens Maximum number of tokens in response.
87
+ --word-wrap Wrap formatted output at specific width (default is 80)
72
88
  --temp Temperature (randomness) of results, from 0.0 to 2.0.
73
89
  --topp TopP, an alternative to temperature that narrows response, from 0.0 to 1.0.
74
90
  --fanciness Your desired level of fanciness.
@@ -81,3 +97,4 @@ Options:
81
97
  Example:
82
98
  # Editorialize your video files
83
99
  ls ~/vids | mods -f "summarize each of these titles, group them by decade" | glow
100
+
data/man/aia.1 CHANGED
@@ -1,6 +1,6 @@
1
1
  .\" Generated by kramdown-man 1.0.1
2
2
  .\" https://github.com/postmodern/kramdown-man#readme
3
- .TH aia 1 "v0.5.12" AIA "User Manuals"
3
+ .TH aia 1 "v0.5.13" AIA "User Manuals"
4
4
  .SH NAME
5
5
  .PP
6
6
  aia \- command\-line interface for an AI assistant
@@ -180,6 +180,11 @@ OpenAI Platform Documentation
180
180
  .UE
181
181
  and working with OpenAI models\.
182
182
  .IP \(bu 2
183
+ llm
184
+ .UR https:\[sl]\[sl]llm\.datasette\.io\[sl]
185
+ .UE
186
+ for more information on \fBllm\fR \- A CLI utility and Python library for interacting with Large Language Models, both via remote APIs and models that can be installed and run on your own machine\.
187
+ .IP \(bu 2
183
188
  mods
184
189
  .UR https:\[sl]\[sl]github\.com\[sl]charmbracelet\[sl]mods
185
190
  .UE
data/man/aia.1.md CHANGED
@@ -1,4 +1,4 @@
1
- # aia 1 "v0.5.12" AIA "User Manuals"
1
+ # aia 1 "v0.5.13" AIA "User Manuals"
2
2
 
3
3
  ## NAME
4
4
 
@@ -176,6 +176,8 @@ if you want to specify them one at a time.
176
176
 
177
177
  - [OpenAI Platform Documentation](https://platform.openai.com/docs/overview) for more information on [obtaining access tokens](https://platform.openai.com/account/api-keys) and working with OpenAI models.
178
178
 
179
+ - [llm](https://llm.datasette.io/) for more information on `llm` - A CLI utility and Python library for interacting with Large Language Models, both via remote APIs and models that can be installed and run on your own machine.
180
+
179
181
  - [mods](https://github.com/charmbracelet/mods) for more information on `mods` - AI for the command line, built for pipelines. LLM based AI is really good at interpreting the output of commands and returning the results in CLI friendly text formats like Markdown. Mods is a simple tool that makes it super easy to use AI on the command line and in your pipelines. Mods works with [OpenAI](https://platform.openai.com/account/api-keys) and [LocalAI](https://github.com/go-skynet/LocalAI)
180
182
 
181
183
  - [sgpt](https://github.com/tbckr/sgpt) (aka shell-gpt) is a powerful command-line interface (CLI) tool designed for seamless interaction with OpenAI models directly from your terminal. Effortlessly run queries, generate shell commands or code, create images from text, and more, using simple commands. Streamline your workflow and enhance productivity with this powerful and user-friendly CLI tool.
metadata CHANGED
@@ -1,14 +1,14 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: aia
3
3
  version: !ruby/object:Gem::Version
4
- version: 0.5.12
4
+ version: 0.5.13
5
5
  platform: ruby
6
6
  authors:
7
7
  - Dewayne VanHoozer
8
8
  autorequire:
9
9
  bindir: bin
10
10
  cert_chain: []
11
- date: 2024-02-25 00:00:00.000000000 Z
11
+ date: 2024-03-03 00:00:00.000000000 Z
12
12
  dependencies:
13
13
  - !ruby/object:Gem::Dependency
14
14
  name: hashie
@@ -220,18 +220,17 @@ dependencies:
220
220
  - - ">="
221
221
  - !ruby/object:Gem::Version
222
222
  version: '0'
223
- description: |
224
- A command-line AI Assistante (aia) that provides pre-compositional
225
- template prompt management to various backend gen-AI processes.
226
- Complete shell integration allows a prompt to access system
227
- environment variables and execut shell commands as part of the
228
- prompt content. In addition full embedded Ruby support is provided
229
- given even more dynamic prompt conditional content. It is a
230
- generalized power house that rivals specialized gen-AI tools. aia
231
- currently supports "mods" and "sgpt" CLI tools. aia uses "ripgrep"
232
- and "fzf" CLI utilities to search for and select prompt files to
233
- send to the backend gen-AI tool along with supported context
234
- files.
223
+ description: A command-line AI Assistante (aia) that provides pre-compositional template
224
+ prompt management to various backend gen-AI processes such as llm, mods and sgpt
225
+ support processing of prompts both via remote API calls as well as keeping everything
226
+ local through the use of locally managed models and the LocalAI API. Complete shell
227
+ integration allows a prompt to access system environment variables and execut shell
228
+ commands as part of the prompt content. In addition full embedded Ruby support
229
+ is provided given even more dynamic prompt conditional content. It is a generalized
230
+ power house that rivals specialized gen-AI tools. aia currently supports "mods"
231
+ and "sgpt" CLI tools. aia uses "ripgrep" and "fzf" CLI utilities to search for
232
+ and select prompt files to send to the backend gen-AI tool along with supported
233
+ context files.
235
234
  email:
236
235
  - dvanhoozer@gmail.com
237
236
  executables:
@@ -268,6 +267,7 @@ files:
268
267
  - lib/aia/tools/editor.rb
269
268
  - lib/aia/tools/fzf.rb
270
269
  - lib/aia/tools/glow.rb
270
+ - lib/aia/tools/llm.rb
271
271
  - lib/aia/tools/mods.rb
272
272
  - lib/aia/tools/sgpt.rb
273
273
  - lib/aia/tools/subl.rb