gpt-cli 0.1.2 → 0.1.3

Sign up to get free protection for your applications and to get access to all the features.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: b82b15bff40eaab0d8c15e2273b275841d3501e8c0ae085443f35782175d3ab8
4
- data.tar.gz: de41d86f71219206252e937180099cbd07b1c77abef0b6e71271e3faf3d170d9
3
+ metadata.gz: e970f253447aaac501f343585e284d0b7254b8fc696f4a9ad74430b38861c833
4
+ data.tar.gz: f1f40e893b47c5bd66d47b8a7156b855156bcb8bda8334340e0a0659746f4c01
5
5
  SHA512:
6
- metadata.gz: 26ff8b088978790eebdc85904dc3c994a96ffa20d854fa0ba5123f52d3c8ff86b96e271bf975bb4b585f544c79becb6447dae9dd10cabb8a5cb28e030fc42104
7
- data.tar.gz: fb1d336d91b5efe7ebf7c48659dbb3bdf4e240c2f54b19e4001ec4806b730ad145686852ce1b5768cf75d29f92f39ac61393098cc9c3db5ba4601ed00e022404
6
+ metadata.gz: 3f0e9f58ee5f1f5a03fb20c217d25d3d6da505d3481e9fad56d53dcb465e759da3a48d524aff9e093ca3c69d3dac72fa688e6c9e4301c66d603e67a5bfd4d099
7
+ data.tar.gz: 20d1bc99c963c155bbd396890cf216d06a665983709a76d2bdeb5179842cef1b5f923b024c5f7ebe794477362156666bc60f67ba2b663564313de6736008e451
data/CHANGELOG.md CHANGED
@@ -4,7 +4,11 @@
4
4
 
5
5
  - Initial release
6
6
 
7
- ## [0.1.1] - 2023-03-23
7
+ ## [0.1.2] - 2023-03-23
8
8
 
9
9
  - Define open ai model in env var
10
- - Input context pompts
10
+ - Input context pompts
11
+
12
+ ## [0.1.3] - 2023-03-23
13
+
14
+ - Custom contexts prompts support
data/README.md CHANGED
@@ -1,7 +1,7 @@
1
1
  # GPTCLI
2
2
 
3
3
  This library provides a UNIX-ey interface to OpenAI.
4
- It is a fork of [openai_pipe](https://github.com/Aesthetikx/openai_pipe), created by [Aesthetikx](https://github.com/Aesthetikx/openai_pipe), kudos to him for his awesome work!
4
+ It is a fork of [openai_pipe](https://github.com/Aesthetikx/openai_pipe), created by [Aesthetikx](https://github.com/Aesthetikx/), kudos to him for his awesome work!
5
5
  The goal of this fork was to add the ability to choose which OpenAI model to use, as well as adding context prompts support. Thanks to these new features, you can have your own highly specialized GPT4 personal assistant, directly in your terminal! Productivity stonks 📈📈📈
6
6
 
7
7
  See [Installation](#installation) and [Setup](#setup) below, but first, some examples.
@@ -185,7 +185,11 @@ Set the model you want to use in ENV:
185
185
  ```bash
186
186
  export OPENAI_MODEL="gpt-3.5-turbo"
187
187
  ```
188
- (Optional) set the default context prompt you want to use (for now, there's only 3 basic prompts available `python`, `fullstack` and `blockchain` in `lib/contexts.rb`):
188
+ Copy `gpt_contexts.sample.json` somewhere, for example `~/Documents/gpt_contexts.json`, then put the file path in ENV:
189
+ ```bash
190
+ export OPENAI_CONTEXTS_PATH="path to gpt_contexts.json"
191
+ ```
192
+ (Optional) set the default context prompt you want to use (for now, there's only 3 basic prompts available `python`, `fullstack` and `blockchain` in `gpt_contexts.sample.json`):
189
193
  ```bash
190
194
  export OPENAI_DEFAULT_CONTEXT="python"
191
195
  ```
@@ -198,8 +202,11 @@ alias gpt="gpt-cli"
198
202
 
199
203
  There's two optional parameters you can set when running `gpt`:
200
204
  `gpt -c <custom_context_prompt> -p <your prompt>`
201
- `--context -c`: this will be the context prompt, see basic contexts in `lib/contexts.rb`. You can put a key ofrom the contexts hash of `lib/contexts.rb` or a custom context.
205
+
206
+ `--context -c`: this will be the context prompt, see basic contexts in `gpt_contexts.sample.json`. You can put a key from `gpt_contexts.json` or a custom context. Default to ENV `OPENAI_DEFAULT_CONTEXT` if not set. Add your own context prompts in `gpt_contexts.json`.
207
+
202
208
  `--prompt -p`: your actual prompt.
209
+
203
210
  You can also run gpt without any arguments, just your question. In this case, the context prompt will default to the one defined by ENV var `OPENAI_DEFAULT_CONTEXT` if it exists. If not, no context will be added.
204
211
  See examples above for an overview of some usecases. Possibilities are endless.
205
212
 
@@ -207,7 +214,13 @@ See examples above for an overview of some usecases. Possibilities are endless.
207
214
 
208
215
  Be aware that there is a cost associated every time GPT3 is invoked, so be mindful of your account usage. Also be wary of sending sensitive data to OpenAI, and also wary of arbitrarily executing scripts or programs that GPT3 generates.
209
216
  Also, this is my very first time working in Ruby. So please be indulgent 🙏
210
- A future release will replace `lib/contexts.rb` by a JSON file so user will be able to easily add/modify custom context prompts.
217
+
218
+ ## TODO
219
+
220
+ - Add conversation history support
221
+ - Add internet search support to feed prompt more accurately
222
+ - Add dall-e support
223
+ - Add proper documentation
211
224
 
212
225
  ## Development
213
226
 
@@ -0,0 +1,5 @@
1
+ {
2
+ "python": "Imagine you are a senior software developer specialized in Python, Django, SQL and Machine Learning with 20 years of experience. You are very proficient at writing high quality, clean, concise and highly optimized code that follows the PEP recommendations. You are my personal assistant. You are here to help me in various tasks and answer my questions.",
3
+ "fullstack": "Imagine you are an Senior Fullstack Engineer with 20 of experience in UNIX systems, bash, Python, Django, SQL, Javascript, ReactJS. You are very proficient at writing high quality, clean, concise and well optimized code that follows the PEP recommendations. You are my personal assistant. You are here to help me in various tasks and answer my questions.",
4
+ "blockchain": "Imagine you are Vitalik Buterin. You created Ethereum and you are the most famous blockchain developer in the world.You have a very deep knowledge about EVM, Solidity, and the whole blockchain ecosystem.You know everything about the latest trends in the blockchain space and you are very proficient at writing high quality, clean, concise and well optimized code that follows the good practices.You are my personal assistant. You are here to help me in various tasks and answer my questions."
5
+ }
@@ -1,5 +1,5 @@
1
1
  # frozen_string_literal: true
2
2
 
3
3
  module GPTCLI
4
- VERSION = "0.1.2"
4
+ VERSION = "0.1.3"
5
5
  end
data/lib/gpt-cli.rb CHANGED
@@ -1,15 +1,16 @@
1
1
  # frozen_string_literal: true
2
2
 
3
3
  require_relative "gpt-cli/version"
4
- require_relative "contexts"
5
4
 
5
+ require "json"
6
6
  require "quick_openai"
7
7
  require "optparse"
8
8
 
9
9
  class ChatGPT
10
10
  def contexts
11
- file_contents = File.read(File.join(File.dirname(__FILE__), 'contexts.rb'))
12
- instance_eval(file_contents)
11
+ file_contents = File.read(ENV["OPENAI_CONTEXTS_PATH"])
12
+ contexts = JSON.parse(file_contents)
13
+ contexts.transform_keys(&:to_sym)
13
14
  end
14
15
 
15
16
  def gpt3(prompt, options)
@@ -44,7 +45,7 @@ module GPTCLI
44
45
  options = {}
45
46
  chatgpt = ChatGPT.new
46
47
  parser = OptionParser.new do |opts|
47
- opts.on('-c', '--context CONTEXT_KEY', 'Context key from contexts.rb') do |context_input|
48
+ opts.on('-c', '--context CONTEXT_KEY', 'Context key from contexts.json') do |context_input|
48
49
  options[:context] = chatgpt.contexts.key?(context_input.to_sym) ? chatgpt.contexts[context_input.to_sym] : context_input
49
50
  end
50
51
  opts.on('-p', '--prompt PROMPT_TEXT', 'Prompt text to be passed to GPT-3') do |prompt_text|
metadata CHANGED
@@ -1,7 +1,7 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: gpt-cli
3
3
  version: !ruby/object:Gem::Version
4
- version: 0.1.2
4
+ version: 0.1.3
5
5
  platform: ruby
6
6
  authors:
7
7
  - John DeSilva
@@ -9,7 +9,7 @@ authors:
9
9
  autorequire:
10
10
  bindir: exe
11
11
  cert_chain: []
12
- date: 2023-03-24 00:00:00.000000000 Z
12
+ date: 2023-03-25 00:00:00.000000000 Z
13
13
  dependencies:
14
14
  - !ruby/object:Gem::Dependency
15
15
  name: quick_openai
@@ -43,7 +43,7 @@ files:
43
43
  - README.md
44
44
  - Rakefile
45
45
  - exe/gpt-cli
46
- - lib/contexts.rb
46
+ - gpt_contexts.sample.json
47
47
  - lib/gpt-cli.rb
48
48
  - lib/gpt-cli/version.rb
49
49
  - sig/gpt-cli.rbs
@@ -69,7 +69,7 @@ required_rubygems_version: !ruby/object:Gem::Requirement
69
69
  - !ruby/object:Gem::Version
70
70
  version: '0'
71
71
  requirements: []
72
- rubygems_version: 3.3.26
72
+ rubygems_version: 3.3.15
73
73
  signing_key:
74
74
  specification_version: 4
75
75
  summary: A UNIX-ey interface to OpenAI
data/lib/contexts.rb DELETED
@@ -1,14 +0,0 @@
1
- contexts = {
2
- :python => %q(Imagine you are a senior software developer specialized in Python, Django, SQL and Machine Learning with 20 years of experience.
3
- You are very proficient at writing high quality, clean, concise and highly optimized code that follows the PEP recommendations.
4
- You are my personal assistant. You are here to help me in various tasks and answer my questions.),
5
-
6
- :fullstack => %q(Imagine you are an Senior Fullstack Engineer with 20 of experience in UNIX systems, bash, Python, Django, SQL, Javascript, ReactJS, NextJS.
7
- You are very proficient at writing high quality, clean, concise and well optimized code that follows the PEP recommendations.
8
- You are my personal assistant. You are here to help me in various tasks and answer my questions.),
9
-
10
- :blockchain => %q(Imagine you are Vitalik Buterin. You created Ethereum and you are the most famous blockchain developer in the world.
11
- You have a very deep knowledge about EVM, Solidity, and the whole blockchain ecosystem.
12
- You know everything about the latest trends in the blockchain space and you are very proficient at writing high quality, clean, concise and well optimized code that follows the good practices.
13
- You are my personal assistant. You are here to help me in various tasks and answer my questions.)
14
- }