gpt-cli 0.1.3 β†’ 0.1.4

Sign up to get free protection for your applications and to get access to all the features.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: e970f253447aaac501f343585e284d0b7254b8fc696f4a9ad74430b38861c833
4
- data.tar.gz: f1f40e893b47c5bd66d47b8a7156b855156bcb8bda8334340e0a0659746f4c01
3
+ metadata.gz: 50f343ce40c9b6b1a320305d13c63f53c27565e78b3eba260e24812ca4a59922
4
+ data.tar.gz: df9ed612ec576b66b925835a5ec8399104937261c5e86f4dac43573b7c842d30
5
5
  SHA512:
6
- metadata.gz: 3f0e9f58ee5f1f5a03fb20c217d25d3d6da505d3481e9fad56d53dcb465e759da3a48d524aff9e093ca3c69d3dac72fa688e6c9e4301c66d603e67a5bfd4d099
7
- data.tar.gz: 20d1bc99c963c155bbd396890cf216d06a665983709a76d2bdeb5179842cef1b5f923b024c5f7ebe794477362156666bc60f67ba2b663564313de6736008e451
6
+ metadata.gz: a8631cd80e9b5dd4609103b06309c578d06edbb2535b3ca9aa0d6a255126273e34fc84aadc06a13ef0cc51ac8b3ff8897855934508ea6bb9c1acac09eead7e48
7
+ data.tar.gz: ef833c19b1d265b4b41e7227f3bacadb2a7eba0546cef4598170d1cec7424665416bdd265def89fa6f36565d663cc497d3fd6d226acaebd2a0f68ec7c007d70c
data/CHANGELOG.md CHANGED
@@ -9,6 +9,12 @@
9
9
  - Define open ai model in env var
10
10
  - Input context pompts
11
11
 
12
- ## [0.1.3] - 2023-03-23
12
+ ## [0.1.3] - 2023-03-25
13
13
 
14
- - Custom contexts prompts support
14
+ - Custom contexts prompts support
15
+
16
+ ## [0.1.4] - 2023-03-25
17
+
18
+ - Added messages history
19
+ - Added more context prompts
20
+ - Added Dall-E support
data/README.md CHANGED
@@ -1,17 +1,33 @@
1
1
  # GPTCLI
2
2
 
3
- This library provides a UNIX-ey interface to OpenAI.
4
- It is a fork of [openai_pipe](https://github.com/Aesthetikx/openai_pipe), created by [Aesthetikx](https://github.com/Aesthetikx/), kudos to him for his awesome work!
5
- The goal of this fork was to add the ability to choose which OpenAI model to use, as well as adding context prompts support. Thanks to these new features, you can have your own highly specialized GPT4 personal assistant, directly in your terminal! Productivity stonks πŸ“ˆπŸ“ˆπŸ“ˆ
3
+ This library provides a UNIX-ey interface to OpenAI.
4
+ Say hello to your own highly specialized GPT4 personal assistant, directly in your terminal!
5
+ Productivity stonks πŸ“ˆπŸ“ˆπŸ“ˆ
6
+
7
+ Consider 🌟 this repo if you find this tool useful πŸ”₯
8
+
9
+ ## Features
10
+ - Use the pipe to pass the output of commands to GPT or Dall-E.
11
+ - Choose your GPT model, GPT4 supported if you have access.
12
+ - Dall-E support: generate images directly in your terminal.
13
+ - Contexts prompts tailored for your needs: add/update predefined context prompts or write your own when running the command.
14
+ - Infinite conversation history: have chat sessions with GPT just like [chat.openai.com](https://chat.openai.com/) but in your terminal.
6
15
 
7
16
  See [Installation](#installation) and [Setup](#setup) below, but first, some examples.
8
17
 
9
18
  ## Examples
10
19
 
11
20
  ```console
12
- $ gpt what is two plus two
13
- Two plus two is equal to four.
21
+ $ gpt "what is your role?"
22
+ As your personal assistant, my role is to assist you in various tasks and answer your questions related to my areas of expertise, which include UNIX systems, bash, Python, Django, SQL, Javascript, ReactJS. I can help you with programming and development, server administration, debugging your code or scripts, optimizing performance, code review, providing recommendations for best practices, and more.
23
+ ```
24
+
25
+ Use Dall-E:
26
+ ```console
27
+ $ gpt give me a very short description of the moon landscape | gpt --dalle
28
+ Image saved in current directory to 7db7ab2e8914175d4f4819e033226563.png
14
29
  ```
30
+ <img src="https://raw.githubusercontent.com/FlorianMgs/gpt-cli/master/.github/7db7ab2e8914175d4f4819e033226563.png" height=256 width=256></img>
15
31
 
16
32
  ```console
17
33
  $ uptime | gpt convert this to json
@@ -27,26 +43,6 @@ $ uptime | gpt convert this to json
27
43
  }
28
44
  ```
29
45
 
30
- ```console
31
- $ gpt -c you are Vitalik Buterin, the creator of Ethereum. You know very well the whole EVM ecosystem and how to write perfectly optimized Solidity smart contracts -p Write a simple ERC20 token smart contract using OpenZeppelin library, respond only by the smart contract, do no write explanations > erc20.sol
32
- pragma solidity ^0.8.0;
33
-
34
- import "@openzeppelin/contracts/token/ERC20/ERC20.sol";
35
-
36
- contract MyToken is ERC20 {
37
- constructor(uint256 initialSupply) ERC20("MyToken", "MTK") {
38
- _mint(msg.sender, initialSupply);
39
- }
40
- }
41
- ```
42
-
43
- Setting default context to `fullstack`:
44
- ```console
45
- $ gpt "what is your role?"
46
- As your personal assistant, my role is to assist you in various tasks and answer your questions related to my areas of expertise, which include UNIX systems, bash, Python, Django, SQL, Javascript, ReactJS, and NextJS. I can help you with programming and development, server administration, debugging your code or scripts, optimizing performance, code review, providing recommendations for best practices, and more.
47
-
48
- ```
49
-
50
46
  ```console
51
47
  $ gpt list the nine planets as JSON | gpt convert this to XML but in French | tee planets.fr.xml
52
48
  <PlanΓ¨tes>
@@ -62,6 +58,19 @@ $ gpt list the nine planets as JSON | gpt convert this to XML but in French | te
62
58
  </PlanΓ¨tes>
63
59
  ```
64
60
 
61
+ ```console
62
+ $ gpt -c "you are Vitalik Buterin, the creator of Ethereum. You know very well the whole EVM ecosystem and how to write perfectly optimized Solidity smart contracts" -p "Write a simple ERC20 token smart contract using OpenZeppelin library, respond only by the smart contract, do no write explanations" > erc20.sol
63
+ pragma solidity ^0.8.0;
64
+
65
+ import "@openzeppelin/contracts/token/ERC20/ERC20.sol";
66
+
67
+ contract MyToken is ERC20 {
68
+ constructor(uint256 initialSupply) ERC20("MyToken", "MTK") {
69
+ _mint(msg.sender, initialSupply);
70
+ }
71
+ }
72
+ ```
73
+
65
74
  ```console
66
75
  $ curl -sL "https://en.wikipedia.org/wiki/cats" | head -n 5 | gpt extract just the title of this webpage | figlet
67
76
  ____ _ __ ___ _ _ _ _
@@ -185,11 +194,11 @@ Set the model you want to use in ENV:
185
194
  ```bash
186
195
  export OPENAI_MODEL="gpt-3.5-turbo"
187
196
  ```
188
- Copy `gpt_contexts.sample.json` somewhere, for example `~/Documents/gpt_contexts.json`, then put the file path in ENV:
197
+ Copy `gpt_contexts.sample.json` somewhere, for example `~/Documents/gpt_contexts.json`, then put the file path in ENV (don't forget to rename the file to `gpt_contexts.json`):
189
198
  ```bash
190
- export OPENAI_CONTEXTS_PATH="path to gpt_contexts.json"
199
+ export OPENAI_CONTEXTS_PATH="path/to/gpt_contexts.json"
191
200
  ```
192
- (Optional) set the default context prompt you want to use (for now, there's only 3 basic prompts available `python`, `fullstack` and `blockchain` in `gpt_contexts.sample.json`):
201
+ (Optional) set the default context prompt you want to use, see `gpt_contexts.sample.json` for examples:
193
202
  ```bash
194
203
  export OPENAI_DEFAULT_CONTEXT="python"
195
204
  ```
@@ -200,26 +209,31 @@ alias gpt="gpt-cli"
200
209
 
201
210
  ## Usage
202
211
 
203
- There's two optional parameters you can set when running `gpt`:
212
+ There's optional parameters you can set when running `gpt`:
204
213
  `gpt -c <custom_context_prompt> -p <your prompt>`
205
214
 
206
- `--context -c`: this will be the context prompt, see basic contexts in `gpt_contexts.sample.json`. You can put a key from `gpt_contexts.json` or a custom context. Default to ENV `OPENAI_DEFAULT_CONTEXT` if not set. Add your own context prompts in `gpt_contexts.json`.
215
+ `--context -c`: this will be the context prompt, see basic contexts in `gpt_contexts.sample.json`. You can put a key from `gpt_contexts.json` or a custom context. Default to ENV `OPENAI_DEFAULT_CONTEXT` if not set. Add your own context prompts in `gpt_contexts.json`.
207
216
 
208
217
  `--prompt -p`: your actual prompt.
209
218
 
210
- You can also run gpt without any arguments, just your question. In this case, the context prompt will default to the one defined by ENV var `OPENAI_DEFAULT_CONTEXT` if it exists. If not, no context will be added.
211
- See examples above for an overview of some usecases. Possibilities are endless.
219
+ `--history -h`: print conversation history.
220
+
221
+ `--dalle -d`: Generate an image from your prompt with Dall-E.
222
+
223
+ `--clear`: clear conversation history.
224
+
225
+ You can also run gpt without any arguments, just with your prompt. In this case, the context prompt will default to the one defined by ENV var `OPENAI_DEFAULT_CONTEXT` if it exists. If not, no context will be added.
226
+ See examples above for an overview of some usecases. Possibilities are endless.
212
227
 
213
228
  ## Notes
214
229
 
215
- Be aware that there is a cost associated every time GPT3 is invoked, so be mindful of your account usage. Also be wary of sending sensitive data to OpenAI, and also wary of arbitrarily executing scripts or programs that GPT3 generates.
230
+ This is made on top of [openai_pipe](https://github.com/Aesthetikx/openai_pipe), created by [Aesthetikx](https://github.com/Aesthetikx/), kudos to him for his awesome work!
231
+ Be aware that there is a cost associated every time GPT is invoked, so be mindful of your account usage. Also be wary of sending sensitive data to OpenAI, and also wary of arbitrarily executing scripts or programs that GPT generates.
216
232
  Also, this is my very first time working in Ruby. So please be indulgent πŸ™
217
233
 
218
234
  ## TODO
219
235
 
220
- - Add conversation history support
221
236
  - Add internet search support to feed prompt more accurately
222
- - Add dall-e support
223
237
  - Add proper documentation
224
238
 
225
239
  ## Development
@@ -1,5 +1,12 @@
1
1
  {
2
- "python": "Imagine you are a senior software developer specialized in Python, Django, SQL and Machine Learning with 20 years of experience. You are very proficient at writing high quality, clean, concise and highly optimized code that follows the PEP recommendations. You are my personal assistant. You are here to help me in various tasks and answer my questions.",
3
- "fullstack": "Imagine you are an Senior Fullstack Engineer with 20 of experience in UNIX systems, bash, Python, Django, SQL, Javascript, ReactJS. You are very proficient at writing high quality, clean, concise and well optimized code that follows the PEP recommendations. You are my personal assistant. You are here to help me in various tasks and answer my questions.",
4
- "blockchain": "Imagine you are Vitalik Buterin. You created Ethereum and you are the most famous blockchain developer in the world.You have a very deep knowledge about EVM, Solidity, and the whole blockchain ecosystem.You know everything about the latest trends in the blockchain space and you are very proficient at writing high quality, clean, concise and well optimized code that follows the good practices.You are my personal assistant. You are here to help me in various tasks and answer my questions."
2
+ "python": "Imagine you are a senior software developer specialized in Python, Django, and SQL with 20 years of experience. You are very proficient at writing high quality, clean, concise and highly optimized code that follows the PEP recommendations. You are my personal assistant. You are here to help me in various tasks and answer my questions.",
3
+ "fullstack": "Imagine you are an Senior Fullstack Engineer with 20 of experience. You have a very deep knowledge about various backend tech stacks and frontend frameworks. You are very proficient at writing high quality, clean, concise and well optimized code. You are my personal assistant. You are here to help me in various tasks and answer my questions.",
4
+ "blockchain": "Imagine you are Vitalik Buterin. You created Ethereum and you are the most famous blockchain developer in the world. You have a very deep knowledge about EVM, Solidity, and the whole blockchain ecosystem.You know everything about the latest trends in the blockchain space and you are very proficient at writing high quality, clean, concise and well optimized code that follows the good practices.You are my personal assistant. You are here to help me in various tasks and answer my questions.",
5
+ "it_expert": "I want you to act as an IT Expert. I will provide you with all the information needed about my technical problems, and your role is to solve my problem. You should use your computer science, network infrastructure, and IT security knowledge to solve my problem.",
6
+ "ml_expert": "I want you to act as a machine learning engineer. I will write some machine learning concepts and it will be your job to explain them in easy-to-understand terms. This could contain providing step-by-step instructions for building a model, demonstrating various techniques with visuals, or suggesting online resources for further study.",
7
+ "hacker": "I want you to act as a cyber security specialist. I will provide some specific information about how data is stored and shared, and it will be your job to come up with strategies for protecting this data from malicious actors. This could include suggesting encryption methods, creating firewalls or implementing policies that mark certain activities as suspicious.",
8
+ "math": "I want you to act as a math teacher. I will provide some mathematical equations or concepts, and it will be your job to explain them in easy-to-understand terms. This could include providing step-by-step instructions for solving a problem, demonstrating various techniques with visuals or suggesting online resources for further study.",
9
+ "tech_transferer": "I want you to act as a Technology Transferer, I will provide resume bullet points and you will map each bullet point from one technology to a different technology. I want you to only reply with the mapped bullet points in the following format: β€œ- [mapped bullet point]”. Do not write explanations. Do not provide additional actions unless instructed. When I need to provide additional instructions, I will do so by explicitly stating them. The technology in the original resume bullet point is {Android} and the technology I want to map to is {ReactJS}",
10
+ "prompt_engineer": "I want you to act as a ChatGPT prompt generator, I will send a topic, you have to generate a ChatGPT prompt based on the content of the topic, the prompt should start with β€œI want you to act as β€œ, and guess what I might do, and expand the prompt accordingly Describe the content to make it useful.",
11
+ "linux_terminal": "I want you to act as a linux terminal. I will type commands and you will reply with what the terminal should show. I want you to only reply with the terminal output inside one unique code block, and nothing else. do not write explanations. do not type commands unless I instruct you to do so. When I need to tell you something in English, I will do so by putting text inside curly brackets {like this}."
5
12
  }
@@ -1,5 +1,5 @@
1
1
  # frozen_string_literal: true
2
2
 
3
3
  module GPTCLI
4
- VERSION = "0.1.3"
4
+ VERSION = "0.1.4"
5
5
  end
data/lib/gpt-cli.rb CHANGED
@@ -2,11 +2,32 @@
2
2
 
3
3
  require_relative "gpt-cli/version"
4
4
 
5
+ require 'digest/md5'
5
6
  require "json"
6
7
  require "quick_openai"
7
8
  require "optparse"
8
9
 
9
10
  class ChatGPT
11
+ attr_accessor :messages
12
+ MESSAGES_FILE = 'message_history.json'
13
+
14
+ def initialize
15
+ @messages = load_messages
16
+ end
17
+
18
+ def save_messages
19
+ File.open(MESSAGES_FILE, 'w') do |file|
20
+ file.write(JSON.dump(@messages))
21
+ end
22
+ end
23
+
24
+ def load_messages
25
+ return [] unless File.exist?(MESSAGES_FILE)
26
+
27
+ file_contents = File.read(MESSAGES_FILE)
28
+ JSON.parse(file_contents, symbolize_names: true)
29
+ end
30
+
10
31
  def contexts
11
32
  file_contents = File.read(ENV["OPENAI_CONTEXTS_PATH"])
12
33
  contexts = JSON.parse(file_contents)
@@ -15,15 +36,14 @@ class ChatGPT
15
36
 
16
37
  def gpt3(prompt, options)
17
38
  context = options[:context] || contexts[ENV["OPENAI_DEFAULT_CONTEXT"].to_sym]
18
- messages = [
19
- {role: "user", content: prompt}
20
- ]
21
- messages.unshift({role: "system", content: context.gsub("\n", ' ').squeeze(' ')}) if context
39
+ context_message = {role: "system", content: context.gsub("\n", ' ').squeeze(' ')}
40
+ @messages << context_message if context && !@messages.include?(context_message)
41
+ @messages << {role: "user", content: prompt}
22
42
 
23
43
  parameters = {
24
44
  model: ENV["OPENAI_MODEL"],
25
45
  max_tokens: 2048,
26
- messages: messages,
46
+ messages: @messages,
27
47
  }
28
48
 
29
49
  response = QuickOpenAI.fetch_response_from_client do |client|
@@ -33,7 +53,8 @@ class ChatGPT
33
53
  text = response.dig("choices", 0, "message", "content")
34
54
 
35
55
  raise QuickOpenAI::Error, "Unable to parse response." if text.nil? || text.empty?
36
-
56
+
57
+ @messages << {role: "assistant", content: text.chomp.strip}
37
58
  text.chomp.strip
38
59
  end
39
60
  end
@@ -51,20 +72,51 @@ module GPTCLI
51
72
  opts.on('-p', '--prompt PROMPT_TEXT', 'Prompt text to be passed to GPT-3') do |prompt_text|
52
73
  options[:prompt] = prompt_text
53
74
  end
75
+ opts.on('-h', '--history', 'Print the message history') do
76
+ options[:history] = true
77
+ end
78
+ opts.on('--clear', 'Clear the message history') do
79
+ options[:clear] = true
80
+ end
81
+ opts.on('-d', '--dalle', 'Use DALL-E instead of GPT. Prompt should be no more than 1000 characters.') do
82
+ options[:dalle] = true
83
+ end
54
84
  end
55
85
 
56
86
  remaining_args = parser.parse!(ARGV)
57
87
 
58
- input_from_pipe = $stdin.read if $stdin.stat.pipe?
59
- input_from_remaining_args = remaining_args.join(" ") if remaining_args.any?
88
+ if options[:history]
89
+ puts chatgpt.messages
90
+ elsif options[:clear]
91
+ File.delete(ChatGPT::MESSAGES_FILE) if File.exist?(ChatGPT::MESSAGES_FILE)
92
+ puts "Message history cleared."
93
+ else
94
+ input_from_pipe = $stdin.read if $stdin.stat.pipe?
95
+ input_from_remaining_args = remaining_args.join(" ") if remaining_args.any?
60
96
 
61
- prompt = options[:prompt] || input_from_remaining_args || ""
62
- full_prompt = [prompt, input_from_pipe].compact.join("\n\n")
63
- full_prompt.strip!
97
+ prompt = options[:prompt] || input_from_remaining_args || ""
98
+ full_prompt = [prompt, input_from_pipe].compact.join("\n\n")
99
+ full_prompt.strip!
64
100
 
65
- puts chatgpt.gpt3(full_prompt, options)
101
+ if options[:dalle]
102
+ if full_prompt.length > 1000
103
+ puts "Prompt is too long. Truncating to 1000 characters."
104
+ full_prompt = full_prompt[0...1000]
105
+ end
106
+ full_prompt.dalle2.then { |tempfile|
107
+ current_working_directory = Dir.pwd
108
+ filename = Digest::MD5.hexdigest(full_prompt)
109
+ output_file_path = File.join(current_working_directory, "#{filename}.png")
110
+ File.write(output_file_path, tempfile.read)
111
+ puts "Image saved in current directory to #{filename}.png"
112
+ }
113
+ else
114
+ puts chatgpt.gpt3(full_prompt, options)
115
+ chatgpt.save_messages
116
+ end
117
+ end
66
118
  rescue QuickOpenAI::Error => e
67
119
  warn e.message
68
120
  exit 1
69
121
  end
70
- end
122
+ end
metadata CHANGED
@@ -1,7 +1,7 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: gpt-cli
3
3
  version: !ruby/object:Gem::Version
4
- version: 0.1.3
4
+ version: 0.1.4
5
5
  platform: ruby
6
6
  authors:
7
7
  - John DeSilva