ruby-shell 2.8.0 → 2.9.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
Files changed (4) hide show
  1. checksums.yaml +4 -4
  2. data/README.md +25 -1
  3. data/bin/rsh +262 -1
  4. metadata +6 -6
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: 55c80db8b211f00c2226a2bc38de34ca8bc448ff38545043f24931bde8441de6
4
- data.tar.gz: 7f70a5858de2c5d64800202a35af757b9bef6d864ccf76ab23dc673b82451fc1
3
+ metadata.gz: 3bfe09de4762828310479cc0535ec0942b575fc331d4f16bfc1614861cf14b8c
4
+ data.tar.gz: 264e440a9257eb49e7985a034fbfc8ec7a3be5fb1efc9e046d4ef7cf94116872
5
5
  SHA512:
6
- metadata.gz: 6ad59802df4fc00d2d39191080b4a8df1d90ad5c567078c16562992b64d0aa8ad3ca59fdd9a0ab0bcefc35f6b02774305bec3f83872d463d27c2570887376901
7
- data.tar.gz: 0a4cd352e6e20d25af578a689bffb6d460cc24f0b40328461b8a9ada40d3b03bda6e6c79674944170b752b7dc3787b9397869644025bc7925e90d54ce3bcff4e
6
+ metadata.gz: bc12e932035ded646ff4459c7d3ad299582102c3d208b8ddb1b529691f8cc21d02ec3e7d7c21b4a346b716eba495540361291b5e483438b712afe9c6fb0ea381
7
+ data.tar.gz: bf29a4a08d04835d7275a09c9bb0970d597a72a207a68e57e42755b433613dddac4d93bcae3a0152b74f6a8093562b8f748d8f33844bb736f0fdff66da83bfeb
data/README.md CHANGED
@@ -33,7 +33,16 @@ Or simply `gem install ruby-shell`.
33
33
  * All colors are themeable in .rshrc (see github link for possibilities)
34
34
  * Copy current command line to primary selection (paste w/middle button) with `Ctrl-y`
35
35
 
36
- ## NEW in v2.8.0 - Enhanced Help System & Nick Management
36
+ ## NEW in v2.9.0 - AI Integration
37
+ * **AI-powered command assistance**: Get help with commands using natural language
38
+ * **`@ <question>`**: Ask questions and get AI-generated text responses
39
+ * **`@@ <request>`**: Describe what you want to do, and AI suggests the command
40
+ * **Smart command suggestion**: AI suggestions appear directly on the command line, ready to execute
41
+ * **Local AI support**: Works with Ollama for privacy-focused local AI
42
+ * **External AI support**: Configure OpenAI or other providers via `.rshrc`
43
+ * **Syntax highlighting**: @ and @@ commands are highlighted in blue
44
+
45
+ ## Enhanced Help System & Nick Management (v2.8.0)
37
46
  * **Two-column help display**: Compact, organized help that fits on one screen
38
47
  * **New `:info` command**: Shows introduction and feature overview
39
48
  * **`:nickdel` and `:gnickdel`**: Intuitive commands to delete nicks and gnicks
@@ -81,6 +90,21 @@ Background jobs:
81
90
  * Use `:fg` or `:fg job_id` to bring jobs to foreground
82
91
  * Use `Ctrl-Z` to suspend running jobs, `:bg job_id` to resume them
83
92
 
93
+ ## AI Configuration
94
+ The AI features work out of the box with Ollama for local AI processing. To set up:
95
+
96
+ ### Local AI (Recommended)
97
+ 1. Install Ollama: `curl -fsSL https://ollama.com/install.sh | sh`
98
+ 2. Pull a model: `ollama pull llama3.2`
99
+ 3. That's it! Use `@ What is the capital of France?` or `@@ list files by size`
100
+
101
+ ### External AI (OpenAI)
102
+ Add to your `.rshrc`:
103
+ ```ruby
104
+ @aimodel = "gpt-4"
105
+ @aikey = "your-api-key-here"
106
+ ```
107
+
84
108
  ## Moving around
85
109
  While you `cd` around to different directories, you can see the last 10 directories visited via the command `:dirs` or the convenient shortcut `#`. Entering the number in the list (like `6` and ENTER) will jump you to that directory. Entering `-` will jump you back to the previous dir (equivalent of `1`. Entering `~` will get you to your home dir. If you want to bookmark a special directory, you can do that via a general nick like this: `:gnick "x = /path/to/a/dir/"` - this would bookmark the directory to the single letter `x`.
86
110
 
data/bin/rsh CHANGED
@@ -8,7 +8,7 @@
8
8
  # Web_site: http://isene.com/
9
9
  # Github: https://github.com/isene/rsh
10
10
  # License: Public domain
11
- @version = "2.8.0" # Feature release: nickdel functions, improved help system, info command
11
+ @version = "2.9.0" # Feature release: AI integration with @ and @@ commands
12
12
 
13
13
  # MODULES, CLASSES AND EXTENSIONS
14
14
  class String # Add coloring to strings (with escaping for Readline)
@@ -109,6 +109,7 @@ begin # Initialization
109
109
  @dirs = ["."]*10
110
110
  @jobs = {} # Background jobs tracking
111
111
  @job_id = 0 # Job counter
112
+ @ai_suggestion = nil # Store AI command suggestion
112
113
  @last_exit = 0 # Last command exit status
113
114
  def pre_cmd; end # User-defined function to be run BEFORE command execution
114
115
  def post_cmd; end # User-defined function to be run AFTER command execution
@@ -138,6 +139,7 @@ end
138
139
  * rsh specific commands and full set of Ruby commands available via :<command>
139
140
  * All colors are themeable in .rshrc (see github link for possibilities)
140
141
  * Copy current command line to primary selection (paste w/middle button) with `Ctrl-y`
142
+ * AI integration: Use @ for text responses and @@ for command suggestions (requires ollama or OpenAI)
141
143
 
142
144
  Use `:help` for command reference.
143
145
 
@@ -234,6 +236,12 @@ def getstr # A custom Readline-like function
234
236
  @pos = 0
235
237
  chr = ""
236
238
  @history.unshift("")
239
+ # Check if we have an AI suggestion to pre-fill
240
+ if @ai_suggestion
241
+ @history[0] = @ai_suggestion
242
+ @pos = @ai_suggestion.length
243
+ @ai_suggestion = nil
244
+ end
237
245
  @row0, p = @c.pos
238
246
  while chr != "ENTER" # Keep going with readline until user presses ENTER
239
247
  @ci = nil
@@ -547,6 +555,14 @@ def hist_clean # Clean up @history
547
555
  end
548
556
  def cmd_check(str) # Check if each element on the readline matches commands, nicks, paths; color them
549
557
  return if str.nil?
558
+
559
+ # Special handling for @ and @@ commands
560
+ if str =~ /^(@@?)\s+(.*)$/
561
+ prefix = $1
562
+ rest = $2
563
+ return prefix.c(4) + " " + rest # Color @ or @@ in blue (4), rest uncolored
564
+ end
565
+
550
566
  str.gsub(/(?:\S'[^']*'|[^ '])+/) do |el|
551
567
  if @exe.include?(el)
552
568
  el.c(@c_cmd)
@@ -646,6 +662,10 @@ def help
646
662
  right_col << "= <expr> xrpn calculator"
647
663
  right_col << ":<ruby code> Execute Ruby"
648
664
  right_col << ""
665
+ right_col << "AI FEATURES:".c(@c_prompt).b
666
+ right_col << "@ <question> AI text response"
667
+ right_col << "@@ <request> AI command → prompt"
668
+ right_col << ""
649
669
  right_col << "EXPANSIONS:".c(@c_prompt).b
650
670
  right_col << "~ Home directory"
651
671
  right_col << "$VAR, ${VAR} Environment var"
@@ -859,6 +879,226 @@ def expand_braces(str)
859
879
  end
860
880
  end
861
881
 
882
+ # AI INTEGRATION FUNCTIONS
883
+ def get_ollama_model
884
+ begin
885
+ # Try to get list of available models
886
+ output = `ollama list 2>/dev/null`
887
+ return nil if output.empty? || $?.exitstatus != 0
888
+
889
+ # Parse the output to find a suitable model
890
+ lines = output.split("\n")
891
+ return nil if lines.length < 2
892
+
893
+ # Skip header line and get first available model
894
+ model_line = lines[1]
895
+ return nil if model_line.nil?
896
+
897
+ # Extract model name (first column)
898
+ model_name = model_line.split(/\s+/)[0]
899
+ return model_name
900
+ rescue => e
901
+ return nil
902
+ end
903
+ end
904
+
905
+ def ai_query(prompt)
906
+ # Get AI model configuration
907
+ model = @aimodel || nil
908
+ key = @aikey || nil
909
+
910
+ if model.nil? || model.empty?
911
+ # Try ollama first
912
+ if File.exist?("/usr/local/bin/ollama") || File.exist?("/usr/bin/ollama") || system("command -v ollama >/dev/null 2>&1")
913
+ begin
914
+ require 'json'
915
+ require 'net/http'
916
+
917
+ # First, get available models
918
+ ollama_model = get_ollama_model()
919
+ return ai_setup_help unless ollama_model
920
+
921
+ uri = URI('http://localhost:11434/api/generate')
922
+ http = Net::HTTP.new(uri.host, uri.port)
923
+ http.read_timeout = 30
924
+
925
+ request = Net::HTTP::Post.new(uri)
926
+ request['Content-Type'] = 'application/json'
927
+ request.body = {
928
+ model: ollama_model,
929
+ prompt: prompt,
930
+ stream: false,
931
+ options: {
932
+ num_predict: 200
933
+ }
934
+ }.to_json
935
+
936
+ response = http.request(request)
937
+ if response.code == '200'
938
+ result = JSON.parse(response.body)
939
+ return result['response']
940
+ else
941
+ return ai_setup_help
942
+ end
943
+ rescue => e
944
+ return ai_setup_help
945
+ end
946
+ else
947
+ return ai_setup_help
948
+ end
949
+ else
950
+ # Use external model
951
+ if model =~ /^gpt/i && key
952
+ begin
953
+ require 'json'
954
+ require 'net/http'
955
+ uri = URI('https://api.openai.com/v1/chat/completions')
956
+ http = Net::HTTP.new(uri.host, uri.port)
957
+ http.use_ssl = true
958
+ http.read_timeout = 30
959
+
960
+ request = Net::HTTP::Post.new(uri)
961
+ request['Content-Type'] = 'application/json'
962
+ request['Authorization'] = "Bearer #{key}"
963
+ request.body = {
964
+ model: model,
965
+ messages: [{role: 'user', content: prompt}],
966
+ max_tokens: 200,
967
+ temperature: 0.7
968
+ }.to_json
969
+
970
+ response = http.request(request)
971
+ if response.code == '200'
972
+ result = JSON.parse(response.body)
973
+ return result['choices'][0]['message']['content']
974
+ else
975
+ return "Error: #{response.code} - #{response.body}"
976
+ end
977
+ rescue => e
978
+ return "Error connecting to OpenAI: #{e.message}"
979
+ end
980
+ else
981
+ return "Unsupported model: #{model}. Currently only ollama and OpenAI models are supported."
982
+ end
983
+ end
984
+ end
985
+
986
+ def ai_command_suggest(prompt)
987
+ # Get AI model configuration
988
+ model = @aimodel || nil
989
+ key = @aikey || nil
990
+
991
+ # Modify prompt to request command output
992
+ cmd_prompt = "You are a Linux/Unix command line expert. Given this request: '#{prompt}', output ONLY the exact shell command that would accomplish this task. Output just the command itself with no explanation, no backticks, no markdown. For example, if asked 'list files' you would output: ls"
993
+
994
+ if model.nil? || model.empty?
995
+ # Try ollama first
996
+ if File.exist?("/usr/local/bin/ollama") || File.exist?("/usr/bin/ollama") || system("command -v ollama >/dev/null 2>&1")
997
+ begin
998
+ require 'json'
999
+ require 'net/http'
1000
+
1001
+ # First, get available models
1002
+ ollama_model = get_ollama_model()
1003
+ unless ollama_model
1004
+ puts ai_setup_help
1005
+ return nil
1006
+ end
1007
+
1008
+ uri = URI('http://localhost:11434/api/generate')
1009
+ http = Net::HTTP.new(uri.host, uri.port)
1010
+ http.read_timeout = 30
1011
+
1012
+ request = Net::HTTP::Post.new(uri)
1013
+ request['Content-Type'] = 'application/json'
1014
+ request.body = {
1015
+ model: ollama_model,
1016
+ prompt: cmd_prompt,
1017
+ stream: false,
1018
+ options: {
1019
+ num_predict: 50,
1020
+ temperature: 0.3
1021
+ }
1022
+ }.to_json
1023
+
1024
+ response = http.request(request)
1025
+ if response.code == '200'
1026
+ result = JSON.parse(response.body)
1027
+ cmd = result['response'].strip.split("\n")[0]
1028
+ return cmd
1029
+ else
1030
+ puts ai_setup_help
1031
+ return nil
1032
+ end
1033
+ rescue => e
1034
+ puts ai_setup_help
1035
+ return nil
1036
+ end
1037
+ else
1038
+ puts ai_setup_help
1039
+ return nil
1040
+ end
1041
+ else
1042
+ # Use external model
1043
+ if model =~ /^gpt/i && key
1044
+ begin
1045
+ require 'json'
1046
+ require 'net/http'
1047
+ uri = URI('https://api.openai.com/v1/chat/completions')
1048
+ http = Net::HTTP.new(uri.host, uri.port)
1049
+ http.use_ssl = true
1050
+ http.read_timeout = 30
1051
+
1052
+ request = Net::HTTP::Post.new(uri)
1053
+ request['Content-Type'] = 'application/json'
1054
+ request['Authorization'] = "Bearer #{key}"
1055
+ request.body = {
1056
+ model: model,
1057
+ messages: [{role: 'user', content: cmd_prompt}],
1058
+ max_tokens: 50,
1059
+ temperature: 0.3
1060
+ }.to_json
1061
+
1062
+ response = http.request(request)
1063
+ if response.code == '200'
1064
+ result = JSON.parse(response.body)
1065
+ cmd = result['choices'][0]['message']['content'].strip.split("\n")[0]
1066
+ return cmd
1067
+ else
1068
+ puts "Error: #{response.code} - #{response.body}"
1069
+ return nil
1070
+ end
1071
+ rescue => e
1072
+ puts "Error connecting to OpenAI: #{e.message}"
1073
+ return nil
1074
+ end
1075
+ else
1076
+ puts "Unsupported model: #{model}. Currently only ollama and OpenAI models are supported."
1077
+ return nil
1078
+ end
1079
+ end
1080
+ end
1081
+
1082
+ def ai_setup_help
1083
+ help_text = <<~HELP
1084
+
1085
+ AI is not configured. To use AI features, you have two options:
1086
+
1087
+ 1. Install Ollama (recommended for local AI):
1088
+ curl -fsSL https://ollama.com/install.sh | sh
1089
+ ollama pull llama3.2 # or any model you prefer
1090
+
1091
+ 2. Configure external AI model in ~/.rshrc:
1092
+ @aimodel = "gpt-4"
1093
+ @aikey = "your-api-key-here"
1094
+
1095
+ Once configured:
1096
+ - Use @ for AI text responses: @ What is the GDP of Norway?
1097
+ - Use @@ for AI command suggestions: @@ list files sorted by size
1098
+ HELP
1099
+ return help_text.c(@c_path)
1100
+ end
1101
+
862
1102
  # INITIAL SETUP
863
1103
  begin # Load .rshrc and populate @history
864
1104
  trap "SIGINT" do end
@@ -935,6 +1175,10 @@ loop do
935
1175
  hi = @history[@cmd.sub(/^!(\d+)$/, '\1').to_i+1]
936
1176
  @cmd = hi if hi
937
1177
  end
1178
+ # Move cursor to end of line and print the full command before clearing
1179
+ @c.row(@row0)
1180
+ @c.clear_line
1181
+ print @prompt + @cmd
938
1182
  print "\n"; @c.clear_screen_down
939
1183
  if @cmd == "r" # Integration with rtfm (https://github.com/isene/RTFM)
940
1184
  t = Time.now
@@ -951,6 +1195,23 @@ loop do
951
1195
  @cmd.gsub!(" ", ",")
952
1196
  @cmd = "echo \"#{@cmd[1...]},prx,off\" | xrpn"
953
1197
  end
1198
+ # AI integration with @ and @@
1199
+ if @cmd =~ /^@@\s+(.+)/ # AI command suggestion
1200
+ prompt = $1
1201
+ response = ai_command_suggest(prompt)
1202
+ if response
1203
+ # Store the suggestion for the next prompt
1204
+ @ai_suggestion = response
1205
+ # Also add to history for record keeping
1206
+ @history.unshift(response)
1207
+ end
1208
+ next
1209
+ elsif @cmd =~ /^@\s+(.+)/ # AI text response
1210
+ prompt = $1
1211
+ response = ai_query(prompt)
1212
+ puts response if response
1213
+ next
1214
+ end
954
1215
  if @cmd.match(/^\s*:/) # Ruby commands are prefixed with ":"
955
1216
  begin
956
1217
  eval(@cmd[1..-1])
metadata CHANGED
@@ -1,21 +1,21 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: ruby-shell
3
3
  version: !ruby/object:Gem::Version
4
- version: 2.8.0
4
+ version: 2.9.0
5
5
  platform: ruby
6
6
  authors:
7
7
  - Geir Isene
8
8
  autorequire:
9
9
  bindir: bin
10
10
  cert_chain: []
11
- date: 2025-07-08 00:00:00.000000000 Z
11
+ date: 2025-07-28 00:00:00.000000000 Z
12
12
  dependencies: []
13
13
  description: 'A shell written in Ruby with extensive tab completions, aliases/nicks,
14
14
  history, syntax highlighting, theming, auto-cd, auto-opening files and more. UPDATE
15
- v2.8.0: Enhanced help system with two-column display, new :info command, :nickdel/:gnickdel
16
- commands for easier nick management. v2.7.0: Ruby Functions - define custom shell
17
- commands using full Ruby power! Also: job control, command substitution, variable
18
- expansion, conditional execution, and login shell support.'
15
+ v2.9.0: AI integration! Use @ for AI text responses and @@ for AI command suggestions.
16
+ Works with local Ollama or external providers like OpenAI. v2.8.0: Enhanced help
17
+ system, :info command, and easier nick management. v2.7.0: Ruby Functions, job control,
18
+ command substitution, and more.'
19
19
  email: g@isene.com
20
20
  executables:
21
21
  - rsh