ruby-shell 2.8.0 → 2.9.1

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
Files changed (4) hide show
  1. checksums.yaml +4 -4
  2. data/README.md +25 -1
  3. data/bin/rsh +267 -2
  4. metadata +6 -6
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: 55c80db8b211f00c2226a2bc38de34ca8bc448ff38545043f24931bde8441de6
4
- data.tar.gz: 7f70a5858de2c5d64800202a35af757b9bef6d864ccf76ab23dc673b82451fc1
3
+ metadata.gz: 9ceae5891b4b0deebdf2e2876232999e8c64374f4730ab4d5e8ef09ff2dc3835
4
+ data.tar.gz: bee38e735fb5f9eb1b4b133471ea649faec9531c7138ef035c6c9a089db87f06
5
5
  SHA512:
6
- metadata.gz: 6ad59802df4fc00d2d39191080b4a8df1d90ad5c567078c16562992b64d0aa8ad3ca59fdd9a0ab0bcefc35f6b02774305bec3f83872d463d27c2570887376901
7
- data.tar.gz: 0a4cd352e6e20d25af578a689bffb6d460cc24f0b40328461b8a9ada40d3b03bda6e6c79674944170b752b7dc3787b9397869644025bc7925e90d54ce3bcff4e
6
+ metadata.gz: 7e586a0d583b1d3c33f441ce1dcabf001dc1a12d3a375581ad2337db8cab2822850455cf599b5a5e7ba21a31d750f2133c10b0a71c3e9bdc6ff62454c98296ee
7
+ data.tar.gz: 8b7ae87743f8fe9f0414dda5ae607f632ece3f8289131519d21d8ebc20d54d13d7d7361364b8eeef7d9c68d6af9006731790decbee4eb0862a159997af0f306c
data/README.md CHANGED
@@ -33,7 +33,16 @@ Or simply `gem install ruby-shell`.
33
33
  * All colors are themeable in .rshrc (see github link for possibilities)
34
34
  * Copy current command line to primary selection (paste w/middle button) with `Ctrl-y`
35
35
 
36
- ## NEW in v2.8.0 - Enhanced Help System & Nick Management
36
+ ## NEW in v2.9.0 - AI Integration
37
+ * **AI-powered command assistance**: Get help with commands using natural language
38
+ * **`@ <question>`**: Ask questions and get AI-generated text responses
39
+ * **`@@ <request>`**: Describe what you want to do, and AI suggests the command
40
+ * **Smart command suggestion**: AI suggestions appear directly on the command line, ready to execute
41
+ * **Local AI support**: Works with Ollama for privacy-focused local AI
42
+ * **External AI support**: Configure OpenAI or other providers via `.rshrc`
43
+ * **Syntax highlighting**: @ and @@ commands are highlighted in blue
44
+
45
+ ## Enhanced Help System & Nick Management (v2.8.0)
37
46
  * **Two-column help display**: Compact, organized help that fits on one screen
38
47
  * **New `:info` command**: Shows introduction and feature overview
39
48
  * **`:nickdel` and `:gnickdel`**: Intuitive commands to delete nicks and gnicks
@@ -81,6 +90,21 @@ Background jobs:
81
90
  * Use `:fg` or `:fg job_id` to bring jobs to foreground
82
91
  * Use `Ctrl-Z` to suspend running jobs, `:bg job_id` to resume them
83
92
 
93
+ ## AI Configuration
94
+ The AI features work out of the box with Ollama for local AI processing. To set up:
95
+
96
+ ### Local AI (Recommended)
97
+ 1. Install Ollama: `curl -fsSL https://ollama.com/install.sh | sh`
98
+ 2. Pull a model: `ollama pull llama3.2`
99
+ 3. That's it! Use `@ What is the capital of France?` or `@@ list files by size`
100
+
101
+ ### External AI (OpenAI)
102
+ Add to your `.rshrc`:
103
+ ```ruby
104
+ @aimodel = "gpt-4"
105
+ @aikey = "your-api-key-here"
106
+ ```
107
+
84
108
  ## Moving around
85
109
  While you `cd` around to different directories, you can see the last 10 directories visited via the command `:dirs` or the convenient shortcut `#`. Entering the number in the list (like `6` and ENTER) will jump you to that directory. Entering `-` will jump you back to the previous dir (equivalent of `1`. Entering `~` will get you to your home dir. If you want to bookmark a special directory, you can do that via a general nick like this: `:gnick "x = /path/to/a/dir/"` - this would bookmark the directory to the single letter `x`.
86
110
 
data/bin/rsh CHANGED
@@ -8,7 +8,7 @@
8
8
  # Web_site: http://isene.com/
9
9
  # Github: https://github.com/isene/rsh
10
10
  # License: Public domain
11
- @version = "2.8.0" # Feature release: nickdel functions, improved help system, info command
11
+ @version = "2.9.1" # Bug fix: Improved command coloring for local executables and after ENTER
12
12
 
13
13
  # MODULES, CLASSES AND EXTENSIONS
14
14
  class String # Add coloring to strings (with escaping for Readline)
@@ -109,6 +109,7 @@ begin # Initialization
109
109
  @dirs = ["."]*10
110
110
  @jobs = {} # Background jobs tracking
111
111
  @job_id = 0 # Job counter
112
+ @ai_suggestion = nil # Store AI command suggestion
112
113
  @last_exit = 0 # Last command exit status
113
114
  def pre_cmd; end # User-defined function to be run BEFORE command execution
114
115
  def post_cmd; end # User-defined function to be run AFTER command execution
@@ -138,6 +139,7 @@ end
138
139
  * rsh specific commands and full set of Ruby commands available via :<command>
139
140
  * All colors are themeable in .rshrc (see github link for possibilities)
140
141
  * Copy current command line to primary selection (paste w/middle button) with `Ctrl-y`
142
+ * AI integration: Use @ for text responses and @@ for command suggestions (requires ollama or OpenAI)
141
143
 
142
144
  Use `:help` for command reference.
143
145
 
@@ -234,6 +236,12 @@ def getstr # A custom Readline-like function
234
236
  @pos = 0
235
237
  chr = ""
236
238
  @history.unshift("")
239
+ # Check if we have an AI suggestion to pre-fill
240
+ if @ai_suggestion
241
+ @history[0] = @ai_suggestion
242
+ @pos = @ai_suggestion.length
243
+ @ai_suggestion = nil
244
+ end
237
245
  @row0, p = @c.pos
238
246
  while chr != "ENTER" # Keep going with readline until user presses ENTER
239
247
  @ci = nil
@@ -547,12 +555,24 @@ def hist_clean # Clean up @history
547
555
  end
548
556
  def cmd_check(str) # Check if each element on the readline matches commands, nicks, paths; color them
549
557
  return if str.nil?
558
+
559
+ # Special handling for @ and @@ commands
560
+ if str =~ /^(@@?)\s+(.*)$/
561
+ prefix = $1
562
+ rest = $2
563
+ return prefix.c(4) + " " + rest # Color @ or @@ in blue (4), rest uncolored
564
+ end
565
+
550
566
  str.gsub(/(?:\S'[^']*'|[^ '])+/) do |el|
567
+ clean_el = el.gsub("'", "")
551
568
  if @exe.include?(el)
552
569
  el.c(@c_cmd)
553
570
  elsif el == "cd"
554
571
  el.c(@c_cmd)
555
- elsif File.exist?(el.gsub("'", ""))
572
+ elsif clean_el =~ /^\.\/(.+)/ && File.exist?(clean_el) && File.executable?(clean_el)
573
+ # Color local executables starting with ./
574
+ el.c(@c_cmd)
575
+ elsif File.exist?(clean_el)
556
576
  el.c(@c_path)
557
577
  elsif @nick.include?(el)
558
578
  el.c(@c_nick)
@@ -646,6 +666,10 @@ def help
646
666
  right_col << "= <expr> xrpn calculator"
647
667
  right_col << ":<ruby code> Execute Ruby"
648
668
  right_col << ""
669
+ right_col << "AI FEATURES:".c(@c_prompt).b
670
+ right_col << "@ <question> AI text response"
671
+ right_col << "@@ <request> AI command → prompt"
672
+ right_col << ""
649
673
  right_col << "EXPANSIONS:".c(@c_prompt).b
650
674
  right_col << "~ Home directory"
651
675
  right_col << "$VAR, ${VAR} Environment var"
@@ -859,6 +883,226 @@ def expand_braces(str)
859
883
  end
860
884
  end
861
885
 
886
+ # AI INTEGRATION FUNCTIONS
887
+ def get_ollama_model
888
+ begin
889
+ # Try to get list of available models
890
+ output = `ollama list 2>/dev/null`
891
+ return nil if output.empty? || $?.exitstatus != 0
892
+
893
+ # Parse the output to find a suitable model
894
+ lines = output.split("\n")
895
+ return nil if lines.length < 2
896
+
897
+ # Skip header line and get first available model
898
+ model_line = lines[1]
899
+ return nil if model_line.nil?
900
+
901
+ # Extract model name (first column)
902
+ model_name = model_line.split(/\s+/)[0]
903
+ return model_name
904
+ rescue => e
905
+ return nil
906
+ end
907
+ end
908
+
909
+ def ai_query(prompt)
910
+ # Get AI model configuration
911
+ model = @aimodel || nil
912
+ key = @aikey || nil
913
+
914
+ if model.nil? || model.empty?
915
+ # Try ollama first
916
+ if File.exist?("/usr/local/bin/ollama") || File.exist?("/usr/bin/ollama") || system("command -v ollama >/dev/null 2>&1")
917
+ begin
918
+ require 'json'
919
+ require 'net/http'
920
+
921
+ # First, get available models
922
+ ollama_model = get_ollama_model()
923
+ return ai_setup_help unless ollama_model
924
+
925
+ uri = URI('http://localhost:11434/api/generate')
926
+ http = Net::HTTP.new(uri.host, uri.port)
927
+ http.read_timeout = 30
928
+
929
+ request = Net::HTTP::Post.new(uri)
930
+ request['Content-Type'] = 'application/json'
931
+ request.body = {
932
+ model: ollama_model,
933
+ prompt: prompt,
934
+ stream: false,
935
+ options: {
936
+ num_predict: 200
937
+ }
938
+ }.to_json
939
+
940
+ response = http.request(request)
941
+ if response.code == '200'
942
+ result = JSON.parse(response.body)
943
+ return result['response']
944
+ else
945
+ return ai_setup_help
946
+ end
947
+ rescue => e
948
+ return ai_setup_help
949
+ end
950
+ else
951
+ return ai_setup_help
952
+ end
953
+ else
954
+ # Use external model
955
+ if model =~ /^gpt/i && key
956
+ begin
957
+ require 'json'
958
+ require 'net/http'
959
+ uri = URI('https://api.openai.com/v1/chat/completions')
960
+ http = Net::HTTP.new(uri.host, uri.port)
961
+ http.use_ssl = true
962
+ http.read_timeout = 30
963
+
964
+ request = Net::HTTP::Post.new(uri)
965
+ request['Content-Type'] = 'application/json'
966
+ request['Authorization'] = "Bearer #{key}"
967
+ request.body = {
968
+ model: model,
969
+ messages: [{role: 'user', content: prompt}],
970
+ max_tokens: 200,
971
+ temperature: 0.7
972
+ }.to_json
973
+
974
+ response = http.request(request)
975
+ if response.code == '200'
976
+ result = JSON.parse(response.body)
977
+ return result['choices'][0]['message']['content']
978
+ else
979
+ return "Error: #{response.code} - #{response.body}"
980
+ end
981
+ rescue => e
982
+ return "Error connecting to OpenAI: #{e.message}"
983
+ end
984
+ else
985
+ return "Unsupported model: #{model}. Currently only ollama and OpenAI models are supported."
986
+ end
987
+ end
988
+ end
989
+
990
+ def ai_command_suggest(prompt)
991
+ # Get AI model configuration
992
+ model = @aimodel || nil
993
+ key = @aikey || nil
994
+
995
+ # Modify prompt to request command output
996
+ cmd_prompt = "You are a Linux/Unix command line expert. Given this request: '#{prompt}', output ONLY the exact shell command that would accomplish this task. Output just the command itself with no explanation, no backticks, no markdown. For example, if asked 'list files' you would output: ls"
997
+
998
+ if model.nil? || model.empty?
999
+ # Try ollama first
1000
+ if File.exist?("/usr/local/bin/ollama") || File.exist?("/usr/bin/ollama") || system("command -v ollama >/dev/null 2>&1")
1001
+ begin
1002
+ require 'json'
1003
+ require 'net/http'
1004
+
1005
+ # First, get available models
1006
+ ollama_model = get_ollama_model()
1007
+ unless ollama_model
1008
+ puts ai_setup_help
1009
+ return nil
1010
+ end
1011
+
1012
+ uri = URI('http://localhost:11434/api/generate')
1013
+ http = Net::HTTP.new(uri.host, uri.port)
1014
+ http.read_timeout = 30
1015
+
1016
+ request = Net::HTTP::Post.new(uri)
1017
+ request['Content-Type'] = 'application/json'
1018
+ request.body = {
1019
+ model: ollama_model,
1020
+ prompt: cmd_prompt,
1021
+ stream: false,
1022
+ options: {
1023
+ num_predict: 50,
1024
+ temperature: 0.3
1025
+ }
1026
+ }.to_json
1027
+
1028
+ response = http.request(request)
1029
+ if response.code == '200'
1030
+ result = JSON.parse(response.body)
1031
+ cmd = result['response'].strip.split("\n")[0]
1032
+ return cmd
1033
+ else
1034
+ puts ai_setup_help
1035
+ return nil
1036
+ end
1037
+ rescue => e
1038
+ puts ai_setup_help
1039
+ return nil
1040
+ end
1041
+ else
1042
+ puts ai_setup_help
1043
+ return nil
1044
+ end
1045
+ else
1046
+ # Use external model
1047
+ if model =~ /^gpt/i && key
1048
+ begin
1049
+ require 'json'
1050
+ require 'net/http'
1051
+ uri = URI('https://api.openai.com/v1/chat/completions')
1052
+ http = Net::HTTP.new(uri.host, uri.port)
1053
+ http.use_ssl = true
1054
+ http.read_timeout = 30
1055
+
1056
+ request = Net::HTTP::Post.new(uri)
1057
+ request['Content-Type'] = 'application/json'
1058
+ request['Authorization'] = "Bearer #{key}"
1059
+ request.body = {
1060
+ model: model,
1061
+ messages: [{role: 'user', content: cmd_prompt}],
1062
+ max_tokens: 50,
1063
+ temperature: 0.3
1064
+ }.to_json
1065
+
1066
+ response = http.request(request)
1067
+ if response.code == '200'
1068
+ result = JSON.parse(response.body)
1069
+ cmd = result['choices'][0]['message']['content'].strip.split("\n")[0]
1070
+ return cmd
1071
+ else
1072
+ puts "Error: #{response.code} - #{response.body}"
1073
+ return nil
1074
+ end
1075
+ rescue => e
1076
+ puts "Error connecting to OpenAI: #{e.message}"
1077
+ return nil
1078
+ end
1079
+ else
1080
+ puts "Unsupported model: #{model}. Currently only ollama and OpenAI models are supported."
1081
+ return nil
1082
+ end
1083
+ end
1084
+ end
1085
+
1086
+ def ai_setup_help
1087
+ help_text = <<~HELP
1088
+
1089
+ AI is not configured. To use AI features, you have two options:
1090
+
1091
+ 1. Install Ollama (recommended for local AI):
1092
+ curl -fsSL https://ollama.com/install.sh | sh
1093
+ ollama pull llama3.2 # or any model you prefer
1094
+
1095
+ 2. Configure external AI model in ~/.rshrc:
1096
+ @aimodel = "gpt-4"
1097
+ @aikey = "your-api-key-here"
1098
+
1099
+ Once configured:
1100
+ - Use @ for AI text responses: @ What is the GDP of Norway?
1101
+ - Use @@ for AI command suggestions: @@ list files sorted by size
1102
+ HELP
1103
+ return help_text.c(@c_path)
1104
+ end
1105
+
862
1106
  # INITIAL SETUP
863
1107
  begin # Load .rshrc and populate @history
864
1108
  trap "SIGINT" do end
@@ -935,6 +1179,10 @@ loop do
935
1179
  hi = @history[@cmd.sub(/^!(\d+)$/, '\1').to_i+1]
936
1180
  @cmd = hi if hi
937
1181
  end
1182
+ # Move cursor to end of line and print the full command before clearing
1183
+ @c.row(@row0)
1184
+ @c.clear_line
1185
+ print @prompt + cmd_check(@cmd)
938
1186
  print "\n"; @c.clear_screen_down
939
1187
  if @cmd == "r" # Integration with rtfm (https://github.com/isene/RTFM)
940
1188
  t = Time.now
@@ -951,6 +1199,23 @@ loop do
951
1199
  @cmd.gsub!(" ", ",")
952
1200
  @cmd = "echo \"#{@cmd[1...]},prx,off\" | xrpn"
953
1201
  end
1202
+ # AI integration with @ and @@
1203
+ if @cmd =~ /^@@\s+(.+)/ # AI command suggestion
1204
+ prompt = $1
1205
+ response = ai_command_suggest(prompt)
1206
+ if response
1207
+ # Store the suggestion for the next prompt
1208
+ @ai_suggestion = response
1209
+ # Also add to history for record keeping
1210
+ @history.unshift(response)
1211
+ end
1212
+ next
1213
+ elsif @cmd =~ /^@\s+(.+)/ # AI text response
1214
+ prompt = $1
1215
+ response = ai_query(prompt)
1216
+ puts response if response
1217
+ next
1218
+ end
954
1219
  if @cmd.match(/^\s*:/) # Ruby commands are prefixed with ":"
955
1220
  begin
956
1221
  eval(@cmd[1..-1])
metadata CHANGED
@@ -1,21 +1,21 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: ruby-shell
3
3
  version: !ruby/object:Gem::Version
4
- version: 2.8.0
4
+ version: 2.9.1
5
5
  platform: ruby
6
6
  authors:
7
7
  - Geir Isene
8
8
  autorequire:
9
9
  bindir: bin
10
10
  cert_chain: []
11
- date: 2025-07-08 00:00:00.000000000 Z
11
+ date: 2025-09-13 00:00:00.000000000 Z
12
12
  dependencies: []
13
13
  description: 'A shell written in Ruby with extensive tab completions, aliases/nicks,
14
14
  history, syntax highlighting, theming, auto-cd, auto-opening files and more. UPDATE
15
- v2.8.0: Enhanced help system with two-column display, new :info command, :nickdel/:gnickdel
16
- commands for easier nick management. v2.7.0: Ruby Functions - define custom shell
17
- commands using full Ruby power! Also: job control, command substitution, variable
18
- expansion, conditional execution, and login shell support.'
15
+ v2.9.0: AI integration! Use @ for AI text responses and @@ for AI command suggestions.
16
+ Works with local Ollama or external providers like OpenAI. v2.8.0: Enhanced help
17
+ system, :info command, and easier nick management. v2.7.0: Ruby Functions, job control,
18
+ command substitution, and more.'
19
19
  email: g@isene.com
20
20
  executables:
21
21
  - rsh