elelem 0.4.1 → 0.5.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: 0a92bbacb94cf0c5d56ce42a83f48efc0079f59cbafb066a7fdd3a8c382cadc1
4
- data.tar.gz: 8bdf5148c6e082a070dd528bde2165093cb0e34a82d5c9e7260e24eef0c5b9ca
3
+ metadata.gz: 5358e2fd855f1b8848a191c4db7160cc7eb296a73451d56f69da7e67b1b83313
4
+ data.tar.gz: 728dcd83f11d131af93a99cc08b68c3141b57c7ee36d69fb038736220e628eb1
5
5
  SHA512:
6
- metadata.gz: f9f80b2af42fc637d70dc1fbffc72af9fdb1e1465c7db4a54adf789ae02861e8c306319cb35ebe3152a28bd22fed9202fdd21dba54415492593ea98cfb5fed4d
7
- data.tar.gz: 6fb3ea9891e2ae9f9e5377791bae7ac46d89303134d8a1d65de87e550be7ddb38f1f12d83cc3e257b32623e232317eb2fc4b48ed3a2d92adf6e2d402d80ae04b
6
+ metadata.gz: 2bcc2bb0271cb1188a68d390922b59a0c1007aaf8f8980c0981c26d60b75d6cb666c481702d72db92d61db4824402d88a9bbfffce0623b73ba133ca8557270df
7
+ data.tar.gz: 1b36b16c23223f574942de9bf72ea46f34d5e090eae8523c53baeccbd64ce26239de21938f80adf673f76326e87c9344e011723cb53407ed5f3df7d9647a1c3c
data/CHANGELOG.md CHANGED
@@ -1,9 +1,38 @@
1
1
  ## [Unreleased]
2
2
 
3
+ ## [0.5.0] - 2025-01-07
4
+
5
+ ### Added
6
+ - Multi-provider support: Ollama, Anthropic, OpenAI, and VertexAI
7
+ - `--provider` CLI option to select LLM provider (default: ollama)
8
+ - `--model` CLI option to override default model
9
+ - Tool aliases (`bash` also accepts `exec`, `shell`, `command`, `terminal`, `run`)
10
+ - Thinking text output for models that support extended thinking
11
+
12
+ ### Changed
13
+ - Requires net-llm >= 0.5.0 with unified fetch interface
14
+ - Updated gem description to reflect multi-provider support
15
+
16
+ ## [0.4.2] - 2025-12-01
17
+
18
+ ### Changed
19
+ - Renamed `exec` tool to `bash` for clarity
20
+ - Improved system prompt with iterative refinements
21
+ - Added environment context variables to system prompt
22
+
3
23
  ## [0.4.1] - 2025-11-26
4
24
 
5
25
  ### Added
6
- - Updated version to 0.4.1
26
+ - `elelem files` subcommand: generates Claude‑compatible XML file listings.
27
+ - Rake task `files:prompt` to output a ready‑to‑copy list of files for prompts.
28
+
29
+ ### Changed
30
+ - Refactor tool‑call formatting to a more compact JSON payload for better LLM parsing.
31
+ - Updated CI and documentation to use GitHub instead of previous hosting.
32
+ - Runtime validation of command‑line parameters against a JSON schema.
33
+
34
+ ### Fixed
35
+ - Minor documentation and CI workflow adjustments.
7
36
 
8
37
  ## [0.4.0] - 2025-11-10
9
38
 
@@ -122,16 +151,3 @@
122
151
 
123
152
  - Initial release
124
153
 
125
- ## [0.4.2] - 2025-11-27
126
-
127
- ### Added
128
- - `elelem files` subcommand: generates Claude‑compatible XML file listings.
129
- - Rake task `files:prompt` to output a ready‑to‑copy list of files for prompts.
130
-
131
- ### Changed
132
- - Refactor tool‑call formatting to a more compact JSON payload for better LLM parsing.
133
- - Updated CI and documentation to use GitHub instead of previous hosting.
134
- - Runtime validation of command‑line parameters against a JSON schema.
135
-
136
- ### Fixed
137
- - Minor documentation and CI workflow adjustments.
data/README.md CHANGED
@@ -63,7 +63,7 @@ gem install elelem
63
63
 
64
64
  ## Usage
65
65
 
66
- Start an interactive chat session with an Ollama model:
66
+ Start an interactive chat session:
67
67
 
68
68
  ```bash
69
69
  elelem chat
@@ -71,20 +71,36 @@ elelem chat
71
71
 
72
72
  ### Options
73
73
 
74
- * `--host` – Ollama host (default: `localhost:11434`).
75
- * `--model` – Ollama model (default: `gpt-oss`).
76
- * `--token` – Authentication token.
74
+ * `--provider` – LLM provider: `ollama`, `anthropic`, `openai`, or `vertex-ai` (default: `ollama`).
75
+ * `--model` – Override the default model for the selected provider.
77
76
 
78
77
  ### Examples
79
78
 
80
79
  ```bash
81
- # Default model
80
+ # Default (Ollama)
82
81
  elelem chat
83
82
 
84
- # Specific model and host
85
- elelem chat --model llama2 --host remote-host:11434
83
+ # Anthropic Claude
84
+ ANTHROPIC_API_KEY=sk-... elelem chat --provider anthropic
85
+
86
+ # OpenAI
87
+ OPENAI_API_KEY=sk-... elelem chat --provider openai
88
+
89
+ # VertexAI (uses gcloud ADC)
90
+ elelem chat --provider vertex-ai --model claude-sonnet-4@20250514
86
91
  ```
87
92
 
93
+ ### Provider Configuration
94
+
95
+ Each provider reads its configuration from environment variables:
96
+
97
+ | Provider | Environment Variables |
98
+ |-------------|---------------------------------------------------|
99
+ | ollama | `OLLAMA_HOST` (default: localhost:11434) |
100
+ | anthropic | `ANTHROPIC_API_KEY` |
101
+ | openai | `OPENAI_API_KEY`, `OPENAI_BASE_URL` |
102
+ | vertex-ai | `GOOGLE_CLOUD_PROJECT`, `GOOGLE_CLOUD_REGION` |
103
+
88
104
  ## Mode System
89
105
 
90
106
  The agent exposes seven built‑in tools. You can switch which ones are
@@ -125,13 +141,13 @@ seven tools, each represented by a JSON schema that the LLM can call.
125
141
 
126
142
  | Tool | Purpose | Parameters |
127
143
  | ---- | ------- | ---------- |
144
+ | `bash` | Run shell commands | `cmd`, `args`, `env`, `cwd`, `stdin` |
128
145
  | `eval` | Dynamically create new tools | `code` |
129
146
  | `grep` | Search Git‑tracked files | `query` |
130
147
  | `list` | List tracked files | `path` (optional) |
148
+ | `patch` | Apply a unified diff via `git apply` | `diff` |
131
149
  | `read` | Read file contents | `path` |
132
150
  | `write` | Overwrite a file | `path`, `content` |
133
- | `patch` | Apply a unified diff via `git apply` | `diff` |
134
- | `execute` | Run shell commands | `cmd`, `args`, `env`, `cwd`, `stdin` |
135
151
 
136
152
  ## Tool Definition
137
153
 
@@ -148,8 +164,7 @@ arguments as a hash.
148
164
 
149
165
  ## Contributing
150
166
 
151
- Feel free to open issues or pull requests. The repository follows the
152
- GitHub Flow.
167
+ Send me an email. For instructions see https://git-send-email.io/.
153
168
 
154
169
  ## License
155
170
 
data/lib/elelem/agent.rb CHANGED
@@ -66,6 +66,7 @@ module Elelem
66
66
  end
67
67
 
68
68
  def format_tool_call_result(result)
69
+ return if result.nil?
69
70
  return result["stdout"] if result["stdout"]
70
71
  return result["stderr"] if result["stderr"]
71
72
  return result[:error] if result[:error]
@@ -73,6 +74,21 @@ module Elelem
73
74
  ""
74
75
  end
75
76
 
77
+ def format_tool_calls_for_api(tool_calls)
78
+ tool_calls.map do |tc|
79
+ args = openai_client? ? JSON.dump(tc[:arguments]) : tc[:arguments]
80
+ {
81
+ id: tc[:id],
82
+ type: "function",
83
+ function: { name: tc[:name], arguments: args }
84
+ }
85
+ end
86
+ end
87
+
88
+ def openai_client?
89
+ client.is_a?(Net::Llm::OpenAI)
90
+ end
91
+
76
92
  def execute_turn(messages, tools:)
77
93
  turn_context = []
78
94
 
@@ -80,32 +96,31 @@ module Elelem
80
96
  content = ""
81
97
  tool_calls = []
82
98
 
83
- print "Thinking..."
84
- client.chat(messages + turn_context, tools) do |chunk|
85
- msg = chunk["message"]
86
- if msg
87
- if msg["content"] && !msg["content"].empty?
88
- print "\r\e[K" if content.empty?
89
- print msg["content"]
90
- content += msg["content"]
91
- end
92
-
93
- tool_calls += msg["tool_calls"] if msg["tool_calls"]
99
+ print "Thinking> "
100
+ client.fetch(messages + turn_context, tools) do |chunk|
101
+ case chunk[:type]
102
+ when :delta
103
+ print chunk[:thinking] if chunk[:thinking]
104
+ content += chunk[:content] if chunk[:content]
105
+ when :complete
106
+ content = chunk[:content] if chunk[:content]
107
+ tool_calls = chunk[:tool_calls] || []
94
108
  end
95
109
  end
96
110
 
97
- puts
98
- turn_context << { role: "assistant", content: content, tool_calls: tool_calls }.compact
111
+ puts "\nAssistant> #{content}" unless content.to_s.empty?
112
+ api_tool_calls = tool_calls.any? ? format_tool_calls_for_api(tool_calls) : nil
113
+ turn_context << { role: "assistant", content: content, tool_calls: api_tool_calls }.compact
99
114
 
100
115
  if tool_calls.any?
101
116
  tool_calls.each do |call|
102
- name = call.dig("function", "name")
103
- args = call.dig("function", "arguments")
117
+ name = call[:name]
118
+ args = call[:arguments]
104
119
 
105
- puts "Tool> #{name}(#{args})"
120
+ puts "\nTool> #{name}(#{args})"
106
121
  result = toolbox.run_tool(name, args)
107
122
  puts format_tool_call_result(result)
108
- turn_context << { role: "tool", content: JSON.dump(result) }
123
+ turn_context << { role: "tool", tool_call_id: call[:id], content: JSON.dump(result) }
109
124
  end
110
125
 
111
126
  tool_calls = []
@@ -2,27 +2,40 @@
2
2
 
3
3
  module Elelem
4
4
  class Application < Thor
5
+ PROVIDERS = %w[ollama anthropic openai vertex-ai].freeze
6
+
5
7
  desc "chat", "Start the REPL"
6
- method_option :host,
7
- aliases: "--host",
8
+ method_option :provider,
9
+ aliases: "-p",
8
10
  type: :string,
9
- desc: "Ollama host",
10
- default: ENV.fetch("OLLAMA_HOST", "localhost:11434")
11
+ desc: "LLM provider (#{PROVIDERS.join(', ')})",
12
+ default: ENV.fetch("ELELEM_PROVIDER", "ollama")
11
13
  method_option :model,
12
- aliases: "--model",
14
+ aliases: "-m",
13
15
  type: :string,
14
- desc: "Ollama model",
15
- default: ENV.fetch("OLLAMA_MODEL", "gpt-oss")
16
+ desc: "Model name (uses provider default if not specified)"
16
17
  def chat(*)
17
- client = Net::Llm::Ollama.new(
18
- host: options[:host],
19
- model: options[:model],
20
- )
21
- say "Agent (#{options[:model]})", :green
18
+ client = build_client
19
+ say "Agent (#{options[:provider]}/#{client.model})", :green
22
20
  agent = Agent.new(client, Toolbox.new)
23
21
  agent.repl
24
22
  end
25
23
 
24
+ private
25
+
26
+ def build_client
27
+ model_opts = options[:model] ? { model: options[:model] } : {}
28
+
29
+ case options[:provider]
30
+ when "ollama" then Net::Llm::Ollama.new(**model_opts)
31
+ when "anthropic" then Net::Llm::Anthropic.new(**model_opts)
32
+ when "openai" then Net::Llm::OpenAI.new(**model_opts)
33
+ when "vertex-ai" then Net::Llm::VertexAI.new(**model_opts)
34
+ else
35
+ raise Error, "Unknown provider: #{options[:provider]}. Use: #{PROVIDERS.join(', ')}"
36
+ end
37
+ end
38
+
26
39
  desc "files", "Generate CXML of the files"
27
40
  def files
28
41
  puts '<documents>'
@@ -45,19 +45,19 @@ module Elelem
45
45
 
46
46
  case mode.sort
47
47
  when [:read]
48
- "#{base}\n\nRead and analyze. Understand before suggesting action."
48
+ "#{base}\n\nYou may read files on the system."
49
49
  when [:write]
50
- "#{base}\n\nWrite clean, thoughtful code."
50
+ "#{base}\n\nYou may write files on the system."
51
51
  when [:execute]
52
- "#{base}\n\nUse shell commands creatively to understand and manipulate the system."
52
+ "#{base}\n\nYou may execute shell commands on the system."
53
53
  when [:read, :write]
54
- "#{base}\n\nFirst understand, then build solutions that integrate well."
54
+ "#{base}\n\nYou may read and write files on the system."
55
55
  when [:execute, :read]
56
- "#{base}\n\nUse commands to deeply understand the system."
56
+ "#{base}\n\nYou may execute shell commands and read files on the system."
57
57
  when [:execute, :write]
58
- "#{base}\n\nCreate and execute freely. Have fun. Be kind."
58
+ "#{base}\n\nYou may execute shell commands and write files on the system."
59
59
  when [:execute, :read, :write]
60
- "#{base}\n\nYou have all tools. Use them wisely."
60
+ "#{base}\n\nYou may read files, write files and execute shell commands on the system."
61
61
  else
62
62
  base
63
63
  end
@@ -1,5 +1,15 @@
1
- You are a reasoning coding and system agent working from: <%= Dir.pwd %>.
1
+ You are a reasoning coding and system agent.
2
2
 
3
- - Less is more
4
- - No code comments
5
- - No trailing whitespace
3
+ ## System
4
+
5
+ Operating System: <%= `uname -a` %>
6
+ USER: <%= ENV['USER'] %>
7
+ HOME: <%= ENV['HOME'] %>
8
+ SHELL: <%= ENV['SHELL'] %>
9
+ PATH: <%= ENV['PATH'] %>
10
+ PWD: <%= ENV['PWD'] %>
11
+ LANG: <%= ENV['LANG'] %>
12
+ EDITOR: <%= ENV['EDITOR'] %>
13
+ LOGNAME: <%= ENV['LOGNAME'] %>
14
+ TERM: <%= ENV['TERM'] %>
15
+ MAIL: <%= ENV['MAIL'] %>
data/lib/elelem/tool.rb CHANGED
@@ -11,7 +11,9 @@ module Elelem
11
11
  end
12
12
 
13
13
  def call(args)
14
- return ArgumentError.new(args) unless valid?(args)
14
+ unless valid?(args)
15
+ return { error: "Invalid args for #{@name}", received: args.keys, expected: @schema.dig(:function, :parameters, :required) }
16
+ end
15
17
 
16
18
  @block.call(args)
17
19
  end
@@ -2,13 +2,14 @@
2
2
 
3
3
  module Elelem
4
4
  class Toolbox
5
+
5
6
  READ_TOOL = Tool.build("read", "Read complete contents of a file. Requires exact file path.", { path: { type: "string" } }, ["path"]) do |args|
6
7
  path = args["path"]
7
8
  full_path = Pathname.new(path).expand_path
8
9
  full_path.exist? ? { content: full_path.read } : { error: "File not found: #{path}" }
9
10
  end
10
11
 
11
- EXEC_TOOL = Tool.build("execute", "Run shell commands. For git: execute({\"cmd\": \"git\", \"args\": [\"log\", \"--oneline\"]}). Returns stdout/stderr/exit_status.", { cmd: { type: "string" }, args: { type: "array", items: { type: "string" } }, env: { type: "object", additionalProperties: { type: "string" } }, cwd: { type: "string", description: "Working directory (defaults to current)" }, stdin: { type: "string" } }, ["cmd"]) do |args|
12
+ BASH_TOOL = Tool.build("bash", "Run shell commands. For git: bash({\"cmd\": \"git\", \"args\": [\"log\", \"--oneline\"]}). Returns stdout/stderr/exit_status.", { cmd: { type: "string" }, args: { type: "array", items: { type: "string" } }, env: { type: "object", additionalProperties: { type: "string" } }, cwd: { type: "string", description: "Working directory (defaults to current)" }, stdin: { type: "string" } }, ["cmd"]) do |args|
12
13
  Elelem.shell.execute(
13
14
  args["cmd"],
14
15
  args: args["args"] || [],
@@ -36,13 +37,20 @@ module Elelem
36
37
  { bytes_written: full_path.write(args["content"]) }
37
38
  end
38
39
 
40
+ TOOL_ALIASES = {
41
+ "exec" => "bash",
42
+ "execute" => "bash",
43
+ "open" => "read",
44
+ "search" => "grep",
45
+ }
46
+
39
47
  attr_reader :tools
40
48
 
41
49
  def initialize
42
50
  @tools_by_name = {}
43
51
  @tools = { read: [], write: [], execute: [] }
44
52
  add_tool(eval_tool(binding), :execute)
45
- add_tool(EXEC_TOOL, :execute)
53
+ add_tool(BASH_TOOL, :execute)
46
54
  add_tool(GREP_TOOL, :read)
47
55
  add_tool(LIST_TOOL, :read)
48
56
  add_tool(PATCH_TOOL, :write)
@@ -64,7 +72,8 @@ module Elelem
64
72
  end
65
73
 
66
74
  def run_tool(name, args)
67
- @tools_by_name[name]&.call(args) || { error: "Unknown tool", name: name, args: args }
75
+ resolved_name = TOOL_ALIASES.fetch(name, name)
76
+ @tools_by_name[resolved_name]&.call(args) || { error: "Unknown tool", name: name, args: args }
68
77
  rescue => error
69
78
  { error: error.message, name: name, args: args, backtrace: error.backtrace.first(5) }
70
79
  end
@@ -1,5 +1,5 @@
1
1
  # frozen_string_literal: true
2
2
 
3
3
  module Elelem
4
- VERSION = "0.4.1"
4
+ VERSION = "0.5.0"
5
5
  end
metadata CHANGED
@@ -1,7 +1,7 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: elelem
3
3
  version: !ruby/object:Gem::Version
4
- version: 0.4.1
4
+ version: 0.5.0
5
5
  platform: ruby
6
6
  authors:
7
7
  - mo khan
@@ -13,171 +13,177 @@ dependencies:
13
13
  name: erb
14
14
  requirement: !ruby/object:Gem::Requirement
15
15
  requirements:
16
- - - ">="
16
+ - - "~>"
17
17
  - !ruby/object:Gem::Version
18
- version: '0'
18
+ version: '6.0'
19
19
  type: :runtime
20
20
  prerelease: false
21
21
  version_requirements: !ruby/object:Gem::Requirement
22
22
  requirements:
23
- - - ">="
23
+ - - "~>"
24
24
  - !ruby/object:Gem::Version
25
- version: '0'
25
+ version: '6.0'
26
26
  - !ruby/object:Gem::Dependency
27
27
  name: fileutils
28
28
  requirement: !ruby/object:Gem::Requirement
29
29
  requirements:
30
- - - ">="
30
+ - - "~>"
31
31
  - !ruby/object:Gem::Version
32
- version: '0'
32
+ version: '1.0'
33
33
  type: :runtime
34
34
  prerelease: false
35
35
  version_requirements: !ruby/object:Gem::Requirement
36
36
  requirements:
37
- - - ">="
37
+ - - "~>"
38
38
  - !ruby/object:Gem::Version
39
- version: '0'
39
+ version: '1.0'
40
40
  - !ruby/object:Gem::Dependency
41
41
  name: json
42
42
  requirement: !ruby/object:Gem::Requirement
43
43
  requirements:
44
- - - ">="
44
+ - - "~>"
45
45
  - !ruby/object:Gem::Version
46
- version: '0'
46
+ version: '2.0'
47
47
  type: :runtime
48
48
  prerelease: false
49
49
  version_requirements: !ruby/object:Gem::Requirement
50
50
  requirements:
51
- - - ">="
51
+ - - "~>"
52
52
  - !ruby/object:Gem::Version
53
- version: '0'
53
+ version: '2.0'
54
54
  - !ruby/object:Gem::Dependency
55
55
  name: json-schema
56
56
  requirement: !ruby/object:Gem::Requirement
57
57
  requirements:
58
- - - ">="
58
+ - - "~>"
59
59
  - !ruby/object:Gem::Version
60
- version: '0'
60
+ version: '6.0'
61
61
  type: :runtime
62
62
  prerelease: false
63
63
  version_requirements: !ruby/object:Gem::Requirement
64
64
  requirements:
65
- - - ">="
65
+ - - "~>"
66
66
  - !ruby/object:Gem::Version
67
- version: '0'
67
+ version: '6.0'
68
68
  - !ruby/object:Gem::Dependency
69
69
  name: logger
70
70
  requirement: !ruby/object:Gem::Requirement
71
71
  requirements:
72
- - - ">="
72
+ - - "~>"
73
73
  - !ruby/object:Gem::Version
74
- version: '0'
74
+ version: '1.0'
75
75
  type: :runtime
76
76
  prerelease: false
77
77
  version_requirements: !ruby/object:Gem::Requirement
78
78
  requirements:
79
- - - ">="
79
+ - - "~>"
80
80
  - !ruby/object:Gem::Version
81
- version: '0'
81
+ version: '1.0'
82
82
  - !ruby/object:Gem::Dependency
83
83
  name: net-llm
84
84
  requirement: !ruby/object:Gem::Requirement
85
85
  requirements:
86
+ - - "~>"
87
+ - !ruby/object:Gem::Version
88
+ version: '0.5'
86
89
  - - ">="
87
90
  - !ruby/object:Gem::Version
88
- version: '0'
91
+ version: 0.5.0
89
92
  type: :runtime
90
93
  prerelease: false
91
94
  version_requirements: !ruby/object:Gem::Requirement
92
95
  requirements:
96
+ - - "~>"
97
+ - !ruby/object:Gem::Version
98
+ version: '0.5'
93
99
  - - ">="
94
100
  - !ruby/object:Gem::Version
95
- version: '0'
101
+ version: 0.5.0
96
102
  - !ruby/object:Gem::Dependency
97
103
  name: open3
98
104
  requirement: !ruby/object:Gem::Requirement
99
105
  requirements:
100
- - - ">="
106
+ - - "~>"
101
107
  - !ruby/object:Gem::Version
102
- version: '0'
108
+ version: '0.1'
103
109
  type: :runtime
104
110
  prerelease: false
105
111
  version_requirements: !ruby/object:Gem::Requirement
106
112
  requirements:
107
- - - ">="
113
+ - - "~>"
108
114
  - !ruby/object:Gem::Version
109
- version: '0'
115
+ version: '0.1'
110
116
  - !ruby/object:Gem::Dependency
111
117
  name: pathname
112
118
  requirement: !ruby/object:Gem::Requirement
113
119
  requirements:
114
- - - ">="
120
+ - - "~>"
115
121
  - !ruby/object:Gem::Version
116
- version: '0'
122
+ version: '0.1'
117
123
  type: :runtime
118
124
  prerelease: false
119
125
  version_requirements: !ruby/object:Gem::Requirement
120
126
  requirements:
121
- - - ">="
127
+ - - "~>"
122
128
  - !ruby/object:Gem::Version
123
- version: '0'
129
+ version: '0.1'
124
130
  - !ruby/object:Gem::Dependency
125
131
  name: reline
126
132
  requirement: !ruby/object:Gem::Requirement
127
133
  requirements:
128
- - - ">="
134
+ - - "~>"
129
135
  - !ruby/object:Gem::Version
130
- version: '0'
136
+ version: '0.6'
131
137
  type: :runtime
132
138
  prerelease: false
133
139
  version_requirements: !ruby/object:Gem::Requirement
134
140
  requirements:
135
- - - ">="
141
+ - - "~>"
136
142
  - !ruby/object:Gem::Version
137
- version: '0'
143
+ version: '0.6'
138
144
  - !ruby/object:Gem::Dependency
139
145
  name: set
140
146
  requirement: !ruby/object:Gem::Requirement
141
147
  requirements:
142
- - - ">="
148
+ - - "~>"
143
149
  - !ruby/object:Gem::Version
144
- version: '0'
150
+ version: '1.0'
145
151
  type: :runtime
146
152
  prerelease: false
147
153
  version_requirements: !ruby/object:Gem::Requirement
148
154
  requirements:
149
- - - ">="
155
+ - - "~>"
150
156
  - !ruby/object:Gem::Version
151
- version: '0'
157
+ version: '1.0'
152
158
  - !ruby/object:Gem::Dependency
153
159
  name: thor
154
160
  requirement: !ruby/object:Gem::Requirement
155
161
  requirements:
156
- - - ">="
162
+ - - "~>"
157
163
  - !ruby/object:Gem::Version
158
- version: '0'
164
+ version: '1.0'
159
165
  type: :runtime
160
166
  prerelease: false
161
167
  version_requirements: !ruby/object:Gem::Requirement
162
168
  requirements:
163
- - - ">="
169
+ - - "~>"
164
170
  - !ruby/object:Gem::Version
165
- version: '0'
171
+ version: '1.0'
166
172
  - !ruby/object:Gem::Dependency
167
173
  name: timeout
168
174
  requirement: !ruby/object:Gem::Requirement
169
175
  requirements:
170
- - - ">="
176
+ - - "~>"
171
177
  - !ruby/object:Gem::Version
172
- version: '0'
178
+ version: '0.1'
173
179
  type: :runtime
174
180
  prerelease: false
175
181
  version_requirements: !ruby/object:Gem::Requirement
176
182
  requirements:
177
- - - ">="
183
+ - - "~>"
178
184
  - !ruby/object:Gem::Version
179
- version: '0'
180
- description: A REPL for Ollama.
185
+ version: '0.1'
186
+ description: A minimal coding agent supporting Ollama, Anthropic, OpenAI, and VertexAI.
181
187
  email:
182
188
  - mo@mokhan.ca
183
189
  executables:
@@ -198,14 +204,14 @@ files:
198
204
  - lib/elelem/tool.rb
199
205
  - lib/elelem/toolbox.rb
200
206
  - lib/elelem/version.rb
201
- homepage: https://github.com/xlgmokha/elelem
207
+ homepage: https://src.mokhan.ca/xlgmokha/elelem
202
208
  licenses:
203
209
  - MIT
204
210
  metadata:
205
211
  allowed_push_host: https://rubygems.org
206
- homepage_uri: https://github.com/xlgmokha/elelem
207
- source_code_uri: https://github.com/xlgmokha/elelem
208
- changelog_uri: https://github.com/xlgmokha/elelem/blob/main/CHANGELOG.md
212
+ homepage_uri: https://src.mokhan.ca/xlgmokha/elelem
213
+ source_code_uri: https://src.mokhan.ca/xlgmokha/elelem
214
+ changelog_uri: https://src.mokhan.ca/xlgmokha/elelem/blob/main/CHANGELOG.md.html
209
215
  rdoc_options: []
210
216
  require_paths:
211
217
  - lib
@@ -222,5 +228,5 @@ required_rubygems_version: !ruby/object:Gem::Requirement
222
228
  requirements: []
223
229
  rubygems_version: 3.7.2
224
230
  specification_version: 4
225
- summary: A REPL for Ollama.
231
+ summary: A minimal coding agent for LLMs.
226
232
  test_files: []