vsm 0.1.0 → 0.2.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
Files changed (49) hide show
  1. checksums.yaml +4 -4
  2. data/README.md +144 -0
  3. data/Rakefile +1 -5
  4. data/examples/01_echo_tool.rb +5 -24
  5. data/examples/02_openai_streaming.rb +3 -3
  6. data/examples/02b_anthropic_streaming.rb +1 -4
  7. data/examples/03b_anthropic_tools.rb +1 -4
  8. data/examples/05_mcp_server_and_chattty.rb +63 -0
  9. data/examples/06_mcp_mount_reflection.rb +45 -0
  10. data/examples/07_connect_claude_mcp.rb +78 -0
  11. data/examples/08_custom_chattty.rb +63 -0
  12. data/examples/09_mcp_with_llm_calls.rb +49 -0
  13. data/examples/10_meta_read_only.rb +56 -0
  14. data/exe/vsm +17 -0
  15. data/lib/vsm/async_channel.rb +26 -3
  16. data/lib/vsm/capsule.rb +2 -0
  17. data/lib/vsm/cli.rb +78 -0
  18. data/lib/vsm/dsl.rb +41 -11
  19. data/lib/vsm/dsl_mcp.rb +36 -0
  20. data/lib/vsm/generator/new_project.rb +154 -0
  21. data/lib/vsm/generator/templates/Gemfile.erb +9 -0
  22. data/lib/vsm/generator/templates/README_md.erb +40 -0
  23. data/lib/vsm/generator/templates/Rakefile.erb +5 -0
  24. data/lib/vsm/generator/templates/bin_console.erb +11 -0
  25. data/lib/vsm/generator/templates/bin_setup.erb +7 -0
  26. data/lib/vsm/generator/templates/exe_name.erb +34 -0
  27. data/lib/vsm/generator/templates/gemspec.erb +24 -0
  28. data/lib/vsm/generator/templates/gitignore.erb +10 -0
  29. data/lib/vsm/generator/templates/lib_name_rb.erb +9 -0
  30. data/lib/vsm/generator/templates/lib_organism_rb.erb +44 -0
  31. data/lib/vsm/generator/templates/lib_ports_chat_tty_rb.erb +12 -0
  32. data/lib/vsm/generator/templates/lib_tools_read_file_rb.erb +32 -0
  33. data/lib/vsm/generator/templates/lib_version_rb.erb +6 -0
  34. data/lib/vsm/mcp/client.rb +80 -0
  35. data/lib/vsm/mcp/jsonrpc.rb +92 -0
  36. data/lib/vsm/mcp/remote_tool_capsule.rb +35 -0
  37. data/lib/vsm/meta/snapshot_builder.rb +121 -0
  38. data/lib/vsm/meta/snapshot_cache.rb +25 -0
  39. data/lib/vsm/meta/support.rb +35 -0
  40. data/lib/vsm/meta/tools.rb +498 -0
  41. data/lib/vsm/meta.rb +59 -0
  42. data/lib/vsm/ports/chat_tty.rb +112 -0
  43. data/lib/vsm/ports/mcp/server_stdio.rb +101 -0
  44. data/lib/vsm/roles/intelligence.rb +6 -2
  45. data/lib/vsm/version.rb +1 -1
  46. data/lib/vsm.rb +10 -0
  47. data/mcp_update.md +162 -0
  48. metadata +38 -18
  49. data/.rubocop.yml +0 -8
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: 9bdbe6ab3817a25d90829e9880efb85860948748281eedd885004411ff0221b3
4
- data.tar.gz: 7b47991a4af3167b8c3968e41e062dedad7f6050a74af7ad5f4f48952ccedb72
3
+ metadata.gz: cc432d13c4f757abb289f153f28cfb6c77a2585f5607a135579225009610eac4
4
+ data.tar.gz: beb5f88da44e032ce7f135c66d2fa0cc314eef3a05c1d76338f6bf8686d4ef41
5
5
  SHA512:
6
- metadata.gz: 1dd4ee44eaef8be76d7a02890312f7b3a4a99216ddcd613655cd8c47f81c9c188c471f164bfc2bd69071d9a5b392d89dc803e7600ae811f49f2db8b7b55bdc3f
7
- data.tar.gz: 7fe63b320f208604cfe40e0440bec76f9ef8dbceb0a419cb2f7501fce7f4d8097c42ce01d6d9e61345e272a3f6d438e663de94ca8af3eee52f44fc8d401780d4
6
+ metadata.gz: 808ebdb904246a2ef7ed4e6a5e2f6b66792ab8b6794e04e96e56a4ab5eb05be5fbe7d36d1bc1edf2ea87f64a243491151c9478c8eebb813b90fc0847cf1efc74
7
+ data.tar.gz: 0bd7328481b990604fbc40abe6050ee179734e144d9f6e7f70ace363752e0b245cc2aaf026b2c5175b68a22b175238e83c0570a70a75ee5dd56298ba9395c443
data/README.md CHANGED
@@ -165,6 +165,28 @@ ruby quickstart.rb
165
165
  # Tool> you said: hello
166
166
  ```
167
167
 
168
+ ## Project Generator (CLI)
169
+
170
+ Scaffold a new VSM app with a ChatTTY interface:
171
+
172
+ ```bash
173
+ gem install vsm # or build/install locally
174
+ vsm new my_agent
175
+ cd my_agent
176
+ bundle install
177
+ bundle exec exe/my-agent
178
+ ```
179
+
180
+ Options:
181
+ - `--with-llm openai|anthropic|gemini` — choose LLM provider (default: openai)
182
+ - `--model <name>` — default model
183
+ - `--git` — initialize git and commit
184
+ - `--bundle` — run `bundle install`
185
+ - `--path <dir>` — target directory (default: `./<name>`)
186
+ - `--force` — overwrite an existing non-empty directory
187
+
188
+ Generated layout mirrors the `airb` example: an `Organism.build` to assemble the capsule, a default `ChatTTY` port, and a sample `echo` tool ready to extend.
189
+
168
190
  ## Building a Real Agent
169
191
 
170
192
  For a real agent with LLM integration:
@@ -225,6 +247,37 @@ Your `MyLLMIntelligence` would:
225
247
  - **Observability**: append‑only JSONL ledger you can feed into a UI later
226
248
  - **POODR/SOLID**: small objects, high cohesion, low coupling
227
249
 
250
+ ## Meta Tools
251
+
252
+ VSM includes a set of read‑only meta tools you can attach to any capsule to inspect its structure and code:
253
+
254
+ - `meta_summarize_self` — Summarize the current capsule including roles and tools
255
+ - `meta_list_tools` — List all tools available in the organism (descriptors and paths)
256
+ - `meta_explain_tool` — Show code and context for a specific tool
257
+ - `meta_explain_role` — Explain a role implementation for a capsule, with source snippets
258
+
259
+ Attach them when building your capsule:
260
+
261
+ ```ruby
262
+ capsule = VSM::DSL.define(:my_agent) do
263
+ identity klass: VSM::Identity, args: { identity: "my_agent" }
264
+ governance klass: VSM::Governance
265
+ coordination klass: VSM::Coordination
266
+ intelligence klass: VSM::Intelligence
267
+ monitoring klass: VSM::Monitoring
268
+ operations do
269
+ meta_tools # registers the four meta tools above on this capsule
270
+ end
271
+ end
272
+ ```
273
+
274
+ Example calls:
275
+
276
+ - `meta_summarize_self {}` → high‑level snapshot and counts
277
+ - `meta_list_tools {}` → array of tools with descriptors
278
+ - `meta_explain_tool { "tool": "some_tool" }` → code snippet + descriptor
279
+ - `meta_explain_role { "role": "coordination" }` → role class, constructor args, source locations, and code blocks
280
+
228
281
  ## Core Concepts
229
282
 
230
283
  ### Capsule
@@ -350,6 +403,91 @@ Start everything:
350
403
  VSM::Runtime.start(capsule, ports: [MyPort.new(capsule:)])
351
404
  ```
352
405
 
406
+ ### Built-in Ports
407
+
408
+ - `VSM::Ports::ChatTTY` — A generic, customizable chat terminal UI. Safe to run alongside MCP stdio; prefers `IO.console` so it won’t pollute stdout.
409
+ - `VSM::Ports::MCP::ServerStdio` — Exposes your capsule as an MCP server on stdio implementing `tools/list` and `tools/call`.
410
+
411
+ Enable them:
412
+
413
+ ```ruby
414
+ require "vsm/ports/chat_tty"
415
+ require "vsm/ports/mcp/server_stdio"
416
+
417
+ ports = [
418
+ VSM::Ports::MCP::ServerStdio.new(capsule: capsule), # machine IO (stdio)
419
+ VSM::Ports::ChatTTY.new(capsule: capsule) # human IO (terminal)
420
+ ]
421
+ VSM::Runtime.start(capsule, ports: ports)
422
+ ```
423
+
424
+ ### MCP Client (reflect and wrap tools)
425
+
426
+ Reflect tools from an external MCP server and expose them as local tools using the DSL. This uses a tiny stdio JSON‑RPC client under the hood.
427
+
428
+ ```ruby
429
+ require "vsm/dsl_mcp"
430
+
431
+ cap = VSM::DSL.define(:mcp_client) do
432
+ identity klass: VSM::Identity, args: { identity: "mcp_client", invariants: [] }
433
+ governance klass: VSM::Governance
434
+ coordination klass: VSM::Coordination
435
+ intelligence klass: VSM::Intelligence # or your own
436
+ monitoring klass: VSM::Monitoring
437
+ operations do
438
+ # Prefix helps avoid name collisions
439
+ mcp_server :smith, cmd: "smith-server --stdio", prefix: "smith_", include: %w[search read]
440
+ end
441
+ end
442
+ ```
443
+
444
+ See `examples/06_mcp_mount_reflection.rb` and `examples/07_connect_claude_mcp.rb`.
445
+
446
+ Note: Many MCP servers speak LSP-style `Content-Length` framing on stdio. The
447
+ current minimal transport uses NDJSON for simplicity. If a server hangs or
448
+ doesn't respond, switch the transport to LSP framing in `lib/vsm/mcp/jsonrpc.rb`.
449
+
450
+ ### Customizing ChatTTY
451
+
452
+ You can customize ChatTTY via options or by subclassing to override only the banner and rendering methods, while keeping the input loop.
453
+
454
+ ```ruby
455
+ class FancyTTY < VSM::Ports::ChatTTY
456
+ def banner(io)
457
+ io.puts "\e[95m\n ███ CUSTOM CHAT ███\n\e[0m"
458
+ end
459
+
460
+ def render_out(m)
461
+ super # or implement your own formatting
462
+ end
463
+ end
464
+
465
+ VSM::Runtime.start(capsule, ports: [FancyTTY.new(capsule: capsule, prompt: "Me> ")])
466
+ ```
467
+
468
+ See `examples/08_custom_chattty.rb`.
469
+
470
+ ### LLM-driven MCP tools
471
+
472
+ Use an LLM driver (e.g., OpenAI) to automatically call tools reflected from an MCP server:
473
+
474
+ ```ruby
475
+ driver = VSM::Drivers::OpenAI::AsyncDriver.new(api_key: ENV.fetch("OPENAI_API_KEY"), model: ENV["AIRB_MODEL"] || "gpt-4o-mini")
476
+ cap = VSM::DSL.define(:mcp_with_llm) do
477
+ identity klass: VSM::Identity, args: { identity: "mcp_with_llm", invariants: [] }
478
+ governance klass: VSM::Governance
479
+ coordination klass: VSM::Coordination
480
+ intelligence klass: VSM::Intelligence, args: { driver: driver, system_prompt: "Use tools when helpful." }
481
+ monitoring klass: VSM::Monitoring
482
+ operations do
483
+ mcp_server :server, cmd: ["claude","mcp","serve"] # reflect tools
484
+ end
485
+ end
486
+ VSM::Runtime.start(cap, ports: [VSM::Ports::ChatTTY.new(capsule: cap)])
487
+ ```
488
+
489
+ See `examples/09_mcp_with_llm_calls.rb`.
490
+
353
491
  ## Observability
354
492
 
355
493
  VSM ships a tiny Monitoring role that writes an append‑only JSONL ledger:
@@ -364,6 +502,12 @@ VSM ships a tiny Monitoring role that writes an append‑only JSONL ledger:
364
502
 
365
503
  Use it to power a TUI/HTTP "Lens" later. Because everything flows over the bus, you get consistent events across nested capsules and sub‑agents.
366
504
 
505
+ ### MCP and ChatTTY Coexistence
506
+
507
+ - MCP stdio port only reads stdin and writes strict JSON to stdout.
508
+ - ChatTTY prefers `IO.console` or falls back to stderr and disables input if no TTY.
509
+ - You can run both in the same process: machine protocol on stdio, human UI on the terminal.
510
+
367
511
  ## Writing an Intelligence
368
512
 
369
513
  The Intelligence role is where you plan/decide. It might:
data/Rakefile CHANGED
@@ -5,8 +5,4 @@ require "rspec/core/rake_task"
5
5
 
6
6
  RSpec::Core::RakeTask.new(:spec)
7
7
 
8
- require "rubocop/rake_task"
9
-
10
- RuboCop::RakeTask.new
11
-
12
- task default: %i[spec rubocop]
8
+ task default: %i[spec]
@@ -1,6 +1,8 @@
1
1
  # frozen_string_literal: true
2
2
  $LOAD_PATH.unshift(File.expand_path("../lib", __dir__))
3
3
  require "vsm"
4
+ require "vsm/ports/chat_tty"
5
+ require "securerandom"
4
6
 
5
7
  class EchoTool < VSM::ToolCapsule
6
8
  tool_name "echo"
@@ -44,27 +46,6 @@ cap = VSM::DSL.define(:demo) do
44
46
  end
45
47
  end
46
48
 
47
- # Simple CLI port
48
- class StdinPort < VSM::Port
49
- def loop
50
- sid = SecureRandom.uuid
51
- print "You: "
52
- while (line = $stdin.gets&.chomp)
53
- @capsule.bus.emit VSM::Message.new(kind: :user, payload: line, meta: { session_id: sid })
54
- @capsule.roles[:coordination].wait_for_turn_end(sid)
55
- print "You: "
56
- end
57
- end
58
-
59
- def render_out(msg)
60
- case msg.kind
61
- when :assistant
62
- puts "\nBot: #{msg.payload}"
63
- when :tool_result
64
- puts "\nTool> #{msg.payload}"
65
- end
66
- end
67
- end
68
-
69
- VSM::Runtime.start(cap, ports: [StdinPort.new(capsule: cap)])
70
-
49
+ # Use the built-in, customizable ChatTTY port
50
+ banner = ->(io) { io.puts "\e[96mEcho demo\e[0m — type 'echo: hello' (Ctrl-C to exit)" }
51
+ VSM::Runtime.start(cap, ports: [VSM::Ports::ChatTTY.new(capsule: cap, banner: banner)])
@@ -57,8 +57,9 @@ class StreamTTY < VSM::Port
57
57
  print msg.payload
58
58
  $stdout.flush
59
59
  when :assistant
60
- puts "" # end the line
61
- puts msg.payload.to_s unless msg.payload.to_s.empty?
60
+ puts "" # end the line after streaming
61
+ # The :assistant event carries the full final text again; avoid re-printing it
62
+ # because we've already streamed the deltas above. Just show the turn marker.
62
63
  puts "(turn #{msg.meta&.dig(:turn_id)})"
63
64
  when :tool_result
64
65
  puts "\nTool> #{msg.payload}"
@@ -70,4 +71,3 @@ end
70
71
 
71
72
  VSM::Runtime.start(cap, ports: [StreamTTY.new(capsule: cap)])
72
73
 
73
-
@@ -10,9 +10,7 @@ MODEL = ENV["AIRB_MODEL"] || "claude-sonnet-4-0"
10
10
 
11
11
  driver = VSM::Drivers::Anthropic::AsyncDriver.new(
12
12
  api_key: ENV.fetch("ANTHROPIC_API_KEY"),
13
- model: MODEL,
14
- streaming: true,
15
- transport: :nethttp
13
+ model: MODEL
16
14
  )
17
15
 
18
16
  system_prompt = "You are a concise assistant. Answer briefly."
@@ -58,4 +56,3 @@ end
58
56
 
59
57
  VSM::Runtime.start(cap, ports: [StreamTTY.new(capsule: cap)])
60
58
 
61
-
@@ -37,9 +37,7 @@ end
37
37
 
38
38
  driver = VSM::Drivers::Anthropic::AsyncDriver.new(
39
39
  api_key: ENV.fetch("ANTHROPIC_API_KEY"),
40
- model: MODEL,
41
- streaming: true,
42
- transport: :nethttp
40
+ model: MODEL
43
41
  )
44
42
 
45
43
  system_prompt = <<~PROMPT
@@ -93,4 +91,3 @@ end
93
91
 
94
92
  VSM::Runtime.start(cap, ports: [ToolTTY.new(capsule: cap)])
95
93
 
96
-
@@ -0,0 +1,63 @@
1
+ # frozen_string_literal: true
2
+ $LOAD_PATH.unshift(File.expand_path("../lib", __dir__))
3
+ require "vsm"
4
+ require "securerandom"
5
+ require "vsm/ports/chat_tty"
6
+ require "vsm/ports/mcp/server_stdio"
7
+
8
+ # A simple local tool we can expose to both ChatTTY and MCP stdio.
9
+ class EchoTool < VSM::ToolCapsule
10
+ tool_name "echo"
11
+ tool_description "Echoes back the provided text"
12
+ tool_schema({ type: "object", properties: { text: { type: "string" } }, required: ["text"] })
13
+ def run(args)
14
+ "you said: #{args["text"]}"
15
+ end
16
+ end
17
+
18
+ # Minimal intelligence that triggers the echo tool when user types: echo: ...
19
+ class DemoIntelligence < VSM::Intelligence
20
+ def handle(message, bus:, **)
21
+ case message.kind
22
+ when :user
23
+ if message.payload =~ /\Aecho:\s*(.+)\z/
24
+ bus.emit VSM::Message.new(kind: :tool_call, payload: { tool: "echo", args: { "text" => $1 } }, corr_id: SecureRandom.uuid, meta: message.meta)
25
+ else
26
+ bus.emit VSM::Message.new(kind: :assistant, payload: "Try: echo: hello", meta: message.meta)
27
+ end
28
+ true
29
+ when :tool_result
30
+ bus.emit VSM::Message.new(kind: :assistant, payload: "(done)", meta: message.meta)
31
+ true
32
+ else
33
+ false
34
+ end
35
+ end
36
+ end
37
+
38
+ cap = VSM::DSL.define(:demo_mcp_server_and_chat) do
39
+ identity klass: VSM::Identity, args: { identity: "demo", invariants: [] }
40
+ governance klass: VSM::Governance
41
+ coordination klass: VSM::Coordination
42
+ intelligence klass: DemoIntelligence
43
+ monitoring klass: VSM::Monitoring
44
+ operations do
45
+ capsule :echo, klass: EchoTool
46
+ end
47
+ end
48
+
49
+ # Run both ports together: MCP stdio (machine) + ChatTTY (human).
50
+ banner = ->(io) { io.puts "\e[96mVSM demo\e[0m — type 'echo: hi' (Ctrl-C to exit)" }
51
+ ports = [VSM::Ports::MCP::ServerStdio.new(capsule: cap)]
52
+ if $stdout.tty?
53
+ # Only enable interactive ChatTTY when attached to a TTY to avoid
54
+ # interfering when this example is spawned as a background MCP server.
55
+ begin
56
+ tty = File.open("/dev/tty", "r+")
57
+ rescue StandardError
58
+ tty = nil
59
+ end
60
+ ports << VSM::Ports::ChatTTY.new(capsule: cap, banner: banner, input: tty, output: tty)
61
+ end
62
+
63
+ VSM::Runtime.start(cap, ports: ports)
@@ -0,0 +1,45 @@
1
+ # frozen_string_literal: true
2
+ $LOAD_PATH.unshift(File.expand_path("../lib", __dir__))
3
+ require "vsm"
4
+ require "vsm/dsl_mcp"
5
+ require "vsm/ports/chat_tty"
6
+ require "securerandom"
7
+
8
+ # This example mounts a remote MCP server (we use example 05 as the server)
9
+ # and exposes its tools locally via dynamic reflection. Type: echo: hello
10
+
11
+ class DemoIntelligence < VSM::Intelligence
12
+ def handle(message, bus:, **)
13
+ case message.kind
14
+ when :user
15
+ if message.payload =~ /\Aecho:\s*(.+)\z/
16
+ bus.emit VSM::Message.new(kind: :tool_call, payload: { tool: "echo", args: { "text" => $1 } }, corr_id: SecureRandom.uuid, meta: message.meta)
17
+ else
18
+ bus.emit VSM::Message.new(kind: :assistant, payload: "Try: echo: hello", meta: message.meta)
19
+ end
20
+ true
21
+ when :tool_result
22
+ bus.emit VSM::Message.new(kind: :assistant, payload: "(done)", meta: message.meta)
23
+ true
24
+ else
25
+ false
26
+ end
27
+ end
28
+ end
29
+
30
+ server_cmd = "ruby #{File.expand_path("05_mcp_server_and_chattty.rb", __dir__)}"
31
+
32
+ cap = VSM::DSL.define(:mcp_mount_demo) do
33
+ identity klass: VSM::Identity, args: { identity: "mcp_mount_demo", invariants: [] }
34
+ governance klass: VSM::Governance
35
+ coordination klass: VSM::Coordination
36
+ intelligence klass: DemoIntelligence
37
+ monitoring klass: VSM::Monitoring
38
+ operations do
39
+ # Reflect the remote server's tools; include only :echo and expose as local name "echo"
40
+ mcp_server :demo_server, cmd: server_cmd, include: %w[echo]
41
+ end
42
+ end
43
+
44
+ banner = ->(io) { io.puts "\e[96mMCP mount demo\e[0m — type 'echo: hi' (Ctrl-C to exit)" }
45
+ VSM::Runtime.start(cap, ports: [VSM::Ports::ChatTTY.new(capsule: cap, banner: banner)])
@@ -0,0 +1,78 @@
1
+ # frozen_string_literal: true
2
+ $LOAD_PATH.unshift(File.expand_path("../lib", __dir__))
3
+ require "json"
4
+ require "securerandom"
5
+ require "vsm"
6
+ require "vsm/dsl_mcp"
7
+ require "vsm/ports/chat_tty"
8
+
9
+ # Example: Connect to an external MCP server (Claude Code)
10
+ #
11
+ # Prereqs:
12
+ # - Install Claude CLI and log in.
13
+ # - Ensure `claude mcp serve` works in your shell.
14
+ #
15
+ # IMPORTANT: Many MCP servers (including Claude) use LSP-style Content-Length
16
+ # framing over stdio. The minimal transport in this repo currently uses NDJSON
17
+ # (one JSON per line). If this example hangs or fails, it's due to framing
18
+ # mismatch; swap the transport to LSP framing in lib/vsm/mcp/jsonrpc.rb.
19
+ #
20
+ # Usage:
21
+ # ruby examples/07_connect_claude_mcp.rb
22
+ # Then type:
23
+ # list
24
+ # call: some_tool {"arg1":"value"}
25
+ #
26
+ # This example avoids requiring any LLM API keys by letting you call tools manually
27
+ # via a simple chat convention.
28
+
29
+ # Intelligence that recognizes two commands:
30
+ # - "list" → prints available tools
31
+ # - "call: NAME {json}" → invokes the reflected tool with JSON args
32
+ class ManualMCPIntelligence < VSM::Intelligence
33
+ def handle(message, bus:, **)
34
+ return false unless message.kind == :user
35
+ line = message.payload.to_s.strip
36
+ if line == "list"
37
+ # Inspect operations children for tool descriptors
38
+ ops = bus.context[:operations_children] || {}
39
+ tools = ops.values.select { _1.respond_to?(:tool_descriptor) }.map { _1.tool_descriptor.name }
40
+ bus.emit VSM::Message.new(kind: :assistant, payload: tools.any? ? "tools: #{tools.join(", ")}" : "(no tools)", meta: message.meta)
41
+ return true
42
+ elsif line.start_with?("call:")
43
+ if line =~ /\Acall:\s*(\S+)\s*(\{.*\})?\z/
44
+ tool = $1
45
+ json = $2
46
+ args = json ? (JSON.parse(json) rescue {}) : {}
47
+ bus.emit VSM::Message.new(kind: :tool_call, payload: { tool: tool, args: args }, corr_id: SecureRandom.uuid, meta: message.meta)
48
+ return true
49
+ else
50
+ bus.emit VSM::Message.new(kind: :assistant, payload: "usage: call: NAME {json}", meta: message.meta)
51
+ return true
52
+ end
53
+ else
54
+ bus.emit VSM::Message.new(kind: :assistant, payload: "Commands: list | call: NAME {json}", meta: message.meta)
55
+ return true
56
+ end
57
+ end
58
+ end
59
+
60
+ cap = VSM::DSL.define(:claude_mcp_client) do
61
+ identity klass: VSM::Identity, args: { identity: "claude_mcp_client", invariants: [] }
62
+ governance klass: VSM::Governance
63
+ coordination klass: VSM::Coordination
64
+ intelligence klass: ManualMCPIntelligence
65
+ monitoring klass: VSM::Monitoring
66
+ operations do
67
+ # Reflect all available tools from the external server.
68
+ # Tip: if tool names collide with locals, use prefix: "claude_".
69
+ mcp_server :claude, cmd: ["claude", "mcp", "serve"]
70
+ end
71
+ end
72
+
73
+ banner = ->(io) do
74
+ io.puts "\e[96mMCP client (Claude)\e[0m"
75
+ io.puts "Type 'list' or 'call: NAME {json}'"
76
+ end
77
+
78
+ VSM::Runtime.start(cap, ports: [VSM::Ports::ChatTTY.new(capsule: cap, banner: banner, prompt: "You> ")])
@@ -0,0 +1,63 @@
1
+ # frozen_string_literal: true
2
+ $LOAD_PATH.unshift(File.expand_path("../lib", __dir__))
3
+ require "vsm"
4
+ require "vsm/ports/chat_tty"
5
+ require "securerandom"
6
+
7
+ # Demonstrates subclassing ChatTTY to customize the banner and output formatting.
8
+
9
+ class EchoTool < VSM::ToolCapsule
10
+ tool_name "echo"
11
+ tool_description "Echoes back the provided text"
12
+ tool_schema({ type: "object", properties: { text: { type: "string" } }, required: ["text"] })
13
+ def run(args)
14
+ "you said: #{args["text"]}"
15
+ end
16
+ end
17
+
18
+ class DemoIntelligence < VSM::Intelligence
19
+ def handle(message, bus:, **)
20
+ return false unless message.kind == :user
21
+ if message.payload =~ /\Aecho:\s*(.+)\z/
22
+ bus.emit VSM::Message.new(kind: :tool_call, payload: { tool: "echo", args: { "text" => $1 } }, corr_id: SecureRandom.uuid, meta: message.meta)
23
+ else
24
+ bus.emit VSM::Message.new(kind: :assistant, payload: "Try: echo: hello", meta: message.meta)
25
+ end
26
+ true
27
+ end
28
+ end
29
+
30
+ class FancyTTY < VSM::Ports::ChatTTY
31
+ def banner(io)
32
+ io.puts "\e[95m\n ███ CUSTOM CHAT ███\n\e[0m"
33
+ end
34
+
35
+ def render_out(m)
36
+ case m.kind
37
+ when :assistant_delta
38
+ @streaming = true
39
+ @out.print m.payload
40
+ @out.flush
41
+ when :assistant
42
+ @out.puts unless @streaming
43
+ @streaming = false
44
+ when :tool_call
45
+ @out.puts "\n\e[90m→ calling #{m.payload[:tool]}\e[0m"
46
+ when :tool_result
47
+ @out.puts "\e[92m✓ #{m.payload}\e[0m"
48
+ end
49
+ end
50
+ end
51
+
52
+ cap = VSM::DSL.define(:fancy_chat) do
53
+ identity klass: VSM::Identity, args: { identity: "fancy_chat", invariants: [] }
54
+ governance klass: VSM::Governance
55
+ coordination klass: VSM::Coordination
56
+ intelligence klass: DemoIntelligence
57
+ monitoring klass: VSM::Monitoring
58
+ operations do
59
+ capsule :echo, klass: EchoTool
60
+ end
61
+ end
62
+
63
+ VSM::Runtime.start(cap, ports: [FancyTTY.new(capsule: cap, prompt: "Me: ")])
@@ -0,0 +1,49 @@
1
+ # frozen_string_literal: true
2
+ $LOAD_PATH.unshift(File.expand_path("../lib", __dir__))
3
+ require "vsm"
4
+ require "vsm/dsl_mcp"
5
+ require "vsm/ports/chat_tty"
6
+
7
+ # Example: Use an LLM driver (OpenAI) to automatically call tools exposed by an MCP server.
8
+ #
9
+ # Prereqs:
10
+ # - OPENAI_API_KEY must be set
11
+ # - An MCP server available on your PATH, e.g. `claude mcp serve`
12
+ #
13
+ # Usage:
14
+ # OPENAI_API_KEY=... AIRB_MODEL=gpt-4o-mini ruby examples/09_mcp_with_llm_calls.rb
15
+ # Type a question; the model will choose tools from the reflected MCP server.
16
+
17
+ MODEL = ENV["AIRB_MODEL"] || "gpt-4o-mini"
18
+
19
+ driver = VSM::Drivers::OpenAI::AsyncDriver.new(
20
+ api_key: ENV.fetch("OPENAI_API_KEY"),
21
+ model: MODEL
22
+ )
23
+
24
+ system_prompt = <<~PROMPT
25
+ You are a helpful assistant. You have access to the listed tools.
26
+ When a tool can help, call it with appropriate JSON arguments.
27
+ Keep final answers concise.
28
+ PROMPT
29
+
30
+ cap = VSM::DSL.define(:mcp_with_llm) do
31
+ identity klass: VSM::Identity, args: { identity: "mcp_with_llm", invariants: [] }
32
+ governance klass: VSM::Governance
33
+ coordination klass: VSM::Coordination
34
+ intelligence klass: VSM::Intelligence, args: { driver: driver, system_prompt: system_prompt }
35
+ monitoring klass: VSM::Monitoring
36
+ operations do
37
+ # Reflect tools from an external MCP server (e.g., Claude Code).
38
+ # If your server requires strict LSP framing, run with VSM_MCP_LSP=1.
39
+ # You can also prefix names to avoid collisions: prefix: "claude_"
40
+ mcp_server :claude, cmd: ["claude", "mcp", "serve"]
41
+ end
42
+ end
43
+
44
+ banner = ->(io) do
45
+ io.puts "\e[96mLLM + MCP tools\e[0m — Ask a question; model may call tools."
46
+ end
47
+
48
+ VSM::Runtime.start(cap, ports: [VSM::Ports::ChatTTY.new(capsule: cap, banner: banner, prompt: "You> ")])
49
+
@@ -0,0 +1,56 @@
1
+ # frozen_string_literal: true
2
+
3
+ # Demo: use OpenAI tool-calling to let an LLM inspect the running capsule via
4
+ # the read-only meta tools. Set OPENAI_API_KEY (and optionally AIRB_MODEL) then:
5
+ # bundle exec ruby examples/10_meta_read_only.rb
6
+ # Ask things like "What can you do?" or "Explain meta_demo_tool" and the model
7
+ # will call the meta tools to gather context before replying.
8
+
9
+ $LOAD_PATH.unshift(File.expand_path("../lib", __dir__))
10
+
11
+ require "securerandom"
12
+ require "vsm"
13
+
14
+ MODEL = ENV["AIRB_MODEL"] || "gpt-4o-mini"
15
+ API_KEY = ENV["OPENAI_API_KEY"] or abort "OPENAI_API_KEY required for this demo"
16
+
17
+ class MetaDemoTool < VSM::ToolCapsule
18
+ tool_name "meta_demo_tool"
19
+ tool_description "Simple tool included alongside meta tools"
20
+ tool_schema({ type: "object", properties: {}, additionalProperties: false })
21
+
22
+ def run(_args)
23
+ "hello from demo tool"
24
+ end
25
+ end
26
+
27
+ driver = VSM::Drivers::OpenAI::AsyncDriver.new(api_key: API_KEY, model: MODEL)
28
+
29
+ SYSTEM_PROMPT = <<~PROMPT
30
+ You are the steward of a VSM capsule. You have access to built-in reflection
31
+ tools that describe the organism and its operations:
32
+ - meta_summarize_self: overview of the current capsule and its roles
33
+ - meta_list_tools: list available tools with schemas
34
+ - meta_explain_tool: show implementation details for a named tool
35
+ - meta_explain_role: show capsule-specific details and code for a VSM role
36
+ When the user asks about capabilities, available tools, or how something
37
+ works, call the appropriate meta_* tool first, then respond with a clear,
38
+ human-friendly summary that cites relevant tool names. Be concise but
39
+ complete.
40
+ PROMPT
41
+
42
+ cap = VSM::DSL.define(:meta_demo_llm) do
43
+ identity klass: VSM::Identity, args: { identity: "meta_demo_llm", invariants: [] }
44
+ governance klass: VSM::Governance, args: {}
45
+ coordination klass: VSM::Coordination, args: {}
46
+ intelligence klass: VSM::Intelligence, args: { driver: driver, system_prompt: SYSTEM_PROMPT }
47
+ monitoring klass: VSM::Monitoring, args: {}
48
+ operations do
49
+ meta_tools
50
+ capsule :meta_demo_tool, klass: MetaDemoTool
51
+ end
52
+ end
53
+
54
+ ports = [VSM::Ports::ChatTTY.new(capsule: cap, banner: ->(io) { io.puts "Meta demo ready. Try asking 'What can you do?'" })]
55
+
56
+ VSM::Runtime.start(cap, ports: ports)
data/exe/vsm ADDED
@@ -0,0 +1,17 @@
1
+ #!/usr/bin/env ruby
2
+ # frozen_string_literal: true
3
+
4
+ # Keep CLI independent of any project's Bundler context so we resolve this
5
+ # gem's dependencies rather than a host app's Gemfile.
6
+ ENV.delete('BUNDLE_GEMFILE')
7
+ ENV.delete('BUNDLE_BIN_PATH')
8
+ if (rubyopt = ENV['RUBYOPT'])
9
+ ENV['RUBYOPT'] = rubyopt.split.reject { |x| x.include?('bundler/setup') }.join(' ')
10
+ end
11
+ ENV.delete('RUBYGEMS_GEMDEPS')
12
+
13
+ require 'vsm'
14
+ require 'vsm/cli'
15
+
16
+ VSM::CLI.start(ARGV)
17
+