indirecttek-vibe-engine 2.6.32 → 2.6.33

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
Files changed (2) hide show
  1. package/README.md +20 -40
  2. package/package.json +4 -4
package/README.md CHANGED
@@ -6,60 +6,40 @@
6
6
 
7
7
  ---
8
8
 
9
- ## 🚀 Client Setup (For Users)
9
+ ## 🚀 Quick Start (Direct Mode)
10
10
 
11
11
  **Prerequisites:**
12
- - Node.js 18+ installed on your laptop.
13
- - Connection to an Ollama Server (Local or VPN).
12
+ - **Ollama** installed and running (`http://localhost:11434`).
13
+ - **Model**: `qwen2.5-coder:14b` (Recommended). Pull it via `ollama pull qwen2.5-coder:14b`.
14
14
 
15
- ### Step 1: Install the Extension
16
- Search for **"IndirectTek Vibe Engine"** in the VS Code Marketplace and install it.
15
+ ### Step 1: Install & Go
16
+ 1. Install **"IndirectTek Vibe Engine"** from VS Code Marketplace.
17
+ 2. Open the **IndirectTek AI** chat panel.
18
+ 3. Type: *"Create a hello world python script"*.
19
+ 4. If your Ollama is running on localhost, it just works.
17
20
 
18
- ### Step 2: Start the Agent Controller
19
- The brain of the operation runs separately to ensure stability. Open a terminal and run:
20
-
21
- ```bash
22
- npx indirecttek-vibe-engine@latest
23
- ```
24
-
25
- It will print a security token:
26
- ```text
27
- 🚀 Starting IndirectTek Vibe Agent...
28
- 🔑 GENERATED TOKEN: vibe-abcd-1234
29
- 👉 Copy this token...
30
- ```
31
- *Keep this terminal running!*
32
-
33
- ### Step 3: Connect
34
- 1. Open VS Code Settings (`Cmd+,`).
35
- 2. Search for **"IndirectTek"**.
36
- 3. Configure:
37
- - **Ollama Url**: `http://<YOUR_SERVER_IP>:11434` (Ask your Admin for this IP)
38
- - **Model**: `qwen2.5-coder:32b` (Or whatever works on your server)
39
- - **Use Controller**: ✅ **CHECKED** (This connects to the npx process)
40
- - **Controller Token**: Paste the `vibe-xxxx` token from Step 2.
41
-
42
- ### Step 4: Vibe Check
43
- Open the **IndirectTek AI** chat panel.
44
- You should see: `✅ Agent Controller Connected`.
45
-
46
- Type: *"Create a hello world python script"* to test it out!
21
+ _Note: The agent connects to `http://localhost:11434` by default._
47
22
 
48
23
  ---
49
24
 
50
- ## 🛠️ Advanced: Hosting the Server
25
+ ## 🛠️ Advanced: Vibe Controller (Optional)
26
+
27
+ For **Telemetry**, **Dashboards**, and **Enterprise Features**, run the standalone Vibe Controller:
51
28
 
52
- Do you want to host the AI models yourself?
53
- 👉 **See [SERVER_SETUP.md](SERVER_SETUP.md)** for instructions on setting up Ollama and GPU infrastructure on Linux/Mac.
29
+ 1. Run: `npx indirecttek-vibe-engine@latest`
30
+ 2. Copy the **Token**.
31
+ 3. In VS Code Settings:
32
+ - Set **Use Controller** to `true`.
33
+ - Paste the **Controller Token**.
34
+ - (Optional) Update **Ollama URL** if using a remote server.
54
35
 
55
36
  ---
56
37
 
57
38
  ## ✨ Key Features (v2.0)
58
39
  - **100% Data Privacy**: Your code never leaves your network.
40
+ - **Capability-Based Permissions**: Interactive "Yellow Card" approval for all file edits and commands.
41
+ - **Visible Tool Calls**: See exactly what the agent is doing with granular logs.
59
42
  - **Intent Router**: Automatically switches models based on task (Chat vs Refactor vs Plan).
60
- - **Context Hygiene**: Enforces strict token limits to prevent "poisoning" the model with too much history.
61
- - **Deep Telemetry**: Tracks real-time Token/s and Latency for every request.
62
- - **Visual Dashboard**: View live metrics and request logs at `http://localhost:43110/stats`.
63
43
  - **Auto-Summary**: Automatically summarizes changes (Edits, Commands) at the end of every task.
64
44
  - **Autonomous Editing**: Strict "Hunk Only" editing to prevent creating massive diffs.
65
45
 
package/package.json CHANGED
@@ -2,7 +2,7 @@
2
2
  "name": "indirecttek-vibe-engine",
3
3
  "displayName": "IndirectTek Vibe Engine",
4
4
  "description": "Autonomous Local AI Agent. Vibe Coding v1.0.",
5
- "version": "2.6.32",
5
+ "version": "2.6.33",
6
6
  "icon": "media/icon.png",
7
7
  "bin": {
8
8
  "vibe-agent": "bin/vibe-agent.js"
@@ -85,7 +85,7 @@
85
85
  "properties": {
86
86
  "partnerBot.ollamaUrl": {
87
87
  "type": "string",
88
- "default": "http://192.168.86.44:11434",
88
+ "default": "http://localhost:11434",
89
89
  "description": "URL of your local Ollama instance"
90
90
  },
91
91
  "partnerBot.model": {
@@ -166,7 +166,7 @@
166
166
  },
167
167
  "partnerBot.useController": {
168
168
  "type": "boolean",
169
- "default": true,
169
+ "default": false,
170
170
  "description": "Route agent conversations through the local Vibe Agent Controller instead of calling Ollama directly (experimental)."
171
171
  },
172
172
  "partnerBot.controllerPort": {
@@ -181,7 +181,7 @@
181
181
  },
182
182
  "partnerBot.fooocusUrl": {
183
183
  "type": "string",
184
- "default": "http://192.168.86.39:8888",
184
+ "default": "http://localhost:8888",
185
185
  "description": "Base URL of the Fooocus API. Do not include a trailing slash or path (e.g. http://192.168.86.39:8888)."
186
186
  }
187
187
  }