@kimchitest/opencode-otel-plugin 1.0.4 → 1.0.6

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
Files changed (3) hide show
  1. package/README.md +11 -12
  2. package/package.json +4 -4
  3. package/plugin.ts +4 -4
package/README.md CHANGED
@@ -1,6 +1,6 @@
1
- # OpenCode OTEL Plugin for AI Enabler
1
+ # OpenCode OTEL Plugin for Kimchi
2
2
 
3
- Sends usage telemetry from OpenCode to the AI Enabler service.
3
+ Sends usage telemetry from OpenCode to the Kimchi service.
4
4
 
5
5
  OpenCode version 1.2.20+
6
6
 
@@ -50,8 +50,8 @@ mkdir -p .opencode/plugins
50
50
  | Variable | Required | Description |
51
51
  |----------|----------|-------------|
52
52
  | `OPENCODE_ENABLE_TELEMETRY` | Yes | Set to `1` to enable telemetry |
53
- | `OPENCODE_OTLP_ENDPOINT` | Yes | AI Enabler logs ingest endpoint URL |
54
- | `OPENCODE_OTLP_HEADERS` | Yes | Authorization header with your AI Enabler API key |
53
+ | `OPENCODE_OTLP_ENDPOINT` | Yes | Kimchi logs ingest endpoint URL |
54
+ | `OPENCODE_OTLP_HEADERS` | Yes | Authorization header with your Kimchi API key |
55
55
 
56
56
  #### Example Environment Variables
57
57
 
@@ -61,10 +61,10 @@ Add these to your shell config (`~/.zshrc`, `~/.bashrc`, etc.):
61
61
  # Enable the plugin
62
62
  export OPENCODE_ENABLE_TELEMETRY=1
63
63
 
64
- # AI Enabler endpoint for log ingestion
64
+ # Kimchi endpoint for log ingestion
65
65
  export OPENCODE_OTLP_ENDPOINT=https://api.cast.ai/ai-optimizer/v1beta/logs:ingest
66
66
 
67
- # Authorization header with your AI Enabler API key
67
+ # Authorization header with your Kimchi API key
68
68
  export OPENCODE_OTLP_HEADERS="Authorization=Bearer YOUR_API_KEY_HERE"
69
69
  ```
70
70
 
@@ -95,7 +95,7 @@ The plugin reads provider information from your OpenCode config (`~/.config/open
95
95
  | `perplexity` | Perplexity |
96
96
  | `hosted_vllm` | Hosted vLLM |
97
97
  | `bedrock` | AWS Bedrock |
98
- | `ai-enabler` | AI Enabler (serverless models) |
98
+ | `kimchi` | Kimchi (serverless models) |
99
99
 
100
100
  > **Important:** If your provider key does not match one of the valid values listed above, the request will be **rejected** and you will see an error toast notification in OpenCode. Make sure to use a provider key from the list above to ensure your usage data is recorded correctly.
101
101
 
@@ -103,11 +103,11 @@ The plugin reads provider information from your OpenCode config (`~/.config/open
103
103
 
104
104
  ```json
105
105
  {
106
- "model": "ai-enabler/glm-5-fp8",
106
+ "model": "kimchi/glm-5-fp8",
107
107
  "provider": {
108
- "ai-enabler": {
108
+ "kimchi": {
109
109
  "npm": "@ai-sdk/openai-compatible",
110
- "name": "AI Enabler",
110
+ "name": "Kimchi",
111
111
  "options": {
112
112
  "baseURL": "https://llm.cast.ai/openai/v1",
113
113
  "apiKey": "your-api-key"
@@ -150,7 +150,6 @@ The plugin shows error notifications via OpenCode toasts when issues occur:
150
150
  ### Debugging Steps
151
151
 
152
152
  1. Verify environment variables are set correctly
153
- 2. Check your AI Enabler API key is valid
153
+ 2. Check your Kimchi API key is valid
154
154
  3. Ensure the provider key in your OpenCode config matches a valid value
155
155
  4. Verify the endpoint URL is correct
156
-
package/package.json CHANGED
@@ -1,17 +1,17 @@
1
1
  {
2
2
  "name": "@kimchitest/opencode-otel-plugin",
3
- "version": "1.0.4",
4
- "description": "OpenCode OTEL plugin for AI Enabler - sends usage telemetry for cost tracking",
3
+ "version": "1.0.6",
4
+ "description": "OpenCode OTEL plugin for Kimchi - sends usage telemetry",
5
5
  "main": "plugin.ts",
6
6
  "type": "module",
7
7
  "keywords": [
8
8
  "opencode",
9
9
  "otel",
10
10
  "opentelemetry",
11
- "ai-enabler",
11
+ "kimchi",
12
12
  "telemetry"
13
13
  ],
14
- "author": "AI Enabler Team",
14
+ "author": "Kimchi Team",
15
15
  "license": "MIT",
16
16
  "repository": {
17
17
  "type": "git",
package/plugin.ts CHANGED
@@ -1,7 +1,7 @@
1
1
  /**
2
- * OpenCode OTEL Plugin for AI Enabler
2
+ * OpenCode OTEL Plugin for Kimchi
3
3
  *
4
- * Sends api_request events to the AI Enabler service for usage tracking.
4
+ * Sends api_request events to the Kimchi service for usage tracking.
5
5
  *
6
6
  * REQUIRED ATTRIBUTES:
7
7
  * - model: The model identifier (e.g., "glm-5-fp8", "claude-sonnet-4-6")
@@ -28,11 +28,11 @@
28
28
  * - "perplexity" - Perplexity
29
29
  * - "hosted_vllm" - Hosted vLLM
30
30
  * - "bedrock" - AWS Bedrock
31
- * - "ai-enabler" - AI Enabler (serverless models)
31
+ * - "kimchi" - Kimchi (serverless models)
32
32
  *
33
33
  * ENVIRONMENT VARIABLES:
34
34
  * - OPENCODE_ENABLE_TELEMETRY: Set to enable telemetry
35
- * - OPENCODE_OTLP_ENDPOINT: The AI Enabler logs ingest endpoint
35
+ * - OPENCODE_OTLP_ENDPOINT: The Kimchi logs ingest endpoint
36
36
  * - OPENCODE_OTLP_HEADERS: Authorization header (format: "Authorization=Bearer <token>")
37
37
  *
38
38
  * Tested with OpenCode version: 1.2.20