lynkr 3.2.1 β†’ 4.0.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/docs/index.md CHANGED
@@ -6,7 +6,7 @@
6
6
 
7
7
  # Lynkr - Production-Ready Claude Code Proxy with Multi-Provider Support, MCP Integration & Token Optimization
8
8
 
9
- #### Lynkr is an open-source, production-ready Claude Code proxy that enables the Claude Code CLI to work with any LLM provider (Databricks, OpenRouter, Ollama, Azure, OpenAI, llama.cpp) without losing Anthropic backend features. It features MCP server orchestration, Git workflows, repo intelligence, workspace tools, prompt caching, and 60-80% token optimization for cost-effective LLM-powered development.
9
+ #### Lynkr is an open-source, production-ready Claude Code proxy that enables the Claude Code CLI to work with any LLM provider (Databricks, AWS Bedrock (100+ models), OpenRouter, Ollama, Azure, OpenAI, llama.cpp) without losing Anthropic backend features. It features MCP server orchestration, Git workflows, repo intelligence, workspace tools, prompt caching, and 60-80% token optimization for cost-effective LLM-powered development.
10
10
  <!--
11
11
  SEO Keywords:
12
12
  Databricks, Claude Code, Anthropic, Azure Anthropic,
@@ -17,7 +17,7 @@ prompt caching, Node.js
17
17
 
18
18
  ## πŸ”– Keywords
19
19
 
20
- `claude-code` `claude-proxy` `anthropic-api` `databricks-llm` `openrouter-integration` `ollama-local` `llama-cpp` `azure-openai` `azure-anthropic` `mcp-server` `prompt-caching` `token-optimization` `ai-coding-assistant` `llm-proxy` `self-hosted-ai` `git-automation` `code-generation` `developer-tools` `ci-cd-automation` `llm-gateway` `cost-reduction` `multi-provider-llm`
20
+ `claude-code` `claude-proxy` `anthropic-api` `databricks-llm` `aws-bedrock` `bedrock-models` `deepseek-r1` `qwen3` `openrouter-integration` `ollama-local` `llama-cpp` `azure-openai` `azure-anthropic` `mcp-server` `prompt-caching` `token-optimization` `ai-coding-assistant` `llm-proxy` `self-hosted-ai` `git-automation` `code-generation` `developer-tools` `ci-cd-automation` `llm-gateway` `cost-reduction` `multi-provider-llm`
21
21
 
22
22
  ---
23
23
 
@@ -32,7 +32,7 @@ prompt caching, Node.js
32
32
 
33
33
  # πŸš€ What is Lynkr?
34
34
 
35
- **Lynkr** is an open-source **Claude Code-compatible backend proxy** that lets you run the **Claude Code CLI** and Claude-style tools **directly against [Databricks, Azure, OpenRouter, Ollama, and llama.cpp](#-configuration-guide-for-multi-provider-support-databricks-azure-openrouter-ollama-llamacpp)** instead of the default Anthropic cloud.
35
+ **Lynkr** is an open-source **Claude Code-compatible backend proxy** that lets you run the **Claude Code CLI** and Claude-style tools **directly against [Databricks, AWS Bedrock, Azure, OpenRouter, Ollama, and llama.cpp](#-configuration-guide-for-multi-provider-support-databricks-aws-bedrock-azure-openrouter-ollama-llamacpp)** instead of the default Anthropic cloud.
36
36
 
37
37
  It enables full repo-aware LLM workflows:
38
38
 
@@ -58,6 +58,9 @@ Emulates Anthropic’s backend so the **Claude Code CLI works without modificati
58
58
  ### βœ” Works with Databricks LLM Serving
59
59
  Supports **Databricks-hosted Claude Sonnet / Haiku models**, or any LLM served from Databricks.
60
60
 
61
+ ### βœ” Supports AWS Bedrock (100+ models) πŸ†•
62
+ Access Claude 4.5, DeepSeek R1, Qwen3 (480B), OpenAI GPT-OSS, Google Gemma, Nova, Titan, Llama, Mistral, and more through AWS's unified Converse API.
63
+
61
64
  ### βœ” Supports Azure Anthropic models
62
65
  Route Claude Code requests into Azure's `/anthropic/v1/messages` endpoint.
63
66
 
@@ -137,7 +140,7 @@ Claude Code CLI
137
140
  ↓
138
141
  Lynkr Proxy
139
142
  ↓
140
- Databricks / Azure Anthropic / OpenRouter / Ollama / llama.cpp / MCP / Tools
143
+ Databricks / AWS Bedrock / Azure Anthropic / OpenRouter / Ollama / llama.cpp / MCP / Tools
141
144
 
142
145
  ```
143
146
 
@@ -265,7 +268,7 @@ npm start
265
268
 
266
269
  ---
267
270
 
268
- # πŸ”§ Configuration Guide for Multi-Provider Support (Databricks, Azure, OpenRouter, Ollama, llama.cpp)
271
+ # πŸ”§ Configuration Guide for Multi-Provider Support (Databricks, AWS Bedrock, Azure, OpenRouter, Ollama, llama.cpp)
269
272
 
270
273
  ## Databricks Setup
271
274
 
@@ -299,6 +302,47 @@ AZURE_OPENAI_DEPLOYMENT=gpt-4o
299
302
  PORT=8080
300
303
  ```
301
304
 
305
+ ## AWS Bedrock Setup (πŸ†• NEW - 100+ Models)
306
+
307
+ **What is AWS Bedrock?**
308
+
309
+ AWS Bedrock provides serverless access to **nearly 100 foundation models** from leading AI companies through a unified Converse API. Benefits:
310
+ - βœ… **Massive model variety** – Claude, DeepSeek R1, Qwen3, OpenAI GPT-OSS, Google Gemma, Nova, Titan, Llama, Mistral, Cohere, and more
311
+ - βœ… **Enterprise-grade** – AWS infrastructure, security, and compliance
312
+ - βœ… **Pay-per-use** – No monthly fees, pay only for tokens used
313
+ - βœ… **Claude tool calling** – Full Read/Write/Bash tool support with Claude models
314
+ - βœ… **Multi-cloud flexibility** – Not locked into single provider
315
+
316
+ **Configuration:**
317
+
318
+ ```env
319
+ MODEL_PROVIDER=bedrock
320
+ AWS_BEDROCK_API_KEY=your-bearer-token # Get from AWS Console β†’ Bedrock β†’ API Keys
321
+ AWS_BEDROCK_REGION=us-east-2 # us-east-1, us-west-2, us-east-2, etc.
322
+ AWS_BEDROCK_MODEL_ID=us.anthropic.claude-sonnet-4-5-20250929-v1:0
323
+ PORT=8080
324
+ WORKSPACE_ROOT=/path/to/your/repo
325
+ ```
326
+
327
+ **Popular Models:**
328
+ - `us.anthropic.claude-sonnet-4-5-20250929-v1:0` – Best for coding with tool calling
329
+ - `us.deepseek.r1-v1:0` – Reasoning model (o1-style)
330
+ - `qwen.qwen3-coder-480b-a35b-v1:0` – 480B parameter coding specialist
331
+ - `openai.gpt-oss-120b-1:0` – Open-weight GPT model
332
+ - `us.amazon.nova-pro-v1:0` – Multimodal, 300K context
333
+ - `meta.llama3-1-70b-instruct-v1:0` – Open-source Llama 3.1
334
+ - `mistral.mistral-large-2407-v1:0` – Multilingual, coding
335
+
336
+ πŸ“– See [BEDROCK_MODELS.md](https://github.com/vishalveerareddy123/Lynkr/blob/main/BEDROCK_MODELS.md) for complete model catalog (100+ models), pricing, capabilities, and use cases.
337
+
338
+ **Getting Started:**
339
+ 1. Visit [AWS Bedrock Console](https://console.aws.amazon.com/bedrock/)
340
+ 2. Navigate to API Keys section
341
+ 3. Generate a new API key (Bearer token)
342
+ 4. Configure Lynkr as shown above
343
+ 5. Start using any of the 100+ available models!
344
+
345
+ ⚠️ **Tool Calling Note**: Only Claude models support tool calling (Read, Write, Bash) on Bedrock. Other models work via Converse API but won't execute tools.
302
346
 
303
347
  ## OpenRouter Setup
304
348
 
package/final-test.js ADDED
@@ -0,0 +1,33 @@
1
+ const http = require('http');
2
+
3
+ const data = JSON.stringify({
4
+ model: "claude-sonnet-4-5",
5
+ max_tokens: 100,
6
+ messages: [{ role: "user", content: "Say hello" }]
7
+ });
8
+
9
+ const req = http.request({
10
+ hostname: 'localhost',
11
+ port: 8081,
12
+ path: '/v1/messages',
13
+ method: 'POST',
14
+ headers: { 'Content-Type': 'application/json', 'Content-Length': data.length }
15
+ }, (res) => {
16
+ let body = '';
17
+ res.on('data', chunk => body += chunk);
18
+ res.on('end', () => {
19
+ console.log('Status:', res.statusCode);
20
+ if (res.statusCode === 200) {
21
+ const json = JSON.parse(body);
22
+ console.log('βœ… SUCCESS!');
23
+ console.log('Model:', json.model);
24
+ console.log('Response:', json.content[0].text.substring(0, 150));
25
+ } else {
26
+ console.log('❌ Error:', body.substring(0, 300));
27
+ }
28
+ });
29
+ });
30
+
31
+ req.on('error', e => console.error('Request failed:', e.message));
32
+ req.write(data);
33
+ req.end();
package/package.json CHANGED
@@ -1,7 +1,7 @@
1
1
  {
2
2
  "name": "lynkr",
3
- "version": "3.2.1",
4
- "description": "Self-hosted Claude Code proxy with Databricks,Azure adapters, openrouter, Ollama,llamacpp, workspace tooling, and MCP integration.",
3
+ "version": "4.0.0",
4
+ "description": "Self-hosted Claude Code & Cursor proxy with Databricks,AWS BedRock,Azure adapters, openrouter, Ollama,llamacpp,LM Studio, workspace tooling, and MCP integration.",
5
5
  "main": "index.js",
6
6
  "bin": {
7
7
  "lynkr": "./bin/cli.js",