@curaious/uno 0.1.17 → 0.1.18

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.md CHANGED
@@ -1,188 +1,155 @@
1
1
  # Uno
2
+ This project primarily provides two things:
3
+ * [Golang SDK](https://github.com/curaious/uno?tab=readme-ov-file#golang-sdk) for building LLM-powered applications.
4
+ * [Make LLM calls](https://github.com/curaious/uno?tab=readme-ov-file#making-llm-calls) to multiple providers through a unified interface.
5
+ * [Build agents](https://github.com/curaious/uno?tab=readme-ov-file#building-agents) with custom tools, MCP servers, conversation history, and durable execution.
2
6
 
3
- A high-performance LLM Gateway and Agent Framework written in Go.
7
+ * [AI Gateway](https://github.com/curaious/uno?tab=readme-ov-file#ai-gateway) for managing access and rate limiting LLM calls, and building and deploying agents without writing code with built-in observability for LLM calls and Agent Loop.
4
8
 
5
- Uno provides a unified interface for interacting with LLMs from OpenAI, Anthropic, Gemini, xAI, and Ollama. Use it as a standalone gateway with virtual keys and observability, or embed the SDK directly in your Go applications.
9
+ ## Golang SDK
6
10
 
7
- ## Features
11
+ ### Quickstart
8
12
 
9
- - **Unified API** Single request/response format across all providers
10
- - **Virtual Keys** — Protect provider API keys with Uno-generated keys
11
- - **Observability** — Built-in tracing with OpenTelemetry and ClickHouse
12
- - **Agent Framework** — Build agents with tool calling, MCP integration, and conversation history
13
- - **Durable Execution** — Run agents with [Restate](https://restate.dev) for fault-tolerant workflows
13
+ **Add the SDK to your project**
14
14
 
15
- ## Quickstart
16
-
17
- ### Gateway Mode
18
-
19
- Start the gateway with Docker:
20
-
21
- ```bash
22
- npx @curaious/uno
23
- ```
24
-
25
- Open `http://localhost:3000` to configure providers and create virtual keys.
26
-
27
- Point your existing SDK to the gateway:
28
-
29
- ```python
30
- from openai import OpenAI
31
-
32
- client = OpenAI(
33
- base_url="http://localhost:6060/api/gateway/openai",
34
- api_key="your-virtual-key",
35
- )
36
-
37
- response = client.responses.create(
38
- model="gpt-4.1-mini",
39
- input="Hello!",
40
- )
41
15
  ```
42
-
43
- ### SDK Mode
44
-
45
- Install:
46
-
47
- ```bash
48
16
  go get -u github.com/curaious/uno
49
17
  ```
50
18
 
51
- Use the SDK directly:
52
-
19
+ **Initialize the SDK**
53
20
  ```go
54
- package main
55
-
56
- import (
57
- "context"
58
- "fmt"
59
- "os"
60
-
61
- "github.com/curaious/uno/pkg/gateway"
62
- "github.com/curaious/uno/pkg/llm"
63
- "github.com/curaious/uno/pkg/llm/responses"
64
- "github.com/curaious/uno/pkg/sdk"
65
- "github.com/curaious/uno/internal/utils"
66
- )
67
-
68
- func main() {
69
- client, _ := sdk.New(&sdk.ClientOptions{
70
- LLMConfigs: sdk.NewInMemoryConfigStore([]*gateway.ProviderConfig{
71
- {
72
- ProviderName: llm.ProviderNameOpenAI,
73
- ApiKeys: []*gateway.APIKeyConfig{
74
- {Name: "default", APIKey: os.Getenv("OPENAI_API_KEY")},
75
- },
76
- },
77
- }),
78
- })
79
-
80
- model := client.NewLLM(sdk.LLMOptions{
81
- Provider: llm.ProviderNameOpenAI,
82
- Model: "gpt-4.1-mini",
83
- })
84
-
85
- resp, _ := model.NewResponses(context.Background(), &responses.Request{
86
- Input: responses.InputUnion{
87
- OfString: utils.Ptr("What is the capital of France?"),
88
- },
89
- })
90
-
91
- fmt.Println(resp.Output[0].OfOutputMessage.Content[0].OfOutputText.Text)
92
- }
21
+ client, err := sdk.New(&sdk.ClientOptions{
22
+ LLMConfigs: sdk.NewInMemoryConfigStore([]*gateway.ProviderConfig{
23
+ {
24
+ ProviderName: llm.ProviderNameOpenAI,
25
+ ApiKeys: []*gateway.APIKeyConfig{
26
+ {
27
+ Name: "Key 1",
28
+ APIKey: os.Getenv("OPENAI_API_KEY"),
29
+ },
30
+ },
31
+ },
32
+ }),
33
+ })
93
34
  ```
94
35
 
95
- ## Provider Support
96
-
97
- | Provider | Text | Image Gen | Image Input | Tool Calls | Reasoning | Streaming | Structured Output | Embeddings |
98
- | :--- | :---: | :---: | :---: | :---: | :---: | :---: | :---: | :---: |
99
- | OpenAI | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ |
100
- | Anthropic | ✅ | ❌ | ✅ | ✅ | ✅ | ✅ | ✅ | ❌ |
101
- | Gemini | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ |
102
- | xAI | ✅ | ❌ | ❌ | ✅ | ❌ | ✅ | ✅ | ❌ |
103
- | Ollama | ✅ | ❌ | ✅ | ✅ | ❌ | ✅ | ✅ | ✅ |
104
-
105
- ## SDK Capabilities
36
+ ---
106
37
 
107
- ### Responses API
108
-
109
- Generate text, images, and structured output:
38
+ ### Making LLM Calls
110
39
 
40
+ **Step 1: Create a model instance**
111
41
  ```go
112
- resp, _ := model.NewResponses(ctx, &responses.Request{
113
- Instructions: utils.Ptr("You are a helpful assistant."),
114
- Input: responses.InputUnion{OfString: utils.Ptr("Hello!")},
42
+ // OpenAI
43
+ model := client.NewLLM(sdk.LLMOptions{
44
+ Provider: llm.ProviderNameOpenAI,
45
+ Model: "gpt-4.1-mini",
115
46
  })
116
- ```
117
-
118
- ### Agents
119
-
120
- Build agents with tools and conversation memory:
121
47
 
48
+ // Anthropic
49
+ model := client.NewLLM(sdk.LLMOptions{
50
+ Provider: llm.ProviderNameAnthropic,
51
+ Model: "claude-haiku-4-5",
52
+ })
53
+ ```
54
+ **Step 2: Make the LLM Call**
122
55
  ```go
123
- agent := client.NewAgent(&sdk.AgentOptions{
124
- Name: "Assistant",
125
- Instruction: client.Prompt("You are a helpful assistant."),
126
- LLM: client.NewLLM(sdk.LLMOptions{Provider: llm.ProviderNameOpenAI, Model: "gpt-4o"}),
127
- Tools: []core.Tool{weatherTool, searchTool},
56
+ // Completions
57
+ resp, err := model.NewResponses(
58
+ context.Background(),
59
+ &responses.Request{
60
+ Instructions: utils.Ptr("You are helpful assistant. You greet user with a light-joke"),
61
+ Input: responses.InputUnion{
62
+ OfString: utils.Ptr("Hello!"),
63
+ },
64
+ },
65
+ )
66
+
67
+ // Embeddings
68
+ resp, err := model.NewEmbedding(context.Background(), &embeddings.Request{
69
+ Input: embeddings.InputUnion{
70
+ OfString: utils.Ptr("The food was delicious and the waiter..."),
71
+ },
128
72
  })
129
73
 
130
- output, _ := agent.Execute(ctx, []responses.InputMessageUnion{
131
- responses.UserMessage("What's the weather in Tokyo?"),
132
- }, callback)
74
+ // Text-to-spech
75
+ resp, err := model.NewSpeech(context.Background(), &speech.Request{
76
+ Input: "Hello, this is a test of the text-to-speech system.",
77
+ Model: "tts-1",
78
+ Voice: "alloy",
79
+ })
133
80
  ```
134
81
 
135
- ### MCP Tools
82
+ **Refer to documentation for advanced usage:**
83
+ * [Text Generation](https://curaious.techinscribed.com/uno-sdk/responses/text-generation)
84
+ * [Tool Calling](https://curaious.techinscribed.com/uno-sdk/responses/tool-calling)
85
+ * [Reasoning](https://curaious.techinscribed.com/uno-sdk/responses/reasoning)
86
+ * [Structured Output](https://curaious.techinscribed.com/uno-sdk/responses/structured-output)
87
+ * [Image Generation](https://curaious.techinscribed.com/uno-sdk/responses/image-generation)
88
+ * [Web Search Tool](https://curaious.techinscribed.com/uno-sdk/responses/web-search-tool)
89
+ * [Code Execution Tool](https://curaious.techinscribed.com/uno-sdk/responses/code-execution-tool)
136
90
 
137
- Connect to MCP servers:
91
+ ---
138
92
 
93
+ ## Building Agents
139
94
  ```go
140
95
  agent := client.NewAgent(&sdk.AgentOptions{
141
- Name: "MCP Agent",
142
- Instruction: client.Prompt("You have access to external tools."),
96
+ Name: "Hello world agent",
97
+ Instruction: client.Prompt("You are helpful assistant. You are interacting with the user named {{name}}"),
143
98
  LLM: model,
144
- MCPServers: []*core.MCPServer{
145
- {Name: "filesystem", Command: "npx", Args: []string{"-y", "@anthropic/mcp-filesystem"}},
99
+ Parameters: responses.Parameters{
100
+ Temperature: utils.Ptr(0.2),
146
101
  },
147
102
  })
148
- ```
149
103
 
150
- ### Embeddings
151
-
152
- Generate text embeddings:
153
-
154
- ```go
155
- resp, _ := model.NewEmbedding(ctx, &embeddings.Request{
156
- Input: embeddings.InputUnion{
157
- OfString: utils.Ptr("The food was delicious"),
104
+ out, err := agent.Execute(context.Background(), &agents.AgentInput{
105
+ Messages: []responses.InputMessageUnion{
106
+ responses.UserMessage("Hello!"),
158
107
  },
159
108
  })
160
109
  ```
161
110
 
162
- ## Examples
163
-
164
- See the [`examples/`](./examples) directory:
165
-
166
- | Example | Description |
167
- | :--- | :--- |
168
- | `1_text_generation` | Basic text generation with streaming |
169
- | `2_tool_calling` | Function calling with LLMs |
170
- | `3_reasoning` | Chain-of-thought reasoning |
171
- | `4_image_processing` | Image input processing |
172
- | `5_image_generation` | Generate images with DALL-E/Imagen |
173
- | `6_simple_agent` | Basic agent setup |
174
- | `7_tool_calling_agent` | Agent with function tools |
175
- | `8_agent_multi_turn_conversation` | Multi-turn conversations |
176
- | `9_agent_with_mcp_tools` | MCP server integration |
177
- | `10_agent_as_a_tool` | Compose agents as tools |
178
- | `11_human_in_the_loop` | Human approval workflows |
179
- | `12_embeddings` | Text embeddings |
180
-
181
- ## Documentation
182
-
183
- Full documentation: [docs](./docs)
111
+ **Refer to documentation for more advanced usage:**
112
+ * [System Prompt](https://curaious.techinscribed.com/uno-sdk/agents/system-instruction)
113
+ * [Function Tools](https://curaious.techinscribed.com/uno-sdk/agents/tools/function-tools)
114
+ * [MCP Tools](https://curaious.techinscribed.com/uno-sdk/agents/tools/mcp-tools)
115
+ * [Agent as a Tool](https://curaious.techinscribed.com/uno-sdk/agents/tools/agent-as-a-tool)
116
+ * [Human in the loop](https://curaious.techinscribed.com/uno-sdk/agents/tools/human-in-the-loop)
117
+ * [Conversation History](https://curaious.techinscribed.com/uno-sdk/agents/conversations/history)
118
+ * [History Compaction or Summarization](https://curaious.techinscribed.com/uno-sdk/agents/conversations/summarization)
119
+ * [Durable Execution via Restate](https://curaious.techinscribed.com/uno-sdk/agents/durable/restate)
120
+ * [Durable Execution via Temporal](https://curaious.techinscribed.com/uno-sdk/agents/durable/temporal)
121
+ * [Serving Agent through HTTP](https://curaious.techinscribed.com/uno-sdk/agents/serving-agents/serving-agents-http)
122
+
123
+ ## AI Gateway
124
+
125
+ ### Quickstart
126
+
127
+ **Prerequisite:**
128
+ _Docker and Docker compose installed and running._
129
+
130
+ **Install AI Gateway and run locally**
131
+ ```
132
+ npx -y @curaious/uno
133
+ ```
134
+ then visit http://localhost:3000
135
+
136
+ **Refer to documentation for advanced usage:**
137
+ * LLM Gateway
138
+ * [Configuring Providers](https://curaious.techinscribed.com/gateway/llm/providers)
139
+ * [Managing Virtual Keys & Rate Limits](https://curaious.techinscribed.com/gateway/llm/virtual-keys)
140
+ * [Using with OpenAI SDK](https://curaious.techinscribed.com/gateway/llm/sdk-integrations/openai)
141
+ * [Observability](https://curaious.techinscribed.com/gateway/llm/tracing)
142
+ * No-code Agent Builder
143
+ * [Choosing agent runtime](https://curaious.techinscribed.com/gateway/agent-builder/agent-runtime)
144
+ * [Setting model parameters](https://curaious.techinscribed.com/gateway/agent-builder/model-configuration)
145
+ * [Configure system prompt](https://curaious.techinscribed.com/gateway/agent-builder/system-prompt)
146
+ * [Using built-in tools](https://curaious.techinscribed.com/gateway/agent-builder/built-in-tools)
147
+ * [Connecting MCP servers](https://curaious.techinscribed.com/gateway/agent-builder/mcp-server)
148
+ * [Structured Output](https://curaious.techinscribed.com/gateway/agent-builder/structured-output)
149
+ * [Setup history & summarization](https://curaious.techinscribed.com/gateway/agent-builder/conversation-history)
150
+ * [Versioning agents](https://curaious.techinscribed.com/gateway/agent-builder/versioning)
151
+ * [Chat with the agent](https://curaious.techinscribed.com/gateway/agent-builder/conversing-with-the-agent)
184
152
 
185
153
  ## License
186
154
 
187
155
  Apache 2.0
188
-
@@ -16,9 +16,9 @@ REDIS_PORT=6379
16
16
 
17
17
  ALLOWED_HEADERS="Content-Type, Authorization"
18
18
 
19
- RESTATE_WORKER_HOST_PORT="0.0.0.0:9080"
20
19
  RESTATE_SERVER_ENDPOINT="http://host.docker.internal:8081"
21
20
 
22
21
  TEMPORAL_SERVER_HOST_PORT="host.docker.internal:7233"
23
22
 
24
- RUNTIME_ENABLED=""
23
+ SANDBOX_ENABLED="true"
24
+ SANDBOX_DEFAULT_IMAGE="praveenraj9495/uno-sandbox"
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "@curaious/uno",
3
- "version": "0.1.17",
3
+ "version": "0.1.18",
4
4
  "description": "Uno - AI Gateway and Agent Framework",
5
5
  "bin": {
6
6
  "uno": "./bin/cli.js"