langsmith 0.1.2 → 0.1.5

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.md CHANGED
@@ -90,8 +90,9 @@ Langsmith's `traceable` wrapper function makes it easy to trace any function or
90
90
  ### OpenAI SDK
91
91
 
92
92
  <!-- markdown-link-check-disable -->
93
- The easiest ways to trace calls from the [OpenAI SDK](https://platform.openai.com/docs/api-reference) with LangSmith
94
- is using the `traceable` wrapper function available in LangSmith 0.1.0 and up.
93
+
94
+ The easiest way to trace calls from the [OpenAI SDK](https://platform.openai.com/docs/api-reference) with LangSmith
95
+ is using the `wrapOpenAI` wrapper function available in LangSmith 0.1.3 and up.
95
96
 
96
97
  In order to use, you first need to set your LangSmith API key:
97
98
 
@@ -105,72 +106,41 @@ Next, you will need to install the LangSmith SDK and the OpenAI SDK:
105
106
  npm install langsmith openai
106
107
  ```
107
108
 
108
- After that, initialize your OpenAI client:
109
+ After that, initialize your OpenAI client and wrap the client with `wrapOpenAI` method to enable tracing for the completions and chat completions methods:
109
110
 
110
111
  ```ts
111
112
  import { OpenAI } from "openai";
113
+ import { wrapOpenAI } from "langsmith/wrappers";
114
+
115
+ const openai = wrapOpenAI(new OpenAI());
112
116
 
113
- const client = new OpenAI();
117
+ await openai.chat.completions.create({
118
+ model: "gpt-3.5-turbo",
119
+ messages: [{ content: "Hi there!", role: "user" }],
120
+ });
114
121
  ```
115
122
 
116
- Then, you can wrap the client methods you want to use by passing it to the `traceable` function like this:
123
+ Alternatively, you can use the `traceable` function to wrap the client methods you want to use:
117
124
 
118
125
  ```ts
119
126
  import { traceable } from "langsmith/traceable";
120
127
 
128
+ const openai = new OpenAI();
129
+
121
130
  const createCompletion = traceable(
122
131
  openai.chat.completions.create.bind(openai.chat.completions),
123
132
  { name: "OpenAI Chat Completion", run_type: "llm" }
124
133
  );
125
- ```
126
-
127
- Note the use of `.bind` to preserve the function's context. The `run_type` field in the extra config object
128
- marks the function as an LLM call, and enables token usage tracking for OpenAI.
129
-
130
- This new method takes the same exact arguments and has the same return type as the original method,
131
- but will log everything to LangSmith!
132
134
 
133
- ```ts
134
135
  await createCompletion({
135
136
  model: "gpt-3.5-turbo",
136
137
  messages: [{ content: "Hi there!", role: "user" }],
137
138
  });
138
139
  ```
139
140
 
140
- ```
141
- {
142
- id: 'chatcmpl-8sOWEOYVyehDlyPcBiaDtTxWvr9v6',
143
- object: 'chat.completion',
144
- created: 1707974654,
145
- model: 'gpt-3.5-turbo-0613',
146
- choices: [
147
- {
148
- index: 0,
149
- message: { role: 'assistant', content: 'Hello! How can I help you today?' },
150
- logprobs: null,
151
- finish_reason: 'stop'
152
- }
153
- ],
154
- usage: { prompt_tokens: 10, completion_tokens: 9, total_tokens: 19 },
155
- system_fingerprint: null
156
- }
157
- ```
158
-
159
- This also works for streaming:
160
-
161
- ```ts
162
- const stream = await createCompletion({
163
- model: "gpt-3.5-turbo",
164
- stream: true,
165
- messages: [{ content: "Hi there!", role: "user" }],
166
- });
167
- ```
168
-
169
- ```ts
170
- for await (const chunk of stream) {
171
- console.log(chunk);
172
- }
173
- ```
141
+ Note the use of `.bind` to preserve the function's context. The `run_type` field in the
142
+ extra config object marks the function as an LLM call, and enables token usage tracking
143
+ for OpenAI.
174
144
 
175
145
  Oftentimes, you use the OpenAI client inside of other functions or as part of a longer
176
146
  sequence. You can automatically get nested traces by using this wrapped method
@@ -178,7 +148,7 @@ within other functions wrapped with `traceable`.
178
148
 
179
149
  ```ts
180
150
  const nestedTrace = traceable(async (text: string) => {
181
- const completion = await createCompletion({
151
+ const completion = await openai.chat.completions.create({
182
152
  model: "gpt-3.5-turbo",
183
153
  messages: [{ content: text, role: "user" }],
184
154
  });
@@ -230,25 +200,22 @@ import { NextRequest, NextResponse } from "next/server";
230
200
 
231
201
  import { OpenAI } from "openai";
232
202
  import { traceable } from "langsmith/traceable";
203
+ import { wrapOpenAI } from "langsmith/wrappers";
233
204
 
234
205
  export const runtime = "edge";
235
206
 
236
207
  const handler = traceable(
237
208
  async function () {
238
- const openai = new OpenAI();
239
- const createCompletion = traceable(
240
- openai.chat.completions.create.bind(openai.chat.completions),
241
- { name: "OpenAI Chat Completion", run_type: "llm" }
242
- );
209
+ const openai = wrapOpenAI(new OpenAI());
243
210
 
244
- const completion = await createCompletion({
211
+ const completion = await openai.chat.completions.create({
245
212
  model: "gpt-3.5-turbo",
246
213
  messages: [{ content: "Why is the sky blue?", role: "user" }],
247
214
  });
248
215
 
249
216
  const response1 = completion.choices[0].message.content;
250
217
 
251
- const completion2 = await createCompletion({
218
+ const completion2 = await openai.chat.completions.create({
252
219
  model: "gpt-3.5-turbo",
253
220
  messages: [
254
221
  { content: "Why is the sky blue?", role: "user" },
@@ -287,28 +254,25 @@ The [Vercel AI SDK](https://sdk.vercel.ai/docs) contains integrations with a var
287
254
  Here's an example of how you can trace outputs in a Next.js handler:
288
255
 
289
256
  ```ts
290
- import { traceable } from 'langsmith/traceable';
291
- import { OpenAIStream, StreamingTextResponse } from 'ai';
257
+ import { traceable } from "langsmith/traceable";
258
+ import { OpenAIStream, StreamingTextResponse } from "ai";
292
259
 
293
260
  // Note: There are no types for the Mistral API client yet.
294
- import MistralClient from '@mistralai/mistralai';
261
+ import MistralClient from "@mistralai/mistralai";
295
262
 
296
- const client = new MistralClient(process.env.MISTRAL_API_KEY || '');
263
+ const client = new MistralClient(process.env.MISTRAL_API_KEY || "");
297
264
 
298
265
  export async function POST(req: Request) {
299
266
  // Extract the `messages` from the body of the request
300
267
  const { messages } = await req.json();
301
268
 
302
- const mistralChatStream = traceable(
303
- client.chatStream.bind(client),
304
- {
305
- name: "Mistral Stream",
306
- run_type: "llm",
307
- }
308
- );
269
+ const mistralChatStream = traceable(client.chatStream.bind(client), {
270
+ name: "Mistral Stream",
271
+ run_type: "llm",
272
+ });
309
273
 
310
274
  const response = await mistralChatStream({
311
- model: 'mistral-tiny',
275
+ model: "mistral-tiny",
312
276
  maxTokens: 1000,
313
277
  messages,
314
278
  });
@@ -324,7 +288,6 @@ export async function POST(req: Request) {
324
288
 
325
289
  See the [AI SDK docs](https://sdk.vercel.ai/docs) for more examples.
326
290
 
327
-
328
291
  #### Alternatives: **Log traces using a RunTree.**
329
292
 
330
293
  A RunTree tracks your application. Each RunTree object is required to have a name and run_type. These and other important attributes are as follows:
@@ -413,7 +376,7 @@ try {
413
376
  await childChainRun.end({
414
377
  error: `I errored again ${e.message}`,
415
378
  });
416
- await childChainRun.patchRun();
379
+ await childChainRun.patchRun();
417
380
  throw e;
418
381
  }
419
382
 
@@ -431,7 +394,7 @@ await parentRun.patchRun();
431
394
 
432
395
  ## Evaluation
433
396
 
434
- #### Create a Dataset from Existing Runs
397
+ #### Create a Dataset from Existing Runs
435
398
 
436
399
  Once your runs are stored in LangSmith, you can convert them into a dataset.
437
400
  For this example, we will do so using the Client, but you can also do this using
@@ -29,6 +29,8 @@ services:
29
29
  condition: service_healthy
30
30
  clickhouse-setup:
31
31
  condition: service_completed_successfully
32
+ postgres-setup:
33
+ condition: service_completed_successfully
32
34
  restart: always
33
35
  langchain-queue:
34
36
  image: langchain/${_LANGSMITH_IMAGE_PREFIX-}langchainplus-backend:${_LANGSMITH_IMAGE_VERSION:-latest}
@@ -44,6 +46,8 @@ services:
44
46
  condition: service_healthy
45
47
  clickhouse-setup:
46
48
  condition: service_completed_successfully
49
+ postgres-setup:
50
+ condition: service_completed_successfully
47
51
  restart: always
48
52
  langchain-hub:
49
53
  image: langchain/${_LANGSMITH_IMAGE_PREFIX-}langchainhub-backend:${_LANGSMITH_IMAGE_VERSION:-latest}
@@ -55,7 +59,15 @@ services:
55
59
  ports:
56
60
  - 1985:1985
57
61
  depends_on:
58
- - langchain-db
62
+ langchain-db:
63
+ condition: service_healthy
64
+ langchain-redis:
65
+ condition: service_healthy
66
+ clickhouse-setup:
67
+ condition: service_completed_successfully
68
+ postgres-setup:
69
+ condition: service_completed_successfully
70
+ restart: always
59
71
  langchain-db:
60
72
  image: postgres:14.7
61
73
  command:
@@ -114,13 +126,29 @@ services:
114
126
  depends_on:
115
127
  langchain-clickhouse:
116
128
  condition: service_healthy
117
- restart: "no"
129
+ restart: "on-failure:10"
118
130
  entrypoint:
119
131
  [
120
132
  "bash",
121
133
  "-c",
122
134
  "migrate -source file://clickhouse/migrations -database 'clickhouse://langchain-clickhouse:9000?username=default&password=password&database=default&x-multi-statement=true&x-migrations-table-engine=MergeTree' up",
123
135
  ]
136
+ postgres-setup:
137
+ image: langchain/${_LANGSMITH_IMAGE_PREFIX-}langchainplus-backend:${_LANGSMITH_IMAGE_VERSION:-latest}
138
+ depends_on:
139
+ langchain-db:
140
+ condition: service_healthy
141
+ environment:
142
+ - LANGCHAIN_ENV=local_docker
143
+ - LOG_LEVEL=warning
144
+ - LANGSMITH_LICENSE_KEY=${LANGSMITH_LICENSE_KEY}
145
+ restart: "on-failure:10"
146
+ entrypoint:
147
+ [
148
+ "bash",
149
+ "-c",
150
+ "alembic upgrade head",
151
+ ]
124
152
  volumes:
125
153
  langchain-db-data:
126
154
  langchain-redis-data:
package/dist/client.cjs CHANGED
@@ -244,7 +244,6 @@ class Client {
244
244
  this.apiUrl = trimQuotes(config.apiUrl ?? defaultConfig.apiUrl) ?? "";
245
245
  this.apiKey = trimQuotes(config.apiKey ?? defaultConfig.apiKey);
246
246
  this.webUrl = trimQuotes(config.webUrl ?? defaultConfig.webUrl);
247
- this.validateApiKeyIfHosted();
248
247
  this.timeout_ms = config.timeout_ms ?? 12000;
249
248
  this.caller = new async_caller_js_1.AsyncCaller(config.callerOptions ?? {});
250
249
  this.hideInputs = config.hideInputs ?? defaultConfig.hideInputs;
@@ -267,12 +266,6 @@ class Client {
267
266
  hideOutputs: hideOutputs,
268
267
  };
269
268
  }
270
- validateApiKeyIfHosted() {
271
- const isLocal = isLocalhost(this.apiUrl);
272
- if (!isLocal && !this.apiKey) {
273
- throw new Error("API key must be provided when using hosted LangSmith API");
274
- }
275
- }
276
269
  getHostUrl() {
277
270
  if (this.webUrl) {
278
271
  return this.webUrl;
@@ -569,13 +562,18 @@ class Client {
569
562
  "Content-Type": "application/json",
570
563
  Accept: "application/json",
571
564
  };
572
- const response = await this.caller.call(fetch, `${this.apiUrl}/runs/batch`, {
573
- method: "POST",
574
- headers,
575
- body: JSON.stringify(body),
576
- signal: AbortSignal.timeout(this.timeout_ms),
577
- });
578
- await raiseForStatus(response, "batch create run");
565
+ try {
566
+ const response = await this.caller.call(fetch, `${this.apiUrl}/runs/batch`, {
567
+ method: "POST",
568
+ headers,
569
+ body: JSON.stringify(body),
570
+ signal: AbortSignal.timeout(this.timeout_ms),
571
+ });
572
+ await raiseForStatus(response, "batch create run");
573
+ }
574
+ catch (e) {
575
+ console.error(`Failed to batch create runs: ${e}`);
576
+ }
579
577
  }
580
578
  async updateRun(runId, run) {
581
579
  assertUuid(runId);
package/dist/client.d.ts CHANGED
@@ -98,7 +98,6 @@ export declare class Client {
98
98
  hideInputs?: boolean;
99
99
  hideOutputs?: boolean;
100
100
  };
101
- private validateApiKeyIfHosted;
102
101
  private getHostUrl;
103
102
  private get headers();
104
103
  private processInputs;
package/dist/client.js CHANGED
@@ -217,7 +217,6 @@ export class Client {
217
217
  this.apiUrl = trimQuotes(config.apiUrl ?? defaultConfig.apiUrl) ?? "";
218
218
  this.apiKey = trimQuotes(config.apiKey ?? defaultConfig.apiKey);
219
219
  this.webUrl = trimQuotes(config.webUrl ?? defaultConfig.webUrl);
220
- this.validateApiKeyIfHosted();
221
220
  this.timeout_ms = config.timeout_ms ?? 12000;
222
221
  this.caller = new AsyncCaller(config.callerOptions ?? {});
223
222
  this.hideInputs = config.hideInputs ?? defaultConfig.hideInputs;
@@ -240,12 +239,6 @@ export class Client {
240
239
  hideOutputs: hideOutputs,
241
240
  };
242
241
  }
243
- validateApiKeyIfHosted() {
244
- const isLocal = isLocalhost(this.apiUrl);
245
- if (!isLocal && !this.apiKey) {
246
- throw new Error("API key must be provided when using hosted LangSmith API");
247
- }
248
- }
249
242
  getHostUrl() {
250
243
  if (this.webUrl) {
251
244
  return this.webUrl;
@@ -542,13 +535,18 @@ export class Client {
542
535
  "Content-Type": "application/json",
543
536
  Accept: "application/json",
544
537
  };
545
- const response = await this.caller.call(fetch, `${this.apiUrl}/runs/batch`, {
546
- method: "POST",
547
- headers,
548
- body: JSON.stringify(body),
549
- signal: AbortSignal.timeout(this.timeout_ms),
550
- });
551
- await raiseForStatus(response, "batch create run");
538
+ try {
539
+ const response = await this.caller.call(fetch, `${this.apiUrl}/runs/batch`, {
540
+ method: "POST",
541
+ headers,
542
+ body: JSON.stringify(body),
543
+ signal: AbortSignal.timeout(this.timeout_ms),
544
+ });
545
+ await raiseForStatus(response, "batch create run");
546
+ }
547
+ catch (e) {
548
+ console.error(`Failed to batch create runs: ${e}`);
549
+ }
552
550
  }
553
551
  async updateRun(runId, run) {
554
552
  assertUuid(runId);
package/dist/index.cjs CHANGED
@@ -6,4 +6,4 @@ Object.defineProperty(exports, "Client", { enumerable: true, get: function () {
6
6
  var run_trees_js_1 = require("./run_trees.cjs");
7
7
  Object.defineProperty(exports, "RunTree", { enumerable: true, get: function () { return run_trees_js_1.RunTree; } });
8
8
  // Update using yarn bump-version
9
- exports.__version__ = "0.1.2";
9
+ exports.__version__ = "0.1.5";
package/dist/index.d.ts CHANGED
@@ -1,4 +1,4 @@
1
1
  export { Client } from "./client.js";
2
2
  export type { Dataset, Example, TracerSession, Run, Feedback, } from "./schemas.js";
3
3
  export { RunTree, type RunTreeConfig } from "./run_trees.js";
4
- export declare const __version__ = "0.1.2";
4
+ export declare const __version__ = "0.1.5";
package/dist/index.js CHANGED
@@ -1,4 +1,4 @@
1
1
  export { Client } from "./client.js";
2
2
  export { RunTree } from "./run_trees.js";
3
3
  // Update using yarn bump-version
4
- export const __version__ = "0.1.2";
4
+ export const __version__ = "0.1.5";
@@ -23,7 +23,7 @@ var __importStar = (this && this.__importStar) || function (mod) {
23
23
  return result;
24
24
  };
25
25
  Object.defineProperty(exports, "__esModule", { value: true });
26
- exports.isRunTree = exports.RunTree = exports.convertToDottedOrderFormat = void 0;
26
+ exports.isRunnableConfigLike = exports.isRunTree = exports.RunTree = exports.convertToDottedOrderFormat = void 0;
27
27
  const uuid = __importStar(require("uuid"));
28
28
  const env_js_1 = require("./utils/env.cjs");
29
29
  const client_js_1 = require("./client.cjs");
@@ -98,6 +98,12 @@ class RunTree {
98
98
  writable: true,
99
99
  value: void 0
100
100
  });
101
+ Object.defineProperty(this, "tags", {
102
+ enumerable: true,
103
+ configurable: true,
104
+ writable: true,
105
+ value: void 0
106
+ });
101
107
  Object.defineProperty(this, "error", {
102
108
  enumerable: true,
103
109
  configurable: true,
@@ -153,10 +159,11 @@ class RunTree {
153
159
  value: void 0
154
160
  });
155
161
  const defaultConfig = RunTree.getDefaultConfig();
156
- Object.assign(this, { ...defaultConfig, ...config });
162
+ const client = config.client ?? new client_js_1.Client();
163
+ Object.assign(this, { ...defaultConfig, ...config, client });
157
164
  if (!this.trace_id) {
158
165
  if (this.parent_run) {
159
- this.trace_id = this.parent_run.trace_id;
166
+ this.trace_id = this.parent_run.trace_id ?? this.id;
160
167
  }
161
168
  else {
162
169
  this.trace_id = this.id;
@@ -173,6 +180,35 @@ class RunTree {
173
180
  }
174
181
  }
175
182
  }
183
+ static fromRunnableConfig(config, props) {
184
+ // We only handle the callback manager case for now
185
+ const callbackManager = config?.callbacks;
186
+ let parentRun;
187
+ let projectName;
188
+ if (callbackManager) {
189
+ const parentRunId = callbackManager?.getParentRunId?.() ?? "";
190
+ const langChainTracer = callbackManager?.handlers?.find((handler) => handler?.name == "langchain_tracer");
191
+ parentRun = langChainTracer?.getRun?.(parentRunId);
192
+ projectName = langChainTracer?.projectName;
193
+ }
194
+ const deduppedTags = [
195
+ ...new Set((parentRun?.tags ?? []).concat(config?.tags ?? [])),
196
+ ];
197
+ const dedupedMetadata = {
198
+ ...parentRun?.extra?.metadata,
199
+ ...config?.metadata,
200
+ };
201
+ const rt = new RunTree({
202
+ name: props?.name ?? "<lambda>",
203
+ parent_run: parentRun,
204
+ tags: deduppedTags,
205
+ extra: {
206
+ metadata: dedupedMetadata,
207
+ },
208
+ project_name: projectName,
209
+ });
210
+ return rt;
211
+ }
176
212
  static getDefaultConfig() {
177
213
  return {
178
214
  id: uuid.v4(),
@@ -188,7 +224,6 @@ class RunTree {
188
224
  serialized: {},
189
225
  inputs: {},
190
226
  extra: {},
191
- client: new client_js_1.Client({}),
192
227
  };
193
228
  }
194
229
  async createChild(config) {
@@ -244,6 +279,7 @@ class RunTree {
244
279
  parent_run_id: parent_run_id,
245
280
  trace_id: run.trace_id,
246
281
  dotted_order: run.dotted_order,
282
+ tags: run.tags,
247
283
  };
248
284
  return persistedRun;
249
285
  }
@@ -268,6 +304,7 @@ class RunTree {
268
304
  events: this.events,
269
305
  dotted_order: this.dotted_order,
270
306
  trace_id: this.trace_id,
307
+ tags: this.tags,
271
308
  };
272
309
  await this.client.updateRun(this.id, runUpdate);
273
310
  }
@@ -279,3 +316,22 @@ function isRunTree(x) {
279
316
  typeof x.postRun === "function");
280
317
  }
281
318
  exports.isRunTree = isRunTree;
319
+ function containsLangChainTracerLike(x) {
320
+ return (Array.isArray(x) &&
321
+ x.some((callback) => {
322
+ return (typeof callback.name === "string" &&
323
+ callback.name === "langchain_tracer");
324
+ }));
325
+ }
326
+ function isRunnableConfigLike(x) {
327
+ // Check that it's an object with a callbacks arg
328
+ // that has either a CallbackManagerLike object with a langchain tracer within it
329
+ // or an array with a LangChainTracerLike object within it
330
+ return (x !== undefined &&
331
+ typeof x.callbacks === "object" &&
332
+ // Callback manager with a langchain tracer
333
+ (containsLangChainTracerLike(x.callbacks?.handlers) ||
334
+ // Or it's an array with a LangChainTracerLike object within it
335
+ containsLangChainTracerLike(x.callbacks)));
336
+ }
337
+ exports.isRunnableConfigLike = isRunnableConfigLike;
@@ -7,10 +7,12 @@ export interface RunTreeConfig {
7
7
  id?: string;
8
8
  project_name?: string;
9
9
  parent_run?: RunTree;
10
+ parent_run_id?: string;
10
11
  child_runs?: RunTree[];
11
12
  start_time?: number;
12
13
  end_time?: number;
13
14
  extra?: KVMap;
15
+ tags?: string[];
14
16
  error?: string;
15
17
  serialized?: object;
16
18
  inputs?: KVMap;
@@ -18,16 +20,34 @@ export interface RunTreeConfig {
18
20
  reference_example_id?: string;
19
21
  client?: Client;
20
22
  }
23
+ export interface RunnableConfigLike {
24
+ /**
25
+ * Tags for this call and any sub-calls (eg. a Chain calling an LLM).
26
+ * You can use these to filter calls.
27
+ */
28
+ tags?: string[];
29
+ /**
30
+ * Metadata for this call and any sub-calls (eg. a Chain calling an LLM).
31
+ * Keys should be strings, values should be JSON-serializable.
32
+ */
33
+ metadata?: Record<string, unknown>;
34
+ /**
35
+ * Callbacks for this call and any sub-calls (eg. a Chain calling an LLM).
36
+ * Tags are passed to all callbacks, metadata is passed to handle*Start callbacks.
37
+ */
38
+ callbacks?: any;
39
+ }
21
40
  export declare class RunTree implements BaseRun {
22
41
  id: string;
23
42
  name: RunTreeConfig["name"];
24
43
  run_type: string;
25
44
  project_name: string;
26
- parent_run?: RunTree;
45
+ parent_run?: BaseRun;
27
46
  child_runs: RunTree[];
28
47
  start_time: number;
29
48
  end_time?: number;
30
49
  extra: KVMap;
50
+ tags?: string[];
31
51
  error?: string;
32
52
  serialized: object;
33
53
  inputs: KVMap;
@@ -38,6 +58,11 @@ export declare class RunTree implements BaseRun {
38
58
  trace_id: string;
39
59
  dotted_order: string;
40
60
  constructor(config: RunTreeConfig);
61
+ static fromRunnableConfig(config: RunnableConfigLike, props: {
62
+ name: string;
63
+ tags?: string[];
64
+ metadata?: KVMap;
65
+ }): RunTree;
41
66
  private static getDefaultConfig;
42
67
  createChild(config: RunTreeConfig): Promise<RunTree>;
43
68
  end(outputs?: KVMap, error?: string, endTime?: number): Promise<void>;
@@ -46,3 +71,4 @@ export declare class RunTree implements BaseRun {
46
71
  patchRun(): Promise<void>;
47
72
  }
48
73
  export declare function isRunTree(x?: unknown): x is RunTree;
74
+ export declare function isRunnableConfigLike(x?: unknown): x is RunnableConfigLike;
package/dist/run_trees.js CHANGED
@@ -71,6 +71,12 @@ export class RunTree {
71
71
  writable: true,
72
72
  value: void 0
73
73
  });
74
+ Object.defineProperty(this, "tags", {
75
+ enumerable: true,
76
+ configurable: true,
77
+ writable: true,
78
+ value: void 0
79
+ });
74
80
  Object.defineProperty(this, "error", {
75
81
  enumerable: true,
76
82
  configurable: true,
@@ -126,10 +132,11 @@ export class RunTree {
126
132
  value: void 0
127
133
  });
128
134
  const defaultConfig = RunTree.getDefaultConfig();
129
- Object.assign(this, { ...defaultConfig, ...config });
135
+ const client = config.client ?? new Client();
136
+ Object.assign(this, { ...defaultConfig, ...config, client });
130
137
  if (!this.trace_id) {
131
138
  if (this.parent_run) {
132
- this.trace_id = this.parent_run.trace_id;
139
+ this.trace_id = this.parent_run.trace_id ?? this.id;
133
140
  }
134
141
  else {
135
142
  this.trace_id = this.id;
@@ -146,6 +153,35 @@ export class RunTree {
146
153
  }
147
154
  }
148
155
  }
156
+ static fromRunnableConfig(config, props) {
157
+ // We only handle the callback manager case for now
158
+ const callbackManager = config?.callbacks;
159
+ let parentRun;
160
+ let projectName;
161
+ if (callbackManager) {
162
+ const parentRunId = callbackManager?.getParentRunId?.() ?? "";
163
+ const langChainTracer = callbackManager?.handlers?.find((handler) => handler?.name == "langchain_tracer");
164
+ parentRun = langChainTracer?.getRun?.(parentRunId);
165
+ projectName = langChainTracer?.projectName;
166
+ }
167
+ const deduppedTags = [
168
+ ...new Set((parentRun?.tags ?? []).concat(config?.tags ?? [])),
169
+ ];
170
+ const dedupedMetadata = {
171
+ ...parentRun?.extra?.metadata,
172
+ ...config?.metadata,
173
+ };
174
+ const rt = new RunTree({
175
+ name: props?.name ?? "<lambda>",
176
+ parent_run: parentRun,
177
+ tags: deduppedTags,
178
+ extra: {
179
+ metadata: dedupedMetadata,
180
+ },
181
+ project_name: projectName,
182
+ });
183
+ return rt;
184
+ }
149
185
  static getDefaultConfig() {
150
186
  return {
151
187
  id: uuid.v4(),
@@ -161,7 +197,6 @@ export class RunTree {
161
197
  serialized: {},
162
198
  inputs: {},
163
199
  extra: {},
164
- client: new Client({}),
165
200
  };
166
201
  }
167
202
  async createChild(config) {
@@ -217,6 +252,7 @@ export class RunTree {
217
252
  parent_run_id: parent_run_id,
218
253
  trace_id: run.trace_id,
219
254
  dotted_order: run.dotted_order,
255
+ tags: run.tags,
220
256
  };
221
257
  return persistedRun;
222
258
  }
@@ -241,6 +277,7 @@ export class RunTree {
241
277
  events: this.events,
242
278
  dotted_order: this.dotted_order,
243
279
  trace_id: this.trace_id,
280
+ tags: this.tags,
244
281
  };
245
282
  await this.client.updateRun(this.id, runUpdate);
246
283
  }
@@ -250,3 +287,21 @@ export function isRunTree(x) {
250
287
  typeof x.createChild === "function" &&
251
288
  typeof x.postRun === "function");
252
289
  }
290
+ function containsLangChainTracerLike(x) {
291
+ return (Array.isArray(x) &&
292
+ x.some((callback) => {
293
+ return (typeof callback.name === "string" &&
294
+ callback.name === "langchain_tracer");
295
+ }));
296
+ }
297
+ export function isRunnableConfigLike(x) {
298
+ // Check that it's an object with a callbacks arg
299
+ // that has either a CallbackManagerLike object with a langchain tracer within it
300
+ // or an array with a LangChainTracerLike object within it
301
+ return (x !== undefined &&
302
+ typeof x.callbacks === "object" &&
303
+ // Callback manager with a langchain tracer
304
+ (containsLangChainTracerLike(x.callbacks?.handlers) ||
305
+ // Or it's an array with a LangChainTracerLike object within it
306
+ containsLangChainTracerLike(x.callbacks)));
307
+ }
package/dist/schemas.d.ts CHANGED
@@ -118,6 +118,7 @@ export interface RunUpdate {
118
118
  id?: string;
119
119
  end_time?: number;
120
120
  extra?: KVMap;
121
+ tags?: string[];
121
122
  error?: string;
122
123
  inputs?: KVMap;
123
124
  outputs?: KVMap;
@@ -36,6 +36,10 @@ function traceable(wrappedFunc, config) {
36
36
  currentRunTree = args[0];
37
37
  rawInputs = args.slice(1);
38
38
  }
39
+ else if ((0, run_trees_js_1.isRunnableConfigLike)(args[0])) {
40
+ currentRunTree = run_trees_js_1.RunTree.fromRunnableConfig(args[0], ensuredConfig);
41
+ rawInputs = args.slice(1);
42
+ }
39
43
  else if (previousRunTree !== undefined) {
40
44
  currentRunTree = await previousRunTree.createChild(ensuredConfig);
41
45
  rawInputs = args;
@@ -1,4 +1,4 @@
1
- import { RunTree, RunTreeConfig } from "./run_trees.js";
1
+ import { RunTree, RunTreeConfig, RunnableConfigLike } from "./run_trees.js";
2
2
  export type RunTreeLike = RunTree;
3
3
  type WrapArgReturnPair<Pair> = Pair extends [
4
4
  infer Args extends any[],
@@ -6,6 +6,7 @@ type WrapArgReturnPair<Pair> = Pair extends [
6
6
  ] ? {
7
7
  (...args: Args): Promise<Return>;
8
8
  (...args: [runTree: RunTreeLike, ...rest: Args]): Promise<Return>;
9
+ (...args: [config: RunnableConfigLike, ...rest: Args]): Promise<Return>;
9
10
  } : never;
10
11
  type UnionToIntersection<U> = (U extends any ? (x: U) => void : never) extends (x: infer I) => void ? I : never;
11
12
  export type TraceableFunction<Func extends (...args: any[]) => any> = Func extends {
package/dist/traceable.js CHANGED
@@ -1,5 +1,5 @@
1
1
  import { AsyncLocalStorage } from "async_hooks";
2
- import { RunTree, isRunTree } from "./run_trees.js";
2
+ import { RunTree, isRunTree, isRunnableConfigLike, } from "./run_trees.js";
3
3
  const asyncLocalStorage = new AsyncLocalStorage();
4
4
  const isAsyncIterable = (x) => x != null &&
5
5
  typeof x === "object" &&
@@ -33,6 +33,10 @@ export function traceable(wrappedFunc, config) {
33
33
  currentRunTree = args[0];
34
34
  rawInputs = args.slice(1);
35
35
  }
36
+ else if (isRunnableConfigLike(args[0])) {
37
+ currentRunTree = RunTree.fromRunnableConfig(args[0], ensuredConfig);
38
+ rawInputs = args.slice(1);
39
+ }
36
40
  else if (previousRunTree !== undefined) {
37
41
  currentRunTree = await previousRunTree.createChild(ensuredConfig);
38
42
  rawInputs = args;
@@ -0,0 +1,12 @@
1
+ "use strict";
2
+ Object.defineProperty(exports, "__esModule", { value: true });
3
+ exports.wrapOpenAI = void 0;
4
+ const traceable_js_1 = require("./traceable.cjs");
5
+ const wrapOpenAI = (openai, options) => {
6
+ // @ts-expect-error Promise<APIPromise<...>> != APIPromise<...>
7
+ openai.chat.completions.create = (0, traceable_js_1.traceable)(openai.chat.completions.create.bind(openai.chat.completions), Object.assign({ name: "ChatOpenAI", run_type: "llm" }, options?.client));
8
+ // @ts-expect-error Promise<APIPromise<...>> != APIPromise<...>
9
+ openai.completions.create = (0, traceable_js_1.traceable)(openai.completions.create.bind(openai.completions), Object.assign({ name: "OpenAI", run_type: "llm" }, options?.client));
10
+ return openai;
11
+ };
12
+ exports.wrapOpenAI = wrapOpenAI;
@@ -0,0 +1,5 @@
1
+ import type { OpenAI } from "openai";
2
+ import type { Client } from "./index.js";
3
+ export declare const wrapOpenAI: (openai: OpenAI, options?: {
4
+ client?: Client;
5
+ }) => OpenAI;
@@ -0,0 +1,8 @@
1
+ import { traceable } from "./traceable.js";
2
+ export const wrapOpenAI = (openai, options) => {
3
+ // @ts-expect-error Promise<APIPromise<...>> != APIPromise<...>
4
+ openai.chat.completions.create = traceable(openai.chat.completions.create.bind(openai.chat.completions), Object.assign({ name: "ChatOpenAI", run_type: "llm" }, options?.client));
5
+ // @ts-expect-error Promise<APIPromise<...>> != APIPromise<...>
6
+ openai.completions.create = traceable(openai.completions.create.bind(openai.completions), Object.assign({ name: "OpenAI", run_type: "llm" }, options?.client));
7
+ return openai;
8
+ };
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "langsmith",
3
- "version": "0.1.2",
3
+ "version": "0.1.5",
4
4
  "description": "Client library to connect to the LangSmith LLM Tracing and Evaluation Platform.",
5
5
  "packageManager": "yarn@1.22.19",
6
6
  "files": [
@@ -20,6 +20,9 @@
20
20
  "schemas.cjs",
21
21
  "schemas.js",
22
22
  "schemas.d.ts",
23
+ "wrappers.cjs",
24
+ "wrappers.js",
25
+ "wrappers.d.ts",
23
26
  "index.cjs",
24
27
  "index.js",
25
28
  "index.d.ts"
@@ -66,7 +69,8 @@
66
69
  "devDependencies": {
67
70
  "@babel/preset-env": "^7.22.4",
68
71
  "@jest/globals": "^29.5.0",
69
- "@langchain/core": "^0.1.28",
72
+ "@langchain/core": "^0.1.32",
73
+ "@langchain/langgraph": "^0.0.8",
70
74
  "@tsconfig/recommended": "^1.0.2",
71
75
  "@types/jest": "^29.5.1",
72
76
  "@typescript-eslint/eslint-plugin": "^5.59.8",
@@ -80,6 +84,7 @@
80
84
  "eslint-plugin-no-instanceof": "^1.0.1",
81
85
  "eslint-plugin-prettier": "^4.2.1",
82
86
  "jest": "^29.5.0",
87
+ "openai": "^4.28.0",
83
88
  "prettier": "^2.8.8",
84
89
  "ts-jest": "^29.1.0",
85
90
  "ts-node": "^10.9.1",
@@ -129,6 +134,11 @@
129
134
  "import": "./schemas.js",
130
135
  "require": "./schemas.cjs"
131
136
  },
137
+ "./wrappers": {
138
+ "types": "./wrappers.d.ts",
139
+ "import": "./wrappers.js",
140
+ "require": "./wrappers.cjs"
141
+ },
132
142
  "./package.json": "./package.json"
133
143
  }
134
144
  }
package/wrappers.cjs ADDED
@@ -0,0 +1 @@
1
+ module.exports = require('./dist/wrappers.cjs');
package/wrappers.d.ts ADDED
@@ -0,0 +1 @@
1
+ export * from './dist/wrappers.js'
package/wrappers.js ADDED
@@ -0,0 +1 @@
1
+ export * from './dist/wrappers.js'