@rheonic/sdk 0.1.0-beta.1 → 0.1.0-beta.11

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/CHANGELOG.md CHANGED
@@ -1,12 +1,24 @@
1
1
  # Changelog
2
2
 
3
- All notable changes to `rheonic-node` will be documented in this file.
3
+ All notable changes to `@rheonic/sdk` will be documented in this file.
4
4
 
5
5
  ## Unreleased
6
6
 
7
- - Publish-ready changelog entries will be added here for the next release.
7
+ ### Changed
8
+ - Google provider instrumentation and examples now align with the current official Google Gen AI JavaScript SDK shape: `@google/genai` with `ai.models.generateContent({...})`.
9
+
10
+ ### Fixed
11
+ - SDK debug logs now always emit non-empty `trace_id` and `span_id`, including warmup, token-estimation, and protect preflight paths.
12
+ - Provider instrumentation now keeps the full protected call lifecycle under one trace so SDK debug logs correlate cleanly with backend requests.
13
+ - Quickstart and README examples now use the published `@rheonic/sdk` package name consistently and show the current Anthropic and Google instrumentation patterns.
14
+
15
+ ## 0.1.0-beta.7
16
+
17
+ ### Changed
18
+ - `RHEONICBlockedError` now exposes structured block feedback for apps and agents: `reason`, `retry_after_seconds`, `blocked_until`, `trace_id`, `request_id`, and `snapshot`.
19
+ - Fail-closed protect fallback now reports `reason="fail_closed"` instead of the generic `decision_unavailable` on blocked requests.
8
20
 
9
- ## 0.1.0
21
+ ## 0.1.0-beta.6
10
22
 
11
23
  ### Added
12
24
  - Initial public beta release of the Rheonic Node SDK.
package/README.md CHANGED
@@ -1,150 +1,179 @@
1
1
  # Rheonic Node SDK
2
2
 
3
- The Rheonic Node SDK runs inside your app process, captures provider telemetry, and can request protect preflight decisions before provider calls.
3
+ Rheonic captures provider telemetry and applies protect preflight decisions before provider calls.
4
4
 
5
5
  ## Install
6
6
 
7
- Beta prerelease install:
8
-
9
7
  ```bash
10
- npm install rheonic-node@next
8
+ npm install @rheonic/sdk
11
9
  ```
12
10
 
13
- Compatibility:
14
- - Node.js 18+
15
- - One of the supported provider SDKs: `openai`, `@anthropic-ai/sdk`, or `@google/generative-ai`
16
-
17
- Beta note:
18
- - Public beta releases may add guardrail fields and provider wrappers before `1.0.0`.
11
+ ## Required environment variables
19
12
 
20
- ## Configuration
13
+ ```bash
14
+ RHEONIC_INGEST_KEY=<your_project_ingest_key>
15
+ RHEONIC_BASE_URL=<value_shown_in_dashboard>
16
+ ```
21
17
 
22
- - Required: `ingestKey`
23
- - Optional for local development: `baseUrl` (defaults to `RHEONIC_BASE_URL`, else `http://localhost:8000`)
24
- - Optional: `environment` (default `dev`)
18
+ ## Instrument provider calls
25
19
 
26
- For hosted beta, staging, or production deployments, set `RHEONIC_BASE_URL` or pass `baseUrl` explicitly. The localhost default is intended only for local development.
20
+ Wrap your provider SDK once.
27
21
 
28
- Provider/model validation: SDK wrappers fail fast with `RHEONICValidationError` when provider is missing/unsupported or model is missing/empty. Supported providers are `openai`, `anthropic`, and `google`. Model naming is not pattern-validated so future vendor naming changes remain compatible.
22
+ Telemetry is captured automatically after each provider call.
29
23
 
30
- ## Integration Recommendation
24
+ Enforcement follows Project mode in the dashboard (`Observe` / `Protect`).
31
25
 
32
- Create one long-lived SDK client at app startup and reuse it for all provider calls. The SDK prewarms tokenizer state and the backend connection on client initialization, so reusing a single client avoids paying protect cold-start cost on every request.
26
+ OpenAI:
33
27
 
34
- ## Logging
28
+ ```ts
29
+ import OpenAI from "openai";
30
+ import { createClient, instrumentOpenAI, RHEONICBlockedError } from "@rheonic/sdk";
35
31
 
36
- The SDK emits structured JSON logs to stdout. You do not need to configure file logging.
32
+ const rheonic = createClient({
33
+ baseUrl: process.env.RHEONIC_BASE_URL!,
34
+ ingestKey: process.env.RHEONIC_INGEST_KEY!,
35
+ });
37
36
 
38
- Example log:
37
+ const openai = instrumentOpenAI(new OpenAI({ apiKey: process.env.OPENAI_API_KEY! }), {
38
+ client: rheonic,
39
+ endpoint: "/chat/completions",
40
+ feature: "assistant",
41
+ });
39
42
 
40
- ```json
41
- {
42
- "timestamp": "2026-03-18T09:20:15.145102+00:00",
43
- "level": "info",
44
- "service": "sdk-node",
45
- "env": "staging",
46
- "trace_id": "f4ac8b6b-6f8d-4f4c-b54f-3c2c2f76a27b",
47
- "span_id": "9f12db3a1d204f8f",
48
- "event": "sdk_client_initialized",
49
- "message": "SDK client initialized",
50
- "metadata": {}
43
+ try {
44
+ await openai.chat.completions.create({
45
+ model: "gpt-4o-mini",
46
+ messages: [{ role: "user", content: "hello" }],
47
+ max_tokens: 256,
48
+ });
49
+ } catch (error) {
50
+ if (error instanceof RHEONICBlockedError) {
51
+ console.log(
52
+ JSON.stringify(
53
+ {
54
+ reason: error.reason,
55
+ retry_after_seconds: error.retry_after_seconds,
56
+ blocked_until: error.blocked_until,
57
+ trace_id: error.trace_id,
58
+ request_id: error.request_id,
59
+ },
60
+ null,
61
+ 2,
62
+ ),
63
+ );
64
+ }
51
65
  }
52
66
  ```
53
67
 
54
- Notes:
55
- - backend requests automatically include `X-Trace-ID`,
56
- - SDK logs share that `trace_id` so you can correlate SDK, backend, worker, and webhook activity,
57
- - sensitive fields such as API keys and tokens are redacted.
58
-
59
- ## Integration Path 1: Manual Capture (generic)
68
+ Anthropic:
60
69
 
61
70
  ```ts
62
- import { buildEvent, createClient } from "@rheonic/sdk";
71
+ import Anthropic from "@anthropic-ai/sdk";
72
+ import { createClient, instrumentAnthropic, RHEONICBlockedError } from "@rheonic/sdk";
63
73
 
64
- const client = createClient({
74
+ const rheonic = createClient({
65
75
  baseUrl: process.env.RHEONIC_BASE_URL!,
66
76
  ingestKey: process.env.RHEONIC_INGEST_KEY!,
67
77
  });
68
78
 
69
- await client.captureEvent(
70
- buildEvent({
71
- provider: "openai",
72
- model: "gpt-4o-mini",
73
- request: { endpoint: "/chat", input_tokens: 12 },
74
- response: { total_tokens: 32, latency_ms: 140, http_status: 200 },
75
- }),
76
- );
77
- ```
78
-
79
- Initialize `client` once during app startup, then reuse that same instance for manual capture and provider instrumentation.
80
-
81
- Minimal protect preflight usage:
82
-
83
- ```ts
84
- const decision = await client.protect({
85
- provider: "openai",
86
- model: "gpt-4o-mini",
79
+ const anthropic = instrumentAnthropic(new Anthropic({ apiKey: process.env.ANTHROPIC_API_KEY! }), {
80
+ client: rheonic,
81
+ endpoint: "/v1/messages",
87
82
  feature: "assistant",
88
- inputTokensEstimate: 32,
89
- maxOutputTokens: 256,
90
83
  });
84
+
85
+ try {
86
+ await anthropic.messages.create({
87
+ model: "claude-3-5-sonnet-latest",
88
+ max_tokens: 256,
89
+ messages: [{ role: "user", content: "hello" }],
90
+ });
91
+ } catch (error) {
92
+ if (error instanceof RHEONICBlockedError) {
93
+ console.log(JSON.stringify({
94
+ reason: error.reason,
95
+ retry_after_seconds: error.retry_after_seconds,
96
+ blocked_until: error.blocked_until,
97
+ trace_id: error.trace_id,
98
+ request_id: error.request_id,
99
+ }, null, 2));
100
+ }
101
+ }
91
102
  ```
92
103
 
93
- ## Integration Path 2: OpenAI instrumentation (convenience wrapper)
104
+ Google:
94
105
 
95
106
  ```ts
96
- import OpenAI from "openai";
97
- import { createClient, instrumentOpenAI } from "rheonic-node";
107
+ import { GoogleGenAI } from "@google/genai";
108
+ import { createClient, instrumentGoogle, RHEONICBlockedError } from "@rheonic/sdk";
98
109
 
99
- const rheonicClient = createClient({
110
+ const rheonic = createClient({
100
111
  baseUrl: process.env.RHEONIC_BASE_URL!,
101
112
  ingestKey: process.env.RHEONIC_INGEST_KEY!,
102
113
  });
103
- const openai = instrumentOpenAI(new OpenAI({ apiKey: process.env.OPENAI_API_KEY }), {
104
- client: rheonicClient,
105
- endpoint: "/chat/completions",
114
+
115
+ const ai = instrumentGoogle(new GoogleGenAI({ apiKey: process.env.GOOGLE_API_KEY! }), {
116
+ client: rheonic,
117
+ endpoint: "/v1beta/models/generateContent",
106
118
  feature: "assistant",
107
119
  });
120
+
121
+ try {
122
+ await ai.models.generateContent({
123
+ model: "gemini-1.5-pro",
124
+ contents: "hello",
125
+ config: {
126
+ maxOutputTokens: 256,
127
+ },
128
+ });
129
+ } catch (error) {
130
+ if (error instanceof RHEONICBlockedError) {
131
+ console.log(JSON.stringify({
132
+ reason: error.reason,
133
+ retry_after_seconds: error.retry_after_seconds,
134
+ blocked_until: error.blocked_until,
135
+ trace_id: error.trace_id,
136
+ request_id: error.request_id,
137
+ }, null, 2));
138
+ }
139
+ }
108
140
  ```
109
141
 
110
- ## Integration Path 3: Anthropic and Google wrappers
142
+ `RHEONICBlockedError.reason` is meant to be operator-relevant. The main values are:
143
+ - `tok_cap_breach`
144
+ - `req_cap_breach`
145
+ - `cooldown_active`
146
+ - `fail_closed`
147
+
148
+ If Protect is `fail_open`, timeout or availability problems stay internal and the provider call continues. If Protect is `fail_closed`, the SDK raises `RHEONICBlockedError` with the feedback fields shown above.
149
+
150
+ Keep one long-lived SDK client per app process. Initialize it during app startup and reuse it for all capture and instrumentation calls so Rheonic can avoid repeated protect cold-start latency.
151
+
152
+ ## Optional: custom event capture
153
+
154
+ Use this only if you can't instrument a provider SDK or need custom events.
111
155
 
112
156
  ```ts
113
- import Anthropic from "@anthropic-ai/sdk";
114
- import { GoogleGenerativeAI } from "@google/generative-ai";
115
- import { createClient } from "rheonic-node";
157
+ import { createClient, buildEvent } from "@rheonic/sdk";
116
158
 
117
159
  const client = createClient({
118
160
  baseUrl: process.env.RHEONIC_BASE_URL!,
119
161
  ingestKey: process.env.RHEONIC_INGEST_KEY!,
120
162
  });
121
163
 
122
- const anthropic = client.instrumentAnthropic(new Anthropic({ apiKey: process.env.ANTHROPIC_API_KEY }));
123
- await anthropic.messages.create({
124
- model: "claude-3-5-sonnet-latest",
125
- max_tokens: 256,
126
- messages: [{ role: "user", content: "Hello Claude" }],
127
- });
128
-
129
- const genAI = new GoogleGenerativeAI(process.env.GOOGLE_API_KEY!);
130
- const googleModel = client.instrumentGoogle(genAI.getGenerativeModel({ model: "gemini-1.5-pro" }));
131
- await googleModel.generateContent("Hello Google model");
164
+ await client.captureEvent(
165
+ buildEvent({
166
+ provider: "openai",
167
+ model: "gpt-4o-mini",
168
+ request: { endpoint: "/chat/completions", feature: "assistant", token_explosion_tokens: 64 },
169
+ response: { total_tokens: 64, latency_ms: 120, http_status: 200 },
170
+ }),
171
+ );
132
172
  ```
133
173
 
134
- Runtime call path:
135
- - SDK instrumentation calls `POST /api/v1/protect/decision` then `POST /api/v1/events`.
136
- - Project mode in dashboard controls decision behavior:
137
- - Observe: allow only.
138
- - Protect: allow/warn/block with cooldown.
139
-
140
- ## Provider SDKs
174
+ `token_explosion_tokens` is optional. Set it only for custom/manual events when you want token-explosion detection to use the same request-context signal that the SDK instrumentation sends to both protect and ingest.
141
175
 
142
- Install only the provider SDKs you actually use alongside `rheonic-node`:
143
-
144
- ```bash
145
- npm install openai
146
- npm install @anthropic-ai/sdk
147
- npm install @google/generative-ai
148
- ```
176
+ ## Reference
149
177
 
150
- Beta prereleases use semver prerelease format such as `0.2.0-beta.1` and are published under the `next` tag.
178
+ Full quickstart:
179
+ - `https://beta.rheonic.dev/quickstart`
package/dist/client.d.ts CHANGED
@@ -42,6 +42,7 @@ export declare class Client {
42
42
  private warmupCompleted;
43
43
  constructor(config: ClientConfig);
44
44
  captureEvent(event: EventPayload): Promise<void>;
45
+ captureEventAndFlush(event: EventPayload): Promise<void>;
45
46
  getStats(): ClientStats;
46
47
  flush(): Promise<void>;
47
48
  evaluateProtectDecision(context: ProtectContext): Promise<ProtectEvaluation>;
package/dist/client.js CHANGED
@@ -30,7 +30,13 @@ export class Client {
30
30
  constructor(config) {
31
31
  this.baseUrl = config.baseUrl ?? process.env.RHEONIC_BASE_URL ?? sdkNodeConfig.defaultBaseUrl;
32
32
  this.ingestKey = config.ingestKey;
33
- this.environment = config.environment ?? sdkNodeConfig.defaultEnvironment;
33
+ this.environment =
34
+ config.environment ??
35
+ process.env.NODE_ENV ??
36
+ process.env.APP_ENV ??
37
+ process.env.ENVIRONMENT ??
38
+ process.env.ENV ??
39
+ sdkNodeConfig.defaultEnvironment;
34
40
  this.flushIntervalMs = config.flushIntervalMs ?? sdkNodeConfig.defaultFlushIntervalMs;
35
41
  this.maxQueueSize = config.maxQueueSize ?? sdkNodeConfig.defaultMaxQueueSize;
36
42
  this.overflowPolicy = config.overflowPolicy ?? "drop_oldest";
@@ -92,6 +98,10 @@ export class Client {
92
98
  return;
93
99
  }
94
100
  }
101
+ async captureEventAndFlush(event) {
102
+ await this.captureEvent(event);
103
+ await this.flushWithTimeout();
104
+ }
95
105
  getStats() {
96
106
  return {
97
107
  queued: this.queue.length,
@@ -120,7 +130,11 @@ export class Client {
120
130
  }
121
131
  async evaluateProtectDecision(context) {
122
132
  await this.ensureWarmup();
123
- return this.protectEngine.evaluate(context);
133
+ return this.protectEngine.evaluate({
134
+ ...context,
135
+ trace_id: context.trace_id ?? getTraceId() ?? undefined,
136
+ span_id: context.span_id ?? getSpanId() ?? undefined,
137
+ });
124
138
  }
125
139
  async warmConnections() {
126
140
  if (this.warmupCompleted) {
@@ -146,21 +160,29 @@ export class Client {
146
160
  }
147
161
  async runWarmup() {
148
162
  try {
149
- const response = await bindTraceContext(generateTraceId(), generateSpanId(), async () => await requestJson(`${this.baseUrl}/health`, {
150
- method: "GET",
151
- headers: {
152
- "X-Trace-ID": getTraceId(),
153
- "X-Span-ID": getSpanId(),
154
- },
155
- }));
156
- this.debugLog("SDK connection warmup completed", { status_code: response.status });
163
+ const traceId = generateTraceId();
164
+ const spanId = generateSpanId();
165
+ await bindTraceContext(traceId, spanId, async () => {
166
+ const response = await requestJson(`${this.baseUrl}/health`, {
167
+ method: "GET",
168
+ headers: {
169
+ "X-Trace-ID": getTraceId(),
170
+ "X-Span-ID": getSpanId(),
171
+ },
172
+ });
173
+ this.debugLog("SDK connection warmup completed", { status_code: response.status });
174
+ });
157
175
  }
158
176
  catch {
159
177
  this.debugLog("SDK connection warmup failed");
160
178
  }
161
179
  try {
162
- await this.protectEngine.bootstrap();
163
- this.debugLog("SDK protect config bootstrap completed");
180
+ const traceId = generateTraceId();
181
+ const spanId = generateSpanId();
182
+ await bindTraceContext(traceId, spanId, async () => {
183
+ await this.protectEngine.bootstrap();
184
+ this.debugLog("SDK protect config bootstrap completed");
185
+ });
164
186
  }
165
187
  catch {
166
188
  this.debugLog("SDK protect config bootstrap failed");
package/dist/config.d.ts CHANGED
@@ -1,12 +1,12 @@
1
1
  export declare const sdkNodeConfig: {
2
2
  readonly defaultBaseUrl: "http://localhost:8000";
3
- readonly defaultEnvironment: "dev";
3
+ readonly defaultEnvironment: "unknown";
4
4
  readonly defaultFlushIntervalMs: 1000;
5
5
  readonly defaultMaxQueueSize: 1000;
6
6
  readonly defaultFlushTimeoutMs: 500;
7
7
  readonly defaultRequestTimeoutMs: 1000;
8
8
  readonly defaultProtectFailMode: "open";
9
- readonly internalProtectDecisionTimeoutMs: 150;
9
+ readonly internalProtectDecisionTimeoutMs: 300;
10
10
  readonly retryDelayMinMs: 200;
11
11
  readonly retryDelayMaxMs: 400;
12
12
  readonly defaultTokenizerEncoding: "cl100k_base";
package/dist/config.js CHANGED
@@ -1,12 +1,12 @@
1
1
  export const sdkNodeConfig = {
2
2
  defaultBaseUrl: "http://localhost:8000",
3
- defaultEnvironment: "dev",
3
+ defaultEnvironment: "unknown",
4
4
  defaultFlushIntervalMs: 1000,
5
5
  defaultMaxQueueSize: 1000,
6
6
  defaultFlushTimeoutMs: 500,
7
7
  defaultRequestTimeoutMs: 1000,
8
8
  defaultProtectFailMode: "open",
9
- internalProtectDecisionTimeoutMs: 150,
9
+ internalProtectDecisionTimeoutMs: 300,
10
10
  retryDelayMinMs: 200,
11
11
  retryDelayMaxMs: 400,
12
12
  defaultTokenizerEncoding: "cl100k_base",
@@ -1,8 +1,10 @@
1
1
  export interface EventRequest {
2
2
  endpoint?: string;
3
3
  feature?: string;
4
+ request_fingerprint?: string;
4
5
  input_tokens?: number;
5
6
  input_tokens_estimate?: number;
7
+ token_explosion_tokens?: number;
6
8
  max_output_tokens?: number;
7
9
  protect_decision?: string;
8
10
  protect_reason?: string;
@@ -25,19 +25,18 @@ class BufferedJsonResponse {
25
25
  }
26
26
  }
27
27
  export async function requestJson(url, options) {
28
- const mockedFetch = getMockedFetchOverride();
29
- if (mockedFetch) {
30
- const response = await mockedFetch(url, {
28
+ if (typeof globalThis.fetch === "function") {
29
+ const response = await globalThis.fetch(url, {
31
30
  method: options.method,
32
31
  headers: options.headers,
33
32
  body: options.body,
34
33
  signal: options.signal,
35
34
  });
36
- return {
37
- ok: response.ok,
38
- status: response.status,
39
- json: async () => await response.json(),
40
- };
35
+ const textReader = response.text;
36
+ const payload = typeof textReader === "function"
37
+ ? await textReader.call(response)
38
+ : await serializeFetchJson(response);
39
+ return new BufferedJsonResponse(response.status, payload);
41
40
  }
42
41
  const target = new URL(url);
43
42
  const useHttps = target.protocol === "https:";
@@ -88,13 +87,10 @@ function createAbortError() {
88
87
  error.name = "AbortError";
89
88
  return error;
90
89
  }
91
- function getMockedFetchOverride() {
92
- if (typeof globalThis.fetch !== "function") {
93
- return null;
90
+ async function serializeFetchJson(response) {
91
+ if (typeof response.json !== "function") {
92
+ return "";
94
93
  }
95
- const source = Function.prototype.toString.call(globalThis.fetch);
96
- if (source.includes("[native code]")) {
97
- return null;
98
- }
99
- return globalThis.fetch.bind(globalThis);
94
+ const payload = await response.json();
95
+ return payload == null ? "" : JSON.stringify(payload);
100
96
  }
package/dist/logger.js CHANGED
@@ -19,13 +19,20 @@ export function getSpanId() {
19
19
  return contextStorage.getStore()?.spanId ?? "";
20
20
  }
21
21
  export function emitLog(params) {
22
+ const traceId = params.traceId ?? getTraceId();
23
+ const spanId = params.spanId ?? getSpanId();
22
24
  const payload = {
23
25
  timestamp: new Date().toISOString(),
24
26
  level: params.level,
25
27
  service: SERVICE_NAME,
26
- env: (params.environment ?? process.env.RHEONIC_ENV ?? process.env.NODE_ENV ?? "dev").toLowerCase(),
27
- trace_id: params.traceId ?? getTraceId(),
28
- span_id: params.spanId ?? getSpanId(),
28
+ env: (params.environment ??
29
+ process.env.NODE_ENV ??
30
+ process.env.APP_ENV ??
31
+ process.env.ENVIRONMENT ??
32
+ process.env.ENV ??
33
+ "unknown").toLowerCase(),
34
+ trace_id: traceId || generateTraceId(),
35
+ span_id: spanId || generateSpanId(),
29
36
  event: sanitizeEvent(params.event),
30
37
  message: params.message,
31
38
  metadata: sanitizeMetadata(params.metadata ?? {}),
@@ -1,4 +1,4 @@
1
- export type ProtectDecision = "allow" | "warn" | "block";
1
+ export type ProtectDecision = "allow" | "clamp" | "block";
2
2
  export type ProtectFailMode = "open" | "closed";
3
3
  export interface ProtectContext {
4
4
  provider: string;
@@ -6,10 +6,16 @@ export interface ProtectContext {
6
6
  feature?: string;
7
7
  max_output_tokens?: number;
8
8
  input_tokens_estimate?: number;
9
+ trace_id?: string;
10
+ span_id?: string;
9
11
  }
10
12
  export interface ProtectEvaluation {
11
13
  decision: ProtectDecision;
12
14
  reason: string;
15
+ trace_id: string;
16
+ request_id: string;
17
+ blocked_until?: string;
18
+ retry_after_seconds?: number;
13
19
  snapshot?: Record<string, unknown>;
14
20
  applyClampEnabled?: boolean;
15
21
  clamp?: {
@@ -19,7 +25,19 @@ export interface ProtectEvaluation {
19
25
  }
20
26
  export declare class RHEONICBlockedError extends Error {
21
27
  readonly reason: string;
22
- constructor(reason: string);
28
+ readonly trace_id: string;
29
+ readonly request_id: string;
30
+ readonly blocked_until?: string;
31
+ readonly retry_after_seconds?: number;
32
+ readonly snapshot?: Record<string, unknown>;
33
+ constructor(params: {
34
+ reason: string;
35
+ trace_id: string;
36
+ request_id: string;
37
+ blocked_until?: string;
38
+ retry_after_seconds?: number;
39
+ snapshot?: Record<string, unknown>;
40
+ });
23
41
  }
24
42
  export declare class ProtectEngine {
25
43
  private readonly baseUrl;
@@ -41,8 +59,9 @@ export declare class ProtectEngine {
41
59
  debugLog?: (message: string, meta?: Record<string, unknown>) => void;
42
60
  });
43
61
  evaluate(context: ProtectContext): Promise<ProtectEvaluation>;
62
+ private fallbackEvaluation;
44
63
  bootstrap(): Promise<void>;
45
64
  private reportDecisionTimeout;
46
65
  private reportDecisionUnavailable;
47
66
  }
48
- export declare const defaultProtectTimeoutMs: 150;
67
+ export declare const defaultProtectTimeoutMs: 300;