backtest-kit 3.6.0 → 3.7.1
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- package/README.md +221 -61
- package/build/index.cjs +989 -184
- package/build/index.mjs +980 -185
- package/package.json +1 -1
- package/types.d.ts +508 -7
package/README.md
CHANGED
|
@@ -204,72 +204,181 @@ Customize via `setConfig()`:
|
|
|
204
204
|
|
|
205
205
|
Backtest Kit is **not a data-processing library** - it is a **time execution engine**. Think of the engine as an **async stream of time**, where your strategy is evaluated step by step.
|
|
206
206
|
|
|
207
|
+
### 💰 How PNL Works
|
|
208
|
+
|
|
209
|
+
These three functions work together to manage a position dynamically. To reduce position linearity, the framework treats every DCA entry as a fixed **$100 unit** regardless of price — this flattens the effective entry curve and makes PNL weighting independent of position size.
|
|
210
|
+
|
|
211
|
+
**Public API:**
|
|
212
|
+
- **`commitAverageBuy`** — adds a new DCA entry. For LONG, **only accepted when current price is below a new low**. Silently rejected otherwise. This prevents averaging up.
|
|
213
|
+
- **`commitPartialProfit`** — closes X% of the position at a profit. Locks in gains while keeping exposure.
|
|
214
|
+
- **`commitPartialLoss`** — closes X% of the position at a loss. Cuts exposure before the stop-loss is hit.
|
|
215
|
+
|
|
216
|
+
<details>
|
|
217
|
+
<summary>
|
|
218
|
+
The Math
|
|
219
|
+
</summary>
|
|
220
|
+
|
|
221
|
+
**Scenario:** LONG entry @ 1000, 4 DCA attempts (1 rejected), 3 partials, closed at TP.
|
|
222
|
+
`totalInvested = $400` (4 × $100, rejected attempt not counted).
|
|
223
|
+
|
|
224
|
+
**Entries**
|
|
225
|
+
```
|
|
226
|
+
entry#1 @ 1000 → 0.10000 coins
|
|
227
|
+
commitPartialProfit(30%) @ 1150 ← cnt=1
|
|
228
|
+
entry#2 @ 950 → 0.10526 coins
|
|
229
|
+
entry#3 @ 880 → 0.11364 coins
|
|
230
|
+
commitPartialLoss(20%) @ 860 ← cnt=3
|
|
231
|
+
entry#4 @ 920 → 0.10870 coins
|
|
232
|
+
commitPartialProfit(40%) @ 1050 ← cnt=4
|
|
233
|
+
entry#5 @ 980 ✗ REJECTED (980 > ep3≈929.90)
|
|
234
|
+
totalInvested = $400
|
|
235
|
+
```
|
|
236
|
+
|
|
237
|
+
**Partial#1 — commitPartialProfit @ 1150, 30%, cnt=1**
|
|
238
|
+
```
|
|
239
|
+
effectivePrice = hm(1000) = 1000
|
|
240
|
+
costBasis = $100
|
|
241
|
+
partialDollarValue = 30% × 100 = $30 → weight = 30/400 = 0.075
|
|
242
|
+
pnl = (1150−1000)/1000 × 100 = +15.00%
|
|
243
|
+
costBasis → $70
|
|
244
|
+
coins sold: 0.03000 × 1150 = $34.50
|
|
245
|
+
remaining: 0.07000
|
|
246
|
+
```
|
|
247
|
+
|
|
248
|
+
**DCA after Partial#1**
|
|
249
|
+
```
|
|
250
|
+
entry#2 @ 950 (950 < ep1=1000 ✓ accepted)
|
|
251
|
+
entry#3 @ 880 (880 < ep1=1000 ✓ accepted)
|
|
252
|
+
coins: 0.07000 + 0.10526 + 0.11364 = 0.28890
|
|
253
|
+
```
|
|
254
|
+
|
|
255
|
+
**Partial#2 — commitPartialLoss @ 860, 20%, cnt=3**
|
|
256
|
+
```
|
|
257
|
+
costBasis = 70 + 100 + 100 = $270
|
|
258
|
+
ep2 = 270 / 0.28890 ≈ 934.93
|
|
259
|
+
partialDollarValue = 20% × 270 = $54 → weight = 54/400 = 0.135
|
|
260
|
+
pnl = (860−934.93)/934.93 × 100 ≈ −8.01%
|
|
261
|
+
costBasis → $216
|
|
262
|
+
coins sold: 0.05778 × 860 = $49.69
|
|
263
|
+
remaining: 0.23112
|
|
264
|
+
```
|
|
265
|
+
|
|
266
|
+
**DCA after Partial#2**
|
|
267
|
+
```
|
|
268
|
+
entry#4 @ 920 (920 < ep2=934.93 ✓ accepted)
|
|
269
|
+
coins: 0.23112 + 0.10870 = 0.33982
|
|
270
|
+
```
|
|
271
|
+
|
|
272
|
+
**Partial#3 — commitPartialProfit @ 1050, 40%, cnt=4**
|
|
273
|
+
```
|
|
274
|
+
costBasis = 216 + 100 = $316
|
|
275
|
+
ep3 = 316 / 0.33982 ≈ 929.90
|
|
276
|
+
partialDollarValue = 40% × 316 = $126.4 → weight = 126.4/400 = 0.316
|
|
277
|
+
pnl = (1050−929.90)/929.90 × 100 ≈ +12.92%
|
|
278
|
+
costBasis → $189.6
|
|
279
|
+
coins sold: 0.13593 × 1050 = $142.72
|
|
280
|
+
remaining: 0.20389
|
|
281
|
+
```
|
|
282
|
+
|
|
283
|
+
**DCA after Partial#3 — rejected**
|
|
284
|
+
```
|
|
285
|
+
entry#5 @ 980 (980 > ep3≈929.90 ✗ REJECTED)
|
|
286
|
+
```
|
|
287
|
+
|
|
288
|
+
**Close at TP @ 1200**
|
|
289
|
+
```
|
|
290
|
+
ep_final = ep3 ≈ 929.90 (no new entries)
|
|
291
|
+
coins: 0.20389
|
|
292
|
+
|
|
293
|
+
remainingDollarValue = 400 − 30 − 54 − 126.4 = $189.6
|
|
294
|
+
weight = 189.6/400 = 0.474
|
|
295
|
+
pnl = (1200−929.90)/929.90 × 100 ≈ +29.05%
|
|
296
|
+
coins sold: 0.20389 × 1200 = $244.67
|
|
297
|
+
```
|
|
298
|
+
|
|
299
|
+
**Result (toProfitLossDto)**
|
|
300
|
+
```
|
|
301
|
+
0.075 × (+15.00) = +1.125
|
|
302
|
+
0.135 × (−8.01) = −1.081
|
|
303
|
+
0.316 × (+12.92) = +4.083
|
|
304
|
+
0.474 × (+29.05) = +13.770
|
|
305
|
+
─────────────────────────────
|
|
306
|
+
≈ +17.90%
|
|
307
|
+
|
|
308
|
+
Cross-check (coins):
|
|
309
|
+
34.50 + 49.69 + 142.72 + 244.67 = $471.58
|
|
310
|
+
(471.58 − 400) / 400 × 100 = +17.90% ✓
|
|
311
|
+
```
|
|
312
|
+
</details>
|
|
313
|
+
|
|
314
|
+
**`priceOpen`** is the harmonic mean of all accepted DCA entries. After each partial close (`commitPartialProfit` or `commitPartialLoss`), the remaining cost basis is carried forward into the harmonic mean calculation for subsequent entries — so `priceOpen` shifts after every partial, which in turn changes whether the next `commitAverageBuy` call will be accepted.
|
|
315
|
+
|
|
207
316
|
### 🔍 How getCandles Works
|
|
208
317
|
|
|
209
318
|
backtest-kit uses Node.js `AsyncLocalStorage` to automatically provide
|
|
210
319
|
temporal time context to your strategies.
|
|
211
320
|
|
|
212
|
-
|
|
213
|
-
|
|
214
|
-
|
|
215
|
-
|
|
216
|
-
|
|
217
|
-
|
|
218
|
-
|
|
219
|
-
|
|
220
|
-
|
|
221
|
-
|
|
222
|
-
|
|
223
|
-
|
|
224
|
-
|
|
225
|
-
|
|
226
|
-
|
|
227
|
-
|
|
228
|
-
|
|
229
|
-
|
|
230
|
-
|
|
231
|
-
|
|
232
|
-
|
|
233
|
-
|
|
234
|
-
|
|
235
|
-
|
|
236
|
-
|
|
237
|
-
|
|
238
|
-
|
|
239
|
-
|
|
240
|
-
|
|
241
|
-
|
|
242
|
-
|
|
243
|
-
|
|
244
|
-
|
|
245
|
-
|
|
246
|
-
|
|
247
|
-
|
|
248
|
-
|
|
249
|
-
|
|
250
|
-
|
|
251
|
-
|
|
252
|
-
|
|
253
|
-
|
|
254
|
-
|
|
255
|
-
|
|
256
|
-
|
|
257
|
-
|
|
258
|
-
|
|
259
|
-
|
|
260
|
-
|
|
261
|
-
|
|
262
|
-
|
|
263
|
-
|
|
264
|
-
|
|
265
|
-
|
|
266
|
-
|
|
267
|
-
|
|
268
|
-
|
|
269
|
-
|
|
270
|
-
|
|
271
|
-
|
|
272
|
-
|
|
321
|
+
<details>
|
|
322
|
+
<summary>
|
|
323
|
+
The Math
|
|
324
|
+
</summary>
|
|
325
|
+
|
|
326
|
+
For a candle with:
|
|
327
|
+
- `timestamp` = candle open time (openTime)
|
|
328
|
+
- `stepMs` = interval duration (e.g., 60000ms for "1m")
|
|
329
|
+
- Candle close time = `timestamp + stepMs`
|
|
330
|
+
|
|
331
|
+
**Alignment:** All timestamps are aligned down to interval boundary.
|
|
332
|
+
For example, for 15m interval: 00:17 → 00:15, 00:44 → 00:30
|
|
333
|
+
|
|
334
|
+
**Adapter contract:**
|
|
335
|
+
- First candle.timestamp must equal aligned `since`
|
|
336
|
+
- Adapter must return exactly `limit` candles
|
|
337
|
+
- Sequential timestamps: `since + i * stepMs` for i = 0..limit-1
|
|
338
|
+
|
|
339
|
+
**How `since` is calculated from `when`:**
|
|
340
|
+
- `when` = current execution context time (from AsyncLocalStorage)
|
|
341
|
+
- `alignedWhen` = `Math.floor(when / stepMs) * stepMs` (aligned down to interval boundary)
|
|
342
|
+
- `since` = `alignedWhen - limit * stepMs` (go back `limit` candles from aligned when)
|
|
343
|
+
|
|
344
|
+
**Boundary semantics (inclusive/exclusive):**
|
|
345
|
+
- `since` is always **inclusive** — first candle has `timestamp === since`
|
|
346
|
+
- Exactly `limit` candles are returned
|
|
347
|
+
- Last candle has `timestamp === since + (limit - 1) * stepMs` — **inclusive**
|
|
348
|
+
- For `getCandles`: `alignedWhen` is **exclusive** — candle at that timestamp is NOT included (it's a pending/incomplete candle)
|
|
349
|
+
- For `getRawCandles`: `eDate` is **exclusive** — candle at that timestamp is NOT included (it's a pending/incomplete candle)
|
|
350
|
+
- For `getNextCandles`: `alignedWhen` is **inclusive** — first candle starts at `alignedWhen` (it's the current candle for backtest, already closed in historical data)
|
|
351
|
+
|
|
352
|
+
- `getCandles(symbol, interval, limit)` - Returns exactly `limit` candles
|
|
353
|
+
- Aligns `when` down to interval boundary
|
|
354
|
+
- Calculates `since = alignedWhen - limit * stepMs`
|
|
355
|
+
- **since — inclusive**, first candle.timestamp === since
|
|
356
|
+
- **alignedWhen — exclusive**, candle at alignedWhen is NOT returned
|
|
357
|
+
- Range: `[since, alignedWhen)` — half-open interval
|
|
358
|
+
- Example: `getCandles("BTCUSDT", "1m", 100)` returns 100 candles ending before aligned when
|
|
359
|
+
|
|
360
|
+
- `getNextCandles(symbol, interval, limit)` - Returns exactly `limit` candles (backtest only)
|
|
361
|
+
- Aligns `when` down to interval boundary
|
|
362
|
+
- `since = alignedWhen` (starts from aligned when, going forward)
|
|
363
|
+
- **since — inclusive**, first candle.timestamp === since
|
|
364
|
+
- Range: `[alignedWhen, alignedWhen + limit * stepMs)` — half-open interval
|
|
365
|
+
- Throws error in live mode to prevent look-ahead bias
|
|
366
|
+
- Example: `getNextCandles("BTCUSDT", "1m", 10)` returns next 10 candles starting from aligned when
|
|
367
|
+
|
|
368
|
+
- `getRawCandles(symbol, interval, limit?, sDate?, eDate?)` - Flexible parameter combinations:
|
|
369
|
+
- `(limit)` - since = alignedWhen - limit * stepMs, range `[since, alignedWhen)`
|
|
370
|
+
- `(limit, sDate)` - since = align(sDate), returns `limit` candles forward, range `[since, since + limit * stepMs)`
|
|
371
|
+
- `(limit, undefined, eDate)` - since = align(eDate) - limit * stepMs, **eDate — exclusive**, range `[since, eDate)`
|
|
372
|
+
- `(undefined, sDate, eDate)` - since = align(sDate), limit calculated from range, **sDate — inclusive, eDate — exclusive**, range `[sDate, eDate)`
|
|
373
|
+
- `(limit, sDate, eDate)` - since = align(sDate), returns `limit` candles, **sDate — inclusive**
|
|
374
|
+
- All combinations respect look-ahead bias protection (eDate/endTime <= when)
|
|
375
|
+
|
|
376
|
+
**Persistent Cache:**
|
|
377
|
+
- Cache lookup calculates expected timestamps: `since + i * stepMs` for i = 0..limit-1
|
|
378
|
+
- Returns all candles if found, null if any missing (cache miss)
|
|
379
|
+
- Cache and runtime use identical timestamp calculation logic
|
|
380
|
+
|
|
381
|
+
</details>
|
|
273
382
|
|
|
274
383
|
#### Candle Timestamp Convention:
|
|
275
384
|
|
|
@@ -328,6 +437,57 @@ Unlike candles, most exchanges (e.g. Binance `GET /api/v3/depth`) only expose th
|
|
|
328
437
|
- `depth` defaults to `CC_ORDER_BOOK_MAX_DEPTH_LEVELS`
|
|
329
438
|
- Adapter receives `(symbol, depth, from, to, backtest)` — may ignore `from`/`to` in live mode
|
|
330
439
|
|
|
440
|
+
### 🔍 How getAggregatedTrades Works
|
|
441
|
+
|
|
442
|
+
Aggregated trades fetching uses the same look-ahead bias protection as candles - `to` is always aligned down to the nearest minute boundary so future trades are never visible to the strategy.
|
|
443
|
+
|
|
444
|
+
**Key principles:**
|
|
445
|
+
- `to` is always aligned down to the 1-minute boundary — prevents look-ahead bias
|
|
446
|
+
- Without `limit`: returns one full window (`CC_AGGREGATED_TRADES_MAX_MINUTES`)
|
|
447
|
+
- With `limit`: paginates backwards until collected, then slices to most recent `limit`
|
|
448
|
+
- Adapter receives `(symbol, from, to, backtest)` — may ignore `from`/`to` in live mode
|
|
449
|
+
|
|
450
|
+
<details>
|
|
451
|
+
<summary>
|
|
452
|
+
The Math
|
|
453
|
+
</summary>
|
|
454
|
+
|
|
455
|
+
**Time range calculation:**
|
|
456
|
+
- `when` = current execution context time (from AsyncLocalStorage)
|
|
457
|
+
- `alignedTo` = `Math.floor(when / 60000) * 60000` (aligned down to 1-minute boundary)
|
|
458
|
+
- `windowMs` = `CC_AGGREGATED_TRADES_MAX_MINUTES * 60000 − 60000`
|
|
459
|
+
- `to` = `alignedTo`, `from` = `alignedTo − windowMs`
|
|
460
|
+
|
|
461
|
+
**Without `limit`:** fetches a single window and returns it as-is.
|
|
462
|
+
|
|
463
|
+
**With `limit`:** paginates backwards in `CC_AGGREGATED_TRADES_MAX_MINUTES` chunks until at least `limit` trades are collected, then slices to the most recent `limit` trades.
|
|
464
|
+
|
|
465
|
+
**Example with CC_AGGREGATED_TRADES_MAX_MINUTES = 60, limit = 200:**
|
|
466
|
+
```
|
|
467
|
+
when = 1704067920000 // 2024-01-01 00:12:00 UTC
|
|
468
|
+
alignedTo = 1704067800000 // 2024-01-01 00:12:00 → aligned to 00:12:00
|
|
469
|
+
windowMs = 59 * 60000 // 3540000ms = 59 minutes
|
|
470
|
+
|
|
471
|
+
Window 1: from = 00:12:00 − 59m = 23:13:00
|
|
472
|
+
to = 00:12:00
|
|
473
|
+
→ got 120 trades — not enough
|
|
474
|
+
|
|
475
|
+
Window 2: from = 23:13:00 − 59m = 22:14:00
|
|
476
|
+
to = 23:13:00
|
|
477
|
+
→ got 100 more → total 220 trades
|
|
478
|
+
|
|
479
|
+
result = last 200 of 220 (most recent)
|
|
480
|
+
```
|
|
481
|
+
|
|
482
|
+
**Adapter contract:**
|
|
483
|
+
- `getAggregatedTrades(symbol, from, to, backtest)` is called on the exchange schema
|
|
484
|
+
- `from`/`to` are `Date` objects
|
|
485
|
+
- Schema implementation may use the time range (backtest) or ignore it (live trading)
|
|
486
|
+
|
|
487
|
+
</details>
|
|
488
|
+
|
|
489
|
+
**Compatible with:** [garch](https://www.npmjs.com/package/garch) for volatility modelling and [volume-anomaly](https://www.npmjs.com/package/volume-anomaly) for detecting abnormal trade volume — both accept the same `from`/`to` time range format that `getAggregatedTrades` produces.
|
|
490
|
+
|
|
331
491
|
### 🔬 Technical Details: Timestamp Alignment
|
|
332
492
|
|
|
333
493
|
**Why align timestamps to interval boundaries?**
|