defistream 1.2.0__tar.gz → 1.2.1__tar.gz

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
@@ -1,6 +1,6 @@
1
1
  Metadata-Version: 2.4
2
2
  Name: defistream
3
- Version: 1.2.0
3
+ Version: 1.2.1
4
4
  Summary: Python client for the DeFiStream API
5
5
  Project-URL: Homepage, https://defistream.dev
6
6
  Project-URL: Documentation, https://docs.defistream.dev
@@ -78,6 +78,7 @@ print(df.head())
78
78
  ## Features
79
79
 
80
80
  - **Builder pattern**: Fluent query API with chainable methods
81
+ - **Aggregate queries**: Bucket events into time or block intervals with summary statistics
81
82
  - **Type-safe**: Full type hints and Pydantic models
82
83
  - **Multiple formats**: DataFrame (pandas/polars), CSV, Parquet, JSON
83
84
  - **Async support**: Native async/await with `AsyncDeFiStream`
@@ -245,6 +246,57 @@ df = (
245
246
  )
246
247
  ```
247
248
 
249
+ ### Aggregate Queries
250
+
251
+ Use `.aggregate()` to bucket raw events into time or block intervals with summary statistics. All existing filters work before `.aggregate()` is called.
252
+
253
+ ```python
254
+ # Aggregate USDT transfers into 2-hour buckets
255
+ df = (
256
+ client.erc20.transfers("USDT")
257
+ .network("ETH")
258
+ .block_range(21000000, 21100000)
259
+ .aggregate(group_by="time", period="2h")
260
+ .as_df()
261
+ )
262
+
263
+ # Aggregate by block intervals
264
+ df = (
265
+ client.erc20.transfers("USDT")
266
+ .network("ETH")
267
+ .block_range(21000000, 21100000)
268
+ .aggregate(group_by="block", period="100b")
269
+ .as_df()
270
+ )
271
+
272
+ # Combine with filters — large transfers from exchanges, bucketed hourly
273
+ df = (
274
+ client.erc20.transfers("USDT")
275
+ .network("ETH")
276
+ .block_range(21000000, 21100000)
277
+ .sender_category("exchange")
278
+ .min_amount(10000)
279
+ .aggregate(group_by="time", period="1h")
280
+ .as_df()
281
+ )
282
+
283
+ # Aggregate Uniswap swaps
284
+ df = (
285
+ client.uniswap.swaps("WETH", "USDC", 500)
286
+ .network("ETH")
287
+ .block_range(21000000, 21100000)
288
+ .aggregate(group_by="time", period="1h")
289
+ .as_df()
290
+ )
291
+ ```
292
+
293
+ You can also discover what aggregate fields are available for a protocol:
294
+
295
+ ```python
296
+ schema = client.aggregate_schema("erc20")
297
+ print(schema)
298
+ ```
299
+
248
300
  ### Verbose Mode
249
301
 
250
302
  By default, responses omit metadata fields to reduce payload size. Use `.verbose()` to include all fields:
@@ -507,6 +559,13 @@ Filter events by entity names or categories using the labels database. Available
507
559
 
508
560
  **Mutual exclusivity:** Within each slot (involving/sender/receiver), only one of address/label/category can be set. `involving*` filters cannot be combined with `sender*`/`receiver*` filters.
509
561
 
562
+ ### Aggregate Methods
563
+
564
+ | Method | Description |
565
+ |--------|-------------|
566
+ | `.aggregate(group_by, period)` | Transition to aggregate query. `group_by`: `"time"` or `"block"`. `period`: bucket size (e.g. `"1h"`, `"100b"`). Returns an `AggregateQueryBuilder` that supports all the same terminal and filter methods. |
567
+ | `client.aggregate_schema(protocol)` | Get available aggregate fields for a protocol (e.g. `"erc20"`, `"aave"`). |
568
+
510
569
  ### Terminal Methods
511
570
 
512
571
  | Method | Description |
@@ -44,6 +44,7 @@ print(df.head())
44
44
  ## Features
45
45
 
46
46
  - **Builder pattern**: Fluent query API with chainable methods
47
+ - **Aggregate queries**: Bucket events into time or block intervals with summary statistics
47
48
  - **Type-safe**: Full type hints and Pydantic models
48
49
  - **Multiple formats**: DataFrame (pandas/polars), CSV, Parquet, JSON
49
50
  - **Async support**: Native async/await with `AsyncDeFiStream`
@@ -211,6 +212,57 @@ df = (
211
212
  )
212
213
  ```
213
214
 
215
+ ### Aggregate Queries
216
+
217
+ Use `.aggregate()` to bucket raw events into time or block intervals with summary statistics. All existing filters work before `.aggregate()` is called.
218
+
219
+ ```python
220
+ # Aggregate USDT transfers into 2-hour buckets
221
+ df = (
222
+ client.erc20.transfers("USDT")
223
+ .network("ETH")
224
+ .block_range(21000000, 21100000)
225
+ .aggregate(group_by="time", period="2h")
226
+ .as_df()
227
+ )
228
+
229
+ # Aggregate by block intervals
230
+ df = (
231
+ client.erc20.transfers("USDT")
232
+ .network("ETH")
233
+ .block_range(21000000, 21100000)
234
+ .aggregate(group_by="block", period="100b")
235
+ .as_df()
236
+ )
237
+
238
+ # Combine with filters — large transfers from exchanges, bucketed hourly
239
+ df = (
240
+ client.erc20.transfers("USDT")
241
+ .network("ETH")
242
+ .block_range(21000000, 21100000)
243
+ .sender_category("exchange")
244
+ .min_amount(10000)
245
+ .aggregate(group_by="time", period="1h")
246
+ .as_df()
247
+ )
248
+
249
+ # Aggregate Uniswap swaps
250
+ df = (
251
+ client.uniswap.swaps("WETH", "USDC", 500)
252
+ .network("ETH")
253
+ .block_range(21000000, 21100000)
254
+ .aggregate(group_by="time", period="1h")
255
+ .as_df()
256
+ )
257
+ ```
258
+
259
+ You can also discover what aggregate fields are available for a protocol:
260
+
261
+ ```python
262
+ schema = client.aggregate_schema("erc20")
263
+ print(schema)
264
+ ```
265
+
214
266
  ### Verbose Mode
215
267
 
216
268
  By default, responses omit metadata fields to reduce payload size. Use `.verbose()` to include all fields:
@@ -473,6 +525,13 @@ Filter events by entity names or categories using the labels database. Available
473
525
 
474
526
  **Mutual exclusivity:** Within each slot (involving/sender/receiver), only one of address/label/category can be set. `involving*` filters cannot be combined with `sender*`/`receiver*` filters.
475
527
 
528
+ ### Aggregate Methods
529
+
530
+ | Method | Description |
531
+ |--------|-------------|
532
+ | `.aggregate(group_by, period)` | Transition to aggregate query. `group_by`: `"time"` or `"block"`. `period`: bucket size (e.g. `"1h"`, `"100b"`). Returns an `AggregateQueryBuilder` that supports all the same terminal and filter methods. |
533
+ | `client.aggregate_schema(protocol)` | Get available aggregate fields for a protocol (e.g. `"erc20"`, `"aave"`). |
534
+
476
535
  ### Terminal Methods
477
536
 
478
537
  | Method | Description |
@@ -4,7 +4,7 @@ build-backend = "hatchling.build"
4
4
 
5
5
  [project]
6
6
  name = "defistream"
7
- version = "1.2.0"
7
+ version = "1.2.1"
8
8
  description = "Python client for the DeFiStream API"
9
9
  readme = "README.md"
10
10
  license = "MIT"
@@ -36,7 +36,7 @@ from .query import (
36
36
  QueryBuilder,
37
37
  )
38
38
 
39
- __version__ = "1.2.0"
39
+ __version__ = "1.2.1"
40
40
 
41
41
  __all__ = [
42
42
  # Clients
File without changes
File without changes
File without changes
File without changes