cachebox 5.2.1__cp314-cp314t-manylinux_2_5_i686.manylinux1_i686.whl

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
@@ -0,0 +1,880 @@
1
+ Metadata-Version: 2.4
2
+ Name: cachebox
3
+ Version: 5.2.1
4
+ Classifier: Programming Language :: Python :: Implementation :: CPython
5
+ Classifier: Programming Language :: Python :: Implementation :: PyPy
6
+ Classifier: Programming Language :: Python :: 3
7
+ Classifier: Programming Language :: Python :: 3 :: Only
8
+ Classifier: Programming Language :: Python :: 3.9
9
+ Classifier: Programming Language :: Python :: 3.10
10
+ Classifier: Programming Language :: Python :: 3.11
11
+ Classifier: Programming Language :: Python :: 3.12
12
+ Classifier: Programming Language :: Python :: 3.13
13
+ Classifier: Programming Language :: Python :: 3.14
14
+ Classifier: Programming Language :: Python
15
+ Classifier: Programming Language :: Rust
16
+ Classifier: Intended Audience :: Developers
17
+ Classifier: License :: OSI Approved :: MIT License
18
+ Classifier: Operating System :: POSIX :: Linux
19
+ Classifier: Operating System :: Microsoft :: Windows
20
+ Classifier: Operating System :: MacOS
21
+ Classifier: Typing :: Typed
22
+ License-File: LICENSE
23
+ Summary: The fastest memoizing and caching Python library written in Rust
24
+ Keywords: caching,cached,cachebox,cache,in-memory-caching,memoizing
25
+ Home-Page: https://github.com/awolverp/cachebox
26
+ Author-email: awolverp <awolverp@gmail.com>
27
+ License: MIT
28
+ Requires-Python: >=3.9
29
+ Description-Content-Type: text/markdown; charset=UTF-8; variant=GFM
30
+ Project-URL: Homepage, https://github.com/awolverp/cachebox
31
+
32
+ <div align="center">
33
+
34
+ # Cachebox
35
+
36
+ *The fastest caching Python library written in Rust*
37
+
38
+ [**Releases**](https://github.com/awolverp/cachebox/releases) |
39
+ [**Benchmarks**](https://github.com/awolverp/cachebox-benchmark) |
40
+ [**Issues**](https://github.com/awolverp/cachebox/issues/new)
41
+
42
+ [![License](https://img.shields.io/github/license/awolverp/cachebox.svg?style=flat-square)](https://github.com/awolverp/cachebox/blob/main/LICENSE)
43
+ [![Release](https://img.shields.io/github/v/release/awolverp/cachebox.svg?style=flat-square)](https://github.com/awolverp/cachebox/releases)
44
+ [![Python Versions](https://img.shields.io/pypi/pyversions/cachebox.svg?style=flat-square)](https://pypi.org/project/cachebox/)
45
+ [![Downloads](https://img.shields.io/pypi/dm/cachebox?style=flat-square&color=%23314bb5)](https://pepy.tech/projects/cachebox)
46
+
47
+ </div>
48
+
49
+ -------
50
+
51
+ ### What does it do?
52
+ You can easily and powerfully perform caching operations in Python as fast as possible.
53
+ This can make your application very faster and it's a good choice in big applications.
54
+ **Ideal for optimizing large-scale applications** with efficient, low-overhead caching.
55
+
56
+ **Key Features:**
57
+ - ๐Ÿš€ Extremely fast (10-50x faster than other caching libraries -- [*benchmarks*](https://github.com/awolverp/cachebox-benchmark))
58
+ - ๐Ÿ“Š Minimal memory footprint (50% of standard dictionary memory usage)
59
+ - ๐Ÿ”ฅ Full-featured and user-friendly
60
+ - ๐Ÿงถ Completely thread-safe
61
+ - ๐Ÿ”ง Tested and correct
62
+ - **\[R\]** written in Rust for maximum performance
63
+ - ๐Ÿค Compatible with Python 3.9+ (PyPy and CPython)
64
+ - ๐Ÿ“ฆ Supports 7 advanced caching algorithms
65
+
66
+ ### Page Contents
67
+ - โ“ [**When i need caching and cachebox**](#when-i-need-caching-and-cachebox)
68
+ - ๐ŸŒŸ [**Why `cachebox`**](#why-cachebox)
69
+ - ๐Ÿ”ง [**Installation**](#installation)
70
+ - ๐Ÿ’ก [**Preview**](#examples)
71
+ - ๐ŸŽ“ [**Getting started**](#getting-started)
72
+ - โœ๏ธ [**Incompatible changes**](#%EF%B8%8F-incompatible-changes)
73
+ - ๐Ÿ“Œ [**Tips & Notes**](#tips-and-notes)
74
+
75
+ ### When i need caching and cachebox
76
+ - ๐Ÿ“ˆ **Frequently Data Access** \
77
+ If you need to access the same data multiple times, caching can help reduce the number of database queries or API calls, improving performance.
78
+
79
+ - ๐Ÿ’Ž **Expensive Operations** \
80
+ If you have operations that are computationally expensive, caching can help reduce the number of times these operations need to be performed.
81
+
82
+ - ๐Ÿš— **High Traffic Scenarios** \
83
+ If your application has high user traffic, caching can help reduce the load on your server by reducing the number of requests that need to be processed.
84
+
85
+ - #๏ธโƒฃ **Web Page Rendring** \
86
+ If you are rendering web pages, caching can help reduce the time it takes to generate the page by caching the results of expensive operations. Caching HTML pages can speed up the delivery of static content.
87
+
88
+ - ๐Ÿšง **Rate Limiting** \
89
+ If you have a rate limiting system in place, caching can help reduce the number of requests that need to be processed by the rate limiter. Also, caching can help you to manage rate limits imposed by third-party APIs by reducing the number of requests sent.
90
+
91
+ - ๐Ÿค– **Machine Learning Models** \
92
+ If your application frequently makes predictions using the same input data, caching the results can save computation time.
93
+
94
+ ### Why cachebox?
95
+ - **โšก Rust** \
96
+ It uses *Rust* language to has high-performance.
97
+
98
+ - **๐Ÿงฎ SwissTable** \
99
+ It uses Google's high-performance SwissTable hash map. thanks to [hashbrown](https://github.com/rust-lang/hashbrown).
100
+
101
+ - **โœจ Low memory usage** \
102
+ It has very low memory usage.
103
+
104
+ - **โญ Zero Dependency** \
105
+ As we said, `cachebox` written in Rust so you don't have to install any other dependecies.
106
+
107
+ - **๐Ÿงถ Thread safe** \
108
+ It's completely thread-safe and uses locks to prevent problems.
109
+
110
+ - **๐Ÿ‘Œ Easy To Use** \
111
+ You only need to import it and choice your implementation to use and behave with it like a dictionary.
112
+
113
+ - **๐Ÿšซ Avoids Cache Stampede** \
114
+ It avoids [cache stampede](https://en.wikipedia.org/wiki/Cache_stampede) by using a distributed lock system.
115
+
116
+
117
+ ## Installation
118
+ cachebox is installable by `pip`:
119
+ ```bash
120
+ pip3 install -U cachebox
121
+ ```
122
+
123
+ > [!WARNING]\
124
+ > The new version v5 has some incompatible with v4, for more info please see [Incompatible changes](#incompatible-changes)
125
+
126
+ ## Examples
127
+ The simplest example of **cachebox** could look like this:
128
+ ```python
129
+ import cachebox
130
+
131
+ # Like functools.lru_cache, If maxsize is set to 0, the cache can grow without bound and limit.
132
+ @cachebox.cached(cachebox.FIFOCache(maxsize=128))
133
+ def factorial(number: int) -> int:
134
+ fact = 1
135
+ for num in range(2, number + 1):
136
+ fact *= num
137
+ return fact
138
+
139
+ assert factorial(5) == 125
140
+ assert len(factorial.cache) == 1
141
+
142
+ # Async are also supported
143
+ @cachebox.cached(cachebox.LRUCache(maxsize=128))
144
+ async def make_request(method: str, url: str) -> dict:
145
+ response = await client.request(method, url)
146
+ return response.json()
147
+ ```
148
+
149
+ Also, unlike functools.lru_cache and other caching libraries, cachebox can copy `dict`, `list`, and `set` objects.
150
+ ```python
151
+ @cachebox.cached(cachebox.LRUCache(maxsize=128))
152
+ def make_dict(name: str, age: int) -> dict:
153
+ return {"name": name, "age": age}
154
+ >
155
+ d = make_dict("cachebox", 10)
156
+ assert d == {"name": "cachebox", "age": 10}
157
+ d["new-key"] = "new-value"
158
+
159
+ d2 = make_dict("cachebox", 10)
160
+ # `d2` will be `{"name": "cachebox", "age": 10, "new-key": "new-value"}` if you use other libraries
161
+ assert d2 == {"name": "cachebox", "age": 10}
162
+ ```
163
+
164
+ You can use cache alghoritms without `cached` decorator -- just import what cache alghoritms you want and use it like a dictionary.
165
+ ```python
166
+ from cachebox import FIFOCache
167
+
168
+ cache = FIFOCache(maxsize=128)
169
+ cache["key"] = "value"
170
+ assert cache["key"] == "value"
171
+
172
+ # You can also use `cache.get(key, default)`
173
+ assert cache.get("key") == "value"
174
+ ```
175
+
176
+ ## Getting started
177
+ There are 3 useful functions:
178
+ - [**cached**](#cached--decorator): a decorator that helps you to cache your functions and calculations with a lot of options.
179
+ - [**is_cached**](#is_cached--function): check if a function/method cached by cachebox or not
180
+
181
+ And 9 classes:
182
+ - [**BaseCacheImpl**](#basecacheimpl-๏ธ-class): base-class for all classes.
183
+ - [**Cache**](#cache-๏ธ-class): A simple cache that has no algorithm; this is only a hashmap.
184
+ - [**FIFOCache**](#fifocache-๏ธ-class): the FIFO cache will remove the element that has been in the cache the longest.
185
+ - [**RRCache**](#rrcache-๏ธ-class): the RR cache will choice randomly element to remove it to make space when necessary.
186
+ - [**LRUCache**](#lrucache-๏ธ-class): the LRU cache will remove the element in the cache that has not been accessed in the longest time.
187
+ - [**LFUCache**](#lfucache-๏ธ-class): the LFU cache will remove the element in the cache that has been accessed the least, regardless of time.
188
+ - [**TTLCache**](#ttlcache-๏ธ-class): the TTL cache will automatically remove the element in the cache that has expired.
189
+ - [**VTTLCache**](#vttlcache-๏ธ-class): the TTL cache will automatically remove the element in the cache that has expired when need.
190
+ - [**Frozen**](#frozen-๏ธ-class): you can use this class for freezing your caches.
191
+
192
+ You only need to import the class which you want, and behave with it like a dictionary (except for [VTTLCache](#vttlcache-๏ธ-class), this have some differences)
193
+
194
+ There are some examples for you with different methods for introducing those.
195
+ **All the methods you will see in the examples are common across all classes (except for a few of them).**
196
+
197
+ * * *
198
+
199
+ ### `cached` (๐ŸŽ€ decorator)
200
+ Decorator to wrap a function with a memoizing callable that saves results in a cache.
201
+
202
+ **Parameters:**
203
+ - `cache`: Specifies a cache that handles and stores the results. if `None` or `dict`, `FIFOCache` will be used.
204
+
205
+ - `key_maker`: Specifies a function that will be called with the same positional and keyword
206
+ arguments as the wrapped function itself, and which has to return a suitable
207
+ cache key (must be hashable).
208
+
209
+ - `clear_reuse`: The wrapped function has a function named `clear_cache` that uses `cache.clear`
210
+ method to clear the cache. This parameter will be passed to cache's `clear` method.
211
+
212
+ - `callback`: Every time the `cache` is used, callback is also called.
213
+ The callback arguments are: event number (see `EVENT_MISS` or `EVENT_HIT` variables), key, and then result.
214
+
215
+ - `copy_level`: The wrapped function always copies the result of your function and then returns it.
216
+ This parameter specifies that the wrapped function has to copy which type of results.
217
+ `0` means "never copy", `1` means "only copy `dict`, `list`, and `set` results" and
218
+ `2` means "always copy the results".
219
+
220
+ <details>
221
+ <summary><b>Examples</b></summary>
222
+
223
+
224
+ A simple example:
225
+ ```python
226
+ import cachebox
227
+
228
+ @cachebox.cached(cachebox.LRUCache(128))
229
+ def sum_as_string(a, b):
230
+ return str(a+b)
231
+
232
+ assert sum_as_string(1, 2) == "3"
233
+
234
+ assert len(sum_as_string.cache) == 1
235
+ sum_as_string.cache_clear()
236
+ assert len(sum_as_string.cache) == 0
237
+ ```
238
+
239
+ A key_maker example:
240
+ ```python
241
+ import cachebox
242
+
243
+ def simple_key_maker(args: tuple, kwds: dict):
244
+ return args[0].path
245
+
246
+ # Async methods are supported
247
+ @cachebox.cached(cachebox.LRUCache(128), key_maker=simple_key_maker)
248
+ async def request_handler(request: Request):
249
+ return Response("hello man")
250
+ ```
251
+
252
+ A typed key_maker example:
253
+ ```python
254
+ import cachebox
255
+
256
+ @cachebox.cached(cachebox.LRUCache(128), key_maker=cachebox.make_typed_key)
257
+ def sum_as_string(a, b):
258
+ return str(a+b)
259
+
260
+ sum_as_string(1.0, 1)
261
+ sum_as_string(1, 1)
262
+ print(len(sum_as_string.cache)) # 2
263
+ ```
264
+
265
+ You have also manage functions' caches with `.cache` attribute as you saw in examples.
266
+ Also there're more attributes and methods you can use:
267
+ ```python
268
+ import cachebox
269
+
270
+ @cachebox.cached(cachebox.LRUCache(0))
271
+ def sum_as_string(a, b):
272
+ return str(a+b)
273
+
274
+ print(sum_as_string.cache)
275
+ # LRUCache(0 / 9223372036854775807, capacity=0)
276
+
277
+ print(sum_as_string.cache_info())
278
+ # CacheInfo(hits=0, misses=0, maxsize=9223372036854775807, length=0, memory=8)
279
+
280
+ # `.cache_clear()` clears the cache
281
+ sum_as_string.cache_clear()
282
+ ```
283
+
284
+ method example: *(Added in v5.1.0)*
285
+ ```python
286
+ import cachebox
287
+
288
+ class Example:
289
+ def __init__(self, num) -> None:
290
+ self.num = num
291
+ self._cache = cachebox.TTLCache(20, 10)
292
+
293
+ @cachebox.cached(lambda self: self._cache)
294
+ def method(self, char: str):
295
+ return char * self.num
296
+
297
+ ex = Example(10)
298
+ assert ex.method("a") == "a" * 10
299
+ ```
300
+
301
+ callback example: *(Added in v4.2.0)*
302
+ ```python
303
+ import cachebox
304
+
305
+ def callback_func(event: int, key, value):
306
+ if event == cachebox.EVENT_MISS:
307
+ print("callback_func: miss event", key, value)
308
+ elif event == cachebox.EVENT_HIT:
309
+ print("callback_func: hit event", key, value)
310
+ else:
311
+ # unreachable code
312
+ raise NotImplementedError
313
+
314
+ @cachebox.cached(cachebox.LRUCache(0), callback=callback_func)
315
+ def func(a, b):
316
+ return a + b
317
+
318
+ assert func(1, 2) == 3
319
+ # callback_func: miss event (1, 2) 3
320
+
321
+ assert func(1, 2) == 3 # hit
322
+ # callback_func: hit event (1, 2) 3
323
+
324
+ assert func(1, 2) == 3 # hit again
325
+ # callback_func: hit event (1, 2) 3
326
+
327
+ assert func(5, 4) == 9
328
+ # callback_func: miss event (5, 4) 9
329
+ ```
330
+
331
+ </details>
332
+
333
+ > [!TIP]\
334
+ > There's a new feature **since `v4.1.0`** that you can tell to a cached function that don't use cache for a call:
335
+ > ```python
336
+ > # with `cachebox__ignore=True` parameter, cachebox does not use cache and only calls the function and returns its result.
337
+ > sum_as_string(10, 20, cachebox__ignore=True)
338
+ > ```
339
+
340
+ * * *
341
+
342
+ ### `cachedmethod` (๐ŸŽ€ decorator)
343
+ this is excatly works like `cached()`, but ignores `self` parameters in hashing and key making.
344
+
345
+ > [!WARNING]\
346
+ > This function has been deprecated since `v5.1.0`, use `cached` function instead.
347
+
348
+ <details>
349
+ <summary><b>Example</b></summary>
350
+
351
+ ```python
352
+ import cachebox
353
+
354
+ class MyClass:
355
+ @cachebox.cachedmethod(cachebox.TTLCache(0, ttl=10))
356
+ def my_method(self, name: str):
357
+ return "Hello, " + name + "!"
358
+
359
+ c = MyClass()
360
+ c.my_method()
361
+ ```
362
+
363
+ </details>
364
+
365
+ * * *
366
+
367
+ ### `is_cached` (๐Ÿ“ฆ function)
368
+ Checks that a function/method is cached by cachebox or not.
369
+
370
+ **Parameters:**
371
+ - `func`: The function/method to check.
372
+
373
+ <details>
374
+ <summary><b>Example</b></summary>
375
+
376
+ ```python
377
+ import cachebox
378
+
379
+ @cachebox.cached(cachebox.FIFOCache(0))
380
+ def func():
381
+ pass
382
+
383
+ assert cachebox.is_cached(func)
384
+ ```
385
+
386
+ </details>
387
+
388
+ * * *
389
+
390
+ ### `BaseCacheImpl` (๐Ÿ—๏ธ class)
391
+ Base implementation for cache classes in the cachebox library.
392
+
393
+ This abstract base class defines the generic structure for cache implementations,
394
+ supporting different key and value types through generic type parameters.
395
+ Serves as a foundation for specific cache variants like Cache and FIFOCache.
396
+
397
+ <details>
398
+ <summary><b>Example</b></summary>
399
+
400
+ ```python
401
+ import cachebox
402
+
403
+ # subclass
404
+ class ClassName(cachebox.BaseCacheImpl):
405
+ ...
406
+
407
+ # type-hint
408
+ def func(cache: BaseCacheImpl):
409
+ ...
410
+
411
+ # isinstance
412
+ cache = cachebox.LFUCache(0)
413
+ assert isinstance(cache, cachebox.BaseCacheImpl)
414
+ ```
415
+
416
+ </details>
417
+
418
+ * * *
419
+
420
+ ### `Cache` (๐Ÿ—๏ธ class)
421
+ A thread-safe, memory-efficient hashmap-like cache with configurable maximum size.
422
+
423
+ Provides a flexible key-value storage mechanism with:
424
+ - Configurable maximum size (zero means unlimited)
425
+ - Lower memory usage compared to standard dict
426
+ - Thread-safe operations
427
+ - Useful memory management methods
428
+
429
+ Supports initialization with optional initial data and capacity,
430
+ and provides dictionary-like access with additional cache-specific operations.
431
+
432
+ > [!TIP]\
433
+ > Differs from standard `dict` by:
434
+ > - it is thread-safe and unordered, while dict isn't thread-safe and ordered (Python 3.6+).
435
+ > - it uses very lower memory than dict.
436
+ > - it supports useful and new methods for managing memory, while dict does not.
437
+ > - it does not support popitem, while dict does.
438
+ > - You can limit the size of Cache, but you cannot for dict.
439
+
440
+ | | get | insert | delete | popitem |
441
+ | ------------ | ----- | ------- | ------ | ------- |
442
+ | Worse-case | O(1) | O(1) | O(1) | N/A |
443
+
444
+ <details>
445
+ <summary><b>Example</b></summary>
446
+
447
+ ```python
448
+ from cachebox import Cache
449
+
450
+ # These parameters are common in classes:
451
+ # By `maxsize` param, you can specify the limit size of the cache ( zero means infinity ); this is unchangable.
452
+ # By `iterable` param, you can create cache from a dict or an iterable.
453
+ # If `capacity` param is given, cache attempts to allocate a new hash table with at
454
+ # least enough capacity for inserting the given number of elements without reallocating.
455
+ cache = Cache(maxsize=100, iterable=None, capacity=100)
456
+
457
+ # you can behave with it like a dictionary
458
+ cache["key"] = "value"
459
+ # or you can use `.insert(key, value)` instead of that (recommended)
460
+ cache.insert("key", "value")
461
+
462
+ print(cache["key"]) # value
463
+
464
+ del cache["key"]
465
+ cache["key"] # KeyError: key
466
+
467
+ # cachebox.Cache does not have any policy, so will raise OverflowError if reached the bound.
468
+ cache.update({i:i for i in range(200)})
469
+ # OverflowError: The cache has reached the bound.
470
+ ```
471
+
472
+ </details>
473
+
474
+ * * *
475
+
476
+ ### `FIFOCache` (๐Ÿ—๏ธ class)
477
+ A First-In-First-Out (FIFO) cache implementation with configurable maximum size and optional initial capacity.
478
+
479
+ This cache provides a fixed-size container that automatically removes the oldest items when the maximum size is reached.
480
+
481
+ **Key features**:
482
+ - Deterministic item eviction order (oldest items removed first)
483
+ - Efficient key-value storage and retrieval
484
+ - Supports dictionary-like operations
485
+ - Allows optional initial data population
486
+
487
+ | | get | insert | delete | popitem |
488
+ | ------------ | ----- | ------- | ------------- | ------- |
489
+ | Worse-case | O(1) | O(1) | O(min(i, n-i)) | O(1) |
490
+
491
+ <details>
492
+ <summary><b>Example</b></summary>
493
+
494
+ ```python
495
+ from cachebox import FIFOCache
496
+
497
+ cache = FIFOCache(5, {i:i*2 for i in range(5)})
498
+
499
+ print(len(cache)) # 5
500
+ cache["new-key"] = "new-value"
501
+ print(len(cache)) # 5
502
+
503
+ print(cache.get(3, "default-val")) # 6
504
+ print(cache.get(6, "default-val")) # default-val
505
+
506
+ print(cache.popitem()) # (1, 2)
507
+
508
+ # insert method returns a value:
509
+ # - If the cache did not have this key present, None is returned.
510
+ # - If the cache did have this key present, the value is updated, and the old value is returned.
511
+ print(cache.insert(3, "val")) # 6
512
+ print(cache.insert("new-key", "val")) # None
513
+
514
+ # Returns the first key in cache; this is the one which will be removed by `popitem()`.
515
+ print(cache.first())
516
+ ```
517
+
518
+ </details>
519
+
520
+ * * *
521
+
522
+ ### `RRCache` (๐Ÿ—๏ธ class)
523
+ A thread-safe cache implementation with Random Replacement (RR) policy.
524
+
525
+ This cache randomly selects and removes elements when the cache reaches its maximum size,
526
+ ensuring a simple and efficient caching mechanism with configurable capacity.
527
+
528
+ Supports operations like insertion, retrieval, deletion, and iteration with O(1) complexity.
529
+
530
+ | | get | insert | delete | popitem |
531
+ | ------------ | ----- | ------- | ------ | ------- |
532
+ | Worse-case | O(1) | O(1) | O(1) | O(1) |
533
+
534
+ <details>
535
+ <summary><b>Example</b></summary>
536
+
537
+ ```python
538
+ from cachebox import RRCache
539
+
540
+ cache = RRCache(10, {i:i for i in range(10)})
541
+ print(cache.is_full()) # True
542
+ print(cache.is_empty()) # False
543
+
544
+ # Returns the number of elements the map can hold without reallocating.
545
+ print(cache.capacity()) # 28
546
+
547
+ # Shrinks the cache to fit len(self) elements.
548
+ cache.shrink_to_fit()
549
+ print(cache.capacity()) # 10
550
+
551
+ # Returns a random key
552
+ print(cache.random_key()) # 4
553
+ ```
554
+
555
+ </details>
556
+
557
+ * * *
558
+
559
+ ### `LRUCache` (๐Ÿ—๏ธ class)
560
+ Thread-safe Least Recently Used (LRU) cache implementation.
561
+
562
+ Provides a cache that automatically removes the least recently used items when
563
+ the cache reaches its maximum size. Supports various operations like insertion,
564
+ retrieval, and management of cached items with configurable maximum size and
565
+ initial capacity.
566
+
567
+ | | get | insert | delete(i) | popitem |
568
+ | ------------ | ----- | ------- | --------- | ------- |
569
+ | Worse-case | O(1)~ | O(1)~ | O(1)~ | O(1)~ |
570
+
571
+ <details>
572
+ <summary><b>Example</b></summary>
573
+
574
+ ```python
575
+ from cachebox import LRUCache
576
+
577
+ cache = LRUCache(0, {i:i*2 for i in range(10)})
578
+
579
+ # access `1`
580
+ print(cache[0]) # 0
581
+ print(cache.least_recently_used()) # 1
582
+ print(cache.popitem()) # (1, 2)
583
+
584
+ # .peek() searches for a key-value in the cache and returns it without moving the key to recently used.
585
+ print(cache.peek(2)) # 4
586
+ print(cache.popitem()) # (3, 6)
587
+
588
+ # Does the `popitem()` `n` times and returns count of removed items.
589
+ print(cache.drain(5)) # 5
590
+ ```
591
+
592
+ </details>
593
+
594
+ * * *
595
+
596
+ ### `LFUCache` (๐Ÿ—๏ธ class)
597
+ A thread-safe Least Frequently Used (LFU) cache implementation.
598
+
599
+ This cache removes elements that have been accessed the least number of times,
600
+ regardless of their access time. It provides methods for inserting, retrieving,
601
+ and managing cache entries with configurable maximum size and initial capacity.
602
+
603
+ | | get | insert | delete(i) | popitem |
604
+ | ------------ | ----- | ------- | --------- | ------- |
605
+ | Worse-case | O(1)~ | O(1)~ | O(min(i, n-i)) | O(1)~ |
606
+
607
+ <details>
608
+ <summary><b>Example</b></summary>
609
+
610
+ ```python
611
+ from cachebox import LFUCache
612
+
613
+ cache = cachebox.LFUCache(5)
614
+ cache.insert('first', 'A')
615
+ cache.insert('second', 'B')
616
+
617
+ # access 'first' twice
618
+ cache['first']
619
+ cache['first']
620
+
621
+ # access 'second' once
622
+ cache['second']
623
+
624
+ assert cache.least_frequently_used() == 'second'
625
+ assert cache.least_frequently_used(2) is None # 2 is out of range
626
+
627
+ for item in cache.items_with_frequency():
628
+ print(item)
629
+ # ('second', 'B', 1)
630
+ # ('first', 'A', 2)
631
+ ```
632
+
633
+ </details>
634
+
635
+ * * *
636
+
637
+ ### `TTLCache` (๐Ÿ—๏ธ class)
638
+ A thread-safe Time-To-Live (TTL) cache implementation with configurable maximum size and expiration.
639
+
640
+ This cache automatically removes elements that have expired based on their time-to-live setting.
641
+ Supports various operations like insertion, retrieval, and iteration.
642
+
643
+ | | get | insert | delete(i) | popitem |
644
+ | ------------ | ----- | ------- | --------- | ------- |
645
+ | Worse-case | O(1)~ | O(1)~ | O(min(i, n-i)) | O(n) |
646
+
647
+ <details>
648
+ <summary><b>Example</b></summary>
649
+
650
+ ```python
651
+ from cachebox import TTLCache
652
+ import time
653
+
654
+ # The `ttl` param specifies the time-to-live value for each element in cache (in seconds); cannot be zero or negative.
655
+ cache = TTLCache(0, ttl=2)
656
+ cache.update({i:str(i) for i in range(10)})
657
+
658
+ print(cache.get_with_expire(2)) # ('2', 1.99)
659
+
660
+ # Returns the oldest key in cache; this is the one which will be removed by `popitem()`
661
+ print(cache.first()) # 0
662
+
663
+ cache["mykey"] = "value"
664
+ time.sleep(2)
665
+ cache["mykey"] # KeyError
666
+ ```
667
+
668
+ </details>
669
+
670
+ * * *
671
+
672
+ ### `VTTLCache` (๐Ÿ—๏ธ class)
673
+ A thread-safe, time-to-live (TTL) cache implementation with per-key expiration policy.
674
+
675
+ This cache allows storing key-value pairs with optional expiration times. When an item expires,
676
+ it is automatically removed from the cache. The cache supports a maximum size and provides
677
+ various methods for inserting, retrieving, and managing cached items.
678
+
679
+ Key features:
680
+ - Per-key time-to-live (TTL) support
681
+ - Configurable maximum cache size
682
+ - Thread-safe operations
683
+ - Automatic expiration of items
684
+
685
+ Supports dictionary-like operations such as get, insert, update, and iteration.
686
+
687
+ | | get | insert | delete(i) | popitem |
688
+ | ------------ | ----- | ------- | --------- | ------- |
689
+ | Worse-case | O(1)~ | O(1)~ | O(min(i, n-i)) | O(1)~ |
690
+
691
+ > [!TIP]\
692
+ > `VTTLCache` vs `TTLCache`:
693
+ > - In `VTTLCache` each item has its own unique time-to-live, unlike `TTLCache`.
694
+ > - `VTTLCache` is generally slower than `TTLCache`.
695
+
696
+ <details>
697
+ <summary><b>Example</b></summary>
698
+
699
+ ```python
700
+ from cachebox import VTTLCache
701
+ import time
702
+
703
+ # The `ttl` param specifies the time-to-live value for `iterable` (in seconds); cannot be zero or negative.
704
+ cache = VTTLCache(100, iterable={i:i for i in range(4)}, ttl=3)
705
+ print(len(cache)) # 4
706
+ time.sleep(3)
707
+ print(len(cache)) # 0
708
+
709
+ # The "key1" is exists for 5 seconds
710
+ cache.insert("key1", "value", ttl=5)
711
+ # The "key2" is exists for 2 seconds
712
+ cache.insert("key2", "value", ttl=2)
713
+
714
+ time.sleep(2)
715
+ # "key1" is exists for 3 seconds
716
+ print(cache.get("key1")) # value
717
+
718
+ # "key2" has expired
719
+ print(cache.get("key2")) # None
720
+ ```
721
+
722
+ </details>
723
+
724
+ * * *
725
+
726
+ ### `Frozen` (๐Ÿ—๏ธ class)
727
+ **This is not a cache**; This is a wrapper class that prevents modifications to an underlying cache implementation.
728
+
729
+ This class provides a read-only view of a cache, optionally allowing silent
730
+ suppression of modification attempts instead of raising exceptions.
731
+
732
+ <details>
733
+ <summary><b>Example</b></summary>
734
+
735
+ ```python
736
+ from cachebox import Frozen, FIFOCache
737
+
738
+ cache = FIFOCache(10, {1:1, 2:2, 3:3})
739
+
740
+ # parameters:
741
+ # cls: your cache
742
+ # ignore: If False, will raise TypeError if anyone try to change cache. will do nothing otherwise.
743
+ frozen = Frozen(cache, ignore=True)
744
+ print(frozen[1]) # 1
745
+ print(len(frozen)) # 3
746
+
747
+ # Frozen ignores this action and do nothing
748
+ frozen.insert("key", "value")
749
+ print(len(frozen)) # 3
750
+
751
+ # Let's try with ignore=False
752
+ frozen = Frozen(cache, ignore=False)
753
+
754
+ frozen.insert("key", "value")
755
+ # TypeError: This cache is frozen.
756
+ ```
757
+
758
+ </details>
759
+
760
+ > [!NOTE]\
761
+ > The **Frozen** class can't prevent expiring in [TTLCache](#ttlcache) or [VTTLCache](#vttlcache).
762
+ >
763
+ > For example:
764
+ > ```python
765
+ > cache = TTLCache(0, ttl=3, iterable={i:i for i in range(10)})
766
+ > frozen = Frozen(cache)
767
+ >
768
+ > time.sleep(3)
769
+ > print(len(frozen)) # 0
770
+ > ```
771
+
772
+ ## โš ๏ธ Incompatible Changes
773
+ These are changes that are not compatible with the previous version:
774
+
775
+ **You can see more info about changes in [Changelog](CHANGELOG.md).**
776
+
777
+ #### CacheInfo's cachememory attribute renamed!
778
+ The `CacheInfo.cachememory` was renamed to `CacheInfo.memory`.
779
+
780
+ ```python
781
+ @cachebox.cached({})
782
+ def func(a: int, b: int) -> str:
783
+ ...
784
+
785
+ info = func.cache_info()
786
+
787
+ # Older versions
788
+ print(info.cachememory)
789
+
790
+ # New version
791
+ print(info.memory)
792
+ ```
793
+
794
+ #### Errors in the `__eq__` method will not be ignored!
795
+ Now the errors which occurred while doing `__eq__` operations will not be ignored.
796
+
797
+ ```python
798
+ class A:
799
+ def __hash__(self):
800
+ return 1
801
+
802
+ def __eq__(self, other):
803
+ raise NotImplementedError("not implemeneted")
804
+
805
+ cache = cachebox.FIFOCache(0, {A(): 10})
806
+
807
+ # Older versions:
808
+ cache[A()] # => KeyError
809
+
810
+ # New version:
811
+ cache[A()]
812
+ # Traceback (most recent call last):
813
+ # File "script.py", line 11, in <module>
814
+ # cache[A()]
815
+ # ~~~~~^^^^^
816
+ # File "script.py", line 7, in __eq__
817
+ # raise NotImplementedError("not implemeneted")
818
+ # NotImplementedError: not implemeneted
819
+ ```
820
+
821
+ #### Cache comparisons will not be strict!
822
+ In older versions, cache comparisons depended on the caching algorithm. Now, they work just like dictionary comparisons.
823
+
824
+ ```python
825
+ cache1 = cachebox.FIFOCache(10)
826
+ cache2 = cachebox.FIFOCache(10)
827
+
828
+ cache1.insert(1, 'first')
829
+ cache1.insert(2, 'second')
830
+
831
+ cache2.insert(2, 'second')
832
+ cache2.insert(1, 'first')
833
+
834
+ # Older versions:
835
+ cache1 == cache2 # False
836
+
837
+ # New version:
838
+ cache1 == cache2 # True
839
+ ```
840
+
841
+ ## Tips and Notes
842
+ #### How to save caches in files?
843
+ there's no built-in file-based implementation, but you can use `pickle` for saving caches in files. For example:
844
+ ```python
845
+ import cachebox
846
+ import pickle
847
+ c = cachebox.LRUCache(100, {i:i for i in range(78)})
848
+
849
+ with open("file", "wb") as fd:
850
+ pickle.dump(c, fd)
851
+
852
+ with open("file", "rb") as fd:
853
+ loaded = pickle.load(fd)
854
+
855
+ assert c == loaded
856
+ assert c.capacity() == loaded.capacity()
857
+ ```
858
+
859
+ > [!TIP]\
860
+ > For more, see this [issue](https://github.com/awolverp/cachebox/issues/8).
861
+
862
+ * * *
863
+
864
+ #### How to copy the caches?
865
+ You can use `copy.deepcopy` or `cache.copy` for copying caches. For example:
866
+ ```python
867
+ import cachebox
868
+ cache = cachebox.LRUCache(100, {i:i for i in range(78)})
869
+
870
+ # shallow copy
871
+ shallow = cache.copy()
872
+
873
+ # deep copy
874
+ import copy
875
+ deep = copy.deepcopy(cache)
876
+ ```
877
+
878
+ ## License
879
+ This repository is licensed under the [MIT License](LICENSE)
880
+