django-fast-treenode 2.1.3__py3-none-any.whl → 2.1.5__py3-none-any.whl

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
@@ -1,6 +1,6 @@
1
- Metadata-Version: 2.2
1
+ Metadata-Version: 2.4
2
2
  Name: django-fast-treenode
3
- Version: 2.1.3
3
+ Version: 2.1.5
4
4
  Summary: Application for supporting tree (hierarchical) data structure in Django projects
5
5
  Home-page: https://django-fast-treenode.readthedocs.io/
6
6
  Author: Timur Kady
@@ -53,22 +53,24 @@ Requires-Python: >=3.9
53
53
  Description-Content-Type: text/markdown
54
54
  License-File: LICENSE
55
55
  Requires-Dist: Django>=4.0
56
- Requires-Dist: pympler>=1.0
57
56
  Requires-Dist: django-widget-tweaks>=1.5
57
+ Requires-Dist: msgpack>=1.1
58
58
  Provides-Extra: import-export
59
59
  Requires-Dist: openpyxl; extra == "import-export"
60
60
  Requires-Dist: pyyaml; extra == "import-export"
61
61
  Requires-Dist: xlsxwriter; extra == "import-export"
62
+ Dynamic: license-file
62
63
 
63
64
  # Django-fast-treenode
64
- **Combining Adjacency List and Closure Table for Optimal Performance**
65
+ **Hybrid Tree Storage**
65
66
 
66
67
  [![Tests](https://github.com/TimurKady/django-fast-treenode/actions/workflows/test.yaml/badge.svg?branch=main)](https://github.com/TimurKady/django-fast-treenode/actions/workflows/test.yaml)
67
68
  [![Docs](https://readthedocs.org/projects/django-fast-treenode/badge/?version=latest)](https://django-fast-treenode.readthedocs.io/)
68
69
  [![PyPI](https://img.shields.io/pypi/v/django-fast-treenode.svg)](https://pypi.org/project/django-fast-treenode/)
69
70
  [![Published on Django Packages](https://img.shields.io/badge/Published%20on-Django%20Packages-0c3c26)](https://djangopackages.org/packages/p/django-fast-treenode/)
71
+ [![Sponsor](https://img.shields.io/github/sponsors/TimurKady)](https://github.com/sponsors/TimurKady)
70
72
 
71
- **Django Fast TreeNode** is a high-performance Django application for working with tree structures, combining **Adjacency List** and **Closure Table** models. Each **TreeNodeModel** instance maintains two synchronized tables, enabling most operations to be performed with a single database query.
73
+ **Django Fast TreeNode** is a high-performance Django application for working with tree structures.
72
74
 
73
75
  ## Features
74
76
  - **Hybrid storage model**: Combines Adjacency List and Closure Table for optimal performance.
@@ -79,6 +81,8 @@ Requires-Dist: xlsxwriter; extra == "import-export"
79
81
  - **Admin panel integration**: Full compatibility with Django's admin panel, allowing intuitive management of tree structures.
80
82
  - **Import & Export functionality**: Built-in support for importing and exporting tree structures in multiple formats (CSV, JSON, XLSX, YAML, TSV), including integration with the Django admin panel.
81
83
 
84
+ It seems that django-fast-treenode is currently the most balanced and performant solution for most tasks, especially those related to dynamic hierarchical data structures. Check out the results of (comparison tests)[#] with other Django packages.
85
+
82
86
  ## Use Cases
83
87
  Django Fast TreeNode is suitable for a wide range of applications, from simple directories to complex systems with deep hierarchical structures:
84
88
  - **Categories and taxonomies**: Manage product categories, tags, and classification systems.
@@ -87,7 +91,6 @@ Django Fast TreeNode is suitable for a wide range of applications, from simple d
87
91
  - **Geographical data**: Represent administrative divisions, regions, and areas of influence.
88
92
  - **Organizational and Business Structures**: Model company hierarchies, business processes, employees and departments.
89
93
 
90
- In all applications, `django-fast-treenode` models show excellent performance and stability.
91
94
 
92
95
  ## Quick start
93
96
  1. Run `pip install django-fast-treenode`.
@@ -159,4 +162,4 @@ Released under [MIT License](https://github.com/TimurKady/django-fast-treenode/b
159
162
  ## Credits
160
163
  Thanks to everyone who contributed to the development and testing of this package, as well as the Django community for their inspiration and support.
161
164
 
162
- Special thanks to [Fabio Caccamo](https://github.com/fabiocaccamo) for the idea behind creating a fast Django application for handling hierarchies and [Mathieu Leplatre](https://github.com/leplatrem) for the advice used in writing this application.
165
+ Special thanks to [Fabio Caccamo](https://github.com/fabiocaccamo) for the idea behind creating a fast Django application for handling hierarchies.
@@ -1,34 +1,35 @@
1
- treenode/__init__.py,sha256=CZ-0uZyhi8OZJP8xmqbTqXrZUjcQ4SNAXhKFgM0qw2M,99
1
+ django_fast_treenode-2.1.5.dist-info/licenses/LICENSE,sha256=T0evsb7y-63fg18ovdNSx3wwWWRwyluQvN9J4zFSvfE,1093
2
+ treenode/__init__.py,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
2
3
  treenode/apps.py,sha256=a7UasXiZZudPccjmHEudP79TkhR_53Mvnb-dBXLHRRQ,862
3
- treenode/cache.py,sha256=JLbI0EWq1XmH24SF46glC6e7sdizmfObbFA5vQJvMiY,7213
4
+ treenode/cache.py,sha256=GoN2J-ypEQWIK05WSw9LYo7boKHGPXNFxqHorFPUqX8,12481
4
5
  treenode/forms.py,sha256=Mjrpuyd1CPsitcElDVagE3k-p2kU4xIlRuy1f5Zgt3c,3800
5
6
  treenode/signals.py,sha256=ERrlKjGqhYaPYVKKRk1JBBlPFOmJKpJ6bXsJavcTlo0,518
6
7
  treenode/urls.py,sha256=CsgX0hRyDVrMS8YnRlr_CxmDlgGIhDpqZ9ldoMYZCac,866
7
- treenode/version.py,sha256=PBeSQ_6jPFoS_aURwvCBCsE-VlTS5vbGFlvnO41XfMY,222
8
+ treenode/version.py,sha256=mv001KtDXkO8dnqphEI_VEhvLuHmmmhI515DitY0q2U,222
8
9
  treenode/views.py,sha256=rEZEgdbEA3AJDHrvtrAm-t60QTJcJ4JEhNsNMR1Y_I4,5549
9
10
  treenode/widgets.py,sha256=Mi0F-AK_UcmU6C50ENK9vv6xGQNuDtrtzXSnXSOXhLM,4760
10
- treenode/admin/__init__.py,sha256=TdlPIyRW8i9qTVqGLmLWiBw4DyoGHUYZErE6rCyGOPE,119
11
- treenode/admin/admin.py,sha256=6H3N2Dg6l-MrFwIcyKR5YENg0cEo-I4uKCP9MuhHkqo,10580
11
+ treenode/admin/__init__.py,sha256=K5GgagrfLwzF8GvOYfwXpJYLCexM8DbEoK1bhsqIBvc,119
12
+ treenode/admin/admin.py,sha256=iVi8s8mPVVDlbbJFqCcuXzDbE29KPj6XntFDDRECkmY,10580
12
13
  treenode/admin/changelist.py,sha256=YZm3zNniX75CgLjnbHpVr0OIP91halDEBHmrcS8m5Og,2128
13
14
  treenode/admin/mixins.py,sha256=-dVZwEjKsfRzMkBe87dkI0SZ9MH45YE_o39SIhSJWy4,11194
14
15
  treenode/managers/__init__.py,sha256=EG_tj9P1Hama3kaqMfHck4lfzUWoPaJJVOXe3qaKMUo,585
15
- treenode/managers/adjacency.py,sha256=NVN8dq5z7gIh90yqW2uxV7MokmUfXTOT7crqMDyMaH0,7889
16
+ treenode/managers/adjacency.py,sha256=OOjHCSTo0aAcSxOOwz7OsQTGdTRkM1mAxSN7jlzRpho,7896
16
17
  treenode/managers/closure.py,sha256=PcScdJJUnLcKe8Y1wqROYPsRtAnBMUO4xn5sILk9AIM,10638
17
18
  treenode/models/__init__.py,sha256=pBiMlEpC_Thh7asraNzA7W_7BKu2oAHtcn-K6_sdJe8,112
18
- treenode/models/adjacency.py,sha256=ijStfIQDSd48L3nA8OnLD1nHGYo5YsnokqUVfzDt68w,12422
19
+ treenode/models/adjacency.py,sha256=QWGOidd4tH3afqVedPNQqeh-W-zUTNs1m-iAhCAXub4,12396
19
20
  treenode/models/classproperty.py,sha256=J4W6snsfsEUSHKHkIlM9yOJYQ_FSrp3P3oEYMKJengg,571
20
- treenode/models/closure.py,sha256=NEC8pi9QIxNtUnctEi0lPHBHEPX2K3V1oeSs9JrDGA0,4588
21
+ treenode/models/closure.py,sha256=eZtLbnCOR1xYWhgbo1Pml_K0Pd0MM2DjiZl3SWMVe2A,3712
21
22
  treenode/models/factory.py,sha256=10FEGGC5PGWaR58qErs0oOrCS0KeI8x9H-SknZAAWqw,2291
22
23
  treenode/models/mixins/__init__.py,sha256=gTdMZFh1slNHMvxrnu-hGl46xqnWd4W7TOEFWTVJq40,757
23
- treenode/models/mixins/ancestors.py,sha256=_nF8V99SBjT-G88BmcRtiTC0DgSEIgHrI0g02jqTnss,2075
24
- treenode/models/mixins/children.py,sha256=OchaH6m6pOr6uuiZRRBHoZXoCSWM-ENTNWu1iLHXaBM,2564
25
- treenode/models/mixins/descendants.py,sha256=2mhnIhC8VJomTqntzzAwFFW_CcMiwujzQoD5_mfMsK0,2208
24
+ treenode/models/mixins/ancestors.py,sha256=QZywMcIVZK82j13QsgevVN2ZhRLa86DfRIt2BsiM2to,1526
25
+ treenode/models/mixins/children.py,sha256=xgenQFyZBG7_S33QQlznSmNhXEdeo9DeLyi7dKmvFhw,2637
26
+ treenode/models/mixins/descendants.py,sha256=PYYfd7oqlv3Gnfahm0u9ACHjpWSDNM6Z8oJaJXPVQ8w,1910
26
27
  treenode/models/mixins/family.py,sha256=h2IRRADkQxve97QqBHKv0evVz4cFQtcNR8CbPi9Ri_w,1645
27
28
  treenode/models/mixins/logical.py,sha256=jlhBSq3AfCYNyNjqyKM9siyioS3SYcGD-aG2b4MV2RM,2169
28
- treenode/models/mixins/node.py,sha256=wgLbFKA99QuLV4l32GR3JZ5gYIVKUbfDzM8iDw4C8Bs,7694
29
+ treenode/models/mixins/node.py,sha256=VpLiFI1olvj5Gp2yV4n-aG4z4mZ7vOS6ytloWwO5s6w,7149
29
30
  treenode/models/mixins/properties.py,sha256=pfv80KLXcPeGx00IFCBcst1_cf0AmzhjshFjq1XQWMY,3876
30
31
  treenode/models/mixins/roots.py,sha256=MoFQq1fph70awc26UMUbfeTpt0ToUOvMz1c7LlDyIP8,2956
31
- treenode/models/mixins/siblings.py,sha256=fh0ZrlFXKxOQ4Qrp6sElTMvRhU5PyRRykLHDcbH-3Rk,3113
32
+ treenode/models/mixins/siblings.py,sha256=JTQjaxnDH9t-AVMCQFiuc0nHLdIsE4v5vJ5z6LcUZLY,3236
32
33
  treenode/models/mixins/tree.py,sha256=CsO0ynwcwkrWgQbTzvF4yws-y7n1GGM2zImJH0hgV00,13042
33
34
  treenode/static/.gitkeep,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
34
35
  treenode/static/treenode/.gitkeep,sha256=frcCV1k9oG9oKj3dpUqdJg1PxRT2RSN_XKdLCPjaYaY,2
@@ -53,11 +54,10 @@ treenode/utils/aid.py,sha256=o8Jgc1vDRtQpx4XYdv0qR5Lqvens55Jfbdca1nr-EOA,1013
53
54
  treenode/utils/base16.py,sha256=U1PMit2aZOpYusG_u1c7eVpXO-cFrFPyVyk9zdHrehg,817
54
55
  treenode/utils/base36.py,sha256=yICmyPE-yyPNO9T2oALOt-b6uYf37ahFfx0R4tXn3X0,847
55
56
  treenode/utils/db.py,sha256=36q4OckKmEd6uHTbMTxdKpV9nOIZ55DAantRWR9bxWg,4297
56
- treenode/utils/exporter.py,sha256=mV6Gch7XfW8f_1x3WqWgtV0qekMLdo-_n9gz6GJjXjw,7259
57
+ treenode/utils/exporter.py,sha256=LGC5VfJj7wMFp7BkaWjmfrImgCVRpJ8gjkDpn4IDTEs,7258
57
58
  treenode/utils/importer.py,sha256=Hvirbd6NyZ2MHa56_jOrUF3kYFeby1DbSLR3mhHy-9s,12891
58
59
  treenode/utils/radix.py,sha256=zHpOuDxsebiv9Gza6snNhAtBKiex6CDrAVRtB6esaWo,1642
59
- django_fast_treenode-2.1.3.dist-info/LICENSE,sha256=T0evsb7y-63fg18ovdNSx3wwWWRwyluQvN9J4zFSvfE,1093
60
- django_fast_treenode-2.1.3.dist-info/METADATA,sha256=glnnVS6RVwKFZkbQOLVkcow74o-MKFS_AsPTY_smBd4,8165
61
- django_fast_treenode-2.1.3.dist-info/WHEEL,sha256=52BFRY2Up02UkjOa29eZOS2VxUrpPORXg1pkohGGUS8,91
62
- django_fast_treenode-2.1.3.dist-info/top_level.txt,sha256=fmgxHbXyx1O2MPi_9kjx8aL9L-8TmV0gre4Go8XgqFk,9
63
- django_fast_treenode-2.1.3.dist-info/RECORD,,
60
+ django_fast_treenode-2.1.5.dist-info/METADATA,sha256=KpOVLmk1TDKx5_iXE1geP9_GlfmVIvJK-IhuSh7Lu8w,8103
61
+ django_fast_treenode-2.1.5.dist-info/WHEEL,sha256=DK49LOLCYiurdXXOXwGJm6U4DkHkg4lcxjhqwRa0CP4,91
62
+ django_fast_treenode-2.1.5.dist-info/top_level.txt,sha256=fmgxHbXyx1O2MPi_9kjx8aL9L-8TmV0gre4Go8XgqFk,9
63
+ django_fast_treenode-2.1.5.dist-info/RECORD,,
@@ -1,5 +1,5 @@
1
1
  Wheel-Version: 1.0
2
- Generator: setuptools (76.0.0)
2
+ Generator: setuptools (78.0.2)
3
3
  Root-Is-Purelib: true
4
4
  Tag: py3-none-any
5
5
 
treenode/__init__.py CHANGED
@@ -1,5 +0,0 @@
1
- """
2
- Django Fast TreeNode.
3
-
4
- 📖 Documentation: https://django-fast-treenode.readthedocs.io/
5
- """
@@ -1,9 +1,9 @@
1
1
  # -*- coding: utf-8 -*-
2
2
 
3
- from .admin import TreeNodeAdminModel
3
+ from .admin import TreeNodeModelAdmin
4
4
 
5
5
 
6
- __all__ = ["TreeNodeAdminModel"]
6
+ __all__ = ["TreeNodeModelAdmin"]
7
7
 
8
8
 
9
9
  # The end
treenode/admin/admin.py CHANGED
@@ -29,7 +29,7 @@ import logging
29
29
  logger = logging.getLogger(__name__)
30
30
 
31
31
 
32
- class TreeNodeAdminModel(AdminMixin, admin.ModelAdmin):
32
+ class TreeNodeModelAdmin(AdminMixin, admin.ModelAdmin):
33
33
  """
34
34
  TreeNodeAdmin class.
35
35
 
treenode/cache.py CHANGED
@@ -12,218 +12,339 @@ Features:
12
12
  - Automatic cache eviction when memory limits are exceeded.
13
13
  - Decorator `@cached_method` for caching method results.
14
14
 
15
- Version: 2.1.0
15
+ Version: 2.2.0
16
16
  Author: Timur Kady
17
17
  Email: timurkady@yandex.com
18
18
  """
19
19
 
20
-
21
- from django.core.cache import caches
22
- from django.conf import settings
23
- import threading
24
20
  import hashlib
25
- import json
26
- import logging
27
- from pympler import asizeof
21
+ import msgpack
22
+ import sys
23
+ import threading
24
+ from collections import deque, defaultdict, OrderedDict
25
+ from django.conf import settings
26
+ from django.core.cache import caches
27
+ from functools import lru_cache
28
+ from functools import wraps
28
29
 
29
- from .utils.base36 import to_base36
30
30
 
31
- logger = logging.getLogger(__name__)
31
+ # ---------------------------------------------------
32
+ # Utilities
33
+ # ---------------------------------------------------
34
+
35
+ _DIGITS = "0123456789ABCDEFGHIJKLMNOPQRSTUVWXYZ"
36
+ _CLEARINT_THESHOLD = 0.8
37
+ _EVICT_INTERVAL = 50
38
+
39
+
40
+ @lru_cache(maxsize=1000)
41
+ def to_base36(num):
42
+ """
43
+ Convert an integer to a base36 string.
44
+
45
+ For example: 10 -> 'A', 35 -> 'Z', 36 -> '10', etc.
46
+ """
47
+ if num == 0:
48
+ return '0'
49
+ sign = '-' if num < 0 else ''
50
+ num = abs(num)
51
+ result = []
52
+ while num:
53
+ num, rem = divmod(num, 36)
54
+ result.append(_DIGITS[rem])
55
+ return sign + ''.join(reversed(result))
32
56
 
33
57
 
34
58
  # ---------------------------------------------------
35
59
  # Caching
36
60
  # ---------------------------------------------------
37
61
 
38
- class TreeNodeCache:
39
- """Singleton-класс для управления кэшем TreeNode."""
62
+ class TreeCache:
63
+ """Singleton class for managing the TreeNode cache."""
40
64
 
41
65
  _instance = None
42
- _lock = threading.Lock()
43
- _keys = dict()
44
- _total_size = 0
45
- _cache_limit = 0
46
-
47
- def __new__(cls):
48
- """Create only one instance of the class (Singleton)."""
49
- with cls._lock:
66
+ _instance_lock = threading.Lock()
67
+
68
+ def __new__(cls, *args, **kwargs):
69
+ """Singleton new."""
70
+ with cls._instance_lock:
50
71
  if cls._instance is None:
51
- cls._instance = super(TreeNodeCache, cls).__new__(cls)
52
- cls._instance._initialize()
72
+ cls._instance = super(TreeCache, cls).__new__(cls)
53
73
  return cls._instance
54
74
 
55
- def _initialize(self):
56
- """Initialize cache."""
57
- self.cache_timeout = None
58
- limit = getattr(settings, 'TREENODE_CACHE_LIMIT', 100)*1024*1024
59
- self._cache_limit = limit
60
- self.cache_timeout = None
61
- cache_name = 'treenode' if 'treenode' in settings.CACHES else 'default'
62
- self.cache = caches[cache_name]
63
- self._total_size = 0
64
- self.cache.clear()
75
+ def __init__(self, cache_limit=100 * 1024 * 1024):
76
+ """
77
+ Initialize the cache.
78
+
79
+ If the 'treenode' key is present in settings.CACHES, the corresponding
80
+ backend is used.
81
+ Otherwise, the custom dictionary is used.
82
+ The cache size (in bytes) is taken from
83
+ settings.TREENODE_CACHE_LIMIT (MB), by default 100 MB.
84
+ """
85
+ if hasattr(self, '_initialized') and self._initialized:
86
+ return
87
+
88
+ # Get the cache limit (MB), then convert to bytes.
89
+ cache_limit_mb = getattr(settings, 'TREENODE_CACHE_LIMIT', 100)
90
+ self.cache_limit = cache_limit_mb * 1024 * 1024
91
+
92
+ # Select backend: if there is 'treenode' in settings.CACHES, use it.
93
+ # Otherwise, use our own dictionary.
94
+ if hasattr(settings, 'CACHES') and 'treenode' in settings.CACHES:
95
+ self.cache = caches['treenode']
96
+ else:
97
+ # We use our dictionary as a backend.
98
+ self.cache = OrderedDict()
99
+
100
+ self.order = deque() # Queue for FIFO implementation.
101
+ self.total_size = 0 # Current cache size in bytes.
102
+ self.lock = threading.Lock() # Lock for thread safety.
103
+
104
+ # Additional index for fast search of keys by prefix
105
+ # Format: {prefix: {key1, key2, ...}}
106
+ self.prefix_index = defaultdict(set)
107
+ # Dictionary to store the sizes of each key (key -> size in bytes)
108
+ self.sizes = {}
109
+ # Dictionary to store the prefix for each key to avoid repeated
110
+ # splitting
111
+ self.key_prefix = {}
112
+
113
+ # Counter for number of set operations for periodic eviction
114
+ self._set_counter = 0
115
+ # Evict cache every _evict_interval set operations when using external
116
+ # backend
117
+ self._evict_interval = _EVICT_INTERVAL
118
+
119
+ self._initialized = True
65
120
 
66
121
  def generate_cache_key(self, label, func_name, unique_id, *args, **kwargs):
67
122
  """
68
- Generate Cache Key.
69
-
70
- Generates a cache key of the form:
71
- <model_name>_<func_name>_<id>_<hash>,
72
- where <hash> is calculated from the function parameters
73
- (args and kwargs).
74
- If the parameters can be serialized via JSON, use this, otherwise we
75
- use repr to generate the string.
123
+ Generate a cache key.
124
+
125
+ <label>_<func_name>_<unique_id>_<hash>
76
126
  """
77
- try:
78
- # Sort dictionary keys to ensure determinism.
79
- params_repr = json.dumps(
80
- (args, kwargs),
81
- sort_keys=True,
82
- default=str
83
- )
84
- except (TypeError, ValueError) as e:
85
- # If JSON serialization fails, use repr.
86
- params_repr = repr((args, kwargs))
87
- logger.warning(f"Failed to serialize cache key params: {e}")
88
-
89
- # Calculate the MD5 hash from the received string.
90
- hash_value = hashlib.sha256(params_repr.encode("utf-8")).hexdigest()
91
-
92
- # Forming the final key.
93
- cache_key = f"{label}_{func_name}_{unique_id}_{hash_value}"
94
-
95
- return cache_key
127
+ # If using custom dict backend, use simple key generation without
128
+ # serialization.
129
+ if isinstance(self.cache, dict):
130
+ sorted_kwargs = sorted(kwargs.items())
131
+ return f"{label}_{func_name}_{unique_id}_{args}_{sorted_kwargs}"
132
+ else:
133
+ try:
134
+ # Using msgpack for fast binary representation of arguments
135
+ sorted_kwargs = sorted(kwargs.items())
136
+ params_bytes = msgpack.packb(
137
+ (args, sorted_kwargs), use_bin_type=True)
138
+ except Exception:
139
+ params_bytes = repr((args, kwargs)).encode('utf-8')
140
+ # Using MD5 for speed (no cryptographic strength)
141
+ hash_value = hashlib.md5(params_bytes).hexdigest()
142
+ return f"{label}_{func_name}_{unique_id}_{hash_value}"
96
143
 
97
144
  def get_obj_size(self, value):
98
- """Determine the size of the object in bytes."""
99
- try:
100
- return len(json.dumps(value).encode("utf-8"))
101
- except (TypeError, ValueError):
102
- return asizeof.asizeof(value)
103
-
104
- def cache_size(self):
105
- """Return the total size of the cache in bytes."""
106
- return self._total_size
107
-
108
- def set(self, cache_key, value):
109
- """Push to cache."""
110
- size = self.get_obj_size(value)
111
- self.cache.set(cache_key, value, timeout=self.cache_timeout)
112
-
113
- # Update cache size
114
- if cache_key in self._keys:
115
- self._total_size -= self._keys[cache_key]
116
- self._keys[cache_key] = size
117
- self._total_size += size
118
-
119
- # Check if the limit has been exceeded
120
- self._evict_cache()
121
-
122
- def get(self, cache_key):
123
- """Get from cache."""
124
- return self.cache.get(cache_key)
125
-
126
- def invalidate(self, label):
127
- """Clear cache for a specific model only."""
128
- prefix = f"{label}_"
129
- keys_to_remove = [key for key in self._keys if key.startswith(prefix)]
130
- for key in keys_to_remove:
131
- self.cache.delete(key)
132
- self._total_size -= self._keys.pop(key, 0)
133
- if self._total_size < 0:
134
- self._total_size = 0
145
+ """
146
+ Determine the size of the object in bytes.
135
147
 
136
- def clear(self):
137
- """Full cache clearing."""
138
- self.cache.clear()
139
- self._keys.clear()
140
- self._total_size = 0
148
+ If the value is already in bytes or bytearray, simply returns its
149
+ length. Otherwise, uses sys.getsizeof for an approximate estimate.
150
+ """
151
+ if isinstance(value, (bytes, bytearray)):
152
+ return len(value)
153
+ return sys.getsizeof(value)
141
154
 
142
- def _evict_cache(self):
143
- """Delete old entries if the cache has exceeded the limit."""
144
- if self._total_size <= self._cache_limit:
145
- # If the size is within the limit, do nothing
146
- return
155
+ def set(self, key, value):
156
+ """
157
+ Store the value in the cache.
158
+
159
+ Stores the value in the cache, updates the FIFO queue, prefix index,
160
+ size dictionary, and total cache size.
161
+ """
162
+ # Idea 1: Store raw object if using custom dict backend, otherwise
163
+ # serialize using msgpack.
164
+ if isinstance(self.cache, dict):
165
+ stored_value = value
166
+ else:
167
+ try:
168
+ stored_value = msgpack.packb(value, use_bin_type=True)
169
+ except Exception:
170
+ stored_value = value
171
+
172
+ # Calculate the size of the stored value
173
+ if isinstance(stored_value, (bytes, bytearray)):
174
+ size = len(stored_value)
175
+ else:
176
+ size = sys.getsizeof(stored_value)
147
177
 
148
- if not self._keys:
149
- self.clear()
178
+ # Store the value in the cache backend
179
+ if isinstance(self.cache, dict):
180
+ self.cache[key] = stored_value
181
+ else:
182
+ self.cache.set(key, stored_value)
183
+
184
+ # Update internal structures under lock
185
+ with self.lock:
186
+ if key in self.sizes:
187
+ # If the key already exists, adjust the total size
188
+ old_size = self.sizes[key]
189
+ self.total_size -= old_size
190
+ else:
191
+ # New key: add to FIFO queue
192
+ self.order.append(key)
193
+ # Compute prefix once and store it in key_prefix
194
+ if "_" in key:
195
+ prefix = key.split('_', 1)[0] + "_"
196
+ else:
197
+ prefix = key
198
+ self.key_prefix[key] = prefix
199
+ self.prefix_index[prefix].add(key)
200
+ # Save the size for this key and update total_size
201
+ self.sizes[key] = size
202
+ self.total_size += size
203
+
204
+ # Increment the set counter for periodic eviction
205
+ self._set_counter += 1
206
+
207
+ # Idea 3: If using external backend, evict cache every _evict_interval
208
+ # sets. Otherwise, always evict immediately.
209
+ if self._set_counter >= self._evict_interval:
210
+ with self.lock:
211
+ self._set_counter = 0
212
+ self._evict_cache()
213
+
214
+ def get(self, key):
215
+ """
216
+ Get a value from the cache by key.
150
217
 
151
- logger.warning(f"Cache limit exceeded! Current size: \
152
- {self._total_size}, Limit: {self._cache_limit}")
218
+ Quickly retrieves a value from the cache by key.
219
+ Here we simply request a value from the backend (either a dictionary or
220
+ Django cache-backend) and return it without any additional operations.
221
+ """
222
+ if isinstance(self.cache, dict):
223
+ return self.cache.get(key)
224
+ else:
225
+ packed_value = self.cache.get(key)
226
+ if packed_value is None:
227
+ return None
228
+ try:
229
+ return msgpack.unpackb(packed_value, raw=False)
230
+ except Exception:
231
+ # If unpacking fails, return what we got
232
+ return packed_value
233
+
234
+ def invalidate(self, prefix):
235
+ """
236
+ Invalidate model cache.
237
+
238
+ Quickly removes all items from the cache whose keys start with prefix.
239
+ Uses prefix_index for instant access to keys.
240
+ When removing, each key's size is retrieved from self.sizes,
241
+ and total_size is reduced by the corresponding amount.
242
+ """
243
+ prefix += '_'
244
+ with self.lock:
245
+ keys_to_remove = self.prefix_index.get(prefix, set())
246
+ if not keys_to_remove:
247
+ return
248
+
249
+ # Remove keys from main cache and update total_size via sizes
250
+ # dictionary
251
+ if isinstance(self.cache, dict):
252
+ for key in keys_to_remove:
253
+ self.cache.pop(key, None)
254
+ size = self.sizes.pop(key, 0)
255
+ self.total_size -= size
256
+ # Remove key from key_prefix as well
257
+ self.key_prefix.pop(key, None)
258
+ else:
259
+ # If using Django backend
260
+ self.cache.delete_many(list(keys_to_remove))
261
+ for key in keys_to_remove:
262
+ size = self.sizes.pop(key, 0)
263
+ self.total_size -= size
264
+ self.key_prefix.pop(key, None)
265
+
266
+ # Remove prefix from index and update FIFO queue
267
+ del self.prefix_index[prefix]
268
+ self.order = deque(k for k in self.order if k not in keys_to_remove)
153
269
 
154
- # Sort keys by insertion order (FIFO)
155
- keys_sorted = list(self._keys.keys())
270
+ def clear(self):
271
+ """Clear cache completely."""
272
+ with self.lock:
273
+ if isinstance(self.cache, dict):
274
+ self.cache.clear()
275
+ else:
276
+ self.cache.clear()
277
+ self.order.clear()
278
+ self.prefix_index.clear()
279
+ self.sizes.clear()
280
+ self.key_prefix.clear()
281
+ self.total_size = 0
156
282
 
157
- keys_to_delete = []
158
- freed_size = 0
283
+ def _evict_cache(self):
284
+ """
285
+ Perform FIFO cache evacuation.
159
286
 
160
- # Delete old keys until we reach the limit
161
- for key in keys_sorted:
162
- freed_size += self._keys[key]
163
- keys_to_delete.append(key)
164
- if self._total_size - freed_size <= self._cache_limit:
165
- break
287
+ Removes old items until the total cache size is less than
288
+ _CLEARINT_THESHOLD of the limit.
289
+ """
290
+ with self.lock:
291
+ # Evict until total_size is below 80% of cache_limit
292
+ target_size = _CLEARINT_THESHOLD * self.cache_limit
293
+ while self.total_size > target_size and self.order:
294
+ # Extract the oldest key from the queue (FIFO)
295
+ key = self.order.popleft()
166
296
 
167
- # Delete keys in batches (delete_many)
168
- self.cache.delete_many(keys_to_delete)
297
+ # Delete entry from backend cache
298
+ if isinstance(self.cache, dict):
299
+ self.cache.pop(key, None)
300
+ else:
301
+ self.cache.delete(key)
169
302
 
170
- # Update data in `_keys` and `_total_size`
171
- for key in keys_to_delete:
172
- self._total_size -= self._keys.pop(key, 0)
303
+ # Extract the size of the entry to be deleted and reduce
304
+ # the overall cache size
305
+ size = self.sizes.pop(key, 0)
306
+ self.total_size -= size
173
307
 
174
- logger.info(f"Evicted {len(keys_to_delete)} keys from cache, \
175
- freed {freed_size} bytes.")
308
+ # Retrieve prefix from key_prefix without splitting
309
+ prefix = self.key_prefix.pop(key, None)
310
+ if prefix is not None:
311
+ self.prefix_index[prefix].discard(key)
312
+ if not self.prefix_index[prefix]:
313
+ del self.prefix_index[prefix]
176
314
 
177
315
 
178
- # Create a global cache object (there is only one for the entire system)
179
- treenode_cache = TreeNodeCache()
316
+ # Global cache object (unique for the system)
317
+ treenode_cache = TreeCache()
180
318
 
181
319
 
182
320
  # ---------------------------------------------------
183
321
  # Decorator
184
322
  # ---------------------------------------------------
185
323
 
186
-
187
324
  def cached_method(func):
188
- """
189
- Decorate instance methods for caching.
190
-
191
- The decorator caches the results of the decorated class or instance method.
192
- If the cache is cleared or invalidated, the cached results will be
193
- recalculated.
194
-
195
- Usage:
196
- @cached_tree_method
197
- def model_method(self):
198
- # Tree method logic
199
- """
200
-
325
+ """Decorate instance or class methods."""
326
+ @wraps(func)
201
327
  def wrapper(self, *args, **kwargs):
202
- # Generate a cache key.
328
+ cache = treenode_cache
329
+
203
330
  if isinstance(self, type):
204
- # Если self — класс, используем его имя
205
331
  unique_id = to_base36(id(self))
206
332
  label = getattr(self._meta, 'label', self.__name__)
207
333
  else:
208
- unique_id = getattr(self, "pk", id(self))
334
+ unique_id = getattr(self, "pk", None) or to_base36(id(self))
209
335
  label = self._meta.label
210
336
 
211
- cache_key = treenode_cache.generate_cache_key(
337
+ cache_key = cache.generate_cache_key(
212
338
  label,
213
339
  func.__name__,
214
340
  unique_id,
215
341
  *args,
216
342
  **kwargs
217
343
  )
218
-
219
- # Retrieving from cache
220
- value = treenode_cache.get(cache_key)
221
-
344
+ value = cache.get(cache_key)
222
345
  if value is None:
223
346
  value = func(self, *args, **kwargs)
224
-
225
- # Push to cache
226
- treenode_cache.set(cache_key, value)
347
+ cache.set(cache_key, value)
227
348
  return value
228
349
  return wrapper
229
350
 
@@ -14,6 +14,7 @@ Email: timurkady@yandex.com
14
14
  from collections import deque, defaultdict
15
15
  from django.db import models, transaction
16
16
  from django.db import connection
17
+ from django.db.models import F
17
18
 
18
19
 
19
20
  class TreeNodeQuerySet(models.QuerySet):
@@ -137,10 +138,11 @@ class TreeNodeModelManager(models.Manager):
137
138
 
138
139
  def get_queryset(self):
139
140
  """Return a sorted QuerySet."""
140
- queryset = TreeNodeQuerySet(self.model, using=self._db)\
141
- .annotate(_depth_db=models.Max("parents_set__depth"))\
142
- .order_by("_depth_db", "tn_parent", "tn_priority")
143
- return queryset
141
+ return TreeNodeQuerySet(self.model, using=self._db)\
142
+ .order_by(
143
+ # F('tn_parent').asc(nulls_first=True),
144
+ 'tn_parent', 'tn_priority'
145
+ )
144
146
 
145
147
  # Service methods -------------------
146
148
 
@@ -74,8 +74,6 @@ class TreeNodeModel(
74
74
 
75
75
  abstract = True
76
76
  indexes = [
77
- models.Index(fields=["tn_parent"]),
78
- models.Index(fields=["tn_parent", "id"]),
79
77
  models.Index(fields=["tn_parent", "tn_priority"]),
80
78
  ]
81
79
 
@@ -151,43 +149,45 @@ class TreeNodeModel(
151
149
  using=self._state.db,
152
150
  update_fields=kwargs.get("update_fields", None)
153
151
  )
154
-
155
- # If the object already exists, get the old parent and priority values
156
- is_new = self.pk is None
157
- if not is_new:
158
- old_parent, old_priority = model.objects\
159
- .filter(pk=self.pk)\
160
- .values_list('tn_parent', 'tn_priority')\
161
- .first()
162
- is_move = (old_priority != self.tn_priority)
163
- else:
164
- force_insert = True
165
- is_move = False
166
- old_parent = None
167
-
168
- # Check if we are trying to move a node to a child
169
- if old_parent and old_parent != self.tn_parent and self.tn_parent:
170
- # Get pk of children via values_list to avoid creating full
171
- # set of objects
172
- if self.tn_parent.pk in self.get_descendants_pks():
173
- raise ValueError("You cannot move a node into its own child.")
174
-
175
- # Save the object and synchronize with the closing table
176
- # Disable signals
177
- with (disable_signals(pre_save, model),
178
- disable_signals(post_save, model)):
179
-
180
- if is_new or is_move:
181
- self._update_priority()
182
- super().save(force_insert=force_insert, *args, **kwargs)
183
- # Run synchronize
184
- if is_new:
185
- self.closure_model.insert_node(self)
186
- elif is_move:
187
- subtree_nodes = self.get_descendants(include_self=True)
188
- self.closure_model.move_node(subtree_nodes)
189
- # Update priorities among neighbors or clear cache if there was
190
- # no movement
152
+ with transaction.atomic():
153
+ # If the object already exists, get the old parent and priority
154
+ # values
155
+ is_new = self.pk is None
156
+ if not is_new:
157
+ old_parent, old_priority = model.objects\
158
+ .filter(pk=self.pk)\
159
+ .values_list('tn_parent', 'tn_priority')\
160
+ .first()
161
+ is_move = (old_priority != self.tn_priority)
162
+ else:
163
+ force_insert = True
164
+ is_move = False
165
+ old_parent = None
166
+
167
+ descendants = self.get_descendants(include_self=True)
168
+
169
+ # Check if we are trying to move a node to a child
170
+ if old_parent and old_parent != self.tn_parent and self.tn_parent:
171
+ # Get pk of children via values_list to avoid creating full
172
+ # set of objects
173
+ if self.tn_parent in descendants:
174
+ raise ValueError(
175
+ "You cannot move a node into its own child."
176
+ )
177
+
178
+ # Save the object and synchronize with the closing table
179
+ # Disable signals
180
+ with (disable_signals(pre_save, model),
181
+ disable_signals(post_save, model)):
182
+
183
+ if is_new or is_move:
184
+ self._update_priority()
185
+ super().save(force_insert=force_insert, *args, **kwargs)
186
+ # Run synchronize
187
+ if is_new:
188
+ self.closure_model.insert_node(self)
189
+ elif is_move:
190
+ self.closure_model.move_node(descendants)
191
191
 
192
192
  # Clear model cache
193
193
  model.clear_cache()
@@ -203,7 +203,7 @@ class TreeNodeModel(
203
203
 
204
204
  def _update_priority(self):
205
205
  """Update tn_priority field for siblings."""
206
- siblings = self.get_siblings()
206
+ siblings = self.get_siblings(include_self=False)
207
207
  siblings = sorted(siblings, key=lambda x: x.tn_priority)
208
208
  insert_pos = min(self.tn_priority, len(siblings))
209
209
  siblings.insert(insert_pos, self)
@@ -214,7 +214,6 @@ class TreeNodeModel(
214
214
  # Save changes
215
215
  model = self._meta.model
216
216
  model.objects.bulk_update(siblings, ['tn_priority'])
217
- model.clear_cache()
218
217
 
219
218
  @classmethod
220
219
  def _get_place(cls, target, position=0):
@@ -62,7 +62,8 @@ class ClosureModel(models.Model):
62
62
  unique_together = (("parent", "child"),)
63
63
  indexes = [
64
64
  models.Index(fields=["parent", "child"]),
65
- models.Index(fields=["child", "parent"]),
65
+ models.Index(fields=["parent", "depth"]),
66
+ models.Index(fields=["child", "depth"]),
66
67
  models.Index(fields=["parent", "child", "depth"]),
67
68
  ]
68
69
 
@@ -72,28 +73,6 @@ class ClosureModel(models.Model):
72
73
 
73
74
  # ----------- Methods of working with tree structure ----------- #
74
75
 
75
- @classmethod
76
- def get_ancestors_pks(cls, node, include_self=True, depth=None):
77
- """Get the ancestors pks list."""
78
- options = dict(child_id=node.pk, depth__gte=0 if include_self else 1)
79
- if depth:
80
- options["depth__lte"] = depth
81
- queryset = cls.objects.filter(**options)\
82
- .order_by('depth')\
83
- .values_list('parent_id', flat=True)
84
- return list(queryset.values_list("parent_id", flat=True))
85
-
86
- @classmethod
87
- def get_descendants_pks(cls, node, include_self=False, depth=None):
88
- """Get a list containing all descendants."""
89
- options = dict(parent_id=node.pk, depth__gte=0 if include_self else 1)
90
- if depth:
91
- options.update({'depth__lte': depth})
92
- queryset = cls.objects.filter(**options)\
93
- .order_by('depth')\
94
- .values_list('child_id', flat=True)
95
- return queryset
96
-
97
76
  @classmethod
98
77
  def get_root(cls, node):
99
78
  """Get the root node pk for the current node."""
@@ -2,12 +2,13 @@
2
2
  """
3
3
  TreeNode Ancestors Mixin
4
4
 
5
- Version: 2.1.0
5
+ Version: 2.1.4
6
6
  Author: Timur Kady
7
7
  Email: timurkady@yandex.com
8
8
  """
9
9
 
10
10
  from django.db import models
11
+ from django.db.models import OuterRef, Subquery, IntegerField, Case, When, Value
11
12
  from ...cache import treenode_cache, cached_method
12
13
 
13
14
 
@@ -22,41 +23,23 @@ class TreeNodeAncestorsMixin(models.Model):
22
23
  @cached_method
23
24
  def get_ancestors_queryset(self, include_self=True, depth=None):
24
25
  """Get the ancestors queryset (ordered from root to parent)."""
25
- qs = self._meta.model.objects.filter(tn_closure__child=self.pk)
26
+ options = dict(child_id=self.pk, depth__gte=0 if include_self else 1)
27
+ if depth:
28
+ options.update({'depth__lte': depth})
26
29
 
27
- if depth is not None:
28
- qs = qs.filter(tn_closure__depth__lte=depth)
29
-
30
- if include_self:
31
- qs = qs | self._meta.model.objects.filter(pk=self.pk)
32
-
33
- return qs.distinct().order_by("tn_closure__depth")
30
+ return self.closure_model.objects\
31
+ .filter(**options)\
32
+ .order_by('-depth')
34
33
 
35
34
  @cached_method
36
35
  def get_ancestors_pks(self, include_self=True, depth=None):
37
36
  """Get the ancestors pks list."""
38
- cache_key = treenode_cache.generate_cache_key(
39
- label=self._meta.label,
40
- func_name=getattr(self, "get_ancestors_queryset").__name__,
41
- unique_id=self.pk,
42
- arg={
43
- "include_self": include_self,
44
- "depth": depth
45
- }
46
- )
47
- queryset = treenode_cache.get(cache_key)
48
- if queryset is not None:
49
- return list(queryset.values_list("id", flat=True))
50
- elif hasattr(self, "closure_model"):
51
- return self.closure_model.get_ancestors_pks(
52
- self, include_self, depth
53
- )
54
- return []
37
+ return self.get_ancestors_queryset(include_self, depth)\
38
+ .values_list('id', flat=True)
55
39
 
56
40
  def get_ancestors(self, include_self=True, depth=None):
57
41
  """Get a list with all ancestors (ordered from root to self/parent)."""
58
- queryset = self.get_ancestors_queryset(include_self, depth)
59
- return list(queryset)
42
+ return list(self.get_ancestors_queryset(include_self, depth))
60
43
 
61
44
  def get_ancestors_count(self, include_self=True, depth=None):
62
45
  """Get the ancestors count."""
@@ -60,7 +60,8 @@ class TreeNodeChildrenMixin(models.Model):
60
60
  @cached_method
61
61
  def get_children_queryset(self):
62
62
  """Get the children queryset with prefetch."""
63
- return self.tn_children.prefetch_related('tn_children')
63
+ # return self.tn_children.prefetch_related('tn_children')
64
+ return self._meta.model.objects.filter(tn_parent__pk=self.id)
64
65
 
65
66
  def get_children(self):
66
67
  """Get a list containing all children."""
@@ -8,6 +8,8 @@ Email: timurkady@yandex.com
8
8
  """
9
9
 
10
10
  from django.db import models
11
+ from django.db.models import OuterRef, Subquery, Min
12
+
11
13
  from treenode.cache import treenode_cache, cached_method
12
14
 
13
15
 
@@ -22,37 +24,29 @@ class TreeNodeDescendantsMixin(models.Model):
22
24
  @cached_method
23
25
  def get_descendants_queryset(self, include_self=False, depth=None):
24
26
  """Get the descendants queryset."""
25
- queryset = self._meta.model.objects\
26
- .annotate(min_depth=models.Min("parents_set__depth"))\
27
- .filter(parents_set__parent=self.pk)
27
+ Closure = self.closure_model
28
+ desc_qs = Closure.objects.filter(child=OuterRef('pk'), parent=self.pk)
29
+ desc_qs = desc_qs.values('child').annotate(
30
+ mdepth=Min('depth')).values('mdepth')[:1]
31
+
32
+ queryset = self._meta.model.objects.annotate(
33
+ min_depth=Subquery(desc_qs)
34
+ ).filter(min_depth__isnull=False)
28
35
 
29
36
  if depth is not None:
30
37
  queryset = queryset.filter(min_depth__lte=depth)
31
- if include_self and not queryset.filter(pk=self.pk).exists():
38
+
39
+ # add self if needed
40
+ if include_self:
32
41
  queryset = queryset | self._meta.model.objects.filter(pk=self.pk)
33
42
 
34
- return queryset.order_by("min_depth", "tn_priority")
43
+ return queryset.order_by('min_depth', 'tn_priority')
35
44
 
36
45
  @cached_method
37
46
  def get_descendants_pks(self, include_self=False, depth=None):
38
47
  """Get the descendants pks list."""
39
- cache_key = treenode_cache.generate_cache_key(
40
- label=self._meta.label,
41
- func_name=getattr(self, "get_descendants_queryset").__name__,
42
- unique_id=self.pk,
43
- arg={
44
- "include_self": include_self,
45
- "depth": depth
46
- }
47
- )
48
- queryset = treenode_cache.get(cache_key)
49
- if queryset is not None:
50
- return list(queryset.values_list("id", flat=True))
51
- elif hasattr(self, "closure_model"):
52
- return self.closure_model.get_descendants_pks(
53
- self, include_self, depth
54
- )
55
- return []
48
+ return self.get_descendants_queryset(include_self, depth)\
49
+ .values_list("id", flat=True)
56
50
 
57
51
  def get_descendants(self, include_self=False, depth=None):
58
52
  """Get a list containing all descendants."""
@@ -2,7 +2,7 @@
2
2
  """
3
3
  TreeNode Node Mixin
4
4
 
5
- Version: 2.1.0
5
+ Version: 2.1.3
6
6
  Author: Timur Kady
7
7
  Email: timurkady@yandex.com
8
8
  """
@@ -25,30 +25,14 @@ class TreeNodeNodeMixin(models.Model):
25
25
  @cached_method
26
26
  def get_breadcrumbs(self, attr='id'):
27
27
  """Optimized breadcrumbs retrieval with direct cache check."""
28
+
28
29
  try:
29
30
  self._meta.get_field(attr)
30
31
  except FieldDoesNotExist:
31
32
  raise ValueError(f"Invalid attribute name: {attr}")
32
33
 
33
- # Easy logics for roots
34
- if self.tn_parent is None:
35
- return [getattr(self, attr)]
36
-
37
- # Generate parents cache key
38
- cache_key = treenode_cache.generate_cache_key(
39
- self._meta.label,
40
- self.get_breadcrumbs.__name__,
41
- self.tn_parent.pk,
42
- attr
43
- )
44
-
45
- # Try get value from cache
46
- breadcrumbs = treenode_cache.get(cache_key)
47
- if breadcrumbs is not None:
48
- return breadcrumbs + [getattr(self, attr)]
49
-
50
- queryset = self.get_ancestors_queryset(include_self=True).only(attr)
51
- return [getattr(item, attr) for item in queryset]
34
+ ancestors = self.get_ancestors(include_self=True)
35
+ return [getattr(node, attr) for node in ancestors]
52
36
 
53
37
  @cached_method
54
38
  def get_depth(self):
@@ -45,25 +45,25 @@ class TreeNodeSiblingsMixin(models.Model):
45
45
  return instance
46
46
 
47
47
  @cached_method
48
- def get_siblings_queryset(self):
48
+ def get_siblings_queryset(self, include_self=True):
49
49
  """Get the siblings queryset with prefetch."""
50
50
  if self.tn_parent:
51
- qs = self.tn_parent.tn_children.prefetch_related('tn_children')
51
+ qs = self._meta.model.objects.filter(tn_parent=self.tn_parent)
52
52
  else:
53
53
  qs = self._meta.model.objects.filter(tn_parent__isnull=True)
54
- return qs.exclude(pk=self.pk)
54
+ return qs if include_self else qs.exclude(pk=self.pk)
55
55
 
56
- def get_siblings(self):
56
+ def get_siblings(self, include_self=True):
57
57
  """Get a list with all the siblings."""
58
58
  return list(self.get_siblings_queryset())
59
59
 
60
- def get_siblings_count(self):
60
+ def get_siblings_count(self, include_self=True):
61
61
  """Get the siblings count."""
62
- return self.get_siblings_queryset().count()
62
+ return self.get_siblings_queryset(include_self).count()
63
63
 
64
- def get_siblings_pks(self):
64
+ def get_siblings_pks(self, include_self=True):
65
65
  """Get the siblings pks list."""
66
- return [item.pk for item in self.get_siblings_queryset()]
66
+ return [item.pk for item in self.get_siblings_queryset(include_self)]
67
67
 
68
68
  def get_first_sibling(self):
69
69
  """
@@ -12,7 +12,7 @@ Features:
12
12
  - Provides optimized data extraction for QuerySets.
13
13
  - Generates downloadable files with appropriate HTTP responses.
14
14
 
15
- Version: 2.0.11
15
+ Version: 3.0.0
16
16
  Author: Timur Kady
17
17
  Email: timurkady@yandex.com
18
18
  """
treenode/version.py CHANGED
@@ -4,10 +4,10 @@ TreeNode Version Module
4
4
 
5
5
  This module defines the current version of the TreeNode package.
6
6
 
7
- Version: 2.1.2
7
+ Version: 2.1.5
8
8
  Author: Timur Kady
9
9
  Email: timurkady@yandex.com
10
10
  """
11
11
 
12
12
 
13
- __version__ = '2.1.3'
13
+ __version__ = '2.1.5'