json-duplicate-keys 2024.3.24__tar.gz → 2024.7.17__tar.gz
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- {json-duplicate-keys-2024.3.24/json_duplicate_keys.egg-info → json-duplicate-keys-2024.7.17}/PKG-INFO +29 -1
- {json-duplicate-keys-2024.3.24 → json-duplicate-keys-2024.7.17}/README.md +28 -0
- {json-duplicate-keys-2024.3.24 → json-duplicate-keys-2024.7.17}/json_duplicate_keys/__init__.py +279 -204
- {json-duplicate-keys-2024.3.24 → json-duplicate-keys-2024.7.17/json_duplicate_keys.egg-info}/PKG-INFO +29 -1
- {json-duplicate-keys-2024.3.24 → json-duplicate-keys-2024.7.17}/setup.py +1 -1
- {json-duplicate-keys-2024.3.24 → json-duplicate-keys-2024.7.17}/LICENSE +0 -0
- {json-duplicate-keys-2024.3.24 → json-duplicate-keys-2024.7.17}/MANIFEST.in +0 -0
- {json-duplicate-keys-2024.3.24 → json-duplicate-keys-2024.7.17}/json_duplicate_keys.egg-info/SOURCES.txt +0 -0
- {json-duplicate-keys-2024.3.24 → json-duplicate-keys-2024.7.17}/json_duplicate_keys.egg-info/dependency_links.txt +0 -0
- {json-duplicate-keys-2024.3.24 → json-duplicate-keys-2024.7.17}/json_duplicate_keys.egg-info/top_level.txt +0 -0
- {json-duplicate-keys-2024.3.24 → json-duplicate-keys-2024.7.17}/requirements.txt +0 -0
- {json-duplicate-keys-2024.3.24 → json-duplicate-keys-2024.7.17}/setup.cfg +0 -0
@@ -1,6 +1,6 @@
|
|
1
1
|
Metadata-Version: 2.1
|
2
2
|
Name: json-duplicate-keys
|
3
|
-
Version: 2024.
|
3
|
+
Version: 2024.7.17
|
4
4
|
Summary: Flatten/ Unflatten and Load(s)/ Dump(s) JSON File/ Object with Duplicate Keys
|
5
5
|
Home-page: https://github.com/truocphan/json-duplicate-keys
|
6
6
|
Author: TP Cyber Security
|
@@ -265,6 +265,27 @@ Description: # JSON Duplicate Keys - PyPI
|
|
265
265
|
```
|
266
266
|
---
|
267
267
|
|
268
|
+
### JSON_DUPLICATE_KEYS.filter_values(`value`, `separator`="||", `parse_index`="$", `ordered_dict`=False)
|
269
|
+
|
270
|
+
- `value`:
|
271
|
+
- `separator`:
|
272
|
+
- `parse_index`:
|
273
|
+
- `ordered_dict`:
|
274
|
+
```python
|
275
|
+
import json_duplicate_keys as jdks
|
276
|
+
|
277
|
+
Jstr = '{"author": "truocphan", "version": "22.3.3", "version": "latest", "release": [{"version": "latest"}], "snapshot": {"author": "truocphan", "version": "22.3.3", "release": [{"version": "latest"}]}}'
|
278
|
+
|
279
|
+
JDKSObject = jdks.loads(Jstr)
|
280
|
+
|
281
|
+
print(JDKSObject.filter_values("latest").dumps())
|
282
|
+
# OUTPUT: {"version": "latest", "release||$0$||version": "latest", "snapshot||release||$0$||version": "latest"}
|
283
|
+
|
284
|
+
print(JDKSObject.dumps())
|
285
|
+
# OUTPUT: {"author": "truocphan", "version": "22.3.3", "version": "latest", "release": [{"version": "latest"}], "snapshot": {"author": "truocphan", "version": "22.3.3", "release": [{"version": "latest"}]}}
|
286
|
+
```
|
287
|
+
---
|
288
|
+
|
268
289
|
### JSON_DUPLICATE_KEYS.dumps(`dupSign_start`="{{{", `dupSign_end`="}}}", `_isDebug_`=False, `skipkeys`=False, `ensure_ascii`=True, `check_circular`=True, `allow_nan`=True, `cls`=None, `indent`=None, `separators`=None, `default`=None, `sort_keys`=False)
|
269
290
|
_Serialize a JSON object to a JSON format string_
|
270
291
|
- `dupSign_start`:
|
@@ -367,6 +388,13 @@ Description: # JSON Duplicate Keys - PyPI
|
|
367
388
|
---
|
368
389
|
|
369
390
|
## CHANGELOG
|
391
|
+
#### [json-duplicate-keys v2024.7.17](https://github.com/truocphan/json-duplicate-keys/tree/2024.7.17)
|
392
|
+
- **Fixed**: issue [#3](https://github.com/truocphan/json-duplicate-keys/issues/3) break the set function when the key's value is empty. Thanks [ptth222](https://github.com/ptth222) for reporting this issue.
|
393
|
+
|
394
|
+
#### [json-duplicate-keys v2024.4.20](https://github.com/truocphan/json-duplicate-keys/tree/2024.4.20)
|
395
|
+
- **New**: _filter_values_
|
396
|
+
- **Updated**: _filter_keys_
|
397
|
+
|
370
398
|
#### [json-duplicate-keys v2024.3.24](https://github.com/truocphan/json-duplicate-keys/tree/2024.3.24)
|
371
399
|
- **Updated**: _normalize_key_, _loads_, _get_, _set_, _update_, _delete_
|
372
400
|
---
|
@@ -257,6 +257,27 @@ print(JDKSObject.dumps())
|
|
257
257
|
```
|
258
258
|
---
|
259
259
|
|
260
|
+
### JSON_DUPLICATE_KEYS.filter_values(`value`, `separator`="||", `parse_index`="$", `ordered_dict`=False)
|
261
|
+
|
262
|
+
- `value`:
|
263
|
+
- `separator`:
|
264
|
+
- `parse_index`:
|
265
|
+
- `ordered_dict`:
|
266
|
+
```python
|
267
|
+
import json_duplicate_keys as jdks
|
268
|
+
|
269
|
+
Jstr = '{"author": "truocphan", "version": "22.3.3", "version": "latest", "release": [{"version": "latest"}], "snapshot": {"author": "truocphan", "version": "22.3.3", "release": [{"version": "latest"}]}}'
|
270
|
+
|
271
|
+
JDKSObject = jdks.loads(Jstr)
|
272
|
+
|
273
|
+
print(JDKSObject.filter_values("latest").dumps())
|
274
|
+
# OUTPUT: {"version": "latest", "release||$0$||version": "latest", "snapshot||release||$0$||version": "latest"}
|
275
|
+
|
276
|
+
print(JDKSObject.dumps())
|
277
|
+
# OUTPUT: {"author": "truocphan", "version": "22.3.3", "version": "latest", "release": [{"version": "latest"}], "snapshot": {"author": "truocphan", "version": "22.3.3", "release": [{"version": "latest"}]}}
|
278
|
+
```
|
279
|
+
---
|
280
|
+
|
260
281
|
### JSON_DUPLICATE_KEYS.dumps(`dupSign_start`="{{{", `dupSign_end`="}}}", `_isDebug_`=False, `skipkeys`=False, `ensure_ascii`=True, `check_circular`=True, `allow_nan`=True, `cls`=None, `indent`=None, `separators`=None, `default`=None, `sort_keys`=False)
|
261
282
|
_Serialize a JSON object to a JSON format string_
|
262
283
|
- `dupSign_start`:
|
@@ -359,6 +380,13 @@ print(JDKSObject.getObject())
|
|
359
380
|
---
|
360
381
|
|
361
382
|
## CHANGELOG
|
383
|
+
#### [json-duplicate-keys v2024.7.17](https://github.com/truocphan/json-duplicate-keys/tree/2024.7.17)
|
384
|
+
- **Fixed**: issue [#3](https://github.com/truocphan/json-duplicate-keys/issues/3) break the set function when the key's value is empty. Thanks [ptth222](https://github.com/ptth222) for reporting this issue.
|
385
|
+
|
386
|
+
#### [json-duplicate-keys v2024.4.20](https://github.com/truocphan/json-duplicate-keys/tree/2024.4.20)
|
387
|
+
- **New**: _filter_values_
|
388
|
+
- **Updated**: _filter_keys_
|
389
|
+
|
362
390
|
#### [json-duplicate-keys v2024.3.24](https://github.com/truocphan/json-duplicate-keys/tree/2024.3.24)
|
363
391
|
- **Updated**: _normalize_key_, _loads_, _get_, _set_, _update_, _delete_
|
364
392
|
---
|
{json-duplicate-keys-2024.3.24 → json-duplicate-keys-2024.7.17}/json_duplicate_keys/__init__.py
RENAMED
@@ -6,18 +6,21 @@ def normalize_key(name, dupSign_start="{{{", dupSign_end="}}}", _isDebug_=False)
|
|
6
6
|
|
7
7
|
# User input data type validation
|
8
8
|
if type(_isDebug_) != bool: _isDebug_ = False
|
9
|
+
|
9
10
|
try:
|
10
|
-
if type(name) not in [str, unicode]:
|
11
|
-
|
11
|
+
if type(name) not in [str, unicode]: exit("\x1b[31m[-] DataTypeError: the KEY name must be str or unicode, not {}\x1b[0m".format(type(name)))
|
12
|
+
|
12
13
|
if type(dupSign_start) not in [str, unicode]: dupSign_start = "{{{"
|
14
|
+
|
13
15
|
if type(dupSign_end) not in [str, unicode]: dupSign_end = "}}}"
|
14
16
|
except Exception as e:
|
15
|
-
if type(name) not in [str]:
|
16
|
-
|
17
|
+
if type(name) not in [str]: exit("\x1b[31m[-] DataTypeError: the KEY name must be str or unicode, not {}\x1b[0m".format(type(name)))
|
18
|
+
|
17
19
|
if type(dupSign_start) not in [str]: dupSign_start = "{{{"
|
20
|
+
|
18
21
|
if type(dupSign_end) not in [str]: dupSign_end = "}}}"
|
19
22
|
|
20
|
-
return re.sub('{dupSign_start}_
|
23
|
+
return re.sub('{dupSign_start}_\\d+_{dupSign_end}$'.format(dupSign_start=re.escape(dupSign_start), dupSign_end=re.escape(dupSign_end)), "", name)
|
21
24
|
# # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # #
|
22
25
|
# # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # #
|
23
26
|
# # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # #
|
@@ -32,22 +35,26 @@ def loads(Jstr, dupSign_start="{{{", dupSign_end="}}}", ordered_dict=False, _isD
|
|
32
35
|
|
33
36
|
# User input data type validation
|
34
37
|
if type(_isDebug_) != bool: _isDebug_ = False
|
38
|
+
|
35
39
|
if type(ordered_dict) != bool: ordered_dict = False
|
40
|
+
|
36
41
|
try:
|
37
|
-
if type(Jstr) not in [str, unicode]:
|
38
|
-
|
42
|
+
if type(Jstr) not in [str, unicode]: exit("\x1b[31m[-] DataTypeError: the JSON object must be str or unicode, not {}\x1b[0m".format(type(Jstr)))
|
43
|
+
|
39
44
|
if type(dupSign_start) not in [str, unicode]: dupSign_start = "{{{"
|
45
|
+
|
40
46
|
if type(dupSign_end) not in [str, unicode]: dupSign_end = "}}}"
|
41
47
|
except Exception as e:
|
42
|
-
if type(Jstr) not in [str]:
|
43
|
-
|
48
|
+
if type(Jstr) not in [str]: exit("\x1b[31m[-] DataTypeError: the JSON object must be str or unicode, not {}\x1b[0m".format(type(Jstr)))
|
49
|
+
|
44
50
|
if type(dupSign_start) not in [str]: dupSign_start = "{{{"
|
51
|
+
|
45
52
|
if type(dupSign_end) not in [str]: dupSign_end = "}}}"
|
46
53
|
|
47
54
|
def __convert_Jloads_to_Jobj(Jloads, Jobj):
|
48
55
|
if type(Jloads) in [dict, OrderedDict]:
|
49
56
|
for k in Jloads.keys():
|
50
|
-
_key = re.split(dupSign_start_escape_regex+"_
|
57
|
+
_key = re.split(dupSign_start_escape_regex+"_\\d+_"+dupSign_end_escape_regex+"$", k)[0]
|
51
58
|
|
52
59
|
if _key not in Jobj.keys():
|
53
60
|
if type(Jloads[k]) not in [list, dict, OrderedDict]:
|
@@ -62,7 +69,7 @@ def loads(Jstr, dupSign_start="{{{", dupSign_end="}}}", ordered_dict=False, _isD
|
|
62
69
|
|
63
70
|
__convert_Jloads_to_Jobj(Jloads[k], Jobj[_key])
|
64
71
|
else:
|
65
|
-
countObj = len([i for i in Jobj.keys() if _key==re.split(dupSign_start_escape_regex+"_
|
72
|
+
countObj = len([i for i in Jobj.keys() if _key==re.split(dupSign_start_escape_regex+"_\\d+_"+dupSign_end_escape_regex+"$", i)[0]])
|
66
73
|
if type(Jloads[k]) not in [list, dict, OrderedDict]:
|
67
74
|
Jobj[_key+dupSign_start+"_"+str(countObj+1)+"_"+dupSign_end] = Jloads[k]
|
68
75
|
else:
|
@@ -90,8 +97,7 @@ def loads(Jstr, dupSign_start="{{{", dupSign_end="}}}", ordered_dict=False, _isD
|
|
90
97
|
|
91
98
|
try:
|
92
99
|
Jloads = json.loads(Jstr)
|
93
|
-
if ordered_dict:
|
94
|
-
Jloads = json.loads(Jstr, object_pairs_hook=OrderedDict)
|
100
|
+
if ordered_dict: Jloads = json.loads(Jstr, object_pairs_hook=OrderedDict)
|
95
101
|
|
96
102
|
if type(Jloads) in [list, dict, OrderedDict]:
|
97
103
|
dupSign_start_escape = "".join(["\\\\u"+hex(ord(c))[2:].zfill(4) for c in dupSign_start])
|
@@ -189,40 +195,43 @@ class JSON_DUPLICATE_KEYS:
|
|
189
195
|
|
190
196
|
# User input data type validation
|
191
197
|
if type(_isDebug_) != bool: _isDebug_ = False
|
198
|
+
|
192
199
|
try:
|
193
|
-
if type(name) not in [str, unicode]:
|
194
|
-
|
200
|
+
if type(name) not in [str, unicode]: exit("\x1b[31m[-] DataTypeError: the KEY name must be str or unicode, not {}\x1b[0m".format(type(name)))
|
201
|
+
|
195
202
|
if type(separator) not in [str, unicode]: separator = "||"
|
203
|
+
|
196
204
|
if type(parse_index) not in [str, unicode]: parse_index = "$"
|
197
205
|
except Exception as e:
|
198
|
-
if type(name) not in [str]:
|
199
|
-
|
206
|
+
if type(name) not in [str]: exit("\x1b[31m[-] DataTypeError: the KEY name must be str or unicode, not {}\x1b[0m".format(type(name)))
|
207
|
+
|
200
208
|
if type(separator) not in [str]: separator = "||"
|
209
|
+
|
201
210
|
if type(parse_index) not in [str]: parse_index = "$"
|
202
211
|
|
203
|
-
if type(self.getObject()) in [list, dict, OrderedDict]:
|
204
|
-
try:
|
205
|
-
Jobj = self.__Jobj
|
206
|
-
Jval = "JSON_DUPLICATE_KEYS_ERROR"
|
207
|
-
name_split = name.split(separator)
|
208
|
-
|
209
|
-
for i in range(len(name_split)):
|
210
|
-
if type(Jobj) in [dict, OrderedDict] and name_split[i] in Jobj.keys():
|
211
|
-
Jval = Jobj[name_split[i]]
|
212
|
-
Jobj = Jobj[name_split[i]]
|
213
|
-
elif type(Jobj) in [list] and re.search("^"+re.escape(parse_index)+"\d+"+re.escape(parse_index)+"$", name_split[i]):
|
214
|
-
Jval = Jobj[int(name_split[i].split(parse_index)[1])]
|
215
|
-
Jobj = Jobj[int(name_split[i].split(parse_index)[1])]
|
216
|
-
else:
|
217
|
-
if _isDebug_: print("\x1b[31m[-] KeyNotFoundError: \x1b[0m"+separator.join(name_split[:i+1]))
|
218
|
-
return "JSON_DUPLICATE_KEYS_ERROR"
|
219
|
-
return Jval
|
220
|
-
except Exception as e:
|
221
|
-
if _isDebug_: print("\x1b[31m[-] ExceptionError: {}\x1b[0m".format(e))
|
222
|
-
return "JSON_DUPLICATE_KEYS_ERROR"
|
223
|
-
else:
|
212
|
+
if type(self.getObject()) not in [list, dict, OrderedDict]:
|
224
213
|
if _isDebug_: print("\x1b[31m[-] DataTypeError: the JSON object must be list, dict or OrderedDict, not {}\x1b[0m".format(type(self.getObject())))
|
225
214
|
return "JSON_DUPLICATE_KEYS_ERROR"
|
215
|
+
|
216
|
+
try:
|
217
|
+
Jobj = self.__Jobj
|
218
|
+
Jval = "JSON_DUPLICATE_KEYS_ERROR"
|
219
|
+
name_split = name.split(separator)
|
220
|
+
|
221
|
+
for i in range(len(name_split)):
|
222
|
+
if type(Jobj) in [dict, OrderedDict] and name_split[i] in Jobj.keys():
|
223
|
+
Jval = Jobj[name_split[i]]
|
224
|
+
Jobj = Jobj[name_split[i]]
|
225
|
+
elif type(Jobj) in [list] and re.search("^"+re.escape(parse_index)+"\\d+"+re.escape(parse_index)+"$", name_split[i]):
|
226
|
+
Jval = Jobj[int(name_split[i].split(parse_index)[1])]
|
227
|
+
Jobj = Jobj[int(name_split[i].split(parse_index)[1])]
|
228
|
+
else:
|
229
|
+
if _isDebug_: print("\x1b[31m[-] KeyNotFoundError: \x1b[0m"+separator.join(name_split[:i+1]))
|
230
|
+
return "JSON_DUPLICATE_KEYS_ERROR"
|
231
|
+
return Jval
|
232
|
+
except Exception as e:
|
233
|
+
if _isDebug_: print("\x1b[31m[-] ExceptionError: {}\x1b[0m".format(e))
|
234
|
+
return "JSON_DUPLICATE_KEYS_ERROR"
|
226
235
|
# # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # #
|
227
236
|
# # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # #
|
228
237
|
# # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # #
|
@@ -237,125 +246,145 @@ class JSON_DUPLICATE_KEYS:
|
|
237
246
|
|
238
247
|
# User input data type validation
|
239
248
|
if type(_isDebug_) != bool: _isDebug_ = False
|
249
|
+
|
240
250
|
if type(ordered_dict) != bool: ordered_dict = False
|
251
|
+
|
241
252
|
try:
|
242
|
-
if type(name) not in [str, unicode]:
|
243
|
-
|
253
|
+
if type(name) not in [str, unicode]: exit("\x1b[31m[-] DataTypeError: the KEY name must be str or unicode, not {}\x1b[0m".format(type(name)))
|
254
|
+
|
244
255
|
if type(separator) not in [str, unicode]: separator = "||"
|
256
|
+
|
245
257
|
if type(parse_index) not in [str, unicode]: parse_index = "$"
|
258
|
+
|
246
259
|
if type(dupSign_start) not in [str, unicode]: dupSign_start = "{{{"
|
260
|
+
|
247
261
|
if type(dupSign_end) not in [str, unicode]: dupSign_end = "}}}"
|
248
262
|
except Exception as e:
|
249
|
-
if type(name) not in [str]:
|
250
|
-
|
263
|
+
if type(name) not in [str]: exit("\x1b[31m[-] DataTypeError: the KEY name must be str or unicode, not {}\x1b[0m".format(type(name)))
|
264
|
+
|
251
265
|
if type(separator) not in [str]: separator = "||"
|
266
|
+
|
252
267
|
if type(parse_index) not in [str]: parse_index = "$"
|
268
|
+
|
253
269
|
if type(dupSign_start) not in [str]: dupSign_start = "{{{"
|
254
|
-
if type(dupSign_end) not in [str]: dupSign_end = "}}}"
|
255
270
|
|
256
|
-
|
257
|
-
try:
|
258
|
-
name_split = name.split(separator)
|
259
|
-
name_split_first = name_split[:-1]
|
260
|
-
name_split_lastKey = name_split[-1]
|
261
|
-
|
262
|
-
if not re.search("^"+re.escape(parse_index)+"\d+"+re.escape(parse_index)+"$", name_split_lastKey):
|
263
|
-
"""
|
264
|
-
name = name_split_first||name_split_lastKey
|
265
|
-
|
266
|
-
if "name" exist in "self.getObject()"
|
267
|
-
=> Add duplicate key
|
268
|
-
else if "name_split_first" exist in "self.getObject()"
|
269
|
-
if typeof "name_split_first" is list
|
270
|
-
if length of "name_split_lastKey" is 0
|
271
|
-
=> Add new key (append "value" to "name_split_first")
|
272
|
-
else
|
273
|
-
=> Add new key (append dict "name_split_lastKey"/"value" to "name_split_first")
|
274
|
-
else if typeof "name_split_first" is dict
|
275
|
-
=> Add new key ( name_split_first[name_split_lastKey] = value )
|
276
|
-
else if length of "name_split_first" is 0 => Add new key
|
277
|
-
if typeof self.getObject() is list
|
278
|
-
if length of "name_split_lastKey" is 0
|
279
|
-
=> Add new key (append "value" to self.__Jobj)
|
280
|
-
else
|
281
|
-
=> Add new key (append dict "name_split_lastKey"/"value" to self.__Jobj)
|
282
|
-
else if typeof self.getObject() is dict
|
283
|
-
=> Add new key ( self.__Jobj[name_split_lastKey] = value )
|
284
|
-
"""
|
285
|
-
# Add duplicate key
|
286
|
-
if self.get(separator.join(name_split), separator=separator, parse_index=parse_index) != "JSON_DUPLICATE_KEYS_ERROR":
|
287
|
-
index = 2
|
288
|
-
while True:
|
289
|
-
if self.get(separator.join(name_split)+dupSign_start+"_"+str(index)+"_"+dupSign_end, separator=separator, parse_index=parse_index) == "JSON_DUPLICATE_KEYS_ERROR":
|
290
|
-
break
|
291
|
-
index += 1
|
271
|
+
if type(dupSign_end) not in [str]: dupSign_end = "}}}"
|
292
272
|
|
293
|
-
|
273
|
+
if type(self.getObject()) not in [list, dict, OrderedDict]:
|
274
|
+
if _isDebug_: print("\x1b[31m[-] DataTypeError: the JSON object must be list, dict or OrderedDict, not {}\x1b[0m".format(type(self.getObject())))
|
275
|
+
return False
|
294
276
|
|
295
|
-
|
296
|
-
|
297
|
-
|
298
|
-
|
299
|
-
|
300
|
-
|
277
|
+
try:
|
278
|
+
name_split = name.split(separator)
|
279
|
+
name_split_first = name_split[:-1]
|
280
|
+
name_split_lastKey = name_split[-1]
|
281
|
+
|
282
|
+
if re.search("^"+re.escape(parse_index)+"\\d+"+re.escape(parse_index)+"$", name_split_lastKey):
|
283
|
+
if _isDebug_: print("\x1b[31m[-] KeyNameInvalidError: The key name does not end with the list index\x1b[0m")
|
284
|
+
return False
|
285
|
+
|
286
|
+
"""
|
287
|
+
name = name_split_first||name_split_lastKey
|
288
|
+
|
289
|
+
if "name" exist in "self.getObject()"
|
290
|
+
=> Add duplicate key
|
291
|
+
else if "name_split_first" exist in "self.getObject()"
|
292
|
+
if typeof "name_split_first" is list
|
293
|
+
if length of "name_split_lastKey" is 0
|
294
|
+
=> Add new key (append "value" to "name_split_first")
|
295
|
+
else
|
296
|
+
=> Add new key (append dict "name_split_lastKey"/"value" to "name_split_first")
|
297
|
+
else if typeof "name_split_first" is dict
|
298
|
+
=> Add new key ( name_split_first[name_split_lastKey] = value )
|
299
|
+
else if length of "name_split_first" is 0 => Add new key
|
300
|
+
if typeof self.getObject() is list
|
301
|
+
if length of "name_split_lastKey" is 0
|
302
|
+
=> Add new key (append "value" to self.__Jobj)
|
303
|
+
else
|
304
|
+
=> Add new key (append dict "name_split_lastKey"/"value" to self.__Jobj)
|
305
|
+
else if typeof self.getObject() is dict
|
306
|
+
=> Add new key ( self.__Jobj[name_split_lastKey] = value )
|
307
|
+
"""
|
308
|
+
# Add duplicate key
|
309
|
+
if self.get(separator.join(name_split), separator=separator, parse_index=parse_index) != "JSON_DUPLICATE_KEYS_ERROR":
|
310
|
+
index = 2
|
311
|
+
while True:
|
312
|
+
if self.get(separator.join(name_split)+dupSign_start+"_"+str(index)+"_"+dupSign_end, separator=separator, parse_index=parse_index) == "JSON_DUPLICATE_KEYS_ERROR":
|
313
|
+
break
|
314
|
+
index += 1
|
301
315
|
|
302
|
-
|
303
|
-
return True
|
304
|
-
# Add new key
|
305
|
-
elif self.get(separator.join(name_split_first), separator=separator, parse_index=parse_index) != "JSON_DUPLICATE_KEYS_ERROR":
|
306
|
-
if type(self.get(separator.join(name_split_first), separator=separator, parse_index=parse_index)) == list:
|
307
|
-
if name_split_lastKey == "":
|
308
|
-
exec_expression = "self.getObject()"
|
309
|
-
|
310
|
-
for k in name_split_first:
|
311
|
-
if re.search("^"+re.escape(parse_index)+"\d+"+re.escape(parse_index)+"$", k):
|
312
|
-
exec_expression += "["+k.split(parse_index)[1]+"]"
|
313
|
-
else:
|
314
|
-
exec_expression += "["+repr(k)+"]"
|
315
|
-
|
316
|
-
exec(exec_expression+".append("+repr(value)+")")
|
317
|
-
else:
|
318
|
-
exec_expression = "self.getObject()"
|
316
|
+
exec_expression = "self.getObject()"
|
319
317
|
|
320
|
-
|
321
|
-
|
322
|
-
|
323
|
-
|
324
|
-
|
318
|
+
name_split[-1] = name_split[-1]+dupSign_start+"_"+str(index)+"_"+dupSign_end
|
319
|
+
for k in name_split:
|
320
|
+
if re.search("^"+re.escape(parse_index)+"\\d+"+re.escape(parse_index)+"$", k):
|
321
|
+
exec_expression += "["+k.split(parse_index)[1]+"]"
|
322
|
+
else:
|
323
|
+
exec_expression += "["+repr(k)+"]"
|
325
324
|
|
326
|
-
|
327
|
-
|
328
|
-
|
325
|
+
exec(exec_expression+"="+repr(value))
|
326
|
+
return True
|
327
|
+
# Add new key
|
328
|
+
elif self.get(separator.join(name_split_first), separator=separator, parse_index=parse_index) != "JSON_DUPLICATE_KEYS_ERROR":
|
329
|
+
if len(name_split_first) > 0:
|
330
|
+
if type(self.get(separator.join(name_split_first), separator=separator, parse_index=parse_index)) == list:
|
331
|
+
if name_split_lastKey == "":
|
329
332
|
exec_expression = "self.getObject()"
|
330
333
|
|
331
334
|
for k in name_split_first:
|
332
|
-
if re.search("^"+re.escape(parse_index)+"
|
335
|
+
if re.search("^"+re.escape(parse_index)+"\\d+"+re.escape(parse_index)+"$", k):
|
333
336
|
exec_expression += "["+k.split(parse_index)[1]+"]"
|
334
337
|
else:
|
335
338
|
exec_expression += "["+repr(k)+"]"
|
336
339
|
|
337
|
-
exec(exec_expression+"
|
338
|
-
return True
|
340
|
+
exec(exec_expression+".append("+repr(value)+")")
|
339
341
|
else:
|
340
|
-
|
341
|
-
|
342
|
-
|
343
|
-
|
344
|
-
|
345
|
-
|
342
|
+
exec_expression = "self.getObject()"
|
343
|
+
|
344
|
+
for k in name_split_first:
|
345
|
+
if re.search("^"+re.escape(parse_index)+"\\d+"+re.escape(parse_index)+"$", k):
|
346
|
+
exec_expression += "["+k.split(parse_index)[1]+"]"
|
347
|
+
else:
|
348
|
+
exec_expression += "["+repr(k)+"]"
|
349
|
+
|
350
|
+
exec(exec_expression+".append({"+repr(name_split_lastKey)+":"+repr(value)+"})")
|
351
|
+
return True
|
352
|
+
elif type(self.get(separator.join(name_split_first), separator=separator, parse_index=parse_index)) == dict:
|
353
|
+
exec_expression = "self.getObject()"
|
354
|
+
|
355
|
+
for k in name_split_first:
|
356
|
+
if re.search("^"+re.escape(parse_index)+"\\d+"+re.escape(parse_index)+"$", k):
|
357
|
+
exec_expression += "["+k.split(parse_index)[1]+"]"
|
346
358
|
else:
|
347
|
-
|
348
|
-
|
349
|
-
|
359
|
+
exec_expression += "["+repr(k)+"]"
|
360
|
+
|
361
|
+
exec(exec_expression+"["+repr(name_split_lastKey)+"]="+repr(value))
|
350
362
|
return True
|
351
363
|
else:
|
352
|
-
if _isDebug_: print("\x1b[31m[-]
|
364
|
+
if _isDebug_: print("\x1b[31m[-] KeyNameNotExistError: {}\x1b[0m".format(separator.join(name_split_first)))
|
353
365
|
else:
|
354
|
-
if
|
355
|
-
|
356
|
-
|
357
|
-
|
358
|
-
|
366
|
+
if type(self.getObject()) == list:
|
367
|
+
if name_split_lastKey == "":
|
368
|
+
self.__Jobj.append(value)
|
369
|
+
else:
|
370
|
+
self.__Jobj.append({name_split_lastKey: value})
|
371
|
+
else:
|
372
|
+
self.__Jobj[name_split_lastKey] = value
|
373
|
+
return True
|
374
|
+
# Add new key
|
375
|
+
elif len(name_split_first) == 0:
|
376
|
+
if type(self.getObject()) == list:
|
377
|
+
if name_split_lastKey == "":
|
378
|
+
self.__Jobj.append(value)
|
379
|
+
else:
|
380
|
+
self.__Jobj.append({name_split_lastKey: value})
|
381
|
+
else:
|
382
|
+
self.__Jobj[name_split_lastKey] = value
|
383
|
+
return True
|
384
|
+
else:
|
385
|
+
if _isDebug_: print("\x1b[31m[-] KeyNameInvalidError: {}\x1b[0m".format(separator.join(name_split_first)))
|
386
|
+
except Exception as e:
|
387
|
+
if _isDebug_: print("\x1b[31m[-] ExceptionError: {}\x1b[0m".format(e))
|
359
388
|
|
360
389
|
return False
|
361
390
|
# # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # #
|
@@ -371,15 +400,18 @@ class JSON_DUPLICATE_KEYS:
|
|
371
400
|
|
372
401
|
# User input data type validation
|
373
402
|
if type(_isDebug_) != bool: _isDebug_ = False
|
403
|
+
|
374
404
|
try:
|
375
|
-
if type(name) not in [str, unicode]:
|
376
|
-
|
405
|
+
if type(name) not in [str, unicode]: exit("\x1b[31m[-] DataTypeError: the KEY name must be str or unicode, not {}\x1b[0m".format(type(name)))
|
406
|
+
|
377
407
|
if type(separator) not in [str, unicode]: separator = "||"
|
408
|
+
|
378
409
|
if type(parse_index) not in [str, unicode]: parse_index = "$"
|
379
410
|
except Exception as e:
|
380
|
-
if type(name) not in [str]:
|
381
|
-
|
411
|
+
if type(name) not in [str]: exit("\x1b[31m[-] DataTypeError: the KEY name must be str or unicode, not {}\x1b[0m".format(type(name)))
|
412
|
+
|
382
413
|
if type(separator) not in [str]: separator = "||"
|
414
|
+
|
383
415
|
if type(parse_index) not in [str]: parse_index = "$"
|
384
416
|
|
385
417
|
if self.get(name, separator=separator, parse_index=parse_index, _isDebug_=_isDebug_) != "JSON_DUPLICATE_KEYS_ERROR":
|
@@ -387,7 +419,7 @@ class JSON_DUPLICATE_KEYS:
|
|
387
419
|
exec_expression = "self.getObject()"
|
388
420
|
|
389
421
|
for k in name.split(separator):
|
390
|
-
if re.search("^"+re.escape(parse_index)+"
|
422
|
+
if re.search("^"+re.escape(parse_index)+"\\d+"+re.escape(parse_index)+"$", k):
|
391
423
|
exec_expression += "["+k.split(parse_index)[1]+"]"
|
392
424
|
else:
|
393
425
|
exec_expression += "["+repr(k)+"]"
|
@@ -413,14 +445,16 @@ class JSON_DUPLICATE_KEYS:
|
|
413
445
|
if type(_isDebug_) != bool: _isDebug_ = False
|
414
446
|
|
415
447
|
try:
|
416
|
-
if type(name) not in [str, unicode]:
|
417
|
-
|
448
|
+
if type(name) not in [str, unicode]: exit("\x1b[31m[-] DataTypeError: the KEY name must be str or unicode, not {}\x1b[0m".format(type(name)))
|
449
|
+
|
418
450
|
if type(separator) not in [str, unicode]: separator = "||"
|
451
|
+
|
419
452
|
if type(parse_index) not in [str, unicode]: parse_index = "$"
|
420
453
|
except Exception as e:
|
421
|
-
if type(name) not in [str]:
|
422
|
-
|
454
|
+
if type(name) not in [str]: exit("\x1b[31m[-] DataTypeError: the KEY name must be str or unicode, not {}\x1b[0m".format(type(name)))
|
455
|
+
|
423
456
|
if type(separator) not in [str]: separator = "||"
|
457
|
+
|
424
458
|
if type(parse_index) not in [str]: parse_index = "$"
|
425
459
|
|
426
460
|
if self.get(name, separator=separator, parse_index=parse_index, _isDebug_=_isDebug_) != "JSON_DUPLICATE_KEYS_ERROR":
|
@@ -428,7 +462,7 @@ class JSON_DUPLICATE_KEYS:
|
|
428
462
|
exec_expression = "del self.getObject()"
|
429
463
|
|
430
464
|
for k in name.split(separator):
|
431
|
-
if re.search("^"+re.escape(parse_index)+"
|
465
|
+
if re.search("^"+re.escape(parse_index)+"\\d+"+re.escape(parse_index)+"$", k):
|
432
466
|
exec_expression += "["+k.split(parse_index)[1]+"]"
|
433
467
|
else:
|
434
468
|
exec_expression += "["+repr(k)+"]"
|
@@ -454,9 +488,37 @@ class JSON_DUPLICATE_KEYS:
|
|
454
488
|
JDKSObject.flatten(separator=separator, parse_index=parse_index, ordered_dict=ordered_dict)
|
455
489
|
newJDKSObject = loads("{}", ordered_dict=ordered_dict)
|
456
490
|
|
457
|
-
for k in JDKSObject.getObject():
|
458
|
-
if
|
459
|
-
|
491
|
+
for k, v in JDKSObject.getObject().items():
|
492
|
+
if type(k) == str and type(name) == str:
|
493
|
+
if re.search(name, k):
|
494
|
+
newJDKSObject.set(k, v, separator="§§"+separator+"§§", parse_index="§§"+parse_index+"§§", ordered_dict=ordered_dict)
|
495
|
+
else:
|
496
|
+
if name == k:
|
497
|
+
newJDKSObject.set(k, v, separator="§§"+separator+"§§", parse_index="§§"+parse_index+"§§", ordered_dict=ordered_dict)
|
498
|
+
|
499
|
+
return newJDKSObject
|
500
|
+
# # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # #
|
501
|
+
# # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # #
|
502
|
+
# # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # #
|
503
|
+
|
504
|
+
|
505
|
+
# # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # #
|
506
|
+
# # # # # # # # # # # # filter_values # # # # # # # # # # # # #
|
507
|
+
# # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # #
|
508
|
+
def filter_values(self, value, separator="||", parse_index="$", ordered_dict=False):
|
509
|
+
import re, copy
|
510
|
+
|
511
|
+
JDKSObject = copy.deepcopy(self)
|
512
|
+
JDKSObject.flatten(separator=separator, parse_index=parse_index, ordered_dict=ordered_dict)
|
513
|
+
newJDKSObject = loads("{}", ordered_dict=ordered_dict)
|
514
|
+
|
515
|
+
for k, v in JDKSObject.getObject().items():
|
516
|
+
if type(v) == str and type(value) == str:
|
517
|
+
if re.search(value, v):
|
518
|
+
newJDKSObject.set(k, v, separator="§§"+separator+"§§", parse_index="§§"+parse_index+"§§", ordered_dict=ordered_dict)
|
519
|
+
else:
|
520
|
+
if value == v:
|
521
|
+
newJDKSObject.set(k, v, separator="§§"+separator+"§§", parse_index="§§"+parse_index+"§§", ordered_dict=ordered_dict)
|
460
522
|
|
461
523
|
return newJDKSObject
|
462
524
|
# # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # #
|
@@ -473,22 +535,25 @@ class JSON_DUPLICATE_KEYS:
|
|
473
535
|
|
474
536
|
# User input data type validation
|
475
537
|
if type(_isDebug_) != bool: _isDebug_ = False
|
538
|
+
|
476
539
|
try:
|
477
540
|
if type(dupSign_start) not in [str, unicode]: dupSign_start = "{{{"
|
541
|
+
|
478
542
|
if type(dupSign_end) not in [str, unicode]: dupSign_end = "}}}"
|
479
543
|
except Exception as e:
|
480
544
|
if type(dupSign_start) not in [str]: dupSign_start = "{{{"
|
481
|
-
if type(dupSign_end) not in [str]: dupSign_end = "}}}"
|
482
545
|
|
483
|
-
|
484
|
-
dupSign_start_escape_regex = re.escape(json.dumps({dupSign_start:""})[2:-6])
|
485
|
-
|
486
|
-
dupSign_end_escape_regex = re.escape(json.dumps({dupSign_end:""})[2:-6])
|
546
|
+
if type(dupSign_end) not in [str]: dupSign_end = "}}}"
|
487
547
|
|
488
|
-
|
489
|
-
else:
|
548
|
+
if type(self.getObject()) not in [list, dict, OrderedDict]:
|
490
549
|
if _isDebug_: print("\x1b[31m[-] DataTypeError: the JSON object must be list, dict or OrderedDict, not {}\x1b[0m".format(type(self.getObject())))
|
491
550
|
return "JSON_DUPLICATE_KEYS_ERROR"
|
551
|
+
|
552
|
+
dupSign_start_escape_regex = re.escape(json.dumps({dupSign_start:""})[2:-6])
|
553
|
+
|
554
|
+
dupSign_end_escape_regex = re.escape(json.dumps({dupSign_end:""})[2:-6])
|
555
|
+
|
556
|
+
return re.sub(r'{dupSign_start}_\\d+_{dupSign_end}":'.format(dupSign_start=dupSign_start_escape_regex, dupSign_end=dupSign_end_escape_regex), '":', json.dumps(self.getObject(), skipkeys=skipkeys, ensure_ascii=ensure_ascii, check_circular=check_circular, allow_nan=allow_nan, cls=cls, indent=indent, separators=separators, default=default, sort_keys=sort_keys))
|
492
557
|
# # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # #
|
493
558
|
# # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # #
|
494
559
|
# # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # #
|
@@ -519,52 +584,57 @@ class JSON_DUPLICATE_KEYS:
|
|
519
584
|
|
520
585
|
# User input data type validation
|
521
586
|
if type(_isDebug_) != bool: _isDebug_ = False
|
587
|
+
|
522
588
|
if type(ordered_dict) != bool: ordered_dict = False
|
589
|
+
|
523
590
|
try:
|
524
591
|
if type(separator) not in [str, unicode]: separator = "||"
|
592
|
+
|
525
593
|
if type(parse_index) not in [str, unicode]: parse_index = "$"
|
526
594
|
except Exception as e:
|
527
595
|
if type(separator) not in [str]: separator = "||"
|
596
|
+
|
528
597
|
if type(parse_index) not in [str]: parse_index = "$"
|
529
598
|
|
530
|
-
if type(self.getObject()) in [list, dict, OrderedDict]:
|
531
|
-
if
|
532
|
-
|
533
|
-
Jflat = dict()
|
534
|
-
if ordered_dict:
|
535
|
-
Jflat = OrderedDict()
|
536
|
-
|
537
|
-
def __convert_Jobj_to_Jflat(Jobj, key=None):
|
538
|
-
if type(Jobj) in [dict, OrderedDict]:
|
539
|
-
if len(Jobj) == 0:
|
540
|
-
Jflat[key] = dict()
|
541
|
-
if ordered_dict:
|
542
|
-
Jflat[key] = OrderedDict()
|
543
|
-
else:
|
544
|
-
for k,v in Jobj.items():
|
545
|
-
_Jobj = v
|
546
|
-
_key = "{key}{separator}{k}".format(key=key,separator=separator,k=k) if key != None else "{k}".format(k=k)
|
547
|
-
|
548
|
-
__convert_Jobj_to_Jflat(_Jobj, _key)
|
549
|
-
elif type(Jobj) == list:
|
550
|
-
if len(Jobj) == 0:
|
551
|
-
Jflat[key] = list()
|
552
|
-
else:
|
553
|
-
for i,v in enumerate(Jobj):
|
554
|
-
_Jobj = v
|
555
|
-
_key = "{key}{separator}{parse_index}{i}{parse_index}".format(key=key, separator=separator, parse_index=parse_index, i=i) if key != None else "{parse_index}{i}{parse_index}".format(parse_index=parse_index, i=i)
|
599
|
+
if type(self.getObject()) not in [list, dict, OrderedDict]:
|
600
|
+
if _isDebug_: print("\x1b[31m[-] DataTypeError: the JSON object must be list, dict or OrderedDict, not {}\x1b[0m".format(type(self.getObject())))
|
601
|
+
exit()
|
556
602
|
|
557
|
-
|
603
|
+
if len(self.getObject()) > 0:
|
604
|
+
try:
|
605
|
+
Jflat = dict()
|
606
|
+
if ordered_dict:
|
607
|
+
Jflat = OrderedDict()
|
608
|
+
|
609
|
+
def __convert_Jobj_to_Jflat(Jobj, key=None):
|
610
|
+
if type(Jobj) in [dict, OrderedDict]:
|
611
|
+
if len(Jobj) == 0:
|
612
|
+
Jflat[key] = dict()
|
613
|
+
if ordered_dict:
|
614
|
+
Jflat[key] = OrderedDict()
|
615
|
+
else:
|
616
|
+
for k,v in Jobj.items():
|
617
|
+
_Jobj = v
|
618
|
+
_key = "{key}{separator}{k}".format(key=key,separator=separator,k=k) if key != None else "{k}".format(k=k)
|
619
|
+
|
620
|
+
__convert_Jobj_to_Jflat(_Jobj, _key)
|
621
|
+
elif type(Jobj) == list:
|
622
|
+
if len(Jobj) == 0:
|
623
|
+
Jflat[key] = list()
|
558
624
|
else:
|
559
|
-
|
625
|
+
for i,v in enumerate(Jobj):
|
626
|
+
_Jobj = v
|
627
|
+
_key = "{key}{separator}{parse_index}{i}{parse_index}".format(key=key, separator=separator, parse_index=parse_index, i=i) if key != None else "{parse_index}{i}{parse_index}".format(parse_index=parse_index, i=i)
|
560
628
|
|
561
|
-
|
629
|
+
__convert_Jobj_to_Jflat(_Jobj, _key)
|
630
|
+
else:
|
631
|
+
Jflat[key] = Jobj
|
562
632
|
|
563
|
-
|
564
|
-
|
565
|
-
|
566
|
-
|
567
|
-
|
633
|
+
__convert_Jobj_to_Jflat(self.getObject())
|
634
|
+
|
635
|
+
self.__Jobj = Jflat
|
636
|
+
except Exception as e:
|
637
|
+
if _isDebug_: print("\x1b[31m[-] ExceptionError: {}\x1b[0m".format(e))
|
568
638
|
# # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # #
|
569
639
|
# # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # #
|
570
640
|
# # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # #
|
@@ -579,42 +649,47 @@ class JSON_DUPLICATE_KEYS:
|
|
579
649
|
|
580
650
|
# User input data type validation
|
581
651
|
if type(_isDebug_) != bool: _isDebug_ = False
|
652
|
+
|
582
653
|
if type(ordered_dict) != bool: ordered_dict = False
|
654
|
+
|
583
655
|
try:
|
584
656
|
if type(separator) not in [str, unicode]: separator = "||"
|
657
|
+
|
585
658
|
if type(parse_index) not in [str, unicode]: parse_index = "$"
|
586
659
|
except Exception as e:
|
587
660
|
if type(separator) not in [str]: separator = "||"
|
661
|
+
|
588
662
|
if type(parse_index) not in [str]: parse_index = "$"
|
589
663
|
|
590
|
-
if type(self.getObject()) in [dict, OrderedDict]:
|
591
|
-
if
|
592
|
-
|
593
|
-
Jobj = list() if len([k for k in self.__Jobj.keys() if re.compile("^"+re.escape(parse_index)+"\d+"+re.escape(parse_index)+"$").match(str(k).split(separator)[0])]) == len(self.__Jobj.keys()) else OrderedDict() if ordered_dict else dict()
|
664
|
+
if type(self.getObject()) not in [dict, OrderedDict]:
|
665
|
+
if _isDebug_: print("\x1b[31m[-] DataTypeError: the JSON object must be dict or OrderedDict, not {}\x1b[0m".format(type(self.getObject())))
|
666
|
+
exit()
|
594
667
|
|
595
|
-
|
596
|
-
|
597
|
-
|
668
|
+
if len(self.getObject()) > 0:
|
669
|
+
try:
|
670
|
+
Jobj = list() if len([k for k in self.__Jobj.keys() if re.compile("^"+re.escape(parse_index)+"\\d+"+re.escape(parse_index)+"$").match(str(k).split(separator)[0])]) == len(self.__Jobj.keys()) else OrderedDict() if ordered_dict else dict()
|
598
671
|
|
599
|
-
|
600
|
-
|
672
|
+
for k, v in self.__Jobj.items():
|
673
|
+
Jtmp = Jobj
|
674
|
+
Jkeys = k.split(separator)
|
601
675
|
|
602
|
-
|
603
|
-
|
676
|
+
for count, (Jkey, next_Jkeys) in enumerate(zip(Jkeys, Jkeys[1:] + [v]), 1):
|
677
|
+
v = next_Jkeys if count == len(Jkeys) else list() if re.compile("^"+re.escape(parse_index)+"\\d+"+re.escape(parse_index)+"$").match(next_Jkeys) else OrderedDict() if ordered_dict else dict()
|
604
678
|
|
605
|
-
|
606
|
-
|
679
|
+
if type(Jtmp) == list:
|
680
|
+
Jkey = int(re.compile(re.escape(parse_index)+"(\\d+)"+re.escape(parse_index)).match(Jkey).group(1))
|
607
681
|
|
608
|
-
|
609
|
-
Jtmp
|
682
|
+
while Jkey >= len(Jtmp):
|
683
|
+
Jtmp.append(v)
|
610
684
|
|
611
|
-
|
685
|
+
elif Jkey not in Jtmp:
|
686
|
+
Jtmp[Jkey] = v
|
612
687
|
|
613
|
-
|
614
|
-
|
615
|
-
|
616
|
-
|
617
|
-
|
688
|
+
Jtmp = Jtmp[Jkey]
|
689
|
+
|
690
|
+
self.__Jobj = Jobj
|
691
|
+
except Exception as e:
|
692
|
+
if _isDebug_: print("\x1b[31m[-] ExceptionError: {}\x1b[0m".format(e))
|
618
693
|
# # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # #
|
619
694
|
# # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # #
|
620
695
|
# # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # #
|
@@ -1,6 +1,6 @@
|
|
1
1
|
Metadata-Version: 2.1
|
2
2
|
Name: json-duplicate-keys
|
3
|
-
Version: 2024.
|
3
|
+
Version: 2024.7.17
|
4
4
|
Summary: Flatten/ Unflatten and Load(s)/ Dump(s) JSON File/ Object with Duplicate Keys
|
5
5
|
Home-page: https://github.com/truocphan/json-duplicate-keys
|
6
6
|
Author: TP Cyber Security
|
@@ -265,6 +265,27 @@ Description: # JSON Duplicate Keys - PyPI
|
|
265
265
|
```
|
266
266
|
---
|
267
267
|
|
268
|
+
### JSON_DUPLICATE_KEYS.filter_values(`value`, `separator`="||", `parse_index`="$", `ordered_dict`=False)
|
269
|
+
|
270
|
+
- `value`:
|
271
|
+
- `separator`:
|
272
|
+
- `parse_index`:
|
273
|
+
- `ordered_dict`:
|
274
|
+
```python
|
275
|
+
import json_duplicate_keys as jdks
|
276
|
+
|
277
|
+
Jstr = '{"author": "truocphan", "version": "22.3.3", "version": "latest", "release": [{"version": "latest"}], "snapshot": {"author": "truocphan", "version": "22.3.3", "release": [{"version": "latest"}]}}'
|
278
|
+
|
279
|
+
JDKSObject = jdks.loads(Jstr)
|
280
|
+
|
281
|
+
print(JDKSObject.filter_values("latest").dumps())
|
282
|
+
# OUTPUT: {"version": "latest", "release||$0$||version": "latest", "snapshot||release||$0$||version": "latest"}
|
283
|
+
|
284
|
+
print(JDKSObject.dumps())
|
285
|
+
# OUTPUT: {"author": "truocphan", "version": "22.3.3", "version": "latest", "release": [{"version": "latest"}], "snapshot": {"author": "truocphan", "version": "22.3.3", "release": [{"version": "latest"}]}}
|
286
|
+
```
|
287
|
+
---
|
288
|
+
|
268
289
|
### JSON_DUPLICATE_KEYS.dumps(`dupSign_start`="{{{", `dupSign_end`="}}}", `_isDebug_`=False, `skipkeys`=False, `ensure_ascii`=True, `check_circular`=True, `allow_nan`=True, `cls`=None, `indent`=None, `separators`=None, `default`=None, `sort_keys`=False)
|
269
290
|
_Serialize a JSON object to a JSON format string_
|
270
291
|
- `dupSign_start`:
|
@@ -367,6 +388,13 @@ Description: # JSON Duplicate Keys - PyPI
|
|
367
388
|
---
|
368
389
|
|
369
390
|
## CHANGELOG
|
391
|
+
#### [json-duplicate-keys v2024.7.17](https://github.com/truocphan/json-duplicate-keys/tree/2024.7.17)
|
392
|
+
- **Fixed**: issue [#3](https://github.com/truocphan/json-duplicate-keys/issues/3) break the set function when the key's value is empty. Thanks [ptth222](https://github.com/ptth222) for reporting this issue.
|
393
|
+
|
394
|
+
#### [json-duplicate-keys v2024.4.20](https://github.com/truocphan/json-duplicate-keys/tree/2024.4.20)
|
395
|
+
- **New**: _filter_values_
|
396
|
+
- **Updated**: _filter_keys_
|
397
|
+
|
370
398
|
#### [json-duplicate-keys v2024.3.24](https://github.com/truocphan/json-duplicate-keys/tree/2024.3.24)
|
371
399
|
- **Updated**: _normalize_key_, _loads_, _get_, _set_, _update_, _delete_
|
372
400
|
---
|
File without changes
|
File without changes
|
File without changes
|
File without changes
|
File without changes
|
File without changes
|
File without changes
|