@hyperjump/json-schema 1.6.5 → 1.6.7
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- package/bundle/README.md +134 -0
- package/bundle/index.js +50 -2
- package/lib/keywords/enum.js +8 -1
- package/lib/keywords.js +15 -2
- package/lib/schema.js +10 -9
- package/openapi-3-0/index.js +1 -1
- package/openapi-3-1/index.d.ts +1 -1
- package/package.json +5 -7
- package/annotations/tests/applicators.json +0 -375
- package/annotations/tests/content.json +0 -57
- package/annotations/tests/core.json +0 -33
- package/annotations/tests/format.json +0 -20
- package/annotations/tests/meta-data.json +0 -128
- package/annotations/tests/unevaluated.json +0 -557
- package/annotations/tests/unknown.json +0 -87
- package/annotations/tests/validation.json +0 -328
package/bundle/README.md
ADDED
|
@@ -0,0 +1,134 @@
|
|
|
1
|
+
# Bundler Tests
|
|
2
|
+
|
|
3
|
+
The testing strategy used by this Test Suite is based on the concept that it
|
|
4
|
+
should be indistinguishable whether the user evaluated the schema using the
|
|
5
|
+
bundle or using individually loaded schemas. Test Runners should validate each
|
|
6
|
+
Test twice, once with all schemas loaded individually and again using just the
|
|
7
|
+
bundle. The Test passes if both evaluations produce identical output.
|
|
8
|
+
|
|
9
|
+
The accuracy of this testing approach is dependent on how detailed the output
|
|
10
|
+
is. It's recommended that Test Runner use a validator implementation that
|
|
11
|
+
supports the
|
|
12
|
+
["verbose"](https://json-schema.org/draft/2020-12/json-schema-core#name-verbose)
|
|
13
|
+
official output format or something similarly detailed that can be used for
|
|
14
|
+
comparison.
|
|
15
|
+
|
|
16
|
+
## Supported Dialects
|
|
17
|
+
|
|
18
|
+
Although the topic of bundling didn't appear in the spec until 2020-12, the
|
|
19
|
+
process described is applicable since `id` (now `$id`) was introduced in
|
|
20
|
+
draft-03. Test Cases in this Test Suite are designed to be compatible with as
|
|
21
|
+
many releases of JSON Schema as possible. They do not include a `$schema`
|
|
22
|
+
keyword so that implementations can run the same Test Suite for each dialect
|
|
23
|
+
they support. The Test Runner should define what dialect it's using to run the
|
|
24
|
+
Test Suite.
|
|
25
|
+
|
|
26
|
+
Since this Test Suite can be used for a variety of dialects, there are a couple
|
|
27
|
+
of options that can be used by Test Runners to filter out Test Cases that don't
|
|
28
|
+
apply to the dialect under test.
|
|
29
|
+
|
|
30
|
+
## Test Case Components
|
|
31
|
+
|
|
32
|
+
### description
|
|
33
|
+
|
|
34
|
+
A short description of what behavior the Test Case is covering.
|
|
35
|
+
|
|
36
|
+
### compatibility
|
|
37
|
+
|
|
38
|
+
The `compatibility` option allows you to set which dialects the Test Case is
|
|
39
|
+
compatible with. Test Runners can use this value to filter out Test Cases that
|
|
40
|
+
don't apply the to dialect currently under test. Dialects are indicated by the
|
|
41
|
+
number corresponding to their release. Date-based releases use just the year.
|
|
42
|
+
|
|
43
|
+
If this option isn't present, it means the Test Case is compatible with draft-03
|
|
44
|
+
and above.
|
|
45
|
+
|
|
46
|
+
If this option is present with a number, the number indicates the minimum
|
|
47
|
+
release the Test Case is compatible with. This example indicates that the Test
|
|
48
|
+
Case is compatible with draft-07 and up.
|
|
49
|
+
|
|
50
|
+
**Example**: `"compatibility": "7"`
|
|
51
|
+
|
|
52
|
+
You can use a `<=` operator to indicate that the Test Case is compatible with
|
|
53
|
+
releases less then or equal to the given release. This example indicates that
|
|
54
|
+
the Test Case is compatible with 2019-09 and under.
|
|
55
|
+
|
|
56
|
+
**Example**: `"compatibility": "<=2019"`
|
|
57
|
+
|
|
58
|
+
You can use comma-separated values to indicate multiple constraints if needed.
|
|
59
|
+
This example indicates that the Test Case is compatible with releases between
|
|
60
|
+
draft-06 and 2019-09.
|
|
61
|
+
|
|
62
|
+
**Example**: `"compatibility": "6,<=2019"`
|
|
63
|
+
|
|
64
|
+
For convenience, you can use the `=` operator to indicate a Test Case is only
|
|
65
|
+
compatible with a single release. This example indicates that the Test Case is
|
|
66
|
+
compatible only with 2020-12.
|
|
67
|
+
|
|
68
|
+
**Example**: `"compatibility": "=2020"`
|
|
69
|
+
|
|
70
|
+
### requiredDialects
|
|
71
|
+
|
|
72
|
+
Since 2020-12, it's been allowed to have bundles that include schemas using
|
|
73
|
+
dialects different from the parent schema. In order to test this, it's necessary
|
|
74
|
+
to have an external schema that declares a dialect with `$schema`. The
|
|
75
|
+
`requiredDialects` option can be included to tell the Test Runner which dialects
|
|
76
|
+
are required to run the Test Case other than the dialect under test. This option
|
|
77
|
+
is an array of numbers corresponding to release numbers. Date-based releases use
|
|
78
|
+
just the year.
|
|
79
|
+
|
|
80
|
+
**Example**: `"requiredDialects": [2019]`
|
|
81
|
+
|
|
82
|
+
### schema
|
|
83
|
+
|
|
84
|
+
The schema that will serve as the entry point of the bundle. This schema
|
|
85
|
+
shouldn't include `$schema` or `id`/`$id` because Test Cases should be designed
|
|
86
|
+
to work with as many releases as possible.
|
|
87
|
+
|
|
88
|
+
### externalSchemas
|
|
89
|
+
|
|
90
|
+
`externalSchemas` is where you define the schemas that the schema in `schema`
|
|
91
|
+
references and you expect to get embedded into the bundled schema. The value is
|
|
92
|
+
an object where the keys are retrieval URIs and values are schemas. Most
|
|
93
|
+
external schemas aren't self identifying (using `id`/`$id`) and rely solely on
|
|
94
|
+
the retrieval URI for identification. This is done to increase the number of
|
|
95
|
+
dialects that the test is compatible with. Because `id` changed to `$id` in
|
|
96
|
+
draft-06, if you use `$id`, the test becomes incompatible with draft-03/4 and in
|
|
97
|
+
most cases, that's not necessary.
|
|
98
|
+
|
|
99
|
+
### tests
|
|
100
|
+
|
|
101
|
+
`tests` are a collection of Tests to run to verify the Test Case. Tests don't
|
|
102
|
+
include an expected pass/fail for the instance because we don't care if the
|
|
103
|
+
instance is valid or not, we care only whether the bundle produces the same
|
|
104
|
+
results as its unbundled equivalent.
|
|
105
|
+
|
|
106
|
+
Tests should be designed to expose potential differences between the bundled and
|
|
107
|
+
unbundled evaluations. You should have at least one Test that's expected to pass
|
|
108
|
+
to ensure that annotations are equivalent and at least one Test that's expected
|
|
109
|
+
to fail to ensure errors are equivalent.
|
|
110
|
+
|
|
111
|
+
## Fixtures
|
|
112
|
+
|
|
113
|
+
Often we don't care what's in the externally referenced schemas and end up using
|
|
114
|
+
the same simple and generic schemas in a lot of tests. In order to reduce
|
|
115
|
+
duplication, there are a couple of generic schemas defined that test runners
|
|
116
|
+
should load in addition to `externalSchemas` for each Test Case.
|
|
117
|
+
|
|
118
|
+
Fixtures are expected to be simple, compatible with all dialects, and not
|
|
119
|
+
critical to understanding what is being tested. Anything that doesn't meet those
|
|
120
|
+
criteria is better off declared in `externalSchemas`.
|
|
121
|
+
|
|
122
|
+
## Directory Structure
|
|
123
|
+
|
|
124
|
+
* tests - A directory containing the full Test Suite.
|
|
125
|
+
* tests.json - A Test Suite covering functionality in the latest spec release.
|
|
126
|
+
* legacy-tests.json - A Test Suite covering functionality that was removed in
|
|
127
|
+
a previous release. This is mostly adaptations of Test Cases that use `$id`
|
|
128
|
+
or `$defs` to cover the same functionality for older releases that use `id`
|
|
129
|
+
or `definitions`. Test Cases involving other removed keywords such as
|
|
130
|
+
`dependencies` and `recursiveRef` can be found here as well.
|
|
131
|
+
* fixtures - A collection of simple and reusable schemas
|
|
132
|
+
* string.schema.json
|
|
133
|
+
* number.schema.json
|
|
134
|
+
* ... etc
|
package/bundle/index.js
CHANGED
|
@@ -223,8 +223,8 @@ const loadKeywordSupport = () => {
|
|
|
223
223
|
|
|
224
224
|
const dependencies = getKeyword("https://json-schema.org/keyword/draft-04/dependencies");
|
|
225
225
|
if (dependencies) {
|
|
226
|
-
dependencies.collectExternalIds = (
|
|
227
|
-
Object.values(
|
|
226
|
+
dependencies.collectExternalIds = (dependencies, externalIds, ast, dynamicAnchors) => {
|
|
227
|
+
Object.values(dependencies).forEach(([, dependency]) => {
|
|
228
228
|
if (typeof dependency === "string") {
|
|
229
229
|
Validation.collectExternalIds(dependency, externalIds, ast, dynamicAnchors);
|
|
230
230
|
}
|
|
@@ -265,4 +265,52 @@ const loadKeywordSupport = () => {
|
|
|
265
265
|
Validation.collectExternalIds(contains, externalIds, ast, dynamicAnchors);
|
|
266
266
|
};
|
|
267
267
|
}
|
|
268
|
+
|
|
269
|
+
// Experimental
|
|
270
|
+
|
|
271
|
+
const propertyDependencies = getKeyword("https://json-schema.org/keyword/propertyDependencies");
|
|
272
|
+
if (propertyDependencies) {
|
|
273
|
+
propertyDependencies.collectExternalIds = (propertyDependencies, externalIds, ast, dynamicAnchors) => {
|
|
274
|
+
for (const key in propertyDependencies) {
|
|
275
|
+
for (const value in propertyDependencies[key]) {
|
|
276
|
+
Validation.collectExternalIds(propertyDependencies[key][value], externalIds, ast, dynamicAnchors);
|
|
277
|
+
}
|
|
278
|
+
}
|
|
279
|
+
};
|
|
280
|
+
}
|
|
281
|
+
|
|
282
|
+
const conditional = getKeyword("https://json-schema.org/keyword/conditional");
|
|
283
|
+
if (conditional) {
|
|
284
|
+
conditional.collectExternalIds = (conditional, externalIds, ast, dynamicAnchors) => {
|
|
285
|
+
for (const schema of conditional) {
|
|
286
|
+
Validation.collectExternalIds(schema, externalIds, ast, dynamicAnchors);
|
|
287
|
+
}
|
|
288
|
+
};
|
|
289
|
+
}
|
|
290
|
+
|
|
291
|
+
const itemPattern = getKeyword("https://json-schema.org/keyword/itemPattern");
|
|
292
|
+
if (itemPattern) {
|
|
293
|
+
itemPattern.collectExternalIds = (nfa, externalIds, ast, dynamicAnchors) => {
|
|
294
|
+
for (const itemSchema of collectNfaSchemas(nfa.start)) {
|
|
295
|
+
Validation.collectExternalIds(itemSchema, externalIds, ast, dynamicAnchors);
|
|
296
|
+
}
|
|
297
|
+
};
|
|
298
|
+
}
|
|
299
|
+
|
|
300
|
+
const collectNfaSchemas = function* (node, visited = new Set()) {
|
|
301
|
+
if (visited.has(node)) {
|
|
302
|
+
return;
|
|
303
|
+
}
|
|
304
|
+
|
|
305
|
+
visited.add(node);
|
|
306
|
+
|
|
307
|
+
for (const schema in node.transition) {
|
|
308
|
+
yield schema;
|
|
309
|
+
yield* collectNfaSchemas(node.transition[schema], visited);
|
|
310
|
+
}
|
|
311
|
+
|
|
312
|
+
for (const epsilon of node.epsilonTransitions) {
|
|
313
|
+
yield* collectNfaSchemas(epsilon, visited);
|
|
314
|
+
}
|
|
315
|
+
};
|
|
268
316
|
};
|
package/lib/keywords/enum.js
CHANGED
|
@@ -1,3 +1,4 @@
|
|
|
1
|
+
import { pipe, asyncMap, asyncCollectArray } from "@hyperjump/pact";
|
|
1
2
|
import jsonStringify from "fastest-stable-stringify";
|
|
2
3
|
import * as Schema from "../schema.js";
|
|
3
4
|
import * as Instance from "../instance.js";
|
|
@@ -5,7 +6,13 @@ import * as Instance from "../instance.js";
|
|
|
5
6
|
|
|
6
7
|
const id = "https://json-schema.org/keyword/enum";
|
|
7
8
|
|
|
8
|
-
const compile = (schema) =>
|
|
9
|
+
const compile = (schema) => pipe(
|
|
10
|
+
Schema.iter(schema),
|
|
11
|
+
asyncMap(Schema.value),
|
|
12
|
+
asyncMap(jsonStringify),
|
|
13
|
+
asyncCollectArray
|
|
14
|
+
);
|
|
15
|
+
|
|
9
16
|
const interpret = (enum_, instance) => enum_.some((enumValue) => jsonStringify(Instance.value(instance)) === enumValue);
|
|
10
17
|
|
|
11
18
|
export default { id, compile, interpret };
|
package/lib/keywords.js
CHANGED
|
@@ -15,9 +15,22 @@ export const defineVocabulary = (id, keywords) => {
|
|
|
15
15
|
|
|
16
16
|
const _dialects = {};
|
|
17
17
|
const _allowUnknownKeywords = {};
|
|
18
|
-
|
|
19
|
-
|
|
18
|
+
|
|
19
|
+
export const getKeywordId = (dialectId, keyword) => {
|
|
20
|
+
if (!hasDialect(dialectId)) {
|
|
21
|
+
throw Error(`Encountered unknown dialect '${dialectId}'`);
|
|
22
|
+
}
|
|
23
|
+
|
|
24
|
+
return _dialects[dialectId]?.[keyword]
|
|
25
|
+
|| (_allowUnknownKeywords[dialectId] || keyword.startsWith("x-"))
|
|
26
|
+
&& `https://json-schema.org/keyword/unknown#${keyword}`;
|
|
27
|
+
};
|
|
28
|
+
|
|
20
29
|
export const getKeywordName = (dialectId, keywordId) => {
|
|
30
|
+
if (!hasDialect(dialectId)) {
|
|
31
|
+
throw Error(`Encountered unknown dialect '${dialectId}'`);
|
|
32
|
+
}
|
|
33
|
+
|
|
21
34
|
for (const keyword in _dialects[dialectId]) {
|
|
22
35
|
if (_dialects[dialectId][keyword] === keywordId) {
|
|
23
36
|
return keyword;
|
package/lib/schema.js
CHANGED
|
@@ -22,14 +22,10 @@ export const add = (schema, retrievalUri = undefined, contextDialectId = undefin
|
|
|
22
22
|
const dialectId = toAbsoluteIri(schema.$schema || contextDialectId);
|
|
23
23
|
delete schema.$schema;
|
|
24
24
|
|
|
25
|
-
if (!hasDialect(dialectId)) {
|
|
26
|
-
throw Error(`Encountered unknown dialect '${dialectId}'`);
|
|
27
|
-
}
|
|
28
|
-
|
|
29
25
|
// Identifiers
|
|
30
26
|
const idToken = getKeywordName(dialectId, "https://json-schema.org/keyword/id")
|
|
31
27
|
|| getKeywordName(dialectId, "https://json-schema.org/keyword/draft-04/id");
|
|
32
|
-
if (retrievalUri
|
|
28
|
+
if (!retrievalUri && typeof schema?.[idToken] !== "string") {
|
|
33
29
|
throw Error(`Unable to determine an identifier for the schema. Use the '${idToken}' keyword or pass a retrievalUri when loading the schema.`);
|
|
34
30
|
}
|
|
35
31
|
const internalUrl = resolveUri(schema[idToken] || retrievalUri, retrievalUri);
|
|
@@ -77,9 +73,6 @@ const processSchema = (subject, id, dialectId, pointer, anchors, dynamicAnchors)
|
|
|
77
73
|
if (jsonTypeOf(subject, "object")) {
|
|
78
74
|
// Embedded Schema
|
|
79
75
|
const embeddedDialectId = typeof subject.$schema === "string" ? toAbsoluteIri(subject.$schema) : dialectId;
|
|
80
|
-
if (!hasDialect(embeddedDialectId)) {
|
|
81
|
-
throw Error(`Encountered unknown dialect '${embeddedDialectId}'`);
|
|
82
|
-
}
|
|
83
76
|
|
|
84
77
|
const idToken = getKeywordName(embeddedDialectId, "https://json-schema.org/keyword/id");
|
|
85
78
|
if (typeof subject[idToken] === "string") {
|
|
@@ -259,6 +252,7 @@ export const toSchema = (schemaDoc, options = {}) => {
|
|
|
259
252
|
const idToken = getKeywordName(schemaDoc.dialectId, "https://json-schema.org/keyword/id")
|
|
260
253
|
|| getKeywordName(schemaDoc.dialectId, "https://json-schema.org/keyword/draft-04/id");
|
|
261
254
|
const anchorToken = getKeywordName(schemaDoc.dialectId, "https://json-schema.org/keyword/anchor");
|
|
255
|
+
const legacyAnchorToken = getKeywordName(schemaDoc.dialectId, "https://json-schema.org/keyword/draft-04/id");
|
|
262
256
|
const dynamicAnchorToken = getKeywordName(schemaDoc.dialectId, "https://json-schema.org/keyword/dynamicAnchor");
|
|
263
257
|
const legacyDynamicAnchorToken = getKeywordName(schemaDoc.dialectId, "https://json-schema.org/keyword/draft-2020-12/dynamicAnchor");
|
|
264
258
|
const recursiveAnchorToken = getKeywordName(schemaDoc.dialectId, "https://json-schema.org/keyword/recursiveAnchor");
|
|
@@ -284,7 +278,14 @@ export const toSchema = (schemaDoc, options = {}) => {
|
|
|
284
278
|
}
|
|
285
279
|
} else {
|
|
286
280
|
if (pointer in anchors) {
|
|
287
|
-
|
|
281
|
+
if (anchorToken) {
|
|
282
|
+
value = { [anchorToken]: anchors[pointer], ...value };
|
|
283
|
+
}
|
|
284
|
+
|
|
285
|
+
// Legacy anchor
|
|
286
|
+
if (legacyAnchorToken) {
|
|
287
|
+
value = { [legacyAnchorToken]: `#${anchors[pointer]}`, ...value };
|
|
288
|
+
}
|
|
288
289
|
}
|
|
289
290
|
if (pointer in dynamicAnchors) {
|
|
290
291
|
if (dynamicAnchorToken) {
|
package/openapi-3-0/index.js
CHANGED
package/openapi-3-1/index.d.ts
CHANGED
package/package.json
CHANGED
|
@@ -1,6 +1,6 @@
|
|
|
1
1
|
{
|
|
2
2
|
"name": "@hyperjump/json-schema",
|
|
3
|
-
"version": "1.6.
|
|
3
|
+
"version": "1.6.7",
|
|
4
4
|
"description": "A JSON Schema validator with support for custom keywords, vocabularies, and dialects",
|
|
5
5
|
"type": "module",
|
|
6
6
|
"main": "./stable/index.js",
|
|
@@ -27,7 +27,7 @@
|
|
|
27
27
|
"scripts": {
|
|
28
28
|
"clean": "xargs -a .gitignore rm -rf",
|
|
29
29
|
"lint": "eslint lib stable draft-* openapi-* bundle annotations",
|
|
30
|
-
"test": "
|
|
30
|
+
"test": "vitest --watch=false"
|
|
31
31
|
},
|
|
32
32
|
"repository": "github:hyperjump-io/json-schema",
|
|
33
33
|
"keywords": [
|
|
@@ -51,20 +51,18 @@
|
|
|
51
51
|
"url": "https://github.com/sponsors/jdesrosiers"
|
|
52
52
|
},
|
|
53
53
|
"devDependencies": {
|
|
54
|
-
"@types/chai": "*",
|
|
55
|
-
"@types/mocha": "*",
|
|
56
54
|
"@types/node": "*",
|
|
57
55
|
"@typescript-eslint/eslint-plugin": "*",
|
|
58
56
|
"@typescript-eslint/parser": "*",
|
|
59
|
-
"
|
|
57
|
+
"@vitest/coverage-v8": "^1.0.4",
|
|
60
58
|
"eslint": "*",
|
|
59
|
+
"eslint-import-resolver-exports": "*",
|
|
61
60
|
"eslint-import-resolver-node": "*",
|
|
62
61
|
"eslint-import-resolver-typescript": "*",
|
|
63
62
|
"eslint-plugin-import": "*",
|
|
64
63
|
"json-schema-test-suite": "github:json-schema-org/JSON-Schema-Test-Suite",
|
|
65
|
-
"mocha": "*",
|
|
66
|
-
"ts-node": "*",
|
|
67
64
|
"typescript": "*",
|
|
65
|
+
"vitest": "*",
|
|
68
66
|
"yaml": "*"
|
|
69
67
|
},
|
|
70
68
|
"dependencies": {
|