vis-chronicle 0.0.2

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
@@ -0,0 +1,5 @@
1
+ {
2
+ "editor.insertSpaces": false,
3
+ "editor.tabSize": 4,
4
+ "editor.detectIndentation": false
5
+ }
package/LICENSE.md ADDED
@@ -0,0 +1,5 @@
1
+ Copyright 2025 Brian MacIntosh
2
+
3
+ Permission to use, copy, modify, and/or distribute this software for any purpose with or without fee is hereby granted, provided that the above copyright notice and this permission notice appear in all copies.
4
+
5
+ THE SOFTWARE IS PROVIDED “AS IS” AND THE AUTHOR DISCLAIMS ALL WARRANTIES WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE.
package/README.md ADDED
@@ -0,0 +1,95 @@
1
+ # vis-chronicle
2
+
3
+ vis-chronicle is a tool to generate web-based timelines from Wikidata SPARQL queries. You feed the tool a specification JSON file that describes what items to put on the timeline. The tool gathers the needed data from Wikidata and produces a JSON file suitable for feeding directly to [vis-timeline](https://github.com/visjs/vis-timeline).
4
+
5
+ [vis-chronicle-demo](https://github.com/BrianMacIntosh/vis-chronicle-demo) is a demo application using this tool.
6
+
7
+ ## Example
8
+
9
+ This specification file generates a timeline containing the lifespans of every United States president.
10
+
11
+ ```json
12
+ {
13
+ "groups":
14
+ [
15
+ {
16
+ "id": "presidents",
17
+ "content": ""
18
+ }
19
+ ],
20
+ "options":
21
+ {
22
+ "width": "100%",
23
+ "preferZoom": false,
24
+ "showCurrentTime": true,
25
+ "stack": false,
26
+ "stackSubgroups": false,
27
+ "verticalScroll": true
28
+ },
29
+ "queryTemplates":
30
+ {
31
+ "dateOfBirth": "{entity} p:P569 ?_prop. ?_prop psv:P569 ?_value.",
32
+ "dateOfDeath": "{entity} p:P570 ?_prop. ?_prop psv:P570 ?_value."
33
+ },
34
+ "items":
35
+ [
36
+ {
37
+ "comment": "All United States presidents",
38
+ "itemQuery": "?_node wdt:P39 wd:Q11696. ?_node wdt:P31 wd:Q5.",
39
+ "group": "presidents",
40
+
41
+ "startQuery": "#dateOfBirth",
42
+ "endQuery": "#dateOfDeath"
43
+ },
44
+ {
45
+ "label": "Ed Kealty (fictional)",
46
+ "entity": "Q5335019",
47
+ "group": "presidents",
48
+ "startQuery": "#dateOfBirth",
49
+ "endQuery": "#dateOfDeath"
50
+ }
51
+ ]
52
+ }
53
+
54
+ ```
55
+
56
+ `groups` and `options` are transparently passed through for vis-timeline - see the vis-timeline documentation. `"stack": false` and `"stackSubgroups": false` false are necessary in order for chronicle's open-ended ranges to display correctly.
57
+
58
+ `queryTemplates` contains template queries to be re-used by multiple items. vis-chronicle comes with a few template queries by default. Queries need to match a property (`p:`) using the variable `?_prop` and a value (such as `psv:` or `pqv:`) using the variable `?_value`. Also, any format variable like `{format}` will be replaced by the same-named value on the item entry, allowing templates to be parameterized.
59
+
60
+ `items` contains any number of items to generate. An item can be a single, literal item, or a multi-item generator (using `itemQuery` or `items`).
61
+
62
+ Item properties:
63
+ * `label`: Literal items only. Display label for the item. Can contain HTML. For generators, wildcards can be used:
64
+ * `{_LABEL}` will be replaced by the Wikidata label.
65
+ * `{_QID}` will be replaced by the Wikidata Q-id.
66
+ * `itemQuery`: Generators only. A SPARQL query segment to select all the items that should be generated. `?_node` stands in for the ouput variable. The `entity` property is added to each item with the item's entity id. Item labels are automatically fetched from Wikidata.
67
+ * `items`: Generators only. An array of Wikidata ids. Works like `itemQuery` but with an explicit list.
68
+ * `itemRange`: Generators only. Only generates items whose time spans fall at least partially within the provided range.
69
+ * `min`
70
+ * `max`
71
+ * `group`: The vis-timeline group to put the item(s) in.
72
+ * `className`: CSS class to apply to the timeline div.
73
+ * `type`: vis-timeline type of the item.
74
+ * `startQuery`: A SPARQL query segment that selects the desired start time of the object. These should select Wikidata properties (not statements or values). Can be a query term like `{entity} p:P569 ?_prop. ?_prop psv:P569 ?_value.` or a template query like `#dateOfBirth`.
75
+ * `endQuery`: Same as `startQuery`, but for the end time.
76
+ * `startEndQuery`: Use instead of `startQuery` and `endQuery` separately if the start and end times are from qualifiers. This guarantees they will be drawn from the same statement.
77
+ * `general`: General query segment, usually for selecting the item and property.
78
+ * `start`: Query segment for selecting the start value.
79
+ * `end`: Query segment for selecting the end value.
80
+ * `expectedDuration`: Describes the expected duration, for hinting if the start or end is missing.
81
+ * `min`: The absolute minimum duration.
82
+ * `max`: The absolute maximum duration.
83
+ * `avg`: The average duration.
84
+
85
+ ## Parameters
86
+
87
+ The tool can be run as a package script like `vis-chronicle ./src/timeline.json -v`.
88
+
89
+ The tool takes these parameters on the command line:
90
+ * (Mandatory) The name of the input file or directory. If a directory, all JSON files in the directory will be combined together.
91
+ * (Optional) The name of the output file. Defaults to `intermediate/timeline.json`.
92
+ * `-v[erbose]`: Print verbose output, including the actual queries being run.
93
+ * `-skip-wd-cache`: Do not read anything from the local cache, query all data fresh.
94
+ * `-q[uery-url]`: The URL of the SPARQL endpoint. Defaults to `https://query.wikidata.org/sparql`.
95
+ * `-l[ang]`: Language priorities to use when fetching labels. Defaults to `en,mul`. See (SPARQL/SERVICE - Label)[https://en.wikibooks.org/wiki/SPARQL/SERVICE_-_Label].
package/package.json ADDED
@@ -0,0 +1,30 @@
1
+ {
2
+ "name": "vis-chronicle",
3
+ "version": "0.0.2",
4
+ "description": "Generates JSON for populating a vis.js timeline from Wikidata queries.",
5
+ "keywords": [
6
+ "wikidata", "visjs", "timeline"
7
+ ],
8
+ "homepage": "https://github.com/BrianMacIntosh/vis-chronicle",
9
+ "bugs": "https://github.com/BrianMacIntosh/vis-chronicle/issues",
10
+ "repository": {
11
+ "type": "git",
12
+ "url": "git+https://github.com/BrianMacIntosh/vis-chronicle.git"
13
+ },
14
+ "bin": {
15
+ "vis-chronicle": "src/fetch.js"
16
+ },
17
+ "scripts": {
18
+ },
19
+ "author": {
20
+ "name": "Brian MacIntosh",
21
+ "email": "brianamacintosh@gmail.com",
22
+ "url": "https://brianmacintosh.com"
23
+ },
24
+ "license": "ISC",
25
+ "dependencies": {
26
+ },
27
+ "devDependencies": {
28
+ "moment": "2.30.1"
29
+ }
30
+ }
package/src/fetch.js ADDED
@@ -0,0 +1,544 @@
1
+ #!/usr/bin/env node
2
+
3
+ // Uses an input specification file to produce an output file for vis.js Timeline.
4
+
5
+ const path = require('path')
6
+ const nodeutil = require('node:util')
7
+ const assert = require('node:assert/strict');
8
+
9
+ // parse args
10
+ const { values, positionals } = nodeutil.parseArgs({
11
+ allowPositionals: true,
12
+ options: {
13
+ "verbose": { type: 'boolean', short: 'v', default: false },
14
+ "skip-wd-cache": { type: 'boolean', default: false },
15
+ "query-url": { type: 'string', short: 'q', default: "https://query.wikidata.org/sparql" },
16
+ "lang": { type: 'string', short: 'l', default: "en,mul" },
17
+ "cachebuster": { type: 'string', default: undefined}
18
+ }})
19
+
20
+ var specFile = positionals[0]
21
+ if (!specFile)
22
+ {
23
+ console.error(`No specification file or directory provided.`)
24
+ console.log(`Usage: vis-chronicle IN [OUT]`)
25
+ }
26
+ else
27
+ {
28
+ console.log(`Fetching using spec '${specFile}'.`)
29
+ }
30
+
31
+ var outputFile = positionals[1]
32
+ if (!outputFile)
33
+ {
34
+ outputFile = "intermediate/timeline.json"
35
+ }
36
+
37
+ const moment = require('moment')
38
+ const fs = require('fs');
39
+ const wikidata = require('./wikidata.js')
40
+ const mypath = require("./mypath.js")
41
+ const globalData = require("./global-data.json")
42
+
43
+ function postprocessDuration(duration)
44
+ {
45
+ if (duration.min)
46
+ duration.min = moment.duration(duration.min)
47
+ if (duration.max)
48
+ duration.max = moment.duration(duration.max)
49
+ if (duration.avg)
50
+ duration.avg = moment.duration(duration.avg)
51
+ return duration
52
+ }
53
+ function postprocessGlobalData()
54
+ {
55
+ for (const expectation of globalData.expectations)
56
+ {
57
+ if (expectation.duration)
58
+ {
59
+ postprocessDuration(expectation.duration)
60
+ }
61
+ }
62
+ }
63
+ postprocessGlobalData()
64
+
65
+ wikidata.skipCache = values["skip-wd-cache"]
66
+ wikidata.cacheBuster = values["cachebuster"]
67
+ wikidata.sparqlUrl = values["query-url"]
68
+ wikidata.verboseLogging = values["verbose"]
69
+ wikidata.setLang(values["lang"])
70
+ wikidata.initialize()
71
+
72
+ // produces a moment from a Wikidata time
73
+ function wikidataToMoment(inTime)
74
+ {
75
+ if (!inTime || !inTime.value)
76
+ {
77
+ // missing value
78
+ return undefined
79
+ }
80
+
81
+ // moment has trouble with negative years unless they're six digits
82
+ const date = moment.utc(inTime.value, 'YYYYYY-MM-DDThh:mm:ss')
83
+
84
+ //TODO: do something nice in the GUI to indicate imprecision of times
85
+ switch (inTime.precision)
86
+ {
87
+ case 0: case 1: case 2: case 3: case 4: case 5: case 6: case 7: case 8: case 9:
88
+ const yearBase = Math.pow(10, 9 - inTime.precision)
89
+ const roundedYear = Math.round(date.year() / yearBase) * yearBase
90
+ return date.year(roundedYear).startOf('year')
91
+ case 10: // month precision
92
+ return date.startOf('month')
93
+ case 11: // day precision
94
+ return date.startOf('day')
95
+ case 12: // hour precision
96
+ return date.startOf('hour')
97
+ case 13: // minute precision
98
+ return date.startOf('minute')
99
+ case 14: // second precision
100
+ return date.startOf('second')
101
+ default:
102
+ throw `Unrecognized date precision ${inTime.precision}`
103
+ }
104
+ }
105
+
106
+ function getExpectation(item)
107
+ {
108
+ if (item.expectedDuration)
109
+ {
110
+ return { "duration": item.expectedDuration };
111
+ }
112
+
113
+ for (const expectation of globalData.expectations)
114
+ {
115
+ if (expectation.startQuery && item.startQuery != expectation.startQuery)
116
+ {
117
+ continue;
118
+ }
119
+ if (expectation.endQuery && item.endQuery != expectation.endQuery)
120
+ {
121
+ continue;
122
+ }
123
+ if (expectation.startEndQuery && item.startEndQuery != expectation.startEndQuery)
124
+ {
125
+ continue;
126
+ }
127
+ return expectation;
128
+ }
129
+ assert(false) // expect at least a universal fallback expectation
130
+ return undefined
131
+ }
132
+
133
+ // produces JSON output from the queried data
134
+ function produceOutput(items)
135
+ {
136
+ const finalizeItem = function(item)
137
+ {
138
+ if (item.start)
139
+ item.start = item.start.format("YYYYYY-MM-DDThh:mm:ss")
140
+ if (item.end)
141
+ item.end = item.end.format("YYYYYY-MM-DDThh:mm:ss")
142
+ outputObject.items.push(item)
143
+ }
144
+
145
+ var outputObject = { items: [], groups: wikidata.inputSpec.groups, options: wikidata.inputSpec.options }
146
+ for (const item of items)
147
+ {
148
+ var outputItem = {
149
+ id: item.id,
150
+ content: item.label,
151
+ className: item.className,
152
+ comment: item.comment,
153
+ type: item.type
154
+ }
155
+ if (item.group)
156
+ {
157
+ outputItem.group = item.group
158
+ outputItem.subgroup = item.subgroup ? item.subgroup : item.entity
159
+ }
160
+
161
+ const isRangeType = !outputItem.type || outputItem.type == "range" || outputItem.type == "background"
162
+
163
+ // for debugging
164
+ outputItem.className = [ outputItem.className, item.entity ].join(' ')
165
+
166
+ // look up duration expectations
167
+ const expectation = getExpectation(item)
168
+
169
+ assert(expectation && expectation.duration && expectation.duration.avg) // expect at least a universal fallback expectation
170
+
171
+ if (!item.start && !item.end
172
+ && !item.start_min && !item.start_max
173
+ && !item.end_min && !item.end_max)
174
+ {
175
+ console.warn(`Item ${item.id} has no date data at all.`)
176
+ continue
177
+ }
178
+
179
+ outputItem.start = wikidataToMoment(item.start)
180
+ outputItem.end = wikidataToMoment(item.end)
181
+ const start_min = item.start_min ? wikidataToMoment(item.start_min) : outputItem.start
182
+ const start_max = item.start_max ? wikidataToMoment(item.start_max) : outputItem.start
183
+ const end_min = item.end_min ? wikidataToMoment(item.end_min) : outputItem.end
184
+ const end_max = item.end_max ? wikidataToMoment(item.end_max) : outputItem.end
185
+
186
+ // exclude items that violate itemRange constraints
187
+ //OPT: do this at an earlier stage? (e.g. when running the first query)
188
+ if (item.itemRange)
189
+ {
190
+ if (item.itemRange.min && moment(item.itemRange.min).isAfter(end_max))
191
+ continue
192
+ if (item.itemRange.max && moment(item.itemRange.max).isBefore(end_max))
193
+ continue
194
+ }
195
+
196
+ // no certainty at all
197
+ if (start_max >= end_min)
198
+ {
199
+ outputItem.className = [ outputItem.className, 'visc-uncertain' ].join(' ')
200
+ outputItem.start = start_min
201
+ outputItem.end = end_max
202
+
203
+ finalizeItem(outputItem)
204
+ continue
205
+ }
206
+
207
+ if (!isRangeType)
208
+ {
209
+ finalizeItem(outputItem)
210
+ continue
211
+ }
212
+
213
+ // handle end date
214
+ if (end_min && end_max && end_min < end_max)
215
+ {
216
+ // uncertain end
217
+
218
+ // find lower bound of uncertain region
219
+ var uncertainMin
220
+ if (item.end_min)
221
+ uncertainMin = end_min
222
+ else if (outputItem.end)
223
+ uncertainMin = outputItem.end
224
+ else
225
+ uncertainMin = outputItem.start //TODO: use min/max start
226
+ assert(uncertainMin)
227
+
228
+ // add uncertain range
229
+ outputObject.items.push({
230
+ id: outputItem.id + "-unc-end",
231
+ className: [outputItem.className, "visc-uncertain", "visc-left-connection"].join(' '),
232
+ content: item.label ? "&nbsp;" : "",
233
+ start: uncertainMin.format("YYYYYY-MM-DDThh:mm:ss"),
234
+ end: end_max.format("YYYYYY-MM-DDThh:mm:ss"),
235
+ group: item.group,
236
+ subgroup: outputItem.subgroup
237
+ })
238
+
239
+ // adjust normal range to match
240
+ outputItem.end = uncertainMin
241
+ outputItem.className = [ outputItem.className, 'visc-right-connection' ].join(' ')
242
+ }
243
+ else if (!outputItem.end)
244
+ {
245
+ // open-ended end
246
+ const useMax = expectation.duration.max ? expectation.duration.max : moment(expectation.duration.avg.asMilliseconds() * 2)
247
+ if (outputItem.start.clone().add(useMax) < moment())
248
+ {
249
+ // 'max' is less than 'now'; it is likely this duration is not ongoing but has an unknown end
250
+ //TODO: accomodate wikidata special 'no value'
251
+ outputItem.end = outputItem.start.clone().add(expectation.duration.avg)
252
+ }
253
+ else
254
+ {
255
+ // 'now' is within 'max' and so it is a reasonable guess that this duration is ongoing
256
+ outputItem.end = moment()
257
+ }
258
+
259
+ const avgDuration = moment.duration(expectation.duration.avg) //HACK: TODO: consistently postprocess expectations, or don't
260
+ const actualDuration = moment.duration(outputItem.end.diff(outputItem.start))
261
+ var excessDuration = moment.duration(avgDuration.asMilliseconds()).subtract(actualDuration)
262
+ excessDuration = moment.duration(Math.max(excessDuration.asMilliseconds(), avgDuration.asMilliseconds() * 0.25))
263
+
264
+ // add a "tail" item after the end
265
+ outputObject.items.push({
266
+ id: outputItem.id + "-tail",
267
+ className: [outputItem.className, "visc-right-tail"].join(' '),
268
+ content: item.label ? "&nbsp;" : "",
269
+ start: outputItem.end.format("YYYYYY-MM-DDThh:mm:ss"),
270
+ end: outputItem.end.clone().add(excessDuration).format("YYYYYY-MM-DDThh:mm:ss"), //HACK: magic number
271
+ group: item.group,
272
+ subgroup: outputItem.subgroup
273
+ })
274
+
275
+ outputItem.className = [ outputItem.className, 'visc-right-connection' ].join(' ')
276
+ }
277
+
278
+ // handle start date
279
+ if (start_min && start_max && start_max > start_min)
280
+ {
281
+ // uncertain start
282
+
283
+ // find upper bound of uncertain region
284
+ var uncertainMax
285
+ if (item.start_max)
286
+ uncertainMax = start_max
287
+ else if (outputItem.start)
288
+ uncertainMax = outputItem.start
289
+ else
290
+ uncertainMax = outputItem.end //TODO: use min/max end
291
+ assert(uncertainMax)
292
+
293
+ // add uncertain range
294
+ outputObject.items.push({
295
+ id: outputItem.id + "-unc-start",
296
+ className: [outputItem.className, "visc-uncertain", "visc-right-connection"].join(' '),
297
+ content: item.label ? "&nbsp;" : "",
298
+ start: start_min.format("YYYYYY-MM-DDThh:mm:ss"),
299
+ end: uncertainMax.format("YYYYYY-MM-DDThh:mm:ss"),
300
+ group: item.group,
301
+ subgroup: outputItem.subgroup
302
+ })
303
+
304
+ // adjust normal range to match
305
+ outputItem.start = uncertainMax
306
+ outputItem.className = [ outputItem.className, 'visc-left-connection' ].join(' ')
307
+ }
308
+ else if (!outputItem.start)
309
+ {
310
+ // open-ended start
311
+ outputItem.start = outputItem.end.clone().subtract(expectation.duration.avg)
312
+ outputItem.className = [outputItem.className, "visc-open-left"].join(' ')
313
+ }
314
+
315
+ //TODO: missing death dates inside expected duration: solid to NOW, fade after NOW
316
+ //TODO: accept expected durations and place uncertainly before/after those
317
+
318
+ finalizeItem(outputItem)
319
+ }
320
+ return JSON.stringify(outputObject, undefined, "\t") //TODO: configure space
321
+ }
322
+
323
+ async function entryPoint() {}
324
+
325
+ entryPoint()
326
+ .then(async () => {
327
+
328
+ await wikidata.readInputSpec(path.join(process.cwd(), specFile));
329
+
330
+ })
331
+ .then(async () => {
332
+
333
+ await wikidata.readCache();
334
+
335
+ })
336
+ .then(async () => {
337
+
338
+ // replace template items using their item-generating queries
339
+ for (var i = wikidata.inputSpec.items.length - 1; i >= 0; --i)
340
+ {
341
+ var templateItem = wikidata.inputSpec.items[i]
342
+ if (templateItem.itemQuery || templateItem.items)
343
+ {
344
+ //TODO: caching for item queries
345
+ wikidata.inputSpec.items.splice(i, 1)
346
+ const newItems = await wikidata.createTemplateItems(templateItem)
347
+ for (const newItem of newItems)
348
+ wikidata.inputSpec.items.push(newItem)
349
+ }
350
+ }
351
+
352
+ })
353
+ .then(() => {
354
+
355
+ var hasError = false;
356
+
357
+ // collect all existing ids
358
+ var itemIds = new Set()
359
+ for (const item of wikidata.inputSpec.items)
360
+ {
361
+ if (item.id)
362
+ {
363
+ if (itemIds.has(item.id))
364
+ {
365
+ itemIds.add(item.id)
366
+ }
367
+ else
368
+ {
369
+ console.error(`Item id '${item.id}' appears multiple times.`)
370
+ hasError = true
371
+ }
372
+ }
373
+ }
374
+
375
+ // generate ids for items that don't have one
376
+ for (const item of wikidata.inputSpec.items)
377
+ {
378
+ if (!item.id)
379
+ {
380
+ const baseId = item.entity ? item.entity : "anonymous"
381
+ var prospectiveId = baseId
382
+ var i = 0
383
+ while (itemIds.has(prospectiveId))
384
+ {
385
+ i++
386
+ prospectiveId = baseId + "-" + i
387
+ }
388
+ item.id = prospectiveId
389
+ itemIds.add(prospectiveId)
390
+ }
391
+ }
392
+
393
+ if (hasError) throw "Error generating item ids."
394
+
395
+ })
396
+ .then(() => {
397
+
398
+ // batch up queries using templates
399
+ //TODO:
400
+
401
+ })
402
+ .then(async () => {
403
+
404
+ // run all the Wikidata queries
405
+
406
+ const isQueryProperty = function(key)
407
+ {
408
+ if (key == "startEndQuery" || key == "startQuery" || key == "endQuery") return true
409
+
410
+ //TODO: look at the queries for parameter names
411
+ return key != "comment"
412
+ && key != "id"
413
+ && key != "content"
414
+ && key != "group"
415
+ && key != "subgroup"
416
+ && key != "itemQuery"
417
+ && key != "type"
418
+ && key != "label"
419
+ && key != "className"
420
+ && key != "entity"
421
+ && key != "skipCache"
422
+ }
423
+
424
+ // bundle items that use the same queries
425
+ const queryBundles = {}
426
+ for (const item of wikidata.inputSpec.items)
427
+ {
428
+ if (item.finished) continue
429
+
430
+ // the bundle key is the queries, as well as any wildcard parameters
431
+ const keyObject = {}
432
+ for (const itemKey in item)
433
+ {
434
+ if (isQueryProperty(itemKey))
435
+ {
436
+ keyObject[itemKey] = item[itemKey]
437
+ }
438
+ }
439
+
440
+ const keyStr = JSON.stringify(keyObject)
441
+ var targetBundle = queryBundles[keyStr]
442
+ if (targetBundle)
443
+ {
444
+ targetBundle.push(item)
445
+ }
446
+ else
447
+ {
448
+ targetBundle = [ item ]
449
+ queryBundles[keyStr] = targetBundle
450
+ }
451
+ }
452
+
453
+ console.log(`There are ${Object.keys(queryBundles).length} query bundles.`)
454
+ for (const bundleKey in queryBundles)
455
+ {
456
+ const bundle = queryBundles[bundleKey]
457
+ console.log(`\tBundle (${bundle.length} items): ${bundleKey}.`)
458
+
459
+ const representativeItem = bundle[0]
460
+
461
+ const copyResult = function(result, func)
462
+ {
463
+ for (const entityId in result)
464
+ {
465
+ var entityResult = result[entityId]
466
+
467
+ // there may be multiple source items making the same query
468
+ for (const item of bundle)
469
+ {
470
+ if (item.entity == entityId)
471
+ {
472
+ if (entityResult instanceof Array)
473
+ {
474
+ // clone the item for each result beyond the first
475
+ for (var i = 1; i < entityResult.length; ++i)
476
+ {
477
+ const newItem = structuredClone(item)
478
+ newItem.id = `${newItem.id}-v${i}`
479
+ wikidata.inputSpec.items.push(newItem) //HACK: modifying original array
480
+ func(newItem, entityResult[i])
481
+ newItem.finished = true
482
+ }
483
+
484
+ // populate the first result into the original item
485
+ entityResult = entityResult[0]
486
+ }
487
+
488
+ func(item, entityResult)
489
+ item.finished = true
490
+ }
491
+ }
492
+ }
493
+ }
494
+
495
+ if (representativeItem.startEndQuery)
496
+ {
497
+ const result = await wikidata.runTimeQueryTerm(representativeItem.startEndQuery, bundle)
498
+ copyResult(result, function(item, result) {
499
+ Object.assign(item, result)
500
+ })
501
+ }
502
+ else
503
+ {
504
+ if (representativeItem.startQuery)
505
+ {
506
+ const result = await wikidata.runTimeQueryTerm(representativeItem.startQuery, bundle)
507
+ copyResult(result, function(item, result) {
508
+ item.start = result.value;
509
+ item.start_min = result.min;
510
+ item.start_max = result.max;
511
+ })
512
+ }
513
+ if (representativeItem.endQuery)
514
+ {
515
+ const result = await wikidata.runTimeQueryTerm(representativeItem.endQuery, bundle)
516
+ copyResult(result, function(item, result) {
517
+ item.end = result.value;
518
+ item.end_min = result.min;
519
+ item.end_max = result.max;
520
+ })
521
+ }
522
+ }
523
+ }
524
+ })
525
+ .then(async () => {
526
+
527
+ await mypath.ensureDirectoryForFile(outputFile)
528
+
529
+ // write the output file
530
+ const output = produceOutput(wikidata.inputSpec.items)
531
+ await fs.writeFile(outputFile, output, err => {
532
+ if (err) {
533
+ console.error(`Error writing output file:`)
534
+ console.error(err)
535
+ }
536
+ });
537
+
538
+ })
539
+ .catch((reason) => {
540
+
541
+ console.error(reason)
542
+
543
+ })
544
+ .finally(async () => { await wikidata.writeCache(); })
@@ -0,0 +1,52 @@
1
+ {
2
+ "queryTemplates":
3
+ {
4
+ "dateOfBirth": "{entity} p:P569 ?_prop. ?_prop psv:P569 ?_value.",
5
+ "dateOfDeath": "{entity} p:P570 ?_prop. ?_prop psv:P570 ?_value.",
6
+
7
+ "inception": "{entity} p:P571 ?_prop. ?_prop psv:P571 ?_value.",
8
+ "dissolved": "{entity} p:P576 ?_prop. ?_prop psv:P576 ?_value.",
9
+
10
+ "startTime": "{entity} p:P580 ?_prop. ?_prop psv:P580 ?_value.",
11
+ "endTime": "{entity} p:P582 ?_prop. ?_prop psv:P582 ?_value.",
12
+
13
+ "pointInTime": "{entity} p:P585 ?_prop. ?_prop psv:P585 ?_value.",
14
+
15
+ "positionHeldStartEnd": {
16
+ "general": "{entity} p:P39 ?_prop. ?_prop ps:P39 {position}.",
17
+ "start": "?_prop pqv:P580 ?_start_value.",
18
+ "start_min": "?_prop pqv:P1319 ?_start_min_value.",
19
+ "start_max": "?_prop pqv:P8555 ?_start_max_value.",
20
+ "end": "?_prop pqv:P582 ?_end_value.",
21
+ "end_min": "?_prop pqv:P8554 ?_end_min_value.",
22
+ "end_max": "?_prop pqv:P12506 ?_end_max_value."
23
+ },
24
+ "positionJurisdictionHeldStartEnd": {
25
+ "general": "{entity} p:P39 ?_prop. ?_prop ps:P39 {position}; pq:P1001 {jurisdiction}.",
26
+ "start": "?_prop pqv:P580 ?_start_value.",
27
+ "start_min": "?_prop pqv:P1319 ?_start_min_value.",
28
+ "start_max": "?_prop pqv:P8555 ?_start_max_value.",
29
+ "end": "?_prop pqv:P582 ?_end_value.",
30
+ "end_min": "?_prop pqv:P8554 ?_end_min_value.",
31
+ "end_max": "?_prop pqv:P12506 ?_end_max_value."
32
+ }
33
+ },
34
+ "itemQueryTemplates":
35
+ {
36
+ "withPosition": "?_node wdt:P39 {position}.",
37
+
38
+ "humansWithPosition": "?_node wdt:P39 {position}. ?_node wdt:P31 wd:Q5.",
39
+ "humansWithPositionJurisdiction": "?_node p:P39 [ ps:P39 {position}; pq:P1001 {jurisdiction} ]. ?_node wdt:P31 wd:Q5."
40
+ },
41
+ "expectations":
42
+ [
43
+ {
44
+ "startQuery": "#dateOfBirth",
45
+ "endQuery": "#dateOfDeath",
46
+ "duration": { "avg": "P80Y", "max": "P150Y" }
47
+ },
48
+ {
49
+ "duration": { "avg": "P100Y" }
50
+ }
51
+ ]
52
+ }
package/src/mypath.js ADDED
@@ -0,0 +1,19 @@
1
+
2
+ const path = require('path')
3
+ const fs = require('fs');
4
+
5
+ module.exports = {
6
+
7
+ ensureDirectoryForFile: async function(filePath)
8
+ {
9
+ const directoryName = path.dirname(filePath);
10
+ await fs.mkdir(directoryName, { recursive: true }, (err) => {
11
+ if (err)
12
+ {
13
+ console.error(`Error creating output directory: ${err}`);
14
+ throw err;
15
+ }
16
+ });
17
+ }
18
+
19
+ }
@@ -0,0 +1,73 @@
1
+
2
+ const assert = require('node:assert/strict')
3
+
4
+ module.exports = class SparqlBuilder
5
+ {
6
+ constructor()
7
+ {
8
+ this.outParams = []
9
+ this.queryTerms = []
10
+ }
11
+
12
+ // Adds an output parameter to the query
13
+ addOutParam(paramName)
14
+ {
15
+ //TODO: validate name more
16
+ assert(paramName)
17
+ if (this.outParams.indexOf(paramName) < 0)
18
+ {
19
+ this.outParams.push(paramName)
20
+ }
21
+ }
22
+
23
+ addQueryTerm(term)
24
+ {
25
+ assert(term)
26
+ this.queryTerms.push(term)
27
+ }
28
+
29
+ addWikibaseLabel(lang)
30
+ {
31
+ if (!lang) lang = "mul"
32
+ this.addQueryTerm(`SERVICE wikibase:label{bd:serviceParam wikibase:language "${lang}".}`)
33
+ }
34
+
35
+ addOptionalQueryTerm(term)
36
+ {
37
+ assert(term)
38
+ this.queryTerms.push(`OPTIONAL{${term}}`)
39
+ }
40
+
41
+ addTimeTerm(term, valueVar, timeVar, precisionVar)
42
+ {
43
+ assert(term)
44
+
45
+ this.addOutParam(timeVar)
46
+ this.addOutParam(precisionVar)
47
+ this.addQueryTerm(`${term} ${valueVar} wikibase:timeValue ${timeVar}. ${valueVar} wikibase:timePrecision ${precisionVar}.`)
48
+ }
49
+
50
+ addOptionalTimeTerm(term, valueVar, timeVar, precisionVar)
51
+ {
52
+ assert(term)
53
+
54
+ this.addOutParam(timeVar)
55
+ this.addOutParam(precisionVar)
56
+ this.addOptionalQueryTerm(`${term} ${valueVar} wikibase:timeValue ${timeVar}. ${valueVar} wikibase:timePrecision ${precisionVar}.`)
57
+ }
58
+
59
+ addCacheBuster(str)
60
+ {
61
+ if (str)
62
+ this.addQueryTerm(`#${str}\n`)
63
+ }
64
+
65
+ build()
66
+ {
67
+ assert(this.outParams.length > 0)
68
+ assert(this.queryTerms.length > 0)
69
+
70
+ //TODO: prevent injection
71
+ return `SELECT ${this.outParams.join(" ")} WHERE{${this.queryTerms.join(" ")}}`
72
+ }
73
+ }
@@ -0,0 +1,484 @@
1
+
2
+ const mypath = require("./mypath.js")
3
+ const fs = require('fs');
4
+ const nodepath = require('node:path');
5
+ const globalData = require("./global-data.json")
6
+ const assert = require('node:assert/strict')
7
+ const SparqlBuilder = require("./sparql-builder.js")
8
+
9
+ const wikidata = module.exports = {
10
+
11
+ inputSpec: null,
12
+ verboseLogging: false,
13
+
14
+ cache: {},
15
+ skipCache: false,
16
+ cacheBuster: undefined,
17
+ termCacheFile: "intermediate/wikidata-term-cache.json",
18
+
19
+ sparqlUrl: "https://query.wikidata.org/sparql",
20
+ lang: "en,mul",
21
+
22
+ rankDeprecated: "http://wikiba.se/ontology#DeprecatedRank",
23
+ rankNormal: "http://wikiba.se/ontology#NormalRank",
24
+ rankPreferred: "http://wikiba.se/ontology#PreferredRank",
25
+
26
+ initialize: function()
27
+ {
28
+ const chroniclePackage = require("../package.json")
29
+ this.options = {
30
+ method: 'POST',
31
+ headers: {
32
+ 'User-Agent': `vis-chronicle/${chroniclePackage.version} (https://github.com/BrianMacIntosh/vis-chronicle) Node.js/${process.version}`,
33
+ 'Content-Type': 'application/sparql-query',
34
+ 'Accept': 'application/sparql-results+json'
35
+ }
36
+ }
37
+ },
38
+
39
+ setLang: function(inLang)
40
+ {
41
+ //TODO: escape
42
+ this.lang = inLang
43
+ },
44
+
45
+ readInputSpec: async function(path)
46
+ {
47
+ const pathStat = await fs.promises.stat(path)
48
+ if (pathStat.isDirectory())
49
+ {
50
+ // read multi-file config
51
+ this.inputSpec = {}
52
+
53
+ const files = await fs.promises.readdir(path)
54
+ for (const fileName of files)
55
+ {
56
+ if (fileName.endsWith(".json"))
57
+ {
58
+ const contents = await fs.promises.readFile(nodepath.join(path, fileName))
59
+ const specPart = JSON.parse(contents)
60
+ for (const key in specPart)
61
+ {
62
+ if (specPart[key] instanceof Array)
63
+ {
64
+ if (this.inputSpec[key])
65
+ {
66
+ this.inputSpec[key].push(...specPart[key])
67
+ }
68
+ else
69
+ {
70
+ this.inputSpec[key] = specPart[key]
71
+ }
72
+ }
73
+ else if (this.inputSpec[key])
74
+ {
75
+ throw `Key '${key}' appears in multiple input spec files.`
76
+ }
77
+ else
78
+ {
79
+ this.inputSpec[key] = specPart[key]
80
+ }
81
+ }
82
+ }
83
+ }
84
+ }
85
+ else
86
+ {
87
+ // read single-file config
88
+ const contents = await fs.promises.readFile(path)
89
+ this.inputSpec = JSON.parse(contents)
90
+ }
91
+ },
92
+
93
+ readCache: async function()
94
+ {
95
+ try
96
+ {
97
+ const contents = await fs.promises.readFile(this.termCacheFile)
98
+ this.cache = JSON.parse(contents)
99
+ }
100
+ catch
101
+ {
102
+ // cache doesn't exist or is invalid; continue without it
103
+ }
104
+ },
105
+
106
+ writeCache: async function()
107
+ {
108
+ await mypath.ensureDirectoryForFile(this.termCacheFile)
109
+
110
+ fs.writeFile(this.termCacheFile, JSON.stringify(this.cache), err => {
111
+ if (err) {
112
+ console.error(`Error writing wikidata cache:`)
113
+ console.error(err)
114
+ }
115
+ })
116
+ },
117
+
118
+ getQueryTemplate: function(templateName, templateSetName)
119
+ {
120
+ assert(templateName)
121
+ assert(templateSetName)
122
+ if (this.inputSpec[templateSetName] && this.inputSpec[templateSetName][templateName])
123
+ {
124
+ return this.inputSpec[templateSetName][templateName]
125
+ }
126
+ else if (globalData && globalData[templateSetName] && globalData[templateSetName][templateName])
127
+ {
128
+ return globalData[templateSetName][templateName]
129
+ }
130
+ else
131
+ {
132
+ return undefined
133
+ }
134
+ },
135
+
136
+ postprocessQueryTerm: function(context, term, item)
137
+ {
138
+ if (!term)
139
+ {
140
+ return term;
141
+ }
142
+
143
+ // replace query wildcards
144
+ for (const key in item)
145
+ {
146
+ var insertValue = item[key]
147
+ if (typeof insertValue === "string" && insertValue.startsWith("Q"))
148
+ insertValue = "wd:" + insertValue
149
+ term = term.replaceAll(`{${key}}`, insertValue)
150
+ }
151
+
152
+ // detect unreplaced wildcards
153
+ //TODO:
154
+
155
+ // terminate term
156
+ if (!term.trim().endsWith("."))
157
+ {
158
+ term += "."
159
+ }
160
+
161
+ return term
162
+ },
163
+
164
+ // Dereferences the query term if it's a pointer to a template.
165
+ // Expects simple string terms (start or end)
166
+ getQueryTermHelper: function(inQueryTerm, item, tempateSetName)
167
+ {
168
+ var queryTerm
169
+
170
+ if (inQueryTerm.startsWith("#"))
171
+ {
172
+ const templateName = inQueryTerm.substring(1).trim()
173
+ var queryTemplate = this.getQueryTemplate(templateName, tempateSetName);
174
+ if (queryTemplate)
175
+ {
176
+ queryTerm = queryTemplate
177
+ }
178
+ else
179
+ {
180
+ throw `Query template '${templateName}' not found (on item ${item.id}).`
181
+ }
182
+ }
183
+ else
184
+ {
185
+ queryTerm = inQueryTerm
186
+ }
187
+
188
+ //TODO: validate query has required wildcards
189
+
190
+ queryTerm = this.postprocessQueryTerm(inQueryTerm, queryTerm, item)
191
+ return queryTerm
192
+ },
193
+
194
+ // Dereferences the query term if it's a pointer to a template.
195
+ // Expects item-generating terms
196
+ getItemQueryTerm: function(queryTerm, item)
197
+ {
198
+ return this.getQueryTermHelper(queryTerm, item, "itemQueryTemplates")
199
+ },
200
+
201
+ // Dereferences the query term if it's a pointer to a template.
202
+ getValueQueryTerm: function(inQueryTerm, item)
203
+ {
204
+ var queryTerm
205
+
206
+ if (inQueryTerm.startsWith && inQueryTerm.startsWith("#"))
207
+ {
208
+ const templateName = inQueryTerm.substring(1).trim()
209
+ var queryTemplate = this.getQueryTemplate(templateName, "queryTemplates");
210
+ if (queryTemplate)
211
+ {
212
+ queryTerm = queryTemplate
213
+ }
214
+ else
215
+ {
216
+ throw `Query template '${templateName}' not found (on item ${item.id}).`
217
+ }
218
+ }
219
+ else
220
+ {
221
+ queryTerm = inQueryTerm
222
+ }
223
+
224
+ if (typeof queryTerm === 'string' || queryTerm instanceof String)
225
+ {
226
+ return {
227
+ value: this.postprocessQueryTerm(inQueryTerm, queryTerm, item),
228
+ min: "?_prop pqv:P1319 ?_min_value.",
229
+ max: "?_prop pqv:P1326 ?_max_value."
230
+ }
231
+ }
232
+ else
233
+ {
234
+ const result = {}
235
+ for (const key in queryTerm)
236
+ {
237
+ result[key] = this.postprocessQueryTerm(inQueryTerm, queryTerm[key], item)
238
+ }
239
+ return result
240
+ }
241
+ },
242
+
243
+ extractQidFromUrl: function(url)
244
+ {
245
+ const lastSlashIndex = url.lastIndexOf("/")
246
+ if (lastSlashIndex >= 0)
247
+ return url.substring(lastSlashIndex + 1)
248
+ else
249
+ return url
250
+ },
251
+
252
+ // Sequential list of filters to narrow down a list of bindings to the best result
253
+ bindingFilters: [
254
+ (binding, index, array) => {
255
+ return binding._rank.value != wikidata.rankDeprecated;
256
+ },
257
+ (binding, index, array) => {
258
+ return binding._rank.value === wikidata.rankPreferred;
259
+ },
260
+ ],
261
+
262
+ // From the specified set of bindings, returns only the best ones to use
263
+ filterBestBindings: function(inBindings)
264
+ {
265
+ // filter the values down until there are none
266
+ var lastBindings = inBindings.slice()
267
+ for (const filter of this.bindingFilters)
268
+ {
269
+ workingBindings = lastBindings.filter(filter)
270
+ if (workingBindings.length == 0)
271
+ {
272
+ break
273
+ }
274
+ lastBindings = workingBindings
275
+ }
276
+ return lastBindings
277
+ },
278
+
279
+ // runs a SPARQL query for time values on an item or set of items
280
+ runTimeQueryTerm: async function (queryTermStr, items)
281
+ {
282
+ const entityVarName = '_entity'
283
+ const entityVar = `?${entityVarName}`
284
+ const propVar = '?_prop'
285
+ const rankVar = '?_rank'
286
+
287
+ const queryBuilder = new SparqlBuilder()
288
+ queryBuilder.addCacheBuster(this.cacheBuster)
289
+
290
+ // create a dummy item representing the collective items
291
+ //TODO: validate that they match
292
+ item = { ...items[0] }
293
+ item.entity = entityVar
294
+ item.id = "DUMMY"
295
+
296
+ queryTerm = this.getValueQueryTerm(queryTermStr, item)
297
+
298
+ // assembly query targets
299
+ const targetEntities = new Set()
300
+ for (const item of items)
301
+ {
302
+ targetEntities.add(`wd:${item.entity}`)
303
+ }
304
+ queryBuilder.addQueryTerm(`VALUES ${entityVar}{${[...targetEntities].join(' ')}}`)
305
+ queryBuilder.addOutParam(entityVar)
306
+
307
+ queryBuilder.addOutParam(rankVar)
308
+ if (queryTerm.general)
309
+ {
310
+ queryBuilder.addQueryTerm(queryTerm.general)
311
+ }
312
+ if (queryTerm.value) //TODO: could unify better with loop below?
313
+ {
314
+ queryBuilder.addTimeTerm(queryTerm.value, "?_value", "?_value_ti", "?_value_pr")
315
+ }
316
+ for (const termKey in queryTerm) //TODO: only pass recognized terms
317
+ {
318
+ if (termKey == "general" || termKey == "value") continue
319
+ queryBuilder.addOptionalTimeTerm(queryTerm[termKey], `?_${termKey}_value`, `?_${termKey}_ti`, `?_${termKey}_pr`)
320
+ }
321
+ queryBuilder.addOptionalQueryTerm(`${propVar} wikibase:rank ${rankVar}.`)
322
+
323
+ const query = queryBuilder.build()
324
+
325
+ // read cache
326
+ const cacheKey = query
327
+ if (!this.skipCache && !item.skipCache && this.cache[cacheKey])
328
+ {
329
+ return this.cache[cacheKey]
330
+ }
331
+
332
+ const data = await this.runQuery(query)
333
+ console.log(`\tQuery for ${item.id} returned ${data.results.bindings.length} results.`)
334
+
335
+ const readBinding = function(binding)
336
+ {
337
+ const result = {}
338
+ for (const termKey in queryTerm) //TODO: only use recognized terms
339
+ {
340
+ if (binding[`_${termKey}_ti`])
341
+ {
342
+ result[termKey] = {
343
+ value: binding[`_${termKey}_ti`].value,
344
+ precision: parseInt(binding[`_${termKey}_pr`].value)
345
+ }
346
+ }
347
+ }
348
+ return result
349
+ }
350
+
351
+ const readBindings = function(bindings)
352
+ {
353
+ const results = []
354
+ for (const binding of bindings)
355
+ {
356
+ results.push(readBinding(binding))
357
+ }
358
+ return results
359
+ }
360
+
361
+ // arrays of bindings, grouped by entity id
362
+ const bindingsByEntity = {}
363
+
364
+ // sort out the bindings by entity
365
+ for (const binding of data.results.bindings)
366
+ {
367
+ // read entity id
368
+ assert(binding[entityVarName].type == 'uri')
369
+ const entityId = this.extractQidFromUrl(binding[entityVarName].value)
370
+
371
+ // get array for entity-specific results
372
+ var entityBindings = bindingsByEntity[entityId]
373
+ if (!entityBindings)
374
+ {
375
+ entityBindings = []
376
+ bindingsByEntity[entityId] = entityBindings
377
+ }
378
+
379
+ entityBindings.push(binding)
380
+ }
381
+
382
+ // filter results down to best per entity
383
+ const result = {}
384
+ for (const entityId in bindingsByEntity)
385
+ {
386
+ const entityBindings = bindingsByEntity[entityId]
387
+ if (entityBindings.length == 1)
388
+ {
389
+ result[entityId] = readBinding(entityBindings[0])
390
+ }
391
+ else // entityBindings.length > 1
392
+ {
393
+ var lastBindings = this.filterBestBindings(entityBindings)
394
+ result[entityId] = readBindings(lastBindings)
395
+ }
396
+ }
397
+
398
+ this.cache[cacheKey] = result;
399
+ return result;
400
+ },
401
+
402
+ // runs a SPARQL query that generates multiple items
403
+ createTemplateItems: async function(templateItem)
404
+ {
405
+ //TODO: caching
406
+
407
+ const itemVarName = "_node"
408
+ const itemVar = `?${itemVarName}`
409
+ templateItem.entity = itemVar
410
+
411
+ var itemQueryTerm
412
+ if (templateItem.itemQuery)
413
+ {
414
+ if (templateItem.items)
415
+ {
416
+ console.warn(`Template item '${templateItem.comment}' has both 'itemQuery' and 'items': 'items' will be ignored.`)
417
+ }
418
+ itemQueryTerm = this.getItemQueryTerm(templateItem.itemQuery, templateItem)
419
+ }
420
+ else if (templateItem.items)
421
+ {
422
+ itemQueryTerm = `VALUES ${itemVar}{${templateItem.items.map(id => `wd:${id}` ).join(' ')}}`
423
+ }
424
+ else
425
+ {
426
+ console.error(`Template item '${templateItem.comment}' has no 'itemQuery' or 'items'.`)
427
+ return []
428
+ }
429
+
430
+ const queryBuilder = new SparqlBuilder()
431
+ queryBuilder.addCacheBuster(this.cacheBuster)
432
+ queryBuilder.addOutParam(itemVar)
433
+ queryBuilder.addOutParam(itemVar + "Label")
434
+ queryBuilder.addQueryTerm(itemQueryTerm)
435
+ queryBuilder.addWikibaseLabel(this.lang)
436
+
437
+ const query = queryBuilder.build()
438
+ const data = await this.runQuery(query)
439
+
440
+ const newItems = []
441
+
442
+ // clone the template for each result
443
+ for (const binding of data.results.bindings)
444
+ {
445
+ const newItem = structuredClone(templateItem)
446
+ delete newItem.comment
447
+ delete newItem.itemQuery
448
+ delete newItem.items
449
+ newItem.entity = this.extractQidFromUrl(binding[itemVarName].value)
450
+ newItem.generated = true
451
+ const wikidataLabel = binding[itemVarName + "Label"].value
452
+ newItem.label = templateItem.label
453
+ ? templateItem.label.replaceAll("{_LABEL}", wikidataLabel).replaceAll("{_QID}", newItem.entity)
454
+ : `<a target="_blank" href="https://www.wikidata.org/wiki/${newItem.entity}">${wikidataLabel}</a>`
455
+ newItems.push(newItem)
456
+ }
457
+
458
+ templateItem.finished = true
459
+ console.log(`Item template '${templateItem.comment}' created ${newItems.length} items.`)
460
+
461
+ return newItems;
462
+ },
463
+
464
+ // runs a SPARQL query
465
+ runQuery: async function(query)
466
+ {
467
+ if (this.verboseLogging) console.log(query)
468
+
469
+ assert(this.options)
470
+ const options = { ...this.options, body: query }
471
+ const response = await fetch(this.sparqlUrl, options)
472
+ if (response.status != 200)
473
+ {
474
+ console.log(response)
475
+ console.error(`${response.status}: ${response.statusText}`)
476
+ return null
477
+ }
478
+ else
479
+ {
480
+ const data = await response.json()
481
+ return data
482
+ }
483
+ }
484
+ }
@@ -0,0 +1,40 @@
1
+ .vis-item.vis-range.visc-open-right
2
+ {
3
+ mask-image: linear-gradient(to right, rgba(0, 0, 0, 1) 50%, rgba(0, 0, 0, 0));
4
+ }
5
+ .vis-item.vis-range.visc-open-left
6
+ {
7
+ mask-image: linear-gradient(to right, rgba(0, 0, 0, 0), rgba(0, 0, 0, 1) 50%);
8
+ }
9
+ .vis-item.vis-range.visc-open-both
10
+ {
11
+ mask-image: linear-gradient(to right, rgba(0, 0, 0, 0), rgba(0, 0, 0, 1) 50%, rgba(0, 0, 0, 0));
12
+ }
13
+ .visc-uncertain
14
+ {
15
+ background: repeating-linear-gradient(45deg, #00000060, #00000060 8px, transparent 8px, transparent 16px)
16
+ }
17
+ .vis-item.vis-range.visc-right-connection, .vis-item.vis-range.visc-left-tail
18
+ {
19
+ border-right-style: none;
20
+ border-top-right-radius: 0;
21
+ border-bottom-right-radius: 0;
22
+ }
23
+ .vis-item.vis-range.visc-right-tail
24
+ {
25
+ mask-image: linear-gradient(to right, rgba(0, 0, 0, 1), rgba(0, 0, 0, 0));
26
+ }
27
+ .vis-item.vis-range.visc-left-connection, .vis-item.vis-range.visc-right-tail
28
+ {
29
+ border-left-style: none;
30
+ border-top-left-radius: 0;
31
+ border-bottom-left-radius: 0;
32
+ }
33
+ .vis-item.vis-range.visc-left-tail
34
+ {
35
+ mask-image: linear-gradient(to right, rgba(0, 0, 0, 0), rgba(0, 0, 0, 1));
36
+ }
37
+ .vis-item-content a {
38
+ color: inherit;
39
+ text-decoration: inherit;
40
+ }