eyeling 1.22.16 → 1.23.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/HANDBOOK.md CHANGED
@@ -27,7 +27,7 @@
27
27
  - [Epilogue](#epilogue)
28
28
  - [Appendix A — Eyeling user notes](#app-a)
29
29
  - [Appendix B — Notation3: when facts can carry their own logic](#app-b)
30
- - [Appendix C — N3 beyond Prolog: logic that survives the open web](#app-c)
30
+ - [Appendix C — Why N3 fits the Eyeling examples](#app-c)
31
31
  - [Appendix D — LLM + Eyeling: A Repeatable Logic Toolchain](#app-d)
32
32
  - [Appendix E — How Eyeling reaches 100% on `notation3tests`](#app-e)
33
33
  - [Appendix F — The ARC approach: Answer • Reason Why • Check](#app-f)
@@ -1871,7 +1871,7 @@ echo '@prefix : <http://example.org/> .
1871
1871
  { ?x a :Man } => { ?x a :Mortal } .' | npx eyeling
1872
1872
  ```
1873
1873
 
1874
- You can also pass a file path, or `-` to read explicitly from stdin.
1874
+ You can also pass one or more file paths/URLs, or `-` to read explicitly from stdin. When multiple inputs are given, Eyeling parses each source separately, merges the parsed ASTs, and then runs one reasoning pass over the combined facts and rules. This avoids constructing one giant N3 source string.
1875
1875
 
1876
1876
  Show the available options:
1877
1877
 
@@ -1895,7 +1895,7 @@ npx eyeling --builtin lib/builtin-sudoku.js examples/sudoku.n3
1895
1895
 
1896
1896
  The bundle contains the whole engine. The CLI path is the “canonical behavior”:
1897
1897
 
1898
- - parse input file
1898
+ - parse one or more input sources; with multiple sources, parse each source independently and merge the ASTs
1899
1899
  - reason to closure
1900
1900
  - print derived triples, or render `log:outputString` strings when present
1901
1901
  - optional proof comments
@@ -1916,6 +1916,7 @@ The current CLI supports a small set of flags (see `lib/cli.js`):
1916
1916
  - `-h`, `--help` — show usage.
1917
1917
  - With no positional argument, Eyeling reads from stdin when input is piped.
1918
1918
  - Use `-` as the input path to read explicitly from stdin.
1919
+ - Multiple positional inputs are allowed, for example `eyeling facts.n3 rules.n3`; rules from any input can match facts from any other input after the merge.
1919
1920
 
1920
1921
  ### 14.3 Package entrypoint split for Node, browser, and CLI
1921
1922
 
@@ -2024,13 +2025,34 @@ Notes:
2024
2025
 
2025
2026
  #### 14.4.2 RDF-JS and Eyeling rule-object interoperability
2026
2027
 
2027
- The JavaScript APIs accept three input styles:
2028
+ The JavaScript APIs accept four input styles:
2028
2029
 
2029
2030
  1. plain N3 text
2030
- 2. RDF/JS fact input (`quads`, `facts`, or `dataset`)
2031
- 3. Eyeling rule objects or full AST bundles
2031
+ 2. a multi-source N3 object (`{ sources: [...] }`)
2032
+ 3. RDF/JS fact input (`quads`, `facts`, or `dataset`)
2033
+ 4. Eyeling rule objects or full AST bundles
2032
2034
 
2033
- If you want to use N3 source text, pass the whole input as a plain string.
2035
+ If you want to use one N3 source text, pass the whole input as a plain string. If you want to avoid concatenating several N3 sources into one large string, pass them as a source list instead.
2036
+
2037
+ For example:
2038
+
2039
+ ```js
2040
+ const { reason } = require('eyeling');
2041
+
2042
+ const out = reason(
2043
+ { proofComments: false },
2044
+ {
2045
+ sources: [
2046
+ '@prefix : <http://example.org/> .\n:Socrates a :Man .\n',
2047
+ '@prefix : <http://example.org/> .\n{ ?x a :Man } => { ?x a :Mortal } .\n',
2048
+ ],
2049
+ },
2050
+ );
2051
+
2052
+ console.log(out);
2053
+ ```
2054
+
2055
+ In a source list, each source is parsed with its own blank-node scope and optional base IRI. That means the same explicit blank label, such as `_:x`, in two different sources does not accidentally become the same blank node after merging. Prefix declarations are merged mainly for readable output; IRI expansion has already happened while each source was parsed.
2034
2056
 
2035
2057
  For RDF/JS facts, the graph must be the default graph. Named-graph quads are rejected.
2036
2058
 
@@ -2467,6 +2489,12 @@ The authoritative list is always:
2467
2489
  eyeling --help
2468
2490
  ```
2469
2491
 
2492
+ Usage:
2493
+
2494
+ ```bash
2495
+ eyeling [options] [file-or-url.n3|- ...]
2496
+ ```
2497
+
2470
2498
  Options:
2471
2499
 
2472
2500
  ```
@@ -2481,6 +2509,8 @@ Options:
2481
2509
  -v, --version Print version and exit.
2482
2510
  ```
2483
2511
 
2512
+ Input note: with multiple positional inputs, Eyeling reads and parses each source separately, then merges facts, forward rules, backward rules, and `log:query` directives before reasoning. Blank node labels are scoped per input document.
2513
+
2484
2514
  Note: when `log:query` directives are present, or when the program may produce `log:outputString` facts, Eyeling cannot stream its final user-facing output from partial derivations, so `--stream` has no effect in those cases. In the latter case Eyeling saturates first and then renders the collected output strings.
2485
2515
 
2486
2516
  See also:
@@ -2631,23 +2661,69 @@ In that sense, N3 is less a bid to make the web “smarter” than a bid to make
2631
2661
 
2632
2662
  <a id="app-c"></a>
2633
2663
 
2634
- ## Appendix C — N3 beyond Prolog: logic that survives the open web
2664
+ ## Appendix C — Why N3 fits the Eyeling examples
2665
+
2666
+ The Eyeling examples combine several things at once. They contain facts about a situation, rules that derive new facts, checks that make the result testable, and an answer that can be shown to a human. That combination matters. It means that Eyeling is not only a data exercise and not only a logic exercise. It needs a notation in which data and rules can remain together.
2667
+
2668
+ This raises a practical question: which language fits these examples best?
2669
+
2670
+ SQL is a natural candidate when the main task is storing data and querying it. Prolog is a natural candidate when the main task is writing rules and deriving consequences from facts. N3 is interesting because it tries to keep those two sides together. The point of this appendix is not to rank SQL, Prolog, and N3 in general. The point is to explain why N3 works especially well for Eyeling-style examples.
2671
+
2672
+ ### What the examples need
2673
+
2674
+ A typical Eyeling example is not just a small dataset. It is also not just a set of inference rules. It is a compact artifact in which several layers belong together.
2675
+
2676
+ There is usually a description of a situation: products, airports, organisms, policies, signatures, dates, or other entities. There are rules that derive new facts from those inputs. There are explicit checks that say whether the intended conclusions hold. And there is often a final answer or explanation that is part of the example itself.
2677
+
2678
+ This is the real design problem. If the language handles only one of these layers well, then the example has to be split up. The data ends up in one notation, the rules in another, the checks somewhere else, and the final answer in yet another place. Once that happens, the example becomes harder to read and harder to maintain.
2679
+
2680
+ ### What SQL contributes
2681
+
2682
+ SQL is strong when the main task is structured data and queries over that data. It is excellent for tables, filtering, aggregation, joins, and efficient execution. When an Eyeling example is translated into DuckDB, SQL can do a surprising amount. Recursive queries can express route search. Views can express derived facts. Checks can be written as boolean queries. Output can be assembled from query results.
2683
+
2684
+ That is useful, and it shows that the examples can be operationalized in a relational setting.
2685
+
2686
+ However, SQL is not the original shape of these examples. To get there, the graph-like source has to be mapped into tables, and the rule logic has to be reconstructed with joins, common table expressions, macros, and views. The result can work well, but the conceptual structure becomes more indirect. Data and reasoning are still connected, but they are no longer expressed in the same native form.
2687
+
2688
+ In other words, SQL is a good execution target, but it is not always the clearest authoring language for this kind of material.
2689
+
2690
+ ### What Prolog contributes
2691
+
2692
+ Prolog is strong when the main task is expressing facts and rules directly. An Eyeling example often looks much closer to Prolog than to SQL once the focus shifts to derivation. Facts become predicates. Rules become clauses. Recursive reasoning becomes natural. This makes Prolog a very good target when the aim is to express the logical behavior of an example clearly.
2693
+
2694
+ That is why Prolog translations of Eyeling examples often feel much cleaner than SQL translations. The rule layer fits naturally.
2695
+
2696
+ However, Prolog is not primarily a graph data notation. Eyeling examples often use a linked-data style in which named entities and relations remain visible as part of the knowledge representation. In Prolog this can certainly be modeled, but it is usually represented as application-specific predicates rather than as a graph-native notation. That means the rule side is natural, while the original data style becomes less central.
2697
+
2698
+ So Prolog captures the inference well, but it does not preserve the same linked-data feel as naturally as N3 does.
2699
+
2700
+ ### Why N3 fits these examples well
2701
+
2702
+ N3 fits Eyeling well because it keeps the data model and the rule model close together.
2703
+
2704
+ The facts remain graph-shaped. Entities and relations can be written directly. Rules can be added in the same notation. Checks can be expressed next to the derivations they depend on. Even the final answer can remain part of the same artifact. This allows an example to stay compact from beginning to end.
2705
+
2706
+ That compactness is important. It means the reader can inspect one example and see the situation, the derivation, the checks, and the answer without mentally switching between several different layers of representation.
2707
+
2708
+ This is the main reason N3 feels like a sweet spot in Eyeling. Compared with SQL, it avoids the split between graph-shaped knowledge and relational encoding. Compared with Prolog, it avoids the split between logic programming and linked-data representation. It keeps both sides close enough that the whole example can stay in one place.
2709
+
2710
+ ### Where this matters in practice
2635
2711
 
2636
- Notation3 and Prolog resemble each other at first glance. Both use variables, unification, and implication-shaped rules. Both can be used to derive consequences from a set of premises. But they were shaped for different native environments. Prolog was designed around executing logic programs. N3 was designed around expressing logic in the same world as RDF: a world of IRIs, triples, graphs, linked documents, and shared vocabularies. That difference in native setting gives N3 a wider built-in reach for web-scale knowledge work.
2712
+ This matters most in examples where the structure of the knowledge is part of the point.
2637
2713
 
2638
- The first difference is the data model. In Prolog, the basic units are terms and clauses. In N3, the basic units are RDF-style triples and graphs, extended with variables, lists, graph terms, and logical implication. N3 is explicitly defined as an extension of RDF, not as a separate programming language that merely happens to manipulate graphs. That means identifiers, data, and inferred results all live in one interoperable form. A rule consumes graph patterns and produces graph patterns, so the output of reasoning can be published, merged, quoted, exchanged, and reasoned over again without changing representation.
2714
+ In the path-discovery example, the facts describe airports and routes, and the rule describes how a connection can be found through a bounded number of stopovers. In SQL, this becomes a recursive query over tables. In Prolog, it becomes a recursive predicate over facts. In N3, the graph and the rule remain in one notation.
2639
2715
 
2640
- The second difference is more profound: N3 can treat graphs as first-class terms. A graph term is a quoted graph. It can appear in subject, predicate, or object position, and it does not automatically assert its contents as true. It is a resource in its own right. This makes it natural to represent claims, reports, beliefs, policies, source snapshots, and provenance. One can say not only “this is true,” but also “this document says this,” “this authority concludes that,” or “this rule applies to that quoted context.” In other words, N3 can reason about statements as statements, not only about the world those statements describe.
2716
+ In the barley-seed-becoming example, the facts describe stages, transitions, and constraints, and the rules determine what can and cannot become something else. In SQL and Prolog, this can be translated, but N3 preserves the original structure more directly.
2641
2717
 
2642
- That leads directly to a third advantage: scope is part of the logic. In N3, implicit universal quantification is global, while implicit existential quantification is local to the graph in which it occurs. Once graphs can be nested and quoted, that distinction matters. It allows a reasoner to keep track of what is asserted here, what is asserted inside a quoted subgraph, and what variables belong to which level. This gives N3 a native way to express context-sensitive reasoning that would feel external or encoded-by-convention in classic Prolog.
2718
+ In the delfour example, the same pattern becomes even clearer. The example combines facts about products and household needs, rules about authorization and recommendation, checks over the derived conclusions, and a final human-readable answer. That kind of example is exactly where a language that keeps data, rules, checks, and answers together becomes valuable.
2643
2719
 
2644
- A fourth difference is that N3 connects logic to the web inside the language itself. The N3 vocabulary includes `log:semantics` and `log:conclusion` for retrieving and reasoning over local or online sources. It also includes scoped operators such as `log:forAllIn` and `log:notIncludes`, which support scoped universal reasoning and scoped negation-as-failure over a chosen graph or document. So web retrieval, source-bounded reasoning, and reasoning over quoted content are not afterthoughts. They are part of the logic vocabulary. This is one of the clearest senses in which N3 has greater inherent potential: it is not only a language for deriving consequences from facts, but a language for locating, quoting, constraining, and comparing the sources of those facts.
2720
+ ### Conclusion
2645
2721
 
2646
- By contrast, classic Prolog is centered on a proof procedure over definite clauses. In the usual operational reading, Prolog applies a restricted form of SLD-resolution, chooses the leftmost goal, tries clauses top-to-bottom, and searches depth-first. Traditional negation is negation-as-failure, described in the SWI-Prolog glossary as a weak negation. This makes Prolog extremely effective as an executable logic language, but it also means that much of its practical power comes from the operational behavior of the engine, not from a web-native knowledge model.
2722
+ N3 is not the best language for every task. SQL is stronger as a database query language. Prolog is stronger as a pure rule language. But the Eyeling examples are not only database exercises and not only rule exercises.
2647
2723
 
2648
- This does not make Prolog inferior. Modern Prolog systems add important capabilities beyond the classic core, including tabling, which improves termination behavior and can make Prolog behave more like a bottom-up theorem prover for some classes of problems. But that actually sharpens the comparison: when Prolog needs to work more comfortably in graph-heavy or knowledge-integration settings, it typically gains that power through extensions, libraries, or host-environment choices. N3 starts there. Graphs, quoted formulas, scoped reasoning, and web-connected semantics are part of its native conceptual center.
2724
+ They are compact knowledge artifacts in which facts, rules, checks, and answers belong together.
2649
2725
 
2650
- So the strongest claim is not that N3 replaces Prolog in every domain. The stronger and more precise claim is this: Prolog is a very strong language for executing logic programs, while N3 has greater inherent potential as a language for publishable, linkable, inspectable, and source-aware logic on the open web. It lets facts, rules, quoted claims, retrieved documents, and derived conclusions all live in one semantic space. For a system like Eyeling, that is not a cosmetic difference. It is the reason the language can scale from local inference to web-shaped reasoning without changing its basic form.
2726
+ That is why N3 fits them so well. It is not because N3 wins an abstract language competition. It is because these examples need a form in which data and reasoning can remain unified. For Eyeling, that is exactly what N3 provides.
2651
2727
 
2652
2728
  ---
2653
2729
 
@@ -3686,7 +3762,7 @@ Taken together, these positions support a straightforward attitude toward Eyelin
3686
3762
 
3687
3763
  ## Appendix K — Whitehead-inspired becoming examples
3688
3764
 
3689
- A small family of examples in the repository explores a common idea: that logic can describe not only what **is** the case, but what a thing, system, lineage, or device can **become**. The inspiration is Whiteheadian in a broad sense. The examples do not attempt to formalize Whitehead’s metaphysics as scholarship. Instead, they borrow one guiding intuition from it: reality is often better understood as a structured passage from one state to another than as a mere inventory of static objects.
3765
+ A small family of examples in the repository (`examples/*-becoming.n3`) explores a common idea: that logic can describe not only what **is** the case, but what a thing, system, lineage, or device can **become**. The inspiration is Whiteheadian in a broad sense. The examples do not attempt to formalize Whitehead’s metaphysics as scholarship. Instead, they borrow one guiding intuition from it: reality is often better understood as a structured passage from one state to another than as a mere inventory of static objects.
3690
3766
 
3691
3767
  In N3 terms, this means the examples are written so that rules describe **state-transition potential**. Earlier examples in the handbook often use predicates such as `:can`, `:cannot`, `:supports`, or `:requires`. The becoming family shifts the emphasis toward predicates such as `:canBecome` and `:cannotBecome`, along with intermediate states such as protected dormancy, germination, negative differential response, or adaptive persistence. This is still ordinary Horn-style reasoning. The novelty is not in the engine, but in the modeling style.
3692
3768
 
@@ -4935,6 +4935,7 @@
4935
4935
  const engine = require('./engine');
4936
4936
  const deref = require('./deref');
4937
4937
  const { PrefixEnv } = require('./prelude');
4938
+ const { parseN3Text, mergeParsedDocuments } = require('./multisource');
4938
4939
 
4939
4940
  function offsetToLineCol(text, offset) {
4940
4941
  const chars = Array.from(text);
@@ -5018,8 +5019,9 @@
5018
5019
 
5019
5020
  function printHelp(toStderr = false) {
5020
5021
  const msg =
5021
- `Usage: ${prog} [options] [file.n3|-]\n\n` +
5022
- `When no file is given and stdin is piped, read N3 from stdin.\n\n` +
5022
+ `Usage: ${prog} [options] [file-or-url.n3|- ...]\n\n` +
5023
+ `When no file is given and stdin is piped, read N3 from stdin.\n` +
5024
+ `When multiple inputs are given, parse each source separately, merge ASTs, then reason once.\n\n` +
5023
5025
  `Options:\n` +
5024
5026
  ` -a, --ast Print parsed AST as JSON and exit.\n` +
5025
5027
  ` --builtin <module.js> Load a custom builtin module (repeatable).\n` +
@@ -5063,7 +5065,7 @@
5063
5065
  builtinModules.push(a.slice('--builtin='.length));
5064
5066
  continue;
5065
5067
  }
5066
- if (!a.startsWith('-')) positional.push(a);
5068
+ if (a === '-' || !a.startsWith('-')) positional.push(a);
5067
5069
  }
5068
5070
 
5069
5071
  const showAst = argv.includes('--ast') || argv.includes('-a');
@@ -5089,17 +5091,12 @@
5089
5091
  if (typeof engine.setSuperRestrictedMode === 'function') engine.setSuperRestrictedMode(true);
5090
5092
  }
5091
5093
 
5092
- // Positional args (the N3 file)
5094
+ // Positional args (one or more N3 sources).
5093
5095
  const useImplicitStdin = positional.length === 0 && !process.stdin.isTTY;
5094
5096
  if (positional.length === 0 && !useImplicitStdin) {
5095
5097
  printHelp(false);
5096
5098
  process.exit(0);
5097
5099
  }
5098
- if (positional.length > 1) {
5099
- console.error('Error: expected at most one input [file.n3|-].');
5100
- printHelp(true);
5101
- process.exit(1);
5102
- }
5103
5100
 
5104
5101
  for (const spec of builtinModules) {
5105
5102
  try {
@@ -5113,35 +5110,47 @@
5113
5110
  }
5114
5111
  }
5115
5112
 
5116
- const sourceLabel = useImplicitStdin || positional[0] === '-' ? '<stdin>' : positional[0];
5117
- const baseIri = __sourceLabelToBaseIri(sourceLabel);
5118
-
5119
- let text;
5120
- try {
5121
- text = __readInputSourceSync(sourceLabel);
5122
- } catch (e) {
5123
- if (sourceLabel === '<stdin>') console.error(`Error reading stdin: ${e.message}`);
5124
- else console.error(`Error reading file ${JSON.stringify(sourceLabel)}: ${e.message}`);
5113
+ const sourceLabels = useImplicitStdin ? ['<stdin>'] : positional.map((item) => (item === '-' ? '<stdin>' : item));
5114
+ if (sourceLabels.filter((item) => item === '<stdin>').length > 1) {
5115
+ console.error('Error: stdin can only be used once.');
5125
5116
  process.exit(1);
5126
5117
  }
5127
5118
 
5128
- let toks;
5129
- let prefixes, triples, frules, brules, qrules;
5130
- try {
5131
- toks = engine.lex(text);
5132
- const parser = new engine.Parser(toks);
5133
- if (baseIri) parser.prefixes.setBase(baseIri);
5134
- [prefixes, triples, frules, brules, qrules] = parser.parseDocument();
5135
- // Make the parsed prefixes available to log:trace output (CLI path)
5136
- engine.setTracePrefixes(prefixes);
5137
- } catch (e) {
5138
- if (e && e.name === 'N3SyntaxError') {
5139
- console.error(formatN3SyntaxError(e, text, sourceLabel));
5119
+ const parsedSources = [];
5120
+ for (const sourceLabel of sourceLabels) {
5121
+ let text;
5122
+ try {
5123
+ text = __readInputSourceSync(sourceLabel);
5124
+ } catch (e) {
5125
+ if (sourceLabel === '<stdin>') console.error(`Error reading stdin: ${e.message}`);
5126
+ else console.error(`Error reading source ${JSON.stringify(sourceLabel)}: ${e.message}`);
5140
5127
  process.exit(1);
5141
5128
  }
5142
- throw e;
5129
+
5130
+ try {
5131
+ parsedSources.push(
5132
+ parseN3Text(text, {
5133
+ baseIri: __sourceLabelToBaseIri(sourceLabel),
5134
+ label: sourceLabel,
5135
+ }),
5136
+ );
5137
+ } catch (e) {
5138
+ if (e && e.name === 'N3SyntaxError') {
5139
+ console.error(formatN3SyntaxError(e, text, sourceLabel));
5140
+ process.exit(1);
5141
+ }
5142
+ throw e;
5143
+ }
5143
5144
  }
5144
5145
 
5146
+ const mergedDocument = mergeParsedDocuments(parsedSources);
5147
+ const prefixes = mergedDocument.prefixes;
5148
+ const triples = mergedDocument.triples;
5149
+ const frules = mergedDocument.frules;
5150
+ const brules = mergedDocument.brules;
5151
+ const qrules = mergedDocument.logQueryRules;
5152
+ const tokenSets = parsedSources.map((source) => ({ tokens: source.tokens, prefixes: source.prefixes }));
5153
+
5145
5154
  if (showAst) {
5146
5155
  function astReplacer(unusedJsonKey, value) {
5147
5156
  if (value instanceof Set) return Array.from(value);
@@ -5285,7 +5294,10 @@
5285
5294
  const mayAutoRenderOutputStrings = programMayProduceOutputStrings(triples, frules, qrules);
5286
5295
 
5287
5296
  if (streamMode && !hasQueries && !mayAutoRenderOutputStrings) {
5288
- const usedInInput = prefixesUsedInInputTokens(toks, prefixes);
5297
+ const usedInInput = new Set();
5298
+ for (const source of tokenSets) {
5299
+ for (const pfx of prefixesUsedInInputTokens(source.tokens, source.prefixes)) usedInInput.add(pfx);
5300
+ }
5289
5301
  const outPrefixes = restrictPrefixEnv(prefixes, usedInInput);
5290
5302
 
5291
5303
  // Ensure log:trace uses the same compact prefix set as the output.
@@ -5879,6 +5891,7 @@
5879
5891
  const { lex, N3SyntaxError } = require('./lexer');
5880
5892
  const { Parser } = require('./parser');
5881
5893
  const { liftBlankRuleVars } = require('./rules');
5894
+ const { parseN3SourceList } = require('./multisource');
5882
5895
 
5883
5896
  const {
5884
5897
  makeBuiltins,
@@ -9228,7 +9241,8 @@ ${triples.map((tr) => ` ${tripleToN3(tr, prefixes)}`).join('\n')}
9228
9241
  builtinModules = null,
9229
9242
  } = opts;
9230
9243
 
9231
- const parsedInput = normalizeParsedReasonerInputSync(input);
9244
+ const parsedSourceList = parseN3SourceList(input, { baseIri });
9245
+ const parsedInput = parsedSourceList || normalizeParsedReasonerInputSync(input);
9232
9246
  const rdfFactory = rdfjs ? getDataFactory(dataFactory) : null;
9233
9247
 
9234
9248
  const __oldEnforceHttps = deref.getEnforceHttpsEnabled();
@@ -9371,7 +9385,7 @@ ${triples.map((tr) => ` ${tripleToN3(tr, prefixes)}`).join('\n')}
9371
9385
 
9372
9386
  Promise.resolve().then(async () => {
9373
9387
  try {
9374
- const normalizedInput = await normalizeReasonerInputAsync(input);
9388
+ const normalizedInput = parseN3SourceList(input, restOpts) || (await normalizeReasonerInputAsync(input));
9375
9389
  reasonStream(normalizedInput, {
9376
9390
  ...restOpts,
9377
9391
  rdfjs: false,
@@ -10566,6 +10580,206 @@ ${triples.map((tr) => ` ${tripleToN3(tr, prefixes)}`).join('\n')}
10566
10580
 
10567
10581
  module.exports = { Token, N3SyntaxError, lex, decodeN3StringEscapes };
10568
10582
  };
10583
+ __modules['lib/multisource.js'] = function (require, module, exports) {
10584
+ /**
10585
+ * Eyeling Reasoner — multi-source parsing helpers
10586
+ *
10587
+ * These helpers let the CLI/API parse several N3 documents independently and
10588
+ * merge their parsed ASTs before reasoning. This avoids building one giant N3
10589
+ * string while preserving the existing lexer/parser/engine pipeline.
10590
+ */
10591
+
10592
+ 'use strict';
10593
+
10594
+ const { lex } = require('./lexer');
10595
+ const { Parser } = require('./parser');
10596
+ const {
10597
+ Blank,
10598
+ ListTerm,
10599
+ OpenListTerm,
10600
+ GraphTerm,
10601
+ Triple,
10602
+ Rule,
10603
+ PrefixEnv,
10604
+ annotateQuotedGraphTerm,
10605
+ } = require('./prelude');
10606
+
10607
+ function emptyParsedDocument() {
10608
+ return {
10609
+ prefixes: PrefixEnv.newDefault(),
10610
+ triples: [],
10611
+ frules: [],
10612
+ brules: [],
10613
+ logQueryRules: [],
10614
+ };
10615
+ }
10616
+
10617
+ function parseN3Text(text, opts = {}) {
10618
+ const { baseIri = '', label = '<input>' } = opts || {};
10619
+ const tokens = lex(text);
10620
+ const parser = new Parser(tokens);
10621
+ if (baseIri) parser.prefixes.setBase(baseIri);
10622
+ const [prefixes, triples, frules, brules, logQueryRules] = parser.parseDocument();
10623
+ return { prefixes, triples, frules, brules, logQueryRules, tokens, text, label };
10624
+ }
10625
+
10626
+ function sourceBlankPrefix(sourceIndex) {
10627
+ return `_:src${sourceIndex}_`;
10628
+ }
10629
+
10630
+ function scopedBlankLabel(label, sourceIndex, mapping) {
10631
+ const key = String(label || '');
10632
+ let out = mapping.get(key);
10633
+ if (out) return out;
10634
+
10635
+ const bare = key.startsWith('_:') ? key.slice(2) : key;
10636
+ out = sourceBlankPrefix(sourceIndex) + bare;
10637
+ mapping.set(key, out);
10638
+ return out;
10639
+ }
10640
+
10641
+ function scopeBlankNodesInDocument(doc, sourceIndex) {
10642
+ const mapping = new Map();
10643
+
10644
+ function cloneTerm(term) {
10645
+ if (term instanceof Blank) return new Blank(scopedBlankLabel(term.label, sourceIndex, mapping));
10646
+ if (term instanceof ListTerm) return new ListTerm(term.elems.map(cloneTerm));
10647
+ if (term instanceof OpenListTerm) return new OpenListTerm(term.prefix.map(cloneTerm), term.tailVar);
10648
+ if (term instanceof GraphTerm) return annotateQuotedGraphTerm(new GraphTerm(term.triples.map(cloneTriple)));
10649
+ return term;
10650
+ }
10651
+
10652
+ function cloneTriple(triple) {
10653
+ return new Triple(cloneTerm(triple.s), cloneTerm(triple.p), cloneTerm(triple.o));
10654
+ }
10655
+
10656
+ function cloneRule(rule) {
10657
+ const headBlankLabels = new Set();
10658
+ if (rule && rule.headBlankLabels instanceof Set) {
10659
+ for (const label of rule.headBlankLabels) headBlankLabels.add(scopedBlankLabel(label, sourceIndex, mapping));
10660
+ }
10661
+
10662
+ const out = new Rule(
10663
+ (rule.premise || []).map(cloneTriple),
10664
+ (rule.conclusion || []).map(cloneTriple),
10665
+ rule.isForward,
10666
+ rule.isFuse,
10667
+ headBlankLabels,
10668
+ );
10669
+
10670
+ if (rule && Object.prototype.hasOwnProperty.call(rule, '__dynamicConclusionTerm')) {
10671
+ Object.defineProperty(out, '__dynamicConclusionTerm', {
10672
+ value: cloneTerm(rule.__dynamicConclusionTerm),
10673
+ enumerable: false,
10674
+ writable: false,
10675
+ configurable: true,
10676
+ });
10677
+ }
10678
+
10679
+ return out;
10680
+ }
10681
+
10682
+ return {
10683
+ prefixes: doc.prefixes,
10684
+ triples: (doc.triples || []).map(cloneTriple),
10685
+ frules: (doc.frules || []).map(cloneRule),
10686
+ brules: (doc.brules || []).map(cloneRule),
10687
+ logQueryRules: (doc.logQueryRules || []).map(cloneRule),
10688
+ tokens: doc.tokens,
10689
+ text: doc.text,
10690
+ label: doc.label,
10691
+ };
10692
+ }
10693
+
10694
+ function mergePrefixEnvs(target, source) {
10695
+ if (!source) return target;
10696
+ const map = source.map || {};
10697
+ for (const [prefix, iri] of Object.entries(map)) {
10698
+ // Every parser starts with an empty default namespace. Do not let a later
10699
+ // source that never declared ':' erase a useful default namespace from an
10700
+ // earlier source; prefix merging is for output readability only.
10701
+ if (iri || !Object.prototype.hasOwnProperty.call(target.map, prefix)) target.set(prefix, iri);
10702
+ }
10703
+ if (source.baseIri) target.setBase(source.baseIri);
10704
+ return target;
10705
+ }
10706
+
10707
+ function mergeParsedDocuments(docs, opts = {}) {
10708
+ const documents = Array.isArray(docs) ? docs : [];
10709
+ const scopeBlankNodes = typeof opts.scopeBlankNodes === 'boolean' ? opts.scopeBlankNodes : documents.length > 1;
10710
+
10711
+ const merged = emptyParsedDocument();
10712
+ const mergedSources = [];
10713
+
10714
+ for (let i = 0; i < documents.length; i++) {
10715
+ const originalDoc = documents[i] || emptyParsedDocument();
10716
+ const doc = scopeBlankNodes ? scopeBlankNodesInDocument(originalDoc, i + 1) : originalDoc;
10717
+
10718
+ mergePrefixEnvs(merged.prefixes, doc.prefixes);
10719
+ merged.triples.push(...(doc.triples || []));
10720
+ merged.frules.push(...(doc.frules || []));
10721
+ merged.brules.push(...(doc.brules || []));
10722
+ merged.logQueryRules.push(...(doc.logQueryRules || []));
10723
+ mergedSources.push(doc);
10724
+ }
10725
+
10726
+ Object.defineProperty(merged, 'sources', {
10727
+ value: mergedSources,
10728
+ enumerable: false,
10729
+ writable: false,
10730
+ configurable: true,
10731
+ });
10732
+
10733
+ return merged;
10734
+ }
10735
+
10736
+ function isN3SourceListInput(input) {
10737
+ return !!(input && typeof input === 'object' && !Array.isArray(input) && Array.isArray(input.sources));
10738
+ }
10739
+
10740
+ function normalizeN3SourceItem(source, index) {
10741
+ const sourceNumber = index + 1;
10742
+ if (typeof source === 'string') {
10743
+ return { text: source, label: `<source ${sourceNumber}>`, baseIri: '' };
10744
+ }
10745
+ if (!source || typeof source !== 'object' || Array.isArray(source)) {
10746
+ throw new TypeError('Each N3 source must be a string or an object with an n3/text field');
10747
+ }
10748
+
10749
+ const text = typeof source.n3 === 'string' ? source.n3 : typeof source.text === 'string' ? source.text : null;
10750
+ if (text === null) throw new TypeError('Each N3 source object must provide an n3 or text string');
10751
+
10752
+ return {
10753
+ text,
10754
+ label: typeof source.label === 'string' && source.label ? source.label : `<source ${sourceNumber}>`,
10755
+ baseIri: typeof source.baseIri === 'string' ? source.baseIri : '',
10756
+ };
10757
+ }
10758
+
10759
+ function parseN3SourceList(input, opts = {}) {
10760
+ if (!isN3SourceListInput(input)) return null;
10761
+ const sources = input.sources.map(normalizeN3SourceItem);
10762
+ const defaultBaseIri = typeof opts.baseIri === 'string' ? opts.baseIri : '';
10763
+ const parsed = sources.map((source, index) =>
10764
+ parseN3Text(source.text, {
10765
+ label: source.label,
10766
+ baseIri: source.baseIri || (sources.length === 1 ? defaultBaseIri : ''),
10767
+ }),
10768
+ );
10769
+ return mergeParsedDocuments(parsed, {
10770
+ scopeBlankNodes: typeof input.scopeBlankNodes === 'boolean' ? input.scopeBlankNodes : parsed.length > 1,
10771
+ });
10772
+ }
10773
+
10774
+ module.exports = {
10775
+ emptyParsedDocument,
10776
+ parseN3Text,
10777
+ mergeParsedDocuments,
10778
+ scopeBlankNodesInDocument,
10779
+ isN3SourceListInput,
10780
+ parseN3SourceList,
10781
+ };
10782
+ };
10569
10783
  __modules['lib/parser.js'] = function (require, module, exports) {
10570
10784
  /**
10571
10785
  * Eyeling Reasoner — parser