@unispechq/unispec-core 0.2.0 → 0.2.2
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- package/README.md +223 -223
- package/dist/cjs/converters/index.js +206 -0
- package/dist/cjs/diff/index.js +236 -0
- package/dist/cjs/index.js +22 -0
- package/dist/cjs/loader/index.js +22 -0
- package/dist/cjs/normalizer/index.js +107 -0
- package/dist/cjs/package.json +3 -0
- package/dist/cjs/types/index.js +3 -0
- package/dist/cjs/validator/index.js +81 -0
- package/dist/index.cjs +22 -0
- package/dist/validator/index.js +10 -8
- package/package.json +11 -2
package/README.md
CHANGED
|
@@ -1,223 +1,223 @@
|
|
|
1
|
-
# UniSpec Core Platform
|
|
2
|
-
**The official UniSpec Core Engine package — parser, validator, normalizer, diff engine, and converters for the UniSpec format**
|
|
3
|
-
|
|
4
|
-
---
|
|
5
|
-
|
|
6
|
-
## 🚀 Overview
|
|
7
|
-
|
|
8
|
-
**UniSpec Core Engine** is the central runtime package of the UniSpec ecosystem.
|
|
9
|
-
It implements the mechanics of the **UniSpec Format**: loading, JSON Schema validation, normalization, diffing, and conversion to other formats.
|
|
10
|
-
|
|
11
|
-
This repository contains **only the Core Engine**, which is consumed by other UniSpec platform components (CLI, Registry, Portal, adapters, SDKs) that live in separate repositories.
|
|
12
|
-
|
|
13
|
-
Core Engine capabilities:
|
|
14
|
-
|
|
15
|
-
- 🧠 **Core Engine** — parser, validator, normalizer, diff engine, converters
|
|
16
|
-
- 🔄 **Format conversion** — UniSpec → OpenAPI, UniSpec → GraphQL SDL, UniSpec → WebSocket channel models
|
|
17
|
-
- 🧩 **Shared types and utilities** — types and helper functions used by the CLI, adapters, Registry, and Portal
|
|
18
|
-
|
|
19
|
-
UniSpec is designed to unify documentation for:
|
|
20
|
-
|
|
21
|
-
- REST
|
|
22
|
-
- GraphQL
|
|
23
|
-
- WebSocket
|
|
24
|
-
- Event-driven APIs (future)
|
|
25
|
-
- Multi-service architectures
|
|
26
|
-
|
|
27
|
-
---
|
|
28
|
-
|
|
29
|
-
## 📦 Repository Structure
|
|
30
|
-
|
|
31
|
-
```pgsql
|
|
32
|
-
unispec-core/
|
|
33
|
-
├─ src/
|
|
34
|
-
│ ├─ loader/ # File loading (YAML/JSON → JS object)
|
|
35
|
-
│ ├─ validator/ # JSON Schema validation
|
|
36
|
-
│ ├─ normalizer/ # UniSpec structure normalization
|
|
37
|
-
│ ├─ diff/ # Comparison of two UniSpec documents
|
|
38
|
-
│ ├─ converters/ # OpenAPI, GraphQL SDL, WS, etc.
|
|
39
|
-
│ ├─ types/ # UniSpec types/interfaces
|
|
40
|
-
│ ├─ utils/ # Small helpers
|
|
41
|
-
│ └─ index.ts # Public API
|
|
42
|
-
├─ tests/ # Unit tests
|
|
43
|
-
├─ package.json
|
|
44
|
-
└─ README.md
|
|
45
|
-
```
|
|
46
|
-
|
|
47
|
-
---
|
|
48
|
-
|
|
49
|
-
## 🧠 Core Concepts
|
|
50
|
-
|
|
51
|
-
### 📐 1. UniSpec Format
|
|
52
|
-
Defined in a separate repository: **`unispec-spec`**.
|
|
53
|
-
UniSpec Core Engine *implements* this format but does **not** define it.
|
|
54
|
-
|
|
55
|
-
### 🧱 2. Core Engine
|
|
56
|
-
Located in this repository, under the `src/` directory.
|
|
57
|
-
|
|
58
|
-
Core Engine provides:
|
|
59
|
-
|
|
60
|
-
- YAML/JSON loader
|
|
61
|
-
- JSON Schema validator
|
|
62
|
-
– Normalizer → canonical UniSpec output (REST routes, GraphQL operations, WebSocket channels/messages)
|
|
63
|
-
– Diff engine (with basic breaking / non-breaking classification for REST, GraphQL and WebSocket)
|
|
64
|
-
– Converters:
|
|
65
|
-
- UniSpec → OpenAPI (REST-centric)
|
|
66
|
-
- UniSpec → GraphQL SDL
|
|
67
|
-
- UniSpec → WebSocket channel models for dashboards
|
|
68
|
-
|
|
69
|
-
This is the foundation used by:
|
|
70
|
-
|
|
71
|
-
- the CLI (in a separate repository/package)
|
|
72
|
-
- framework adapters
|
|
73
|
-
- Registry and Portal (as separate UniSpec platform services)
|
|
74
|
-
|
|
75
|
-
### 🌉 3. Protocol Coverage in Core
|
|
76
|
-
|
|
77
|
-
At the level of the Core Engine, protocol support is intentionally minimal but deterministic:
|
|
78
|
-
|
|
79
|
-
- **REST**
|
|
80
|
-
- Typed REST surface defined by `service.protocols.rest.routes[]` and reusable `service.schemas`.
|
|
81
|
-
- Normalizer orders routes by `name` (or `path + method`) for stable diffs.
|
|
82
|
-
- Diff engine annotates route additions and removals with breaking/non-breaking severity.
|
|
83
|
-
- Converter exposes REST as an OpenAPI 3.1 document built from routes, schemas and environments, while preserving the original UniSpec under `x-unispec`.
|
|
84
|
-
|
|
85
|
-
- **GraphQL**
|
|
86
|
-
- Typed protocol with `schema` (SDL string) and `queries`/`mutations`/`subscriptions` as operation lists.
|
|
87
|
-
- Normalizer orders operation names within each bucket.
|
|
88
|
-
- Diff engine annotates operation additions/removals as non-breaking/breaking.
|
|
89
|
-
- Converter either passes through user-provided SDL or generates a minimal, deterministic SDL shell.
|
|
90
|
-
|
|
91
|
-
- **WebSocket**
|
|
92
|
-
- Typed protocol with channels and messages backed by reusable schemas and security schemes.
|
|
93
|
-
- Normalizer orders channels by name and messages by `name` within each channel.
|
|
94
|
-
- Diff engine annotates channel/message additions/removals as non-breaking/breaking changes.
|
|
95
|
-
- Converter produces a dashboard-friendly model with service metadata, a normalized channel list and the raw protocol.
|
|
96
|
-
|
|
97
|
-
### 💻 4. CLI (separate platform component)
|
|
98
|
-
The CLI uses UniSpec Core Engine for validation, conversion, and working with UniSpec specifications.
|
|
99
|
-
|
|
100
|
-
Example UniSpec CLI commands:
|
|
101
|
-
|
|
102
|
-
```pgsql
|
|
103
|
-
unispec validate
|
|
104
|
-
unispec push
|
|
105
|
-
unispec dev
|
|
106
|
-
unispec open
|
|
107
|
-
unispec diff
|
|
108
|
-
unispec convert
|
|
109
|
-
```
|
|
110
|
-
|
|
111
|
-
|
|
112
|
-
### 🗂 5. Registry API (separate service)
|
|
113
|
-
Stores UniSpec specs from multiple services and uses Core Engine for validation, normalization, and change analysis.
|
|
114
|
-
Effectively acts as the "system of record" for all APIs in the company.
|
|
115
|
-
|
|
116
|
-
### 🌐 6. Portal Web + API (separate service)
|
|
117
|
-
API documentation portal built on top of UniSpec and Core Engine:
|
|
118
|
-
|
|
119
|
-
- service catalog
|
|
120
|
-
- API endpoints
|
|
121
|
-
- schemas
|
|
122
|
-
- interactive playgrounds
|
|
123
|
-
- version comparison
|
|
124
|
-
|
|
125
|
-
---
|
|
126
|
-
|
|
127
|
-
## 🛒 Use Cases
|
|
128
|
-
|
|
129
|
-
### 🟦 Monolith mode (Swagger-like)
|
|
130
|
-
A backend service includes an adapter →
|
|
131
|
-
`/unispec.json` + `/docs` are served automatically.
|
|
132
|
-
|
|
133
|
-
### 🟧 Microservice mode
|
|
134
|
-
Each service pushes its UniSpec into Registry →
|
|
135
|
-
Portal Web assembles your entire company's API landscape.
|
|
136
|
-
|
|
137
|
-
### 🟩 Enterprise mode
|
|
138
|
-
Includes:
|
|
139
|
-
|
|
140
|
-
- SSO / permissions
|
|
141
|
-
- RBAC
|
|
142
|
-
- auditing
|
|
143
|
-
- topology map
|
|
144
|
-
- advanced analytics
|
|
145
|
-
|
|
146
|
-
(Handled via private repos.)
|
|
147
|
-
|
|
148
|
-
---
|
|
149
|
-
|
|
150
|
-
## 🏁 Getting Started
|
|
151
|
-
|
|
152
|
-
### 1. Clone the repo
|
|
153
|
-
|
|
154
|
-
```pgsql
|
|
155
|
-
git clone https://github.com/unispec/unispec-core.git
|
|
156
|
-
cd unispec-core
|
|
157
|
-
```
|
|
158
|
-
|
|
159
|
-
### 2. Install dependencies
|
|
160
|
-
|
|
161
|
-
```pgsql
|
|
162
|
-
pnpm install
|
|
163
|
-
```
|
|
164
|
-
|
|
165
|
-
### 3. Build all packages
|
|
166
|
-
|
|
167
|
-
```pgsql
|
|
168
|
-
pnpm build
|
|
169
|
-
```
|
|
170
|
-
|
|
171
|
-
### 4. Run locally (dev mode)
|
|
172
|
-
|
|
173
|
-
---
|
|
174
|
-
|
|
175
|
-
## 🤝 Contributing
|
|
176
|
-
|
|
177
|
-
Contributions are welcome!
|
|
178
|
-
Before contributing, please review:
|
|
179
|
-
|
|
180
|
-
- `docs/development.md`
|
|
181
|
-
- `.windsurfrules`
|
|
182
|
-
|
|
183
|
-
All core changes must comply with:
|
|
184
|
-
|
|
185
|
-
- the UniSpec format from `unispec-spec`
|
|
186
|
-
- compatibility rules
|
|
187
|
-
- test coverage
|
|
188
|
-
- platform architecture
|
|
189
|
-
|
|
190
|
-
---
|
|
191
|
-
|
|
192
|
-
## Related Repositories
|
|
193
|
-
|
|
194
|
-
| Repository | Purpose |
|
|
195
|
-
|------------------------|---------|
|
|
196
|
-
| `unispec-spec` | UniSpec format definition (schemas, examples) |
|
|
197
|
-
| `unispec-core` | **This repo** — UniSpec Core Engine implementation |
|
|
198
|
-
| `unispec-docs` | Documentation site |
|
|
199
|
-
| `unispec-js-adapters` | Framework integrations (Express/Nest/Fastify) built on top of Core Engine |
|
|
200
|
-
| `unispec-infra` | Helm charts, Docker, Terraform for deploying the UniSpec platform |
|
|
201
|
-
|
|
202
|
-
---
|
|
203
|
-
|
|
204
|
-
## License
|
|
205
|
-
|
|
206
|
-
UniSpec Core Engine is open-source and free to use under the MIT License.
|
|
207
|
-
|
|
208
|
-
---
|
|
209
|
-
|
|
210
|
-
## 🔥 Summary
|
|
211
|
-
|
|
212
|
-
`unispec-core` is the **heart of the UniSpec ecosystem** at the code level.
|
|
213
|
-
It provides the Core Engine used to:
|
|
214
|
-
|
|
215
|
-
- validate
|
|
216
|
-
- normalize
|
|
217
|
-
- diff
|
|
218
|
-
- convert
|
|
219
|
-
- and integrate
|
|
220
|
-
|
|
221
|
-
API specifications in the UniSpec format.
|
|
222
|
-
|
|
223
|
-
If you're building tools, adapters, or platform services that rely on UniSpec → **this is where the engine lives.**
|
|
1
|
+
# UniSpec Core Platform
|
|
2
|
+
**The official UniSpec Core Engine package — parser, validator, normalizer, diff engine, and converters for the UniSpec format**
|
|
3
|
+
|
|
4
|
+
---
|
|
5
|
+
|
|
6
|
+
## 🚀 Overview
|
|
7
|
+
|
|
8
|
+
**UniSpec Core Engine** is the central runtime package of the UniSpec ecosystem.
|
|
9
|
+
It implements the mechanics of the **UniSpec Format**: loading, JSON Schema validation, normalization, diffing, and conversion to other formats.
|
|
10
|
+
|
|
11
|
+
This repository contains **only the Core Engine**, which is consumed by other UniSpec platform components (CLI, Registry, Portal, adapters, SDKs) that live in separate repositories.
|
|
12
|
+
|
|
13
|
+
Core Engine capabilities:
|
|
14
|
+
|
|
15
|
+
- 🧠 **Core Engine** — parser, validator, normalizer, diff engine, converters
|
|
16
|
+
- 🔄 **Format conversion** — UniSpec → OpenAPI, UniSpec → GraphQL SDL, UniSpec → WebSocket channel models
|
|
17
|
+
- 🧩 **Shared types and utilities** — types and helper functions used by the CLI, adapters, Registry, and Portal
|
|
18
|
+
|
|
19
|
+
UniSpec is designed to unify documentation for:
|
|
20
|
+
|
|
21
|
+
- REST
|
|
22
|
+
- GraphQL
|
|
23
|
+
- WebSocket
|
|
24
|
+
- Event-driven APIs (future)
|
|
25
|
+
- Multi-service architectures
|
|
26
|
+
|
|
27
|
+
---
|
|
28
|
+
|
|
29
|
+
## 📦 Repository Structure
|
|
30
|
+
|
|
31
|
+
```pgsql
|
|
32
|
+
unispec-core/
|
|
33
|
+
├─ src/
|
|
34
|
+
│ ├─ loader/ # File loading (YAML/JSON → JS object)
|
|
35
|
+
│ ├─ validator/ # JSON Schema validation
|
|
36
|
+
│ ├─ normalizer/ # UniSpec structure normalization
|
|
37
|
+
│ ├─ diff/ # Comparison of two UniSpec documents
|
|
38
|
+
│ ├─ converters/ # OpenAPI, GraphQL SDL, WS, etc.
|
|
39
|
+
│ ├─ types/ # UniSpec types/interfaces
|
|
40
|
+
│ ├─ utils/ # Small helpers
|
|
41
|
+
│ └─ index.ts # Public API
|
|
42
|
+
├─ tests/ # Unit tests
|
|
43
|
+
├─ package.json
|
|
44
|
+
└─ README.md
|
|
45
|
+
```
|
|
46
|
+
|
|
47
|
+
---
|
|
48
|
+
|
|
49
|
+
## 🧠 Core Concepts
|
|
50
|
+
|
|
51
|
+
### 📐 1. UniSpec Format
|
|
52
|
+
Defined in a separate repository: **`unispec-spec`**.
|
|
53
|
+
UniSpec Core Engine *implements* this format but does **not** define it.
|
|
54
|
+
|
|
55
|
+
### 🧱 2. Core Engine
|
|
56
|
+
Located in this repository, under the `src/` directory.
|
|
57
|
+
|
|
58
|
+
Core Engine provides:
|
|
59
|
+
|
|
60
|
+
- YAML/JSON loader
|
|
61
|
+
- JSON Schema validator
|
|
62
|
+
– Normalizer → canonical UniSpec output (REST routes, GraphQL operations, WebSocket channels/messages)
|
|
63
|
+
– Diff engine (with basic breaking / non-breaking classification for REST, GraphQL and WebSocket)
|
|
64
|
+
– Converters:
|
|
65
|
+
- UniSpec → OpenAPI (REST-centric)
|
|
66
|
+
- UniSpec → GraphQL SDL
|
|
67
|
+
- UniSpec → WebSocket channel models for dashboards
|
|
68
|
+
|
|
69
|
+
This is the foundation used by:
|
|
70
|
+
|
|
71
|
+
- the CLI (in a separate repository/package)
|
|
72
|
+
- framework adapters
|
|
73
|
+
- Registry and Portal (as separate UniSpec platform services)
|
|
74
|
+
|
|
75
|
+
### 🌉 3. Protocol Coverage in Core
|
|
76
|
+
|
|
77
|
+
At the level of the Core Engine, protocol support is intentionally minimal but deterministic:
|
|
78
|
+
|
|
79
|
+
- **REST**
|
|
80
|
+
- Typed REST surface defined by `service.protocols.rest.routes[]` and reusable `service.schemas`.
|
|
81
|
+
- Normalizer orders routes by `name` (or `path + method`) for stable diffs.
|
|
82
|
+
- Diff engine annotates route additions and removals with breaking/non-breaking severity.
|
|
83
|
+
- Converter exposes REST as an OpenAPI 3.1 document built from routes, schemas and environments, while preserving the original UniSpec under `x-unispec`.
|
|
84
|
+
|
|
85
|
+
- **GraphQL**
|
|
86
|
+
- Typed protocol with `schema` (SDL string) and `queries`/`mutations`/`subscriptions` as operation lists.
|
|
87
|
+
- Normalizer orders operation names within each bucket.
|
|
88
|
+
- Diff engine annotates operation additions/removals as non-breaking/breaking.
|
|
89
|
+
- Converter either passes through user-provided SDL or generates a minimal, deterministic SDL shell.
|
|
90
|
+
|
|
91
|
+
- **WebSocket**
|
|
92
|
+
- Typed protocol with channels and messages backed by reusable schemas and security schemes.
|
|
93
|
+
- Normalizer orders channels by name and messages by `name` within each channel.
|
|
94
|
+
- Diff engine annotates channel/message additions/removals as non-breaking/breaking changes.
|
|
95
|
+
- Converter produces a dashboard-friendly model with service metadata, a normalized channel list and the raw protocol.
|
|
96
|
+
|
|
97
|
+
### 💻 4. CLI (separate platform component)
|
|
98
|
+
The CLI uses UniSpec Core Engine for validation, conversion, and working with UniSpec specifications.
|
|
99
|
+
|
|
100
|
+
Example UniSpec CLI commands:
|
|
101
|
+
|
|
102
|
+
```pgsql
|
|
103
|
+
unispec validate
|
|
104
|
+
unispec push
|
|
105
|
+
unispec dev
|
|
106
|
+
unispec open
|
|
107
|
+
unispec diff
|
|
108
|
+
unispec convert
|
|
109
|
+
```
|
|
110
|
+
|
|
111
|
+
|
|
112
|
+
### 🗂 5. Registry API (separate service)
|
|
113
|
+
Stores UniSpec specs from multiple services and uses Core Engine for validation, normalization, and change analysis.
|
|
114
|
+
Effectively acts as the "system of record" for all APIs in the company.
|
|
115
|
+
|
|
116
|
+
### 🌐 6. Portal Web + API (separate service)
|
|
117
|
+
API documentation portal built on top of UniSpec and Core Engine:
|
|
118
|
+
|
|
119
|
+
- service catalog
|
|
120
|
+
- API endpoints
|
|
121
|
+
- schemas
|
|
122
|
+
- interactive playgrounds
|
|
123
|
+
- version comparison
|
|
124
|
+
|
|
125
|
+
---
|
|
126
|
+
|
|
127
|
+
## 🛒 Use Cases
|
|
128
|
+
|
|
129
|
+
### 🟦 Monolith mode (Swagger-like)
|
|
130
|
+
A backend service includes an adapter →
|
|
131
|
+
`/unispec.json` + `/docs` are served automatically.
|
|
132
|
+
|
|
133
|
+
### 🟧 Microservice mode
|
|
134
|
+
Each service pushes its UniSpec into Registry →
|
|
135
|
+
Portal Web assembles your entire company's API landscape.
|
|
136
|
+
|
|
137
|
+
### 🟩 Enterprise mode
|
|
138
|
+
Includes:
|
|
139
|
+
|
|
140
|
+
- SSO / permissions
|
|
141
|
+
- RBAC
|
|
142
|
+
- auditing
|
|
143
|
+
- topology map
|
|
144
|
+
- advanced analytics
|
|
145
|
+
|
|
146
|
+
(Handled via private repos.)
|
|
147
|
+
|
|
148
|
+
---
|
|
149
|
+
|
|
150
|
+
## 🏁 Getting Started
|
|
151
|
+
|
|
152
|
+
### 1. Clone the repo
|
|
153
|
+
|
|
154
|
+
```pgsql
|
|
155
|
+
git clone https://github.com/unispec/unispec-core.git
|
|
156
|
+
cd unispec-core
|
|
157
|
+
```
|
|
158
|
+
|
|
159
|
+
### 2. Install dependencies
|
|
160
|
+
|
|
161
|
+
```pgsql
|
|
162
|
+
pnpm install
|
|
163
|
+
```
|
|
164
|
+
|
|
165
|
+
### 3. Build all packages
|
|
166
|
+
|
|
167
|
+
```pgsql
|
|
168
|
+
pnpm build
|
|
169
|
+
```
|
|
170
|
+
|
|
171
|
+
### 4. Run locally (dev mode)
|
|
172
|
+
|
|
173
|
+
---
|
|
174
|
+
|
|
175
|
+
## 🤝 Contributing
|
|
176
|
+
|
|
177
|
+
Contributions are welcome!
|
|
178
|
+
Before contributing, please review:
|
|
179
|
+
|
|
180
|
+
- `docs/development.md`
|
|
181
|
+
- `.windsurfrules`
|
|
182
|
+
|
|
183
|
+
All core changes must comply with:
|
|
184
|
+
|
|
185
|
+
- the UniSpec format from `unispec-spec`
|
|
186
|
+
- compatibility rules
|
|
187
|
+
- test coverage
|
|
188
|
+
- platform architecture
|
|
189
|
+
|
|
190
|
+
---
|
|
191
|
+
|
|
192
|
+
## Related Repositories
|
|
193
|
+
|
|
194
|
+
| Repository | Purpose |
|
|
195
|
+
|------------------------|---------|
|
|
196
|
+
| `unispec-spec` | UniSpec format definition (schemas, examples) |
|
|
197
|
+
| `unispec-core` | **This repo** — UniSpec Core Engine implementation |
|
|
198
|
+
| `unispec-docs` | Documentation site |
|
|
199
|
+
| `unispec-js-adapters` | Framework integrations (Express/Nest/Fastify) built on top of Core Engine |
|
|
200
|
+
| `unispec-infra` | Helm charts, Docker, Terraform for deploying the UniSpec platform |
|
|
201
|
+
|
|
202
|
+
---
|
|
203
|
+
|
|
204
|
+
## License
|
|
205
|
+
|
|
206
|
+
UniSpec Core Engine is open-source and free to use under the MIT License.
|
|
207
|
+
|
|
208
|
+
---
|
|
209
|
+
|
|
210
|
+
## 🔥 Summary
|
|
211
|
+
|
|
212
|
+
`unispec-core` is the **heart of the UniSpec ecosystem** at the code level.
|
|
213
|
+
It provides the Core Engine used to:
|
|
214
|
+
|
|
215
|
+
- validate
|
|
216
|
+
- normalize
|
|
217
|
+
- diff
|
|
218
|
+
- convert
|
|
219
|
+
- and integrate
|
|
220
|
+
|
|
221
|
+
API specifications in the UniSpec format.
|
|
222
|
+
|
|
223
|
+
If you're building tools, adapters, or platform services that rely on UniSpec → **this is where the engine lives.**
|
|
@@ -0,0 +1,206 @@
|
|
|
1
|
+
"use strict";
|
|
2
|
+
Object.defineProperty(exports, "__esModule", { value: true });
|
|
3
|
+
exports.toOpenAPI = toOpenAPI;
|
|
4
|
+
exports.toGraphQLSDL = toGraphQLSDL;
|
|
5
|
+
exports.toWebSocketModel = toWebSocketModel;
|
|
6
|
+
function toOpenAPI(doc) {
|
|
7
|
+
const service = doc.service;
|
|
8
|
+
const rest = (service.protocols?.rest ?? {});
|
|
9
|
+
const info = {
|
|
10
|
+
title: service.title ?? service.name,
|
|
11
|
+
description: service.description,
|
|
12
|
+
};
|
|
13
|
+
// Derive OpenAPI servers from UniSpec environments when available.
|
|
14
|
+
const servers = Array.isArray(service.environments)
|
|
15
|
+
? service.environments.map((env) => ({
|
|
16
|
+
url: env.baseUrl,
|
|
17
|
+
description: env.name,
|
|
18
|
+
}))
|
|
19
|
+
: [];
|
|
20
|
+
const paths = {};
|
|
21
|
+
const components = {};
|
|
22
|
+
// Map service.schemas into OpenAPI components.schemas
|
|
23
|
+
const schemas = (service.schemas ?? {});
|
|
24
|
+
const componentsSchemas = {};
|
|
25
|
+
for (const [name, def] of Object.entries(schemas)) {
|
|
26
|
+
componentsSchemas[name] = def.jsonSchema;
|
|
27
|
+
}
|
|
28
|
+
if (Object.keys(componentsSchemas).length > 0) {
|
|
29
|
+
components.schemas = componentsSchemas;
|
|
30
|
+
}
|
|
31
|
+
// Helper to build a $ref into components.schemas when schemaRef is present.
|
|
32
|
+
function schemaRefToOpenAPI(schemaRef) {
|
|
33
|
+
if (!schemaRef) {
|
|
34
|
+
return {};
|
|
35
|
+
}
|
|
36
|
+
return { $ref: `#/components/schemas/${schemaRef}` };
|
|
37
|
+
}
|
|
38
|
+
// Build paths from REST routes.
|
|
39
|
+
if (Array.isArray(rest.routes)) {
|
|
40
|
+
for (const route of rest.routes) {
|
|
41
|
+
const pathItem = (paths[route.path] ?? {});
|
|
42
|
+
const method = route.method.toLowerCase();
|
|
43
|
+
const parameters = [];
|
|
44
|
+
// Path params
|
|
45
|
+
for (const param of route.pathParams ?? []) {
|
|
46
|
+
parameters.push({
|
|
47
|
+
name: param.name,
|
|
48
|
+
in: "path",
|
|
49
|
+
required: param.required ?? true,
|
|
50
|
+
description: param.description,
|
|
51
|
+
schema: schemaRefToOpenAPI(param.schemaRef),
|
|
52
|
+
});
|
|
53
|
+
}
|
|
54
|
+
// Query params
|
|
55
|
+
for (const param of route.queryParams ?? []) {
|
|
56
|
+
parameters.push({
|
|
57
|
+
name: param.name,
|
|
58
|
+
in: "query",
|
|
59
|
+
required: param.required ?? false,
|
|
60
|
+
description: param.description,
|
|
61
|
+
schema: schemaRefToOpenAPI(param.schemaRef),
|
|
62
|
+
});
|
|
63
|
+
}
|
|
64
|
+
// Header params
|
|
65
|
+
for (const param of route.headers ?? []) {
|
|
66
|
+
parameters.push({
|
|
67
|
+
name: param.name,
|
|
68
|
+
in: "header",
|
|
69
|
+
required: param.required ?? false,
|
|
70
|
+
description: param.description,
|
|
71
|
+
schema: schemaRefToOpenAPI(param.schemaRef),
|
|
72
|
+
});
|
|
73
|
+
}
|
|
74
|
+
// Request body
|
|
75
|
+
let requestBody;
|
|
76
|
+
if (route.requestBody && route.requestBody.content) {
|
|
77
|
+
const content = {};
|
|
78
|
+
for (const [mediaType, media] of Object.entries(route.requestBody.content)) {
|
|
79
|
+
content[mediaType] = {
|
|
80
|
+
schema: schemaRefToOpenAPI(media.schemaRef),
|
|
81
|
+
};
|
|
82
|
+
}
|
|
83
|
+
requestBody = {
|
|
84
|
+
description: route.requestBody.description,
|
|
85
|
+
required: route.requestBody.required,
|
|
86
|
+
content,
|
|
87
|
+
};
|
|
88
|
+
}
|
|
89
|
+
// Responses
|
|
90
|
+
const responses = {};
|
|
91
|
+
for (const [status, resp] of Object.entries(route.responses ?? {})) {
|
|
92
|
+
const content = {};
|
|
93
|
+
if (resp.content) {
|
|
94
|
+
for (const [mediaType, media] of Object.entries(resp.content)) {
|
|
95
|
+
content[mediaType] = {
|
|
96
|
+
schema: schemaRefToOpenAPI(media.schemaRef),
|
|
97
|
+
};
|
|
98
|
+
}
|
|
99
|
+
}
|
|
100
|
+
responses[status] = {
|
|
101
|
+
description: resp.description ?? "",
|
|
102
|
+
...(Object.keys(content).length > 0 ? { content } : {}),
|
|
103
|
+
};
|
|
104
|
+
}
|
|
105
|
+
const operation = {
|
|
106
|
+
operationId: route.name,
|
|
107
|
+
summary: route.summary,
|
|
108
|
+
description: route.description,
|
|
109
|
+
...(parameters.length > 0 ? { parameters } : {}),
|
|
110
|
+
responses: Object.keys(responses).length > 0 ? responses : { default: { description: "" } },
|
|
111
|
+
};
|
|
112
|
+
if (requestBody) {
|
|
113
|
+
operation.requestBody = requestBody;
|
|
114
|
+
}
|
|
115
|
+
// Security requirements
|
|
116
|
+
if (Array.isArray(route.security) && route.security.length > 0) {
|
|
117
|
+
operation.security = route.security.map((req) => {
|
|
118
|
+
const obj = {};
|
|
119
|
+
for (const schemeName of req) {
|
|
120
|
+
obj[schemeName] = [];
|
|
121
|
+
}
|
|
122
|
+
return obj;
|
|
123
|
+
});
|
|
124
|
+
}
|
|
125
|
+
pathItem[method] = operation;
|
|
126
|
+
paths[route.path] = pathItem;
|
|
127
|
+
}
|
|
128
|
+
}
|
|
129
|
+
// Security schemes pass-through from REST protocol
|
|
130
|
+
const componentsSecuritySchemes = rest.securitySchemes ?? {};
|
|
131
|
+
if (Object.keys(componentsSecuritySchemes).length > 0) {
|
|
132
|
+
components.securitySchemes = componentsSecuritySchemes;
|
|
133
|
+
}
|
|
134
|
+
return {
|
|
135
|
+
openapi: "3.1.0",
|
|
136
|
+
info,
|
|
137
|
+
servers,
|
|
138
|
+
paths,
|
|
139
|
+
...(Object.keys(components).length > 0 ? { components } : {}),
|
|
140
|
+
"x-unispec": doc,
|
|
141
|
+
};
|
|
142
|
+
}
|
|
143
|
+
function toGraphQLSDL(doc) {
|
|
144
|
+
// Minimal implementation: generate a basic SDL that exposes service metadata
|
|
145
|
+
// via a Query field. This does not attempt to interpret the full GraphQL
|
|
146
|
+
// protocol structure yet, but provides a stable, deterministic SDL shape
|
|
147
|
+
// based on top-level UniSpec document fields.
|
|
148
|
+
const graphql = doc.service.protocols?.graphql;
|
|
149
|
+
const customSDL = graphql?.schema;
|
|
150
|
+
if (typeof customSDL === "string" && customSDL.trim()) {
|
|
151
|
+
return { sdl: customSDL };
|
|
152
|
+
}
|
|
153
|
+
const service = doc.service;
|
|
154
|
+
const title = service.title ?? service.name;
|
|
155
|
+
const description = service.description ?? "";
|
|
156
|
+
const lines = [];
|
|
157
|
+
if (title || description) {
|
|
158
|
+
lines.push("\"\"");
|
|
159
|
+
if (title) {
|
|
160
|
+
lines.push(title);
|
|
161
|
+
}
|
|
162
|
+
if (description) {
|
|
163
|
+
lines.push("");
|
|
164
|
+
lines.push(description);
|
|
165
|
+
}
|
|
166
|
+
lines.push("\"\"");
|
|
167
|
+
}
|
|
168
|
+
lines.push("schema {");
|
|
169
|
+
lines.push(" query: Query");
|
|
170
|
+
lines.push("}");
|
|
171
|
+
lines.push("");
|
|
172
|
+
lines.push("type Query {");
|
|
173
|
+
lines.push(" _serviceInfo: String!\n");
|
|
174
|
+
lines.push("}");
|
|
175
|
+
const sdl = lines.join("\n");
|
|
176
|
+
return { sdl };
|
|
177
|
+
}
|
|
178
|
+
function toWebSocketModel(doc) {
|
|
179
|
+
// Base WebSocket model intended for a modern, dashboard-oriented UI.
|
|
180
|
+
// It exposes service metadata, a normalized list of channels and the raw
|
|
181
|
+
// websocket protocol object, while also embedding the original UniSpec
|
|
182
|
+
// document under a technical key for debugging and introspection.
|
|
183
|
+
const service = doc.service;
|
|
184
|
+
const websocket = (service.protocols?.websocket ?? {});
|
|
185
|
+
const channelsArray = Array.isArray(websocket.channels) ? websocket.channels : [];
|
|
186
|
+
const channels = [...channelsArray]
|
|
187
|
+
.sort((a, b) => (a.name ?? "").localeCompare(b.name ?? ""))
|
|
188
|
+
.map((channel) => ({
|
|
189
|
+
name: channel.name,
|
|
190
|
+
summary: undefined,
|
|
191
|
+
description: channel.description,
|
|
192
|
+
direction: channel.direction,
|
|
193
|
+
messages: channel.messages,
|
|
194
|
+
raw: channel,
|
|
195
|
+
}));
|
|
196
|
+
return {
|
|
197
|
+
service: {
|
|
198
|
+
name: service.name,
|
|
199
|
+
title: service.title,
|
|
200
|
+
description: service.description,
|
|
201
|
+
},
|
|
202
|
+
channels,
|
|
203
|
+
rawProtocol: websocket,
|
|
204
|
+
"x-unispec-ws": doc,
|
|
205
|
+
};
|
|
206
|
+
}
|
|
@@ -0,0 +1,236 @@
|
|
|
1
|
+
"use strict";
|
|
2
|
+
Object.defineProperty(exports, "__esModule", { value: true });
|
|
3
|
+
exports.diffUniSpec = diffUniSpec;
|
|
4
|
+
function isPlainObject(value) {
|
|
5
|
+
return Object.prototype.toString.call(value) === "[object Object]";
|
|
6
|
+
}
|
|
7
|
+
function diffValues(oldVal, newVal, basePath, out) {
|
|
8
|
+
if (oldVal === newVal) {
|
|
9
|
+
return;
|
|
10
|
+
}
|
|
11
|
+
// Both plain objects → recurse by keys
|
|
12
|
+
if (isPlainObject(oldVal) && isPlainObject(newVal)) {
|
|
13
|
+
const oldKeys = new Set(Object.keys(oldVal));
|
|
14
|
+
const newKeys = new Set(Object.keys(newVal));
|
|
15
|
+
// Removed keys
|
|
16
|
+
for (const key of oldKeys) {
|
|
17
|
+
if (!newKeys.has(key)) {
|
|
18
|
+
out.push({
|
|
19
|
+
path: `${basePath}/${key}`,
|
|
20
|
+
description: "Field removed",
|
|
21
|
+
severity: "unknown",
|
|
22
|
+
});
|
|
23
|
+
}
|
|
24
|
+
}
|
|
25
|
+
// Added / changed keys
|
|
26
|
+
for (const key of newKeys) {
|
|
27
|
+
const childPath = `${basePath}/${key}`;
|
|
28
|
+
if (!oldKeys.has(key)) {
|
|
29
|
+
out.push({
|
|
30
|
+
path: childPath,
|
|
31
|
+
description: "Field added",
|
|
32
|
+
severity: "unknown",
|
|
33
|
+
});
|
|
34
|
+
continue;
|
|
35
|
+
}
|
|
36
|
+
diffValues(oldVal[key], newVal[key], childPath, out);
|
|
37
|
+
}
|
|
38
|
+
return;
|
|
39
|
+
}
|
|
40
|
+
// Arrays
|
|
41
|
+
if (Array.isArray(oldVal) && Array.isArray(newVal)) {
|
|
42
|
+
// Special handling for UniSpec collections identified by "name"
|
|
43
|
+
const isNamedCollection = basePath === "/service/protocols/rest/routes" ||
|
|
44
|
+
basePath === "/service/protocols/websocket/channels" ||
|
|
45
|
+
basePath === "/service/protocols/graphql/queries" ||
|
|
46
|
+
basePath === "/service/protocols/graphql/mutations" ||
|
|
47
|
+
basePath === "/service/protocols/graphql/subscriptions";
|
|
48
|
+
if (isNamedCollection) {
|
|
49
|
+
const oldByName = new Map();
|
|
50
|
+
const newByName = new Map();
|
|
51
|
+
for (const item of oldVal) {
|
|
52
|
+
if (item && typeof item === "object" && typeof item.name === "string") {
|
|
53
|
+
oldByName.set(item.name, item);
|
|
54
|
+
}
|
|
55
|
+
}
|
|
56
|
+
for (const item of newVal) {
|
|
57
|
+
if (item && typeof item === "object" && typeof item.name === "string") {
|
|
58
|
+
newByName.set(item.name, item);
|
|
59
|
+
}
|
|
60
|
+
}
|
|
61
|
+
// Removed
|
|
62
|
+
for (const [name, oldItem] of oldByName.entries()) {
|
|
63
|
+
if (!newByName.has(name)) {
|
|
64
|
+
out.push({
|
|
65
|
+
path: `${basePath}/${name}`,
|
|
66
|
+
description: "Item removed",
|
|
67
|
+
severity: "unknown",
|
|
68
|
+
});
|
|
69
|
+
}
|
|
70
|
+
else {
|
|
71
|
+
const newItem = newByName.get(name);
|
|
72
|
+
diffValues(oldItem, newItem, `${basePath}/${name}`, out);
|
|
73
|
+
}
|
|
74
|
+
}
|
|
75
|
+
// Added
|
|
76
|
+
for (const [name] of newByName.entries()) {
|
|
77
|
+
if (!oldByName.has(name)) {
|
|
78
|
+
out.push({
|
|
79
|
+
path: `${basePath}/${name}`,
|
|
80
|
+
description: "Item added",
|
|
81
|
+
severity: "unknown",
|
|
82
|
+
});
|
|
83
|
+
}
|
|
84
|
+
}
|
|
85
|
+
return;
|
|
86
|
+
}
|
|
87
|
+
// Generic shallow index-based compare
|
|
88
|
+
const maxLen = Math.max(oldVal.length, newVal.length);
|
|
89
|
+
for (let i = 0; i < maxLen; i++) {
|
|
90
|
+
const childPath = `${basePath}/${i}`;
|
|
91
|
+
if (i >= oldVal.length) {
|
|
92
|
+
out.push({
|
|
93
|
+
path: childPath,
|
|
94
|
+
description: "Item added",
|
|
95
|
+
severity: "unknown",
|
|
96
|
+
});
|
|
97
|
+
}
|
|
98
|
+
else if (i >= newVal.length) {
|
|
99
|
+
out.push({
|
|
100
|
+
path: childPath,
|
|
101
|
+
description: "Item removed",
|
|
102
|
+
severity: "unknown",
|
|
103
|
+
});
|
|
104
|
+
}
|
|
105
|
+
else {
|
|
106
|
+
diffValues(oldVal[i], newVal[i], childPath, out);
|
|
107
|
+
}
|
|
108
|
+
}
|
|
109
|
+
return;
|
|
110
|
+
}
|
|
111
|
+
// Primitive or mismatched types → treat as value change
|
|
112
|
+
out.push({
|
|
113
|
+
path: basePath,
|
|
114
|
+
description: "Value changed",
|
|
115
|
+
severity: "unknown",
|
|
116
|
+
});
|
|
117
|
+
}
|
|
118
|
+
function annotateRestChange(change) {
|
|
119
|
+
if (!change.path.startsWith("/service/protocols/rest/routes/")) {
|
|
120
|
+
return change;
|
|
121
|
+
}
|
|
122
|
+
const segments = change.path.split("/").filter(Boolean);
|
|
123
|
+
// Expected shape: ["service", "protocols", "rest", "routes", index]
|
|
124
|
+
if (segments[0] !== "service" || segments[1] !== "protocols" || segments[2] !== "rest" || segments[3] !== "routes") {
|
|
125
|
+
return change;
|
|
126
|
+
}
|
|
127
|
+
const index = segments[4];
|
|
128
|
+
if (typeof index === "undefined") {
|
|
129
|
+
return change;
|
|
130
|
+
}
|
|
131
|
+
const annotated = {
|
|
132
|
+
...change,
|
|
133
|
+
protocol: "rest",
|
|
134
|
+
};
|
|
135
|
+
if (change.description === "Item removed" || change.description === "Field removed") {
|
|
136
|
+
annotated.kind = "rest.route.removed";
|
|
137
|
+
annotated.severity = "breaking";
|
|
138
|
+
}
|
|
139
|
+
else if (change.description === "Item added" || change.description === "Field added") {
|
|
140
|
+
annotated.kind = "rest.route.added";
|
|
141
|
+
annotated.severity = "non-breaking";
|
|
142
|
+
}
|
|
143
|
+
return annotated;
|
|
144
|
+
}
|
|
145
|
+
function annotateWebSocketChange(change) {
|
|
146
|
+
if (!change.path.startsWith("/service/protocols/websocket/channels/")) {
|
|
147
|
+
return change;
|
|
148
|
+
}
|
|
149
|
+
const segments = change.path.split("/").filter(Boolean);
|
|
150
|
+
// Expected base: ["service","protocols","websocket","channels", channelIndex, ...]
|
|
151
|
+
if (segments[0] !== "service" || segments[1] !== "protocols" || segments[2] !== "websocket" || segments[3] !== "channels") {
|
|
152
|
+
return change;
|
|
153
|
+
}
|
|
154
|
+
const channelIndex = segments[4];
|
|
155
|
+
const next = segments[5];
|
|
156
|
+
const annotated = {
|
|
157
|
+
...change,
|
|
158
|
+
protocol: "websocket",
|
|
159
|
+
};
|
|
160
|
+
if (typeof channelIndex === "undefined") {
|
|
161
|
+
return annotated;
|
|
162
|
+
}
|
|
163
|
+
// Channel-level changes: /service/protocols/websocket/channels/{index}
|
|
164
|
+
if (!next) {
|
|
165
|
+
if (change.description === "Item removed" || change.description === "Field removed") {
|
|
166
|
+
annotated.kind = "websocket.channel.removed";
|
|
167
|
+
annotated.severity = "breaking";
|
|
168
|
+
}
|
|
169
|
+
else if (change.description === "Item added" || change.description === "Field added") {
|
|
170
|
+
annotated.kind = "websocket.channel.added";
|
|
171
|
+
annotated.severity = "non-breaking";
|
|
172
|
+
}
|
|
173
|
+
return annotated;
|
|
174
|
+
}
|
|
175
|
+
// Message-level changes: /service/protocols/websocket/channels/{index}/messages/{msgIndex}
|
|
176
|
+
if (next === "messages") {
|
|
177
|
+
const messageIndex = segments[6];
|
|
178
|
+
if (typeof messageIndex === "undefined") {
|
|
179
|
+
return annotated;
|
|
180
|
+
}
|
|
181
|
+
if (change.description === "Item removed") {
|
|
182
|
+
annotated.kind = "websocket.message.removed";
|
|
183
|
+
annotated.severity = "breaking";
|
|
184
|
+
}
|
|
185
|
+
else if (change.description === "Item added") {
|
|
186
|
+
annotated.kind = "websocket.message.added";
|
|
187
|
+
annotated.severity = "non-breaking";
|
|
188
|
+
}
|
|
189
|
+
}
|
|
190
|
+
return annotated;
|
|
191
|
+
}
|
|
192
|
+
function annotateGraphQLChange(change) {
|
|
193
|
+
if (!change.path.startsWith("/service/protocols/graphql/")) {
|
|
194
|
+
return change;
|
|
195
|
+
}
|
|
196
|
+
const segments = change.path.split("/").filter(Boolean);
|
|
197
|
+
// Expected: ["service","protocols","graphql", kind, index, ...]
|
|
198
|
+
if (segments[0] !== "service" || segments[1] !== "protocols" || segments[2] !== "graphql") {
|
|
199
|
+
return change;
|
|
200
|
+
}
|
|
201
|
+
const opKind = segments[3];
|
|
202
|
+
const index = segments[4];
|
|
203
|
+
if (!opKind || typeof index === "undefined") {
|
|
204
|
+
return change;
|
|
205
|
+
}
|
|
206
|
+
if (opKind !== "queries" && opKind !== "mutations" && opKind !== "subscriptions") {
|
|
207
|
+
return change;
|
|
208
|
+
}
|
|
209
|
+
const annotated = {
|
|
210
|
+
...change,
|
|
211
|
+
protocol: "graphql",
|
|
212
|
+
};
|
|
213
|
+
if (change.description === "Item removed" || change.description === "Field removed") {
|
|
214
|
+
annotated.kind = "graphql.operation.removed";
|
|
215
|
+
annotated.severity = "breaking";
|
|
216
|
+
}
|
|
217
|
+
else if (change.description === "Item added" || change.description === "Field added") {
|
|
218
|
+
annotated.kind = "graphql.operation.added";
|
|
219
|
+
annotated.severity = "non-breaking";
|
|
220
|
+
}
|
|
221
|
+
return annotated;
|
|
222
|
+
}
|
|
223
|
+
/**
|
|
224
|
+
* Compute a structural diff between two UniSpec documents.
|
|
225
|
+
*
|
|
226
|
+
* Current behavior:
|
|
227
|
+
* - Tracks added, removed, and changed fields and array items.
|
|
228
|
+
* - Uses JSON Pointer-like paths rooted at "" (e.g., "/info/title").
|
|
229
|
+
* - Marks all changes with severity "unknown" for now.
|
|
230
|
+
*/
|
|
231
|
+
function diffUniSpec(oldDoc, newDoc) {
|
|
232
|
+
const changes = [];
|
|
233
|
+
diffValues(oldDoc, newDoc, "", changes);
|
|
234
|
+
const annotated = changes.map((change) => annotateWebSocketChange(annotateGraphQLChange(annotateRestChange(change))));
|
|
235
|
+
return { changes: annotated };
|
|
236
|
+
}
|
|
@@ -0,0 +1,22 @@
|
|
|
1
|
+
"use strict";
|
|
2
|
+
var __createBinding = (this && this.__createBinding) || (Object.create ? (function(o, m, k, k2) {
|
|
3
|
+
if (k2 === undefined) k2 = k;
|
|
4
|
+
var desc = Object.getOwnPropertyDescriptor(m, k);
|
|
5
|
+
if (!desc || ("get" in desc ? !m.__esModule : desc.writable || desc.configurable)) {
|
|
6
|
+
desc = { enumerable: true, get: function() { return m[k]; } };
|
|
7
|
+
}
|
|
8
|
+
Object.defineProperty(o, k2, desc);
|
|
9
|
+
}) : (function(o, m, k, k2) {
|
|
10
|
+
if (k2 === undefined) k2 = k;
|
|
11
|
+
o[k2] = m[k];
|
|
12
|
+
}));
|
|
13
|
+
var __exportStar = (this && this.__exportStar) || function(m, exports) {
|
|
14
|
+
for (var p in m) if (p !== "default" && !Object.prototype.hasOwnProperty.call(exports, p)) __createBinding(exports, m, p);
|
|
15
|
+
};
|
|
16
|
+
Object.defineProperty(exports, "__esModule", { value: true });
|
|
17
|
+
__exportStar(require("./types/index.js"), exports);
|
|
18
|
+
__exportStar(require("./loader/index.js"), exports);
|
|
19
|
+
__exportStar(require("./validator/index.js"), exports);
|
|
20
|
+
__exportStar(require("./normalizer/index.js"), exports);
|
|
21
|
+
__exportStar(require("./diff/index.js"), exports);
|
|
22
|
+
__exportStar(require("./converters/index.js"), exports);
|
|
@@ -0,0 +1,22 @@
|
|
|
1
|
+
"use strict";
|
|
2
|
+
Object.defineProperty(exports, "__esModule", { value: true });
|
|
3
|
+
exports.loadUniSpec = loadUniSpec;
|
|
4
|
+
/**
|
|
5
|
+
* Load a UniSpec document from a raw input value.
|
|
6
|
+
* Currently supports:
|
|
7
|
+
* - JavaScript objects (treated as already parsed UniSpec)
|
|
8
|
+
* - JSON strings
|
|
9
|
+
*
|
|
10
|
+
* YAML and filesystem helpers will be added later, keeping this API stable.
|
|
11
|
+
*/
|
|
12
|
+
async function loadUniSpec(input, _options = {}) {
|
|
13
|
+
if (typeof input === "string") {
|
|
14
|
+
const trimmed = input.trim();
|
|
15
|
+
if (!trimmed) {
|
|
16
|
+
throw new Error("Cannot load UniSpec: input string is empty");
|
|
17
|
+
}
|
|
18
|
+
// For now we assume JSON; YAML support will be added later.
|
|
19
|
+
return JSON.parse(trimmed);
|
|
20
|
+
}
|
|
21
|
+
return input;
|
|
22
|
+
}
|
|
@@ -0,0 +1,107 @@
|
|
|
1
|
+
"use strict";
|
|
2
|
+
Object.defineProperty(exports, "__esModule", { value: true });
|
|
3
|
+
exports.normalizeUniSpec = normalizeUniSpec;
|
|
4
|
+
function isPlainObject(value) {
|
|
5
|
+
return Object.prototype.toString.call(value) === "[object Object]";
|
|
6
|
+
}
|
|
7
|
+
function normalizeValue(value) {
|
|
8
|
+
if (Array.isArray(value)) {
|
|
9
|
+
return value.map((item) => normalizeValue(item));
|
|
10
|
+
}
|
|
11
|
+
if (isPlainObject(value)) {
|
|
12
|
+
const entries = Object.entries(value).sort(([a], [b]) => a.localeCompare(b));
|
|
13
|
+
const normalized = {};
|
|
14
|
+
for (const [key, val] of entries) {
|
|
15
|
+
normalized[key] = normalizeValue(val);
|
|
16
|
+
}
|
|
17
|
+
return normalized;
|
|
18
|
+
}
|
|
19
|
+
return value;
|
|
20
|
+
}
|
|
21
|
+
function normalizeRestRoutes(doc) {
|
|
22
|
+
if (!doc || !doc.service || !doc.service.protocols) {
|
|
23
|
+
return doc;
|
|
24
|
+
}
|
|
25
|
+
const protocols = doc.service.protocols;
|
|
26
|
+
const rest = protocols.rest;
|
|
27
|
+
if (!rest || !Array.isArray(rest.routes)) {
|
|
28
|
+
return doc;
|
|
29
|
+
}
|
|
30
|
+
const routes = [...rest.routes];
|
|
31
|
+
routes.sort((a, b) => {
|
|
32
|
+
const keyA = a.name || `${a.path} ${a.method}`;
|
|
33
|
+
const keyB = b.name || `${b.path} ${b.method}`;
|
|
34
|
+
return keyA.localeCompare(keyB);
|
|
35
|
+
});
|
|
36
|
+
rest.routes = routes;
|
|
37
|
+
return doc;
|
|
38
|
+
}
|
|
39
|
+
function normalizeWebSocket(doc) {
|
|
40
|
+
if (!doc || !doc.service || !doc.service.protocols) {
|
|
41
|
+
return doc;
|
|
42
|
+
}
|
|
43
|
+
const protocols = doc.service.protocols;
|
|
44
|
+
const websocket = protocols.websocket;
|
|
45
|
+
if (!websocket || !Array.isArray(websocket.channels)) {
|
|
46
|
+
return doc;
|
|
47
|
+
}
|
|
48
|
+
const channels = websocket.channels.map((channel) => {
|
|
49
|
+
if (!channel || !Array.isArray(channel.messages)) {
|
|
50
|
+
return channel;
|
|
51
|
+
}
|
|
52
|
+
const sortedMessages = [...channel.messages].sort((a, b) => {
|
|
53
|
+
const aName = a?.name ?? "";
|
|
54
|
+
const bName = b?.name ?? "";
|
|
55
|
+
return aName.localeCompare(bName);
|
|
56
|
+
});
|
|
57
|
+
return {
|
|
58
|
+
...channel,
|
|
59
|
+
messages: sortedMessages,
|
|
60
|
+
};
|
|
61
|
+
});
|
|
62
|
+
channels.sort((a, b) => {
|
|
63
|
+
const aName = a?.name ?? "";
|
|
64
|
+
const bName = b?.name ?? "";
|
|
65
|
+
return aName.localeCompare(bName);
|
|
66
|
+
});
|
|
67
|
+
websocket.channels = channels;
|
|
68
|
+
return doc;
|
|
69
|
+
}
|
|
70
|
+
function normalizeGraphqlOperations(doc) {
|
|
71
|
+
if (!doc || !doc.service || !doc.service.protocols) {
|
|
72
|
+
return doc;
|
|
73
|
+
}
|
|
74
|
+
const protocols = doc.service.protocols;
|
|
75
|
+
const graphql = protocols.graphql;
|
|
76
|
+
if (!graphql) {
|
|
77
|
+
return doc;
|
|
78
|
+
}
|
|
79
|
+
const kinds = [
|
|
80
|
+
"queries",
|
|
81
|
+
"mutations",
|
|
82
|
+
"subscriptions",
|
|
83
|
+
];
|
|
84
|
+
for (const kind of kinds) {
|
|
85
|
+
const ops = graphql[kind];
|
|
86
|
+
if (!Array.isArray(ops)) {
|
|
87
|
+
continue;
|
|
88
|
+
}
|
|
89
|
+
graphql[kind] = [...ops].sort((a, b) => {
|
|
90
|
+
const aName = a?.name ?? "";
|
|
91
|
+
const bName = b?.name ?? "";
|
|
92
|
+
return aName.localeCompare(bName);
|
|
93
|
+
});
|
|
94
|
+
}
|
|
95
|
+
return doc;
|
|
96
|
+
}
|
|
97
|
+
/**
|
|
98
|
+
* Normalize a UniSpec document into a canonical, deterministic form.
|
|
99
|
+
*
|
|
100
|
+
* Current behavior:
|
|
101
|
+
* - Recursively sorts object keys lexicographically.
|
|
102
|
+
* - Preserves values as-is.
|
|
103
|
+
*/
|
|
104
|
+
function normalizeUniSpec(doc, _options = {}) {
|
|
105
|
+
const normalized = normalizeValue(doc);
|
|
106
|
+
return normalizeWebSocket(normalizeGraphqlOperations(normalizeRestRoutes(normalized)));
|
|
107
|
+
}
|
|
@@ -0,0 +1,81 @@
|
|
|
1
|
+
"use strict";
|
|
2
|
+
var __importDefault = (this && this.__importDefault) || function (mod) {
|
|
3
|
+
return (mod && mod.__esModule) ? mod : { "default": mod };
|
|
4
|
+
};
|
|
5
|
+
Object.defineProperty(exports, "__esModule", { value: true });
|
|
6
|
+
exports.validateUniSpec = validateUniSpec;
|
|
7
|
+
exports.validateUniSpecTests = validateUniSpecTests;
|
|
8
|
+
const _2020_js_1 = __importDefault(require("ajv/dist/2020.js"));
|
|
9
|
+
const fs_1 = __importDefault(require("fs"));
|
|
10
|
+
const path_1 = __importDefault(require("path"));
|
|
11
|
+
const unispec_schema_1 = require("@unispechq/unispec-schema");
|
|
12
|
+
const ajv = new _2020_js_1.default({
|
|
13
|
+
allErrors: true,
|
|
14
|
+
strict: true,
|
|
15
|
+
});
|
|
16
|
+
// Register minimal URI format to satisfy UniSpec schemas (service.environments[*].baseUrl)
|
|
17
|
+
ajv.addFormat("uri", true);
|
|
18
|
+
// Register all UniSpec subschemas so that Ajv can resolve internal $ref links
|
|
19
|
+
try {
|
|
20
|
+
const schemaDir = path_1.default.join(process.cwd(), "node_modules", "@unispechq", "unispec-schema", "schema");
|
|
21
|
+
const types = unispec_schema_1.manifest?.types ?? {};
|
|
22
|
+
const typeSchemaPaths = Object.values(types).map((rel) => String(rel));
|
|
23
|
+
const loadedTypeSchemas = typeSchemaPaths
|
|
24
|
+
.map((relPath) => path_1.default.join(schemaDir, relPath))
|
|
25
|
+
.filter((filePath) => fs_1.default.existsSync(filePath))
|
|
26
|
+
.map((filePath) => JSON.parse(fs_1.default.readFileSync(filePath, "utf8")));
|
|
27
|
+
ajv.addSchema(loadedTypeSchemas);
|
|
28
|
+
}
|
|
29
|
+
catch {
|
|
30
|
+
// If subschemas cannot be loaded for some reason, validation will still work for
|
|
31
|
+
// parts of the schema that do not rely on those $ref references.
|
|
32
|
+
}
|
|
33
|
+
const validateFn = ajv.compile(unispec_schema_1.unispec);
|
|
34
|
+
let validateTestsFn;
|
|
35
|
+
function mapAjvErrors(errors) {
|
|
36
|
+
if (!errors)
|
|
37
|
+
return [];
|
|
38
|
+
return errors.map((error) => ({
|
|
39
|
+
message: error.message || "UniSpec validation error",
|
|
40
|
+
path: error.instancePath || error.schemaPath,
|
|
41
|
+
code: error.keyword,
|
|
42
|
+
}));
|
|
43
|
+
}
|
|
44
|
+
/**
|
|
45
|
+
* Validate a UniSpec document against the UniSpec JSON Schema.
|
|
46
|
+
*/
|
|
47
|
+
async function validateUniSpec(doc, _options = {}) {
|
|
48
|
+
const valid = validateFn(doc);
|
|
49
|
+
if (valid) {
|
|
50
|
+
return {
|
|
51
|
+
valid: true,
|
|
52
|
+
errors: [],
|
|
53
|
+
};
|
|
54
|
+
}
|
|
55
|
+
return {
|
|
56
|
+
valid: false,
|
|
57
|
+
errors: mapAjvErrors(validateFn.errors),
|
|
58
|
+
};
|
|
59
|
+
}
|
|
60
|
+
/**
|
|
61
|
+
* Validate a UniSpec Tests document against the UniSpec Tests JSON Schema.
|
|
62
|
+
*/
|
|
63
|
+
async function validateUniSpecTests(doc, _options = {}) {
|
|
64
|
+
if (!validateTestsFn) {
|
|
65
|
+
const schemaDir = path_1.default.join(process.cwd(), "node_modules", "@unispechq", "unispec-schema", "schema");
|
|
66
|
+
const testsSchemaPath = path_1.default.join(schemaDir, "unispec-tests.schema.json");
|
|
67
|
+
const testsSchema = JSON.parse(fs_1.default.readFileSync(testsSchemaPath, "utf8"));
|
|
68
|
+
validateTestsFn = ajv.compile(testsSchema);
|
|
69
|
+
}
|
|
70
|
+
const valid = validateTestsFn(doc);
|
|
71
|
+
if (valid) {
|
|
72
|
+
return {
|
|
73
|
+
valid: true,
|
|
74
|
+
errors: [],
|
|
75
|
+
};
|
|
76
|
+
}
|
|
77
|
+
return {
|
|
78
|
+
valid: false,
|
|
79
|
+
errors: mapAjvErrors(validateTestsFn.errors),
|
|
80
|
+
};
|
|
81
|
+
}
|
package/dist/index.cjs
ADDED
|
@@ -0,0 +1,22 @@
|
|
|
1
|
+
"use strict";
|
|
2
|
+
var __createBinding = (this && this.__createBinding) || (Object.create ? (function(o, m, k, k2) {
|
|
3
|
+
if (k2 === undefined) k2 = k;
|
|
4
|
+
var desc = Object.getOwnPropertyDescriptor(m, k);
|
|
5
|
+
if (!desc || ("get" in desc ? !m.__esModule : desc.writable || desc.configurable)) {
|
|
6
|
+
desc = { enumerable: true, get: function() { return m[k]; } };
|
|
7
|
+
}
|
|
8
|
+
Object.defineProperty(o, k2, desc);
|
|
9
|
+
}) : (function(o, m, k, k2) {
|
|
10
|
+
if (k2 === undefined) k2 = k;
|
|
11
|
+
o[k2] = m[k];
|
|
12
|
+
}));
|
|
13
|
+
var __exportStar = (this && this.__exportStar) || function(m, exports) {
|
|
14
|
+
for (var p in m) if (p !== "default" && !Object.prototype.hasOwnProperty.call(exports, p)) __createBinding(exports, m, p);
|
|
15
|
+
};
|
|
16
|
+
Object.defineProperty(exports, "__esModule", { value: true });
|
|
17
|
+
__exportStar(require("./types/index.js"), exports);
|
|
18
|
+
__exportStar(require("./loader/index.js"), exports);
|
|
19
|
+
__exportStar(require("./validator/index.js"), exports);
|
|
20
|
+
__exportStar(require("./normalizer/index.js"), exports);
|
|
21
|
+
__exportStar(require("./diff/index.js"), exports);
|
|
22
|
+
__exportStar(require("./converters/index.js"), exports);
|
package/dist/validator/index.js
CHANGED
|
@@ -1,7 +1,7 @@
|
|
|
1
1
|
import Ajv2020 from "ajv/dist/2020.js";
|
|
2
|
-
import
|
|
2
|
+
import fs from "fs";
|
|
3
|
+
import path from "path";
|
|
3
4
|
import { unispec as unispecSchema, manifest as unispecManifest } from "@unispechq/unispec-schema";
|
|
4
|
-
const require = createRequire(import.meta.url);
|
|
5
5
|
const ajv = new Ajv2020({
|
|
6
6
|
allErrors: true,
|
|
7
7
|
strict: true,
|
|
@@ -10,11 +10,13 @@ const ajv = new Ajv2020({
|
|
|
10
10
|
ajv.addFormat("uri", true);
|
|
11
11
|
// Register all UniSpec subschemas so that Ajv can resolve internal $ref links
|
|
12
12
|
try {
|
|
13
|
-
const
|
|
14
|
-
const schemaDir = schemaRootPath.replace(/index\.(cjs|mjs|js)$/u, "schema/");
|
|
13
|
+
const schemaDir = path.join(process.cwd(), "node_modules", "@unispechq", "unispec-schema", "schema");
|
|
15
14
|
const types = unispecManifest?.types ?? {};
|
|
16
15
|
const typeSchemaPaths = Object.values(types).map((rel) => String(rel));
|
|
17
|
-
const loadedTypeSchemas = typeSchemaPaths
|
|
16
|
+
const loadedTypeSchemas = typeSchemaPaths
|
|
17
|
+
.map((relPath) => path.join(schemaDir, relPath))
|
|
18
|
+
.filter((filePath) => fs.existsSync(filePath))
|
|
19
|
+
.map((filePath) => JSON.parse(fs.readFileSync(filePath, "utf8")));
|
|
18
20
|
ajv.addSchema(loadedTypeSchemas);
|
|
19
21
|
}
|
|
20
22
|
catch {
|
|
@@ -53,9 +55,9 @@ export async function validateUniSpec(doc, _options = {}) {
|
|
|
53
55
|
*/
|
|
54
56
|
export async function validateUniSpecTests(doc, _options = {}) {
|
|
55
57
|
if (!validateTestsFn) {
|
|
56
|
-
const
|
|
57
|
-
const
|
|
58
|
-
const testsSchema =
|
|
58
|
+
const schemaDir = path.join(process.cwd(), "node_modules", "@unispechq", "unispec-schema", "schema");
|
|
59
|
+
const testsSchemaPath = path.join(schemaDir, "unispec-tests.schema.json");
|
|
60
|
+
const testsSchema = JSON.parse(fs.readFileSync(testsSchemaPath, "utf8"));
|
|
59
61
|
validateTestsFn = ajv.compile(testsSchema);
|
|
60
62
|
}
|
|
61
63
|
const valid = validateTestsFn(doc);
|
package/package.json
CHANGED
|
@@ -1,18 +1,27 @@
|
|
|
1
1
|
{
|
|
2
2
|
"name": "@unispechq/unispec-core",
|
|
3
|
-
"version": "0.2.
|
|
3
|
+
"version": "0.2.2",
|
|
4
4
|
"description": "Central UniSpec Core Engine providing parsing, validation, normalization, diffing, and conversion of UniSpec specs.",
|
|
5
5
|
"license": "MIT",
|
|
6
6
|
"type": "module",
|
|
7
7
|
"main": "dist/index.cjs",
|
|
8
8
|
"module": "dist/index.js",
|
|
9
|
+
"exports": {
|
|
10
|
+
".": {
|
|
11
|
+
"types": "./dist/index.d.ts",
|
|
12
|
+
"import": "./dist/index.js",
|
|
13
|
+
"require": "./dist/index.cjs"
|
|
14
|
+
}
|
|
15
|
+
},
|
|
9
16
|
"types": "dist/index.d.ts",
|
|
10
17
|
"files": [
|
|
11
18
|
"dist",
|
|
12
19
|
"README.md"
|
|
13
20
|
],
|
|
14
21
|
"scripts": {
|
|
15
|
-
"build": "
|
|
22
|
+
"build": "npm run build:esm && npm run build:cjs",
|
|
23
|
+
"build:esm": "tsc -p tsconfig.json",
|
|
24
|
+
"build:cjs": "tsc -p tsconfig.cjs.json && node scripts/postbuild-cjs.cjs",
|
|
16
25
|
"test": "npm run build && node --test tests/*.test.mjs",
|
|
17
26
|
"release:patch": "node scripts/release.js patch",
|
|
18
27
|
"release:minor": "node scripts/release.js minor",
|