face-validator-sdk 1.0.0
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- package/CHANGELOG.md +109 -0
- package/README.md +281 -0
- package/dist/face-validator-sdk.cjs.js +2 -0
- package/dist/face-validator-sdk.cjs.js.map +1 -0
- package/dist/face-validator-sdk.esm.js +2 -0
- package/dist/face-validator-sdk.esm.js.map +1 -0
- package/dist/face-validator-sdk.umd.js +2 -0
- package/dist/face-validator-sdk.umd.js.map +1 -0
- package/dist/types/FaceValidator.d.ts +30 -0
- package/dist/types/i18n.d.ts +14 -0
- package/dist/types/index.d.ts +7 -0
- package/dist/types/types.d.ts +91 -0
- package/dist/types/utils.d.ts +57 -0
- package/package.json +85 -0
package/CHANGELOG.md
ADDED
|
@@ -0,0 +1,109 @@
|
|
|
1
|
+
# Changelog
|
|
2
|
+
|
|
3
|
+
All notable changes to Face Validator SDK will be documented in this file.
|
|
4
|
+
|
|
5
|
+
## [1.0.0] – 2026-02-06
|
|
6
|
+
|
|
7
|
+
### 🎉 Initial Release
|
|
8
|
+
|
|
9
|
+
Face Validator SDK is now available! A production-ready, real-time selfie validation component powered by **MediaPipe**.
|
|
10
|
+
|
|
11
|
+
### ✨ Features
|
|
12
|
+
|
|
13
|
+
#### Face Detection & Validation
|
|
14
|
+
|
|
15
|
+
- **478 facial landmarks** for precise face analysis
|
|
16
|
+
- **Distance validation**: Detects when face is TOO_CLOSE or TOO_FAR from camera
|
|
17
|
+
- **Centering validation**: Ensures face is properly centered in oval guide
|
|
18
|
+
- **Head pose detection**: Validates head is straight (max 28° tilt)
|
|
19
|
+
- **Illumination validation**: Checks for adequate lighting (brightness > 70)
|
|
20
|
+
- **Stability detection**: Requires 1 second of stillness before capture
|
|
21
|
+
- **Multiple face detection**: Rejects frames with more than one face
|
|
22
|
+
|
|
23
|
+
#### Hand Detection (NEW)
|
|
24
|
+
|
|
25
|
+
- **21 landmarks per hand** for high precision hand tracking
|
|
26
|
+
- **Hand obstruction detection**: Prevents face obstruction by hands
|
|
27
|
+
- **Real-time hand proximity analysis**: Validates hand distance from face
|
|
28
|
+
|
|
29
|
+
#### Internationalization (i18n)
|
|
30
|
+
|
|
31
|
+
- **3 languages supported**: Portuguese (pt-BR), English (en), Spanish (es)
|
|
32
|
+
- **Customizable messages**: Override any validation message
|
|
33
|
+
- **Dynamic language switching**: Change language at runtime
|
|
34
|
+
|
|
35
|
+
#### Developer Experience
|
|
36
|
+
|
|
37
|
+
- **Multiple builds**: ESM, CJS, UMD for maximum compatibility
|
|
38
|
+
- **TypeScript support**: Full type definitions included
|
|
39
|
+
- **Debug mode**: Visualize landmarks and validation overlays
|
|
40
|
+
- **GPU acceleration**: Powered by MediaPipe with GPU support
|
|
41
|
+
- **Flexible configuration**: 15+ validation thresholds for fine-tuning
|
|
42
|
+
|
|
43
|
+
#### Demo & Documentation
|
|
44
|
+
|
|
45
|
+
- **Live interactive demo**: [https://face-validator-sdk.vercel.app](https://face-validator-sdk.vercel.app)
|
|
46
|
+
- **Comprehensive README**: Installation, quick start, configuration guide
|
|
47
|
+
- **Validation checklist**: Detailed explanation of all validation states
|
|
48
|
+
- **Code examples**: TypeScript examples for common use cases
|
|
49
|
+
|
|
50
|
+
### 📦 Installation
|
|
51
|
+
|
|
52
|
+
Single command installation - MediaPipe is included:
|
|
53
|
+
|
|
54
|
+
```bash
|
|
55
|
+
npm install face-validator-sdk
|
|
56
|
+
```
|
|
57
|
+
|
|
58
|
+
### 🚀 Quick Start
|
|
59
|
+
|
|
60
|
+
```typescript
|
|
61
|
+
import { FaceValidator, ValidationStatus } from 'face-validator-sdk';
|
|
62
|
+
|
|
63
|
+
const validator = new FaceValidator({
|
|
64
|
+
videoElement: document.getElementById('video'),
|
|
65
|
+
overlayCanvasElement: document.getElementById('overlay'),
|
|
66
|
+
locale: 'pt-BR',
|
|
67
|
+
|
|
68
|
+
onStatusUpdate: (status, message) => {
|
|
69
|
+
console.log(message);
|
|
70
|
+
},
|
|
71
|
+
|
|
72
|
+
onCaptureSuccess: (blob) => {
|
|
73
|
+
// Upload captured selfie
|
|
74
|
+
},
|
|
75
|
+
|
|
76
|
+
onError: (errorType, error) => {
|
|
77
|
+
console.error(error);
|
|
78
|
+
}
|
|
79
|
+
});
|
|
80
|
+
```
|
|
81
|
+
|
|
82
|
+
### 🔧 Configuration
|
|
83
|
+
|
|
84
|
+
15+ validation thresholds available:
|
|
85
|
+
|
|
86
|
+
- `minDetectionConfidence` (default: 0.5)
|
|
87
|
+
- `minIlluminationThreshold` (default: 70)
|
|
88
|
+
- `minFaceSizeFactor` (default: 0.25)
|
|
89
|
+
- `maxFaceSizeFactor` (default: 0.65)
|
|
90
|
+
- `maxHeadTiltDegrees` (default: 28)
|
|
91
|
+
- `maxHandFaceDistance` (default: 0.15)
|
|
92
|
+
- And more...
|
|
93
|
+
|
|
94
|
+
### 📚 What's Included
|
|
95
|
+
|
|
96
|
+
- Real-time video validation with visual feedback
|
|
97
|
+
- Automatic photo capture on successful validation
|
|
98
|
+
- Blob output for direct API upload
|
|
99
|
+
- LocalStorage support for demo captures
|
|
100
|
+
- ESM, CJS, and UMD builds for any environment
|
|
101
|
+
|
|
102
|
+
### 🎯 Key Achievements
|
|
103
|
+
|
|
104
|
+
✅ **Production Ready**: Tested and optimized for real-world use
|
|
105
|
+
✅ **Accessible**: Works on desktop, tablet, and mobile browsers
|
|
106
|
+
✅ **Customizable**: Fine-tune validation parameters for your use case
|
|
107
|
+
✅ **Fast**: GPU-accelerated inference with MediaPipe
|
|
108
|
+
✅ **Reliable**: High accuracy face and hand detection
|
|
109
|
+
✅ **Modern**: Built with TypeScript and modern web standards
|
package/README.md
ADDED
|
@@ -0,0 +1,281 @@
|
|
|
1
|
+
# Face Validator SDK
|
|
2
|
+
|
|
3
|
+
Real-time selfie validation SDK with face detection, powered by **MediaPipe**. Detects faces, hands, and validates pose, lighting, and occlusions in real-time.
|
|
4
|
+
|
|
5
|
+
🎭 **[Live Demo](https://face-validator-sdk.vercel.app)** | 📦 [NPM Package](#installation) | 📖 [Documentation](#usage) | 🤝 [Contributing](#contributing)
|
|
6
|
+
|
|
7
|
+
## ✨ Features
|
|
8
|
+
|
|
9
|
+
### Face Detection (478 landmarks)
|
|
10
|
+
|
|
11
|
+
- ✅ **Distance validation**: TOO_CLOSE / TOO_FAR
|
|
12
|
+
- ✅ **Centering**: Face must be centered in oval guide
|
|
13
|
+
- ✅ **Head pose**: Detects tilted or turned head
|
|
14
|
+
- ✅ **Illumination**: Validates proper lighting
|
|
15
|
+
- ✅ **Stability**: Ensures user stays still before capture
|
|
16
|
+
- ✅ **Multiple faces**: Rejects when more than one face detected
|
|
17
|
+
|
|
18
|
+
### Hand Detection
|
|
19
|
+
|
|
20
|
+
- ✅ **Hand near face detection**: Prevents hand covering face (obstructions)
|
|
21
|
+
- ✅ **21 landmarks per hand**: High precision tracking
|
|
22
|
+
- ✅ **Real-time validation**: Instant feedback
|
|
23
|
+
|
|
24
|
+
### Additional Features
|
|
25
|
+
|
|
26
|
+
- 🌐 **i18n**: Portuguese (pt-BR), English (en), Spanish (es)
|
|
27
|
+
- 🎨 **Visual feedback**: Oval guide with color-coded status
|
|
28
|
+
- 🐛 **Debug mode**: Visualize landmarks and bounding boxes
|
|
29
|
+
- 📦 **Multiple builds**: ESM, CJS, UMD
|
|
30
|
+
- 🚀 **GPU accelerated**: Powered by MediaPipe with GPU support
|
|
31
|
+
|
|
32
|
+
## 📦 Installation
|
|
33
|
+
|
|
34
|
+
```bash
|
|
35
|
+
npm install face-validator-sdk
|
|
36
|
+
```
|
|
37
|
+
|
|
38
|
+
The SDK automatically includes `@mediapipe/tasks-vision` as a dependency.
|
|
39
|
+
|
|
40
|
+
## 📊 Validation Checklist
|
|
41
|
+
|
|
42
|
+
The SDK validates multiple conditions before capturing the selfie. Here's what each status means:
|
|
43
|
+
|
|
44
|
+
| Status | Description | User Action | Validation Threshold |
|
|
45
|
+
|--------|-------------|-------------|----------------------|
|
|
46
|
+
| **INITIALIZING** | Loading MediaPipe models from CDN | Wait, models loading... | N/A |
|
|
47
|
+
| **NO_FACE_DETECTED** | Camera is active but no face found | Move closer to camera, ensure good lighting | Requires 1 face |
|
|
48
|
+
| **FACE_DETECTED** | Face detected, starting validation | Hold still for validation | Confidence > 50% |
|
|
49
|
+
| **TOO_CLOSE** | Face is too large in frame (too close) | Move camera away | Face height < 65% viewport |
|
|
50
|
+
| **TOO_FAR** | Face is too small in frame (too far) | Move camera closer | Face height > 25% viewport |
|
|
51
|
+
| **OFF_CENTER** | Face not properly centered in oval | Center face in the oval guide | Within center zone |
|
|
52
|
+
| **FACE_OBSTRUCTED** | **Hand, glasses, or low visibility** | Remove hands from face, ensure visibility | Hand distance > 15% |
|
|
53
|
+
| **HEAD_NOT_STRAIGHT** | Head is tilted or turned | Face camera directly, keep head straight | Yaw/Pitch < 28° |
|
|
54
|
+
| **MULTIPLE_FACES** | More than one face detected | Ensure only you are in frame | Exactly 1 face required |
|
|
55
|
+
| **POOR_ILLUMINATION** | Not enough light to see face clearly | Increase lighting (natural/lamp light) | Brightness avg > 70 |
|
|
56
|
+
| **STAY_STILL** | Movement detected, hold still | Stop moving, keep steady position | Movement < 5px, 1s |
|
|
57
|
+
| **CAPTURING** | Validation passed, taking photo... | Keep position, don't move | Auto-capture in progress |
|
|
58
|
+
| **SUCCESS** | ✅ Selfie captured successfully! | Photo saved and ready to upload | Capture completed |
|
|
59
|
+
| **ERROR** | An error occurred during validation | Check camera permissions, try again | Check logs for details |
|
|
60
|
+
|
|
61
|
+
## 🚀 Quick Start
|
|
62
|
+
|
|
63
|
+
### Basic Usage
|
|
64
|
+
|
|
65
|
+
```typescript
|
|
66
|
+
import { FaceValidator, ValidationStatus } from 'face-validator-sdk';
|
|
67
|
+
|
|
68
|
+
// Get DOM elements
|
|
69
|
+
const videoElement = document.getElementById('video');
|
|
70
|
+
const canvasElement = document.getElementById('overlay');
|
|
71
|
+
|
|
72
|
+
// Initialize validator
|
|
73
|
+
const validator = new FaceValidator({
|
|
74
|
+
videoElement,
|
|
75
|
+
overlayCanvasElement: canvasElement,
|
|
76
|
+
locale: 'pt-BR', // 'pt-BR' | 'en' | 'es'
|
|
77
|
+
debugMode: true, // Show landmarks for debugging
|
|
78
|
+
|
|
79
|
+
// Called whenever validation status changes
|
|
80
|
+
onStatusUpdate: (status, message) => {
|
|
81
|
+
document.getElementById('status').textContent = message;
|
|
82
|
+
console.log(`Status: ${status} - ${message}`);
|
|
83
|
+
},
|
|
84
|
+
|
|
85
|
+
// Called when user passes all validations and photo is captured
|
|
86
|
+
onCaptureSuccess: (imageBlob) => {
|
|
87
|
+
// Image is a Blob with the captured selfie
|
|
88
|
+
const url = URL.createObjectURL(imageBlob);
|
|
89
|
+
document.getElementById('preview').src = url;
|
|
90
|
+
|
|
91
|
+
// Send to backend
|
|
92
|
+
const formData = new FormData();
|
|
93
|
+
formData.append('selfie', imageBlob, 'selfie.jpg');
|
|
94
|
+
fetch('/api/upload-selfie', { method: 'POST', body: formData });
|
|
95
|
+
},
|
|
96
|
+
|
|
97
|
+
// Called if something goes wrong
|
|
98
|
+
onError: (errorType, error) => {
|
|
99
|
+
console.error(`Validation Error: ${errorType}`, error);
|
|
100
|
+
document.getElementById('status').textContent = error.message;
|
|
101
|
+
}
|
|
102
|
+
});
|
|
103
|
+
|
|
104
|
+
// Validator starts automatically capturing when initialized
|
|
105
|
+
// To stop the validator: validator.stop();
|
|
106
|
+
```
|
|
107
|
+
|
|
108
|
+
### HTML Setup
|
|
109
|
+
|
|
110
|
+
```html
|
|
111
|
+
<!DOCTYPE html>
|
|
112
|
+
<html>
|
|
113
|
+
<head>
|
|
114
|
+
<meta charset="UTF-8">
|
|
115
|
+
<meta name="viewport" content="width=device-width, initial-scale=1.0">
|
|
116
|
+
<title>Face Validator SDK</title>
|
|
117
|
+
<style>
|
|
118
|
+
body { font-family: sans-serif; margin: 0; padding: 20px; }
|
|
119
|
+
#status { margin: 10px 0; padding: 10px; background: #f0f0f0; border-radius: 4px; }
|
|
120
|
+
#preview { max-width: 300px; border-radius: 8px; margin-top: 20px; }
|
|
121
|
+
</style>
|
|
122
|
+
</head>
|
|
123
|
+
<body>
|
|
124
|
+
<h1>Face Validator SDK Demo</h1>
|
|
125
|
+
|
|
126
|
+
<!-- Video element for camera feed (will be mirrored) -->
|
|
127
|
+
<video id="video" width="512" height="384" autoplay playsinline muted></video>
|
|
128
|
+
|
|
129
|
+
<!-- Canvas for validation feedback (landmarks, oval guide, etc.) -->
|
|
130
|
+
<canvas id="overlay" width="512" height="384" style="border: 1px solid #ccc;"></canvas>
|
|
131
|
+
|
|
132
|
+
<!-- Status display -->
|
|
133
|
+
<div id="status">Loading...</div>
|
|
134
|
+
|
|
135
|
+
<!-- Captured selfie preview -->
|
|
136
|
+
<img id="preview" alt="Captured selfie" />
|
|
137
|
+
|
|
138
|
+
<!-- Load SDK (MediaPipe models are loaded automatically) -->
|
|
139
|
+
<script type="module" src="./app.js"></script>
|
|
140
|
+
</body>
|
|
141
|
+
</html>
|
|
142
|
+
```
|
|
143
|
+
|
|
144
|
+
## ⚙️ Configuration Options
|
|
145
|
+
|
|
146
|
+
```typescript
|
|
147
|
+
interface FaceValidatorOptions {
|
|
148
|
+
// ===== REQUIRED =====
|
|
149
|
+
videoElement: HTMLVideoElement;
|
|
150
|
+
onStatusUpdate: (status: ValidationStatus, message: string) => void;
|
|
151
|
+
onCaptureSuccess: (imageBlob: Blob) => void;
|
|
152
|
+
onError: (errorType: ValidationStatus, error: Error) => void;
|
|
153
|
+
|
|
154
|
+
// ===== OPTIONAL =====
|
|
155
|
+
// Display
|
|
156
|
+
overlayCanvasElement?: HTMLCanvasElement;
|
|
157
|
+
locale?: 'pt-BR' | 'en' | 'es'; // Default: 'en'
|
|
158
|
+
debugMode?: boolean; // Show landmarks and bounding boxes. Default: false
|
|
159
|
+
|
|
160
|
+
// Validation Thresholds
|
|
161
|
+
minDetectionConfidence?: number; // Face detection threshold. Default: 0.5 (50%)
|
|
162
|
+
minIlluminationThreshold?: number; // Minimum brightness (0-255). Default: 70
|
|
163
|
+
minFaceSizeFactor?: number; // Minimum face size relative to viewport. Default: 0.25 (25%)
|
|
164
|
+
maxFaceSizeFactor?: number; // Maximum face size relative to viewport. Default: 0.65 (65%)
|
|
165
|
+
|
|
166
|
+
// Stability & Capture
|
|
167
|
+
stabilizationTimeThreshold?: number; // Time to hold still before capture (ms). Default: 1000
|
|
168
|
+
stabilityMovementThreshold?: number; // Max allowed movement (pixels). Default: 5
|
|
169
|
+
minFaceVisibilityScore?: number; // Minimum face visibility (0-1). Default: 0.5
|
|
170
|
+
|
|
171
|
+
// Head Pose
|
|
172
|
+
maxHeadTiltDegrees?: number; // Maximum head tilt allowed. Default: 28°
|
|
173
|
+
|
|
174
|
+
// Hand Detection
|
|
175
|
+
maxHandFaceDistance?: number; // Maximum hand distance from face (0-1). Default: 0.15 (normalized)
|
|
176
|
+
|
|
177
|
+
// Advanced
|
|
178
|
+
modelPath?: string; // Custom path to MediaPipe WASM models. Auto-detected from CDN.
|
|
179
|
+
customMessages?: Partial<Record<ValidationStatus, string>>; // Override status messages
|
|
180
|
+
}
|
|
181
|
+
```
|
|
182
|
+
|
|
183
|
+
### Example with Custom Thresholds
|
|
184
|
+
|
|
185
|
+
```typescript
|
|
186
|
+
const validator = new FaceValidator({
|
|
187
|
+
videoElement,
|
|
188
|
+
overlayCanvasElement,
|
|
189
|
+
locale: 'pt-BR',
|
|
190
|
+
|
|
191
|
+
// Stricter validation for high-security use cases
|
|
192
|
+
minDetectionConfidence: 0.8, // 80% confidence required
|
|
193
|
+
minIlluminationThreshold: 100, // Very bright required
|
|
194
|
+
maxHeadTiltDegrees: 15, // Almost perfectly straight
|
|
195
|
+
stabilizationTimeThreshold: 2000, // 2 seconds of stillness
|
|
196
|
+
|
|
197
|
+
onStatusUpdate,
|
|
198
|
+
onCaptureSuccess,
|
|
199
|
+
onError
|
|
200
|
+
});
|
|
201
|
+
```
|
|
202
|
+
|
|
203
|
+
## 🏗️ Architecture
|
|
204
|
+
|
|
205
|
+
### MediaPipe Integration
|
|
206
|
+
|
|
207
|
+
The SDK uses two MediaPipe models running in parallel:
|
|
208
|
+
|
|
209
|
+
1. **FaceLandmarker**: 478 facial landmarks + face detection
|
|
210
|
+
2. **HandLandmarker**: 21 hand landmarks per hand
|
|
211
|
+
|
|
212
|
+
```
|
|
213
|
+
┌─────────────────────────────────────────┐
|
|
214
|
+
│ FaceValidator │
|
|
215
|
+
├─────────────────────────────────────────┤
|
|
216
|
+
│ ┌─────────────────┐ ┌──────────────┐ │
|
|
217
|
+
│ │ FaceLandmarker │ │ HandLandmarker│ │
|
|
218
|
+
│ │ (478 points) │ │ (21 pts/hand) │ │
|
|
219
|
+
│ └─────────────────┘ └──────────────┘ │
|
|
220
|
+
│ ↓ ↓ │
|
|
221
|
+
│ ┌──────────────────────────────────┐ │
|
|
222
|
+
│ │ Validation Pipeline │ │
|
|
223
|
+
│ │ 1. Distance │ │
|
|
224
|
+
│ │ 2. Centering │ │
|
|
225
|
+
│ │ 3. Face geometry │ │
|
|
226
|
+
│ │ 4. Head pose │ │
|
|
227
|
+
│ │ 5. Hand proximity │ │
|
|
228
|
+
│ │ 6. Illumination │ │
|
|
229
|
+
│ │ 7. Stability │ │
|
|
230
|
+
│ └──────────────────────────────────┘ │
|
|
231
|
+
└─────────────────────────────────────────┘
|
|
232
|
+
```
|
|
233
|
+
|
|
234
|
+
## 🔧 Development
|
|
235
|
+
|
|
236
|
+
### Scripts
|
|
237
|
+
|
|
238
|
+
```bash
|
|
239
|
+
npm run dev # Start local dev server (webpack)
|
|
240
|
+
npm run build # Build SDK (CJS, ESM, UMD)
|
|
241
|
+
npm run build:demo # Build production demo
|
|
242
|
+
npm run lint # Run ESLint
|
|
243
|
+
npm run format # Format code with Prettier
|
|
244
|
+
npm run test # Run tests (Jest)
|
|
245
|
+
```
|
|
246
|
+
|
|
247
|
+
### Project Structure
|
|
248
|
+
|
|
249
|
+
```
|
|
250
|
+
face-validator-sdk/
|
|
251
|
+
├── src/
|
|
252
|
+
│ ├── FaceValidator.ts # Main validator class
|
|
253
|
+
│ ├── types.ts # TypeScript types
|
|
254
|
+
│ ├── utils.ts # Validation functions
|
|
255
|
+
│ ├── i18n.ts # Internationalization
|
|
256
|
+
│ └── index.ts # Public API
|
|
257
|
+
├── demo/
|
|
258
|
+
│ ├── demo.ts # Local development demo
|
|
259
|
+
│ ├── demo-standalone.ts # Production demo
|
|
260
|
+
│ └── public/index.html # Demo HTML
|
|
261
|
+
├── dist/ # Built SDK (generated)
|
|
262
|
+
└── tests/ # Test files
|
|
263
|
+
```
|
|
264
|
+
|
|
265
|
+
## 📄 License
|
|
266
|
+
|
|
267
|
+
MIT License - see [LICENSE](LICENSE) file for details.
|
|
268
|
+
|
|
269
|
+
## 🙏 Acknowledgments
|
|
270
|
+
|
|
271
|
+
- [MediaPipe](https://developers.google.com/mediapipe) by Google for the powerful machine learning models
|
|
272
|
+
|
|
273
|
+
## 📞 Support
|
|
274
|
+
|
|
275
|
+
- 🐛 [Report Bug](https://github.com/rwmsousa/face-validator-sdk/issues)
|
|
276
|
+
- 💡 [Request Feature](https://github.com/rwmsousa/face-validator-sdk/issues)
|
|
277
|
+
- 📧 Contact: [GitHub Profile](https://github.com/rwmsousa)
|
|
278
|
+
|
|
279
|
+
---
|
|
280
|
+
|
|
281
|
+
Made with ❤️ using MediaPipe
|
|
@@ -0,0 +1,2 @@
|
|
|
1
|
+
(()=>{"use strict";var e={d:(t,n)=>{for(var a in n)e.o(n,a)&&!e.o(t,a)&&Object.defineProperty(t,a,{enumerable:!0,get:n[a]})},o:(e,t)=>Object.prototype.hasOwnProperty.call(e,t),r:e=>{"undefined"!=typeof Symbol&&Symbol.toStringTag&&Object.defineProperty(e,Symbol.toStringTag,{value:"Module"}),Object.defineProperty(e,"__esModule",{value:!0})}},t={};e.r(t),e.d(t,{FaceValidator:()=>T,ValidationStatus:()=>a,default:()=>S,getLoadingModelsMessage:()=>l,getMessage:()=>r,getValidationMessages:()=>s});const n=require("@mediapipe/tasks-vision");var a;!function(e){e.INITIALIZING="INITIALIZING",e.NO_FACE_DETECTED="NO_FACE_DETECTED",e.FACE_DETECTED="FACE_DETECTED",e.TOO_CLOSE="TOO_CLOSE",e.TOO_FAR="TOO_FAR",e.OFF_CENTER="OFF_CENTER",e.FACE_OBSTRUCTED="FACE_OBSTRUCTED",e.HEAD_NOT_STRAIGHT="HEAD_NOT_STRAIGHT",e.MULTIPLE_FACES="MULTIPLE_FACES",e.POOR_ILLUMINATION="POOR_ILLUMINATION",e.NOT_NEUTRAL_EXPRESSION="NOT_NEUTRAL_EXPRESSION",e.DARK_GLASSES="DARK_GLASSES",e.STAY_STILL="STAY_STILL",e.CAPTURING="CAPTURING",e.SUCCESS="SUCCESS",e.ERROR="ERROR"}(a||(a={}));const i={"pt-BR":{[a.INITIALIZING]:"Inicializando câmera e detector...",[a.NO_FACE_DETECTED]:"Posicione seu rosto no centro do oval.",[a.FACE_DETECTED]:"Analisando...",[a.TOO_CLOSE]:"Afaste-se um pouco",[a.TOO_FAR]:"Aproxime-se da câmera",[a.OFF_CENTER]:"Centralize o rosto no centro do oval",[a.FACE_OBSTRUCTED]:"Mantenha o rosto totalmente visível. Remova as mãos do rosto.",[a.HEAD_NOT_STRAIGHT]:"Olhe diretamente para a câmera e mantenha a cabeça reta.",[a.MULTIPLE_FACES]:"Mantenha apenas uma pessoa no quadro.",[a.POOR_ILLUMINATION]:"Procure um ambiente com boa iluminação e centralize seu rosto no centro do oval.",[a.NOT_NEUTRAL_EXPRESSION]:"Mantenha expressão neutra: boca fechada, sem sorrir e olhos abertos.",[a.DARK_GLASSES]:"Remova os óculos escuros. Óculos de grau são permitidos.",[a.STAY_STILL]:"Fique imóvel para capturar a foto",[a.CAPTURING]:"Capturando...",[a.SUCCESS]:"Captura realizada!",[a.ERROR]:"Ocorreu um erro."},en:{[a.INITIALIZING]:"Initializing camera and detector...",[a.NO_FACE_DETECTED]:"Position your face in the center of the oval.",[a.FACE_DETECTED]:"Analyzing...",[a.TOO_CLOSE]:"Move back a little",[a.TOO_FAR]:"Move closer to the camera",[a.OFF_CENTER]:"Center your face in the center of the oval",[a.FACE_OBSTRUCTED]:"Keep your face fully visible. Remove your hands from your face.",[a.HEAD_NOT_STRAIGHT]:"Look directly at the camera and keep your head straight.",[a.MULTIPLE_FACES]:"Keep only one person in the frame.",[a.POOR_ILLUMINATION]:"Find a well-lit environment and center your face in the oval.",[a.NOT_NEUTRAL_EXPRESSION]:"Keep a neutral expression: mouth closed, no smiling, and eyes open.",[a.DARK_GLASSES]:"Remove sunglasses. Prescription glasses are allowed.",[a.STAY_STILL]:"Stay still to capture the photo",[a.CAPTURING]:"Capturing...",[a.SUCCESS]:"Capture complete!",[a.ERROR]:"An error occurred."},es:{[a.INITIALIZING]:"Inicializando cámara y detector...",[a.NO_FACE_DETECTED]:"Coloque su rostro en el centro del óvalo.",[a.FACE_DETECTED]:"Analizando...",[a.TOO_CLOSE]:"Aléjese un poco",[a.TOO_FAR]:"Acérquese a la cámara",[a.OFF_CENTER]:"Centre el rostro en el centro del óvalo",[a.FACE_OBSTRUCTED]:"Mantenga el rostro totalmente visible. Quite las manos del rostro.",[a.HEAD_NOT_STRAIGHT]:"Mire directamente a la cámara y mantenga la cabeza recta.",[a.MULTIPLE_FACES]:"Mantenga solo una persona en el encuadre.",[a.POOR_ILLUMINATION]:"Busque un ambiente con buena iluminación y centre su rostro en el óvalo.",[a.NOT_NEUTRAL_EXPRESSION]:"Mantenga expresión neutra: boca cerrada, sin sonreír y ojos abiertos.",[a.DARK_GLASSES]:"Quite las gafas de sol. Las gafas graduadas están permitidas.",[a.STAY_STILL]:"Permanezca quieto para capturar la foto",[a.CAPTURING]:"Capturando...",[a.SUCCESS]:"¡Captura realizada!",[a.ERROR]:"Ocurrió un error."}},o={"pt-BR":"Status desconhecido.",en:"Unknown status.",es:"Estado desconhecido."};function s(e){return Object.assign({},i[e])}function r(e,t){var n;return null!==(n=i[t][e])&&void 0!==n?n:o[t]}function l(e){return{"pt-BR":"Carregando...",en:"Loading...",es:"Cargando..."}[e]}function c(e){const t=e.data;let n=0;for(let e=0;e<t.length;e+=4)n+=.2126*t[e]+.7152*t[e+1]+.0722*t[e+2];return n/(t.length/4)}const h=[33,133,159,145],d=[263,362,386,374],u=[61,291,0,17,39,269,270,409],m=.38;var E=function(e,t,n,a){return new(n||(n=Promise))(function(i,o){function s(e){try{l(a.next(e))}catch(e){o(e)}}function r(e){try{l(a.throw(e))}catch(e){o(e)}}function l(e){var t;e.done?i(e.value):(t=e.value,t instanceof n?t:new n(function(e){e(t)})).then(s,r)}l((a=a.apply(e,t||[])).next())})};const g={overlayCanvasElement:void 0,videoWidth:512,videoHeight:384,minDetectionConfidence:.4,minIlluminationThreshold:50,minFaceSizeFactor:.15,maxFaceSizeFactor:.75,stabilizationTimeThreshold:1e3,stabilityMovementThreshold:5,minFaceVisibilityScore:.4,maxHeadTiltDegrees:30,maxHandFaceDistance:.15,debugMode:!1,locale:"en",customMessages:{}};class T{constructor(e){this.faceLandmarker=null,this.handLandmarker=null,this.animationFrameId=null,this.lastDetection=null,this.stableSince=null,this.isCapturing=!1,this.options=this.resolveOptions(e),this.setStatus(a.INITIALIZING),this.init()}resolveOptions(e){const t=e.modelPath||"https://cdn.jsdelivr.net/npm/@mediapipe/tasks-vision@latest/wasm";return Object.assign(Object.assign(Object.assign({},g),e),{modelPath:t,locale:e.locale||"en",customMessages:e.customMessages||{}})}init(){return E(this,void 0,void 0,function*(){try{const e=l(this.options.locale);this.setStatus(a.INITIALIZING,void 0,e);const t=yield n.FilesetResolver.forVisionTasks(this.options.modelPath);this.faceLandmarker=yield n.FaceLandmarker.createFromOptions(t,{baseOptions:{modelAssetPath:"https://storage.googleapis.com/mediapipe-models/face_landmarker/face_landmarker/float16/1/face_landmarker.task",delegate:"GPU"},runningMode:"VIDEO",numFaces:2,minFaceDetectionConfidence:this.options.minDetectionConfidence,minFacePresenceConfidence:this.options.minFaceVisibilityScore,minTrackingConfidence:this.options.minFaceVisibilityScore}),this.handLandmarker=yield n.HandLandmarker.createFromOptions(t,{baseOptions:{modelAssetPath:"https://storage.googleapis.com/mediapipe-models/hand_landmarker/hand_landmarker/float16/1/hand_landmarker.task",delegate:"GPU"},runningMode:"VIDEO",numHands:2,minHandDetectionConfidence:.5,minHandPresenceConfidence:.5,minTrackingConfidence:.5}),this.startDetectionLoop()}catch(e){const t=e instanceof Error?e:new Error(String(e));this.setStatus(a.ERROR,t)}})}getMessageForStatus(e,t){return t||(this.options.customMessages[e]?this.options.customMessages[e]:r(e,this.options.locale))}setStatus(e,t,n){const i=this.getMessageForStatus(e,n);this.options.onStatusUpdate(e,i),e===a.ERROR&&t&&this.options.onError(e,t)}startDetectionLoop(){const e=this.options.videoElement,t=this.options.videoWidth||640,n=this.options.videoHeight||480,i=()=>E(this,void 0,void 0,function*(){var o;if(this.faceLandmarker&&this.handLandmarker&&e.videoWidth){try{const i=performance.now();let s=a.NO_FACE_DETECTED,r=null,l=[];const E=this.faceLandmarker.detectForVideo(e,i),g=this.handLandmarker.detectForVideo(e,i);if(g.landmarks&&g.landmarks.length>0&&(l=g.landmarks.map((e,t)=>{var n,a,i;return{landmarks:e,handedness:(null===(i=null===(a=null===(n=g.handednesses)||void 0===n?void 0:n[t])||void 0===a?void 0:a[0])||void 0===i?void 0:i.categoryName)||"Unknown"}})),E.faceLandmarks&&E.faceLandmarks.length>1){s=a.MULTIPLE_FACES,this.stableSince=null;const e=E.faceLandmarks[0],t=(null===(o=E.faceBlendshapes)||void 0===o?void 0:o[0])?this.estimateBoundingBox(e):null;t&&(r={boundingBox:t,landmarks:e,timestamp:i})}else if(E.faceLandmarks&&1===E.faceLandmarks.length){const o=E.faceLandmarks[0],g=this.estimateBoundingBox(o);r={boundingBox:g,landmarks:o,timestamp:i};const T=function(e,t=.18,n=.7){const a=e.width;return a<t?"TOO_FAR":a>n?"TOO_CLOSE":"OK"}(g,this.options.minFaceSizeFactor,this.options.maxFaceSizeFactor);if("OK"!==T)s="TOO_CLOSE"===T?a.TOO_CLOSE:a.TOO_FAR,this.stableSince=null;else{const E=o[4],T=function(e,t,n,a){const i=(e*n-n/2)/(.2*n),o=(t*a-a/2)/(a*m);return i*i+o*o<=1}(E.x,E.y,t,n);if(function(e,t,n){const a=t/2,i=n/2,o=.2*t,s=n*m,r=e.xMin*t,l=(e.xMin+e.width)*t,c=e.yMin*n,h=(e.yMin+e.height)*n,d=((r+l)/2-a)/o,u=((c+h)/2-i)/s;if(d*d+u*u>1)return!1;const E=[{x:r,y:c},{x:l,y:c},{x:r,y:h},{x:l,y:h}];for(const e of E){const t=(e.x-a)/o,n=(e.y-i)/s;t*t+n*n>1.2&&0}}(g,t,n),T)if(function(e,t){if(e.length<478)return!1;const n=e[4],a=u.map(t=>e[t]),i=a.reduce((e,t)=>e+t.y,0)/a.length,o=Math.min(...a.map(e=>e.y)),s=Math.max(...a.map(e=>e.y))-o,r=t.height;return!(i<n.y-.01||i-n.y<.06*r||s<.02*r)}(o,g))if(function(e,t=25){if(e.length<478)return!1;const n=e[h[0]],a=e[d[0]],i=e[4],o=e[13],s=e[14],r=e[152],l=e[10],c=Math.abs(n.y-a.y),u=Math.abs(n.x-a.x);if(u<.01)return!1;const m=c/u;if(Math.atan(m)*(180/Math.PI)>t)return!1;const E=(n.x+a.x)/2,g=i.x-E,T=Math.abs(n.x-a.x);if(T<.01)return!1;const S=Math.abs(g)/T;if(Math.atan(S)*(180/Math.PI)>t)return!1;if(!function(e){if(e.length<478)return!1;const t=e[234],n=e[454],a=e[4],i=Math.abs(t.x-a.x),o=Math.abs(n.x-a.x);return!((i>.01&&o>.01?Math.max(i,o)/Math.min(i,o):1)>1.4||void 0!==t.z&&void 0!==n.z&&Math.abs(t.z-n.z)>.05)}(e))return!1;const f=(n.y+a.y)/2,O=(o.y+s.y)/2,p=r.y-l.y;if(p<.1)return!1;if(l.y>f+.02)return!1;if(f>i.y+.02)return!1;if(i.y>O+.02)return!1;if(O>=r.y)return!1;const I=(f-l.y)/p,y=(i.y-f)/p,C=(O-i.y)/p,A=(r.y-O)/p;return!(I<.06||I>.38||y<.03||y>.3||C<.02||C>.25||A<.04||A>.38)}(o,this.options.maxHeadTiltDegrees))if(l.length>0&&function(e,t,n=.15){const a=t.xMin+t.width/2,i=t.yMin+t.height/2;for(const t of e.landmarks){const e=t.x-a,o=t.y-i;if(Math.sqrt(e*e+o*o)<n)return!0}return!1}(l[0],g,this.options.maxHandFaceDistance))s=a.FACE_OBSTRUCTED,this.stableSince=null;else if(function(e){if(e.length<478)return!1;const t=e[159],n=e[144],a=e[386],i=e[373],o=Math.abs(t.y-n.y),s=Math.abs(a.y-i.y);if(o<.01||s<.01)return!1;const r=e[13],l=e[14];if(Math.abs(r.y-l.y)>.025)return!1;const c=e[61],h=e[291],d=e[4];return!((c.y+h.y)/2-d.y<.05)}(o))if(function(e,t){if(t.length<478)return!1;try{const n=document.createElement("canvas"),a=n.getContext("2d");if(!a)return!1;const i=e.videoWidth,o=e.videoHeight,s=[t[33],t[133],t[159],t[144],t[145]],r=[t[263],t[362],t[386],t[373],t[374]],l=e=>{const t=e.map(e=>e.x*i),n=e.map(e=>e.y*o),a=Math.max(0,Math.min(...t)-5),s=Math.min(i,Math.max(...t)+5),r=Math.max(0,Math.min(...n)-5);return{x:a,y:r,width:s-a,height:Math.min(o,Math.max(...n)+5)-r}},h=t=>(n.width=t.width,n.height=t.height,a.drawImage(e,t.x,t.y,t.width,t.height,0,0,t.width,t.height),c(a.getImageData(0,0,t.width,t.height))),d=l(s),u=l(r);return(h(d)+h(u))/2<35}catch(e){return console.warn("Erro ao detectar óculos escuros:",e),!1}}(e,o))s=a.DARK_GLASSES,this.stableSince=null;else{const o=document.createElement("canvas"),l=g.xMin*e.videoWidth,h=g.yMin*e.videoHeight,d=g.width*e.videoWidth,u=g.height*e.videoHeight;o.width=d,o.height=u;const m=o.getContext("2d",{willReadFrequently:!0});if(m){m.drawImage(e,l,h,d,u,0,0,d,u);c(m.getImageData(0,0,o.width,o.height))<this.options.minIlluminationThreshold?(s=a.POOR_ILLUMINATION,this.stableSince=null):function(e,t,n=5,a=512,i=384){if(!e||!t)return!1;const o=(e.boundingBox.xMin+e.boundingBox.width/2)*a,s=(e.boundingBox.yMin+e.boundingBox.height/2)*i,r=(t.boundingBox.xMin+t.boundingBox.width/2)*a,l=(t.boundingBox.yMin+t.boundingBox.height/2)*i,c=Math.abs(o-r),h=Math.abs(s-l),d=Math.abs(e.boundingBox.width-t.boundingBox.width)*a,u=Math.abs(e.boundingBox.height-t.boundingBox.height)*i;return c<=n&&h<=n&&d<=2*n&&u<=2*n}(r,this.lastDetection,this.options.stabilityMovementThreshold,t,n)?(this.stableSince||(this.stableSince=i),s=i-this.stableSince>=this.options.stabilizationTimeThreshold?a.CAPTURING:a.STAY_STILL):(this.stableSince=null,s=a.STAY_STILL)}else s=a.FACE_DETECTED,this.stableSince=null}else s=a.NOT_NEUTRAL_EXPRESSION,this.stableSince=null;else s=a.HEAD_NOT_STRAIGHT,this.stableSince=null;else s=a.FACE_OBSTRUCTED,this.stableSince=null;else s=a.OFF_CENTER,this.stableSince=null}}else this.lastDetection=null,this.stableSince=null;if(this.lastDetection=r,this.setStatus(s),this.options.overlayCanvasElement&&function(e,t,n,i,o){const s=e.getContext("2d");if(!s)return;const r=e.width,l=e.height,c=r/2,u=l/2;s.clearRect(0,0,r,l);const E=.2*r,g=l*m;if(s.fillStyle="rgba(255, 255, 255, 0.35)",s.fillRect(0,0,r,l),s.save(),s.beginPath(),s.ellipse(c,u,E,g,0,0,2*Math.PI),s.closePath(),s.globalCompositeOperation="destination-out",s.fill(),s.restore(),s.strokeStyle="rgba(255, 255, 255, 0.9)",s.lineWidth=3,s.beginPath(),s.ellipse(c,u,E,g,0,0,2*Math.PI),s.stroke(),s.strokeStyle="rgba(255, 255, 255, 0.45)",s.lineWidth=1,s.beginPath(),s.moveTo(c-6,u),s.lineTo(c+6,u),s.moveTo(c,u-6),s.lineTo(c,u+6),s.stroke(),t&&i){const e=i.landmarks;if(e.length>=478){const t=e[10],i=e[152],o=e[234],c=e[454],u=e.map(e=>e.x),m=e.map(e=>e.y),E=Math.min(...u),g=Math.max(...u),T=Math.min(...m),S=g-E,f=Math.max(...m)-T,O=.08,p=(E-S*O)*r,I=(T-f*O)*l,y=S*(1+2*O)*r,C=f*(1+2*O)*l;let A="red";n===a.STAY_STILL||n===a.CAPTURING?A="lime":n===a.FACE_DETECTED&&(A="yellow"),s.strokeStyle=A,s.lineWidth=3,s.strokeRect(p,I,y,C);const _=e[4];e[h[0]],e[d[0]],s.fillStyle="cyan",s.beginPath(),s.arc(_.x*r,_.y*l,5,0,2*Math.PI),s.fill(),s.fillStyle="magenta",s.beginPath(),s.arc(t.x*r,t.y*l,4,0,2*Math.PI),s.fill(),s.fillStyle="lime",s.beginPath(),s.arc(i.x*r,i.y*l,4,0,2*Math.PI),s.fill(),s.fillStyle="yellow",[e[33],e[133],e[159],e[144],e[145]].forEach(e=>{s.beginPath(),s.arc(e.x*r,e.y*l,3,0,2*Math.PI),s.fill()}),s.fillStyle="yellow",[e[263],e[362],e[386],e[373],e[374]].forEach(e=>{s.beginPath(),s.arc(e.x*r,e.y*l,3,0,2*Math.PI),s.fill()}),s.fillStyle="purple",s.beginPath(),s.arc(o.x*r,o.y*l,3,0,2*Math.PI),s.fill(),s.beginPath(),s.arc(c.x*r,c.y*l,3,0,2*Math.PI),s.fill()}}t&&o&&o.length>0&&o.forEach(e=>{s.fillStyle="orange",e.landmarks.forEach(e=>{s.beginPath(),s.arc(e.x*r,e.y*l,3,0,2*Math.PI),s.fill()})})}(this.options.overlayCanvasElement,this.options.debugMode||!1,s,r||void 0,l.length>0?l:void 0),s===a.CAPTURING&&!this.isCapturing)return this.isCapturing=!0,yield this.captureImage(),this.setStatus(a.SUCCESS),void this.stop()}catch(e){const t=e instanceof Error?e:new Error(String(e));this.setStatus(a.ERROR,t)}this.animationFrameId=requestAnimationFrame(i)}else this.animationFrameId=requestAnimationFrame(i)});this.animationFrameId=requestAnimationFrame(i)}estimateBoundingBox(e){const t=e.map(e=>e.x),n=e.map(e=>e.y),a=Math.min(...t),i=Math.max(...t),o=Math.min(...n);return{xMin:a,yMin:o,width:i-a,height:Math.max(...n)-o}}captureImage(){return E(this,void 0,void 0,function*(){const e=this.options.videoElement,t=document.createElement("canvas");t.width=e.videoWidth,t.height=e.videoHeight;const n=t.getContext("2d");n?(n.drawImage(e,0,0,t.width,t.height),t.toBlob(e=>{e?this.options.onCaptureSuccess(e):this.setStatus(a.ERROR,new Error("Failed to generate image blob"))},"image/jpeg",.95)):this.setStatus(a.ERROR,new Error("Failed to get canvas context"))})}stop(){null!==this.animationFrameId&&(cancelAnimationFrame(this.animationFrameId),this.animationFrameId=null),this.faceLandmarker&&this.faceLandmarker.close(),this.handLandmarker&&this.handLandmarker.close()}}const S=T;module.exports=t})();
|
|
2
|
+
//# sourceMappingURL=face-validator-sdk.cjs.js.map
|
|
@@ -0,0 +1 @@
|
|
|
1
|
+
{"version":3,"file":"face-validator-sdk.cjs.js","mappings":"mBACA,IAAIA,EAAsB,CCA1BA,EAAwB,CAACC,EAASC,KACjC,IAAI,IAAIC,KAAOD,EACXF,EAAoBI,EAAEF,EAAYC,KAASH,EAAoBI,EAAEH,EAASE,IAC5EE,OAAOC,eAAeL,EAASE,EAAK,CAAEI,YAAY,EAAMC,IAAKN,EAAWC,MCJ3EH,EAAwB,CAACS,EAAKC,IAAUL,OAAOM,UAAUC,eAAeC,KAAKJ,EAAKC,GCClFV,EAAyBC,IACH,oBAAXa,QAA0BA,OAAOC,aAC1CV,OAAOC,eAAeL,EAASa,OAAOC,YAAa,CAAEC,MAAO,WAE7DX,OAAOC,eAAeL,EAAS,aAAc,CAAEe,OAAO,M,yJCLvD,MAAM,EAA+BC,QAAQ,2BCAtC,IAAIC,GACX,SAAWA,GACPA,EAA+B,aAAI,eACnCA,EAAmC,iBAAI,mBACvCA,EAAgC,cAAI,gBACpCA,EAA4B,UAAI,YAChCA,EAA0B,QAAI,UAC9BA,EAA6B,WAAI,aACjCA,EAAkC,gBAAI,kBACtCA,EAAoC,kBAAI,oBACxCA,EAAiC,eAAI,iBACrCA,EAAoC,kBAAI,oBACxCA,EAAyC,uBAAI,yBAC7CA,EAA+B,aAAI,eACnCA,EAA6B,WAAI,aACjCA,EAA4B,UAAI,YAChCA,EAA0B,QAAI,UAC9BA,EAAwB,MAAI,OAC/B,CAjBD,CAiBGA,IAAqBA,EAAmB,CAAC,ICjB5C,MAAMC,EAAW,CACb,QAAS,CACL,CAACD,EAAiBE,cAAe,qCACjC,CAACF,EAAiBG,kBAAmB,yCACrC,CAACH,EAAiBI,eAAgB,gBAClC,CAACJ,EAAiBK,WAAY,qBAC9B,CAACL,EAAiBM,SAAU,wBAC5B,CAACN,EAAiBO,YAAa,uCAC/B,CAACP,EAAiBQ,iBAAkB,gEACpC,CAACR,EAAiBS,mBAAoB,2DACtC,CAACT,EAAiBU,gBAAiB,wCACnC,CAACV,EAAiBW,mBAAoB,mFACtC,CAACX,EAAiBY,wBAAyB,uEAC3C,CAACZ,EAAiBa,cAAe,2DACjC,CAACb,EAAiBc,YAAa,oCAC/B,CAACd,EAAiBe,WAAY,gBAC9B,CAACf,EAAiBgB,SAAU,qBAC5B,CAAChB,EAAiBiB,OAAQ,oBAE9BC,GAAI,CACA,CAAClB,EAAiBE,cAAe,sCACjC,CAACF,EAAiBG,kBAAmB,gDACrC,CAACH,EAAiBI,eAAgB,eAClC,CAACJ,EAAiBK,WAAY,qBAC9B,CAACL,EAAiBM,SAAU,4BAC5B,CAACN,EAAiBO,YAAa,6CAC/B,CAACP,EAAiBQ,iBAAkB,kEACpC,CAACR,EAAiBS,mBAAoB,2DACtC,CAACT,EAAiBU,gBAAiB,qCACnC,CAACV,EAAiBW,mBAAoB,gEACtC,CAACX,EAAiBY,wBAAyB,sEAC3C,CAACZ,EAAiBa,cAAe,uDACjC,CAACb,EAAiBc,YAAa,kCAC/B,CAACd,EAAiBe,WAAY,eAC9B,CAACf,EAAiBgB,SAAU,oBAC5B,CAAChB,EAAiBiB,OAAQ,sBAE9BE,GAAI,CACA,CAACnB,EAAiBE,cAAe,qCACjC,CAACF,EAAiBG,kBAAmB,4CACrC,CAACH,EAAiBI,eAAgB,gBAClC,CAACJ,EAAiBK,WAAY,kBAC9B,CAACL,EAAiBM,SAAU,wBAC5B,CAACN,EAAiBO,YAAa,0CAC/B,CAACP,EAAiBQ,iBAAkB,qEACpC,CAACR,EAAiBS,mBAAoB,4DACtC,CAACT,EAAiBU,gBAAiB,4CACnC,CAACV,EAAiBW,mBAAoB,2EACtC,CAACX,EAAiBY,wBAAyB,wEAC3C,CAACZ,EAAiBa,cAAe,gEACjC,CAACb,EAAiBc,YAAa,0CAC/B,CAACd,EAAiBe,WAAY,gBAC9B,CAACf,EAAiBgB,SAAU,sBAC5B,CAAChB,EAAiBiB,OAAQ,sBAG5BG,EAAwB,CAC1B,QAAS,uBACTF,GAAI,kBACJC,GAAI,wBAKD,SAASE,EAAsBC,GAClC,OAAOnC,OAAOoC,OAAO,CAAC,EAAGtB,EAASqB,GACtC,CAIO,SAASE,EAAWC,EAAQH,GAC/B,IAAII,EACJ,OAA2C,QAAnCA,EAAKzB,EAASqB,GAAQG,UAAiC,IAAZC,EAAgBA,EAAKN,EAAsBE,EAClG,CAIO,SAASK,EAAwBL,GAMpC,MALgB,CACZ,QAAS,gBACTJ,GAAI,aACJC,GAAI,eAEOG,EACnB,CCjFO,SAASM,EAA2BC,GACvC,MAAMC,EAAOD,EAAUC,KACvB,IAAIC,EAAM,EACV,IAAK,IAAIC,EAAI,EAAGA,EAAIF,EAAKG,OAAQD,GAAK,EAMlCD,GADkB,MAJRD,EAAKE,GAIgB,MAHrBF,EAAKE,EAAI,GAGyB,MAFlCF,EAAKE,EAAI,GAKvB,OAAOD,GAAOD,EAAKG,OAAS,EAChC,CA0BA,MACMC,EAAqB,CAAC,GAAI,IAAK,IAAK,KACpCC,EAAsB,CAAC,IAAK,IAAK,IAAK,KACtCC,EAAwB,CAAC,GAAI,IAAK,EAAG,GAAI,GAAI,IAAK,IAAK,KAgBvDC,EAAuB,IC7D7B,IAAIC,EAAwC,SAAUC,EAASC,EAAYC,EAAGC,GAE1E,OAAO,IAAKD,IAAMA,EAAIE,UAAU,SAAUC,EAASC,GAC/C,SAASC,EAAUhD,GAAS,IAAMiD,EAAKL,EAAUM,KAAKlD,GAAS,CAAE,MAAOmD,GAAKJ,EAAOI,EAAI,CAAE,CAC1F,SAASC,EAASpD,GAAS,IAAMiD,EAAKL,EAAiB,MAAE5C,GAAS,CAAE,MAAOmD,GAAKJ,EAAOI,EAAI,CAAE,CAC7F,SAASF,EAAKI,GAJlB,IAAerD,EAIaqD,EAAOC,KAAOR,EAAQO,EAAOrD,QAJ1CA,EAIyDqD,EAAOrD,MAJhDA,aAAiB2C,EAAI3C,EAAQ,IAAI2C,EAAE,SAAUG,GAAWA,EAAQ9C,EAAQ,IAIjBuD,KAAKP,EAAWI,EAAW,CAC7GH,GAAML,EAAYA,EAAUY,MAAMf,EAASC,GAAc,KAAKQ,OAClE,EACJ,EAKA,MACMO,EAAiB,CACnBC,0BAAsBC,EACtBC,WAAY,IACZC,YAAa,IACbC,uBAAwB,GACxBC,yBAA0B,GAC1BC,kBAAmB,IACnBC,kBAAmB,IACnBC,2BAA4B,IAC5BC,2BAA4B,EAC5BC,uBAAwB,GACxBC,mBAAoB,GACpBC,oBAAqB,IACrBC,WAAW,EACX/C,OAfmB,KAgBnBgD,eAAgB,CAAC,GAKd,MAAMC,EACT,WAAAC,CAAYC,GACRC,KAAKC,eAAiB,KACtBD,KAAKE,eAAiB,KACtBF,KAAKG,iBAAmB,KACxBH,KAAKI,cAAgB,KACrBJ,KAAKK,YAAc,KACnBL,KAAKM,aAAc,EACnBN,KAAKD,QAAUC,KAAKO,eAAeR,GACnCC,KAAKQ,UAAUlF,EAAiBE,cAChCwE,KAAKS,MACT,CACA,cAAAF,CAAeR,GACX,MAAMW,EAAYX,EAAQW,WAAa,mEACvC,OAAOjG,OAAOoC,OAAOpC,OAAOoC,OAAOpC,OAAOoC,OAAO,CAAC,EAAGgC,GAAiBkB,GAAU,CAAEW,YAAW9D,OAAQmD,EAAQnD,QAnC9F,KAmCwHgD,eAAgBG,EAAQH,gBAAkB,CAAC,GACtL,CACA,IAAAa,GACI,OAAO7C,EAAUoC,UAAW,OAAQ,EAAG,YACnC,IACI,MAAMW,EAAa1D,EAAwB+C,KAAKD,QAAQnD,QACxDoD,KAAKQ,UAAUlF,EAAiBE,kBAAcuD,EAAW4B,GAEzD,MAAMC,QAAe,EAAAC,gBAAgBC,eAAed,KAAKD,QAAQW,WAEjEV,KAAKC,qBAAuB,EAAAc,eAAeC,kBAAkBJ,EAAQ,CACjEK,YAAa,CACTC,eAAgB,iHAChBC,SAAU,OAEdC,YAAa,QACbC,SAAU,EACVC,2BAA4BtB,KAAKD,QAAQb,uBACzCqC,0BAA2BvB,KAAKD,QAAQP,uBACxCgC,sBAAuBxB,KAAKD,QAAQP,yBAGxCQ,KAAKE,qBAAuB,EAAAuB,eAAeT,kBAAkBJ,EAAQ,CACjEK,YAAa,CACTC,eAAgB,iHAChBC,SAAU,OAEdC,YAAa,QACbM,SAAU,EACVC,2BAA4B,GAC5BC,0BAA2B,GAC3BJ,sBAAuB,KAE3BxB,KAAK6B,oBACT,CACA,MAAOC,GACH,MAAMC,EAAQD,aAAeE,MAAQF,EAAM,IAAIE,MAAMC,OAAOH,IAC5D9B,KAAKQ,UAAUlF,EAAiBiB,MAAOwF,EAC3C,CACJ,EACJ,CACA,mBAAAG,CAAoBnF,EAAQoF,GACxB,OAAIA,IAEAnC,KAAKD,QAAQH,eAAe7C,GACrBiD,KAAKD,QAAQH,eAAe7C,GAEhCD,EAAWC,EAAQiD,KAAKD,QAAQnD,QAC3C,CACA,SAAA4D,CAAUzD,EAAQgF,EAAOI,GACrB,MAAMC,EAAUpC,KAAKkC,oBAAoBnF,EAAQoF,GACjDnC,KAAKD,QAAQsC,eAAetF,EAAQqF,GAChCrF,IAAWzB,EAAiBiB,OAASwF,GACrC/B,KAAKD,QAAQuC,QAAQvF,EAAQgF,EAErC,CACA,kBAAAF,GACI,MAAMU,EAAQvC,KAAKD,QAAQyC,aACrBC,EAAazC,KAAKD,QAAQf,YAAc,IACxC0D,EAAc1C,KAAKD,QAAQd,aAAe,IAC1C0D,EAAS,IAAM/E,EAAUoC,UAAW,OAAQ,EAAG,YACjD,IAAIhD,EACJ,GAAKgD,KAAKC,gBAAmBD,KAAKE,gBAAmBqC,EAAMvD,WAA3D,CAIA,IACI,MAAM4D,EAAMC,YAAYD,MACxB,IAAIE,EAAgBxH,EAAiBG,iBACjCsH,EAAW,KACXC,EAAW,GAEf,MAAMC,EAAcjD,KAAKC,eAAeiD,eAAeX,EAAOK,GAExDO,EAAcnD,KAAKE,eAAegD,eAAeX,EAAOK,GAW9D,GATIO,EAAYC,WAAaD,EAAYC,UAAU7F,OAAS,IACxDyF,EAAWG,EAAYC,UAAUC,IAAI,CAACD,EAAWE,KAC7C,IAAItG,EAAIuG,EAAIC,EACZ,MAAO,CACHJ,YACAK,YAAuJ,QAAzID,EAA6F,QAAvFD,EAAyC,QAAnCvG,EAAKmG,EAAYO,oBAAsC,IAAZ1G,OAAqB,EAAIA,EAAGsG,UAA8B,IAAZC,OAAqB,EAAIA,EAAG,UAA4B,IAAZC,OAAqB,EAAIA,EAAGG,eAAiB,cAIpNV,EAAYW,eAAiBX,EAAYW,cAAcrG,OAAS,EAAG,CAEnEuF,EAAgBxH,EAAiBU,eACjCgE,KAAKK,YAAc,KAEnB,MAAM+C,EAAYH,EAAYW,cAAc,GACtCC,GAA8C,QAAtC7G,EAAKiG,EAAYa,uBAAyC,IAAZ9G,OAAqB,EAAIA,EAAG,IAAMgD,KAAK+D,oBAAoBX,GAAa,KAChIS,IACAd,EAAW,CAAEiB,YAAaH,EAAKT,YAAWa,UAAWrB,GAE7D,MACK,GAAIK,EAAYW,eAAsD,IAArCX,EAAYW,cAAcrG,OAAc,CAE1E,MAAM6F,EAAYH,EAAYW,cAAc,GACtCI,EAAchE,KAAK+D,oBAAoBX,GAC7CL,EAAW,CACPiB,cACAZ,YACAa,UAAWrB,GAGf,MAAMsB,EDtInB,SAA2BF,EAAa5E,EAAoB,IAAMC,EAAoB,IAEzF,MAAM8E,EAAiBH,EAAYI,MACnC,OAAID,EAAiB/E,EACV,UACP+E,EAAiB9E,EACV,YACJ,IACX,CC8H2CgF,CAAkBL,EAAahE,KAAKD,QAAQX,kBAAmBY,KAAKD,QAAQV,mBACnG,GAAuB,OAAnB6E,EACApB,EAAmC,cAAnBoB,EAAiC5I,EAAiBK,UAAYL,EAAiBM,QAC/FoE,KAAKK,YAAc,SAElB,CAGD,MAAMiE,EAAOlB,EAAU,GACjBmB,EDlGvB,SAA2BC,EAAQC,EAAQhC,EAAYC,GAE1D,MAMMgC,GANKF,EAAS/B,EAETA,EAAa,IATC,GAWdA,GAGLkC,GANKF,EAAS/B,EAETA,EAAc,IAEdA,EAAc/E,GAGzB,OAAO+G,EAAKA,EAAKC,EAAKA,GAAM,CAChC,CCuF+CC,CAAkBN,EAAKO,EAAGP,EAAKQ,EAAGrC,EAAYC,GAIrE,GDtFjB,SAAqCsB,EAAavB,EAAYC,GACjE,MAAMqC,EAAKtC,EAAa,EAClBuC,EAAKtC,EAAc,EACnBuC,EAxBmB,GAwBdxC,EACLyC,EAAKxC,EAAc/E,EAEnBwH,EAAWnB,EAAYoB,KAAO3C,EAC9B4C,GAAarB,EAAYoB,KAAOpB,EAAYI,OAAS3B,EACrD6C,EAAUtB,EAAYuB,KAAO7C,EAC7B8C,GAAcxB,EAAYuB,KAAOvB,EAAYyB,QAAU/C,EAIvDgD,IAHeP,EAAWE,GAAa,EAGbN,GAAME,EAChCU,IAHeL,EAAUE,GAAc,EAGbR,GAAME,EACtC,GAAIQ,EAAWA,EAAWC,EAAWA,EAAW,EAE5C,OAAO,EAGX,MAAMC,EAAU,CACZ,CAAEf,EAAGM,EAAUL,EAAGQ,GAClB,CAAET,EAAGQ,EAAWP,EAAGQ,GACnB,CAAET,EAAGM,EAAUL,EAAGU,GAClB,CAAEX,EAAGQ,EAAWP,EAAGU,IAIvB,IAAK,MAAMK,KAAUD,EAAS,CAC1B,MAAMlB,GAAMmB,EAAOhB,EAAIE,GAAME,EACvBN,GAAMkB,EAAOf,EAAIE,GAAME,EACzBR,EAAKA,EAAKC,EAAKA,EAAK,KAEpBmB,CAER,CAGJ,CC6CiDC,CAA4B/B,EAAavB,EAAYC,GAGzE6B,EAIA,GDmDtB,SAAiCnB,EAAWY,GAC/C,GAAIZ,EAAU7F,OAAS,IACnB,OAAO,EACX,MAAM+G,EAAOlB,EAvLU,GAyLjB4C,EAActI,EAAsB2F,IAAIC,GAAOF,EAAUE,IACzD2C,EAAeD,EAAYE,OAAO,CAACC,EAAGC,IAAMD,EAAIC,EAAEtB,EAAG,GAAKkB,EAAYzI,OACtE8I,EAAYC,KAAKC,OAAOP,EAAY3C,IAAI+C,GAAKA,EAAEtB,IAE/C0B,EADYF,KAAKG,OAAOT,EAAY3C,IAAI+C,GAAKA,EAAEtB,IACbuB,EAClCK,EAAY1C,EAAYyB,OAE9B,QAAIQ,EAAe3B,EAAKQ,EAAI,KAGJmB,EAAe3B,EAAKQ,EACtB,IAAO4B,GAGzBF,EAAsB,IAAOE,EAGrC,CCzEkCC,CAAwBvD,EAAWY,GAIxC,GDnDtB,SAAwBZ,EAAWwD,EAAiB,IACvD,GAAIxD,EAAU7F,OAAS,IACnB,OAAO,EAEX,MAAMsJ,EAAUzD,EAAU5F,EAAmB,IACvCsJ,EAAW1D,EAAU3F,EAAoB,IACzC6G,EAAOlB,EAxFU,GA0FjB2D,EAAW3D,EAAU,IACrB4D,EAAW5D,EAAU,IACrB6D,EAAO7D,EAAU,KACjB8D,EAAW9D,EAAU,IAErB+D,EAAYb,KAAKc,IAAIP,EAAQ/B,EAAIgC,EAAShC,GAC1CuC,EAAYf,KAAKc,IAAIP,EAAQhC,EAAIiC,EAASjC,GAChD,GAAIwC,EAAY,IACZ,OAAO,EACX,MAAMC,EAAYH,EAAYE,EAE9B,GADqBf,KAAKiB,KAAKD,IAAc,IAAMhB,KAAKkB,IACrCZ,EACf,OAAO,EAGX,MAAMa,GAAYZ,EAAQhC,EAAIiC,EAASjC,GAAK,EACtC6C,EAAcpD,EAAKO,EAAI4C,EACvBE,EAAUrB,KAAKc,IAAIP,EAAQhC,EAAIiC,EAASjC,GAC9C,GAAI8C,EAAU,IACV,OAAO,EACX,MAAMC,EAAWtB,KAAKc,IAAIM,GAAeC,EAEzC,GADoBrB,KAAKiB,KAAKK,IAAa,IAAMtB,KAAKkB,IACpCZ,EACd,OAAO,EAEX,IAoMG,SAAyBxD,GAC5B,GAAIA,EAAU7F,OAAS,IACnB,OAAO,EACX,MAAMsK,EAAUzE,EAtTO,KAuTjB0E,EAAW1E,EAtTO,KAuTlBkB,EAAOlB,EA5TU,GA8TjB2E,EAAiBzB,KAAKc,IAAIS,EAAQhD,EAAIP,EAAKO,GAC3CmD,EAAkB1B,KAAKc,IAAIU,EAASjD,EAAIP,EAAKO,GAOnD,SALuBkD,EAAiB,KAAQC,EAAkB,IAC5D1B,KAAKG,IAAIsB,EAAgBC,GAAmB1B,KAAKC,IAAIwB,EAAgBC,GACrE,GAGe,UAKHjJ,IAAd8I,EAAQI,QAAkClJ,IAAf+I,EAASG,GAChB3B,KAAKc,IAAIS,EAAQI,EAAIH,EAASG,GAEhC,IAK1B,CAhOSC,CAAgB9E,GACjB,OAAO,EAEX,MAAM+E,GAAYtB,EAAQ/B,EAAIgC,EAAShC,GAAK,EACtCsD,GAAUrB,EAASjC,EAAIkC,EAASlC,GAAK,EAIrCuD,EAAapB,EAAKnC,EAAIoC,EAASpC,EACrC,GAAIuD,EAAa,GAEb,OAAO,EAKX,GAAInB,EAASpC,EAAIqD,EAAW,IACxB,OAAO,EAGX,GAAIA,EAAW7D,EAAKQ,EAAI,IACpB,OAAO,EAGX,GAAIR,EAAKQ,EAAIsD,EAAS,IAClB,OAAO,EAGX,GAAIA,GAAUnB,EAAKnC,EACf,OAAO,EAGX,MAIMwD,GAJiBH,EAAWjB,EAASpC,GAIAuD,EACrCE,GAJajE,EAAKQ,EAAIqD,GAIOE,EAC7BG,GAJcJ,EAAS9D,EAAKQ,GAIGuD,EAC/BI,GAJcxB,EAAKnC,EAAIsD,GAIQC,EAIrC,QAAIC,EAAoB,KAAQA,EAAoB,KAIhDC,EAAgB,KAAQA,EAAgB,IAIxCC,EAAiB,KAAQA,EAAiB,KAK1CC,EAAiB,KAAQA,EAAiB,IAIlD,CC1CkCC,CAAetF,EAAWpD,KAAKD,QAAQN,oBAI5C,GAAIuD,EAASzF,OAAS,GDmO5C,SAAwByF,EAAU2F,EAAiBC,EAAc,KACpE,MAAMC,EAAcF,EAAgBvD,KAAOuD,EAAgBvE,MAAQ,EAC7D0E,EAAcH,EAAgBpD,KAAOoD,EAAgBlD,OAAS,EAEpE,IAAK,MAAMsD,KAAY/F,EAASI,UAAW,CACvC,MAAMsB,EAAKqE,EAASlE,EAAIgE,EAClBlE,EAAKoE,EAASjE,EAAIgE,EAExB,GADiBxC,KAAK0C,KAAKtE,EAAKA,EAAKC,EAAKA,GAC3BiE,EACX,OAAO,CAEf,CACA,OAAO,CACX,CChPwDK,CAAejG,EAAS,GAAIgB,EAAahE,KAAKD,QAAQL,qBAElFoD,EAAgBxH,EAAiBQ,gBACjCkE,KAAKK,YAAc,UAElB,GDiItB,SAA6B+C,GAChC,GAAIA,EAAU7F,OAAS,IACnB,OAAO,EAEX,MAAM2L,EAAa9F,EA7QQ,KA8QrB+F,EAAgB/F,EA7QQ,KA8QxBgG,EAAchG,EA7QQ,KA8QtBiG,EAAiBjG,EA7QQ,KA8QzBkG,EAAkBhD,KAAKc,IAAI8B,EAAWpE,EAAIqE,EAAcrE,GACxDyE,EAAmBjD,KAAKc,IAAIgC,EAAYtE,EAAIuE,EAAevE,GAEjE,GAAIwE,EAAkB,KAAQC,EAAmB,IAC7C,OAAO,EAGX,MAAMC,EAAWpG,EAlRO,IAmRlBqG,EAAcrG,EAlRO,IAqR3B,GAFsBkD,KAAKc,IAAIoC,EAAS1E,EAAI2E,EAAY3E,GAEpC,KAChB,OAAO,EAGX,MAAM4E,EAAkBtG,EA5RQ,IA6R1BuG,EAAmBvG,EA5RQ,KA6R3BkB,EAAOlB,EAxSU,GA8SvB,SAJ0BsG,EAAgB5E,EAAI6E,EAAiB7E,GAAK,EACvBR,EAAKQ,EAG1B,IAI5B,CCpKkC8E,CAAoBxG,GAKzB,GD4DtB,SAAwBb,EAAOa,GAClC,GAAIA,EAAU7F,OAAS,IACnB,OAAO,EACX,IAEI,MAAMsM,EAAaC,SAASC,cAAc,UACpCC,EAAMH,EAAWI,WAAW,MAClC,IAAKD,EACD,OAAO,EACX,MAAMhL,EAAauD,EAAMvD,WACnBC,EAAcsD,EAAMtD,YAEpBiL,EAAmB,CACrB9G,EAAU,IACVA,EAAU,KACVA,EAAU,KACVA,EAAU,KACVA,EAAU,MAER+G,EAAoB,CACtB/G,EAAU,KACVA,EAAU,KACVA,EAAU,KACVA,EAAU,KACVA,EAAU,MAGRgH,EAAkBC,IACpB,MAAMC,EAAKD,EAAahH,IAAIkH,GAAKA,EAAE1F,EAAI7F,GACjCwL,EAAKH,EAAahH,IAAIkH,GAAKA,EAAEzF,EAAI7F,GACjCwL,EAAOnE,KAAKG,IAAI,EAAGH,KAAKC,OAAO+D,GAAM,GACrCI,EAAOpE,KAAKC,IAAIvH,EAAYsH,KAAKG,OAAO6D,GAAM,GAC9CK,EAAOrE,KAAKG,IAAI,EAAGH,KAAKC,OAAOiE,GAAM,GAE3C,MAAO,CAAE3F,EAAG4F,EAAM3F,EAAG6F,EAAMvG,MAAOsG,EAAOD,EAAMhF,OADlCa,KAAKC,IAAItH,EAAaqH,KAAKG,OAAO+D,GAAM,GACSG,IAG5DC,EAAuB/G,IACzBgG,EAAWzF,MAAQP,EAAIO,MACvByF,EAAWpE,OAAS5B,EAAI4B,OACxBuE,EAAIa,UAAUtI,EAAOsB,EAAIgB,EAAGhB,EAAIiB,EAAGjB,EAAIO,MAAOP,EAAI4B,OAAQ,EAAG,EAAG5B,EAAIO,MAAOP,EAAI4B,QAExEvI,EADW8M,EAAIc,aAAa,EAAG,EAAGjH,EAAIO,MAAOP,EAAI4B,UAItDsF,EAAaX,EAAeF,GAC5Bc,EAAcZ,EAAeD,GAOnC,OAN0BS,EAAoBG,GACnBH,EAAoBI,IACqB,EAI1C,EAC9B,CACA,MAAOjJ,GAEH,OADAkJ,QAAQC,KAAK,mCAAoCnJ,IAC1C,CACX,CACJ,CCvHiCoJ,CAAe5I,EAAOa,GAE3BN,EAAgBxH,EAAiBa,aACjC6D,KAAKK,YAAc,SAElB,CAED,MAAMwJ,EAAaC,SAASC,cAAc,UACpCqB,EAAOpH,EAAYoB,KAAO7C,EAAMvD,WAChCqM,EAAOrH,EAAYuB,KAAOhD,EAAMtD,YAChCqM,EAAOtH,EAAYI,MAAQ7B,EAAMvD,WACjCuM,EAAOvH,EAAYyB,OAASlD,EAAMtD,YACxC4K,EAAWzF,MAAQkH,EACnBzB,EAAWpE,OAAS8F,EACpB,MAAMC,EAAU3B,EAAWI,WAAW,KAAM,CAAEwB,oBAAoB,IAClE,GAAID,EAAS,CACTA,EAAQX,UAAUtI,EAAO6I,EAAMC,EAAMC,EAAMC,EAAM,EAAG,EAAGD,EAAMC,GAE1CrO,EADGsO,EAAQV,aAAa,EAAG,EAAGjB,EAAWzF,MAAOyF,EAAWpE,SAE7DzF,KAAKD,QAAQZ,0BAC1B2D,EAAgBxH,EAAiBW,kBACjC+D,KAAKK,YAAc,MD+KhD,SAAsBqL,EAAaC,EAAcC,EAAoB,EAAGnJ,EAAa,IAAKC,EAAc,KAC3G,IAAKgJ,IAAgBC,EACjB,OAAO,EAEX,MAAME,GAAkBH,EAAY1H,YAAYoB,KAAOsG,EAAY1H,YAAYI,MAAQ,GAAK3B,EACtFqJ,GAAkBJ,EAAY1H,YAAYuB,KAAOmG,EAAY1H,YAAYyB,OAAS,GAAK/C,EACvFqJ,GAAmBJ,EAAa3H,YAAYoB,KAAOuG,EAAa3H,YAAYI,MAAQ,GAAK3B,EACzFuJ,GAAmBL,EAAa3H,YAAYuB,KAAOoG,EAAa3H,YAAYyB,OAAS,GAAK/C,EAC1FuJ,EAAS3F,KAAKc,IAAIyE,EAAiBE,GACnCG,EAAS5F,KAAKc,IAAI0E,EAAiBE,GACnCG,EAAa7F,KAAKc,IAAIsE,EAAY1H,YAAYI,MAAQuH,EAAa3H,YAAYI,OAAS3B,EACxF2J,EAAc9F,KAAKc,IAAIsE,EAAY1H,YAAYyB,OAASkG,EAAa3H,YAAYyB,QAAU/C,EACjG,OAAQuJ,GAAUL,GACdM,GAAUN,GACVO,GAAkC,EAApBP,GACdQ,GAAmC,EAApBR,CACvB,CC3LwCS,CAAatJ,EAAU/C,KAAKI,cAAeJ,KAAKD,QAAQR,2BAA4BkD,EAAYC,IAC3F1C,KAAKK,cACNL,KAAKK,YAAcuC,GAEnBE,EADAF,EAAM5C,KAAKK,aAAeL,KAAKD,QAAQT,2BACvBhE,EAAiBe,UAGjBf,EAAiBc,aAIrC4D,KAAKK,YAAc,KACnByC,EAAgBxH,EAAiBc,WAG7C,MAEI0G,EAAgBxH,EAAiBI,cACjCsE,KAAKK,YAAc,IAE3B,MAhDIyC,EAAgBxH,EAAiBY,uBACjC8D,KAAKK,YAAc,UAXnByC,EAAgBxH,EAAiBS,kBACjCiE,KAAKK,YAAc,UALnByC,EAAgBxH,EAAiBQ,gBACjCkE,KAAKK,YAAc,UALnByC,EAAgBxH,EAAiBO,WACjCmE,KAAKK,YAAc,IAkE3B,CACJ,MAGIL,KAAKI,cAAgB,KACrBJ,KAAKK,YAAc,KASvB,GAPAL,KAAKI,cAAgB2C,EACrB/C,KAAKQ,UAAUsC,GAEX9C,KAAKD,QAAQjB,sBDkL1B,SAAqBwN,EAAQ3M,EAAW5C,EAAQgG,EAAUC,GAC7D,MAAMgH,EAAMsC,EAAOrC,WAAW,MAC9B,IAAKD,EACD,OACJ,MAAMvH,EAAa6J,EAAOlI,MACpB1B,EAAc4J,EAAO7G,OACrB8G,EAAe9J,EAAa,EAC5B+J,EAAe9J,EAAc,EACnCsH,EAAIyC,UAAU,EAAG,EAAGhK,EAAYC,GAEhC,MAAMgK,EArXmB,GAqXTjK,EACVkK,EAAUjK,EAAc/E,EA4B9B,GA1BAqM,EAAI4C,UAAY,4BAChB5C,EAAI6C,SAAS,EAAG,EAAGpK,EAAYC,GAC/BsH,EAAI8C,OACJ9C,EAAI+C,YACJ/C,EAAIgD,QAAQT,EAAcC,EAAcE,EAASC,EAAS,EAAG,EAAG,EAAIrG,KAAKkB,IACzEwC,EAAIiD,YACJjD,EAAIkD,yBAA2B,kBAC/BlD,EAAImD,OACJnD,EAAIoD,UAEJpD,EAAIqD,YAAc,2BAClBrD,EAAIsD,UAAY,EAChBtD,EAAI+C,YACJ/C,EAAIgD,QAAQT,EAAcC,EAAcE,EAASC,EAAS,EAAG,EAAG,EAAIrG,KAAKkB,IACzEwC,EAAIuD,SAGJvD,EAAIqD,YAAc,4BAClBrD,EAAIsD,UAAY,EAChBtD,EAAI+C,YACJ/C,EAAIwD,OAAOjB,EAJa,EAImBC,GAC3CxC,EAAIyD,OAAOlB,EALa,EAKmBC,GAC3CxC,EAAIwD,OAAOjB,EAAcC,EAND,GAOxBxC,EAAIyD,OAAOlB,EAAcC,EAPD,GAQxBxC,EAAIuD,SAEA5N,GAAaoD,EAAU,CAEvB,MAAMK,EAAYL,EAASK,UAC3B,GAAIA,EAAU7F,QAAU,IAAK,CAEzB,MAAM2J,EAAW9D,EAAU,IACrB6D,EAAO7D,EAAU,KACjByE,EAAUzE,EAAU,KACpB0E,EAAW1E,EAAU,KAErBsK,EAAatK,EAAUC,IAAIkH,GAAKA,EAAE1F,GAClC8I,EAAavK,EAAUC,IAAIkH,GAAKA,EAAEzF,GAClC2F,EAAOnE,KAAKC,OAAOmH,GACnBhD,EAAOpE,KAAKG,OAAOiH,GACnB/C,EAAOrE,KAAKC,OAAOoH,GAGnBvJ,EAAQsG,EAAOD,EACfhF,EAHOa,KAAKG,OAAOkH,GAGHhD,EAChBiD,EAAS,IACT/I,GAAK4F,EAAOrG,EAAQwJ,GAAUnL,EAC9BqC,GAAK6F,EAAOlF,EAASmI,GAAUlL,EAC/BmL,EAAIzJ,GAAS,EAAI,EAAIwJ,GAAUnL,EAC/BqL,EAAIrI,GAAU,EAAI,EAAImI,GAAUlL,EAEtC,IAAIqL,EAAW,MACXhR,IAAWzB,EAAiBc,YAAcW,IAAWzB,EAAiBe,UACtE0R,EAAW,OAENhR,IAAWzB,EAAiBI,gBACjCqS,EAAW,UAEf/D,EAAIqD,YAAcU,EAClB/D,EAAIsD,UAAY,EAChBtD,EAAIgE,WAAWnJ,EAAGC,EAAG+I,EAAGC,GAExB,MAAMxJ,EAAOlB,EAxcE,GAycMA,EAAU5F,EAAmB,IAC5B4F,EAAU3F,EAAoB,IAEpDuM,EAAI4C,UAAY,OAChB5C,EAAI+C,YACJ/C,EAAIiE,IAAI3J,EAAKO,EAAIpC,EAAY6B,EAAKQ,EAAIpC,EAAa,EAAG,EAAG,EAAI4D,KAAKkB,IAClEwC,EAAImD,OAEJnD,EAAI4C,UAAY,UAChB5C,EAAI+C,YACJ/C,EAAIiE,IAAI/G,EAASrC,EAAIpC,EAAYyE,EAASpC,EAAIpC,EAAa,EAAG,EAAG,EAAI4D,KAAKkB,IAC1EwC,EAAImD,OAEJnD,EAAI4C,UAAY,OAChB5C,EAAI+C,YACJ/C,EAAIiE,IAAIhH,EAAKpC,EAAIpC,EAAYwE,EAAKnC,EAAIpC,EAAa,EAAG,EAAG,EAAI4D,KAAKkB,IAClEwC,EAAImD,OAGJnD,EAAI4C,UAAY,SACS,CACrBxJ,EAAU,IACVA,EAAU,KACVA,EAAU,KACVA,EAAU,KACVA,EAAU,MAEG8K,QAAQnF,IACrBiB,EAAI+C,YACJ/C,EAAIiE,IAAIlF,EAASlE,EAAIpC,EAAYsG,EAASjE,EAAIpC,EAAa,EAAG,EAAG,EAAI4D,KAAKkB,IAC1EwC,EAAImD,SAGRnD,EAAI4C,UAAY,SACU,CACtBxJ,EAAU,KACVA,EAAU,KACVA,EAAU,KACVA,EAAU,KACVA,EAAU,MAEI8K,QAAQnF,IACtBiB,EAAI+C,YACJ/C,EAAIiE,IAAIlF,EAASlE,EAAIpC,EAAYsG,EAASjE,EAAIpC,EAAa,EAAG,EAAG,EAAI4D,KAAKkB,IAC1EwC,EAAImD,SAGRnD,EAAI4C,UAAY,SAChB5C,EAAI+C,YACJ/C,EAAIiE,IAAIpG,EAAQhD,EAAIpC,EAAYoF,EAAQ/C,EAAIpC,EAAa,EAAG,EAAG,EAAI4D,KAAKkB,IACxEwC,EAAImD,OACJnD,EAAI+C,YACJ/C,EAAIiE,IAAInG,EAASjD,EAAIpC,EAAYqF,EAAShD,EAAIpC,EAAa,EAAG,EAAG,EAAI4D,KAAKkB,IAC1EwC,EAAImD,MACR,CACJ,CAEIxN,GAAaqD,GAAYA,EAASzF,OAAS,GAC3CyF,EAASkL,QAASC,IACdnE,EAAI4C,UAAY,SAChBuB,EAAK/K,UAAU8K,QAASnF,IACpBiB,EAAI+C,YACJ/C,EAAIiE,IAAIlF,EAASlE,EAAIpC,EAAYsG,EAASjE,EAAIpC,EAAa,EAAG,EAAG,EAAI4D,KAAKkB,IAC1EwC,EAAImD,UAIpB,CChUoBiB,CAAYpO,KAAKD,QAAQjB,qBAAsBkB,KAAKD,QAAQJ,YAAa,EAAOmD,EAAeC,QAAYhE,EAAWiE,EAASzF,OAAS,EAAIyF,OAAWjE,GAGvJ+D,IAAkBxH,EAAiBe,YAAc2D,KAAKM,YAKtD,OAJAN,KAAKM,aAAc,QACbN,KAAKqO,eACXrO,KAAKQ,UAAUlF,EAAiBgB,cAChC0D,KAAKsO,MAGb,CACA,MAAOxM,GACH,MAAMC,EAAQD,aAAeE,MAAQF,EAAM,IAAIE,MAAMC,OAAOH,IAC5D9B,KAAKQ,UAAUlF,EAAiBiB,MAAOwF,EAC3C,CACA/B,KAAKG,iBAAmBoO,sBAAsB5L,EApJ9C,MAFI3C,KAAKG,iBAAmBoO,sBAAsB5L,EAuJtD,GACA3C,KAAKG,iBAAmBoO,sBAAsB5L,EAClD,CAIA,mBAAAoB,CAAoBX,GAChB,MAAMkH,EAAKlH,EAAUC,IAAKkH,GAAMA,EAAE1F,GAC5B2F,EAAKpH,EAAUC,IAAKkH,GAAMA,EAAEzF,GAC5BM,EAAOkB,KAAKC,OAAO+D,GACnBkE,EAAOlI,KAAKG,OAAO6D,GACnB/E,EAAOe,KAAKC,OAAOiE,GAEzB,MAAO,CACHpF,OACAG,OACAnB,MAAOoK,EAAOpJ,EACdK,OALSa,KAAKG,OAAO+D,GAKNjF,EAEvB,CACA,YAAA8I,GACI,OAAOzQ,EAAUoC,UAAW,OAAQ,EAAG,YACnC,MAAMuC,EAAQvC,KAAKD,QAAQyC,aACrB8J,EAASxC,SAASC,cAAc,UACtCuC,EAAOlI,MAAQ7B,EAAMvD,WACrBsN,EAAO7G,OAASlD,EAAMtD,YACtB,MAAM+K,EAAMsC,EAAOrC,WAAW,MACzBD,GAILA,EAAIa,UAAUtI,EAAO,EAAG,EAAG+J,EAAOlI,MAAOkI,EAAO7G,QAChD6G,EAAOmC,OAAQC,IACPA,EACA1O,KAAKD,QAAQ4O,iBAAiBD,GAG9B1O,KAAKQ,UAAUlF,EAAiBiB,MAAO,IAAIyF,MAAM,mCAEtD,aAAc,MAXbhC,KAAKQ,UAAUlF,EAAiBiB,MAAO,IAAIyF,MAAM,gCAYzD,EACJ,CACA,IAAAsM,GACkC,OAA1BtO,KAAKG,mBACLyO,qBAAqB5O,KAAKG,kBAC1BH,KAAKG,iBAAmB,MAExBH,KAAKC,gBACLD,KAAKC,eAAe4O,QAEpB7O,KAAKE,gBACLF,KAAKE,eAAe2O,OAE5B,ECrTJ,U","sources":["webpack://face-validator-sdk/webpack/bootstrap","webpack://face-validator-sdk/webpack/runtime/define property getters","webpack://face-validator-sdk/webpack/runtime/hasOwnProperty shorthand","webpack://face-validator-sdk/webpack/runtime/make namespace object","webpack://face-validator-sdk/external commonjs2 \"@mediapipe/tasks-vision\"","webpack://face-validator-sdk/./src/types.ts","webpack://face-validator-sdk/./src/i18n.ts","webpack://face-validator-sdk/./src/utils.ts","webpack://face-validator-sdk/./src/FaceValidator.ts","webpack://face-validator-sdk/./src/index.ts"],"sourcesContent":["// The require scope\nvar __webpack_require__ = {};\n\n","// define getter functions for harmony exports\n__webpack_require__.d = (exports, definition) => {\n\tfor(var key in definition) {\n\t\tif(__webpack_require__.o(definition, key) && !__webpack_require__.o(exports, key)) {\n\t\t\tObject.defineProperty(exports, key, { enumerable: true, get: definition[key] });\n\t\t}\n\t}\n};","__webpack_require__.o = (obj, prop) => (Object.prototype.hasOwnProperty.call(obj, prop))","// define __esModule on exports\n__webpack_require__.r = (exports) => {\n\tif(typeof Symbol !== 'undefined' && Symbol.toStringTag) {\n\t\tObject.defineProperty(exports, Symbol.toStringTag, { value: 'Module' });\n\t}\n\tObject.defineProperty(exports, '__esModule', { value: true });\n};","const __WEBPACK_NAMESPACE_OBJECT__ = require(\"@mediapipe/tasks-vision\");","export var ValidationStatus;\n(function (ValidationStatus) {\n ValidationStatus[\"INITIALIZING\"] = \"INITIALIZING\";\n ValidationStatus[\"NO_FACE_DETECTED\"] = \"NO_FACE_DETECTED\";\n ValidationStatus[\"FACE_DETECTED\"] = \"FACE_DETECTED\";\n ValidationStatus[\"TOO_CLOSE\"] = \"TOO_CLOSE\";\n ValidationStatus[\"TOO_FAR\"] = \"TOO_FAR\";\n ValidationStatus[\"OFF_CENTER\"] = \"OFF_CENTER\";\n ValidationStatus[\"FACE_OBSTRUCTED\"] = \"FACE_OBSTRUCTED\";\n ValidationStatus[\"HEAD_NOT_STRAIGHT\"] = \"HEAD_NOT_STRAIGHT\";\n ValidationStatus[\"MULTIPLE_FACES\"] = \"MULTIPLE_FACES\";\n ValidationStatus[\"POOR_ILLUMINATION\"] = \"POOR_ILLUMINATION\";\n ValidationStatus[\"NOT_NEUTRAL_EXPRESSION\"] = \"NOT_NEUTRAL_EXPRESSION\";\n ValidationStatus[\"DARK_GLASSES\"] = \"DARK_GLASSES\";\n ValidationStatus[\"STAY_STILL\"] = \"STAY_STILL\";\n ValidationStatus[\"CAPTURING\"] = \"CAPTURING\";\n ValidationStatus[\"SUCCESS\"] = \"SUCCESS\";\n ValidationStatus[\"ERROR\"] = \"ERROR\";\n})(ValidationStatus || (ValidationStatus = {}));\n","import { ValidationStatus } from './types';\nconst messages = {\n 'pt-BR': {\n [ValidationStatus.INITIALIZING]: 'Inicializando câmera e detector...',\n [ValidationStatus.NO_FACE_DETECTED]: 'Posicione seu rosto no centro do oval.',\n [ValidationStatus.FACE_DETECTED]: 'Analisando...',\n [ValidationStatus.TOO_CLOSE]: 'Afaste-se um pouco',\n [ValidationStatus.TOO_FAR]: 'Aproxime-se da câmera',\n [ValidationStatus.OFF_CENTER]: 'Centralize o rosto no centro do oval',\n [ValidationStatus.FACE_OBSTRUCTED]: 'Mantenha o rosto totalmente visível. Remova as mãos do rosto.',\n [ValidationStatus.HEAD_NOT_STRAIGHT]: 'Olhe diretamente para a câmera e mantenha a cabeça reta.',\n [ValidationStatus.MULTIPLE_FACES]: 'Mantenha apenas uma pessoa no quadro.',\n [ValidationStatus.POOR_ILLUMINATION]: 'Procure um ambiente com boa iluminação e centralize seu rosto no centro do oval.',\n [ValidationStatus.NOT_NEUTRAL_EXPRESSION]: 'Mantenha expressão neutra: boca fechada, sem sorrir e olhos abertos.',\n [ValidationStatus.DARK_GLASSES]: 'Remova os óculos escuros. Óculos de grau são permitidos.',\n [ValidationStatus.STAY_STILL]: 'Fique imóvel para capturar a foto',\n [ValidationStatus.CAPTURING]: 'Capturando...',\n [ValidationStatus.SUCCESS]: 'Captura realizada!',\n [ValidationStatus.ERROR]: 'Ocorreu um erro.',\n },\n en: {\n [ValidationStatus.INITIALIZING]: 'Initializing camera and detector...',\n [ValidationStatus.NO_FACE_DETECTED]: 'Position your face in the center of the oval.',\n [ValidationStatus.FACE_DETECTED]: 'Analyzing...',\n [ValidationStatus.TOO_CLOSE]: 'Move back a little',\n [ValidationStatus.TOO_FAR]: 'Move closer to the camera',\n [ValidationStatus.OFF_CENTER]: 'Center your face in the center of the oval',\n [ValidationStatus.FACE_OBSTRUCTED]: 'Keep your face fully visible. Remove your hands from your face.',\n [ValidationStatus.HEAD_NOT_STRAIGHT]: 'Look directly at the camera and keep your head straight.',\n [ValidationStatus.MULTIPLE_FACES]: 'Keep only one person in the frame.',\n [ValidationStatus.POOR_ILLUMINATION]: 'Find a well-lit environment and center your face in the oval.',\n [ValidationStatus.NOT_NEUTRAL_EXPRESSION]: 'Keep a neutral expression: mouth closed, no smiling, and eyes open.',\n [ValidationStatus.DARK_GLASSES]: 'Remove sunglasses. Prescription glasses are allowed.',\n [ValidationStatus.STAY_STILL]: 'Stay still to capture the photo',\n [ValidationStatus.CAPTURING]: 'Capturing...',\n [ValidationStatus.SUCCESS]: 'Capture complete!',\n [ValidationStatus.ERROR]: 'An error occurred.',\n },\n es: {\n [ValidationStatus.INITIALIZING]: 'Inicializando cámara y detector...',\n [ValidationStatus.NO_FACE_DETECTED]: 'Coloque su rostro en el centro del óvalo.',\n [ValidationStatus.FACE_DETECTED]: 'Analizando...',\n [ValidationStatus.TOO_CLOSE]: 'Aléjese un poco',\n [ValidationStatus.TOO_FAR]: 'Acérquese a la cámara',\n [ValidationStatus.OFF_CENTER]: 'Centre el rostro en el centro del óvalo',\n [ValidationStatus.FACE_OBSTRUCTED]: 'Mantenga el rostro totalmente visible. Quite las manos del rostro.',\n [ValidationStatus.HEAD_NOT_STRAIGHT]: 'Mire directamente a la cámara y mantenga la cabeza recta.',\n [ValidationStatus.MULTIPLE_FACES]: 'Mantenga solo una persona en el encuadre.',\n [ValidationStatus.POOR_ILLUMINATION]: 'Busque un ambiente con buena iluminación y centre su rostro en el óvalo.',\n [ValidationStatus.NOT_NEUTRAL_EXPRESSION]: 'Mantenga expresión neutra: boca cerrada, sin sonreír y ojos abiertos.',\n [ValidationStatus.DARK_GLASSES]: 'Quite las gafas de sol. Las gafas graduadas están permitidas.',\n [ValidationStatus.STAY_STILL]: 'Permanezca quieto para capturar la foto',\n [ValidationStatus.CAPTURING]: 'Capturando...',\n [ValidationStatus.SUCCESS]: '¡Captura realizada!',\n [ValidationStatus.ERROR]: 'Ocurrió un error.',\n },\n};\nconst unknownStatusByLocale = {\n 'pt-BR': 'Status desconhecido.',\n en: 'Unknown status.',\n es: 'Estado desconhecido.',\n};\n/**\n * Returns the validation messages for the given locale.\n */\nexport function getValidationMessages(locale) {\n return Object.assign({}, messages[locale]);\n}\n/**\n * Returns the message for a given status and locale.\n */\nexport function getMessage(status, locale) {\n var _a;\n return (_a = messages[locale][status]) !== null && _a !== void 0 ? _a : unknownStatusByLocale[locale];\n}\n/**\n * Returns the \"Loading models...\" message for a given locale (used during model load).\n */\nexport function getLoadingModelsMessage(locale) {\n const loading = {\n 'pt-BR': 'Carregando...',\n en: 'Loading...',\n es: 'Cargando...',\n };\n return loading[locale];\n}\n","import { ValidationStatus } from './types';\n/**\n * Calcula o brilho médio de uma região da imagem (0-255).\n */\nexport function calculateAverageBrightness(imageData) {\n const data = imageData.data;\n let sum = 0;\n for (let i = 0; i < data.length; i += 4) {\n const r = data[i];\n const g = data[i + 1];\n const b = data[i + 2];\n // Luminance formula: ITU-R BT.709\n const luminance = 0.2126 * r + 0.7152 * g + 0.0722 * b;\n sum += luminance;\n }\n return sum / (data.length / 4);\n}\n/**\n * Verifica se a face está na distância adequada (baseado no tamanho do bounding box).\n */\nexport function checkFaceDistance(boundingBox, minFaceSizeFactor = 0.18, maxFaceSizeFactor = 0.70) {\n // boundingBox.width é normalizado (0-1)\n const faceWidthRatio = boundingBox.width;\n if (faceWidthRatio < minFaceSizeFactor)\n return 'TOO_FAR';\n if (faceWidthRatio > maxFaceSizeFactor)\n return 'TOO_CLOSE';\n return 'OK';\n}\n/**\n * MediaPipe Face Mesh landmark indices (478 pontos):\n * - Nose tip: 4\n * - Left eye: 33, 133, 159, 145\n * - Right eye: 263, 362, 386, 374\n * - Mouth: 61, 291, 0, 17 (outer lips)\n * - Face oval: contorno do rosto\n * - Ears: left ear (234), right ear (454)\n * - Eye iris: left (468-471), right (473-476)\n * - Eyelids: upper left (159, 145), lower left (144, 153), upper right (386, 374), lower right (373, 380)\n * - Mouth corners: left (61), right (291)\n * - Lips: upper (13), lower (14)\n */\nconst MEDIAPIPE_NOSE_TIP = 4;\nconst MEDIAPIPE_LEFT_EYE = [33, 133, 159, 145];\nconst MEDIAPIPE_RIGHT_EYE = [263, 362, 386, 374];\nconst MEDIAPIPE_MOUTH_OUTER = [61, 291, 0, 17, 39, 269, 270, 409];\nconst MEDIAPIPE_LEFT_EAR = 234;\nconst MEDIAPIPE_RIGHT_EAR = 454;\nconst MEDIAPIPE_LEFT_EYE_TOP = 159;\nconst MEDIAPIPE_LEFT_EYE_BOTTOM = 144;\nconst MEDIAPIPE_RIGHT_EYE_TOP = 386;\nconst MEDIAPIPE_RIGHT_EYE_BOTTOM = 373;\nconst MEDIAPIPE_MOUTH_LEFT_CORNER = 61;\nconst MEDIAPIPE_MOUTH_RIGHT_CORNER = 291;\nconst MEDIAPIPE_MOUTH_TOP = 13;\nconst MEDIAPIPE_MOUTH_BOTTOM = 14;\n/**\n * Proporções do oval da moldura para centralização.\n * Ajustado para o tamanho do container (512x384).\n */\nconst OVAL_RADIUS_X_FACTOR = 0.20; // Raio horizontal do oval\nconst OVAL_RADIUS_Y_FACTOR = 0.38; // Raio vertical do oval\n/**\n * Verifica se um ponto (normalizado 0-1) está dentro do oval de enquadramento.\n */\nexport function isPointInsideOval(pointX, pointY, frameWidth, frameHeight) {\n // Converter ponto normalizado para pixels\n const px = pointX * frameWidth;\n const py = pointY * frameHeight;\n const cx = frameWidth / 2;\n const cy = frameHeight / 2;\n const rx = frameWidth * OVAL_RADIUS_X_FACTOR;\n const ry = frameHeight * OVAL_RADIUS_Y_FACTOR;\n const dx = (px - cx) / rx;\n const dy = (py - cy) / ry;\n return dx * dx + dy * dy <= 1;\n}\n/**\n * Verifica se o bounding box da face cabe dentro do oval de enquadramento.\n * Versão simplificada: verifica apenas o centro e limites principais sem margem excessiva.\n */\nexport function isFaceBoundingBoxInsideOval(boundingBox, frameWidth, frameHeight) {\n const cx = frameWidth / 2;\n const cy = frameHeight / 2;\n const rx = frameWidth * OVAL_RADIUS_X_FACTOR;\n const ry = frameHeight * OVAL_RADIUS_Y_FACTOR;\n // Converter bbox normalizado para pixels\n const faceLeft = boundingBox.xMin * frameWidth;\n const faceRight = (boundingBox.xMin + boundingBox.width) * frameWidth;\n const faceTop = boundingBox.yMin * frameHeight;\n const faceBottom = (boundingBox.yMin + boundingBox.height) * frameHeight;\n const faceCenterX = (faceLeft + faceRight) / 2;\n const faceCenterY = (faceTop + faceBottom) / 2;\n // 1. Centro da face deve estar dentro do oval (relaxado)\n const centerDx = (faceCenterX - cx) / rx;\n const centerDy = (faceCenterY - cy) / ry;\n if (centerDx * centerDx + centerDy * centerDy > 1.0) {\n // Centro pode estar até no limite do oval (era 0.8)\n return false;\n }\n // 2. Verificar apenas os cantos SEM margem adicional\n const corners = [\n { x: faceLeft, y: faceTop }, // Top-left\n { x: faceRight, y: faceTop }, // Top-right\n { x: faceLeft, y: faceBottom }, // Bottom-left\n { x: faceRight, y: faceBottom }, // Bottom-right\n ];\n // Permitir que até 2 cantos fiquem fora (MUITO mais flexível)\n let cornersOutside = 0;\n for (const corner of corners) {\n const dx = (corner.x - cx) / rx;\n const dy = (corner.y - cy) / ry;\n if (dx * dx + dy * dy > 1.2) {\n // Usar 1.2 ao invés de 1.1 para dar 20% de tolerância\n cornersOutside++;\n }\n }\n // Permitir até 2 cantos fora (era 1)\n return cornersOutside <= 2;\n}\n/**\n * Verifica se a cabeça está reta (sem inclinação lateral, horizontal ou vertical).\n * MediaPipe: usa landmarks dos olhos, nariz e boca.\n */\nexport function isHeadStraight(landmarks, maxTiltDegrees = 25) {\n if (landmarks.length < 478)\n return false;\n // Pontos dos olhos\n const leftEye = landmarks[MEDIAPIPE_LEFT_EYE[0]]; // 33\n const rightEye = landmarks[MEDIAPIPE_RIGHT_EYE[0]]; // 263\n const nose = landmarks[MEDIAPIPE_NOSE_TIP]; // 4\n // Pontos da boca para pitch\n const upperLip = landmarks[13]; // Lábio superior central\n const lowerLip = landmarks[14]; // Lábio inferior central\n const chin = landmarks[152]; // Queixo\n const forehead = landmarks[10]; // Testa\n // Roll: inclinação lateral (olhos desalinhados verticalmente)\n const eyeDeltaY = Math.abs(leftEye.y - rightEye.y);\n const eyeDeltaX = Math.abs(leftEye.x - rightEye.x);\n if (eyeDeltaX < 0.01)\n return false; // Proteção divisão por zero\n const rollRatio = eyeDeltaY / eyeDeltaX;\n const rollAngleDeg = Math.atan(rollRatio) * (180 / Math.PI);\n if (rollAngleDeg > maxTiltDegrees)\n return false;\n // Yaw: desvio horizontal (nariz deslocado do centro dos olhos)\n // NOTA: validação adicional usando orelhas é feita em isYawAcceptable()\n const midEyesX = (leftEye.x + rightEye.x) / 2;\n const noseOffsetX = nose.x - midEyesX;\n const eyeDist = Math.abs(leftEye.x - rightEye.x);\n if (eyeDist < 0.01)\n return false;\n const yawRatio = Math.abs(noseOffsetX) / eyeDist;\n const yawAngleDeg = Math.atan(yawRatio) * (180 / Math.PI);\n if (yawAngleDeg > maxTiltDegrees)\n return false;\n // Validação adicional de yaw usando orelhas (mais precisa para rostos na diagonal)\n if (!isYawAcceptable(landmarks))\n return false;\n // Pitch: inclinação vertical (cabeça para cima/baixo)\n const midEyesY = (leftEye.y + rightEye.y) / 2;\n const mouthY = (upperLip.y + lowerLip.y) / 2;\n // Verificações SIMPLIFICADAS apenas para inclinações EXTREMAS\n // Permitir variações naturais de postura\n // 1. Verificar altura total da face é plausível\n const faceHeight = chin.y - forehead.y;\n if (faceHeight < 0.10) {\n // Face extremamente \"achatada\" verticalmente = inclinação MUITO severa\n return false;\n }\n // 2. Verificar apenas ordem básica dos elementos principais\n // Apenas rejeitar casos EXTREMOS onde a ordem está completamente invertida\n // Testa deve estar ACIMA dos olhos (com margem de tolerância)\n if (forehead.y > midEyesY + 0.02) {\n return false; // Testa abaixo dos olhos = MUITO inclinado para trás\n }\n // Olhos devem estar ACIMA do nariz (com margem)\n if (midEyesY > nose.y + 0.02) {\n return false; // Olhos abaixo do nariz = inclinação extrema\n }\n // Nariz deve estar ACIMA da boca (com margem)\n if (nose.y > mouthY + 0.02) {\n return false; // Nariz abaixo da boca = inclinação extrema\n }\n // Boca deve estar ACIMA do queixo (sempre deve ser verdade)\n if (mouthY >= chin.y) {\n return false; // Geometria impossível\n }\n // 3. Verificar proporções - detectar inclinações extremas\n const foreheadToEyes = midEyesY - forehead.y;\n const eyesToNose = nose.y - midEyesY;\n const noseToMouth = mouthY - nose.y;\n const mouthToChin = chin.y - mouthY;\n const foreheadEyesRatio = foreheadToEyes / faceHeight;\n const eyesNoseRatio = eyesToNose / faceHeight;\n const noseMouthRatio = noseToMouth / faceHeight;\n const mouthChinRatio = mouthToChin / faceHeight;\n // Testa-olhos:\n // - Se MUITO GRANDE (>38%) = cabeça inclinada para FRENTE (testa dominante)\n // - Se MUITO PEQUENO (<6%) = cabeça inclinada para TRÁS (testa oculta)\n if (foreheadEyesRatio < 0.06 || foreheadEyesRatio > 0.38) {\n return false;\n }\n // Olhos-nariz: aceitar de 3% a 30%\n if (eyesNoseRatio < 0.03 || eyesNoseRatio > 0.30) {\n return false;\n }\n // Nariz-boca: aceitar de 2% a 25%\n if (noseMouthRatio < 0.02 || noseMouthRatio > 0.25) {\n return false;\n }\n // Boca-queixo: MUITO flexível (barba pode interferir)\n // Apenas rejeitar casos extremos\n if (mouthChinRatio < 0.04 || mouthChinRatio > 0.38) {\n return false;\n }\n return true;\n}\n/**\n * Verifica se a geometria do rosto é plausível (boca visível, não obstruída por mão).\n * MediaPipe: analisa distância nariz-boca e extensão vertical da boca.\n */\nexport function isFaceGeometryPlausible(landmarks, boundingBox) {\n if (landmarks.length < 478)\n return false;\n const nose = landmarks[MEDIAPIPE_NOSE_TIP];\n // Pontos da boca (contorno externo)\n const mouthPoints = MEDIAPIPE_MOUTH_OUTER.map(idx => landmarks[idx]);\n const mouthCenterY = mouthPoints.reduce((s, p) => s + p.y, 0) / mouthPoints.length;\n const mouthMinY = Math.min(...mouthPoints.map(p => p.y));\n const mouthMaxY = Math.max(...mouthPoints.map(p => p.y));\n const mouthVerticalSpread = mouthMaxY - mouthMinY;\n const boxHeight = boundingBox.height;\n // Boca deve estar abaixo do nariz (com margem de tolerância)\n if (mouthCenterY < nose.y - 0.01)\n return false;\n // Distância nariz–centro da boca: mínimo 6% da altura (reduzido de 10% para aceitar barbas e óculos)\n const noseToMouthDist = mouthCenterY - nose.y;\n if (noseToMouthDist < 0.06 * boxHeight)\n return false;\n // Extensão vertical da boca: mínimo 2% da altura (reduzido de 3%)\n if (mouthVerticalSpread < 0.02 * boxHeight)\n return false;\n return true;\n}\n/**\n * Detecta se a pessoa está usando óculos escuros através da análise de luminosidade dos olhos.\n * Óculos de grau geralmente não bloqueiam completamente a luz, permitindo ver os olhos.\n */\nexport function hasDarkGlasses(video, landmarks) {\n if (landmarks.length < 478)\n return false;\n try {\n // Criar canvas temporário para capturar regiões dos olhos\n const tempCanvas = document.createElement('canvas');\n const ctx = tempCanvas.getContext('2d');\n if (!ctx)\n return false;\n const videoWidth = video.videoWidth;\n const videoHeight = video.videoHeight;\n // Definir landmarks dos olhos (área maior que inclui região ao redor)\n const leftEyeLandmarks = [\n landmarks[33], // Canto externo\n landmarks[133], // Canto interno\n landmarks[159], // Superior\n landmarks[144], // Inferior\n landmarks[145], // Centro\n ];\n const rightEyeLandmarks = [\n landmarks[263], // Canto externo\n landmarks[362], // Canto interno\n landmarks[386], // Superior\n landmarks[373], // Inferior\n landmarks[374], // Centro\n ];\n // Função para calcular bounding box de uma região\n const getBoundingBox = (eyeLandmarks) => {\n const xs = eyeLandmarks.map(l => l.x * videoWidth);\n const ys = eyeLandmarks.map(l => l.y * videoHeight);\n const minX = Math.max(0, Math.min(...xs) - 5);\n const maxX = Math.min(videoWidth, Math.max(...xs) + 5);\n const minY = Math.max(0, Math.min(...ys) - 5);\n const maxY = Math.min(videoHeight, Math.max(...ys) + 5);\n return { x: minX, y: minY, width: maxX - minX, height: maxY - minY };\n };\n // Função para calcular luminosidade média de uma região\n const getRegionBrightness = (box) => {\n tempCanvas.width = box.width;\n tempCanvas.height = box.height;\n ctx.drawImage(video, box.x, box.y, box.width, box.height, 0, 0, box.width, box.height);\n const imageData = ctx.getImageData(0, 0, box.width, box.height);\n return calculateAverageBrightness(imageData);\n };\n // Analisar ambos os olhos\n const leftEyeBox = getBoundingBox(leftEyeLandmarks);\n const rightEyeBox = getBoundingBox(rightEyeLandmarks);\n const leftEyeBrightness = getRegionBrightness(leftEyeBox);\n const rightEyeBrightness = getRegionBrightness(rightEyeBox);\n const avgEyeBrightness = (leftEyeBrightness + rightEyeBrightness) / 2;\n // Threshold: se a região dos olhos está muito escura (< 40 em escala 0-255)\n // isso indica óculos escuros. Óculos de grau não bloqueiam tanto a luz.\n // Ajustado para 35 para ser mais sensível a óculos escuros\n return avgEyeBrightness < 35;\n }\n catch (error) {\n console.warn('Erro ao detectar óculos escuros:', error);\n return false; // Em caso de erro, não bloqueia a captura\n }\n}\n/**\n * Verifica se a expressão facial é neutra (sem sorriso, boca fechada, olhos abertos).\n * Rejeita: sorriso, boca aberta, olhos fechados.\n */\nexport function isNeutralExpression(landmarks) {\n if (landmarks.length < 478)\n return false;\n // 1. Verificar se olhos estão abertos\n const leftEyeTop = landmarks[MEDIAPIPE_LEFT_EYE_TOP];\n const leftEyeBottom = landmarks[MEDIAPIPE_LEFT_EYE_BOTTOM];\n const rightEyeTop = landmarks[MEDIAPIPE_RIGHT_EYE_TOP];\n const rightEyeBottom = landmarks[MEDIAPIPE_RIGHT_EYE_BOTTOM];\n const leftEyeOpenness = Math.abs(leftEyeTop.y - leftEyeBottom.y);\n const rightEyeOpenness = Math.abs(rightEyeTop.y - rightEyeBottom.y);\n // Olhos devem estar abertos (mínimo 1% de abertura em coordenadas normalizadas)\n if (leftEyeOpenness < 0.01 || rightEyeOpenness < 0.01) {\n return false; // Olho(s) fechado(s)\n }\n // 2. Verificar se boca está fechada\n const mouthTop = landmarks[MEDIAPIPE_MOUTH_TOP];\n const mouthBottom = landmarks[MEDIAPIPE_MOUTH_BOTTOM];\n const mouthOpenness = Math.abs(mouthTop.y - mouthBottom.y);\n // Boca deve estar relativamente fechada (máximo 2.5% de abertura)\n if (mouthOpenness > 0.025) {\n return false; // Boca aberta\n }\n // 3. Verificar se há sorriso (cantos da boca elevados)\n const mouthLeftCorner = landmarks[MEDIAPIPE_MOUTH_LEFT_CORNER];\n const mouthRightCorner = landmarks[MEDIAPIPE_MOUTH_RIGHT_CORNER];\n const nose = landmarks[MEDIAPIPE_NOSE_TIP];\n // Calcular posição vertical média dos cantos da boca relativo ao nariz\n const mouthCornersAvgY = (mouthLeftCorner.y + mouthRightCorner.y) / 2;\n const noseMouthDistance = mouthCornersAvgY - nose.y;\n // Se os cantos da boca estão muito elevados (próximos ao nariz), é sorriso\n // Em expressão neutra, os cantos devem estar significativamente abaixo do nariz\n if (noseMouthDistance < 0.05) {\n return false; // Sorriso (cantos da boca elevados)\n }\n return true;\n}\n/**\n * Verifica yaw (inclinação lateral) usando visibilidade das orelhas.\n * Quando o rosto está virado para o lado, uma orelha fica mais visível que a outra.\n */\nexport function isYawAcceptable(landmarks) {\n if (landmarks.length < 478)\n return false;\n const leftEar = landmarks[MEDIAPIPE_LEFT_EAR];\n const rightEar = landmarks[MEDIAPIPE_RIGHT_EAR];\n const nose = landmarks[MEDIAPIPE_NOSE_TIP];\n // Calcular distância de cada orelha ao nariz (em coordenadas normalizadas)\n const leftEarToNoseX = Math.abs(leftEar.x - nose.x);\n const rightEarToNoseX = Math.abs(rightEar.x - nose.x);\n // Calcular ratio de assimetria\n const asymmetryRatio = leftEarToNoseX > 0.01 && rightEarToNoseX > 0.01\n ? Math.max(leftEarToNoseX, rightEarToNoseX) / Math.min(leftEarToNoseX, rightEarToNoseX)\n : 1.0;\n // Se uma orelha está muito mais longe do nariz que a outra, o rosto está na diagonal\n // Permitir até 40% de assimetria (1.4 ratio)\n if (asymmetryRatio > 1.4) {\n return false; // Rosto muito virado para o lado\n }\n // Verificar visibilidade Z (profundidade) se disponível\n // Em MediaPipe, coordenada Z indica profundidade relativa\n if (leftEar.z !== undefined && rightEar.z !== undefined) {\n const zDifference = Math.abs(leftEar.z - rightEar.z);\n // Se a diferença de profundidade é muito grande, o rosto está na diagonal\n if (zDifference > 0.05) {\n return false; // Rosto virado lateralmente (detectado por profundidade)\n }\n }\n return true;\n}\n/**\n * Verifica se a face está estável (sem movimento significativo entre frames).\n */\nexport function isFaceStable(currentFace, previousFace, movementThreshold = 5, frameWidth = 512, frameHeight = 384) {\n if (!currentFace || !previousFace)\n return false;\n // Converter coordenadas normalizadas para pixels\n const currentCenterX = (currentFace.boundingBox.xMin + currentFace.boundingBox.width / 2) * frameWidth;\n const currentCenterY = (currentFace.boundingBox.yMin + currentFace.boundingBox.height / 2) * frameHeight;\n const previousCenterX = (previousFace.boundingBox.xMin + previousFace.boundingBox.width / 2) * frameWidth;\n const previousCenterY = (previousFace.boundingBox.yMin + previousFace.boundingBox.height / 2) * frameHeight;\n const deltaX = Math.abs(currentCenterX - previousCenterX);\n const deltaY = Math.abs(currentCenterY - previousCenterY);\n const deltaWidth = Math.abs(currentFace.boundingBox.width - previousFace.boundingBox.width) * frameWidth;\n const deltaHeight = Math.abs(currentFace.boundingBox.height - previousFace.boundingBox.height) * frameHeight;\n return (deltaX <= movementThreshold &&\n deltaY <= movementThreshold &&\n deltaWidth <= movementThreshold * 2 &&\n deltaHeight <= movementThreshold * 2);\n}\n/**\n * Calcula distância entre um ponto da mão e o centro do rosto (normalizado).\n * Retorna true se a mão estiver próxima ao rosto (indicando possível obstrução).\n */\nexport function isHandNearFace(handData, faceBoundingBox, maxDistance = 0.15) {\n const faceCenterX = faceBoundingBox.xMin + faceBoundingBox.width / 2;\n const faceCenterY = faceBoundingBox.yMin + faceBoundingBox.height / 2;\n // Verificar se algum ponto da mão está próximo ao centro do rosto\n for (const landmark of handData.landmarks) {\n const dx = landmark.x - faceCenterX;\n const dy = landmark.y - faceCenterY;\n const distance = Math.sqrt(dx * dx + dy * dy);\n if (distance < maxDistance) {\n return true;\n }\n }\n return false;\n}\n/**\n * Desenha overlay de feedback visual (oval, status, debug info).\n */\nexport function drawOverlay(canvas, debugMode, status, faceData, handData) {\n const ctx = canvas.getContext('2d');\n if (!ctx)\n return;\n const frameWidth = canvas.width;\n const frameHeight = canvas.height;\n const frameCenterX = frameWidth / 2;\n const frameCenterY = frameHeight / 2;\n ctx.clearRect(0, 0, frameWidth, frameHeight);\n // Oval: guia de enquadramento\n const radiusX = frameWidth * OVAL_RADIUS_X_FACTOR;\n const radiusY = frameHeight * OVAL_RADIUS_Y_FACTOR;\n // 1) Área fora do oval esmaecida (mais transparente)\n ctx.fillStyle = 'rgba(255, 255, 255, 0.35)';\n ctx.fillRect(0, 0, frameWidth, frameHeight);\n ctx.save();\n ctx.beginPath();\n ctx.ellipse(frameCenterX, frameCenterY, radiusX, radiusY, 0, 0, 2 * Math.PI);\n ctx.closePath();\n ctx.globalCompositeOperation = 'destination-out';\n ctx.fill();\n ctx.restore();\n // 2) Borda do oval\n ctx.strokeStyle = 'rgba(255, 255, 255, 0.9)';\n ctx.lineWidth = 3;\n ctx.beginPath();\n ctx.ellipse(frameCenterX, frameCenterY, radiusX, radiusY, 0, 0, 2 * Math.PI);\n ctx.stroke();\n // 3) Mira central\n const crosshairLength = 6;\n ctx.strokeStyle = 'rgba(255, 255, 255, 0.45)';\n ctx.lineWidth = 1;\n ctx.beginPath();\n ctx.moveTo(frameCenterX - crosshairLength, frameCenterY);\n ctx.lineTo(frameCenterX + crosshairLength, frameCenterY);\n ctx.moveTo(frameCenterX, frameCenterY - crosshairLength);\n ctx.lineTo(frameCenterX, frameCenterY + crosshairLength);\n ctx.stroke();\n // Debug: desenhar face bounding box adaptável e landmarks\n if (debugMode && faceData) {\n // Calcular bounding box adaptável baseado nos landmarks reais\n const landmarks = faceData.landmarks;\n if (landmarks.length >= 478) {\n // Pontos importantes para definir limites da face\n const forehead = landmarks[10];\n const chin = landmarks[152];\n const leftEar = landmarks[234];\n const rightEar = landmarks[454];\n // Calcular limites da face com margem\n const allXCoords = landmarks.map(l => l.x);\n const allYCoords = landmarks.map(l => l.y);\n const minX = Math.min(...allXCoords);\n const maxX = Math.max(...allXCoords);\n const minY = Math.min(...allYCoords);\n const maxY = Math.max(...allYCoords);\n // Adicionar margem de 8% para incluir toda a cabeça\n const width = maxX - minX;\n const height = maxY - minY;\n const margin = 0.08;\n const x = (minX - width * margin) * frameWidth;\n const y = (minY - height * margin) * frameHeight;\n const w = width * (1 + 2 * margin) * frameWidth;\n const h = height * (1 + 2 * margin) * frameHeight;\n // Bounding box colorido por status (adaptável)\n let boxColor = 'red';\n if (status === ValidationStatus.STAY_STILL || status === ValidationStatus.CAPTURING) {\n boxColor = 'lime';\n }\n else if (status === ValidationStatus.FACE_DETECTED) {\n boxColor = 'yellow';\n }\n ctx.strokeStyle = boxColor;\n ctx.lineWidth = 3;\n ctx.strokeRect(x, y, w, h);\n // Desenhar pontos de referência chave\n const nose = landmarks[MEDIAPIPE_NOSE_TIP];\n const leftEyePoint = landmarks[MEDIAPIPE_LEFT_EYE[0]];\n const rightEyePoint = landmarks[MEDIAPIPE_RIGHT_EYE[0]];\n // Nariz (cyan)\n ctx.fillStyle = 'cyan';\n ctx.beginPath();\n ctx.arc(nose.x * frameWidth, nose.y * frameHeight, 5, 0, 2 * Math.PI);\n ctx.fill();\n // Testa (magenta) - importante para validação de inclinação\n ctx.fillStyle = 'magenta';\n ctx.beginPath();\n ctx.arc(forehead.x * frameWidth, forehead.y * frameHeight, 4, 0, 2 * Math.PI);\n ctx.fill();\n // Queixo (verde)\n ctx.fillStyle = 'lime';\n ctx.beginPath();\n ctx.arc(chin.x * frameWidth, chin.y * frameHeight, 4, 0, 2 * Math.PI);\n ctx.fill();\n // Landmarks dos olhos para visualização da detecção de óculos\n // Olho esquerdo (amarelo)\n ctx.fillStyle = 'yellow';\n const leftEyeLandmarks = [\n landmarks[33], // Canto externo\n landmarks[133], // Canto interno\n landmarks[159], // Superior (pálpebra superior)\n landmarks[144], // Inferior (pálpebra inferior)\n landmarks[145], // Centro\n ];\n leftEyeLandmarks.forEach(landmark => {\n ctx.beginPath();\n ctx.arc(landmark.x * frameWidth, landmark.y * frameHeight, 3, 0, 2 * Math.PI);\n ctx.fill();\n });\n // Olho direito (amarelo)\n ctx.fillStyle = 'yellow';\n const rightEyeLandmarks = [\n landmarks[263], // Canto externo\n landmarks[362], // Canto interno\n landmarks[386], // Superior (pálpebra superior)\n landmarks[373], // Inferior (pálpebra inferior)\n landmarks[374], // Centro\n ];\n rightEyeLandmarks.forEach(landmark => {\n ctx.beginPath();\n ctx.arc(landmark.x * frameWidth, landmark.y * frameHeight, 3, 0, 2 * Math.PI);\n ctx.fill();\n });\n // Orelhas (roxo) - usadas na detecção de yaw\n ctx.fillStyle = 'purple';\n ctx.beginPath();\n ctx.arc(leftEar.x * frameWidth, leftEar.y * frameHeight, 3, 0, 2 * Math.PI);\n ctx.fill();\n ctx.beginPath();\n ctx.arc(rightEar.x * frameWidth, rightEar.y * frameHeight, 3, 0, 2 * Math.PI);\n ctx.fill();\n }\n }\n // Debug: desenhar mãos\n if (debugMode && handData && handData.length > 0) {\n handData.forEach((hand) => {\n ctx.fillStyle = 'orange';\n hand.landmarks.forEach((landmark) => {\n ctx.beginPath();\n ctx.arc(landmark.x * frameWidth, landmark.y * frameHeight, 3, 0, 2 * Math.PI);\n ctx.fill();\n });\n });\n }\n}\n","var __awaiter = (this && this.__awaiter) || function (thisArg, _arguments, P, generator) {\n function adopt(value) { return value instanceof P ? value : new P(function (resolve) { resolve(value); }); }\n return new (P || (P = Promise))(function (resolve, reject) {\n function fulfilled(value) { try { step(generator.next(value)); } catch (e) { reject(e); } }\n function rejected(value) { try { step(generator[\"throw\"](value)); } catch (e) { reject(e); } }\n function step(result) { result.done ? resolve(result.value) : adopt(result.value).then(fulfilled, rejected); }\n step((generator = generator.apply(thisArg, _arguments || [])).next());\n });\n};\nimport { FaceLandmarker, HandLandmarker, FilesetResolver } from '@mediapipe/tasks-vision';\nimport { ValidationStatus, } from './types';\nimport { getMessage, getLoadingModelsMessage } from './i18n';\nimport { calculateAverageBrightness, checkFaceDistance, isFaceStable, isHeadStraight, isFaceGeometryPlausible, isPointInsideOval, isFaceBoundingBoxInsideOval, isHandNearFace, isNeutralExpression, hasDarkGlasses, drawOverlay, } from './utils';\nconst DEFAULT_LOCALE = 'en';\nconst defaultOptions = {\n overlayCanvasElement: undefined,\n videoWidth: 512,\n videoHeight: 384,\n minDetectionConfidence: 0.4,\n minIlluminationThreshold: 50,\n minFaceSizeFactor: 0.15,\n maxFaceSizeFactor: 0.75,\n stabilizationTimeThreshold: 1000,\n stabilityMovementThreshold: 5,\n minFaceVisibilityScore: 0.4,\n maxHeadTiltDegrees: 30,\n maxHandFaceDistance: 0.15,\n debugMode: false,\n locale: DEFAULT_LOCALE,\n customMessages: {},\n};\n/**\n * FaceValidator SDK - Real-time selfie validation with MediaPipe\n */\nexport class FaceValidator {\n constructor(options) {\n this.faceLandmarker = null;\n this.handLandmarker = null;\n this.animationFrameId = null;\n this.lastDetection = null;\n this.stableSince = null;\n this.isCapturing = false;\n this.options = this.resolveOptions(options);\n this.setStatus(ValidationStatus.INITIALIZING);\n this.init();\n }\n resolveOptions(options) {\n const modelPath = options.modelPath || 'https://cdn.jsdelivr.net/npm/@mediapipe/tasks-vision@latest/wasm';\n return Object.assign(Object.assign(Object.assign({}, defaultOptions), options), { modelPath, locale: options.locale || DEFAULT_LOCALE, customMessages: options.customMessages || {} });\n }\n init() {\n return __awaiter(this, void 0, void 0, function* () {\n try {\n const loadingMsg = getLoadingModelsMessage(this.options.locale);\n this.setStatus(ValidationStatus.INITIALIZING, undefined, loadingMsg);\n // Initialize MediaPipe FilesetResolver\n const vision = yield FilesetResolver.forVisionTasks(this.options.modelPath);\n // Initialize FaceLandmarker\n this.faceLandmarker = yield FaceLandmarker.createFromOptions(vision, {\n baseOptions: {\n modelAssetPath: 'https://storage.googleapis.com/mediapipe-models/face_landmarker/face_landmarker/float16/1/face_landmarker.task',\n delegate: 'GPU'\n },\n runningMode: 'VIDEO',\n numFaces: 2, // Detectar até 2 faces para MULTIPLE_FACES\n minFaceDetectionConfidence: this.options.minDetectionConfidence,\n minFacePresenceConfidence: this.options.minFaceVisibilityScore,\n minTrackingConfidence: this.options.minFaceVisibilityScore,\n });\n // Initialize HandLandmarker\n this.handLandmarker = yield HandLandmarker.createFromOptions(vision, {\n baseOptions: {\n modelAssetPath: 'https://storage.googleapis.com/mediapipe-models/hand_landmarker/hand_landmarker/float16/1/hand_landmarker.task',\n delegate: 'GPU'\n },\n runningMode: 'VIDEO',\n numHands: 2,\n minHandDetectionConfidence: 0.5,\n minHandPresenceConfidence: 0.5,\n minTrackingConfidence: 0.5,\n });\n this.startDetectionLoop();\n }\n catch (err) {\n const error = err instanceof Error ? err : new Error(String(err));\n this.setStatus(ValidationStatus.ERROR, error);\n }\n });\n }\n getMessageForStatus(status, messageOverride) {\n if (messageOverride)\n return messageOverride;\n if (this.options.customMessages[status]) {\n return this.options.customMessages[status];\n }\n return getMessage(status, this.options.locale);\n }\n setStatus(status, error, messageOverride) {\n const message = this.getMessageForStatus(status, messageOverride);\n this.options.onStatusUpdate(status, message);\n if (status === ValidationStatus.ERROR && error) {\n this.options.onError(status, error);\n }\n }\n startDetectionLoop() {\n const video = this.options.videoElement;\n const frameWidth = this.options.videoWidth || 640;\n const frameHeight = this.options.videoHeight || 480;\n const detect = () => __awaiter(this, void 0, void 0, function* () {\n var _a;\n if (!this.faceLandmarker || !this.handLandmarker || !video.videoWidth) {\n this.animationFrameId = requestAnimationFrame(detect);\n return;\n }\n try {\n const now = performance.now();\n let currentStatus = ValidationStatus.NO_FACE_DETECTED;\n let faceData = null;\n let handData = [];\n // Detectar faces\n const faceResults = this.faceLandmarker.detectForVideo(video, now);\n // Detectar mãos\n const handResults = this.handLandmarker.detectForVideo(video, now);\n // Processar mãos detectadas\n if (handResults.landmarks && handResults.landmarks.length > 0) {\n handData = handResults.landmarks.map((landmarks, idx) => {\n var _a, _b, _c;\n return ({\n landmarks,\n handedness: ((_c = (_b = (_a = handResults.handednesses) === null || _a === void 0 ? void 0 : _a[idx]) === null || _b === void 0 ? void 0 : _b[0]) === null || _c === void 0 ? void 0 : _c.categoryName) || 'Unknown'\n });\n });\n }\n if (faceResults.faceLandmarks && faceResults.faceLandmarks.length > 1) {\n // Múltiplas faces detectadas\n currentStatus = ValidationStatus.MULTIPLE_FACES;\n this.stableSince = null;\n // Usar primeira face para overlay\n const landmarks = faceResults.faceLandmarks[0];\n const box = ((_a = faceResults.faceBlendshapes) === null || _a === void 0 ? void 0 : _a[0]) ? this.estimateBoundingBox(landmarks) : null;\n if (box) {\n faceData = { boundingBox: box, landmarks, timestamp: now };\n }\n }\n else if (faceResults.faceLandmarks && faceResults.faceLandmarks.length === 1) {\n // Uma face detectada\n const landmarks = faceResults.faceLandmarks[0];\n const boundingBox = this.estimateBoundingBox(landmarks);\n faceData = {\n boundingBox,\n landmarks,\n timestamp: now,\n };\n // Validações sequenciais\n const distanceStatus = checkFaceDistance(boundingBox, this.options.minFaceSizeFactor, this.options.maxFaceSizeFactor);\n if (distanceStatus !== 'OK') {\n currentStatus = distanceStatus === 'TOO_CLOSE' ? ValidationStatus.TOO_CLOSE : ValidationStatus.TOO_FAR;\n this.stableSince = null;\n }\n else {\n // Verificar centralização: nariz no oval OU bounding box dentro do oval\n // Relaxado para aceitar quando pelo menos uma condição é verdadeira\n const nose = landmarks[4]; // MediaPipe nose tip\n const isNoseCentered = isPointInsideOval(nose.x, nose.y, frameWidth, frameHeight);\n const isFaceInsideOval = isFaceBoundingBoxInsideOval(boundingBox, frameWidth, frameHeight);\n // Aceitar se nariz está centrado (validação principal)\n // Validação do bounding box é adicional mas não obrigatória\n if (!isNoseCentered) {\n currentStatus = ValidationStatus.OFF_CENTER;\n this.stableSince = null;\n }\n else if (!isFaceGeometryPlausible(landmarks, boundingBox)) {\n currentStatus = ValidationStatus.FACE_OBSTRUCTED;\n this.stableSince = null;\n }\n else if (!isHeadStraight(landmarks, this.options.maxHeadTiltDegrees)) {\n currentStatus = ValidationStatus.HEAD_NOT_STRAIGHT;\n this.stableSince = null;\n }\n else if (handData.length > 0 && isHandNearFace(handData[0], boundingBox, this.options.maxHandFaceDistance)) {\n // Mão detectada próxima ao rosto\n currentStatus = ValidationStatus.FACE_OBSTRUCTED;\n this.stableSince = null;\n }\n else if (!isNeutralExpression(landmarks)) {\n // Expressão não neutra (sorriso, boca aberta, olhos fechados)\n currentStatus = ValidationStatus.NOT_NEUTRAL_EXPRESSION;\n this.stableSince = null;\n }\n else if (hasDarkGlasses(video, landmarks)) {\n // Óculos escuros detectados\n currentStatus = ValidationStatus.DARK_GLASSES;\n this.stableSince = null;\n }\n else {\n // Verificar iluminação\n const tempCanvas = document.createElement('canvas');\n const boxX = boundingBox.xMin * video.videoWidth;\n const boxY = boundingBox.yMin * video.videoHeight;\n const boxW = boundingBox.width * video.videoWidth;\n const boxH = boundingBox.height * video.videoHeight;\n tempCanvas.width = boxW;\n tempCanvas.height = boxH;\n const tempCtx = tempCanvas.getContext('2d', { willReadFrequently: true });\n if (tempCtx) {\n tempCtx.drawImage(video, boxX, boxY, boxW, boxH, 0, 0, boxW, boxH);\n const faceImageData = tempCtx.getImageData(0, 0, tempCanvas.width, tempCanvas.height);\n const brightness = calculateAverageBrightness(faceImageData);\n if (brightness < this.options.minIlluminationThreshold) {\n currentStatus = ValidationStatus.POOR_ILLUMINATION;\n this.stableSince = null;\n }\n else {\n // Verificar estabilidade\n if (isFaceStable(faceData, this.lastDetection, this.options.stabilityMovementThreshold, frameWidth, frameHeight)) {\n if (!this.stableSince)\n this.stableSince = now;\n if (now - this.stableSince >= this.options.stabilizationTimeThreshold) {\n currentStatus = ValidationStatus.CAPTURING;\n }\n else {\n currentStatus = ValidationStatus.STAY_STILL;\n }\n }\n else {\n this.stableSince = null;\n currentStatus = ValidationStatus.STAY_STILL;\n }\n }\n }\n else {\n currentStatus = ValidationStatus.FACE_DETECTED;\n this.stableSince = null;\n }\n }\n }\n }\n else {\n // Nenhuma face detectada\n this.lastDetection = null;\n this.stableSince = null;\n }\n this.lastDetection = faceData;\n this.setStatus(currentStatus);\n // Desenhar overlay\n if (this.options.overlayCanvasElement) {\n drawOverlay(this.options.overlayCanvasElement, this.options.debugMode || false, currentStatus, faceData || undefined, handData.length > 0 ? handData : undefined);\n }\n // Capturar se status é CAPTURING\n if (currentStatus === ValidationStatus.CAPTURING && !this.isCapturing) {\n this.isCapturing = true;\n yield this.captureImage();\n this.setStatus(ValidationStatus.SUCCESS);\n this.stop();\n return;\n }\n }\n catch (err) {\n const error = err instanceof Error ? err : new Error(String(err));\n this.setStatus(ValidationStatus.ERROR, error);\n }\n this.animationFrameId = requestAnimationFrame(detect);\n });\n this.animationFrameId = requestAnimationFrame(detect);\n }\n /**\n * Estima bounding box a partir dos landmarks (MediaPipe não fornece bbox diretamente).\n */\n estimateBoundingBox(landmarks) {\n const xs = landmarks.map((l) => l.x);\n const ys = landmarks.map((l) => l.y);\n const xMin = Math.min(...xs);\n const xMax = Math.max(...xs);\n const yMin = Math.min(...ys);\n const yMax = Math.max(...ys);\n return {\n xMin,\n yMin,\n width: xMax - xMin,\n height: yMax - yMin,\n };\n }\n captureImage() {\n return __awaiter(this, void 0, void 0, function* () {\n const video = this.options.videoElement;\n const canvas = document.createElement('canvas');\n canvas.width = video.videoWidth;\n canvas.height = video.videoHeight;\n const ctx = canvas.getContext('2d');\n if (!ctx) {\n this.setStatus(ValidationStatus.ERROR, new Error('Failed to get canvas context'));\n return;\n }\n ctx.drawImage(video, 0, 0, canvas.width, canvas.height);\n canvas.toBlob((blob) => {\n if (blob) {\n this.options.onCaptureSuccess(blob);\n }\n else {\n this.setStatus(ValidationStatus.ERROR, new Error('Failed to generate image blob'));\n }\n }, 'image/jpeg', 0.95);\n });\n }\n stop() {\n if (this.animationFrameId !== null) {\n cancelAnimationFrame(this.animationFrameId);\n this.animationFrameId = null;\n }\n if (this.faceLandmarker) {\n this.faceLandmarker.close();\n }\n if (this.handLandmarker) {\n this.handLandmarker.close();\n }\n }\n}\n","import { FaceValidator } from './FaceValidator';\nimport { ValidationStatus, } from './types';\nimport { getValidationMessages, getMessage, getLoadingModelsMessage, } from './i18n';\nexport { FaceValidator };\nexport { ValidationStatus };\nexport { getValidationMessages, getMessage, getLoadingModelsMessage };\nexport default FaceValidator;\n"],"names":["__webpack_require__","exports","definition","key","o","Object","defineProperty","enumerable","get","obj","prop","prototype","hasOwnProperty","call","Symbol","toStringTag","value","require","ValidationStatus","messages","INITIALIZING","NO_FACE_DETECTED","FACE_DETECTED","TOO_CLOSE","TOO_FAR","OFF_CENTER","FACE_OBSTRUCTED","HEAD_NOT_STRAIGHT","MULTIPLE_FACES","POOR_ILLUMINATION","NOT_NEUTRAL_EXPRESSION","DARK_GLASSES","STAY_STILL","CAPTURING","SUCCESS","ERROR","en","es","unknownStatusByLocale","getValidationMessages","locale","assign","getMessage","status","_a","getLoadingModelsMessage","calculateAverageBrightness","imageData","data","sum","i","length","MEDIAPIPE_LEFT_EYE","MEDIAPIPE_RIGHT_EYE","MEDIAPIPE_MOUTH_OUTER","OVAL_RADIUS_Y_FACTOR","__awaiter","thisArg","_arguments","P","generator","Promise","resolve","reject","fulfilled","step","next","e","rejected","result","done","then","apply","defaultOptions","overlayCanvasElement","undefined","videoWidth","videoHeight","minDetectionConfidence","minIlluminationThreshold","minFaceSizeFactor","maxFaceSizeFactor","stabilizationTimeThreshold","stabilityMovementThreshold","minFaceVisibilityScore","maxHeadTiltDegrees","maxHandFaceDistance","debugMode","customMessages","FaceValidator","constructor","options","this","faceLandmarker","handLandmarker","animationFrameId","lastDetection","stableSince","isCapturing","resolveOptions","setStatus","init","modelPath","loadingMsg","vision","FilesetResolver","forVisionTasks","FaceLandmarker","createFromOptions","baseOptions","modelAssetPath","delegate","runningMode","numFaces","minFaceDetectionConfidence","minFacePresenceConfidence","minTrackingConfidence","HandLandmarker","numHands","minHandDetectionConfidence","minHandPresenceConfidence","startDetectionLoop","err","error","Error","String","getMessageForStatus","messageOverride","message","onStatusUpdate","onError","video","videoElement","frameWidth","frameHeight","detect","now","performance","currentStatus","faceData","handData","faceResults","detectForVideo","handResults","landmarks","map","idx","_b","_c","handedness","handednesses","categoryName","faceLandmarks","box","faceBlendshapes","estimateBoundingBox","boundingBox","timestamp","distanceStatus","faceWidthRatio","width","checkFaceDistance","nose","isNoseCentered","pointX","pointY","dx","dy","isPointInsideOval","x","y","cx","cy","rx","ry","faceLeft","xMin","faceRight","faceTop","yMin","faceBottom","height","centerDx","centerDy","corners","corner","cornersOutside","isFaceBoundingBoxInsideOval","mouthPoints","mouthCenterY","reduce","s","p","mouthMinY","Math","min","mouthVerticalSpread","max","boxHeight","isFaceGeometryPlausible","maxTiltDegrees","leftEye","rightEye","upperLip","lowerLip","chin","forehead","eyeDeltaY","abs","eyeDeltaX","rollRatio","atan","PI","midEyesX","noseOffsetX","eyeDist","yawRatio","leftEar","rightEar","leftEarToNoseX","rightEarToNoseX","z","isYawAcceptable","midEyesY","mouthY","faceHeight","foreheadEyesRatio","eyesNoseRatio","noseMouthRatio","mouthChinRatio","isHeadStraight","faceBoundingBox","maxDistance","faceCenterX","faceCenterY","landmark","sqrt","isHandNearFace","leftEyeTop","leftEyeBottom","rightEyeTop","rightEyeBottom","leftEyeOpenness","rightEyeOpenness","mouthTop","mouthBottom","mouthLeftCorner","mouthRightCorner","isNeutralExpression","tempCanvas","document","createElement","ctx","getContext","leftEyeLandmarks","rightEyeLandmarks","getBoundingBox","eyeLandmarks","xs","l","ys","minX","maxX","minY","getRegionBrightness","drawImage","getImageData","leftEyeBox","rightEyeBox","console","warn","hasDarkGlasses","boxX","boxY","boxW","boxH","tempCtx","willReadFrequently","currentFace","previousFace","movementThreshold","currentCenterX","currentCenterY","previousCenterX","previousCenterY","deltaX","deltaY","deltaWidth","deltaHeight","isFaceStable","canvas","frameCenterX","frameCenterY","clearRect","radiusX","radiusY","fillStyle","fillRect","save","beginPath","ellipse","closePath","globalCompositeOperation","fill","restore","strokeStyle","lineWidth","stroke","moveTo","lineTo","allXCoords","allYCoords","margin","w","h","boxColor","strokeRect","arc","forEach","hand","drawOverlay","captureImage","stop","requestAnimationFrame","xMax","toBlob","blob","onCaptureSuccess","cancelAnimationFrame","close"],"sourceRoot":""}
|