loomlarge 0.1.0 → 1.0.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
Files changed (2) hide show
  1. package/README.md +654 -541
  2. package/package.json +1 -1
package/README.md CHANGED
@@ -1,714 +1,827 @@
1
- ![Latticework by Lovelace LOL](LoomLarge.png)
1
+ # LoomLarge
2
2
 
3
- # LoomLarge (Latticework Animation Platform)
3
+ A FACS-based morph and bone mapping library for controlling high-definition 3D characters in Three.js.
4
4
 
5
- **LoomLarge** is a next-generation interactive 3D character animation platform built on **Latticework**, featuring reactive state management with XState, facial animation control via ARKit FACS (Facial Action Coding System), and modular agency-based architecture for lip-sync, eye/head tracking, prosodic expression, and conversational AI.
5
+ LoomLarge provides pre-built mappings that connect [Facial Action Coding System (FACS)](https://en.wikipedia.org/wiki/Facial_Action_Coding_System) Action Units to the morph targets and bone transforms found in Character Creator 4 (CC4) characters. Instead of manually figuring out which blend shapes correspond to which facial movements, you can simply say `setAU(12, 0.8)` and the library handles the rest.
6
6
 
7
7
  ---
8
8
 
9
9
  ## Table of Contents
10
- 1. [Quick Start](#quick-start)
11
- 2. [Core Concepts](#core-concepts)
12
- - [Latticework Architecture](#latticework-architecture)
13
- - [Composite Rotation System](#composite-rotation-system)
14
- - [XState & Reactive Services](#xstate--reactive-services)
15
- 3. [Installation](#installation)
16
- 4. [Project Structure](#project-structure)
17
- 5. [How It Works](#how-it-works)
18
- - [Animation Service](#animation-service)
19
- - [Eye & Head Tracking](#eye--head-tracking)
20
- - [Lip-Sync Agency](#lip-sync-agency)
21
- - [Prosodic Expression](#prosodic-expression)
22
- 6. [Modules](#modules)
23
- 7. [Development](#development)
24
- 8. [Deployment](#deployment)
25
- 9. [License & Acknowledgments](#license--acknowledgments)
10
+
11
+ 1. [Installation & Setup](#1-installation--setup)
12
+ 2. [Using Presets](#2-using-presets)
13
+ 3. [Extending & Custom Presets](#3-extending--custom-presets)
14
+ 4. [Action Unit Control](#4-action-unit-control)
15
+ 5. [Mix Weight System](#5-mix-weight-system)
16
+ 6. [Composite Rotation System](#6-composite-rotation-system)
17
+ 7. [Continuum Pairs](#7-continuum-pairs)
18
+ 8. [Direct Morph Control](#8-direct-morph-control)
19
+ 9. [Viseme System](#9-viseme-system)
20
+ 10. [Transition System](#10-transition-system)
21
+ 11. [Playback & State Control](#11-playback--state-control)
22
+ 12. [Hair Physics](#12-hair-physics)
26
23
 
27
24
  ---
28
25
 
29
- ## Quick Start
26
+ ## 1. Installation & Setup
30
27
 
31
- ```bash
32
- # Clone the repository
33
- git clone https://github.com/meekmachine/LoomLarge.git
34
- cd LoomLarge
28
+ ### Install the package
35
29
 
36
- # Install dependencies
37
- yarn install
30
+ ```bash
31
+ npm install loomlarge
32
+ ```
38
33
 
39
- # Start development server (Vite)
40
- yarn dev
34
+ ### Peer dependency
41
35
 
42
- # Build for production
43
- yarn build
36
+ LoomLarge requires Three.js as a peer dependency:
44
37
 
45
- # Deploy to GitHub Pages
46
- yarn deploy
38
+ ```bash
39
+ npm install three
47
40
  ```
48
41
 
49
- The development server will start at `http://localhost:5173` with hot module replacement enabled.
42
+ ### Basic setup
50
43
 
51
- ---
44
+ ```typescript
45
+ import * as THREE from 'three';
46
+ import { GLTFLoader } from 'three/addons/loaders/GLTFLoader.js';
47
+ import { LoomLargeThree, collectMorphMeshes, CC4_PRESET } from 'loomlarge';
52
48
 
53
- ## Core Concepts
49
+ // 1. Create the LoomLarge controller with a preset
50
+ const loom = new LoomLargeThree({ auMappings: CC4_PRESET });
54
51
 
55
- ### Latticework Architecture
52
+ // 2. Set up your Three.js scene
53
+ const scene = new THREE.Scene();
54
+ const camera = new THREE.PerspectiveCamera(35, window.innerWidth / window.innerHeight, 0.1, 100);
55
+ const renderer = new THREE.WebGLRenderer({ antialias: true });
56
+ renderer.setSize(window.innerWidth, window.innerHeight);
57
+ document.body.appendChild(renderer.domElement);
56
58
 
57
- - **Agency-Based Design**: Independent services (agencies) handle specialized tasks:
58
- - **Animation Agency**: Core snippet scheduling and playback
59
- - **Lip-Sync Agency**: Phoneme prediction and viseme animation
60
- - **Prosodic Expression Agency**: Emotional head gestures and speech timing
61
- - **Eye/Head Tracking Agency**: Gaze control with mouse, webcam, or manual modes
62
- - **Conversation Agency**: Multi-modal conversational AI orchestration
59
+ // 3. Load your character model
60
+ const loader = new GLTFLoader();
61
+ loader.load('/character.glb', (gltf) => {
62
+ scene.add(gltf.scene);
63
63
 
64
- - **Immutable State**: Reactive state management ensures predictable updates and time-travel debugging
65
- - **XState Machines**: Declarative state machines replace callback hell and ad-hoc timers
64
+ // 4. Collect all meshes that have morph targets
65
+ const meshes = collectMorphMeshes(gltf.scene);
66
66
 
67
- ### Composite Rotation System
67
+ // 5. Initialize LoomLarge with the meshes and model
68
+ loom.onReady({ meshes, model: gltf.scene });
68
69
 
69
- The **Composite Rotation System** in EngineThree allows smooth blending of blendshapes (morphs) and bone rotations:
70
+ console.log(`Loaded ${meshes.length} meshes with morph targets`);
71
+ });
70
72
 
71
- - **Continuum Values** (-1 to +1): Single value controls paired AUs (e.g., Head Left ↔ Right)
72
- - **Mix Weights** (0 to 1): Blend between 100% morph (0) and 100% bone (1)
73
- - **Unified Rotation State**: Prevents axis conflicts when multiple systems control the same bones
73
+ // 6. Animation loop - call loom.update() every frame
74
+ let lastTime = performance.now();
75
+ function animate() {
76
+ requestAnimationFrame(animate);
74
77
 
75
- Example:
76
- ```typescript
77
- // Eyes: -1 (look left) to +1 (look right)
78
- engine.applyEyeComposite(yaw, pitch);
78
+ const now = performance.now();
79
+ const deltaSeconds = (now - lastTime) / 1000;
80
+ lastTime = now;
79
81
 
80
- // Head: yaw/pitch/roll with mix control
81
- engine.applyHeadComposite(yaw, pitch, roll);
82
+ // Update LoomLarge transitions
83
+ loom.update(deltaSeconds);
84
+
85
+ renderer.render(scene, camera);
86
+ }
87
+ animate();
82
88
  ```
83
89
 
84
- ### XState & Reactive Services
90
+ ### The `collectMorphMeshes` helper
85
91
 
86
- - **XState 5.x**: Modern state machines with TypeScript support
87
- - **Service Pattern**: Each agency exposes a service with start/stop/update lifecycle
88
- - **Global Context**: Services registered in `ModulesContext` for cross-component access
92
+ This utility function traverses a Three.js scene and returns all meshes that have `morphTargetInfluences` (i.e., blend shapes). It's the recommended way to gather meshes for LoomLarge:
93
+
94
+ ```typescript
95
+ import { collectMorphMeshes } from 'loomlarge';
96
+
97
+ const meshes = collectMorphMeshes(gltf.scene);
98
+ // Returns: Array of THREE.Mesh objects with morph targets
99
+ ```
89
100
 
90
101
  ---
91
102
 
92
- ## Installation
103
+ ## 2. Using Presets
93
104
 
94
- ### Prerequisites
105
+ Presets define how FACS Action Units map to your character's morph targets and bones. LoomLarge ships with `CC4_PRESET` for Character Creator 4 characters.
95
106
 
96
- - **Node.js** 18+ (LTS recommended)
97
- - **Yarn** 1.22+ or npm 8+
98
- - Modern browser with WebGL 2.0 support
107
+ ### What's in a preset?
99
108
 
100
- ### Setup Steps
109
+ ```typescript
110
+ import { CC4_PRESET } from 'loomlarge';
111
+
112
+ // CC4_PRESET contains:
113
+ {
114
+ auToMorphs: {
115
+ // AU number → array of morph target names
116
+ 1: ['Brow_Raise_Inner_L', 'Brow_Raise_Inner_R'],
117
+ 12: ['Mouth_Smile_L', 'Mouth_Smile_R'],
118
+ 45: ['Eye_Blink_L', 'Eye_Blink_R'],
119
+ // ... 87 AUs total
120
+ },
121
+
122
+ auToBones: {
123
+ // AU number → array of bone bindings
124
+ 51: [{ node: 'HEAD', channel: 'ry', scale: -1, maxDegrees: 30 }],
125
+ 61: [{ node: 'EYE_L', channel: 'rz', scale: 1, maxDegrees: 25 }],
126
+ // ... 32 bone bindings
127
+ },
128
+
129
+ boneNodes: {
130
+ // Logical bone name → actual node name in skeleton
131
+ 'HEAD': 'CC_Base_Head',
132
+ 'JAW': 'CC_Base_JawRoot',
133
+ 'EYE_L': 'CC_Base_L_Eye',
134
+ 'EYE_R': 'CC_Base_R_Eye',
135
+ 'TONGUE': 'CC_Base_Tongue01',
136
+ },
137
+
138
+ visemeKeys: [
139
+ // 15 viseme morph names for lip-sync
140
+ 'V_EE', 'V_Er', 'V_IH', 'V_Ah', 'V_Oh',
141
+ 'V_W_OO', 'V_S_Z', 'V_Ch_J', 'V_F_V', 'V_TH',
142
+ 'V_T_L_D_N', 'V_B_M_P', 'V_K_G_H_NG', 'V_AE', 'V_R'
143
+ ],
101
144
 
102
- 1. **Clone and install**:
103
- ```bash
104
- git clone https://github.com/meekmachine/LoomLarge.git
105
- cd LoomLarge
106
- yarn install
107
- ```
145
+ morphToMesh: {
146
+ // Routes morph categories to specific meshes
147
+ 'face': ['CC_Base_Body'],
148
+ 'tongue': ['CC_Base_Tongue'],
149
+ 'eye': ['CC_Base_EyeOcclusion_L', 'CC_Base_EyeOcclusion_R'],
150
+ },
151
+
152
+ auMixDefaults: {
153
+ // Default morph/bone blend weights (0 = morph, 1 = bone)
154
+ 26: 0.5, // Jaw drop: 50% morph, 50% bone
155
+ 51: 0.7, // Head turn: 70% bone
156
+ },
157
+
158
+ auInfo: {
159
+ // Metadata about each AU
160
+ '12': {
161
+ name: 'Lip Corner Puller',
162
+ muscularBasis: 'zygomaticus major',
163
+ faceArea: 'Lower',
164
+ facePart: 'Mouth',
165
+ },
166
+ // ...
167
+ }
168
+ }
169
+ ```
108
170
 
109
- 2. **Add your 3D model**:
110
- - Place GLB file in `public/characters/`
111
- - Update model path in `src/App.tsx`:
112
- ```typescript
113
- const glbSrc = import.meta.env.BASE_URL + "characters/your-model.glb";
114
- ```
171
+ ### Passing a preset to LoomLarge
115
172
 
116
- 3. **Configure API keys** (optional, for AI modules):
117
- - Create `.env.local`:
118
- ```
119
- VITE_ANTHROPIC_API_KEY=sk-ant-...
120
- ```
173
+ ```typescript
174
+ import { LoomLargeThree, CC4_PRESET } from 'loomlarge';
121
175
 
122
- 4. **Start development**:
123
- ```bash
124
- yarn dev
125
- ```
176
+ const loom = new LoomLargeThree({ auMappings: CC4_PRESET });
177
+ ```
126
178
 
127
179
  ---
128
180
 
129
- ## Project Structure
130
-
131
- ```
132
- LoomLarge/
133
- ├── README.md # This file
134
- ├── package.json
135
- ├── vite.config.ts # Vite bundler config
136
- ├── public/
137
- │ ├── characters/ # GLB 3D models
138
- │ ├── animations/ # Pre-baked animation JSON
139
- │ ├── models/ # ML models (face-api.js)
140
- │ └── skyboxes/ # Environment maps
141
- ├── src/
142
- │ ├── App.tsx # Main React app entry
143
- │ ├── main.tsx # Vite entry point
144
- │ ├── engine/
145
- │ │ ├── EngineThree.ts # Three.js engine (AU/morph control)
146
- │ │ ├── EngineWind.ts # Wind physics for hair/cloth
147
- │ │ └── arkit/
148
- │ │ └── shapeDict.ts # ARKit FACS AU → morph mappings
149
- │ ├── latticework/ # Core agencies
150
- │ │ ├── animation/ # Animation scheduler (XState)
151
- │ │ ├── lipsync/ # Lip-sync phoneme predictor
152
- │ │ ├── prosodic/ # Prosodic expression (head gestures)
153
- │ │ ├── eyeHeadTracking/ # Eye/head tracking service
154
- │ │ ├── conversation/ # Conversational AI orchestration
155
- │ │ └── transcription/ # Speech-to-text services
156
- │ ├── components/
157
- │ │ ├── au/ # AU control UI components
158
- │ │ │ ├── AUSection.tsx # AU sliders
159
- │ │ │ ├── VisemeSection.tsx # Viseme controls
160
- │ │ │ ├── EyeHeadTrackingSection.tsx # Eye/head tracking UI
161
- │ │ │ └── ContinuumSlider.tsx # Bidirectional AU slider
162
- │ │ ├── SliderDrawer.tsx # Main dockable UI drawer
163
- │ │ ├── PlaybackControls.tsx # Animation playback controls
164
- │ │ ├── CurveEditor.tsx # Visual curve editor
165
- │ │ └── ModulesMenu.tsx # Module activation UI
166
- │ ├── modules/ # Pluggable modules
167
- │ │ ├── aiChat/ # AI chat module (Anthropic)
168
- │ │ ├── frenchQuiz/ # French quiz demo module
169
- │ │ └── config.ts # Module registry
170
- │ ├── context/
171
- │ │ ├── threeContext.tsx # Global engine/animation context
172
- │ │ └── ModulesContext.tsx # Module state management
173
- │ ├── hooks/
174
- │ │ └── useWebcamEyeTracking.ts # Webcam face tracking hook
175
- │ ├── scenes/
176
- │ │ └── CharacterGLBScene.tsx # Three.js React scene
177
- │ └── utils/
178
- │ └── animationLoader.ts # Load animation JSON files
179
- └── docs/ # Documentation
180
- ├── QUICK_START.md
181
- ├── LIPSYNC_COMPLETE_GUIDE.md
182
- ├── BACKEND_INTEGRATION.md
183
- └── DEPLOYMENT.md
181
+ ## 3. Extending & Custom Presets
182
+
183
+ ### Extending an existing preset
184
+
185
+ Use spread syntax to override specific mappings while keeping the rest:
186
+
187
+ ```typescript
188
+ import { CC4_PRESET } from 'loomlarge';
189
+
190
+ const MY_PRESET = {
191
+ ...CC4_PRESET,
192
+
193
+ // Override AU12 (smile) with custom morph names
194
+ auToMorphs: {
195
+ ...CC4_PRESET.auToMorphs,
196
+ 12: ['MySmile_Left', 'MySmile_Right'],
197
+ },
198
+
199
+ // Add a new bone binding
200
+ auToBones: {
201
+ ...CC4_PRESET.auToBones,
202
+ 99: [{ node: 'CUSTOM_BONE', channel: 'ry', scale: 1, maxDegrees: 45 }],
203
+ },
204
+
205
+ // Update bone node paths
206
+ boneNodes: {
207
+ ...CC4_PRESET.boneNodes,
208
+ 'CUSTOM_BONE': 'MyRig_CustomBone',
209
+ },
210
+ };
211
+
212
+ const loom = new LoomLargeThree({ auMappings: MY_PRESET });
213
+ ```
214
+
215
+ ### Creating a preset from scratch
216
+
217
+ ```typescript
218
+ import { AUMappingConfig } from 'loomlarge';
219
+
220
+ const CUSTOM_PRESET: AUMappingConfig = {
221
+ auToMorphs: {
222
+ 1: ['brow_inner_up_L', 'brow_inner_up_R'],
223
+ 2: ['brow_outer_up_L', 'brow_outer_up_R'],
224
+ 12: ['mouth_smile_L', 'mouth_smile_R'],
225
+ 45: ['eye_blink_L', 'eye_blink_R'],
226
+ },
227
+
228
+ auToBones: {
229
+ 51: [{ node: 'HEAD', channel: 'ry', scale: -1, maxDegrees: 30 }],
230
+ 52: [{ node: 'HEAD', channel: 'ry', scale: 1, maxDegrees: 30 }],
231
+ },
232
+
233
+ boneNodes: {
234
+ 'HEAD': 'head_bone',
235
+ 'JAW': 'jaw_bone',
236
+ },
237
+
238
+ visemeKeys: ['aa', 'ee', 'ih', 'oh', 'oo'],
239
+
240
+ morphToMesh: {
241
+ 'face': ['body_mesh'],
242
+ },
243
+ };
244
+ ```
245
+
246
+ ### Changing presets at runtime
247
+
248
+ ```typescript
249
+ // Switch to a different preset
250
+ loom.setAUMappings(ANOTHER_PRESET);
251
+
252
+ // Get current mappings
253
+ const current = loom.getAUMappings();
184
254
  ```
185
255
 
186
256
  ---
187
257
 
188
- ## How It Works
189
-
190
- ### Animation Service
191
-
192
- The **Animation Service** (`latticework/animation/animationService.ts`) is the core scheduler:
193
-
194
- 1. **Load Snippets**: JSON files defining AU/morph keyframe curves
195
- ```typescript
196
- const anim = createAnimationService(host);
197
- anim.loadSnippet('smile', smileSnippetJSON);
198
- ```
199
-
200
- 2. **Schedule Playback**: Queue snippets with priority and duration
201
- ```typescript
202
- anim.schedule('smile', {
203
- duration: 1000,
204
- priority: 10,
205
- loop: false
206
- });
207
- ```
208
-
209
- 3. **XState Machine**: Manages snippet lifecycle (`idle` `playing` → `paused`)
210
- - Driven by central frame loop (`threeContext.tsx`)
211
- - Handles overlapping snippets with priority-based blending
212
-
213
- 4. **Host Interface**: Abstraction layer for AU/morph application
214
- ```typescript
215
- const host = {
216
- applyAU: (id, value) => engine.setAU(id, value),
217
- setMorph: (key, value) => engine.setMorph(key, value),
218
- transitionAU: (id, value, duration) => engine.transitionAU(id, value, duration)
219
- };
220
- ```
221
-
222
- ### Eye & Head Tracking
223
-
224
- The **Eye/Head Tracking Service** (`latticework/eyeHeadTracking/eyeHeadTrackingService.ts`) provides three tracking modes:
225
-
226
- 1. **Manual Mode**: Direct slider control of gaze direction
227
- 2. **Mouse Mode**: Character follows cursor with mirror behavior
228
- - Mouse left → Character looks right (at user)
229
- - Negative x coordinate for natural gaze
230
- 3. **Webcam Mode**: Face tracking using BlazeFace model
231
- - Real-time eye position detection
232
- - Normalized coordinates (-1 to 1)
233
-
234
- **Key Features**:
235
- - **Composite Methods**: Uses `applyEyeComposite(yaw, pitch)` and `applyHeadComposite(yaw, pitch, roll)`
236
- - **Intensity Control**: Separate sliders for eye and head movement intensity
237
- - **Head Follow Eyes**: Optional delayed head movement matching eye gaze
238
- - **Global Service**: Created in App.tsx and shared via ModulesContext
239
-
240
- **Usage**:
241
- ```typescript
242
- // Initialize service with engine reference
243
- const service = createEyeHeadTrackingService({
244
- eyeTrackingEnabled: true,
245
- headTrackingEnabled: true,
246
- headFollowEyes: true,
247
- eyeIntensity: 1.0,
248
- headIntensity: 0.5,
249
- engine: engine
258
+ ## 4. Action Unit Control
259
+
260
+ Action Units are the core of FACS. Each AU represents a specific muscular movement of the face.
261
+
262
+ ### Setting an AU immediately
263
+
264
+ ```typescript
265
+ // Set AU12 (smile) to 80% intensity
266
+ loom.setAU(12, 0.8);
267
+
268
+ // Set AU45 (blink) to full intensity
269
+ loom.setAU(45, 1.0);
270
+
271
+ // Set to 0 to deactivate
272
+ loom.setAU(12, 0);
273
+ ```
274
+
275
+ ### Transitioning an AU over time
276
+
277
+ ```typescript
278
+ // Animate AU12 to 0.8 over 200ms
279
+ const handle = loom.transitionAU(12, 0.8, 200);
280
+
281
+ // Wait for completion
282
+ await handle.promise;
283
+
284
+ // Or chain transitions
285
+ loom.transitionAU(12, 1.0, 200).promise.then(() => {
286
+ loom.transitionAU(12, 0, 300); // Fade out
250
287
  });
288
+ ```
251
289
 
252
- service.start();
253
- service.setMode('mouse'); // or 'webcam' or 'manual'
290
+ ### Getting the current AU value
254
291
 
255
- // Set gaze target manually
256
- service.setGazeTarget({ x: 0.5, y: -0.2, z: 0 });
292
+ ```typescript
293
+ const smileAmount = loom.getAU(12);
294
+ console.log(`Current smile: ${smileAmount}`);
257
295
  ```
258
296
 
259
- ### Lip-Sync Agency
297
+ ### Asymmetric control with balance
260
298
 
261
- The **Lip-Sync Agency** (`latticework/lipsync/`) generates viseme animations from text:
299
+ Many AUs have left and right variants (e.g., `Mouth_Smile_L` and `Mouth_Smile_R`). The `balance` parameter lets you control them independently:
262
300
 
263
- 1. **Phoneme Prediction**: Enhanced predictor with coarticulation model
264
- ```typescript
265
- const predictor = new EnhancedPhonemePredictor();
266
- const phonemes = predictor.predict('Hello world');
267
- ```
301
+ ```typescript
302
+ // Balance range: -1 (left only) to +1 (right only), 0 = both equal
268
303
 
269
- 2. **Viseme Mapping**: Phonemes ARKit visemes (AA, CH_J, DD, etc.)
270
- - Timing based on phoneme duration and speech rate
271
- - Coarticulation smoothing between adjacent phonemes
304
+ // Smile on both sides equally
305
+ loom.setAU(12, 0.8, 0);
272
306
 
273
- 3. **Animation Snippet Generation**: Creates JSON snippets for animation service
274
- ```typescript
275
- const snippet = generateLipsyncSnippet(text, {
276
- speechRate: 1.0,
277
- intensity: 0.8,
278
- style: 'relaxed' // or 'precise', 'theatrical', etc.
279
- });
280
- ```
307
+ // Smile only on left side
308
+ loom.setAU(12, 0.8, -1);
281
309
 
282
- 4. **Integration**: Scheduled via animation service with high priority (30)
310
+ // Smile only on right side
311
+ loom.setAU(12, 0.8, 1);
283
312
 
284
- ### Prosodic Expression
313
+ // 70% left, 30% right
314
+ loom.setAU(12, 0.8, -0.4);
315
+ ```
285
316
 
286
- The **Prosodic Expression Agency** (`latticework/prosodic/prosodicService.ts`) adds emotional head movements:
317
+ ### String-based side selection
287
318
 
288
- 1. **XState Machine**: Models prosodic states (idle analyzing expressing)
289
- 2. **Gesture Library**: Pre-defined head nods, tilts, and shakes
290
- - Nod: Positive affirmation (head pitch down)
291
- - Shake: Negation (head yaw side-to-side)
292
- - Tilt: Curiosity/emphasis (head roll)
319
+ You can also specify the side directly in the AU ID:
293
320
 
294
- 3. **Emotion Mapping**: Text analysis triggers appropriate gestures
295
- ```typescript
296
- // Question slight head tilt
297
- // Exclamation head nod emphasis
298
- // Negation words → head shake
299
- ```
321
+ ```typescript
322
+ // These are equivalent:
323
+ loom.setAU('12L', 0.8); // Left side only
324
+ loom.setAU(12, 0.8, -1); // Left side only
300
325
 
301
- 4. **Scheduling**: Prosodic snippets scheduled with medium priority (20)
326
+ loom.setAU('12R', 0.8); // Right side only
327
+ loom.setAU(12, 0.8, 1); // Right side only
328
+ ```
302
329
 
303
330
  ---
304
331
 
305
- ## Modules
306
-
307
- LoomLarge supports pluggable modules for extended functionality:
308
-
309
- ### AI Chat Module
310
- - **Description**: Real-time conversational AI using Anthropic Claude
311
- - **Location**: `src/modules/aiChat/`
312
- - **Features**:
313
- - Streaming text-to-speech synthesis
314
- - Lip-sync integration with prosodic expression
315
- - Eye/head tracking during conversation
316
- - WebSocket or LiveKit audio streaming
317
-
318
- **Activation**:
319
- ```typescript
320
- // Via ModulesMenu UI or programmatically:
321
- import { AIChatApp } from './modules/aiChat';
322
- <AIChatApp animationManager={anim} />
323
- ```
324
-
325
- ### French Quiz Module
326
- - **Description**: Interactive language learning demo
327
- - **Location**: `src/modules/frenchQuiz/`
328
- - **Features**:
329
- - Survey-style question flow
330
- - Facial expressions tied to correct/incorrect answers
331
- - Modal-based UI with progress tracking
332
-
333
- ### Custom Modules
334
-
335
- Create your own modules by following this pattern:
336
-
337
- 1. **Define module config** (`src/modules/config.ts`):
338
- ```typescript
339
- export default {
340
- modules: [
341
- {
342
- name: 'My Module',
343
- description: 'Custom module description',
344
- component: './modules/myModule/index.tsx'
345
- }
346
- ]
347
- };
348
- ```
349
-
350
- 2. **Create module component**:
351
- ```typescript
352
- // src/modules/myModule/index.tsx
353
- import React from 'react';
354
- import { useModulesContext } from '../../context/ModulesContext';
355
-
356
- export default function MyModule({ animationManager }: any) {
357
- const { eyeHeadTrackingService } = useModulesContext();
358
-
359
- // Your module logic here
360
- return <div>My Module UI</div>;
361
- }
362
- ```
332
+ ## 5. Mix Weight System
363
333
 
364
- ---
334
+ Some AUs can be driven by both morph targets (blend shapes) AND bone rotations. The mix weight controls the blend between them.
365
335
 
366
- ## Development
336
+ ### Why mix weights?
367
337
 
368
- ### Running the Dev Server
338
+ Take jaw opening (AU26) as an example:
339
+ - **Morph-only (weight 0)**: Vertices deform to show open mouth, but jaw bone doesn't move
340
+ - **Bone-only (weight 1)**: Jaw bone rotates down, but no soft tissue deformation
341
+ - **Mixed (weight 0.5)**: Both contribute equally for realistic results
369
342
 
370
- ```bash
371
- yarn dev
343
+ ### Setting mix weights
344
+
345
+ ```typescript
346
+ // Get the default mix weight for AU26
347
+ const weight = loom.getAUMixWeight(26); // e.g., 0.5
348
+
349
+ // Set to pure morph
350
+ loom.setAUMixWeight(26, 0);
351
+
352
+ // Set to pure bone
353
+ loom.setAUMixWeight(26, 1);
354
+
355
+ // Set to 70% bone, 30% morph
356
+ loom.setAUMixWeight(26, 0.7);
372
357
  ```
373
358
 
374
- Access at `http://localhost:5173` with:
375
- - Hot module replacement (HMR)
376
- - Source maps for debugging
377
- - Console logging for all services
359
+ ### Which AUs support mixing?
378
360
 
379
- ### Testing Animation Snippets
361
+ Only AUs that have both `auToMorphs` AND `auToBones` entries support mixing. Common examples:
362
+ - AU26 (Jaw Drop)
363
+ - AU27 (Mouth Stretch)
364
+ - AU51-56 (Head movements)
365
+ - AU61-64 (Eye movements)
380
366
 
381
- Load test animations in the browser console:
367
+ ```typescript
368
+ import { isMixedAU } from 'loomlarge';
382
369
 
383
- ```javascript
384
- // Global handles (auto-exposed in dev mode)
385
- window.engine // EngineThree instance
386
- window.anim // Animation service
370
+ if (isMixedAU(26)) {
371
+ console.log('AU26 supports morph/bone mixing');
372
+ }
373
+ ```
387
374
 
388
- // Load and play a snippet
389
- anim.loadSnippet('test', {
390
- duration: 2000,
391
- keyframes: {
392
- 'AU_12': [[0, 0], [1000, 1], [2000, 0]], // Smile curve
393
- 'AU_6': [[0, 0], [1000, 0.8], [2000, 0]] // Cheek raise
394
- }
395
- });
396
- anim.schedule('test', { priority: 10 });
397
- anim.play();
375
+ ---
376
+
377
+ ## 6. Composite Rotation System
378
+
379
+ Bones like the head and eyes need multi-axis rotation (pitch, yaw, roll). The composite rotation system handles this automatically.
380
+
381
+ ### How it works
382
+
383
+ When you set an AU that affects a bone rotation, LoomLarge:
384
+ 1. Queues the rotation update in `pendingCompositeNodes`
385
+ 2. At the end of `update()`, calls `flushPendingComposites()`
386
+ 3. Applies all three axes (pitch, yaw, roll) together to prevent gimbal issues
387
+
388
+ ### Supported bones and their axes
389
+
390
+ | Bone | Pitch (X) | Yaw (Y) | Roll (Z) |
391
+ |------|-----------|---------|----------|
392
+ | HEAD | AU53 (up) / AU54 (down) | AU51 (left) / AU52 (right) | AU55 (tilt left) / AU56 (tilt right) |
393
+ | EYE_L | AU63 (up) / AU64 (down) | AU61 (left) / AU62 (right) | - |
394
+ | EYE_R | AU63 (up) / AU64 (down) | AU61 (left) / AU62 (right) | - |
395
+ | JAW | AU25-27 (open) | AU30 (left) / AU35 (right) | - |
396
+ | TONGUE | AU37 (up) / AU38 (down) | AU39 (left) / AU40 (right) | AU41 / AU42 (tilt) |
397
+
398
+ ### Example: Moving the head
399
+
400
+ ```typescript
401
+ // Turn head left 50%
402
+ loom.setAU(51, 0.5);
403
+
404
+ // Turn head right 50%
405
+ loom.setAU(52, 0.5);
406
+
407
+ // Tilt head up 30%
408
+ loom.setAU(53, 0.3);
409
+
410
+ // Combine: turn left AND tilt up
411
+ loom.setAU(51, 0.5);
412
+ loom.setAU(53, 0.3);
413
+ // Both are applied together in a single composite rotation
398
414
  ```
399
415
 
400
- ### Debugging Eye/Head Tracking
416
+ ### Example: Eye gaze
401
417
 
402
- The service includes comprehensive diagnostic logging:
418
+ ```typescript
419
+ // Look left
420
+ loom.setAU(61, 0.7);
403
421
 
404
- ```javascript
405
- // Check current mode
406
- window.eyeHeadTrackingService?.getMode(); // 'manual' | 'mouse' | 'webcam'
422
+ // Look right
423
+ loom.setAU(62, 0.7);
407
424
 
408
- // Set gaze manually
409
- window.eyeHeadTrackingService?.setGazeTarget({ x: 0.5, y: -0.3, z: 0 });
425
+ // Look up
426
+ loom.setAU(63, 0.5);
410
427
 
411
- // Update configuration
412
- window.eyeHeadTrackingService?.updateConfig({
413
- eyeIntensity: 1.0,
414
- headIntensity: 0.7,
415
- headFollowEyes: true
416
- });
428
+ // Look down-right (combined)
429
+ loom.setAU(62, 0.6);
430
+ loom.setAU(64, 0.4);
417
431
  ```
418
432
 
419
- ### TypeScript Type Checking
433
+ ---
420
434
 
421
- ```bash
422
- yarn typecheck
435
+ ## 7. Continuum Pairs
436
+
437
+ Continuum pairs are bidirectional AU pairs that represent opposite directions on the same axis. They're linked so that activating one should deactivate the other.
438
+
439
+ ### Pair mappings
440
+
441
+ | Pair | Description |
442
+ |------|-------------|
443
+ | AU51 ↔ AU52 | Head turn left / right |
444
+ | AU53 ↔ AU54 | Head up / down |
445
+ | AU55 ↔ AU56 | Head tilt left / right |
446
+ | AU61 ↔ AU62 | Eyes look left / right |
447
+ | AU63 ↔ AU64 | Eyes look up / down |
448
+ | AU30 ↔ AU35 | Jaw shift left / right |
449
+ | AU37 ↔ AU38 | Tongue up / down |
450
+ | AU39 ↔ AU40 | Tongue left / right |
451
+ | AU73 ↔ AU74 | Tongue narrow / wide |
452
+ | AU76 ↔ AU77 | Tongue tip up / down |
453
+
454
+ ### Working with pairs
455
+
456
+ When using continuum pairs, set one AU from the pair and leave the other at 0:
457
+
458
+ ```typescript
459
+ // Head looking left at 50%
460
+ loom.setAU(51, 0.5);
461
+ loom.setAU(52, 0); // Right should be 0
462
+
463
+ // Head looking right at 70%
464
+ loom.setAU(51, 0); // Left should be 0
465
+ loom.setAU(52, 0.7);
423
466
  ```
424
467
 
425
- Runs `tsc --noEmit` to validate types without building.
468
+ ### The CONTINUUM_PAIRS_MAP
469
+
470
+ You can access pair information programmatically:
471
+
472
+ ```typescript
473
+ import { CONTINUUM_PAIRS_MAP } from 'loomlarge';
474
+
475
+ const pair = CONTINUUM_PAIRS_MAP[51];
476
+ // { pairId: 52, isNegative: true, axis: 'yaw', node: 'HEAD' }
477
+ ```
426
478
 
427
479
  ---
428
480
 
429
- ## Deployment
481
+ ## 8. Direct Morph Control
430
482
 
431
- ### GitHub Pages Deployment
483
+ Sometimes you need to control morph targets directly by name, bypassing the AU system.
432
484
 
433
- The project is configured for automatic GitHub Pages deployment:
485
+ ### Setting a morph immediately
434
486
 
435
- ```bash
436
- yarn deploy
487
+ ```typescript
488
+ // Set a specific morph to 50%
489
+ loom.setMorph('Mouth_Smile_L', 0.5);
490
+
491
+ // Set on specific meshes only
492
+ loom.setMorph('Mouth_Smile_L', 0.5, ['CC_Base_Body']);
437
493
  ```
438
494
 
439
- This script:
440
- 1. Builds production bundle (`yarn build`)
441
- 2. Deploys to `gh-pages` branch
442
- 3. Publishes to `https://meekmachine.github.io/LoomLarge`
495
+ ### Transitioning a morph
443
496
 
444
- **Configuration** (`vite.config.ts`):
445
497
  ```typescript
446
- export default defineConfig({
447
- base: '/LoomLarge/', // GitHub repo name
448
- build: {
449
- outDir: 'dist',
450
- assetsDir: 'assets'
451
- }
452
- });
453
- ```
498
+ // Animate morph over 200ms
499
+ const handle = loom.transitionMorph('Mouth_Smile_L', 0.8, 200);
454
500
 
455
- ### Custom Domain
501
+ // With mesh targeting
502
+ loom.transitionMorph('Eye_Blink_L', 1.0, 100, ['CC_Base_Body']);
456
503
 
457
- To use a custom domain:
504
+ // Wait for completion
505
+ await handle.promise;
506
+ ```
458
507
 
459
- 1. Add `CNAME` file to `public/`:
460
- ```
461
- your-domain.com
462
- ```
508
+ ### Reading current morph value
509
+
510
+ ```typescript
511
+ const value = loom.getMorphValue('Mouth_Smile_L');
512
+ ```
463
513
 
464
- 2. Configure DNS:
465
- ```
466
- A 185.199.108.153
467
- A 185.199.109.153
468
- A 185.199.110.153
469
- A 185.199.111.153
470
- CNAME www your-username.github.io
471
- ```
514
+ ### Morph caching
472
515
 
473
- 3. Deploy:
474
- ```bash
475
- yarn deploy
476
- ```
516
+ LoomLarge caches morph target lookups for performance. The first time you access a morph, it searches all meshes and caches the index. Subsequent accesses are O(1).
477
517
 
478
518
  ---
479
519
 
480
- ## API Reference
520
+ ## 9. Viseme System
521
+
522
+ Visemes are mouth shapes used for lip-sync. LoomLarge includes 15 visemes with automatic jaw coupling.
523
+
524
+ ### The 15 visemes
481
525
 
482
- ### Animation Service API
526
+ | Index | Key | Phoneme Example |
527
+ |-------|-----|-----------------|
528
+ | 0 | EE | "b**ee**" |
529
+ | 1 | Er | "h**er**" |
530
+ | 2 | IH | "s**i**t" |
531
+ | 3 | Ah | "f**a**ther" |
532
+ | 4 | Oh | "g**o**" |
533
+ | 5 | W_OO | "t**oo**" |
534
+ | 6 | S_Z | "**s**un, **z**oo" |
535
+ | 7 | Ch_J | "**ch**ip, **j**ump" |
536
+ | 8 | F_V | "**f**un, **v**an" |
537
+ | 9 | TH | "**th**ink" |
538
+ | 10 | T_L_D_N | "**t**op, **l**ip, **d**og, **n**o" |
539
+ | 11 | B_M_P | "**b**at, **m**an, **p**op" |
540
+ | 12 | K_G_H_NG | "**k**ite, **g**o, **h**at, si**ng**" |
541
+ | 13 | AE | "c**a**t" |
542
+ | 14 | R | "**r**ed" |
543
+
544
+ ### Setting a viseme
483
545
 
484
546
  ```typescript
485
- interface AnimationService {
486
- loadSnippet(name: string, snippet: AnimationSnippet): void;
487
- schedule(name: string, options?: ScheduleOptions): void;
488
- play(): void;
489
- pause(): void;
490
- stop(): void;
491
- setLoop(loop: boolean): void;
492
- scrub(time: number): void;
493
- step(deltaSeconds: number): void;
494
- dispose(): void;
495
- }
547
+ // Set viseme 3 (Ah) to full intensity
548
+ loom.setViseme(3, 1.0);
549
+
550
+ // With jaw scale (0-1, default 1)
551
+ loom.setViseme(3, 1.0, 0.5); // Half jaw opening
496
552
  ```
497
553
 
498
- ### Eye/Head Tracking Service API
554
+ ### Transitioning visemes
499
555
 
500
556
  ```typescript
501
- interface EyeHeadTrackingService {
502
- start(): void;
503
- stop(): void;
504
- setGazeTarget(target: GazeTarget): void;
505
- setMode(mode: 'manual' | 'mouse' | 'webcam'): void;
506
- getMode(): 'manual' | 'mouse' | 'webcam';
507
- updateConfig(config: Partial<EyeHeadTrackingConfig>): void;
508
- setSpeaking(isSpeaking: boolean): void;
509
- setListening(isListening: boolean): void;
510
- blink(): void;
511
- dispose(): void;
512
- }
557
+ // Animate to viseme over 80ms (typical for speech)
558
+ const handle = loom.transitionViseme(3, 1.0, 80);
559
+
560
+ // Disable jaw coupling
561
+ loom.transitionViseme(3, 1.0, 80, 0);
513
562
  ```
514
563
 
515
- ### EngineThree Composite Methods
564
+ ### Automatic jaw coupling
516
565
 
517
- ```typescript
518
- class EngineThree {
519
- // Eye composite rotation (yaw/pitch)
520
- applyEyeComposite(yaw: number, pitch: number): void;
566
+ Each viseme has a predefined jaw opening amount. When you set a viseme, the jaw automatically opens proportionally:
521
567
 
522
- // Head composite rotation (yaw/pitch/roll)
523
- applyHeadComposite(yaw: number, pitch: number, roll?: number): void;
568
+ | Viseme | Jaw Amount |
569
+ |--------|------------|
570
+ | EE | 0.15 |
571
+ | Ah | 0.70 |
572
+ | Oh | 0.50 |
573
+ | B_M_P | 0.20 |
524
574
 
525
- // Get/Set AU mix weight (morph ↔ bone blend)
526
- getAUMixWeight(auId: number): number | undefined;
527
- setAUMixWeight(auId: number, mix: number): void;
575
+ The `jawScale` parameter multiplies this amount:
576
+ - `jawScale = 1.0`: Normal jaw opening
577
+ - `jawScale = 0.5`: Half jaw opening
578
+ - `jawScale = 0`: No jaw movement (viseme only)
528
579
 
529
- // Direct AU control
530
- setAU(id: number | string, value: number): void;
531
- setMorph(key: string, value: number): void;
580
+ ### Lip-sync example
581
+
582
+ ```typescript
583
+ async function speak(phonemes: number[]) {
584
+ for (const viseme of phonemes) {
585
+ // Clear previous viseme
586
+ for (let i = 0; i < 15; i++) loom.setViseme(i, 0);
532
587
 
533
- // Smooth transitions
534
- transitionAU(id: number | string, target: number, duration?: number): void;
535
- transitionMorph(key: string, target: number, duration?: number): void;
588
+ // Transition to new viseme
589
+ await loom.transitionViseme(viseme, 1.0, 80).promise;
590
+
591
+ // Hold briefly
592
+ await new Promise(r => setTimeout(r, 100));
593
+ }
594
+
595
+ // Return to neutral
596
+ for (let i = 0; i < 15; i++) loom.setViseme(i, 0);
536
597
  }
598
+
599
+ // "Hello" approximation
600
+ speak([5, 0, 10, 4]);
537
601
  ```
538
602
 
539
603
  ---
540
604
 
541
- ## Troubleshooting
605
+ ## 10. Transition System
542
606
 
543
- ### Character Not Moving
607
+ All animated changes in LoomLarge go through the transition system, which provides smooth interpolation with easing.
544
608
 
545
- 1. **Check engine initialization**:
546
- ```javascript
547
- console.log(window.engine); // Should show EngineThree instance
548
- ```
609
+ ### TransitionHandle
549
610
 
550
- 2. **Verify service startup**:
551
- ```javascript
552
- console.log(window.anim?.getSnapshot?.().value); // Should show 'playing' or 'idle'
553
- ```
611
+ Every transition method returns a `TransitionHandle`:
554
612
 
555
- 3. **Check eye/head tracking**:
556
- ```javascript
557
- window.eyeHeadTrackingService?.getState(); // Should show current gaze/status
558
- ```
613
+ ```typescript
614
+ interface TransitionHandle {
615
+ promise: Promise<void>; // Resolves when transition completes
616
+ pause(): void; // Pause this transition
617
+ resume(): void; // Resume this transition
618
+ cancel(): void; // Cancel immediately
619
+ }
620
+ ```
559
621
 
560
- ### Eye Tracking Direction Issues
622
+ ### Using handles
561
623
 
562
- - **Eyes looking wrong way**: Check coordinate negation in `eyeHeadTrackingService.ts:283`
563
- - **Head not following**: Verify `headFollowEyes` is enabled in config
564
- - **Intensity too low**: Increase `eyeIntensity` and `headIntensity` sliders
624
+ ```typescript
625
+ // Start a transition
626
+ const handle = loom.transitionAU(12, 1.0, 500);
565
627
 
566
- ### Animation Not Loading
628
+ // Pause it
629
+ handle.pause();
567
630
 
568
- 1. **Check JSON format**:
569
- - Must have `duration` and `keyframes` fields
570
- - AU keys must match ARKit spec (e.g., 'AU_12', not '12')
631
+ // Resume later
632
+ handle.resume();
571
633
 
572
- 2. **Verify snippet name**:
573
- ```javascript
574
- anim.listSnippets(); // Show all loaded snippets
575
- ```
634
+ // Or cancel entirely
635
+ handle.cancel();
576
636
 
577
- 3. **Check console for errors**:
578
- - Look for `[Animation]` prefixed logs
579
- - Validate keyframe curve format: `[[time, value], ...]`
637
+ // Wait for completion
638
+ await handle.promise;
639
+ ```
580
640
 
581
- ### TensorFlow.js Bundling Errors
641
+ ### Combining multiple transitions
582
642
 
583
- If you encounter errors related to TensorFlow.js modules (e.g., `@tensorflow/tfjs-core/dist/ops/ops_for_converter`), this is a **known issue** with TensorFlow.js 4.x and Vite.
643
+ When you call `transitionAU`, it may create multiple internal transitions (one per morph target). The returned handle controls all of them:
584
644
 
585
- **The Problem:**
586
- - TensorFlow.js references internal module paths that don't actually exist in the npm package
587
- - Vite's esbuild optimizer cannot resolve these phantom paths
588
- - Results in 100+ bundling errors about missing exports and modules
645
+ ```typescript
646
+ // AU12 might affect Mouth_Smile_L and Mouth_Smile_R
647
+ const handle = loom.transitionAU(12, 1.0, 200);
589
648
 
590
- **Solution (Already Implemented):**
591
- LoomLarge loads TensorFlow.js and BlazeFace from **CDN** instead of npm packages:
649
+ // Pausing the handle pauses both morph transitions
650
+ handle.pause();
651
+ ```
652
+
653
+ ### Easing
654
+
655
+ The default easing is `easeInOutQuad`. Custom easing can be provided when using the Animation system directly:
592
656
 
593
- ```html
594
- <!-- index.html -->
595
- <script src="https://cdn.jsdelivr.net/npm/@tensorflow/tfjs@4.22.0/dist/tf.min.js"></script>
596
- <script src="https://cdn.jsdelivr.net/npm/@tensorflow-models/blazeface@0.0.7/dist/blazeface.js"></script>
657
+ ```typescript
658
+ // The AnimationThree class supports custom easing
659
+ animation.addTransition(
660
+ 'custom',
661
+ 0,
662
+ 1,
663
+ 200,
664
+ (v) => console.log(v),
665
+ (t) => t * t // Custom ease-in quadratic
666
+ );
597
667
  ```
598
668
 
599
- The code uses global `blazeface` object with TypeScript declarations:
669
+ ### Active transition count
670
+
600
671
  ```typescript
601
- // useWebcamEyeTracking.ts
602
- declare const blazeface: any;
603
- const model = await blazeface.load();
672
+ const count = loom.getActiveTransitionCount();
673
+ console.log(`${count} transitions in progress`);
604
674
  ```
605
675
 
606
- Vite config excludes TensorFlow packages from optimization:
676
+ ### Clearing all transitions
677
+
607
678
  ```typescript
608
- // vite.config.ts
609
- optimizeDeps: {
610
- exclude: [
611
- '@tensorflow/tfjs',
612
- '@tensorflow/tfjs-core',
613
- '@tensorflow/tfjs-converter',
614
- '@tensorflow/tfjs-backend-cpu',
615
- '@tensorflow/tfjs-backend-webgl',
616
- '@tensorflow-models/blazeface',
617
- ],
618
- }
679
+ // Cancel everything immediately
680
+ loom.clearTransitions();
619
681
  ```
620
682
 
621
- **If you still see errors:**
622
- 1. Ensure TensorFlow packages are NOT in `package.json` dependencies
623
- 2. Clear Vite cache: `rm -rf node_modules/.vite`
624
- 3. Restart dev server: `yarn dev`
683
+ ---
625
684
 
626
- See [src/hooks/README_useWebcamEyeTracking.md](src/hooks/README_useWebcamEyeTracking.md) for full documentation.
685
+ ## 11. Playback & State Control
627
686
 
628
- ### Build Errors
687
+ ### Pausing and resuming
629
688
 
630
- ```bash
631
- # Clear cache and rebuild
632
- rm -rf node_modules dist .vite
633
- yarn install
634
- yarn build
689
+ ```typescript
690
+ // Pause all animation updates
691
+ loom.pause();
692
+
693
+ // Check pause state
694
+ if (loom.getPaused()) {
695
+ console.log('Animation is paused');
696
+ }
697
+
698
+ // Resume
699
+ loom.resume();
635
700
  ```
636
701
 
637
- ---
702
+ When paused, `loom.update()` stops processing transitions, but you can still call `setAU()` for immediate changes.
638
703
 
639
- ## Performance Optimization
704
+ ### Resetting to neutral
640
705
 
641
- ### Animation Loop
706
+ ```typescript
707
+ // Reset everything to rest state
708
+ loom.resetToNeutral();
709
+ ```
642
710
 
643
- - Central frame loop runs at 60 FPS (`threeContext.tsx`)
644
- - Animation scheduler ticks every frame via `step(deltaTime)`
645
- - Morph/bone updates batched per frame
711
+ This:
712
+ - Clears all AU values to 0
713
+ - Cancels all active transitions
714
+ - Resets all morph targets to 0
715
+ - Returns all bones to their original position/rotation
646
716
 
647
- ### Composite Rotation Caching
717
+ ### Mesh visibility
648
718
 
649
- - Rotation state cached in `compositeRotationState` map
650
- - Only recalculates when values change
651
- - Avoids redundant Three.js object updates
719
+ ```typescript
720
+ // Get list of all meshes
721
+ const meshes = loom.getMeshList();
722
+ // Returns: [{ name: 'CC_Base_Body', visible: true, morphCount: 80 }, ...]
652
723
 
653
- ### Snippet Scheduling
724
+ // Hide a mesh
725
+ loom.setMeshVisible('CC_Base_Hair', false);
654
726
 
655
- - Priority-based scheduler prevents conflicts
656
- - Lower priority snippets paused when higher priority plays
657
- - Automatic cleanup of completed snippets
727
+ // Show it again
728
+ loom.setMeshVisible('CC_Base_Hair', true);
729
+ ```
658
730
 
659
- ---
731
+ ### Cleanup
660
732
 
661
- ## Contributing
662
-
663
- We welcome contributions! Please follow these guidelines:
664
-
665
- 1. **Fork and clone** the repository
666
- 2. **Create a feature branch**: `git checkout -b feature/my-feature`
667
- 3. **Follow code style**:
668
- - TypeScript strict mode
669
- - ESLint/Prettier for formatting
670
- - Descriptive variable names
671
- 4. **Test thoroughly**:
672
- - Manual testing in dev mode
673
- - TypeScript type checking (`yarn typecheck`)
674
- 5. **Commit with descriptive messages**:
675
- ```
676
- feat: Add webcam eye tracking support
677
- fix: Correct head yaw direction in mouse mode
678
- docs: Update README with API reference
679
- ```
680
- 6. **Push and create PR** to `main` branch
733
+ ```typescript
734
+ // When done, dispose of resources
735
+ loom.dispose();
736
+ ```
681
737
 
682
738
  ---
683
739
 
684
- ## License & Acknowledgments
740
+ ## 12. Hair Physics
685
741
 
686
- **© 2025 Jonathan Sutton Fields, Lovelace LOL**
687
- Licensed under the **Loom Large, Latticework copyleft license**
742
+ LoomLarge includes an experimental hair physics system that simulates hair movement based on head motion.
688
743
 
689
- ### Acknowledgments
744
+ ### Basic setup
690
745
 
691
- - **Three.js** – 3D rendering engine
692
- - **XState** State machine library
693
- - **React** – UI framework
694
- - **Vite** – Lightning-fast bundler
695
- - **ARKit** – Facial Action Coding System specification
696
- - **BlazeFace** – Webcam face detection model
746
+ ```typescript
747
+ import { HairPhysics } from 'loomlarge';
697
748
 
698
- ### Related Projects
749
+ const hair = new HairPhysics();
750
+ ```
699
751
 
700
- - **VISOS** Predecessor architecture (object-oriented)
701
- - **eEVA Workbench** – Original survey/conversation platform
702
- - **Latticework** – Core agency framework
752
+ ### Updating in animation loop
703
753
 
704
- ---
754
+ ```typescript
755
+ function animate() {
756
+ // Get current head state (from your tracking system or AU values)
757
+ const headState = {
758
+ yaw: 0, // Head rotation in radians
759
+ pitch: 0,
760
+ roll: 0,
761
+ yawVelocity: 0.5, // Angular velocity
762
+ pitchVelocity: 0,
763
+ };
764
+
765
+ // Update hair physics
766
+ const hairMorphs = hair.update(deltaTime, headState);
767
+
768
+ // Apply hair morphs
769
+ for (const [morphName, value] of Object.entries(hairMorphs)) {
770
+ loom.setMorph(morphName, value);
771
+ }
772
+ }
773
+ ```
774
+
775
+ ### Output morphs
776
+
777
+ The physics system outputs 6 morph values:
705
778
 
706
- ## Support
779
+ | Morph | Description |
780
+ |-------|-------------|
781
+ | L_Hair_Left | Left side, swing left |
782
+ | L_Hair_Right | Left side, swing right |
783
+ | L_Hair_Front | Left side, swing forward |
784
+ | R_Hair_Left | Right side, swing left |
785
+ | R_Hair_Right | Right side, swing right |
786
+ | R_Hair_Front | Right side, swing forward |
707
787
 
708
- - **Issues**: [GitHub Issues](https://github.com/meekmachine/LoomLarge/issues)
709
- - **Discussions**: [GitHub Discussions](https://github.com/meekmachine/LoomLarge/discussions)
710
- - **Email**: jonathan@lovelacelol.com
788
+ ### Physics forces
789
+
790
+ The simulation models 5 forces:
791
+
792
+ 1. **Spring restoration** - Pulls hair back to rest position
793
+ 2. **Damping** - Air resistance prevents infinite oscillation
794
+ 3. **Gravity** - Hair swings based on head tilt
795
+ 4. **Inertia** - Hair lags behind head movement
796
+ 5. **Wind** - Optional oscillating wind force
797
+
798
+ ### Configuration
799
+
800
+ ```typescript
801
+ const hair = new HairPhysics({
802
+ mass: 1.0,
803
+ stiffness: 50,
804
+ damping: 5,
805
+ gravity: 9.8,
806
+ headInfluence: 0.8, // How much head movement affects hair
807
+ wind: {
808
+ strength: 0,
809
+ direction: { x: 1, y: 0, z: 0 },
810
+ turbulence: 0.2,
811
+ frequency: 1.0,
812
+ },
813
+ });
814
+ ```
711
815
 
712
816
  ---
713
817
 
714
- **Built with ❤️ by the Latticework team**
818
+ ## Resources
819
+
820
+ - [FACS on Wikipedia](https://en.wikipedia.org/wiki/Facial_Action_Coding_System)
821
+ - [Paul Ekman Group - FACS](https://www.paulekman.com/facial-action-coding-system/)
822
+ - [Character Creator 4](https://www.reallusion.com/character-creator/)
823
+ - [Three.js Documentation](https://threejs.org/docs/)
824
+
825
+ ## License
826
+
827
+ MIT License - see LICENSE file for details.