@djangocfg/ui-nextjs 2.1.81 → 2.1.83

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
Files changed (36) hide show
  1. package/package.json +4 -4
  2. package/src/tools/AudioPlayer/@refactoring3/00-IMPLEMENTATION-ROADMAP.md +1146 -0
  3. package/src/tools/AudioPlayer/@refactoring3/01-WAVESURFER-STREAMING-ANALYSIS.md +611 -0
  4. package/src/tools/AudioPlayer/@refactoring3/02-MEDIA-VIEWER-ANALYSIS.md +560 -0
  5. package/src/tools/AudioPlayer/@refactoring3/03-HYBRID-ARCHITECTURE-PROPOSAL.md +769 -0
  6. package/src/tools/AudioPlayer/@refactoring3/04-CRACKLING-ISSUE-DIAGNOSIS.md +373 -0
  7. package/src/tools/AudioPlayer/README.md +177 -205
  8. package/src/tools/AudioPlayer/components/AudioPlayer.tsx +9 -4
  9. package/src/tools/AudioPlayer/components/HybridAudioPlayer.tsx +251 -0
  10. package/src/tools/AudioPlayer/components/HybridSimplePlayer.tsx +291 -0
  11. package/src/tools/AudioPlayer/components/HybridWaveform.tsx +279 -0
  12. package/src/tools/AudioPlayer/components/SimpleAudioPlayer.tsx +16 -26
  13. package/src/tools/AudioPlayer/components/index.ts +6 -1
  14. package/src/tools/AudioPlayer/context/AudioProvider.tsx +16 -8
  15. package/src/tools/AudioPlayer/context/HybridAudioProvider.tsx +121 -0
  16. package/src/tools/AudioPlayer/context/index.ts +14 -2
  17. package/src/tools/AudioPlayer/hooks/index.ts +11 -0
  18. package/src/tools/AudioPlayer/hooks/useHybridAudio.ts +387 -0
  19. package/src/tools/AudioPlayer/hooks/useHybridAudioAnalysis.ts +95 -0
  20. package/src/tools/AudioPlayer/hooks/useSharedWebAudio.ts +6 -3
  21. package/src/tools/AudioPlayer/index.ts +31 -0
  22. package/src/tools/AudioPlayer/progressive/ProgressiveAudioPlayer.tsx +8 -0
  23. package/src/tools/ImageViewer/hooks/useImageLoading.ts +33 -9
  24. package/src/tools/VideoPlayer/hooks/useVideoPositionCache.ts +13 -6
  25. package/src/tools/VideoPlayer/providers/StreamProvider.tsx +38 -22
  26. package/src/tools/index.ts +22 -0
  27. package/src/tools/AudioPlayer/@refactoring/00-PLAN.md +0 -148
  28. package/src/tools/AudioPlayer/@refactoring/01-TYPES.md +0 -301
  29. package/src/tools/AudioPlayer/@refactoring/02-HOOKS.md +0 -281
  30. package/src/tools/AudioPlayer/@refactoring/03-CONTEXT.md +0 -328
  31. package/src/tools/AudioPlayer/@refactoring/04-COMPONENTS.md +0 -251
  32. package/src/tools/AudioPlayer/@refactoring/05-EFFECTS.md +0 -427
  33. package/src/tools/AudioPlayer/@refactoring/06-UTILS-AND-INDEX.md +0 -193
  34. package/src/tools/AudioPlayer/@refactoring/07-EXECUTION-CHECKLIST.md +0 -146
  35. package/src/tools/AudioPlayer/@refactoring2/ISSUE_ANALYSIS.md +0 -187
  36. package/src/tools/AudioPlayer/@refactoring2/PLAN.md +0 -372
@@ -0,0 +1,1146 @@
1
+ # Audio Player Implementation Roadmap
2
+
3
+ **Date:** 2025-12-30
4
+ **Status:** In Progress
5
+ **Goal:** Create a robust audio player with streaming support and reactive visualizations without crackling
6
+
7
+ ---
8
+
9
+ ## Progress Summary
10
+
11
+ | Phase | Status | Description |
12
+ |-------|--------|-------------|
13
+ | Phase 0 | ✅ **DONE** | Quick Fix for Crackling |
14
+ | Phase 1 | ✅ **DONE** | Core Infrastructure |
15
+ | Phase 2 | ✅ **DONE** | Visualization Integration |
16
+ | Phase 3 | ✅ **DONE** | Player Components |
17
+ | Phase 4 | ⏳ Pending | Integration & Testing |
18
+ | Phase 5 | ✅ **DONE** | Cleanup |
19
+
20
+ ### Completed Files (2025-12-30)
21
+
22
+ **Phase 0:**
23
+ - `hooks/useSharedWebAudio.ts` - Fixed double audio routing
24
+
25
+ **Phase 1:**
26
+ - `hooks/useHybridAudio.ts` - Core hybrid audio hook
27
+ - `hooks/useHybridAudioAnalysis.ts` - Audio frequency analysis for hybrid player
28
+ - `context/HybridAudioProvider.tsx` - Context provider with state and controls
29
+
30
+ **Phase 2:**
31
+ - `components/HybridWaveform.tsx` - Real-time frequency visualization
32
+
33
+ **Phase 3:**
34
+ - `components/HybridAudioPlayer.tsx` - Main player with controls
35
+ - `components/HybridSimplePlayer.tsx` - All-in-one wrapper with reactive cover
36
+
37
+ **Phase 5 (Cleanup):**
38
+ - Added `@deprecated` notices to legacy components (SimpleAudioPlayer, AudioPlayer, AudioProvider)
39
+ - Added documentation notes to ProgressiveAudioPlayer
40
+ - Removed old `@refactoring/` and `@refactoring2/` folders
41
+ - Updated README.md with player comparison and migration guide
42
+
43
+ ---
44
+
45
+ ## Table of Contents
46
+
47
+ 1. [Executive Summary](#executive-summary)
48
+ 2. [Key Findings from Analysis](#key-findings-from-analysis)
49
+ 3. [Proposed Solution: Hybrid Audio Player](#proposed-solution-hybrid-audio-player)
50
+ 4. [Implementation Phases](#implementation-phases)
51
+ 5. [Files to Modify/Create](#files-to-modifycreate)
52
+ 6. [Code Examples](#code-examples)
53
+ 7. [Migration Guide](#migration-guide)
54
+ 8. [Testing Strategy](#testing-strategy)
55
+ 9. [Risk Assessment](#risk-assessment)
56
+ 10. [Timeline Estimate](#timeline-estimate)
57
+
58
+ ---
59
+
60
+ ## Executive Summary
61
+
62
+ ### Current Problems
63
+
64
+ 1. **Seek Limitation (>2 minutes):** Backend enforces 2MB chunk limit, preventing seek beyond buffered content
65
+ 2. **Audio Crackling:** Web Audio API double-routing causes audio artifacts when reactive effects are enabled
66
+ 3. **WaveSurfer Streaming:** WaveSurfer.js requires full file download for waveform - no true streaming support
67
+ 4. **Dual Player Complexity:** Maintaining both WaveSurfer and Progressive players creates duplication
68
+
69
+ ### Proposed Solution
70
+
71
+ Implement a **Hybrid Audio Player** that:
72
+ - Uses native HTML5 `<audio>` for playback (no crackling, native streaming)
73
+ - Connects Web Audio API only for visualization (AnalyserNode)
74
+ - Preserves all reactive cover effects (glow, orbs, spotlight, mesh)
75
+ - Supports chunked/streaming audio with proper Range request handling
76
+
77
+ ---
78
+
79
+ ## Key Findings from Analysis
80
+
81
+ ### From WaveSurfer Streaming Analysis (01)
82
+
83
+ | Feature | Support | Notes |
84
+ |---------|---------|-------|
85
+ | True streaming (MediaSource API) | **NO** | Not implemented |
86
+ | Chunked fetch with progressive rendering | **NO** | Full download required |
87
+ | Partial audio playback | **LIMITED** | Only with pre-decoded peaks |
88
+ | Pre-computed peaks + streaming URL | **YES** | Recommended approach |
89
+
90
+ **Key Insight:** WaveSurfer requires the **entire audio file** to be downloaded and decoded before rendering the waveform. For streaming, pre-computed peaks are required.
91
+
92
+ ### From Media Viewer Analysis (02)
93
+
94
+ | Feature | SimpleAudioPlayer (WaveSurfer) | ProgressiveAudioPlayer |
95
+ |---------|------------------------------|----------------------|
96
+ | Audio Routing | Web Audio API | Native Browser |
97
+ | Waveform | Pre-computed (needs full file) | Progressive (chunks) |
98
+ | Seek | Limited by buffered data | Any position |
99
+ | Reactive Effects | Yes | No |
100
+ | Crackling Risk | High (Web Audio routing) | Low (direct playback) |
101
+ | Streaming | Requires prefetch | Native support |
102
+
103
+ ### From Hybrid Architecture Proposal (03)
104
+
105
+ The hybrid approach combines the best of both:
106
+
107
+ ```
108
+ HTML5 <audio> --> MediaElementSource --> AnalyserNode --> Destination
109
+ (native playback) (bridge) (visualization) (speakers)
110
+ ```
111
+
112
+ - Native audio handles all playback concerns (buffering, seeking, streaming)
113
+ - Web Audio only used for frequency analysis
114
+ - Reactive effects continue to work unchanged
115
+
116
+ ### From Crackling Issue Diagnosis (04)
117
+
118
+ **Root Cause:** Double audio routing in `useSharedWebAudio.ts`:
119
+
120
+ ```typescript
121
+ // PROBLEM: Audio routed TWICE to destination
122
+ sourceRef.current.connect(audioContext.destination); // First connection
123
+ // ...
124
+ sourceRef.current.connect(analyser);
125
+ analyser.connect(audioContext.destination); // DUPLICATE path!
126
+ ```
127
+
128
+ **Fix:** Analyser only needs connection to source for `getByteFrequencyData()` - it does NOT need to connect to destination:
129
+
130
+ ```typescript
131
+ // CORRECT: Analyser as passive listener
132
+ sourceRef.current.connect(analyser);
133
+ // analyser.connect(destination) - NOT NEEDED!
134
+ sourceRef.current.connect(audioContext.destination); // Single path
135
+ ```
136
+
137
+ ### From Problem Analysis (01-cmdop)
138
+
139
+ Backend chunk size limitation causes seek issues:
140
+
141
+ | Bitrate | 2 MB Duration | Seek Limit |
142
+ |---------|---------------|------------|
143
+ | 128 kbps | 125 sec | ~2 min |
144
+ | 192 kbps | 87 sec | ~1.5 min |
145
+ | 256 kbps | 65 sec | ~1 min |
146
+
147
+ ### From Solution Options (02-cmdop)
148
+
149
+ Recommended priority:
150
+ 1. **Option A** - Backend limit increase (Low effort, High impact)
151
+ 2. **Option B** - Progressive player (Done)
152
+ 3. **Option E** - Blob for small files (Low effort, Medium impact)
153
+ 4. **Option C** - MSE (High effort, High impact)
154
+ 5. **Option D** - HLS/DASH (High effort, High impact)
155
+
156
+ ---
157
+
158
+ ## Proposed Solution: Hybrid Audio Player
159
+
160
+ ### Architecture Overview
161
+
162
+ ```
163
+ HYBRID AUDIO PLAYER ARCHITECTURE
164
+ +===============================================================================+
165
+ | |
166
+ | +-------------------------------------------------------------------------+ |
167
+ | | HybridAudioProvider | |
168
+ | | (React Context - manages all state and coordinates components) | |
169
+ | +-------------------------------------------------------------------------+ |
170
+ | | |
171
+ | +---------------------------+---------------------------+ |
172
+ | | | | |
173
+ | v v v |
174
+ | +---------------+ +----------------+ +------------------+ |
175
+ | | PLAYBACK | | VISUALIZATION | | REACTIVE FX | |
176
+ | | LAYER | | LAYER | | LAYER | |
177
+ | +---------------+ +----------------+ +------------------+ |
178
+ | | | | | | | |
179
+ | | HTML5 <audio> | ---> | MediaElement | ---> | useAudioAnalysis | |
180
+ | | (native) | | SourceNode | | (bass/mid/high) | |
181
+ | | | | | | | | | |
182
+ | | - Range req | | v | | v | |
183
+ | | - Buffering | | AnalyserNode | | AudioReactive | |
184
+ | | - Seek | | | | | Cover | |
185
+ | | - No crackling| | v | | - GlowEffect | |
186
+ | +-------+-------+ | Destination | | - OrbsEffect | |
187
+ | | +-------+--------+ | - SpotlightEffect| |
188
+ | | | | - MeshEffect | |
189
+ | v v +------------------+ |
190
+ | +-------+-------+ +-------+--------+ |
191
+ | | useAudioState | | useSharedWeb | |
192
+ | | - currentTime | | Audio (fixed) | |
193
+ | | - duration | +----------------+ |
194
+ | | - buffered | |
195
+ | | - isPlaying | |
196
+ | +---------------+ |
197
+ | |
198
+ +===============================================================================+
199
+ ```
200
+
201
+ ### Key Benefits
202
+
203
+ 1. **No Crackling:** HTML5 audio handles playback natively
204
+ 2. **Streaming Support:** Browser handles Range requests automatically
205
+ 3. **Reactive Effects Preserved:** AnalyserNode provides frequency data
206
+ 4. **Simpler Codebase:** Single player architecture instead of two
207
+ 5. **Better Memory Usage:** No full-file blob needed for playback
208
+
209
+ ---
210
+
211
+ ## Implementation Phases
212
+
213
+ ### Phase 0: Quick Fix for Crackling ✅ DONE
214
+
215
+ **Goal:** Fix the crackling issue in existing WaveSurfer player without full refactoring
216
+
217
+ **Changes:**
218
+ - ✅ Modified `useSharedWebAudio.ts` to fix double-routing
219
+ - ✅ Analyser connects only to source, not to destination
220
+
221
+ **Completed:** 2025-12-30
222
+ **File Changed:** `hooks/useSharedWebAudio.ts` (lines 80-84)
223
+ **Impact:** Eliminates crackling immediately
224
+
225
+ ### Phase 1: Core Infrastructure ✅ DONE
226
+
227
+ **Goal:** Create the core hybrid audio hooks and context
228
+
229
+ **Deliverables:**
230
+ - ✅ `useHybridAudio.ts` - Combined HTML5 audio + Web Audio hook
231
+ - ✅ `useHybridAudioAnalysis.ts` - Frequency analysis for hybrid context
232
+ - ✅ `HybridAudioProvider.tsx` - Context provider
233
+
234
+ **Completed:** 2025-12-30
235
+
236
+ ### Phase 2: Visualization Integration ✅ DONE
237
+
238
+ **Goal:** Connect visualization components to hybrid system
239
+
240
+ **Deliverables:**
241
+ - ✅ `HybridWaveform.tsx` - Real-time frequency visualization with seek support
242
+ - ✅ Integrated with existing canvas rendering patterns
243
+
244
+ **Completed:** 2025-12-30
245
+
246
+ ### Phase 3: Player Components ✅ DONE
247
+
248
+ **Goal:** Build the new player components
249
+
250
+ **Deliverables:**
251
+ - ✅ `HybridAudioPlayer.tsx` - Main player with controls (play, pause, seek, volume, loop)
252
+ - ✅ `HybridSimplePlayer.tsx` - All-in-one wrapper with reactive cover support
253
+
254
+ **Completed:** 2025-12-30
255
+
256
+ ### Phase 4: Integration & Testing (Days 7-8)
257
+
258
+ **Goal:** Integrate with existing application and test thoroughly
259
+
260
+ **Deliverables:**
261
+ - Update `AudioViewer.tsx` to use hybrid player
262
+ - Migration guide for consumers
263
+ - Comprehensive testing
264
+
265
+ ### Phase 5: Cleanup (Day 9-10)
266
+
267
+ **Goal:** Remove deprecated code and finalize
268
+
269
+ **Deliverables:**
270
+ - Deprecate/remove WaveSurfer-based player
271
+ - Update documentation
272
+ - Performance optimization
273
+
274
+ ---
275
+
276
+ ## Files to Modify/Create
277
+
278
+ ### Files to Create
279
+
280
+ | File | Purpose |
281
+ |------|---------|
282
+ | `hooks/useHybridAudio.ts` | Core hook combining HTML5 audio + Web Audio |
283
+ | `context/HybridAudioProvider.tsx` | Context provider for hybrid player |
284
+ | `components/HybridAudioPlayer.tsx` | Main hybrid player component |
285
+ | `components/HybridSimplePlayer.tsx` | Simplified wrapper |
286
+ | `components/HybridWaveform.tsx` | Real-time frequency visualization |
287
+
288
+ ### Files to Modify
289
+
290
+ | File | Changes |
291
+ |------|---------|
292
+ | `hooks/useSharedWebAudio.ts` | **CRITICAL:** Fix double-routing issue |
293
+ | `hooks/useAudioAnalysis.ts` | Minor updates for hybrid context |
294
+ | `context/index.ts` | Export new HybridAudioProvider |
295
+ | `components/index.ts` | Export new components |
296
+ | `types/audio.ts` | Add hybrid types |
297
+
298
+ ### Files in cmdop to Modify
299
+
300
+ | File | Changes |
301
+ |------|---------|
302
+ | `viewers/media/AudioViewer.tsx` | Use HybridSimplePlayer instead of mode toggle |
303
+
304
+ ### Backend Files to Consider
305
+
306
+ | File | Changes |
307
+ |------|---------|
308
+ | `django/apps/terminal/views/api/media_stream/viewsets.py` | Increase audio chunk limit to 50MB |
309
+
310
+ ---
311
+
312
+ ## Code Examples
313
+
314
+ ### Fix 1: useSharedWebAudio.ts (Critical - Immediate)
315
+
316
+ ```typescript
317
+ // hooks/useSharedWebAudio.ts
318
+
319
+ // BEFORE (problematic):
320
+ const initAudio = () => {
321
+ // ...
322
+ sourceRef.current = audioContext.createMediaElementSource(audioElement);
323
+ sourceRef.current.connect(audioContext.destination); // First connection
324
+ // ...
325
+ };
326
+
327
+ const createAnalyser = useCallback(() => {
328
+ // ...
329
+ sourceRef.current.connect(analyser);
330
+ analyser.connect(audioContextRef.current.destination); // DUPLICATE!
331
+ // ...
332
+ });
333
+
334
+ // AFTER (fixed):
335
+ const initAudio = () => {
336
+ // ...
337
+ sourceRef.current = audioContext.createMediaElementSource(audioElement);
338
+ sourceRef.current.connect(audioContext.destination); // Single path to destination
339
+ // ...
340
+ };
341
+
342
+ const createAnalyser = useCallback(() => {
343
+ // ...
344
+ // Analyser only needs connection to source for getByteFrequencyData()
345
+ // It does NOT need to connect to destination
346
+ sourceRef.current.connect(analyser);
347
+ // NO: analyser.connect(destination) - causes double routing!
348
+
349
+ analyserNodesRef.current.add(analyser);
350
+ return analyser;
351
+ });
352
+ ```
353
+
354
+ ### Fix 2: useHybridAudio.ts (New)
355
+
356
+ ```typescript
357
+ // hooks/useHybridAudio.ts
358
+ import { useRef, useState, useCallback, useEffect } from 'react';
359
+
360
+ export interface UseHybridAudioOptions {
361
+ src: string;
362
+ autoPlay?: boolean;
363
+ initialVolume?: number;
364
+ loop?: boolean;
365
+ crossOrigin?: 'anonymous' | 'use-credentials';
366
+ onPlay?: () => void;
367
+ onPause?: () => void;
368
+ onEnded?: () => void;
369
+ onTimeUpdate?: (time: number) => void;
370
+ onError?: (error: Error) => void;
371
+ }
372
+
373
+ export interface AudioState {
374
+ isReady: boolean;
375
+ isPlaying: boolean;
376
+ currentTime: number;
377
+ duration: number;
378
+ volume: number;
379
+ isMuted: boolean;
380
+ isLooping: boolean;
381
+ buffered: TimeRanges | null;
382
+ error: Error | null;
383
+ }
384
+
385
+ export interface AudioControls {
386
+ play: () => Promise<void>;
387
+ pause: () => void;
388
+ togglePlay: () => void;
389
+ seek: (time: number) => void;
390
+ seekTo: (progress: number) => void;
391
+ skip: (seconds: number) => void;
392
+ setVolume: (vol: number) => void;
393
+ toggleMute: () => void;
394
+ toggleLoop: () => void;
395
+ }
396
+
397
+ export interface WebAudioAPI {
398
+ context: AudioContext | null;
399
+ analyser: AnalyserNode | null;
400
+ sourceNode: MediaElementAudioSourceNode | null;
401
+ }
402
+
403
+ export function useHybridAudio(options: UseHybridAudioOptions) {
404
+ const {
405
+ src,
406
+ autoPlay = false,
407
+ initialVolume = 1,
408
+ loop = false,
409
+ crossOrigin = 'anonymous',
410
+ onPlay,
411
+ onPause,
412
+ onEnded,
413
+ onTimeUpdate,
414
+ onError,
415
+ } = options;
416
+
417
+ // Refs
418
+ const audioRef = useRef<HTMLAudioElement | null>(null);
419
+ const audioContextRef = useRef<AudioContext | null>(null);
420
+ const sourceNodeRef = useRef<MediaElementAudioSourceNode | null>(null);
421
+ const analyserRef = useRef<AnalyserNode | null>(null);
422
+
423
+ // State
424
+ const [state, setState] = useState<AudioState>({
425
+ isReady: false,
426
+ isPlaying: false,
427
+ currentTime: 0,
428
+ duration: 0,
429
+ volume: initialVolume,
430
+ isMuted: false,
431
+ isLooping: loop,
432
+ buffered: null,
433
+ error: null,
434
+ });
435
+
436
+ // Initialize Web Audio for visualization
437
+ const initWebAudio = useCallback(() => {
438
+ const audio = audioRef.current;
439
+ if (!audio || audioContextRef.current) return;
440
+
441
+ try {
442
+ const AudioContextClass = window.AudioContext ||
443
+ (window as any).webkitAudioContext;
444
+ const ctx = new AudioContextClass();
445
+ audioContextRef.current = ctx;
446
+
447
+ // Create source from audio element
448
+ const source = ctx.createMediaElementSource(audio);
449
+ sourceNodeRef.current = source;
450
+
451
+ // Create analyser for visualization
452
+ const analyser = ctx.createAnalyser();
453
+ analyser.fftSize = 256;
454
+ analyser.smoothingTimeConstant = 0.8;
455
+ analyserRef.current = analyser;
456
+
457
+ // Connect: source -> analyser (for visualization)
458
+ source.connect(analyser);
459
+
460
+ // Connect: source -> destination (for playback)
461
+ // Note: analyser does NOT connect to destination - this prevents double audio
462
+ source.connect(ctx.destination);
463
+
464
+ } catch (error) {
465
+ console.warn('[useHybridAudio] Web Audio init failed:', error);
466
+ }
467
+ }, []);
468
+
469
+ // Resume AudioContext on user interaction
470
+ const resumeAudioContext = useCallback(async () => {
471
+ const ctx = audioContextRef.current;
472
+ if (ctx && ctx.state === 'suspended') {
473
+ await ctx.resume();
474
+ }
475
+ }, []);
476
+
477
+ // Controls
478
+ const controls: AudioControls = {
479
+ play: useCallback(async () => {
480
+ const audio = audioRef.current;
481
+ if (!audio) return;
482
+
483
+ try {
484
+ await resumeAudioContext();
485
+ await audio.play();
486
+ } catch (error) {
487
+ console.error('[useHybridAudio] Play failed:', error);
488
+ onError?.(error as Error);
489
+ }
490
+ }, [resumeAudioContext, onError]),
491
+
492
+ pause: useCallback(() => {
493
+ audioRef.current?.pause();
494
+ }, []),
495
+
496
+ togglePlay: useCallback(() => {
497
+ if (state.isPlaying) {
498
+ controls.pause();
499
+ } else {
500
+ controls.play();
501
+ }
502
+ }, [state.isPlaying]),
503
+
504
+ seek: useCallback((time: number) => {
505
+ const audio = audioRef.current;
506
+ if (audio && isFinite(time)) {
507
+ audio.currentTime = Math.max(0, Math.min(time, state.duration));
508
+ }
509
+ }, [state.duration]),
510
+
511
+ seekTo: useCallback((progress: number) => {
512
+ controls.seek(state.duration * progress);
513
+ }, [state.duration]),
514
+
515
+ skip: useCallback((seconds: number) => {
516
+ controls.seek(state.currentTime + seconds);
517
+ }, [state.currentTime]),
518
+
519
+ setVolume: useCallback((vol: number) => {
520
+ const audio = audioRef.current;
521
+ if (audio) {
522
+ const clampedVol = Math.max(0, Math.min(1, vol));
523
+ audio.volume = clampedVol;
524
+ setState(prev => ({ ...prev, volume: clampedVol }));
525
+ }
526
+ }, []),
527
+
528
+ toggleMute: useCallback(() => {
529
+ const audio = audioRef.current;
530
+ if (audio) {
531
+ audio.muted = !audio.muted;
532
+ setState(prev => ({ ...prev, isMuted: audio.muted }));
533
+ }
534
+ }, []),
535
+
536
+ toggleLoop: useCallback(() => {
537
+ const audio = audioRef.current;
538
+ if (audio) {
539
+ audio.loop = !audio.loop;
540
+ setState(prev => ({ ...prev, isLooping: audio.loop }));
541
+ }
542
+ }, []),
543
+ };
544
+
545
+ // Event handlers
546
+ useEffect(() => {
547
+ const audio = audioRef.current;
548
+ if (!audio) return;
549
+
550
+ const handlers = {
551
+ loadedmetadata: () => {
552
+ setState(prev => ({
553
+ ...prev,
554
+ duration: audio.duration,
555
+ isReady: true,
556
+ }));
557
+ },
558
+ canplay: () => {
559
+ setState(prev => ({ ...prev, isReady: true }));
560
+ if (autoPlay) controls.play();
561
+ },
562
+ play: () => {
563
+ setState(prev => ({ ...prev, isPlaying: true }));
564
+ onPlay?.();
565
+ },
566
+ pause: () => {
567
+ setState(prev => ({ ...prev, isPlaying: false }));
568
+ onPause?.();
569
+ },
570
+ ended: () => {
571
+ setState(prev => ({ ...prev, isPlaying: false }));
572
+ onEnded?.();
573
+ },
574
+ timeupdate: () => {
575
+ setState(prev => ({ ...prev, currentTime: audio.currentTime }));
576
+ onTimeUpdate?.(audio.currentTime);
577
+ },
578
+ progress: () => {
579
+ setState(prev => ({ ...prev, buffered: audio.buffered }));
580
+ },
581
+ error: () => {
582
+ const error = new Error(audio.error?.message || 'Audio error');
583
+ setState(prev => ({ ...prev, error }));
584
+ onError?.(error);
585
+ },
586
+ };
587
+
588
+ Object.entries(handlers).forEach(([event, handler]) => {
589
+ audio.addEventListener(event, handler);
590
+ });
591
+
592
+ return () => {
593
+ Object.entries(handlers).forEach(([event, handler]) => {
594
+ audio.removeEventListener(event, handler);
595
+ });
596
+ };
597
+ }, [autoPlay, onPlay, onPause, onEnded, onTimeUpdate, onError]);
598
+
599
+ // Load new source
600
+ useEffect(() => {
601
+ const audio = audioRef.current;
602
+ if (!audio) return;
603
+
604
+ setState(prev => ({
605
+ ...prev,
606
+ isReady: false,
607
+ currentTime: 0,
608
+ duration: 0,
609
+ error: null,
610
+ }));
611
+
612
+ audio.src = src;
613
+ audio.load();
614
+ initWebAudio();
615
+ }, [src, initWebAudio]);
616
+
617
+ // Create audio element
618
+ useEffect(() => {
619
+ const audio = document.createElement('audio');
620
+ audio.preload = 'metadata';
621
+ audio.crossOrigin = crossOrigin;
622
+ audio.volume = initialVolume;
623
+ audio.loop = loop;
624
+ audioRef.current = audio;
625
+
626
+ return () => {
627
+ audio.pause();
628
+ audio.src = '';
629
+ audioContextRef.current?.close();
630
+ };
631
+ }, []);
632
+
633
+ return {
634
+ audioRef,
635
+ state,
636
+ controls,
637
+ webAudio: {
638
+ context: audioContextRef.current,
639
+ analyser: analyserRef.current,
640
+ sourceNode: sourceNodeRef.current,
641
+ },
642
+ };
643
+ }
644
+ ```
645
+
646
+ ### Fix 3: HybridAudioProvider.tsx (New)
647
+
648
+ ```typescript
649
+ // context/HybridAudioProvider.tsx
650
+ import React, { createContext, useContext, useRef, useMemo } from 'react';
651
+ import { useHybridAudio, UseHybridAudioOptions, AudioState, AudioControls, WebAudioAPI } from '../hooks/useHybridAudio';
652
+ import { useAudioAnalysis, AudioLevels } from '../hooks/useAudioAnalysis';
653
+
654
+ interface HybridAudioContextValue {
655
+ // Audio state
656
+ state: AudioState;
657
+
658
+ // Controls
659
+ controls: AudioControls;
660
+
661
+ // Web Audio (for visualizations)
662
+ webAudio: WebAudioAPI;
663
+
664
+ // Audio levels (for reactive effects)
665
+ audioLevels: AudioLevels;
666
+
667
+ // Audio element ref (for custom integrations)
668
+ audioRef: React.RefObject<HTMLAudioElement | null>;
669
+ }
670
+
671
+ const HybridAudioContext = createContext<HybridAudioContextValue | null>(null);
672
+
673
+ export interface HybridAudioProviderProps extends UseHybridAudioOptions {
674
+ children: React.ReactNode;
675
+ }
676
+
677
+ export function HybridAudioProvider({ children, ...options }: HybridAudioProviderProps) {
678
+ const { audioRef, state, controls, webAudio } = useHybridAudio(options);
679
+
680
+ // Audio analysis for reactive effects
681
+ const audioLevels = useAudioAnalysis(webAudio.analyser, state.isPlaying);
682
+
683
+ const value = useMemo<HybridAudioContextValue>(() => ({
684
+ state,
685
+ controls,
686
+ webAudio,
687
+ audioLevels,
688
+ audioRef,
689
+ }), [state, controls, webAudio, audioLevels, audioRef]);
690
+
691
+ return (
692
+ <HybridAudioContext.Provider value={value}>
693
+ {children}
694
+ </HybridAudioContext.Provider>
695
+ );
696
+ }
697
+
698
+ export function useHybridAudioContext() {
699
+ const context = useContext(HybridAudioContext);
700
+ if (!context) {
701
+ throw new Error('useHybridAudioContext must be used within HybridAudioProvider');
702
+ }
703
+ return context;
704
+ }
705
+ ```
706
+
707
+ ### Fix 4: HybridWaveform.tsx (New)
708
+
709
+ ```typescript
710
+ // components/HybridWaveform.tsx
711
+ import React, { useRef, useEffect, useCallback } from 'react';
712
+ import { useHybridAudioContext } from '../context/HybridAudioProvider';
713
+
714
+ interface HybridWaveformProps {
715
+ mode?: 'frequency' | 'waveform';
716
+ peaks?: number[];
717
+ height?: number;
718
+ barWidth?: number;
719
+ barGap?: number;
720
+ progressColor?: string;
721
+ waveColor?: string;
722
+ bufferedColor?: string;
723
+ className?: string;
724
+ onSeek?: (time: number) => void;
725
+ }
726
+
727
+ export function HybridWaveform({
728
+ mode = 'frequency',
729
+ peaks,
730
+ height = 64,
731
+ barWidth = 3,
732
+ barGap = 1,
733
+ progressColor = 'rgba(59, 130, 246, 1)',
734
+ waveColor = 'rgba(59, 130, 246, 0.4)',
735
+ bufferedColor = 'rgba(255, 255, 255, 0.1)',
736
+ className,
737
+ onSeek,
738
+ }: HybridWaveformProps) {
739
+ const canvasRef = useRef<HTMLCanvasElement>(null);
740
+ const { state, controls, webAudio } = useHybridAudioContext();
741
+ const animationRef = useRef<number>();
742
+
743
+ // Handle seek on click
744
+ const handleClick = useCallback((e: React.MouseEvent<HTMLCanvasElement>) => {
745
+ const canvas = canvasRef.current;
746
+ if (!canvas || !state.duration) return;
747
+
748
+ const rect = canvas.getBoundingClientRect();
749
+ const x = e.clientX - rect.left;
750
+ const progress = x / rect.width;
751
+ const time = state.duration * progress;
752
+
753
+ controls.seek(time);
754
+ onSeek?.(time);
755
+ }, [state.duration, controls, onSeek]);
756
+
757
+ // Render frequency bars (real-time)
758
+ const renderFrequency = useCallback(() => {
759
+ const canvas = canvasRef.current;
760
+ const analyser = webAudio.analyser;
761
+ if (!canvas || !analyser) return;
762
+
763
+ const ctx = canvas.getContext('2d');
764
+ if (!ctx) return;
765
+
766
+ const { width, height: canvasHeight } = canvas;
767
+ const bufferLength = analyser.frequencyBinCount;
768
+ const dataArray = new Uint8Array(bufferLength);
769
+ analyser.getByteFrequencyData(dataArray);
770
+
771
+ ctx.clearRect(0, 0, width, canvasHeight);
772
+
773
+ // Draw buffered regions
774
+ if (state.buffered && state.duration > 0) {
775
+ ctx.fillStyle = bufferedColor;
776
+ for (let i = 0; i < state.buffered.length; i++) {
777
+ const start = (state.buffered.start(i) / state.duration) * width;
778
+ const end = (state.buffered.end(i) / state.duration) * width;
779
+ ctx.fillRect(start, canvasHeight - 2, end - start, 2);
780
+ }
781
+ }
782
+
783
+ // Draw frequency bars
784
+ const barCount = Math.floor(width / (barWidth + barGap));
785
+ const step = Math.floor(bufferLength / barCount);
786
+ const progress = state.duration > 0 ? state.currentTime / state.duration : 0;
787
+ const progressX = width * progress;
788
+
789
+ for (let i = 0; i < barCount; i++) {
790
+ const value = dataArray[i * step] / 255;
791
+ const barHeight = value * canvasHeight * 0.9;
792
+ const x = i * (barWidth + barGap);
793
+ const y = canvasHeight - barHeight;
794
+
795
+ ctx.fillStyle = x < progressX ? progressColor : waveColor;
796
+ ctx.fillRect(x, y, barWidth, barHeight);
797
+ }
798
+
799
+ // Continue animation if playing
800
+ if (state.isPlaying) {
801
+ animationRef.current = requestAnimationFrame(renderFrequency);
802
+ }
803
+ }, [webAudio.analyser, state, barWidth, barGap, progressColor, waveColor, bufferedColor]);
804
+
805
+ // Render pre-computed waveform
806
+ const renderWaveform = useCallback(() => {
807
+ const canvas = canvasRef.current;
808
+ if (!canvas || !peaks?.length) return;
809
+
810
+ const ctx = canvas.getContext('2d');
811
+ if (!ctx) return;
812
+
813
+ const { width, height: canvasHeight } = canvas;
814
+
815
+ ctx.clearRect(0, 0, width, canvasHeight);
816
+
817
+ // Draw buffered regions
818
+ if (state.buffered && state.duration > 0) {
819
+ ctx.fillStyle = bufferedColor;
820
+ for (let i = 0; i < state.buffered.length; i++) {
821
+ const start = (state.buffered.start(i) / state.duration) * width;
822
+ const end = (state.buffered.end(i) / state.duration) * width;
823
+ ctx.fillRect(start, canvasHeight - 2, end - start, 2);
824
+ }
825
+ }
826
+
827
+ // Draw waveform
828
+ const barCount = Math.floor(width / (barWidth + barGap));
829
+ const peaksPerBar = Math.ceil(peaks.length / barCount);
830
+ const progress = state.duration > 0 ? state.currentTime / state.duration : 0;
831
+ const progressX = width * progress;
832
+
833
+ for (let i = 0; i < barCount; i++) {
834
+ // Average peaks for this bar
835
+ const startPeak = i * peaksPerBar;
836
+ const endPeak = Math.min(startPeak + peaksPerBar, peaks.length);
837
+ let sum = 0;
838
+ for (let j = startPeak; j < endPeak; j++) {
839
+ sum += Math.abs(peaks[j]);
840
+ }
841
+ const avg = sum / (endPeak - startPeak);
842
+
843
+ const barHeight = avg * canvasHeight * 0.9;
844
+ const x = i * (barWidth + barGap);
845
+ const y = (canvasHeight - barHeight) / 2;
846
+
847
+ ctx.fillStyle = x < progressX ? progressColor : waveColor;
848
+ ctx.fillRect(x, y, barWidth, barHeight);
849
+ }
850
+ }, [peaks, state, barWidth, barGap, progressColor, waveColor, bufferedColor]);
851
+
852
+ // Animation loop
853
+ useEffect(() => {
854
+ if (mode === 'frequency') {
855
+ renderFrequency();
856
+ } else {
857
+ renderWaveform();
858
+ }
859
+
860
+ return () => {
861
+ if (animationRef.current) {
862
+ cancelAnimationFrame(animationRef.current);
863
+ }
864
+ };
865
+ }, [mode, renderFrequency, renderWaveform, state.isPlaying]);
866
+
867
+ // Re-render waveform on time change (for static waveform mode)
868
+ useEffect(() => {
869
+ if (mode === 'waveform') {
870
+ renderWaveform();
871
+ }
872
+ }, [mode, state.currentTime, renderWaveform]);
873
+
874
+ return (
875
+ <canvas
876
+ ref={canvasRef}
877
+ height={height}
878
+ className={className}
879
+ onClick={handleClick}
880
+ style={{ width: '100%', cursor: 'pointer' }}
881
+ />
882
+ );
883
+ }
884
+ ```
885
+
886
+ ### Fix 5: HybridSimplePlayer.tsx (New)
887
+
888
+ ```typescript
889
+ // components/HybridSimplePlayer.tsx
890
+ import React from 'react';
891
+ import { HybridAudioProvider, HybridAudioProviderProps } from '../context/HybridAudioProvider';
892
+ import { HybridAudioPlayer, HybridAudioPlayerProps } from './HybridAudioPlayer';
893
+ import { AudioReactiveCover } from './ReactiveCover/AudioReactiveCover';
894
+ import { VisualizationProvider } from '../context/VisualizationProvider';
895
+
896
+ export interface HybridSimplePlayerProps extends Omit<HybridAudioProviderProps, 'children'> {
897
+ // Display options
898
+ title?: string;
899
+ artist?: string;
900
+ coverArt?: string;
901
+
902
+ // Features
903
+ showWaveform?: boolean;
904
+ showControls?: boolean;
905
+ showTimer?: boolean;
906
+ showVolume?: boolean;
907
+ showLoop?: boolean;
908
+
909
+ // Reactive cover
910
+ reactiveCover?: boolean;
911
+ variant?: 'glow' | 'orbs' | 'spotlight' | 'mesh' | 'none';
912
+
913
+ // Waveform mode
914
+ waveformMode?: 'frequency' | 'waveform';
915
+
916
+ // Styling
917
+ className?: string;
918
+ }
919
+
920
+ export function HybridSimplePlayer({
921
+ src,
922
+ title,
923
+ artist,
924
+ coverArt,
925
+ showWaveform = true,
926
+ showControls = true,
927
+ showTimer = true,
928
+ showVolume = true,
929
+ showLoop = true,
930
+ reactiveCover = false,
931
+ variant = 'spotlight',
932
+ waveformMode = 'frequency',
933
+ className,
934
+ ...audioOptions
935
+ }: HybridSimplePlayerProps) {
936
+ return (
937
+ <VisualizationProvider>
938
+ <HybridAudioProvider src={src} {...audioOptions}>
939
+ <div className={className}>
940
+ {/* Reactive Cover Art */}
941
+ {coverArt && reactiveCover && variant !== 'none' && (
942
+ <AudioReactiveCover variant={variant}>
943
+ <img src={coverArt} alt={title || 'Cover'} />
944
+ </AudioReactiveCover>
945
+ )}
946
+
947
+ {/* Non-reactive Cover Art */}
948
+ {coverArt && (!reactiveCover || variant === 'none') && (
949
+ <img src={coverArt} alt={title || 'Cover'} />
950
+ )}
951
+
952
+ {/* Title and Artist */}
953
+ {(title || artist) && (
954
+ <div className="mb-2">
955
+ {title && <div className="font-medium">{title}</div>}
956
+ {artist && <div className="text-sm text-muted-foreground">{artist}</div>}
957
+ </div>
958
+ )}
959
+
960
+ {/* Player */}
961
+ <HybridAudioPlayer
962
+ showWaveform={showWaveform}
963
+ showControls={showControls}
964
+ showTimer={showTimer}
965
+ showVolume={showVolume}
966
+ showLoop={showLoop}
967
+ waveformMode={waveformMode}
968
+ />
969
+ </div>
970
+ </HybridAudioProvider>
971
+ </VisualizationProvider>
972
+ );
973
+ }
974
+ ```
975
+
976
+ ---
977
+
978
+ ## Migration Guide
979
+
980
+ ### From WaveSurfer-based Player
981
+
982
+ ```tsx
983
+ // BEFORE
984
+ import { SimpleAudioPlayer } from '@djangocfg/ui-nextjs/tools/AudioPlayer';
985
+
986
+ <SimpleAudioPlayer
987
+ src={url}
988
+ showWaveform
989
+ reactiveCover
990
+ variant="spotlight"
991
+ />
992
+
993
+ // AFTER
994
+ import { HybridSimplePlayer } from '@djangocfg/ui-nextjs/tools/AudioPlayer';
995
+
996
+ <HybridSimplePlayer
997
+ src={url}
998
+ showWaveform
999
+ reactiveCover
1000
+ variant="spotlight"
1001
+ />
1002
+ ```
1003
+
1004
+ ### From Progressive Player
1005
+
1006
+ ```tsx
1007
+ // BEFORE
1008
+ import { ProgressiveAudioPlayer } from '@djangocfg/ui-nextjs/tools/AudioPlayer';
1009
+
1010
+ <ProgressiveAudioPlayer
1011
+ src={streamUrl}
1012
+ showWaveform
1013
+ />
1014
+
1015
+ // AFTER
1016
+ import { HybridSimplePlayer } from '@djangocfg/ui-nextjs/tools/AudioPlayer';
1017
+
1018
+ <HybridSimplePlayer
1019
+ src={streamUrl}
1020
+ showWaveform
1021
+ waveformMode="frequency" // Real-time visualization
1022
+ />
1023
+ ```
1024
+
1025
+ ### AudioViewer Migration
1026
+
1027
+ ```tsx
1028
+ // BEFORE (AudioViewer.tsx)
1029
+ {useProgressivePlayer ? (
1030
+ <ProgressiveAudioPlayer src={src} ... />
1031
+ ) : (
1032
+ <SimpleAudioPlayer src={src} ... />
1033
+ )}
1034
+
1035
+ // AFTER (AudioViewer.tsx)
1036
+ <HybridSimplePlayer
1037
+ src={src}
1038
+ reactiveCover
1039
+ variant="spotlight"
1040
+ ...
1041
+ />
1042
+ // No mode toggle needed - hybrid player handles everything
1043
+ ```
1044
+
1045
+ ---
1046
+
1047
+ ## Testing Strategy
1048
+
1049
+ ### Unit Tests
1050
+
1051
+ 1. `useHybridAudio` hook behavior
1052
+ - State transitions (loading -> ready -> playing -> paused)
1053
+ - Control functions (play, pause, seek, volume)
1054
+ - Web Audio initialization
1055
+
1056
+ 2. Audio routing
1057
+ - Source connects to destination once
1058
+ - Analyser connects to source only
1059
+ - No double routing
1060
+
1061
+ ### Integration Tests
1062
+
1063
+ 1. Full player rendering
1064
+ 2. Reactive effects with audio levels
1065
+ 3. Waveform visualization modes
1066
+ 4. Buffered region display
1067
+
1068
+ ### Manual Testing Checklist
1069
+
1070
+ - [ ] No crackling during normal playback
1071
+ - [ ] No crackling when seeking
1072
+ - [ ] No crackling with reactive effects enabled
1073
+ - [ ] Seek works beyond 2 minutes (with backend fix)
1074
+ - [ ] Volume control works correctly
1075
+ - [ ] Loop toggle works
1076
+ - [ ] Keyboard shortcuts work
1077
+ - [ ] Mobile touch controls work
1078
+ - [ ] Cross-browser: Chrome, Firefox, Safari, Edge
1079
+
1080
+ ### Test Files
1081
+
1082
+ Test with various audio scenarios:
1083
+ - Small files (< 5MB)
1084
+ - Large files (> 50MB)
1085
+ - Streaming URLs
1086
+ - Different bitrates (128, 192, 256, 320 kbps)
1087
+ - Different formats (MP3, AAC, OGG)
1088
+
1089
+ ---
1090
+
1091
+ ## Risk Assessment
1092
+
1093
+ | Risk | Likelihood | Impact | Mitigation |
1094
+ |------|------------|--------|------------|
1095
+ | Safari AudioContext issues | Medium | High | Test webkit prefix, add fallbacks |
1096
+ | CORS with streaming | Medium | Medium | Ensure crossOrigin="anonymous" |
1097
+ | Breaking existing code | Medium | High | Keep old components, provide migration guide |
1098
+ | Performance on mobile | Medium | Medium | Lower fftSize, reduce frame rate |
1099
+ | Memory leaks | Low | Medium | Proper cleanup in useEffect |
1100
+
1101
+ ---
1102
+
1103
+ ## Timeline Estimate
1104
+
1105
+ | Phase | Duration | Dependencies |
1106
+ |-------|----------|--------------|
1107
+ | Phase 0: Quick Fix | 1 hour | None |
1108
+ | Phase 1: Core Infrastructure | 2 days | Phase 0 |
1109
+ | Phase 2: Visualization | 2 days | Phase 1 |
1110
+ | Phase 3: Player Components | 2 days | Phase 2 |
1111
+ | Phase 4: Integration & Testing | 2 days | Phase 3 |
1112
+ | Phase 5: Cleanup | 2 days | Phase 4 |
1113
+
1114
+ **Total: ~10 days** (can be parallelized)
1115
+
1116
+ ### Recommended Approach
1117
+
1118
+ 1. **Immediately:** Apply Phase 0 fix to eliminate crackling
1119
+ 2. **Short-term:** Implement Phases 1-3 for hybrid player
1120
+ 3. **Medium-term:** Phase 4 integration and testing
1121
+ 4. **Long-term:** Phase 5 cleanup and deprecation
1122
+
1123
+ ---
1124
+
1125
+ ## Appendix: Related Files Reference
1126
+
1127
+ ### Analysis Documents
1128
+ - `01-WAVESURFER-STREAMING-ANALYSIS.md` - WaveSurfer.js streaming limitations
1129
+ - `02-MEDIA-VIEWER-ANALYSIS.md` - Current player architecture comparison
1130
+ - `03-HYBRID-ARCHITECTURE-PROPOSAL.md` - Detailed hybrid design
1131
+ - `04-CRACKLING-ISSUE-DIAGNOSIS.md` - Root cause and fixes
1132
+
1133
+ ### External References
1134
+ - `cmdop/.../01-PROBLEM-ANALYSIS.md` - Backend chunk limit issue
1135
+ - `cmdop/.../02-SOLUTION-OPTIONS.md` - Solution comparison
1136
+
1137
+ ### Source Files to Reference
1138
+ - `hooks/useSharedWebAudio.ts` - Current Web Audio routing
1139
+ - `hooks/useAudioAnalysis.ts` - Audio level calculation
1140
+ - `progressive/useAudioElement.ts` - Native audio hook reference
1141
+ - `progressive/WaveformCanvas.tsx` - Canvas rendering reference
1142
+ - `context/AudioProvider.tsx` - Current WaveSurfer context
1143
+
1144
+ ---
1145
+
1146
+ **End of Implementation Roadmap**