node-mac-recorder 2.21.22 → 2.21.24

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
@@ -0,0 +1,5 @@
1
+ {
2
+ "setup-worktree": [
3
+ "npm install"
4
+ ]
5
+ }
@@ -0,0 +1,85 @@
1
+ # Cursor-Video Senkronizasyon Düzeltmesi
2
+
3
+ ## Problem
4
+ Ekran videosu kaydedilirken custom cursor da kayıt ediliyor, ancak cursor'un tıklama/hareket kayıtları ekran videosundan ~0.5-1 saniye geriden geliyordu.
5
+
6
+ ## Çözüm
7
+
8
+ ### 1. Cursor Tracking Interval Azaltıldı
9
+ - **Öncesi**: 20ms interval (50 FPS)
10
+ - **Sonrası**: 5ms interval (200 FPS)
11
+ - **Sonuç**: Cursor artık çok daha sık örnekleniyor, bu yüzden video frame'leri ile sync şansı çok daha yüksek
12
+
13
+ ### 2. Position Threshold Azaltıldı
14
+ - **Öncesi**: 2 pixel minimum hareket
15
+ - **Sonrası**: 1 pixel minimum hareket
16
+ - **Sonuç**: Daha hassas tracking, küçük mouse hareketleri bile kaydediliyor
17
+
18
+ ### 3. Gerçek Test Sonuçları
19
+
20
+ ```
21
+ 📊 Cursor tracking analysis:
22
+ Total events captured: 193
23
+ Average capture rate: 41.5 FPS
24
+ Timing analysis:
25
+ - Average interval: 24.3ms (41.2 FPS)
26
+ - Min interval: 1.0ms
27
+ - Max interval: 765.0ms
28
+ ✅ Smooth cursor tracking
29
+ ```
30
+
31
+ ## Test Etme
32
+
33
+ ```bash
34
+ # Kısa sync testi (5 saniye, mouse'u hareket ettir)
35
+ node test-cursor-sync-mouse.js
36
+
37
+ # Test sonrası video ve cursor dosyasını kontrol et:
38
+ # - Video: test-output/sync-test-{timestamp}.mov
39
+ # - Cursor: test-output/temp_cursor_{timestamp}.json
40
+ ```
41
+
42
+ ## Teknik Detaylar
43
+
44
+ ### Neden Native Event Tracking Kullanılmadı?
45
+ Native `NSTimer` ve `CGEventTap` çalışıyor ama Node.js event loop ile uyumsuz. Timer callback'leri çağrılmıyor çünkü:
46
+ - Node.js kendi event loop'unu kullanıyor
47
+ - macOS main run loop block olmuyor
48
+ - Bu yüzden timer callback'leri tetiklenmiyor
49
+
50
+ ### JavaScript Polling Neden Yeterli?
51
+ - 5ms interval = 200 FPS sampling rate
52
+ - Video 60 FPS = ~16.67ms per frame
53
+ - Cursor her video frame'inde 3+ kez örnekleniyor
54
+ - `shouldCaptureEvent` filtrelemesi sayesinde sadece değişiklikler kaydediliyor
55
+ - Ortalama 40-50 FPS cursor data elde ediliyor (yeterli)
56
+
57
+ ### Sync Mekanizması
58
+ 1. **Unified Session Timestamp**: Hem video hem cursor aynı `sessionTimestamp` kullanıyor
59
+ 2. **Synchronized Start**: Video başladıktan hemen sonra cursor tracking başlıyor (aynı timestamp)
60
+ 3. **Relative Timestamps**: İkisi de başlangıçtan itibaren millisaniye cinsinden kaydediyor
61
+
62
+ ## Beklenen Sonuç
63
+
64
+ - ✅ Cursor ve video aynı anda başlıyor (0ms fark)
65
+ - ✅ Cursor her ~24ms'de bir örnekleniyor (smooth)
66
+ - ✅ Click event'leri doğru yakalanıyor
67
+ - ✅ Video frame'leri ile mükemmel sync
68
+
69
+ ## Ek Notlar
70
+
71
+ ### Çoklu Ekran Kullanımı
72
+ Eğer cursor başka bir ekrandaysa `coordinateSystem: "video-relative-outside"` olarak işaretlenir. Bu normal ve cursor overlay render'ında handle edilmelidir.
73
+
74
+ ### Performance
75
+ 200 FPS sampling yüksek görünse de:
76
+ - Change detection filtrelemesi var (sadece hareket olunca yazıyor)
77
+ - Dosya boyutu küçük kalıyor
78
+ - CPU overhead minimal
79
+
80
+ ### İleride İyileştirme
81
+ Native event tracking için:
82
+ - Ayrı thread'de CFRunLoop çalıştırılabilir
83
+ - Veya GCD dispatch queue kullanılabilir
84
+ - Ama şimdilik JavaScript polling yeterli ve güvenilir
85
+
@@ -0,0 +1,138 @@
1
+ # ✅ Cursor-Video Senkronizasyon Tamamen Çözüldü!
2
+
3
+ ## Asıl Sorun
4
+
5
+ Cursor tracking **hemen** başlıyordu, ama video'nun ilk frame'ini yakalaması **~100-200ms** sürüyordu.
6
+
7
+ ```
8
+ ÖNCE:
9
+ t=0ms: startRecording() çağrılır
10
+ t=0ms: Cursor tracking başlar ✅
11
+ t=150ms: İlk video frame yakalanır ❌ (CURSOR ÖNDE!)
12
+ ```
13
+
14
+ Bu yüzden cursor event'leri video'dan önce geliyordu - **cursor önde, video geride!**
15
+
16
+ ## Çözüm: İlk Frame Senkronizasyonu
17
+
18
+ ### 1. Native Tarafta: İlk Frame Timestamp'ini Kaydet
19
+
20
+ **ScreenCaptureKit** (`screen_capture_kit.mm`):
21
+ ```objc
22
+ static NSTimeInterval g_actualRecordingStartTime = 0;
23
+
24
+ - (void)stream:didOutputSampleBuffer:ofType: {
25
+ if (!g_videoWriterStarted) {
26
+ // İlk frame yakalandığında
27
+ g_actualRecordingStartTime = [[NSDate date] timeIntervalSince1970] * 1000;
28
+ MRLog(@"🎞️ First frame captured at %.0f", g_actualRecordingStartTime);
29
+ }
30
+ }
31
+ ```
32
+
33
+ **AVFoundation** (`avfoundation_recorder.mm`):
34
+ ```objc
35
+ static NSTimeInterval g_avActualRecordingStartTime = 0;
36
+
37
+ if (g_avFrameNumber == 0) {
38
+ // İlk frame yazıldığında
39
+ g_avActualRecordingStartTime = [[NSDate date] timeIntervalSince1970] * 1000;
40
+ MRLog(@"🎞️ AVFoundation first frame written at %.0f", g_avActualRecordingStartTime);
41
+ }
42
+ ```
43
+
44
+ ### 2. JavaScript Tarafta: İlk Frame'i Bekle
45
+
46
+ ```javascript
47
+ // Poll for actual recording start (when first frame is captured)
48
+ console.log('⏳ SYNC: Waiting for first video frame...');
49
+ const maxWaitMs = 2000;
50
+ const pollInterval = 10;
51
+ let actualStartTime = 0;
52
+
53
+ while (waitedMs < maxWaitMs) {
54
+ actualStartTime = nativeBinding.getActualRecordingStartTime();
55
+ if (actualStartTime > 0) {
56
+ console.log(`✅ SYNC: First frame captured at ${actualStartTime}ms`);
57
+ break;
58
+ }
59
+ await new Promise(resolve => setTimeout(resolve, pollInterval));
60
+ waitedMs += pollInterval;
61
+ }
62
+
63
+ // Cursor tracking'i ACTUAL start time ile başlat
64
+ await this.startCursorCapture(cursorFilePath, {
65
+ startTimestamp: actualStartTime // ← PERFECT SYNC!
66
+ });
67
+ ```
68
+
69
+ ## Sonuç: Mükemmel Senkronizasyon!
70
+
71
+ ```
72
+ ŞIMDI:
73
+ t=0ms: startRecording() çağrılır
74
+ t=150ms: İlk video frame yakalanır ✅
75
+ t=150ms: Cursor tracking başlar ✅ (TAM SENKRON!)
76
+ ```
77
+
78
+ ### Test Sonuçları:
79
+ ```
80
+ ✅ SYNC: First frame captured at 1761818226539.994ms (waited 30ms)
81
+ 🎯 SYNC: Starting cursor tracking at ACTUAL recording start: 1761818226539.994
82
+ First cursor event: t=19.006ms (video'dan 19ms sonra - mükemmel!)
83
+ ```
84
+
85
+ ## Teknik Detaylar
86
+
87
+ ### Neden Bu Yaklaşım Çalışıyor?
88
+
89
+ 1. **Video recording başlatma** → Native sistem hazırlanıyor
90
+ 2. **İlk frame yakalanıyor** → GERÇEK kayıt başlangıcı (100-200ms sonra)
91
+ 3. **Cursor tracking başlıyor** → Aynı timestamp'ten başlıyor
92
+ 4. **Sonuç**: Cursor ve video TAM SENKRON!
93
+
94
+ ### Timing Analizi
95
+
96
+ - **Bekleme süresi**: ~30ms (çok hızlı!)
97
+ - **İlk cursor event**: İlk frame'den 19ms sonra
98
+ - **Senkronizasyon farkı**: <20ms (algılanamaz!)
99
+
100
+ ### Ek İyileştirmeler
101
+
102
+ 1. **5ms cursor interval** (200 FPS sampling)
103
+ 2. **1px minimum threshold** (hassas tracking)
104
+ 3. **Change detection filtering** (sadece hareket varsa kaydet)
105
+
106
+ ## Test Etme
107
+
108
+ ```bash
109
+ node test-cursor-sync-mouse.js
110
+ ```
111
+
112
+ Test sırasında mouse'u hareket ettir ve tıkla. Sonuçta:
113
+ - ✅ Video ve cursor aynı anda başlıyor
114
+ - ✅ Tıklama event'leri doğru timing'de
115
+ - ✅ Mouse hareketi smooth ve senkronize
116
+
117
+ ## Kullanım
118
+
119
+ ```javascript
120
+ const recorder = new MacRecorder();
121
+
122
+ // Normal kullanım - senkronizasyon otomatik!
123
+ await recorder.startRecording('output.mov', {
124
+ captureCursor: false, // Sistem cursor'unu gizle
125
+ frameRate: 60
126
+ });
127
+
128
+ // Cursor tracking otomatik olarak video'nun ilk frame'ini bekler
129
+ // ve mükemmel senkronizasyonla başlar!
130
+ ```
131
+
132
+ ## Özet
133
+
134
+ 🎯 **Problem Çözüldü**: Cursor artık video ile TAM SENKRONIZE!
135
+ ✅ **Timing**: İlk frame'den <20ms sonra cursor tracking başlıyor
136
+ ⚡ **Performance**: Sadece ~30ms bekleme süresi
137
+ 🔥 **Sonuç**: Mükemmel cursor-video sync!
138
+
@@ -44,8 +44,8 @@ class ElectronSafeMacRecorder extends EventEmitter {
44
44
  this.options = {
45
45
  includeMicrophone: false,
46
46
  includeSystemAudio: false,
47
- quality: "medium",
48
- frameRate: 30,
47
+ quality: "high",
48
+ frameRate: 60,
49
49
  captureArea: null,
50
50
  captureCursor: false,
51
51
  showClicks: false,
package/index.js CHANGED
@@ -43,8 +43,8 @@ class MacRecorder extends EventEmitter {
43
43
  this.options = {
44
44
  includeMicrophone: false, // Default olarak mikrofon kapalı
45
45
  includeSystemAudio: false, // Default olarak sistem sesi kapalı - kullanıcı explicit olarak açmalı
46
- quality: "medium",
47
- frameRate: 30,
46
+ quality: "high",
47
+ frameRate: 60,
48
48
  captureArea: null, // { x, y, width, height }
49
49
  captureCursor: false, // Default olarak cursor gizli
50
50
  showClicks: false,
@@ -181,6 +181,13 @@ class MacRecorder extends EventEmitter {
181
181
  if (options.captureCamera !== undefined) {
182
182
  this.options.captureCamera = options.captureCamera === true;
183
183
  }
184
+ if (options.frameRate !== undefined) {
185
+ const fps = parseInt(options.frameRate, 10);
186
+ if (!Number.isNaN(fps) && fps > 0) {
187
+ // Clamp reasonable range 1-120
188
+ this.options.frameRate = Math.min(Math.max(fps, 1), 120);
189
+ }
190
+ }
184
191
  if (options.cameraDeviceId !== undefined) {
185
192
  this.options.cameraDeviceId =
186
193
  typeof options.cameraDeviceId === "string" && options.cameraDeviceId.length > 0
@@ -497,6 +504,8 @@ class MacRecorder extends EventEmitter {
497
504
  captureCamera: this.options.captureCamera === true,
498
505
  cameraDeviceId: this.options.cameraDeviceId || null,
499
506
  sessionTimestamp,
507
+ frameRate: this.options.frameRate || 60,
508
+ quality: this.options.quality || "high",
500
509
  };
501
510
 
502
511
  if (cameraFilePath) {
@@ -542,7 +551,8 @@ class MacRecorder extends EventEmitter {
542
551
  console.warn('❌ Native recording failed to start:', error.message);
543
552
  }
544
553
 
545
- // Only start cursor if native recording started successfully
554
+ // CRITICAL SYNC FIX: Wait for first video frame before starting cursor
555
+ // This ensures cursor and video start at the EXACT same time
546
556
  if (success) {
547
557
  const standardCursorOptions = {
548
558
  videoRelative: true,
@@ -551,11 +561,36 @@ class MacRecorder extends EventEmitter {
551
561
  this.options.captureArea ? 'area' : 'display',
552
562
  captureArea: this.options.captureArea,
553
563
  windowId: this.options.windowId,
554
- startTimestamp: sessionTimestamp // Use the same timestamp base
564
+ startTimestamp: sessionTimestamp // Will be updated with actual start time
555
565
  };
556
566
 
557
567
  try {
558
- console.log('🎯 SYNC: Starting cursor tracking at timestamp:', sessionTimestamp);
568
+ // Poll for actual recording start (when first frame is captured)
569
+ console.log('⏳ SYNC: Waiting for first video frame...');
570
+ const maxWaitMs = 2000; // Max 2 seconds wait
571
+ const pollInterval = 10; // Check every 10ms
572
+ let waitedMs = 0;
573
+ let actualStartTime = 0;
574
+
575
+ while (waitedMs < maxWaitMs) {
576
+ actualStartTime = nativeBinding.getActualRecordingStartTime();
577
+ if (actualStartTime > 0) {
578
+ console.log(`✅ SYNC: First frame captured at ${actualStartTime}ms (waited ${waitedMs}ms)`);
579
+ break;
580
+ }
581
+ await new Promise(resolve => setTimeout(resolve, pollInterval));
582
+ waitedMs += pollInterval;
583
+ }
584
+
585
+ if (actualStartTime > 0) {
586
+ // Use actual start time for perfect sync
587
+ standardCursorOptions.startTimestamp = actualStartTime;
588
+ console.log('🎯 SYNC: Starting cursor tracking at ACTUAL recording start:', actualStartTime);
589
+ } else {
590
+ // Fallback to session timestamp if first frame not detected
591
+ console.warn('⚠️ SYNC: First frame not detected, using session timestamp');
592
+ }
593
+
559
594
  await this.startCursorCapture(cursorFilePath, standardCursorOptions);
560
595
  console.log('✅ SYNC: Cursor tracking started successfully');
561
596
  } catch (cursorError) {
@@ -998,21 +1033,23 @@ class MacRecorder extends EventEmitter {
998
1033
 
999
1034
  const last = this.lastCapturedData;
1000
1035
 
1001
- // Event type değişmişse
1036
+ // Event type değişmişse (click, drag, vs)
1002
1037
  if (currentData.type !== last.type) {
1003
1038
  return true;
1004
1039
  }
1005
1040
 
1006
- // Pozisyon değişmişse (minimum 2 pixel tolerans)
1007
- if (
1008
- Math.abs(currentData.x - last.x) >= 2 ||
1009
- Math.abs(currentData.y - last.y) >= 2
1010
- ) {
1041
+ // Cursor type değişmişse (pointer, text, vs)
1042
+ if (currentData.cursorType !== last.cursorType) {
1011
1043
  return true;
1012
1044
  }
1013
1045
 
1014
- // Cursor type değişmişse
1015
- if (currentData.cursorType !== last.cursorType) {
1046
+ // SYNC FIX: Reduced threshold for better sync (1 pixel instead of 2)
1047
+ // With 200 FPS sampling, we can afford more granular position tracking
1048
+ // Pozisyon değişmişse (minimum 1 pixel - hassas tracking)
1049
+ if (
1050
+ Math.abs(currentData.x - last.x) >= 1 ||
1051
+ Math.abs(currentData.y - last.y) >= 1
1052
+ ) {
1016
1053
  return true;
1017
1054
  }
1018
1055
 
@@ -1033,11 +1070,14 @@ class MacRecorder extends EventEmitter {
1033
1070
  */
1034
1071
  async startCursorCapture(intervalOrFilepath = 100, options = {}) {
1035
1072
  let filepath;
1036
- let interval = 20; // Default 50 FPS
1073
+ // SYNC FIX: Use 5ms interval (200 FPS) for ultra-smooth cursor tracking
1074
+ // This high sampling rate ensures cursor is always in sync with 60 FPS video
1075
+ // Even if we sample 200 times per second, we only write on position/event changes (efficient)
1076
+ let interval = 5; // Default 200 FPS for perfect sync
1037
1077
 
1038
1078
  // Parameter parsing: number = interval, string = filepath
1039
1079
  if (typeof intervalOrFilepath === "number") {
1040
- interval = Math.max(10, intervalOrFilepath); // Min 10ms
1080
+ interval = Math.max(5, intervalOrFilepath); // Min 5ms for sync
1041
1081
  filepath = `cursor-data-${Date.now()}.json`;
1042
1082
  } else if (typeof intervalOrFilepath === "string") {
1043
1083
  filepath = intervalOrFilepath;
@@ -1132,6 +1172,10 @@ class MacRecorder extends EventEmitter {
1132
1172
 
1133
1173
  return new Promise((resolve, reject) => {
1134
1174
  try {
1175
+ // NOTE: Native cursor tracking (NSTimer/CFRunLoop) doesn't work with Node.js event loop
1176
+ // Using JavaScript setInterval with high frequency (5ms = 200 FPS) instead
1177
+ // This provides excellent sync with minimal overhead due to change-detection filtering
1178
+
1135
1179
  // Dosyayı oluştur ve temizle
1136
1180
  const fs = require("fs");
1137
1181
  fs.writeFileSync(filepath, "[");
@@ -1236,7 +1280,7 @@ class MacRecorder extends EventEmitter {
1236
1280
  return resolve(false);
1237
1281
  }
1238
1282
 
1239
- // Interval'ı durdur
1283
+ // Stop JavaScript interval
1240
1284
  clearInterval(this.cursorCaptureInterval);
1241
1285
  this.cursorCaptureInterval = null;
1242
1286
 
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "node-mac-recorder",
3
- "version": "2.21.22",
3
+ "version": "2.21.24",
4
4
  "description": "Native macOS screen recording package for Node.js applications",
5
5
  "main": "index.js",
6
6
  "keywords": [
@@ -25,6 +25,9 @@ static CMTime g_avStartTime;
25
25
  static void* g_avAudioRecorder = nil;
26
26
  static NSString* g_avAudioOutputPath = nil;
27
27
 
28
+ // SYNC FIX: Track actual recording start time (when first frame is captured)
29
+ static NSTimeInterval g_avActualRecordingStartTime = 0;
30
+
28
31
  // AVFoundation screen recording implementation
29
32
  extern "C" bool startAVFoundationRecording(const std::string& outputPath,
30
33
  CGDirectDisplayID displayID,
@@ -34,7 +37,8 @@ extern "C" bool startAVFoundationRecording(const std::string& outputPath,
34
37
  bool includeMicrophone,
35
38
  bool includeSystemAudio,
36
39
  NSString* audioDeviceId,
37
- NSString* audioOutputPath) {
40
+ NSString* audioOutputPath,
41
+ double requestedFrameRate) {
38
42
 
39
43
  if (g_avIsRecording) {
40
44
  NSLog(@"❌ AVFoundation recording already in progress");
@@ -129,15 +133,20 @@ extern "C" bool startAVFoundationRecording(const std::string& outputPath,
129
133
  NSLog(@"🎬 ULTRA QUALITY AVFoundation: %dx%d, bitrate=%.2fMbps",
130
134
  (int)recordingSize.width, (int)recordingSize.height, bitrate / (1000.0 * 1000.0));
131
135
 
136
+ // Resolve target FPS
137
+ double fps = requestedFrameRate > 0 ? requestedFrameRate : 60.0;
138
+ if (fps < 1.0) fps = 1.0;
139
+ if (fps > 120.0) fps = 120.0;
140
+
132
141
  NSDictionary *videoSettings = @{
133
142
  AVVideoCodecKey: codecKey,
134
143
  AVVideoWidthKey: @((int)recordingSize.width),
135
144
  AVVideoHeightKey: @((int)recordingSize.height),
136
145
  AVVideoCompressionPropertiesKey: @{
137
146
  AVVideoAverageBitRateKey: @(bitrate),
138
- AVVideoMaxKeyFrameIntervalKey: @30,
147
+ AVVideoMaxKeyFrameIntervalKey: @((int)fps),
139
148
  AVVideoAllowFrameReorderingKey: @YES,
140
- AVVideoExpectedSourceFrameRateKey: @60,
149
+ AVVideoExpectedSourceFrameRateKey: @((int)fps),
141
150
  AVVideoQualityKey: @(0.95), // 0.0-1.0, higher is better
142
151
  AVVideoProfileLevelKey: AVVideoProfileLevelH264HighAutoLevel,
143
152
  AVVideoH264EntropyModeKey: AVVideoH264EntropyModeCABAC
@@ -266,7 +275,7 @@ extern "C" bool startAVFoundationRecording(const std::string& outputPath,
266
275
  }
267
276
  }
268
277
 
269
- // Start capture timer (10 FPS for Electron compatibility)
278
+ // Start capture timer using target FPS
270
279
  dispatch_queue_t captureQueue = dispatch_queue_create("AVFoundationCaptureQueue", DISPATCH_QUEUE_SERIAL);
271
280
  g_avTimer = dispatch_source_create(DISPATCH_SOURCE_TYPE_TIMER, 0, 0, captureQueue);
272
281
 
@@ -275,7 +284,7 @@ extern "C" bool startAVFoundationRecording(const std::string& outputPath,
275
284
  return false;
276
285
  }
277
286
 
278
- uint64_t interval = NSEC_PER_SEC / 10; // 10 FPS for Electron stability
287
+ uint64_t interval = (uint64_t)(NSEC_PER_SEC / fps);
279
288
  dispatch_source_set_timer(g_avTimer, dispatch_time(DISPATCH_TIME_NOW, 0), interval, interval / 10);
280
289
 
281
290
  // Retain objects before passing to block to prevent deallocation
@@ -371,9 +380,14 @@ extern "C" bool startAVFoundationRecording(const std::string& outputPath,
371
380
 
372
381
  // Write frame only if input is ready
373
382
  if (localVideoInput && localVideoInput.readyForMoreMediaData) {
374
- CMTime frameTime = CMTimeAdd(g_avStartTime, CMTimeMakeWithSeconds(g_avFrameNumber / 10.0, 600));
383
+ CMTime frameTime = CMTimeAdd(g_avStartTime, CMTimeMakeWithSeconds(((double)g_avFrameNumber) / fps, 600));
375
384
  BOOL appendSuccess = [localPixelBufferAdaptor appendPixelBuffer:pixelBuffer withPresentationTime:frameTime];
376
385
  if (appendSuccess) {
386
+ // SYNC FIX: Record actual start time when first frame is written
387
+ if (g_avFrameNumber == 0) {
388
+ g_avActualRecordingStartTime = [[NSDate date] timeIntervalSince1970] * 1000; // milliseconds
389
+ MRLog(@"🎞️ AVFoundation first frame written (actual start time: %.0f)", g_avActualRecordingStartTime);
390
+ }
377
391
  g_avFrameNumber++;
378
392
  } else {
379
393
  NSLog(@"⚠️ Failed to append pixel buffer");
@@ -401,8 +415,8 @@ extern "C" bool startAVFoundationRecording(const std::string& outputPath,
401
415
  dispatch_resume(g_avTimer);
402
416
  g_avIsRecording = true;
403
417
 
404
- MRLog(@"🎥 AVFoundation recording started: %dx%d @ 10fps",
405
- (int)recordingSize.width, (int)recordingSize.height);
418
+ MRLog(@"🎥 AVFoundation recording started: %dx%d @ %.0ffps",
419
+ (int)recordingSize.width, (int)recordingSize.height, fps);
406
420
 
407
421
  return true;
408
422
 
@@ -473,6 +487,7 @@ extern "C" bool stopAVFoundationRecording() {
473
487
  g_avVideoInput = nil;
474
488
  g_avPixelBufferAdaptor = nil;
475
489
  g_avFrameNumber = 0;
490
+ g_avActualRecordingStartTime = 0;
476
491
 
477
492
  MRLog(@"✅ AVFoundation recording stopped");
478
493
  return true;
@@ -487,6 +502,11 @@ extern "C" bool isAVFoundationRecording() {
487
502
  return g_avIsRecording;
488
503
  }
489
504
 
505
+ // SYNC FIX: Get actual recording start time for AVFoundation
506
+ extern "C" NSTimeInterval getAVFoundationActualStartTime() {
507
+ return g_avActualRecordingStartTime;
508
+ }
509
+
490
510
  extern "C" NSString* getAVFoundationAudioPath() {
491
511
  return g_avAudioOutputPath;
492
512
  }
@@ -1032,7 +1032,9 @@ Napi::Value StartCursorTracking(const Napi::CallbackInfo& info) {
1032
1032
  // NSTimer kullan (main thread'de çalışır)
1033
1033
  g_timerTarget = [[CursorTimerTarget alloc] init];
1034
1034
 
1035
- g_cursorTimer = [NSTimer timerWithTimeInterval:0.05 // 50ms (20 FPS)
1035
+ // SYNC FIX: Match screen recording frame rate (60 FPS = 16.67ms)
1036
+ // This ensures cursor tracking is synchronized with video frames
1037
+ g_cursorTimer = [NSTimer timerWithTimeInterval:0.01667 // 16.67ms (60 FPS) - matches screen recording
1036
1038
  target:g_timerTarget
1037
1039
  selector:@selector(timerCallback:)
1038
1040
  userInfo:nil
@@ -121,12 +121,27 @@ static void initializeSafeQueue() {
121
121
  SCDisplay *targetDisplay = nil;
122
122
 
123
123
  if (displayId) {
124
+ // First, try matching by real CGDirectDisplayID
124
125
  for (SCDisplay *display in content.displays) {
125
126
  if (display.displayID == [displayId unsignedIntValue]) {
126
127
  targetDisplay = display;
127
128
  break;
128
129
  }
129
130
  }
131
+
132
+ // If not matched, treat provided value as index (0-based or 1-based)
133
+ if (!targetDisplay && content.displays.count > 0) {
134
+ NSUInteger count = content.displays.count;
135
+ NSUInteger idx0 = (NSUInteger)[displayId unsignedIntValue];
136
+ if (idx0 < count) {
137
+ targetDisplay = content.displays[idx0];
138
+ } else if ([displayId unsignedIntegerValue] > 0) {
139
+ NSUInteger idx1 = [displayId unsignedIntegerValue] - 1;
140
+ if (idx1 < count) {
141
+ targetDisplay = content.displays[idx1];
142
+ }
143
+ }
144
+ }
130
145
  }
131
146
 
132
147
  if (!targetDisplay && content.displays.count > 0) {
@@ -154,9 +169,45 @@ static void initializeSafeQueue() {
154
169
  }
155
170
 
156
171
  // Video configuration
157
- config.width = 1920;
158
- config.height = 1080;
159
- config.minimumFrameInterval = CMTimeMake(1, 30); // 30 FPS
172
+ // Prefer the target display's native resolution when available
173
+ if (filter && [filter isKindOfClass:[SCContentFilter class]]) {
174
+ // Try to infer dimensions from selected display or capture area
175
+ NSDictionary *captureArea = options[@"captureArea"];
176
+ if (captureArea) {
177
+ config.width = (size_t)[captureArea[@"width"] doubleValue];
178
+ config.height = (size_t)[captureArea[@"height"] doubleValue];
179
+ } else {
180
+ // Find the selected display again to get dimensions
181
+ NSNumber *displayId = options[@"displayId"];
182
+ if (displayId) {
183
+ for (SCDisplay *display in content.displays) {
184
+ if (display.displayID == [displayId unsignedIntValue]) {
185
+ config.width = (size_t)display.width;
186
+ config.height = (size_t)display.height;
187
+ break;
188
+ }
189
+ }
190
+ }
191
+ }
192
+ }
193
+
194
+ // Fallback default resolution if not set above
195
+ if (config.width == 0 || config.height == 0) {
196
+ config.width = 1920;
197
+ config.height = 1080;
198
+ }
199
+
200
+ // Frame rate from options (default 60)
201
+ NSInteger fps = 60;
202
+ if (options[@"frameRate"]) {
203
+ NSInteger v = [options[@"frameRate"] integerValue];
204
+ if (v > 0) {
205
+ if (v < 1) v = 1;
206
+ if (v > 120) v = 120;
207
+ fps = v;
208
+ }
209
+ }
210
+ config.minimumFrameInterval = CMTimeMake(1, (int)fps);
160
211
  config.queueDepth = 8;
161
212
 
162
213
  // Capture area if specified
@@ -19,9 +19,11 @@ extern "C" {
19
19
  bool includeMicrophone,
20
20
  bool includeSystemAudio,
21
21
  NSString* audioDeviceId,
22
- NSString* audioOutputPath);
22
+ NSString* audioOutputPath,
23
+ double frameRate);
23
24
  bool stopAVFoundationRecording();
24
25
  bool isAVFoundationRecording();
26
+ NSTimeInterval getAVFoundationActualStartTime();
25
27
  NSString* getAVFoundationAudioPath();
26
28
 
27
29
  NSArray<NSDictionary *> *listCameraDevices();
@@ -204,6 +206,7 @@ Napi::Value StartRecording(const Napi::CallbackInfo& info) {
204
206
  NSString *cameraOutputPath = nil;
205
207
  int64_t sessionTimestamp = 0;
206
208
  NSString *audioOutputPath = nil;
209
+ double frameRate = 60.0;
207
210
 
208
211
  if (info.Length() > 1 && info[1].IsObject()) {
209
212
  Napi::Object options = info[1].As<Napi::Object>();
@@ -271,33 +274,57 @@ Napi::Value StartRecording(const Napi::CallbackInfo& info) {
271
274
  if (options.Has("sessionTimestamp") && options.Get("sessionTimestamp").IsNumber()) {
272
275
  sessionTimestamp = options.Get("sessionTimestamp").As<Napi::Number>().Int64Value();
273
276
  }
277
+
278
+ // Frame rate
279
+ if (options.Has("frameRate") && options.Get("frameRate").IsNumber()) {
280
+ double fps = options.Get("frameRate").As<Napi::Number>().DoubleValue();
281
+ if (fps > 0) {
282
+ // Clamp to reasonable range
283
+ if (fps < 1.0) fps = 1.0;
284
+ if (fps > 120.0) fps = 120.0;
285
+ frameRate = fps;
286
+ }
287
+ }
274
288
 
275
- // Display ID
289
+ // Display ID (accepts either real CGDirectDisplayID or index [0-based or 1-based])
276
290
  if (options.Has("displayId") && !options.Get("displayId").IsNull()) {
277
291
  double displayIdNum = options.Get("displayId").As<Napi::Number>().DoubleValue();
278
292
 
279
- // Use the display ID directly (not as an index)
280
- // The JavaScript layer passes the actual CGDirectDisplayID
281
- displayID = (CGDirectDisplayID)displayIdNum;
293
+ // First, assume the provided value is a real CGDirectDisplayID
294
+ CGDirectDisplayID candidateID = (CGDirectDisplayID)displayIdNum;
282
295
 
283
- // Verify that this display ID is valid
284
- uint32_t displayCount;
296
+ // Verify against active displays
297
+ uint32_t displayCount = 0;
285
298
  CGGetActiveDisplayList(0, NULL, &displayCount);
286
299
  if (displayCount > 0) {
287
300
  CGDirectDisplayID *displays = (CGDirectDisplayID*)malloc(displayCount * sizeof(CGDirectDisplayID));
288
301
  CGGetActiveDisplayList(displayCount, displays, &displayCount);
289
302
 
290
- bool validDisplay = false;
303
+ bool matchedByID = false;
291
304
  for (uint32_t i = 0; i < displayCount; i++) {
292
- if (displays[i] == displayID) {
293
- validDisplay = true;
305
+ if (displays[i] == candidateID) {
306
+ matchedByID = true;
307
+ displayID = candidateID;
294
308
  break;
295
309
  }
296
310
  }
297
311
 
298
- if (!validDisplay) {
299
- // Fallback to main display if invalid ID provided
300
- displayID = CGMainDisplayID();
312
+ if (!matchedByID) {
313
+ // Tolerant mapping: allow passing index instead of CGDirectDisplayID
314
+ // Try 0-based index
315
+ int idx0 = (int)displayIdNum;
316
+ if (idx0 >= 0 && idx0 < (int)displayCount) {
317
+ displayID = displays[idx0];
318
+ } else {
319
+ // Try 1-based index (common in user examples)
320
+ int idx1 = (int)displayIdNum - 1;
321
+ if (idx1 >= 0 && idx1 < (int)displayCount) {
322
+ displayID = displays[idx1];
323
+ } else {
324
+ // Fallback to main display
325
+ displayID = CGMainDisplayID();
326
+ }
327
+ }
301
328
  }
302
329
 
303
330
  free(displays);
@@ -400,6 +427,8 @@ Napi::Value StartRecording(const Napi::CallbackInfo& info) {
400
427
  if (sessionTimestamp != 0) {
401
428
  sckConfig[@"sessionTimestamp"] = @(sessionTimestamp);
402
429
  }
430
+ // Pass requested frame rate
431
+ sckConfig[@"frameRate"] = @(frameRate);
403
432
 
404
433
  if (!CGRectIsNull(captureRect)) {
405
434
  sckConfig[@"captureRect"] = @{
@@ -511,7 +540,8 @@ Napi::Value StartRecording(const Napi::CallbackInfo& info) {
511
540
  bool includeMicrophone,
512
541
  bool includeSystemAudio,
513
542
  NSString* audioDeviceId,
514
- NSString* audioOutputPath);
543
+ NSString* audioOutputPath,
544
+ double frameRate);
515
545
 
516
546
  // CRITICAL SYNC FIX: Start camera BEFORE screen recording for perfect sync
517
547
  // This ensures both capture their first frame at approximately the same time
@@ -529,7 +559,8 @@ Napi::Value StartRecording(const Napi::CallbackInfo& info) {
529
559
  // Now start screen recording immediately after camera
530
560
  MRLog(@"🎯 SYNC: Starting screen recording immediately");
531
561
  bool avResult = startAVFoundationRecording(outputPath, displayID, windowID, captureRect,
532
- captureCursor, includeMicrophone, includeSystemAudio, audioDeviceId, audioOutputPath);
562
+ captureCursor, includeMicrophone, includeSystemAudio,
563
+ audioDeviceId, audioOutputPath, frameRate);
533
564
 
534
565
  if (avResult) {
535
566
  MRLog(@"🎥 RECORDING METHOD: AVFoundation");
@@ -1027,6 +1058,26 @@ Napi::Value GetRecordingStatus(const Napi::CallbackInfo& info) {
1027
1058
  return Napi::Boolean::New(env, isRecording);
1028
1059
  }
1029
1060
 
1061
+ // SYNC FIX: Get actual recording start time (when first frame was captured)
1062
+ Napi::Value GetActualRecordingStartTime(const Napi::CallbackInfo& info) {
1063
+ Napi::Env env = info.Env();
1064
+
1065
+ NSTimeInterval startTime = 0;
1066
+
1067
+ // Check ScreenCaptureKit first
1068
+ if (@available(macOS 12.3, *)) {
1069
+ startTime = [ScreenCaptureKitRecorder getActualRecordingStartTime];
1070
+ }
1071
+
1072
+ // Check AVFoundation if ScreenCaptureKit didn't return a time
1073
+ if (startTime == 0) {
1074
+ startTime = getAVFoundationActualStartTime();
1075
+ }
1076
+
1077
+ // Return 0 if not started yet, otherwise return the actual start time in milliseconds
1078
+ return Napi::Number::New(env, startTime);
1079
+ }
1080
+
1030
1081
  // NAPI Function: Get Window Thumbnail
1031
1082
  Napi::Value GetWindowThumbnail(const Napi::CallbackInfo& info) {
1032
1083
  Napi::Env env = info.Env();
@@ -1355,6 +1406,7 @@ Napi::Object Init(Napi::Env env, Napi::Object exports) {
1355
1406
  exports.Set(Napi::String::New(env, "getDisplays"), Napi::Function::New(env, GetDisplays));
1356
1407
  exports.Set(Napi::String::New(env, "getWindows"), Napi::Function::New(env, GetWindows));
1357
1408
  exports.Set(Napi::String::New(env, "getRecordingStatus"), Napi::Function::New(env, GetRecordingStatus));
1409
+ exports.Set(Napi::String::New(env, "getActualRecordingStartTime"), Napi::Function::New(env, GetActualRecordingStartTime));
1358
1410
  exports.Set(Napi::String::New(env, "checkPermissions"), Napi::Function::New(env, CheckPermissions));
1359
1411
 
1360
1412
  // Thumbnail functions
@@ -15,5 +15,6 @@ API_AVAILABLE(macos(12.3))
15
15
  + (void)finalizeRecording;
16
16
  + (void)finalizeVideoWriter;
17
17
  + (void)cleanupVideoWriter;
18
+ + (NSTimeInterval)getActualRecordingStartTime;
18
19
 
19
20
  @end
@@ -32,11 +32,15 @@ static BOOL g_audioWriterStarted = NO;
32
32
 
33
33
  static NSInteger g_configuredSampleRate = 48000;
34
34
  static NSInteger g_configuredChannelCount = 2;
35
+ static NSInteger g_targetFPS = 60;
35
36
 
36
37
  // Frame rate debugging
37
38
  static NSInteger g_frameCount = 0;
38
39
  static CFAbsoluteTime g_firstFrameTime = 0;
39
40
 
41
+ // SYNC FIX: Track actual recording start time (when first frame is captured)
42
+ static NSTimeInterval g_actualRecordingStartTime = 0;
43
+
40
44
  static void CleanupWriters(void);
41
45
  static AVAssetWriterInputPixelBufferAdaptor * _Nullable CurrentPixelBufferAdaptor(void) {
42
46
  if (!g_pixelBufferAdaptorRef) {
@@ -98,6 +102,7 @@ static void CleanupWriters(void) {
98
102
  // Reset frame counting
99
103
  g_frameCount = 0;
100
104
  g_firstFrameTime = 0;
105
+ g_actualRecordingStartTime = 0;
101
106
  }
102
107
 
103
108
  if (g_audioWriter) {
@@ -189,7 +194,12 @@ extern "C" NSString *ScreenCaptureKitCurrentAudioPath(void) {
189
194
  [g_videoWriter startSessionAtSourceTime:presentationTime];
190
195
  g_videoStartTime = presentationTime;
191
196
  g_videoWriterStarted = YES;
192
- MRLog(@"🎞️ Video writer session started @ %.3f", CMTimeGetSeconds(presentationTime));
197
+
198
+ // SYNC FIX: Record the ACTUAL recording start time (when first frame is captured)
199
+ // This is the TRUE sync point - cursor tracking should use this timestamp
200
+ g_actualRecordingStartTime = [[NSDate date] timeIntervalSince1970] * 1000; // milliseconds
201
+ MRLog(@"🎞️ Video writer session started @ %.3f (actual start time: %.0f)",
202
+ CMTimeGetSeconds(presentationTime), g_actualRecordingStartTime);
193
203
  }
194
204
 
195
205
  if (!g_videoInput.readyForMoreMediaData) {
@@ -342,9 +352,9 @@ extern "C" NSString *ScreenCaptureKitCurrentAudioPath(void) {
342
352
 
343
353
  NSDictionary *compressionProps = @{
344
354
  AVVideoAverageBitRateKey: @(bitrate),
345
- AVVideoMaxKeyFrameIntervalKey: @30,
355
+ AVVideoMaxKeyFrameIntervalKey: @(MAX(1, g_targetFPS)),
346
356
  AVVideoAllowFrameReorderingKey: @YES,
347
- AVVideoExpectedSourceFrameRateKey: @60,
357
+ AVVideoExpectedSourceFrameRateKey: @(MAX(1, g_targetFPS)),
348
358
  AVVideoQualityKey: @(0.95), // 0.0-1.0, higher is better (0.95 = excellent)
349
359
  AVVideoProfileLevelKey: AVVideoProfileLevelH264HighAutoLevel,
350
360
  AVVideoH264EntropyModeKey: AVVideoH264EntropyModeCABAC
@@ -524,6 +534,17 @@ extern "C" NSString *ScreenCaptureKitCurrentAudioPath(void) {
524
534
  NSString *audioOutputPath = MRNormalizePath(config[@"audioOutputPath"]);
525
535
  NSNumber *sessionTimestampNumber = config[@"sessionTimestamp"];
526
536
 
537
+ // Extract requested frame rate
538
+ NSNumber *frameRateNumber = config[@"frameRate"];
539
+ if (frameRateNumber && [frameRateNumber respondsToSelector:@selector(intValue)]) {
540
+ NSInteger fps = [frameRateNumber intValue];
541
+ if (fps < 1) fps = 1;
542
+ if (fps > 120) fps = 120;
543
+ g_targetFPS = fps;
544
+ } else {
545
+ g_targetFPS = 60;
546
+ }
547
+
527
548
  MRLog(@"🎬 Starting PURE ScreenCaptureKit recording (NO AVFoundation)");
528
549
  MRLog(@"🔧 Config: cursor=%@ mic=%@ system=%@ display=%@ window=%@ crop=%@",
529
550
  captureCursor, includeMicrophone, includeSystemAudio, displayId, windowId, captureRect);
@@ -641,7 +662,7 @@ extern "C" NSString *ScreenCaptureKitCurrentAudioPath(void) {
641
662
  SCStreamConfiguration *streamConfig = [[SCStreamConfiguration alloc] init];
642
663
  streamConfig.width = recordingWidth;
643
664
  streamConfig.height = recordingHeight;
644
- streamConfig.minimumFrameInterval = CMTimeMake(1, 60); // 60 FPS for smooth recording
665
+ streamConfig.minimumFrameInterval = CMTimeMake(1, (int)MAX(1, g_targetFPS));
645
666
  streamConfig.pixelFormat = kCVPixelFormatType_32BGRA;
646
667
  streamConfig.scalesToFit = NO;
647
668
 
@@ -650,7 +671,7 @@ extern "C" NSString *ScreenCaptureKitCurrentAudioPath(void) {
650
671
  streamConfig.queueDepth = 8; // Larger queue for smoother capture
651
672
  }
652
673
 
653
- MRLog(@"🎬 ScreenCaptureKit config: %ldx%ld @ 60fps", (long)recordingWidth, (long)recordingHeight);
674
+ MRLog(@"🎬 ScreenCaptureKit config: %ldx%ld @ %ldfps", (long)recordingWidth, (long)recordingHeight, (long)g_targetFPS);
654
675
 
655
676
  BOOL shouldCaptureMic = includeMicrophone ? [includeMicrophone boolValue] : NO;
656
677
  BOOL shouldCaptureSystemAudio = includeSystemAudio ? [includeSystemAudio boolValue] : NO;
@@ -735,8 +756,8 @@ extern "C" NSString *ScreenCaptureKitCurrentAudioPath(void) {
735
756
  BOOL shouldShowCursor = captureCursor ? [captureCursor boolValue] : YES;
736
757
  streamConfig.showsCursor = shouldShowCursor;
737
758
 
738
- MRLog(@"🎥 Pure ScreenCapture config: %ldx%ld @ 30fps, cursor=%d",
739
- recordingWidth, recordingHeight, shouldShowCursor);
759
+ MRLog(@"🎥 Pure ScreenCapture config: %ldx%ld @ %ldfps, cursor=%d",
760
+ recordingWidth, recordingHeight, (long)g_targetFPS, shouldShowCursor);
740
761
 
741
762
  NSError *writerError = nil;
742
763
  if (![ScreenCaptureKitRecorder prepareVideoWriterWithWidth:recordingWidth height:recordingHeight error:&writerError]) {
@@ -933,9 +954,16 @@ BOOL isScreenCaptureKitCleaningUp() API_AVAILABLE(macos(12.3)) {
933
954
  g_isRecording = NO;
934
955
  g_isCleaningUp = NO; // Reset cleanup flag
935
956
  g_outputPath = nil;
957
+ g_actualRecordingStartTime = 0;
936
958
 
937
959
  MRLog(@"🧹 Pure ScreenCaptureKit cleanup complete");
938
960
  }
939
961
  }
940
962
 
963
+ // SYNC FIX: Get the actual recording start time (when first frame was captured)
964
+ // This is the TRUE sync point for cursor tracking
965
+ + (NSTimeInterval)getActualRecordingStartTime {
966
+ return g_actualRecordingStartTime;
967
+ }
968
+
941
969
  @end