kinemotion 0.10.6__py3-none-any.whl → 0.67.0__py3-none-any.whl

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.

Potentially problematic release.


This version of kinemotion might be problematic. Click here for more details.

Files changed (48) hide show
  1. kinemotion/__init__.py +31 -6
  2. kinemotion/api.py +39 -598
  3. kinemotion/cli.py +2 -0
  4. kinemotion/cmj/__init__.py +5 -0
  5. kinemotion/cmj/analysis.py +621 -0
  6. kinemotion/cmj/api.py +563 -0
  7. kinemotion/cmj/cli.py +324 -0
  8. kinemotion/cmj/debug_overlay.py +457 -0
  9. kinemotion/cmj/joint_angles.py +307 -0
  10. kinemotion/cmj/kinematics.py +360 -0
  11. kinemotion/cmj/metrics_validator.py +767 -0
  12. kinemotion/cmj/validation_bounds.py +341 -0
  13. kinemotion/core/__init__.py +28 -0
  14. kinemotion/core/auto_tuning.py +71 -37
  15. kinemotion/core/cli_utils.py +60 -0
  16. kinemotion/core/debug_overlay_utils.py +385 -0
  17. kinemotion/core/determinism.py +83 -0
  18. kinemotion/core/experimental.py +103 -0
  19. kinemotion/core/filtering.py +9 -6
  20. kinemotion/core/formatting.py +75 -0
  21. kinemotion/core/metadata.py +231 -0
  22. kinemotion/core/model_downloader.py +172 -0
  23. kinemotion/core/pipeline_utils.py +433 -0
  24. kinemotion/core/pose.py +298 -141
  25. kinemotion/core/pose_landmarks.py +67 -0
  26. kinemotion/core/quality.py +393 -0
  27. kinemotion/core/smoothing.py +250 -154
  28. kinemotion/core/timing.py +247 -0
  29. kinemotion/core/types.py +42 -0
  30. kinemotion/core/validation.py +201 -0
  31. kinemotion/core/video_io.py +135 -50
  32. kinemotion/dropjump/__init__.py +1 -1
  33. kinemotion/dropjump/analysis.py +367 -182
  34. kinemotion/dropjump/api.py +665 -0
  35. kinemotion/dropjump/cli.py +156 -466
  36. kinemotion/dropjump/debug_overlay.py +136 -206
  37. kinemotion/dropjump/kinematics.py +232 -255
  38. kinemotion/dropjump/metrics_validator.py +240 -0
  39. kinemotion/dropjump/validation_bounds.py +157 -0
  40. kinemotion/models/__init__.py +0 -0
  41. kinemotion/models/pose_landmarker_lite.task +0 -0
  42. kinemotion-0.67.0.dist-info/METADATA +726 -0
  43. kinemotion-0.67.0.dist-info/RECORD +47 -0
  44. {kinemotion-0.10.6.dist-info → kinemotion-0.67.0.dist-info}/WHEEL +1 -1
  45. kinemotion-0.10.6.dist-info/METADATA +0 -561
  46. kinemotion-0.10.6.dist-info/RECORD +0 -20
  47. {kinemotion-0.10.6.dist-info → kinemotion-0.67.0.dist-info}/entry_points.txt +0 -0
  48. {kinemotion-0.10.6.dist-info → kinemotion-0.67.0.dist-info}/licenses/LICENSE +0 -0
@@ -0,0 +1,726 @@
1
+ Metadata-Version: 2.4
2
+ Name: kinemotion
3
+ Version: 0.67.0
4
+ Summary: Video-based kinematic analysis for athletic performance
5
+ Project-URL: Homepage, https://github.com/feniix/kinemotion
6
+ Project-URL: Repository, https://github.com/feniix/kinemotion
7
+ Project-URL: Source, https://github.com/feniix/kinemotion
8
+ Project-URL: Issues, https://github.com/feniix/kinemotion/issues
9
+ Author-email: Sebastian Otaegui <feniix@gmail.com>
10
+ License: MIT
11
+ License-File: LICENSE
12
+ Keywords: athletic-performance,drop-jump,kinemetry,kinemotion,mediapipe,pose-tracking,video-analysis
13
+ Classifier: Development Status :: 3 - Alpha
14
+ Classifier: Intended Audience :: Science/Research
15
+ Classifier: License :: OSI Approved :: MIT License
16
+ Classifier: Programming Language :: Python :: 3
17
+ Classifier: Programming Language :: Python :: 3.10
18
+ Classifier: Programming Language :: Python :: 3.11
19
+ Classifier: Programming Language :: Python :: 3.12
20
+ Classifier: Topic :: Multimedia :: Video
21
+ Classifier: Topic :: Scientific/Engineering :: Image Recognition
22
+ Requires-Python: <3.13,>=3.10
23
+ Requires-Dist: click>=8.1.7
24
+ Requires-Dist: mediapipe>=0.10.30
25
+ Requires-Dist: numpy>=1.26.0
26
+ Requires-Dist: opencv-python>=4.9.0
27
+ Requires-Dist: platformdirs>=4.0.0
28
+ Requires-Dist: scipy>=1.11.0
29
+ Requires-Dist: typing-extensions>=4.15.0
30
+ Description-Content-Type: text/markdown
31
+
32
+ # Kinemotion
33
+
34
+ [![PyPI version](https://img.shields.io/pypi/v/kinemotion.svg)](https://pypi.org/project/kinemotion/)
35
+ [![Python Version](https://img.shields.io/pypi/pyversions/kinemotion.svg)](https://pypi.org/project/kinemotion/)
36
+ [![License: MIT](https://img.shields.io/badge/License-MIT-yellow.svg)](https://opensource.org/licenses/MIT)
37
+
38
+ [![Tests](https://github.com/feniix/kinemotion/workflows/Test%20%26%20Quality/badge.svg)](https://github.com/feniix/kinemotion/actions)
39
+ [![Quality Gate Status](https://sonarcloud.io/api/project_badges/measure?project=feniix_kinemotion&metric=alert_status)](https://sonarcloud.io/summary/overall?id=feniix_kinemotion)
40
+ [![Coverage](https://sonarcloud.io/api/project_badges/measure?project=feniix_kinemotion&metric=coverage)](https://sonarcloud.io/summary/overall?id=feniix_kinemotion)
41
+ [![OpenSSF Best Practices](https://www.bestpractices.dev/projects/11506/badge)](https://www.bestpractices.dev/projects/11506)
42
+
43
+ [![Ruff](https://img.shields.io/endpoint?url=https://raw.githubusercontent.com/astral-sh/ruff/main/assets/badge/v2.json)](https://github.com/astral-sh/ruff)
44
+ [![Type checked with pyright](https://img.shields.io/badge/type%20checked-pyright-blue.svg)](https://github.com/microsoft/pyright)
45
+ [![pre-commit](https://img.shields.io/badge/pre--commit-enabled-brightgreen?logo=pre-commit)](https://github.com/pre-commit/pre-commit)
46
+
47
+ > A video-based kinematic analysis tool for athletic performance. Analyzes vertical jump videos to estimate key performance metrics using MediaPipe pose tracking and advanced kinematics.
48
+
49
+ **Supported jump types:**
50
+
51
+ - **Drop Jump**: Ground contact time, flight time, reactive strength index
52
+ - **Counter Movement Jump (CMJ)**: Jump height, flight time, countermovement depth, triple extension biomechanics
53
+
54
+ ## Features
55
+
56
+ ### Core Features
57
+
58
+ - **Automatic pose tracking** using MediaPipe Pose landmarks
59
+ - **Derivative-based velocity** - smooth velocity calculation from position trajectory
60
+ - **Trajectory curvature analysis** - acceleration patterns for refined event detection
61
+ - **Sub-frame interpolation** - precise timing beyond frame boundaries
62
+ - **Intelligent auto-tuning** - automatic parameter optimization based on video characteristics
63
+ - **JSON output** for easy integration with other tools
64
+ - **Debug video overlays** with visual analysis
65
+ - **Batch processing** - CLI and Python API for parallel processing
66
+ - **Python library API** - use kinemotion programmatically
67
+ - **CSV export** - aggregated results for research
68
+
69
+ ### Drop Jump Analysis
70
+
71
+ - **Ground contact detection** based on foot velocity and position
72
+ - **Automatic drop jump detection** - identifies box → drop → landing → jump phases
73
+ - **Metrics**: Ground contact time, flight time, jump height (calculated from flight time)
74
+ - **Reactive strength index** calculations
75
+
76
+ ### Counter Movement Jump (CMJ) Analysis
77
+
78
+ - **Backward search algorithm** - robust phase detection from peak height
79
+ - **Flight time method** - force plate standard (h = g×t²/8)
80
+ - **Triple extension tracking** - ankle, knee, hip joint angles
81
+ - **Skeleton overlay** - biomechanical visualization
82
+ - **Metrics**: Jump height, flight time, countermovement depth, eccentric/concentric durations
83
+ - **Validated accuracy**: 50.6cm jump (±1 frame precision)
84
+
85
+ ## ⚠️ Validation Status
86
+
87
+ **Current Status:** Pre-validation (not validated against force plates or motion capture systems)
88
+
89
+ ### What This Tool IS Suitable For
90
+
91
+ ✅ **Training monitoring** - Track relative changes within the same athlete over time
92
+ ✅ **Educational purposes** - Learn about jump biomechanics and video analysis
93
+ ✅ **Exploratory analysis** - Initial investigation before formal testing
94
+ ✅ **Proof-of-concept research** - Demonstrate feasibility of video-based methods
95
+
96
+ ### What This Tool IS NOT Suitable For
97
+
98
+ ❌ **Research publications** - As a validated measurement instrument
99
+ ❌ **Clinical decision-making** - Injury assessment, return-to-play decisions
100
+ ❌ **Talent identification** - Absolute performance comparisons between athletes
101
+ ❌ **Legal/insurance assessments** - Any context requiring validated measurements
102
+ ❌ **High-stakes testing** - Draft combines, professional athlete evaluation
103
+
104
+ ### Known Limitations
105
+
106
+ - **No force plate validation** - Accuracy claims are theoretical, not empirical
107
+ - **MediaPipe constraints** - Accuracy affected by lighting, clothing, occlusion, camera quality
108
+ - **Lower sampling rate** - Typical video (30-60fps) vs validated apps (120-240Hz)
109
+ - **Indirect measurement** - Landmarks → CoM estimation introduces potential error
110
+ - **No correction factors** - Unlike validated tools (e.g., MyJump), no systematic bias corrections applied
111
+
112
+ ### Recommended Use
113
+
114
+ If you need validated measurements for research or clinical use, consider:
115
+
116
+ - **Commercial validated apps**: MyJump 2, MyJumpLab (smartphone-based, force plate validated)
117
+ - **Laboratory equipment**: Force plates, optical motion capture systems
118
+ - **Validation testing**: Compare kinemotion against validated equipment in your specific use case
119
+
120
+ For detailed validation status and roadmap, see [`docs/validation-status.md`](docs/validation-status.md).
121
+
122
+ ## Setup
123
+
124
+ ### System Requirements
125
+
126
+ **All Platforms:**
127
+
128
+ - Python 3.10, 3.11, or 3.12
129
+
130
+ **Platform-Specific:**
131
+
132
+ #### Windows
133
+
134
+ **Required system dependencies:**
135
+
136
+ - [Microsoft Visual C++ 2022 Redistributable](https://visualstudio.microsoft.com/visual-cpp-build-tools/) - Runtime libraries for OpenCV/MediaPipe
137
+ - Python 3.10-3.12 (64-bit) - MediaPipe requires 64-bit Python
138
+
139
+ **Recommended for mobile video support:**
140
+
141
+ - [FFmpeg](https://ffmpeg.org/download.html) - Download and add to PATH for full video codec support
142
+
143
+ #### macOS
144
+
145
+ **Required system dependencies:**
146
+
147
+ - Xcode Command Line Tools - Provides compilers and system frameworks
148
+
149
+ ```bash
150
+ xcode-select --install
151
+ ```
152
+
153
+ **Recommended for mobile video support:**
154
+
155
+ ```bash
156
+ brew install ffmpeg
157
+ ```
158
+
159
+ #### Linux (Ubuntu/Debian)
160
+
161
+ **Recommended system libraries:**
162
+
163
+ ```bash
164
+ sudo apt-get update
165
+ sudo apt-get install -y \
166
+ libgl1 \ # OpenGL library for OpenCV
167
+ libglib2.0-0 \ # GLib library for MediaPipe
168
+ libgomp1 \ # OpenMP library for multi-threading
169
+ ffmpeg # Video codec support and metadata extraction
170
+ ```
171
+
172
+ **Note:** `ffmpeg` provides the `ffprobe` tool for video metadata extraction (rotation, aspect ratio). Kinemotion works without it, but mobile/rotated videos may not process correctly. A warning will be shown if `ffprobe` is not available.
173
+
174
+ ### Installation Methods
175
+
176
+ #### From PyPI (Recommended)
177
+
178
+ ```bash
179
+ pip install kinemotion
180
+ ```
181
+
182
+ #### From Source (Development)
183
+
184
+ **Step 1:** Install asdf plugins (if not already installed):
185
+
186
+ ```bash
187
+ asdf plugin add python
188
+ asdf plugin add uv
189
+ ```
190
+
191
+ **Step 2:** Install versions specified in `.tool-versions`:
192
+
193
+ ```bash
194
+ asdf install
195
+ ```
196
+
197
+ **Step 3:** Install project dependencies using uv:
198
+
199
+ ```bash
200
+ uv sync
201
+ ```
202
+
203
+ This will install all dependencies and make the `kinemotion` command available.
204
+
205
+ ## Usage
206
+
207
+ Kinemotion supports two jump types with intelligent auto-tuning that automatically optimizes parameters based on video characteristics.
208
+
209
+ ### Analyzing Drop Jumps
210
+
211
+ Analyzes reactive strength and ground contact time:
212
+
213
+ ```bash
214
+ # Automatic parameter tuning based on video characteristics
215
+ kinemotion dropjump-analyze video.mp4
216
+ ```
217
+
218
+ ### Analyzing CMJ
219
+
220
+ Analyzes jump height and biomechanics:
221
+
222
+ ```bash
223
+ # No drop height needed (floor level)
224
+ kinemotion cmj-analyze video.mp4
225
+
226
+ # With triple extension visualization
227
+ kinemotion cmj-analyze video.mp4 --output debug.mp4
228
+ ```
229
+
230
+ ### Common Options (Both Jump Types)
231
+
232
+ ```bash
233
+ # Save metrics to JSON
234
+ kinemotion cmj-analyze video.mp4 --json-output results.json
235
+
236
+ # Generate debug video
237
+ kinemotion cmj-analyze video.mp4 --output debug.mp4
238
+
239
+ # Complete analysis with all outputs
240
+ kinemotion cmj-analyze video.mp4 \
241
+ --output debug.mp4 \
242
+ --json-output results.json \
243
+ --verbose
244
+ ```
245
+
246
+ ### Quality Presets
247
+
248
+ ```bash
249
+ # Fast (50% faster, good for batch)
250
+ kinemotion cmj-analyze video.mp4 --quality fast
251
+
252
+ # Balanced (default)
253
+ kinemotion cmj-analyze video.mp4 --quality balanced
254
+
255
+ # Accurate (research-grade)
256
+ kinemotion cmj-analyze video.mp4 --quality accurate --verbose
257
+ ```
258
+
259
+ ### Batch Processing
260
+
261
+ Process multiple videos in parallel:
262
+
263
+ ```bash
264
+ # Drop jumps
265
+ kinemotion dropjump-analyze videos/*.mp4 --batch --workers 4
266
+
267
+ # CMJ with output directories
268
+ kinemotion cmj-analyze videos/*.mp4 --batch --workers 4 \
269
+ --json-output-dir results/ \
270
+ --csv-summary summary.csv
271
+ ```
272
+
273
+ ### Quality Assessment
274
+
275
+ All analysis outputs include automatic quality assessment in the metadata section to help you know when to trust results:
276
+
277
+ ```json
278
+ {
279
+ "data": {
280
+ "jump_height_m": 0.352,
281
+ "flight_time_ms": 534.2
282
+ },
283
+ "metadata": {
284
+ "quality": {
285
+ "confidence": "high",
286
+ "quality_score": 87.3,
287
+ "quality_indicators": {
288
+ "avg_visibility": 0.89,
289
+ "min_visibility": 0.82,
290
+ "tracking_stable": true,
291
+ "phase_detection_clear": true,
292
+ "outliers_detected": 2,
293
+ "outlier_percentage": 1.5,
294
+ "position_variance": 0.0008,
295
+ "fps": 60.0
296
+ },
297
+ "warnings": []
298
+ }
299
+ }
300
+ }
301
+ ```
302
+
303
+ **Confidence Levels:**
304
+
305
+ - **High** (score ≥75): Trust these results, good tracking quality
306
+ - **Medium** (score 50-74): Use with caution, check quality indicators
307
+ - **Low** (score \<50): Results may be unreliable, review warnings
308
+
309
+ **Common Warnings:**
310
+
311
+ - Poor lighting or occlusion detected
312
+ - Unstable landmark tracking (jitter)
313
+ - High outlier rate (tracking glitches)
314
+ - Low frame rate (\<30fps)
315
+ - Unclear phase transitions
316
+
317
+ **Filtering by Quality:**
318
+
319
+ ```python
320
+ # Only use high-confidence results
321
+ metrics = process_cmj_video("video.mp4")
322
+ if metrics.quality_assessment is not None and metrics.quality_assessment.confidence == "high":
323
+ print(f"Reliable jump height: {metrics.jump_height:.3f}m")
324
+ elif metrics.quality_assessment is not None:
325
+ print(f"Low quality - warnings: {metrics.quality_assessment.warnings}")
326
+ ```
327
+
328
+ ## Python API
329
+
330
+ Use kinemotion as a library for automated pipelines and custom analysis.
331
+
332
+ ### Drop Jump API
333
+
334
+ ```python
335
+ from kinemotion import process_dropjump_video
336
+
337
+ # Process a single video
338
+ metrics = process_dropjump_video(
339
+ video_path="athlete_jump.mp4",
340
+ quality="balanced",
341
+ verbose=True
342
+ )
343
+
344
+ # Access results
345
+ print(f"Jump height: {metrics.jump_height:.3f} m")
346
+ print(f"Ground contact time: {metrics.ground_contact_time * 1000:.1f} ms")
347
+ print(f"Flight time: {metrics.flight_time * 1000:.1f} ms")
348
+ ```
349
+
350
+ ### Bulk Video Processing
351
+
352
+ ```python
353
+ # Drop jump bulk processing
354
+ from kinemotion import DropJumpVideoConfig, process_dropjump_videos_bulk
355
+
356
+ configs = [
357
+ DropJumpVideoConfig("video1.mp4", quality="balanced"),
358
+ DropJumpVideoConfig("video2.mp4", quality="accurate"),
359
+ ]
360
+
361
+ results = process_dropjump_videos_bulk(configs, max_workers=4)
362
+
363
+ # CMJ bulk processing
364
+ from kinemotion import CMJVideoConfig, process_cmj_videos_bulk
365
+
366
+ cmj_configs = [
367
+ CMJVideoConfig("cmj1.mp4"),
368
+ CMJVideoConfig("cmj2.mp4", quality="accurate"),
369
+ ]
370
+
371
+ cmj_results = process_cmj_videos_bulk(cmj_configs, max_workers=4)
372
+
373
+ for result in cmj_results:
374
+ if result.success:
375
+ print(f"{result.video_path}: {result.metrics.jump_height*100:.1f}cm")
376
+ ```
377
+
378
+ See `examples/bulk/README.md` for comprehensive API documentation.
379
+
380
+ ### CMJ-Specific Features
381
+
382
+ ```python
383
+ # Triple extension angles available in metrics
384
+ metrics = process_cmj_video("video.mp4", output_video="debug.mp4")
385
+
386
+ # Debug video shows:
387
+ # - Skeleton overlay (foot→shin→femur→trunk)
388
+ # - Joint angles (ankle, knee, hip, trunk)
389
+ # - Phase-coded visualization
390
+ ```
391
+
392
+ ### CSV Export Example
393
+
394
+ ```python
395
+ # See examples/bulk/ for complete CSV export examples
396
+ from kinemotion import process_cmj_video
397
+ import csv
398
+
399
+ # ... process videos ...
400
+ with open("results.csv", "w", newline="") as f:
401
+ writer = csv.writer(f)
402
+ writer.writerow(["Video", "GCT (ms)", "Flight (ms)", "Jump (m)"])
403
+
404
+ for r in results:
405
+ if r.success and r.metrics:
406
+ writer.writerow([
407
+ Path(r.video_path).name,
408
+ f"{r.metrics.ground_contact_time * 1000:.1f}" if r.metrics.ground_contact_time else "N/A",
409
+ f"{r.metrics.flight_time * 1000:.1f}" if r.metrics.flight_time else "N/A",
410
+ f"{r.metrics.jump_height:.3f}" if r.metrics.jump_height else "N/A",
411
+ ])
412
+ ```
413
+
414
+ **See [examples/bulk/README.md](examples/bulk/README.md) for comprehensive API documentation and more examples.**
415
+
416
+ ## Configuration Options
417
+
418
+ ### Intelligent Auto-Tuning
419
+
420
+ Kinemotion automatically optimizes parameters based on your video:
421
+
422
+ - **FPS-based scaling**: 30fps, 60fps, 120fps videos use different thresholds automatically
423
+ - **Quality-based adjustments**: Adapts smoothing based on MediaPipe tracking confidence
424
+ - **Always enabled**: Outlier rejection, curvature analysis, drop start detection
425
+
426
+ ### Parameters
427
+
428
+ All parameters are optional. Kinemotion uses intelligent auto-tuning to select optimal settings based on video characteristics.
429
+
430
+ - `--quality [fast|balanced|accurate]` (default: balanced)
431
+
432
+ - **fast**: Quick analysis, less precise (~50% faster)
433
+ - **balanced**: Good accuracy/speed tradeoff (recommended)
434
+ - **accurate**: Research-grade analysis, slower (maximum precision)
435
+
436
+ - `--verbose` / `-v`
437
+
438
+ - Show auto-selected parameters and analysis details
439
+ - Useful for understanding what the tool is doing
440
+
441
+ - `--output <path>` / `-o`
442
+
443
+ - Generate annotated debug video with pose tracking visualization
444
+
445
+ - `--json-output <path>` / `-j`
446
+
447
+ - Save metrics to JSON file instead of stdout
448
+
449
+ ### Expert Overrides (Rarely Needed)
450
+
451
+ For advanced users who need manual control:
452
+
453
+ - `--drop-start-frame <int>`: Manually specify where drop begins (if auto-detection fails)
454
+ - `--smoothing-window <int>`: Override auto-tuned smoothing window
455
+ - `--velocity-threshold <float>`: Override auto-tuned velocity threshold
456
+ - `--min-contact-frames <int>`: Override auto-tuned minimum contact frames
457
+ - `--visibility-threshold <float>`: Override visibility threshold
458
+ - `--detection-confidence <float>`: Override MediaPipe detection confidence
459
+ - `--tracking-confidence <float>`: Override MediaPipe tracking confidence
460
+
461
+ > **📖 For detailed parameter explanations, see [docs/reference/parameters.md](docs/reference/parameters.md)**
462
+ >
463
+ > **Note:** Most users never need expert parameters - auto-tuning handles optimization automatically!
464
+
465
+ ## Output Format
466
+
467
+ ### Drop Jump JSON Output
468
+
469
+ ```json
470
+ {
471
+ "data": {
472
+ "ground_contact_time_ms": 245.67,
473
+ "flight_time_ms": 456.78,
474
+ "jump_height_m": 0.339,
475
+ "jump_height_kinematic_m": 0.339,
476
+ "jump_height_trajectory_normalized": 0.0845,
477
+ "contact_start_frame": 45,
478
+ "contact_end_frame": 67,
479
+ "flight_start_frame": 68,
480
+ "flight_end_frame": 95,
481
+ "peak_height_frame": 82,
482
+ "contact_start_frame_precise": 45.234,
483
+ "contact_end_frame_precise": 67.891,
484
+ "flight_start_frame_precise": 68.123,
485
+ "flight_end_frame_precise": 94.567
486
+ },
487
+ "metadata": {
488
+ "quality": { },
489
+ "processing_info": { }
490
+ }
491
+ }
492
+ ```
493
+
494
+ **Data Fields**:
495
+
496
+ - `ground_contact_time_ms`: Duration of ground contact phase in milliseconds
497
+ - `flight_time_ms`: Duration of flight phase in milliseconds
498
+ - `jump_height_m`: Jump height calculated from flight time: h = g × t² / 8
499
+ - `jump_height_kinematic_m`: Kinematic estimate (same as `jump_height_m`)
500
+ - `jump_height_trajectory_normalized`: Position-based measurement in normalized coordinates (0-1 range)
501
+ - `contact_start_frame`: Frame index where contact begins (integer, for visualization)
502
+ - `contact_end_frame`: Frame index where contact ends (integer, for visualization)
503
+ - `flight_start_frame`: Frame index where flight begins (integer, for visualization)
504
+ - `flight_end_frame`: Frame index where flight ends (integer, for visualization)
505
+ - `peak_height_frame`: Frame index at maximum jump height (integer, for visualization)
506
+ - `contact_start_frame_precise`: Sub-frame precise timing for contact start (fractional, for calculations)
507
+ - `contact_end_frame_precise`: Sub-frame precise timing for contact end (fractional, for calculations)
508
+ - `flight_start_frame_precise`: Sub-frame precise timing for flight start (fractional, for calculations)
509
+ - `flight_end_frame_precise`: Sub-frame precise timing for flight end (fractional, for calculations)
510
+
511
+ **Note**: Integer frame indices are provided for visualization in debug videos. Precise fractional frames are used for all timing calculations and provide sub-frame accuracy (±10ms at 30fps).
512
+
513
+ ### CMJ JSON Output
514
+
515
+ ```json
516
+ {
517
+ "data": {
518
+ "jump_height_m": 0.352,
519
+ "flight_time_ms": 534.2,
520
+ "countermovement_depth_m": 0.285,
521
+ "eccentric_duration_ms": 612.5,
522
+ "concentric_duration_ms": 321.8,
523
+ "total_movement_time_ms": 934.3,
524
+ "peak_eccentric_velocity_m_s": -2.145,
525
+ "peak_concentric_velocity_m_s": 3.789,
526
+ "transition_time_ms": 125.4,
527
+ "standing_start_frame": 12.5,
528
+ "lowest_point_frame": 45.2,
529
+ "takeoff_frame": 67.8,
530
+ "landing_frame": 102.3,
531
+ "tracking_method": "foot"
532
+ },
533
+ "metadata": {
534
+ "quality": { },
535
+ "processing_info": { }
536
+ }
537
+ }
538
+ ```
539
+
540
+ **Data Fields**:
541
+
542
+ - `jump_height_m`: Jump height calculated from flight time: h = g × t² / 8
543
+ - `flight_time_ms`: Duration of flight phase in milliseconds
544
+ - `countermovement_depth_m`: Maximum downward displacement during eccentric (descent) phase
545
+ - `eccentric_duration_ms`: Time from start of countermovement to lowest point
546
+ - `concentric_duration_ms`: Time from lowest point to takeoff
547
+ - `total_movement_time_ms`: Total time from countermovement start to takeoff
548
+ - `peak_eccentric_velocity_m_s`: Maximum downward velocity during descent (negative value)
549
+ - `peak_concentric_velocity_m_s`: Maximum upward velocity during propulsion (positive value)
550
+ - `transition_time_ms`: Duration at lowest point (amortization phase between descent and propulsion)
551
+ - `standing_start_frame`: Frame where standing phase ends and countermovement begins
552
+ - `lowest_point_frame`: Frame at the lowest point of the countermovement
553
+ - `takeoff_frame`: Frame where athlete leaves ground
554
+ - `landing_frame`: Frame where athlete lands after jump
555
+ - `tracking_method`: Tracking method used - "foot" (foot landmarks) or "com" (center of mass estimation)
556
+
557
+ ### Debug Video
558
+
559
+ The debug video includes:
560
+
561
+ - **Green circle**: Average foot position when on ground
562
+ - **Red circle**: Average foot position when in air
563
+ - **Yellow circles**: Individual foot landmarks (ankles, heels)
564
+ - **State indicator**: Current contact state (on_ground/in_air)
565
+ - **Phase labels**: "GROUND CONTACT" and "FLIGHT PHASE" during relevant periods
566
+ - **Peak marker**: "PEAK HEIGHT" at maximum jump height
567
+ - **Frame number**: Current frame index
568
+
569
+ ## Troubleshooting
570
+
571
+ ### Poor Tracking Quality
572
+
573
+ **Symptoms**: Erratic landmark positions, missing detections, incorrect contact states
574
+
575
+ **Solutions**:
576
+
577
+ 1. **Check video quality**: Ensure the athlete is clearly visible in profile view
578
+ 1. **Increase smoothing**: Use `--smoothing-window 7` or higher
579
+ 1. **Adjust detection confidence**: Try `--detection-confidence 0.6` or `--tracking-confidence 0.6`
580
+ 1. **Generate debug video**: Use `--output` to visualize what's being tracked
581
+
582
+ ### No Pose Detected
583
+
584
+ **Symptoms**: "No frames processed" error or all null landmarks
585
+
586
+ **Solutions**:
587
+
588
+ 1. **Verify video format**: OpenCV must be able to read the video
589
+ 1. **Check framing**: Ensure full body is visible in side view
590
+ 1. **Lower confidence thresholds**: Try `--detection-confidence 0.3 --tracking-confidence 0.3`
591
+ 1. **Test video playback**: Verify video opens correctly with standard video players
592
+
593
+ ### Incorrect Contact Detection
594
+
595
+ **Symptoms**: Wrong ground contact times, flight phases not detected
596
+
597
+ **Solutions**:
598
+
599
+ 1. **Generate debug video**: Visualize contact states to diagnose the issue
600
+ 1. **Adjust velocity threshold**:
601
+ - If missing contacts: decrease to `--velocity-threshold 0.01`
602
+ - If false contacts: increase to `--velocity-threshold 0.03`
603
+ 1. **Adjust minimum frames**: `--min-contact-frames 5` for longer required contact
604
+ 1. **Check visibility**: Lower `--visibility-threshold 0.3` if feet are partially obscured
605
+
606
+ ### Jump Height Seems Wrong
607
+
608
+ **Symptoms**: Unrealistic jump height values
609
+
610
+ **Solutions**:
611
+
612
+ 1. **Check video quality**: Ensure video frame rate is adequate (30fps or higher recommended)
613
+ 1. **Verify flight time detection**: Check `flight_start_frame` and `flight_end_frame` in JSON
614
+ 1. **Compare measurements**: JSON output includes both `jump_height_m` (primary) and `jump_height_kinematic_m` (kinematic-only)
615
+ 1. **Check for drop jump detection**: If doing a drop jump, ensure first phase is elevated enough (>5% of frame height)
616
+
617
+ ### Video Codec Issues
618
+
619
+ **Symptoms**: Cannot write debug video or corrupted output
620
+
621
+ **Solutions**:
622
+
623
+ 1. **Install additional codecs**: Ensure OpenCV has proper video codec support
624
+ 1. **Try different output format**: Use `.avi` extension instead of `.mp4`
625
+ 1. **Check output path**: Ensure write permissions for output directory
626
+
627
+ ## How It Works
628
+
629
+ 1. **Pose Tracking**: MediaPipe extracts 2D pose landmarks (foot points: ankles, heels, foot indices) from each frame
630
+ 1. **Position Calculation**: Averages ankle, heel, and foot index positions to determine foot location
631
+ 1. **Smoothing**: Savitzky-Golay filter reduces tracking jitter while preserving motion dynamics
632
+ 1. **Contact Detection**: Analyzes vertical position velocity to identify ground contact vs. flight phases
633
+ 1. **Phase Identification**: Finds continuous ground contact and flight periods
634
+ - Automatically detects drop jumps vs regular jumps
635
+ - For drop jumps: identifies box → drop → ground contact → jump sequence
636
+ 1. **Sub-Frame Interpolation**: Estimates exact transition times between frames
637
+ - Uses Savitzky-Golay derivative for smooth velocity calculation
638
+ - Linear interpolation of velocity to find threshold crossings
639
+ - Achieves sub-millisecond timing precision (at 30fps: ±10ms vs ±33ms)
640
+ - Reduces timing error by 60-70% for contact and flight measurements
641
+ - Smoother velocity curves eliminate false threshold crossings
642
+ 1. **Trajectory Curvature Analysis**: Refines transitions using acceleration patterns
643
+ - Computes second derivative (acceleration) from position trajectory
644
+ - Detects landing impact by acceleration spike
645
+ - Identifies takeoff by acceleration change patterns
646
+ - Provides independent validation and refinement of velocity-based detection
647
+ 1. **Metric Calculation**:
648
+ - Ground contact time = contact phase duration (using fractional frames)
649
+ - Flight time = flight phase duration (using fractional frames)
650
+ - Jump height = kinematic estimate from flight time: (g × t²) / 8
651
+
652
+ ## Development
653
+
654
+ ### Code Quality Standards
655
+
656
+ This project enforces strict code quality standards:
657
+
658
+ - **Type safety**: Full pyright strict mode compliance with complete type annotations
659
+ - **Linting**: Comprehensive ruff checks (pycodestyle, pyflakes, isort, pep8-naming, etc.)
660
+ - **Formatting**: Black code style
661
+ - **Testing**: pytest with 261 comprehensive tests (74.27% coverage)
662
+ - **PEP 561 compliant**: Includes py.typed marker for type checking support
663
+
664
+ ### Development Commands
665
+
666
+ ```bash
667
+ # Run the tool
668
+ uv run kinemotion dropjump-analyze <video_path>
669
+
670
+ # Run all tests
671
+ uv run pytest
672
+
673
+ # Run tests with verbose output
674
+ uv run pytest -v
675
+
676
+ # Format code
677
+ uv run black src/
678
+
679
+ # Lint code
680
+ uv run ruff check
681
+
682
+ # Auto-fix linting issues
683
+ uv run ruff check --fix
684
+
685
+ # Type check
686
+ uv run pyright
687
+
688
+ # Run all checks
689
+ uv run ruff check && uv run pyright && uv run pytest
690
+ ```
691
+
692
+ ### Contributing
693
+
694
+ Before committing code, ensure all checks pass:
695
+
696
+ 1. Format with Black
697
+ 1. Fix linting issues with ruff
698
+ 1. Ensure type safety with pyright
699
+ 1. Run all tests with pytest
700
+
701
+ See [CONTRIBUTING.md](CONTRIBUTING.md) for contribution guidelines and requirements, or [CLAUDE.md](CLAUDE.md) for detailed development guidelines.
702
+
703
+ ## Limitations
704
+
705
+ - **2D Analysis**: Only analyzes motion in the camera's view plane
706
+ - **Validation Status**: ⚠️ Accuracy has not been validated against gold standard measurements (force plates, 3D motion capture)
707
+ - **Side View Required**: Must film from the side to accurately track vertical motion
708
+ - **Single Athlete**: Designed for analyzing one athlete at a time
709
+ - **Timing precision**:
710
+ - 30fps videos: ±10ms with sub-frame interpolation (vs ±33ms without)
711
+ - 60fps videos: ±5ms with sub-frame interpolation (vs ±17ms without)
712
+ - Higher frame rates still beneficial for better temporal resolution
713
+ - **Drop jump detection**: Requires first ground phase to be >5% higher than second ground phase
714
+
715
+ ## Future Enhancements
716
+
717
+ - Advanced camera calibration (intrinsic parameters, lens distortion)
718
+ - Multi-angle analysis support
719
+ - Automatic camera orientation detection
720
+ - Real-time analysis from webcam
721
+ - Comparison with reference values
722
+ - Force plate integration for validation
723
+
724
+ ## License
725
+
726
+ MIT License - feel free to use for personal experiments and research.