kinemotion 0.1.0__tar.gz

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.

Potentially problematic release.


This version of kinemotion might be problematic. Click here for more details.

@@ -0,0 +1,62 @@
1
+ # Python
2
+ __pycache__/
3
+ *.py[cod]
4
+ *$py.class
5
+ *.so
6
+ .Python
7
+ build/
8
+ develop-eggs/
9
+ dist/
10
+ downloads/
11
+ eggs/
12
+ .eggs/
13
+ lib/
14
+ lib64/
15
+ parts/
16
+ sdist/
17
+ var/
18
+ wheels/
19
+ *.egg-info/
20
+ .installed.cfg
21
+ *.egg
22
+
23
+ # Virtual environments
24
+ venv/
25
+ ENV/
26
+ env/
27
+ .venv
28
+
29
+ # uv
30
+ .uv/
31
+ uv.lock
32
+
33
+ # IDEs
34
+ .vscode/
35
+ .idea/
36
+ *.swp
37
+ *.swo
38
+ *~
39
+
40
+ # Testing
41
+ .pytest_cache/
42
+ .coverage
43
+ htmlcov/
44
+
45
+ # Output files
46
+ *.mp4
47
+ *.avi
48
+ *.json
49
+ !pyproject.toml
50
+
51
+ # Logs
52
+ *.log
53
+ combined.log
54
+ error.log
55
+
56
+ # OS
57
+ .DS_Store
58
+ Thumbs.db
59
+
60
+ *.mp4
61
+ *.jpeg
62
+ *.jpg
@@ -0,0 +1,2 @@
1
+ uv 0.8.17
2
+ python 3.12.7
@@ -0,0 +1,548 @@
1
+ # CLAUDE.md
2
+
3
+ This file provides guidance to Claude Code (claude.ai/code) when working with code in this repository.
4
+
5
+ ## Repository Purpose
6
+
7
+ Kinemetry: Video-based kinematic analysis tool for athletic performance. Analyzes drop-jump videos to estimate ground contact time, flight time, and jump height by tracking athlete's feet using MediaPipe pose tracking and advanced kinematics.
8
+
9
+ ## Project Setup
10
+
11
+ ### Dependencies
12
+
13
+ Managed with `uv` and `asdf`:
14
+ - Python version: 3.12.7 (specified in `.tool-versions`)
15
+ - **Important**: MediaPipe requires Python 3.12 or earlier (no 3.13 support yet)
16
+ - Install dependencies: `uv sync`
17
+ - Run CLI: `kinemetry dropjump-analyze <video.mp4>`
18
+
19
+ **Production dependencies:**
20
+ - click: CLI framework
21
+ - opencv-python: Video processing
22
+ - mediapipe: Pose detection and tracking
23
+ - numpy: Numerical operations
24
+ - scipy: Signal processing (Savitzky-Golay filter)
25
+
26
+ **Development dependencies:**
27
+ - pytest: Testing framework
28
+ - black: Code formatting
29
+ - ruff: Fast Python linter
30
+ - mypy: Static type checking
31
+
32
+ ### Development Commands
33
+
34
+ - **Run tool**: `uv run kinemetry dropjump-analyze <video_path>`
35
+ - **Install/sync deps**: `uv sync`
36
+ - **Run tests**: `uv run pytest`
37
+ - **Run specific test**: `uv run pytest tests/test_aspect_ratio.py -v`
38
+ - **Format code**: `uv run black src/`
39
+ - **Lint code**: `uv run ruff check`
40
+ - **Auto-fix lint issues**: `uv run ruff check --fix`
41
+ - **Type check**: `uv run mypy src/dropjump`
42
+ - **Run all checks**: `uv run ruff check && uv run mypy src/dropjump && uv run pytest`
43
+
44
+ ## Architecture
45
+
46
+ ### Module Structure
47
+
48
+ ```
49
+ src/dropjump/
50
+ ├── cli.py # Click-based CLI entry point
51
+ ├── pose_tracker.py # MediaPipe Pose integration
52
+ ├── smoothing.py # Savitzky-Golay landmark smoothing
53
+ ├── contact_detection.py # Ground contact state detection
54
+ ├── kinematics.py # Metric calculations (contact time, flight time, jump height)
55
+ └── video_io.py # Video processing and debug overlay rendering
56
+
57
+ tests/
58
+ ├── test_contact_detection.py # Contact detection unit tests
59
+ ├── test_kinematics.py # Metrics calculation tests
60
+ └── test_aspect_ratio.py # Aspect ratio preservation tests
61
+
62
+ docs/
63
+ └── PARAMETERS.md # Comprehensive guide to all CLI parameters
64
+ ```
65
+
66
+ ### Analysis Pipeline
67
+
68
+ 1. **Pose Tracking** (pose_tracker.py): MediaPipe extracts foot landmarks (ankles, heels, foot indices) from each frame
69
+ 2. **Smoothing** (smoothing.py): Savitzky-Golay filter reduces jitter while preserving dynamics
70
+ 3. **Contact Detection** (contact_detection.py): Analyzes vertical foot velocity to classify ground contact vs. flight
71
+ 4. **Phase Identification**: Finds continuous ground contact and flight periods
72
+ - Automatically detects drop jumps vs regular jumps
73
+ - For drop jumps: identifies standing on box → drop → ground contact → jump
74
+ 5. **Sub-Frame Interpolation** (contact_detection.py): Estimates exact transition times
75
+ - Computes velocity from Savitzky-Golay derivative (smoothing.py)
76
+ - Linear interpolation of smooth velocity to find threshold crossings
77
+ - Returns fractional frame indices (e.g., 48.78 instead of 49)
78
+ - Reduces timing error from ±33ms to ±10ms at 30fps (60-70% improvement)
79
+ - Eliminates false threshold crossings from velocity noise
80
+ 6. **Trajectory Curvature Analysis** (contact_detection.py): Refines transitions
81
+ - Computes acceleration (second derivative) using Savitzky-Golay filter
82
+ - Detects landing events by acceleration spikes (impact deceleration)
83
+ - Identifies takeoff events by acceleration changes
84
+ - Blends curvature-based refinement with velocity-based estimates (70/30)
85
+ - Provides independent validation based on physical motion patterns
86
+ 7. **Metrics Calculation** (kinematics.py):
87
+ - Ground contact time from phase duration (using fractional frames)
88
+ - Flight time from phase duration (using fractional frames)
89
+ - Jump height from position tracking with optional calibration
90
+ - Fallback: kinematic estimate from flight time: h = (g × t²) / 8
91
+ 7. **Output**: JSON metrics + optional debug video overlay
92
+
93
+ ### Key Design Decisions
94
+
95
+ - **Normalized coordinates**: All positions use MediaPipe's 0-1 normalized coordinates (independent of video resolution)
96
+ - **Velocity-based contact detection**: More robust than absolute position thresholds
97
+ - **Configurable thresholds**: CLI flags allow tuning for different video qualities and athletes
98
+ - **Calibrated jump height**: Position-based measurement with drop height calibration for accuracy
99
+ - Optional `--drop-height` parameter uses known drop box height to calibrate measurements
100
+ - Achieves ~88% accuracy (vs 71% with kinematic-only method)
101
+ - Fallback to empirically-corrected kinematic formula when no calibration provided
102
+ - **Aspect ratio preservation**: Output video ALWAYS matches source video dimensions
103
+ - Handles SAR (Sample Aspect Ratio) metadata from mobile videos
104
+ - No hardcoded aspect ratios
105
+
106
+ ## Code Quality & Type Safety
107
+
108
+ The codebase enforces strict code quality standards using multiple tools:
109
+
110
+ ### Type Checking with mypy
111
+
112
+ - **Strict mode enabled**: All functions require type annotations
113
+ - Configuration in `pyproject.toml` under `[tool.mypy]`
114
+ - Key settings:
115
+ - `disallow_untyped_defs`: All functions must have complete type annotations
116
+ - `disallow_incomplete_defs`: Partial type hints not allowed
117
+ - `warn_return_any`: Warns on Any return types
118
+ - Third-party stubs: Ignores missing imports for cv2, mediapipe, scipy
119
+ - Run with: `uv run mypy src/dropjump`
120
+
121
+ ### Linting with ruff
122
+
123
+ - **Comprehensive rule set**: pycodestyle, pyflakes, isort, pep8-naming, pyupgrade, flake8-bugbear, flake8-comprehensions
124
+ - Configuration in `pyproject.toml` under `[tool.ruff]`
125
+ - Key settings:
126
+ - Line length: 100 characters
127
+ - Target version: Python 3.11+
128
+ - Auto-fixable issues: Use `uv run ruff check --fix`
129
+ - Run with: `uv run ruff check`
130
+
131
+ ### Code Formatting with black
132
+
133
+ - Consistent code style across the project
134
+ - Run with: `uv run black src/`
135
+
136
+ ### When Contributing Code
137
+
138
+ Always run before committing:
139
+ ```bash
140
+ # Format code
141
+ uv run black src/
142
+
143
+ # Check and fix linting issues
144
+ uv run ruff check --fix
145
+
146
+ # Type check
147
+ uv run mypy src/dropjump
148
+
149
+ # Run tests
150
+ uv run pytest
151
+ ```
152
+
153
+ Or run all checks at once:
154
+ ```bash
155
+ uv run ruff check && uv run mypy src/dropjump && uv run pytest
156
+ ```
157
+
158
+ ## Critical Implementation Details
159
+
160
+ ### Aspect Ratio Preservation & SAR Handling (video_io.py)
161
+
162
+ **IMPORTANT**: The tool preserves the exact aspect ratio of the source video, including SAR (Sample Aspect Ratio) metadata. No dimensions are hardcoded.
163
+
164
+ #### VideoProcessor (`video_io.py:15-110`)
165
+
166
+ - Reads the **first actual frame** to get true encoded dimensions (not OpenCV properties)
167
+ - Critical for mobile videos with rotation metadata
168
+ - `CAP_PROP_FRAME_WIDTH` and `CAP_PROP_FRAME_HEIGHT` can return incorrect dimensions
169
+ - Always use `frame.shape[:2]` to get actual (height, width)
170
+ - **SAR Metadata Extraction**: Uses `ffprobe` to extract Sample Aspect Ratio metadata
171
+ - Many mobile videos use non-square pixels (e.g., 1080x1080 encoded, but 616x1080 display)
172
+ - Calculates display dimensions: `display_width = width × SAR_width / SAR_height`
173
+ - Falls back to encoded dimensions if ffprobe unavailable or SAR = 1:1
174
+
175
+ ```python
176
+ # Correct approach (current implementation)
177
+ ret, first_frame = self.cap.read()
178
+ if ret:
179
+ self.height, self.width = first_frame.shape[:2] # From actual frame data
180
+ ```
181
+
182
+ **Never do this:**
183
+ ```python
184
+ # Wrong - may return incorrect dimensions with rotated videos
185
+ self.width = int(self.cap.get(cv2.CAP_PROP_FRAME_WIDTH))
186
+ self.height = int(self.cap.get(cv2.CAP_PROP_FRAME_HEIGHT))
187
+ ```
188
+
189
+ #### DebugOverlayRenderer (`video_io.py:130-330`)
190
+
191
+ - Creates output video with **display dimensions** (respecting SAR)
192
+ - Resizes frames from encoded dimensions to display dimensions if needed (INTER_LANCZOS4)
193
+ - Output video uses square pixels (SAR 1:1) at correct display size
194
+ - H.264 codec (avc1) with fallback to mp4v
195
+ - Runtime validation in `write_frame()` ensures every frame matches expected encoded dimensions
196
+ - Raises `ValueError` if aspect ratio would be corrupted
197
+
198
+ ### Sub-Frame Interpolation (contact_detection.py:113-227)
199
+
200
+ **IMPORTANT**: The tool uses sub-frame interpolation with derivative-based velocity to achieve timing precision beyond frame boundaries.
201
+
202
+ #### Derivative-Based Velocity Calculation (smoothing.py:126-172)
203
+
204
+ Instead of simple frame-to-frame differences, velocity is computed as the derivative of the smoothed position trajectory using Savitzky-Golay filter:
205
+
206
+ **Advantages:**
207
+ - **Smoother velocity curves**: Eliminates noise from frame-to-frame jitter
208
+ - **More accurate threshold crossings**: Clean transitions without false positives
209
+ - **Better interpolation**: Smoother velocity gradient for sub-frame precision
210
+ - **Consistent with smoothing**: Uses same polynomial fit as position smoothing
211
+
212
+ **Implementation:**
213
+ ```python
214
+ # OLD: Simple differences (noisy)
215
+ velocities = np.abs(np.diff(foot_positions, prepend=foot_positions[0]))
216
+
217
+ # NEW: Derivative from smoothed trajectory (smooth)
218
+ velocities = savgol_filter(positions, window_length=5, polyorder=2, deriv=1, delta=1.0)
219
+ ```
220
+
221
+ **Key Function:**
222
+ - `compute_velocity_from_derivative()`: Computes first derivative using Savitzky-Golay filter
223
+
224
+ #### Sub-Frame Interpolation Algorithm
225
+
226
+ At 30fps, each frame represents 33.3ms. Contact events (landing, takeoff) rarely occur exactly at frame boundaries. Sub-frame interpolation estimates the exact moment between frames when velocity crosses the threshold.
227
+
228
+ **Algorithm:**
229
+ 1. Calculate smooth velocity using derivative: `v = derivative(smooth_position)`
230
+ 2. Find frames where velocity crosses threshold (e.g., from 0.025 to 0.015, threshold 0.020)
231
+ 3. Use linear interpolation to find exact crossing point:
232
+ ```python
233
+ # If v[10] = 0.025 and v[11] = 0.015, threshold = 0.020
234
+ t = (0.020 - 0.025) / (0.015 - 0.025) = 0.5
235
+ # Crossing at frame 10.5
236
+ ```
237
+
238
+ **Key Functions:**
239
+ - `interpolate_threshold_crossing()`: Linear interpolation of velocity crossing
240
+ - `find_interpolated_phase_transitions()`: Returns fractional frame indices for all phases
241
+
242
+ **Accuracy Improvement:**
243
+ ```
244
+ 30fps without interpolation: ±33ms (1 frame on each boundary)
245
+ 30fps with interpolation: ±10ms (sub-frame precision)
246
+ 60fps without interpolation: ±17ms
247
+ 60fps with interpolation: ±5ms
248
+ ```
249
+
250
+ **Velocity Comparison:**
251
+ ```python
252
+ # Frame-to-frame differences: noisy, discontinuous jumps
253
+ v_simple = [0.01, 0.03, 0.02, 0.04, 0.02, 0.01] # Jittery
254
+
255
+ # Derivative-based: smooth, continuous curve
256
+ v_deriv = [0.015, 0.022, 0.025, 0.024, 0.018, 0.012] # Smooth
257
+ ```
258
+
259
+ **Example:**
260
+ ```python
261
+ # Integer frames: contact from frame 49 to 53 (5 frames = 168ms at 30fps)
262
+ # With derivative velocity: contact from 49.0 to 53.0 (4 frames = 135ms)
263
+ # Result: Cleaner threshold crossings, less sub-frame offset
264
+ ```
265
+
266
+ ### Trajectory Curvature Analysis (contact_detection.py:242-394)
267
+
268
+ **IMPORTANT**: The tool uses acceleration patterns (trajectory curvature) to refine event timing.
269
+
270
+ #### Acceleration-Based Event Detection (smoothing.py:175-223)
271
+
272
+ Acceleration (second derivative) reveals characteristic patterns at contact events:
273
+
274
+ **Physical Patterns:**
275
+ - **Landing impact**: Large acceleration spike as feet decelerate on impact
276
+ - **Takeoff**: Acceleration change as body transitions from static to upward motion
277
+ - **In flight**: Constant acceleration (gravity ≈ -9.81 m/s²)
278
+ - **On ground**: Near-zero acceleration (stationary position)
279
+
280
+ **Implementation:**
281
+ ```python
282
+ # Compute acceleration using Savitzky-Golay second derivative
283
+ acceleration = savgol_filter(positions, window=5, polyorder=2, deriv=2, delta=1.0)
284
+
285
+ # Landing: Find maximum absolute acceleration (impact deceleration)
286
+ landing_frame = np.argmax(np.abs(acceleration[search_window]))
287
+
288
+ # Takeoff: Find maximum acceleration change (transition from static)
289
+ accel_change = np.abs(np.diff(acceleration))
290
+ takeoff_frame = np.argmax(accel_change[search_window])
291
+ ```
292
+
293
+ **Key Functions:**
294
+ - `compute_acceleration_from_derivative()`: Computes second derivative using Savitzky-Golay
295
+ - `refine_transition_with_curvature()`: Searches for acceleration patterns near transitions
296
+ - `find_interpolated_phase_transitions_with_curvature()`: Combines velocity + curvature
297
+
298
+ #### Refinement Strategy
299
+
300
+ Curvature analysis refines velocity-based estimates through blending:
301
+
302
+ 1. **Velocity estimate**: Initial sub-frame transition from velocity threshold crossing
303
+ 2. **Curvature search**: Look for acceleration patterns within ±3 frames
304
+ 3. **Blending**: 70% curvature-based + 30% velocity-based
305
+
306
+ **Why Blending?**
307
+ - Velocity is reliable for coarse timing
308
+ - Curvature provides fine detail but can be noisy at boundaries
309
+ - Blending prevents large deviations while incorporating physical insights
310
+
311
+ **Algorithm:**
312
+ ```python
313
+ # 1. Get velocity-based estimate
314
+ velocity_estimate = 49.0 # from interpolation
315
+
316
+ # 2. Search for acceleration peak near estimate
317
+ search_window = acceleration[46:52] # ±3 frames
318
+ peak_idx = np.argmax(np.abs(search_window))
319
+ curvature_estimate = 46 + peak_idx # = 47.2
320
+
321
+ # 3. Blend estimates
322
+ blend = 0.7 * 47.2 + 0.3 * 49.0 # = 47.74
323
+ ```
324
+
325
+ **Accuracy Improvement:**
326
+ ```python
327
+ # Example: Landing detection
328
+ # Velocity only: frame 49.0 (when velocity drops below threshold)
329
+ # With curvature: frame 46.9 (when acceleration spike occurs at impact)
330
+ # Result: 2.1 frames earlier (70ms at 30fps) - more physically accurate
331
+ ```
332
+
333
+ **Optional Feature:**
334
+ - Enabled by default (`--use-curvature`, default: True)
335
+ - Can be disabled with `--no-curvature` flag for pure velocity-based detection
336
+ - Negligible performance impact (reuses smoothed trajectory)
337
+
338
+ ### JSON Serialization (kinematics.py:29-100)
339
+
340
+ **IMPORTANT**: NumPy integer types (int64, int32) are not JSON serializable.
341
+
342
+ Always convert to Python `int()` in `to_dict()` method:
343
+
344
+ ```python
345
+ "contact_start_frame": (
346
+ int(self.contact_start_frame) if self.contact_start_frame is not None else None
347
+ )
348
+ ```
349
+
350
+ **Never do this:**
351
+ ```python
352
+ # Wrong - will fail with "Object of type int64 is not JSON serializable"
353
+ "contact_start_frame": self.contact_start_frame
354
+ ```
355
+
356
+ ### Video Codec Handling (video_io.py:78-94)
357
+
358
+ - Primary codec: H.264 (avc1) - better quality, smaller file size
359
+ - Fallback codec: MPEG-4 (mp4v) - broader compatibility
360
+ - Raises error if both fail to open
361
+
362
+ ### Frame Dimensions (Throughout)
363
+
364
+ OpenCV and NumPy use different dimension ordering:
365
+
366
+ - **NumPy array shape**: `(height, width, channels)`
367
+ - **OpenCV VideoWriter size**: `(width, height)` tuple
368
+
369
+ Example:
370
+ ```python
371
+ frame.shape # (1080, 1920, 3) - height first
372
+ cv2.VideoWriter(..., (1920, 1080)) # width first
373
+ ```
374
+
375
+ Always be careful with dimension ordering to avoid squashed/stretched videos.
376
+
377
+ ## Common Development Tasks
378
+
379
+ ### Adding New Metrics
380
+
381
+ 1. Update `DropJumpMetrics` class in `kinematics.py:10-19`
382
+ 2. Add calculation logic in `calculate_drop_jump_metrics()` function
383
+ 3. Update `to_dict()` method for JSON serialization (remember to convert NumPy types to Python types)
384
+ 4. Optionally add visualization in `DebugOverlayRenderer.render_frame()` in `video_io.py:96`
385
+ 5. Add tests in `tests/test_kinematics.py`
386
+
387
+ ### Modifying Contact Detection Logic
388
+
389
+ Edit `detect_ground_contact()` in `contact_detection.py:14`. Key parameters:
390
+ - `velocity_threshold`: Tune for different surface/athlete combinations (default: 0.02)
391
+ - `min_contact_frames`: Adjust for frame rate and contact duration expectations (default: 3)
392
+ - `visibility_threshold`: Minimum landmark visibility score (default: 0.5)
393
+
394
+ ### Adjusting Smoothing
395
+
396
+ Modify `smooth_landmarks()` in `smoothing.py:9`:
397
+ - `window_length`: Controls smoothing strength (must be odd, default: 5)
398
+ - `polyorder`: Polynomial order for Savitzky-Golay filter (default: 2)
399
+
400
+ ### Parameter Tuning
401
+
402
+ **IMPORTANT**: See `docs/PARAMETERS.md` for comprehensive guide on all 7 CLI parameters.
403
+
404
+ Quick reference:
405
+ - **smoothing-window**: Trajectory smoothness (↑ for noisy video)
406
+ - **velocity-threshold**: Contact sensitivity (↓ to detect brief contacts)
407
+ - **min-contact-frames**: Temporal filter (↑ to remove false contacts)
408
+ - **visibility-threshold**: Landmark confidence (↓ for occluded feet)
409
+ - **detection-confidence**: Pose detection strictness (MediaPipe)
410
+ - **tracking-confidence**: Tracking persistence (MediaPipe)
411
+ - **drop-height**: Drop box height in meters for calibration (e.g., 0.40 for 40cm)
412
+
413
+ The detailed guide includes:
414
+ - How each parameter works internally
415
+ - Frame rate considerations
416
+ - Scenario-based recommended settings
417
+ - Debugging workflow with visual indicators
418
+ - Parameter interaction effects
419
+
420
+ ### Working with Different Video Formats
421
+
422
+ The tool handles various video formats and aspect ratios:
423
+ - 16:9 landscape (1920x1080)
424
+ - 4:3 standard (640x480)
425
+ - 9:16 portrait (1080x1920)
426
+ - Mobile videos with rotation metadata
427
+
428
+ Tests in `tests/test_aspect_ratio.py` verify this behavior.
429
+
430
+ ## Testing
431
+
432
+ ### Running Tests
433
+
434
+ ```bash
435
+ # All tests (9 tests total)
436
+ uv run pytest
437
+
438
+ # Specific test modules
439
+ uv run pytest tests/test_aspect_ratio.py -v
440
+ uv run pytest tests/test_contact_detection.py -v
441
+ uv run pytest tests/test_kinematics.py -v
442
+
443
+ # With verbose output
444
+ uv run pytest -v
445
+ ```
446
+
447
+ ### Test Coverage
448
+
449
+ - **Aspect ratio preservation**: 4 tests covering 16:9, 4:3, 9:16, and validation
450
+ - **Contact detection**: 3 tests for ground contact detection and phase identification
451
+ - **Kinematics**: 2 tests for metrics calculation and JSON serialization
452
+
453
+ ### Code Quality
454
+
455
+ All code passes:
456
+ - ✅ **Type checking**: Full mypy strict mode compliance
457
+ - ✅ **Linting**: ruff checks with comprehensive rule sets
458
+ - ✅ **Tests**: 9/9 tests passing
459
+ - ✅ **Formatting**: Black code style
460
+
461
+ ## Troubleshooting
462
+
463
+ ### MediaPipe Version Compatibility
464
+
465
+ - MediaPipe 0.10.x requires Python ≤ 3.12
466
+ - If you see "no matching distribution" errors, check Python version in `.tool-versions`
467
+
468
+ ### Video Dimension Issues
469
+
470
+ If output video has wrong aspect ratio:
471
+ 1. Check `VideoProcessor` is reading first frame correctly
472
+ 2. Verify `DebugOverlayRenderer` receives correct width/height from `VideoProcessor`
473
+ 3. Check that `write_frame()` validation is enabled (should raise error if dimensions mismatch)
474
+ 4. Run `tests/test_aspect_ratio.py` to verify the mechanism
475
+
476
+ ### JSON Serialization Errors
477
+
478
+ If you see "Object of type X is not JSON serializable":
479
+ 1. Check `kinematics.py` `to_dict()` method
480
+ 2. Ensure all NumPy types are converted to Python types with `int()` or `float()`
481
+ 3. Run `tests/test_kinematics.py::test_metrics_to_dict` to verify
482
+
483
+ ### Video Codec Issues
484
+
485
+ If output video won't play:
486
+ 1. Try different output format: `.avi` instead of `.mp4`
487
+ 2. Check OpenCV codec support: `cv2.getBuildInformation()`
488
+ 3. DebugOverlayRenderer will fallback from H.264 to MPEG-4 automatically
489
+
490
+ ### Type Checking Issues
491
+
492
+ If mypy reports errors:
493
+ 1. Ensure all function signatures have complete type annotations (parameters and return types)
494
+ 2. For numpy types, use explicit casts: `int()`, `float()` when converting to Python types
495
+ 3. For third-party libraries without stubs (cv2, mediapipe, scipy), use `# type: ignore` comments sparingly
496
+ 4. Check `pyproject.toml` under `[tool.mypy]` for configuration
497
+ 5. Run `uv run mypy src/dropjump` to verify fixes
498
+
499
+ ## CLI Usage Examples
500
+
501
+ ```bash
502
+ # Show main command help
503
+ uv run kinemetry --help
504
+
505
+ # Show subcommand help
506
+ uv run kinemetry dropjump-analyze --help
507
+
508
+ # Basic analysis (JSON to stdout)
509
+ uv run kinemetry dropjump-analyze video.mp4
510
+
511
+ # Save metrics to file
512
+ uv run kinemetry dropjump-analyze video.mp4 --json-output results.json
513
+
514
+ # Generate debug video
515
+ uv run kinemetry dropjump-analyze video.mp4 --output debug.mp4
516
+
517
+ # Drop jump with calibration (40cm box)
518
+ uv run kinemetry dropjump-analyze video.mp4 --drop-height 0.40
519
+
520
+ # Custom parameters for noisy video
521
+ uv run kinemetry dropjump-analyze video.mp4 \
522
+ --smoothing-window 7 \
523
+ --velocity-threshold 0.01 \
524
+ --min-contact-frames 5
525
+
526
+ # Full analysis with calibration and all outputs
527
+ uv run kinemetry dropjump-analyze video.mp4 \
528
+ --output debug.mp4 \
529
+ --json-output metrics.json \
530
+ --drop-height 0.40 \
531
+ --smoothing-window 7
532
+
533
+ # Regular jump (no calibration, uses corrected kinematic method)
534
+ uv run kinemetry dropjump-analyze jump.mp4 \
535
+ --output debug.mp4 \
536
+ --json-output metrics.json
537
+ ```
538
+
539
+ ## MCP Server Configuration
540
+
541
+ The repository includes MCP server configuration in `.mcp.json`:
542
+ - **web-search**: DuckDuckGo search via @dannyboy2042/freebird-mcp
543
+ - **sequential**: Sequential thinking via @smithery-ai/server-sequential-thinking
544
+ - **context7**: Library documentation via @upstash/context7-mcp
545
+ - **terraform**: Terraform registry via terraform-mcp-server
546
+ - **basic-memory**: Note-taking for sr-quant-iac project
547
+
548
+ Enabled via `.claude/settings.local.json` with `enableAllProjectMcpServers: true`.
@@ -0,0 +1,21 @@
1
+ MIT License
2
+
3
+ Copyright (c) 2025 Drop-Jump Analysis Contributors
4
+
5
+ Permission is hereby granted, free of charge, to any person obtaining a copy
6
+ of this software and associated documentation files (the "Software"), to deal
7
+ in the Software without restriction, including without limitation the rights
8
+ to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
9
+ copies of the Software, and to permit persons to whom the Software is
10
+ furnished to do so, subject to the following conditions:
11
+
12
+ The above copyright notice and this permission notice shall be included in all
13
+ copies or substantial portions of the Software.
14
+
15
+ THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
16
+ IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
17
+ FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
18
+ AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
19
+ LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
20
+ OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
21
+ SOFTWARE.