sports2d 0.8.20__tar.gz → 0.8.21__tar.gz

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
Files changed (36) hide show
  1. {sports2d-0.8.20 → sports2d-0.8.21}/PKG-INFO +66 -36
  2. {sports2d-0.8.20 → sports2d-0.8.21}/README.md +65 -35
  3. sports2d-0.8.21/Sports2D/Demo/Calib_demo.toml +12 -0
  4. {sports2d-0.8.20 → sports2d-0.8.21}/Sports2D/Demo/Config_demo.toml +2 -2
  5. {sports2d-0.8.20 → sports2d-0.8.21}/Sports2D/Sports2D.py +13 -4
  6. {sports2d-0.8.20 → sports2d-0.8.21}/Sports2D/Utilities/tests.py +30 -8
  7. {sports2d-0.8.20 → sports2d-0.8.21}/sports2d.egg-info/PKG-INFO +66 -36
  8. {sports2d-0.8.20 → sports2d-0.8.21}/sports2d.egg-info/SOURCES.txt +1 -0
  9. {sports2d-0.8.20 → sports2d-0.8.21}/.github/workflows/continuous-integration.yml +0 -0
  10. {sports2d-0.8.20 → sports2d-0.8.21}/.github/workflows/joss_pdf.yml +0 -0
  11. {sports2d-0.8.20 → sports2d-0.8.21}/.github/workflows/publish-on-release.yml +0 -0
  12. {sports2d-0.8.20 → sports2d-0.8.21}/.gitignore +0 -0
  13. {sports2d-0.8.20 → sports2d-0.8.21}/CITATION.cff +0 -0
  14. {sports2d-0.8.20 → sports2d-0.8.21}/Content/Demo_plots.png +0 -0
  15. {sports2d-0.8.20 → sports2d-0.8.21}/Content/Demo_results.png +0 -0
  16. {sports2d-0.8.20 → sports2d-0.8.21}/Content/Demo_terminal.png +0 -0
  17. {sports2d-0.8.20 → sports2d-0.8.21}/Content/Person_selection.png +0 -0
  18. {sports2d-0.8.20 → sports2d-0.8.21}/Content/Video_tuto_Sports2D_Colab.png +0 -0
  19. {sports2d-0.8.20 → sports2d-0.8.21}/Content/joint_convention.png +0 -0
  20. {sports2d-0.8.20 → sports2d-0.8.21}/Content/paper.bib +0 -0
  21. {sports2d-0.8.20 → sports2d-0.8.21}/Content/paper.md +0 -0
  22. {sports2d-0.8.20 → sports2d-0.8.21}/Content/sports2d_blender.gif +0 -0
  23. {sports2d-0.8.20 → sports2d-0.8.21}/Content/sports2d_opensim.gif +0 -0
  24. {sports2d-0.8.20 → sports2d-0.8.21}/LICENSE +0 -0
  25. {sports2d-0.8.20 → sports2d-0.8.21}/Sports2D/Demo/demo.mp4 +0 -0
  26. {sports2d-0.8.20 → sports2d-0.8.21}/Sports2D/Sports2D.ipynb +0 -0
  27. {sports2d-0.8.20 → sports2d-0.8.21}/Sports2D/Utilities/__init__.py +0 -0
  28. {sports2d-0.8.20 → sports2d-0.8.21}/Sports2D/Utilities/common.py +0 -0
  29. {sports2d-0.8.20 → sports2d-0.8.21}/Sports2D/__init__.py +0 -0
  30. {sports2d-0.8.20 → sports2d-0.8.21}/Sports2D/process.py +0 -0
  31. {sports2d-0.8.20 → sports2d-0.8.21}/pyproject.toml +0 -0
  32. {sports2d-0.8.20 → sports2d-0.8.21}/setup.cfg +0 -0
  33. {sports2d-0.8.20 → sports2d-0.8.21}/sports2d.egg-info/dependency_links.txt +0 -0
  34. {sports2d-0.8.20 → sports2d-0.8.21}/sports2d.egg-info/entry_points.txt +0 -0
  35. {sports2d-0.8.20 → sports2d-0.8.21}/sports2d.egg-info/requires.txt +0 -0
  36. {sports2d-0.8.20 → sports2d-0.8.21}/sports2d.egg-info/top_level.txt +0 -0
@@ -1,6 +1,6 @@
1
1
  Metadata-Version: 2.4
2
2
  Name: sports2d
3
- Version: 0.8.20
3
+ Version: 0.8.21
4
4
  Summary: Compute 2D human pose and angles from a video or a webcam.
5
5
  Author-email: David Pagnon <contact@david-pagnon.com>
6
6
  Maintainer-email: David Pagnon <contact@david-pagnon.com>
@@ -67,6 +67,7 @@ Dynamic: license-file
67
67
  </br>
68
68
 
69
69
  > **`Announcements:`**
70
+ > - Generate or import a calibration file, OpenSim skeleton overlay **New in v0.9!**
70
71
  > - Select only the persons you want to analyze **New in v0.8!**
71
72
  > - MarkerAugmentation and Inverse Kinematics for accurate 3D motion with OpenSim. **New in v0.7!**
72
73
  > - Any detector and pose estimation model can be used. **New in v0.6!**
@@ -218,16 +219,19 @@ The Demo video is voluntarily challenging to demonstrate the robustness of the p
218
219
 
219
220
  1. **Install the Pose2Sim_Blender add-on.**\
220
221
  Follow instructions on the [Pose2Sim_Blender](https://github.com/davidpagnon/Pose2Sim_Blender) add-on page.
222
+ 2. **Import the camera and video.**
223
+ - **Cameras -> Import**: Open your `demo_calib.toml` file from your `result_dir` folder.
224
+ - **Images/Videos -> Show**: open your video file (e.g., `demo_Sports2D.mp4`).\
225
+ -> **Other tools -> See through camera**
221
226
  2. **Open your point coordinates.**\
222
- **Add Markers**: open your trc file(e.g., `coords_m.trc`) from your `result_dir` folder.
223
-
227
+ **OpenSim data -> Markers**: Open your trc file(e.g., `demo_Sports2D_m_person00.trc`) from your `result_dir` folder.\
224
228
  This will optionally create **an animated rig** based on the motion of the captured person.
225
229
  3. **Open your animated skeleton:**\
226
230
  Make sure you first set `--do_ik True` ([full install](#full-install) required). See [inverse kinematics](#run-inverse-kinematics) section for more details.
227
- - **Add Model**: Open your scaled model (e.g., `Model_Pose2Sim_LSTM.osim`).
228
- - **Add Motion**: Open your motion file (e.g., `angles.mot`). Make sure the skeleton is selected in the outliner.
231
+ - **OpenSim data -> Model**: Open your scaled model (e.g., `demo_Sports2D_m_person00_LSTM.osim`).
232
+ - **OpenSim data -> Motion**: Open your motion file (e.g., `demo_Sports2D_m_person00_LSTM_ik.mot`).
229
233
 
230
- The OpenSim skeleton is not rigged yet. **[Feel free to contribute!](https://github.com/perfanalytics/pose2sim/issues/40)**
234
+ The OpenSim skeleton is not rigged yet. **[Feel free to contribute!](https://github.com/perfanalytics/pose2sim/issues/40)** [![Discord](https://img.shields.io/discord/1183750225471492206?logo=Discord&label=Discord%20community)](https://discord.com/invite/4mXUdSFjmt)
231
235
 
232
236
  <img src="Content/sports2d_blender.gif" width="760">
233
237
 
@@ -284,7 +288,7 @@ If you only want to analyze a subset of the detected persons, you can use the `-
284
288
  sports2d --nb_persons_to_detect 2 --person_ordering_method highest_likelihood
285
289
  ```
286
290
 
287
- We recommend to use the `on_click` method if you can afford a manual input. This lets the user handle both the person number and their order in the same stage. When prompted, select the persons you are interested in in the desired order. In our case, lets slide to a frame where both people are visible, and select the woman first, then the man.
291
+ We recommend using the `on_click` method if you can afford a manual input. This lets the user handle both the person number and their order in the same stage. When prompted, select the persons you are interested in in the desired order. In our case, lets slide to a frame where both people are visible, and select the woman first, then the man.
288
292
 
289
293
  Otherwise, if you want to run Sports2D automatically for example, you can choose other ordering methods such as 'highest_likelihood', 'largest_size', 'smallest_size', 'greatest_displacement', 'least_displacement', 'first_detected', or 'last_detected'.
290
294
 
@@ -301,28 +305,32 @@ sports2d --person_ordering_method on_click
301
305
 
302
306
 
303
307
  #### Get coordinates in meters:
304
- > **N.B.:** Depth is estimated from a neutral pose.
308
+ > **N.B.:** The Z coordinate (depth) should not be overly trusted.
305
309
 
306
- <!-- You either need to provide a calibration file, or simply the height of a person (Note that the latter will not take distortions into account, and that it will be less accurate for motion in the frontal plane).\-->
307
- You may need to convert pixel coordinates to meters.\
308
- Just provide the height of the reference person (and their ID in case of multiple person detection).
310
+ You may want coordinates in meters rather than pixels. 2 options to do so:
309
311
 
310
- You can also specify whether the visible side of the person is left, right, front, or back. Set it to 'auto' if you do not want to find it automatically (only works for motion in the sagittal plane), or to 'none' if you want to keep 2D instead of 3D coordinates (if the person goes right, and then left for example).
312
+ 1. **Just provide the height of a reference person**:
313
+ - Their height in meters is be compared with their height in pixels to get a pixel-to-meter conversion factor.
314
+ - To estimate the depth coordinates, specify which side of the person is visible: `left`, `right`, `front`, or `back`. Use `auto` if you want it to be automatically determined (only works for motions in the sagittal plane), or `none` if you want to keep 2D coordinates instead of 3D (if the person turns around, for example).
315
+ - The floor angle is automatically estimated from gait, as well as the origin of the xy axis. The person trajectory is corrected accordingly. You can use the `--floor_angle` and `--xy_origin` parameters to manually specify them if your subject is not travelling horizontally or if you want the origin not to be under their feet (note that the `y` axis points down).
316
+
317
+ **N.B.: A calibration file will be generated.** By convention, the camera-to-subject distance is set to 10 meters.
311
318
 
312
- The floor angle and the origin of the xy axis are computed automatically from gait. If you analyze another type of motion, you can manually specify them. Note that `y` points down.\
313
- Also note that distortions are not taken into account, and that results will be less accurate for motions in the frontal plane.
319
+ ``` cmd
320
+ sports2d --first_person_height 1.65 --visible_side auto front none
321
+ ```
322
+ ``` cmd
323
+ sports2d --first_person_height 1.65 --visible_side auto front none `
324
+ --person_ordering_method on_click `
325
+ --floor_angle 0 --xy_origin 0 940
326
+ ```
314
327
 
315
- <!-- ``` cmd
316
- sports2d --to_meters True --calib_file calib_demo.toml
317
- ``` -->
318
- ``` cmd
319
- sports2d --to_meters True --first_person_height 1.65 --visible_side auto front none
320
- ```
321
- ``` cmd
322
- sports2d --to_meters True --first_person_height 1.65 --visible_side auto front none `
323
- --person_ordering_method on_click `
324
- --floor_angle 0 --xy_origin 0 940
325
- ```
328
+ 2. **Or use a calibration file**:\
329
+ It can either be a `.toml` calibration file previously generated by Sports2D, or a more accurate one coming from another system. For example, [Pose2Sim](https://github.com/perfanalytics/pose2sim) can be used to accurately calculate calibration, or to convert calibration files from Qualisys, Vicon, OpenCap, FreeMoCap, etc.
330
+
331
+ ``` cmd
332
+ sports2d --calib_file Calib_demo.toml --visible_side auto front none
333
+ ```
326
334
 
327
335
  <br>
328
336
 
@@ -337,18 +345,22 @@ OpenSim inverse kinematics allows you to set joint constraints, joint angle limi
337
345
  This is done via [Pose2Sim](https://github.com/perfanalytics/pose2sim).\
338
346
  Model scaling is done according to the mean of the segment lengths, across a subset of frames. We remove the 10% fastest frames (potential outliers), the frames where the speed is 0 (person probably out of frame), the frames where the average knee and hip flexion angles are above 45° (pose estimation is not precise when the person is crouching) and the 20% most extreme segment values after the previous operations (potential outliers). All these parameters can be edited in your Config.toml file.
339
347
 
348
+ **N.B.: This will not work on sections where the person is not moving in a single plane. You can split your video into several time ranges if needed.**
349
+
340
350
  ```cmd
341
351
  sports2d --time_range 1.2 2.7 `
342
352
  --do_ik true --first_person_height 1.65 --visible_side auto front
343
353
  ```
344
354
 
345
355
  You can optionally use the LSTM marker augmentation to improve the quality of the output motion.\
346
- You can also optionally give the participants proper masses. Mass has no influence on motion, only on forces (if you decide to further pursue kinetics analysis).
356
+ You can also optionally give the participants proper masses. Mass has no influence on motion, only on forces (if you decide to further pursue kinetics analysis).\
357
+ Optionally again, you can [visualize the overlaid results in Blender](#visualize-in-blender). The automatic calibration won't be accurate with such a small time range, so you need to use the provided calibration file (or one that has been generated from the full walk).
347
358
 
348
359
  ```cmd
349
360
  sports2d --time_range 1.2 2.7 `
350
361
  --do_ik true --first_person_height 1.65 --visible_side left front `
351
- --use_augmentation True --participant_mass 55.0 67.0
362
+ --use_augmentation True --participant_mass 55.0 67.0 `
363
+ --calib_file Calib_demo.toml
352
364
  ```
353
365
 
354
366
  <br>
@@ -376,14 +388,31 @@ sports2d --video_input demo.mp4 other_video.mp4 --time_range 1.2 2.7 0 3.5
376
388
  ``` cmd
377
389
  sports2d --config Config_demo.toml
378
390
  ```
379
- - Run within Python:
380
- ``` python
381
- from Sports2D import Sports2D; Sports2D.process('Config_demo.toml')
382
- ```
383
- - Run within Python with a dictionary (for example, `config_dict = toml.load('Config_demo.toml')`):
384
- ``` python
385
- from Sports2D import Sports2D; Sports2D.process(config_dict)
386
- ```
391
+ - Run within Python, for example:\
392
+ - Edit `Demo/Config_demo.toml` and run:
393
+ ```python
394
+ from Sports2D import Sports2D
395
+ from pathlib import Path
396
+ import toml
397
+
398
+ config_path = Path(Sports2D.__file__).parent / 'Demo'/'Config_demo.toml'
399
+ config_dict = toml.load(config_path)
400
+ Sports2D.process(config_dict)
401
+ ```
402
+ - Or you can pass the non default values only:
403
+ ```python
404
+ from Sports2D import Sports2D
405
+ config_dict = {
406
+ 'base': {
407
+ 'nb_persons_to_detect': 1,
408
+ 'person_ordering_method': 'greatest_displacement'
409
+ },
410
+ 'pose': {
411
+ 'mode': 'lightweight',
412
+ 'det_frequency': 50
413
+ }}
414
+ Sports2D.process(config_dict)
415
+ ```
387
416
 
388
417
  <br>
389
418
 
@@ -407,7 +436,7 @@ sports2d --video_input demo.mp4 other_video.mp4 --time_range 1.2 2.7 0 3.5
407
436
  ```cmd
408
437
  sports2d --flip_left_right true # Default
409
438
  ```
410
- - Correct segment angles according to the estimated camera tild angle.\
439
+ - Correct segment angles according to the estimated camera tilt angle.\
411
440
  **N.B.:** *The camera tilt angle is automatically estimated. Set to false if it is actually the floor which is tilted rather than the camera.*
412
441
  ```cmd
413
442
  sports2d --correct_segment_angles_with_floor_angle true # Default
@@ -477,6 +506,7 @@ sports2d --help
477
506
  'show_realtime_results': ["R", "show results in real-time. true if not specified"],
478
507
  'display_angle_values_on': ["a", '"body", "list", "body" "list", or "none". body list if not specified'],
479
508
  'show_graphs': ["G", "show plots of raw and processed results. true if not specified"],
509
+ 'save_graphs': ["", "save position and angle plots of raw and processed results. false if not specified"],
480
510
  'joint_angles': ["j", '"Right ankle" "Left ankle" "Right knee" "Left knee" "Right hip" "Left hip" "Right shoulder" "Left shoulder" "Right elbow" "Left elbow" if not specified'],
481
511
  'segment_angles': ["s", '"Right foot" "Left foot" "Right shank" "Left shank" "Right thigh" "Left thigh" "Pelvis" "Trunk" "Shoulders" "Head" "Right arm" "Left arm" "Right forearm" "Left forearm" if not specified'],
482
512
  'save_vid': ["V", "save processed video. true if not specified"],
@@ -24,6 +24,7 @@
24
24
  </br>
25
25
 
26
26
  > **`Announcements:`**
27
+ > - Generate or import a calibration file, OpenSim skeleton overlay **New in v0.9!**
27
28
  > - Select only the persons you want to analyze **New in v0.8!**
28
29
  > - MarkerAugmentation and Inverse Kinematics for accurate 3D motion with OpenSim. **New in v0.7!**
29
30
  > - Any detector and pose estimation model can be used. **New in v0.6!**
@@ -175,16 +176,19 @@ The Demo video is voluntarily challenging to demonstrate the robustness of the p
175
176
 
176
177
  1. **Install the Pose2Sim_Blender add-on.**\
177
178
  Follow instructions on the [Pose2Sim_Blender](https://github.com/davidpagnon/Pose2Sim_Blender) add-on page.
179
+ 2. **Import the camera and video.**
180
+ - **Cameras -> Import**: Open your `demo_calib.toml` file from your `result_dir` folder.
181
+ - **Images/Videos -> Show**: open your video file (e.g., `demo_Sports2D.mp4`).\
182
+ -> **Other tools -> See through camera**
178
183
  2. **Open your point coordinates.**\
179
- **Add Markers**: open your trc file(e.g., `coords_m.trc`) from your `result_dir` folder.
180
-
184
+ **OpenSim data -> Markers**: Open your trc file(e.g., `demo_Sports2D_m_person00.trc`) from your `result_dir` folder.\
181
185
  This will optionally create **an animated rig** based on the motion of the captured person.
182
186
  3. **Open your animated skeleton:**\
183
187
  Make sure you first set `--do_ik True` ([full install](#full-install) required). See [inverse kinematics](#run-inverse-kinematics) section for more details.
184
- - **Add Model**: Open your scaled model (e.g., `Model_Pose2Sim_LSTM.osim`).
185
- - **Add Motion**: Open your motion file (e.g., `angles.mot`). Make sure the skeleton is selected in the outliner.
188
+ - **OpenSim data -> Model**: Open your scaled model (e.g., `demo_Sports2D_m_person00_LSTM.osim`).
189
+ - **OpenSim data -> Motion**: Open your motion file (e.g., `demo_Sports2D_m_person00_LSTM_ik.mot`).
186
190
 
187
- The OpenSim skeleton is not rigged yet. **[Feel free to contribute!](https://github.com/perfanalytics/pose2sim/issues/40)**
191
+ The OpenSim skeleton is not rigged yet. **[Feel free to contribute!](https://github.com/perfanalytics/pose2sim/issues/40)** [![Discord](https://img.shields.io/discord/1183750225471492206?logo=Discord&label=Discord%20community)](https://discord.com/invite/4mXUdSFjmt)
188
192
 
189
193
  <img src="Content/sports2d_blender.gif" width="760">
190
194
 
@@ -241,7 +245,7 @@ If you only want to analyze a subset of the detected persons, you can use the `-
241
245
  sports2d --nb_persons_to_detect 2 --person_ordering_method highest_likelihood
242
246
  ```
243
247
 
244
- We recommend to use the `on_click` method if you can afford a manual input. This lets the user handle both the person number and their order in the same stage. When prompted, select the persons you are interested in in the desired order. In our case, lets slide to a frame where both people are visible, and select the woman first, then the man.
248
+ We recommend using the `on_click` method if you can afford a manual input. This lets the user handle both the person number and their order in the same stage. When prompted, select the persons you are interested in in the desired order. In our case, lets slide to a frame where both people are visible, and select the woman first, then the man.
245
249
 
246
250
  Otherwise, if you want to run Sports2D automatically for example, you can choose other ordering methods such as 'highest_likelihood', 'largest_size', 'smallest_size', 'greatest_displacement', 'least_displacement', 'first_detected', or 'last_detected'.
247
251
 
@@ -258,28 +262,32 @@ sports2d --person_ordering_method on_click
258
262
 
259
263
 
260
264
  #### Get coordinates in meters:
261
- > **N.B.:** Depth is estimated from a neutral pose.
265
+ > **N.B.:** The Z coordinate (depth) should not be overly trusted.
262
266
 
263
- <!-- You either need to provide a calibration file, or simply the height of a person (Note that the latter will not take distortions into account, and that it will be less accurate for motion in the frontal plane).\-->
264
- You may need to convert pixel coordinates to meters.\
265
- Just provide the height of the reference person (and their ID in case of multiple person detection).
267
+ You may want coordinates in meters rather than pixels. 2 options to do so:
266
268
 
267
- You can also specify whether the visible side of the person is left, right, front, or back. Set it to 'auto' if you do not want to find it automatically (only works for motion in the sagittal plane), or to 'none' if you want to keep 2D instead of 3D coordinates (if the person goes right, and then left for example).
269
+ 1. **Just provide the height of a reference person**:
270
+ - Their height in meters is be compared with their height in pixels to get a pixel-to-meter conversion factor.
271
+ - To estimate the depth coordinates, specify which side of the person is visible: `left`, `right`, `front`, or `back`. Use `auto` if you want it to be automatically determined (only works for motions in the sagittal plane), or `none` if you want to keep 2D coordinates instead of 3D (if the person turns around, for example).
272
+ - The floor angle is automatically estimated from gait, as well as the origin of the xy axis. The person trajectory is corrected accordingly. You can use the `--floor_angle` and `--xy_origin` parameters to manually specify them if your subject is not travelling horizontally or if you want the origin not to be under their feet (note that the `y` axis points down).
273
+
274
+ **N.B.: A calibration file will be generated.** By convention, the camera-to-subject distance is set to 10 meters.
268
275
 
269
- The floor angle and the origin of the xy axis are computed automatically from gait. If you analyze another type of motion, you can manually specify them. Note that `y` points down.\
270
- Also note that distortions are not taken into account, and that results will be less accurate for motions in the frontal plane.
276
+ ``` cmd
277
+ sports2d --first_person_height 1.65 --visible_side auto front none
278
+ ```
279
+ ``` cmd
280
+ sports2d --first_person_height 1.65 --visible_side auto front none `
281
+ --person_ordering_method on_click `
282
+ --floor_angle 0 --xy_origin 0 940
283
+ ```
271
284
 
272
- <!-- ``` cmd
273
- sports2d --to_meters True --calib_file calib_demo.toml
274
- ``` -->
275
- ``` cmd
276
- sports2d --to_meters True --first_person_height 1.65 --visible_side auto front none
277
- ```
278
- ``` cmd
279
- sports2d --to_meters True --first_person_height 1.65 --visible_side auto front none `
280
- --person_ordering_method on_click `
281
- --floor_angle 0 --xy_origin 0 940
282
- ```
285
+ 2. **Or use a calibration file**:\
286
+ It can either be a `.toml` calibration file previously generated by Sports2D, or a more accurate one coming from another system. For example, [Pose2Sim](https://github.com/perfanalytics/pose2sim) can be used to accurately calculate calibration, or to convert calibration files from Qualisys, Vicon, OpenCap, FreeMoCap, etc.
287
+
288
+ ``` cmd
289
+ sports2d --calib_file Calib_demo.toml --visible_side auto front none
290
+ ```
283
291
 
284
292
  <br>
285
293
 
@@ -294,18 +302,22 @@ OpenSim inverse kinematics allows you to set joint constraints, joint angle limi
294
302
  This is done via [Pose2Sim](https://github.com/perfanalytics/pose2sim).\
295
303
  Model scaling is done according to the mean of the segment lengths, across a subset of frames. We remove the 10% fastest frames (potential outliers), the frames where the speed is 0 (person probably out of frame), the frames where the average knee and hip flexion angles are above 45° (pose estimation is not precise when the person is crouching) and the 20% most extreme segment values after the previous operations (potential outliers). All these parameters can be edited in your Config.toml file.
296
304
 
305
+ **N.B.: This will not work on sections where the person is not moving in a single plane. You can split your video into several time ranges if needed.**
306
+
297
307
  ```cmd
298
308
  sports2d --time_range 1.2 2.7 `
299
309
  --do_ik true --first_person_height 1.65 --visible_side auto front
300
310
  ```
301
311
 
302
312
  You can optionally use the LSTM marker augmentation to improve the quality of the output motion.\
303
- You can also optionally give the participants proper masses. Mass has no influence on motion, only on forces (if you decide to further pursue kinetics analysis).
313
+ You can also optionally give the participants proper masses. Mass has no influence on motion, only on forces (if you decide to further pursue kinetics analysis).\
314
+ Optionally again, you can [visualize the overlaid results in Blender](#visualize-in-blender). The automatic calibration won't be accurate with such a small time range, so you need to use the provided calibration file (or one that has been generated from the full walk).
304
315
 
305
316
  ```cmd
306
317
  sports2d --time_range 1.2 2.7 `
307
318
  --do_ik true --first_person_height 1.65 --visible_side left front `
308
- --use_augmentation True --participant_mass 55.0 67.0
319
+ --use_augmentation True --participant_mass 55.0 67.0 `
320
+ --calib_file Calib_demo.toml
309
321
  ```
310
322
 
311
323
  <br>
@@ -333,14 +345,31 @@ sports2d --video_input demo.mp4 other_video.mp4 --time_range 1.2 2.7 0 3.5
333
345
  ``` cmd
334
346
  sports2d --config Config_demo.toml
335
347
  ```
336
- - Run within Python:
337
- ``` python
338
- from Sports2D import Sports2D; Sports2D.process('Config_demo.toml')
339
- ```
340
- - Run within Python with a dictionary (for example, `config_dict = toml.load('Config_demo.toml')`):
341
- ``` python
342
- from Sports2D import Sports2D; Sports2D.process(config_dict)
343
- ```
348
+ - Run within Python, for example:\
349
+ - Edit `Demo/Config_demo.toml` and run:
350
+ ```python
351
+ from Sports2D import Sports2D
352
+ from pathlib import Path
353
+ import toml
354
+
355
+ config_path = Path(Sports2D.__file__).parent / 'Demo'/'Config_demo.toml'
356
+ config_dict = toml.load(config_path)
357
+ Sports2D.process(config_dict)
358
+ ```
359
+ - Or you can pass the non default values only:
360
+ ```python
361
+ from Sports2D import Sports2D
362
+ config_dict = {
363
+ 'base': {
364
+ 'nb_persons_to_detect': 1,
365
+ 'person_ordering_method': 'greatest_displacement'
366
+ },
367
+ 'pose': {
368
+ 'mode': 'lightweight',
369
+ 'det_frequency': 50
370
+ }}
371
+ Sports2D.process(config_dict)
372
+ ```
344
373
 
345
374
  <br>
346
375
 
@@ -364,7 +393,7 @@ sports2d --video_input demo.mp4 other_video.mp4 --time_range 1.2 2.7 0 3.5
364
393
  ```cmd
365
394
  sports2d --flip_left_right true # Default
366
395
  ```
367
- - Correct segment angles according to the estimated camera tild angle.\
396
+ - Correct segment angles according to the estimated camera tilt angle.\
368
397
  **N.B.:** *The camera tilt angle is automatically estimated. Set to false if it is actually the floor which is tilted rather than the camera.*
369
398
  ```cmd
370
399
  sports2d --correct_segment_angles_with_floor_angle true # Default
@@ -434,6 +463,7 @@ sports2d --help
434
463
  'show_realtime_results': ["R", "show results in real-time. true if not specified"],
435
464
  'display_angle_values_on': ["a", '"body", "list", "body" "list", or "none". body list if not specified'],
436
465
  'show_graphs': ["G", "show plots of raw and processed results. true if not specified"],
466
+ 'save_graphs': ["", "save position and angle plots of raw and processed results. false if not specified"],
437
467
  'joint_angles': ["j", '"Right ankle" "Left ankle" "Right knee" "Left knee" "Right hip" "Left hip" "Right shoulder" "Left shoulder" "Right elbow" "Left elbow" if not specified'],
438
468
  'segment_angles': ["s", '"Right foot" "Left foot" "Right shank" "Left shank" "Right thigh" "Left thigh" "Pelvis" "Trunk" "Shoulders" "Head" "Right arm" "Left arm" "Right forearm" "Left forearm" if not specified'],
439
469
  'save_vid': ["V", "save processed video. true if not specified"],
@@ -0,0 +1,12 @@
1
+ [demo]
2
+ name = "demo"
3
+ size = [ 1768, 994]
4
+ matrix = [ [ 2520.0897058227038, 0.0, 884.0], [ 0.0, 2520.0897058227038, 497.0], [ 0.0, 0.0, 1.0]]
5
+ distortions = [ 0.0, 0.0, 0.0, 0.0]
6
+ rotation = [ 1.2082126924727719, 1.2098328575850605, -1.2082126924727719]
7
+ translation = [ -3.510103521992233, 1.7079310029359385, 10.0]
8
+ fisheye = false
9
+
10
+ [metadata]
11
+ adjusted = false
12
+ error = 0.0
@@ -103,7 +103,7 @@ keypoint_number_threshold = 0.3 # Person will be ignored if the number of go
103
103
  # Pixel to meters conversion
104
104
  to_meters = true
105
105
  make_c3d = true
106
- save_calib = true # Coming soon!
106
+ save_calib = false
107
107
 
108
108
  # If conversion from first_person_height
109
109
  floor_angle = 'auto' # 'auto' or a value in degrees, eg 2.3. If 'auto', estimated from the line formed by the toes when they are on the ground (where speed = 0)
@@ -139,7 +139,7 @@ reject_outliers = true # Hampel filter for outlier rejection before other f
139
139
 
140
140
  filter = true
141
141
  show_graphs = true # Show plots of raw and processed results
142
- save_graphs = false # Save position and angle plots of raw and processed results
142
+ save_graphs = true # Save position and angle plots of raw and processed results
143
143
  filter_type = 'butterworth' # butterworth, kalman, gcv_spline, gaussian, loess, median, butterworth_on_speed
144
144
 
145
145
  # Most intuitive and standard filter in biomechanics
@@ -196,7 +196,7 @@ DEFAULT_CONFIG = {'base': {'video_input': ['demo.mp4'],
196
196
  'calib_file': '',
197
197
  'floor_angle': 'auto',
198
198
  'xy_origin': ['auto'],
199
- 'save_calib': True
199
+ 'save_calib': False
200
200
  },
201
201
  'angles': {'display_angle_values_on': ['body', 'list'],
202
202
  'fontSize': 0.3,
@@ -236,7 +236,7 @@ DEFAULT_CONFIG = {'base': {'video_input': ['demo.mp4'],
236
236
  'reject_outliers': True,
237
237
  'filter': True,
238
238
  'show_graphs': True,
239
- 'save_graphs': False,
239
+ 'save_graphs': True,
240
240
  'filter_type': 'butterworth',
241
241
  'butterworth': {'order': 4, 'cut_off_frequency': 6.0},
242
242
  'kalman': {'trust_ratio': 500.0, 'smooth':True},
@@ -280,7 +280,7 @@ CONFIG_HELP = {'config': ["C", "path to a toml configuration file"],
280
280
  'show_realtime_results': ["R", "show results in real-time. true if not specified"],
281
281
  'display_angle_values_on': ["a", '"body", "list", "body" "list", or "none". body list if not specified'],
282
282
  'show_graphs': ["G", "show plots of raw and processed results. true if not specified"],
283
- 'save_graphs': ["", "save position and angle plots of raw and processed results. false if not specified"],
283
+ 'save_graphs': ["", "save position and angle plots of raw and processed results. true if not specified"],
284
284
  'joint_angles': ["j", '"Right ankle" "Left ankle" "Right knee" "Left knee" "Right hip" "Left hip" "Right shoulder" "Left shoulder" "Right elbow" "Left elbow" if not specified'],
285
285
  'segment_angles': ["s", '"Right foot" "Left foot" "Right shank" "Left shank" "Right thigh" "Left thigh" "Pelvis" "Trunk" "Shoulders" "Head" "Right arm" "Left arm" "Right forearm" "Left forearm" if not specified'],
286
286
  'save_vid': ["V", "save processed video. true if not specified"],
@@ -473,6 +473,14 @@ def set_nested_value(config, flat_key, value):
473
473
  d[keys[-1]] = value
474
474
 
475
475
 
476
+ def merge_dicts(original, overrides):
477
+ for key, value in overrides.items():
478
+ if isinstance(value, dict) and isinstance(original.get(key), dict):
479
+ merge_dicts(original[key], value)
480
+ else:
481
+ original[key] = value
482
+
483
+
476
484
  def str2bool(v):
477
485
  '''
478
486
  Convert a string to a boolean value.
@@ -500,7 +508,8 @@ def process(config='Config_demo.toml'):
500
508
  from Sports2D.process import process_fun
501
509
 
502
510
  if type(config) == dict:
503
- config_dict = config
511
+ config_dict = DEFAULT_CONFIG.copy()
512
+ merge_dicts(config_dict, config)
504
513
  else:
505
514
  config_dict = read_config_file(config)
506
515
  video_dir, video_files, frame_rates, time_ranges, result_dir = base_params(config_dict)
@@ -45,7 +45,7 @@ def test_workflow():
45
45
  ## From Python ##
46
46
  #############################
47
47
 
48
- # Default
48
+ # Default from the demo config file
49
49
  config_path = Path(__file__).resolve().parent.parent / 'Demo' / 'Config_demo.toml'
50
50
  config_dict = toml.load(config_path)
51
51
  video_dir = Path(__file__).resolve().parent.parent / 'Demo'
@@ -53,6 +53,28 @@ def test_workflow():
53
53
  config_dict.get("base").update({"person_ordering_method": "highest_likelihood"})
54
54
  config_dict.get("base").update({"show_realtime_results":False})
55
55
  config_dict.get("post-processing").update({"show_graphs":False})
56
+ config_dict.get("post-processing").update({"save_graphs":False})
57
+
58
+ Sports2D.process(config_dict)
59
+
60
+
61
+ # Only passing the updated values
62
+ video_dir = Path(__file__).resolve().parent.parent / 'Demo'
63
+ config_dict = {
64
+ 'base': {
65
+ 'nb_persons_to_detect': 1,
66
+ 'person_ordering_method': 'greatest_displacement',
67
+ "show_realtime_results":False
68
+ },
69
+ 'pose': {
70
+ 'mode': 'lightweight',
71
+ 'det_frequency': 50
72
+ },
73
+ 'post-processing': {
74
+ 'show_graphs':False,
75
+ 'save_graphs':False
76
+ }
77
+ }
56
78
 
57
79
  Sports2D.process(config_dict)
58
80
 
@@ -62,28 +84,28 @@ def test_workflow():
62
84
  #############################
63
85
 
64
86
  # Default
65
- demo_cmd = ["sports2d", "--person_ordering_method", "highest_likelihood", "--show_realtime_results", "False", "--show_graphs", "False"]
87
+ demo_cmd = ["sports2d", "--person_ordering_method", "highest_likelihood", "--show_realtime_results", "False", "--show_graphs", "False", "--save_graphs", "False"]
66
88
  subprocess.run(demo_cmd, check=True, capture_output=True, text=True, encoding='utf-8', errors='replace')
67
89
 
68
90
  # With loading a trc file, visible_side 'front', first_person_height '1.76", floor_angle 0, xy_origin [0, 928]
69
- demo_cmd2 = ["sports2d", "--show_realtime_results", "False", "--show_graphs", "False",
91
+ demo_cmd2 = ["sports2d", "--show_realtime_results", "False", "--show_graphs", "False", "--save_graphs", "False",
70
92
  "--load_trc_px", os.path.join(root_dir, "demo_Sports2D", "demo_Sports2D_px_person01.trc"),
71
93
  "--visible_side", "front", "--first_person_height", "1.76", "--time_range", "1.2", "2.7",
72
94
  "--floor_angle", "0", "--xy_origin", "0", "928"]
73
95
  subprocess.run(demo_cmd2, check=True, capture_output=True, text=True, encoding='utf-8', errors='replace')
74
96
 
75
97
  # With no pixels to meters conversion, one person to select, lightweight mode, detection frequency, slowmo factor, gaussian filter, RTMO body pose model
76
- demo_cmd3 = ["sports2d", "--show_realtime_results", "False", "--show_graphs", "False",
77
- "--to_meters", "False",
98
+ demo_cmd3 = ["sports2d", "--show_realtime_results", "False", "--show_graphs", "False", "--save_graphs", "False",
99
+ # "--calib_file", "calib_demo.toml",
78
100
  "--nb_persons_to_detect", "1", "--person_ordering_method", "greatest_displacement",
79
101
  "--mode", "lightweight", "--det_frequency", "50",
80
102
  "--slowmo_factor", "4",
81
- "--filter_type", "gaussian",
103
+ "--filter_type", "gaussian", "--use_augmentation", "False",
82
104
  "--pose_model", "body", "--mode", """{'pose_class':'RTMO', 'pose_model':'https://download.openmmlab.com/mmpose/v1/projects/rtmo/onnx_sdk/rtmo-m_16xb16-600e_body7-640x640-39e78cc4_20231211.zip', 'pose_input_size':[640, 640]}"""]
83
105
  subprocess.run(demo_cmd3, check=True, capture_output=True, text=True, encoding='utf-8', errors='replace')
84
106
 
85
107
  # With a time range, inverse kinematics, marker augmentation
86
- demo_cmd4 = ["sports2d", "--person_ordering_method", "greatest_displacement", "--show_realtime_results", "False", "--show_graphs", "False",
108
+ demo_cmd4 = ["sports2d", "--person_ordering_method", "greatest_displacement", "--show_realtime_results", "False", "--show_graphs", "False", "--save_graphs", "False",
87
109
  "--time_range", "1.2", "2.7",
88
110
  "--do_ik", "True", "--use_augmentation", "True",
89
111
  "--nb_persons_to_detect", "all", "--first_person_height", "1.65",
@@ -97,7 +119,7 @@ def test_workflow():
97
119
  config_dict.get("base").update({"video_dir": str(video_dir)})
98
120
  config_dict.get("base").update({"person_ordering_method": "highest_likelihood"})
99
121
  with open(config_path, 'w') as f: toml.dump(config_dict, f)
100
- demo_cmd5 = ["sports2d", "--config", str(config_path), "--show_realtime_results", "False", "--show_graphs", "False"]
122
+ demo_cmd5 = ["sports2d", "--config", str(config_path), "--show_realtime_results", "False", "--show_graphs", "False", "--save_graphs", "False",]
101
123
  subprocess.run(demo_cmd5, check=True, capture_output=True, text=True, encoding='utf-8', errors='replace')
102
124
 
103
125
 
@@ -1,6 +1,6 @@
1
1
  Metadata-Version: 2.4
2
2
  Name: sports2d
3
- Version: 0.8.20
3
+ Version: 0.8.21
4
4
  Summary: Compute 2D human pose and angles from a video or a webcam.
5
5
  Author-email: David Pagnon <contact@david-pagnon.com>
6
6
  Maintainer-email: David Pagnon <contact@david-pagnon.com>
@@ -67,6 +67,7 @@ Dynamic: license-file
67
67
  </br>
68
68
 
69
69
  > **`Announcements:`**
70
+ > - Generate or import a calibration file, OpenSim skeleton overlay **New in v0.9!**
70
71
  > - Select only the persons you want to analyze **New in v0.8!**
71
72
  > - MarkerAugmentation and Inverse Kinematics for accurate 3D motion with OpenSim. **New in v0.7!**
72
73
  > - Any detector and pose estimation model can be used. **New in v0.6!**
@@ -218,16 +219,19 @@ The Demo video is voluntarily challenging to demonstrate the robustness of the p
218
219
 
219
220
  1. **Install the Pose2Sim_Blender add-on.**\
220
221
  Follow instructions on the [Pose2Sim_Blender](https://github.com/davidpagnon/Pose2Sim_Blender) add-on page.
222
+ 2. **Import the camera and video.**
223
+ - **Cameras -> Import**: Open your `demo_calib.toml` file from your `result_dir` folder.
224
+ - **Images/Videos -> Show**: open your video file (e.g., `demo_Sports2D.mp4`).\
225
+ -> **Other tools -> See through camera**
221
226
  2. **Open your point coordinates.**\
222
- **Add Markers**: open your trc file(e.g., `coords_m.trc`) from your `result_dir` folder.
223
-
227
+ **OpenSim data -> Markers**: Open your trc file(e.g., `demo_Sports2D_m_person00.trc`) from your `result_dir` folder.\
224
228
  This will optionally create **an animated rig** based on the motion of the captured person.
225
229
  3. **Open your animated skeleton:**\
226
230
  Make sure you first set `--do_ik True` ([full install](#full-install) required). See [inverse kinematics](#run-inverse-kinematics) section for more details.
227
- - **Add Model**: Open your scaled model (e.g., `Model_Pose2Sim_LSTM.osim`).
228
- - **Add Motion**: Open your motion file (e.g., `angles.mot`). Make sure the skeleton is selected in the outliner.
231
+ - **OpenSim data -> Model**: Open your scaled model (e.g., `demo_Sports2D_m_person00_LSTM.osim`).
232
+ - **OpenSim data -> Motion**: Open your motion file (e.g., `demo_Sports2D_m_person00_LSTM_ik.mot`).
229
233
 
230
- The OpenSim skeleton is not rigged yet. **[Feel free to contribute!](https://github.com/perfanalytics/pose2sim/issues/40)**
234
+ The OpenSim skeleton is not rigged yet. **[Feel free to contribute!](https://github.com/perfanalytics/pose2sim/issues/40)** [![Discord](https://img.shields.io/discord/1183750225471492206?logo=Discord&label=Discord%20community)](https://discord.com/invite/4mXUdSFjmt)
231
235
 
232
236
  <img src="Content/sports2d_blender.gif" width="760">
233
237
 
@@ -284,7 +288,7 @@ If you only want to analyze a subset of the detected persons, you can use the `-
284
288
  sports2d --nb_persons_to_detect 2 --person_ordering_method highest_likelihood
285
289
  ```
286
290
 
287
- We recommend to use the `on_click` method if you can afford a manual input. This lets the user handle both the person number and their order in the same stage. When prompted, select the persons you are interested in in the desired order. In our case, lets slide to a frame where both people are visible, and select the woman first, then the man.
291
+ We recommend using the `on_click` method if you can afford a manual input. This lets the user handle both the person number and their order in the same stage. When prompted, select the persons you are interested in in the desired order. In our case, lets slide to a frame where both people are visible, and select the woman first, then the man.
288
292
 
289
293
  Otherwise, if you want to run Sports2D automatically for example, you can choose other ordering methods such as 'highest_likelihood', 'largest_size', 'smallest_size', 'greatest_displacement', 'least_displacement', 'first_detected', or 'last_detected'.
290
294
 
@@ -301,28 +305,32 @@ sports2d --person_ordering_method on_click
301
305
 
302
306
 
303
307
  #### Get coordinates in meters:
304
- > **N.B.:** Depth is estimated from a neutral pose.
308
+ > **N.B.:** The Z coordinate (depth) should not be overly trusted.
305
309
 
306
- <!-- You either need to provide a calibration file, or simply the height of a person (Note that the latter will not take distortions into account, and that it will be less accurate for motion in the frontal plane).\-->
307
- You may need to convert pixel coordinates to meters.\
308
- Just provide the height of the reference person (and their ID in case of multiple person detection).
310
+ You may want coordinates in meters rather than pixels. 2 options to do so:
309
311
 
310
- You can also specify whether the visible side of the person is left, right, front, or back. Set it to 'auto' if you do not want to find it automatically (only works for motion in the sagittal plane), or to 'none' if you want to keep 2D instead of 3D coordinates (if the person goes right, and then left for example).
312
+ 1. **Just provide the height of a reference person**:
313
+ - Their height in meters is be compared with their height in pixels to get a pixel-to-meter conversion factor.
314
+ - To estimate the depth coordinates, specify which side of the person is visible: `left`, `right`, `front`, or `back`. Use `auto` if you want it to be automatically determined (only works for motions in the sagittal plane), or `none` if you want to keep 2D coordinates instead of 3D (if the person turns around, for example).
315
+ - The floor angle is automatically estimated from gait, as well as the origin of the xy axis. The person trajectory is corrected accordingly. You can use the `--floor_angle` and `--xy_origin` parameters to manually specify them if your subject is not travelling horizontally or if you want the origin not to be under their feet (note that the `y` axis points down).
316
+
317
+ **N.B.: A calibration file will be generated.** By convention, the camera-to-subject distance is set to 10 meters.
311
318
 
312
- The floor angle and the origin of the xy axis are computed automatically from gait. If you analyze another type of motion, you can manually specify them. Note that `y` points down.\
313
- Also note that distortions are not taken into account, and that results will be less accurate for motions in the frontal plane.
319
+ ``` cmd
320
+ sports2d --first_person_height 1.65 --visible_side auto front none
321
+ ```
322
+ ``` cmd
323
+ sports2d --first_person_height 1.65 --visible_side auto front none `
324
+ --person_ordering_method on_click `
325
+ --floor_angle 0 --xy_origin 0 940
326
+ ```
314
327
 
315
- <!-- ``` cmd
316
- sports2d --to_meters True --calib_file calib_demo.toml
317
- ``` -->
318
- ``` cmd
319
- sports2d --to_meters True --first_person_height 1.65 --visible_side auto front none
320
- ```
321
- ``` cmd
322
- sports2d --to_meters True --first_person_height 1.65 --visible_side auto front none `
323
- --person_ordering_method on_click `
324
- --floor_angle 0 --xy_origin 0 940
325
- ```
328
+ 2. **Or use a calibration file**:\
329
+ It can either be a `.toml` calibration file previously generated by Sports2D, or a more accurate one coming from another system. For example, [Pose2Sim](https://github.com/perfanalytics/pose2sim) can be used to accurately calculate calibration, or to convert calibration files from Qualisys, Vicon, OpenCap, FreeMoCap, etc.
330
+
331
+ ``` cmd
332
+ sports2d --calib_file Calib_demo.toml --visible_side auto front none
333
+ ```
326
334
 
327
335
  <br>
328
336
 
@@ -337,18 +345,22 @@ OpenSim inverse kinematics allows you to set joint constraints, joint angle limi
337
345
  This is done via [Pose2Sim](https://github.com/perfanalytics/pose2sim).\
338
346
  Model scaling is done according to the mean of the segment lengths, across a subset of frames. We remove the 10% fastest frames (potential outliers), the frames where the speed is 0 (person probably out of frame), the frames where the average knee and hip flexion angles are above 45° (pose estimation is not precise when the person is crouching) and the 20% most extreme segment values after the previous operations (potential outliers). All these parameters can be edited in your Config.toml file.
339
347
 
348
+ **N.B.: This will not work on sections where the person is not moving in a single plane. You can split your video into several time ranges if needed.**
349
+
340
350
  ```cmd
341
351
  sports2d --time_range 1.2 2.7 `
342
352
  --do_ik true --first_person_height 1.65 --visible_side auto front
343
353
  ```
344
354
 
345
355
  You can optionally use the LSTM marker augmentation to improve the quality of the output motion.\
346
- You can also optionally give the participants proper masses. Mass has no influence on motion, only on forces (if you decide to further pursue kinetics analysis).
356
+ You can also optionally give the participants proper masses. Mass has no influence on motion, only on forces (if you decide to further pursue kinetics analysis).\
357
+ Optionally again, you can [visualize the overlaid results in Blender](#visualize-in-blender). The automatic calibration won't be accurate with such a small time range, so you need to use the provided calibration file (or one that has been generated from the full walk).
347
358
 
348
359
  ```cmd
349
360
  sports2d --time_range 1.2 2.7 `
350
361
  --do_ik true --first_person_height 1.65 --visible_side left front `
351
- --use_augmentation True --participant_mass 55.0 67.0
362
+ --use_augmentation True --participant_mass 55.0 67.0 `
363
+ --calib_file Calib_demo.toml
352
364
  ```
353
365
 
354
366
  <br>
@@ -376,14 +388,31 @@ sports2d --video_input demo.mp4 other_video.mp4 --time_range 1.2 2.7 0 3.5
376
388
  ``` cmd
377
389
  sports2d --config Config_demo.toml
378
390
  ```
379
- - Run within Python:
380
- ``` python
381
- from Sports2D import Sports2D; Sports2D.process('Config_demo.toml')
382
- ```
383
- - Run within Python with a dictionary (for example, `config_dict = toml.load('Config_demo.toml')`):
384
- ``` python
385
- from Sports2D import Sports2D; Sports2D.process(config_dict)
386
- ```
391
+ - Run within Python, for example:\
392
+ - Edit `Demo/Config_demo.toml` and run:
393
+ ```python
394
+ from Sports2D import Sports2D
395
+ from pathlib import Path
396
+ import toml
397
+
398
+ config_path = Path(Sports2D.__file__).parent / 'Demo'/'Config_demo.toml'
399
+ config_dict = toml.load(config_path)
400
+ Sports2D.process(config_dict)
401
+ ```
402
+ - Or you can pass the non default values only:
403
+ ```python
404
+ from Sports2D import Sports2D
405
+ config_dict = {
406
+ 'base': {
407
+ 'nb_persons_to_detect': 1,
408
+ 'person_ordering_method': 'greatest_displacement'
409
+ },
410
+ 'pose': {
411
+ 'mode': 'lightweight',
412
+ 'det_frequency': 50
413
+ }}
414
+ Sports2D.process(config_dict)
415
+ ```
387
416
 
388
417
  <br>
389
418
 
@@ -407,7 +436,7 @@ sports2d --video_input demo.mp4 other_video.mp4 --time_range 1.2 2.7 0 3.5
407
436
  ```cmd
408
437
  sports2d --flip_left_right true # Default
409
438
  ```
410
- - Correct segment angles according to the estimated camera tild angle.\
439
+ - Correct segment angles according to the estimated camera tilt angle.\
411
440
  **N.B.:** *The camera tilt angle is automatically estimated. Set to false if it is actually the floor which is tilted rather than the camera.*
412
441
  ```cmd
413
442
  sports2d --correct_segment_angles_with_floor_angle true # Default
@@ -477,6 +506,7 @@ sports2d --help
477
506
  'show_realtime_results': ["R", "show results in real-time. true if not specified"],
478
507
  'display_angle_values_on': ["a", '"body", "list", "body" "list", or "none". body list if not specified'],
479
508
  'show_graphs': ["G", "show plots of raw and processed results. true if not specified"],
509
+ 'save_graphs': ["", "save position and angle plots of raw and processed results. false if not specified"],
480
510
  'joint_angles': ["j", '"Right ankle" "Left ankle" "Right knee" "Left knee" "Right hip" "Left hip" "Right shoulder" "Left shoulder" "Right elbow" "Left elbow" if not specified'],
481
511
  'segment_angles': ["s", '"Right foot" "Left foot" "Right shank" "Left shank" "Right thigh" "Left thigh" "Pelvis" "Trunk" "Shoulders" "Head" "Right arm" "Left arm" "Right forearm" "Left forearm" if not specified'],
482
512
  'save_vid': ["V", "save processed video. true if not specified"],
@@ -20,6 +20,7 @@ Sports2D/Sports2D.ipynb
20
20
  Sports2D/Sports2D.py
21
21
  Sports2D/__init__.py
22
22
  Sports2D/process.py
23
+ Sports2D/Demo/Calib_demo.toml
23
24
  Sports2D/Demo/Config_demo.toml
24
25
  Sports2D/Demo/demo.mp4
25
26
  Sports2D/Utilities/__init__.py
File without changes
File without changes
File without changes
File without changes
File without changes
File without changes
File without changes
File without changes