waveorder 1.0.0rc2__tar.gz → 2.0.0__tar.gz

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
Files changed (82) hide show
  1. {waveorder-1.0.0rc2 → waveorder-2.0.0}/PKG-INFO +19 -16
  2. {waveorder-1.0.0rc2 → waveorder-2.0.0}/README.md +8 -11
  3. waveorder-2.0.0/docs/valuable-prs/2023-02-27.110.pr.open.md +308 -0
  4. waveorder-2.0.0/examples/README.md +7 -0
  5. waveorder-2.0.0/examples/documentation/PTI_experiment/PTI_Experiment_Recon3D_anisotropic_target_small.ipynb +1073 -0
  6. waveorder-2.0.0/examples/documentation/PTI_experiment/README.md +6 -0
  7. waveorder-2.0.0/examples/documentation/README.md +1 -0
  8. {waveorder-1.0.0rc2/examples → waveorder-2.0.0/examples/maintenance}/PTI_simulation/PTI_Simulation_Forward_2D3D.py +42 -31
  9. {waveorder-1.0.0rc2/examples → waveorder-2.0.0/examples/maintenance}/PTI_simulation/PTI_Simulation_Recon2D.py +18 -15
  10. {waveorder-1.0.0rc2/examples → waveorder-2.0.0/examples/maintenance}/PTI_simulation/PTI_Simulation_Recon3D.py +20 -16
  11. waveorder-2.0.0/examples/maintenance/PTI_simulation/README.md +4 -0
  12. {waveorder-1.0.0rc2/examples/2D_QLIPP_simulation → waveorder-2.0.0/examples/maintenance/QLIPP_simulation}/2D_QLIPP_forward.py +19 -10
  13. {waveorder-1.0.0rc2/examples/2D_QLIPP_simulation → waveorder-2.0.0/examples/maintenance/QLIPP_simulation}/2D_QLIPP_recon.py +51 -44
  14. waveorder-2.0.0/examples/maintenance/README.md +1 -0
  15. waveorder-2.0.0/examples/models/README.md +9 -0
  16. waveorder-2.0.0/examples/models/inplane_oriented_thick_pol3D.py +62 -0
  17. waveorder-2.0.0/examples/models/isotropic_fluorescent_thick_3d.py +73 -0
  18. waveorder-2.0.0/examples/models/isotropic_thin_3d.py +88 -0
  19. waveorder-2.0.0/examples/models/phase_thick_3d.py +80 -0
  20. waveorder-2.0.0/readme.png +0 -0
  21. {waveorder-1.0.0rc2 → waveorder-2.0.0}/setup.cfg +1 -6
  22. waveorder-2.0.0/tests/conftest.py +13 -0
  23. waveorder-2.0.0/tests/models/test_inplane_oriented_thick_pol3D.py +42 -0
  24. waveorder-2.0.0/tests/models/test_isotropic_fluorescent_thick_3d.py +52 -0
  25. waveorder-2.0.0/tests/models/test_isotropic_thin_3d.py +21 -0
  26. waveorder-2.0.0/tests/models/test_phase_thick_3d.py +22 -0
  27. waveorder-2.0.0/tests/test_correction.py +42 -0
  28. {waveorder-1.0.0rc2 → waveorder-2.0.0}/tests/test_examples.py +22 -12
  29. waveorder-2.0.0/tests/test_focus_estimator.py +77 -0
  30. waveorder-2.0.0/tests/test_optics.py +94 -0
  31. waveorder-2.0.0/tests/test_stokes.py +204 -0
  32. waveorder-2.0.0/tests/test_util.py +46 -0
  33. waveorder-2.0.0/waveorder/_version.py +16 -0
  34. waveorder-2.0.0/waveorder/correction.py +107 -0
  35. waveorder-2.0.0/waveorder/focus.py +180 -0
  36. waveorder-2.0.0/waveorder/models/inplane_oriented_thick_pol3d.py +159 -0
  37. waveorder-2.0.0/waveorder/models/isotropic_fluorescent_thick_3d.py +175 -0
  38. waveorder-2.0.0/waveorder/models/isotropic_thin_3d.py +259 -0
  39. waveorder-2.0.0/waveorder/models/phase_thick_3d.py +226 -0
  40. {waveorder-1.0.0rc2 → waveorder-2.0.0}/waveorder/optics.py +127 -242
  41. waveorder-2.0.0/waveorder/stokes.py +458 -0
  42. {waveorder-1.0.0rc2 → waveorder-2.0.0}/waveorder/util.py +209 -268
  43. {waveorder-1.0.0rc2 → waveorder-2.0.0}/waveorder/visual.py +1 -1
  44. {waveorder-1.0.0rc2 → waveorder-2.0.0}/waveorder/waveorder_reconstructor.py +95 -295
  45. {waveorder-1.0.0rc2 → waveorder-2.0.0}/waveorder/waveorder_simulator.py +33 -58
  46. {waveorder-1.0.0rc2 → waveorder-2.0.0}/waveorder.egg-info/PKG-INFO +19 -16
  47. waveorder-2.0.0/waveorder.egg-info/SOURCES.txt +70 -0
  48. {waveorder-1.0.0rc2 → waveorder-2.0.0}/waveorder.egg-info/requires.txt +1 -6
  49. waveorder-1.0.0rc2/Fig_Readme.png +0 -0
  50. waveorder-1.0.0rc2/examples/3D_PODT_phase_simulation/3D_PODT_Phase_forward.py +0 -169
  51. waveorder-1.0.0rc2/examples/3D_PODT_phase_simulation/3D_PODT_Phase_recon.py +0 -119
  52. waveorder-1.0.0rc2/examples/README.md +0 -12
  53. waveorder-1.0.0rc2/examples/experimental_reconstructions/PTI_experiment/PTI_Experiment_Recon3D_anisotropic_target_small.ipynb +0 -1128
  54. waveorder-1.0.0rc2/tests/reconstructor/test_2D_QLIPP_recon.py +0 -131
  55. waveorder-1.0.0rc2/tests/test_focus_estimator.py +0 -43
  56. waveorder-1.0.0rc2/waveorder/__init__.py +0 -6
  57. waveorder-1.0.0rc2/waveorder/_version.py +0 -4
  58. waveorder-1.0.0rc2/waveorder/focus.py +0 -105
  59. waveorder-1.0.0rc2/waveorder.egg-info/SOURCES.txt +0 -49
  60. {waveorder-1.0.0rc2 → waveorder-2.0.0}/.git-blame-ignore-revs +0 -0
  61. {waveorder-1.0.0rc2 → waveorder-2.0.0}/.github/workflows/pytests.yml +0 -0
  62. {waveorder-1.0.0rc2 → waveorder-2.0.0}/.gitignore +0 -0
  63. {waveorder-1.0.0rc2 → waveorder-2.0.0}/CITATION.cff +0 -0
  64. {waveorder-1.0.0rc2 → waveorder-2.0.0}/LICENSE +0 -0
  65. {waveorder-1.0.0rc2/examples/experimental_reconstructions → waveorder-2.0.0/examples/documentation}/PTI_experiment/PTI_full_FOV_anisotropic_target.ipynb +0 -0
  66. {waveorder-1.0.0rc2/examples/experimental_reconstructions → waveorder-2.0.0/examples/documentation}/PTI_experiment/PTI_full_FOV_cardiac_muscle.ipynb +0 -0
  67. {waveorder-1.0.0rc2/examples/experimental_reconstructions → waveorder-2.0.0/examples/documentation}/PTI_experiment/PTI_full_FOV_cardiomyocyte_infected_1.ipynb +0 -0
  68. {waveorder-1.0.0rc2/examples/experimental_reconstructions → waveorder-2.0.0/examples/documentation}/PTI_experiment/PTI_full_FOV_cardiomyocyte_infected_2.ipynb +0 -0
  69. {waveorder-1.0.0rc2/examples/experimental_reconstructions → waveorder-2.0.0/examples/documentation}/PTI_experiment/PTI_full_FOV_cardiomyocyte_mock.ipynb +0 -0
  70. {waveorder-1.0.0rc2/examples/experimental_reconstructions → waveorder-2.0.0/examples/documentation}/PTI_experiment/PTI_full_FOV_human_uterus.ipynb +0 -0
  71. {waveorder-1.0.0rc2/examples/experimental_reconstructions → waveorder-2.0.0/examples/documentation}/PTI_experiment/PTI_full_FOV_mouse_brain_aco.ipynb +0 -0
  72. {waveorder-1.0.0rc2/examples/experimental_reconstructions → waveorder-2.0.0/examples/documentation}/QLIPP_experiment/2D_QLIPP_recon_experiment.ipynb +0 -0
  73. {waveorder-1.0.0rc2/examples/experimental_reconstructions → waveorder-2.0.0/examples/documentation}/QLIPP_experiment/3D_QLIPP_recon_experiment.ipynb +0 -0
  74. {waveorder-1.0.0rc2/examples/experimental_reconstructions → waveorder-2.0.0/examples/documentation}/fluorescence_deconvolution/fluorescence_deconv.ipynb +0 -0
  75. {waveorder-1.0.0rc2/examples → waveorder-2.0.0/examples/maintenance}/PTI_simulation/PTI_formulation.html +0 -0
  76. {waveorder-1.0.0rc2 → waveorder-2.0.0}/pyproject.toml +0 -0
  77. {waveorder-1.0.0rc2 → waveorder-2.0.0}/setup.py +0 -0
  78. {waveorder-1.0.0rc2 → waveorder-2.0.0}/tests/__init__.py +0 -0
  79. {waveorder-1.0.0rc2/tests/reconstructor → waveorder-2.0.0/waveorder}/__init__.py +0 -0
  80. {waveorder-1.0.0rc2 → waveorder-2.0.0}/waveorder/background_estimator.py +0 -0
  81. {waveorder-1.0.0rc2 → waveorder-2.0.0}/waveorder.egg-info/dependency_links.txt +0 -0
  82. {waveorder-1.0.0rc2 → waveorder-2.0.0}/waveorder.egg-info/top_level.txt +0 -0
@@ -1,6 +1,6 @@
1
1
  Metadata-Version: 2.1
2
2
  Name: waveorder
3
- Version: 1.0.0rc2
3
+ Version: 2.0.0
4
4
  Summary: Wave optical simulations and deconvolution of optical properties
5
5
  Home-page: https://github.com/mehta-lab/waveorder
6
6
  Author: Computational Microscopy Platform, CZ Biohub
@@ -10,7 +10,6 @@ Project-URL: Bug Tracker, https://github.com/mehta-lab/waveorder/issues
10
10
  Project-URL: Documentation, https://github.com/mehta-lab/waveorder
11
11
  Project-URL: Source Code, https://github.com/mehta-lab/waveorder
12
12
  Project-URL: User Support, https://github.com/mehta-lab/waveorder/issues
13
- Platform: UNKNOWN
14
13
  Classifier: Development Status :: 4 - Beta
15
14
  Classifier: Intended Audience :: Science/Research
16
15
  Classifier: License :: OSI Approved :: BSD License
@@ -29,8 +28,17 @@ Classifier: Operating System :: Unix
29
28
  Classifier: Operating System :: MacOS
30
29
  Requires-Python: >=3.8
31
30
  Description-Content-Type: text/markdown
32
- Provides-Extra: dev
33
31
  License-File: LICENSE
32
+ Requires-Dist: numpy>=1.21
33
+ Requires-Dist: matplotlib>=3.1.1
34
+ Requires-Dist: scipy>=1.3.0
35
+ Requires-Dist: pywavelets>=1.1.1
36
+ Requires-Dist: ipywidgets>=7.5.1
37
+ Requires-Dist: torch>=2.0.0
38
+ Provides-Extra: dev
39
+ Requires-Dist: flake8; extra == "dev"
40
+ Requires-Dist: pytest>=5.0.0; extra == "dev"
41
+ Requires-Dist: pytest-cov; extra == "dev"
34
42
 
35
43
  # waveorder
36
44
 
@@ -61,9 +69,10 @@ In addition to PTI, `waveorder` enables simulations and reconstructions of subse
61
69
 
62
70
  PTI provides volumetric reconstructions of mean permittivity ($\propto$ material density), differential permittivity ($\propto$ material anisotropy), 3D orientation, and optic sign. The following figure summarizes PTI acquisition and reconstruction with a small optical section of the mouse brain tissue:
63
71
 
64
- ![Data_flow](https://github.com/mehta-lab/waveorder/blob/main/Fig_Readme.png?raw=true)
72
+ ![Data_flow](https://github.com/mehta-lab/waveorder/blob/main/readme.png?raw=true)
73
+
65
74
 
66
- The [example notebooks](examples/) illustrate simulations and reconstruction for 2D QLIPP, 3D PODT, and 2D/3D PTI methods.
75
+ The [examples](https://github.com/mehta-lab/waveorder/tree/main/examples) illustrate simulations and reconstruction for 2D QLIPP, 3D PODT, and 2D/3D PTI methods.
67
76
 
68
77
  If you are interested in deploying QLIPP or PODT for label-free imaging at scale, checkout our [napari plugin](https://www.napari-hub.org/plugins/recOrder-napari), [`recOrder-napari`](https://github.com/mehta-lab/recOrder).
69
78
 
@@ -73,7 +82,7 @@ In addition to label-free reconstruction algorithms, `waveorder` also implements
73
82
 
74
83
  1. Correlative measurements of biomolecular density and orientation from polarization-diverse widefield imaging ([multimodal Instant PolScope](https://opg.optica.org/boe/fulltext.cfm?uri=boe-13-5-3102&id=472350))
75
84
 
76
- We provide an [example notebook](examples/experimental_reconstructions/fluorescence_deconvolution/) for widefield fluorescence deconvolution.
85
+ We provide an [example notebook](https://github.com/mehta-lab/waveorder/blob/main/examples/documentation/fluorescence_deconvolution/fluorescence_deconv.ipynb) for widefield fluorescence deconvolution.
77
86
 
78
87
  ## Citation
79
88
 
@@ -109,13 +118,7 @@ pip install jupyter
109
118
  jupyter notebook ./waveorder/examples/
110
119
  ```
111
120
 
112
- (Optional) Use NVIDIA GPUs by installing `cupy` with [these instructions](https://docs.cupy.dev/en/stable/install.html). Check that `cupy` is properly installed with
113
-
114
- ```sh
115
- python
116
- >>> import cupy
117
- ```
118
-
119
- To use GPUs in `waveorder` set ```use_gpu=True``` when initializing the simulator and reconstructor classes.
120
-
121
-
121
+ (M1 users) `pytorch` has [incomplete GPU support](https://github.com/pytorch/pytorch/issues/77764),
122
+ so please use `export PYTORCH_ENABLE_MPS_FALLBACK=1`
123
+ to allow some operators to fallback to CPU
124
+ if you plan to use GPU acceleration for polarization reconstruction.
@@ -27,9 +27,10 @@ In addition to PTI, `waveorder` enables simulations and reconstructions of subse
27
27
 
28
28
  PTI provides volumetric reconstructions of mean permittivity ($\propto$ material density), differential permittivity ($\propto$ material anisotropy), 3D orientation, and optic sign. The following figure summarizes PTI acquisition and reconstruction with a small optical section of the mouse brain tissue:
29
29
 
30
- ![Data_flow](https://github.com/mehta-lab/waveorder/blob/main/Fig_Readme.png?raw=true)
30
+ ![Data_flow](https://github.com/mehta-lab/waveorder/blob/main/readme.png?raw=true)
31
31
 
32
- The [example notebooks](examples/) illustrate simulations and reconstruction for 2D QLIPP, 3D PODT, and 2D/3D PTI methods.
32
+
33
+ The [examples](https://github.com/mehta-lab/waveorder/tree/main/examples) illustrate simulations and reconstruction for 2D QLIPP, 3D PODT, and 2D/3D PTI methods.
33
34
 
34
35
  If you are interested in deploying QLIPP or PODT for label-free imaging at scale, checkout our [napari plugin](https://www.napari-hub.org/plugins/recOrder-napari), [`recOrder-napari`](https://github.com/mehta-lab/recOrder).
35
36
 
@@ -39,7 +40,7 @@ In addition to label-free reconstruction algorithms, `waveorder` also implements
39
40
 
40
41
  1. Correlative measurements of biomolecular density and orientation from polarization-diverse widefield imaging ([multimodal Instant PolScope](https://opg.optica.org/boe/fulltext.cfm?uri=boe-13-5-3102&id=472350))
41
42
 
42
- We provide an [example notebook](examples/experimental_reconstructions/fluorescence_deconvolution/) for widefield fluorescence deconvolution.
43
+ We provide an [example notebook](https://github.com/mehta-lab/waveorder/blob/main/examples/documentation/fluorescence_deconvolution/fluorescence_deconv.ipynb) for widefield fluorescence deconvolution.
43
44
 
44
45
  ## Citation
45
46
 
@@ -75,11 +76,7 @@ pip install jupyter
75
76
  jupyter notebook ./waveorder/examples/
76
77
  ```
77
78
 
78
- (Optional) Use NVIDIA GPUs by installing `cupy` with [these instructions](https://docs.cupy.dev/en/stable/install.html). Check that `cupy` is properly installed with
79
-
80
- ```sh
81
- python
82
- >>> import cupy
83
- ```
84
-
85
- To use GPUs in `waveorder` set ```use_gpu=True``` when initializing the simulator and reconstructor classes.
79
+ (M1 users) `pytorch` has [incomplete GPU support](https://github.com/pytorch/pytorch/issues/77764),
80
+ so please use `export PYTORCH_ENABLE_MPS_FALLBACK=1`
81
+ to allow some operators to fallback to CPU
82
+ if you plan to use GPU acceleration for polarization reconstruction.
@@ -0,0 +1,308 @@
1
+ # [\#110 PR](https://github.com/mehta-lab/waveorder/pull/110) `open`: New Stokes and Mueller module
2
+
3
+ #### <img src="https://avatars.githubusercontent.com/u/9554101?u=7ab5421e9a6613c01e9c1d3261fa6f93645d48f9&v=4" width="50">[talonchandler](https://github.com/talonchandler) opened issue at [2023-02-27 22:13](https://github.com/mehta-lab/waveorder/pull/110):
4
+
5
+ This PR adds a completely rewritten version of all Stokes- and Mueller-related calculations in `waveorder`.
6
+
7
+ This rewrite was motived by the following questions:
8
+ - Q: how should we handle background correction of S3 to prepare it for deconvolution? A: keep track of S3 and use Mueller matrices to correct backgrounds.
9
+ - how should we handle normalization (see the closed #103 for an earlier discussion)? A: no need to normalize in a separate step.
10
+ - can we handle larger background retardances? A: yes, by estimating Mueller matrices.
11
+
12
+ **Highlight improvements:**
13
+ - removal of normalization steps...instead of a two step reconstruction with a normalization followed by reconstruction, this implementation goes straight from Stokes parameters to (retardance, orientation, transmittance, depolarization).
14
+ - a natively ND implementation...the Stokes and Mueller indices go in the first/second axes, and the remaining indices are arbitrary. A convenience `mmul` (Mueller multiply) function that uses `einsum` is the key simplifying design choice.
15
+ - A Mueller-matrix based reconstruction scheme that can handle larger background retardances.
16
+
17
+ **What does the new API look like?** Here's an example snippet from `/hpc/projects/compmicro/projects/infected_cell_imaging/Image_preprocessing/Exp_2023_02_07_A549MemNucl_PolPhase3D/Background_correction_trial/bg-corr-with-mask.py`
18
+ ```
19
+ # Calculate A and A_inv
20
+ A = stokes.A_matrix(swing=0.1, scheme="5-State")
21
+ A_inv = np.linalg.pinv(A)
22
+
23
+ # Apply A_inv to background and sample data
24
+ S_bg = stokes.mmul(A_inv, cyx_bg_data)
25
+ S_sm = stokes.mmul(A_inv, czyx_data)
26
+
27
+ # Calculate background correction matrix from S_bg
28
+ M_inv = stokes.inv_AR_mueller_from_CPL_projection(*S_bg)
29
+
30
+ # Apply background correction to sample data
31
+ bg_corr_S_sm = stokes.mmul(M_inv, S_sm)
32
+
33
+ # Reconstruct parameters
34
+ ret, ori, tra, dop = stokes.inverse_s0123_CPL_after_ADR(*bg_corr_S_sm)
35
+ ```
36
+
37
+ **Limitations compared to the current `waveorder_reconstructor` implementation:**
38
+ - No GPU implementation. @ziw-liu maybe you have ideas for flipping a gpu switch for this whole module? The `waveorder_reconstructor` class' parallel `np` and `cp` implementations seem clunky.
39
+ - Not yet optimized...instead of using differences and ratios to apply background corrections, this implementation uses Mueller matrices and their inverses. This implementation is not slower than the phase reconstructions, and I've added comments in the places where further optimization can improve performance.
40
+
41
+ I have not removed the existing implementation in the `waveorder_reconstructor` class. My current plan is to discuss the technical parts of this reimplementation and compare with the existing implementation here, then later I can complete the refactor by removing the Stokes parts of the `waveorder_reconstructor` class and updating the `recOrder` calls.
42
+
43
+ Note: this PR is to merge into `alg-dev`, so we have a bit more flexibility in the changes. Temporarily breaking changes/suggestions are okay while we iterate.
44
+
45
+ **Specific feedback requests:**
46
+ - @ziw-liu your comments on a gpu switch, and on documentation+testing practice is very welcome. I wasn't sure if I should use type annotations & numpy-style docstrings? I stayed with only docstrings for now.
47
+ - @mattersoflight I'm particularly interested in your thoughts on naming. For example, `inverse_s0123_CPL_after_ADR` doesn't roll off the tongue like the earlier `Polarization_recon`, but I think it's important to be very specific at this level. Later layers of abstraction can reintroduce more abstract names likes `Polarization_recon` if we think they're helpful.
48
+
49
+ #### <img src="https://avatars.githubusercontent.com/u/2934183?u=e638cee77e71afc6e0579e50c2712dfe8a707869&v=4" width="50">[mattersoflight](https://github.com/mattersoflight) commented at [2023-02-27 23:04](https://github.com/mehta-lab/waveorder/pull/110#issuecomment-1447249286):
50
+
51
+ > @ziw-liu your comments on a gpu switch, and on documentation+testing practice is very welcome. I wasn't sure if I should use type annotations & numpy-style docstrings? I stayed with only docstrings for now.
52
+
53
+ Re: GPU implementation, the API between pytorch and numpy seems quite consistent i.e., `object.operation` runs on CPU if object is a numpy array and on GPU if `object` is a torch tensor on GPU, e.g. [pinv](https://pytorch.org/docs/stable/generated/torch.linalg.pinv.html). If we are going to replace the GPU library, I'd first consider pytorch. Can you do a census of numpy methods you use and see if the same methods are available for torch tensor?
54
+
55
+ >@mattersoflight I'm particularly interested in your thoughts on naming. For example, inverse_s0123_CPL_after_ADR doesn't roll off the tongue like the earlier Polarization_recon, but I think it's important to be very specific at this level. Later layers of abstraction can reintroduce more abstract names likes Polarization_recon if we think they're helpful.
56
+
57
+ If we find that all the assumptions in the forward model related to polarization transfer can be covered by two properties: a) instrument matrix, and b) sample model, we can use a generic name and specify the assumptions via arguments. I'll think more about this.
58
+
59
+ #### <img src="https://avatars.githubusercontent.com/u/2934183?u=e638cee77e71afc6e0579e50c2712dfe8a707869&v=4" width="50">[mattersoflight](https://github.com/mattersoflight) commented at [2023-02-27 23:17](https://github.com/mehta-lab/waveorder/pull/110#issuecomment-1447267572):
60
+
61
+ `fft` is not relevant for this PR, but for deconvolution operations later. This benchmark reports that pytorch fft works almost as fast as cupy: https://thomasaarholt.github.io/fftspeedtest/fftspeedtest.html. pytorch is accelerated on M1, but cupy will require a nvidia gpu.
62
+
63
+ #### <img src="https://avatars.githubusercontent.com/u/67518483?v=4" width="50">[ziw-liu](https://github.com/ziw-liu) commented at [2023-02-27 23:28](https://github.com/mehta-lab/waveorder/pull/110#issuecomment-1447278771):
64
+
65
+ > Can you do a census of numpy methods you use and see if the same methods are available for torch tensor?
66
+
67
+ I wouldn't be too concerned about NumPy methods. However SciPy signal processing API may have a much lower coverage. Will have to check in detail.
68
+
69
+ #### <img src="https://avatars.githubusercontent.com/u/67518483?v=4" width="50">[ziw-liu](https://github.com/ziw-liu) commented at [2023-02-27 23:36](https://github.com/mehta-lab/waveorder/pull/110#issuecomment-1447285464):
70
+
71
+ Using torch is an interesting idea, in that it is 'accelerated' for CPUs too, so *in theory* the same code can work for CPU and GPU. However in addition to API coverage, lack of optimization/more overhead can be [potential issues](https://discuss.pytorch.org/t/torch-is-slow-compared-to-numpy/117502).
72
+
73
+ #### <img src="https://avatars.githubusercontent.com/u/67518483?v=4" width="50">[ziw-liu](https://github.com/ziw-liu) commented at [2023-02-27 23:56](https://github.com/mehta-lab/waveorder/pull/110#issuecomment-1447300776):
74
+
75
+ > I wasn't sure if I should use type annotations & numpy-style docstrings? I stayed with only docstrings for now.
76
+
77
+ We don't currently have type checking infra set up, so type hints serves mainly 2 purpose:
78
+
79
+ 1. Help devs as well as users call the API in code elsewhere, because autocompletion and other in-editor static analysis works better.
80
+ 2. Help generate the docstring. I personally use tools that populate the type info in the docstring automatically from docstrings.
81
+
82
+ I like to write type hints because it helps me code faster (e.g. I get syntax-highlighted and linted types that's copied over so less typos in the docstring type field). But as long as the code is consistent in style and well-documented I think it's all fine.
83
+
84
+ #### <img src="https://avatars.githubusercontent.com/u/101817974?v=4" width="50">[Soorya19Pradeep](https://github.com/Soorya19Pradeep) commented at [2023-02-28 01:31](https://github.com/mehta-lab/waveorder/pull/110#issuecomment-1447428605):
85
+
86
+ @talonchandler, the cell membrane signal from the new orientation image computed with the additional background correction definitely has more contrast and is more continuous signal compared to the earlier version with just measured background correction. Thank you!
87
+
88
+ #### <img src="https://avatars.githubusercontent.com/u/2934183?u=e638cee77e71afc6e0579e50c2712dfe8a707869&v=4" width="50">[mattersoflight](https://github.com/mattersoflight) commented at [2023-02-28 15:41](https://github.com/mehta-lab/waveorder/pull/110#issuecomment-1448405744):
89
+
90
+ @talonchandler
91
+
92
+ >@Soorya19Pradeep asked about the consistency of this convention with the existing waveorder implementation.
93
+
94
+ Could you clarify which convention we are discussing here: convention for what is called right vs left circularly polarized light, or convention for axes of orientation, or may be both?
95
+
96
+ #### <img src="https://avatars.githubusercontent.com/u/2934183?u=e638cee77e71afc6e0579e50c2712dfe8a707869&v=4" width="50">[mattersoflight](https://github.com/mattersoflight) commented at [2023-02-28 15:47](https://github.com/mehta-lab/waveorder/pull/110#issuecomment-1448414978):
97
+
98
+ @ziw-liu , @talonchandler
99
+ >Using torch is an interesting idea, in that it is 'accelerated' for CPUs too, so in theory the same code can work for CPU and GPU.
100
+ My thought was to keep using numpy for CPU, and use torch instead of cupy for GPU. There can be code branches depending on whether you use CPU or GPU, but only when matrices (tensors) are instantiated.
101
+
102
+ Let's focus on the model (which is making a lot of sense as I read it), naming convention, and numpy implementation in this PR, and start a separate issue to work on GPU acceleration. We should refactor whole codebase (including deconvolution code) if we change the GPU backend.
103
+
104
+ #### <img src="https://avatars.githubusercontent.com/u/101817974?v=4" width="50">[Soorya19Pradeep](https://github.com/Soorya19Pradeep) commented at [2023-02-28 16:29](https://github.com/mehta-lab/waveorder/pull/110#issuecomment-1448483920):
105
+
106
+ > @talonchandler
107
+ >
108
+ > > @Soorya19Pradeep asked about the consistency of this convention with the existing waveorder implementation.
109
+ >
110
+ > Could you clarify which convention we are discussing here: convention for what is called right vs left circularly polarized light, or convention for axes of orientation, or may be both?
111
+
112
+ @mattersoflight, I am trying to understand how to read the orientation measurement of cell membrane, if it makes physical sense. The value of orientation changes with new implementation and further background correction, so I was curious.
113
+ This is from orientation information from a cell from earlier version with measured background correction, colorbar range of values (-0.87,+0.87)
114
+ <img width="444" alt="image" src="https://user-images.githubusercontent.com/101817974/221914487-29cf0229-ff9b-46a7-b26a-92bfb40dcfc7.png">
115
+ This is from the new version with just measured background correction, range (+0.87,+2.5). I realized the information here is same, just inverted and offset by 90 degrees.
116
+ <img width="444" alt="image" src="https://user-images.githubusercontent.com/101817974/221913601-0e56f4ac-e94c-40c6-a7ac-52fcffaa02d8.png">
117
+ After the extra background correction the range changes and more information is visible, range (0,+3.14).
118
+ <img width="444" alt="image" src="https://user-images.githubusercontent.com/101817974/221913869-0201bc9a-b4e4-46a5-8202-666240768d92.png">
119
+
120
+ #### <img src="https://avatars.githubusercontent.com/u/9554101?u=7ab5421e9a6613c01e9c1d3261fa6f93645d48f9&v=4" width="50">[talonchandler](https://github.com/talonchandler) commented at [2023-02-28 16:50](https://github.com/mehta-lab/waveorder/pull/110#issuecomment-1448514682):
121
+
122
+ @mattersoflight
123
+ > Could you clarify which convention we are discussing here: convention for what is called right vs left circularly polarized light, or convention for axes of orientation, or may be both?
124
+
125
+ Good question...I think we should discuss both.
126
+
127
+ I can think of two paths to take here:
128
+
129
+ - **Decouple the reconstruction from the orientation convention.** We can give ourselves some flexibility in RCP vs. LCP, relative orientations of the camera wrt the polarizers, axis convention etc., then give the `recOrder` user (and the scripts) enough "knobs" to fix any deviation from convention. For example, we currently have one knob in recOrder that rotates by 90 degrees. To set the knobs, image a known sample (ideally a Kazansky target, but we can point our users to a more available alternative), twiddle the knobs until your colors match your expectations, then keep those knobs as default reconstruction parameters. **This is effectively what we're doing now, and I think it's workable.**
130
+
131
+ - **Couple the reconstruction and orientation convention.** We can choose to be strict with our conventions: make the user specify RCP vs. LCP, the camera orientation, axis convention etc., then use those parameters as inputs to the reconstruction code. This will lead to the same results as above, but requires more diligence from `recOrder` user. In practice, I expect that this approach will result in the same approach as above---fiddle with these (physically motivated) knobs until you match your expectations of a known sample.
132
+
133
+ #### <img src="https://avatars.githubusercontent.com/u/9554101?u=7ab5421e9a6613c01e9c1d3261fa6f93645d48f9&v=4" width="50">[talonchandler](https://github.com/talonchandler) commented at [2023-02-28 16:53](https://github.com/mehta-lab/waveorder/pull/110#issuecomment-1448519731):
134
+
135
+ Thanks for the comparison @Soorya19Pradeep. Very helpful.
136
+
137
+ > I realized the information here is same, just inverted and offset by 90 degrees.
138
+
139
+ These are the types of knobs that we might provide in the "decoupling" approach: one checkbox/function for "invert orientation" and one for "rotate by 90 degrees".
140
+
141
+ #### <img src="https://avatars.githubusercontent.com/u/2934183?u=e638cee77e71afc6e0579e50c2712dfe8a707869&v=4" width="50">[mattersoflight](https://github.com/mattersoflight) commented at [2023-02-28 16:55](https://github.com/mehta-lab/waveorder/pull/110#issuecomment-1448521331):
142
+
143
+ > Decouple the reconstruction from the orientation convention. We can give ourselves some flexibility in RCP vs. LCP, relative orientations of the camera wrt the polarizers, axis convention etc., then give the recOrder user (and the scripts) enough "knobs" to fix any deviation from convention. For example, we currently have one knob in recOrder that rotates by 90 degrees. To set the knobs, image a known sample (ideally a Kazansky target, but we can point our users to a more available alternative), twiddle the knobs until your colors match your expectations, then keep those knobs as default reconstruction parameters. This is effectively what we're doing now, and I think it's workable.
144
+
145
+ This is the most general approach for any light path. I think two knobs suffice to register the angular coordinate system in data with the angular coordinate system on stage: one flip and one rotation.
146
+
147
+ #### <img src="https://avatars.githubusercontent.com/u/2934183?u=e638cee77e71afc6e0579e50c2712dfe8a707869&v=4" width="50">[mattersoflight](https://github.com/mattersoflight) commented at [2023-02-28 18:34](https://github.com/mehta-lab/waveorder/pull/110#issuecomment-1448671628):
148
+
149
+ > After the extra background correction the range changes and more information is visible, range (0,+3.14).
150
+
151
+ Thanks, @Soorya19Pradeep for the examples. It is great that you are taking a close look at FOVs.
152
+
153
+ Seeing the full dynamic range of orientation after correcting background bias is promising! To be sure of the accuracy of the measurement, I suggest finding some patches where you see strong cortical actin bundles. If the background correction in this (arguably challenging) case is accurate, you'd see that orientation is parallel to the actin bundle. Once you have reconstructed retardance and orientation, you can call [`waveorder.visual.plotVectorField` ](https://github.com/mehta-lab/waveorder/blob/4b3b13364f313f752e23dde8bf9cf2080367acb4/waveorder/visual.py#LL995C18-L995C18) to visualize the orientation.
154
+
155
+ #### <img src="https://avatars.githubusercontent.com/u/9554101?u=7ab5421e9a6613c01e9c1d3261fa6f93645d48f9&v=4" width="50">[talonchandler](https://github.com/talonchandler) commented at [2023-03-07 02:23](https://github.com/mehta-lab/waveorder/pull/110#issuecomment-1457401823):
156
+
157
+ I've just completed the renaming/refactoring. @mattersoflight this is ready for your re-review.
158
+
159
+ The latest API (at this level) looks like:
160
+ ```
161
+ # Calculate I2S
162
+ I2S = stokes.I2S_matrix(swing, scheme="5-State")
163
+
164
+ # Calculate Stokes vectors
165
+ S_sm = stokes.mmul(I2S, czyx_data) # Apply I2S to sample data
166
+ S_bg_meas = stokes.mmul(I2S, cyx_bg_data) # Apply I2S to measured background
167
+
168
+ # Calculate background correction matrix from S_bg
169
+ M_bg_inv = stokes.mueller_from_stokes(*S_bg_meas)
170
+
171
+ # Apply measured background correction to sample data
172
+ bg_corr_S_sm = stokes.mmul(M_bg_inv, S_sm)
173
+
174
+ # Recover ADR parameters
175
+ ret, ori, tra, dop = stokes.estimate_ADR_from_stokes(*bg_corr_S_sm)
176
+ ```
177
+
178
+ I've also spent some time characterizing the old (green profiles) vs. new algorithms (white profiles).
179
+
180
+ **Soorya's cells - retardance on y axis - measured bkg correction only**
181
+ ![Screenshot 2023-03-06 at 5 53 03 PM](https://user-images.githubusercontent.com/9554101/223301263-a580ca6c-e835-4aaf-85cc-e4712134d70f.png)
182
+ At most 2% difference in retardance.
183
+
184
+ **Kazansky target - retardance on y axis - measured bkg correction only**
185
+ ![Screenshot 2023-03-06 at 5 31 36 PM](https://user-images.githubusercontent.com/9554101/223301450-7e6e5638-03f3-4560-9273-76eeb8a59790.png)
186
+ At most 1% difference in retardance.
187
+
188
+ **Kazansky target - orientation on y axis - measured bkg correction only**
189
+ ![Screenshot 2023-03-06 at 5 35 18 PM](https://user-images.githubusercontent.com/9554101/223301559-46b1ebab-9350-4c7e-90a1-3a1ed560b6ed.png)
190
+ At most 2% difference in non-background regions when the different orientation convention is accounted for. This main difference here is from a difference in orientation conventions which we'll be handling with two user-facing switches as discussed above.
191
+
192
+ **Timing**
193
+ Current performance bottleneck is the pre-calculation of `mueller_from_stokes` from the background stokes vectors, which can be further optimized (I expect factors of 2-4x). For now, here are a couple benchmarks:
194
+
195
+ 1 x 2048 x 2048:
196
+ old algorithm: 1.0 s
197
+ new algorithm: 15.4 s
198
+
199
+ 8 x 2048 x 2048:
200
+ old algorithm: 17.8 s
201
+ new algorithm: 19.6 s
202
+
203
+ **Example comparison script (generates Kaz target comparison above)**
204
+
205
+ <details>
206
+ <summary>Full example script (click to expand):</summary>
207
+
208
+ ```
209
+ import numpy as np
210
+ from waveorder import stokes
211
+ from recOrder.io.utils import load_bg
212
+ from recOrder.io.metadata_reader import MetadataReader
213
+ from recOrder.compute.reconstructions import (
214
+ initialize_reconstructor,
215
+ reconstruct_qlipp_stokes,
216
+ reconstruct_qlipp_birefringence,
217
+ )
218
+ from iohub.reader import imread
219
+ from iohub.ngff import open_ome_zarr
220
+ import napari
221
+
222
+ # Set paths
223
+ base_path = "/hpc/projects/compmicro/rawdata/hummingbird/Talon/2023_02_08_kaz/"
224
+ data_subpath = "kaz-raw_recOrderPluginSnap_0/kaz-raw_RawPolDataSnap.zarr"
225
+ bg_subpath = "BG"
226
+ cal_subpath = "calibration_metadata.txt"
227
+
228
+ # Read data
229
+ reader = imread(base_path + data_subpath)
230
+ T, C, Z, Y, X = reader.shape
231
+ czyx_data = reader.get_array(position=0)[0, ...]
232
+
233
+ # Read background data
234
+ cyx_bg_data = load_bg(base_path + bg_subpath, height=Y, width=X)
235
+
236
+ # Read calibration metadata
237
+ md = MetadataReader(base_path + cal_subpath)
238
+
239
+ def new_bg_correction(czyx_data, cyx_bg_data, swing, scheme):
240
+ # Calculate I2S
241
+ I2S = stokes.I2S_matrix(md.Swing, scheme=md.get_calibration_scheme())
242
+
243
+ # Calculate Stokes vectors
244
+ S_sm = stokes.mmul(I2S, czyx_data) # Apply I2S to sample data
245
+ S_bg_meas = stokes.mmul(I2S, cyx_bg_data) # Apply I2S to measured background
246
+
247
+ # Calculate background correction matrix from S_bg
248
+ M_bg_inv = stokes.mueller_from_stokes(*S_bg_meas)
249
+
250
+ # Apply measured background correction to sample data
251
+ bg_corr_S_sm = stokes.mmul(M_bg_inv, S_sm)
252
+
253
+ # Recover ADR parameters
254
+ ret, ori, tra, dop = stokes.estimate_ADR_from_stokes(*bg_corr_S_sm)
255
+
256
+ ret = ret / (2 * np.pi) * 532
257
+
258
+ return ret, ori, tra, dop
259
+
260
+ def old_bg_correction(czyx_data, cyx_bg_data, swing, scheme):
261
+ reconstructor_args = {
262
+ "image_dim": (Y, X),
263
+ "n_slices": 1, # number of slices in z-stack
264
+ "wavelength_nm": 532,
265
+ "swing": swing,
266
+ "calibration_scheme": scheme, # "4-State" or "5-State"
267
+ "bg_correction": "global",
268
+ }
269
+ reconstructor = initialize_reconstructor(
270
+ pipeline="birefringence", **reconstructor_args
271
+ )
272
+ # Reconstruct background Stokes
273
+ bg_stokes = reconstruct_qlipp_stokes(cyx_bg_data, reconstructor)
274
+
275
+ # Reconstruct data Stokes w/ background correction
276
+ stokes = reconstruct_qlipp_stokes(czyx_data, reconstructor, bg_stokes)
277
+
278
+ birefringence = reconstruct_qlipp_birefringence(stokes, reconstructor)
279
+ birefringence[0] = (
280
+ birefringence[0] / (2 * np.pi) * reconstructor_args["wavelength_nm"]
281
+ )
282
+ return birefringence
283
+
284
+ oldADR = old_bg_correction(czyx_data, cyx_bg_data, md.Swing, md.Calibration_scheme)
285
+ newADR = new_bg_correction(czyx_data, cyx_bg_data, md.Swing, md.Calibration_scheme)
286
+
287
+
288
+ v = napari.Viewer()
289
+ v.add_image(oldADR[..., 890:1220, 790:1370], name="old")
290
+ v.add_image(np.stack(newADR)[..., 890:1220, 790:1370], "new")
291
+ import pdb; pdb.set_trace()
292
+ ```
293
+ </details>
294
+
295
+ #### <img src="https://avatars.githubusercontent.com/u/9554101?u=7ab5421e9a6613c01e9c1d3261fa6f93645d48f9&v=4" width="50">[talonchandler](https://github.com/talonchandler) commented at [2023-03-07 21:12](https://github.com/mehta-lab/waveorder/pull/110#issuecomment-1458882074):
296
+
297
+ I neglected to commit one small change: `stokes.mueller_from_stokes` should be `direction=inverse` by default since this is the most common usage mode (and the usage mode I showed in the snippet from last night).
298
+
299
+ #### <img src="https://avatars.githubusercontent.com/u/2934183?u=e638cee77e71afc6e0579e50c2712dfe8a707869&v=4" width="50">[mattersoflight](https://github.com/mattersoflight) commented at [2023-03-07 23:07](https://github.com/mehta-lab/waveorder/pull/110#issuecomment-1459006194):
300
+
301
+ thanks for the offline discussion. Looks great to me!
302
+
303
+
304
+ -------------------------------------------------------------------------------
305
+
306
+
307
+
308
+ [Export of Github issue for [mehta-lab/waveorder](https://github.com/mehta-lab/waveorder). Generated on 2023.03.07 at 15:20:38.]
@@ -0,0 +1,7 @@
1
+ `waveorder` is undergoing a significant refactor, and this `examples/` folder serves as a good place to understand the current state of the repository.
2
+
3
+ The `models/` folder demonstrates the latest functionality of `waveorder`. These scripts will run as is in an environment with `waveorder` and `napari` installed. Each script demonstrates a simulation and reconstruction with a **model**: a specific set of assumptions about the sample and the data being acquired.
4
+
5
+ The `maintenance/` folder demonstrates the functionality of `waveorder` that we plan to move to `models/`. These scripts can be run as is, and they are being maintained with tests.
6
+
7
+ The `documentation/` folder consists of examples that demonstrate reconstruction with real data. These examples require access to the complete datasets, so they are not being actively maintained and serve primarily as documentation.