pyrception 0.3.2__py3-none-any.whl → 0.3.4__py3-none-any.whl

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
@@ -0,0 +1,120 @@
1
+ Metadata-Version: 2.4
2
+ Name: pyrception
3
+ Version: 0.3.4
4
+ Author: Alexander Hadjiivanov
5
+ Requires-Python: >=3.11
6
+ Description-Content-Type: text/markdown
7
+ License-File: LICENCE.md
8
+ Requires-Dist: av
9
+ Requires-Dist: bokeh
10
+ Requires-Dist: loguru
11
+ Requires-Dist: matplotlib
12
+ Requires-Dist: moviepy
13
+ Requires-Dist: numpy
14
+ Requires-Dist: omegaconf
15
+ Requires-Dist: platformdirs
16
+ Requires-Dist: scikit-image
17
+ Requires-Dist: scipy
18
+ Requires-Dist: tqdm
19
+ Dynamic: license-file
20
+
21
+ [![PyPI - Version](https://img.shields.io/pypi/v/pyrception)](https://pypi.org/project/pyrception/)
22
+ [![Read The Docs](https://readthedocs.org/projects/pyrception/badge/?version=latest)](https://pyrception.readthedocs.io/en/latest/)
23
+
24
+ # 🌄 Overview
25
+ Pyrception is a simulation framework for biosensors. Currently, it provides the base ingredients for simulating key parts of the structural and functional elements of visual processing observed in the mammalian retina. The long-term goal of Pyrception is to support multiple sensory modalities (such as auditory, olfactory and tactile), and to provide methods for integrating those inputs into a unified multisensory input signal (such as spike trains). Alongside this, Pyrception can also serve as an input *conversion* for encoding raw multimodal sensory input into a uniform spike train suitable for processing with spiking neural networks.
26
+
27
+ <!-- ![Herd of zebra - multi-stage retinal processing]() -->
28
+ <div align="center">
29
+ <video align="center" src="https://github.com/user-attachments/assets/faeea832-3832-48d0-8085-d71e62acc794" type="video/mp4" controls autoplay muted loop></video>
30
+ <p align="center">An RGB video processed with different layers (see the full <a href="https://pyrception.readthedocs.io/en/latest/notebooks/video/">example</a>).</p>
31
+ </div>
32
+
33
+ ## 🪛 Installation
34
+
35
+ You can install Pyrception from PyPI:
36
+
37
+ ```shell
38
+ pip install pyrception
39
+ ```
40
+
41
+ or directly from GitHub (optionally in development mode):
42
+
43
+ ```shell
44
+ git clone git@github.com:cantordust/pyrception.git
45
+ cd pyrception
46
+ pip install -e .
47
+ ```
48
+
49
+ ### ♻️ Optional dependencies
50
+
51
+ Pyrception supports several dependency groups:
52
+
53
+ - `cli`: Command-line interface.
54
+ - `events`: Support for processing events (including from event cameras).
55
+ - `dev`: Development tools (for testing, profiling, etc.).
56
+ - `torch`: PyTorch support.
57
+ - `ipy`: ipykernel & ipywidgets (for running inside notebooks).
58
+ - `docs`: Tools for building the documentation.
59
+ - NOTE: The documentation is built using [MKDocs](https://www.mkdocs.org/), which has been discontinued. The documentation will likely move to [Zensical](https://zensical.org/) soon.
60
+ - `all`: All of the above.
61
+
62
+ Use the `--group` with `pip` to enable a dependency group (repeat for each group). For instance:
63
+
64
+ ```shell
65
+ pip install -e . --group events --group docs
66
+ ```
67
+
68
+ will pull in all dependencies necessary for event-based input and building the documentation.
69
+
70
+ ## ⏯️ Usage
71
+
72
+ Please refer to the [documentation](https://pyrception.readthedocs.io/en/latest/), which contains step-by-step notebooks demonstrating how to use Pyrception with a [static image](https://pyrception.readthedocs.io/en/latest/notebooks/image/) and an [RGB video](https://pyrception.readthedocs.io/en/latest/notebooks/video/). More notebooks are currently being developed, including sparse event input from an event camera. Stay tuned.
73
+
74
+ ## 📈 Development
75
+
76
+ Please open an issue if you discover a bug, feature. That said, contributions are welcome!
77
+
78
+ To generate and view the documentation locally, clone the repository and run the MkDocs build pipeline (note that you have to install Pyrception with the [`docs` dependency group](#optional-dependencies)):
79
+
80
+ ```shell
81
+ git clone git@github.com:cantordust/pyrception.git
82
+ cd pyrception
83
+ pip install -e . --group docs
84
+ cd docs
85
+ mkdocs build
86
+ ```
87
+
88
+ Then, to view the documentation locally, start the MkDocs server:
89
+
90
+ ```shell
91
+ mkdocs serve
92
+ ```
93
+
94
+ # 📋 ToDo
95
+
96
+ ## Short-term
97
+ ### 👁️ Visual package
98
+ - [X] All major types of retinal cells.
99
+ - [X] Receptors (raw input, Weber's law).
100
+ - [X] Horizontal cells (mean local brightness, normalising feedback).
101
+ - [X] Bipolar cells (positive and negative contrast, temporal filter, excitatory input to ganglion cells).
102
+ - [X] Amacrine cells (inhibitory input to ganglion cells, modulatory signal to bipolar cells).
103
+ - [X] Ganglion cells (spiking).
104
+ - [X] Logpolar kernel arrangement.
105
+ - [X] Uniform or Gaussian kernels.
106
+ - [X] Arbitrary kernel, size, shape and orientation.
107
+ - 🚧 Colour vision (with colour opponency).
108
+ - 🚧 Temporal dynamics.
109
+ - 🚧 Events as input.
110
+ - [ ] Saccadic movements.
111
+
112
+ ### 👂 Auditory package
113
+ WIP.
114
+
115
+ ### 👃 Olfactory package
116
+ WIP.
117
+
118
+ ### 🔧 Others
119
+ - 🚧 Support alternative backends for sparse matrix operations ([CuPy](https://cupy.dev/), [PyTorch](https://pytorch.org/docs/stable/sparse.html), [Sparse](https://sparse.pydata.org/en/stable/)).
120
+ - 🚧 Interfacing with (neuromorphic) hardware, such as event cameras.
@@ -0,0 +1,7 @@
1
+ pyrception/__init__.py,sha256=WUk9ebEuT3MfgaQIrOUWRGvg1W9M76rUGuJZEQ3-lTE,84
2
+ pyrception-0.3.4.dist-info/licenses/LICENCE.md,sha256=sMjXzdudBXMsZj3NRSrmovs-xqcYWRmpwUkZmuD8zhA,1070
3
+ pyrception-0.3.4.dist-info/METADATA,sha256=_93Cev3POLlV1lSp2xux0UDCN9OD3RLJNm6MSwY2y2M,4930
4
+ pyrception-0.3.4.dist-info/WHEEL,sha256=wUyA8OaulRlbfwMtmQsvNngGrxQHAvkKcvRmdizlJi0,92
5
+ pyrception-0.3.4.dist-info/entry_points.txt,sha256=otFE1Ls0p9Hrb4dXjIRruuSLaUFjsKJDjf9R0tDUxmw,55
6
+ pyrception-0.3.4.dist-info/top_level.txt,sha256=0BQnVt58i3lOTA_eEir2Fz2KxDcruQoBmMAxgMWlfrU,11
7
+ pyrception-0.3.4.dist-info/RECORD,,
@@ -1,110 +0,0 @@
1
- Metadata-Version: 2.4
2
- Name: pyrception
3
- Version: 0.3.2
4
- Author: Alexander Hadjiivanov
5
- Requires-Python: >=3.11
6
- Description-Content-Type: text/markdown
7
- License-File: LICENCE.md
8
- Requires-Dist: av
9
- Requires-Dist: bokeh
10
- Requires-Dist: environs
11
- Requires-Dist: ipykernel
12
- Requires-Dist: ipywidgets
13
- Requires-Dist: loguru
14
- Requires-Dist: matplotlib
15
- Requires-Dist: moviepy
16
- Requires-Dist: numpy
17
- Requires-Dist: omegaconf
18
- Requires-Dist: platformdirs
19
- Requires-Dist: scikit-image
20
- Requires-Dist: scipy
21
- Requires-Dist: tqdm
22
- Provides-Extra: gui
23
- Requires-Dist: pyqtgraph; extra == "gui"
24
- Requires-Dist: PySide6; extra == "gui"
25
- Provides-Extra: cli
26
- Requires-Dist: cloup; extra == "cli"
27
- Provides-Extra: events
28
- Requires-Dist: tonic; extra == "events"
29
- Provides-Extra: dev
30
- Requires-Dist: black; extra == "dev"
31
- Requires-Dist: py-spy; extra == "dev"
32
- Provides-Extra: torch
33
- Requires-Dist: torch; extra == "torch"
34
- Provides-Extra: docs
35
- Requires-Dist: mkdocs; extra == "docs"
36
- Requires-Dist: mike; extra == "docs"
37
- Requires-Dist: mkdocs-jupyter; extra == "docs"
38
- Requires-Dist: mkdocs-material; extra == "docs"
39
- Requires-Dist: mkdocstrings[python]; extra == "docs"
40
- Dynamic: license-file
41
-
42
- [![PyPI - Version](https://img.shields.io/pypi/v/pyrception)](https://pypi.org/project/pyrception/)
43
- [![Read The Docs](https://readthedocs.org/projects/pyrception/badge/?version=latest)](https://pyrception.readthedocs.io/en/latest/)
44
-
45
- # Overview
46
- Pyrception is a simulation framework for bio-plausible simulation of perceptual modalities. Currently, it supports visual pathways of the mammalian retina, but the long-term goal is to support modalities such as auditory, olfactory and so forth. It can also serve as an input conversion library for encoding raw multimodal sensory input into a uniform spike train suitable for processing with spiking neural networks.
47
-
48
- ## Installation
49
-
50
- You can install Pyrception from PyPI:
51
-
52
- ```shell
53
- pip install pyrception
54
- ```
55
-
56
- or directly from GitHub (optionally in development mode):
57
-
58
- ```shell
59
- git clone git@github.com:cantordust/pyrception.git
60
- cd pyrception
61
- pip install -e .
62
- ```
63
-
64
- ## Usage
65
-
66
- Please refer to the [documentation](https://pyrception.readthedocs.io/en/latest/), which contains a [step-by-step notebook](https://pyrception.readthedocs.io/en/latest/notebooks/image/) demonstrating how to use `pyrception` with a static image. More notebooks are currently being developed, including frame-based RGB input and sparse event input from an event camera. Watch this space.
67
-
68
- ## Documentation
69
-
70
- To generate the documentation, run the MkDocs build pipeline. Note that to build and view the documentation locally, you have to install `pyrception` from GitHub with the optional `docs` modifier:
71
-
72
- ```shell
73
- pip install -e .[dev]
74
- cd docs
75
- mkdocs build
76
- ```
77
-
78
- Then, to view the documentation locally, start the MkDocs server:
79
-
80
- ```shell
81
- mkdocs serve
82
- ```
83
-
84
- # ToDo
85
-
86
- ## Short-term
87
- ### Visual package
88
- - [X] All major types of retinal cells.
89
- - [X] Receptors (raw input, Weber's law).
90
- - [X] Horizontal cells (mean local brightness, normalising feedback).
91
- - [X] Bipolar cells (positive and negative contrast, temporal filter, excitatory input to ganglion cells).
92
- - [X] Amacrine cells (inhibitory input to ganglion cells, modulatory signal to bipolar cells).
93
- - [X] Ganglion cells (spiking).
94
- - [X] Logpolar kernel arrangement.
95
- - [X] Uniform or Gaussian kernels.
96
- - [X] Arbitrary kernel, size, shape and orientation.
97
- - [ ] Saccadic movements [WIP].
98
- - [ ] Colour vision (with colour opponency) [WIP].
99
- - [ ] Temporal dynamics [WIP].
100
- - [ ] Events as input [WIP].
101
-
102
- ### Auditory package
103
- WIP.
104
-
105
- ### Olfactory package
106
- WIP.
107
-
108
- ### Overall functionality
109
- - [WIP] Support alternative backends for sparse matrix operations ([CuPy](https://cupy.dev/), [PyTorch](https://pytorch.org/docs/stable/sparse.html), [Sparse](https://sparse.pydata.org/en/stable/)).
110
- - [ ] Interfacing with (neuromorphic) hardware, such as event cameras.
@@ -1,7 +0,0 @@
1
- pyrception/__init__.py,sha256=WUk9ebEuT3MfgaQIrOUWRGvg1W9M76rUGuJZEQ3-lTE,84
2
- pyrception-0.3.2.dist-info/licenses/LICENCE.md,sha256=sMjXzdudBXMsZj3NRSrmovs-xqcYWRmpwUkZmuD8zhA,1070
3
- pyrception-0.3.2.dist-info/METADATA,sha256=weLEAFdSqU3Fb5YLPe0Pcra-9o5Qqkkg92K3bvNPtfk,3885
4
- pyrception-0.3.2.dist-info/WHEEL,sha256=wUyA8OaulRlbfwMtmQsvNngGrxQHAvkKcvRmdizlJi0,92
5
- pyrception-0.3.2.dist-info/entry_points.txt,sha256=otFE1Ls0p9Hrb4dXjIRruuSLaUFjsKJDjf9R0tDUxmw,55
6
- pyrception-0.3.2.dist-info/top_level.txt,sha256=0BQnVt58i3lOTA_eEir2Fz2KxDcruQoBmMAxgMWlfrU,11
7
- pyrception-0.3.2.dist-info/RECORD,,