neural-feature-importance 0.5.0__tar.gz → 0.5.2__tar.gz

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
Files changed (21) hide show
  1. neural_feature_importance-0.5.2/PKG-INFO +126 -0
  2. neural_feature_importance-0.5.2/README.md +113 -0
  3. neural_feature_importance-0.5.2/neural_feature_importance.egg-info/PKG-INFO +126 -0
  4. {neural_feature_importance-0.5.0 → neural_feature_importance-0.5.2}/pyproject.toml +2 -0
  5. neural_feature_importance-0.5.0/PKG-INFO +0 -10
  6. neural_feature_importance-0.5.0/README.md +0 -95
  7. neural_feature_importance-0.5.0/neural_feature_importance.egg-info/PKG-INFO +0 -10
  8. {neural_feature_importance-0.5.0 → neural_feature_importance-0.5.2}/.github/workflows/python-publish.yml +0 -0
  9. {neural_feature_importance-0.5.0 → neural_feature_importance-0.5.2}/AGENTS.md +0 -0
  10. {neural_feature_importance-0.5.0 → neural_feature_importance-0.5.2}/compare_feature_importance.py +0 -0
  11. {neural_feature_importance-0.5.0 → neural_feature_importance-0.5.2}/full_experiment.py +0 -0
  12. {neural_feature_importance-0.5.0 → neural_feature_importance-0.5.2}/neural_feature_importance/__init__.py +0 -0
  13. {neural_feature_importance-0.5.0 → neural_feature_importance-0.5.2}/neural_feature_importance/callbacks.py +0 -0
  14. {neural_feature_importance-0.5.0 → neural_feature_importance-0.5.2}/neural_feature_importance/utils/__init__.py +0 -0
  15. {neural_feature_importance-0.5.0 → neural_feature_importance-0.5.2}/neural_feature_importance/utils/monitors.py +0 -0
  16. {neural_feature_importance-0.5.0 → neural_feature_importance-0.5.2}/neural_feature_importance.egg-info/SOURCES.txt +0 -0
  17. {neural_feature_importance-0.5.0 → neural_feature_importance-0.5.2}/neural_feature_importance.egg-info/dependency_links.txt +0 -0
  18. {neural_feature_importance-0.5.0 → neural_feature_importance-0.5.2}/neural_feature_importance.egg-info/requires.txt +0 -0
  19. {neural_feature_importance-0.5.0 → neural_feature_importance-0.5.2}/neural_feature_importance.egg-info/top_level.txt +0 -0
  20. {neural_feature_importance-0.5.0 → neural_feature_importance-0.5.2}/setup.cfg +0 -0
  21. {neural_feature_importance-0.5.0 → neural_feature_importance-0.5.2}/variance-based feature importance in artificial neural networks.ipynb +0 -0
@@ -0,0 +1,126 @@
1
+ Metadata-Version: 2.4
2
+ Name: neural-feature-importance
3
+ Version: 0.5.2
4
+ Summary: Variance-based feature importance for Neural Networks using callbacks for Keras and PyTorch
5
+ Author: CR de Sá
6
+ Requires-Python: >=3.10
7
+ Description-Content-Type: text/markdown
8
+ Requires-Dist: numpy
9
+ Provides-Extra: tensorflow
10
+ Requires-Dist: tensorflow; extra == "tensorflow"
11
+ Provides-Extra: torch
12
+ Requires-Dist: torch; extra == "torch"
13
+
14
+ # neural-feature-importance
15
+
16
+ [![PyPI version](https://img.shields.io/pypi/v/neural-feature-importance.svg)](https://pypi.org/project/neural-feature-importance/)
17
+ [![Python versions](https://img.shields.io/pypi/pyversions/neural-feature-importance.svg)](https://pypi.org/project/neural-feature-importance/)
18
+
19
+ Variance-based feature importance for deep learning models.
20
+
21
+ `neural-feature-importance` implements the method described in
22
+ [CR de Sá, *Variance-based Feature Importance in Neural Networks*](https://doi.org/10.1007/978-3-030-33778-0_24).
23
+ It tracks the variance of the first trainable layer using Welford's algorithm
24
+ and produces normalized importance scores for each feature.
25
+
26
+ ## Features
27
+
28
+ - `VarianceImportanceKeras` — drop-in callback for TensorFlow/Keras models
29
+ - `VarianceImportanceTorch` — helper class for PyTorch training loops
30
+ - `MetricThreshold` — early-stopping callback based on a monitored metric
31
+ - Example scripts to reproduce the experiments from the paper
32
+
33
+ ## Installation
34
+
35
+ ```bash
36
+ pip install "neural-feature-importance[tensorflow]" # for Keras
37
+ pip install "neural-feature-importance[torch]" # for PyTorch
38
+ ```
39
+
40
+ Retrieve the package version via:
41
+
42
+ ```python
43
+ from neural_feature_importance import __version__
44
+ print(__version__)
45
+ ```
46
+
47
+ ## Quick start
48
+
49
+ ### Keras
50
+
51
+ ```python
52
+ from neural_feature_importance import VarianceImportanceKeras
53
+ from neural_feature_importance.utils import MetricThreshold
54
+
55
+ viann = VarianceImportanceKeras()
56
+ monitor = MetricThreshold(monitor="val_accuracy", threshold=0.95)
57
+ model.fit(X, y, validation_split=0.05, epochs=30, callbacks=[viann, monitor])
58
+ print(viann.feature_importances_)
59
+ ```
60
+
61
+ ### PyTorch
62
+
63
+ ```python
64
+ from neural_feature_importance import VarianceImportanceTorch
65
+
66
+ tracker = VarianceImportanceTorch(model)
67
+ tracker.on_train_begin()
68
+ for epoch in range(num_epochs):
69
+ train_one_epoch(model, optimizer, dataloader)
70
+ tracker.on_epoch_end()
71
+ tracker.on_train_end()
72
+ print(tracker.feature_importances_)
73
+ ```
74
+
75
+ ## Example scripts
76
+
77
+ Run `compare_feature_importance.py` to train a small network on the Iris dataset
78
+ and compare the scores with a random forest baseline:
79
+
80
+ ```bash
81
+ python compare_feature_importance.py
82
+ ```
83
+
84
+ Run `full_experiment.py` to reproduce the experiments from the paper:
85
+
86
+ ```bash
87
+ python full_experiment.py
88
+ ```
89
+
90
+ ## Development
91
+
92
+ After making changes, run the following checks:
93
+
94
+ ```bash
95
+ python -m py_compile neural_feature_importance/callbacks.py
96
+ python -m py_compile "variance-based feature importance in artificial neural networks.ipynb" 2>&1 | head
97
+ jupyter nbconvert --to script "variance-based feature importance in artificial neural networks.ipynb" --stdout | head
98
+ ```
99
+
100
+ ## Citation
101
+
102
+ If you use this package in your research, please cite:
103
+
104
+ ```bibtex
105
+ @inproceedings{DBLP:conf/dis/Sa19,
106
+ author = {Cl{\'a}udio Rebelo de S{\'a}},
107
+ editor = {Petra Kralj Novak and
108
+ Tomislav Smuc and
109
+ Saso Dzeroski},
110
+ title = {Variance-Based Feature Importance in Neural Networks},
111
+ booktitle = {Discovery Science - 22nd International Conference, {DS} 2019, Split,
112
+ Croatia, October 28-30, 2019, Proceedings},
113
+ series = {Lecture Notes in Computer Science},
114
+ volume = {11828},
115
+ pages = {306--315},
116
+ publisher = {Springer},
117
+ year = {2019},
118
+ url = {https://doi.org/10.1007/978-3-030-33778-0\_24},
119
+ doi = {10.1007/978-3-030-33778-0\_24},
120
+ timestamp = {Thu, 07 Nov 2019 09:20:36 +0100},
121
+ biburl = {https://dblp.org/rec/conf/dis/Sa19.bib},
122
+ bibsource = {dblp computer science bibliography, https://dblp.org}
123
+ }
124
+ ```
125
+
126
+ We appreciate citations as they help the community discover this work.
@@ -0,0 +1,113 @@
1
+ # neural-feature-importance
2
+
3
+ [![PyPI version](https://img.shields.io/pypi/v/neural-feature-importance.svg)](https://pypi.org/project/neural-feature-importance/)
4
+ [![Python versions](https://img.shields.io/pypi/pyversions/neural-feature-importance.svg)](https://pypi.org/project/neural-feature-importance/)
5
+
6
+ Variance-based feature importance for deep learning models.
7
+
8
+ `neural-feature-importance` implements the method described in
9
+ [CR de Sá, *Variance-based Feature Importance in Neural Networks*](https://doi.org/10.1007/978-3-030-33778-0_24).
10
+ It tracks the variance of the first trainable layer using Welford's algorithm
11
+ and produces normalized importance scores for each feature.
12
+
13
+ ## Features
14
+
15
+ - `VarianceImportanceKeras` — drop-in callback for TensorFlow/Keras models
16
+ - `VarianceImportanceTorch` — helper class for PyTorch training loops
17
+ - `MetricThreshold` — early-stopping callback based on a monitored metric
18
+ - Example scripts to reproduce the experiments from the paper
19
+
20
+ ## Installation
21
+
22
+ ```bash
23
+ pip install "neural-feature-importance[tensorflow]" # for Keras
24
+ pip install "neural-feature-importance[torch]" # for PyTorch
25
+ ```
26
+
27
+ Retrieve the package version via:
28
+
29
+ ```python
30
+ from neural_feature_importance import __version__
31
+ print(__version__)
32
+ ```
33
+
34
+ ## Quick start
35
+
36
+ ### Keras
37
+
38
+ ```python
39
+ from neural_feature_importance import VarianceImportanceKeras
40
+ from neural_feature_importance.utils import MetricThreshold
41
+
42
+ viann = VarianceImportanceKeras()
43
+ monitor = MetricThreshold(monitor="val_accuracy", threshold=0.95)
44
+ model.fit(X, y, validation_split=0.05, epochs=30, callbacks=[viann, monitor])
45
+ print(viann.feature_importances_)
46
+ ```
47
+
48
+ ### PyTorch
49
+
50
+ ```python
51
+ from neural_feature_importance import VarianceImportanceTorch
52
+
53
+ tracker = VarianceImportanceTorch(model)
54
+ tracker.on_train_begin()
55
+ for epoch in range(num_epochs):
56
+ train_one_epoch(model, optimizer, dataloader)
57
+ tracker.on_epoch_end()
58
+ tracker.on_train_end()
59
+ print(tracker.feature_importances_)
60
+ ```
61
+
62
+ ## Example scripts
63
+
64
+ Run `compare_feature_importance.py` to train a small network on the Iris dataset
65
+ and compare the scores with a random forest baseline:
66
+
67
+ ```bash
68
+ python compare_feature_importance.py
69
+ ```
70
+
71
+ Run `full_experiment.py` to reproduce the experiments from the paper:
72
+
73
+ ```bash
74
+ python full_experiment.py
75
+ ```
76
+
77
+ ## Development
78
+
79
+ After making changes, run the following checks:
80
+
81
+ ```bash
82
+ python -m py_compile neural_feature_importance/callbacks.py
83
+ python -m py_compile "variance-based feature importance in artificial neural networks.ipynb" 2>&1 | head
84
+ jupyter nbconvert --to script "variance-based feature importance in artificial neural networks.ipynb" --stdout | head
85
+ ```
86
+
87
+ ## Citation
88
+
89
+ If you use this package in your research, please cite:
90
+
91
+ ```bibtex
92
+ @inproceedings{DBLP:conf/dis/Sa19,
93
+ author = {Cl{\'a}udio Rebelo de S{\'a}},
94
+ editor = {Petra Kralj Novak and
95
+ Tomislav Smuc and
96
+ Saso Dzeroski},
97
+ title = {Variance-Based Feature Importance in Neural Networks},
98
+ booktitle = {Discovery Science - 22nd International Conference, {DS} 2019, Split,
99
+ Croatia, October 28-30, 2019, Proceedings},
100
+ series = {Lecture Notes in Computer Science},
101
+ volume = {11828},
102
+ pages = {306--315},
103
+ publisher = {Springer},
104
+ year = {2019},
105
+ url = {https://doi.org/10.1007/978-3-030-33778-0\_24},
106
+ doi = {10.1007/978-3-030-33778-0\_24},
107
+ timestamp = {Thu, 07 Nov 2019 09:20:36 +0100},
108
+ biburl = {https://dblp.org/rec/conf/dis/Sa19.bib},
109
+ bibsource = {dblp computer science bibliography, https://dblp.org}
110
+ }
111
+ ```
112
+
113
+ We appreciate citations as they help the community discover this work.
@@ -0,0 +1,126 @@
1
+ Metadata-Version: 2.4
2
+ Name: neural-feature-importance
3
+ Version: 0.5.2
4
+ Summary: Variance-based feature importance for Neural Networks using callbacks for Keras and PyTorch
5
+ Author: CR de Sá
6
+ Requires-Python: >=3.10
7
+ Description-Content-Type: text/markdown
8
+ Requires-Dist: numpy
9
+ Provides-Extra: tensorflow
10
+ Requires-Dist: tensorflow; extra == "tensorflow"
11
+ Provides-Extra: torch
12
+ Requires-Dist: torch; extra == "torch"
13
+
14
+ # neural-feature-importance
15
+
16
+ [![PyPI version](https://img.shields.io/pypi/v/neural-feature-importance.svg)](https://pypi.org/project/neural-feature-importance/)
17
+ [![Python versions](https://img.shields.io/pypi/pyversions/neural-feature-importance.svg)](https://pypi.org/project/neural-feature-importance/)
18
+
19
+ Variance-based feature importance for deep learning models.
20
+
21
+ `neural-feature-importance` implements the method described in
22
+ [CR de Sá, *Variance-based Feature Importance in Neural Networks*](https://doi.org/10.1007/978-3-030-33778-0_24).
23
+ It tracks the variance of the first trainable layer using Welford's algorithm
24
+ and produces normalized importance scores for each feature.
25
+
26
+ ## Features
27
+
28
+ - `VarianceImportanceKeras` — drop-in callback for TensorFlow/Keras models
29
+ - `VarianceImportanceTorch` — helper class for PyTorch training loops
30
+ - `MetricThreshold` — early-stopping callback based on a monitored metric
31
+ - Example scripts to reproduce the experiments from the paper
32
+
33
+ ## Installation
34
+
35
+ ```bash
36
+ pip install "neural-feature-importance[tensorflow]" # for Keras
37
+ pip install "neural-feature-importance[torch]" # for PyTorch
38
+ ```
39
+
40
+ Retrieve the package version via:
41
+
42
+ ```python
43
+ from neural_feature_importance import __version__
44
+ print(__version__)
45
+ ```
46
+
47
+ ## Quick start
48
+
49
+ ### Keras
50
+
51
+ ```python
52
+ from neural_feature_importance import VarianceImportanceKeras
53
+ from neural_feature_importance.utils import MetricThreshold
54
+
55
+ viann = VarianceImportanceKeras()
56
+ monitor = MetricThreshold(monitor="val_accuracy", threshold=0.95)
57
+ model.fit(X, y, validation_split=0.05, epochs=30, callbacks=[viann, monitor])
58
+ print(viann.feature_importances_)
59
+ ```
60
+
61
+ ### PyTorch
62
+
63
+ ```python
64
+ from neural_feature_importance import VarianceImportanceTorch
65
+
66
+ tracker = VarianceImportanceTorch(model)
67
+ tracker.on_train_begin()
68
+ for epoch in range(num_epochs):
69
+ train_one_epoch(model, optimizer, dataloader)
70
+ tracker.on_epoch_end()
71
+ tracker.on_train_end()
72
+ print(tracker.feature_importances_)
73
+ ```
74
+
75
+ ## Example scripts
76
+
77
+ Run `compare_feature_importance.py` to train a small network on the Iris dataset
78
+ and compare the scores with a random forest baseline:
79
+
80
+ ```bash
81
+ python compare_feature_importance.py
82
+ ```
83
+
84
+ Run `full_experiment.py` to reproduce the experiments from the paper:
85
+
86
+ ```bash
87
+ python full_experiment.py
88
+ ```
89
+
90
+ ## Development
91
+
92
+ After making changes, run the following checks:
93
+
94
+ ```bash
95
+ python -m py_compile neural_feature_importance/callbacks.py
96
+ python -m py_compile "variance-based feature importance in artificial neural networks.ipynb" 2>&1 | head
97
+ jupyter nbconvert --to script "variance-based feature importance in artificial neural networks.ipynb" --stdout | head
98
+ ```
99
+
100
+ ## Citation
101
+
102
+ If you use this package in your research, please cite:
103
+
104
+ ```bibtex
105
+ @inproceedings{DBLP:conf/dis/Sa19,
106
+ author = {Cl{\'a}udio Rebelo de S{\'a}},
107
+ editor = {Petra Kralj Novak and
108
+ Tomislav Smuc and
109
+ Saso Dzeroski},
110
+ title = {Variance-Based Feature Importance in Neural Networks},
111
+ booktitle = {Discovery Science - 22nd International Conference, {DS} 2019, Split,
112
+ Croatia, October 28-30, 2019, Proceedings},
113
+ series = {Lecture Notes in Computer Science},
114
+ volume = {11828},
115
+ pages = {306--315},
116
+ publisher = {Springer},
117
+ year = {2019},
118
+ url = {https://doi.org/10.1007/978-3-030-33778-0\_24},
119
+ doi = {10.1007/978-3-030-33778-0\_24},
120
+ timestamp = {Thu, 07 Nov 2019 09:20:36 +0100},
121
+ biburl = {https://dblp.org/rec/conf/dis/Sa19.bib},
122
+ bibsource = {dblp computer science bibliography, https://dblp.org}
123
+ }
124
+ ```
125
+
126
+ We appreciate citations as they help the community discover this work.
@@ -6,8 +6,10 @@ build-backend = "setuptools.build_meta"
6
6
  name = "neural-feature-importance"
7
7
  description = "Variance-based feature importance for Neural Networks using callbacks for Keras and PyTorch"
8
8
  authors = [{name = "CR de Sá"}]
9
+ readme = "README.md"
9
10
  dependencies = ["numpy"]
10
11
  dynamic = ["version"]
12
+ requires-python = ">=3.10"
11
13
 
12
14
  [project.optional-dependencies]
13
15
  tensorflow = ["tensorflow"]
@@ -1,10 +0,0 @@
1
- Metadata-Version: 2.4
2
- Name: neural-feature-importance
3
- Version: 0.5.0
4
- Summary: Variance-based feature importance for Neural Networks using callbacks for Keras and PyTorch
5
- Author: CR de Sá
6
- Requires-Dist: numpy
7
- Provides-Extra: tensorflow
8
- Requires-Dist: tensorflow; extra == "tensorflow"
9
- Provides-Extra: torch
10
- Requires-Dist: torch; extra == "torch"
@@ -1,95 +0,0 @@
1
- # Variance-based Feature Importance in Neural Networks / Deep Learning
2
-
3
- This file provides a working example of how to measure the importance of features (inputs) in neural networks.
4
-
5
- This method is a new method to measure the relative importance of features in Artificial Neural Networks (ANN) models. Its underlying principle assumes that the more important a feature is, the more the weights, connected to the respective input neuron, will change during the training of the model. To capture this behavior, a running variance of every weight connected to the input layer is measured during training. For that, an adaptation of Welford's online algorithm for computing the online variance is proposed.
6
-
7
- When the training is finished, for each input, the variances of the weights are combined with the final weights to obtain the measure of relative importance for each feature.
8
-
9
- The file **variance-based feature importance in artificial neural networks.ipynb** includes the code to fully replicate the results obtained in the paper:
10
-
11
- CR de Sá [**Variance-based Feature Importance in Neural Networks**](https://doi.org/10.1007/978-3-030-33778-0_24)
12
- 22st International Conference on Discovery Science (DS 2019) Split, Croatia, October 28-30, 2019
13
-
14
-
15
- ## VIANN
16
- #### Variance-based Feature Importance of Artificial Neural Networks
17
-
18
- This repository exposes the feature importance callback as a small Python package named `neural-feature-importance`.
19
- It will automatically track the first layer that contains trainable weights so you can use it with models that start with an `InputLayer` or other preprocessing layers.
20
- There is also a helper for PyTorch models that follows the same API.
21
-
22
- Install with pip and select the extras that match your framework:
23
-
24
- ```bash
25
- pip install "neural-feature-importance[tensorflow]" # for Keras
26
- pip install "neural-feature-importance[torch]" # for PyTorch
27
- ```
28
-
29
- The package uses `setuptools_scm` to derive its version from Git tags. Access it
30
- via:
31
-
32
- ```python
33
- from neural_feature_importance import __version__
34
-
35
- print(__version__)
36
- ```
37
-
38
- ```python
39
- from neural_feature_importance import VarianceImportanceCallback, AccuracyMonitor
40
-
41
- import logging
42
-
43
- logging.basicConfig(level=logging.INFO)
44
-
45
- VIANN = VarianceImportanceCallback()
46
- monitor = AccuracyMonitor(baseline=0.95)
47
- ```
48
-
49
- For a PyTorch model, use ``VarianceImportanceTorch`` and call its
50
- ``on_train_begin``, ``on_epoch_end`` and ``on_train_end`` methods inside your
51
- training loop:
52
-
53
- ```python
54
- from neural_feature_importance import VarianceImportanceTorch
55
-
56
- tracker = VarianceImportanceTorch(model)
57
- tracker.on_train_begin()
58
- for epoch in range(num_epochs):
59
- train_one_epoch(model, optimizer, data_loader)
60
- tracker.on_epoch_end()
61
- tracker.on_train_end()
62
- print(tracker.var_scores)
63
- ```
64
-
65
- Use this callback during model training:
66
-
67
- ```python
68
- model = Sequential()
69
- model.add(Dense(50, input_dim=input_dim, activation='relu', kernel_initializer='normal', kernel_regularizer=l2(0.01)))
70
- model.add(Dense(100, activation='relu', kernel_initializer='normal', kernel_regularizer=l2(0.01)))
71
- model.add(Dense(50, activation='relu', kernel_initializer='normal', kernel_regularizer=l2(0.01)))
72
- model.add(Dense(5, activation='softmax', kernel_initializer='normal'))
73
-
74
- model.compile(loss='categorical_crossentropy', optimizer='sgd', metrics=['accuracy'])
75
- model.fit(X, Y, validation_split=0.05, epochs=30, batch_size=64, shuffle=True,
76
- verbose=1, callbacks=[VIANN, monitor])
77
-
78
- print(VIANN.var_scores)
79
- ```
80
-
81
- ## Comparing with Random Forest
82
-
83
- To verify the variance-based scores, run `compare_feature_importance.py`. The
84
- script trains a small neural network on the Iris dataset and compares the scores
85
- with those from a `RandomForestClassifier`.
86
-
87
- ```bash
88
- python compare_feature_importance.py
89
- ```
90
-
91
- For a larger experiment across several datasets, run `full_experiment.py`. The script builds a simple network for each dataset, applies the `AccuracyMonitor` for early stopping, and prints the correlation between neural network importances and a random forest baseline.
92
-
93
- ```bash
94
- python full_experiment.py
95
- ```
@@ -1,10 +0,0 @@
1
- Metadata-Version: 2.4
2
- Name: neural-feature-importance
3
- Version: 0.5.0
4
- Summary: Variance-based feature importance for Neural Networks using callbacks for Keras and PyTorch
5
- Author: CR de Sá
6
- Requires-Dist: numpy
7
- Provides-Extra: tensorflow
8
- Requires-Dist: tensorflow; extra == "tensorflow"
9
- Provides-Extra: torch
10
- Requires-Dist: torch; extra == "torch"