functioneer 0.2.0__tar.gz → 0.2.1__tar.gz

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
@@ -1,6 +1,6 @@
1
1
  Metadata-Version: 2.2
2
2
  Name: functioneer
3
- Version: 0.2.0
3
+ Version: 0.2.1
4
4
  Summary: A library providing easy automated analysis for your functions.
5
5
  Author-email: Quinn Marsh <quinnmarsh@hotmail.com>
6
6
  Maintainer-email: Quinn Marsh <quinnmarsh@hotmail.com>
@@ -31,7 +31,7 @@ Requires-Dist: pandas>=1.0.5
31
31
  # Functioneer
32
32
 
33
33
  **Author**: Quinn Marsh
34
- **Date**: January 08, 2025
34
+ **Date**: February 02, 2025
35
35
 
36
36
  Functioneer is a Python package that automates the analysis of ANY function, enabling you to test and optimize with unlimited combinations of parameters. Whether you're performing parameter sweeps, sensitivity testing, or optimizing digital twins, Functioneer lets you queue up thousands or even millions of tests in seconds. Easily retrieve and analyze results in formats like pandas for seamless integration into your analysis workflows.
37
37
 
@@ -159,7 +159,7 @@ Output:
159
159
  The parameters `x` and `y` were given 3 and 2 fork values respectively, this created 6 total *leaves* (end of each branch) in the analysis. `rosen` has been evaluated for each *leaf*. Essentially you have begun to map the Rosenbrock function over the x-y domain.
160
160
 
161
161
  ### Example 3: Optimization
162
- Lets say you want to find the local minimum of the Rosenbrock (optimize `x` and `y`) for several different flavors Rosenbrock functions (each with different `a` nnd `b` parameters). You would then fork the analysis at parameters `a` and `b` then after the forks perform the optimization on each branch.
162
+ Let's say you want to find the local minimum of the Rosenbrock (optimize `x` and `y`) for several variations of `a` and `b` (different flavors Rosenbrock functions). You would fork the analysis at parameters `a` and `b`, then perform an optimization on each branch.
163
163
  ```
164
164
  # Create new analysis
165
165
  anal = fn.AnalysisModule(dict(x=0, y=0))
@@ -212,8 +212,10 @@ Output:
212
212
  1 0.0 1 100 1 10 8100
213
213
  2 0.0 1 100 2 20 25601
214
214
  ```
215
+ Notice 3 branches have been create for each combination of `x` and `y`: `(x=0, y=0), (x=1, y=10), (x=2, y=20)`
216
+
215
217
  ### Example 5: Analysis Steps can be Conditional
216
- Any *analysis step* can be given a conditional function that must return true at runtime or else the *analysis step* will be skipped. One use case for this is when you want to skip an expensive *analysis step* if the parameters aren't looking good.
218
+ Any *analysis step* can be given a conditional function that must return true at runtime or else the *analysis step* will be skipped. An example use case is when you want to skip an expensive *analysis step* if the parameters aren't looking "good".
217
219
 
218
220
  As an arbitrary example, assume that we only care about cases where the optimized value of `y` is above 0.5. Also assume `expensive_func` is costly to run and we want to avoid running it when `y<0.5`.
219
221
  ```
@@ -1,7 +1,7 @@
1
1
  # Functioneer
2
2
 
3
3
  **Author**: Quinn Marsh
4
- **Date**: January 08, 2025
4
+ **Date**: February 02, 2025
5
5
 
6
6
  Functioneer is a Python package that automates the analysis of ANY function, enabling you to test and optimize with unlimited combinations of parameters. Whether you're performing parameter sweeps, sensitivity testing, or optimizing digital twins, Functioneer lets you queue up thousands or even millions of tests in seconds. Easily retrieve and analyze results in formats like pandas for seamless integration into your analysis workflows.
7
7
 
@@ -129,7 +129,7 @@ Output:
129
129
  The parameters `x` and `y` were given 3 and 2 fork values respectively, this created 6 total *leaves* (end of each branch) in the analysis. `rosen` has been evaluated for each *leaf*. Essentially you have begun to map the Rosenbrock function over the x-y domain.
130
130
 
131
131
  ### Example 3: Optimization
132
- Lets say you want to find the local minimum of the Rosenbrock (optimize `x` and `y`) for several different flavors Rosenbrock functions (each with different `a` nnd `b` parameters). You would then fork the analysis at parameters `a` and `b` then after the forks perform the optimization on each branch.
132
+ Let's say you want to find the local minimum of the Rosenbrock (optimize `x` and `y`) for several variations of `a` and `b` (different flavors Rosenbrock functions). You would fork the analysis at parameters `a` and `b`, then perform an optimization on each branch.
133
133
  ```
134
134
  # Create new analysis
135
135
  anal = fn.AnalysisModule(dict(x=0, y=0))
@@ -182,8 +182,10 @@ Output:
182
182
  1 0.0 1 100 1 10 8100
183
183
  2 0.0 1 100 2 20 25601
184
184
  ```
185
+ Notice 3 branches have been create for each combination of `x` and `y`: `(x=0, y=0), (x=1, y=10), (x=2, y=20)`
186
+
185
187
  ### Example 5: Analysis Steps can be Conditional
186
- Any *analysis step* can be given a conditional function that must return true at runtime or else the *analysis step* will be skipped. One use case for this is when you want to skip an expensive *analysis step* if the parameters aren't looking good.
188
+ Any *analysis step* can be given a conditional function that must return true at runtime or else the *analysis step* will be skipped. An example use case is when you want to skip an expensive *analysis step* if the parameters aren't looking "good".
187
189
 
188
190
  As an arbitrary example, assume that we only care about cases where the optimized value of `y` is above 0.5. Also assume `expensive_func` is costly to run and we want to avoid running it when `y<0.5`.
189
191
  ```
@@ -4,7 +4,7 @@ build-backend = "setuptools.build_meta"
4
4
 
5
5
  [project]
6
6
  name = "functioneer"
7
- version = "0.2.0"
7
+ version = "0.2.1"
8
8
  authors = [{ name = "Quinn Marsh", email = "quinnmarsh@hotmail.com" }]
9
9
  maintainers = [{ name = "Quinn Marsh", email = "quinnmarsh@hotmail.com" }]
10
10
  description = "A library providing easy automated analysis for your functions."
@@ -1,6 +1,6 @@
1
1
  Metadata-Version: 2.2
2
2
  Name: functioneer
3
- Version: 0.2.0
3
+ Version: 0.2.1
4
4
  Summary: A library providing easy automated analysis for your functions.
5
5
  Author-email: Quinn Marsh <quinnmarsh@hotmail.com>
6
6
  Maintainer-email: Quinn Marsh <quinnmarsh@hotmail.com>
@@ -31,7 +31,7 @@ Requires-Dist: pandas>=1.0.5
31
31
  # Functioneer
32
32
 
33
33
  **Author**: Quinn Marsh
34
- **Date**: January 08, 2025
34
+ **Date**: February 02, 2025
35
35
 
36
36
  Functioneer is a Python package that automates the analysis of ANY function, enabling you to test and optimize with unlimited combinations of parameters. Whether you're performing parameter sweeps, sensitivity testing, or optimizing digital twins, Functioneer lets you queue up thousands or even millions of tests in seconds. Easily retrieve and analyze results in formats like pandas for seamless integration into your analysis workflows.
37
37
 
@@ -159,7 +159,7 @@ Output:
159
159
  The parameters `x` and `y` were given 3 and 2 fork values respectively, this created 6 total *leaves* (end of each branch) in the analysis. `rosen` has been evaluated for each *leaf*. Essentially you have begun to map the Rosenbrock function over the x-y domain.
160
160
 
161
161
  ### Example 3: Optimization
162
- Lets say you want to find the local minimum of the Rosenbrock (optimize `x` and `y`) for several different flavors Rosenbrock functions (each with different `a` nnd `b` parameters). You would then fork the analysis at parameters `a` and `b` then after the forks perform the optimization on each branch.
162
+ Let's say you want to find the local minimum of the Rosenbrock (optimize `x` and `y`) for several variations of `a` and `b` (different flavors Rosenbrock functions). You would fork the analysis at parameters `a` and `b`, then perform an optimization on each branch.
163
163
  ```
164
164
  # Create new analysis
165
165
  anal = fn.AnalysisModule(dict(x=0, y=0))
@@ -212,8 +212,10 @@ Output:
212
212
  1 0.0 1 100 1 10 8100
213
213
  2 0.0 1 100 2 20 25601
214
214
  ```
215
+ Notice 3 branches have been create for each combination of `x` and `y`: `(x=0, y=0), (x=1, y=10), (x=2, y=20)`
216
+
215
217
  ### Example 5: Analysis Steps can be Conditional
216
- Any *analysis step* can be given a conditional function that must return true at runtime or else the *analysis step* will be skipped. One use case for this is when you want to skip an expensive *analysis step* if the parameters aren't looking good.
218
+ Any *analysis step* can be given a conditional function that must return true at runtime or else the *analysis step* will be skipped. An example use case is when you want to skip an expensive *analysis step* if the parameters aren't looking "good".
217
219
 
218
220
  As an arbitrary example, assume that we only care about cases where the optimized value of `y` is above 0.5. Also assume `expensive_func` is costly to run and we want to avoid running it when `y<0.5`.
219
221
  ```
File without changes
File without changes