fxn 0.0.38__tar.gz → 0.0.40__tar.gz
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- {fxn-0.0.38 → fxn-0.0.40}/PKG-INFO +5 -5
- {fxn-0.0.38 → fxn-0.0.40}/README.md +4 -4
- fxn-0.0.40/fxn/lib/linux/arm64/libFunction.so +0 -0
- fxn-0.0.40/fxn/lib/linux/x86_64/libFunction.so +0 -0
- {fxn-0.0.38 → fxn-0.0.40}/fxn/lib/macos/arm64/Function.dylib +0 -0
- {fxn-0.0.38 → fxn-0.0.40}/fxn/lib/macos/x86_64/Function.dylib +0 -0
- {fxn-0.0.38 → fxn-0.0.40}/fxn/lib/windows/arm64/Function.dll +0 -0
- {fxn-0.0.38 → fxn-0.0.40}/fxn/lib/windows/x86_64/Function.dll +0 -0
- {fxn-0.0.38 → fxn-0.0.40}/fxn/services/prediction.py +9 -14
- {fxn-0.0.38 → fxn-0.0.40}/fxn/version.py +1 -1
- {fxn-0.0.38 → fxn-0.0.40}/fxn.egg-info/PKG-INFO +5 -5
- fxn-0.0.38/fxn/lib/linux/arm64/libFunction.so +0 -0
- fxn-0.0.38/fxn/lib/linux/x86_64/libFunction.so +0 -0
- {fxn-0.0.38 → fxn-0.0.40}/LICENSE +0 -0
- {fxn-0.0.38 → fxn-0.0.40}/fxn/__init__.py +0 -0
- {fxn-0.0.38 → fxn-0.0.40}/fxn/api/__init__.py +0 -0
- {fxn-0.0.38 → fxn-0.0.40}/fxn/api/client.py +0 -0
- {fxn-0.0.38 → fxn-0.0.40}/fxn/c/__init__.py +0 -0
- {fxn-0.0.38 → fxn-0.0.40}/fxn/c/configuration.py +0 -0
- {fxn-0.0.38 → fxn-0.0.40}/fxn/c/dtype.py +0 -0
- {fxn-0.0.38 → fxn-0.0.40}/fxn/c/fxnc.py +0 -0
- {fxn-0.0.38 → fxn-0.0.40}/fxn/c/map.py +0 -0
- {fxn-0.0.38 → fxn-0.0.40}/fxn/c/prediction.py +0 -0
- {fxn-0.0.38 → fxn-0.0.40}/fxn/c/predictor.py +0 -0
- {fxn-0.0.38 → fxn-0.0.40}/fxn/c/status.py +0 -0
- {fxn-0.0.38 → fxn-0.0.40}/fxn/c/stream.py +0 -0
- {fxn-0.0.38 → fxn-0.0.40}/fxn/c/value.py +0 -0
- {fxn-0.0.38 → fxn-0.0.40}/fxn/c/version.py +0 -0
- {fxn-0.0.38 → fxn-0.0.40}/fxn/cli/__init__.py +0 -0
- {fxn-0.0.38 → fxn-0.0.40}/fxn/cli/auth.py +0 -0
- {fxn-0.0.38 → fxn-0.0.40}/fxn/cli/env.py +0 -0
- {fxn-0.0.38 → fxn-0.0.40}/fxn/cli/misc.py +0 -0
- {fxn-0.0.38 → fxn-0.0.40}/fxn/cli/predict.py +0 -0
- {fxn-0.0.38 → fxn-0.0.40}/fxn/cli/predictors.py +0 -0
- {fxn-0.0.38 → fxn-0.0.40}/fxn/function.py +0 -0
- {fxn-0.0.38 → fxn-0.0.40}/fxn/lib/__init__.py +0 -0
- {fxn-0.0.38 → fxn-0.0.40}/fxn/services/__init__.py +0 -0
- {fxn-0.0.38 → fxn-0.0.40}/fxn/services/predictor.py +0 -0
- {fxn-0.0.38 → fxn-0.0.40}/fxn/services/user.py +0 -0
- {fxn-0.0.38 → fxn-0.0.40}/fxn/types/__init__.py +0 -0
- {fxn-0.0.38 → fxn-0.0.40}/fxn/types/dtype.py +0 -0
- {fxn-0.0.38 → fxn-0.0.40}/fxn/types/prediction.py +0 -0
- {fxn-0.0.38 → fxn-0.0.40}/fxn/types/predictor.py +0 -0
- {fxn-0.0.38 → fxn-0.0.40}/fxn/types/user.py +0 -0
- {fxn-0.0.38 → fxn-0.0.40}/fxn.egg-info/SOURCES.txt +0 -0
- {fxn-0.0.38 → fxn-0.0.40}/fxn.egg-info/dependency_links.txt +0 -0
- {fxn-0.0.38 → fxn-0.0.40}/fxn.egg-info/entry_points.txt +0 -0
- {fxn-0.0.38 → fxn-0.0.40}/fxn.egg-info/requires.txt +0 -0
- {fxn-0.0.38 → fxn-0.0.40}/fxn.egg-info/top_level.txt +0 -0
- {fxn-0.0.38 → fxn-0.0.40}/pyproject.toml +0 -0
- {fxn-0.0.38 → fxn-0.0.40}/setup.cfg +0 -0
@@ -1,6 +1,6 @@
|
|
1
1
|
Metadata-Version: 2.1
|
2
2
|
Name: fxn
|
3
|
-
Version: 0.0.
|
3
|
+
Version: 0.0.40
|
4
4
|
Summary: Run prediction functions locally in Python. Register at https://fxn.ai.
|
5
5
|
Author-email: "NatML Inc." <hi@fxn.ai>
|
6
6
|
License: Apache License
|
@@ -229,7 +229,10 @@ Requires-Dist: typer
|
|
229
229
|
|
230
230
|
[](https://fxn.ai/community)
|
231
231
|
|
232
|
-
Run prediction functions (a.k.a "predictors") locally in your Python apps, with full GPU acceleration and zero dependencies.
|
232
|
+
Run prediction functions (a.k.a "predictors") locally in your Python apps, with full GPU acceleration and zero dependencies.
|
233
|
+
|
234
|
+
> [!TIP]
|
235
|
+
> [Join our waitlist](https://fxn.ai/waitlist) to bring your custom Python functions and run them on-device across Android, iOS, macOS, Linux, web, and Windows.
|
233
236
|
|
234
237
|
## Installing Function
|
235
238
|
Function is distributed on PyPi. This distribution contains both the Python client and the command line interface (CLI). To install, open a terminal and run the following command:
|
@@ -266,9 +269,6 @@ prediction = fxn.predictions.create(
|
|
266
269
|
print(prediction.results[0])
|
267
270
|
```
|
268
271
|
|
269
|
-
> [!TIP]
|
270
|
-
> Explore public predictors [on Function](https://fxn.ai/explore) or [create your own](https://fxn.ai/waitlist).
|
271
|
-
|
272
272
|
## Using the Function CLI
|
273
273
|
Open up a terminal and login to the Function CLI:
|
274
274
|
```sh
|
@@ -4,7 +4,10 @@
|
|
4
4
|
|
5
5
|
[](https://fxn.ai/community)
|
6
6
|
|
7
|
-
Run prediction functions (a.k.a "predictors") locally in your Python apps, with full GPU acceleration and zero dependencies.
|
7
|
+
Run prediction functions (a.k.a "predictors") locally in your Python apps, with full GPU acceleration and zero dependencies.
|
8
|
+
|
9
|
+
> [!TIP]
|
10
|
+
> [Join our waitlist](https://fxn.ai/waitlist) to bring your custom Python functions and run them on-device across Android, iOS, macOS, Linux, web, and Windows.
|
8
11
|
|
9
12
|
## Installing Function
|
10
13
|
Function is distributed on PyPi. This distribution contains both the Python client and the command line interface (CLI). To install, open a terminal and run the following command:
|
@@ -41,9 +44,6 @@ prediction = fxn.predictions.create(
|
|
41
44
|
print(prediction.results[0])
|
42
45
|
```
|
43
46
|
|
44
|
-
> [!TIP]
|
45
|
-
> Explore public predictors [on Function](https://fxn.ai/explore) or [create your own](https://fxn.ai/waitlist).
|
46
|
-
|
47
47
|
## Using the Function CLI
|
48
48
|
Open up a terminal and login to the Function CLI:
|
49
49
|
```sh
|
Binary file
|
Binary file
|
Binary file
|
Binary file
|
Binary file
|
Binary file
|
@@ -31,6 +31,8 @@ class PredictionService:
|
|
31
31
|
self.client = client
|
32
32
|
self.__fxnc = PredictionService.__load_fxnc()
|
33
33
|
self.__cache = { }
|
34
|
+
self.__cache_dir = self.__class__.__get_resource_dir() / ".fxn" / "cache"
|
35
|
+
self.__cache_dir.mkdir(parents=True, exist_ok=True)
|
34
36
|
|
35
37
|
def create (
|
36
38
|
self,
|
@@ -89,7 +91,7 @@ class PredictionService:
|
|
89
91
|
self,
|
90
92
|
tag: str,
|
91
93
|
*,
|
92
|
-
inputs: Dict[str,
|
94
|
+
inputs: Dict[str, float | int | str | bool | NDArray | List[Any] | Dict[str, Any] | Path | Image.Image] = {},
|
93
95
|
acceleration: Acceleration=Acceleration.Default,
|
94
96
|
client_id: str=None,
|
95
97
|
configuration_id: str=None
|
@@ -127,12 +129,12 @@ class PredictionService:
|
|
127
129
|
yield prediction
|
128
130
|
|
129
131
|
@classmethod
|
130
|
-
def __load_fxnc (
|
132
|
+
def __load_fxnc (cls) -> Optional[CDLL]:
|
131
133
|
os = system().lower()
|
132
134
|
os = "macos" if os == "darwin" else os
|
133
|
-
arch = machine()
|
135
|
+
arch = machine().lower()
|
134
136
|
arch = "arm64" if arch == "aarch64" else arch
|
135
|
-
arch = "x86_64" if arch
|
137
|
+
arch = "x86_64" if arch in ["x64", "amd64"] else arch
|
136
138
|
package = f"fxn.lib.{os}.{arch}"
|
137
139
|
resource = "libFunction.so"
|
138
140
|
resource = "Function.dylib" if os == "macos" else resource
|
@@ -141,11 +143,6 @@ class PredictionService:
|
|
141
143
|
return load_fxnc(fxnc_path)
|
142
144
|
|
143
145
|
def __get_client_id (self) -> str:
|
144
|
-
# Fallback if fxnc failed to load
|
145
|
-
if not self.__fxnc:
|
146
|
-
os = system().lower()
|
147
|
-
os = "macos" if os == "darwin" else os
|
148
|
-
return f"{os}-{machine()}"
|
149
146
|
# Get
|
150
147
|
buffer = create_string_buffer(64)
|
151
148
|
status = self.__fxnc.FXNConfigurationGetClientID(buffer, len(buffer))
|
@@ -278,7 +275,7 @@ class PredictionService:
|
|
278
275
|
|
279
276
|
def __to_value (
|
280
277
|
self,
|
281
|
-
value:
|
278
|
+
value: float | int | bool | str | NDArray | List[Any] | Dict[str, Any] | Image.Image | bytes | bytearray | memoryview | BytesIO | None
|
282
279
|
) -> type[FXNValueRef]:
|
283
280
|
value = PredictionService.__try_ensure_serializable(value)
|
284
281
|
fxnc = self.__fxnc
|
@@ -336,7 +333,7 @@ class PredictionService:
|
|
336
333
|
def __to_object (
|
337
334
|
self,
|
338
335
|
value: type[FXNValueRef]
|
339
|
-
) ->
|
336
|
+
) -> float | int | bool | str | NDArray | List[Any] | Dict[str, Any] | Image.Image | BytesIO | None:
|
340
337
|
# Type
|
341
338
|
fxnc = self.__fxnc
|
342
339
|
dtype = FXNDtype()
|
@@ -376,10 +373,8 @@ class PredictionService:
|
|
376
373
|
raise RuntimeError(f"Failed to convert Function value to Python value because Function value has unsupported type: {dtype}")
|
377
374
|
|
378
375
|
def __get_resource_path (self, resource: PredictionResource) -> Path:
|
379
|
-
cache_dir = self.__class__.__get_resource_dir() / ".fxn" / "cache"
|
380
|
-
cache_dir.mkdir(exist_ok=True)
|
381
376
|
res_name = Path(urlparse(resource.url).path).name
|
382
|
-
res_path =
|
377
|
+
res_path = self.__cache_dir / res_name
|
383
378
|
if res_path.exists():
|
384
379
|
return res_path
|
385
380
|
req = get(resource.url)
|
@@ -1,6 +1,6 @@
|
|
1
1
|
Metadata-Version: 2.1
|
2
2
|
Name: fxn
|
3
|
-
Version: 0.0.
|
3
|
+
Version: 0.0.40
|
4
4
|
Summary: Run prediction functions locally in Python. Register at https://fxn.ai.
|
5
5
|
Author-email: "NatML Inc." <hi@fxn.ai>
|
6
6
|
License: Apache License
|
@@ -229,7 +229,10 @@ Requires-Dist: typer
|
|
229
229
|
|
230
230
|
[](https://fxn.ai/community)
|
231
231
|
|
232
|
-
Run prediction functions (a.k.a "predictors") locally in your Python apps, with full GPU acceleration and zero dependencies.
|
232
|
+
Run prediction functions (a.k.a "predictors") locally in your Python apps, with full GPU acceleration and zero dependencies.
|
233
|
+
|
234
|
+
> [!TIP]
|
235
|
+
> [Join our waitlist](https://fxn.ai/waitlist) to bring your custom Python functions and run them on-device across Android, iOS, macOS, Linux, web, and Windows.
|
233
236
|
|
234
237
|
## Installing Function
|
235
238
|
Function is distributed on PyPi. This distribution contains both the Python client and the command line interface (CLI). To install, open a terminal and run the following command:
|
@@ -266,9 +269,6 @@ prediction = fxn.predictions.create(
|
|
266
269
|
print(prediction.results[0])
|
267
270
|
```
|
268
271
|
|
269
|
-
> [!TIP]
|
270
|
-
> Explore public predictors [on Function](https://fxn.ai/explore) or [create your own](https://fxn.ai/waitlist).
|
271
|
-
|
272
272
|
## Using the Function CLI
|
273
273
|
Open up a terminal and login to the Function CLI:
|
274
274
|
```sh
|
Binary file
|
Binary file
|
File without changes
|
File without changes
|
File without changes
|
File without changes
|
File without changes
|
File without changes
|
File without changes
|
File without changes
|
File without changes
|
File without changes
|
File without changes
|
File without changes
|
File without changes
|
File without changes
|
File without changes
|
File without changes
|
File without changes
|
File without changes
|
File without changes
|
File without changes
|
File without changes
|
File without changes
|
File without changes
|
File without changes
|
File without changes
|
File without changes
|
File without changes
|
File without changes
|
File without changes
|
File without changes
|
File without changes
|
File without changes
|
File without changes
|
File without changes
|
File without changes
|
File without changes
|
File without changes
|
File without changes
|