clarifai 11.7.5rc1__py3-none-any.whl → 11.8.1__py3-none-any.whl
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- clarifai/__init__.py +1 -1
- clarifai/client/user.py +0 -172
- clarifai/runners/models/model_builder.py +0 -133
- clarifai/runners/models/model_runner.py +21 -3
- clarifai/runners/models/openai_class.py +18 -0
- {clarifai-11.7.5rc1.dist-info → clarifai-11.8.1.dist-info}/METADATA +1 -1
- clarifai-11.8.1.dist-info/RECORD +129 -0
- {clarifai-11.7.5rc1.dist-info → clarifai-11.8.1.dist-info}/WHEEL +1 -1
- clarifai/__pycache__/__init__.cpython-311.pyc +0 -0
- clarifai/__pycache__/__init__.cpython-312.pyc +0 -0
- clarifai/__pycache__/__init__.cpython-39.pyc +0 -0
- clarifai/__pycache__/errors.cpython-311.pyc +0 -0
- clarifai/__pycache__/errors.cpython-39.pyc +0 -0
- clarifai/__pycache__/versions.cpython-311.pyc +0 -0
- clarifai/__pycache__/versions.cpython-39.pyc +0 -0
- clarifai/cli/__pycache__/__init__.cpython-311.pyc +0 -0
- clarifai/cli/__pycache__/__init__.cpython-39.pyc +0 -0
- clarifai/cli/__pycache__/base.cpython-311.pyc +0 -0
- clarifai/cli/__pycache__/base.cpython-39.pyc +0 -0
- clarifai/cli/__pycache__/compute_cluster.cpython-311.pyc +0 -0
- clarifai/cli/__pycache__/compute_cluster.cpython-39.pyc +0 -0
- clarifai/cli/__pycache__/deployment.cpython-311.pyc +0 -0
- clarifai/cli/__pycache__/deployment.cpython-39.pyc +0 -0
- clarifai/cli/__pycache__/model.cpython-311.pyc +0 -0
- clarifai/cli/__pycache__/model.cpython-39.pyc +0 -0
- clarifai/cli/__pycache__/nodepool.cpython-311.pyc +0 -0
- clarifai/cli/__pycache__/nodepool.cpython-39.pyc +0 -0
- clarifai/cli/__pycache__/pipeline.cpython-311.pyc +0 -0
- clarifai/cli/__pycache__/pipeline_step.cpython-311.pyc +0 -0
- clarifai/cli/model_templates.py +0 -243
- clarifai/cli/pipeline_step_templates.py +0 -64
- clarifai/cli/templates/__pycache__/__init__.cpython-311.pyc +0 -0
- clarifai/cli/templates/__pycache__/pipeline_templates.cpython-311.pyc +0 -0
- clarifai/client/__pycache__/__init__.cpython-311.pyc +0 -0
- clarifai/client/__pycache__/__init__.cpython-312.pyc +0 -0
- clarifai/client/__pycache__/__init__.cpython-39.pyc +0 -0
- clarifai/client/__pycache__/app.cpython-311.pyc +0 -0
- clarifai/client/__pycache__/app.cpython-312.pyc +0 -0
- clarifai/client/__pycache__/app.cpython-39.pyc +0 -0
- clarifai/client/__pycache__/base.cpython-311.pyc +0 -0
- clarifai/client/__pycache__/base.cpython-39.pyc +0 -0
- clarifai/client/__pycache__/compute_cluster.cpython-311.pyc +0 -0
- clarifai/client/__pycache__/dataset.cpython-311.pyc +0 -0
- clarifai/client/__pycache__/dataset.cpython-39.pyc +0 -0
- clarifai/client/__pycache__/deployment.cpython-311.pyc +0 -0
- clarifai/client/__pycache__/deployment.cpython-39.pyc +0 -0
- clarifai/client/__pycache__/input.cpython-311.pyc +0 -0
- clarifai/client/__pycache__/input.cpython-39.pyc +0 -0
- clarifai/client/__pycache__/lister.cpython-311.pyc +0 -0
- clarifai/client/__pycache__/lister.cpython-39.pyc +0 -0
- clarifai/client/__pycache__/model.cpython-311.pyc +0 -0
- clarifai/client/__pycache__/model.cpython-39.pyc +0 -0
- clarifai/client/__pycache__/model_client.cpython-311.pyc +0 -0
- clarifai/client/__pycache__/model_client.cpython-39.pyc +0 -0
- clarifai/client/__pycache__/module.cpython-311.pyc +0 -0
- clarifai/client/__pycache__/nodepool.cpython-311.pyc +0 -0
- clarifai/client/__pycache__/pipeline.cpython-311.pyc +0 -0
- clarifai/client/__pycache__/pipeline_step.cpython-311.pyc +0 -0
- clarifai/client/__pycache__/runner.cpython-311.pyc +0 -0
- clarifai/client/__pycache__/search.cpython-311.pyc +0 -0
- clarifai/client/__pycache__/user.cpython-311.pyc +0 -0
- clarifai/client/__pycache__/workflow.cpython-311.pyc +0 -0
- clarifai/client/auth/__pycache__/__init__.cpython-311.pyc +0 -0
- clarifai/client/auth/__pycache__/__init__.cpython-39.pyc +0 -0
- clarifai/client/auth/__pycache__/helper.cpython-311.pyc +0 -0
- clarifai/client/auth/__pycache__/helper.cpython-39.pyc +0 -0
- clarifai/client/auth/__pycache__/register.cpython-311.pyc +0 -0
- clarifai/client/auth/__pycache__/register.cpython-39.pyc +0 -0
- clarifai/client/auth/__pycache__/stub.cpython-311.pyc +0 -0
- clarifai/client/auth/__pycache__/stub.cpython-39.pyc +0 -0
- clarifai/constants/__pycache__/base.cpython-311.pyc +0 -0
- clarifai/constants/__pycache__/base.cpython-39.pyc +0 -0
- clarifai/constants/__pycache__/dataset.cpython-311.pyc +0 -0
- clarifai/constants/__pycache__/dataset.cpython-39.pyc +0 -0
- clarifai/constants/__pycache__/input.cpython-311.pyc +0 -0
- clarifai/constants/__pycache__/input.cpython-39.pyc +0 -0
- clarifai/constants/__pycache__/model.cpython-311.pyc +0 -0
- clarifai/constants/__pycache__/model.cpython-39.pyc +0 -0
- clarifai/constants/__pycache__/rag.cpython-311.pyc +0 -0
- clarifai/constants/__pycache__/search.cpython-311.pyc +0 -0
- clarifai/constants/__pycache__/workflow.cpython-311.pyc +0 -0
- clarifai/datasets/__pycache__/__init__.cpython-311.pyc +0 -0
- clarifai/datasets/__pycache__/__init__.cpython-39.pyc +0 -0
- clarifai/datasets/export/__pycache__/__init__.cpython-311.pyc +0 -0
- clarifai/datasets/export/__pycache__/__init__.cpython-39.pyc +0 -0
- clarifai/datasets/export/__pycache__/inputs_annotations.cpython-311.pyc +0 -0
- clarifai/datasets/export/__pycache__/inputs_annotations.cpython-39.pyc +0 -0
- clarifai/datasets/upload/__pycache__/__init__.cpython-311.pyc +0 -0
- clarifai/datasets/upload/__pycache__/__init__.cpython-39.pyc +0 -0
- clarifai/datasets/upload/__pycache__/base.cpython-311.pyc +0 -0
- clarifai/datasets/upload/__pycache__/base.cpython-39.pyc +0 -0
- clarifai/datasets/upload/__pycache__/features.cpython-311.pyc +0 -0
- clarifai/datasets/upload/__pycache__/features.cpython-39.pyc +0 -0
- clarifai/datasets/upload/__pycache__/image.cpython-311.pyc +0 -0
- clarifai/datasets/upload/__pycache__/image.cpython-39.pyc +0 -0
- clarifai/datasets/upload/__pycache__/multimodal.cpython-311.pyc +0 -0
- clarifai/datasets/upload/__pycache__/multimodal.cpython-39.pyc +0 -0
- clarifai/datasets/upload/__pycache__/text.cpython-311.pyc +0 -0
- clarifai/datasets/upload/__pycache__/text.cpython-39.pyc +0 -0
- clarifai/datasets/upload/__pycache__/utils.cpython-311.pyc +0 -0
- clarifai/datasets/upload/__pycache__/utils.cpython-39.pyc +0 -0
- clarifai/datasets/upload/loaders/__pycache__/__init__.cpython-311.pyc +0 -0
- clarifai/datasets/upload/loaders/__pycache__/coco_detection.cpython-311.pyc +0 -0
- clarifai/datasets/upload/loaders/__pycache__/imagenet_classification.cpython-311.pyc +0 -0
- clarifai/models/model_serving/README.md +0 -158
- clarifai/models/model_serving/__init__.py +0 -14
- clarifai/models/model_serving/cli/__init__.py +0 -12
- clarifai/models/model_serving/cli/_utils.py +0 -53
- clarifai/models/model_serving/cli/base.py +0 -14
- clarifai/models/model_serving/cli/build.py +0 -79
- clarifai/models/model_serving/cli/clarifai_clis.py +0 -33
- clarifai/models/model_serving/cli/create.py +0 -171
- clarifai/models/model_serving/cli/example_cli.py +0 -34
- clarifai/models/model_serving/cli/login.py +0 -26
- clarifai/models/model_serving/cli/upload.py +0 -179
- clarifai/models/model_serving/constants.py +0 -21
- clarifai/models/model_serving/docs/cli.md +0 -161
- clarifai/models/model_serving/docs/concepts.md +0 -229
- clarifai/models/model_serving/docs/dependencies.md +0 -11
- clarifai/models/model_serving/docs/inference_parameters.md +0 -139
- clarifai/models/model_serving/docs/model_types.md +0 -19
- clarifai/models/model_serving/model_config/__init__.py +0 -16
- clarifai/models/model_serving/model_config/base.py +0 -369
- clarifai/models/model_serving/model_config/config.py +0 -312
- clarifai/models/model_serving/model_config/inference_parameter.py +0 -129
- clarifai/models/model_serving/model_config/model_types_config/multimodal-embedder.yaml +0 -25
- clarifai/models/model_serving/model_config/model_types_config/text-classifier.yaml +0 -19
- clarifai/models/model_serving/model_config/model_types_config/text-embedder.yaml +0 -20
- clarifai/models/model_serving/model_config/model_types_config/text-to-image.yaml +0 -19
- clarifai/models/model_serving/model_config/model_types_config/text-to-text.yaml +0 -19
- clarifai/models/model_serving/model_config/model_types_config/visual-classifier.yaml +0 -22
- clarifai/models/model_serving/model_config/model_types_config/visual-detector.yaml +0 -32
- clarifai/models/model_serving/model_config/model_types_config/visual-embedder.yaml +0 -19
- clarifai/models/model_serving/model_config/model_types_config/visual-segmenter.yaml +0 -19
- clarifai/models/model_serving/model_config/output.py +0 -133
- clarifai/models/model_serving/model_config/triton/__init__.py +0 -14
- clarifai/models/model_serving/model_config/triton/serializer.py +0 -136
- clarifai/models/model_serving/model_config/triton/triton_config.py +0 -182
- clarifai/models/model_serving/model_config/triton/wrappers.py +0 -281
- clarifai/models/model_serving/repo_build/__init__.py +0 -14
- clarifai/models/model_serving/repo_build/build.py +0 -198
- clarifai/models/model_serving/repo_build/static_files/_requirements.txt +0 -2
- clarifai/models/model_serving/repo_build/static_files/base_test.py +0 -169
- clarifai/models/model_serving/repo_build/static_files/inference.py +0 -26
- clarifai/models/model_serving/repo_build/static_files/sample_clarifai_config.yaml +0 -25
- clarifai/models/model_serving/repo_build/static_files/test.py +0 -40
- clarifai/models/model_serving/repo_build/static_files/triton/model.py +0 -75
- clarifai/models/model_serving/utils.py +0 -23
- clarifai/rag/__pycache__/__init__.cpython-311.pyc +0 -0
- clarifai/rag/__pycache__/rag.cpython-311.pyc +0 -0
- clarifai/rag/__pycache__/utils.cpython-311.pyc +0 -0
- clarifai/runners/__pycache__/__init__.cpython-311.pyc +0 -0
- clarifai/runners/__pycache__/__init__.cpython-39.pyc +0 -0
- clarifai/runners/models/__pycache__/__init__.cpython-311.pyc +0 -0
- clarifai/runners/models/__pycache__/__init__.cpython-39.pyc +0 -0
- clarifai/runners/models/__pycache__/dummy_openai_model.cpython-311.pyc +0 -0
- clarifai/runners/models/__pycache__/mcp_class.cpython-311.pyc +0 -0
- clarifai/runners/models/__pycache__/model_builder.cpython-311.pyc +0 -0
- clarifai/runners/models/__pycache__/model_builder.cpython-39.pyc +0 -0
- clarifai/runners/models/__pycache__/model_class.cpython-311.pyc +0 -0
- clarifai/runners/models/__pycache__/model_run_locally.cpython-311.pyc +0 -0
- clarifai/runners/models/__pycache__/model_runner.cpython-311.pyc +0 -0
- clarifai/runners/models/__pycache__/model_servicer.cpython-311.pyc +0 -0
- clarifai/runners/models/__pycache__/openai_class.cpython-311.pyc +0 -0
- clarifai/runners/models/base_typed_model.py +0 -238
- clarifai/runners/models/model_upload.py +0 -607
- clarifai/runners/pipeline_steps/__pycache__/__init__.cpython-311.pyc +0 -0
- clarifai/runners/pipeline_steps/__pycache__/pipeline_step_builder.cpython-311.pyc +0 -0
- clarifai/runners/pipelines/__pycache__/__init__.cpython-311.pyc +0 -0
- clarifai/runners/pipelines/__pycache__/pipeline_builder.cpython-311.pyc +0 -0
- clarifai/runners/utils/__pycache__/__init__.cpython-311.pyc +0 -0
- clarifai/runners/utils/__pycache__/__init__.cpython-39.pyc +0 -0
- clarifai/runners/utils/__pycache__/code_script.cpython-311.pyc +0 -0
- clarifai/runners/utils/__pycache__/code_script.cpython-39.pyc +0 -0
- clarifai/runners/utils/__pycache__/const.cpython-311.pyc +0 -0
- clarifai/runners/utils/__pycache__/data_utils.cpython-311.pyc +0 -0
- clarifai/runners/utils/__pycache__/data_utils.cpython-39.pyc +0 -0
- clarifai/runners/utils/__pycache__/loader.cpython-311.pyc +0 -0
- clarifai/runners/utils/__pycache__/method_signatures.cpython-311.pyc +0 -0
- clarifai/runners/utils/__pycache__/model_utils.cpython-311.pyc +0 -0
- clarifai/runners/utils/__pycache__/openai_convertor.cpython-311.pyc +0 -0
- clarifai/runners/utils/__pycache__/pipeline_validation.cpython-311.pyc +0 -0
- clarifai/runners/utils/__pycache__/serializers.cpython-311.pyc +0 -0
- clarifai/runners/utils/__pycache__/url_fetcher.cpython-311.pyc +0 -0
- clarifai/runners/utils/data_handler.py +0 -231
- clarifai/runners/utils/data_types/__pycache__/__init__.cpython-311.pyc +0 -0
- clarifai/runners/utils/data_types/__pycache__/__init__.cpython-39.pyc +0 -0
- clarifai/runners/utils/data_types/__pycache__/data_types.cpython-311.pyc +0 -0
- clarifai/runners/utils/data_types/__pycache__/data_types.cpython-39.pyc +0 -0
- clarifai/runners/utils/data_types.py +0 -471
- clarifai/runners/utils/temp.py +0 -59
- clarifai/schema/__pycache__/search.cpython-311.pyc +0 -0
- clarifai/urls/__pycache__/helper.cpython-311.pyc +0 -0
- clarifai/urls/__pycache__/helper.cpython-39.pyc +0 -0
- clarifai/utils/__pycache__/__init__.cpython-311.pyc +0 -0
- clarifai/utils/__pycache__/__init__.cpython-39.pyc +0 -0
- clarifai/utils/__pycache__/cli.cpython-311.pyc +0 -0
- clarifai/utils/__pycache__/cli.cpython-39.pyc +0 -0
- clarifai/utils/__pycache__/config.cpython-311.pyc +0 -0
- clarifai/utils/__pycache__/config.cpython-39.pyc +0 -0
- clarifai/utils/__pycache__/constants.cpython-311.pyc +0 -0
- clarifai/utils/__pycache__/constants.cpython-39.pyc +0 -0
- clarifai/utils/__pycache__/logging.cpython-311.pyc +0 -0
- clarifai/utils/__pycache__/logging.cpython-39.pyc +0 -0
- clarifai/utils/__pycache__/misc.cpython-311.pyc +0 -0
- clarifai/utils/__pycache__/misc.cpython-39.pyc +0 -0
- clarifai/utils/__pycache__/model_train.cpython-311.pyc +0 -0
- clarifai/utils/__pycache__/protobuf.cpython-311.pyc +0 -0
- clarifai/utils/__pycache__/protobuf.cpython-39.pyc +0 -0
- clarifai/utils/__pycache__/secrets.cpython-311.pyc +0 -0
- clarifai/utils/evaluation/__pycache__/__init__.cpython-311.pyc +0 -0
- clarifai/utils/evaluation/__pycache__/helpers.cpython-311.pyc +0 -0
- clarifai/utils/evaluation/__pycache__/main.cpython-311.pyc +0 -0
- clarifai/utils/evaluation/__pycache__/testset_annotation_parser.cpython-311.pyc +0 -0
- clarifai/workflows/__pycache__/__init__.cpython-311.pyc +0 -0
- clarifai/workflows/__pycache__/export.cpython-311.pyc +0 -0
- clarifai/workflows/__pycache__/utils.cpython-311.pyc +0 -0
- clarifai/workflows/__pycache__/validate.cpython-311.pyc +0 -0
- clarifai-11.7.5rc1.dist-info/RECORD +0 -339
- {clarifai-11.7.5rc1.dist-info → clarifai-11.8.1.dist-info}/entry_points.txt +0 -0
- {clarifai-11.7.5rc1.dist-info → clarifai-11.8.1.dist-info}/licenses/LICENSE +0 -0
- {clarifai-11.7.5rc1.dist-info → clarifai-11.8.1.dist-info}/top_level.txt +0 -0
clarifai/cli/model_templates.py
DELETED
@@ -1,243 +0,0 @@
|
|
1
|
-
"""Templates for model initialization."""
|
2
|
-
|
3
|
-
from clarifai import __version__
|
4
|
-
|
5
|
-
|
6
|
-
def get_model_class_template() -> str:
|
7
|
-
"""Return the template for a basic ModelClass-based model."""
|
8
|
-
return '''from typing import Iterator, List
|
9
|
-
from clarifai.runners.models.model_class import ModelClass
|
10
|
-
from clarifai.runners.util.data_utils import Param
|
11
|
-
|
12
|
-
class MyModel(ModelClass):
|
13
|
-
"""A custom model implementation using ModelClass."""
|
14
|
-
|
15
|
-
def load_model(self):
|
16
|
-
"""Load the model here.
|
17
|
-
# TODO: please fill in
|
18
|
-
# Add your model loading logic here
|
19
|
-
"""
|
20
|
-
pass
|
21
|
-
|
22
|
-
@ModelClass.method
|
23
|
-
def predict(
|
24
|
-
self,
|
25
|
-
prompt: str = "",
|
26
|
-
chat_history: List[dict] = None,
|
27
|
-
max_tokens: int = Param(default=256, description="The maximum number of tokens to generate. Shorter token lengths will provide faster performance."),
|
28
|
-
temperature: float = Param(default=1.0, description="A decimal number that determines the degree of randomness in the response"),
|
29
|
-
top_p: float = Param(default=1.0, description="An alternative to sampling with temperature, where the model considers the results of the tokens with top_p probability mass."),
|
30
|
-
) -> str:
|
31
|
-
"""This is the method that will be called when the runner is run. It takes in an input and returns an output."""
|
32
|
-
# TODO: please fill in
|
33
|
-
# Implement your prediction logic here
|
34
|
-
pass # Replace with your actual logic
|
35
|
-
|
36
|
-
@ModelClass.method
|
37
|
-
def generate(
|
38
|
-
self,
|
39
|
-
prompt: str = "",
|
40
|
-
chat_history: List[dict] = None,
|
41
|
-
max_tokens: int = Param(default=256, description="The maximum number of tokens to generate. Shorter token lengths will provide faster performance."),
|
42
|
-
temperature: float = Param(default=1.0, description="A decimal number that determines the degree of randomness in the response"),
|
43
|
-
top_p: float = Param(default=1.0, description="An alternative to sampling with temperature, where the model considers the results of the tokens with top_p probability mass."),
|
44
|
-
) -> Iterator[str]:
|
45
|
-
"""Example yielding a streamed response."""
|
46
|
-
# TODO: please fill in
|
47
|
-
# Implement your generation logic here
|
48
|
-
pass # Replace with your actual logic
|
49
|
-
'''
|
50
|
-
|
51
|
-
|
52
|
-
def get_mcp_model_class_template() -> str:
|
53
|
-
"""Return the template for an MCPModelClass-based model."""
|
54
|
-
return '''from typing import Any
|
55
|
-
|
56
|
-
from fastmcp import FastMCP # use fastmcp v2 not the built in mcp
|
57
|
-
from pydantic import Field
|
58
|
-
|
59
|
-
from clarifai.runners.models.mcp_class import MCPModelClass
|
60
|
-
|
61
|
-
# TODO: please fill in
|
62
|
-
# Configure your FastMCP server
|
63
|
-
server = FastMCP("my-mcp-server", instructions="", stateless_http=True)
|
64
|
-
|
65
|
-
|
66
|
-
# TODO: please fill in
|
67
|
-
# Add your tools, resources, and prompts here
|
68
|
-
@server.tool("example_tool", description="An example tool")
|
69
|
-
def example_tool(input_param: Any = Field(description="Example input parameter")):
|
70
|
-
"""Example tool implementation."""
|
71
|
-
# TODO: please fill in
|
72
|
-
# Implement your tool logic here
|
73
|
-
return f"Processed: {input_param}"
|
74
|
-
|
75
|
-
|
76
|
-
# Static resource example
|
77
|
-
@server.resource("config://version")
|
78
|
-
def get_version():
|
79
|
-
"""Example static resource."""
|
80
|
-
# TODO: please fill in
|
81
|
-
# Return your resource data
|
82
|
-
return "1.0.0"
|
83
|
-
|
84
|
-
|
85
|
-
@server.prompt()
|
86
|
-
def example_prompt(text: str) -> str:
|
87
|
-
"""Example prompt template."""
|
88
|
-
# TODO: please fill in
|
89
|
-
# Define your prompt template
|
90
|
-
return f"Process this text: {text}"
|
91
|
-
|
92
|
-
|
93
|
-
class MyModel(MCPModelClass):
|
94
|
-
"""A custom model implementation using MCPModelClass."""
|
95
|
-
|
96
|
-
def get_server(self) -> FastMCP:
|
97
|
-
"""Return the FastMCP server instance."""
|
98
|
-
return server
|
99
|
-
'''
|
100
|
-
|
101
|
-
|
102
|
-
def get_openai_model_class_template() -> str:
|
103
|
-
"""Return the template for an OpenAIModelClass-based model."""
|
104
|
-
return '''from typing import List
|
105
|
-
from openai import OpenAI
|
106
|
-
from clarifai.runners.models.openai_class import OpenAIModelClass
|
107
|
-
from clarifai.runners.util.data_utils import Param
|
108
|
-
from clarifai.runners.utils.openai_convertor import build_openai_messages
|
109
|
-
|
110
|
-
class MyModel(OpenAIModelClass):
|
111
|
-
"""A custom model implementation using OpenAIModelClass."""
|
112
|
-
|
113
|
-
# TODO: please fill in
|
114
|
-
# Configure your OpenAI-compatible client for local model
|
115
|
-
client = OpenAI(
|
116
|
-
api_key="local-key", # TODO: please fill in - use your local API key
|
117
|
-
base_url="http://localhost:8000/v1", # TODO: please fill in - your local model server endpoint
|
118
|
-
)
|
119
|
-
|
120
|
-
# TODO: please fill in
|
121
|
-
# Specify the model name to use
|
122
|
-
model = "my-local-model" # TODO: please fill in - replace with your local model name
|
123
|
-
|
124
|
-
def load_model(self):
|
125
|
-
"""Optional: Add any additional model loading logic here."""
|
126
|
-
# TODO: please fill in (optional)
|
127
|
-
# Add any initialization logic if needed
|
128
|
-
pass
|
129
|
-
|
130
|
-
@OpenAIModelClass.method
|
131
|
-
def predict(
|
132
|
-
self,
|
133
|
-
prompt: str = "",
|
134
|
-
chat_history: List[dict] = None,
|
135
|
-
max_tokens: int = Param(default=256, description="The maximum number of tokens to generate. Shorter token lengths will provide faster performance."),
|
136
|
-
temperature: float = Param(default=1.0, description="A decimal number that determines the degree of randomness in the response"),
|
137
|
-
top_p: float = Param(default=1.0, description="An alternative to sampling with temperature, where the model considers the results of the tokens with top_p probability mass."),
|
138
|
-
) -> str:
|
139
|
-
"""Run a single prompt completion using the OpenAI client."""
|
140
|
-
# TODO: please fill in
|
141
|
-
# Implement your prediction logic here
|
142
|
-
messages = build_openai_messages(prompt, chat_history)
|
143
|
-
response = self.client.chat.completions.create(
|
144
|
-
model=self.model,
|
145
|
-
messages=messages,
|
146
|
-
max_completion_tokens=max_tokens,
|
147
|
-
temperature=temperature,
|
148
|
-
top_p=top_p,
|
149
|
-
)
|
150
|
-
return response.choices[0].message.content
|
151
|
-
|
152
|
-
@OpenAIModelClass.method
|
153
|
-
def generate(
|
154
|
-
self,
|
155
|
-
prompt: str = "",
|
156
|
-
chat_history: List[dict] = None,
|
157
|
-
max_tokens: int = Param(default=256, description="The maximum number of tokens to generate. Shorter token lengths will provide faster performance."),
|
158
|
-
temperature: float = Param(default=1.0, description="A decimal number that determines the degree of randomness in the response"),
|
159
|
-
top_p: float = Param(default=1.0, description="An alternative to sampling with temperature, where the model considers the results of the tokens with top_p probability mass."),
|
160
|
-
):
|
161
|
-
"""Stream a completion response using the OpenAI client."""
|
162
|
-
# TODO: please fill in
|
163
|
-
# Implement your streaming logic here
|
164
|
-
messages = build_openai_messages(prompt, chat_history)
|
165
|
-
stream = self.client.chat.completions.create(
|
166
|
-
model=self.model,
|
167
|
-
messages=messages,
|
168
|
-
max_completion_tokens=max_tokens,
|
169
|
-
temperature=temperature,
|
170
|
-
top_p=top_p,
|
171
|
-
stream=True,
|
172
|
-
)
|
173
|
-
for chunk in stream:
|
174
|
-
if chunk.choices:
|
175
|
-
text = (chunk.choices[0].delta.content
|
176
|
-
if (chunk and chunk.choices[0].delta.content) is not None else '')
|
177
|
-
yield text
|
178
|
-
'''
|
179
|
-
|
180
|
-
|
181
|
-
def get_config_template(model_type_id: str = "text-to-text") -> str:
|
182
|
-
"""Return the template for config.yaml."""
|
183
|
-
return f'''# Configuration file for your Clarifai model
|
184
|
-
|
185
|
-
model:
|
186
|
-
id: "my-model" # TODO: please fill in - replace with your model ID
|
187
|
-
user_id: "user_id" # TODO: please fill in - replace with your user ID
|
188
|
-
app_id: "app_id" # TODO: please fill in - replace with your app ID
|
189
|
-
model_type_id: "{model_type_id}" # TODO: please fill in - replace if different model type ID
|
190
|
-
|
191
|
-
build_info:
|
192
|
-
python_version: "3.12"
|
193
|
-
|
194
|
-
# TODO: please fill in - adjust compute requirements for your model
|
195
|
-
inference_compute_info:
|
196
|
-
cpu_limit: "1" # TODO: please fill in - Amount of CPUs to use as a limit
|
197
|
-
cpu_memory: "1Gi" # TODO: please fill in - Amount of CPU memory to use as a limit
|
198
|
-
cpu_requests: "0.5" # TODO: please fill in - Amount of CPUs to use as a minimum
|
199
|
-
cpu_memory_requests: "512Mi" # TODO: please fill in - Amount of CPU memory to use as a minimum
|
200
|
-
num_accelerators: 1 # TODO: please fill in - Amount of GPU/TPUs to use
|
201
|
-
accelerator_type: ["NVIDIA-*"] # TODO: please fill in - type of accelerators requested
|
202
|
-
accelerator_memory: "1Gi" # TODO: please fill in - Amount of accelerator/GPU memory to use as a minimum
|
203
|
-
|
204
|
-
# TODO: please fill in (optional) - add checkpoints section if needed
|
205
|
-
# checkpoints:
|
206
|
-
# type: "huggingface" # supported type
|
207
|
-
# repo_id: "your-model-repo" # for huggingface
|
208
|
-
# when: "build" # or "runtime", "upload"
|
209
|
-
'''
|
210
|
-
|
211
|
-
|
212
|
-
def get_requirements_template(model_type_id: str = None) -> str:
|
213
|
-
"""Return the template for requirements.txt."""
|
214
|
-
requirements = f'''# Clarifai SDK - required
|
215
|
-
clarifai>={__version__}
|
216
|
-
'''
|
217
|
-
if model_type_id == "mcp":
|
218
|
-
requirements += "fastmcp\n"
|
219
|
-
elif model_type_id == "openai":
|
220
|
-
requirements += "openai\n"
|
221
|
-
requirements += '''
|
222
|
-
# TODO: please fill in - add your model's dependencies here
|
223
|
-
# Examples:
|
224
|
-
# torch>=2.0.0
|
225
|
-
# transformers>=4.30.0
|
226
|
-
# numpy>=1.21.0
|
227
|
-
# pillow>=9.0.0
|
228
|
-
'''
|
229
|
-
return requirements
|
230
|
-
|
231
|
-
|
232
|
-
# Mapping of model type IDs to their corresponding templates
|
233
|
-
MODEL_TYPE_TEMPLATES = {
|
234
|
-
"mcp": get_mcp_model_class_template,
|
235
|
-
"openai": get_openai_model_class_template,
|
236
|
-
}
|
237
|
-
|
238
|
-
|
239
|
-
def get_model_template(model_type_id: str = None) -> str:
|
240
|
-
"""Get the appropriate model template based on model_type_id."""
|
241
|
-
if model_type_id in MODEL_TYPE_TEMPLATES:
|
242
|
-
return MODEL_TYPE_TEMPLATES[model_type_id]()
|
243
|
-
return get_model_class_template()
|
@@ -1,64 +0,0 @@
|
|
1
|
-
"""Templates for initializing pipeline step projects."""
|
2
|
-
|
3
|
-
from clarifai.versions import CLIENT_VERSION
|
4
|
-
|
5
|
-
|
6
|
-
def get_config_template():
|
7
|
-
"""Get the config.yaml template for pipeline steps."""
|
8
|
-
return """pipeline_step:
|
9
|
-
id: "text-classifier-train-upload-step" # TODO: please fill in
|
10
|
-
user_id: "your_user_id" # TODO: please fill in
|
11
|
-
app_id: "your_app_id" # TODO: please fill in
|
12
|
-
|
13
|
-
pipeline_step_input_params:
|
14
|
-
- name: param_a
|
15
|
-
- name: param_b
|
16
|
-
default: "param_b_allowed_value1"
|
17
|
-
description: "param_b is the second parameter of the pipeline step"
|
18
|
-
accepted_values: # list of accepted values for param_b
|
19
|
-
- "param_b_allowed_value1"
|
20
|
-
- "param_b_allowed_value2"
|
21
|
-
- "param_b_allowed_value3"
|
22
|
-
|
23
|
-
build_info:
|
24
|
-
python_version: "3.12"
|
25
|
-
|
26
|
-
pipeline_step_compute_info:
|
27
|
-
cpu_limit: "500m"
|
28
|
-
cpu_memory: "500Mi"
|
29
|
-
num_accelerators: 0
|
30
|
-
"""
|
31
|
-
|
32
|
-
|
33
|
-
def get_pipeline_step_template():
|
34
|
-
"""Get the pipeline_step.py template."""
|
35
|
-
return '''import argparse
|
36
|
-
|
37
|
-
import clarifai
|
38
|
-
|
39
|
-
|
40
|
-
def main():
|
41
|
-
parser = argparse.ArgumentParser(description='Concatenate two strings.')
|
42
|
-
parser.add_argument('--param_a', type=str, required=True, help='First string to concatenate')
|
43
|
-
parser.add_argument('--param_b', type=str, required=True, help='Second string to concatenate')
|
44
|
-
|
45
|
-
args = parser.parse_args()
|
46
|
-
|
47
|
-
print(clarifai.__version__)
|
48
|
-
|
49
|
-
print(f"Concatenation Output: {args.param_a + args.param_b}")
|
50
|
-
|
51
|
-
|
52
|
-
if __name__ == "__main__":
|
53
|
-
main()
|
54
|
-
'''
|
55
|
-
|
56
|
-
|
57
|
-
def get_requirements_template():
|
58
|
-
"""Get the requirements.txt template."""
|
59
|
-
return f'''clarifai=={CLIENT_VERSION}
|
60
|
-
# Add your pipeline step dependencies here
|
61
|
-
# Example:
|
62
|
-
# torch>=1.9.0
|
63
|
-
# transformers>=4.20.0
|
64
|
-
'''
|
Binary file
|
Binary file
|
Binary file
|
Binary file
|
Binary file
|
Binary file
|
Binary file
|
Binary file
|
Binary file
|
Binary file
|
Binary file
|
Binary file
|
Binary file
|
Binary file
|
Binary file
|
Binary file
|
Binary file
|
Binary file
|
Binary file
|
Binary file
|
Binary file
|
Binary file
|
Binary file
|
Binary file
|
Binary file
|
Binary file
|
Binary file
|
Binary file
|
Binary file
|
Binary file
|
Binary file
|
Binary file
|
Binary file
|
Binary file
|
Binary file
|
Binary file
|
Binary file
|
Binary file
|
Binary file
|
Binary file
|
Binary file
|
Binary file
|
Binary file
|
Binary file
|
Binary file
|
Binary file
|
Binary file
|
Binary file
|
Binary file
|
Binary file
|
Binary file
|
Binary file
|
Binary file
|
Binary file
|
Binary file
|
Binary file
|
Binary file
|
Binary file
|
Binary file
|
Binary file
|
Binary file
|
Binary file
|
Binary file
|
Binary file
|
Binary file
|
Binary file
|
Binary file
|
Binary file
|
Binary file
|
Binary file
|
Binary file
|
Binary file
|
Binary file
|
@@ -1,158 +0,0 @@
|
|
1
|
-
# Clarifai Model Serving
|
2
|
-
|
3
|
-
## Overview
|
4
|
-
|
5
|
-
Model Serving is a part of user journey at Clarifai offers a user-friendly interface for deploying your local model into production with Clarifai, featuring:
|
6
|
-
|
7
|
-
* A convenient command-line interface (CLI)
|
8
|
-
* Easy implementation and testing in Python
|
9
|
-
* No need for MLops expertise.
|
10
|
-
|
11
|
-
## Quickstart Guide
|
12
|
-
|
13
|
-
Quick example for deploying a `text-to-text` model
|
14
|
-
|
15
|
-
### Initialize a Clarifai model repository
|
16
|
-
|
17
|
-
Suppose your working directory name is `your_model_dir`. Then run
|
18
|
-
|
19
|
-
```bash
|
20
|
-
$ clarifai create model --type text-to-text --working-dir your_model_dir
|
21
|
-
$ cd your_model_dir
|
22
|
-
```
|
23
|
-
|
24
|
-
In `your_model_dir` folder you will see essential files for deployment process
|
25
|
-
|
26
|
-
```bash
|
27
|
-
your_model_dir
|
28
|
-
├── clarifai_config.yaml
|
29
|
-
├── inference.py
|
30
|
-
├── test.py
|
31
|
-
└── requirements.txt
|
32
|
-
```
|
33
|
-
|
34
|
-
### Implementation
|
35
|
-
|
36
|
-
Write your code in class `InferenceModel` which is an interface between your model and Clarifai server in `inference.py`, there are 2 functions you must implement:
|
37
|
-
|
38
|
-
* `__init__`: load your model checkpoint once.
|
39
|
-
* `predict`: make prediction, called everytime when you make request from API.
|
40
|
-
|
41
|
-
For example, a complete implementation of a hf text-generation model
|
42
|
-
|
43
|
-
```python
|
44
|
-
import os
|
45
|
-
from typing import Dict, Union
|
46
|
-
from clarifai.models.model_serving.model_config import *
|
47
|
-
|
48
|
-
import torch
|
49
|
-
from transformers import AutoTokenizer
|
50
|
-
import transformers
|
51
|
-
|
52
|
-
class InferenceModel(TextToText):
|
53
|
-
"""User model inference class."""
|
54
|
-
|
55
|
-
def __init__(self) -> None:
|
56
|
-
"""
|
57
|
-
Load inference time artifacts that are called frequently .e.g. models, tokenizers, etc.
|
58
|
-
in this method so they are loaded only once for faster inference.
|
59
|
-
"""
|
60
|
-
# current directory
|
61
|
-
self.base_path = os.path.dirname(__file__)
|
62
|
-
# where you save hf checkpoint in your working dir e.i. `your_model_dir`
|
63
|
-
model_path = os.path.join(self.base_path, "checkpoint")
|
64
|
-
self.tokenizer = AutoTokenizer.from_pretrained(model_path)
|
65
|
-
self.pipeline = transformers.pipeline(
|
66
|
-
"text-generation",
|
67
|
-
model=model_path,
|
68
|
-
torch_dtype=torch.float16,
|
69
|
-
device_map="auto",
|
70
|
-
)
|
71
|
-
|
72
|
-
def predict(self, input_data: list,
|
73
|
-
inference_parameters: Dict[str, Union[str, float, int]]) -> list:
|
74
|
-
""" Custom prediction function for `text-to-text` (also called as `text generation`) model.
|
75
|
-
|
76
|
-
Args:
|
77
|
-
input_data (List[str]): List of text
|
78
|
-
inference_parameters (Dict[str, Union[str, float, int]]): your inference parameters
|
79
|
-
|
80
|
-
Returns:
|
81
|
-
list of TextOutput
|
82
|
-
|
83
|
-
"""
|
84
|
-
output_sequences = self.pipeline(
|
85
|
-
input_data,
|
86
|
-
eos_token_id=self.tokenizer.eos_token_id,
|
87
|
-
**inference_parameters)
|
88
|
-
|
89
|
-
# wrap outputs in Clarifai defined output
|
90
|
-
return [TextOutput(each[0]) for each in output_sequences]
|
91
|
-
```
|
92
|
-
|
93
|
-
Update dependencies in `requirements.txt`
|
94
|
-
|
95
|
-
```
|
96
|
-
clarifai
|
97
|
-
torch=2.1.1
|
98
|
-
transformers==4.36.2
|
99
|
-
accelerate==0.26.1
|
100
|
-
```
|
101
|
-
|
102
|
-
### Test (optional)
|
103
|
-
|
104
|
-
> NOTE: Running `test` is also involved in `build` and `upload` command.
|
105
|
-
|
106
|
-
Test and play with your implementation by executing `test.py`.
|
107
|
-
|
108
|
-
Install pytest
|
109
|
-
|
110
|
-
```bash
|
111
|
-
$ pip install pytest
|
112
|
-
```
|
113
|
-
|
114
|
-
Execute test
|
115
|
-
|
116
|
-
```bash
|
117
|
-
$ pytest test.py
|
118
|
-
```
|
119
|
-
|
120
|
-
### Build
|
121
|
-
|
122
|
-
Prepare for deployment step. Run:
|
123
|
-
|
124
|
-
```bash
|
125
|
-
$ clarifai build model
|
126
|
-
```
|
127
|
-
|
128
|
-
You will obtain `*.clarifai` file, it's simply a zip having all nessecary files in it to get your model work on Clarifai platform.
|
129
|
-
|
130
|
-
`NOTE`: you need to upload your built file to cloud storage to get direct download `url` for next step
|
131
|
-
|
132
|
-
### Deployment
|
133
|
-
|
134
|
-
Login to Clarifai
|
135
|
-
|
136
|
-
```bash
|
137
|
-
$ clarifai login
|
138
|
-
Get your PAT from https://clarifai.com/settings/security and pass it here: <insert your pat here>
|
139
|
-
```
|
140
|
-
|
141
|
-
Upload
|
142
|
-
|
143
|
-
```bash
|
144
|
-
# upload built file directly
|
145
|
-
$ clarifai upload model <your-working-dir> --user-app <your_user_id>/<your_app_id> --id <your_model_id>
|
146
|
-
# or using direct download url of cloud storage
|
147
|
-
$ clarifai upload model --url <url> --user-app <your_user_id>/<your_app_id> --id <your_model_id>
|
148
|
-
```
|
149
|
-
|
150
|
-
## Learn More
|
151
|
-
|
152
|
-
* [Detail Instruction](./docs/concepts.md)
|
153
|
-
* [Examples](https://github.com/Clarifai/examples/tree/main/model_upload)
|
154
|
-
* [Initialize from example](./docs/cli.md)
|
155
|
-
* [CLI usage](./docs/cli.md)
|
156
|
-
* [Inference parameters](./docs/inference_parameters.md)
|
157
|
-
* [Model Types](./docs/model_types.md)
|
158
|
-
* [Dependencies](./docs/dependencies.md)
|