dm-aioaiagent 0.5.7__tar.gz → 0.6.1__tar.gz

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
@@ -1,6 +1,6 @@
1
1
  Metadata-Version: 2.4
2
2
  Name: dm-aioaiagent
3
- Version: 0.5.7
3
+ Version: 0.6.1
4
4
  Summary: This is my custom aioaiagent client
5
5
  Home-page: https://pypi.org/project/dm-aioaiagent
6
6
  Author: dimka4621
@@ -15,14 +15,33 @@ Description-Content-Type: text/markdown
15
15
  Requires-Dist: dm-logger<0.7.0,>=0.6.6
16
16
  Requires-Dist: python-dotenv>=1.0.0
17
17
  Requires-Dist: pydantic<3.0.0,>=2.9.2
18
- Requires-Dist: langchain<0.4.0,>=0.3.0
19
- Requires-Dist: langchain-core<0.4.0,>=0.3.5
20
- Requires-Dist: langchain-community<0.4.0,>=0.3.0
21
- Requires-Dist: langchain-openai<0.4.0,>=0.3.0
22
- Requires-Dist: langchain-anthropic<0.4.0,>=0.3.0
23
- Requires-Dist: langgraph<0.4.0,>=0.3.23
24
- Requires-Dist: langsmith<0.4.0,>=0.3.45
18
+ Requires-Dist: langchain<2.0.0,>=1.0.0
19
+ Requires-Dist: langchain-core<2.0.0,>=1.0.0
20
+ Requires-Dist: langchain-openai<2.0.0,>=1.0.0
21
+ Requires-Dist: langgraph<2.0.0,>=1.0.0
22
+ Requires-Dist: langsmith<1.0.0,>=0.4.0
25
23
  Requires-Dist: grandalf<0.9.0,>=0.8.0
24
+ Provides-Extra: anthropic
25
+ Requires-Dist: langchain-anthropic<2.0.0,>=1.0.0; extra == "anthropic"
26
+ Provides-Extra: gemini
27
+ Requires-Dist: langchain-google-genai<5.0.0,>=4.0.0; extra == "gemini"
28
+ Requires-Dist: langchain-google-vertexai<4.0.0,>=3.0.0; extra == "gemini"
29
+ Provides-Extra: groq
30
+ Requires-Dist: langchain-groq<2.0.0,>=1.0.0; extra == "groq"
31
+ Provides-Extra: mistral
32
+ Requires-Dist: langchain-mistralai<2.0.0,>=1.0.0; extra == "mistral"
33
+ Provides-Extra: deepseek
34
+ Requires-Dist: langchain-deepseek<2.0.0,>=1.0.0; extra == "deepseek"
35
+ Provides-Extra: ollama
36
+ Requires-Dist: langchain-ollama<2.0.0,>=1.0.0; extra == "ollama"
37
+ Provides-Extra: all
38
+ Requires-Dist: langchain-anthropic<2.0.0,>=1.0.0; extra == "all"
39
+ Requires-Dist: langchain-google-genai<5.0.0,>=4.0.0; extra == "all"
40
+ Requires-Dist: langchain-google-vertexai<4.0.0,>=3.0.0; extra == "all"
41
+ Requires-Dist: langchain-groq<2.0.0,>=1.0.0; extra == "all"
42
+ Requires-Dist: langchain-mistralai<2.0.0,>=1.0.0; extra == "all"
43
+ Requires-Dist: langchain-deepseek<2.0.0,>=1.0.0; extra == "all"
44
+ Requires-Dist: langchain-ollama<2.0.0,>=1.0.0; extra == "all"
26
45
  Dynamic: author
27
46
  Dynamic: author-email
28
47
  Dynamic: classifier
@@ -31,6 +50,7 @@ Dynamic: description-content-type
31
50
  Dynamic: home-page
32
51
  Dynamic: keywords
33
52
  Dynamic: project-url
53
+ Dynamic: provides-extra
34
54
  Dynamic: requires-dist
35
55
  Dynamic: requires-python
36
56
  Dynamic: summary
@@ -44,6 +64,59 @@ Dynamic: summary
44
64
 
45
65
  ### * Package contains both `asynchronous` and `synchronous` clients
46
66
 
67
+ ## Installation
68
+
69
+ By default, the package ships with **OpenAI** support. Other providers are optional extras:
70
+
71
+ ```bash
72
+ pip install dm-aioaiagent # OpenAI only
73
+ pip install dm-aioaiagent[anthropic] # + Anthropic
74
+ pip install dm-aioaiagent[anthropic,gemini] # several at once
75
+ pip install dm-aioaiagent[all] # every supported provider
76
+ ```
77
+
78
+ Available extras: `anthropic`, `gemini`, `groq`, `mistral`, `deepseek`, `ollama`, `all`.
79
+
80
+ If you call a model from a provider whose package is not installed, `init_chat_model` will raise an `ImportError` with the exact `pip install` command you need.
81
+
82
+ ## Providers
83
+
84
+ Provider resolution is delegated to LangChain's [`init_chat_model`](https://python.langchain.com/api_reference/langchain/chat_models/langchain.chat_models.base.init_chat_model.html) — the agent picks the provider automatically by model name prefix when possible. For everything else, use the `"provider:model"` mask.
85
+
86
+ ```python
87
+ # Auto-detected from model prefix (rules come from LangChain's init_chat_model)
88
+ agent = DMAioAIAgent(model="gpt-4o-mini") # → openai
89
+ agent = DMAioAIAgent(model="claude-3-5-sonnet-latest") # → anthropic
90
+ agent = DMAioAIAgent(model="gemini-2.0-flash") # → google_vertexai (see note below)
91
+
92
+ # Explicit provider via "provider:model" mask
93
+ agent = DMAioAIAgent(model="google_genai:gemini-2.0-flash")
94
+ agent = DMAioAIAgent(model="groq:llama-3.1-70b-versatile")
95
+ agent = DMAioAIAgent(model="mistralai:mistral-large-latest")
96
+ agent = DMAioAIAgent(model="deepseek:deepseek-chat")
97
+ agent = DMAioAIAgent(model="ollama:llama3.1")
98
+
99
+ # OpenAI-compatible gateway (OpenRouter, Together, vLLM, LiteLLM proxy, ...)
100
+ # Works without installing any extra — just point to the OpenAI-compatible URL.
101
+ agent = DMAioAIAgent(
102
+ model="meta-llama/llama-3.1-70b-instruct",
103
+ llm_provider_base_url="https://openrouter.ai/api/v1",
104
+ llm_provider_api_key="sk-or-...",
105
+ )
106
+ ```
107
+
108
+ > **Note about Gemini.** LangChain's auto-detect maps the `gemini*` prefix to **`google_vertexai`** (Google Cloud Vertex AI, requires a GCP service account). If you have a regular **Google AI Studio** API key (`GOOGLE_API_KEY`), use the `google_genai:` mask explicitly:
109
+ >
110
+ > ```python
111
+ > agent = DMAioAIAgent(model="google_genai:gemini-2.0-flash")
112
+ > ```
113
+
114
+ Supported provider keys for the `"provider:model"` mask (list inherited from LangChain): `openai`, `anthropic`, `azure_openai`, `azure_ai`, `google_vertexai`, `google_genai`, `bedrock`, `bedrock_converse`, `cohere`, `fireworks`, `together`, `mistralai`, `huggingface`, `groq`, `ollama`, `google_anthropic_vertex`, `deepseek`, `ibm`, `nvidia`, `xai`, `perplexity`.
115
+
116
+ ### Note about parallel tool calls
117
+
118
+ `parallel_tool_calls` is currently mapped only for **OpenAI** and **Anthropic** (their APIs use different formats). For other providers the parameter is silently ignored — extend per-provider mapping if you need it.
119
+
47
120
  ## Usage
48
121
 
49
122
  Analogue to `DMAioAIAgent` is the synchronous client `DMAIAgent`.
@@ -60,7 +133,7 @@ if sys.platform == "win32":
60
133
 
61
134
  ### Api Key Setup
62
135
 
63
- You can set your OpenAI API key in the environment variable `OPENAI_API_KEY` or pass it as an argument to the agent.
136
+ Each provider reads its API key from a dedicated environment variable, e.g. `OPENAI_API_KEY`, `ANTHROPIC_API_KEY`, `GOOGLE_API_KEY`, `GROQ_API_KEY`, `MISTRAL_API_KEY`, etc. Alternatively, pass the key explicitly via the `llm_provider_api_key` argument useful for multi-tenant setups, custom gateways, or runtime key rotation.
64
137
 
65
138
  **Use load_dotenv to load the `.env` file.**
66
139
 
@@ -1,40 +1,3 @@
1
- Metadata-Version: 2.4
2
- Name: dm-aioaiagent
3
- Version: 0.5.7
4
- Summary: This is my custom aioaiagent client
5
- Home-page: https://pypi.org/project/dm-aioaiagent
6
- Author: dimka4621
7
- Author-email: mismartconfig@gmail.com
8
- Project-URL: GitHub, https://github.com/MykhLibs/dm-aioaiagent
9
- Keywords: dm aioaiagent
10
- Classifier: Programming Language :: Python :: 3.9
11
- Classifier: License :: OSI Approved :: MIT License
12
- Classifier: Operating System :: OS Independent
13
- Requires-Python: >=3.9
14
- Description-Content-Type: text/markdown
15
- Requires-Dist: dm-logger<0.7.0,>=0.6.6
16
- Requires-Dist: python-dotenv>=1.0.0
17
- Requires-Dist: pydantic<3.0.0,>=2.9.2
18
- Requires-Dist: langchain<0.4.0,>=0.3.0
19
- Requires-Dist: langchain-core<0.4.0,>=0.3.5
20
- Requires-Dist: langchain-community<0.4.0,>=0.3.0
21
- Requires-Dist: langchain-openai<0.4.0,>=0.3.0
22
- Requires-Dist: langchain-anthropic<0.4.0,>=0.3.0
23
- Requires-Dist: langgraph<0.4.0,>=0.3.23
24
- Requires-Dist: langsmith<0.4.0,>=0.3.45
25
- Requires-Dist: grandalf<0.9.0,>=0.8.0
26
- Dynamic: author
27
- Dynamic: author-email
28
- Dynamic: classifier
29
- Dynamic: description
30
- Dynamic: description-content-type
31
- Dynamic: home-page
32
- Dynamic: keywords
33
- Dynamic: project-url
34
- Dynamic: requires-dist
35
- Dynamic: requires-python
36
- Dynamic: summary
37
-
38
1
  # DM-aioaiagent
39
2
 
40
3
  ## Urls
@@ -44,6 +7,59 @@ Dynamic: summary
44
7
 
45
8
  ### * Package contains both `asynchronous` and `synchronous` clients
46
9
 
10
+ ## Installation
11
+
12
+ By default, the package ships with **OpenAI** support. Other providers are optional extras:
13
+
14
+ ```bash
15
+ pip install dm-aioaiagent # OpenAI only
16
+ pip install dm-aioaiagent[anthropic] # + Anthropic
17
+ pip install dm-aioaiagent[anthropic,gemini] # several at once
18
+ pip install dm-aioaiagent[all] # every supported provider
19
+ ```
20
+
21
+ Available extras: `anthropic`, `gemini`, `groq`, `mistral`, `deepseek`, `ollama`, `all`.
22
+
23
+ If you call a model from a provider whose package is not installed, `init_chat_model` will raise an `ImportError` with the exact `pip install` command you need.
24
+
25
+ ## Providers
26
+
27
+ Provider resolution is delegated to LangChain's [`init_chat_model`](https://python.langchain.com/api_reference/langchain/chat_models/langchain.chat_models.base.init_chat_model.html) — the agent picks the provider automatically by model name prefix when possible. For everything else, use the `"provider:model"` mask.
28
+
29
+ ```python
30
+ # Auto-detected from model prefix (rules come from LangChain's init_chat_model)
31
+ agent = DMAioAIAgent(model="gpt-4o-mini") # → openai
32
+ agent = DMAioAIAgent(model="claude-3-5-sonnet-latest") # → anthropic
33
+ agent = DMAioAIAgent(model="gemini-2.0-flash") # → google_vertexai (see note below)
34
+
35
+ # Explicit provider via "provider:model" mask
36
+ agent = DMAioAIAgent(model="google_genai:gemini-2.0-flash")
37
+ agent = DMAioAIAgent(model="groq:llama-3.1-70b-versatile")
38
+ agent = DMAioAIAgent(model="mistralai:mistral-large-latest")
39
+ agent = DMAioAIAgent(model="deepseek:deepseek-chat")
40
+ agent = DMAioAIAgent(model="ollama:llama3.1")
41
+
42
+ # OpenAI-compatible gateway (OpenRouter, Together, vLLM, LiteLLM proxy, ...)
43
+ # Works without installing any extra — just point to the OpenAI-compatible URL.
44
+ agent = DMAioAIAgent(
45
+ model="meta-llama/llama-3.1-70b-instruct",
46
+ llm_provider_base_url="https://openrouter.ai/api/v1",
47
+ llm_provider_api_key="sk-or-...",
48
+ )
49
+ ```
50
+
51
+ > **Note about Gemini.** LangChain's auto-detect maps the `gemini*` prefix to **`google_vertexai`** (Google Cloud Vertex AI, requires a GCP service account). If you have a regular **Google AI Studio** API key (`GOOGLE_API_KEY`), use the `google_genai:` mask explicitly:
52
+ >
53
+ > ```python
54
+ > agent = DMAioAIAgent(model="google_genai:gemini-2.0-flash")
55
+ > ```
56
+
57
+ Supported provider keys for the `"provider:model"` mask (list inherited from LangChain): `openai`, `anthropic`, `azure_openai`, `azure_ai`, `google_vertexai`, `google_genai`, `bedrock`, `bedrock_converse`, `cohere`, `fireworks`, `together`, `mistralai`, `huggingface`, `groq`, `ollama`, `google_anthropic_vertex`, `deepseek`, `ibm`, `nvidia`, `xai`, `perplexity`.
58
+
59
+ ### Note about parallel tool calls
60
+
61
+ `parallel_tool_calls` is currently mapped only for **OpenAI** and **Anthropic** (their APIs use different formats). For other providers the parameter is silently ignored — extend per-provider mapping if you need it.
62
+
47
63
  ## Usage
48
64
 
49
65
  Analogue to `DMAioAIAgent` is the synchronous client `DMAIAgent`.
@@ -60,7 +76,7 @@ if sys.platform == "win32":
60
76
 
61
77
  ### Api Key Setup
62
78
 
63
- You can set your OpenAI API key in the environment variable `OPENAI_API_KEY` or pass it as an argument to the agent.
79
+ Each provider reads its API key from a dedicated environment variable, e.g. `OPENAI_API_KEY`, `ANTHROPIC_API_KEY`, `GOOGLE_API_KEY`, `GROQ_API_KEY`, `MISTRAL_API_KEY`, etc. Alternatively, pass the key explicitly via the `llm_provider_api_key` argument useful for multi-tenant setups, custom gateways, or runtime key rotation.
64
80
 
65
81
  **Use load_dotenv to load the `.env` file.**
66
82
 
@@ -4,6 +4,7 @@ from typing import Any
4
4
  from pydantic import SecretStr
5
5
  from itertools import dropwhile
6
6
  from threading import Thread
7
+ from langchain.chat_models import init_chat_model
7
8
  from langchain_core.tools import BaseTool
8
9
  from langchain_core.prompts import ChatPromptTemplate, MessagesPlaceholder
9
10
  from langchain_core.messages import SystemMessage, HumanMessage, AIMessage, ToolMessage
@@ -242,29 +243,28 @@ class DMAIAgent:
242
243
 
243
244
  def _init_agent(self) -> None:
244
245
  base_kwargs = {"model": self._model}
245
- if isinstance(self._temperature, float):
246
- base_kwargs["temperature"] = self._temperature
247
- else:
248
- ValueError("Temperature must be a float value.")
246
+ if self._temperature is not None:
247
+ if not isinstance(self._temperature, (int, float)):
248
+ raise ValueError("Temperature must be a float value.")
249
+ base_kwargs["temperature"] = float(self._temperature)
249
250
  if self._llm_provider_api_key:
250
251
  base_kwargs["api_key"] = SecretStr(self._llm_provider_api_key)
251
252
  if self._llm_provider_base_url:
252
253
  base_kwargs["base_url"] = self._llm_provider_base_url
253
254
 
254
- if self._model.startswith("claude"):
255
- from langchain_anthropic import ChatAnthropic
255
+ llm = init_chat_model(**base_kwargs)
256
256
 
257
- llm = ChatAnthropic(**base_kwargs)
257
+ provider = self._detect_provider(self._model)
258
+ if provider == "anthropic":
258
259
  bind_tool_kwargs = {"tool_choice": {"type": "auto"}}
259
260
  if isinstance(self._parallel_tool_calls, bool):
260
261
  bind_tool_kwargs["tool_choice"]["disable_parallel_tool_use"] = not self._parallel_tool_calls
261
- else:
262
- from langchain_openai import ChatOpenAI
263
-
264
- llm = ChatOpenAI(**base_kwargs)
262
+ elif provider == "openai":
265
263
  bind_tool_kwargs = {}
266
264
  if isinstance(self._parallel_tool_calls, bool):
267
265
  bind_tool_kwargs["parallel_tool_calls"] = self._parallel_tool_calls
266
+ else:
267
+ bind_tool_kwargs = {}
268
268
 
269
269
  if self._is_tools_exists:
270
270
  self._tool_map = {t.name: t for t in self._tools}
@@ -277,6 +277,16 @@ class DMAIAgent:
277
277
  MessagesPlaceholder(variable_name="messages")])
278
278
  self._agent = prompt | llm
279
279
 
280
+ @staticmethod
281
+ def _detect_provider(model: str) -> str:
282
+ if ":" in model:
283
+ return model.split(":", 1)[0].replace("-", "_").lower()
284
+ if model.startswith(("gpt-", "o1", "o3")):
285
+ return "openai"
286
+ if model.startswith("claude"):
287
+ return "anthropic"
288
+ return ""
289
+
280
290
  def _init_graph(self) -> None:
281
291
  workflow = StateGraph(State)
282
292
  workflow.add_node("Prepare messages", self._prepare_messages_node)
@@ -0,0 +1,265 @@
1
+ Metadata-Version: 2.4
2
+ Name: dm-aioaiagent
3
+ Version: 0.6.1
4
+ Summary: This is my custom aioaiagent client
5
+ Home-page: https://pypi.org/project/dm-aioaiagent
6
+ Author: dimka4621
7
+ Author-email: mismartconfig@gmail.com
8
+ Project-URL: GitHub, https://github.com/MykhLibs/dm-aioaiagent
9
+ Keywords: dm aioaiagent
10
+ Classifier: Programming Language :: Python :: 3.9
11
+ Classifier: License :: OSI Approved :: MIT License
12
+ Classifier: Operating System :: OS Independent
13
+ Requires-Python: >=3.9
14
+ Description-Content-Type: text/markdown
15
+ Requires-Dist: dm-logger<0.7.0,>=0.6.6
16
+ Requires-Dist: python-dotenv>=1.0.0
17
+ Requires-Dist: pydantic<3.0.0,>=2.9.2
18
+ Requires-Dist: langchain<2.0.0,>=1.0.0
19
+ Requires-Dist: langchain-core<2.0.0,>=1.0.0
20
+ Requires-Dist: langchain-openai<2.0.0,>=1.0.0
21
+ Requires-Dist: langgraph<2.0.0,>=1.0.0
22
+ Requires-Dist: langsmith<1.0.0,>=0.4.0
23
+ Requires-Dist: grandalf<0.9.0,>=0.8.0
24
+ Provides-Extra: anthropic
25
+ Requires-Dist: langchain-anthropic<2.0.0,>=1.0.0; extra == "anthropic"
26
+ Provides-Extra: gemini
27
+ Requires-Dist: langchain-google-genai<5.0.0,>=4.0.0; extra == "gemini"
28
+ Requires-Dist: langchain-google-vertexai<4.0.0,>=3.0.0; extra == "gemini"
29
+ Provides-Extra: groq
30
+ Requires-Dist: langchain-groq<2.0.0,>=1.0.0; extra == "groq"
31
+ Provides-Extra: mistral
32
+ Requires-Dist: langchain-mistralai<2.0.0,>=1.0.0; extra == "mistral"
33
+ Provides-Extra: deepseek
34
+ Requires-Dist: langchain-deepseek<2.0.0,>=1.0.0; extra == "deepseek"
35
+ Provides-Extra: ollama
36
+ Requires-Dist: langchain-ollama<2.0.0,>=1.0.0; extra == "ollama"
37
+ Provides-Extra: all
38
+ Requires-Dist: langchain-anthropic<2.0.0,>=1.0.0; extra == "all"
39
+ Requires-Dist: langchain-google-genai<5.0.0,>=4.0.0; extra == "all"
40
+ Requires-Dist: langchain-google-vertexai<4.0.0,>=3.0.0; extra == "all"
41
+ Requires-Dist: langchain-groq<2.0.0,>=1.0.0; extra == "all"
42
+ Requires-Dist: langchain-mistralai<2.0.0,>=1.0.0; extra == "all"
43
+ Requires-Dist: langchain-deepseek<2.0.0,>=1.0.0; extra == "all"
44
+ Requires-Dist: langchain-ollama<2.0.0,>=1.0.0; extra == "all"
45
+ Dynamic: author
46
+ Dynamic: author-email
47
+ Dynamic: classifier
48
+ Dynamic: description
49
+ Dynamic: description-content-type
50
+ Dynamic: home-page
51
+ Dynamic: keywords
52
+ Dynamic: project-url
53
+ Dynamic: provides-extra
54
+ Dynamic: requires-dist
55
+ Dynamic: requires-python
56
+ Dynamic: summary
57
+
58
+ # DM-aioaiagent
59
+
60
+ ## Urls
61
+
62
+ * [PyPI](https://pypi.org/project/dm-aioaiagent)
63
+ * [GitHub](https://github.com/MykhLibs/dm-aioaiagent)
64
+
65
+ ### * Package contains both `asynchronous` and `synchronous` clients
66
+
67
+ ## Installation
68
+
69
+ By default, the package ships with **OpenAI** support. Other providers are optional extras:
70
+
71
+ ```bash
72
+ pip install dm-aioaiagent # OpenAI only
73
+ pip install dm-aioaiagent[anthropic] # + Anthropic
74
+ pip install dm-aioaiagent[anthropic,gemini] # several at once
75
+ pip install dm-aioaiagent[all] # every supported provider
76
+ ```
77
+
78
+ Available extras: `anthropic`, `gemini`, `groq`, `mistral`, `deepseek`, `ollama`, `all`.
79
+
80
+ If you call a model from a provider whose package is not installed, `init_chat_model` will raise an `ImportError` with the exact `pip install` command you need.
81
+
82
+ ## Providers
83
+
84
+ Provider resolution is delegated to LangChain's [`init_chat_model`](https://python.langchain.com/api_reference/langchain/chat_models/langchain.chat_models.base.init_chat_model.html) — the agent picks the provider automatically by model name prefix when possible. For everything else, use the `"provider:model"` mask.
85
+
86
+ ```python
87
+ # Auto-detected from model prefix (rules come from LangChain's init_chat_model)
88
+ agent = DMAioAIAgent(model="gpt-4o-mini") # → openai
89
+ agent = DMAioAIAgent(model="claude-3-5-sonnet-latest") # → anthropic
90
+ agent = DMAioAIAgent(model="gemini-2.0-flash") # → google_vertexai (see note below)
91
+
92
+ # Explicit provider via "provider:model" mask
93
+ agent = DMAioAIAgent(model="google_genai:gemini-2.0-flash")
94
+ agent = DMAioAIAgent(model="groq:llama-3.1-70b-versatile")
95
+ agent = DMAioAIAgent(model="mistralai:mistral-large-latest")
96
+ agent = DMAioAIAgent(model="deepseek:deepseek-chat")
97
+ agent = DMAioAIAgent(model="ollama:llama3.1")
98
+
99
+ # OpenAI-compatible gateway (OpenRouter, Together, vLLM, LiteLLM proxy, ...)
100
+ # Works without installing any extra — just point to the OpenAI-compatible URL.
101
+ agent = DMAioAIAgent(
102
+ model="meta-llama/llama-3.1-70b-instruct",
103
+ llm_provider_base_url="https://openrouter.ai/api/v1",
104
+ llm_provider_api_key="sk-or-...",
105
+ )
106
+ ```
107
+
108
+ > **Note about Gemini.** LangChain's auto-detect maps the `gemini*` prefix to **`google_vertexai`** (Google Cloud Vertex AI, requires a GCP service account). If you have a regular **Google AI Studio** API key (`GOOGLE_API_KEY`), use the `google_genai:` mask explicitly:
109
+ >
110
+ > ```python
111
+ > agent = DMAioAIAgent(model="google_genai:gemini-2.0-flash")
112
+ > ```
113
+
114
+ Supported provider keys for the `"provider:model"` mask (list inherited from LangChain): `openai`, `anthropic`, `azure_openai`, `azure_ai`, `google_vertexai`, `google_genai`, `bedrock`, `bedrock_converse`, `cohere`, `fireworks`, `together`, `mistralai`, `huggingface`, `groq`, `ollama`, `google_anthropic_vertex`, `deepseek`, `ibm`, `nvidia`, `xai`, `perplexity`.
115
+
116
+ ### Note about parallel tool calls
117
+
118
+ `parallel_tool_calls` is currently mapped only for **OpenAI** and **Anthropic** (their APIs use different formats). For other providers the parameter is silently ignored — extend per-provider mapping if you need it.
119
+
120
+ ## Usage
121
+
122
+ Analogue to `DMAioAIAgent` is the synchronous client `DMAIAgent`.
123
+
124
+ ### Windows Setup
125
+
126
+ ```python
127
+ import asyncio
128
+ import sys
129
+
130
+ if sys.platform == "win32":
131
+ asyncio.set_event_loop_policy(asyncio.WindowsSelectorEventLoopPolicy())
132
+ ```
133
+
134
+ ### Api Key Setup
135
+
136
+ Each provider reads its API key from a dedicated environment variable, e.g. `OPENAI_API_KEY`, `ANTHROPIC_API_KEY`, `GOOGLE_API_KEY`, `GROQ_API_KEY`, `MISTRAL_API_KEY`, etc. Alternatively, pass the key explicitly via the `llm_provider_api_key` argument — useful for multi-tenant setups, custom gateways, or runtime key rotation.
137
+
138
+ **Use load_dotenv to load the `.env` file.**
139
+
140
+ ```python
141
+ from dotenv import load_dotenv
142
+ load_dotenv()
143
+ ```
144
+
145
+ ### Use agent *with* inner memory and run *single* message
146
+
147
+ By default, agent use inner memory to store the conversation history.
148
+
149
+ (You can set *max count messages in memory* by `max_memory_messages` init argument)
150
+
151
+ ```python
152
+ import asyncio
153
+ from dm_aioaiagent import DMAioAIAgent
154
+
155
+
156
+ async def main():
157
+ # define a system message
158
+ system_message = "Your custom system message with role, backstory and goal"
159
+
160
+ # (optional) define a list of tools, if you want to use them
161
+ tools = [...]
162
+
163
+ # define a openai model, default is "gpt-4o-mini"
164
+ model_name = "gpt-4o"
165
+
166
+ # create an agent
167
+ ai_agent = DMAioAIAgent(system_message, tools, model=model_name)
168
+ # if you don't want to see the input and output messages from agent
169
+ # you can set `input_output_logging=False` init argument
170
+
171
+ # call an agent
172
+ answer = await ai_agent.run("Hello!")
173
+
174
+ # call an agent
175
+ answer = await ai_agent.run("I want to know the weather in Kyiv")
176
+
177
+ # get full conversation history
178
+ conversation_history = ai_agent.memory_messages
179
+
180
+ # clear conversation history
181
+ ai_agent.clear_memory_messages()
182
+
183
+
184
+ if __name__ == "__main__":
185
+ asyncio.run(main())
186
+ ```
187
+
188
+ ### Use agent *without* inner memory and run *multiple* messages
189
+
190
+ If you want to control the memory of the agent, you can disable it by setting `is_memory_enabled=False`
191
+
192
+ ```python
193
+ import asyncio
194
+ from dm_aioaiagent import DMAioAIAgent
195
+
196
+
197
+ async def main():
198
+ # define a system message
199
+ system_message = "Your custom system message with role, backstory and goal"
200
+
201
+ # (optional) define a list of tools, if you want to use them
202
+ tools = [...]
203
+
204
+ # define a openai model, default is "gpt-4o-mini"
205
+ model_name = "gpt-4o"
206
+
207
+ # create an agent
208
+ ai_agent = DMAioAIAgent(system_message, tools, model=model_name,
209
+ is_memory_enabled=False)
210
+ # if you don't want to see the input and output messages from agent
211
+ # you can set input_output_logging=False
212
+
213
+ # define the conversation message(s)
214
+ messages = [
215
+ {"role": "user", "content": "Hello!"}
216
+ ]
217
+
218
+ # call an agent
219
+ new_messages = await ai_agent.run_messages(messages)
220
+
221
+ # add new_messages to messages
222
+ messages.extend(new_messages)
223
+
224
+ # define the next conversation message
225
+ messages.append(
226
+ {"role": "user", "content": "I want to know the weather in Kyiv"}
227
+ )
228
+
229
+ # call an agent
230
+ new_messages = await ai_agent.run_messages(messages)
231
+
232
+
233
+ if __name__ == "__main__":
234
+ asyncio.run(main())
235
+ ```
236
+
237
+ ### Image vision
238
+
239
+ ```python
240
+ from dm_aioaiagent import DMAIAgent, OpenAIImageMessageContent
241
+
242
+
243
+ def main():
244
+ # create an agent
245
+ ai_agent = DMAIAgent(agent_name="image_vision", model="gpt-4o")
246
+
247
+ # create an image message content
248
+ # NOTE: text argument is optional
249
+ img_content = OpenAIImageMessageContent(image_url="https://your.domain/image",
250
+ text="Hello, what is shown in the photo?")
251
+
252
+ # define the conversation messages
253
+ messages = [
254
+ {"role": "user", "content": "Hello!"},
255
+ {"role": "user", "content": img_content},
256
+ ]
257
+
258
+ # call an agent
259
+ new_messages = ai_agent.run_messages(messages)
260
+ answer = new_messages[-1].content
261
+
262
+
263
+ if __name__ == "__main__":
264
+ main()
265
+ ```
@@ -0,0 +1,37 @@
1
+ dm-logger<0.7.0,>=0.6.6
2
+ python-dotenv>=1.0.0
3
+ pydantic<3.0.0,>=2.9.2
4
+ langchain<2.0.0,>=1.0.0
5
+ langchain-core<2.0.0,>=1.0.0
6
+ langchain-openai<2.0.0,>=1.0.0
7
+ langgraph<2.0.0,>=1.0.0
8
+ langsmith<1.0.0,>=0.4.0
9
+ grandalf<0.9.0,>=0.8.0
10
+
11
+ [all]
12
+ langchain-anthropic<2.0.0,>=1.0.0
13
+ langchain-google-genai<5.0.0,>=4.0.0
14
+ langchain-google-vertexai<4.0.0,>=3.0.0
15
+ langchain-groq<2.0.0,>=1.0.0
16
+ langchain-mistralai<2.0.0,>=1.0.0
17
+ langchain-deepseek<2.0.0,>=1.0.0
18
+ langchain-ollama<2.0.0,>=1.0.0
19
+
20
+ [anthropic]
21
+ langchain-anthropic<2.0.0,>=1.0.0
22
+
23
+ [deepseek]
24
+ langchain-deepseek<2.0.0,>=1.0.0
25
+
26
+ [gemini]
27
+ langchain-google-genai<5.0.0,>=4.0.0
28
+ langchain-google-vertexai<4.0.0,>=3.0.0
29
+
30
+ [groq]
31
+ langchain-groq<2.0.0,>=1.0.0
32
+
33
+ [mistral]
34
+ langchain-mistralai<2.0.0,>=1.0.0
35
+
36
+ [ollama]
37
+ langchain-ollama<2.0.0,>=1.0.0
@@ -0,0 +1,60 @@
1
+ from setuptools import setup, find_packages
2
+
3
+
4
+ def readme():
5
+ with open('README.md', 'r') as f:
6
+ return f.read()
7
+
8
+
9
+ setup(
10
+ name='dm-aioaiagent',
11
+ version='v0.6.1',
12
+ author='dimka4621',
13
+ author_email='mismartconfig@gmail.com',
14
+ description='This is my custom aioaiagent client',
15
+ long_description=readme(),
16
+ long_description_content_type='text/markdown',
17
+ url='https://pypi.org/project/dm-aioaiagent',
18
+ packages=find_packages(),
19
+ install_requires=[
20
+ 'dm-logger>=0.6.6, <0.7.0',
21
+ 'python-dotenv>=1.0.0',
22
+ 'pydantic>=2.9.2, <3.0.0',
23
+ 'langchain>=1.0.0, <2.0.0',
24
+ 'langchain-core>=1.0.0, <2.0.0',
25
+ 'langchain-openai>=1.0.0, <2.0.0',
26
+ 'langgraph>=1.0.0, <2.0.0',
27
+ 'langsmith>=0.4.0, <1.0.0',
28
+ 'grandalf>=0.8.0, <0.9.0',
29
+ ],
30
+ extras_require={
31
+ 'anthropic': ['langchain-anthropic>=1.0.0, <2.0.0'],
32
+ 'gemini': [
33
+ 'langchain-google-genai>=4.0.0, <5.0.0',
34
+ 'langchain-google-vertexai>=3.0.0, <4.0.0',
35
+ ],
36
+ 'groq': ['langchain-groq>=1.0.0, <2.0.0'],
37
+ 'mistral': ['langchain-mistralai>=1.0.0, <2.0.0'],
38
+ 'deepseek': ['langchain-deepseek>=1.0.0, <2.0.0'],
39
+ 'ollama': ['langchain-ollama>=1.0.0, <2.0.0'],
40
+ 'all': [
41
+ 'langchain-anthropic>=1.0.0, <2.0.0',
42
+ 'langchain-google-genai>=4.0.0, <5.0.0',
43
+ 'langchain-google-vertexai>=3.0.0, <4.0.0',
44
+ 'langchain-groq>=1.0.0, <2.0.0',
45
+ 'langchain-mistralai>=1.0.0, <2.0.0',
46
+ 'langchain-deepseek>=1.0.0, <2.0.0',
47
+ 'langchain-ollama>=1.0.0, <2.0.0',
48
+ ],
49
+ },
50
+ classifiers=[
51
+ 'Programming Language :: Python :: 3.9',
52
+ 'License :: OSI Approved :: MIT License',
53
+ 'Operating System :: OS Independent'
54
+ ],
55
+ keywords='dm aioaiagent',
56
+ project_urls={
57
+ 'GitHub': 'https://github.com/MykhLibs/dm-aioaiagent'
58
+ },
59
+ python_requires='>=3.9'
60
+ )
@@ -1,155 +0,0 @@
1
- # DM-aioaiagent
2
-
3
- ## Urls
4
-
5
- * [PyPI](https://pypi.org/project/dm-aioaiagent)
6
- * [GitHub](https://github.com/MykhLibs/dm-aioaiagent)
7
-
8
- ### * Package contains both `asynchronous` and `synchronous` clients
9
-
10
- ## Usage
11
-
12
- Analogue to `DMAioAIAgent` is the synchronous client `DMAIAgent`.
13
-
14
- ### Windows Setup
15
-
16
- ```python
17
- import asyncio
18
- import sys
19
-
20
- if sys.platform == "win32":
21
- asyncio.set_event_loop_policy(asyncio.WindowsSelectorEventLoopPolicy())
22
- ```
23
-
24
- ### Api Key Setup
25
-
26
- You can set your OpenAI API key in the environment variable `OPENAI_API_KEY` or pass it as an argument to the agent.
27
-
28
- **Use load_dotenv to load the `.env` file.**
29
-
30
- ```python
31
- from dotenv import load_dotenv
32
- load_dotenv()
33
- ```
34
-
35
- ### Use agent *with* inner memory and run *single* message
36
-
37
- By default, agent use inner memory to store the conversation history.
38
-
39
- (You can set *max count messages in memory* by `max_memory_messages` init argument)
40
-
41
- ```python
42
- import asyncio
43
- from dm_aioaiagent import DMAioAIAgent
44
-
45
-
46
- async def main():
47
- # define a system message
48
- system_message = "Your custom system message with role, backstory and goal"
49
-
50
- # (optional) define a list of tools, if you want to use them
51
- tools = [...]
52
-
53
- # define a openai model, default is "gpt-4o-mini"
54
- model_name = "gpt-4o"
55
-
56
- # create an agent
57
- ai_agent = DMAioAIAgent(system_message, tools, model=model_name)
58
- # if you don't want to see the input and output messages from agent
59
- # you can set `input_output_logging=False` init argument
60
-
61
- # call an agent
62
- answer = await ai_agent.run("Hello!")
63
-
64
- # call an agent
65
- answer = await ai_agent.run("I want to know the weather in Kyiv")
66
-
67
- # get full conversation history
68
- conversation_history = ai_agent.memory_messages
69
-
70
- # clear conversation history
71
- ai_agent.clear_memory_messages()
72
-
73
-
74
- if __name__ == "__main__":
75
- asyncio.run(main())
76
- ```
77
-
78
- ### Use agent *without* inner memory and run *multiple* messages
79
-
80
- If you want to control the memory of the agent, you can disable it by setting `is_memory_enabled=False`
81
-
82
- ```python
83
- import asyncio
84
- from dm_aioaiagent import DMAioAIAgent
85
-
86
-
87
- async def main():
88
- # define a system message
89
- system_message = "Your custom system message with role, backstory and goal"
90
-
91
- # (optional) define a list of tools, if you want to use them
92
- tools = [...]
93
-
94
- # define a openai model, default is "gpt-4o-mini"
95
- model_name = "gpt-4o"
96
-
97
- # create an agent
98
- ai_agent = DMAioAIAgent(system_message, tools, model=model_name,
99
- is_memory_enabled=False)
100
- # if you don't want to see the input and output messages from agent
101
- # you can set input_output_logging=False
102
-
103
- # define the conversation message(s)
104
- messages = [
105
- {"role": "user", "content": "Hello!"}
106
- ]
107
-
108
- # call an agent
109
- new_messages = await ai_agent.run_messages(messages)
110
-
111
- # add new_messages to messages
112
- messages.extend(new_messages)
113
-
114
- # define the next conversation message
115
- messages.append(
116
- {"role": "user", "content": "I want to know the weather in Kyiv"}
117
- )
118
-
119
- # call an agent
120
- new_messages = await ai_agent.run_messages(messages)
121
-
122
-
123
- if __name__ == "__main__":
124
- asyncio.run(main())
125
- ```
126
-
127
- ### Image vision
128
-
129
- ```python
130
- from dm_aioaiagent import DMAIAgent, OpenAIImageMessageContent
131
-
132
-
133
- def main():
134
- # create an agent
135
- ai_agent = DMAIAgent(agent_name="image_vision", model="gpt-4o")
136
-
137
- # create an image message content
138
- # NOTE: text argument is optional
139
- img_content = OpenAIImageMessageContent(image_url="https://your.domain/image",
140
- text="Hello, what is shown in the photo?")
141
-
142
- # define the conversation messages
143
- messages = [
144
- {"role": "user", "content": "Hello!"},
145
- {"role": "user", "content": img_content},
146
- ]
147
-
148
- # call an agent
149
- new_messages = ai_agent.run_messages(messages)
150
- answer = new_messages[-1].content
151
-
152
-
153
- if __name__ == "__main__":
154
- main()
155
- ```
@@ -1,11 +0,0 @@
1
- dm-logger<0.7.0,>=0.6.6
2
- python-dotenv>=1.0.0
3
- pydantic<3.0.0,>=2.9.2
4
- langchain<0.4.0,>=0.3.0
5
- langchain-core<0.4.0,>=0.3.5
6
- langchain-community<0.4.0,>=0.3.0
7
- langchain-openai<0.4.0,>=0.3.0
8
- langchain-anthropic<0.4.0,>=0.3.0
9
- langgraph<0.4.0,>=0.3.23
10
- langsmith<0.4.0,>=0.3.45
11
- grandalf<0.9.0,>=0.8.0
@@ -1,42 +0,0 @@
1
- from setuptools import setup, find_packages
2
-
3
-
4
- def readme():
5
- with open('README.md', 'r') as f:
6
- return f.read()
7
-
8
-
9
- setup(
10
- name='dm-aioaiagent',
11
- version='v0.5.7',
12
- author='dimka4621',
13
- author_email='mismartconfig@gmail.com',
14
- description='This is my custom aioaiagent client',
15
- long_description=readme(),
16
- long_description_content_type='text/markdown',
17
- url='https://pypi.org/project/dm-aioaiagent',
18
- packages=find_packages(),
19
- install_requires=[
20
- 'dm-logger>=0.6.6, <0.7.0',
21
- 'python-dotenv>=1.0.0',
22
- 'pydantic>=2.9.2, <3.0.0',
23
- 'langchain>=0.3.0, <0.4.0',
24
- 'langchain-core>=0.3.5, <0.4.0',
25
- 'langchain-community>=0.3.0, <0.4.0',
26
- 'langchain-openai>=0.3.0, <0.4.0',
27
- 'langchain-anthropic>=0.3.0, <0.4.0',
28
- 'langgraph>=0.3.23, <0.4.0',
29
- 'langsmith>=0.3.45, <0.4.0',
30
- 'grandalf>=0.8.0, <0.9.0',
31
- ],
32
- classifiers=[
33
- 'Programming Language :: Python :: 3.9',
34
- 'License :: OSI Approved :: MIT License',
35
- 'Operating System :: OS Independent'
36
- ],
37
- keywords='dm aioaiagent',
38
- project_urls={
39
- 'GitHub': 'https://github.com/MykhLibs/dm-aioaiagent'
40
- },
41
- python_requires='>=3.9'
42
- )
File without changes