euriai 0.3.3__tar.gz → 0.3.5__tar.gz

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
euriai-0.3.5/PKG-INFO ADDED
@@ -0,0 +1,133 @@
1
+ Metadata-Version: 2.4
2
+ Name: euriai
3
+ Version: 0.3.5
4
+ Summary: Python client for EURI LLM API (euron.one) with CLI and interactive wizard
5
+ Author: euron.one
6
+ Author-email: sudhanshu@euron.one
7
+ License: MIT
8
+ Classifier: Programming Language :: Python :: 3
9
+ Classifier: Operating System :: OS Independent
10
+ Classifier: License :: OSI Approved :: MIT License
11
+ Classifier: Intended Audience :: Developers
12
+ Requires-Python: >=3.6
13
+ Description-Content-Type: text/markdown
14
+ Requires-Dist: requests
15
+ Requires-Dist: langchain-core
16
+ Dynamic: author
17
+ Dynamic: author-email
18
+ Dynamic: classifier
19
+ Dynamic: description
20
+ Dynamic: description-content-type
21
+ Dynamic: license
22
+ Dynamic: requires-dist
23
+ Dynamic: requires-python
24
+ Dynamic: summary
25
+
26
+ # euriai 🧠
27
+
28
+ **EURI AI Python Client** – A simple wrapper and CLI tool for the [Euron LLM API](https://api.euron.one). Supports completions, streaming responses, embeddings, CLI interaction, and an interactive guided wizard!
29
+
30
+ ---
31
+
32
+ ## 🔧 Installation
33
+
34
+ ```bash
35
+ pip install euriai
36
+ ```
37
+
38
+ ## 🚀 Python Usage
39
+
40
+ ### Text Generation
41
+
42
+ ```python
43
+ from euriai import EuriaiClient
44
+
45
+ client = EuriaiClient(
46
+ api_key="your_api_key_here",
47
+ model="gpt-4.1-nano" # You can also try: "gemini-2.0-flash-001", "llama-4-maverick", etc.
48
+ )
49
+
50
+ response = client.generate_completion(
51
+ prompt="Write a short poem about artificial intelligence.",
52
+ temperature=0.7,
53
+ max_tokens=300
54
+ )
55
+
56
+ print(response)
57
+ ```
58
+
59
+ ### Embeddings
60
+
61
+ ```python
62
+ from euriai.embedding import EuriaiEmbeddingClient
63
+
64
+ client = EuriaiEmbeddingClient(api_key="your_key")
65
+ embedding = client.embed("Hello world")
66
+ print(embedding[:5]) # Print first 5 dimensions of the embedding vector
67
+ ```
68
+
69
+ ## 💻 Command-Line Interface (CLI) Usage
70
+
71
+ Run prompts directly from the terminal:
72
+
73
+ ```bash
74
+ euriai --api_key YOUR_API_KEY --prompt "Tell me a joke"
75
+ ```
76
+
77
+ Enable streaming output (if supported by the model):
78
+
79
+ ```bash
80
+ euriai --api_key YOUR_API_KEY --prompt "Stream a fun fact" --stream
81
+ ```
82
+
83
+ List all supported model IDs with recommended use-cases and temperature/token advice:
84
+
85
+ ```bash
86
+ euriai --models
87
+ ```
88
+
89
+ ## 🤖 LangChain Integration
90
+
91
+ ### Text Generation
92
+
93
+ Use Euriai with LangChain directly:
94
+
95
+ ```python
96
+ from euriai import EuriaiLangChainLLM
97
+
98
+ llm = EuriaiLangChainLLM(
99
+ api_key="your_api_key",
100
+ model="gpt-4.1-nano",
101
+ temperature=0.7,
102
+ max_tokens=300
103
+ )
104
+
105
+ print(llm.invoke("Write a poem about time travel."))
106
+ ```
107
+
108
+ ### Embeddings
109
+
110
+ Use Euriai embeddings with LangChain:
111
+
112
+ ```python
113
+ from euriai.langchain_embed import EuriaiEmbeddings
114
+
115
+ embedding_model = EuriaiEmbeddings(api_key="your_key")
116
+ print(embedding_model.embed_query("What's AI?")[:5]) # Print first 5 dimensions
117
+ ```
118
+
119
+ ## 📘 Documentation
120
+
121
+ For full documentation, visit our [official docs site](https://docs.euron.one).
122
+
123
+ ## 🔑 Getting an API Key
124
+
125
+ Sign up for an API key at [Euron AI Platform](https://app.euron.one).
126
+
127
+ ## 🤝 Contributing
128
+
129
+ Contributions are welcome! Please feel free to submit a Pull Request.
130
+
131
+ ## 📄 License
132
+
133
+ This project is licensed under the MIT License - see the LICENSE file for details.
euriai-0.3.5/README.md ADDED
@@ -0,0 +1,108 @@
1
+ # euriai 🧠
2
+
3
+ **EURI AI Python Client** – A simple wrapper and CLI tool for the [Euron LLM API](https://api.euron.one). Supports completions, streaming responses, embeddings, CLI interaction, and an interactive guided wizard!
4
+
5
+ ---
6
+
7
+ ## 🔧 Installation
8
+
9
+ ```bash
10
+ pip install euriai
11
+ ```
12
+
13
+ ## 🚀 Python Usage
14
+
15
+ ### Text Generation
16
+
17
+ ```python
18
+ from euriai import EuriaiClient
19
+
20
+ client = EuriaiClient(
21
+ api_key="your_api_key_here",
22
+ model="gpt-4.1-nano" # You can also try: "gemini-2.0-flash-001", "llama-4-maverick", etc.
23
+ )
24
+
25
+ response = client.generate_completion(
26
+ prompt="Write a short poem about artificial intelligence.",
27
+ temperature=0.7,
28
+ max_tokens=300
29
+ )
30
+
31
+ print(response)
32
+ ```
33
+
34
+ ### Embeddings
35
+
36
+ ```python
37
+ from euriai.embedding import EuriaiEmbeddingClient
38
+
39
+ client = EuriaiEmbeddingClient(api_key="your_key")
40
+ embedding = client.embed("Hello world")
41
+ print(embedding[:5]) # Print first 5 dimensions of the embedding vector
42
+ ```
43
+
44
+ ## 💻 Command-Line Interface (CLI) Usage
45
+
46
+ Run prompts directly from the terminal:
47
+
48
+ ```bash
49
+ euriai --api_key YOUR_API_KEY --prompt "Tell me a joke"
50
+ ```
51
+
52
+ Enable streaming output (if supported by the model):
53
+
54
+ ```bash
55
+ euriai --api_key YOUR_API_KEY --prompt "Stream a fun fact" --stream
56
+ ```
57
+
58
+ List all supported model IDs with recommended use-cases and temperature/token advice:
59
+
60
+ ```bash
61
+ euriai --models
62
+ ```
63
+
64
+ ## 🤖 LangChain Integration
65
+
66
+ ### Text Generation
67
+
68
+ Use Euriai with LangChain directly:
69
+
70
+ ```python
71
+ from euriai import EuriaiLangChainLLM
72
+
73
+ llm = EuriaiLangChainLLM(
74
+ api_key="your_api_key",
75
+ model="gpt-4.1-nano",
76
+ temperature=0.7,
77
+ max_tokens=300
78
+ )
79
+
80
+ print(llm.invoke("Write a poem about time travel."))
81
+ ```
82
+
83
+ ### Embeddings
84
+
85
+ Use Euriai embeddings with LangChain:
86
+
87
+ ```python
88
+ from euriai.langchain_embed import EuriaiEmbeddings
89
+
90
+ embedding_model = EuriaiEmbeddings(api_key="your_key")
91
+ print(embedding_model.embed_query("What's AI?")[:5]) # Print first 5 dimensions
92
+ ```
93
+
94
+ ## 📘 Documentation
95
+
96
+ For full documentation, visit our [official docs site](https://docs.euron.one).
97
+
98
+ ## 🔑 Getting an API Key
99
+
100
+ Sign up for an API key at [Euron AI Platform](https://app.euron.one).
101
+
102
+ ## 🤝 Contributing
103
+
104
+ Contributions are welcome! Please feel free to submit a Pull Request.
105
+
106
+ ## 📄 License
107
+
108
+ This project is licensed under the MIT License - see the LICENSE file for details.
@@ -0,0 +1,6 @@
1
+ from .client import EuriaiClient
2
+ from .langchain_llm import EuriaiLangChainLLM
3
+ from .embedding import EuriaiEmbeddingClient
4
+ from .langchain_embed import EuriaiEmbeddings
5
+
6
+ __all__ = ["EuriaiClient", "EuriaiLangChainLLM","EuriaiEmbeddingClient", "EuriaiEmbeddings"]
@@ -0,0 +1,40 @@
1
+ import requests
2
+ import numpy as np
3
+
4
+ class EuriaiEmbeddingClient:
5
+ def __init__(self, api_key: str, model: str = "text-embedding-3-small"):
6
+ self.api_key = api_key
7
+ self.model = model
8
+ self.url = "https://api.euron.one/api/v1/euri/alpha/embeddings"
9
+
10
+ def embed(self, text: str) -> np.ndarray:
11
+ headers = {
12
+ "Content-Type": "application/json",
13
+ "Authorization": f"Bearer {self.api_key}"
14
+ }
15
+ payload = {
16
+ "input": text,
17
+ "model": self.model
18
+ }
19
+
20
+ response = requests.post(self.url, headers=headers, json=payload)
21
+ response.raise_for_status()
22
+ data = response.json()
23
+
24
+ return np.array(data["data"][0]["embedding"])
25
+
26
+ def embed_batch(self, texts: list[str]) -> list[np.ndarray]:
27
+ headers = {
28
+ "Content-Type": "application/json",
29
+ "Authorization": f"Bearer {self.api_key}"
30
+ }
31
+ payload = {
32
+ "input": texts,
33
+ "model": self.model
34
+ }
35
+
36
+ response = requests.post(self.url, headers=headers, json=payload)
37
+ response.raise_for_status()
38
+ data = response.json()
39
+
40
+ return [np.array(obj["embedding"]) for obj in data["data"]]
@@ -0,0 +1,14 @@
1
+ from langchain_core.embeddings import Embeddings
2
+ from typing import List
3
+ from euriai.embedding import EuriaiEmbeddingClient
4
+
5
+
6
+ class EuriaiEmbeddings(Embeddings):
7
+ def __init__(self, api_key: str, model: str = "text-embedding-3-small"):
8
+ self.client = EuriaiEmbeddingClient(api_key=api_key, model=model)
9
+
10
+ def embed_documents(self, texts: List[str]) -> List[List[float]]:
11
+ return [embedding.tolist() for embedding in self.client.embed_batch(texts)]
12
+
13
+ def embed_query(self, text: str) -> List[float]:
14
+ return self.client.embed(text).tolist()
@@ -0,0 +1,133 @@
1
+ Metadata-Version: 2.4
2
+ Name: euriai
3
+ Version: 0.3.5
4
+ Summary: Python client for EURI LLM API (euron.one) with CLI and interactive wizard
5
+ Author: euron.one
6
+ Author-email: sudhanshu@euron.one
7
+ License: MIT
8
+ Classifier: Programming Language :: Python :: 3
9
+ Classifier: Operating System :: OS Independent
10
+ Classifier: License :: OSI Approved :: MIT License
11
+ Classifier: Intended Audience :: Developers
12
+ Requires-Python: >=3.6
13
+ Description-Content-Type: text/markdown
14
+ Requires-Dist: requests
15
+ Requires-Dist: langchain-core
16
+ Dynamic: author
17
+ Dynamic: author-email
18
+ Dynamic: classifier
19
+ Dynamic: description
20
+ Dynamic: description-content-type
21
+ Dynamic: license
22
+ Dynamic: requires-dist
23
+ Dynamic: requires-python
24
+ Dynamic: summary
25
+
26
+ # euriai 🧠
27
+
28
+ **EURI AI Python Client** – A simple wrapper and CLI tool for the [Euron LLM API](https://api.euron.one). Supports completions, streaming responses, embeddings, CLI interaction, and an interactive guided wizard!
29
+
30
+ ---
31
+
32
+ ## 🔧 Installation
33
+
34
+ ```bash
35
+ pip install euriai
36
+ ```
37
+
38
+ ## 🚀 Python Usage
39
+
40
+ ### Text Generation
41
+
42
+ ```python
43
+ from euriai import EuriaiClient
44
+
45
+ client = EuriaiClient(
46
+ api_key="your_api_key_here",
47
+ model="gpt-4.1-nano" # You can also try: "gemini-2.0-flash-001", "llama-4-maverick", etc.
48
+ )
49
+
50
+ response = client.generate_completion(
51
+ prompt="Write a short poem about artificial intelligence.",
52
+ temperature=0.7,
53
+ max_tokens=300
54
+ )
55
+
56
+ print(response)
57
+ ```
58
+
59
+ ### Embeddings
60
+
61
+ ```python
62
+ from euriai.embedding import EuriaiEmbeddingClient
63
+
64
+ client = EuriaiEmbeddingClient(api_key="your_key")
65
+ embedding = client.embed("Hello world")
66
+ print(embedding[:5]) # Print first 5 dimensions of the embedding vector
67
+ ```
68
+
69
+ ## 💻 Command-Line Interface (CLI) Usage
70
+
71
+ Run prompts directly from the terminal:
72
+
73
+ ```bash
74
+ euriai --api_key YOUR_API_KEY --prompt "Tell me a joke"
75
+ ```
76
+
77
+ Enable streaming output (if supported by the model):
78
+
79
+ ```bash
80
+ euriai --api_key YOUR_API_KEY --prompt "Stream a fun fact" --stream
81
+ ```
82
+
83
+ List all supported model IDs with recommended use-cases and temperature/token advice:
84
+
85
+ ```bash
86
+ euriai --models
87
+ ```
88
+
89
+ ## 🤖 LangChain Integration
90
+
91
+ ### Text Generation
92
+
93
+ Use Euriai with LangChain directly:
94
+
95
+ ```python
96
+ from euriai import EuriaiLangChainLLM
97
+
98
+ llm = EuriaiLangChainLLM(
99
+ api_key="your_api_key",
100
+ model="gpt-4.1-nano",
101
+ temperature=0.7,
102
+ max_tokens=300
103
+ )
104
+
105
+ print(llm.invoke("Write a poem about time travel."))
106
+ ```
107
+
108
+ ### Embeddings
109
+
110
+ Use Euriai embeddings with LangChain:
111
+
112
+ ```python
113
+ from euriai.langchain_embed import EuriaiEmbeddings
114
+
115
+ embedding_model = EuriaiEmbeddings(api_key="your_key")
116
+ print(embedding_model.embed_query("What's AI?")[:5]) # Print first 5 dimensions
117
+ ```
118
+
119
+ ## 📘 Documentation
120
+
121
+ For full documentation, visit our [official docs site](https://docs.euron.one).
122
+
123
+ ## 🔑 Getting an API Key
124
+
125
+ Sign up for an API key at [Euron AI Platform](https://app.euron.one).
126
+
127
+ ## 🤝 Contributing
128
+
129
+ Contributions are welcome! Please feel free to submit a Pull Request.
130
+
131
+ ## 📄 License
132
+
133
+ This project is licensed under the MIT License - see the LICENSE file for details.
@@ -3,6 +3,8 @@ setup.py
3
3
  euriai/__init__.py
4
4
  euriai/cli.py
5
5
  euriai/client.py
6
+ euriai/embedding.py
7
+ euriai/langchain_embed.py
6
8
  euriai/langchain_llm.py
7
9
  euriai.egg-info/PKG-INFO
8
10
  euriai.egg-info/SOURCES.txt
@@ -2,7 +2,7 @@ from setuptools import setup, find_packages
2
2
 
3
3
  setup(
4
4
  name="euriai",
5
- version="0.3.3",
5
+ version="0.3.5",
6
6
  description="Python client for EURI LLM API (euron.one) with CLI and interactive wizard",
7
7
  long_description=open("README.md", encoding="utf-8").read(),
8
8
  long_description_content_type="text/markdown",
euriai-0.3.3/PKG-INFO DELETED
@@ -1,76 +0,0 @@
1
- Metadata-Version: 2.1
2
- Name: euriai
3
- Version: 0.3.3
4
- Summary: Python client for EURI LLM API (euron.one) with CLI and interactive wizard
5
- Author: euron.one
6
- Author-email: sudhanshu@euron.one
7
- License: MIT
8
- Classifier: Programming Language :: Python :: 3
9
- Classifier: Operating System :: OS Independent
10
- Classifier: License :: OSI Approved :: MIT License
11
- Classifier: Intended Audience :: Developers
12
- Requires-Python: >=3.6
13
- Description-Content-Type: text/markdown
14
- Requires-Dist: requests
15
- Requires-Dist: langchain-core
16
-
17
- # euriai 🧠
18
-
19
- **EURI AI Python Client** – A simple wrapper and CLI tool for the [Euron LLM API](https://api.euron.one).
20
- Supports completions, streaming responses, CLI interaction, and an interactive guided wizard!
21
-
22
- ---
23
-
24
- ## 🔧 Installation
25
-
26
- ```bash
27
- pip install euriai
28
-
29
- ## python sample Usage
30
-
31
- from euriai import EuriaiClient
32
-
33
- client = EuriaiClient(
34
- api_key="your_api_key_here",
35
- model="gpt-4.1-nano" # You can also try: "gemini-2.0-flash-001", "llama-4-maverick", etc.
36
- )
37
-
38
- response = client.generate_completion(
39
- prompt="Write a short poem about artificial intelligence.",
40
- temperature=0.7,
41
- max_tokens=300
42
- )
43
-
44
- print(response)
45
-
46
-
47
- ## 💻 Command-Line Interface (CLI) Usage
48
- Run prompts directly from the terminal:
49
-
50
- euriai --api_key YOUR_API_KEY --prompt "Tell me a joke"
51
-
52
-
53
- ## Enable streaming output (if supported by the model):
54
-
55
- euriai --api_key YOUR_API_KEY --prompt "Stream a fun fact" --stream
56
-
57
-
58
- ##List all supported model IDs with recommended use-cases and temperature/token advice:
59
-
60
- euriai --models
61
-
62
- ## 🤖 LangChain Integration
63
-
64
- Use Euriai with LangChain directly:
65
-
66
- ```python
67
- from euriai import EuriaiLangChainLLM
68
-
69
- llm = EuriaiLangChainLLM(
70
- api_key="your_api_key",
71
- model="gpt-4.1-nano",
72
- temperature=0.7,
73
- max_tokens=300
74
- )
75
-
76
- print(llm.invoke("Write a poem about time travel."))
euriai-0.3.3/README.md DELETED
@@ -1,60 +0,0 @@
1
- # euriai 🧠
2
-
3
- **EURI AI Python Client** – A simple wrapper and CLI tool for the [Euron LLM API](https://api.euron.one).
4
- Supports completions, streaming responses, CLI interaction, and an interactive guided wizard!
5
-
6
- ---
7
-
8
- ## 🔧 Installation
9
-
10
- ```bash
11
- pip install euriai
12
-
13
- ## python sample Usage
14
-
15
- from euriai import EuriaiClient
16
-
17
- client = EuriaiClient(
18
- api_key="your_api_key_here",
19
- model="gpt-4.1-nano" # You can also try: "gemini-2.0-flash-001", "llama-4-maverick", etc.
20
- )
21
-
22
- response = client.generate_completion(
23
- prompt="Write a short poem about artificial intelligence.",
24
- temperature=0.7,
25
- max_tokens=300
26
- )
27
-
28
- print(response)
29
-
30
-
31
- ## 💻 Command-Line Interface (CLI) Usage
32
- Run prompts directly from the terminal:
33
-
34
- euriai --api_key YOUR_API_KEY --prompt "Tell me a joke"
35
-
36
-
37
- ## Enable streaming output (if supported by the model):
38
-
39
- euriai --api_key YOUR_API_KEY --prompt "Stream a fun fact" --stream
40
-
41
-
42
- ##List all supported model IDs with recommended use-cases and temperature/token advice:
43
-
44
- euriai --models
45
-
46
- ## 🤖 LangChain Integration
47
-
48
- Use Euriai with LangChain directly:
49
-
50
- ```python
51
- from euriai import EuriaiLangChainLLM
52
-
53
- llm = EuriaiLangChainLLM(
54
- api_key="your_api_key",
55
- model="gpt-4.1-nano",
56
- temperature=0.7,
57
- max_tokens=300
58
- )
59
-
60
- print(llm.invoke("Write a poem about time travel."))
@@ -1,4 +0,0 @@
1
- from .client import EuriaiClient
2
- from .langchain_llm import EuriaiLangChainLLM
3
-
4
- __all__ = ["EuriaiClient", "EuriaiLangChainLLM"]
@@ -1,76 +0,0 @@
1
- Metadata-Version: 2.1
2
- Name: euriai
3
- Version: 0.3.3
4
- Summary: Python client for EURI LLM API (euron.one) with CLI and interactive wizard
5
- Author: euron.one
6
- Author-email: sudhanshu@euron.one
7
- License: MIT
8
- Classifier: Programming Language :: Python :: 3
9
- Classifier: Operating System :: OS Independent
10
- Classifier: License :: OSI Approved :: MIT License
11
- Classifier: Intended Audience :: Developers
12
- Requires-Python: >=3.6
13
- Description-Content-Type: text/markdown
14
- Requires-Dist: requests
15
- Requires-Dist: langchain-core
16
-
17
- # euriai 🧠
18
-
19
- **EURI AI Python Client** – A simple wrapper and CLI tool for the [Euron LLM API](https://api.euron.one).
20
- Supports completions, streaming responses, CLI interaction, and an interactive guided wizard!
21
-
22
- ---
23
-
24
- ## 🔧 Installation
25
-
26
- ```bash
27
- pip install euriai
28
-
29
- ## python sample Usage
30
-
31
- from euriai import EuriaiClient
32
-
33
- client = EuriaiClient(
34
- api_key="your_api_key_here",
35
- model="gpt-4.1-nano" # You can also try: "gemini-2.0-flash-001", "llama-4-maverick", etc.
36
- )
37
-
38
- response = client.generate_completion(
39
- prompt="Write a short poem about artificial intelligence.",
40
- temperature=0.7,
41
- max_tokens=300
42
- )
43
-
44
- print(response)
45
-
46
-
47
- ## 💻 Command-Line Interface (CLI) Usage
48
- Run prompts directly from the terminal:
49
-
50
- euriai --api_key YOUR_API_KEY --prompt "Tell me a joke"
51
-
52
-
53
- ## Enable streaming output (if supported by the model):
54
-
55
- euriai --api_key YOUR_API_KEY --prompt "Stream a fun fact" --stream
56
-
57
-
58
- ##List all supported model IDs with recommended use-cases and temperature/token advice:
59
-
60
- euriai --models
61
-
62
- ## 🤖 LangChain Integration
63
-
64
- Use Euriai with LangChain directly:
65
-
66
- ```python
67
- from euriai import EuriaiLangChainLLM
68
-
69
- llm = EuriaiLangChainLLM(
70
- api_key="your_api_key",
71
- model="gpt-4.1-nano",
72
- temperature=0.7,
73
- max_tokens=300
74
- )
75
-
76
- print(llm.invoke("Write a poem about time travel."))
File without changes
File without changes
File without changes
File without changes