OpenHosta 1.0.1__tar.gz

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
@@ -0,0 +1,21 @@
1
+ MIT License
2
+
3
+ Copyright (c) 2024 hand-e
4
+
5
+ Permission is hereby granted, free of charge, to any person obtaining a copy
6
+ of this software and associated documentation files (the "Software"), to deal
7
+ in the Software without restriction, including without limitation the rights
8
+ to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
9
+ copies of the Software, and to permit persons to whom the Software is
10
+ furnished to do so, subject to the following conditions:
11
+
12
+ The above copyright notice and this permission notice shall be included in all
13
+ copies or substantial portions of the Software.
14
+
15
+ THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
16
+ IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
17
+ FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
18
+ AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
19
+ LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
20
+ OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
21
+ SOFTWARE.
@@ -0,0 +1,193 @@
1
+ Metadata-Version: 2.1
2
+ Name: OpenHosta
3
+ Version: 1.0.1
4
+ Summary: Open-Source programming project IA integretion in developement environnement
5
+ Author: Léandre Ramos, Merlin Devillard, William Jolivet, Emmanuel Batt
6
+ License: MIT License
7
+
8
+ Copyright (c) 2024 hand-e
9
+
10
+ Permission is hereby granted, free of charge, to any person obtaining a copy
11
+ of this software and associated documentation files (the "Software"), to deal
12
+ in the Software without restriction, including without limitation the rights
13
+ to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
14
+ copies of the Software, and to permit persons to whom the Software is
15
+ furnished to do so, subject to the following conditions:
16
+
17
+ The above copyright notice and this permission notice shall be included in all
18
+ copies or substantial portions of the Software.
19
+
20
+ THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
21
+ IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
22
+ FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
23
+ AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
24
+ LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
25
+ OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
26
+ SOFTWARE.
27
+
28
+ Project-URL: Homepage, https://github.com/hand-e-fr/OpenHosta
29
+ Project-URL: Issues, https://github.com/hand-e-fr/OpenHosta/issues
30
+ Keywords: AI,GPT,Natural language,Autommatic,Easy
31
+ Classifier: Development Status :: 5 - Production/Stable
32
+ Classifier: Programming Language :: Python :: 3.8
33
+ Classifier: License :: OSI Approved :: MIT License
34
+ Classifier: Operating System :: OS Independent
35
+ Classifier: Intended Audience :: Developers
36
+ Classifier: Natural Language :: French
37
+ Classifier: Topic :: Software Development :: Code Generators
38
+ Requires-Python: >=3.8
39
+ Description-Content-Type: text/markdown
40
+ License-File: LICENSE
41
+ Requires-Dist: requests>=2.32.3
42
+ Requires-Dist: pydantic>=2.8.2
43
+ Requires-Dist: tiktoken>=0.7.0
44
+
45
+ # OpenHosta
46
+ v1.0 - Open-Source Project
47
+
48
+ **- The future of development is human -**
49
+
50
+ Welcome to the OpenHosta documentation, a powerful tool that facilitates the integration LLM in the development environnement. OpenHosta is used to emulate functions using AI, while respecting Python's native paradygma and syntax.
51
+
52
+ For this project, we have adopted a [Code of Conduct](CODE_OF_CONDUCT.md) to ensure a respectful and inclusive environment for all contributors. Please take a moment to read it.
53
+
54
+ ## Table of Content
55
+
56
+ - [OpenHosta](#openhosta)
57
+ - [Table of Content](#table-of-content)
58
+ - [How to install OpenHosta ?](#how-to-install-openhosta-)
59
+ - [Prerequisites](#prerequisites)
60
+ - [Installation](#installation)
61
+ - [Via pip](#via-pip)
62
+ - [Via git (Developper version)](#via-git-developper-version)
63
+ - [Example](#example)
64
+ - [Further information](#further-information)
65
+ - [Contributing](#contributing)
66
+ - [License](#license)
67
+ - [Authors \& Contact](#authors--contact)
68
+
69
+ ---
70
+
71
+ ## How to install OpenHosta ?
72
+
73
+ ### Prerequisites
74
+
75
+ 1. **Python 3.8+**
76
+ - Download and install Python from [python.org](https://www.python.org/downloads/).
77
+
78
+ 2. **pip**
79
+ - pip is generally included with Python. Verify its installation with:
80
+ ```sh
81
+ pip --version
82
+ ```
83
+
84
+ 3. **Git**
85
+ - Download and install Git from [git-scm.com](https://git-scm.com/downloads).
86
+
87
+ 4. **Virtual Environment (optional)**
88
+ - Create and activate a virtual environment:
89
+ ```bash
90
+ python -m venv env
91
+ ```
92
+ - Activate the virtual environement:
93
+ ```bash
94
+ .\env\Scripts\activate # Windows
95
+ source env/bin/activate # macOS/Linux
96
+ ```
97
+
98
+ 5. **API Key**
99
+ - **API Key**: Log in to your OpenAI account from [openai.com](https://openai.com/), then create your API key. For further information, you can check this [tuto](https://help.openai.com/en/articles/4936850-where-do-i-find-my-openai-api-key).
100
+
101
+ ### Installation
102
+
103
+ #### Via pip
104
+
105
+ 1. Run the following command to install OpenHosta directly:
106
+
107
+ ```sh
108
+ pip install openhosta
109
+ ```
110
+
111
+ 2. After the installation, you can verify that OpenHosta is installed correctly by running:
112
+
113
+ ```sh
114
+ pip show openhosta
115
+ ```
116
+
117
+ #### Via git (Developper version)
118
+
119
+ 1. Clone the **Git repository** to your local machine using the following command:
120
+
121
+ ```bash
122
+ git clone git@github.com:hand-e-fr/OpenHosta-dev.git
123
+ ```
124
+
125
+ 2. Navigate to the **directory** of the cloned project:
126
+
127
+ ```bash
128
+ cd OpenHosta-dev
129
+ ```
130
+
131
+ 3. Ensure you have installed the necessary **dependencies** before starting.
132
+
133
+ ```bash
134
+ pip install -r requirements.txt
135
+ ```
136
+
137
+ This way you have all the documentation and source code to understand our project
138
+
139
+ ### Example
140
+
141
+ ```python
142
+ from openhosta import *
143
+
144
+ config.set_default_apiKey("example-apikey")
145
+
146
+ def my_func(a:int, b:str)->dict:
147
+ """
148
+ This Function does something.
149
+ """
150
+ return emulate()
151
+
152
+ my_func(5, "Hello World!")
153
+
154
+ my_lambda = thought("Do something")
155
+ my_lambda(5)
156
+ ```
157
+ You check OpenHosta's [documentation](doc/Docs.md) for more detailled informations or exemple
158
+
159
+ ## Further information
160
+
161
+ ### Contributing
162
+
163
+ We warmly welcome contributions from the community. Whether you are an experienced developer or a beginner, your contributions are welcome.
164
+
165
+ If you wish to contribute to this project, please refer to our [Contribution Guide](CONTRIBUTING.md) and our [Code of Conduct](CODE_OF_CONDUCT.md).
166
+
167
+ Browse the existing [issues](https://github.com/hand-e-fr/OpenHosta/issues) to see if someone is already working on what you have in mind or to find contribution ideas.
168
+
169
+ ### License
170
+
171
+ This project is licensed under the MIT License. This means you are free to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the software, subject to the following conditions:
172
+
173
+ - The text of the license below must be included in all copies or substantial portions of the software.
174
+
175
+ See the [LICENSE](LICENSE) file for more details.
176
+
177
+ ### Authors & Contact
178
+
179
+ For further questions or assistance, please refer to partner hand-e or contact us directly via github.
180
+
181
+ **Authors**:
182
+ - Emmanuel Batt: Manager and Coordinator, Founder of Hand-e
183
+ - William Jolivet: DevOps, SysAdmin
184
+ - Léandre Ramos: MLOps, IA developer
185
+ - Merlin Devillard: UX designer, Product Owner
186
+
187
+ GitHub: https://github.com/hand-e-fr/OpenHosta
188
+
189
+ ---
190
+
191
+ Thank you for your interest in our project and your potential contributions!
192
+
193
+ **The OpenHosta Team**
@@ -0,0 +1,149 @@
1
+ # OpenHosta
2
+ v1.0 - Open-Source Project
3
+
4
+ **- The future of development is human -**
5
+
6
+ Welcome to the OpenHosta documentation, a powerful tool that facilitates the integration LLM in the development environnement. OpenHosta is used to emulate functions using AI, while respecting Python's native paradygma and syntax.
7
+
8
+ For this project, we have adopted a [Code of Conduct](CODE_OF_CONDUCT.md) to ensure a respectful and inclusive environment for all contributors. Please take a moment to read it.
9
+
10
+ ## Table of Content
11
+
12
+ - [OpenHosta](#openhosta)
13
+ - [Table of Content](#table-of-content)
14
+ - [How to install OpenHosta ?](#how-to-install-openhosta-)
15
+ - [Prerequisites](#prerequisites)
16
+ - [Installation](#installation)
17
+ - [Via pip](#via-pip)
18
+ - [Via git (Developper version)](#via-git-developper-version)
19
+ - [Example](#example)
20
+ - [Further information](#further-information)
21
+ - [Contributing](#contributing)
22
+ - [License](#license)
23
+ - [Authors \& Contact](#authors--contact)
24
+
25
+ ---
26
+
27
+ ## How to install OpenHosta ?
28
+
29
+ ### Prerequisites
30
+
31
+ 1. **Python 3.8+**
32
+ - Download and install Python from [python.org](https://www.python.org/downloads/).
33
+
34
+ 2. **pip**
35
+ - pip is generally included with Python. Verify its installation with:
36
+ ```sh
37
+ pip --version
38
+ ```
39
+
40
+ 3. **Git**
41
+ - Download and install Git from [git-scm.com](https://git-scm.com/downloads).
42
+
43
+ 4. **Virtual Environment (optional)**
44
+ - Create and activate a virtual environment:
45
+ ```bash
46
+ python -m venv env
47
+ ```
48
+ - Activate the virtual environement:
49
+ ```bash
50
+ .\env\Scripts\activate # Windows
51
+ source env/bin/activate # macOS/Linux
52
+ ```
53
+
54
+ 5. **API Key**
55
+ - **API Key**: Log in to your OpenAI account from [openai.com](https://openai.com/), then create your API key. For further information, you can check this [tuto](https://help.openai.com/en/articles/4936850-where-do-i-find-my-openai-api-key).
56
+
57
+ ### Installation
58
+
59
+ #### Via pip
60
+
61
+ 1. Run the following command to install OpenHosta directly:
62
+
63
+ ```sh
64
+ pip install openhosta
65
+ ```
66
+
67
+ 2. After the installation, you can verify that OpenHosta is installed correctly by running:
68
+
69
+ ```sh
70
+ pip show openhosta
71
+ ```
72
+
73
+ #### Via git (Developper version)
74
+
75
+ 1. Clone the **Git repository** to your local machine using the following command:
76
+
77
+ ```bash
78
+ git clone git@github.com:hand-e-fr/OpenHosta-dev.git
79
+ ```
80
+
81
+ 2. Navigate to the **directory** of the cloned project:
82
+
83
+ ```bash
84
+ cd OpenHosta-dev
85
+ ```
86
+
87
+ 3. Ensure you have installed the necessary **dependencies** before starting.
88
+
89
+ ```bash
90
+ pip install -r requirements.txt
91
+ ```
92
+
93
+ This way you have all the documentation and source code to understand our project
94
+
95
+ ### Example
96
+
97
+ ```python
98
+ from openhosta import *
99
+
100
+ config.set_default_apiKey("example-apikey")
101
+
102
+ def my_func(a:int, b:str)->dict:
103
+ """
104
+ This Function does something.
105
+ """
106
+ return emulate()
107
+
108
+ my_func(5, "Hello World!")
109
+
110
+ my_lambda = thought("Do something")
111
+ my_lambda(5)
112
+ ```
113
+ You check OpenHosta's [documentation](doc/Docs.md) for more detailled informations or exemple
114
+
115
+ ## Further information
116
+
117
+ ### Contributing
118
+
119
+ We warmly welcome contributions from the community. Whether you are an experienced developer or a beginner, your contributions are welcome.
120
+
121
+ If you wish to contribute to this project, please refer to our [Contribution Guide](CONTRIBUTING.md) and our [Code of Conduct](CODE_OF_CONDUCT.md).
122
+
123
+ Browse the existing [issues](https://github.com/hand-e-fr/OpenHosta/issues) to see if someone is already working on what you have in mind or to find contribution ideas.
124
+
125
+ ### License
126
+
127
+ This project is licensed under the MIT License. This means you are free to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the software, subject to the following conditions:
128
+
129
+ - The text of the license below must be included in all copies or substantial portions of the software.
130
+
131
+ See the [LICENSE](LICENSE) file for more details.
132
+
133
+ ### Authors & Contact
134
+
135
+ For further questions or assistance, please refer to partner hand-e or contact us directly via github.
136
+
137
+ **Authors**:
138
+ - Emmanuel Batt: Manager and Coordinator, Founder of Hand-e
139
+ - William Jolivet: DevOps, SysAdmin
140
+ - Léandre Ramos: MLOps, IA developer
141
+ - Merlin Devillard: UX designer, Product Owner
142
+
143
+ GitHub: https://github.com/hand-e-fr/OpenHosta
144
+
145
+ ---
146
+
147
+ Thank you for your interest in our project and your potential contributions!
148
+
149
+ **The OpenHosta Team**
@@ -0,0 +1,45 @@
1
+ [build-system]
2
+ requires = ["setuptools>=61.0"]
3
+ build-backend = "setuptools.build_meta"
4
+
5
+ [project]
6
+ name = "OpenHosta"
7
+ version = "1.0.1"
8
+ description = "Open-Source programming project IA integretion in developement environnement"
9
+ keywords = ["AI", "GPT", "Natural language", "Autommatic", "Easy"]
10
+ authors = [
11
+ { name="Léandre Ramos"},
12
+ { name="Merlin Devillard"},
13
+ { name="William Jolivet"},
14
+ { name="Emmanuel Batt"},
15
+ ]
16
+ readme = "README.md"
17
+ license = {file = "LICENSE"}
18
+ requires-python = ">=3.8"
19
+ classifiers = [
20
+ "Development Status :: 5 - Production/Stable",
21
+ "Programming Language :: Python :: 3.8",
22
+ "License :: OSI Approved :: MIT License",
23
+ "Operating System :: OS Independent",
24
+ "Intended Audience :: Developers",
25
+ "Natural Language :: French",
26
+ "Topic :: Software Development :: Code Generators"
27
+ ]
28
+ dependencies = [
29
+ "requests>=2.32.3",
30
+ "pydantic>=2.8.2",
31
+ "tiktoken>=0.7.0"
32
+ ]
33
+
34
+ [project.urls]
35
+ Homepage = "https://github.com/hand-e-fr/OpenHosta"
36
+ Issues = "https://github.com/hand-e-fr/OpenHosta/issues"
37
+
38
+ [tool.setuptools]
39
+ package-dir = { "" = "src" }
40
+
41
+ [tool.setuptools.packages.find]
42
+ where = ["src"]
43
+
44
+ [tool.setuptools.package-data]
45
+ "OpenHosta" = ["*.json"]
@@ -0,0 +1,4 @@
1
+ [egg_info]
2
+ tag_build =
3
+ tag_date = 0
4
+
@@ -0,0 +1 @@
1
+ from .openhosta import *
@@ -0,0 +1,145 @@
1
+ import tiktoken
2
+ from enum import Enum
3
+ import time as t
4
+ import sys
5
+ import requests
6
+ import json
7
+
8
+ from .config import Model, _default_model
9
+ from .prompt import PromptMananger
10
+
11
+ _x = PromptMananger()
12
+
13
+ _estimate_prompt = _x.get_prompt("estimate")
14
+
15
+
16
+ class ModelAnalizer(Model):
17
+
18
+ _default_input_cost: int = 0.005
19
+ _default_output_cost: int = 0.015
20
+ _default_token_perSec = 63.32
21
+ _default_latency = 0.48
22
+
23
+ def __init__(
24
+ self,
25
+ name: str,
26
+ input_cost: float,
27
+ output_cost: float,
28
+ latency: float,
29
+ token_perSec: float,
30
+ ):
31
+ self.name = self._default_name if name is None else name
32
+ self.input_cost = self._default_input_cost if input_cost is None else input_cost
33
+ self.output_cost = (
34
+ self._default_output_cost if output_cost is None else output_cost
35
+ )
36
+ self.latency = self._default_latency if latency is None else latency
37
+ self.token_perSec = (
38
+ self._default_token_perSec if token_perSec is None else token_perSec
39
+ )
40
+ self.tokenizer = tiktoken.get_encoding("cl100k_base")
41
+
42
+ def get_input_cost(self):
43
+ return self.input_cost
44
+
45
+ def get_output_cost(self):
46
+ return self.output_cost
47
+
48
+ def get_latency(self):
49
+ return self.latency
50
+
51
+ def get_token_perSec(self):
52
+ return self.token_perSec
53
+
54
+ def _estimate_output_token(self, function_doc: str, function_call: str):
55
+ global _estimate_prompt, _default_model
56
+
57
+ try:
58
+ if not _estimate_prompt:
59
+ raise ValueError("ValueError -> emulate empty values")
60
+ except ValueError as v:
61
+ sys.stderr.write(f"[ESTIMATE_ERROR]: {v}")
62
+ return None
63
+
64
+ api_key = _default_model.api_key
65
+ l_body = {
66
+ "model": _default_model.model,
67
+ "messages": [
68
+ {
69
+ "role": "system",
70
+ "content": [
71
+ {
72
+ "type": "text",
73
+ "text": _estimate_prompt
74
+ + "---\n"
75
+ + str(function_doc)
76
+ + "\n---",
77
+ }
78
+ ],
79
+ },
80
+ {
81
+ "role": "user",
82
+ "content": [{"type": "text", "text": str(function_call)}],
83
+ },
84
+ ],
85
+ "response_format": {"type": "json_object"},
86
+ "temperature": 0.1,
87
+ "top_p": 0.1,
88
+ }
89
+
90
+ headers = {
91
+ "Content-Type": "application/json",
92
+ "Authorization": f"Bearer {api_key}",
93
+ }
94
+
95
+ response = requests.post(_default_model.base_url, json=l_body, headers=headers)
96
+
97
+ if response.status_code == 200:
98
+ data = response.json()
99
+ json_string = data["choices"][0]["message"]["content"]
100
+ try:
101
+ l_ret_data = json.loads(json_string)
102
+
103
+ except json.JSONDecodeError as e:
104
+ sys.stderr.write(f"JSONDecodeError: {e}")
105
+ l_cleand = "\n".join(json_string.split("\n")[1:-1])
106
+ l_ret_data = json.loads(l_cleand)
107
+
108
+ l_ret = l_ret_data["tokens"]
109
+ else:
110
+ sys.stderr.write(f"Error {response.status_code}: {response.text}")
111
+ l_ret = None
112
+
113
+ return l_ret
114
+
115
+ def _compute_request_cost(self, input_text, output_token):
116
+ input_tokens = self.tokenizer.encode(input_text)
117
+ num_input_tokens = len(input_tokens)
118
+ num_output_tokens = output_token
119
+ cost_input = (num_input_tokens / 1000) * self.input_cost
120
+ cost_output = (num_output_tokens / 1000) * self.output_cost
121
+ total_cost = cost_input + cost_output
122
+ return total_cost
123
+
124
+ def _compute_request_duration(self, output_token):
125
+ total = self.latency
126
+ total += self.token_perSec / output_token
127
+ total += 0.5 # Processing duration margin
128
+ return total
129
+
130
+
131
+ def request_timer(func):
132
+ def wrapper(*args, **kwargs):
133
+ g_c = "\033[94m"
134
+ n = "\033[0m"
135
+ bold = "\033[1m"
136
+
137
+ start = t.time()
138
+ rv = func(*args, **kwargs)
139
+ end = t.time()
140
+
141
+ duration = end - start
142
+ print(f"{g_c}{bold}Execution time of {func.__name__}: {duration:.2f}s{n}")
143
+ return rv
144
+
145
+ return wrapper
@@ -0,0 +1,122 @@
1
+ import sys
2
+ import requests
3
+ import json
4
+ import re
5
+
6
+
7
+ def is_valid_url(url):
8
+ regex = re.compile(
9
+ r"^(?:http|ftp)s?://"
10
+ r"(?:(?:[A-Z0-9](?:[A-Z0-9-]{0,61}[A-Z0-9])?\.)+(?:[A-Z]{2,6}\.?|[A-Z0-9-]{2,}\.?)|"
11
+ r"localhost|"
12
+ r"\d{1,3}\.\d{1,3}\.\d{1,3}\.\d{1,3}|"
13
+ r"\[?[A-F0-9]*:[A-F0-9:]+\]?)"
14
+ r"(?::\d+)?"
15
+ r"(?:/?|[/?]\S+)$",
16
+ re.IGNORECASE,
17
+ )
18
+ return re.match(regex, url) is not None
19
+
20
+
21
+ class Model:
22
+
23
+ _SYS_PROMPT = ""
24
+
25
+ def __init__(self, model: str = None, base_url: str = None, api_key: str = None):
26
+ self.model = model
27
+ self.base_url = base_url
28
+ self.api_key = api_key
29
+
30
+ if any(var is None for var in (model, base_url)):
31
+ sys.stderr.write(f"[CONFIG_ERROR] Empty values.")
32
+ return
33
+ elif not is_valid_url(self.base_url):
34
+ sys.stderr.write(f"[CONFIG_ERROR Invalid URL.")
35
+ return
36
+
37
+ def __str__(self) -> str:
38
+ return (
39
+ f"[OpenHosta] <{self.__class__.__module__}.{self.__class__.__name__} object at {hex(id(self))}>\n"
40
+ f"Model: {self.model}\n"
41
+ f"Base_url: {self.base_url}\n"
42
+ "Infos:\n"
43
+ )
44
+
45
+ def _api_call(
46
+ self, sys_prompt: str, user_prompt: str, creativity: float, diversity: float
47
+ ):
48
+ if self.api_key is None or not self.api_key:
49
+ sys.stderr.write(f"[CALL_ERROR] Unknown API key.")
50
+ return
51
+
52
+ l_body = {
53
+ "model": self.model,
54
+ "messages": [
55
+ {
56
+ "role": "system",
57
+ "content": [{"type": "text", "text": str(sys_prompt)}],
58
+ },
59
+ {
60
+ "role": "user",
61
+ "content": [{"type": "text", "text": str(user_prompt)}],
62
+ },
63
+ ],
64
+ "response_format": {"type": "json_object"},
65
+ "temperature": creativity if creativity is not None else 0.7,
66
+ "top_p": diversity if diversity is not None else 0.7,
67
+ }
68
+ headers = {
69
+ "Content-Type": "application/json",
70
+ "Authorization": f"Bearer {self.api_key}",
71
+ }
72
+
73
+ try:
74
+ response = requests.post(self.base_url, json=l_body, headers=headers)
75
+ except Exception as e:
76
+ pass
77
+ if response.status_code != 200:
78
+ sys.stderr.write(f"[CALL_ERROR] API call the request was unsuccessful.")
79
+ return response
80
+
81
+ def _request_handler(self, response):
82
+ l_ret = ""
83
+
84
+ data = response.json()
85
+ json_string = data["choices"][0]["message"]["content"]
86
+
87
+
88
+ try:
89
+ l_ret_data = json.loads(json_string)
90
+
91
+ except json.JSONDecodeError as e:
92
+ sys.stderr.write(f"JSONDecodeError: {e}")
93
+ l_cleand = "\n".join(json_string.split("\n")[1:-1])
94
+ l_ret_data = json.loads(l_cleand)
95
+
96
+ l_ret = l_ret_data["return"]
97
+
98
+ return l_ret
99
+
100
+
101
+ _default_model = Model(
102
+ model="gpt-4o",
103
+ base_url="https://api.openai.com/v1/chat/completions",
104
+ )
105
+
106
+
107
+ def set_default_model(new: Model):
108
+ global _default_model
109
+
110
+ _default_model = new
111
+
112
+
113
+ def set_default_apiKey(api_key=None):
114
+ global _default_model
115
+
116
+ if api_key is not None or isinstance(api_key, str):
117
+ _default_model.api_key = api_key
118
+ else:
119
+ sys.stderr.write("[CONFIG_ERROR] Invalid API key.")
120
+
121
+
122
+ __all__ = Model, _default_model, set_default_model, set_default_apiKey
@@ -0,0 +1,75 @@
1
+ import sys
2
+
3
+ from .analytics import request_timer
4
+ from .prompt import PromptMananger
5
+ from .config import _default_model, Model, set_default_apiKey
6
+
7
+ _x = PromptMananger()
8
+
9
+ _emulator_pre_prompt = _x.get_prompt("emulate")
10
+
11
+
12
+ def _exec_emulate(
13
+ _function_doc=None,
14
+ _function_call=None,
15
+ _function_return=None,
16
+ model: Model = _default_model,
17
+ warn: bool = False,
18
+ l_creativity: float = None,
19
+ l_diversity: float = None,
20
+ ):
21
+ global _emulator_pre_prompt
22
+
23
+ try:
24
+ if not isinstance(_emulator_pre_prompt, str) or not _emulator_pre_prompt:
25
+ raise ValueError("Invalid prompt.")
26
+ if (l_creativity is not None and (l_creativity < 0 or l_creativity > 1)) or (
27
+ l_diversity is not None and (l_diversity < 0 or l_diversity > 1)
28
+ ):
29
+ raise ValueError("Emulate out of range values (0<creativity|diversity<1)")
30
+ except ValueError as v:
31
+ sys.stderr.write(f"[EMULATE_ERROR]: {v}")
32
+ return None
33
+
34
+ l_user_prompt = (
35
+ "Here's the function definition:\n"
36
+ + _function_doc
37
+ + "\nAnd this is the function call:\n"
38
+ + _function_call
39
+ )
40
+
41
+ if not _function_return is None:
42
+ l_user_prompt = (
43
+ l_user_prompt
44
+ + "\nTo fill the “return” value in the output JSON, build your response as defined in the following JSON schema. Do not change the key \"return\"\n"
45
+ + str(_function_return)
46
+ )
47
+
48
+ response = model._api_call(
49
+ sys_prompt=_emulator_pre_prompt,
50
+ user_prompt=l_user_prompt,
51
+ creativity=l_creativity,
52
+ diversity=l_diversity,
53
+ )
54
+
55
+ l_ret = ""
56
+
57
+ if response.status_code == 200:
58
+ l_ret = model._request_handler(response)
59
+ else:
60
+ sys.stderr.write(f"Error {response.status_code}: {response.text}")
61
+ l_ret = None
62
+ return l_ret
63
+
64
+
65
+ def thought(key):
66
+ def inner_func(*args, **kwargs):
67
+ try:
68
+ result = _exec_emulate(_function_doc=key, _function_call=str(args[0]))
69
+ except Exception as e:
70
+ sys.stderr.write(Exception)
71
+ sys.stderr.write("[LMDA_ERROR]")
72
+ result = None
73
+ return result
74
+
75
+ return inner_func
@@ -0,0 +1,123 @@
1
+ import requests
2
+ import json
3
+ import sys
4
+
5
+ from .prompt import PromptMananger
6
+ from .config import _default_model
7
+
8
+ _x = PromptMananger()
9
+ _enhancer_pre_prompt = _x.get_prompt("enhance")
10
+
11
+
12
+ def _ai_call_enh(sys_prompt: str, func_prot: str, func_doc: str):
13
+ global _default_model
14
+
15
+ api_key = _default_model.api_key
16
+ url = _default_model.base_url
17
+
18
+ headers = {
19
+ "Content-Type": "application/json",
20
+ "Authorization": f"Bearer {api_key}",
21
+ }
22
+
23
+ data = {
24
+ "model": _default_model.model,
25
+ "messages": [
26
+ {"role": "system", "content": [{"type": "text", "text": sys_prompt}]},
27
+ {
28
+ "role": "user",
29
+ "content": [
30
+ {
31
+ "type": "text",
32
+ "text": "\nHere's my python function's prototype:\n---\n"
33
+ + func_prot
34
+ + "\n---\n",
35
+ }
36
+ ],
37
+ },
38
+ {
39
+ "role": "user",
40
+ "content": [
41
+ {
42
+ "type": "text",
43
+ "text": "\nHere's my python function's prompt:\n---\n"
44
+ + func_doc
45
+ + "\n---\n",
46
+ }
47
+ ],
48
+ },
49
+ ],
50
+ "temperature": 0.7,
51
+ "top_p": 0.7,
52
+ }
53
+
54
+ response = requests.post(url, headers=headers, data=json.dumps(data))
55
+
56
+ if response.status_code == 200:
57
+ response_data = response.json()
58
+ return response_data["choices"][0]["message"]["content"]
59
+ else:
60
+ sys.stderr.write(
61
+ "[CALL_ERROR] The request was unsuccessful or one of the parameters is invalid"
62
+ )
63
+ sys.stderr.write(f"Status: {response.status_code}")
64
+ return None
65
+
66
+
67
+ def _parse_data(response: str, last_enh: dict) -> dict:
68
+ current_section = None
69
+ current_text = []
70
+
71
+ for line in response.splitlines():
72
+ if line.startswith("->"):
73
+ if current_section:
74
+ last_enh[current_section] = "\n".join(current_text).strip()
75
+ current_section = line[3:].strip(":")
76
+ current_text = []
77
+ else:
78
+ current_text.append(line)
79
+ if current_section:
80
+ last_enh[current_section] = "\n".join(current_text).strip()
81
+ return last_enh
82
+
83
+
84
+ def _build_attributes(func: object, last_enh) -> int:
85
+ try:
86
+ if not func.__name__ and not type(func.__name__) is str:
87
+ raise ValueError("ValueError -> function name")
88
+ if not last_enh["enhanced"] and not type(last_enh["enhanced"]) is str:
89
+ raise ValueError("ValueError -> enhanced output")
90
+ if not last_enh["review"] and not type(last_enh["review"]) is str:
91
+ raise ValueError("ValueError -> review output")
92
+ if not last_enh["advanced"] and not type(last_enh["advanced"]) is str:
93
+ raise ValueError("ValueError -> seggested output")
94
+ if not last_enh["mermaid"] and not type(last_enh["mermaid"]) is str:
95
+ raise ValueError("ValueError -> mermaid output")
96
+ except ValueError as e:
97
+ sys.stderr.write(f"[BUILD_ERROR] {e}")
98
+ return -1
99
+ finally:
100
+ func.enhanced_prompt = last_enh["enhanced"]
101
+ func.review = last_enh["review"]
102
+ func.advanced = last_enh["advanced"]
103
+ func.diagramm = last_enh["mermaid"]
104
+ return 0
105
+
106
+
107
+ def enhance(func):
108
+ global _enhancer_pre_prompt
109
+
110
+ last_enh: dict = {
111
+ "enhanced": None,
112
+ "review": None,
113
+ "advanced": None,
114
+ "mermaid": None,
115
+ }
116
+
117
+ func_name, func_doc = func.__name__, func.__doc__
118
+
119
+ last_return = _ai_call_enh(_enhancer_pre_prompt, func._prot, func_doc)
120
+
121
+ last_enh = _parse_data(last_return, last_enh)
122
+
123
+ _build_attributes(func, last_enh)
@@ -0,0 +1,8 @@
1
+ class OhCustomError(Exception):
2
+ """Base class for other customs exceptions"""
3
+ pass
4
+
5
+ class RequestError(OhCustomError):
6
+ """Raised when a request to a llm went wrong"""
7
+ pass
8
+
@@ -0,0 +1,114 @@
1
+ import inspect
2
+ import sys
3
+ from typing import Callable, Any, Dict
4
+ from pydantic import BaseModel, create_model
5
+
6
+ from .enhancer import enhance
7
+
8
+
9
+ class HostaInjector:
10
+
11
+ def __init__(self, exec):
12
+ if not callable(exec):
13
+ raise TypeError("Executive function must be a function.")
14
+ self.exec = exec
15
+
16
+ def __call__(self, *args, **kwargs):
17
+ infos = {"def": "", "call": "", "return_type": ""}
18
+
19
+ func_obj, caller = self._extend_scope()
20
+ infos["def"], func_prot = self._get_functionDef(func_obj)
21
+ infos["call"] = self._get_functionCall(func_obj, caller)
22
+ infos["return_type"] = self._get_functionReturnType(func_obj)
23
+
24
+ self._attach_attributs(func_obj, func_prot)
25
+ return self.exec(
26
+ infos["def"], infos["call"], infos["return_type"], *args, **kwargs
27
+ )
28
+
29
+ def _extend_scope(self) -> Callable:
30
+ try:
31
+ x = inspect.currentframe()
32
+ caller = x.f_back.f_back
33
+ name = caller.f_code.co_name
34
+ func = caller.f_globals.get(name)
35
+
36
+ if func is None:
37
+ raise Exception("Scope can't be extend.")
38
+ if not callable(func):
39
+ raise Exception("Larger scope isn't a callable.")
40
+ except Exception as e:
41
+ sys.stderr.write(f"[FRAME_ERROR]: {e}")
42
+ return None
43
+ return func, caller
44
+
45
+ def _get_functionDef(self, func: Callable) -> str:
46
+ sig = inspect.signature(func)
47
+
48
+ func_name = func.__name__
49
+ func_params = ", ".join(
50
+ [
51
+ (
52
+ f"{param_name}: {param.annotation.__name__}"
53
+ if param.annotation != inspect.Parameter.empty
54
+ else param_name
55
+ )
56
+ for param_name, param in sig.parameters.items()
57
+ ]
58
+ )
59
+ func_return = (
60
+ f" -> {sig.return_annotation.__name__}"
61
+ if sig.return_annotation != inspect.Signature.empty
62
+ else ""
63
+ )
64
+ definition = f"def {func_name}({func_params}):{func_return}\n '''\n {func.__doc__}\n '''"
65
+ prototype = f"def {func_name}({func_params}):{func_return}"
66
+ return definition, prototype
67
+
68
+ def _get_functionCall(self, func: Callable, caller) -> str:
69
+ args, _, _, values = inspect.getargvalues(caller)
70
+
71
+ sig = inspect.signature(func)
72
+ bound_args = {}
73
+ bound_args = sig.bind_partial(**values)
74
+ bound_args.apply_defaults()
75
+
76
+ args_str = ", ".join(
77
+ f"{name}={value!r}" if name in bound_args.kwargs else f"{value!r}"
78
+ for name, value in bound_args.arguments.items()
79
+ )
80
+
81
+ call = f"{func.__name__}({args_str})"
82
+ return call
83
+
84
+ def _inspect_returnType(self, func: Callable) -> str:
85
+ sig = inspect.signature(func)
86
+
87
+ if sig.return_annotation != inspect.Signature.empty:
88
+ return sig.return_annotation
89
+ else:
90
+ return None
91
+
92
+ def _get_functionReturnType(self, func: Callable) -> Dict[str, Any]:
93
+ return_type = self._inspect_returnType(func)
94
+ return_json = None
95
+
96
+ if return_type is not None:
97
+ if issubclass(return_type, BaseModel):
98
+ return_json = return_type.model_json_schema()
99
+ else:
100
+ new_model = create_model(
101
+ "Hosta_return_specified", return_hosta_type=(return_type, ...)
102
+ )
103
+ return_json = new_model.model_json_schema()
104
+ else:
105
+ No_return_specified = create_model(
106
+ "Hosta_return_no_specified", return_hosta_type=(Any, ...)
107
+ )
108
+ return_json = No_return_specified.model_json_schema()
109
+
110
+ return return_json
111
+
112
+ def _attach_attributs(self, func: Callable, prototype: str):
113
+ func.__suggest__ = enhance
114
+ func._prot = prototype
@@ -0,0 +1,9 @@
1
+ from .emulate import _exec_emulate, thought
2
+ import config
3
+ from .config import Model
4
+ from .exec import HostaInjector
5
+
6
+
7
+ emulate = HostaInjector(_exec_emulate)
8
+
9
+ __all__ = "emulate", "thought", "config", "Model"
@@ -0,0 +1,22 @@
1
+ {
2
+ "prompts": [
3
+ {
4
+ "key": "emulate",
5
+ "text": "You will act as an emulator of impossible-to-code functions. I will provide you with the description of the function using Python's way of declaring functions, but I won't provide the function body as I don't know how to code it. It might even be impossible to code. Therefore, you should not try to write the body. Instead, directly imagine the function output.\n\nIn the conversation, I will directly write the function call as if it was called in Python. You should directly answer with whatever you believe would be a good return for the function.\n\nWhen you produce an answer, you should estimate the confidence level:\n\n \"low\": You did your best, but with the provided description and your knowledge, you are not confident about the answer.\n \"medium-instance\": You did your best, and with the provided description and your knowledge, you are pretty sure this answer is valid but not the only valid answer.\n \"medium-unique\": You did your best, and with the provided description and your knowledge, you are pretty sure this answer is the unique and valid answer.\n \"high-instance\": You did your best, and you are sure that your provided answer is a valid answer. It is a well-known function or you can easily implement a Python code that yields elements from the list of valid answers. This answer is randomly chosen from the list of valid answers.\n \"high-unique\": You did your best, and you are sure that your provided answer is the unique valid answer. It is a well-known function or you can easily implement a Python code that solves the question and calculates this answer given this input.\n\nIf the output is documented as a Python structure, you should translate it to JSON.\nYou should encode the return in valid JSON format, without comments, using the following format:\n{\"return\":..., \"confidence\":...}\n\nThe output must be of the same type as that specified in the function call.\n\nAny assumptions made should be reasonable based on the provided function description and should take into account the error handling of the function.\n\nConsistency in the confidence levels is important to ensure accurate responses.\nThis is the function documentation:\n\nExample function call:\n\ndef example_function(a: int, b: dict) -> int:\n \"\"\"\n \"\"\"\n This is an example function.\n It adds two numbers.\n \"\"\"\n \"\"\"\n pass\n\nExample imagined function output:\n\nresult = example_function(3, {\"value\": 7})\n\nExpected JSON output:\n\n{\"return\": 10, \"confidence\": \"medium-unique\"}\n\nThis is the function documentation:",
6
+ "category": "executive",
7
+ "version": "v1.0"
8
+ },
9
+ {
10
+ "key": "enhance",
11
+ "text": "I want you to become my Expert Prompt Creator for developer. Your goal is to help me craft the best possible prompt for my programming needs. The prompt you provide should be written from the perspective of me making the request to GPT-4o. Consider in your prompt creation that this prompt will be entered into an interface for GPT-4o. Apart from diagrams, you must write in text, without following any syntax. The process is as follows: You will generate the following sections:\n -> enhanced:\n{provide the best possible prompt according to my request. The prompt is used to describe a function to be performed in Python as precisely as possible. You can add error handling, as the function needs it to work properly. But don't code the function in the prompt. The prompt should not tell to create the function, but describe how it works.}\n-> review:\n{provide a concise paragraph on how to improve the prompt. Be very critical in your response. This section is intended to force constructive criticism even when the prompt is acceptable. Any assumptions and or issues should be included. Don't forget that you speak to a developer}\n-> advanced:\n{rewrite the prompt with the suggested improvements you made in the critique category. The aim is to make a proposal, an example. Make logical assumptions to solve the problem based on the context and all the information you have. You have to respond to the problems you formulated in the previous category. But don't code the function in the prompt.}\n-> mermaid:\n{Make a mermaid diagram explaining the function described by the prompt. You need to break the reasoning into several steps for ease of understanding and clarity. You must make it logical and easy to look at. You have to write it in mermaid syntax. You must not use the markdown syntax}\n",
12
+ "category": "analytics",
13
+ "version": "v1.0"
14
+ },
15
+ {
16
+ "key": "estimate",
17
+ "text": "You're a prompt engineering engineer tasked with estimating the number of output tokens an AI would return when executing a given function.\nThe functions are written in Python, so function returns must use Python typing.\n\nEach time, I'll give you the following elements:\n- The definition of the function.\n- Its call with arguments.\n- The function's docstring.\n\nYou need to take all these elements into account when formulating your answer.\nTo estimate the output token, you need to use a tokenization algorithm: take the one in GPT-3.\n\nTo make your estimate, you need to go through this chain of thought:\n\n1. Understand the Function.\n\t- With the given definition, the function prototype and its docstring.\n2. Analyze the Function Call.\n\t- With the function name and arguments provided.\n3. Guess the expected result without calculating it.\n\t- You MUST NOT calculate the result of the function.\n\t- Simply make a prediction with all the elements you have at your disposal.\n\t- Pay attention to the type of output. If it's not specified in the function prototype, then guess it based on the description and type of the input arguments.\n4. Estimate the number of tokens.\n\t- Use the information you have available for this step.\n\t- If the estimate is complex or impossible, make a realistic prediction using the context elements.\n5. Formulate your answer\n\t- Synthesize your answer into a single number.\n\t- Follow the answer format I'll give you below\n\nYou should encode your response in valid JSON format, without comments, using the following format:\n{ “tokens”:...}\nYour answer in the “tokens” category must be a number only.\nNothing should appear other than this JSON structure.\n\nAny assumptions made should be reasonable based on the provided function description and should take into account the error handling of the function.\n\nI'll give you an example:\nFunction definition:\n```python\ndef reverse_string(a:str)->str:\n\t\"\"\"\n\tThis function reverse the string in parameter.\n\t\"\"\"\n\treturn emulate()\n```\n\nFunction call:\n```python\nreverse_string(\"Hello World!\")\n```\n\nExcpected output:\n```\n{\"tokens\": 13}\n```\n\nHere's all the function documentation for you to estimate:",
18
+ "category": "analytics",
19
+ "version": "v1.0"
20
+ }
21
+ ]
22
+ }
@@ -0,0 +1,41 @@
1
+ import os
2
+ import json
3
+ import sys
4
+
5
+
6
+ class PromptMananger:
7
+ def __init__(self, json_path=None):
8
+ if json_path is None:
9
+ try:
10
+ self.path = os.path.join(os.path.dirname(__file__), "prompt.json")
11
+ except Exception as e:
12
+ self.path = ""
13
+ sys.stderr.write(f"[JSON_ERROR] Impossible to find prompt.json:\n{e}")
14
+ return
15
+ else:
16
+ self.path = json_path
17
+
18
+ try:
19
+ with open(self.path, "r", encoding="utf-8") as file:
20
+ self.json = json.load(file)
21
+ self.prompts = {item["key"]: item for item in self.json["prompts"]}
22
+ except FileNotFoundError:
23
+ sys.stderr.write(f"[JSON_ERROR] File not found: {self.path}\n")
24
+ self.prompts = {}
25
+ except json.JSONDecodeError as e:
26
+ sys.stderr.write(f"[JSON_ERROR] JSON decode error:\n{e}\n")
27
+ self.prompts = {}
28
+
29
+ def get_prompt(self, key):
30
+ prompt = self.prompts.get(key)
31
+ if prompt:
32
+ return prompt["text"]
33
+ sys.stderr.write(f"[JSON_ERROR] Prompt not found\n")
34
+ return None
35
+
36
+ def get_prompt_details(self, key):
37
+ prompt = self.prompts.get(key)
38
+ if prompt:
39
+ return prompt
40
+ sys.stderr.write(f"[JSON_ERROR] Prompt not found\n")
41
+ return None
@@ -0,0 +1,193 @@
1
+ Metadata-Version: 2.1
2
+ Name: OpenHosta
3
+ Version: 1.0.1
4
+ Summary: Open-Source programming project IA integretion in developement environnement
5
+ Author: Léandre Ramos, Merlin Devillard, William Jolivet, Emmanuel Batt
6
+ License: MIT License
7
+
8
+ Copyright (c) 2024 hand-e
9
+
10
+ Permission is hereby granted, free of charge, to any person obtaining a copy
11
+ of this software and associated documentation files (the "Software"), to deal
12
+ in the Software without restriction, including without limitation the rights
13
+ to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
14
+ copies of the Software, and to permit persons to whom the Software is
15
+ furnished to do so, subject to the following conditions:
16
+
17
+ The above copyright notice and this permission notice shall be included in all
18
+ copies or substantial portions of the Software.
19
+
20
+ THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
21
+ IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
22
+ FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
23
+ AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
24
+ LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
25
+ OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
26
+ SOFTWARE.
27
+
28
+ Project-URL: Homepage, https://github.com/hand-e-fr/OpenHosta
29
+ Project-URL: Issues, https://github.com/hand-e-fr/OpenHosta/issues
30
+ Keywords: AI,GPT,Natural language,Autommatic,Easy
31
+ Classifier: Development Status :: 5 - Production/Stable
32
+ Classifier: Programming Language :: Python :: 3.8
33
+ Classifier: License :: OSI Approved :: MIT License
34
+ Classifier: Operating System :: OS Independent
35
+ Classifier: Intended Audience :: Developers
36
+ Classifier: Natural Language :: French
37
+ Classifier: Topic :: Software Development :: Code Generators
38
+ Requires-Python: >=3.8
39
+ Description-Content-Type: text/markdown
40
+ License-File: LICENSE
41
+ Requires-Dist: requests>=2.32.3
42
+ Requires-Dist: pydantic>=2.8.2
43
+ Requires-Dist: tiktoken>=0.7.0
44
+
45
+ # OpenHosta
46
+ v1.0 - Open-Source Project
47
+
48
+ **- The future of development is human -**
49
+
50
+ Welcome to the OpenHosta documentation, a powerful tool that facilitates the integration LLM in the development environnement. OpenHosta is used to emulate functions using AI, while respecting Python's native paradygma and syntax.
51
+
52
+ For this project, we have adopted a [Code of Conduct](CODE_OF_CONDUCT.md) to ensure a respectful and inclusive environment for all contributors. Please take a moment to read it.
53
+
54
+ ## Table of Content
55
+
56
+ - [OpenHosta](#openhosta)
57
+ - [Table of Content](#table-of-content)
58
+ - [How to install OpenHosta ?](#how-to-install-openhosta-)
59
+ - [Prerequisites](#prerequisites)
60
+ - [Installation](#installation)
61
+ - [Via pip](#via-pip)
62
+ - [Via git (Developper version)](#via-git-developper-version)
63
+ - [Example](#example)
64
+ - [Further information](#further-information)
65
+ - [Contributing](#contributing)
66
+ - [License](#license)
67
+ - [Authors \& Contact](#authors--contact)
68
+
69
+ ---
70
+
71
+ ## How to install OpenHosta ?
72
+
73
+ ### Prerequisites
74
+
75
+ 1. **Python 3.8+**
76
+ - Download and install Python from [python.org](https://www.python.org/downloads/).
77
+
78
+ 2. **pip**
79
+ - pip is generally included with Python. Verify its installation with:
80
+ ```sh
81
+ pip --version
82
+ ```
83
+
84
+ 3. **Git**
85
+ - Download and install Git from [git-scm.com](https://git-scm.com/downloads).
86
+
87
+ 4. **Virtual Environment (optional)**
88
+ - Create and activate a virtual environment:
89
+ ```bash
90
+ python -m venv env
91
+ ```
92
+ - Activate the virtual environement:
93
+ ```bash
94
+ .\env\Scripts\activate # Windows
95
+ source env/bin/activate # macOS/Linux
96
+ ```
97
+
98
+ 5. **API Key**
99
+ - **API Key**: Log in to your OpenAI account from [openai.com](https://openai.com/), then create your API key. For further information, you can check this [tuto](https://help.openai.com/en/articles/4936850-where-do-i-find-my-openai-api-key).
100
+
101
+ ### Installation
102
+
103
+ #### Via pip
104
+
105
+ 1. Run the following command to install OpenHosta directly:
106
+
107
+ ```sh
108
+ pip install openhosta
109
+ ```
110
+
111
+ 2. After the installation, you can verify that OpenHosta is installed correctly by running:
112
+
113
+ ```sh
114
+ pip show openhosta
115
+ ```
116
+
117
+ #### Via git (Developper version)
118
+
119
+ 1. Clone the **Git repository** to your local machine using the following command:
120
+
121
+ ```bash
122
+ git clone git@github.com:hand-e-fr/OpenHosta-dev.git
123
+ ```
124
+
125
+ 2. Navigate to the **directory** of the cloned project:
126
+
127
+ ```bash
128
+ cd OpenHosta-dev
129
+ ```
130
+
131
+ 3. Ensure you have installed the necessary **dependencies** before starting.
132
+
133
+ ```bash
134
+ pip install -r requirements.txt
135
+ ```
136
+
137
+ This way you have all the documentation and source code to understand our project
138
+
139
+ ### Example
140
+
141
+ ```python
142
+ from openhosta import *
143
+
144
+ config.set_default_apiKey("example-apikey")
145
+
146
+ def my_func(a:int, b:str)->dict:
147
+ """
148
+ This Function does something.
149
+ """
150
+ return emulate()
151
+
152
+ my_func(5, "Hello World!")
153
+
154
+ my_lambda = thought("Do something")
155
+ my_lambda(5)
156
+ ```
157
+ You check OpenHosta's [documentation](doc/Docs.md) for more detailled informations or exemple
158
+
159
+ ## Further information
160
+
161
+ ### Contributing
162
+
163
+ We warmly welcome contributions from the community. Whether you are an experienced developer or a beginner, your contributions are welcome.
164
+
165
+ If you wish to contribute to this project, please refer to our [Contribution Guide](CONTRIBUTING.md) and our [Code of Conduct](CODE_OF_CONDUCT.md).
166
+
167
+ Browse the existing [issues](https://github.com/hand-e-fr/OpenHosta/issues) to see if someone is already working on what you have in mind or to find contribution ideas.
168
+
169
+ ### License
170
+
171
+ This project is licensed under the MIT License. This means you are free to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the software, subject to the following conditions:
172
+
173
+ - The text of the license below must be included in all copies or substantial portions of the software.
174
+
175
+ See the [LICENSE](LICENSE) file for more details.
176
+
177
+ ### Authors & Contact
178
+
179
+ For further questions or assistance, please refer to partner hand-e or contact us directly via github.
180
+
181
+ **Authors**:
182
+ - Emmanuel Batt: Manager and Coordinator, Founder of Hand-e
183
+ - William Jolivet: DevOps, SysAdmin
184
+ - Léandre Ramos: MLOps, IA developer
185
+ - Merlin Devillard: UX designer, Product Owner
186
+
187
+ GitHub: https://github.com/hand-e-fr/OpenHosta
188
+
189
+ ---
190
+
191
+ Thank you for your interest in our project and your potential contributions!
192
+
193
+ **The OpenHosta Team**
@@ -0,0 +1,18 @@
1
+ LICENSE
2
+ README.md
3
+ pyproject.toml
4
+ src/OpenHosta/__init__.py
5
+ src/OpenHosta/analytics.py
6
+ src/OpenHosta/config.py
7
+ src/OpenHosta/emulate.py
8
+ src/OpenHosta/enhancer.py
9
+ src/OpenHosta/errors.py
10
+ src/OpenHosta/exec.py
11
+ src/OpenHosta/openhosta.py
12
+ src/OpenHosta/prompt.json
13
+ src/OpenHosta/prompt.py
14
+ src/OpenHosta.egg-info/PKG-INFO
15
+ src/OpenHosta.egg-info/SOURCES.txt
16
+ src/OpenHosta.egg-info/dependency_links.txt
17
+ src/OpenHosta.egg-info/requires.txt
18
+ src/OpenHosta.egg-info/top_level.txt
@@ -0,0 +1,3 @@
1
+ requests>=2.32.3
2
+ pydantic>=2.8.2
3
+ tiktoken>=0.7.0
@@ -0,0 +1 @@
1
+ OpenHosta