ai-prompter 0.2.3__tar.gz → 0.3.0__tar.gz

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
@@ -1,6 +1,6 @@
1
1
  Metadata-Version: 2.4
2
2
  Name: ai-prompter
3
- Version: 0.2.3
3
+ Version: 0.3.0
4
4
  Summary: A prompt management library using Jinja2 templates to build complex prompts easily.
5
5
  Author-email: LUIS NOVO <lfnovo@gmail.com>
6
6
  License: MIT
@@ -22,8 +22,10 @@ A prompt management library using Jinja2 templates to build complex prompts easi
22
22
  - Define prompts as Jinja templates.
23
23
  - Load default templates from `src/ai_prompter/prompts`.
24
24
  - Override templates via `PROMPTS_PATH` environment variable.
25
+ - Automatic project root detection for prompt templates.
25
26
  - Render prompts with arbitrary data or Pydantic models.
26
27
  - Export to LangChain `ChatPromptTemplate`.
28
+ - Automatic output parser integration for structured outputs.
27
29
 
28
30
  ## Installation
29
31
 
@@ -85,32 +87,70 @@ Set the `PROMPTS_PATH` environment variable to point to your custom prompts dire
85
87
  export PROMPTS_PATH=/path/to/your/prompts
86
88
  ```
87
89
 
88
- The `Prompter` class will check this path if no custom directory is provided in the constructor. If not set, it will also look in the current working directory and `~/ai-prompter/` as fallback options before using the default package prompts.
90
+ You can specify multiple directories separated by `:` (colon):
89
91
 
90
- ### Raw text template
92
+ ```bash
93
+ export PROMPTS_PATH=/path/to/templates1:/path/to/templates2
94
+ ```
91
95
 
92
- ```python
93
- from ai_prompter import Prompter
96
+ ### Template Search Order
94
97
 
95
- template = """Write an article about {{ topic }}."""
96
- prompter = Prompter(template_text=template)
97
- prompt = prompter.render({"topic": "AI"})
98
- print(prompt) # Write an article about AI.
98
+ The `Prompter` class searches for templates in the following locations (in order of priority):
99
+
100
+ 1. **Custom directory** - If you provide `prompt_dir` parameter when initializing Prompter
101
+ 2. **Environment variable paths** - Directories specified in `PROMPTS_PATH` (colon-separated)
102
+ 3. **Current directory prompts** - `./prompts` subfolder in your current working directory
103
+ 4. **Project root prompts** - Automatically detects your Python project root (by looking for `pyproject.toml`, `setup.py`, `setup.cfg`, or `.git`) and checks for a `prompts` folder there
104
+ 5. **Home directory** - `~/ai-prompter` folder
105
+ 6. **Package defaults** - Built-in templates at `src/ai_prompter/prompts`
106
+
107
+ This allows you to organize your project with prompts at the root level, regardless of your package structure:
108
+ ```
109
+ my-project/
110
+ ├── prompts/ # <- Templates here will be found automatically
111
+ │ └── my_template.jinja
112
+ ├── src/
113
+ │ └── my_package/
114
+ │ └── main.py
115
+ └── pyproject.toml
99
116
  ```
100
117
 
101
118
  ### Using File-based Templates
102
119
 
103
- You can store your templates in files and reference them by name (without the `.jinja` extension). The library looks for templates in the `prompts` directory by default, or you can set a custom directory with the `PROMPTS_PATH` environment variable. You can specify multiple directories separated by `:` (colon), and the library will search through them in order until a matching template is found.
120
+ You can store your templates in files and reference them by name (without the `.jinja` extension). The library will search through all configured paths (see Template Search Order above) until a matching template is found.
104
121
 
105
122
  ```python
106
123
  from ai_prompter import Prompter
107
124
 
125
+ # Will search for 'greet.jinja' in all configured paths
126
+ prompter = Prompter(prompt_template="greet")
127
+ result = prompter.render({"name": "World"})
128
+ print(result) # Output depends on the content of greet.jinja
129
+ ```
130
+
131
+ You can also specify multiple search paths via environment variable:
132
+
133
+ ```python
134
+ import os
135
+ from ai_prompter import Prompter
136
+
108
137
  # Set multiple search paths
109
138
  os.environ["PROMPTS_PATH"] = "/path/to/templates1:/path/to/templates2"
110
139
 
111
140
  prompter = Prompter(prompt_template="greet")
112
141
  result = prompter.render({"name": "World"})
113
- print(result) # Output depends on the content of greet.jinja in the first found path
142
+ print(result) # Uses greet.jinja from the first path where it's found
143
+ ```
144
+
145
+ ### Raw text template
146
+
147
+ ```python
148
+ from ai_prompter import Prompter
149
+
150
+ template = """Write an article about {{ topic }}."""
151
+ prompter = Prompter(template_text=template)
152
+ prompt = prompter.render({"topic": "AI"})
153
+ print(prompt) # Write an article about AI.
114
154
  ```
115
155
 
116
156
  ### Using Raw Text Templates
@@ -149,6 +189,61 @@ lc_file_prompt = file_prompter.to_langchain()
149
189
 
150
190
  **Note**: LangChain integration requires the `langchain-core` package. Install it with `pip install .[langchain]`.
151
191
 
192
+ ### Using Output Parsers
193
+
194
+ The Prompter class supports LangChain output parsers to automatically inject formatting instructions into your prompts. When you provide a parser, it will call the parser's `get_format_instructions()` method and make the result available as `{{ format_instructions }}` in your template.
195
+
196
+ ```python
197
+ from ai_prompter import Prompter
198
+ from langchain.output_parsers import PydanticOutputParser
199
+ from pydantic import BaseModel, Field
200
+
201
+ # Define your output model
202
+ class Article(BaseModel):
203
+ title: str = Field(description="Article title")
204
+ summary: str = Field(description="Brief summary")
205
+ tags: list[str] = Field(description="Relevant tags")
206
+
207
+ # Create a parser
208
+ parser = PydanticOutputParser(pydantic_object=Article)
209
+
210
+ # Create a prompter with the parser
211
+ prompter = Prompter(
212
+ template_text="""Write an article about {{ topic }}.
213
+
214
+ {{ format_instructions }}""",
215
+ parser=parser
216
+ )
217
+
218
+ # Render the prompt - format instructions are automatically included
219
+ prompt = prompter.render({"topic": "AI Safety"})
220
+ print(prompt)
221
+ # Output will include the topic AND the parser's format instructions
222
+ ```
223
+
224
+ This works with file-based templates too:
225
+
226
+ ```jinja
227
+ # article_structured.jinja
228
+ Write an article about {{ topic }}.
229
+
230
+ Please format your response according to these instructions:
231
+ {{ format_instructions }}
232
+ ```
233
+
234
+ ```python
235
+ prompter = Prompter(
236
+ prompt_template="article_structured",
237
+ parser=parser
238
+ )
239
+ ```
240
+
241
+ The parser integration supports any LangChain output parser that implements `get_format_instructions()`, including:
242
+ - `PydanticOutputParser` - For structured Pydantic model outputs
243
+ - `OutputFixingParser` - For fixing malformed outputs
244
+ - `RetryOutputParser` - For retrying failed parsing attempts
245
+ - `StructuredOutputParser` - For dictionary-based structured outputs
246
+
152
247
  ### Including Other Templates
153
248
 
154
249
  You can include other template files within a template using Jinja2's `{% include %}` directive. This allows you to build modular templates.
@@ -7,8 +7,10 @@ A prompt management library using Jinja2 templates to build complex prompts easi
7
7
  - Define prompts as Jinja templates.
8
8
  - Load default templates from `src/ai_prompter/prompts`.
9
9
  - Override templates via `PROMPTS_PATH` environment variable.
10
+ - Automatic project root detection for prompt templates.
10
11
  - Render prompts with arbitrary data or Pydantic models.
11
12
  - Export to LangChain `ChatPromptTemplate`.
13
+ - Automatic output parser integration for structured outputs.
12
14
 
13
15
  ## Installation
14
16
 
@@ -70,32 +72,70 @@ Set the `PROMPTS_PATH` environment variable to point to your custom prompts dire
70
72
  export PROMPTS_PATH=/path/to/your/prompts
71
73
  ```
72
74
 
73
- The `Prompter` class will check this path if no custom directory is provided in the constructor. If not set, it will also look in the current working directory and `~/ai-prompter/` as fallback options before using the default package prompts.
75
+ You can specify multiple directories separated by `:` (colon):
74
76
 
75
- ### Raw text template
77
+ ```bash
78
+ export PROMPTS_PATH=/path/to/templates1:/path/to/templates2
79
+ ```
76
80
 
77
- ```python
78
- from ai_prompter import Prompter
81
+ ### Template Search Order
79
82
 
80
- template = """Write an article about {{ topic }}."""
81
- prompter = Prompter(template_text=template)
82
- prompt = prompter.render({"topic": "AI"})
83
- print(prompt) # Write an article about AI.
83
+ The `Prompter` class searches for templates in the following locations (in order of priority):
84
+
85
+ 1. **Custom directory** - If you provide `prompt_dir` parameter when initializing Prompter
86
+ 2. **Environment variable paths** - Directories specified in `PROMPTS_PATH` (colon-separated)
87
+ 3. **Current directory prompts** - `./prompts` subfolder in your current working directory
88
+ 4. **Project root prompts** - Automatically detects your Python project root (by looking for `pyproject.toml`, `setup.py`, `setup.cfg`, or `.git`) and checks for a `prompts` folder there
89
+ 5. **Home directory** - `~/ai-prompter` folder
90
+ 6. **Package defaults** - Built-in templates at `src/ai_prompter/prompts`
91
+
92
+ This allows you to organize your project with prompts at the root level, regardless of your package structure:
93
+ ```
94
+ my-project/
95
+ ├── prompts/ # <- Templates here will be found automatically
96
+ │ └── my_template.jinja
97
+ ├── src/
98
+ │ └── my_package/
99
+ │ └── main.py
100
+ └── pyproject.toml
84
101
  ```
85
102
 
86
103
  ### Using File-based Templates
87
104
 
88
- You can store your templates in files and reference them by name (without the `.jinja` extension). The library looks for templates in the `prompts` directory by default, or you can set a custom directory with the `PROMPTS_PATH` environment variable. You can specify multiple directories separated by `:` (colon), and the library will search through them in order until a matching template is found.
105
+ You can store your templates in files and reference them by name (without the `.jinja` extension). The library will search through all configured paths (see Template Search Order above) until a matching template is found.
89
106
 
90
107
  ```python
91
108
  from ai_prompter import Prompter
92
109
 
110
+ # Will search for 'greet.jinja' in all configured paths
111
+ prompter = Prompter(prompt_template="greet")
112
+ result = prompter.render({"name": "World"})
113
+ print(result) # Output depends on the content of greet.jinja
114
+ ```
115
+
116
+ You can also specify multiple search paths via environment variable:
117
+
118
+ ```python
119
+ import os
120
+ from ai_prompter import Prompter
121
+
93
122
  # Set multiple search paths
94
123
  os.environ["PROMPTS_PATH"] = "/path/to/templates1:/path/to/templates2"
95
124
 
96
125
  prompter = Prompter(prompt_template="greet")
97
126
  result = prompter.render({"name": "World"})
98
- print(result) # Output depends on the content of greet.jinja in the first found path
127
+ print(result) # Uses greet.jinja from the first path where it's found
128
+ ```
129
+
130
+ ### Raw text template
131
+
132
+ ```python
133
+ from ai_prompter import Prompter
134
+
135
+ template = """Write an article about {{ topic }}."""
136
+ prompter = Prompter(template_text=template)
137
+ prompt = prompter.render({"topic": "AI"})
138
+ print(prompt) # Write an article about AI.
99
139
  ```
100
140
 
101
141
  ### Using Raw Text Templates
@@ -134,6 +174,61 @@ lc_file_prompt = file_prompter.to_langchain()
134
174
 
135
175
  **Note**: LangChain integration requires the `langchain-core` package. Install it with `pip install .[langchain]`.
136
176
 
177
+ ### Using Output Parsers
178
+
179
+ The Prompter class supports LangChain output parsers to automatically inject formatting instructions into your prompts. When you provide a parser, it will call the parser's `get_format_instructions()` method and make the result available as `{{ format_instructions }}` in your template.
180
+
181
+ ```python
182
+ from ai_prompter import Prompter
183
+ from langchain.output_parsers import PydanticOutputParser
184
+ from pydantic import BaseModel, Field
185
+
186
+ # Define your output model
187
+ class Article(BaseModel):
188
+ title: str = Field(description="Article title")
189
+ summary: str = Field(description="Brief summary")
190
+ tags: list[str] = Field(description="Relevant tags")
191
+
192
+ # Create a parser
193
+ parser = PydanticOutputParser(pydantic_object=Article)
194
+
195
+ # Create a prompter with the parser
196
+ prompter = Prompter(
197
+ template_text="""Write an article about {{ topic }}.
198
+
199
+ {{ format_instructions }}""",
200
+ parser=parser
201
+ )
202
+
203
+ # Render the prompt - format instructions are automatically included
204
+ prompt = prompter.render({"topic": "AI Safety"})
205
+ print(prompt)
206
+ # Output will include the topic AND the parser's format instructions
207
+ ```
208
+
209
+ This works with file-based templates too:
210
+
211
+ ```jinja
212
+ # article_structured.jinja
213
+ Write an article about {{ topic }}.
214
+
215
+ Please format your response according to these instructions:
216
+ {{ format_instructions }}
217
+ ```
218
+
219
+ ```python
220
+ prompter = Prompter(
221
+ prompt_template="article_structured",
222
+ parser=parser
223
+ )
224
+ ```
225
+
226
+ The parser integration supports any LangChain output parser that implements `get_format_instructions()`, including:
227
+ - `PydanticOutputParser` - For structured Pydantic model outputs
228
+ - `OutputFixingParser` - For fixing malformed outputs
229
+ - `RetryOutputParser` - For retrying failed parsing attempts
230
+ - `StructuredOutputParser` - For dictionary-based structured outputs
231
+
137
232
  ### Including Other Templates
138
233
 
139
234
  You can include other template files within a template using Jinja2's `{% include %}` directive. This allows you to build modular templates.
@@ -1,6 +1,6 @@
1
1
  [project]
2
2
  name = "ai-prompter"
3
- version = "0.2.3"
3
+ version = "0.3.0"
4
4
  description = "A prompt management library using Jinja2 templates to build complex prompts easily."
5
5
  readme = "README.md"
6
6
  homepage = "https://github.com/lfnovo/ai-prompter"
@@ -30,6 +30,7 @@ package-dir = {"ai_prompter" = "src/ai_prompter"}
30
30
  dev = [
31
31
  "ipykernel>=4.0.1",
32
32
  "ipywidgets>=4.0.0",
33
+ "langchain-core>=0.3.54",
33
34
  "pyperclip>=1.9.0",
34
35
  "pytest>=7.2.0",
35
36
  "pytest-asyncio>=0.21.0",
@@ -97,8 +97,27 @@ class Prompter:
97
97
  prompts_path = os.getenv("PROMPTS_PATH")
98
98
  if prompts_path is not None:
99
99
  prompt_dirs.extend(prompts_path.split(":"))
100
- # Fallback to local folder and ~/ai-prompter
101
- prompt_dirs.extend([os.getcwd(), os.path.expanduser("~/ai-prompter")])
100
+
101
+ # Add current working directory + /prompts
102
+ cwd_prompts = os.path.join(os.getcwd(), "prompts")
103
+ if os.path.exists(cwd_prompts):
104
+ prompt_dirs.append(cwd_prompts)
105
+
106
+ # Try to find project root and add its prompts folder
107
+ current_path = os.getcwd()
108
+ while current_path != os.path.dirname(current_path): # Stop at root
109
+ # Check for common project indicators
110
+ if any(os.path.exists(os.path.join(current_path, indicator))
111
+ for indicator in ['pyproject.toml', 'setup.py', 'setup.cfg', '.git']):
112
+ project_prompts = os.path.join(current_path, "prompts")
113
+ if os.path.exists(project_prompts) and project_prompts not in prompt_dirs:
114
+ prompt_dirs.append(project_prompts)
115
+ break
116
+ current_path = os.path.dirname(current_path)
117
+
118
+ # Fallback to ~/ai-prompter
119
+ prompt_dirs.append(os.path.expanduser("~/ai-prompter"))
120
+
102
121
  # Default package prompts folder
103
122
  if os.path.exists(prompt_path_default):
104
123
  prompt_dirs.append(prompt_path_default)
@@ -1,5 +1,4 @@
1
1
  version = 1
2
- revision = 1
3
2
  requires-python = ">=3.10"
4
3
  resolution-markers = [
5
4
  "python_full_version < '3.12.4'",
@@ -8,7 +7,7 @@ resolution-markers = [
8
7
 
9
8
  [[package]]
10
9
  name = "ai-prompter"
11
- version = "0.2.3"
10
+ version = "0.3.0"
12
11
  source = { editable = "." }
13
12
  dependencies = [
14
13
  { name = "jinja2" },
@@ -25,6 +24,7 @@ langchain = [
25
24
  dev = [
26
25
  { name = "ipykernel" },
27
26
  { name = "ipywidgets" },
27
+ { name = "langchain-core" },
28
28
  { name = "pyperclip" },
29
29
  { name = "pytest" },
30
30
  { name = "pytest-asyncio" },
@@ -38,12 +38,12 @@ requires-dist = [
38
38
  { name = "pip", specifier = ">=25.0.1" },
39
39
  { name = "pydantic", specifier = ">=2.0" },
40
40
  ]
41
- provides-extras = ["langchain"]
42
41
 
43
42
  [package.metadata.requires-dev]
44
43
  dev = [
45
44
  { name = "ipykernel", specifier = ">=4.0.1" },
46
45
  { name = "ipywidgets", specifier = ">=4.0.0" },
46
+ { name = "langchain-core", specifier = ">=0.3.54" },
47
47
  { name = "pyperclip", specifier = ">=1.9.0" },
48
48
  { name = "pytest", specifier = ">=7.2.0" },
49
49
  { name = "pytest-asyncio", specifier = ">=0.21.0" },
@@ -352,7 +352,7 @@ name = "ipykernel"
352
352
  version = "6.29.5"
353
353
  source = { registry = "https://pypi.org/simple" }
354
354
  dependencies = [
355
- { name = "appnope", marker = "sys_platform == 'darwin'" },
355
+ { name = "appnope", marker = "platform_system == 'Darwin'" },
356
356
  { name = "comm" },
357
357
  { name = "debugpy" },
358
358
  { name = "ipython" },
File without changes
File without changes
File without changes
File without changes
File without changes