codegnipy 0.0.1__tar.gz

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
Files changed (32) hide show
  1. codegnipy-0.0.1/LICENSE +21 -0
  2. codegnipy-0.0.1/PKG-INFO +417 -0
  3. codegnipy-0.0.1/README.md +386 -0
  4. codegnipy-0.0.1/codegnipy/__init__.py +190 -0
  5. codegnipy-0.0.1/codegnipy/cli.py +153 -0
  6. codegnipy-0.0.1/codegnipy/decorator.py +151 -0
  7. codegnipy-0.0.1/codegnipy/determinism.py +631 -0
  8. codegnipy-0.0.1/codegnipy/memory.py +276 -0
  9. codegnipy-0.0.1/codegnipy/providers.py +1160 -0
  10. codegnipy-0.0.1/codegnipy/reflection.py +244 -0
  11. codegnipy-0.0.1/codegnipy/runtime.py +197 -0
  12. codegnipy-0.0.1/codegnipy/scheduler.py +498 -0
  13. codegnipy-0.0.1/codegnipy/streaming.py +387 -0
  14. codegnipy-0.0.1/codegnipy/tools.py +481 -0
  15. codegnipy-0.0.1/codegnipy/transformer.py +155 -0
  16. codegnipy-0.0.1/codegnipy/validation.py +961 -0
  17. codegnipy-0.0.1/codegnipy.egg-info/PKG-INFO +417 -0
  18. codegnipy-0.0.1/codegnipy.egg-info/SOURCES.txt +30 -0
  19. codegnipy-0.0.1/codegnipy.egg-info/dependency_links.txt +1 -0
  20. codegnipy-0.0.1/codegnipy.egg-info/entry_points.txt +2 -0
  21. codegnipy-0.0.1/codegnipy.egg-info/requires.txt +9 -0
  22. codegnipy-0.0.1/codegnipy.egg-info/top_level.txt +1 -0
  23. codegnipy-0.0.1/pyproject.toml +63 -0
  24. codegnipy-0.0.1/setup.cfg +4 -0
  25. codegnipy-0.0.1/tests/test_determinism.py +321 -0
  26. codegnipy-0.0.1/tests/test_memory.py +235 -0
  27. codegnipy-0.0.1/tests/test_providers.py +312 -0
  28. codegnipy-0.0.1/tests/test_scheduler.py +274 -0
  29. codegnipy-0.0.1/tests/test_streaming.py +87 -0
  30. codegnipy-0.0.1/tests/test_tools.py +282 -0
  31. codegnipy-0.0.1/tests/test_transformer.py +118 -0
  32. codegnipy-0.0.1/tests/test_validation.py +354 -0
@@ -0,0 +1,21 @@
1
+ MIT License
2
+
3
+ Copyright (c) 2026 Chidc
4
+
5
+ Permission is hereby granted, free of charge, to any person obtaining a copy
6
+ of this software and associated documentation files (the "Software"), to deal
7
+ in the Software without restriction, including without limitation the rights
8
+ to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
9
+ copies of the Software, and to permit persons to whom the Software is
10
+ furnished to do so, subject to the following conditions:
11
+
12
+ The above copyright notice and this permission notice shall be included in all
13
+ copies or substantial portions of the Software.
14
+
15
+ THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
16
+ IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
17
+ FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
18
+ AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
19
+ LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
20
+ OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
21
+ SOFTWARE.
@@ -0,0 +1,417 @@
1
+ Metadata-Version: 2.4
2
+ Name: codegnipy
3
+ Version: 0.0.1
4
+ Summary: AI 原生的 Python 语言扩展 - 让非确定性 AI 成为 Python 一等公民
5
+ Author: Codegnipy Team
6
+ License: MIT
7
+ Project-URL: Homepage, https://github.com/ChidcGithub/CodegniPy
8
+ Project-URL: Documentation, https://github.com/ChidcGithub/CodegniPy#readme
9
+ Project-URL: Repository, https://github.com/ChidcGithub/CodegniPy
10
+ Keywords: ai,llm,openai,language-extension,cognitive-computing
11
+ Classifier: Development Status :: 3 - Alpha
12
+ Classifier: Intended Audience :: Developers
13
+ Classifier: License :: OSI Approved :: MIT License
14
+ Classifier: Programming Language :: Python :: 3
15
+ Classifier: Programming Language :: Python :: 3.10
16
+ Classifier: Programming Language :: Python :: 3.11
17
+ Classifier: Programming Language :: Python :: 3.12
18
+ Classifier: Programming Language :: Python :: 3.13
19
+ Classifier: Topic :: Scientific/Engineering :: Artificial Intelligence
20
+ Requires-Python: >=3.10
21
+ Description-Content-Type: text/markdown
22
+ License-File: LICENSE
23
+ Requires-Dist: openai>=1.0.0
24
+ Requires-Dist: pydantic>=2.0.0
25
+ Provides-Extra: dev
26
+ Requires-Dist: pytest>=7.0.0; extra == "dev"
27
+ Requires-Dist: pytest-asyncio>=0.21.0; extra == "dev"
28
+ Provides-Extra: anthropic
29
+ Requires-Dist: anthropic>=0.18.0; extra == "anthropic"
30
+ Dynamic: license-file
31
+
32
+ # Codegnipy
33
+
34
+ [![PyPI version](https://img.shields.io/pypi/v/codegnipy.svg)](https://pypi.org/project/codegnipy/)
35
+ [![Python](https://img.shields.io/pypi/pyversions/codegnipy.svg)](https://pypi.org/project/codegnipy/)
36
+ [![License](https://img.shields.io/github/license/ChidcGithub/CodegniPy)](LICENSE)
37
+ [![Build Status](https://github.com/ChidcGithub/CodegniPy/actions/workflows/ci.yml/badge.svg)](https://github.com/ChidcGithub/CodegniPy/actions/workflows/ci.yml)
38
+ [![Coverage](https://img.shields.io/codecov/c/github/ChidcGithub/CodegniPy.svg)](https://codecov.io/gh/ChidcGithub/CodegniPy)
39
+
40
+ **AI-Native Python Language Extension**
41
+
42
+ Codegnipy seamlessly integrates Large Language Models (LLMs) into Python, making non-deterministic AI capabilities a first-class citizen in your code. Write logic close to natural language while achieving production-grade performance and debuggability.
43
+
44
+ ## Features
45
+
46
+ - **Syntactic Extension**: `~` operator for natural language prompts directly in code
47
+ - **Cognitive Decorator**: `@cognitive` decorator to let LLMs implement functions
48
+ - **Memory Management**: Session-level memory with pluggable storage backends
49
+ - **Reflection Loop**: Built-in self-correction and quality assurance
50
+ - **Async Scheduler**: High-performance concurrent LLM calls with priority queuing
51
+ - **Deterministic Guarantees**: Type constraints, simulation mode, and hallucination detection
52
+
53
+ ## Installation
54
+
55
+ ```bash
56
+ pip install codegnipy
57
+ ```
58
+
59
+ For development:
60
+
61
+ ```bash
62
+ pip install codegnipy[dev]
63
+ ```
64
+
65
+ ## Quick Start
66
+
67
+ ### The `~` Operator
68
+
69
+ ```python
70
+ from codegnipy import CognitiveContext, cognitive_call
71
+
72
+ with CognitiveContext(api_key="your-api-key"):
73
+ # Natural language prompts directly in code
74
+ result = ~"Translate to English: Hello World"
75
+ print(result)
76
+ ```
77
+
78
+ ### The `@cognitive` Decorator
79
+
80
+ ```python
81
+ from codegnipy import cognitive, CognitiveContext
82
+
83
+ @cognitive
84
+ def summarize(text: str) -> str:
85
+ """Summarize the key points of this text in no more than two sentences."""
86
+
87
+ with CognitiveContext(api_key="your-api-key"):
88
+ summary = summarize("Python is a high-level programming language...")
89
+ print(summary)
90
+ ```
91
+
92
+ ### Async Batch Processing
93
+
94
+ ```python
95
+ import asyncio
96
+ from codegnipy import batch_call, CognitiveContext
97
+
98
+ async def main():
99
+ prompts = [
100
+ "Translate: Hello",
101
+ "Translate: World",
102
+ "Translate: Python"
103
+ ]
104
+ results = await batch_call(prompts, max_concurrent=3)
105
+ print(results)
106
+
107
+ asyncio.run(main())
108
+ ```
109
+
110
+ ### With Memory Persistence
111
+
112
+ ```python
113
+ from codegnipy import CognitiveContext, FileStore
114
+
115
+ with CognitiveContext(
116
+ api_key="your-api-key",
117
+ memory_store=FileStore("session_memory.json")
118
+ ):
119
+ cognitive_call("My name is Alice")
120
+ response = cognitive_call("What is my name?")
121
+ # LLM will remember: "Alice"
122
+ ```
123
+
124
+ ### With Type Constraints
125
+
126
+ ```python
127
+ from codegnipy import PrimitiveConstraint, deterministic_call
128
+
129
+ # Ensure LLM output is a valid integer between 0-100
130
+ constraint = PrimitiveConstraint(
131
+ int,
132
+ min_value=0,
133
+ max_value=100
134
+ )
135
+
136
+ result = deterministic_call(
137
+ "Generate a random number between 1 and 100",
138
+ constraint
139
+ )
140
+
141
+ if result.status == "valid":
142
+ print(result.value) # Guaranteed valid integer
143
+ ```
144
+
145
+ ### With Reflection
146
+
147
+ ```python
148
+ from codegnipy import CognitiveContext, with_reflection
149
+
150
+ with CognitiveContext(api_key="your-api-key") as ctx:
151
+ result = with_reflection(
152
+ "Explain quantum entanglement",
153
+ context=ctx,
154
+ max_iterations=2
155
+ )
156
+
157
+ if result.status == "passed":
158
+ print(result.corrected_response or result.original_response)
159
+ ```
160
+
161
+ ## Architecture
162
+
163
+ ```
164
+ Python Source Code
165
+ |
166
+ v
167
+ AST Preprocessing
168
+ |
169
+ v
170
+ Transformed Code + cognitive_call()
171
+ |
172
+ v
173
+ Runtime Layer
174
+ |
175
+ v
176
+ Scheduler (async)
177
+ |
178
+ v
179
+ LLM APIs
180
+ |
181
+ v
182
+ Validation Layer
183
+ |
184
+ v
185
+ Deterministic Result
186
+ ```
187
+
188
+ ## API Reference
189
+
190
+ ### Core Functions
191
+
192
+ | Function | Description |
193
+ |----------|-------------|
194
+ | `cognitive_call(prompt, context=None, model=None, temperature=None)` | Execute a cognitive call to LLM |
195
+ | `deterministic_call(prompt, constraint, context=None)` | Call LLM with type constraints |
196
+ | `batch_call(prompts, max_concurrent=5)` | Execute multiple prompts concurrently |
197
+
198
+ ### Decorators
199
+
200
+ | Decorator | Description |
201
+ |-----------|-------------|
202
+ | `@cognitive` | Decorate a function to be implemented by LLM |
203
+ | `@cognitive(model="gpt-4")` | With specific model selection |
204
+
205
+ ### Context Manager
206
+
207
+ ```python
208
+ CognitiveContext(
209
+ api_key=None, # OpenAI API key (or use OPENAI_API_KEY env var)
210
+ model="gpt-4o-mini", # Default model
211
+ base_url=None, # Custom API endpoint
212
+ temperature=0.7, # Sampling temperature
213
+ max_tokens=1024, # Maximum response tokens
214
+ memory_store=None # Memory storage backend
215
+ )
216
+ ```
217
+
218
+ ### Memory Backends
219
+
220
+ | Class | Description |
221
+ |-------|-------------|
222
+ | `InMemoryStore` | Volatile in-memory storage |
223
+ | `FileStore(path)` | Persistent file-based storage |
224
+
225
+ ### Type Constraints
226
+
227
+ | Class | Description |
228
+ |-------|-------------|
229
+ | `PrimitiveConstraint(type, min_value=None, max_value=None, min_length=None, max_length=None, pattern=None)` | Validate primitive types |
230
+ | `EnumConstraint(values)` | Validate enum values |
231
+ | `SchemaConstraint(pydantic_model)` | Validate against Pydantic schema |
232
+ | `ListConstraint(item_constraint, min_items=None, max_items=None)` | Validate list items |
233
+
234
+ ### Scheduler
235
+
236
+ ```python
237
+ from codegnipy import CognitiveScheduler, Priority
238
+
239
+ scheduler = CognitiveScheduler(
240
+ max_concurrent=5,
241
+ default_timeout=30.0,
242
+ retry_policy=RetryPolicy(max_retries=3, base_delay=1.0)
243
+ )
244
+
245
+ # Submit with priority
246
+ task_id = await scheduler.submit(
247
+ my_coroutine,
248
+ priority=Priority.HIGH,
249
+ timeout=60.0
250
+ )
251
+
252
+ # Get result
253
+ result = await scheduler.get_result(task_id, timeout=10.0)
254
+ ```
255
+
256
+ ### Hallucination Detection
257
+
258
+ ```python
259
+ from codegnipy import HallucinationDetector
260
+
261
+ detector = HallucinationDetector()
262
+ check = detector.detect(llm_response)
263
+
264
+ print(check.confidence) # 0.0 - 1.0
265
+ print(check.issues) # List of detected issues
266
+ ```
267
+
268
+ ## Configuration
269
+
270
+ ### Environment Variables
271
+
272
+ | Variable | Description |
273
+ |----------|-------------|
274
+ | `OPENAI_API_KEY` | OpenAI API key |
275
+ | `CODEGNIPY_MODEL` | Default model to use |
276
+ | `CODEGNIPY_TEMPERATURE` | Default temperature |
277
+ | `CODEGNIPY_MAX_TOKENS` | Default max tokens |
278
+
279
+ ### Programmatic Configuration
280
+
281
+ ```python
282
+ from codegnipy import CognitiveContext
283
+
284
+ ctx = CognitiveContext(
285
+ api_key="sk-...",
286
+ model="gpt-4",
287
+ temperature=0.5,
288
+ max_tokens=2048
289
+ )
290
+ ```
291
+
292
+ ## CLI Usage
293
+
294
+ ```bash
295
+ # Run a script with cognitive features
296
+ codegnipy run script.py
297
+
298
+ # Start interactive REPL
299
+ codegnipy repl
300
+
301
+ # Show version
302
+ codegnipy version
303
+
304
+ # With options
305
+ codegnipy run script.py --model gpt-4 --api-key sk-...
306
+ ```
307
+
308
+ ## Testing
309
+
310
+ ### Unit Tests
311
+
312
+ ```bash
313
+ # Run all tests
314
+ pytest tests/ -v
315
+
316
+ # With coverage
317
+ pytest tests/ --cov=codegnipy --cov-report=html
318
+ ```
319
+
320
+ ### Simulation Mode
321
+
322
+ For testing without actual LLM calls:
323
+
324
+ ```python
325
+ from codegnipy import Simulator, SimulationMode
326
+
327
+ simulator = Simulator(mode=SimulationMode.MOCK)
328
+
329
+ # Mock responses
330
+ simulator.add_mock("Hello", "Hi there!")
331
+
332
+ # Or record real responses for replay
333
+ simulator = Simulator(mode=SimulationMode.RECORD)
334
+ # ... make real calls ...
335
+ simulator.save_recordings("recordings.json")
336
+
337
+ # Later, replay them
338
+ simulator = Simulator(mode=SimulationMode.REPLAY)
339
+ simulator.load_recordings("recordings.json")
340
+ ```
341
+
342
+ ## Project Structure
343
+
344
+ ```
345
+ Codegnipy/
346
+ codegnipy/
347
+ __init__.py # Package exports
348
+ runtime.py # Core runtime (cognitive_call, CognitiveContext)
349
+ decorator.py # @cognitive decorator
350
+ transformer.py # AST transformer for ~ operator
351
+ memory.py # Memory storage backends
352
+ reflection.py # Reflection loop implementation
353
+ scheduler.py # Async scheduler with retry/timeout
354
+ determinism.py # Type constraints, simulator, hallucination detection
355
+ cli.py # Command-line interface
356
+ tests/
357
+ test_transformer.py
358
+ test_memory.py
359
+ test_scheduler.py
360
+ test_determinism.py
361
+ examples/
362
+ demo.py
363
+ pyproject.toml
364
+ README.md
365
+ ```
366
+
367
+ ## Contributing
368
+
369
+ 1. Fork the repository
370
+ 2. Create a feature branch (`git checkout -b feature/amazing-feature`)
371
+ 3. Commit changes (`git commit -m 'Add amazing feature'`)
372
+ 4. Push to branch (`git push origin feature/amazing-feature`)
373
+ 5. Open a Pull Request
374
+
375
+ ### Development Setup
376
+
377
+ ```bash
378
+ # Clone repository
379
+ git clone https://github.com/ChidcGithub/Codegnipy.git
380
+ cd Codegnipy
381
+
382
+ # Create virtual environment
383
+ python -m venv venv
384
+ source venv/bin/activate # On Windows: venv\Scripts\activate
385
+
386
+ # Install development dependencies
387
+ pip install -e ".[dev]"
388
+
389
+ # Run tests
390
+ pytest tests/ -v
391
+
392
+ # Run linting
393
+ ruff check codegnipy/
394
+ mypy codegnipy/
395
+ ```
396
+
397
+ ## Roadmap
398
+
399
+ - [ ] Rust extension for high-performance scheduling
400
+ - [ ] Support for more LLM providers (Anthropic, local models)
401
+ - [ ] Enhanced hallucination detection with external verification
402
+ - [ ] Visual debugging tools
403
+ - [ ] Distributed execution support
404
+
405
+ ## License
406
+
407
+ This project is licensed under the MIT License - see the [LICENSE](LICENSE) file for details.
408
+
409
+ ## Acknowledgments
410
+
411
+ Codegnipy is inspired by the vision of making AI a natural part of programming, bridging the gap between deterministic code and probabilistic intelligence.
412
+
413
+ ## Links
414
+
415
+ - [Documentation](https://github.com/ChidcGithub/Codegnipy#readme)
416
+ - [Issue Tracker](https://github.com/ChidcGithub/Codegnipy/issues)
417
+ - [PyPI Package](https://pypi.org/project/codegnipy/)