thonny-codemate 0.1.0__tar.gz

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
Files changed (42) hide show
  1. thonny_codemate-0.1.0/CLAUDE.md +169 -0
  2. thonny_codemate-0.1.0/LICENSE +21 -0
  3. thonny_codemate-0.1.0/MANIFEST.in +13 -0
  4. thonny_codemate-0.1.0/PKG-INFO +307 -0
  5. thonny_codemate-0.1.0/README.ja.md +269 -0
  6. thonny_codemate-0.1.0/README.md +260 -0
  7. thonny_codemate-0.1.0/docs_for_ai/copilot_features_analysis.md +187 -0
  8. thonny_codemate-0.1.0/docs_for_ai/copy_insert_implementation.md +99 -0
  9. thonny_codemate-0.1.0/docs_for_ai/demelopment_memo_from_chatgpt.md +216 -0
  10. thonny_codemate-0.1.0/docs_for_ai/implementation_techniques.md +401 -0
  11. thonny_codemate-0.1.0/docs_for_ai/plugin_analysis.md +153 -0
  12. thonny_codemate-0.1.0/docs_for_ai/project_goal.md +18 -0
  13. thonny_codemate-0.1.0/docs_for_ai/todo.md +119 -0
  14. thonny_codemate-0.1.0/pyproject.toml +102 -0
  15. thonny_codemate-0.1.0/setup.cfg +4 -0
  16. thonny_codemate-0.1.0/thonny_codemate.egg-info/PKG-INFO +307 -0
  17. thonny_codemate-0.1.0/thonny_codemate.egg-info/SOURCES.txt +40 -0
  18. thonny_codemate-0.1.0/thonny_codemate.egg-info/dependency_links.txt +1 -0
  19. thonny_codemate-0.1.0/thonny_codemate.egg-info/requires.txt +27 -0
  20. thonny_codemate-0.1.0/thonny_codemate.egg-info/top_level.txt +1 -0
  21. thonny_codemate-0.1.0/thonnycontrib/__init__.py +1 -0
  22. thonny_codemate-0.1.0/thonnycontrib/thonny_codemate/__init__.py +397 -0
  23. thonny_codemate-0.1.0/thonnycontrib/thonny_codemate/api.py +154 -0
  24. thonny_codemate-0.1.0/thonnycontrib/thonny_codemate/context_manager.py +296 -0
  25. thonny_codemate-0.1.0/thonnycontrib/thonny_codemate/external_providers.py +714 -0
  26. thonny_codemate-0.1.0/thonnycontrib/thonny_codemate/i18n.py +506 -0
  27. thonny_codemate-0.1.0/thonnycontrib/thonny_codemate/llm_client.py +841 -0
  28. thonny_codemate-0.1.0/thonnycontrib/thonny_codemate/message_virtualization.py +136 -0
  29. thonny_codemate-0.1.0/thonnycontrib/thonny_codemate/model_manager.py +515 -0
  30. thonny_codemate-0.1.0/thonnycontrib/thonny_codemate/performance_monitor.py +141 -0
  31. thonny_codemate-0.1.0/thonnycontrib/thonny_codemate/prompts.py +102 -0
  32. thonny_codemate-0.1.0/thonnycontrib/thonny_codemate/ui/__init__.py +1 -0
  33. thonny_codemate-0.1.0/thonnycontrib/thonny_codemate/ui/chat_view.py +687 -0
  34. thonny_codemate-0.1.0/thonnycontrib/thonny_codemate/ui/chat_view_html.py +1299 -0
  35. thonny_codemate-0.1.0/thonnycontrib/thonny_codemate/ui/custom_prompt_dialog.py +175 -0
  36. thonny_codemate-0.1.0/thonnycontrib/thonny_codemate/ui/markdown_renderer.py +484 -0
  37. thonny_codemate-0.1.0/thonnycontrib/thonny_codemate/ui/model_download_dialog.py +355 -0
  38. thonny_codemate-0.1.0/thonnycontrib/thonny_codemate/ui/settings_dialog.py +1218 -0
  39. thonny_codemate-0.1.0/thonnycontrib/thonny_codemate/utils/__init__.py +25 -0
  40. thonny_codemate-0.1.0/thonnycontrib/thonny_codemate/utils/constants.py +138 -0
  41. thonny_codemate-0.1.0/thonnycontrib/thonny_codemate/utils/error_messages.py +92 -0
  42. thonny_codemate-0.1.0/thonnycontrib/thonny_codemate/utils/unified_error_handler.py +310 -0
@@ -0,0 +1,169 @@
1
+ # CLAUDE.md
2
+
3
+ This file provides guidance to Claude Code (claude.ai/code) when working with code in this repository.
4
+
5
+ ## Project Overview
6
+
7
+ This is a Thonny IDE plugin that integrates local LLM capabilities using llama-cpp-python (not Ollama server) to provide GitHub Copilot-like features. The plugin will:
8
+ - Load GGUF models directly using llama-cpp-python when Thonny starts
9
+ - Provide code generation and explanation based on user instructions
10
+ - Support context-aware coding with multiple file understanding
11
+ - Include a portable USB deployment option
12
+
13
+ ## AI向けドキュメント
14
+
15
+ `docs_for_ai/*.md` はAI向けのドキュメントです。AIがこのプロジェクトを理解しやすくするために、プロジェクトの目的・ゴールの情報が含まれています。
16
+
17
+ また、AIは作業を開始する前に`docs_for_ai/todo.md` を参照し、タスクを確認してください。
18
+ 新たなタスクが発生した場合は、このファイルに追加してください。
19
+ タスクが完了したら、 `docs_for_ai/todo.md` の該当箇所を完了のステータスにしてください。i had
20
+
21
+ `README.md` を更新した場合は `README.ja.md` も日本語で更新してください。
22
+
23
+ ## Project Goals
24
+
25
+ 1. **Core Functionality**
26
+ - Direct GGUF model loading via llama-cpp-python (no Ollama server)
27
+ - Automatic model loading on Thonny startup
28
+ - Agentic coding with multi-file context understanding
29
+ - Code generation based on user instructions
30
+
31
+ 2. **User Features**
32
+ - Text selection in editor → context-aware Q&A with LLM
33
+ - "Code Explanation" context menu for selected text
34
+ - User skill level selection for appropriate responses
35
+ - Optional support for ChatGPT/Ollama/OpenRouter servers
36
+
37
+ 3. **Distribution**
38
+ - PyPI package: `pip install thonny-codemate`
39
+ - USB portable version with Thonny + plugin + models bundled
40
+
41
+ ## Project Structure
42
+
43
+ ```
44
+ thonny-codemate/
45
+ ├── docs_for_ai/
46
+ │ └── project_goal.md # Detailed project requirements
47
+ ├── pyproject.toml # Package configuration
48
+ ├── README.md
49
+ ├── thonnycontrib/
50
+ │ └── thonny_codemate/
51
+ │ ├── __init__.py # load_plugin() implementation
52
+ │ ├── llm_client.py # llama-cpp-python wrapper
53
+ │ ├── ui_widgets.py # Thonny UI components
54
+ │ └── config.py # Configuration management
55
+ ├── models/ # GGUF model storage
56
+ ├── tests/ # Unit tests
57
+ └── CLAUDE.md # This file
58
+ ```
59
+
60
+ ## Development Setup
61
+
62
+ ### Virtual Environment Setup
63
+ ```bash
64
+ python -m venv .venv
65
+ # Windows:
66
+ .venv\Scripts\activate
67
+ # macOS/Linux:
68
+ source .venv/bin/activate
69
+ pip install -U pip
70
+ ```
71
+
72
+ ### Install Dependencies
73
+ ```bash
74
+ # Install llama-cpp-python (CPU version)
75
+ pip install llama-cpp-python --extra-index-url https://abetlen.github.io/llama-cpp-python/whl/cpu
76
+
77
+ # For CUDA support:
78
+ pip install llama-cpp-python --extra-index-url https://abetlen.github.io/llama-cpp-python/whl/cu124
79
+
80
+ # Install Thonny for development
81
+ git clone https://github.com/thonny/thonny.git
82
+ pip install -e ./thonny
83
+
84
+ # Install plugin in editable mode
85
+ pip install -e .
86
+ ```
87
+
88
+ ### Download GGUF Models
89
+ ```bash
90
+ pip install -U "huggingface_hub[cli]"
91
+ huggingface-cli download TheBloke/Llama-3-8B-GGUF llama3-8b.Q4_K_M.gguf --local-dir ./models
92
+ ```
93
+
94
+ ## Common Commands
95
+
96
+ ```bash
97
+ # Run Thonny with plugin
98
+ python -m thonny
99
+
100
+ # Debug mode with debugpy
101
+ python -m debugpy --listen 5678 --wait-for-client -m thonny
102
+
103
+ # Run tests
104
+ pytest -v
105
+
106
+ # Build package
107
+ python -m build
108
+
109
+ # Quick restart workflow
110
+ # Ctrl+Q → Enter → ↑ → Enter
111
+ ```
112
+
113
+ ## Architecture
114
+
115
+ ### Core Components
116
+
117
+ 1. **LLM Client Module** (`llm_client.py`)
118
+ - Wraps llama-cpp-python for GGUF model loading
119
+ - Manages model lifecycle and memory
120
+ - Handles chat formatting and context management
121
+
122
+ 2. **UI Integration** (`ui_widgets.py`)
123
+ - Context menu for "Code Explanation"
124
+ - Assistant panel for interactive Q&A
125
+ - Progress indicators for model loading
126
+
127
+ 3. **Configuration** (`config.py`)
128
+ - Model path and selection
129
+ - User skill level settings
130
+ - Optional API endpoints for external services
131
+
132
+ ### Key Implementation Details
133
+
134
+ ```python
135
+ # Plugin entry point in __init__.py
136
+ def load_plugin():
137
+ from thonny import get_workbench
138
+ from .llm_client import LLMClient
139
+ from .ui_widgets import AssistantView
140
+
141
+ # Initialize LLM on startup
142
+ client = LLMClient()
143
+ client.load_model_async()
144
+
145
+ # Add UI components
146
+ get_workbench().add_view(AssistantView, "Assistant", "se")
147
+ ```
148
+
149
+ ## Development Notes
150
+
151
+ ### Debugging Tips
152
+ - Use `logging` module - output goes to System Shell
153
+ - Check `thonny.log` in Thonny data folder for detailed logs
154
+ - VS Code/PyCharm remote debugging via debugpy is most efficient
155
+
156
+ ### Performance Considerations
157
+ - Load models asynchronously to avoid blocking UI
158
+ - Use threading for LLM inference
159
+ - Cache model in memory between sessions if possible
160
+
161
+ ### Testing Strategy
162
+ - Mock `thonny.get_workbench()` for unit tests
163
+ - Test model loading separately from UI
164
+ - Include integration tests with small GGUF models
165
+
166
+ ### Portability Requirements
167
+ - Bundle all dependencies for USB deployment
168
+ - Use relative paths for model files
169
+ - Include platform-specific llama-cpp-python wheels
@@ -0,0 +1,21 @@
1
+ MIT License
2
+
3
+ Copyright (c) 2024 tokoroten
4
+
5
+ Permission is hereby granted, free of charge, to any person obtaining a copy
6
+ of this software and associated documentation files (the "Software"), to deal
7
+ in the Software without restriction, including without limitation the rights
8
+ to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
9
+ copies of the Software, and to permit persons to whom the Software is
10
+ furnished to do so, subject to the following conditions:
11
+
12
+ The above copyright notice and this permission notice shall be included in all
13
+ copies or substantial portions of the Software.
14
+
15
+ THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
16
+ IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
17
+ FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
18
+ AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
19
+ LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
20
+ OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
21
+ SOFTWARE.
@@ -0,0 +1,13 @@
1
+ include README.md
2
+ include README.ja.md
3
+ include LICENSE
4
+ include pyproject.toml
5
+ include CLAUDE.md
6
+ recursive-include thonnycontrib *.py
7
+ recursive-include docs_for_ai *.md
8
+ recursive-exclude * __pycache__
9
+ recursive-exclude * *.py[co]
10
+ recursive-exclude * .DS_Store
11
+ recursive-exclude models *
12
+ recursive-exclude .venv *
13
+ recursive-exclude tests *
@@ -0,0 +1,307 @@
1
+ Metadata-Version: 2.4
2
+ Name: thonny-codemate
3
+ Version: 0.1.0
4
+ Summary: A Thonny IDE plugin that provides AI-powered coding assistance using local and cloud LLMs
5
+ Author-email: tokoroten <shinta.nakayama@gmail.com>
6
+ License: MIT
7
+ Project-URL: Homepage, https://github.com/tokoroten/thonny-codemate
8
+ Project-URL: Repository, https://github.com/tokoroten/thonny-codemate
9
+ Project-URL: Issues, https://github.com/tokoroten/thonny-codemate/issues
10
+ Keywords: thonny,plugin,llm,ai,code-assistant,education
11
+ Classifier: Development Status :: 3 - Alpha
12
+ Classifier: Intended Audience :: Education
13
+ Classifier: Intended Audience :: Developers
14
+ Classifier: License :: OSI Approved :: MIT License
15
+ Classifier: Programming Language :: Python :: 3
16
+ Classifier: Programming Language :: Python :: 3.10
17
+ Classifier: Programming Language :: Python :: 3.11
18
+ Classifier: Programming Language :: Python :: 3.12
19
+ Classifier: Topic :: Education
20
+ Classifier: Topic :: Software Development :: Code Generators
21
+ Requires-Python: >=3.10
22
+ Description-Content-Type: text/markdown
23
+ License-File: LICENSE
24
+ Requires-Dist: thonny>=4.0.0
25
+ Requires-Dist: huggingface-hub[hf_xet]>=0.16.0
26
+ Requires-Dist: tkinterweb[javascript]>=3.24
27
+ Requires-Dist: pythonmonkey>=0.2.0
28
+ Requires-Dist: markdown>=3.5
29
+ Requires-Dist: pygments>=2.17
30
+ Requires-Dist: llama-cpp-python>=0.3.9
31
+ Provides-Extra: cuda
32
+ Provides-Extra: metal
33
+ Provides-Extra: external-only
34
+ Provides-Extra: dev
35
+ Requires-Dist: pytest>=7.0; extra == "dev"
36
+ Requires-Dist: pytest-cov>=4.0; extra == "dev"
37
+ Requires-Dist: pytest-mock>=3.10.0; extra == "dev"
38
+ Requires-Dist: debugpy>=1.6; extra == "dev"
39
+ Requires-Dist: black>=23.0; extra == "dev"
40
+ Requires-Dist: ruff>=0.1.0; extra == "dev"
41
+ Requires-Dist: llama-cpp-python>=0.3.9; extra == "dev"
42
+ Provides-Extra: test
43
+ Requires-Dist: pytest>=7.0; extra == "test"
44
+ Requires-Dist: pytest-cov>=4.0; extra == "test"
45
+ Requires-Dist: pytest-mock>=3.10.0; extra == "test"
46
+ Dynamic: license-file
47
+
48
+ # Thonny Local LLM Plugin
49
+
50
+ A Thonny IDE plugin that integrates local LLM capabilities using llama-cpp-python to provide GitHub Copilot-like features without requiring external API services.
51
+
52
+ ![image](https://github.com/user-attachments/assets/af94b175-bcc1-4e44-bbed-37402ee2850f)
53
+
54
+ ## Features
55
+
56
+ - 🤖 **Local LLM Integration**: Uses llama-cpp-python to load GGUF models directly (no Ollama server required)
57
+ - 🚀 **On-Demand Model Loading**: Models are loaded on first use (not at startup) to avoid slow startup times
58
+ - 📝 **Code Generation**: Generate code based on natural language instructions
59
+ - 💡 **Code Explanation**: Select code and get AI-powered explanations via context menu
60
+ - 🎯 **Context-Aware**: Understands multiple files and project context
61
+ - 💬 **Conversation Memory**: Maintains conversation history for contextual responses
62
+ - 🎚️ **Skill Level Adaptation**: Adjusts responses based on user's programming skill level
63
+ - 🔌 **External API Support**: Optional support for ChatGPT, Ollama server, and OpenRouter as alternatives
64
+ - 📥 **Model Download Manager**: Built-in download manager for recommended models
65
+ - 🎨 **Customizable System Prompts**: Tailor AI behavior with custom system prompts
66
+ - 📋 **Interactive Code Blocks**: Copy and insert code blocks directly from chat
67
+ - 🎨 **Markdown Rendering**: Optional rich text formatting with tkinterweb
68
+ - 💾 **USB Portable**: Can be bundled with Thonny and models for portable use
69
+ - 🛡️ **Error Resilience**: Advanced error handling with automatic retry and user-friendly messages
70
+ - ⚡ **Performance Optimized**: Message virtualization and caching for handling large conversations
71
+ - 🔧 **Smart Provider Detection**: Automatically detects Ollama vs LM Studio based on API responses
72
+ - 🌐 **Multi-language Support**: Japanese, Chinese (Simplified/Traditional), and English UI
73
+
74
+ ## Installation
75
+
76
+ ### From PyPI
77
+ ```bash
78
+ # Standard installation (includes llama-cpp-python for CPU)
79
+ pip install thonny-codemate
80
+ ```
81
+
82
+ **For GPU support**, see [INSTALL_GPU.md](INSTALL_GPU.md) for detailed instructions:
83
+ - NVIDIA GPUs (CUDA)
84
+ - Apple Silicon (Metal)
85
+ - Automatic GPU detection
86
+
87
+ ### Development Installation
88
+
89
+ #### Quick Setup with uv (Recommended)
90
+ ```bash
91
+ # Clone the repository
92
+ git clone https://github.com/tokoroten/thonny-codemate.git
93
+ cd thonny-codemate
94
+
95
+ # Install uv if not already installed
96
+ # Windows (PowerShell):
97
+ powershell -c "irm https://astral.sh/uv/install.ps1 | iex"
98
+ # Linux/macOS:
99
+ curl -LsSf https://astral.sh/uv/install.sh | sh
100
+
101
+ # Install all dependencies (including llama-cpp-python)
102
+ uv sync --all-extras
103
+
104
+ # Or install with development dependencies only
105
+ uv sync --extra dev
106
+
107
+ # (Optional) Install Markdown rendering support
108
+ # Basic Markdown rendering:
109
+ uv sync --extra markdown
110
+ # Full JavaScript support for interactive features:
111
+ uv sync --extra markdown-full
112
+
113
+ # Activate virtual environment
114
+ .venv\Scripts\activate # Windows
115
+ source .venv/bin/activate # macOS/Linux
116
+ ```
117
+
118
+ #### Alternative Setup Script
119
+ ```bash
120
+ # Use the setup script for guided installation
121
+ python setup_dev.py
122
+ ```
123
+
124
+ ### Installing with GPU Support
125
+
126
+ By default, llama-cpp-python is installed with CPU support. For GPU acceleration:
127
+
128
+ **CUDA support**:
129
+ ```bash
130
+ # Reinstall llama-cpp-python with CUDA support
131
+ uv pip uninstall llama-cpp-python
132
+ uv pip install llama-cpp-python --extra-index-url https://abetlen.github.io/llama-cpp-python/whl/cu124
133
+ ```
134
+
135
+ **Metal support (macOS)**:
136
+ ```bash
137
+ # Rebuild with Metal support
138
+ uv pip uninstall llama-cpp-python
139
+ CMAKE_ARGS="-DLLAMA_METAL=on" uv pip install llama-cpp-python --no-cache-dir
140
+ ```
141
+
142
+ ## Model Setup
143
+
144
+ ### Download GGUF Models
145
+
146
+ Recommended models:
147
+ - **Qwen2.5-Coder-14B** - Latest high-performance model specialized for programming (8.8GB)
148
+ - **Llama-3.2-1B/3B** - Lightweight and fast models (0.8GB/2.0GB)
149
+ - **Llama-3-ELYZA-JP-8B** - Japanese-specialized model (4.9GB)
150
+
151
+ ```bash
152
+ # Install Hugging Face CLI
153
+ pip install -U "huggingface_hub[cli]"
154
+
155
+ # Qwen2.5 Coder (programming-focused, recommended)
156
+ huggingface-cli download bartowski/Qwen2.5-Coder-14B-Instruct-GGUF Qwen2.5-Coder-14B-Instruct-Q4_K_M.gguf --local-dir ./models
157
+
158
+ # Llama 3.2 1B (lightweight)
159
+ huggingface-cli download bartowski/Llama-3.2-1B-Instruct-GGUF Llama-3.2-1B-Instruct-Q4_K_M.gguf --local-dir ./models
160
+ ```
161
+
162
+ ## Usage
163
+
164
+ 1. **Start Thonny** - The plugin will load automatically
165
+ 2. **Model Setup**:
166
+ - Open Settings → LLM Assistant Settings
167
+ - Choose between local models or external APIs
168
+ - For local models: Select a GGUF file or download recommended models
169
+ - For external APIs: Enter your API key and model name
170
+ 3. **Code Explanation**:
171
+ - Select code in the editor
172
+ - Right-click and choose "Explain Selection"
173
+ - The AI will explain the code based on your skill level
174
+ 4. **Code Generation**:
175
+ - Write a comment describing what you want
176
+ - Right-click and choose "Generate from Comment"
177
+ - Or use the AI Assistant panel for interactive chat
178
+ 5. **Error Fixing**:
179
+ - When you encounter an error, click "Explain Error" in the assistant panel
180
+ - The AI will analyze the error and suggest fixes
181
+
182
+ ### External API Configuration
183
+
184
+ #### ChatGPT
185
+ 1. Get an API key from [OpenAI](https://platform.openai.com/)
186
+ 2. In settings, select "chatgpt" as provider
187
+ 3. Enter your API key
188
+ 4. Choose model (e.g., gpt-3.5-turbo, gpt-4)
189
+
190
+ #### Ollama
191
+ 1. Install and run [Ollama](https://ollama.ai/)
192
+ 2. In settings, select "ollama" as provider
193
+ 3. Set base URL (default: http://localhost:11434)
194
+ 4. Choose installed model (e.g., llama3, mistral)
195
+
196
+ #### OpenRouter
197
+ 1. Get an API key from [OpenRouter](https://openrouter.ai/)
198
+ 2. In settings, select "openrouter" as provider
199
+ 3. Enter your API key
200
+ 4. Choose model (free models available)
201
+
202
+ ## Development
203
+
204
+ ### Project Structure
205
+ ```
206
+ thonny-codemate/
207
+ ├── thonnycontrib/
208
+ │ └── thonny_codemate/
209
+ │ ├── __init__.py # Plugin entry point
210
+ │ ├── llm_client.py # LLM integration
211
+ │ ├── ui_widgets.py # UI components
212
+ │ └── config.py # Configuration
213
+ ├── models/ # GGUF model storage
214
+ ├── tests/ # Unit tests
215
+ ├── docs_for_ai/ # AI documentation
216
+ └── README.md
217
+ ```
218
+
219
+ ### Running in Development Mode
220
+ ```bash
221
+ # Normal mode
222
+ python run_dev.py
223
+
224
+ # Debug mode (for VS Code/PyCharm attachment)
225
+ python run_dev.py --debug
226
+
227
+ # Quick run with uv
228
+ uv run thonny
229
+ ```
230
+
231
+ ### Running Tests
232
+ ```bash
233
+ uv run pytest -v
234
+ ```
235
+
236
+ ## Configuration
237
+
238
+ The plugin stores its configuration in Thonny's settings system. You can configure:
239
+
240
+ - **Provider Selection**: Local models or external APIs (ChatGPT, Ollama, OpenRouter)
241
+ - **Model Settings**: Model path, context size, generation parameters
242
+ - **User Preferences**: Skill level (beginner/intermediate/advanced)
243
+ - **System Prompts**: Choose between coding-focused, explanation-focused, or custom prompts
244
+ - **Generation Parameters**: Temperature, max tokens, etc.
245
+
246
+ ## Requirements
247
+
248
+ - Python 3.8+
249
+ - Thonny 4.0+
250
+ - llama-cpp-python (automatically installed)
251
+ - 4GB+ RAM (depending on model size)
252
+ - 5-10GB disk space for models
253
+ - uv (for development)
254
+ - tkinterweb with JavaScript support (for Markdown rendering and interactive features)
255
+ - Automatically installed with the plugin
256
+ - Includes PythonMonkey for JavaScript-Python communication
257
+ - Enables Copy/Insert buttons with direct Python integration
258
+
259
+ ## Contributing
260
+
261
+ Contributions are welcome! Please feel free to submit a Pull Request.
262
+
263
+ 1. Fork the repository
264
+ 2. Create your feature branch (`git checkout -b feature/AmazingFeature`)
265
+ 3. Commit your changes (`git commit -m 'Add some AmazingFeature'`)
266
+ 4. Push to the branch (`git push origin feature/AmazingFeature`)
267
+ 5. Open a Pull Request
268
+
269
+ ## License
270
+
271
+ This project is licensed under the MIT License - see the LICENSE file for details.
272
+
273
+ ## Acknowledgments
274
+
275
+ - Inspired by GitHub Copilot's functionality
276
+ - Built on top of [llama-cpp-python](https://github.com/abetlen/llama-cpp-python)
277
+ - Designed for [Thonny IDE](https://thonny.org/)
278
+ - **99% of the code in this project was generated by [Claude Code](https://claude.ai/code)** - This project demonstrates the capabilities of AI-assisted development
279
+
280
+ ## Status
281
+
282
+ 🚧 **Under Development** - This plugin is currently in early development stage.
283
+
284
+ ## Roadmap
285
+
286
+ - [x] Initial project setup
287
+ - [x] Development environment with uv
288
+ - [x] Basic plugin structure
289
+ - [x] LLM integration with llama-cpp-python
290
+ - [x] Chat panel UI (right side)
291
+ - [x] Context menu for code explanation
292
+ - [x] Code generation from comments
293
+ - [x] Error fixing assistance
294
+ - [x] Configuration UI
295
+ - [x] Multi-file context support
296
+ - [x] Model download manager
297
+ - [x] External API support (ChatGPT, Ollama, OpenRouter)
298
+ - [x] Customizable system prompts
299
+ - [ ] Inline code completion
300
+ - [ ] USB portable packaging
301
+ - [ ] PyPI release
302
+
303
+ ## Links
304
+
305
+ - [Thonny IDE](https://thonny.org/)
306
+ - [llama-cpp-python](https://github.com/abetlen/llama-cpp-python)
307
+ - [Project Documentation](docs_for_ai/)