cognautic-cli 1.1.2__tar.gz

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
Files changed (32) hide show
  1. cognautic_cli-1.1.2/LICENSE +21 -0
  2. cognautic_cli-1.1.2/PKG-INFO +601 -0
  3. cognautic_cli-1.1.2/README.md +538 -0
  4. cognautic_cli-1.1.2/cognautic/__init__.py +7 -0
  5. cognautic_cli-1.1.2/cognautic/ai_engine.py +2477 -0
  6. cognautic_cli-1.1.2/cognautic/auto_continuation.py +248 -0
  7. cognautic_cli-1.1.2/cognautic/cli.py +1184 -0
  8. cognautic_cli-1.1.2/cognautic/config.py +245 -0
  9. cognautic_cli-1.1.2/cognautic/file_tagger.py +194 -0
  10. cognautic_cli-1.1.2/cognautic/memory.py +419 -0
  11. cognautic_cli-1.1.2/cognautic/provider_endpoints.py +424 -0
  12. cognautic_cli-1.1.2/cognautic/rules.py +246 -0
  13. cognautic_cli-1.1.2/cognautic/tools/__init__.py +19 -0
  14. cognautic_cli-1.1.2/cognautic/tools/base.py +59 -0
  15. cognautic_cli-1.1.2/cognautic/tools/code_analysis.py +391 -0
  16. cognautic_cli-1.1.2/cognautic/tools/command_runner.py +292 -0
  17. cognautic_cli-1.1.2/cognautic/tools/file_operations.py +394 -0
  18. cognautic_cli-1.1.2/cognautic/tools/file_reader.py +218 -0
  19. cognautic_cli-1.1.2/cognautic/tools/registry.py +117 -0
  20. cognautic_cli-1.1.2/cognautic/tools/response_control.py +48 -0
  21. cognautic_cli-1.1.2/cognautic/tools/web_search.py +336 -0
  22. cognautic_cli-1.1.2/cognautic/utils.py +297 -0
  23. cognautic_cli-1.1.2/cognautic/websocket_server.py +485 -0
  24. cognautic_cli-1.1.2/cognautic_cli.egg-info/PKG-INFO +601 -0
  25. cognautic_cli-1.1.2/cognautic_cli.egg-info/SOURCES.txt +30 -0
  26. cognautic_cli-1.1.2/cognautic_cli.egg-info/dependency_links.txt +1 -0
  27. cognautic_cli-1.1.2/cognautic_cli.egg-info/entry_points.txt +2 -0
  28. cognautic_cli-1.1.2/cognautic_cli.egg-info/requires.txt +31 -0
  29. cognautic_cli-1.1.2/cognautic_cli.egg-info/top_level.txt +1 -0
  30. cognautic_cli-1.1.2/pyproject.toml +178 -0
  31. cognautic_cli-1.1.2/setup.cfg +4 -0
  32. cognautic_cli-1.1.2/setup.py +71 -0
@@ -0,0 +1,21 @@
1
+ MIT License
2
+
3
+ Copyright (c) 2025 Cognautic
4
+
5
+ Permission is hereby granted, free of charge, to any person obtaining a copy
6
+ of this software and associated documentation files (the "Software"), to deal
7
+ in the Software without restriction, including without limitation the rights
8
+ to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
9
+ copies of the Software, and to permit persons to whom the Software is
10
+ furnished to do so, subject to the following conditions:
11
+
12
+ The above copyright notice and this permission notice shall be included in all
13
+ copies or substantial portions of the Software.
14
+
15
+ THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
16
+ IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
17
+ FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
18
+ AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
19
+ LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
20
+ OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
21
+ SOFTWARE.
@@ -0,0 +1,601 @@
1
+ Metadata-Version: 2.4
2
+ Name: cognautic-cli
3
+ Version: 1.1.2
4
+ Summary: A Python-based CLI AI coding agent that provides agentic development capabilities with multi-provider AI support and real-time interaction
5
+ Home-page: https://github.com/cognautic/cognautic-cli
6
+ Author: Cognautic
7
+ Author-email: Cognautic <cognautic@gmail.com>
8
+ Maintainer-email: Cognautic <cognautic@gmail.com>
9
+ License: Proprietary - All Rights Reserved
10
+ Project-URL: Homepage, https://github.com/cognautic/cli
11
+ Project-URL: Documentation, https://cognautic.vercel.app/cognautic-cli.html
12
+ Project-URL: Repository, https://github.com/cognautic/cli.git
13
+ Project-URL: Issues, https://github.com/cognautic/cli/issues
14
+ Project-URL: Changelog, https://github.com/cognautic/cli/blob/main/CHANGELOG.md
15
+ Keywords: ai,cli,coding,assistant,development,automation
16
+ Classifier: Development Status :: 4 - Beta
17
+ Classifier: Intended Audience :: Developers
18
+ Classifier: Operating System :: OS Independent
19
+ Classifier: Programming Language :: Python :: 3
20
+ Classifier: Programming Language :: Python :: 3.8
21
+ Classifier: Programming Language :: Python :: 3.9
22
+ Classifier: Programming Language :: Python :: 3.10
23
+ Classifier: Programming Language :: Python :: 3.11
24
+ Classifier: Programming Language :: Python :: 3.12
25
+ Classifier: Topic :: Software Development :: Libraries :: Python Modules
26
+ Classifier: Topic :: Software Development :: Code Generators
27
+ Classifier: Topic :: Scientific/Engineering :: Artificial Intelligence
28
+ Requires-Python: >=3.8
29
+ Description-Content-Type: text/markdown
30
+ License-File: LICENSE
31
+ Requires-Dist: click>=8.0.0
32
+ Requires-Dist: websockets>=10.0
33
+ Requires-Dist: aiohttp>=3.8.0
34
+ Requires-Dist: pydantic>=2.0.0
35
+ Requires-Dist: rich>=13.0.0
36
+ Requires-Dist: requests>=2.28.0
37
+ Requires-Dist: beautifulsoup4>=4.11.0
38
+ Requires-Dist: psutil>=5.9.0
39
+ Requires-Dist: cryptography>=3.4.0
40
+ Requires-Dist: keyring>=23.0.0
41
+ Requires-Dist: openai>=1.0.0
42
+ Requires-Dist: anthropic>=0.7.0
43
+ Requires-Dist: google-generativeai>=0.3.0
44
+ Requires-Dist: together>=0.2.0
45
+ Requires-Dist: nest-asyncio>=1.5.0
46
+ Requires-Dist: prompt-toolkit>=3.0.0
47
+ Provides-Extra: tools
48
+ Requires-Dist: gitpython>=3.1.0; extra == "tools"
49
+ Requires-Dist: pyyaml>=6.0; extra == "tools"
50
+ Provides-Extra: dev
51
+ Requires-Dist: pytest>=7.0.0; extra == "dev"
52
+ Requires-Dist: pytest-asyncio>=0.21.0; extra == "dev"
53
+ Requires-Dist: black>=22.0.0; extra == "dev"
54
+ Requires-Dist: flake8>=5.0.0; extra == "dev"
55
+ Requires-Dist: mypy>=1.0.0; extra == "dev"
56
+ Requires-Dist: coverage>=6.0.0; extra == "dev"
57
+ Provides-Extra: all
58
+ Requires-Dist: cognautic-cli[dev,tools]; extra == "all"
59
+ Dynamic: author
60
+ Dynamic: home-page
61
+ Dynamic: license-file
62
+ Dynamic: requires-python
63
+
64
+ # Cognautic CLI
65
+
66
+ **A Python-based CLI AI coding agent that provides agentic development capabilities with multi-provider AI support and real-time interaction.**
67
+
68
+ ⚠️ **Under Development** - Some features may be unavailable
69
+
70
+ ---
71
+
72
+ ## Overview
73
+
74
+ Cognautic CLI is a Python-based command-line interface that brings AI-powered development capabilities directly to your terminal. It provides agentic tools for file operations, command execution, web search, and code analysis with support for multiple AI providers. The tool is accessed through a single `cognautic` command with various subcommands.
75
+
76
+ > **⚠️ Development Notice:** Cognautic CLI is currently under development. Some features may be unavailable or subject to change.
77
+
78
+ ### Project Information
79
+
80
+ | Property | Value |
81
+ |----------|-------|
82
+ | **Developer** | Cognautic |
83
+ | **Written in** | Python |
84
+ | **Operating system** | Cross-platform |
85
+ | **Type** | AI Development Tool |
86
+ | **Status** | Under Development |
87
+ | **Repository** | [github.com/cognautic/cli](https://github.com/cognautic/cli) |
88
+
89
+ ---
90
+
91
+ ## Features
92
+
93
+ - **Multi-Provider AI Support**: Integrate with OpenAI, Anthropic, Google, Together AI, OpenRouter, and 15+ other AI providers
94
+ - **Local Model Support**: Run free open-source Hugging Face models locally without API keys (NEW! 🎉)
95
+ - **Agentic Tools**: File operations, command execution, web search, and code analysis
96
+ - **Intelligent Web Search**: Automatically searches the web when implementing features requiring current/external information (NEW! 🔍)
97
+ - **Real-time Communication**: WebSocket server for live AI responses and tool execution
98
+ - **Secure Configuration**: Encrypted API key storage and permission management
99
+ - **Interactive CLI**: Rich terminal interface with progress indicators, colored output, and command history
100
+
101
+ ---
102
+
103
+ ## Installation
104
+
105
+ ### Prerequisites
106
+
107
+ Ensure you have Python 3.8 or higher installed:
108
+
109
+ ```bash
110
+ python --version
111
+ ```
112
+
113
+ ### Download the Wheel File
114
+
115
+ Download the latest `.whl` file from the official repository:
116
+
117
+ ```bash
118
+ # Visit https://github.com/cognautic/cli/releases
119
+ # Download the latest cognautic_cli-z.z.z-py3-none-any.whl file
120
+ ```
121
+
122
+ ### Installation with pip
123
+
124
+ Install the downloaded wheel file using pip:
125
+
126
+ ```bash
127
+ # Navigate to your downloads folder
128
+ cd ~/Downloads
129
+
130
+ # Install the wheel file
131
+ pip install cognautic_cli-z.z.z-py3-none-any.whl
132
+ ```
133
+
134
+ ### Installation with pipx (Recommended)
135
+
136
+ For isolated installation, use pipx:
137
+
138
+ ```bash
139
+ # Install pipx if you don't have it
140
+ pip install pipx
141
+ pipx ensurepath
142
+
143
+ # Install Cognautic CLI with pipx
144
+ pipx install cognautic_cli-z.z.z-py3-none-any.whl
145
+ ```
146
+
147
+ ### Verify Installation
148
+
149
+ Check that Cognautic CLI is installed correctly:
150
+
151
+ ```bash
152
+ cognautic --version
153
+ ```
154
+
155
+ ### Updating Cognautic CLI
156
+
157
+ To update to a newer version, download the new wheel file and:
158
+
159
+ ```bash
160
+ # With pip (force reinstall)
161
+ pip install cognautic_cli-y.y.y-py3-none-any.whl --force-reinstall
162
+
163
+ # With pipx
164
+ pipx upgrade cognautic-cli
165
+ # Or force reinstall with pipx
166
+ pipx install cognautic_cli-y.y.y-py3-none-any.whl --force
167
+ ```
168
+
169
+ _**Note:** Replace `y.y.y` and `z.z.z` with actual version numbers (e.g., 1.0.0, 1.1.0)._
170
+
171
+ ### Uninstallation
172
+
173
+ To remove Cognautic CLI:
174
+
175
+ ```bash
176
+ # With pip
177
+ pip uninstall cognautic-cli
178
+
179
+ # With pipx
180
+ pipx uninstall cognautic-cli
181
+ ```
182
+
183
+ ---
184
+
185
+ ## Quick Start
186
+
187
+ ### Step 1: Install Cognautic CLI
188
+
189
+ ```bash
190
+ pip install cognautic_cli-x.x.x-py3-none-any.whl
191
+ ```
192
+
193
+ ### Step 2: Run Setup
194
+
195
+ ```bash
196
+ cognautic setup --interactive
197
+ ```
198
+
199
+ This will guide you through:
200
+ - Configuring API keys for your preferred AI providers
201
+ - Setting default provider and model
202
+ - Basic preferences
203
+
204
+ ### Step 3: Start Chatting
205
+
206
+ ```bash
207
+ cognautic chat
208
+ ```
209
+
210
+ Now you can chat with AI and use slash commands like:
211
+ - `/help` - Show available commands
212
+ - `/provider openai` - Switch AI provider
213
+ - `/model gpt-4` - Change model
214
+ - `/workspace ~/myproject` - Set working directory
215
+ - `/lmodel microsoft/phi-2` - Load local model
216
+
217
+ **That's it!** Start chatting and let the AI help you code.
218
+
219
+ ---
220
+
221
+ ## Available Slash Commands
222
+
223
+ Once you're in chat mode (`cognautic chat`), use these commands:
224
+
225
+ ### Workspace & Configuration
226
+
227
+ ```bash
228
+ /workspace <path> # Change working directory (alias: /ws)
229
+ /setup # Run interactive setup wizard
230
+ /config list # Show current configuration
231
+ /config set <key> <value> # Set configuration value
232
+ /help # Show all available commands
233
+ ```
234
+
235
+ ### AI Provider & Model Management
236
+
237
+ ```bash
238
+ /provider [name] # Switch AI provider (openai, anthropic, google, etc.)
239
+ /model [model_id] # Switch AI model
240
+ /model list # Fetch available models from provider's API
241
+ /lmodel <path> # Load local Hugging Face model
242
+ /lmodel unload # Unload current local model
243
+ ```
244
+
245
+ ### Session Management
246
+
247
+ ```bash
248
+ /session # Show current session info
249
+ /session list # List all sessions
250
+ /session new # Create new session
251
+ /session load <id> # Load existing session
252
+ /session delete <id> # Delete session
253
+ /session title <text> # Update session title
254
+ ```
255
+
256
+ ### Display & Interface
257
+
258
+ ```bash
259
+ /speed [instant|fast|normal|slow] # Set typing speed
260
+ /clear # Clear chat screen
261
+ /exit or /quit # Exit chat session
262
+ ```
263
+
264
+ ---
265
+
266
+ ## Command-Line Usage
267
+
268
+ Cognautic CLI provides these main commands:
269
+
270
+ ### Setup Command
271
+
272
+ ```bash
273
+ cognautic setup --interactive # Interactive setup wizard
274
+ cognautic setup --provider openai # Quick provider setup
275
+ ```
276
+
277
+ ### Chat Command
278
+
279
+ ```bash
280
+ cognautic chat # Start interactive chat
281
+ cognautic chat --provider anthropic # Chat with specific provider
282
+ cognautic chat --model claude-3-sonnet # Chat with specific model
283
+ cognautic chat --project-path ./my_project # Set workspace
284
+ cognautic chat --session <id> # Continue existing session
285
+ ```
286
+
287
+ ### Config Command
288
+
289
+ ```bash
290
+ cognautic config list # Show all configuration
291
+ cognautic config set <key> <value> # Set configuration value
292
+ cognautic config get <key> # Get configuration value
293
+ cognautic config delete <key> # Delete configuration key
294
+ cognautic config reset # Reset to defaults
295
+ ```
296
+
297
+ ### Providers Command
298
+
299
+ ```bash
300
+ cognautic providers # List all AI providers and endpoints
301
+ ```
302
+
303
+ ---
304
+
305
+ ## Supported AI Providers
306
+
307
+ | Provider | Models | API Key Required |
308
+ |----------|--------|------------------|
309
+ | **OpenAI** | GPT models (GPT-4, GPT-3.5) | `OPENAI_API_KEY` |
310
+ | **Anthropic** | Claude models (Claude-3 Sonnet, Haiku) | `ANTHROPIC_API_KEY` |
311
+ | **Google** | Gemini models | `GOOGLE_API_KEY` |
312
+ | **Together AI** | Various open-source models | `TOGETHER_API_KEY` |
313
+ | **OpenRouter** | Access to multiple providers | `OPENROUTER_API_KEY` |
314
+ | **Local Models** | Hugging Face models (Llama, Mistral, Phi, etc.) | ❌ No API key needed! |
315
+
316
+ ### Using Local Models (NEW! 🎉)
317
+
318
+ Run free open-source AI models locally without any API keys:
319
+
320
+ ```bash
321
+ # Install dependencies
322
+ pip install transformers torch accelerate
323
+
324
+ # Start chat and load a local model
325
+ cognautic chat
326
+ /lmodel microsoft/phi-2
327
+ /provider local
328
+
329
+ # Now chat with your local model!
330
+ ```
331
+
332
+ **Popular local models:**
333
+ - `microsoft/phi-2` - Small and fast (2.7B)
334
+ - `TinyLlama/TinyLlama-1.1B-Chat-v1.0` - Ultra lightweight (1.1B)
335
+ - `meta-llama/Llama-2-7b-chat-hf` - High quality (7B)
336
+ - `mistralai/Mistral-7B-Instruct-v0.2` - Excellent performance (7B)
337
+
338
+ **Benefits:**
339
+ - ✅ Complete privacy - no data sent externally
340
+ - ✅ No API costs
341
+ - ✅ Works offline
342
+ - ✅ Full control over model behavior
343
+
344
+ 📖 **[Read the full Local Models Guide →](LOCAL_MODELS.md)**
345
+
346
+ ---
347
+
348
+ ## Intelligent Web Search (NEW! 🔍)
349
+
350
+ Cognautic CLI now features **intelligent web search** that automatically researches information when needed. The AI will search the web when:
351
+
352
+ - **Implementing APIs**: "Implement Stripe payment integration"
353
+ - **Using Latest Libraries**: "Create a React app with TailwindCSS"
354
+ - **Research Requests**: "What's the best way to implement real-time chat?"
355
+ - **Current Best Practices**: "Build a modern authentication system"
356
+
357
+ ### Example Usage
358
+
359
+ ```bash
360
+ You: Implement OpenAI API in my Python project
361
+
362
+ AI: 🔍 Searching for latest OpenAI API documentation...
363
+ ✅ Found: OpenAI API Reference
364
+ 📝 Creating implementation with current best practices...
365
+
366
+ [Creates files with up-to-date API usage]
367
+ ```
368
+
369
+ ### When Web Search is Used
370
+
371
+ ✅ **Automatically triggered for:**
372
+ - Latest API documentation
373
+ - Current framework/library versions
374
+ - Modern best practices
375
+ - Technologies requiring external information
376
+
377
+ ❌ **Not used for:**
378
+ - Basic programming concepts
379
+ - Simple file operations
380
+ - General coding tasks
381
+
382
+ 📖 **[Read the full Web Search Guide →](docs/WEB_SEARCH_TOOL.md)** | **[Quick Reference →](docs/WEB_SEARCH_QUICK_REFERENCE.md)**
383
+
384
+ ---
385
+
386
+ ## Configuration
387
+
388
+ Configuration files are stored in `~/.cognautic/`:
389
+
390
+ - `config.json`: General settings and preferences
391
+ - `api_keys.json`: Encrypted API keys for AI providers
392
+ - `sessions/`: Chat session history and context
393
+ - `cache/`: Temporary files and model cache
394
+
395
+ ---
396
+
397
+ ## Command Usage
398
+
399
+ All Cognautic CLI functionality is accessed through the single `cognautic` command. The general syntax is:
400
+
401
+ ```bash
402
+ cognautic <subcommand> [options] [arguments]
403
+ ```
404
+
405
+ ### Getting Help
406
+
407
+ ```bash
408
+ # Show general help
409
+ cognautic --help
410
+
411
+ # Show help for specific command
412
+ cognautic chat --help
413
+ ```
414
+
415
+ ### Version Information
416
+
417
+ ```bash
418
+ cognautic --version
419
+ ```
420
+
421
+ ---
422
+
423
+ ## WebSocket Server & Real-time Streaming
424
+
425
+ Cognautic CLI includes a powerful WebSocket server that enables **real-time, streaming AI responses**. Instead of waiting for the complete response, you receive AI-generated content as it's being produced, providing a much more interactive experience.
426
+
427
+ ### Starting the WebSocket Server
428
+
429
+ The WebSocket server starts automatically when you run chat mode:
430
+
431
+ ```bash
432
+ # Start with default settings (port 8765)
433
+ cognautic chat
434
+
435
+ # Specify custom port
436
+ cognautic chat --websocket-port 9000
437
+
438
+ # With specific provider and model
439
+ cognautic chat --provider openai --model gpt-4o-mini --websocket-port 8765
440
+ ```
441
+
442
+ ### Key Features
443
+
444
+ - ✨ **Real-time Streaming**: AI responses stream chunk-by-chunk as they're generated
445
+ - 🔄 **Bi-directional**: Full duplex WebSocket communication
446
+ - 🔐 **Session Management**: Automatic session creation and context preservation
447
+ - 🤖 **Multi-provider**: Works with all supported AI providers
448
+ - 🛠️ **Tool Execution**: Execute tools and file operations via WebSocket
449
+
450
+ ### Client Examples
451
+
452
+ **Python Client:**
453
+ ```bash
454
+ python examples/websocket_client_example.py
455
+
456
+ # Interactive mode
457
+ python examples/websocket_client_example.py interactive
458
+ ```
459
+
460
+ **Web Browser:**
461
+ ```bash
462
+ # Open in your browser
463
+ open examples/websocket_client.html
464
+ ```
465
+
466
+ ### Basic Usage Example
467
+
468
+ ```python
469
+ import asyncio
470
+ import json
471
+ import websockets
472
+
473
+ async def chat():
474
+ uri = "ws://localhost:8765"
475
+ async with websockets.connect(uri) as ws:
476
+ # Receive welcome message
477
+ welcome = json.loads(await ws.recv())
478
+ print(f"Connected! Session: {welcome['session_id']}")
479
+
480
+ # Send chat message with streaming enabled
481
+ await ws.send(json.dumps({
482
+ "type": "chat",
483
+ "message": "Explain Python async/await",
484
+ "stream": true
485
+ }))
486
+
487
+ # Receive streaming response in real-time
488
+ while True:
489
+ response = json.loads(await ws.recv())
490
+
491
+ if response['type'] == 'stream_chunk':
492
+ print(response['chunk'], end='', flush=True)
493
+ elif response['type'] == 'stream_end':
494
+ break
495
+
496
+ asyncio.run(chat())
497
+ ```
498
+
499
+ ### API Documentation
500
+
501
+ For complete WebSocket API documentation, see **[WEBSOCKET_API.md](WEBSOCKET_API.md)**.
502
+
503
+ ---
504
+
505
+ ## Examples
506
+
507
+ ### Simple Chat Session
508
+
509
+ Start chatting with AI:
510
+
511
+ ```bash
512
+ $ cognautic chat
513
+ ██████╗ ██████╗ ██████╗ ███╗ ██╗ █████╗ ██╗ ██╗████████╗██╗ ██████╗
514
+ ██╔════╝██╔═══██╗██╔════╝ ████╗ ██║██╔══██╗██║ ██║╚══██╔══╝██║██╔════╝
515
+ ██║ ██║ ██║██║ ███╗██╔██╗ ██║███████║██║ ██║ ██║ ██║██║
516
+ ██║ ██║ ██║██║ ██║██║╚██╗██║██╔══██║██║ ██║ ██║ ██║██║
517
+ ╚██████╗╚██████╔╝╚██████╔╝██║ ╚████║██║ ██║╚██████╔╝ ██║ ██║╚██████╗
518
+ ╚═════╝ ╚═════╝ ╚═════╝ ╚═╝ ╚═══╝╚═╝ ╚═╝ ╚═════╝ ╚═╝ ╚═╝ ╚═════╝
519
+
520
+ 💡 Type '/help' for commands, 'exit' to quit
521
+ 🌐 WebSocket server: ws://localhost:8765
522
+ 📁 Workspace: /home/user/projects
523
+ --------------------------------------------------
524
+
525
+ You [projects]: Can you help me create a Python function?
526
+ AI: Of course! I'd be happy to help you create a Python function...
527
+
528
+ You [projects]: /workspace ~/myproject
529
+ ✅ Workspace changed to: /home/user/myproject
530
+
531
+ You [myproject]: Create a file called utils.py with helper functions
532
+ AI: I'll create that file for you...
533
+ ```
534
+
535
+ ### First-Time Setup
536
+
537
+ ```bash
538
+ $ cognautic
539
+ 🎉 Welcome to Cognautic! Let's get you set up.
540
+ 🔑 No API keys found. Let's configure them.
541
+
542
+ Which AI provider would you like to use?
543
+ 1. OpenAI (GPT-4, GPT-3.5)
544
+ 2. Anthropic (Claude)
545
+ 3. Google (Gemini)
546
+ 4. Other providers...
547
+
548
+ Choice [1-4]: 2
549
+ 🔐 Please enter your Anthropic API key: sk-ant-...
550
+ ✅ API key saved securely!
551
+
552
+ 🚀 Setup complete! You're ready to go.
553
+ ```
554
+
555
+ ### Using Local Models
556
+
557
+ Run AI models locally without API keys:
558
+
559
+ ```bash
560
+ $ cognautic chat
561
+ You: /lmodel microsoft/phi-2
562
+ 🔄 Loading local model from: microsoft/phi-2
563
+ ⏳ This may take a few minutes depending on model size...
564
+ Loading local model from microsoft/phi-2 on cuda...
565
+ ✅ Model loaded successfully on cuda
566
+ ✅ Local model loaded successfully!
567
+ 💡 Use: /provider local - to switch to the local model
568
+
569
+ You: /provider local
570
+ ✅ Switched to provider: local
571
+
572
+ You: Hello! Can you help me code?
573
+ AI: Hello! Yes, I'd be happy to help you with coding...
574
+ ```
575
+
576
+ ### Working with Multiple Providers
577
+
578
+ Switch between different AI providers:
579
+
580
+ ```bash
581
+ You: /provider openai
582
+ ✅ Switched to provider: openai
583
+
584
+ You: /model gpt-4o
585
+ ✅ Switched to model: gpt-4o
586
+
587
+ You: Write a Python function to sort a list
588
+ AI: Here's a Python function...
589
+
590
+ You: /provider anthropic
591
+ ✅ Switched to provider: anthropic
592
+
593
+ You: /model claude-3-sonnet-20240229
594
+ ✅ Switched to model: claude-3-sonnet-20240229
595
+ ```
596
+
597
+ ---
598
+
599
+ ## License
600
+
601
+ MIT