askcii 0.3.0 → 0.4.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: fdbabf6bbd313aa11a7868b5df4630a85798e6a59311872bc04cebeaa2ac49e3
4
- data.tar.gz: cff4449b247c8dcfe5273e122b054d72e54f57866f905854fd94aa39ee8e8547
3
+ metadata.gz: cecbd9c499b5c5736cf4059337fef530a6438272512b1bdbcd365c30be99fb91
4
+ data.tar.gz: 584719227c6c64f9886694abe922d8623b2a517988f788ef078144cfab78213d
5
5
  SHA512:
6
- metadata.gz: 5ca4009158417d0eee5a4452c4ec5c49fefedabdeb22495a2f6b6d08136391f19535e9468113c164f714e0a382eb8b57244e56027d6b824f23132fea679a4d29
7
- data.tar.gz: ee32e1ab823d7a2dc53445e3fa7cf064993272571512f6c4b18966bbb08c87163158262387521d073d5bfa0c158f80b8ab808f05c98f8e330aaaa555122814f2
6
+ metadata.gz: 22b9b01ee86ecd887db61892585f39217ea95c68c7a49dfbfbc12018d0ee8525a3e873464c2c962cff105b436ffeabd3462b93b07d0b93adca612822c23766be
7
+ data.tar.gz: e84ba6a8d2789be26e67bb8cd4527962fdc134ad11d6d3a30c6bd8cbfd8f0c201a9faec8055dfdcd2b1171f3c1a776252a8e69bfe5f3f35382814818848a93b5
data/.gitignore CHANGED
@@ -1,2 +1,5 @@
1
1
  *.gem
2
2
  **/.claude/settings.local.json
3
+
4
+ # Override global gitignore to allow CLAUDE.md in this repo
5
+ !CLAUDE.md
data/CLAUDE.md ADDED
@@ -0,0 +1,131 @@
1
+ # CLAUDE.md
2
+
3
+ This file provides guidance to Claude Code (claude.ai/code) when working with code in this repository.
4
+
5
+ ## Project Overview
6
+
7
+ Askcii is a Ruby gem that provides a command-line interface for interacting with multiple LLM providers (OpenAI, Anthropic, Gemini, DeepSeek, OpenRouter, Ollama). It supports multi-configuration management, session-based conversations, and streaming responses.
8
+
9
+ ## Development Commands
10
+
11
+ ### Installation and Setup
12
+ ```bash
13
+ bundle install # Install dependencies
14
+ bin/setup # Run setup script
15
+ ```
16
+
17
+ ### Running the Application
18
+ ```bash
19
+ bin/askcii 'Your prompt' # Run locally from source
20
+ bundle exec askcii # Run via bundler
21
+ ```
22
+
23
+ ### Gem Management
24
+ ```bash
25
+ bundle exec rake build # Build the gem
26
+ bundle exec rake install # Install gem locally
27
+ bundle exec rake release # Release new version (requires version bump in lib/askcii/version.rb)
28
+ ```
29
+
30
+ ### Testing
31
+ Note: This project currently has no test suite configured, despite the Rakefile referencing RSpec and the gemspec including minitest.
32
+
33
+ ### Console
34
+ ```bash
35
+ bin/console # Interactive Ruby console with askcii loaded
36
+ ```
37
+
38
+ ## Architecture
39
+
40
+ ### Core Components
41
+
42
+ **Entry Point (`bin/askcii`)**: Initializes the database, loads models, and starts the Application.
43
+
44
+ **Application Layer (`lib/askcii/application.rb`)**: The main application controller that:
45
+ - Parses CLI arguments via the CLI class
46
+ - Determines which configuration to use (explicit -m flag, default config, or env vars)
47
+ - Configures the RubyLLM library with provider-specific settings
48
+ - Delegates to ChatSession for actual LLM interaction
49
+
50
+ **CLI Parser (`lib/askcii/cli.rb`)**: Handles command-line argument parsing using OptionParser. Supports flags for:
51
+ - `-p/--private`: Private sessions (no history)
52
+ - `-r/--last-response`: Retrieve last assistant response
53
+ - `-c/--configure`: Interactive configuration management
54
+ - `-m/--model ID`: Use specific configuration by ID
55
+
56
+ **Configuration System**: Multi-layered configuration approach:
57
+ 1. **Multi-config mode** (`lib/askcii/models/config.rb`): Stores multiple provider configurations in SQLite with JSON serialization
58
+ 2. **Legacy env var mode**: Falls back to `ASKCII_API_KEY`, `ASKCII_API_ENDPOINT`, `ASKCII_MODEL_ID` environment variables
59
+ 3. **Configuration Manager** (`lib/askcii/configuration_manager.rb`): Interactive TUI for managing configurations
60
+
61
+ ### Data Models (Sequel ORM)
62
+
63
+ **Config Model** (`lib/askcii/models/config.rb`):
64
+ - Key-value store using `configs` table
65
+ - Multi-configurations stored as `config_1`, `config_2`, etc. with JSON values
66
+ - Each config contains: name, api_key, api_endpoint, model_id, provider
67
+ - Tracks default configuration via `default_config_id` key
68
+
69
+ **Chat Model** (`lib/askcii/models/chat.rb`):
70
+ - Represents a conversation session identified by `context` (from `ASKCII_SESSION` env var or random hex)
71
+ - Has many Messages
72
+ - Converts to/from RubyLLM::Chat objects
73
+ - Persists new messages and completions via callbacks
74
+
75
+ **Message Model** (`lib/askcii/models/message.rb`):
76
+ - Stores individual messages with role, content, token counts, and model_id
77
+ - Belongs to a Chat
78
+ - Handles UTF-8 encoding to prevent database issues
79
+
80
+ ### Database
81
+
82
+ **Location**: `~/.local/share/askcii/askcii.db` (SQLite via Amalgalite)
83
+
84
+ **Schema** (see `Askcii.setup_database` in `lib/askcii.rb`):
85
+ - `chats`: id, model_id, context, created_at
86
+ - `messages`: id, chat_id, role, content (TEXT), model_id, input_tokens, output_tokens, created_at
87
+ - `configs`: id, key (unique), value (TEXT for JSON storage)
88
+
89
+ ### Session Management
90
+
91
+ Sessions are controlled by the `ASKCII_SESSION` environment variable:
92
+ - If set: reuses the same chat context across invocations
93
+ - If unset: generates a random session ID per invocation
94
+ - Private mode (`-p`): creates ephemeral RubyLLM chat without database persistence
95
+
96
+ ### Provider Configuration
97
+
98
+ The `Askcii.configure_llm` method in `lib/askcii.rb` handles provider-specific setup:
99
+ - **OpenAI**: Sets `openai_api_key` and `openai_api_base`
100
+ - **Anthropic**: Sets `anthropic_api_key`
101
+ - **Gemini**: Sets `gemini_api_key`
102
+ - **DeepSeek**: Sets `deepseek_api_key`
103
+ - **OpenRouter**: Sets `openrouter_api_key`
104
+ - **Ollama**: Only requires endpoint (default: `http://localhost:11434/v1`), no API key
105
+
106
+ All LLM interactions go through the `ruby_llm` gem (v1.3.0), which provides unified streaming chat interface.
107
+
108
+ ### Chat Execution Flow
109
+
110
+ 1. `Application#run` determines configuration and calls `ChatSession#execute_chat`
111
+ 2. `ChatSession` creates either a private or persistent chat
112
+ 3. Persistent chats load history from database via `Chat#to_llm`
113
+ 4. System instruction sets terminal-friendly response style
114
+ 5. User prompt (optionally combined with stdin input) is sent to LLM
115
+ 6. Response streams to stdout character-by-character
116
+ 7. For persistent chats, callbacks save assistant messages to database
117
+
118
+ ## Important Patterns
119
+
120
+ **Provider Selection**: The system uses string provider names ('openai', 'anthropic', etc.) in configuration but converts them to symbols for RubyLLM. The `assume_model_exists: true` flag bypasses RubyLLM's model validation.
121
+
122
+ **Text Encoding**: Message content is always encoded as UTF-8 with `undef: :replace` to prevent encoding errors when saving to SQLite.
123
+
124
+ **Streaming**: All chat responses use streaming via blocks to provide immediate character-by-character output.
125
+
126
+ **Configuration Priority**:
127
+ 1. Explicit `-m ID` flag
128
+ 2. Default configuration from database
129
+ 3. Environment variables (legacy fallback)
130
+
131
+ **Database Initialization**: The database and tables are created automatically on first run via `Askcii.setup_database` in the bin script.