askcii 0.1.0 → 0.2.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: 18a37a3681323f04071f337d1b0e8f7b0351fe2879de62224951d4b4c8c7d8d4
4
- data.tar.gz: 3e994f5ea755fed88a74fd07be374a412825c8e2227cc048f5c2d2dc219d2b5b
3
+ metadata.gz: 2fdbbba27b8846c738014ebaab5466f00b2a73b74c2b5c898bee87bd60752f6d
4
+ data.tar.gz: 737756f8559f8f518829b440d5a8d273a70d331731da02ed5be06230b0cad467
5
5
  SHA512:
6
- metadata.gz: 9c7c23cc79eb32819c310774894800f6f77cde9fa8466ac4f5674ebeaa626b006f39a487a8d99b8324b9e152a2f8393ecea0c61a9c1e822148c63732c4e9ea64
7
- data.tar.gz: e10d8a1726d612e6aeab85aaf717d48d6616eac31dcd744b56a761fad2e0ce71c1f8ad4e2e574590817e1c6a1b12f87c772793a72d1747e69095c9868547e548
6
+ metadata.gz: c2a8b74fd5e2ef08a7c1f386964f3a9ccae35e9e0d1ec9f37b28253a420f1849f909fbac88452b9615f5bb928db6814871633b68bd48ec376f2c1afb4951a089
7
+ data.tar.gz: 7bbd8c8a9157c5d7439be1d49dd20db68bb99ca9ef7fc47a9b4cd9a9c8372ebf89521a0edfdb4745587367852d72e017ef1f5fabd2e4cf153f92bfa2c259f31f
data/Gemfile.lock CHANGED
@@ -3,7 +3,7 @@ PATH
3
3
  specs:
4
4
  askcii (0.1.0)
5
5
  amalgalite (~> 1.9)
6
- ruby_llm (= 1.3.0rc1)
6
+ ruby_llm (= 1.3.0)
7
7
  sequel (~> 5.92)
8
8
 
9
9
  GEM
@@ -27,18 +27,20 @@ GEM
27
27
  faraday (~> 2.0)
28
28
  json (2.12.0)
29
29
  logger (1.7.0)
30
+ marcel (1.0.4)
30
31
  minitest (5.25.5)
31
32
  multipart-post (2.4.1)
32
33
  net-http (0.6.0)
33
34
  uri
34
35
  rake (13.2.1)
35
- ruby_llm (1.3.0rc1)
36
+ ruby_llm (1.3.0)
36
37
  base64
37
38
  event_stream_parser (~> 1)
38
- faraday (~> 2)
39
- faraday-multipart (~> 1)
40
- faraday-net_http (~> 3)
41
- faraday-retry (~> 2)
39
+ faraday (>= 1.10.0)
40
+ faraday-multipart (>= 1)
41
+ faraday-net_http (>= 1)
42
+ faraday-retry (>= 1)
43
+ marcel (~> 1.0)
42
44
  zeitwerk (~> 2)
43
45
  sequel (5.92.0)
44
46
  bigdecimal
data/README.md CHANGED
@@ -1,6 +1,6 @@
1
1
  # Askcii
2
2
 
3
- A command-line application for interacting with LLM models in a terminal-friendly way.
3
+ A command-line application for interacting with multiple LLM providers in a terminal-friendly way. Supports OpenAI, Anthropic, Gemini, DeepSeek, OpenRouter, and Ollama with multi-configuration management.
4
4
 
5
5
  ## Installation
6
6
 
@@ -20,52 +20,198 @@ Or install it yourself as:
20
20
 
21
21
  ## Usage
22
22
 
23
- ```
23
+ ### Basic Usage
24
+
25
+ ```bash
24
26
  # Basic usage
25
27
  askcii 'Your prompt here'
26
28
 
27
- # Pipe input
28
- echo 'Your context text' | askcii 'Your prompt here'
29
+ # Pipe input from other commands
30
+ echo 'Your context text' | askcii 'Analyze this text'
29
31
 
30
32
  # File input
31
- askcii 'Your prompt here' < input.txt
33
+ askcii 'Summarize this document' < document.txt
34
+
35
+ # Private session (no conversation history saved)
36
+ askcii -p 'Tell me a joke'
37
+
38
+ # Get the last response from your current session
39
+ askcii -r
40
+
41
+ # Use a specific configuration
42
+ askcii -m 2 'Hello using configuration 2'
43
+ ```
44
+
45
+ ### Session Management
32
46
 
47
+ ```bash
33
48
  # Set a custom session ID to maintain conversation context
34
- ASKCII_SESSION_ID="project-research" askcii 'What did we talk about earlier?'
49
+ ASKCII_SESSION="project-research" askcii 'What did we talk about earlier?'
35
50
 
36
- # Or add the following to your .bashrc or .zshrc to always start a new session
51
+ # Generate a new session for each terminal session
37
52
  export ASKCII_SESSION=$(openssl rand -hex 16)
53
+ ```
54
+
55
+ ### Command Line Options
56
+
57
+ ```
58
+ Usage: askcii [options] 'Your prompt here'
59
+
60
+ Options:
61
+ -p, --private Start a private session and do not record
62
+ -r, --last-response Output the last response
63
+ -c, --configure Manage configurations
64
+ -m, --model ID Use specific configuration ID
65
+ -h, --help Show help
66
+ ```
67
+
68
+ ## Configuration Management
69
+
70
+ Askcii supports multi-configuration management, allowing you to easily switch between different LLM providers and models.
38
71
 
39
- # Configure the API key, endpoint, and model ID
72
+ ### Interactive Configuration
73
+
74
+ Use the `-c` flag to access the configuration management interface:
75
+
76
+ ```bash
40
77
  askcii -c
78
+ ```
41
79
 
42
- # Get the last response
43
- askcii -r
80
+ This will show you a menu like:
81
+
82
+ ```
83
+ Configuration Management
84
+ =======================
85
+ Current configurations:
86
+ 1. GPT-4 [openai] (default)
87
+ 2. Claude Sonnet [anthropic]
88
+ 3. Gemini Pro [gemini]
89
+ 4. Local Llama [ollama]
90
+
91
+ Options:
92
+ 1. Add new configuration
93
+ 2. Set default configuration
94
+ 3. Delete configuration
95
+ 4. Exit
44
96
  ```
45
97
 
46
- ## Configuration
98
+ ### Supported Providers
99
+
100
+ 1. **OpenAI** - GPT models (gpt-4, gpt-3.5-turbo, etc.)
101
+ 2. **Anthropic** - Claude models (claude-3-sonnet, claude-3-haiku, etc.)
102
+ 3. **Gemini** - Google's Gemini models
103
+ 4. **DeepSeek** - DeepSeek models
104
+ 5. **OpenRouter** - Access to multiple models through OpenRouter
105
+ 6. **Ollama** - Local models (no API key required)
47
106
 
48
- You can configure your API key, endpoint, and model ID using the `-c` option:
107
+ ### Adding a New Configuration
49
108
 
109
+ When adding a configuration, you'll be prompted for:
110
+
111
+ 1. **Configuration name** - A friendly name for this configuration
112
+ 2. **Provider** - Choose from the supported providers
113
+ 3. **API key** - Your API key for the provider (not needed for Ollama)
114
+ 4. **API endpoint** - The API endpoint (defaults provided for each provider)
115
+ 5. **Model ID** - The specific model to use
116
+
117
+ Example for OpenAI:
50
118
  ```
51
- $ askcii -c
52
- Configuring askcii...
53
- Enter API key: your_api_key_here
54
- Enter API endpoint: http://localhost:11434/v1
55
- Enter model ID: gemma3:12b
56
- Configuration saved successfully!
119
+ Enter configuration name: GPT-4 Turbo
120
+ Provider (1-6): 1
121
+ Enter OpenAI API key: sk-your-api-key-here
122
+ Enter API endpoint (default: https://api.openai.com/v1): [press enter for default]
123
+ Enter model ID: gpt-4-turbo-preview
57
124
  ```
58
125
 
59
- Configuration settings are stored in a SQLite database located at `~/.local/share/askcii/askcii.db`.
126
+ Example for Ollama (local):
127
+ ```
128
+ Enter configuration name: Local Llama
129
+ Provider (1-6): 6
130
+ Enter API endpoint (default: http://localhost:11434/v1): [press enter for default]
131
+ Enter model ID: llama3:8b
132
+ ```
133
+
134
+ ### Using Configurations
135
+
136
+ ```bash
137
+ # Use the default configuration
138
+ askcii 'Hello world'
60
139
 
61
- You can also use environment variables to override the stored configuration:
140
+ # Use a specific configuration by ID
141
+ askcii -m 2 'Hello using configuration 2'
62
142
 
143
+ # List all configurations
144
+ askcii -c
63
145
  ```
146
+
147
+ ### Configuration Storage
148
+
149
+ Configuration settings are stored in a SQLite database located at `~/.local/share/askcii/askcii.db`.
150
+
151
+ ### Environment Variable Fallback
152
+
153
+ You can still use environment variables as a fallback when no configurations are set up:
154
+
155
+ ```bash
64
156
  ASKCII_API_KEY="your_api_key" askcii 'Hello!'
65
157
  ASKCII_API_ENDPOINT="https://api.example.com/v1" askcii 'Hello!'
66
158
  ASKCII_MODEL_ID="gpt-4" askcii 'Hello!'
67
159
  ```
68
160
 
161
+ ## Examples
162
+
163
+ ### Daily Workflow Examples
164
+
165
+ ```bash
166
+ # Code review
167
+ git diff | askcii 'Review this code change and suggest improvements'
168
+
169
+ # Log analysis
170
+ tail -100 /var/log/app.log | askcii 'Summarize any errors or issues'
171
+
172
+ # Documentation
173
+ askcii 'Explain how to set up a Redis cluster' > redis-setup.md
174
+
175
+ # Quick calculations
176
+ askcii 'Calculate compound interest: $1000 at 5% for 10 years'
177
+
178
+ # Text processing
179
+ cat customers.csv | askcii 'Convert this CSV to JSON format'
180
+ ```
181
+
182
+ ### Multi-Provider Usage
183
+
184
+ ```bash
185
+ # Use Claude for creative writing
186
+ askcii -m 1 'Write a short story about AI'
187
+
188
+ # Use GPT-4 for code analysis
189
+ askcii -m 2 'Explain this Python function' < function.py
190
+
191
+ # Use local Ollama for private data
192
+ askcii -m 3 -p 'Analyze this sensitive document' < confidential.txt
193
+ ```
194
+
195
+ ## Session Management
196
+
197
+ Askcii maintains conversation history unless you use the private mode (`-p`). Sessions are identified by:
198
+
199
+ 1. The `ASKCII_SESSION` environment variable
200
+ 2. A randomly generated session ID if not specified
201
+
202
+ ```bash
203
+ # Start a named session for a project
204
+ export ASKCII_SESSION="project-alpha"
205
+ askcii 'What is the project timeline?'
206
+ askcii 'What are the main risks?' # This will have context from previous question
207
+
208
+ # Use private mode for one-off questions
209
+ askcii -p 'What is the weather like?'
210
+
211
+ # Get the last response from current session
212
+ askcii -r
213
+ ```
214
+
69
215
  ## Development
70
216
 
71
217
  After checking out the repo, run `bin/setup` to install dependencies. Then, run `rake spec` to run the tests. You can also run `bin/console` for an interactive prompt that will allow you to experiment.
data/askcii.gemspec CHANGED
@@ -26,9 +26,9 @@ Gem::Specification.new do |spec|
26
26
  spec.executables = ['askcii']
27
27
  spec.require_paths = ['lib']
28
28
 
29
- spec.add_dependency 'sequel', '~> 5.92'
30
29
  spec.add_dependency 'amalgalite', '~> 1.9'
31
- spec.add_dependency 'ruby_llm', '1.3.0rc1'
30
+ spec.add_dependency 'ruby_llm', '1.3.0'
31
+ spec.add_dependency 'sequel', '~> 5.92'
32
32
 
33
33
  spec.add_development_dependency 'minitest', '~> 5.25'
34
34
  spec.add_development_dependency 'rake', '~> 13.0'
data/bin/askcii CHANGED
@@ -4,99 +4,9 @@
4
4
  # Find the right load path
5
5
  $LOAD_PATH.unshift File.expand_path('../lib', __dir__)
6
6
  require 'askcii'
7
- require 'optparse'
8
7
 
9
8
  Askcii.setup_database
10
- Askcii.configure_llm
11
9
  Askcii.require_models
10
+ Askcii.require_application
12
11
 
13
- # Parse command-line options
14
- options = {}
15
- opt_parser = OptionParser.new do |opts|
16
- opts.banner = "Usage: askcii [options] 'Your prompt here'"
17
-
18
- opts.on('-r', '--last-response', 'Output the last response') do
19
- options[:last_response] = true
20
- end
21
-
22
- opts.on('-c', '--configure', 'Configure API key, endpoint, and model ID') do
23
- options[:configure] = true
24
- end
25
-
26
- opts.on('-h', '--help', 'Show this help message') do
27
- puts opts
28
- exit
29
- end
30
- end
31
-
32
- # Parse options, keeping remaining arguments in ARGV
33
- opt_parser.parse!
34
-
35
- # Handle configuration if requested
36
- if options[:configure]
37
- puts 'Configuring askcii...'
38
-
39
- # Prompt for API key
40
- print 'Enter API key: '
41
- api_key = gets.chomp
42
-
43
- # Prompt for API endpoint
44
- print "Enter API endpoint: "
45
- api_endpoint = gets.chomp
46
-
47
- # Prompt for model ID
48
- print 'Enter model ID: '
49
- model_id = gets.chomp
50
-
51
- # Save configuration to database
52
- Askcii::Config.set('api_key', api_key) unless api_key.empty?
53
- Askcii::Config.set('api_endpoint', api_endpoint) unless api_endpoint.empty?
54
- Askcii::Config.set('model_id', model_id) unless model_id.empty?
55
-
56
- puts 'Configuration saved successfully!'
57
- exit 0
58
- end
59
-
60
- context = ENV['ASKCII_SESSION']
61
- model_id = Askcii::Config.model_id || ENV['ASKCII_MODEL_ID']
62
- chat = Askcii::Chat.find_or_create(context: context, model_id: model_id).to_llm
63
-
64
- # Output last response if requested
65
- if options[:last_response]
66
- last_message = chat.messages.where(role: 'assistant').last
67
- if last_message
68
- puts last_message.content
69
- exit 0
70
- else
71
- puts 'No previous response found.'
72
- exit 1
73
- end
74
- end
75
-
76
- # Process input
77
- input = nil
78
- input = $stdin.read unless $stdin.tty?
79
-
80
- prompt = ARGV.join(' ')
81
-
82
- if prompt.empty?
83
- puts 'Usage:'
84
- puts " askcii [options] 'Your prompt here'"
85
- puts " echo 'Your prompt here' | askcii 'Your prompt here'"
86
- puts " askcii 'Your prompt here' < prompt.txt"
87
- puts ' askcii -r (to get the last response)'
88
- puts ' askcii -c (to configure API key, endpoint, and model ID)'
89
- puts "\nOptions:"
90
- puts ' -r, --last-response Output the last response'
91
- puts ' -c, --configure Configure API key, endpoint, and model ID'
92
- puts ' -h, --help Show help'
93
- exit 1
94
- end
95
-
96
- chat.with_instructions 'You are a command line application. Your responses should be suitable to be read in a terminal. Your responses should only include the necessary text. Do not include any explanations unless prompted for it.'
97
- prompt = "With the following text:\n\n#{input}\n\n#{prompt}" if input
98
-
99
- chat.ask(prompt) do |chunk|
100
- print chunk.content
101
- end
102
- puts ''
12
+ Askcii::Application.new.run
@@ -0,0 +1,70 @@
1
+ # frozen_string_literal: true
2
+
3
+ require_relative 'cli'
4
+ require_relative 'configuration_manager'
5
+ require_relative 'chat_session'
6
+
7
+ module Askcii
8
+ class Application
9
+ def initialize(args = ARGV.dup)
10
+ @cli = CLI.new(args)
11
+ end
12
+
13
+ def run
14
+ @cli.parse!
15
+
16
+ if @cli.show_help?
17
+ puts @cli.help_message
18
+ exit 0
19
+ end
20
+
21
+ if @cli.configure?
22
+ ConfigurationManager.new.run
23
+ exit 0
24
+ end
25
+
26
+ if @cli.show_usage?
27
+ puts @cli.usage_message
28
+ exit 1
29
+ end
30
+
31
+ selected_config = determine_configuration
32
+ configure_llm(selected_config)
33
+
34
+ chat_session = ChatSession.new(@cli.options, selected_config)
35
+ chat_session.handle_last_response if @cli.last_response?
36
+
37
+ input = read_stdin_input
38
+ chat_session.execute_chat(@cli.prompt, input)
39
+ end
40
+
41
+ private
42
+
43
+ def determine_configuration
44
+ if @cli.model_config_id
45
+ config = Askcii::Config.get_configuration(@cli.model_config_id)
46
+ return config if config
47
+ end
48
+
49
+ config = Askcii::Config.current_configuration
50
+ return config if config
51
+
52
+ # Fallback to environment variables
53
+ {
54
+ 'api_key' => ENV['ASKCII_API_KEY'],
55
+ 'api_endpoint' => ENV['ASKCII_API_ENDPOINT'],
56
+ 'model_id' => ENV['ASKCII_MODEL_ID']
57
+ }
58
+ end
59
+
60
+ def configure_llm(selected_config)
61
+ Askcii.configure_llm(selected_config)
62
+ end
63
+
64
+ def read_stdin_input
65
+ return nil if $stdin.tty?
66
+
67
+ $stdin.read
68
+ end
69
+ end
70
+ end
@@ -0,0 +1,70 @@
1
+ # frozen_string_literal: true
2
+
3
+ require 'securerandom'
4
+
5
+ module Askcii
6
+ class ChatSession
7
+ def initialize(options, selected_config)
8
+ @options = options
9
+ @selected_config = selected_config
10
+ end
11
+
12
+ def handle_last_response
13
+ return unless @options[:last_response]
14
+
15
+ context = ENV['ASKCII_SESSION'] || SecureRandom.hex(8)
16
+ model_id = @selected_config['model_id']
17
+ chat_record = Askcii::Chat.find_or_create(context: context, model_id: model_id)
18
+
19
+ last_message = chat_record.messages.where(role: 'assistant').last
20
+ if last_message
21
+ puts last_message.content
22
+ exit 0
23
+ else
24
+ puts 'No previous response found.'
25
+ exit 1
26
+ end
27
+ end
28
+
29
+ def create_chat
30
+ if @options[:private]
31
+ create_private_chat
32
+ else
33
+ create_persistent_chat
34
+ end
35
+ end
36
+
37
+ def execute_chat(prompt, input = nil)
38
+ chat = create_chat
39
+
40
+ chat.with_instructions 'You are a command line application. Your responses should be suitable to be read in a terminal. Your responses should only include the necessary text. Do not include any explanations unless prompted for it.'
41
+
42
+ full_prompt = input ? "With the following text:\n\n#{input}\n\n#{prompt}" : prompt
43
+
44
+ chat.ask(full_prompt) do |chunk|
45
+ print chunk.content
46
+ end
47
+ puts ''
48
+ end
49
+
50
+ private
51
+
52
+ def create_private_chat
53
+ provider_symbol = @selected_config['provider'] ? @selected_config['provider'].to_sym : :openai
54
+ model_id = @selected_config['model_id']
55
+
56
+ RubyLLM.chat(
57
+ model: model_id,
58
+ provider: provider_symbol,
59
+ assume_model_exists: true
60
+ )
61
+ end
62
+
63
+ def create_persistent_chat
64
+ context = ENV['ASKCII_SESSION'] || SecureRandom.hex(8)
65
+ model_id = @selected_config['model_id']
66
+ chat_record = Askcii::Chat.find_or_create(context: context, model_id: model_id)
67
+ chat_record.to_llm
68
+ end
69
+ end
70
+ end
data/lib/askcii/cli.rb ADDED
@@ -0,0 +1,97 @@
1
+ # frozen_string_literal: true
2
+
3
+ require 'optparse'
4
+
5
+ module Askcii
6
+ class CLI
7
+ attr_reader :options, :prompt
8
+
9
+ def initialize(args = ARGV.dup)
10
+ @args = args
11
+ @options = {}
12
+ @prompt = nil
13
+ end
14
+
15
+ def parse!
16
+ option_parser.parse!(@args)
17
+ @prompt = @args.join(' ')
18
+ self
19
+ end
20
+
21
+ def show_help?
22
+ @options[:help]
23
+ end
24
+
25
+ def show_usage?
26
+ @prompt.empty? && !configure? && !last_response?
27
+ end
28
+
29
+ def configure?
30
+ @options[:configure]
31
+ end
32
+
33
+ def last_response?
34
+ @options[:last_response]
35
+ end
36
+
37
+ def private?
38
+ @options[:private]
39
+ end
40
+
41
+ def model_config_id
42
+ @options[:model_config_id]
43
+ end
44
+
45
+ def help_message
46
+ option_parser.to_s
47
+ end
48
+
49
+ def usage_message
50
+ <<~USAGE
51
+ Usage:
52
+ askcii [options] 'Your prompt here'
53
+ echo 'Your prompt here' | askcii 'Your prompt here'
54
+ askcii 'Your prompt here' < prompt.txt
55
+ askcii -p (start a private session)
56
+ askcii -r (to get the last response)
57
+ askcii -c (manage configurations)
58
+ askcii -m 2 (use configuration ID 2)
59
+
60
+ Options:
61
+ -p, --private Start a private session and do not record
62
+ -r, --last-response Output the last response
63
+ -c, --configure Manage configurations
64
+ -m, --model ID Use specific configuration ID
65
+ -h, --help Show help
66
+ USAGE
67
+ end
68
+
69
+ private
70
+
71
+ def option_parser
72
+ @option_parser ||= OptionParser.new do |opts|
73
+ opts.banner = "Usage: askcii [options] 'Your prompt here'"
74
+
75
+ opts.on('-p', '--private', 'Start a private session and do not record') do
76
+ @options[:private] = true
77
+ end
78
+
79
+ opts.on('-r', '--last-response', 'Output the last response') do
80
+ @options[:last_response] = true
81
+ end
82
+
83
+ opts.on('-c', '--configure', 'Manage configurations') do
84
+ @options[:configure] = true
85
+ end
86
+
87
+ opts.on('-m', '--model ID', 'Use specific configuration ID') do |model_id|
88
+ @options[:model_config_id] = model_id
89
+ end
90
+
91
+ opts.on('-h', '--help', 'Show this help message') do
92
+ @options[:help] = true
93
+ end
94
+ end
95
+ end
96
+ end
97
+ end
@@ -0,0 +1,192 @@
1
+ # frozen_string_literal: true
2
+
3
+ module Askcii
4
+ class ConfigurationManager
5
+ PROVIDER_MAP = {
6
+ '1' => 'openai',
7
+ '2' => 'anthropic',
8
+ '3' => 'gemini',
9
+ '4' => 'deepseek',
10
+ '5' => 'openrouter',
11
+ '6' => 'ollama'
12
+ }.freeze
13
+
14
+ DEFAULT_ENDPOINTS = {
15
+ 'openai' => 'https://api.openai.com/v1',
16
+ 'anthropic' => 'https://api.anthropic.com',
17
+ 'gemini' => 'https://generativelanguage.googleapis.com/v1',
18
+ 'deepseek' => 'https://api.deepseek.com/v1',
19
+ 'openrouter' => 'https://openrouter.ai/api/v1',
20
+ 'ollama' => 'http://localhost:11434/v1'
21
+ }.freeze
22
+
23
+ def run
24
+ show_current_configurations
25
+ show_menu
26
+ handle_user_choice
27
+ end
28
+
29
+ private
30
+
31
+ def show_current_configurations
32
+ puts 'Configuration Management'
33
+ puts '======================'
34
+
35
+ configs = Askcii::Config.configurations
36
+ default_id = Askcii::Config.default_configuration_id
37
+
38
+ if configs.empty?
39
+ puts 'No configurations found.'
40
+ else
41
+ puts 'Current configurations:'
42
+ configs.each do |config|
43
+ marker = config['id'] == default_id ? ' (default)' : ''
44
+ provider_info = config['provider'] ? " [#{config['provider']}]" : ''
45
+ puts " #{config['id']}. #{config['name']}#{provider_info}#{marker}"
46
+ end
47
+ puts
48
+ end
49
+ end
50
+
51
+ def show_menu
52
+ puts 'Options:'
53
+ puts ' 1. Add new configuration'
54
+ puts ' 2. Set default configuration'
55
+ puts ' 3. Delete configuration'
56
+ puts ' 4. Exit'
57
+ print 'Select option (1-4): '
58
+ end
59
+
60
+ def handle_user_choice
61
+ choice = $stdin.gets.chomp
62
+
63
+ case choice
64
+ when '1'
65
+ add_new_configuration
66
+ when '2'
67
+ set_default_configuration
68
+ when '3'
69
+ delete_configuration
70
+ when '4'
71
+ puts 'Exiting.'
72
+ else
73
+ puts 'Invalid option.'
74
+ end
75
+ end
76
+
77
+ def add_new_configuration
78
+ print 'Enter configuration name: '
79
+ name = $stdin.gets.chomp
80
+
81
+ provider = select_provider
82
+ return unless provider
83
+
84
+ api_key = get_api_key(provider)
85
+ return unless api_key || provider == 'ollama'
86
+
87
+ endpoint = get_api_endpoint(provider)
88
+ model_id = get_model_id
89
+
90
+ return unless model_id
91
+
92
+ name = model_id if name.empty?
93
+ Askcii::Config.add_configuration(name, api_key || '', endpoint, model_id, provider)
94
+ puts 'Configuration added successfully!'
95
+ end
96
+
97
+ def select_provider
98
+ puts 'Select provider:'
99
+ puts ' 1. OpenAI'
100
+ puts ' 2. Anthropic'
101
+ puts ' 3. Gemini'
102
+ puts ' 4. DeepSeek'
103
+ puts ' 5. OpenRouter'
104
+ puts ' 6. Ollama (no API key needed)'
105
+ print 'Provider (1-6): '
106
+
107
+ provider_choice = $stdin.gets.chomp
108
+ provider = PROVIDER_MAP[provider_choice]
109
+
110
+ if provider.nil?
111
+ puts 'Invalid provider selection.'
112
+ return nil
113
+ end
114
+
115
+ provider
116
+ end
117
+
118
+ def get_api_key(provider)
119
+ return '' if provider == 'ollama'
120
+
121
+ print "Enter #{provider.capitalize} API key: "
122
+ api_key = $stdin.gets.chomp
123
+
124
+ if api_key.empty?
125
+ puts 'API key is required for this provider.'
126
+ return nil
127
+ end
128
+
129
+ api_key
130
+ end
131
+
132
+ def get_api_endpoint(provider)
133
+ default_endpoint = DEFAULT_ENDPOINTS[provider]
134
+ print "Enter API endpoint (default: #{default_endpoint}): "
135
+ api_endpoint = $stdin.gets.chomp
136
+ api_endpoint.empty? ? default_endpoint : api_endpoint
137
+ end
138
+
139
+ def get_model_id
140
+ print 'Enter model ID: '
141
+ model_id = $stdin.gets.chomp
142
+
143
+ if model_id.empty?
144
+ puts 'Model ID is required.'
145
+ return nil
146
+ end
147
+
148
+ model_id
149
+ end
150
+
151
+ def set_default_configuration
152
+ configs = Askcii::Config.configurations
153
+
154
+ if configs.empty?
155
+ puts 'No configurations available to set as default.'
156
+ return
157
+ end
158
+
159
+ print 'Enter configuration ID to set as default: '
160
+ new_default = $stdin.gets.chomp
161
+
162
+ if configs.any? { |c| c['id'] == new_default }
163
+ Askcii::Config.set_default_configuration(new_default)
164
+ puts "Configuration #{new_default} set as default."
165
+ else
166
+ puts 'Invalid configuration ID.'
167
+ end
168
+ end
169
+
170
+ def delete_configuration
171
+ configs = Askcii::Config.configurations
172
+
173
+ if configs.empty?
174
+ puts 'No configurations available to delete.'
175
+ return
176
+ end
177
+
178
+ print 'Enter configuration ID to delete: '
179
+ delete_id = $stdin.gets.chomp
180
+
181
+ if configs.any? { |c| c['id'] == delete_id }
182
+ if Askcii::Config.delete_configuration(delete_id)
183
+ puts "Configuration #{delete_id} deleted successfully."
184
+ else
185
+ puts 'Failed to delete configuration.'
186
+ end
187
+ else
188
+ puts 'Invalid configuration ID.'
189
+ end
190
+ end
191
+ end
192
+ end
@@ -5,9 +5,12 @@ module Askcii
5
5
  one_to_many :messages, class: 'Askcii::Message', key: :chat_id
6
6
 
7
7
  def to_llm
8
+ current_config = Askcii::Config.current_configuration
9
+ provider_symbol = current_config['provider'] ? current_config['provider'].to_sym : :openai
10
+
8
11
  @chat = RubyLLM.chat(
9
12
  model: model_id,
10
- provider: :openai,
13
+ provider: provider_symbol,
11
14
  assume_model_exists: true
12
15
  )
13
16
  messages.each do |msg|
@@ -1,5 +1,7 @@
1
1
  # frozen_string_literal: true
2
2
 
3
+ require 'json'
4
+
3
5
  module Askcii
4
6
  class Config < Sequel::Model(Askcii.database[:configs])
5
7
  def self.set(key, value)
@@ -9,9 +11,10 @@ module Askcii
9
11
 
10
12
  def self.get(key)
11
13
  config = find(key: key)
12
- config ? config.value : nil
14
+ config&.value
13
15
  end
14
16
 
17
+ # Legacy methods for backward compatibility
15
18
  def self.api_key
16
19
  get('api_key')
17
20
  end
@@ -23,5 +26,89 @@ module Askcii
23
26
  def self.model_id
24
27
  get('model_id')
25
28
  end
29
+
30
+ # New multi-configuration methods
31
+ def self.configurations
32
+ where(Sequel.like(:key, 'config_%')).map do |config|
33
+ config_data = JSON.parse(config.value)
34
+ config_data.merge('id' => config.key.split('_', 2)[1])
35
+ end
36
+ rescue JSON::ParserError
37
+ []
38
+ end
39
+
40
+ def self.add_configuration(name, api_key, api_endpoint, model_id, provider)
41
+ config_data = {
42
+ 'name' => name,
43
+ 'api_key' => api_key,
44
+ 'api_endpoint' => api_endpoint,
45
+ 'model_id' => model_id,
46
+ 'provider' => provider
47
+ }
48
+
49
+ # Find the next available ID
50
+ existing_ids = configurations.map { |c| c['id'].to_i }.sort
51
+ next_id = existing_ids.empty? ? 1 : existing_ids.last + 1
52
+
53
+ set("config_#{next_id}", config_data.to_json)
54
+ end
55
+
56
+ def self.get_configuration(id)
57
+ config = get("config_#{id}")
58
+ return nil unless config
59
+
60
+ JSON.parse(config)
61
+ rescue JSON::ParserError
62
+ nil
63
+ end
64
+
65
+ def self.default_configuration_id
66
+ get('default_config_id') || '1'
67
+ end
68
+
69
+ def self.set_default_configuration(id)
70
+ set('default_config_id', id.to_s)
71
+ end
72
+
73
+ def self.delete_configuration(id)
74
+ config = find(key: "config_#{id}")
75
+ return false unless config
76
+
77
+ # Check if this is the default configuration
78
+ if default_configuration_id == id.to_s
79
+ # Reset default to the first remaining configuration
80
+ remaining_configs = configurations.reject { |c| c['id'] == id.to_s }
81
+ if remaining_configs.any?
82
+ set_default_configuration(remaining_configs.first['id'])
83
+ else
84
+ # If no configurations remain, clear the default
85
+ config_record = find(key: 'default_config_id')
86
+ config_record&.delete
87
+ end
88
+ end
89
+
90
+ config.delete
91
+ true
92
+ end
93
+
94
+ def self.current_configuration
95
+ default_id = default_configuration_id
96
+ config = get_configuration(default_id)
97
+
98
+ # Fallback to legacy configuration if no multi-configs exist
99
+ if config.nil? && configurations.empty?
100
+ {
101
+ 'api_key' => api_key,
102
+ 'api_endpoint' => api_endpoint,
103
+ 'model_id' => model_id,
104
+ 'provider' => 'openai'
105
+ }
106
+ else
107
+ # Ensure provider is set for backward compatibility
108
+ config ||= {}
109
+ config['provider'] ||= 'openai'
110
+ config
111
+ end
112
+ end
26
113
  end
27
114
  end
@@ -7,7 +7,7 @@ module Askcii
7
7
  def to_llm
8
8
  RubyLLM::Message.new(
9
9
  role: role.to_sym,
10
- content: content,
10
+ content: content.to_s.encode('UTF-8', undef: :replace),
11
11
  tool_calls: {},
12
12
  tool_call_id: nil,
13
13
  input_tokens: input_tokens,
@@ -1,5 +1,5 @@
1
1
  # frozen_string_literal: true
2
2
 
3
3
  module Askcii
4
- VERSION = '0.1.0'
4
+ VERSION = '0.2.0'
5
5
  end
data/lib/askcii.rb CHANGED
@@ -3,7 +3,6 @@
3
3
  require 'sequel'
4
4
  require 'fileutils'
5
5
  require 'ruby_llm'
6
- require 'ruby_llm/model_info'
7
6
  require_relative './askcii/version'
8
7
 
9
8
  module Askcii
@@ -22,46 +21,75 @@ module Askcii
22
21
 
23
22
  # Initialize the database
24
23
  def self.setup_database
25
- database.create_table :chats do
26
- primary_key :id
27
- String :model_id, null: true
28
- String :context, null: true
29
- Datetime :created_at, null: false, default: Sequel::CURRENT_TIMESTAMP
30
- end unless database.table_exists?(:chats)
24
+ unless database.table_exists?(:chats)
25
+ database.create_table :chats do
26
+ primary_key :id
27
+ String :model_id, null: true
28
+ String :context, null: true
29
+ Datetime :created_at, null: false, default: Sequel::CURRENT_TIMESTAMP
30
+ end
31
+ end
31
32
 
32
- database.create_table :messages do
33
- primary_key :id
34
- foreign_key :chat_id, :chats, null: false
35
- String :role, null: true
36
- Text :content, null: true
37
- String :model_id, null: true
38
- Integer :input_tokens, null: true
39
- Integer :output_tokens, null: true
40
- Datetime :created_at, null: false, default: Sequel::CURRENT_TIMESTAMP
41
- end unless database.table_exists?(:messages)
33
+ unless database.table_exists?(:messages)
34
+ database.create_table :messages do
35
+ primary_key :id
36
+ foreign_key :chat_id, :chats, null: false
37
+ String :role, null: true
38
+ Text :content, null: true
39
+ String :model_id, null: true
40
+ Integer :input_tokens, null: true
41
+ Integer :output_tokens, null: true
42
+ Datetime :created_at, null: false, default: Sequel::CURRENT_TIMESTAMP
43
+ end
44
+ end
45
+
46
+ return if database.table_exists?(:configs)
42
47
 
43
48
  database.create_table :configs do
44
49
  primary_key :id
45
50
  String :key, null: false, unique: true
46
51
  Text :value, null: true
47
- end unless database.table_exists?(:configs)
52
+ end
48
53
  end
49
54
 
50
- def self.configure_llm
55
+ def self.configure_llm(selected_config = nil)
51
56
  RubyLLM.configure do |config|
52
57
  config.log_file = '/dev/null'
53
58
 
54
- # Try to get configuration from the database first, then fallback to ENV variables
55
- config.openai_api_key = begin
56
- Askcii::Config.api_key || ENV['ASKCII_API_KEY'] || 'blank'
57
- rescue StandardError
58
- ENV['ASKCII_API_KEY'] || 'blank'
59
- end
59
+ if selected_config
60
+ provider = selected_config['provider'] || 'openai'
61
+ api_key = selected_config['api_key']
60
62
 
61
- config.openai_api_base = begin
62
- Askcii::Config.api_endpoint || ENV['ASKCII_API_ENDPOINT'] || 'http://localhost:11434/v1'
63
- rescue StandardError
64
- ENV['ASKCII_API_ENDPOINT'] || 'http://localhost:11434/v1'
63
+ # Set the appropriate API key based on provider
64
+ case provider.downcase
65
+ when 'openai'
66
+ config.openai_api_key = api_key || 'blank'
67
+ config.openai_api_base = selected_config['api_endpoint'] || 'https://api.openai.com/v1'
68
+ when 'anthropic'
69
+ config.anthropic_api_key = api_key || 'blank'
70
+ when 'gemini'
71
+ config.gemini_api_key = api_key || 'blank'
72
+ when 'deepseek'
73
+ config.deepseek_api_key = api_key || 'blank'
74
+ when 'openrouter'
75
+ config.openrouter_api_key = api_key || 'blank'
76
+ when 'ollama'
77
+ # Ollama doesn't need an API key
78
+ config.openai_api_base = selected_config['api_endpoint'] || 'http://localhost:11434/v1'
79
+ end
80
+ else
81
+ # Legacy configuration fallback
82
+ config.openai_api_key = begin
83
+ Askcii::Config.api_key || ENV['ASKCII_API_KEY'] || 'blank'
84
+ rescue StandardError
85
+ ENV['ASKCII_API_KEY'] || 'blank'
86
+ end
87
+
88
+ config.openai_api_base = begin
89
+ Askcii::Config.api_endpoint || ENV['ASKCII_API_ENDPOINT'] || 'http://localhost:11434/v1'
90
+ rescue StandardError
91
+ ENV['ASKCII_API_ENDPOINT'] || 'http://localhost:11434/v1'
92
+ end
65
93
  end
66
94
  end
67
95
  end
@@ -71,4 +99,8 @@ module Askcii
71
99
  require_relative './askcii/models/message'
72
100
  require_relative './askcii/models/config'
73
101
  end
102
+
103
+ def self.require_application
104
+ require_relative './askcii/application'
105
+ end
74
106
  end
metadata CHANGED
@@ -1,7 +1,7 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: askcii
3
3
  version: !ruby/object:Gem::Version
4
- version: 0.1.0
4
+ version: 0.2.0
5
5
  platform: ruby
6
6
  authors:
7
7
  - Roel Bondoc
@@ -10,47 +10,47 @@ cert_chain: []
10
10
  date: 1980-01-02 00:00:00.000000000 Z
11
11
  dependencies:
12
12
  - !ruby/object:Gem::Dependency
13
- name: sequel
13
+ name: amalgalite
14
14
  requirement: !ruby/object:Gem::Requirement
15
15
  requirements:
16
16
  - - "~>"
17
17
  - !ruby/object:Gem::Version
18
- version: '5.92'
18
+ version: '1.9'
19
19
  type: :runtime
20
20
  prerelease: false
21
21
  version_requirements: !ruby/object:Gem::Requirement
22
22
  requirements:
23
23
  - - "~>"
24
24
  - !ruby/object:Gem::Version
25
- version: '5.92'
25
+ version: '1.9'
26
26
  - !ruby/object:Gem::Dependency
27
- name: amalgalite
27
+ name: ruby_llm
28
28
  requirement: !ruby/object:Gem::Requirement
29
29
  requirements:
30
- - - "~>"
30
+ - - '='
31
31
  - !ruby/object:Gem::Version
32
- version: '1.9'
32
+ version: 1.3.0
33
33
  type: :runtime
34
34
  prerelease: false
35
35
  version_requirements: !ruby/object:Gem::Requirement
36
36
  requirements:
37
- - - "~>"
37
+ - - '='
38
38
  - !ruby/object:Gem::Version
39
- version: '1.9'
39
+ version: 1.3.0
40
40
  - !ruby/object:Gem::Dependency
41
- name: ruby_llm
41
+ name: sequel
42
42
  requirement: !ruby/object:Gem::Requirement
43
43
  requirements:
44
- - - '='
44
+ - - "~>"
45
45
  - !ruby/object:Gem::Version
46
- version: 1.3.0rc1
46
+ version: '5.92'
47
47
  type: :runtime
48
48
  prerelease: false
49
49
  version_requirements: !ruby/object:Gem::Requirement
50
50
  requirements:
51
- - - '='
51
+ - - "~>"
52
52
  - !ruby/object:Gem::Version
53
- version: 1.3.0rc1
53
+ version: '5.92'
54
54
  - !ruby/object:Gem::Dependency
55
55
  name: minitest
56
56
  requirement: !ruby/object:Gem::Requirement
@@ -98,6 +98,10 @@ files:
98
98
  - bin/console
99
99
  - bin/setup
100
100
  - lib/askcii.rb
101
+ - lib/askcii/application.rb
102
+ - lib/askcii/chat_session.rb
103
+ - lib/askcii/cli.rb
104
+ - lib/askcii/configuration_manager.rb
101
105
  - lib/askcii/models/chat.rb
102
106
  - lib/askcii/models/config.rb
103
107
  - lib/askcii/models/message.rb
@@ -122,7 +126,7 @@ required_rubygems_version: !ruby/object:Gem::Requirement
122
126
  - !ruby/object:Gem::Version
123
127
  version: '0'
124
128
  requirements: []
125
- rubygems_version: 3.6.8
129
+ rubygems_version: 3.6.9
126
130
  specification_version: 4
127
131
  summary: Command line application for LLM interactions
128
132
  test_files: []