askcii 0.1.0 → 0.3.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: 18a37a3681323f04071f337d1b0e8f7b0351fe2879de62224951d4b4c8c7d8d4
4
- data.tar.gz: 3e994f5ea755fed88a74fd07be374a412825c8e2227cc048f5c2d2dc219d2b5b
3
+ metadata.gz: fdbabf6bbd313aa11a7868b5df4630a85798e6a59311872bc04cebeaa2ac49e3
4
+ data.tar.gz: cff4449b247c8dcfe5273e122b054d72e54f57866f905854fd94aa39ee8e8547
5
5
  SHA512:
6
- metadata.gz: 9c7c23cc79eb32819c310774894800f6f77cde9fa8466ac4f5674ebeaa626b006f39a487a8d99b8324b9e152a2f8393ecea0c61a9c1e822148c63732c4e9ea64
7
- data.tar.gz: e10d8a1726d612e6aeab85aaf717d48d6616eac31dcd744b56a761fad2e0ce71c1f8ad4e2e574590817e1c6a1b12f87c772793a72d1747e69095c9868547e548
6
+ metadata.gz: 5ca4009158417d0eee5a4452c4ec5c49fefedabdeb22495a2f6b6d08136391f19535e9468113c164f714e0a382eb8b57244e56027d6b824f23132fea679a4d29
7
+ data.tar.gz: ee32e1ab823d7a2dc53445e3fa7cf064993272571512f6c4b18966bbb08c87163158262387521d073d5bfa0c158f80b8ab808f05c98f8e330aaaa555122814f2
data/Gemfile.lock CHANGED
@@ -1,9 +1,9 @@
1
1
  PATH
2
2
  remote: .
3
3
  specs:
4
- askcii (0.1.0)
4
+ askcii (0.3.0)
5
5
  amalgalite (~> 1.9)
6
- ruby_llm (= 1.3.0rc1)
6
+ ruby_llm (= 1.5.1)
7
7
  sequel (~> 5.92)
8
8
 
9
9
  GEM
@@ -14,6 +14,7 @@ GEM
14
14
  arrayfields (4.9.2)
15
15
  base64 (0.2.0)
16
16
  bigdecimal (3.1.9)
17
+ csv (3.3.5)
17
18
  event_stream_parser (1.0.0)
18
19
  faraday (2.13.1)
19
20
  faraday-net_http (>= 2.0, < 3.5)
@@ -27,18 +28,21 @@ GEM
27
28
  faraday (~> 2.0)
28
29
  json (2.12.0)
29
30
  logger (1.7.0)
31
+ marcel (1.0.4)
30
32
  minitest (5.25.5)
31
33
  multipart-post (2.4.1)
32
34
  net-http (0.6.0)
33
35
  uri
36
+ ostruct (0.6.3)
34
37
  rake (13.2.1)
35
- ruby_llm (1.3.0rc1)
38
+ ruby_llm (1.5.1)
36
39
  base64
37
40
  event_stream_parser (~> 1)
38
- faraday (~> 2)
39
- faraday-multipart (~> 1)
40
- faraday-net_http (~> 3)
41
- faraday-retry (~> 2)
41
+ faraday (>= 1.10.0)
42
+ faraday-multipart (>= 1)
43
+ faraday-net_http (>= 1)
44
+ faraday-retry (>= 1)
45
+ marcel (~> 1.0)
42
46
  zeitwerk (~> 2)
43
47
  sequel (5.92.0)
44
48
  bigdecimal
@@ -59,7 +63,9 @@ PLATFORMS
59
63
 
60
64
  DEPENDENCIES
61
65
  askcii!
66
+ csv (~> 3.0)
62
67
  minitest (~> 5.25)
68
+ ostruct (~> 0.6)
63
69
  rake (~> 13.0)
64
70
 
65
71
  BUNDLED WITH
data/README.md CHANGED
@@ -1,6 +1,6 @@
1
1
  # Askcii
2
2
 
3
- A command-line application for interacting with LLM models in a terminal-friendly way.
3
+ A command-line application for interacting with multiple LLM providers in a terminal-friendly way. Supports OpenAI, Anthropic, Gemini, DeepSeek, OpenRouter, and Ollama with multi-configuration management.
4
4
 
5
5
  ## Installation
6
6
 
@@ -20,52 +20,198 @@ Or install it yourself as:
20
20
 
21
21
  ## Usage
22
22
 
23
- ```
23
+ ### Basic Usage
24
+
25
+ ```bash
24
26
  # Basic usage
25
27
  askcii 'Your prompt here'
26
28
 
27
- # Pipe input
28
- echo 'Your context text' | askcii 'Your prompt here'
29
+ # Pipe input from other commands
30
+ echo 'Your context text' | askcii 'Analyze this text'
29
31
 
30
32
  # File input
31
- askcii 'Your prompt here' < input.txt
33
+ askcii 'Summarize this document' < document.txt
34
+
35
+ # Private session (no conversation history saved)
36
+ askcii -p 'Tell me a joke'
37
+
38
+ # Get the last response from your current session
39
+ askcii -r
40
+
41
+ # Use a specific configuration
42
+ askcii -m 2 'Hello using configuration 2'
43
+ ```
44
+
45
+ ### Session Management
32
46
 
47
+ ```bash
33
48
  # Set a custom session ID to maintain conversation context
34
- ASKCII_SESSION_ID="project-research" askcii 'What did we talk about earlier?'
49
+ ASKCII_SESSION="project-research" askcii 'What did we talk about earlier?'
35
50
 
36
- # Or add the following to your .bashrc or .zshrc to always start a new session
51
+ # Generate a new session for each terminal session
37
52
  export ASKCII_SESSION=$(openssl rand -hex 16)
53
+ ```
54
+
55
+ ### Command Line Options
56
+
57
+ ```
58
+ Usage: askcii [options] 'Your prompt here'
59
+
60
+ Options:
61
+ -p, --private Start a private session and do not record
62
+ -r, --last-response Output the last response
63
+ -c, --configure Manage configurations
64
+ -m, --model ID Use specific configuration ID
65
+ -h, --help Show help
66
+ ```
67
+
68
+ ## Configuration Management
69
+
70
+ Askcii supports multi-configuration management, allowing you to easily switch between different LLM providers and models.
38
71
 
39
- # Configure the API key, endpoint, and model ID
72
+ ### Interactive Configuration
73
+
74
+ Use the `-c` flag to access the configuration management interface:
75
+
76
+ ```bash
40
77
  askcii -c
78
+ ```
41
79
 
42
- # Get the last response
43
- askcii -r
80
+ This will show you a menu like:
81
+
82
+ ```
83
+ Configuration Management
84
+ =======================
85
+ Current configurations:
86
+ 1. GPT-4 [openai] (default)
87
+ 2. Claude Sonnet [anthropic]
88
+ 3. Gemini Pro [gemini]
89
+ 4. Local Llama [ollama]
90
+
91
+ Options:
92
+ 1. Add new configuration
93
+ 2. Set default configuration
94
+ 3. Delete configuration
95
+ 4. Exit
44
96
  ```
45
97
 
46
- ## Configuration
98
+ ### Supported Providers
99
+
100
+ 1. **OpenAI** - GPT models (gpt-4, gpt-3.5-turbo, etc.)
101
+ 2. **Anthropic** - Claude models (claude-3-sonnet, claude-3-haiku, etc.)
102
+ 3. **Gemini** - Google's Gemini models
103
+ 4. **DeepSeek** - DeepSeek models
104
+ 5. **OpenRouter** - Access to multiple models through OpenRouter
105
+ 6. **Ollama** - Local models (no API key required)
47
106
 
48
- You can configure your API key, endpoint, and model ID using the `-c` option:
107
+ ### Adding a New Configuration
49
108
 
109
+ When adding a configuration, you'll be prompted for:
110
+
111
+ 1. **Configuration name** - A friendly name for this configuration
112
+ 2. **Provider** - Choose from the supported providers
113
+ 3. **API key** - Your API key for the provider (not needed for Ollama)
114
+ 4. **API endpoint** - The API endpoint (defaults provided for each provider)
115
+ 5. **Model ID** - The specific model to use
116
+
117
+ Example for OpenAI:
50
118
  ```
51
- $ askcii -c
52
- Configuring askcii...
53
- Enter API key: your_api_key_here
54
- Enter API endpoint: http://localhost:11434/v1
55
- Enter model ID: gemma3:12b
56
- Configuration saved successfully!
119
+ Enter configuration name: GPT-4 Turbo
120
+ Provider (1-6): 1
121
+ Enter OpenAI API key: sk-your-api-key-here
122
+ Enter API endpoint (default: https://api.openai.com/v1): [press enter for default]
123
+ Enter model ID: gpt-4-turbo-preview
57
124
  ```
58
125
 
59
- Configuration settings are stored in a SQLite database located at `~/.local/share/askcii/askcii.db`.
126
+ Example for Ollama (local):
127
+ ```
128
+ Enter configuration name: Local Llama
129
+ Provider (1-6): 6
130
+ Enter API endpoint (default: http://localhost:11434/v1): [press enter for default]
131
+ Enter model ID: llama3:8b
132
+ ```
133
+
134
+ ### Using Configurations
135
+
136
+ ```bash
137
+ # Use the default configuration
138
+ askcii 'Hello world'
60
139
 
61
- You can also use environment variables to override the stored configuration:
140
+ # Use a specific configuration by ID
141
+ askcii -m 2 'Hello using configuration 2'
62
142
 
143
+ # List all configurations
144
+ askcii -c
63
145
  ```
146
+
147
+ ### Configuration Storage
148
+
149
+ Configuration settings are stored in a SQLite database located at `~/.local/share/askcii/askcii.db`.
150
+
151
+ ### Environment Variable Fallback
152
+
153
+ You can still use environment variables as a fallback when no configurations are set up:
154
+
155
+ ```bash
64
156
  ASKCII_API_KEY="your_api_key" askcii 'Hello!'
65
157
  ASKCII_API_ENDPOINT="https://api.example.com/v1" askcii 'Hello!'
66
158
  ASKCII_MODEL_ID="gpt-4" askcii 'Hello!'
67
159
  ```
68
160
 
161
+ ## Examples
162
+
163
+ ### Daily Workflow Examples
164
+
165
+ ```bash
166
+ # Code review
167
+ git diff | askcii 'Review this code change and suggest improvements'
168
+
169
+ # Log analysis
170
+ tail -100 /var/log/app.log | askcii 'Summarize any errors or issues'
171
+
172
+ # Documentation
173
+ askcii 'Explain how to set up a Redis cluster' > redis-setup.md
174
+
175
+ # Quick calculations
176
+ askcii 'Calculate compound interest: $1000 at 5% for 10 years'
177
+
178
+ # Text processing
179
+ cat customers.csv | askcii 'Convert this CSV to JSON format'
180
+ ```
181
+
182
+ ### Multi-Provider Usage
183
+
184
+ ```bash
185
+ # Use Claude for creative writing
186
+ askcii -m 1 'Write a short story about AI'
187
+
188
+ # Use GPT-4 for code analysis
189
+ askcii -m 2 'Explain this Python function' < function.py
190
+
191
+ # Use local Ollama for private data
192
+ askcii -m 3 -p 'Analyze this sensitive document' < confidential.txt
193
+ ```
194
+
195
+ ## Session Management
196
+
197
+ Askcii maintains conversation history unless you use the private mode (`-p`). Sessions are identified by:
198
+
199
+ 1. The `ASKCII_SESSION` environment variable
200
+ 2. A randomly generated session ID if not specified
201
+
202
+ ```bash
203
+ # Start a named session for a project
204
+ export ASKCII_SESSION="project-alpha"
205
+ askcii 'What is the project timeline?'
206
+ askcii 'What are the main risks?' # This will have context from previous question
207
+
208
+ # Use private mode for one-off questions
209
+ askcii -p 'What is the weather like?'
210
+
211
+ # Get the last response from current session
212
+ askcii -r
213
+ ```
214
+
69
215
  ## Development
70
216
 
71
217
  After checking out the repo, run `bin/setup` to install dependencies. Then, run `rake spec` to run the tests. You can also run `bin/console` for an interactive prompt that will allow you to experiment.
data/Rakefile CHANGED
@@ -1,8 +1,24 @@
1
1
  # frozen_string_literal: true
2
2
 
3
3
  require 'bundler/gem_tasks'
4
- require 'rspec/core/rake_task'
4
+ require 'rake/testtask'
5
5
 
6
- RSpec::Core::RakeTask.new(:spec)
6
+ Rake::TestTask.new(:test) do |t|
7
+ t.libs << 'test'
8
+ t.libs << 'lib'
9
+ t.test_files = FileList['test/**/*_test.rb']
10
+ end
7
11
 
8
- task default: :spec
12
+ Rake::TestTask.new(:comprehensive) do |t|
13
+ t.libs << 'test'
14
+ t.libs << 'lib'
15
+ t.test_files = FileList['test/comprehensive_test.rb']
16
+ end
17
+
18
+ Rake::TestTask.new(:simple) do |t|
19
+ t.libs << 'test'
20
+ t.libs << 'lib'
21
+ t.test_files = FileList['test/simple_test.rb']
22
+ end
23
+
24
+ task default: :comprehensive
data/askcii.gemspec CHANGED
@@ -6,7 +6,7 @@ Gem::Specification.new do |spec|
6
6
  spec.name = 'askcii'
7
7
  spec.version = Askcii::VERSION
8
8
  spec.authors = ['Roel Bondoc']
9
- spec.email = ['roelbondoc@example.com']
9
+ spec.email = ['rsbondoc@gmail.com']
10
10
 
11
11
  spec.summary = 'Command line application for LLM interactions'
12
12
  spec.description = 'A terminal-friendly interface for interacting with LLM models'
@@ -26,10 +26,12 @@ Gem::Specification.new do |spec|
26
26
  spec.executables = ['askcii']
27
27
  spec.require_paths = ['lib']
28
28
 
29
- spec.add_dependency 'sequel', '~> 5.92'
30
29
  spec.add_dependency 'amalgalite', '~> 1.9'
31
- spec.add_dependency 'ruby_llm', '1.3.0rc1'
30
+ spec.add_dependency 'ruby_llm', '1.5.1'
31
+ spec.add_dependency 'sequel', '~> 5.92'
32
32
 
33
+ spec.add_development_dependency 'csv', '~> 3.0'
33
34
  spec.add_development_dependency 'minitest', '~> 5.25'
35
+ spec.add_development_dependency 'ostruct', '~> 0.6'
34
36
  spec.add_development_dependency 'rake', '~> 13.0'
35
37
  end
data/bin/askcii CHANGED
@@ -4,99 +4,9 @@
4
4
  # Find the right load path
5
5
  $LOAD_PATH.unshift File.expand_path('../lib', __dir__)
6
6
  require 'askcii'
7
- require 'optparse'
8
7
 
9
8
  Askcii.setup_database
10
- Askcii.configure_llm
11
9
  Askcii.require_models
10
+ Askcii.require_application
12
11
 
13
- # Parse command-line options
14
- options = {}
15
- opt_parser = OptionParser.new do |opts|
16
- opts.banner = "Usage: askcii [options] 'Your prompt here'"
17
-
18
- opts.on('-r', '--last-response', 'Output the last response') do
19
- options[:last_response] = true
20
- end
21
-
22
- opts.on('-c', '--configure', 'Configure API key, endpoint, and model ID') do
23
- options[:configure] = true
24
- end
25
-
26
- opts.on('-h', '--help', 'Show this help message') do
27
- puts opts
28
- exit
29
- end
30
- end
31
-
32
- # Parse options, keeping remaining arguments in ARGV
33
- opt_parser.parse!
34
-
35
- # Handle configuration if requested
36
- if options[:configure]
37
- puts 'Configuring askcii...'
38
-
39
- # Prompt for API key
40
- print 'Enter API key: '
41
- api_key = gets.chomp
42
-
43
- # Prompt for API endpoint
44
- print "Enter API endpoint: "
45
- api_endpoint = gets.chomp
46
-
47
- # Prompt for model ID
48
- print 'Enter model ID: '
49
- model_id = gets.chomp
50
-
51
- # Save configuration to database
52
- Askcii::Config.set('api_key', api_key) unless api_key.empty?
53
- Askcii::Config.set('api_endpoint', api_endpoint) unless api_endpoint.empty?
54
- Askcii::Config.set('model_id', model_id) unless model_id.empty?
55
-
56
- puts 'Configuration saved successfully!'
57
- exit 0
58
- end
59
-
60
- context = ENV['ASKCII_SESSION']
61
- model_id = Askcii::Config.model_id || ENV['ASKCII_MODEL_ID']
62
- chat = Askcii::Chat.find_or_create(context: context, model_id: model_id).to_llm
63
-
64
- # Output last response if requested
65
- if options[:last_response]
66
- last_message = chat.messages.where(role: 'assistant').last
67
- if last_message
68
- puts last_message.content
69
- exit 0
70
- else
71
- puts 'No previous response found.'
72
- exit 1
73
- end
74
- end
75
-
76
- # Process input
77
- input = nil
78
- input = $stdin.read unless $stdin.tty?
79
-
80
- prompt = ARGV.join(' ')
81
-
82
- if prompt.empty?
83
- puts 'Usage:'
84
- puts " askcii [options] 'Your prompt here'"
85
- puts " echo 'Your prompt here' | askcii 'Your prompt here'"
86
- puts " askcii 'Your prompt here' < prompt.txt"
87
- puts ' askcii -r (to get the last response)'
88
- puts ' askcii -c (to configure API key, endpoint, and model ID)'
89
- puts "\nOptions:"
90
- puts ' -r, --last-response Output the last response'
91
- puts ' -c, --configure Configure API key, endpoint, and model ID'
92
- puts ' -h, --help Show help'
93
- exit 1
94
- end
95
-
96
- chat.with_instructions 'You are a command line application. Your responses should be suitable to be read in a terminal. Your responses should only include the necessary text. Do not include any explanations unless prompted for it.'
97
- prompt = "With the following text:\n\n#{input}\n\n#{prompt}" if input
98
-
99
- chat.ask(prompt) do |chunk|
100
- print chunk.content
101
- end
102
- puts ''
12
+ Askcii::Application.new.run
@@ -0,0 +1,86 @@
1
+ # frozen_string_literal: true
2
+
3
+ require_relative 'cli'
4
+ require_relative 'configuration_manager'
5
+ require_relative 'chat_session'
6
+
7
+ module Askcii
8
+ class Application
9
+ def initialize(args = ARGV.dup)
10
+ @cli = CLI.new(args)
11
+ end
12
+
13
+ def run
14
+ @cli.parse!
15
+
16
+ if @cli.show_help?
17
+ puts @cli.help_message
18
+ exit 0
19
+ end
20
+
21
+ if @cli.configure?
22
+ ConfigurationManager.new.run
23
+ exit 0
24
+ end
25
+
26
+ selected_config = determine_configuration
27
+ configure_llm(selected_config)
28
+
29
+ chat_session = ChatSession.new(@cli.options, selected_config)
30
+ chat_session.handle_last_response if @cli.last_response?
31
+
32
+ prompt, input = determine_prompt_and_input
33
+
34
+ if prompt.empty? && input.nil?
35
+ puts @cli.usage_message
36
+ exit 1
37
+ end
38
+
39
+ chat_session.execute_chat(prompt, input)
40
+ end
41
+
42
+ private
43
+
44
+ def determine_configuration
45
+ if @cli.model_config_id
46
+ config = Askcii::Config.get_configuration(@cli.model_config_id)
47
+ return config if config
48
+ end
49
+
50
+ config = Askcii::Config.current_configuration
51
+ return config if config
52
+
53
+ # Fallback to environment variables
54
+ {
55
+ 'api_key' => ENV['ASKCII_API_KEY'],
56
+ 'api_endpoint' => ENV['ASKCII_API_ENDPOINT'],
57
+ 'model_id' => ENV['ASKCII_MODEL_ID']
58
+ }
59
+ end
60
+
61
+ def configure_llm(selected_config)
62
+ Askcii.configure_llm(selected_config)
63
+ end
64
+
65
+ def determine_prompt_and_input
66
+ stdin_content = read_stdin_input
67
+
68
+ if @cli.prompt.empty? && stdin_content
69
+ # No prompt provided via args, use stdin as prompt
70
+ [stdin_content.strip, nil]
71
+ elsif !@cli.prompt.empty? && stdin_content
72
+ # Both prompt and stdin provided, use stdin as input context
73
+ [@cli.prompt, stdin_content]
74
+ else
75
+ # Only prompt provided (or neither)
76
+ [@cli.prompt, nil]
77
+ end
78
+ end
79
+
80
+ def read_stdin_input
81
+ return nil if $stdin.tty?
82
+
83
+ $stdin.read
84
+ end
85
+ end
86
+ end
@@ -0,0 +1,70 @@
1
+ # frozen_string_literal: true
2
+
3
+ require 'securerandom'
4
+
5
+ module Askcii
6
+ class ChatSession
7
+ def initialize(options, selected_config)
8
+ @options = options
9
+ @selected_config = selected_config
10
+ end
11
+
12
+ def handle_last_response
13
+ return unless @options[:last_response]
14
+
15
+ context = ENV['ASKCII_SESSION'] || SecureRandom.hex(8)
16
+ model_id = @selected_config['model_id']
17
+ chat_record = Askcii::Chat.find_or_create(context: context, model_id: model_id)
18
+
19
+ last_message = chat_record.messages.select { |msg| msg.role == 'assistant' }.last
20
+ if last_message
21
+ puts last_message.content
22
+ exit 0
23
+ else
24
+ puts 'No previous response found.'
25
+ exit 1
26
+ end
27
+ end
28
+
29
+ def create_chat
30
+ if @options[:private]
31
+ create_private_chat
32
+ else
33
+ create_persistent_chat
34
+ end
35
+ end
36
+
37
+ def execute_chat(prompt, input = nil)
38
+ chat = create_chat
39
+
40
+ chat.with_instructions 'You are a command line application. Your responses should be suitable to be read in a terminal. Your responses should only include the necessary text. Do not include any explanations unless prompted for it.'
41
+
42
+ full_prompt = input ? "With the following text:\n\n#{input}\n\n#{prompt}" : prompt
43
+
44
+ chat.ask(full_prompt) do |chunk|
45
+ print chunk.content
46
+ end
47
+ puts ''
48
+ end
49
+
50
+ private
51
+
52
+ def create_private_chat
53
+ provider_symbol = @selected_config['provider'] ? @selected_config['provider'].to_sym : :openai
54
+ model_id = @selected_config['model_id']
55
+
56
+ RubyLLM.chat(
57
+ model: model_id,
58
+ provider: provider_symbol,
59
+ assume_model_exists: true
60
+ )
61
+ end
62
+
63
+ def create_persistent_chat
64
+ context = ENV['ASKCII_SESSION'] || SecureRandom.hex(8)
65
+ model_id = @selected_config['model_id']
66
+ chat_record = Askcii::Chat.find_or_create(context: context, model_id: model_id)
67
+ chat_record.to_llm
68
+ end
69
+ end
70
+ end