ollama-ruby 0.1.0 → 0.3.0

Sign up to get free protection for your applications and to get access to all the features.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: 9f88e75feb900230387f4b960f5762daa8622af0f9562c0145d6b22c4a01fb3d
4
- data.tar.gz: 4c6fea6ddc54c83ac6d003d4e5f3ccb6b90195b79a97aea6d98a95e32db36f99
3
+ metadata.gz: 510c31683b2251118a7c3b469620400016b29fd1ea6f17cc86e4f99f62ecea2f
4
+ data.tar.gz: a00c4275e42002a01a99f874a855386c83e502551360c1f4146eee0b74f2fd08
5
5
  SHA512:
6
- metadata.gz: 9002b36cd15680f89c50d6f3ee34b3a4e5d77362cd4982d4ca41fc9abbd0bcd1d44483579686f3bf9f20f9e10b1309c28ae8da789ba8ada66c7b5f6384643f87
7
- data.tar.gz: e2566f950aea15cf47d60381340db2b0e7a8e3a2fd79aaa8ee421f6cbd367baeae0f64b24e5d9baa801bd715695edf80c5a88d0ba41867c9358bc3076d1ed112
6
+ metadata.gz: 17e50d40e4c24b56b5c2923f0b73f3e4294587b61a302fbaf4cd8f891e879eb435541e1f7e195a4f88352fd99aa80ff44c8577b8c195d7ef31af1c33229018b7
7
+ data.tar.gz: a18bcef82e9481b75fee4a4c88598a7a374c48f841e9870b39174c0d1f5f235f1173bdbff939182481d431e4e9c8c85501be4f5ef917ca02865692cfc39fde5a
data/CHANGES.md ADDED
@@ -0,0 +1,105 @@
1
+ # Changes
2
+
3
+ ## 2024-09-05 v0.3.0
4
+
5
+ * **New Features**
6
+ * Created new file `ollama_cli` with Ollama CLI functionality.
7
+ * Added executable `ollama_cli` to s.executables in ollama-ruby.gemspec.
8
+ * Added `find_where` method in `documents.rb` to filter records by text size
9
+ and count.
10
+ * Added test for `find_where` method in `documents_spec.rb`.
11
+ * Features for `ollama_chat`
12
+ * Added `found_texts_count` option to `OllamaChatConfig`.
13
+ * Implemented `parse_rss` method for RSS feeds and `parse_atom` method
14
+ for Atom feeds.
15
+ * Added links to titles in RSS feed item summaries and Atom feed item
16
+ summaries.
17
+ * Updated `parse_source` method to handle different content types,
18
+ including HTML, XML, and RSS/Atom feeds.
19
+ * Added `/web [n] query` command to search web and return n or 1 results
20
+ in chat interface.
21
+ * **Improvements**
22
+ * Improved validation for system prompts
23
+ * Extracted file argument handling into a separate module and method
24
+ * Added default value for config or model system prompt
25
+ * Improved input validation for `system_prompt` path
26
+ * Updated collection clearing logic to accept optional tags parameter
27
+ * Updated `Tags` class to overload `to_a` method for converting to array of
28
+ strings
29
+
30
+ ## 2024-09-03 v0.2.0
31
+
32
+ ### Changes
33
+
34
+ * **Added Web Search Functionality to `ollama_chat`**
35
+ + Added `/web` command to fetch search results from DuckDuckGo
36
+ + Updated `/summarize` command to handle cases where summarization fails
37
+ + Fix bug in parsing content type of source document
38
+ * **Refactored Options Class and Usage**
39
+ + Renamed `options` variable to use `Options[]` method in ollama_chat script
40
+ + Added `[](value)` method to Ollama::Options class for casting hashes
41
+ + Updated options_spec.rb with tests for casting hashes and error handling
42
+ * **Refactored Web Search Command**
43
+ + Added support for specifying a page number in `/web` command
44
+ + Updated regular expression to match new format
45
+ + Passed page number as an argument to `search_web` method
46
+ + Updated content string to reference the query and sources correctly
47
+ * **DTO Class Changes**
48
+ + Renamed `json_create` method to `from_hash` in Ollama::DTO class
49
+ + Updated `as_json` method to remove now unnecessary hash creation
50
+ * **Message and Tool Spec Changes**
51
+ + Removed `json_class` from JSON serialization in message_spec
52
+ + Removed `json_class` from JSON serialization in tool_spec
53
+ * **Command Spec Changes**
54
+ + Removed `json_class` from JSON serialization in various command specs (e.g. generate_spec, pull_spec, etc.)
55
+ * **Miscellaneous Changes**
56
+ + Improved width calculation for text truncation
57
+ + Updated FollowChat class to display evaluation statistics
58
+ + Update OllamaChatConfig to use EOT instead of end for heredoc syntax
59
+ + Add .keep file to tmp directory
60
+
61
+ ## 2024-08-30 v0.1.0
62
+
63
+ ### Change Log for New Version
64
+
65
+ #### Significant Changes
66
+
67
+ * **Document Splitting and Embedding Functionality**: Added `Ollama::Documents` class with methods for adding documents, checking existence, deleting documents, and finding similar documents.
68
+ + Introduced two types of caches: `MemoryCache` and `RedisCache`
69
+ + Implemented `SemanticSplitter` class to split text into sentences based on semantic similarity
70
+ * **Improved Ollama Chat Client**: Added support for document embeddings and web/file RAG
71
+ + Allowed configuration per yaml file
72
+ + Parse user input for URLs or files to send images to multimodal models
73
+ * **Redis Docker Service**: Set `REDIS_URL` environment variable to `redis://localhost:9736`
74
+ + Added Redis service to `docker-compose.yml`
75
+ * **Status Display and Progress Updates**: Added infobar.label = response.status when available
76
+ + Updated infobar with progress message on each call if total and completed are set
77
+ + Display error message from response.error if present
78
+ * **Refactored Chat Commands**: Simplified regular expression patterns for `/pop`, `/save`, `/load`, and `/image` commands
79
+ + Added whitespace to some command patterns for better readability
80
+
81
+ #### Other Changes
82
+
83
+ * Added `Character` and `RecursiveCharacter` splitter classes to split text into chunks based on character separators
84
+ * Added RSpec tests for the Ollama::Documents class(es)
85
+ * Updated dependencies and added new methods for calculating breakpoint thresholds and sentence embeddings
86
+ * Added 'ollama_update' to executables in Rakefile
87
+ * Started using webmock
88
+ * Refactored chooser and add fetcher specs
89
+ * Added tests for Ollama::Utils::Fetcher
90
+ * Update README.md
91
+
92
+ ## 2024-08-16 v0.0.1
93
+
94
+ * **New Features**
95
+ + Added missing options parameter to Embed command
96
+ + Documented new `/api/embed` endpoint
97
+ * **Improvements**
98
+ + Improved example in README.md
99
+ * **Code Refactoring**
100
+ + Renamed `client` to `ollama` in client and command specs
101
+ + Updated expectations to use `ollama` instead of `client`
102
+
103
+ ## 2024-08-12 v0.0.0
104
+
105
+ * Start
data/README.md CHANGED
@@ -43,7 +43,6 @@ ollama_chat [OPTIONS]
43
43
  -c CHAT a saved chat conversation to load
44
44
  -C COLLECTION name of the collection used in this conversation
45
45
  -D DOCUMENT load document and add to collection (multiple)
46
- -d use markdown to display the chat messages
47
46
  -v use voice output
48
47
  -h this help
49
48
  ```
@@ -153,19 +152,20 @@ subject - the young, blue-eyed cat.
153
152
  The following commands can be given inside the chat, if prefixed by a `/`:
154
153
 
155
154
  ```
156
- /paste to paste content
157
- /markdown toggle markdown output
158
- /list list the messages of the conversation
159
- /clear clear the conversation messages
160
- /pop [n] pop the last n exchanges, defaults to 1
161
- /model change the model
162
- /regenerate the last answer message
163
- /collection clear|stats|change|new clear or show stats of current collection
164
- /summarize source summarize the URL/file source's content
165
- /save filename store conversation messages
166
- /load filename load conversation messages
167
- /quit to quit
168
- /help to view this help
155
+ /paste to paste content
156
+ /markdown toggle markdown output
157
+ /list list the messages of the conversation
158
+ /clear clear the conversation messages
159
+ /pop [n] pop the last n exchanges, defaults to 1
160
+ /model change the model
161
+ /regenerate the last answer message
162
+ /collection clear [tag]|stats|change|new clear or show stats of current collection
163
+ /summarize source summarize the URL/file source's content
164
+ /web [n] query query web search & return n or 1 results
165
+ /save filename store conversation messages
166
+ /load filename load conversation messages
167
+ /quit to quit
168
+ /help to view this help
169
169
  ```
170
170
 
171
171
  ### ollama\_console
data/Rakefile CHANGED
@@ -18,7 +18,8 @@ GemHadar do
18
18
  '.utilsrc', '.rspec', *Dir.glob('.github/**/*', File::FNM_DOTMATCH)
19
19
  readme 'README.md'
20
20
 
21
- executables << 'ollama_console' << 'ollama_chat' << 'ollama_update'
21
+ executables << 'ollama_console' << 'ollama_chat' <<
22
+ 'ollama_update' << 'ollama_cli'
22
23
 
23
24
  required_ruby_version '~> 3.1'
24
25
 
@@ -36,6 +37,7 @@ GemHadar do
36
37
  dependency 'complex_config', '~> 0.20'
37
38
  dependency 'search_ui', '~> 0.0'
38
39
  dependency 'amatch', '~> 0.4.1'
40
+ dependency 'pdf-reader', '~> 2.0'
39
41
  development_dependency 'all_images', '~> 0.4'
40
42
  development_dependency 'rspec', '~> 3.2'
41
43
  development_dependency 'utils'
data/bin/ollama_chat CHANGED
@@ -4,18 +4,22 @@ require 'ollama'
4
4
  include Ollama
5
5
  require 'term/ansicolor'
6
6
  include Term::ANSIColor
7
- require 'tins/go'
7
+ require 'tins'
8
8
  include Tins::GO
9
9
  require 'reline'
10
10
  require 'reverse_markdown'
11
11
  require 'complex_config'
12
12
  require 'fileutils'
13
+ require 'uri'
14
+ require 'nokogiri'
15
+ require 'rss'
16
+ require 'pdf/reader'
13
17
 
14
18
  class OllamaChatConfig
15
19
  include ComplexConfig
16
20
  include FileUtils
17
21
 
18
- DEFAULT_CONFIG = <<~end
22
+ DEFAULT_CONFIG = <<~EOT
19
23
  ---
20
24
  url: <%= ENV['OLLAMA_URL'] || 'http://%s' % ENV.fetch('OLLAMA_HOST') %>
21
25
  model:
@@ -34,6 +38,7 @@ class OllamaChatConfig
34
38
  prompt: 'Represent this sentence for searching relevant passages: %s'
35
39
  collection: <%= ENV.fetch('OLLAMA_CHAT_COLLECTION', 'ollama_chat') %>
36
40
  found_texts_size: 4096
41
+ found_texts_count: null
37
42
  splitter:
38
43
  name: RecursiveCharacter
39
44
  chunk_size: 1024
@@ -41,7 +46,7 @@ class OllamaChatConfig
41
46
  redis:
42
47
  url: <%= ENV.fetch('REDIS_URL', 'null') %>
43
48
  debug: <%= ENV['OLLAMA_CHAT_DEBUG'].to_i == 1 ? true : false %>
44
- end
49
+ EOT
45
50
 
46
51
  def initialize(filename = nil)
47
52
  @filename = filename || default_path
@@ -109,14 +114,50 @@ class FollowChat
109
114
  end
110
115
  @say.call(response)
111
116
  end
112
- response.done and @output.puts
117
+ if response.done
118
+ @output.puts
119
+ eval_stats = {
120
+ eval_duration: Tins::Duration.new(response.eval_duration / 1e9),
121
+ eval_count: response.eval_count,
122
+ prompt_eval_duration: Tins::Duration.new(response.prompt_eval_duration / 1e9),
123
+ prompt_eval_count: response.prompt_eval_count,
124
+ total_duration: Tins::Duration.new(response.total_duration / 1e9),
125
+ load_duration: Tins::Duration.new(response.load_duration / 1e9),
126
+ }.map { _1 * '=' } * ' '
127
+ @output.puts '📊 ' + color(111) { Utils::Width.wrap(eval_stats, percentage: 90) }
128
+ end
113
129
  self
114
130
  end
115
131
  end
116
132
 
133
+ def search_web(query, n = 5)
134
+ query = URI.encode_uri_component(query)
135
+ url = "https://www.duckduckgo.com/html/?q=#{query}"
136
+ Ollama::Utils::Fetcher.new.get(url) do |tmp|
137
+ result = []
138
+ doc = Nokogiri::HTML(tmp)
139
+ doc.css('.results_links').each do |link|
140
+ if n > 0
141
+ url = link.css('.result__a').first&.[]('href')
142
+ url.sub!(%r(\A/l/\?uddg=), '')
143
+ url.sub!(%r(&rut=.*), '')
144
+ url = URI.decode_uri_component(url)
145
+ url = URI.parse(url)
146
+ url.host =~ /duckduckgo\.com/ and next
147
+ result << url
148
+ n -= 1
149
+ else
150
+ break
151
+ end
152
+ end
153
+ result
154
+ end
155
+ end
156
+
117
157
  def pull_model_unless_present(model, options, retried = false)
118
158
  ollama.show(name: model) { |response|
119
- puts "Model #{bold{model}} with architecture #{response.model_info['general.architecture']} found."
159
+ puts "Model #{bold{model}} with architecture "\
160
+ "#{response.model_info['general.architecture']} found."
120
161
  if system = response.system
121
162
  puts "Configured model system prompt is:\n#{italic { system }}"
122
163
  return system
@@ -144,7 +185,7 @@ def load_conversation(filename)
144
185
  return
145
186
  end
146
187
  File.open(filename, 'r') do |output|
147
- return JSON(output.read, create_additions: true)
188
+ return JSON(output.read).map { Ollama::Message.from_hash(_1) }
148
189
  end
149
190
  end
150
191
 
@@ -189,19 +230,79 @@ def list_conversation(messages, markdown)
189
230
  end
190
231
  end
191
232
 
233
+ def reverse_markdown(html)
234
+ ReverseMarkdown.convert(
235
+ html,
236
+ unknown_tags: :bypass,
237
+ github_flavored: true,
238
+ tag_border: ''
239
+ )
240
+ end
241
+
242
+ def parse_rss(source_io)
243
+ feed = RSS::Parser.parse(source_io, false, false)
244
+ title = <<~end
245
+ # #{feed&.channel&.title}
246
+
247
+ end
248
+ feed.items.inject(title) do |text, item|
249
+ text << <<~end
250
+ ## [#{item&.title}](#{item&.link})
251
+
252
+ updated on #{item&.pubDate}
253
+
254
+ #{reverse_markdown(item&.description)}
255
+
256
+ end
257
+ end
258
+ end
259
+
260
+ def parse_atom(source_io)
261
+ feed = RSS::Parser.parse(source_io, false, false)
262
+ title = <<~end
263
+ # #{feed.title.content}
264
+
265
+ end
266
+ feed.items.inject(title) do |text, item|
267
+ text << <<~end
268
+ ## [#{item&.title&.content}](#{item&.link&.href})
269
+
270
+ updated on #{item&.updated&.content}
271
+
272
+ #{reverse_markdown(item&.content&.content)}
273
+
274
+ end
275
+ end
276
+ end
277
+
192
278
  def parse_source(source_io)
193
- case source_io&.content_type&.sub_type
194
- when 'html'
195
- ReverseMarkdown.convert(
196
- source_io.read,
197
- unknown_tags: :bypass,
198
- github_flavored: true,
199
- tag_border: ''
200
- )
201
- when 'plain', 'csv', 'xml'
279
+ case source_io&.content_type
280
+ when 'text/html'
281
+ reverse_markdown(source_io.read)
282
+ when 'text/xml'
283
+ if source_io.readline =~ %r(^\s*<rss\s)
284
+ source_io.rewind
285
+ return parse_rss(source_io)
286
+ end
287
+ source_io.rewind
288
+ source_io.read
289
+ when %r(\Atext/)
290
+ source_io.read
291
+ when 'application/rss+xml'
292
+ parse_rss(source_io)
293
+ when 'application/atom+xml'
294
+ parse_atom(source_io)
295
+ when 'application/json'
202
296
  source_io.read
297
+ when 'application/pdf'
298
+ reader = PDF::Reader.new(source_io)
299
+ result = +''
300
+ reader.pages.each do |page|
301
+ result << page.text
302
+ end
303
+ result
203
304
  else
204
- STDERR.puts "Cannot import #{source_io.content_type} document."
305
+ STDERR.puts "Cannot import #{source_io&.content_type} document."
205
306
  return
206
307
  end
207
308
  end
@@ -211,7 +312,7 @@ def import_document(source_io, source)
211
312
  STDOUT.puts "Embedding disabled, I won't import any documents, try: /summarize"
212
313
  return
213
314
  end
214
- STDOUT.puts "Importing #{source_io.content_type} document #{source.to_s.inspect}."
315
+ infobar.puts "Importing #{italic { source_io.content_type }} document #{source.to_s.inspect}."
215
316
  text = parse_source(source_io) or return
216
317
  text.downcase!
217
318
  splitter_config = $config.embedding.splitter
@@ -290,7 +391,7 @@ def parse_content(content, images)
290
391
  case source_io&.content_type&.media_type
291
392
  when 'image'
292
393
  add_image(images, source_io, source)
293
- when 'text'
394
+ when 'text', 'application'
294
395
  import_document(source_io, source)
295
396
  else
296
397
  STDERR.puts(
@@ -354,19 +455,20 @@ end
354
455
 
355
456
  def display_chat_help
356
457
  puts <<~end
357
- /paste to paste content
358
- /markdown toggle markdown output
359
- /list list the messages of the conversation
360
- /clear clear the conversation messages
361
- /pop [n] pop the last n exchanges, defaults to 1
362
- /model change the model
363
- /regenerate the last answer message
364
- /collection clear|stats|change|new clear or show stats of current collection
365
- /summarize source summarize the URL/file source's content
366
- /save filename store conversation messages
367
- /load filename load conversation messages
368
- /quit to quit
369
- /help to view this help
458
+ /paste to paste content
459
+ /markdown toggle markdown output
460
+ /list list the messages of the conversation
461
+ /clear clear the conversation messages
462
+ /pop [n] pop the last n exchanges, defaults to 1
463
+ /model change the model
464
+ /regenerate the last answer message
465
+ /collection clear [tag]|stats|change|new clear or show stats of current collection
466
+ /summarize source summarize the URL/file source's content
467
+ /web [n] query query web search & return n or 1 results
468
+ /save filename store conversation messages
469
+ /load filename load conversation messages
470
+ /quit to quit
471
+ /help to view this help
370
472
  end
371
473
  end
372
474
 
@@ -381,7 +483,6 @@ def usage
381
483
  -c CHAT a saved chat conversation to load
382
484
  -C COLLECTION name of the collection used in this conversation
383
485
  -D DOCUMENT load document and add to collection (multiple)
384
- -d use markdown to display the chat messages
385
486
  -v use voice output
386
487
  -h this help
387
488
 
@@ -393,7 +494,7 @@ def ollama
393
494
  $ollama
394
495
  end
395
496
 
396
- opts = go 'f:u:m:s:c:C:D:dvh'
497
+ opts = go 'f:u:m:s:c:C:D:vh'
397
498
 
398
499
  config = OllamaChatConfig.new(opts[?f])
399
500
  $config = config.config
@@ -407,13 +508,13 @@ base_url = opts[?u] || $config.url
407
508
  $ollama = Client.new(base_url:, debug: $config.debug)
408
509
 
409
510
  model = choose_model(opts[?m], $config.model.name)
410
- options = $config.model.options
511
+ options = Options[$config.model.options]
411
512
  model_system = pull_model_unless_present(model, options)
412
513
  messages = []
413
514
 
414
515
  if $config.embedding.enabled
415
516
  embedding_model = $config.embedding.model.name
416
- embedding_model_options = $config.embedding.model.options
517
+ embedding_model_options = Options[$config.embedding.model.options]
417
518
  pull_model_unless_present(embedding_model, embedding_model_options)
418
519
  collection = opts[?C] || $config.embedding.collection
419
520
  $documents = Documents.new(
@@ -456,23 +557,15 @@ end
456
557
  if voice = ($config.voice if opts[?v])
457
558
  puts "Using voice #{bold{voice}} to speak."
458
559
  end
459
-
460
- markdown = set_markdown(opts[?d] || $config.markdown)
560
+ markdown = set_markdown($config.markdown)
461
561
 
462
562
  if opts[?c]
463
563
  messages.concat load_conversation(opts[?c])
464
564
  else
465
- system = nil
466
- if system_prompt_file = opts[?s]
467
- system = File.read(system_prompt_file)
468
- end
469
- system ||= $config.system
470
-
471
- if system
565
+ if system = Ollama::Utils::FileArgument.
566
+ get_file_argument(opts[?s], default: $config.system? || model_system)
472
567
  messages << Message.new(role: 'system', content: system)
473
568
  puts "Configured system prompt is:\n#{italic { system }}"
474
- elsif model_system.present?
475
- puts "Using model system prompt."
476
569
  end
477
570
  end
478
571
 
@@ -481,9 +574,10 @@ puts "\nType /help to display the chat help."
481
574
  images = []
482
575
  loop do
483
576
  parse_content = true
484
-
485
577
  input_prompt = bold { color(172) { message_type(images) + " user" } } + bold { "> " }
486
- case content = Reline.readline(input_prompt, true)&.chomp
578
+ content = Reline.readline(input_prompt, true)&.chomp
579
+
580
+ case content
487
581
  when %r(^/paste$)
488
582
  puts bold { "Paste your content and then press C-d!" }
489
583
  content = STDIN.read
@@ -500,11 +594,18 @@ loop do
500
594
  messages.clear
501
595
  puts "Cleared messages."
502
596
  next
503
- when %r(^/collection (clear|stats|change|new)$)
504
- case $1
597
+ when %r(^/collection\s+(clear|stats|change|new)(?:\s+(.+))?$)
598
+ command, arg = $1, $2
599
+ case command
505
600
  when 'clear'
506
- $documents.clear
507
- puts "Cleared collection #{bold{collection}}."
601
+ tags = arg.present? ? arg.sub(/\A#*/, '') : nil
602
+ if tags
603
+ $documents.clear(tags:)
604
+ puts "Cleared tag ##{tags} from collection #{bold{collection}}."
605
+ else
606
+ $documents.clear
607
+ puts "Cleared collection #{bold{collection}}."
608
+ end
508
609
  when 'stats'
509
610
  collection_stats
510
611
  when 'change'
@@ -518,7 +619,7 @@ loop do
518
619
  when %r(^/pop?(?:\s+(\d*))?$)
519
620
  n = $1.to_i.clamp(1, Float::INFINITY)
520
621
  r = messages.pop(2 * n)
521
- m = r.size
622
+ m = r.size / 2
522
623
  puts "Popped the last #{m} exchanges."
523
624
  next
524
625
  when %r(^/model$)
@@ -534,7 +635,15 @@ loop do
534
635
  end
535
636
  when %r(^/summarize\s+(.+))
536
637
  parse_content = false
537
- content = summarize($1)
638
+ content = summarize($1) or next
639
+ when %r(^/web\s+(?:(\d+)\s+)?(.+)$)
640
+ parse_content = true
641
+ urls = search_web($2, $1.to_i)
642
+ content = <<~end
643
+ Answer the the query #{$2.inspect} using these sources:
644
+
645
+ #{urls * ?\n}
646
+ end
538
647
  when %r(^/save\s+(.+)$)
539
648
  save_conversation($1, messages)
540
649
  puts "Saved conversation to #$1."
@@ -557,19 +666,18 @@ loop do
557
666
  [ content, Utils::Tags.new ]
558
667
  end
559
668
 
560
- if $config.embedding.enabled
561
- records = $documents.find(
669
+ if $config.embedding.enabled && content
670
+ records = $documents.find_where(
562
671
  content.downcase,
563
672
  tags:,
564
- prompt: $config.embedding.model.prompt?
673
+ prompt: $config.embedding.model.prompt?,
674
+ text_size: $config.embedding.found_texts_size?,
675
+ text_count: $config.embedding.found_texts_count?,
565
676
  )
566
- s, found_texts_size = 0, $config.embedding.found_texts_size
567
- records = records.take_while {
568
- (s += _1.text.size) <= found_texts_size
569
- }
570
677
  found_texts = records.map(&:text)
571
678
  unless found_texts.empty?
572
- content += "\nConsider these chunks for your answer:\n#{found_texts.join("\n\n---\n\n")}"
679
+ content += "\nConsider these chunks for your answer:\n"\
680
+ "#{found_texts.join("\n\n---\n\n")}"
573
681
  end
574
682
  end
575
683
 
@@ -577,15 +685,17 @@ loop do
577
685
  handler = FollowChat.new(messages:, markdown:, voice:)
578
686
  ollama.chat(model:, messages:, options:, stream: true, &handler)
579
687
 
580
- puts records.map { |record|
581
- link = if record.source =~ %r(\Ahttps?://)
582
- record.source
583
- else
584
- 'file://%s' % File.expand_path(record.source)
585
- end
586
- [ link, record.tags.first ]
587
- }.uniq.map { |l, t| hyperlink(l, t) }.join(' ')
588
- $config.debug and jj messages
688
+ if records
689
+ puts records.map { |record|
690
+ link = if record.source =~ %r(\Ahttps?://)
691
+ record.source
692
+ else
693
+ 'file://%s' % File.expand_path(record.source)
694
+ end
695
+ [ link, record.tags.first ]
696
+ }.uniq.map { |l, t| hyperlink(l, t) }.join(' ')
697
+ $config.debug and jj messages
698
+ end
589
699
  rescue Interrupt
590
700
  puts "Type /quit to quit."
591
701
  end
data/bin/ollama_cli ADDED
@@ -0,0 +1,68 @@
1
+ #!/usr/bin/env ruby
2
+
3
+ require 'ollama'
4
+ include Ollama
5
+ include Ollama::Utils::FileArgument
6
+ require 'tins'
7
+ include Tins::GO
8
+ require 'json'
9
+
10
+ def usage
11
+ puts <<~end
12
+ #{File.basename($0)} [OPTIONS]
13
+
14
+ -u URL the ollama base url, OLLAMA_URL
15
+ -m MODEL the ollama model to chat with, OLLAMA_MODEL
16
+ -M OPTIONS the ollama model options to use, OLLAMA_MODEL_OPTIONS
17
+ -s SYSTEM the system prompt to use as a file, OLLAMA_SYSTEM
18
+ -p PROMPT the user prompt to use as a file, OLLAMA_PROMPT
19
+ -H HANDLER the handler to use for the response, defaults to Print
20
+ -S use streaming for generation
21
+ -h this help
22
+
23
+ end
24
+ exit 0
25
+ end
26
+
27
+ opts = go 'u:m:M:s:p:H:Sh', defaults: { ?H => 'Print', ?M => '{}' }
28
+
29
+ opts[?h] and usage
30
+
31
+ base_url = opts[?u] || ENV['OLLAMA_URL'] || 'http://%s' % ENV.fetch('OLLAMA_HOST')
32
+ model = opts[?m] || ENV.fetch('OLLAMA_MODEL', 'llama3.1')
33
+ options = Ollama::Options.from_hash(JSON(
34
+ get_file_argument(opts[?M], default: ENV['OLLAMA_MODEL_OPTIONS'])
35
+ ))
36
+ system = get_file_argument(opts[?s], default: ENV['OLLAMA_SYSTEM'])
37
+ prompt = get_file_argument(opts[?p], default: ENV['OLLAMA_PROMPT'])
38
+
39
+ if prompt.nil?
40
+ prompt = STDIN.read
41
+ elsif c = prompt.scan('%s').size
42
+ case c
43
+ when 0
44
+ when 1
45
+ prompt = prompt % STDIN.read
46
+ else
47
+ STDERR.puts "Found more than one plaeceholder %s. => Ignoring."
48
+ end
49
+ end
50
+
51
+ if ENV['DEBUG'].to_i == 1
52
+ puts <<~EOT
53
+ base_url = #{base_url.inspect}
54
+ model = #{model.inspect}
55
+ system = #{system.inspect}
56
+ prompt = #{prompt.inspect}
57
+ options = #{options.to_json}
58
+ EOT
59
+ end
60
+
61
+ Client.new(base_url:, read_timeout: 120).generate(
62
+ model:,
63
+ system:,
64
+ prompt:,
65
+ options:,
66
+ stream: !!opts[?S],
67
+ &Object.const_get(opts[?H])
68
+ )
data/bin/ollama_console CHANGED
@@ -5,8 +5,13 @@ include Ollama
5
5
  require 'irb'
6
6
  require 'irb/history'
7
7
 
8
- base_url = ENV['OLLAMA_URL'] || 'http://%s' % ENV.fetch('OLLAMA_HOST')
9
- ollama = Client.new(base_url:)
8
+ def base_url
9
+ ENV['OLLAMA_URL'] || 'http://%s' % ENV.fetch('OLLAMA_HOST')
10
+ end
11
+
12
+ def ollama
13
+ $ollama ||= Client.new(base_url:)
14
+ end
10
15
  IRB.setup nil
11
16
  IRB.conf[:MAIN_CONTEXT] = IRB::Irb.new.context
12
17
  IRB.conf[:HISTORY_FILE] = File.join(ENV.fetch('HOME'), '.ollama_console-history')