ollama-ruby 0.0.1 → 0.1.0

Sign up to get free protection for your applications and to get access to all the features.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: 327051092cef37a7fd95d6a5c3a5a13aa6e52df2314da5f4cfd28f2890ffa820
4
- data.tar.gz: 86da0d23497f9717239abaa1830ac687562c8e9827a097781f0404e6b65e1e35
3
+ metadata.gz: 9f88e75feb900230387f4b960f5762daa8622af0f9562c0145d6b22c4a01fb3d
4
+ data.tar.gz: 4c6fea6ddc54c83ac6d003d4e5f3ccb6b90195b79a97aea6d98a95e32db36f99
5
5
  SHA512:
6
- metadata.gz: f3a9324f11877b0772d1f4bcb1902ea905e7132d94d73fe8b6b7e11a567c2aea04431838fa2b534557e4e1e76ae1ee47b13832aebb71e144ac329b13d15ea111
7
- data.tar.gz: d40adf27de7e7700158027f1c6bfec7e97189b4a6081a7da36758edc31b5229b6efef62ee5a192af19afa5b633224cf8e80666d38154a9d3c1cae974a98f2de5
6
+ metadata.gz: 9002b36cd15680f89c50d6f3ee34b3a4e5d77362cd4982d4ca41fc9abbd0bcd1d44483579686f3bf9f20f9e10b1309c28ae8da789ba8ada66c7b5f6384643f87
7
+ data.tar.gz: e2566f950aea15cf47d60381340db2b0e7a8e3a2fd79aaa8ee421f6cbd367baeae0f64b24e5d9baa801bd715695edf80c5a88d0ba41867c9358bc3076d1ed112
data/.envrc ADDED
@@ -0,0 +1 @@
1
+ export REDIS_URL=redis://localhost:9736
data/README.md CHANGED
@@ -28,7 +28,7 @@ to your Gemfile and run `bundle install` in your terminal.
28
28
 
29
29
  ## Executables
30
30
 
31
- ### ollama_chat
31
+ ### ollama\_chat
32
32
 
33
33
  This a chat client, that can be used to connect to an ollama server and enter a
34
34
  chat converstation with a LLM. It can be called with the following arguments:
@@ -36,20 +36,59 @@ chat converstation with a LLM. It can be called with the following arguments:
36
36
  ```
37
37
  ollama_chat [OPTIONS]
38
38
 
39
- -u URL the ollama base url, OLLAMA_URL
40
- -m MODEL the ollama model to chat with, OLLAMA_MODEL
41
- -M OPTIONS the model options as JSON file, see Ollama::Options
42
- -s SYSTEM the system prompt to use as a file
43
- -c CHAT a saved chat conversation to load
44
- -v VOICE use VOICE (e. g. Samantha) to speak with say command
45
- -d use markdown to display the chat messages
46
- -h this help
39
+ -f CONFIG config file to read
40
+ -u URL the ollama base url, OLLAMA_URL
41
+ -m MODEL the ollama model to chat with, OLLAMA_CHAT_MODEL
42
+ -s SYSTEM the system prompt to use as a file, OLLAMA_CHAT_SYSTEM
43
+ -c CHAT a saved chat conversation to load
44
+ -C COLLECTION name of the collection used in this conversation
45
+ -D DOCUMENT load document and add to collection (multiple)
46
+ -d use markdown to display the chat messages
47
+ -v use voice output
48
+ -h this help
47
49
  ```
48
50
 
49
51
  The base URL can be either set by the environment variable `OLLAMA_URL` or it
50
52
  is derived from the environment variable `OLLAMA_HOST`. The default model to
51
53
  connect can be configured in the environment variable `OLLAMA_MODEL`.
52
54
 
55
+ The YAML config file in `$XDG_CONFIG_HOME/ollama_chat/config.yml`, that you can
56
+ use for more complex settings, it looks like this:
57
+
58
+ ```
59
+ ---
60
+ url: <%= ENV['OLLAMA_URL'] || 'http://%s' % ENV.fetch('OLLAMA_HOST') %>
61
+ model:
62
+ name: <%= ENV.fetch('OLLAMA_CHAT_MODEL', 'llama3.1') %>
63
+ options:
64
+ num_ctx: 8192
65
+ system: <%= ENV.fetch('OLLAMA_CHAT_SYSTEM', 'null') %>
66
+ voice: Samantha
67
+ markdown: true
68
+ embedding:
69
+ enabled: true
70
+ model:
71
+ name: mxbai-embed-large
72
+ options: {}
73
+ collection: <%= ENV.fetch('OLLAMA_CHAT_COLLECTION', 'ollama_chat') %>
74
+ found_texts_size: 4096
75
+ splitter:
76
+ name: RecursiveCharacter
77
+ chunk_size: 1024
78
+ cache: Ollama::Documents::RedisCache
79
+ redis:
80
+ url: <%= ENV.fetch('REDIS_URL', 'null') %>
81
+ debug: <%= ENV['OLLAMA_CHAT_DEBUG'].to_i == 1 ? true : false %>
82
+ ```
83
+
84
+ If you want to store embeddings persistently, set an environment variable
85
+ `REDIS_URL` or update the `redis.url` setting in your `config.yml` file to
86
+ connect to a Redis server. Without this setup, embeddings will only be stored
87
+ in process memory, which is less durable.
88
+
89
+ Some settings can be passed as arguments as well, e. g. if you want to choose a
90
+ specific system prompt:
91
+
53
92
  ```
54
93
  $ ollama_chat -s sherlock.txt
55
94
  Model with architecture llama found.
@@ -86,9 +125,7 @@ $ ollama_chat -m llava-llama3
86
125
  Model with architecture llama found.
87
126
  Connecting to llava-llama3@http://localhost:11434 now…
88
127
  Type /help to display the chat help.
89
- 📨 user> /image spec/assets/kitten.jpg
90
- Attached image spec/assets/kitten.jpg to the next message.
91
- 📸 user> What's on this image?
128
+ 📸 user> What's on this image? ./spec/assets/kitten.jpg
92
129
  📨 assistant:
93
130
  The image captures a moment of tranquility featuring a young cat. The cat,
94
131
  adorned with gray and white fur marked by black stripes on its face and legs,
@@ -116,19 +153,22 @@ subject - the young, blue-eyed cat.
116
153
  The following commands can be given inside the chat, if prefixed by a `/`:
117
154
 
118
155
  ```
119
- /paste to paste content
120
- /list list the messages of the conversation
121
- /clear clear the conversation messages
122
- /pop n pop the last n message, defaults to 1
123
- /regenerate the last answer message
124
- /save filename store conversation messages
125
- /load filename load conversation messages
126
- /image filename attach image to the next message
127
- /quit to quit.
128
- /help to view this help.
156
+ /paste to paste content
157
+ /markdown toggle markdown output
158
+ /list list the messages of the conversation
159
+ /clear clear the conversation messages
160
+ /pop [n] pop the last n exchanges, defaults to 1
161
+ /model change the model
162
+ /regenerate the last answer message
163
+ /collection clear|stats|change|new clear or show stats of current collection
164
+ /summarize source summarize the URL/file source's content
165
+ /save filename store conversation messages
166
+ /load filename load conversation messages
167
+ /quit to quit
168
+ /help to view this help
129
169
  ```
130
170
 
131
- ### ollama_console
171
+ ### ollama\_console
132
172
 
133
173
  This is an interactive console, that can be used to try the different commands
134
174
  provided by an `Ollama::Client` instance. For example this command generate a
data/Rakefile CHANGED
@@ -13,12 +13,12 @@ GemHadar do
13
13
  description 'Library that allows interacting with the Ollama API'
14
14
  test_dir 'spec'
15
15
  ignore '.*.sw[pon]', 'pkg', 'Gemfile.lock', '.AppleDouble', '.bundle',
16
- '.yardoc', 'tags', 'errors.lst', 'cscope.out', 'coverage', 'tmp'
16
+ '.yardoc', 'tags', 'errors.lst', 'cscope.out', 'coverage', 'tmp', 'corpus'
17
17
  package_ignore '.all_images.yml', '.tool-versions', '.gitignore', 'VERSION',
18
18
  '.utilsrc', '.rspec', *Dir.glob('.github/**/*', File::FNM_DOTMATCH)
19
19
  readme 'README.md'
20
20
 
21
- executables << 'ollama_console' << 'ollama_chat'
21
+ executables << 'ollama_console' << 'ollama_chat' << 'ollama_update'
22
22
 
23
23
  required_ruby_version '~> 3.1'
24
24
 
@@ -27,9 +27,21 @@ GemHadar do
27
27
  dependency 'term-ansicolor', '~> 1.11'
28
28
  dependency 'kramdown-parser-gfm', '~> 1.1'
29
29
  dependency 'terminal-table', '~> 3.0'
30
- development_dependency 'all_images', '~>0.4'
31
- development_dependency 'rspec', '~>3.2'
30
+ dependency 'redis', '~> 5.0'
31
+ dependency 'numo-narray', '~> 0.9'
32
+ dependency 'more_math', '~> 1.1'
33
+ dependency 'sorted_set', '~> 1.0'
34
+ dependency 'mime-types', '~> 3.0'
35
+ dependency 'reverse_markdown', '~> 2.0'
36
+ dependency 'complex_config', '~> 0.20'
37
+ dependency 'search_ui', '~> 0.0'
38
+ dependency 'amatch', '~> 0.4.1'
39
+ development_dependency 'all_images', '~> 0.4'
40
+ development_dependency 'rspec', '~> 3.2'
32
41
  development_dependency 'utils'
42
+ development_dependency 'webmock'
33
43
 
34
44
  licenses << 'MIT'
45
+
46
+ clobber 'coverage'
35
47
  end