llm_memory 0.1.7 → 0.1.9

Sign up to get free protection for your applications and to get access to all the features.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: c20af963efe2764e2360a0e7c5149afc18b6c757b7c24dc178b3efe190330c20
4
- data.tar.gz: 90d3ee12fffc237846e9bd6342ad9863e257bed7ae573c378673cb6c1b4a42f5
3
+ metadata.gz: 723ea7f6c27fd43b5cb447f476e802af917b678c68e3aa999bbad1807a950b20
4
+ data.tar.gz: e65bf3f2d0555d0c101f5fe550f436760dcecd28b57e67f54d89a1bcecde3ab3
5
5
  SHA512:
6
- metadata.gz: 2240230ab5c96a4f8642f01114da452207d086e8a35432bf69c8652a170307b200fdacb8f48713c3027074d712bb33628da905a5aec1d5afb2c6db9659723d8f
7
- data.tar.gz: 907c03f50e230d7fccbd60244ff6a9c412c5dc7962b1202585c3e56619203c7e3b5dc57be2881d49228167d8296827958315b6e823b69a3e35c32dd08db3be36
6
+ metadata.gz: 975b970c1f2a8c447470755b74b115f3411b072f9bb9e2536c52462cdcedd120c93a342df169dd955ae4cbc9267569393039c5b319fea9487e1c49105a5cbf91
7
+ data.tar.gz: feb02279f9f41132237ef2649d0f3582a83d85c16240d8a79cbc0e8d85c485b659f2d7d2458367af5446ba08a856992bfd12f9bbd9c97510ccbee99bbf1140c6
data/Gemfile CHANGED
@@ -13,6 +13,7 @@ gem "webmock", "~> 3.18.1"
13
13
  gem "ruby-openai"
14
14
  gem "tokenizers"
15
15
  gem "redis"
16
+ gem "llm_memory_pgvector"
16
17
  # dev
17
18
  gem "dotenv"
18
19
  gem "pry"
data/Gemfile.lock CHANGED
@@ -1,7 +1,7 @@
1
1
  PATH
2
2
  remote: .
3
3
  specs:
4
- llm_memory (0.1.7)
4
+ llm_memory (0.1.9)
5
5
  redis (~> 4.6.0)
6
6
  ruby-openai (~> 3.7.0)
7
7
  tokenizers (~> 0.3.3)
@@ -24,12 +24,18 @@ GEM
24
24
  json (2.6.3)
25
25
  language_server-protocol (3.17.0.3)
26
26
  lint_roller (1.0.0)
27
+ llm_memory_pgvector (0.1.2)
28
+ llm_memory (~> 0.1.7)
29
+ pg (~> 1.5.3)
30
+ pgvector (~> 0.2.0)
27
31
  method_source (1.0.0)
28
32
  mini_mime (1.1.2)
29
33
  multi_xml (0.6.0)
30
34
  parallel (1.23.0)
31
35
  parser (3.2.2.1)
32
36
  ast (~> 2.4.1)
37
+ pg (1.5.3)
38
+ pgvector (0.2.0)
33
39
  pry (0.14.2)
34
40
  coderay (~> 1.1)
35
41
  method_source (~> 1.0)
@@ -97,6 +103,7 @@ PLATFORMS
97
103
  DEPENDENCIES
98
104
  dotenv
99
105
  llm_memory!
106
+ llm_memory_pgvector
100
107
  pry
101
108
  rake (~> 13.0)
102
109
  redis
data/README.md CHANGED
@@ -14,7 +14,7 @@ This enables better integration with systems such as Rails and web services whil
14
14
 
15
15
  ## LLM Memory Components
16
16
 
17
- ![image](https://user-images.githubusercontent.com/1880965/236099477-421b2003-79d2-4a7c-8f80-1afac4fd616d.png)
17
+ ![llm_memory_diagram](https://github.com/shohey1226/llm_memory/assets/1880965/b77d0efa-3fec-4549-b98a-eae510de5c3d)
18
18
 
19
19
  1. LlmMemory::Wernicke: Responsible for loading external data (currently from files). More loader types are planned for future development.
20
20
 
@@ -61,6 +61,8 @@ To use LLM Memory, follow these steps:
61
61
  5. Use LlmMemory::Hippocampus to search for relevant information based on user queries
62
62
  6. Create and use ERB templates with LlmMemory::Broca to generate responses based on the information retrieved
63
63
 
64
+ For the details of each class, please refer to [API reference document](https://github.com/shohey1226/llm_memory/wiki/API-Reference).
65
+
64
66
  ```ruby
65
67
  docs = LlmMemory::Wernicke.load(:file, "/tmp/a_directory")
66
68
  # docs is just an array of hash.
@@ -111,9 +113,28 @@ message2 = broca.respond(query_str: query_str2, related_docs: related_docs)
111
113
 
112
114
  The table below provides a list of plugins utilized by llm_memory. The aim is to keep the core llm_memory lightweight while allowing for easy extensibility through the use of plugins.
113
115
 
114
- | Plugin Name | Type | Module | Link |
115
- | ----------------------- | ------ | -------- | ------------------------------------------------------------- |
116
- | llm_memory_gmail_loader | Loader | Wernicke | [link](https://github.com/shohey1226/llm_memory_gmail_loader) |
116
+ Install the plugin and update the method.
117
+
118
+ For example, if you wan to use pgvector. then,
119
+
120
+ ```
121
+ $ bundle add llm_memory_pgvector
122
+ ```
123
+
124
+ Then, load it instead of `:redis` (default is redis).
125
+
126
+ ```ruby
127
+ # may need to have require depending on the project
128
+ # require llm_memory_pgvector
129
+ hippocamups = LlmMemory::Hippocampus.new(store: :pgvector)`
130
+ ```
131
+
132
+ Please refer to the links for the details.
133
+
134
+ | Plugin Name | Type | Module | Link |
135
+ | ----------------------- | ------ | ----------- | ------------------------------------------------------------- |
136
+ | llm_memory_gmail_loader | Loader | Wernicke | [link](https://github.com/shohey1226/llm_memory_gmail_loader) |
137
+ | llm_memory_pgvector | Store | Hippocampus | [link](https://github.com/shohey1226/llm_memory_pgvector) |
117
138
 
118
139
  ## Development
119
140
 
@@ -123,7 +144,7 @@ To install this gem onto your local machine, run `bundle exec rake install`. To
123
144
 
124
145
  ## Contributing
125
146
 
126
- Bug reports and pull requests are welcome on GitHub at https://github.com/[USERNAME]/llm_memory. This project is intended to be a safe, welcoming space for collaboration, and contributors are expected to adhere to the [code of conduct](https://github.com/[USERNAME]/llm_memory/blob/master/CODE_OF_CONDUCT.md).
147
+ Bug reports and pull requests are welcome on GitHub at https://github.com/shohey1226/llm_memory. This project is intended to be a safe, welcoming space for collaboration, and contributors are expected to adhere to the [code of conduct](https://github.com/shohey1226/llm_memory/blob/master/CODE_OF_CONDUCT.md).
127
148
 
128
149
  ## License
129
150
 
@@ -131,4 +152,4 @@ The gem is available as open source under the terms of the [MIT License](https:/
131
152
 
132
153
  ## Code of Conduct
133
154
 
134
- Everyone interacting in the LlmMemory project's codebases, issue trackers, chat rooms and mailing lists is expected to follow the [code of conduct](https://github.com/[USERNAME]/llm_memory/blob/master/CODE_OF_CONDUCT.md).
155
+ Everyone interacting in the LlmMemory project's codebases, issue trackers, chat rooms and mailing lists is expected to follow the [code of conduct](https://github.com/shohey1226/llm_memory/blob/master/CODE_OF_CONDUCT.md).
@@ -10,7 +10,7 @@ module LlmMemory
10
10
  embedding_name: :openai,
11
11
  chunk_size: 1024,
12
12
  chunk_overlap: 50,
13
- store_name: :redis,
13
+ store: :redis,
14
14
  index_name: "llm_memory"
15
15
  )
16
16
  LlmMemory.configure
@@ -19,8 +19,8 @@ module LlmMemory
19
19
  raise "Embedding '#{embedding_name}' not found." unless embedding_class
20
20
  @embedding_instance = embedding_class.new
21
21
 
22
- store_class = StoreManager.stores[store_name]
23
- raise "Store '#{store_name}' not found." unless store_class
22
+ store_class = StoreManager.stores[store]
23
+ raise "Store '#{store}' not found." unless store_class
24
24
  @store = store_class.new(index_name: index_name)
25
25
 
26
26
  # char count, not word count
@@ -50,17 +50,7 @@ module LlmMemory
50
50
 
51
51
  def query(query_str, limit: 3)
52
52
  vector = @embedding_instance.embed_document(query_str)
53
- response_list = @store.search(query: vector, k: limit)
54
- response_list.shift # the first one is the size
55
- # now [redis_key1, [],,, ]
56
- result = response_list.each_slice(2).to_h.values.map { |v|
57
- v.each_slice(2).to_h.transform_keys(&:to_sym)
58
- }
59
- result.each do |item|
60
- hash = JSON.parse(item[:metadata])
61
- item[:metadata] = hash.transform_keys(&:to_sym)
62
- end
63
- result
53
+ @store.search(query: vector, k: limit)
64
54
  end
65
55
 
66
56
  def forget_all
@@ -148,7 +148,17 @@ module LlmMemory
148
148
  "DIALECT",
149
149
  2
150
150
  ]
151
- @client.call(command)
151
+ response_list = @client.call(command)
152
+ response_list.shift # the first one is the size
153
+ # now [redis_key1, [],,, ]
154
+ result = response_list.each_slice(2).to_h.values.map { |v|
155
+ v.each_slice(2).to_h.transform_keys(&:to_sym)
156
+ }
157
+ result.each do |item|
158
+ hash = JSON.parse(item[:metadata])
159
+ item[:metadata] = hash.transform_keys(&:to_sym)
160
+ end
161
+ result
152
162
  end
153
163
  end
154
164
  end
@@ -1,5 +1,5 @@
1
1
  # frozen_string_literal: true
2
2
 
3
3
  module LlmMemory
4
- VERSION = "0.1.7"
4
+ VERSION = "0.1.9"
5
5
  end
metadata CHANGED
@@ -1,14 +1,14 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: llm_memory
3
3
  version: !ruby/object:Gem::Version
4
- version: 0.1.7
4
+ version: 0.1.9
5
5
  platform: ruby
6
6
  authors:
7
7
  - Shohei Kameda
8
8
  autorequire:
9
9
  bindir: exe
10
10
  cert_chain: []
11
- date: 2023-05-11 00:00:00.000000000 Z
11
+ date: 2023-05-18 00:00:00.000000000 Z
12
12
  dependencies:
13
13
  - !ruby/object:Gem::Dependency
14
14
  name: tokenizers