llm_memory 0.1.7 → 0.1.8

Sign up to get free protection for your applications and to get access to all the features.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: c20af963efe2764e2360a0e7c5149afc18b6c757b7c24dc178b3efe190330c20
4
- data.tar.gz: 90d3ee12fffc237846e9bd6342ad9863e257bed7ae573c378673cb6c1b4a42f5
3
+ metadata.gz: baa56ad255b6c35f11a18779aa0e1bd3257b36fc53e1f6870cb5ae902dcb71a9
4
+ data.tar.gz: 1c03bea26b44ca3f74c7097155582087d645808756ba0e95e2f6767c912bd3f3
5
5
  SHA512:
6
- metadata.gz: 2240230ab5c96a4f8642f01114da452207d086e8a35432bf69c8652a170307b200fdacb8f48713c3027074d712bb33628da905a5aec1d5afb2c6db9659723d8f
7
- data.tar.gz: 907c03f50e230d7fccbd60244ff6a9c412c5dc7962b1202585c3e56619203c7e3b5dc57be2881d49228167d8296827958315b6e823b69a3e35c32dd08db3be36
6
+ metadata.gz: e698370bda45506cf5e5be841378d5e5ba0914890cdd09dc65f254cc838c6d009ae188ccf8d0253a0eea96fde20f8ff968a5eaadd2626b4f88f208eaf0ab4faa
7
+ data.tar.gz: f78e7cda1b8438c66f2bf672591cf819db1e1651a6528dd752c0ae9ddf7c3a7d8657de3c93d5e0f3c62478a76ec26bf944d13624809f1c45c44a00a7b88976ec
data/Gemfile.lock CHANGED
@@ -1,7 +1,7 @@
1
1
  PATH
2
2
  remote: .
3
3
  specs:
4
- llm_memory (0.1.7)
4
+ llm_memory (0.1.8)
5
5
  redis (~> 4.6.0)
6
6
  ruby-openai (~> 3.7.0)
7
7
  tokenizers (~> 0.3.3)
data/README.md CHANGED
@@ -14,7 +14,7 @@ This enables better integration with systems such as Rails and web services whil
14
14
 
15
15
  ## LLM Memory Components
16
16
 
17
- ![image](https://user-images.githubusercontent.com/1880965/236099477-421b2003-79d2-4a7c-8f80-1afac4fd616d.png)
17
+ ![llm_memory_diagram](https://github.com/shohey1226/llm_memory/assets/1880965/b77d0efa-3fec-4549-b98a-eae510de5c3d)
18
18
 
19
19
  1. LlmMemory::Wernicke: Responsible for loading external data (currently from files). More loader types are planned for future development.
20
20
 
@@ -61,6 +61,8 @@ To use LLM Memory, follow these steps:
61
61
  5. Use LlmMemory::Hippocampus to search for relevant information based on user queries
62
62
  6. Create and use ERB templates with LlmMemory::Broca to generate responses based on the information retrieved
63
63
 
64
+ For the details of each class, please refer to [API reference document](https://github.com/shohey1226/llm_memory/wiki/API-Reference).
65
+
64
66
  ```ruby
65
67
  docs = LlmMemory::Wernicke.load(:file, "/tmp/a_directory")
66
68
  # docs is just an array of hash.
@@ -111,9 +113,28 @@ message2 = broca.respond(query_str: query_str2, related_docs: related_docs)
111
113
 
112
114
  The table below provides a list of plugins utilized by llm_memory. The aim is to keep the core llm_memory lightweight while allowing for easy extensibility through the use of plugins.
113
115
 
114
- | Plugin Name | Type | Module | Link |
115
- | ----------------------- | ------ | -------- | ------------------------------------------------------------- |
116
- | llm_memory_gmail_loader | Loader | Wernicke | [link](https://github.com/shohey1226/llm_memory_gmail_loader) |
116
+ Install the plugin and update the method.
117
+
118
+ For example, if you wan to use pgvector. then,
119
+
120
+ ```
121
+ $ bundle add llm_memory_pgvector
122
+ ```
123
+
124
+ Then, load it instead of `:redis` (default is redis).
125
+
126
+ ```ruby
127
+ # may need to have require depending on the project
128
+ # require llm_memory_pgvector
129
+ hippocamups = LlmMemory::Hippocampus.new(store: :pgvector)`
130
+ ```
131
+
132
+ Please refer to the links for the details.
133
+
134
+ | Plugin Name | Type | Module | Link |
135
+ | ----------------------- | ------ | ----------- | ------------------------------------------------------------- |
136
+ | llm_memory_gmail_loader | Loader | Wernicke | [link](https://github.com/shohey1226/llm_memory_gmail_loader) |
137
+ | llm_memory_pgvector | Store | Hippocampus | [link](https://github.com/shohey1226/llm_memory_pgvector) |
117
138
 
118
139
  ## Development
119
140
 
@@ -123,7 +144,7 @@ To install this gem onto your local machine, run `bundle exec rake install`. To
123
144
 
124
145
  ## Contributing
125
146
 
126
- Bug reports and pull requests are welcome on GitHub at https://github.com/[USERNAME]/llm_memory. This project is intended to be a safe, welcoming space for collaboration, and contributors are expected to adhere to the [code of conduct](https://github.com/[USERNAME]/llm_memory/blob/master/CODE_OF_CONDUCT.md).
147
+ Bug reports and pull requests are welcome on GitHub at https://github.com/shohey1226/llm_memory. This project is intended to be a safe, welcoming space for collaboration, and contributors are expected to adhere to the [code of conduct](https://github.com/shohey1226/llm_memory/blob/master/CODE_OF_CONDUCT.md).
127
148
 
128
149
  ## License
129
150
 
@@ -131,4 +152,4 @@ The gem is available as open source under the terms of the [MIT License](https:/
131
152
 
132
153
  ## Code of Conduct
133
154
 
134
- Everyone interacting in the LlmMemory project's codebases, issue trackers, chat rooms and mailing lists is expected to follow the [code of conduct](https://github.com/[USERNAME]/llm_memory/blob/master/CODE_OF_CONDUCT.md).
155
+ Everyone interacting in the LlmMemory project's codebases, issue trackers, chat rooms and mailing lists is expected to follow the [code of conduct](https://github.com/shohey1226/llm_memory/blob/master/CODE_OF_CONDUCT.md).
@@ -10,7 +10,7 @@ module LlmMemory
10
10
  embedding_name: :openai,
11
11
  chunk_size: 1024,
12
12
  chunk_overlap: 50,
13
- store_name: :redis,
13
+ store: :redis,
14
14
  index_name: "llm_memory"
15
15
  )
16
16
  LlmMemory.configure
@@ -19,8 +19,8 @@ module LlmMemory
19
19
  raise "Embedding '#{embedding_name}' not found." unless embedding_class
20
20
  @embedding_instance = embedding_class.new
21
21
 
22
- store_class = StoreManager.stores[store_name]
23
- raise "Store '#{store_name}' not found." unless store_class
22
+ store_class = StoreManager.stores[store]
23
+ raise "Store '#{store}' not found." unless store_class
24
24
  @store = store_class.new(index_name: index_name)
25
25
 
26
26
  # char count, not word count
@@ -1,5 +1,5 @@
1
1
  # frozen_string_literal: true
2
2
 
3
3
  module LlmMemory
4
- VERSION = "0.1.7"
4
+ VERSION = "0.1.8"
5
5
  end
metadata CHANGED
@@ -1,14 +1,14 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: llm_memory
3
3
  version: !ruby/object:Gem::Version
4
- version: 0.1.7
4
+ version: 0.1.8
5
5
  platform: ruby
6
6
  authors:
7
7
  - Shohei Kameda
8
8
  autorequire:
9
9
  bindir: exe
10
10
  cert_chain: []
11
- date: 2023-05-11 00:00:00.000000000 Z
11
+ date: 2023-05-18 00:00:00.000000000 Z
12
12
  dependencies:
13
13
  - !ruby/object:Gem::Dependency
14
14
  name: tokenizers