llm_memory 0.1.0 → 0.1.2

Sign up to get free protection for your applications and to get access to all the features.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: c3f9540b8e3ca30d11eacd2e3100b4aa4ef985d041017b2b09bd5fad52d2f776
4
- data.tar.gz: 9998bca0ea546d93b3a051ac7a6c62a9597ae2da9e85d8d5b36aab4212c59641
3
+ metadata.gz: 0615d07f9e76aa0b4c762d027e757bb0cb03cc1448b11d5b5cf684d5a75b381a
4
+ data.tar.gz: 6aa24ea5ddf2476876ca2d873400018785c3bcc6c385faf35cac3f59de94049a
5
5
  SHA512:
6
- metadata.gz: 9923ca576cf858eac89facf8da2f88d6284f7577366b252bb4e3ab197879a78035dfacba62e7890ab097dc193ee922f2b6abf16b321d4f1b09e6f1d30e4e9184
7
- data.tar.gz: 44ff089feebead7cc9e1ef8b555212d20a938fcd6a4c72c13cfb55ae3029ebf6699722ea42e977ac108375e5d91c83aad450177ec2b9ad1037f87a83b83eadf7
6
+ metadata.gz: 7f6ac569b18f28502e3b0e34614498b559dff4b6e085988f3bae2f5160aef9eb6491a7e61f78edc3d5e0c9be0ac8878f4e0747cc2d02358aa389071eea393c0c
7
+ data.tar.gz: b818a177d945d9e2544f1079f214231d792419772688b4907a50a8ceb2fde2435d5571e4e03307fa1bee8f8d22499e7ddf5821363203b1a1ecbd473d4edbdc2d
data/Gemfile CHANGED
@@ -13,3 +13,6 @@ gem "webmock", "~> 3.18.1"
13
13
  gem "ruby-openai"
14
14
  gem "tiktoken_ruby"
15
15
  gem "redis"
16
+ # dev
17
+ gem "dotenv"
18
+ gem "pry"
data/Gemfile.lock CHANGED
@@ -1,7 +1,10 @@
1
1
  PATH
2
2
  remote: .
3
3
  specs:
4
- llm_memory (0.1.0)
4
+ llm_memory (0.1.2)
5
+ redis (~> 4.6.0)
6
+ ruby-openai (~> 3.7.0)
7
+ tiktoken_ruby (~> 0.0.4)
5
8
 
6
9
  GEM
7
10
  remote: https://rubygems.org/
@@ -9,31 +12,31 @@ GEM
9
12
  addressable (2.8.4)
10
13
  public_suffix (>= 2.0.2, < 6.0)
11
14
  ast (2.4.2)
12
- connection_pool (2.4.0)
15
+ coderay (1.1.3)
13
16
  crack (0.4.5)
14
17
  rexml
15
18
  diff-lcs (1.5.0)
16
- faraday (2.7.4)
17
- faraday-net_http (>= 2.0, < 3.1)
18
- ruby2_keywords (>= 0.0.4)
19
- faraday-multipart (1.0.4)
20
- multipart-post (~> 2)
21
- faraday-net_http (3.0.2)
19
+ dotenv (2.8.1)
22
20
  hashdiff (1.0.1)
21
+ httparty (0.21.0)
22
+ mini_mime (>= 1.0.0)
23
+ multi_xml (>= 0.5.2)
23
24
  json (2.6.3)
24
25
  language_server-protocol (3.17.0.3)
25
26
  lint_roller (1.0.0)
26
- multipart-post (2.3.0)
27
+ method_source (1.0.0)
28
+ mini_mime (1.1.2)
29
+ multi_xml (0.6.0)
27
30
  parallel (1.23.0)
28
31
  parser (3.2.2.1)
29
32
  ast (~> 2.4.1)
33
+ pry (0.14.2)
34
+ coderay (~> 1.1)
35
+ method_source (~> 1.0)
30
36
  public_suffix (5.0.1)
31
37
  rainbow (3.1.1)
32
38
  rake (13.0.6)
33
- redis (5.0.6)
34
- redis-client (>= 0.9.0)
35
- redis-client (0.14.1)
36
- connection_pool
39
+ redis (4.6.0)
37
40
  regexp_parser (2.8.0)
38
41
  rexml (3.2.5)
39
42
  rspec (3.12.0)
@@ -64,11 +67,9 @@ GEM
64
67
  rubocop-performance (1.16.0)
65
68
  rubocop (>= 1.7.0, < 2.0)
66
69
  rubocop-ast (>= 0.4.0)
67
- ruby-openai (4.0.0)
68
- faraday (>= 1)
69
- faraday-multipart (>= 1)
70
+ ruby-openai (3.7.0)
71
+ httparty (>= 0.18.1)
70
72
  ruby-progressbar (1.13.0)
71
- ruby2_keywords (0.0.5)
72
73
  standard (1.28.0)
73
74
  language_server-protocol (~> 3.17.0.2)
74
75
  lint_roller (~> 1.0)
@@ -81,6 +82,7 @@ GEM
81
82
  lint_roller (~> 1.0)
82
83
  rubocop-performance (~> 1.16.0)
83
84
  tiktoken_ruby (0.0.4-arm64-darwin)
85
+ tiktoken_ruby (0.0.4-x86_64-linux)
84
86
  unicode-display_width (2.4.2)
85
87
  vcr (6.1.0)
86
88
  webmock (3.18.1)
@@ -93,7 +95,9 @@ PLATFORMS
93
95
  x86_64-linux
94
96
 
95
97
  DEPENDENCIES
98
+ dotenv
96
99
  llm_memory!
100
+ pry
97
101
  rake (~> 13.0)
98
102
  redis
99
103
  rspec (~> 3.0)
data/README.md CHANGED
@@ -14,6 +14,8 @@ This enables better integration with systems such as Rails and web services whil
14
14
 
15
15
  ## LLM Memory Components
16
16
 
17
+ ![image](https://user-images.githubusercontent.com/1880965/236099477-421b2003-79d2-4a7c-8f80-1afac4fd616d.png)
18
+
17
19
  1. LlmMemory::Wernicke: Responsible for loading external data (currently from files). More loader types are planned for future development.
18
20
 
19
21
  > Wernicke's area in brain is involved in the comprehension of written and spoken language
@@ -28,15 +30,13 @@ This enables better integration with systems such as Rails and web services whil
28
30
 
29
31
  ## Installation
30
32
 
31
- TODO: Replace `UPDATE_WITH_YOUR_GEM_NAME_PRIOR_TO_RELEASE_TO_RUBYGEMS_ORG` with your gem name right after releasing it to RubyGems.org. Please do not do it earlier due to security reasons. Alternatively, replace this section with instructions to install your gem from git if you don't plan to release to RubyGems.org.
32
-
33
33
  Install the gem and add to the application's Gemfile by executing:
34
34
 
35
- $ bundle add UPDATE_WITH_YOUR_GEM_NAME_PRIOR_TO_RELEASE_TO_RUBYGEMS_ORG
35
+ $ bundle add llm_memory
36
36
 
37
37
  If bundler is not being used to manage dependencies, install the gem by executing:
38
38
 
39
- $ gem install UPDATE_WITH_YOUR_GEM_NAME_PRIOR_TO_RELEASE_TO_RUBYGEMS_ORG
39
+ $ gem install llm_memory
40
40
 
41
41
  ### Setup
42
42
 
@@ -55,7 +55,7 @@ end
55
55
  To use LLM Memory, follow these steps:
56
56
 
57
57
  1. Install the gem: gem install llm_memory
58
- 2. Set up Redis with Redisearch module enabled
58
+ 2. Set up Redis with Redisearch module enabled - Go to [Reids Cloud](https://redis.com/redis-enterprise-cloud/overview/) and get the redis url
59
59
  3. Configure LLM Memory to connect to your Redis instance
60
60
  4. Use LlmMemory::Wernicke to load data from your external sources
61
61
  5. Use LlmMemory::Hippocampus to search for relevant information based on user queries
@@ -107,6 +107,14 @@ related_docs = hippocampus.query(query_str2, limit: 3)
107
107
  message2 = broca.respond(query_str: query_str2, related_docs: related_docs)
108
108
  ```
109
109
 
110
+ ## Plugins
111
+
112
+ The table below provides a list of plugins utilized by llm_memory. The aim is to keep the core llm_memory lightweight while allowing for easy extensibility through the use of plugins.
113
+
114
+ | Plugin Name | Type | Module | Link |
115
+ | ----------------------- | ------ | -------- | ------------------------------------------------------------- |
116
+ | llm_memory_gmail_loader | Loader | Wernicke | [link](https://github.com/shohey1226/llm_memory_gmail_loader) |
117
+
110
118
  ## Development
111
119
 
112
120
  After checking out the repo, run `bin/setup` to install dependencies. Then, run `rake spec` to run the tests. You can also run `bin/console` for an interactive prompt that will allow you to experiment.
@@ -12,6 +12,7 @@ module LlmMemory
12
12
  temperature: 0.7,
13
13
  max_token: 4096
14
14
  )
15
+ LlmMemory.configure
15
16
  @prompt = prompt
16
17
  @model = model
17
18
  @messages = []
@@ -13,6 +13,8 @@ module LlmMemory
13
13
  store_name: :redis,
14
14
  index_name: "llm_memory"
15
15
  )
16
+ LlmMemory.configure
17
+
16
18
  embedding_class = EmbeddingManager.embeddings[embedding_name]
17
19
  raise "Embedding '#{embedding_name}' not found." unless embedding_class
18
20
  @embedding_instance = embedding_class.new
@@ -26,7 +28,20 @@ module LlmMemory
26
28
  @chunk_overlap = chunk_overlap
27
29
  end
28
30
 
31
+ # validate the document format
32
+ def validate_documents(documents)
33
+ is_valid = documents.all? do |hash|
34
+ hash.is_a?(Hash) &&
35
+ hash.key?(:content) && hash[:content].is_a?(String) &&
36
+ hash.key?(:metadata) && hash[:metadata].is_a?(Hash)
37
+ end
38
+ unless is_valid
39
+ raise "Your documents need to have an array of hashes (content: string and metadata: hash)"
40
+ end
41
+ end
42
+
29
43
  def memorize(docs)
44
+ validate_documents(docs)
30
45
  docs = make_chunks(docs)
31
46
  docs = add_vectors(docs)
32
47
  @store.create_index unless @store.index_exists?
@@ -1,5 +1,5 @@
1
1
  # frozen_string_literal: true
2
2
 
3
3
  module LlmMemory
4
- VERSION = "0.1.0"
4
+ VERSION = "0.1.2"
5
5
  end
data/lib/llm_memory.rb CHANGED
@@ -20,5 +20,4 @@ module LlmMemory
20
20
  self.configuration ||= Configuration.new
21
21
  yield(configuration) if block_given?
22
22
  end
23
- configure # init for default values
24
23
  end
data/llm_memory.gemspec CHANGED
@@ -31,6 +31,9 @@ Gem::Specification.new do |spec|
31
31
 
32
32
  # Uncomment to register a new dependency of your gem
33
33
  # spec.add_dependency "example-gem", "~> 1.0"
34
+ spec.add_dependency "tiktoken_ruby", "~> 0.0.4"
35
+ spec.add_dependency "ruby-openai", "~> 3.7.0"
36
+ spec.add_dependency "redis", "~> 4.6.0"
34
37
 
35
38
  # For more information and examples about making a new gem, check out our
36
39
  # guide at: https://bundler.io/guides/creating_gem.html
metadata CHANGED
@@ -1,15 +1,57 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: llm_memory
3
3
  version: !ruby/object:Gem::Version
4
- version: 0.1.0
4
+ version: 0.1.2
5
5
  platform: ruby
6
6
  authors:
7
7
  - Shohei Kameda
8
8
  autorequire:
9
9
  bindir: exe
10
10
  cert_chain: []
11
- date: 2023-05-04 00:00:00.000000000 Z
12
- dependencies: []
11
+ date: 2023-05-09 00:00:00.000000000 Z
12
+ dependencies:
13
+ - !ruby/object:Gem::Dependency
14
+ name: tiktoken_ruby
15
+ requirement: !ruby/object:Gem::Requirement
16
+ requirements:
17
+ - - "~>"
18
+ - !ruby/object:Gem::Version
19
+ version: 0.0.4
20
+ type: :runtime
21
+ prerelease: false
22
+ version_requirements: !ruby/object:Gem::Requirement
23
+ requirements:
24
+ - - "~>"
25
+ - !ruby/object:Gem::Version
26
+ version: 0.0.4
27
+ - !ruby/object:Gem::Dependency
28
+ name: ruby-openai
29
+ requirement: !ruby/object:Gem::Requirement
30
+ requirements:
31
+ - - "~>"
32
+ - !ruby/object:Gem::Version
33
+ version: 3.7.0
34
+ type: :runtime
35
+ prerelease: false
36
+ version_requirements: !ruby/object:Gem::Requirement
37
+ requirements:
38
+ - - "~>"
39
+ - !ruby/object:Gem::Version
40
+ version: 3.7.0
41
+ - !ruby/object:Gem::Dependency
42
+ name: redis
43
+ requirement: !ruby/object:Gem::Requirement
44
+ requirements:
45
+ - - "~>"
46
+ - !ruby/object:Gem::Version
47
+ version: 4.6.0
48
+ type: :runtime
49
+ prerelease: false
50
+ version_requirements: !ruby/object:Gem::Requirement
51
+ requirements:
52
+ - - "~>"
53
+ - !ruby/object:Gem::Version
54
+ version: 4.6.0
13
55
  description: LLM Memory is a Ruby gem designed to provide large language models (LLMs)
14
56
  like ChatGPT with memory using in-context learning. This enables better integration
15
57
  with systems such as Rails and web services while providing a more user-friendly