ragify 0.1.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
checksums.yaml ADDED
@@ -0,0 +1,7 @@
1
+ ---
2
+ SHA256:
3
+ metadata.gz: 0a8b50afa93cb29e25eda8a6e6c1bc902ed50b106bbb1b18db1c607672c92d86
4
+ data.tar.gz: 02f99522340f1a8c2738c6fbc12c45e50fbfb5de4db9771580881a13385b6c11
5
+ SHA512:
6
+ metadata.gz: 15c94ec278c69c2615c32ae88bd84bece4fec6d1cde2d524cebcd5ddc512152a15c4e2c4b138036baf3ab978940ae3412e4323342e6e10f89afa101896b4f58b
7
+ data.tar.gz: 280a443ad84ed106454fbad4ac76b6ea0604ce6c80f2e74af02d350eda1c5a9c875db547dd28ab5cdc5372136a62432cc6bc9d41ccd1b03940abae308a1e4a34
data/README.md ADDED
@@ -0,0 +1,178 @@
1
+ # Ragify
2
+
3
+ RAG (Retrieval-Augmented Generation) for Rails. Add semantic search and AI chat to any ActiveRecord model using pgvector and OpenAI.
4
+
5
+ No external vector databases. No complex infrastructure. Just your PostgreSQL and a few lines of code.
6
+
7
+ ## Requirements
8
+
9
+ - Ruby >= 3.2
10
+ - Rails >= 7.0
11
+ - PostgreSQL with [pgvector](https://github.com/pgvector/pgvector) extension
12
+ - OpenAI API key
13
+
14
+ ## Installation
15
+
16
+ Add to your Gemfile:
17
+
18
+ ```ruby
19
+ gem "ragify"
20
+ ```
21
+
22
+ Then run:
23
+
24
+ ```bash
25
+ bundle install
26
+ ```
27
+
28
+ ## Quick Start
29
+
30
+ ### 1. Configure
31
+
32
+ ```ruby
33
+ # config/initializers/ragify.rb
34
+ Ragify.configure do |config|
35
+ config.openai_api_key = ENV["OPENAI_API_KEY"]
36
+ config.embedding_model = "text-embedding-3-small" # default
37
+ config.chat_model = "gpt-4o-mini" # default
38
+ config.similarity_threshold = 0.2 # default
39
+ end
40
+ ```
41
+
42
+ ### 2. Add embedding column
43
+
44
+ ```bash
45
+ rails generate ragify:install FaqEntry
46
+ rails db:migrate
47
+ ```
48
+
49
+ This creates a migration that adds a `vector` column with an IVFFlat index.
50
+
51
+ ### 3. Include in your model
52
+
53
+ ```ruby
54
+ class FaqEntry < ApplicationRecord
55
+ include Ragify::Embeddable
56
+
57
+ ragify_content { |record| "#{record.question} #{record.answer}" }
58
+ end
59
+ ```
60
+
61
+ ### 4. Use it
62
+
63
+ ```ruby
64
+ # Semantic search
65
+ results = FaqEntry.semantic_search("how does cashback work?")
66
+ # => [{ record: #<FaqEntry>, content: "...", similarity: 0.87 }, ...]
67
+
68
+ # RAG chat (search + LLM)
69
+ response = Ragify.chat("how does cashback work?",
70
+ context_model: FaqEntry,
71
+ system_prompt: "You are a helpful assistant for a loyalty program."
72
+ )
73
+ # => { answer: "Cashback is automatically...", context: [...], tokens: 342 }
74
+
75
+ # Generate embeddings for existing records
76
+ FaqEntry.embed_all!
77
+ ```
78
+
79
+ ## Features
80
+
81
+ ### Embeddable
82
+
83
+ Include `Ragify::Embeddable` in any ActiveRecord model. Embeddings are automatically generated on create/update.
84
+
85
+ ```ruby
86
+ class Article < ApplicationRecord
87
+ include Ragify::Embeddable
88
+
89
+ # Block form - full control over content
90
+ ragify_content { |r| "#{r.title}\n#{r.body}" }
91
+
92
+ # Or method form
93
+ ragify_content :embedding_text
94
+ end
95
+ ```
96
+
97
+ ### Semantic Search
98
+
99
+ Find records by meaning, not keywords.
100
+
101
+ ```ruby
102
+ Article.semantic_search("climate change effects",
103
+ limit: 10, # max results (default: 5)
104
+ threshold: 0.3 # min similarity 0-1 (default: 0.2)
105
+ )
106
+ ```
107
+
108
+ Returns an array of hashes with `:record`, `:content`, and `:similarity`.
109
+
110
+ ### RAG Chat
111
+
112
+ Combine semantic search with an LLM for context-aware answers.
113
+
114
+ ```ruby
115
+ Ragify.chat("what's your refund policy?",
116
+ context_model: FaqEntry,
117
+ system_prompt: "You are a customer support agent.",
118
+ search_limit: 5,
119
+ history: [
120
+ { role: "user", content: "hi" },
121
+ { role: "assistant", content: "Hello! How can I help?" }
122
+ ]
123
+ )
124
+ ```
125
+
126
+ Returns a hash:
127
+
128
+ ```ruby
129
+ {
130
+ answer: "Our refund policy allows...",
131
+ context: [{ record: ..., content: "...", similarity: 0.91 }],
132
+ tokens: 487,
133
+ had_context: true
134
+ }
135
+ ```
136
+
137
+ ### Batch Embedding
138
+
139
+ Generate embeddings for all records (useful for initial setup):
140
+
141
+ ```ruby
142
+ FaqEntry.embed_all!
143
+
144
+ # Or embed specific records
145
+ Ragify.embed("some text")
146
+ Ragify.embed_batch(["text 1", "text 2", "text 3"])
147
+ ```
148
+
149
+ ## Configuration
150
+
151
+ | Option | Default | Description |
152
+ |--------|---------|-------------|
153
+ | `openai_api_key` | `ENV["OPENAI_API_KEY"]` | OpenAI API key |
154
+ | `embedding_model` | `text-embedding-3-small` | Embedding model |
155
+ | `embedding_dimensions` | `1536` | Vector dimensions |
156
+ | `chat_model` | `gpt-4o-mini` | Chat completion model |
157
+ | `max_tokens` | `1000` | Max response tokens |
158
+ | `temperature` | `0.3` | LLM temperature |
159
+ | `similarity_threshold` | `0.2` | Min cosine similarity |
160
+ | `search_limit` | `5` | Default search results |
161
+ | `ivfflat_probes` | `10` | IVFFlat index probes |
162
+ | `logger` | `Rails.logger` | Logger instance |
163
+
164
+ ## How It Works
165
+
166
+ 1. **Embedding**: When a record is saved, `ragify_content` extracts text and sends it to OpenAI's embedding API. The resulting vector is stored in a pgvector column.
167
+
168
+ 2. **Search**: Queries are embedded the same way, then pgvector finds the nearest neighbors using cosine similarity with an IVFFlat index.
169
+
170
+ 3. **Chat**: The search results are injected as context into an LLM prompt, which generates a grounded answer.
171
+
172
+ ## Contributing
173
+
174
+ Bug reports and pull requests are welcome at https://github.com/pashgo/ragify.
175
+
176
+ ## License
177
+
178
+ MIT License. See [LICENSE](LICENSE) for details.
data/Rakefile ADDED
@@ -0,0 +1,12 @@
1
+ # frozen_string_literal: true
2
+
3
+ require "bundler/gem_tasks"
4
+ require "rspec/core/rake_task"
5
+
6
+ RSpec::Core::RakeTask.new(:spec)
7
+
8
+ require "rubocop/rake_task"
9
+
10
+ RuboCop::RakeTask.new
11
+
12
+ task default: %i[spec rubocop]
@@ -0,0 +1,50 @@
1
+ # frozen_string_literal: true
2
+
3
+ require "rails/generators"
4
+ require "rails/generators/active_record"
5
+
6
+ module Ragify
7
+ class InstallGenerator < Rails::Generators::Base
8
+ include ActiveRecord::Generators::Migration
9
+
10
+ source_root File.expand_path("templates", __dir__)
11
+
12
+ argument :model_name, type: :string, desc: "Model to add embeddings to (e.g., FaqEntry)"
13
+
14
+ def create_migration
15
+ migration_template "add_embedding_migration.rb.erb",
16
+ "db/migrate/add_ragify_embedding_to_#{table_name}.rb"
17
+ end
18
+
19
+ def show_post_install
20
+ say ""
21
+ say "Ragify installed! Next steps:", :green
22
+ say ""
23
+ say " 1. Run the migration:"
24
+ say " rails db:migrate"
25
+ say ""
26
+ say " 2. Add to your model:"
27
+ say " class #{model_name} < ApplicationRecord"
28
+ say " include Ragify::Embeddable"
29
+ say ' ragify_content { |r| "#{r.question} #{r.answer}" }'
30
+ say " end"
31
+ say ""
32
+ say " 3. Generate embeddings for existing records:"
33
+ say " #{model_name}.embed_all!"
34
+ say ""
35
+ say " 4. Search:"
36
+ say ' #{model_name}.semantic_search("your query")'
37
+ say ""
38
+ end
39
+
40
+ private
41
+
42
+ def table_name
43
+ model_name.underscore.pluralize
44
+ end
45
+
46
+ def dimensions
47
+ Ragify.configuration.embedding_dimensions
48
+ end
49
+ end
50
+ end
@@ -0,0 +1,8 @@
1
+ class AddRagifyEmbeddingTo<%= table_name.camelize %> < ActiveRecord::Migration[<%= ActiveRecord::Migration.current_version %>]
2
+ def change
3
+ enable_extension "vector" unless extension_enabled?("vector")
4
+
5
+ add_column :<%= table_name %>, :embedding, :vector, limit: <%= dimensions %>
6
+ add_index :<%= table_name %>, :embedding, using: :ivfflat, opclass: :vector_cosine_ops
7
+ end
8
+ end
@@ -0,0 +1,85 @@
1
+ # frozen_string_literal: true
2
+
3
+ module Ragify
4
+ # RAG chat: find relevant context via semantic search, then ask an LLM.
5
+ #
6
+ # Ragify.chat("how does cashback work?",
7
+ # context_model: FaqEntry,
8
+ # system_prompt: "You are a helpful assistant."
9
+ # )
10
+ # # => { answer: "Cashback is...", context: [...], tokens: 342 }
11
+ #
12
+ class Chat
13
+ OPENAI_CHAT_URL = "https://api.openai.com/v1/chat/completions"
14
+
15
+ DEFAULT_SYSTEM_PROMPT = "You are a helpful assistant. Answer questions based on the provided context. " \
16
+ "If the context doesn't contain relevant information, say you don't know."
17
+
18
+ def initialize(context_model:, system_prompt: nil, search_limit: nil, history: nil)
19
+ @context_model = context_model
20
+ @system_prompt = system_prompt || DEFAULT_SYSTEM_PROMPT
21
+ @search_limit = search_limit
22
+ @history = history || []
23
+ end
24
+
25
+ def call(query)
26
+ search_results = Search.new(@context_model).call(query, limit: @search_limit)
27
+ context = build_context(search_results)
28
+ messages = build_messages(query, context)
29
+
30
+ response = chat_completion(messages)
31
+ answer = response.dig("choices", 0, "message", "content")&.strip
32
+ tokens = response.dig("usage", "total_tokens") || 0
33
+
34
+ {
35
+ answer: answer,
36
+ context: search_results,
37
+ tokens: tokens,
38
+ had_context: search_results.any?
39
+ }
40
+ rescue HttpClient::ApiError, Net::OpenTimeout, Net::ReadTimeout => e
41
+ config.logger.error("[Ragify::Chat] API error: #{e.message}")
42
+ { answer: nil, context: [], tokens: 0, had_context: false, error: e.message }
43
+ end
44
+
45
+ private
46
+
47
+ def build_context(results)
48
+ return "" if results.empty?
49
+
50
+ results.map do |result|
51
+ "[Relevance: #{(result[:similarity] * 100).round}%]\n#{result[:content]}"
52
+ end.join("\n\n---\n\n")
53
+ end
54
+
55
+ def build_messages(query, context)
56
+ messages = [{ role: "system", content: @system_prompt }]
57
+
58
+ @history.each do |msg|
59
+ messages << { role: msg[:role].to_s, content: msg[:content].to_s }
60
+ end
61
+
62
+ user_content = if context.present?
63
+ "Context:\n#{context}\n\n---\n\nQuestion: #{query}"
64
+ else
65
+ query
66
+ end
67
+
68
+ messages << { role: "user", content: user_content }
69
+ messages
70
+ end
71
+
72
+ def chat_completion(messages)
73
+ HttpClient.new.post_json(OPENAI_CHAT_URL, body: {
74
+ model: config.chat_model,
75
+ messages: messages,
76
+ max_tokens: config.max_tokens,
77
+ temperature: config.temperature
78
+ }, timeout: 60)
79
+ end
80
+
81
+ def config
82
+ Ragify.configuration
83
+ end
84
+ end
85
+ end
@@ -0,0 +1,34 @@
1
+ # frozen_string_literal: true
2
+
3
+ module Ragify
4
+ class Configuration
5
+ attr_accessor :openai_api_key,
6
+ :embedding_model,
7
+ :embedding_dimensions,
8
+ :chat_model,
9
+ :max_tokens,
10
+ :temperature,
11
+ :similarity_threshold,
12
+ :search_limit,
13
+ :ivfflat_probes,
14
+ :logger
15
+
16
+ def initialize
17
+ @openai_api_key = ENV.fetch("OPENAI_API_KEY", nil)
18
+ @embedding_model = "text-embedding-3-small"
19
+ @embedding_dimensions = 1536
20
+ @chat_model = "gpt-4o-mini"
21
+ @max_tokens = 1000
22
+ @temperature = 0.3
23
+ @similarity_threshold = 0.2
24
+ @search_limit = 5
25
+ @ivfflat_probes = 10
26
+ @logger = defined?(Rails) ? Rails.logger : Logger.new($stdout)
27
+ end
28
+
29
+ def openai_api_key!
30
+ openai_api_key || raise(Error, "Ragify: openai_api_key is not configured. " \
31
+ "Set it via Ragify.configure { |c| c.openai_api_key = '...' } or ENV['OPENAI_API_KEY']")
32
+ end
33
+ end
34
+ end
@@ -0,0 +1,90 @@
1
+ # frozen_string_literal: true
2
+
3
+ require "active_support/concern"
4
+
5
+ module Ragify
6
+ # Include in ActiveRecord models to add vector embeddings and semantic search.
7
+ #
8
+ # class FaqEntry < ApplicationRecord
9
+ # include Ragify::Embeddable
10
+ # ragify_content { |record| "#{record.question} #{record.answer}" }
11
+ # end
12
+ #
13
+ # FaqEntry.semantic_search("how does cashback work?")
14
+ # FaqEntry.embed_all!
15
+ #
16
+ module Embeddable
17
+ extend ActiveSupport::Concern
18
+
19
+ included do
20
+ has_neighbors :embedding
21
+
22
+ scope :with_embedding, -> { where.not(embedding: nil) }
23
+
24
+ after_commit :ragify_generate_embedding, on: %i[create update], if: :ragify_content_changed?
25
+ end
26
+
27
+ class_methods do
28
+ # Define how to build the text content for embedding.
29
+ #
30
+ # ragify_content { |record| "#{record.title} #{record.body}" }
31
+ # ragify_content :embedding_text # calls record.embedding_text
32
+ #
33
+ def ragify_content(method_name = nil, &block)
34
+ if block
35
+ @ragify_content_proc = block
36
+ elsif method_name
37
+ @ragify_content_proc = ->(record) { record.public_send(method_name) }
38
+ else
39
+ raise ArgumentError, "ragify_content requires a method name or block"
40
+ end
41
+ end
42
+
43
+ def ragify_content_proc
44
+ @ragify_content_proc || raise(Error, "#{name}: call `ragify_content` to define what text to embed")
45
+ end
46
+
47
+ # Search for records semantically similar to the query string.
48
+ #
49
+ # FaqEntry.semantic_search("refund policy", limit: 5, threshold: 0.3)
50
+ #
51
+ def semantic_search(query, limit: nil, threshold: nil)
52
+ Search.new(self).call(query, limit: limit, threshold: threshold)
53
+ end
54
+
55
+ # Generate embeddings for all records (or a subset).
56
+ #
57
+ # FaqEntry.embed_all!
58
+ # FaqEntry.where(embedding: nil).embed_all!
59
+ #
60
+ def embed_all!(batch_size: 50)
61
+ client = EmbeddingClient.new
62
+ find_each(batch_size: batch_size) do |record|
63
+ text = ragify_content_proc.call(record)
64
+ embedding = client.generate(text)
65
+ record.update_column(:embedding, embedding) if embedding # rubocop:disable Rails/SkipsModelValidations
66
+ end
67
+ end
68
+ end
69
+
70
+ # Returns the text that will be embedded for this record.
71
+ def ragify_text
72
+ self.class.ragify_content_proc.call(self)
73
+ end
74
+
75
+ private
76
+
77
+ def ragify_generate_embedding
78
+ text = ragify_text
79
+ return if text.blank?
80
+
81
+ embedding = Ragify.embed(text)
82
+ update_column(:embedding, embedding) if embedding # rubocop:disable Rails/SkipsModelValidations
83
+ end
84
+
85
+ def ragify_content_changed?
86
+ # Always regenerate — the content proc may depend on any attribute
87
+ true
88
+ end
89
+ end
90
+ end
@@ -0,0 +1,52 @@
1
+ # frozen_string_literal: true
2
+
3
+ module Ragify
4
+ class EmbeddingClient
5
+ OPENAI_URL = "https://api.openai.com/v1/embeddings"
6
+ MAX_TEXT_LENGTH = 30_000
7
+
8
+ def generate(text)
9
+ return nil if text.nil? || text.to_s.strip.empty?
10
+
11
+ response = request(truncate(text))
12
+ response.dig("data", 0, "embedding")
13
+ rescue HttpClient::ApiError, Net::OpenTimeout, Net::ReadTimeout => e
14
+ config.logger.error("[Ragify] Embedding failed: #{e.message}")
15
+ nil
16
+ end
17
+
18
+ def generate_batch(texts)
19
+ return [] if texts.nil? || texts.empty?
20
+
21
+ truncated = texts.map { |t| truncate(t) }
22
+ response = request(truncated)
23
+
24
+ data = response["data"] || []
25
+ embeddings = Array.new(texts.size)
26
+
27
+ data.each do |item|
28
+ index = item["index"]
29
+ embeddings[index] = item["embedding"] if index && index < texts.size
30
+ end
31
+
32
+ embeddings
33
+ rescue HttpClient::ApiError, Net::OpenTimeout, Net::ReadTimeout => e
34
+ config.logger.error("[Ragify] Batch embedding failed: #{e.message}")
35
+ Array.new(texts.size)
36
+ end
37
+
38
+ private
39
+
40
+ def request(input)
41
+ HttpClient.new.post_json(OPENAI_URL, body: { model: config.embedding_model, input: input })
42
+ end
43
+
44
+ def truncate(text)
45
+ text.to_s[0, MAX_TEXT_LENGTH]
46
+ end
47
+
48
+ def config
49
+ Ragify.configuration
50
+ end
51
+ end
52
+ end
@@ -0,0 +1,34 @@
1
+ # frozen_string_literal: true
2
+
3
+ require "net/http"
4
+ require "json"
5
+ require "uri"
6
+
7
+ module Ragify
8
+ # Minimal HTTP client using Net::HTTP. No external dependencies.
9
+ class HttpClient
10
+ class ApiError < Error; end
11
+
12
+ def post_json(url, body:, timeout: 30)
13
+ uri = URI.parse(url)
14
+
15
+ http = Net::HTTP.new(uri.host, uri.port)
16
+ http.use_ssl = true
17
+ http.open_timeout = 10
18
+ http.read_timeout = timeout
19
+
20
+ request = Net::HTTP::Post.new(uri.request_uri)
21
+ request["Content-Type"] = "application/json"
22
+ request["Authorization"] = "Bearer #{Ragify.configuration.openai_api_key!}"
23
+ request.body = JSON.generate(body)
24
+
25
+ response = http.request(request)
26
+
27
+ unless response.is_a?(Net::HTTPSuccess)
28
+ raise ApiError, "OpenAI API error #{response.code}: #{response.body}"
29
+ end
30
+
31
+ JSON.parse(response.body)
32
+ end
33
+ end
34
+ end
@@ -0,0 +1,52 @@
1
+ # frozen_string_literal: true
2
+
3
+ module Ragify
4
+ # Semantic vector search against an Embeddable model.
5
+ #
6
+ # Search.new(FaqEntry).call("how does cashback work?")
7
+ # # => [{ record: #<FaqEntry>, content: "...", similarity: 0.87 }, ...]
8
+ #
9
+ class Search
10
+ def initialize(model)
11
+ @model = model
12
+ end
13
+
14
+ def call(query, limit: nil, threshold: nil)
15
+ limit ||= config.search_limit
16
+ threshold ||= config.similarity_threshold
17
+
18
+ embedding = EmbeddingClient.new.generate(query)
19
+ return [] if embedding.nil?
20
+
21
+ set_ivfflat_probes
22
+
23
+ results = @model
24
+ .with_embedding
25
+ .nearest_neighbors(:embedding, embedding, distance: "cosine")
26
+ .limit(limit * 2)
27
+ .to_a
28
+
29
+ results
30
+ .select { |record| (1 - record.neighbor_distance) >= threshold }
31
+ .first(limit)
32
+ .map { |record| format_result(record) }
33
+ end
34
+
35
+ private
36
+
37
+ def set_ivfflat_probes
38
+ @model.connection.execute("SET ivfflat.probes = #{config.ivfflat_probes}")
39
+ end
40
+
41
+ def format_result(record)
42
+ similarity = (1 - record.neighbor_distance).round(4)
43
+ content = record.ragify_text
44
+
45
+ { record: record, content: content, similarity: similarity }
46
+ end
47
+
48
+ def config
49
+ Ragify.configuration
50
+ end
51
+ end
52
+ end
@@ -0,0 +1,5 @@
1
+ # frozen_string_literal: true
2
+
3
+ module Ragify
4
+ VERSION = "0.1.0"
5
+ end
data/lib/ragify.rb ADDED
@@ -0,0 +1,37 @@
1
+ # frozen_string_literal: true
2
+
3
+ require_relative "ragify/version"
4
+ require_relative "ragify/configuration"
5
+ require_relative "ragify/http_client"
6
+ require_relative "ragify/embedding_client"
7
+ require_relative "ragify/embeddable"
8
+ require_relative "ragify/search"
9
+ require_relative "ragify/chat"
10
+
11
+ module Ragify
12
+ class Error < StandardError; end
13
+
14
+ class << self
15
+ attr_writer :configuration
16
+
17
+ def configuration
18
+ @configuration ||= Configuration.new
19
+ end
20
+
21
+ def configure
22
+ yield(configuration)
23
+ end
24
+
25
+ def chat(query, context_model:, system_prompt: nil, **options)
26
+ Chat.new(context_model: context_model, system_prompt: system_prompt, **options).call(query)
27
+ end
28
+
29
+ def embed(text)
30
+ EmbeddingClient.new.generate(text)
31
+ end
32
+
33
+ def embed_batch(texts)
34
+ EmbeddingClient.new.generate_batch(texts)
35
+ end
36
+ end
37
+ end
data/sig/ragify.rbs ADDED
@@ -0,0 +1,4 @@
1
+ module Ragify
2
+ VERSION: String
3
+ # See the writing guide of rbs: https://github.com/ruby/rbs#guides
4
+ end
metadata ADDED
@@ -0,0 +1,104 @@
1
+ --- !ruby/object:Gem::Specification
2
+ name: ragify
3
+ version: !ruby/object:Gem::Version
4
+ version: 0.1.0
5
+ platform: ruby
6
+ authors:
7
+ - Pavel Skripin
8
+ autorequire:
9
+ bindir: exe
10
+ cert_chain: []
11
+ date: 2026-03-23 00:00:00.000000000 Z
12
+ dependencies:
13
+ - !ruby/object:Gem::Dependency
14
+ name: activerecord
15
+ requirement: !ruby/object:Gem::Requirement
16
+ requirements:
17
+ - - ">="
18
+ - !ruby/object:Gem::Version
19
+ version: '7.0'
20
+ type: :runtime
21
+ prerelease: false
22
+ version_requirements: !ruby/object:Gem::Requirement
23
+ requirements:
24
+ - - ">="
25
+ - !ruby/object:Gem::Version
26
+ version: '7.0'
27
+ - !ruby/object:Gem::Dependency
28
+ name: activesupport
29
+ requirement: !ruby/object:Gem::Requirement
30
+ requirements:
31
+ - - ">="
32
+ - !ruby/object:Gem::Version
33
+ version: '7.0'
34
+ type: :runtime
35
+ prerelease: false
36
+ version_requirements: !ruby/object:Gem::Requirement
37
+ requirements:
38
+ - - ">="
39
+ - !ruby/object:Gem::Version
40
+ version: '7.0'
41
+ - !ruby/object:Gem::Dependency
42
+ name: neighbor
43
+ requirement: !ruby/object:Gem::Requirement
44
+ requirements:
45
+ - - ">="
46
+ - !ruby/object:Gem::Version
47
+ version: '0.3'
48
+ type: :runtime
49
+ prerelease: false
50
+ version_requirements: !ruby/object:Gem::Requirement
51
+ requirements:
52
+ - - ">="
53
+ - !ruby/object:Gem::Version
54
+ version: '0.3'
55
+ description: Add semantic search and AI-powered chat to any ActiveRecord model. Uses
56
+ pgvector for vector storage, OpenAI for embeddings, and your existing PostgreSQL
57
+ database.
58
+ email:
59
+ - skripin.pavel@gmail.com
60
+ executables: []
61
+ extensions: []
62
+ extra_rdoc_files: []
63
+ files:
64
+ - README.md
65
+ - Rakefile
66
+ - lib/generators/ragify/install_generator.rb
67
+ - lib/generators/ragify/templates/add_embedding_migration.rb.erb
68
+ - lib/ragify.rb
69
+ - lib/ragify/chat.rb
70
+ - lib/ragify/configuration.rb
71
+ - lib/ragify/embeddable.rb
72
+ - lib/ragify/embedding_client.rb
73
+ - lib/ragify/http_client.rb
74
+ - lib/ragify/search.rb
75
+ - lib/ragify/version.rb
76
+ - sig/ragify.rbs
77
+ homepage: https://github.com/pashgo/ragify
78
+ licenses:
79
+ - MIT
80
+ metadata:
81
+ homepage_uri: https://github.com/pashgo/ragify
82
+ source_code_uri: https://github.com/pashgo/ragify
83
+ changelog_uri: https://github.com/pashgo/ragify/blob/main/CHANGELOG.md
84
+ rubygems_mfa_required: 'true'
85
+ post_install_message:
86
+ rdoc_options: []
87
+ require_paths:
88
+ - lib
89
+ required_ruby_version: !ruby/object:Gem::Requirement
90
+ requirements:
91
+ - - ">="
92
+ - !ruby/object:Gem::Version
93
+ version: 3.2.0
94
+ required_rubygems_version: !ruby/object:Gem::Requirement
95
+ requirements:
96
+ - - ">="
97
+ - !ruby/object:Gem::Version
98
+ version: '0'
99
+ requirements: []
100
+ rubygems_version: 3.5.11
101
+ signing_key:
102
+ specification_version: 4
103
+ summary: RAG (Retrieval-Augmented Generation) for Rails with pgvector and OpenAI
104
+ test_files: []