endee-llamaindex 0.1.2__py3-none-any.whl → 0.1.5a1__py3-none-any.whl

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
@@ -1,140 +0,0 @@
1
- Metadata-Version: 2.4
2
- Name: endee-llamaindex
3
- Version: 0.1.2
4
- Summary: Vector Database for Fast ANN Searches
5
- Home-page: https://endee.io
6
- Author: Endee Labs
7
- Author-email: vineet@endee.io
8
- Classifier: Programming Language :: Python :: 3
9
- Classifier: License :: OSI Approved :: MIT License
10
- Classifier: Operating System :: OS Independent
11
- Requires-Python: >=3.6
12
- Description-Content-Type: text/markdown
13
- Requires-Dist: llama-index>=0.12.34
14
- Requires-Dist: endee>=0.1.2
15
- Dynamic: author
16
- Dynamic: author-email
17
- Dynamic: classifier
18
- Dynamic: description
19
- Dynamic: description-content-type
20
- Dynamic: home-page
21
- Dynamic: requires-dist
22
- Dynamic: requires-python
23
- Dynamic: summary
24
-
25
- # Endee LlamaIndex Integration
26
-
27
- This package provides an integration between [Endee](https://endeedb.ai) (a vector database) and [LlamaIndex](https://www.llamaindex.ai/), allowing you to use Endee as a vector store backend for LlamaIndex.
28
-
29
- ## Features
30
-
31
- - **Vector Storage**: Use Endee for your LlamaIndex embeddings
32
- - **Multiple Distance Metrics**: Support for cosine, L2, and inner product distance metrics
33
- - **Metadata Filtering**: Filter search results based on metadata
34
- - **High Performance**: Optimized for speed and efficiency
35
-
36
- ## Installation
37
-
38
- ```bash
39
- pip install endee-llamaindex
40
- ```
41
-
42
- This will install both the `endee-llamaindex` package and its dependencies (`endee` and `llama-index`).
43
-
44
- ## Quick Start
45
-
46
- ```python
47
- import os
48
- from llama_index.core.schema import TextNode
49
- from llama_index.core.vector_stores.types import VectorStoreQuery
50
- from endee_llamaindex import EndeeVectorStore
51
-
52
- # Configure your Endee credentials
53
- api_token = os.environ.get("ENDEE_API_TOKEN")
54
- index_name = "my_llamaindex_vectors"
55
- dimension = 1536 # OpenAI ada-002 embedding dimension
56
-
57
- # Initialize the vector store
58
- vector_store = EndeeVectorStore.from_params(
59
- api_token=api_token,
60
- index_name=index_name,
61
- dimension=dimension,
62
- space_type="cosine"
63
- )
64
-
65
- # Create a node with embedding
66
- node = TextNode(
67
- text="This is a sample document",
68
- id_="doc1",
69
- embedding=[0.1, 0.2, 0.3, ...], # Your embedding vector
70
- metadata={
71
- "doc_id": "doc1",
72
- "source": "example",
73
- "author": "Endee"
74
- }
75
- )
76
-
77
- # Add the node to the vector store
78
- vector_store.add([node])
79
-
80
- # Query the vector store
81
- query = VectorStoreQuery(
82
- query_embedding=[0.2, 0.3, 0.4, ...], # Your query vector
83
- similarity_top_k=5
84
- )
85
-
86
- results = vector_store.query(query)
87
-
88
- # Process results
89
- for node, score in zip(results.nodes, results.similarities):
90
- print(f"Node ID: {node.node_id}, Similarity: {score}")
91
- print(f"Text: {node.text}")
92
- print(f"Metadata: {node.metadata}")
93
- ```
94
-
95
- ## Using with LlamaIndex
96
-
97
- ```python
98
- from llama_index.core import VectorStoreIndex, StorageContext
99
- from llama_index.embeddings.openai import OpenAIEmbedding
100
-
101
- # Initialize your nodes or documents
102
- nodes = [...] # Your nodes with text but no embeddings yet
103
-
104
- # Setup embedding function
105
- embed_model = OpenAIEmbedding() # Or any other embedding model
106
-
107
- # Initialize Endee vector store
108
- vector_store = EndeeVectorStore.from_params(
109
- api_token=api_token,
110
- index_name=index_name,
111
- dimension=1536, # Make sure this matches your embedding dimension
112
- )
113
-
114
- # Create storage context
115
- storage_context = StorageContext.from_defaults(vector_store=vector_store)
116
-
117
- # Create vector index
118
- index = VectorStoreIndex(
119
- nodes,
120
- storage_context=storage_context,
121
- embed_model=embed_model
122
- )
123
-
124
- # Query the index
125
- query_engine = index.as_query_engine()
126
- response = query_engine.query("Your query here")
127
- print(response)
128
- ```
129
-
130
- ## Configuration Options
131
-
132
- The `EndeeVectorStore` constructor accepts the following parameters:
133
-
134
- - `api_token`: Your Endee API token
135
- - `index_name`: Name of the Endee index
136
- - `dimension`: Vector dimension (required when creating a new index)
137
- - `space_type`: Distance metric, one of "cosine", "l2", or "ip" (default: "cosine")
138
- - `batch_size`: Number of vectors to insert in a single API call (default: 100)
139
- - `text_key`: Key to use for storing text in metadata (default: "text")
140
- - `remove_text_from_metadata`: Whether to remove text from metadata (default: False)
@@ -1,6 +0,0 @@
1
- endee_llamaindex/__init__.py,sha256=ctCcicNLMO3LpXPGLwvQifvQLX7TEd8CYgFO6Nd9afc,83
2
- endee_llamaindex/base.py,sha256=g5o5020lZuccMuKdaeNTAQ3a8J368rhIQypeCkOZjFk,13888
3
- endee_llamaindex-0.1.2.dist-info/METADATA,sha256=7unMMmO3QT520VFRp7UIIpm75VmYVZsx5e_FfJXt1Us,4088
4
- endee_llamaindex-0.1.2.dist-info/WHEEL,sha256=_zCd3N1l69ArxyTb8rzEoP9TpbYXkqRFSNOD5OuxnTs,91
5
- endee_llamaindex-0.1.2.dist-info/top_level.txt,sha256=AReiKL0lBXSdKPsQlDusPIH_qbS_txOSUctuCR0rRNQ,17
6
- endee_llamaindex-0.1.2.dist-info/RECORD,,