local-deep-research 0.3.0__py3-none-any.whl → 0.3.2__py3-none-any.whl

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
@@ -0,0 +1,349 @@
1
+ Metadata-Version: 2.1
2
+ Name: local-deep-research
3
+ Version: 0.3.2
4
+ Summary: AI-powered research assistant with deep, iterative analysis using LLMs and web searches
5
+ Author-Email: LearningCircuit <185559241+LearningCircuit@users.noreply.github.com>, HashedViking <6432677+HashedViking@users.noreply.github.com>
6
+ License: MIT License
7
+
8
+ Copyright (c) 2025 LearningCircuit
9
+
10
+ Permission is hereby granted, free of charge, to any person obtaining a copy
11
+ of this software and associated documentation files (the "Software"), to deal
12
+ in the Software without restriction, including without limitation the rights
13
+ to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
14
+ copies of the Software, and to permit persons to whom the Software is
15
+ furnished to do so, subject to the following conditions:
16
+
17
+ The above copyright notice and this permission notice shall be included in all
18
+ copies or substantial portions of the Software.
19
+
20
+ THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
21
+ IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
22
+ FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
23
+ AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
24
+ LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
25
+ OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
26
+ SOFTWARE.
27
+
28
+ Classifier: Programming Language :: Python :: 3
29
+ Classifier: License :: OSI Approved :: MIT License
30
+ Classifier: Operating System :: OS Independent
31
+ Project-URL: Homepage, https://github.com/LearningCircuit/local-deep-research
32
+ Project-URL: Bug Tracker, https://github.com/LearningCircuit/local-deep-research/issues
33
+ Requires-Python: >=3.10
34
+ Requires-Dist: langchain>=0.3.18
35
+ Requires-Dist: langchain-community>=0.3.17
36
+ Requires-Dist: langchain-core>=0.3.34
37
+ Requires-Dist: langchain-ollama>=0.2.3
38
+ Requires-Dist: langchain-openai>=0.3.5
39
+ Requires-Dist: langchain_anthropic>=0.3.7
40
+ Requires-Dist: duckduckgo_search>=7.3.2
41
+ Requires-Dist: python-dateutil>=2.9.0
42
+ Requires-Dist: typing_extensions>=4.12.2
43
+ Requires-Dist: justext
44
+ Requires-Dist: playwright
45
+ Requires-Dist: beautifulsoup4
46
+ Requires-Dist: flask>=3.1.0
47
+ Requires-Dist: flask-cors>=3.0.10
48
+ Requires-Dist: flask-socketio>=5.1.1
49
+ Requires-Dist: sqlalchemy>=1.4.23
50
+ Requires-Dist: wikipedia
51
+ Requires-Dist: arxiv>=1.4.3
52
+ Requires-Dist: pypdf
53
+ Requires-Dist: sentence-transformers
54
+ Requires-Dist: faiss-cpu
55
+ Requires-Dist: pydantic>=2.0.0
56
+ Requires-Dist: pydantic-settings>=2.0.0
57
+ Requires-Dist: toml>=0.10.2
58
+ Requires-Dist: platformdirs>=3.0.0
59
+ Requires-Dist: dynaconf
60
+ Requires-Dist: requests>=2.28.0
61
+ Requires-Dist: tiktoken>=0.4.0
62
+ Requires-Dist: xmltodict>=0.13.0
63
+ Requires-Dist: lxml>=4.9.2
64
+ Requires-Dist: pdfplumber>=0.9.0
65
+ Requires-Dist: unstructured>=0.10.0
66
+ Requires-Dist: google-search-results
67
+ Requires-Dist: importlib-resources>=6.5.2
68
+ Requires-Dist: setuptools>=78.1.0
69
+ Requires-Dist: flask-wtf>=1.2.2
70
+ Description-Content-Type: text/markdown
71
+
72
+ # Local Deep Research
73
+
74
+ <div align="center">
75
+
76
+ [![GitHub stars](https://img.shields.io/github/stars/LearningCircuit/local-deep-research?style=for-the-badge)](https://github.com/LearningCircuit/local-deep-research/stargazers)
77
+ [![License](https://img.shields.io/badge/License-MIT-green.svg?style=for-the-badge)](LICENSE)
78
+ [![Discord](https://img.shields.io/discord/1352043059562680370?style=for-the-badge&logo=discord)](https://discord.gg/ttcqQeFcJ3)
79
+ [![Reddit](https://img.shields.io/badge/Reddit-r/LocalDeepResearch-FF4500?style=for-the-badge&logo=reddit)](https://www.reddit.com/r/LocalDeepResearch/)
80
+
81
+ *AI-powered research assistant that performs deep, iterative analysis using multiple LLMs and web searches*
82
+
83
+ <a href="https://www.youtube.com/watch?v=0ISreg9q0p0">
84
+ <img src="https://img.youtube.com/vi/0ISreg9q0p0/0.jpg" alt="Local Deep Research Demo" width="500">
85
+ </a>
86
+
87
+ </div>
88
+
89
+ ## 📋 Overview
90
+
91
+ Local Deep Research is a powerful AI research assistant that:
92
+
93
+ 1. **Performs iterative, multi-source research** on any topic
94
+ 2. **Creates comprehensive reports or quick summaries** with proper citations
95
+ 3. **Runs locally** for complete privacy when using local LLMs
96
+ 4. **Searches across multiple sources** including academic databases & the web
97
+ 5. **Processes your own documents** with vector search (RAG)
98
+ 6. **Optimized for speed** with parallel search processing
99
+
100
+ Local Deep Research combines the power of large language models with intelligent search strategies to provide well-researched, properly cited answers to complex questions. It can process queries in just seconds with the Quick Summary option, or create detailed reports with proper section organization for more comprehensive analysis.
101
+
102
+ ## ⚡ Quick Start (Recommended)
103
+
104
+ ```bash
105
+ # 1. Install
106
+ pip install local-deep-research
107
+
108
+ # 2. Setup SearXNG for best results
109
+ docker pull searxng/searxng
110
+ docker run -d -p 8080:8080 --name searxng searxng/searxng
111
+ docker start searxng (required after every reboot)
112
+
113
+ # 3. Install Ollama and pull a model
114
+ # Download from https://ollama.ai and run:
115
+ ollama pull gemma3:12b
116
+
117
+ # 4. Start the web interface
118
+ python -m local_deep_research.web.app
119
+ ```
120
+
121
+ Then visit `http://127.0.0.1:5000` to start researching!
122
+
123
+ ### Alternative Installation Options
124
+
125
+ **Windows Installer**: Download the [Windows Installer](https://github.com/LearningCircuit/local-deep-research/releases/download/v0.1.0/LocalDeepResearch_Setup.exe) for one-click setup.
126
+
127
+ **Docker**: Run with Docker using:
128
+ ```bash
129
+ docker run --network=host \
130
+ local-deep-research
131
+ ```
132
+
133
+ **Command Line**: Alternatively, use the CLI version with:
134
+ ```bash
135
+ python -m local_deep_research.main
136
+ ```
137
+
138
+ ## 🔍 Research Capabilities
139
+
140
+ ### Two Research Modes
141
+
142
+ - **Quick Summary**: Fast results (30s-3min) with key information and proper citations
143
+ - Perfect for rapid exploration and answering straightforward questions
144
+ - Supports multiple search engines in parallel for maximum efficiency
145
+ - Tables and structured information can be included when relevant
146
+
147
+ - **Detailed Report**: Comprehensive analysis with structured sections, table of contents, and in-depth exploration
148
+ - Creates professional-grade reports with proper organization
149
+ - Conducts separate research for each section to ensure comprehensive coverage
150
+ - Integrates information across sections for a cohesive analysis
151
+ - Includes proper citations and reference tracking
152
+
153
+ ### Performance Optimization
154
+
155
+ - **Use Direct SearXNG**: For maximum speed (bypasses LLM calls needed for engine selection)
156
+ - **Adjust Iteration Depth**:
157
+ - 1 iteration: Quick factual questions (~30 seconds)
158
+ - 2-3 iterations: Complex topics requiring deeper exploration (2-3 minutes)
159
+ - 3-5 iterations: Comprehensive research with follow-up investigation (5+ minutes)
160
+ - **Choose Appropriate Models**:
161
+ - 12B-30B parameter models offer good balance of quality and speed
162
+ - For complex research, larger models may provide better synthesis
163
+ - **For Detailed Reports**: Expect multiple research cycles (one per section) and longer processing times
164
+
165
+ ### Multi-Source Integration
166
+
167
+ - **Auto-Engine Selection**: The system intelligently selects the most appropriate search engines for your query
168
+ - **Academic Sources**: Direct access to Wikipedia, arXiv, PubMed, Semantic Scholar, and more
169
+ - **Web Search**: Via SearXNG, Brave Search, SerpAPI (for Google results), and more
170
+ - **Local Document Search**: Search through your private document collections with vector embeddings
171
+ - **Cross-Engine Filtering**: Smart result ranking across search engines for better information quality
172
+
173
+ ## 🤖 LLM Support
174
+
175
+ Local Deep Research works with both local and cloud LLMs:
176
+
177
+ ### Local Models (via Ollama)
178
+
179
+ Local models provide complete privacy and don't require API keys or internet connection for the LLM component (only search queries go online).
180
+
181
+ ```bash
182
+ # Install Ollama from https://ollama.ai
183
+ ollama pull gemma3:12b # Recommended model
184
+ ```
185
+
186
+ Recommended local models:
187
+ - **Gemma 3 (12B)** - Great balance of quality and speed
188
+ - **Mistral (7B/8x7B)** - Fast performance on most hardware
189
+ - **Llama 3 (8B/70B)** - Good performance across various tasks
190
+
191
+ ### Cloud Models
192
+
193
+ Cloud models can provide higher quality results for complex research tasks:
194
+
195
+ API keys can be configured directly through the web interface in the settings panel or via environment variables:
196
+
197
+ ```bash
198
+ # Cloud LLM providers - add to your .env file if not using the web UI
199
+ LDR_LLM_ANTHROPIC_API_KEY=your-api-key-here # For Claude models
200
+ LDR_LLM_OPENAI_API_KEY=your-openai-key-here # For GPT models
201
+ LDR_LLM_OPENAI_ENDPOINT_API_KEY=your-key-here # For OpenRouter or similar services
202
+
203
+ # Set your preferred provider and model
204
+ LDR_LLM_PROVIDER=ollama # Options: ollama, openai, anthropic, etc.
205
+ LDR_LLM_MODEL=gemma3:12b # Model name to use
206
+ ```
207
+
208
+ ### Supported Providers
209
+
210
+ | Provider | Type | Setup | Models |
211
+ |----------|------|---------|--------|
212
+ | `OLLAMA` | Local | Install from [ollama.ai](https://ollama.ai) | Mistral, Llama, Gemma, etc. |
213
+ | `OPENAI` | Cloud | API key required | GPT-3.5, GPT-4, GPT-4o |
214
+ | `ANTHROPIC` | Cloud | API key required | Claude 3 Opus, Sonnet, Haiku |
215
+ | `OPENAI_ENDPOINT` | Cloud | API key required | Any OpenAI-compatible API |
216
+ | `VLLM` | Local | Requires GPU setup | Any supported by vLLM |
217
+ | `LMSTUDIO` | Local | Use LM Studio server | Models from LM Studio |
218
+ | `LLAMACPP` | Local | Configure model path | GGUF model formats |
219
+
220
+ You can easily switch between models in the web interface or via environment variables without reinstalling.
221
+
222
+ ## 🌐 Search Engines
223
+
224
+ The system leverages multiple search engines to find the most relevant information for your queries.
225
+
226
+ ### Core Free Engines (No API Key Required)
227
+
228
+ - **`auto`**: Intelligently selects the best engines based on your query (recommended)
229
+ - **`wikipedia`**: General knowledge, facts, and encyclopedic information
230
+ - **`arxiv`**: Scientific papers and academic research
231
+ - **`pubmed`**: Medical and biomedical research and journals
232
+ - **`semantic_scholar`**: Academic literature across all fields
233
+ - **`github`**: Code repositories, documentation, and technical discussions
234
+ - **`searxng`**: Comprehensive web search via local SearXNG instance
235
+ - **`wayback`**: Historical web content from Internet Archive
236
+
237
+ ### Paid Engines (API Key Required)
238
+
239
+ For enhanced web search capabilities, you can configure these additional engines through the settings interface or via environment variables:
240
+
241
+ ```bash
242
+ # Search API keys (if not using the web UI)
243
+ SERP_API_KEY=your-key-here # Google results via SerpAPI
244
+ GOOGLE_PSE_API_KEY=your-key-here # Google Programmable Search
245
+ BRAVE_API_KEY=your-key-here # Brave Search
246
+ ```
247
+
248
+ ### Search Engine Comparison
249
+
250
+ | Engine | Specialization | Privacy | Speed | Results Quality |
251
+ |--------|----------------|---------|-------|-----------------|
252
+ | SearXNG | General web | ★★★★★ | ★★★★★ | ★★★★½ |
253
+ | Wikipedia | Facts & concepts | ★★★★★ | ★★★★☆ | ★★★★☆ |
254
+ | arXiv | Scientific research | ★★★★★ | ★★★★☆ | ★★★★★ |
255
+ | PubMed | Medical research | ★★★★★ | ★★★★☆ | ★★★★★ |
256
+ | GitHub | Code & tech | ★★★★★ | ★★★☆☆ | ★★★★☆ |
257
+ | SerpAPI | Web (Google) | ★★☆☆☆ | ★★★★☆ | ★★★★★ |
258
+ | Brave | Web (privacy-focused) | ★★★★☆ | ★★★★☆ | ★★★★☆ |
259
+
260
+ ## 📚 Local Document Search (RAG)
261
+
262
+ Local Deep Research includes powerful Retrieval Augmented Generation (RAG) capabilities, allowing you to search and analyze your own private documents using vector embeddings:
263
+
264
+ ### Supported Document Types
265
+
266
+ - PDF files
267
+ - Markdown (.md)
268
+ - Plain text (.txt)
269
+ - Microsoft Word (.docx, .doc)
270
+ - Excel spreadsheets (.xlsx, .xls)
271
+ - CSV files
272
+ - And more
273
+
274
+ ### Using Document Collections
275
+
276
+ You can use your documents in research via:
277
+ - Auto-selection (when relevant to query)
278
+ - Direct collection selection: `tool = "project_docs"`
279
+ - All collections: `tool = "local_all"`
280
+ - Query syntax: `collection:project_docs your query`
281
+
282
+ This allows you to integrate your private knowledge base with web search results for comprehensive research that includes your own documents and data.
283
+
284
+ ## 🛠️ Advanced Configuration
285
+
286
+ ### Web Interface
287
+
288
+ The easiest way to configure Local Deep Research is through the web interface, which provides:
289
+ - Complete settings management
290
+ - Model selection
291
+ - Search engine configuration
292
+ - Research parameter adjustment
293
+ - Local document collection setup
294
+
295
+ ### Configuration Documentation
296
+
297
+ For detailed configuration options, see our guides:
298
+ - [Environment Variables Guide](https://github.com/LearningCircuit/local-deep-research/blob/main/docs/env_configuration.md)
299
+ - [SearXNG Setup Guide](https://github.com/LearningCircuit/local-deep-research/blob/main/docs/SearXNG-Setup.md)
300
+ - [Docker Usage Guide](https://github.com/LearningCircuit/local-deep-research/blob/main/docs/docker-usage-readme.md)
301
+ - [Docker Compose Guide](https://github.com/LearningCircuit/local-deep-research/blob/main/docs/docker-compose-guide.md)
302
+
303
+ ### Programmatic Access
304
+
305
+ Use the Python API for integration with other tools or scripts:
306
+
307
+ ```python
308
+ from local_deep_research import quick_summary, generate_report
309
+
310
+ # Quick research with custom parameters
311
+ results = quick_summary(
312
+ query="advances in fusion energy",
313
+ search_tool="auto",
314
+ iterations=1,
315
+ questions_per_iteration=2,
316
+ max_results=30,
317
+ temperature=0.7
318
+ )
319
+ print(results["summary"])
320
+ ```
321
+
322
+ For more examples, see the [programmatic access tutorial](https://github.com/LearningCircuit/local-deep-research/blob/main/examples/programmatic_access.ipynb).
323
+
324
+ ## 📊 Examples & Documentation
325
+
326
+ For more information and examples of what Local Deep Research can produce:
327
+
328
+ - [Example Outputs](https://github.com/LearningCircuit/local-deep-research/tree/main/examples)
329
+ - [Documentation](https://github.com/LearningCircuit/local-deep-research/tree/main/docs)
330
+ - [Wiki](https://github.com/LearningCircuit/local-deep-research/wiki)
331
+
332
+ ## 🤝 Community & Support
333
+
334
+ - [Discord](https://discord.gg/ttcqQeFcJ3): Discuss features, get help, and share research techniques
335
+ - [Reddit](https://www.reddit.com/r/LocalDeepResearch/): Announcements, updates, and community showcase
336
+ - [GitHub Issues](https://github.com/LearningCircuit/local-deep-research/issues): Bug reports and feature requests
337
+
338
+ ## 📄 License & Acknowledgments
339
+
340
+ This project is licensed under the MIT License.
341
+
342
+ Built with powerful open-source tools:
343
+ - [LangChain](https://github.com/hwchase17/langchain) framework for LLM integration
344
+ - [Ollama](https://ollama.ai) for local AI model management
345
+ - [SearXNG](https://searxng.org/) for privacy-focused web search
346
+ - [FAISS](https://github.com/facebookresearch/faiss) for vector similarity search
347
+ - [justext](https://github.com/miso-belica/justext) and [Playwright](https://playwright.dev) for web content analysis
348
+
349
+ > **Support Free Knowledge:** If you frequently use the search engines in this tool, please consider making a donation to organizations like [Wikipedia](https://donate.wikimedia.org), [arXiv](https://arxiv.org/about/give), or [PubMed](https://www.nlm.nih.gov/pubs/donations/donations.html).
@@ -1,9 +1,10 @@
1
- local_deep_research-0.3.0.dist-info/METADATA,sha256=WSZtS5ZFrspdz3dcvptFc3-KSgYuHKmUADJAsmFCBQI,20274
2
- local_deep_research-0.3.0.dist-info/WHEEL,sha256=tSfRZzRHthuv7vxpI4aehrdN9scLjk-dCJkPLzkHxGg,90
3
- local_deep_research-0.3.0.dist-info/entry_points.txt,sha256=GcXS501Rjh-P80S8db7hnrQ23mS_Jg27PwpVQVO77as,113
4
- local_deep_research-0.3.0.dist-info/licenses/LICENSE,sha256=Qg2CaTdu6SWnSqk1_JtgBPp_Da-LdqJDhT1Vt1MUc5s,1072
5
- local_deep_research/__init__.py,sha256=L2MM_sEYFSoZlmBB3VgmFrFArmN2F6LSvYM5ipdVfTw,870
1
+ local_deep_research-0.3.2.dist-info/METADATA,sha256=nEBQaPbFORrkT0zPmgOC2ja04hQUB0Dpu3C9tBAFx2k,15340
2
+ local_deep_research-0.3.2.dist-info/WHEEL,sha256=tSfRZzRHthuv7vxpI4aehrdN9scLjk-dCJkPLzkHxGg,90
3
+ local_deep_research-0.3.2.dist-info/entry_points.txt,sha256=GcXS501Rjh-P80S8db7hnrQ23mS_Jg27PwpVQVO77as,113
4
+ local_deep_research-0.3.2.dist-info/licenses/LICENSE,sha256=Qg2CaTdu6SWnSqk1_JtgBPp_Da-LdqJDhT1Vt1MUc5s,1072
5
+ local_deep_research/__init__.py,sha256=9wV3oonZMEHsE_JhyZU9P0hW2Uwv47zotGlbAB_gQiA,885
6
6
  local_deep_research/__main__.py,sha256=LIxK5iS6aLAKMFBDpUS3V-jDcxchqi3eSUsI2jAZUXk,371
7
+ local_deep_research/__version__.py,sha256=vNiWJ14r_cw5t_7UDqDQIVZvladKFGyHH2avsLpN7Vg,22
7
8
  local_deep_research/advanced_search_system/__init__.py,sha256=sGusMj4eFIrhXR6QbOM16UDKB6aI-iS4IFivKWpMlh0,234
8
9
  local_deep_research/advanced_search_system/filters/__init__.py,sha256=2dXrV4skcVHI2Lb3BSL2Ajq0rnLeSw7kc1MbIynMxa4,190
9
10
  local_deep_research/advanced_search_system/filters/base_filter.py,sha256=dFNQ7U2dj4bf3voT73YhcG-w9eW-BTlc4F9kstFcETY,969
@@ -35,11 +36,11 @@ local_deep_research/api/research_functions.py,sha256=SItLEuib94AXrhMsgmYDtykGrVm
35
36
  local_deep_research/app.py,sha256=U_92UX0dpVAQoaXciVNy_By_AyDEWGlXSeTwFpohALQ,155
36
37
  local_deep_research/citation_handler.py,sha256=MZVd6xl7g3xrWauFBPuVIC36z8onc-zQb8xI4dQXxsU,4307
37
38
  local_deep_research/config/__init__.py,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
38
- local_deep_research/config/llm_config.py,sha256=Ot61pHJVAjgX9l3SmaoyxcaQg1pd2dmK24XYL8JyTs8,14743
39
+ local_deep_research/config/llm_config.py,sha256=r4ubJaNV3F5gLCpmpcNeCINH-ce_F6iEVO20QEjLGcY,14751
39
40
  local_deep_research/config/search_config.py,sha256=ruryPSS4Wy9-xi_02c-98KLKaELeLnZ10pnCpc0-ogg,2171
40
41
  local_deep_research/defaults/.env.template,sha256=_eVCy4d_XwpGXy8n50CG3wH9xx2oqJCFKS7IbqgInDk,491
41
42
  local_deep_research/defaults/__init__.py,sha256=C_0t0uZmtrVB4rM9NM9Wx8PJU5kFcT-qOHvws5W2iOg,1352
42
- local_deep_research/defaults/default_settings.json,sha256=5-oP17moaO6hgcK5nE1YcOih_JqE0VyfYtYD4be0bdo,114083
43
+ local_deep_research/defaults/default_settings.json,sha256=3buAVqMJqCvpg6J9lpXh9QprNLgDnEOcxx6kxYxZHk0,119769
43
44
  local_deep_research/main.py,sha256=umGmaQmW7bpx27wUAgSNjNr4oSHV6mDX5hoyfb22HEY,7033
44
45
  local_deep_research/migrate_db.py,sha256=S1h6Bv0OJdRW4BaH7MIMrUXBRV_yqgH2T6LVOZKTQjI,4634
45
46
  local_deep_research/report_generator.py,sha256=-G3KDEbsuU3PdxDfuo5v28DIX7RE1yJCCBU2KgRbNzI,9084
@@ -65,10 +66,10 @@ local_deep_research/web/models/settings.py,sha256=rXBI9vY5k3ndR8dPd3fZJy-6HwYltQ
65
66
  local_deep_research/web/routes/api_routes.py,sha256=S0UdCmfm0v1GEM4UiSbI0PE3xUOxiGaYFR2ZOE0256U,19075
66
67
  local_deep_research/web/routes/history_routes.py,sha256=6a_8nX349viuvi1zP5S7BaPPpAh133eTi1NVWO545A8,12622
67
68
  local_deep_research/web/routes/research_routes.py,sha256=JlzaP1z-7XAP3E0nkEjLIfYj_NKf5qDcrjxBmUouAhM,23492
68
- local_deep_research/web/routes/settings_routes.py,sha256=r9RbCCD37rqPsQPFyzlku6OElzQP-nuLbL4AH6e6Fgo,49233
69
+ local_deep_research/web/routes/settings_routes.py,sha256=DkG0JzYQZHzbMoJRxcEreHHYTbzwhhpSvIUHmEMYQAw,49227
69
70
  local_deep_research/web/services/research_service.py,sha256=vs_pWuv56rG2atgSamlDK4MdxpWTxbBVf3rHztr6y2A,39488
70
71
  local_deep_research/web/services/resource_service.py,sha256=yKgOC6GEOmHqRoGzwf52e19UaGCCS1DbDbOIXgWGvGc,4378
71
- local_deep_research/web/services/settings_manager.py,sha256=JRznmUCqPIr2efKO1fVN1-ONzE7okXeCfLbBRmADGbg,16376
72
+ local_deep_research/web/services/settings_manager.py,sha256=lHc0Arh9RR4D_Dubj6OxtlZw7MvHtdY8Db9p5LnX_ac,16376
72
73
  local_deep_research/web/services/settings_service.py,sha256=SgmjhMvGZjJE63hKKaqY7kPGphnUyXcQG8NFN5rTizs,3550
73
74
  local_deep_research/web/services/socket_service.py,sha256=jZGXk6kesBOf4bAdLiT3V4Ofod12pGKTsvxr3ml8ydY,7272
74
75
  local_deep_research/web/static/css/custom_dropdown.css,sha256=-pCx6oazWVgwqFAGq_eZ8OrTKMVQlgkKYCM6w-bACLs,7949
@@ -83,7 +84,7 @@ local_deep_research/web/static/js/components/logpanel.js,sha256=bRYkOf-BRTAHrHx9
83
84
  local_deep_research/web/static/js/components/progress.js,sha256=aTvtyZDQMkjyhqy62icuZuJ7Khyujgust6fpQFcRABk,41570
84
85
  local_deep_research/web/static/js/components/research.js,sha256=LQmZNqRrxkqa61pGaXHLiHGh7SNiq5XNMqfGMKCRDzM,81074
85
86
  local_deep_research/web/static/js/components/results.js,sha256=7fL18Yn0DwAjuelXvz-UlbDiLCFk-_UEEeqEjaDEVBA,32314
86
- local_deep_research/web/static/js/components/settings.js,sha256=0i8bwvgVrgyAeRus29txodBY5MeAVSpZ0byL5h_VRgY,168278
87
+ local_deep_research/web/static/js/components/settings.js,sha256=XRjA62jfaTBAH4YwAMbVbtgJUm6KI87bxVP4E0fWu7E,168497
87
88
  local_deep_research/web/static/js/components/settings_sync.js,sha256=LWDZ2EE8ChCxI5TPmPm9F4rOiYIEzEJxSCE1GLXk-2w,3925
88
89
  local_deep_research/web/static/js/main.js,sha256=NHOcEVytPCvF5tz3yPWg8Qu5ghVs5-GWKmpaKB87oi4,8440
89
90
  local_deep_research/web/static/js/research_form.js,sha256=qOZK0z_BE_xx2a1sx5vTjsCTW-ggHES_uj5eunO9Bo8,3632
@@ -113,7 +114,7 @@ local_deep_research/web/utils/formatters.py,sha256=Gj_a0oFveNXHtvkiFe1rwlEtzYerM
113
114
  local_deep_research/web_search_engines/__init__.py,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
114
115
  local_deep_research/web_search_engines/engines/__init__.py,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
115
116
  local_deep_research/web_search_engines/engines/full_search.py,sha256=6Pi_wj9oAtDHAyLsIbWGBeS8QBv6yCJEJ87LN68Cp-k,4703
116
- local_deep_research/web_search_engines/engines/meta_search_engine.py,sha256=wkeY2OkZi7fVl8ga4IgrDWmUjAuedm5avANqiK6BZPo,12822
117
+ local_deep_research/web_search_engines/engines/meta_search_engine.py,sha256=qUFl8yw5l7sfH-BRpXXrNQ2KrQ9LsaslhG1glb2AOIM,14715
117
118
  local_deep_research/web_search_engines/engines/search_engine_arxiv.py,sha256=3k8R4pyqIZf0RDMqXDw08xIGsfkp4ZR9kePDbmeuaH0,16603
118
119
  local_deep_research/web_search_engines/engines/search_engine_brave.py,sha256=y1j4CSLM0Ujw1LSBiWg1ZBnc2BvrkhDCorrQLnUBVtM,9149
119
120
  local_deep_research/web_search_engines/engines/search_engine_ddg.py,sha256=w9vRDpt_L0h5J-PWiNO_3J5uuRsfk5smlcIQjRofwB4,4649
@@ -131,4 +132,4 @@ local_deep_research/web_search_engines/engines/search_engine_wikipedia.py,sha256
131
132
  local_deep_research/web_search_engines/search_engine_base.py,sha256=PLU_sAWhWKTOQWcv32GINuhLdIwB0sEQy-pp9oG9Ggo,9835
132
133
  local_deep_research/web_search_engines/search_engine_factory.py,sha256=DghAkQvLKRJYl5xb9AUjUv7ydAQ4rPi-TvzrmqdyGxE,10890
133
134
  local_deep_research/web_search_engines/search_engines_config.py,sha256=rgKo3UQhXov_4QxPcdzMqnAfJc5a6tGXtfnjIzKeHdQ,4584
134
- local_deep_research-0.3.0.dist-info/RECORD,,
135
+ local_deep_research-0.3.2.dist-info/RECORD,,