titan-synapse 0.1.1

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
Files changed (62) hide show
  1. package/CONTRIBUTING.md +187 -0
  2. package/Cargo.lock +3976 -0
  3. package/Cargo.toml +10 -0
  4. package/LICENSE +190 -0
  5. package/PROGRESS.md +151 -0
  6. package/README.md +514 -0
  7. package/TEST_LOG.md +220 -0
  8. package/config/default.yaml +36 -0
  9. package/crates/synapse/Cargo.toml +70 -0
  10. package/crates/synapse/src/cli/bench.rs +44 -0
  11. package/crates/synapse/src/cli/eval.rs +395 -0
  12. package/crates/synapse/src/cli/export.rs +45 -0
  13. package/crates/synapse/src/cli/hub.rs +179 -0
  14. package/crates/synapse/src/cli/import.rs +35 -0
  15. package/crates/synapse/src/cli/learn.rs +53 -0
  16. package/crates/synapse/src/cli/mod.rs +10 -0
  17. package/crates/synapse/src/cli/models.rs +36 -0
  18. package/crates/synapse/src/cli/pull.rs +60 -0
  19. package/crates/synapse/src/cli/status.rs +52 -0
  20. package/crates/synapse/src/cli/train.rs +99 -0
  21. package/crates/synapse/src/config.rs +220 -0
  22. package/crates/synapse/src/dashboard.rs +281 -0
  23. package/crates/synapse/src/format/manifest.rs +57 -0
  24. package/crates/synapse/src/format/mod.rs +4 -0
  25. package/crates/synapse/src/format/packer.rs +213 -0
  26. package/crates/synapse/src/inference/engine.rs +361 -0
  27. package/crates/synapse/src/inference/kv_cache.rs +97 -0
  28. package/crates/synapse/src/inference/lora.rs +166 -0
  29. package/crates/synapse/src/inference/mod.rs +9 -0
  30. package/crates/synapse/src/inference/model.rs +167 -0
  31. package/crates/synapse/src/inference/sampler.rs +133 -0
  32. package/crates/synapse/src/inference/speculative.rs +153 -0
  33. package/crates/synapse/src/learn/cloud_fallback.rs +186 -0
  34. package/crates/synapse/src/learn/engine.rs +109 -0
  35. package/crates/synapse/src/learn/mod.rs +5 -0
  36. package/crates/synapse/src/main.rs +185 -0
  37. package/crates/synapse/src/memory/extractor.rs +201 -0
  38. package/crates/synapse/src/memory/graph.rs +332 -0
  39. package/crates/synapse/src/memory/hallucination.rs +259 -0
  40. package/crates/synapse/src/memory/mod.rs +7 -0
  41. package/crates/synapse/src/openai.rs +232 -0
  42. package/crates/synapse/src/server.rs +166 -0
  43. package/crates/synapse/src/streaming.rs +80 -0
  44. package/crates/synapse/src/swarm/coordinator.rs +198 -0
  45. package/crates/synapse/src/swarm/mod.rs +8 -0
  46. package/crates/synapse/src/swarm/orchestrator.rs +225 -0
  47. package/crates/synapse/src/swarm/pool.rs +64 -0
  48. package/crates/synapse/src/swarm/spawner.rs +199 -0
  49. package/crates/synapse/src/swarm/synthesizer.rs +26 -0
  50. package/crates/synapse/src/vram/manager.rs +67 -0
  51. package/crates/synapse/src/vram/mod.rs +3 -0
  52. package/docker-compose.yml +19 -0
  53. package/install.sh +311 -0
  54. package/package.json +36 -0
  55. package/python/Dockerfile.learn +18 -0
  56. package/python/requirements.txt +11 -0
  57. package/python/synapse_learn/__init__.py +0 -0
  58. package/python/synapse_learn/datasets.py +233 -0
  59. package/python/synapse_learn/real_eval.py +616 -0
  60. package/python/synapse_learn/server.py +431 -0
  61. package/python/synapse_learn/train_base.py +672 -0
  62. package/python/synapse_learn/train_specialists.py +787 -0
@@ -0,0 +1,187 @@
1
+ # Contributing to Titan Synapse
2
+
3
+ First off — thanks for considering a contribution. Synapse is a passion project, but it has real teeth and real ambitions. If you're here, you probably believe small models can outperform big ones when they work together. Good. You're in the right place.
4
+
5
+ ---
6
+
7
+ ## Building from Source
8
+
9
+ ```bash
10
+ # Clone it
11
+ git clone https://github.com/Djtony707/titan-synapse.git
12
+ cd titan-synapse
13
+
14
+ # Build (CPU only)
15
+ cargo build --release
16
+
17
+ # Build with CUDA (Linux, NVIDIA GPU)
18
+ cargo build --release --features cuda
19
+
20
+ # Build with Metal (macOS, Apple Silicon)
21
+ cargo build --release --features metal
22
+
23
+ # The binary lands at:
24
+ ./target/release/synapse
25
+ ```
26
+
27
+ **Requirements:**
28
+ - Rust 2024 edition (install via [rustup.rs](https://rustup.rs))
29
+ - For CUDA builds: CUDA toolkit 12.x+ and `nvcc` in PATH
30
+ - For the learning sidecar: Python 3.10+ with `torch`, `peft`, `trl` (see `python/requirements.txt`)
31
+ - Patience during first build. Candle compiles a lot of tensor ops. Go make coffee.
32
+
33
+ ---
34
+
35
+ ## Running Tests
36
+
37
+ ```bash
38
+ # Run the full suite
39
+ cargo test
40
+
41
+ # Run tests for a specific module
42
+ cargo test --package synapse -- knowledge
43
+ cargo test --package synapse -- inference
44
+
45
+ # Run with output (for debugging)
46
+ cargo test -- --nocapture
47
+
48
+ # Run only ignored (slow) integration tests
49
+ cargo test -- --ignored
50
+ ```
51
+
52
+ Tests should pass on both Linux and macOS. If they don't, that's a bug — file it.
53
+
54
+ ---
55
+
56
+ ## Project Structure
57
+
58
+ ```
59
+ titan-synapse/
60
+ ├── Cargo.toml # Workspace root
61
+ ├── crates/synapse/src/
62
+ │ ├── main.rs # CLI entry point (clap): serve, status, models, pull, learn, bench
63
+ │ ├── server.rs # Axum HTTP server on :6900
64
+ │ ├── openai.rs # OpenAI-compatible API handlers
65
+ │ ├── inference/
66
+ │ │ ├── engine.rs # Core inference engine (candle)
67
+ │ │ ├── gguf.rs # GGUF model loader
68
+ │ │ ├── sampler.rs # Temperature/top-p/top-k sampling
69
+ │ │ └── kv_cache.rs # PagedAttention-style KV cache
70
+ │ ├── swarm/
71
+ │ │ ├── coordinator.rs # Routes queries to specialists
72
+ │ │ ├── specialist.rs # Specialist agent definition
73
+ │ │ └── hebbian.rs # Hebbian routing weights
74
+ │ ├── knowledge/
75
+ │ │ ├── graph.rs # SQLite knowledge graph
76
+ │ │ └── facts.rs # Fact extraction + storage
77
+ │ ├── learning/
78
+ │ │ ├── pipeline.rs # QLoRA + DPO training orchestration
79
+ │ │ ├── eval.rs # Self-evaluation scoring
80
+ │ │ └── sidecar.rs # Python process management
81
+ │ └── format/
82
+ │ └── synapse_file.rs # .synapse bundle format
83
+ ├── config/ # Default configuration files
84
+ ├── python/ # Learning sidecar (QLoRA/DPO)
85
+ ├── docker-compose.yml # For the learning sidecar
86
+ └── target/ # Build artifacts (gitignored)
87
+ ```
88
+
89
+ The crate is a single binary (`synapse`) built from `crates/synapse/`. We may add more workspace members later (e.g., `crates/synapse-python` for the FFI bridge), but for now, simplicity wins.
90
+
91
+ ---
92
+
93
+ ## Code Style
94
+
95
+ **Edition:** Rust 2024. We use the latest stable features.
96
+
97
+ **General rules:**
98
+
99
+ - `cargo fmt` before every commit. Non-negotiable.
100
+ - `cargo clippy` should produce zero warnings. If Clippy is wrong (rare), add an `#[allow()]` with a comment explaining why.
101
+ - Error handling: use `anyhow::Result` for application code, `thiserror` for library-facing errors. No `.unwrap()` in production paths. Tests can `.unwrap()` — they're supposed to panic on failure, that's the point.
102
+ - Naming: structs are `PascalCase`, functions are `snake_case`, constants are `SCREAMING_SNAKE`. Standard Rust. Nothing weird.
103
+ - Comments: explain **why**, not **what**. The code should explain what. If it can't, refactor it until it does.
104
+ - Keep functions under ~50 lines when possible. If a function needs a scroll wheel, it needs a refactor.
105
+ - Async: we use Tokio. If you block the async runtime, you buy the team coffee. Metaphorically. We don't have a coffee fund yet.
106
+
107
+ **Commit messages:**
108
+
109
+ ```
110
+ feat: add Metal acceleration for Apple Silicon
111
+ fix: prevent KV cache overflow on long contexts
112
+ refactor: split coordinator routing into separate module
113
+ test: add integration tests for .synapse bundle format
114
+ docs: update CUDA build instructions
115
+ ```
116
+
117
+ Conventional commits. Keep the subject under 72 characters. Body is optional but appreciated for non-obvious changes.
118
+
119
+ ---
120
+
121
+ ## PR Process
122
+
123
+ 1. **Fork and branch.** Branch names: `feat/thing`, `fix/thing`, `refactor/thing`.
124
+ 2. **Make your changes.** Keep PRs focused — one feature or fix per PR. Mega-PRs get mega-delayed.
125
+ 3. **Run the checks:**
126
+ ```bash
127
+ cargo fmt --check
128
+ cargo clippy -- -D warnings
129
+ cargo test
130
+ ```
131
+ 4. **Open a PR** against `main`. Fill in the template (what, why, how to test).
132
+ 5. **Review.** We'll review within a few days. Don't take feedback personally — we're all trying to make this thing great.
133
+ 6. **Merge.** Squash-merge is preferred for clean history.
134
+
135
+ If your PR adds a new feature, include tests. If it fixes a bug, include a test that would have caught it. "Works on my machine" is not a test plan.
136
+
137
+ ---
138
+
139
+ ## Where Help Is Needed
140
+
141
+ These are the areas where contributions would have the most impact:
142
+
143
+ ### CUDA Optimization
144
+ The candle CUDA backend works but there's room for custom kernels — especially for the Hebbian routing matrix updates and batched specialist inference. If you've written CUDA kernels before and enjoy staring at memory access patterns, we want you.
145
+
146
+ ### New Specialists
147
+ The swarm currently has a handful of specialists. We want more: math reasoning, code generation, creative writing, structured output, tool calling. Each specialist is relatively self-contained — it's a great first contribution.
148
+
149
+ ### Inference Performance
150
+ KV cache management, speculative decoding, continuous batching, quantization-aware optimizations. If you can make the inference engine faster without sacrificing quality, that's a direct win for every user.
151
+
152
+ ### Model Format (.synapse)
153
+ The `.synapse` bundle format is new and could use: compression, integrity verification, streaming loads, and a proper spec document. If you like file formats (we don't judge), this one's for you.
154
+
155
+ ### Testing
156
+ More integration tests, benchmarks, and edge case coverage. The test suite is solid but not exhaustive. Especially needed: stress tests for concurrent inference, memory leak detection, and cross-platform CI.
157
+
158
+ ### Documentation
159
+ API docs, tutorials, example configs, deployment guides. If you can explain things clearly, that's a superpower we need.
160
+
161
+ ---
162
+
163
+ ## Setting Up for Development
164
+
165
+ ```bash
166
+ # Recommended: install rust-analyzer for your editor
167
+ rustup component add rust-analyzer
168
+
169
+ # Watch mode (rebuild on save)
170
+ cargo install cargo-watch
171
+ cargo watch -x "test" -x "clippy"
172
+
173
+ # Generate docs locally
174
+ cargo doc --open --no-deps
175
+ ```
176
+
177
+ ---
178
+
179
+ ## Ground Rules
180
+
181
+ - Be respectful. This is a technical project, not a battlefield.
182
+ - If you're stuck, open an issue or discussion before spending three days on something that might get rejected.
183
+ - No AI-generated PRs without understanding the code. We can tell. We literally build AI.
184
+
185
+ ---
186
+
187
+ Built by Tony Elliott. Contributions make it better.