locus-cli 0.1.0a1__tar.gz

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
@@ -0,0 +1,21 @@
1
+ MIT License
2
+
3
+ Copyright (c) 2026 Mattia Rizzo
4
+
5
+ Permission is hereby granted, free of charge, to any person obtaining a copy
6
+ of this software and associated documentation files (the "Software"), to deal
7
+ in the Software without restriction, including without limitation the rights
8
+ to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
9
+ copies of the Software, and to permit persons to whom the Software is
10
+ furnished to do so, subject to the following conditions:
11
+
12
+ The above copyright notice and this permission notice shall be included in all
13
+ copies or substantial portions of the Software.
14
+
15
+ THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
16
+ IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
17
+ FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
18
+ AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
19
+ LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
20
+ OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
21
+ SOFTWARE.
@@ -0,0 +1,106 @@
1
+ Metadata-Version: 2.4
2
+ Name: locus-cli
3
+ Version: 0.1.0a1
4
+ Summary: Lightweight terminal app to quickly understand large codebases
5
+ Author: Mattia Rizzo
6
+ License: MIT
7
+ Project-URL: Homepage, https://github.com/Tech-Matt/locus
8
+ Project-URL: Repository, https://github.com/Tech-Matt/locus
9
+ Keywords: cli,tui,llm,codebase,textual,rich
10
+ Classifier: Development Status :: 3 - Alpha
11
+ Classifier: Environment :: Console
12
+ Classifier: Intended Audience :: Developers
13
+ Classifier: License :: OSI Approved :: MIT License
14
+ Classifier: Programming Language :: Python :: 3
15
+ Classifier: Programming Language :: Python :: 3 :: Only
16
+ Classifier: Programming Language :: Python :: 3.10
17
+ Classifier: Programming Language :: Python :: 3.11
18
+ Classifier: Programming Language :: Python :: 3.12
19
+ Classifier: Topic :: Software Development :: Documentation
20
+ Classifier: Topic :: Utilities
21
+ Requires-Python: >=3.10
22
+ Description-Content-Type: text/markdown
23
+ License-File: LICENSE
24
+ Requires-Dist: textual>=0.70.0
25
+ Requires-Dist: rich>=13.0.0
26
+ Dynamic: license-file
27
+
28
+ <div align="center">
29
+
30
+ # LOCUS 🗺️
31
+
32
+ **The free, 100% private, local-LLM codebase cartographer for your terminal.**
33
+
34
+ [![Python 3.x](https://img.shields.io/badge/python-3.x-blue.svg)](https://www.python.org/)
35
+ [![License: MIT](https://img.shields.io/badge/License-MIT-yellow.svg)](https://opensource.org/licenses/MIT)
36
+ [![Status: Work in Progress](https://img.shields.io/badge/Status-WIP-orange.svg)]()
37
+
38
+ *Stop paying for expensive API keys just to understand a codebase. Locus is a lightweight terminal tool that automatically downloads and runs architecture-optimized local LLMs to instantly map and summarize large codebases—for free, and 100% privately.*
39
+
40
+ </div>
41
+
42
+ ---
43
+
44
+ ## Why Locus?
45
+ I wanted a tool to explore new and large codebases (like the *Linux Kernel* source code 😄), with an intuitive and powerful UI. **Locus** provides instant visual overviews and directory trees right in your terminal, helping you map out the structure before you deep dive into the code.
46
+
47
+ ### Free, Open, Local Intelligence
48
+ You shouldn't have to pay for expensive Gemini, Claude, or OpenAI API keys just to understand a codebase. **Locus dynamically profiles** your PC's hardware (Apple Silicon, NVIDIA, AMD GPU, or CPU-only) and automatically downloads and runs **architecture-optimized local LLMs**. You get instant and private folder summaries running entirely on your own machine—for free. *(Cloud APIs are still supported if you prefer them).*
49
+
50
+ - **Zero-Friction:** Just do `pip install locus-cli` and run `locus`. No Docker, no manual model downloading. (Still to be packaged as of now)
51
+ - **Fast & Native:** A fast TUI built with `Textual` and `Rich`.
52
+ - **Private by Default:** Your proprietary code never leaves your machine unless you explicitly configure a cloud provider.
53
+
54
+ ### Current Platform Support
55
+ - **macOS (Apple Silicon / arm64):** Supported
56
+ - **macOS (Intel / x86_64):** Not supported yet
57
+ - **Linux:** Supported
58
+ - **Windows:** Supported
59
+
60
+ ---
61
+
62
+ ## Installation
63
+
64
+ *Note: Locus is currently in active development. A v0.1 release is coming soon!*
65
+
66
+ ```bash
67
+ git clone https://github.com/Tech-Matt/locus.git
68
+ cd locus
69
+ python -m venv myEnv
70
+ source myEnv/bin/activate # Windows: `myEnv\Scripts\activate`
71
+ pip install -r requirements.txt
72
+ python main.py
73
+ ```
74
+
75
+ ---
76
+
77
+ ## Usage
78
+
79
+ Run `locus` in any directory you want to explore:
80
+
81
+ ```bash
82
+ cd /path/to/massive/codebase
83
+ locus
84
+ ```
85
+
86
+ **Keybindings:**
87
+ - `j` / `k` : Scroll Down / Up
88
+ - `d` : Toggle Dark/Light Mode
89
+ - `q` : Quit Locus
90
+ - `A` : (Coming Soon) Generate a quick summary for the selected folder
91
+
92
+ ---
93
+
94
+ ## Roadmap
95
+
96
+ - [x] **Phase 1: Visual Engine** (Recursive parsing, smart filtering, TUI scaffolding)
97
+ - [x] **Phase 2: Hardware Profiling** (Native GPU/RAM detection, model mapping)
98
+ - [ ] **Phase 3: Local Summaries** (Local inference engine, TUI integration)
99
+ - [ ] **Phase 4: Testing & Hardening** (Unit tests, CI/CD pipeline)
100
+ - [ ] **Phase 5: Packaging** (PyPI release, zero-dependency binaries)
101
+
102
+ ---
103
+
104
+ <div align="center">
105
+ Made with ❤️ by <a href="https://github.com/Tech-Matt">Tech-Matt</a>
106
+ </div>
@@ -0,0 +1,79 @@
1
+ <div align="center">
2
+
3
+ # LOCUS 🗺️
4
+
5
+ **The free, 100% private, local-LLM codebase cartographer for your terminal.**
6
+
7
+ [![Python 3.x](https://img.shields.io/badge/python-3.x-blue.svg)](https://www.python.org/)
8
+ [![License: MIT](https://img.shields.io/badge/License-MIT-yellow.svg)](https://opensource.org/licenses/MIT)
9
+ [![Status: Work in Progress](https://img.shields.io/badge/Status-WIP-orange.svg)]()
10
+
11
+ *Stop paying for expensive API keys just to understand a codebase. Locus is a lightweight terminal tool that automatically downloads and runs architecture-optimized local LLMs to instantly map and summarize large codebases—for free, and 100% privately.*
12
+
13
+ </div>
14
+
15
+ ---
16
+
17
+ ## Why Locus?
18
+ I wanted a tool to explore new and large codebases (like the *Linux Kernel* source code 😄), with an intuitive and powerful UI. **Locus** provides instant visual overviews and directory trees right in your terminal, helping you map out the structure before you deep dive into the code.
19
+
20
+ ### Free, Open, Local Intelligence
21
+ You shouldn't have to pay for expensive Gemini, Claude, or OpenAI API keys just to understand a codebase. **Locus dynamically profiles** your PC's hardware (Apple Silicon, NVIDIA, AMD GPU, or CPU-only) and automatically downloads and runs **architecture-optimized local LLMs**. You get instant and private folder summaries running entirely on your own machine—for free. *(Cloud APIs are still supported if you prefer them).*
22
+
23
+ - **Zero-Friction:** Just do `pip install locus-cli` and run `locus`. No Docker, no manual model downloading. (Still to be packaged as of now)
24
+ - **Fast & Native:** A fast TUI built with `Textual` and `Rich`.
25
+ - **Private by Default:** Your proprietary code never leaves your machine unless you explicitly configure a cloud provider.
26
+
27
+ ### Current Platform Support
28
+ - **macOS (Apple Silicon / arm64):** Supported
29
+ - **macOS (Intel / x86_64):** Not supported yet
30
+ - **Linux:** Supported
31
+ - **Windows:** Supported
32
+
33
+ ---
34
+
35
+ ## Installation
36
+
37
+ *Note: Locus is currently in active development. A v0.1 release is coming soon!*
38
+
39
+ ```bash
40
+ git clone https://github.com/Tech-Matt/locus.git
41
+ cd locus
42
+ python -m venv myEnv
43
+ source myEnv/bin/activate # Windows: `myEnv\Scripts\activate`
44
+ pip install -r requirements.txt
45
+ python main.py
46
+ ```
47
+
48
+ ---
49
+
50
+ ## Usage
51
+
52
+ Run `locus` in any directory you want to explore:
53
+
54
+ ```bash
55
+ cd /path/to/massive/codebase
56
+ locus
57
+ ```
58
+
59
+ **Keybindings:**
60
+ - `j` / `k` : Scroll Down / Up
61
+ - `d` : Toggle Dark/Light Mode
62
+ - `q` : Quit Locus
63
+ - `A` : (Coming Soon) Generate a quick summary for the selected folder
64
+
65
+ ---
66
+
67
+ ## Roadmap
68
+
69
+ - [x] **Phase 1: Visual Engine** (Recursive parsing, smart filtering, TUI scaffolding)
70
+ - [x] **Phase 2: Hardware Profiling** (Native GPU/RAM detection, model mapping)
71
+ - [ ] **Phase 3: Local Summaries** (Local inference engine, TUI integration)
72
+ - [ ] **Phase 4: Testing & Hardening** (Unit tests, CI/CD pipeline)
73
+ - [ ] **Phase 5: Packaging** (PyPI release, zero-dependency binaries)
74
+
75
+ ---
76
+
77
+ <div align="center">
78
+ Made with ❤️ by <a href="https://github.com/Tech-Matt">Tech-Matt</a>
79
+ </div>
@@ -0,0 +1,48 @@
1
+ [build-system]
2
+ requires = ["setuptools>=69", "wheel"]
3
+ build-backend = "setuptools.build_meta"
4
+
5
+ [project]
6
+ name = "locus-cli"
7
+ version = "0.1.0a1"
8
+ description = "Lightweight terminal app to quickly understand large codebases"
9
+ readme = "README.md"
10
+ requires-python = ">=3.10"
11
+ license = { text = "MIT" }
12
+ authors = [{name = "Mattia Rizzo"}]
13
+ keywords = ["cli", "tui", "llm", "codebase", "textual", "rich"]
14
+ classifiers = [
15
+ "Development Status :: 3 - Alpha",
16
+ "Environment :: Console",
17
+ "Intended Audience :: Developers",
18
+ "License :: OSI Approved :: MIT License",
19
+ "Programming Language :: Python :: 3",
20
+ "Programming Language :: Python :: 3 :: Only",
21
+ "Programming Language :: Python :: 3.10",
22
+ "Programming Language :: Python :: 3.11",
23
+ "Programming Language :: Python :: 3.12",
24
+ "Topic :: Software Development :: Documentation",
25
+ "Topic :: Utilities",
26
+ ]
27
+ dependencies = [
28
+ "textual>=0.70.0",
29
+ "rich>=13.0.0"
30
+ ]
31
+
32
+ [project.urls]
33
+ Homepage = "https://github.com/Tech-Matt/locus"
34
+ Repository = "https://github.com/Tech-Matt/locus"
35
+
36
+ [project.scripts]
37
+ locus = "locus_cli.main:main"
38
+
39
+ [tool.setuptools]
40
+ package-dir = { "" = "src" }
41
+ include-package-data = true
42
+
43
+ [tool.setuptools.packages.find]
44
+ where = ["src"]
45
+ exclude = ["*.tests", "*.tests.*", "tests.*", "tests"]
46
+
47
+ [tool.setuptools.package-data]
48
+ locus_cli = ["ui/*.tcss"]
@@ -0,0 +1,4 @@
1
+ [egg_info]
2
+ tag_build =
3
+ tag_date = 0
4
+
File without changes
@@ -0,0 +1,133 @@
1
+ # This files needs to take a folder address like 'src' or '.' and return a Tree
2
+ # object which the UI can then render
3
+
4
+ import os
5
+ from pathlib import Path
6
+ from rich.tree import Tree
7
+ from rich.text import Text
8
+ from rich.filesize import decimal
9
+ from rich.markup import escape
10
+ from ui.console import console
11
+
12
+ class LocusMap:
13
+ # Default list of folders to ignore
14
+ IGNORE_FOLDERS = {
15
+ "__pycache__",
16
+ "node_modules",
17
+ "venv",
18
+ "myEnv",
19
+ ".git",
20
+ ".idea",
21
+ ".vscode",
22
+ "dist",
23
+ "build",
24
+ "target", # Rust/Java
25
+ "bin", # C#
26
+ "obj", # C#
27
+ "vendor" # PHP/Go
28
+ }
29
+
30
+ # Maximum number of files to display per directory
31
+ # If there are more files, show "N more files..." instead
32
+ MAX_FILES_PER_DIR = 10
33
+
34
+ # The constructor gets called on the root folder of interest
35
+ def __init__(self, root_dir, max_depth):
36
+ self.root_dir = root_dir
37
+ self.max_depth = max_depth
38
+
39
+ def generate(self):
40
+ """
41
+ Starting from the root folder, it creates the tree and then returns it
42
+ """
43
+ # Get the folder name
44
+ # resolve() makes the Path absolute, resolving any symlink
45
+ # .name returns the last part, like "setup.py"
46
+ root_name = Path(self.root_dir).resolve().name
47
+
48
+ # Create the visual Root Node
49
+ tree = Tree(f"[bold blue]{root_name}[/]")
50
+
51
+ # Start walking in a recursive fashion
52
+ self._walk(self.root_dir, tree, current_depth=0)
53
+
54
+ return tree
55
+
56
+ # The walk is based on a DFS Search (Depth first search)
57
+ def _walk(self, directory, tree_node, current_depth):
58
+ """
59
+ It looks at 'directory' and adds items to 'tree_node'.
60
+ It calls itself if it finds a subfolder
61
+ """
62
+ # Get all the items in the directory
63
+ try:
64
+ # Sorting works in this way:
65
+ # - folders are showed first, then files. If same type, then order
66
+ # - by name.
67
+ paths = sorted(
68
+ Path(directory).iterdir(), # This is a list of PosixPathor WinwdowsPath
69
+ key=lambda p: (not p.is_dir(), p.name.lower())
70
+ )
71
+
72
+ except PermissionError:
73
+ # Handle folders we can't open
74
+ tree_node.add("[red]Access Denied[/]")
75
+ return
76
+
77
+ # Separate directories and files for the heuristic
78
+ directories = []
79
+ files = []
80
+
81
+ for path in paths:
82
+ # Filter: skip hidden files or folders
83
+ if path.name.startswith("."):
84
+ continue
85
+ # Filter: skip common dep/cache folders
86
+ if path.name in self.IGNORE_FOLDERS:
87
+ continue
88
+
89
+ if path.is_dir():
90
+ directories.append(path)
91
+ else:
92
+ files.append(path)
93
+
94
+ # Always show all directories (no limit)
95
+ for path in directories:
96
+ # Create a new branch for this folder
97
+ # escape() here escapes the folder name to delete possible rich tags
98
+ branch = tree_node.add(f"[bold green]{escape(path.name)}[/]")
99
+
100
+ # Recursion, dive into the folder only if we haven't hit the depth limit
101
+ if current_depth < self.max_depth - 1:
102
+ self._walk(path, branch, current_depth + 1)
103
+
104
+ # Limit the number of files shown
105
+ files_shown = 0
106
+ for path in files:
107
+ if files_shown >= self.MAX_FILES_PER_DIR:
108
+ break
109
+
110
+ # Calculate size for display
111
+ file_size = decimal(path.stat().st_size)
112
+ # Add simple icons based on extension
113
+ icon = "🐍" if path.suffix == ".py" else "📄"
114
+ # Add the leaf node
115
+ tree_node.add(
116
+ f"{icon} {escape(path.name)} ([dim]{file_size}[/])"
117
+ )
118
+ files_shown += 1
119
+
120
+ # If there are more files than shown, add a summary message
121
+ remaining_files = len(files) - files_shown
122
+ if remaining_files > 0:
123
+ tree_node.add(
124
+ f"[dim italic]... {remaining_files} more file{'s' if remaining_files > 1 else ''}[/]"
125
+ )
126
+
127
+ # [REMOVE LATER] Just use this as temporary quick tests
128
+ if __name__ == "__main__":
129
+ # Debug testing with console.print()
130
+ root_folder = Path("~/LinuxSource/.").expanduser()
131
+ map = LocusMap(root_folder, 3)
132
+ tree = map.generate()
133
+ console.print(tree)
@@ -0,0 +1,94 @@
1
+ """
2
+ This module provides hardware profiling capabilities.
3
+ It detects system RAM and available GPUs or AI accelerators.
4
+ """
5
+
6
+ import psutil
7
+ import subprocess
8
+ import platform
9
+ import shutil
10
+
11
+
12
+ class HardwareProfiler:
13
+ def get_total_ram_gb(self) -> float:
14
+ """
15
+ Detects the total physical RAM of the system.
16
+ Returns the value in GB rounded to 2 decimal places
17
+ """
18
+ total_mem_bytes = psutil.virtual_memory().total
19
+ total_mem = total_mem_bytes / (1024 * 1024 * 1024) # Convert from Bytes to GigaBytes
20
+ return round(total_mem, 2)
21
+
22
+ def detect_gpu(self) -> dict:
23
+ """
24
+ Attempts to detect available AI accelerators (Apple Silicon, NVIDIA, AMD).
25
+ Returns a dictionary with 'type' and 'vram_gb' (if applicable).
26
+ """
27
+ # Default fallback
28
+ gpu_info = {"type": "CPU_ONLY", "vram_gb": 0.0}
29
+
30
+ # Get system infos
31
+ system = platform.system()
32
+ machine = platform.machine()
33
+
34
+ # 1. Apple Silicon Check
35
+ if system == "Darwin" and machine == "arm64":
36
+ # Apple uses unified memory, so its VRAM is just the system RAM!
37
+ vram = self.get_total_ram_gb()
38
+ gpu_info = {"type": "APPLE_SILICON", "vram_gb": vram}
39
+
40
+ # 2. Nvidia
41
+ # This checks whether the nvidia driver is installed. If so, the gpu vram
42
+ # is requested
43
+ if shutil.which("nvidia-smi") is not None:
44
+ result = subprocess.run(
45
+ ["nvidia-smi", "--query-gpu=memory.total", "--format=csv,noheader,nounits"],
46
+ capture_output=True, text=True
47
+ )
48
+ # The result.stdout will be a string like "8192\n" (in MB).
49
+ if result.returncode == 0 and result.stdout.strip():
50
+ # If there are multiple GPUs nvidia-smi might output more lines, so we use split()
51
+ vram_mb = float(result.stdout.strip().split('\n')[0])
52
+ vram_gb = round(vram_mb / 1024, 2) # Convert to GB
53
+ gpu_info = {"type": "NVIDIA", "vram_gb": vram_gb}
54
+
55
+
56
+ # 3. AMD Check. In this case we are not going to check VRAM since CUDA is unavailable
57
+ # but Vulkan may be used
58
+ if gpu_info["type"] == "CPU_ONLY":
59
+ try:
60
+ if system == "Windows":
61
+ result = subprocess.run(
62
+ ["wmic", "path", "win32_VideoController", "get", "name"],
63
+ capture_output=True, text=True
64
+ )
65
+ if "AMD" in result.stdout.upper() or "RADEON" in result.stdout.upper():
66
+ gpu_info = {"type": "AMD", "vram_gb": 0.0}
67
+
68
+ elif system == "Linux":
69
+ result = subprocess.run(
70
+ ["lspci"],
71
+ capture_output=True, text=True
72
+ )
73
+ if "AMD" in result.stdout.upper() or "RADEON" in result.stdout.upper():
74
+ gpu_info = {"type": "AMD", "vram_gb": 0.0}
75
+
76
+ except FileNotFoundError:
77
+ # If the AMD GPU is not found we are simply ignoring the error
78
+ # and return the default CPU_ONLY Fallback
79
+ pass
80
+
81
+
82
+ return gpu_info
83
+
84
+
85
+
86
+ # [REMOVE LATER]
87
+ if __name__ == "__main__":
88
+ profiler = HardwareProfiler()
89
+ ram = profiler.get_total_ram_gb()
90
+ gpu_info = profiler.detect_gpu()
91
+ system = gpu_info.get("type")
92
+ vram = gpu_info.get("vram_gb")
93
+ print(f"System RAM detected: {ram} GB")
94
+ print(f"System type: {system}, VRAM detected {vram} GB")
@@ -0,0 +1,92 @@
1
+ import os
2
+ import platform
3
+ import urllib.request
4
+ from pathlib import Path
5
+
6
+ class Provisioner:
7
+ """
8
+ Maps Hardware profiles to specific AI models and inference binaries,
9
+ and handles downloading them to the local system.
10
+ """
11
+
12
+ # Model Matrix
13
+ # Format: {Tier: (Filename, HuggingFace Download URL)}
14
+ MODELS = {
15
+ 1: ("qwen2.5-coder-7b-instruct-q4_k_m.gguf", "https://huggingface.co/Qwen/Qwen2.5-Coder-7B-Instruct-GGUF/resolve/main/qwen2.5-coder-7b-instruct-q4_k_m.gguf"),
16
+ 2: ("qwen2.5-coder-3b-instruct-q4_k_m.gguf", "https://huggingface.co/Qwen/Qwen2.5-Coder-3B-Instruct-GGUF/resolve/main/qwen2.5-coder-3b-instruct-q4_k_m.gguf"),
17
+ 3: ("qwen2.5-coder-1.5b-instruct-q4_k_m.gguf", "https://huggingface.co/Qwen/Qwen2.5-Coder-1.5B-Instruct-GGUF/resolve/main/qwen2.5-coder-1.5b-instruct-q4_k_m.gguf"),
18
+ 4: ("qwen2.5-coder-0.5b-instruct-q4_k_m.gguf", "https://huggingface.co/Qwen/Qwen2.5-Coder-0.5B-Instruct-GGUF/resolve/main/qwen2.5-coder-0.5b-instruct-q4_k_m.gguf")
19
+ }
20
+
21
+
22
+ # The Binary Matrix (llama.cpp server releases)
23
+ # URLS will need to be updated to the latest release tag
24
+ # TODO: Port later to manifest.json
25
+ BINARIES = {
26
+ "Windows": {
27
+ "CUDA": "https://github.com/ggml-org/llama.cpp/releases/download/b8133/llama-b8133-bin-win-cuda-13.1-x64.zip",
28
+ "Vulkan": "https://github.com/ggml-org/llama.cpp/releases/download/b8133/llama-b8133-bin-win-vulkan-x64.zip",
29
+ "CPU": "https://github.com/ggml-org/llama.cpp/releases/download/b8133/llama-b8133-bin-win-cpu-x64.zip"
30
+ },
31
+ "Linux": {
32
+ "CUDA": "https://github.com/ggml-org/llama.cpp/releases/download/b8133/llama-b8133-bin-ubuntu-x64.tar.gz",
33
+ "Vulkan": "https://github.com/ggml-org/llama.cpp/releases/download/b8133/llama-b8133-bin-ubuntu-vulkan-x64.tar.gz",
34
+ "CPU": "https://github.com/ggml-org/llama.cpp/releases/download/b8133/llama-b8133-bin-ubuntu-x64.tar.gz"
35
+ },
36
+ "Darwin": {
37
+ "APPLE_SILICON": "https://github.com/ggml-org/llama.cpp/releases/download/b8133/llama-b8133-bin-macos-arm64.tar.gz"
38
+ }
39
+ }
40
+
41
+ def __init__(self):
42
+ # Define where Locus will store its data
43
+ self.locus_dir = Path.home() / ".locus"
44
+ self.models_dir = self.locus_dir / "models"
45
+ self.bin_dir = self.locus_dir / "bin"
46
+ # Handle directories already existing
47
+ self.locus_dir.mkdir(parents=True, exist_ok=True)
48
+ self.models_dir.mkdir(parents=True, exist_ok=True)
49
+ self.bin_dir.mkdir(parents=True, exist_ok=True)
50
+
51
+ def determine_tier(self, ram_gb: float, gpu_type: str, vram_gb: float) -> int:
52
+ """
53
+ Calculates the appropriate model tier based on hardware.
54
+ Tier 1: High-End (8GB+ VRAM or 16GB+ Apple Silicon)
55
+ Tier 2: Mid-Range (4-6GB VRAM or 16GB+ RAM CPU)
56
+ Tier 3: Low-End (8GB RAM CPU)
57
+ Tier 4: Potato PC (<8GB RAM CPU)
58
+ """
59
+ gpu = (gpu_type or "").upper()
60
+
61
+ # Apple Silicon
62
+ if gpu == "APPLE_SILICON":
63
+ if ram_gb >= 16:
64
+ return 1
65
+ if ram_gb >= 8:
66
+ return 3
67
+ return 4
68
+
69
+ # Discrete GPU
70
+ if vram_gb >= 8:
71
+ return 1
72
+ if vram_gb >= 4:
73
+ return 2
74
+
75
+ # CPU only
76
+ if ram_gb >= 16:
77
+ return 2
78
+ if ram_gb >= 8:
79
+ return 3
80
+
81
+ # Default case - Tier 4
82
+ return 4
83
+
84
+ def get_binary_preference(self, os_name: str, gpu_type: str, user_choice: str = "auto") -> str:
85
+ """
86
+ Determines which llama.cpp to download
87
+ user_choice can be 'CUDA', 'Vulkan', 'CPU' or 'auto'.
88
+ """
89
+ # TODO: Map the OS and GPU type to the correct key in self.BINARIES
90
+
91
+ def download_file():
92
+ pass
File without changes
@@ -0,0 +1,70 @@
1
+ # This file will use the Textual library to create a UI
2
+
3
+ from textual.app import App, ComposeResult
4
+ from textual.widgets import Header, Footer, Static
5
+ from textual.containers import ScrollableContainer, VerticalScroll
6
+ from pathlib import Path
7
+
8
+ from core.map import LocusMap
9
+
10
+ class LocusApp(App):
11
+ """
12
+ The Textual UI for locus
13
+ """
14
+ def __init__(self, root_dir, max_depth):
15
+ # Let's call the textual.App constructor first (necessary)
16
+ super().__init__()
17
+ self.root_dir = root_dir
18
+ self.max_depth = max_depth # Maximum depth of subfolders
19
+
20
+ # Textual uses CSS for the UI
21
+ CSS_PATH = "style.tcss"
22
+
23
+ # Bindings: allow user to press keys to do things
24
+ BINDINGS = [
25
+ ("q", "quit", "Quit Locus"),
26
+ ("d", "toggle_dark", "Toggle Dark Mode"),
27
+ ("j", "scroll_down", "Scroll Down"),
28
+ ("k", "scroll_up", "Scroll Up")
29
+ ]
30
+
31
+ def compose(self) -> ComposeResult:
32
+ """
33
+ Create child widgets for the APP
34
+ """
35
+ yield Header()
36
+ yield VerticalScroll(Static(id="map-view"), id="main-scroll")
37
+ yield Footer()
38
+
39
+
40
+ def on_mount(self) -> None:
41
+ """
42
+ This method runs exactly once, right after the app is
43
+ built and rendered. This is where data is fetched and
44
+ the UI updated
45
+ """
46
+ locus_map = LocusMap(self.root_dir, self.max_depth)
47
+ tree = locus_map.generate()
48
+ self.query_one("#map-view", Static).update(tree)
49
+ pass
50
+
51
+ def action_toggle_dark(self) -> None:
52
+ self.theme = (
53
+ "textual-dark" if self.theme == "textual-light" else "textual-light"
54
+ )
55
+
56
+ def action_scroll_down(self) -> None:
57
+ scroll_view = self.query_one("#main-scroll", VerticalScroll)
58
+ scroll_view.scroll_down()
59
+
60
+ def action_scroll_up(self) -> None:
61
+ scroll_view = self.query_one("#main-scroll", VerticalScroll)
62
+ scroll_view.scroll_up()
63
+
64
+ def action_quit(self) -> None:
65
+ self.exit()
66
+
67
+ # [REMOVE LATER] For testing the UI directly
68
+ if __name__ == "__main__":
69
+ app = LocusApp(Path("~/LinuxSource/linux/").expanduser(), 3)
70
+ app.run()
@@ -0,0 +1,4 @@
1
+ # Rich will read the properties of the console being used and optimize the
2
+ # use of colors everywhere when it is used in other parts of the project
3
+ from rich.console import Console
4
+ console = Console()
File without changes
@@ -0,0 +1,106 @@
1
+ Metadata-Version: 2.4
2
+ Name: locus-cli
3
+ Version: 0.1.0a1
4
+ Summary: Lightweight terminal app to quickly understand large codebases
5
+ Author: Mattia Rizzo
6
+ License: MIT
7
+ Project-URL: Homepage, https://github.com/Tech-Matt/locus
8
+ Project-URL: Repository, https://github.com/Tech-Matt/locus
9
+ Keywords: cli,tui,llm,codebase,textual,rich
10
+ Classifier: Development Status :: 3 - Alpha
11
+ Classifier: Environment :: Console
12
+ Classifier: Intended Audience :: Developers
13
+ Classifier: License :: OSI Approved :: MIT License
14
+ Classifier: Programming Language :: Python :: 3
15
+ Classifier: Programming Language :: Python :: 3 :: Only
16
+ Classifier: Programming Language :: Python :: 3.10
17
+ Classifier: Programming Language :: Python :: 3.11
18
+ Classifier: Programming Language :: Python :: 3.12
19
+ Classifier: Topic :: Software Development :: Documentation
20
+ Classifier: Topic :: Utilities
21
+ Requires-Python: >=3.10
22
+ Description-Content-Type: text/markdown
23
+ License-File: LICENSE
24
+ Requires-Dist: textual>=0.70.0
25
+ Requires-Dist: rich>=13.0.0
26
+ Dynamic: license-file
27
+
28
+ <div align="center">
29
+
30
+ # LOCUS 🗺️
31
+
32
+ **The free, 100% private, local-LLM codebase cartographer for your terminal.**
33
+
34
+ [![Python 3.x](https://img.shields.io/badge/python-3.x-blue.svg)](https://www.python.org/)
35
+ [![License: MIT](https://img.shields.io/badge/License-MIT-yellow.svg)](https://opensource.org/licenses/MIT)
36
+ [![Status: Work in Progress](https://img.shields.io/badge/Status-WIP-orange.svg)]()
37
+
38
+ *Stop paying for expensive API keys just to understand a codebase. Locus is a lightweight terminal tool that automatically downloads and runs architecture-optimized local LLMs to instantly map and summarize large codebases—for free, and 100% privately.*
39
+
40
+ </div>
41
+
42
+ ---
43
+
44
+ ## Why Locus?
45
+ I wanted a tool to explore new and large codebases (like the *Linux Kernel* source code 😄), with an intuitive and powerful UI. **Locus** provides instant visual overviews and directory trees right in your terminal, helping you map out the structure before you deep dive into the code.
46
+
47
+ ### Free, Open, Local Intelligence
48
+ You shouldn't have to pay for expensive Gemini, Claude, or OpenAI API keys just to understand a codebase. **Locus dynamically profiles** your PC's hardware (Apple Silicon, NVIDIA, AMD GPU, or CPU-only) and automatically downloads and runs **architecture-optimized local LLMs**. You get instant and private folder summaries running entirely on your own machine—for free. *(Cloud APIs are still supported if you prefer them).*
49
+
50
+ - **Zero-Friction:** Just do `pip install locus-cli` and run `locus`. No Docker, no manual model downloading. (Still to be packaged as of now)
51
+ - **Fast & Native:** A fast TUI built with `Textual` and `Rich`.
52
+ - **Private by Default:** Your proprietary code never leaves your machine unless you explicitly configure a cloud provider.
53
+
54
+ ### Current Platform Support
55
+ - **macOS (Apple Silicon / arm64):** Supported
56
+ - **macOS (Intel / x86_64):** Not supported yet
57
+ - **Linux:** Supported
58
+ - **Windows:** Supported
59
+
60
+ ---
61
+
62
+ ## Installation
63
+
64
+ *Note: Locus is currently in active development. A v0.1 release is coming soon!*
65
+
66
+ ```bash
67
+ git clone https://github.com/Tech-Matt/locus.git
68
+ cd locus
69
+ python -m venv myEnv
70
+ source myEnv/bin/activate # Windows: `myEnv\Scripts\activate`
71
+ pip install -r requirements.txt
72
+ python main.py
73
+ ```
74
+
75
+ ---
76
+
77
+ ## Usage
78
+
79
+ Run `locus` in any directory you want to explore:
80
+
81
+ ```bash
82
+ cd /path/to/massive/codebase
83
+ locus
84
+ ```
85
+
86
+ **Keybindings:**
87
+ - `j` / `k` : Scroll Down / Up
88
+ - `d` : Toggle Dark/Light Mode
89
+ - `q` : Quit Locus
90
+ - `A` : (Coming Soon) Generate a quick summary for the selected folder
91
+
92
+ ---
93
+
94
+ ## Roadmap
95
+
96
+ - [x] **Phase 1: Visual Engine** (Recursive parsing, smart filtering, TUI scaffolding)
97
+ - [x] **Phase 2: Hardware Profiling** (Native GPU/RAM detection, model mapping)
98
+ - [ ] **Phase 3: Local Summaries** (Local inference engine, TUI integration)
99
+ - [ ] **Phase 4: Testing & Hardening** (Unit tests, CI/CD pipeline)
100
+ - [ ] **Phase 5: Packaging** (PyPI release, zero-dependency binaries)
101
+
102
+ ---
103
+
104
+ <div align="center">
105
+ Made with ❤️ by <a href="https://github.com/Tech-Matt">Tech-Matt</a>
106
+ </div>
@@ -0,0 +1,17 @@
1
+ LICENSE
2
+ README.md
3
+ pyproject.toml
4
+ src/locus_cli/__init__.py
5
+ src/locus_cli/main.py
6
+ src/locus_cli.egg-info/PKG-INFO
7
+ src/locus_cli.egg-info/SOURCES.txt
8
+ src/locus_cli.egg-info/dependency_links.txt
9
+ src/locus_cli.egg-info/entry_points.txt
10
+ src/locus_cli.egg-info/requires.txt
11
+ src/locus_cli.egg-info/top_level.txt
12
+ src/locus_cli/core/map.py
13
+ src/locus_cli/core/profiler.py
14
+ src/locus_cli/core/provisioner.py
15
+ src/locus_cli/ui/app.py
16
+ src/locus_cli/ui/console.py
17
+ src/locus_cli/ui/style.tcss
@@ -0,0 +1,2 @@
1
+ [console_scripts]
2
+ locus = locus_cli.main:main
@@ -0,0 +1,2 @@
1
+ textual>=0.70.0
2
+ rich>=13.0.0
@@ -0,0 +1 @@
1
+ locus_cli