neuroloop-py 0.0.1__tar.gz
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- neuroloop_py-0.0.1/.gitignore +31 -0
- neuroloop_py-0.0.1/PKG-INFO +164 -0
- neuroloop_py-0.0.1/README.md +143 -0
- neuroloop_py-0.0.1/neuroloop/__init__.py +8 -0
- neuroloop_py-0.0.1/neuroloop/agent.py +813 -0
- neuroloop_py-0.0.1/neuroloop/main.py +134 -0
- neuroloop_py-0.0.1/neuroloop/memory.py +33 -0
- neuroloop_py-0.0.1/neuroloop/neuroskill/__init__.py +17 -0
- neuroloop_py-0.0.1/neuroloop/neuroskill/client.py +202 -0
- neuroloop_py-0.0.1/neuroloop/neuroskill/context.py +263 -0
- neuroloop_py-0.0.1/neuroloop/neuroskill/run.py +339 -0
- neuroloop_py-0.0.1/neuroloop/neuroskill/signals.py +494 -0
- neuroloop_py-0.0.1/neuroloop/prompts.py +117 -0
- neuroloop_py-0.0.1/neuroloop/tools/__init__.py +14 -0
- neuroloop_py-0.0.1/neuroloop/tools/protocol.py +179 -0
- neuroloop_py-0.0.1/neuroloop/tools/web_fetch.py +120 -0
- neuroloop_py-0.0.1/neuroloop/tools/web_search.py +161 -0
- neuroloop_py-0.0.1/neuroloop/ui.py +894 -0
- neuroloop_py-0.0.1/pyproject.toml +63 -0
- neuroloop_py-0.0.1/skills/neuroskill-data-reference/SKILL.md +161 -0
- neuroloop_py-0.0.1/skills/neuroskill-labels/SKILL.md +210 -0
- neuroloop_py-0.0.1/skills/neuroskill-protocols/SKILL.md +634 -0
- neuroloop_py-0.0.1/skills/neuroskill-recipes/SKILL.md +340 -0
- neuroloop_py-0.0.1/skills/neuroskill-search/SKILL.md +159 -0
- neuroloop_py-0.0.1/skills/neuroskill-sessions/SKILL.md +151 -0
- neuroloop_py-0.0.1/skills/neuroskill-sleep/SKILL.md +169 -0
- neuroloop_py-0.0.1/skills/neuroskill-status/SKILL.md +133 -0
- neuroloop_py-0.0.1/skills/neuroskill-streaming/SKILL.md +180 -0
- neuroloop_py-0.0.1/skills/neuroskill-transport/SKILL.md +185 -0
|
@@ -0,0 +1,31 @@
|
|
|
1
|
+
# Rust
|
|
2
|
+
/target/
|
|
3
|
+
**/*.rs.bk
|
|
4
|
+
Cargo.lock
|
|
5
|
+
|
|
6
|
+
# Models
|
|
7
|
+
models/
|
|
8
|
+
*.gguf
|
|
9
|
+
*.bin
|
|
10
|
+
|
|
11
|
+
# Build artifacts
|
|
12
|
+
/build/
|
|
13
|
+
*.o
|
|
14
|
+
*.so
|
|
15
|
+
*.a
|
|
16
|
+
*.dylib
|
|
17
|
+
*.dll
|
|
18
|
+
|
|
19
|
+
# IDE
|
|
20
|
+
.idea/
|
|
21
|
+
.vscode/
|
|
22
|
+
*.swp
|
|
23
|
+
*.swo
|
|
24
|
+
*~
|
|
25
|
+
|
|
26
|
+
# OS
|
|
27
|
+
.DS_Store
|
|
28
|
+
Thumbs.db
|
|
29
|
+
|
|
30
|
+
# Debug
|
|
31
|
+
*.pdb
|
|
@@ -0,0 +1,164 @@
|
|
|
1
|
+
Metadata-Version: 2.4
|
|
2
|
+
Name: neuroloop-py
|
|
3
|
+
Version: 0.0.1
|
|
4
|
+
Summary: NeuroLoop™ – EXG-aware AI agent (Python edition using aider/litellm)
|
|
5
|
+
Project-URL: Homepage, https://neuroloop.io
|
|
6
|
+
Project-URL: Repository, https://github.com/NeuroSkill-com/neuroloop
|
|
7
|
+
Author-email: NeuroSkill team <hello@neuroskill.com>
|
|
8
|
+
License: GPL-3.0-only
|
|
9
|
+
Keywords: agent,ai,aider,bci,eeg,exg,litellm,neurotech
|
|
10
|
+
Requires-Python: >=3.12
|
|
11
|
+
Requires-Dist: httpx>=0.24.0
|
|
12
|
+
Requires-Dist: litellm>=1.50.0
|
|
13
|
+
Requires-Dist: neuroskill-dev>=0.0.1
|
|
14
|
+
Requires-Dist: prompt-toolkit>=3.0.0
|
|
15
|
+
Requires-Dist: rich>=13.0.0
|
|
16
|
+
Provides-Extra: dev
|
|
17
|
+
Requires-Dist: pyright>=1.1; extra == 'dev'
|
|
18
|
+
Requires-Dist: pytest-asyncio>=0.23; extra == 'dev'
|
|
19
|
+
Requires-Dist: pytest>=8.0; extra == 'dev'
|
|
20
|
+
Description-Content-Type: text/markdown
|
|
21
|
+
|
|
22
|
+
# neuroloop-py
|
|
23
|
+
|
|
24
|
+
**NeuroLoop™ – EXG-aware AI agent (Python edition)**
|
|
25
|
+
|
|
26
|
+
A Python clone of [neuroloop](../neuroloop) using **aider**'s LLM infrastructure
|
|
27
|
+
([litellm](https://github.com/BerriAI/litellm)) instead of the pi coding agent framework.
|
|
28
|
+
|
|
29
|
+
---
|
|
30
|
+
|
|
31
|
+
## What it does
|
|
32
|
+
|
|
33
|
+
neuroloop-py is an EXG-aware conversational AI agent that:
|
|
34
|
+
|
|
35
|
+
- **Reads your brainwaves** before every turn via `neuroskill status`
|
|
36
|
+
- **Injects your live mental state** into the system prompt so the AI can respond with
|
|
37
|
+
full awareness of how you actually feel — cognitively, emotionally, somatically
|
|
38
|
+
- **Auto-labels notable moments** (awe, grief, deep focus, moral clarity, etc.) as
|
|
39
|
+
permanent EXG annotations
|
|
40
|
+
- **Runs guided protocols** (breathing, meditation, grounding, somatic scans, etc.)
|
|
41
|
+
step by step with OS notifications and EXG timestamps
|
|
42
|
+
- **Searches the web**, reads URLs, and maintains **persistent memory** across sessions
|
|
43
|
+
- **Pre-warms the compare cache** so session comparisons are instant when you ask
|
|
44
|
+
|
|
45
|
+
---
|
|
46
|
+
|
|
47
|
+
## Architecture
|
|
48
|
+
|
|
49
|
+
```
|
|
50
|
+
neuroloop/
|
|
51
|
+
├── main.py Entry point — model selection, CLI args, asyncio.run()
|
|
52
|
+
├── agent.py NeuroloopAgent — main loop, before_agent_start hook, tool dispatch
|
|
53
|
+
├── memory.py ~/.neuroskill/memory.md — read/write persistent memory
|
|
54
|
+
├── prompts.py STATUS_PROMPT + build_system_prompt()
|
|
55
|
+
├── neuroskill/
|
|
56
|
+
│ ├── run.py run_neuroskill() — subprocess executor (npx neuroskill ...)
|
|
57
|
+
│ ├── signals.py detect_signals() — 35+ regex-based domain signal detectors
|
|
58
|
+
│ └── context.py select_contextual_data() — parallel neuroskill queries
|
|
59
|
+
└── tools/
|
|
60
|
+
├── web_fetch.py web_fetch tool — URL → plain text
|
|
61
|
+
├── web_search.py web_search tool — DuckDuckGo Lite (no API key)
|
|
62
|
+
└── protocol.py run_protocol tool — timed step execution + EXG labelling
|
|
63
|
+
```
|
|
64
|
+
|
|
65
|
+
### vs. the TypeScript original
|
|
66
|
+
|
|
67
|
+
| TypeScript (neuroloop) | Python (neuroloop-py) |
|
|
68
|
+
|-------------------------------|------------------------------------------|
|
|
69
|
+
| pi coding agent framework | aider / litellm |
|
|
70
|
+
| pi `ExtensionAPI` | `NeuroloopAgent` class |
|
|
71
|
+
| `before_agent_start` hook | `agent.before_agent_start()` async method |
|
|
72
|
+
| pi `registerTool` | `ALL_TOOLS` list (OpenAI function schema) |
|
|
73
|
+
| pi `InteractiveMode` | `asyncio` + `rich` console REPL |
|
|
74
|
+
| pi TUI (custom header/footer) | `rich` Markdown + rule separators |
|
|
75
|
+
| WebSocket EXG live panel | Per-turn `neuroskill status` query |
|
|
76
|
+
| `@sinclair/typebox` schemas | Plain Python dicts (OpenAI schema) |
|
|
77
|
+
| TypeScript `zod` | Python type hints |
|
|
78
|
+
|
|
79
|
+
> **Why litellm?** Aider uses litellm internally. Using it directly gives us the same
|
|
80
|
+
> multi-provider support (Anthropic, OpenAI, Gemini, Ollama, …) without requiring a
|
|
81
|
+
> full aider installation.
|
|
82
|
+
|
|
83
|
+
---
|
|
84
|
+
|
|
85
|
+
## Installation
|
|
86
|
+
|
|
87
|
+
```bash
|
|
88
|
+
cd /agent/ns/neuroloop-py
|
|
89
|
+
pip install -r requirements.txt
|
|
90
|
+
# or: pip install .
|
|
91
|
+
```
|
|
92
|
+
|
|
93
|
+
Requires Python ≥ 3.12.
|
|
94
|
+
|
|
95
|
+
---
|
|
96
|
+
|
|
97
|
+
## Usage
|
|
98
|
+
|
|
99
|
+
```bash
|
|
100
|
+
# Interactive mode
|
|
101
|
+
python -m neuroloop.main
|
|
102
|
+
|
|
103
|
+
# With a specific model
|
|
104
|
+
python -m neuroloop.main --model claude-3-5-sonnet-20241022
|
|
105
|
+
|
|
106
|
+
# With an initial message
|
|
107
|
+
python -m neuroloop.main "How is my focus today?"
|
|
108
|
+
|
|
109
|
+
# Via the console script (after pip install)
|
|
110
|
+
neuroloop-py
|
|
111
|
+
neuroloop-py --model gpt-4o
|
|
112
|
+
```
|
|
113
|
+
|
|
114
|
+
### Model selection (priority order)
|
|
115
|
+
1. `--model MODEL` CLI flag
|
|
116
|
+
2. `NEUROLOOP_MODEL` environment variable
|
|
117
|
+
3. Auto-detect: `ANTHROPIC_API_KEY` → claude, `OPENAI_API_KEY` → gpt-4o,
|
|
118
|
+
`GEMINI_API_KEY` → gemini, local Ollama → first available model
|
|
119
|
+
4. Fallback: `claude-3-5-sonnet-20241022`
|
|
120
|
+
|
|
121
|
+
---
|
|
122
|
+
|
|
123
|
+
## Commands
|
|
124
|
+
|
|
125
|
+
| Command | Description |
|
|
126
|
+
|---------|-------------|
|
|
127
|
+
| `/exg` | Show live EXG snapshot |
|
|
128
|
+
| `/exg on` / `/exg off` | Toggle EXG display |
|
|
129
|
+
| `/neuro <cmd> [args]` | Run a neuroskill subcommand |
|
|
130
|
+
| `/memory` | Show agent memory |
|
|
131
|
+
| `/help` | Show all commands |
|
|
132
|
+
| `/quit` | Exit |
|
|
133
|
+
|
|
134
|
+
---
|
|
135
|
+
|
|
136
|
+
## Tools available to the AI
|
|
137
|
+
|
|
138
|
+
| Tool | Description |
|
|
139
|
+
|------|-------------|
|
|
140
|
+
| `web_fetch` | Fetch any URL → plain text |
|
|
141
|
+
| `web_search` | DuckDuckGo Lite search (no API key) |
|
|
142
|
+
| `memory_read` | Read `~/.neuroskill/memory.md` |
|
|
143
|
+
| `memory_write` | Write / append to memory |
|
|
144
|
+
| `neuroskill_label` | Create a timestamped EXG annotation |
|
|
145
|
+
| `neuroskill_run` | Run any neuroskill subcommand |
|
|
146
|
+
| `prewarm` | Start background `neuroskill compare` cache build |
|
|
147
|
+
| `run_protocol` | Execute a timed multi-step guided protocol |
|
|
148
|
+
|
|
149
|
+
---
|
|
150
|
+
|
|
151
|
+
## Skills
|
|
152
|
+
|
|
153
|
+
The `skills/` directory contains the same SKILL.md files as the TypeScript version.
|
|
154
|
+
They are loaded on-demand based on context signals detected in the user's prompt
|
|
155
|
+
(protocols, sleep, HRV, etc.).
|
|
156
|
+
|
|
157
|
+
---
|
|
158
|
+
|
|
159
|
+
## Requirements
|
|
160
|
+
|
|
161
|
+
- Python ≥ 3.12
|
|
162
|
+
- `neuroskill` npm package (`npx neuroskill status`)
|
|
163
|
+
- At least one LLM API key (`ANTHROPIC_API_KEY`, `OPENAI_API_KEY`, etc.)
|
|
164
|
+
or a running Ollama instance
|
|
@@ -0,0 +1,143 @@
|
|
|
1
|
+
# neuroloop-py
|
|
2
|
+
|
|
3
|
+
**NeuroLoop™ – EXG-aware AI agent (Python edition)**
|
|
4
|
+
|
|
5
|
+
A Python clone of [neuroloop](../neuroloop) using **aider**'s LLM infrastructure
|
|
6
|
+
([litellm](https://github.com/BerriAI/litellm)) instead of the pi coding agent framework.
|
|
7
|
+
|
|
8
|
+
---
|
|
9
|
+
|
|
10
|
+
## What it does
|
|
11
|
+
|
|
12
|
+
neuroloop-py is an EXG-aware conversational AI agent that:
|
|
13
|
+
|
|
14
|
+
- **Reads your brainwaves** before every turn via `neuroskill status`
|
|
15
|
+
- **Injects your live mental state** into the system prompt so the AI can respond with
|
|
16
|
+
full awareness of how you actually feel — cognitively, emotionally, somatically
|
|
17
|
+
- **Auto-labels notable moments** (awe, grief, deep focus, moral clarity, etc.) as
|
|
18
|
+
permanent EXG annotations
|
|
19
|
+
- **Runs guided protocols** (breathing, meditation, grounding, somatic scans, etc.)
|
|
20
|
+
step by step with OS notifications and EXG timestamps
|
|
21
|
+
- **Searches the web**, reads URLs, and maintains **persistent memory** across sessions
|
|
22
|
+
- **Pre-warms the compare cache** so session comparisons are instant when you ask
|
|
23
|
+
|
|
24
|
+
---
|
|
25
|
+
|
|
26
|
+
## Architecture
|
|
27
|
+
|
|
28
|
+
```
|
|
29
|
+
neuroloop/
|
|
30
|
+
├── main.py Entry point — model selection, CLI args, asyncio.run()
|
|
31
|
+
├── agent.py NeuroloopAgent — main loop, before_agent_start hook, tool dispatch
|
|
32
|
+
├── memory.py ~/.neuroskill/memory.md — read/write persistent memory
|
|
33
|
+
├── prompts.py STATUS_PROMPT + build_system_prompt()
|
|
34
|
+
├── neuroskill/
|
|
35
|
+
│ ├── run.py run_neuroskill() — subprocess executor (npx neuroskill ...)
|
|
36
|
+
│ ├── signals.py detect_signals() — 35+ regex-based domain signal detectors
|
|
37
|
+
│ └── context.py select_contextual_data() — parallel neuroskill queries
|
|
38
|
+
└── tools/
|
|
39
|
+
├── web_fetch.py web_fetch tool — URL → plain text
|
|
40
|
+
├── web_search.py web_search tool — DuckDuckGo Lite (no API key)
|
|
41
|
+
└── protocol.py run_protocol tool — timed step execution + EXG labelling
|
|
42
|
+
```
|
|
43
|
+
|
|
44
|
+
### vs. the TypeScript original
|
|
45
|
+
|
|
46
|
+
| TypeScript (neuroloop) | Python (neuroloop-py) |
|
|
47
|
+
|-------------------------------|------------------------------------------|
|
|
48
|
+
| pi coding agent framework | aider / litellm |
|
|
49
|
+
| pi `ExtensionAPI` | `NeuroloopAgent` class |
|
|
50
|
+
| `before_agent_start` hook | `agent.before_agent_start()` async method |
|
|
51
|
+
| pi `registerTool` | `ALL_TOOLS` list (OpenAI function schema) |
|
|
52
|
+
| pi `InteractiveMode` | `asyncio` + `rich` console REPL |
|
|
53
|
+
| pi TUI (custom header/footer) | `rich` Markdown + rule separators |
|
|
54
|
+
| WebSocket EXG live panel | Per-turn `neuroskill status` query |
|
|
55
|
+
| `@sinclair/typebox` schemas | Plain Python dicts (OpenAI schema) |
|
|
56
|
+
| TypeScript `zod` | Python type hints |
|
|
57
|
+
|
|
58
|
+
> **Why litellm?** Aider uses litellm internally. Using it directly gives us the same
|
|
59
|
+
> multi-provider support (Anthropic, OpenAI, Gemini, Ollama, …) without requiring a
|
|
60
|
+
> full aider installation.
|
|
61
|
+
|
|
62
|
+
---
|
|
63
|
+
|
|
64
|
+
## Installation
|
|
65
|
+
|
|
66
|
+
```bash
|
|
67
|
+
cd /agent/ns/neuroloop-py
|
|
68
|
+
pip install -r requirements.txt
|
|
69
|
+
# or: pip install .
|
|
70
|
+
```
|
|
71
|
+
|
|
72
|
+
Requires Python ≥ 3.12.
|
|
73
|
+
|
|
74
|
+
---
|
|
75
|
+
|
|
76
|
+
## Usage
|
|
77
|
+
|
|
78
|
+
```bash
|
|
79
|
+
# Interactive mode
|
|
80
|
+
python -m neuroloop.main
|
|
81
|
+
|
|
82
|
+
# With a specific model
|
|
83
|
+
python -m neuroloop.main --model claude-3-5-sonnet-20241022
|
|
84
|
+
|
|
85
|
+
# With an initial message
|
|
86
|
+
python -m neuroloop.main "How is my focus today?"
|
|
87
|
+
|
|
88
|
+
# Via the console script (after pip install)
|
|
89
|
+
neuroloop-py
|
|
90
|
+
neuroloop-py --model gpt-4o
|
|
91
|
+
```
|
|
92
|
+
|
|
93
|
+
### Model selection (priority order)
|
|
94
|
+
1. `--model MODEL` CLI flag
|
|
95
|
+
2. `NEUROLOOP_MODEL` environment variable
|
|
96
|
+
3. Auto-detect: `ANTHROPIC_API_KEY` → claude, `OPENAI_API_KEY` → gpt-4o,
|
|
97
|
+
`GEMINI_API_KEY` → gemini, local Ollama → first available model
|
|
98
|
+
4. Fallback: `claude-3-5-sonnet-20241022`
|
|
99
|
+
|
|
100
|
+
---
|
|
101
|
+
|
|
102
|
+
## Commands
|
|
103
|
+
|
|
104
|
+
| Command | Description |
|
|
105
|
+
|---------|-------------|
|
|
106
|
+
| `/exg` | Show live EXG snapshot |
|
|
107
|
+
| `/exg on` / `/exg off` | Toggle EXG display |
|
|
108
|
+
| `/neuro <cmd> [args]` | Run a neuroskill subcommand |
|
|
109
|
+
| `/memory` | Show agent memory |
|
|
110
|
+
| `/help` | Show all commands |
|
|
111
|
+
| `/quit` | Exit |
|
|
112
|
+
|
|
113
|
+
---
|
|
114
|
+
|
|
115
|
+
## Tools available to the AI
|
|
116
|
+
|
|
117
|
+
| Tool | Description |
|
|
118
|
+
|------|-------------|
|
|
119
|
+
| `web_fetch` | Fetch any URL → plain text |
|
|
120
|
+
| `web_search` | DuckDuckGo Lite search (no API key) |
|
|
121
|
+
| `memory_read` | Read `~/.neuroskill/memory.md` |
|
|
122
|
+
| `memory_write` | Write / append to memory |
|
|
123
|
+
| `neuroskill_label` | Create a timestamped EXG annotation |
|
|
124
|
+
| `neuroskill_run` | Run any neuroskill subcommand |
|
|
125
|
+
| `prewarm` | Start background `neuroskill compare` cache build |
|
|
126
|
+
| `run_protocol` | Execute a timed multi-step guided protocol |
|
|
127
|
+
|
|
128
|
+
---
|
|
129
|
+
|
|
130
|
+
## Skills
|
|
131
|
+
|
|
132
|
+
The `skills/` directory contains the same SKILL.md files as the TypeScript version.
|
|
133
|
+
They are loaded on-demand based on context signals detected in the user's prompt
|
|
134
|
+
(protocols, sleep, HRV, etc.).
|
|
135
|
+
|
|
136
|
+
---
|
|
137
|
+
|
|
138
|
+
## Requirements
|
|
139
|
+
|
|
140
|
+
- Python ≥ 3.12
|
|
141
|
+
- `neuroskill` npm package (`npx neuroskill status`)
|
|
142
|
+
- At least one LLM API key (`ANTHROPIC_API_KEY`, `OPENAI_API_KEY`, etc.)
|
|
143
|
+
or a running Ollama instance
|