@aman_asmuei/aman 0.2.0 → 0.3.1

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/Dockerfile ADDED
@@ -0,0 +1,47 @@
1
+ # aman ecosystem — full AI companion in one container
2
+ # Includes: aman-agent (CLI) + achannel (Telegram/Discord/webhook) + aman-mcp + amem
3
+ #
4
+ # Build: docker build -t aman .
5
+ # Run: docker run -it -e ANTHROPIC_API_KEY=sk-... aman
6
+ # Serve: docker run -d -p 3000:3000 -e ANTHROPIC_API_KEY=sk-... aman serve
7
+
8
+ FROM node:22-alpine AS base
9
+
10
+ # Install system dependencies
11
+ RUN apk add --no-cache git curl
12
+
13
+ WORKDIR /app
14
+
15
+ # Install all ecosystem packages globally
16
+ RUN npm install -g \
17
+ @aman_asmuei/aman-agent@latest \
18
+ @aman_asmuei/aman-mcp@latest \
19
+ @aman_asmuei/achannel@latest \
20
+ @aman_asmuei/aman@latest
21
+
22
+ # Create ecosystem directories
23
+ RUN mkdir -p /root/.acore /root/.amem /root/.akit /root/.aflow \
24
+ /root/.arules /root/.aeval /root/.askill /root/.aman-agent
25
+
26
+ # Default environment
27
+ ENV NODE_ENV=production
28
+ ENV AMEM_DB_PATH=/root/.amem/memory.db
29
+
30
+ # Volumes for persistent data
31
+ VOLUME ["/root/.acore", "/root/.amem", "/root/.aman-agent"]
32
+
33
+ # Expose webhook port (achannel serve)
34
+ EXPOSE 3000
35
+
36
+ # Healthcheck for server mode
37
+ HEALTHCHECK --interval=30s --timeout=5s --retries=3 \
38
+ CMD curl -f http://localhost:3000/status || exit 1
39
+
40
+ # Entrypoint script
41
+ COPY docker-entrypoint.sh /usr/local/bin/
42
+ RUN chmod +x /usr/local/bin/docker-entrypoint.sh
43
+
44
+ ENTRYPOINT ["docker-entrypoint.sh"]
45
+
46
+ # Default: interactive CLI mode
47
+ CMD ["agent"]
package/README.md CHANGED
@@ -9,7 +9,7 @@
9
9
 
10
10
  ### Your complete AI companion.
11
11
 
12
- Identity + Memory + Tools + Workflows + Guardrails + Evaluation — one command, any AI.
12
+ Identity + Memory + Tools + Workflows + Guardrails + Skills + Evaluation — setup, deploy, run anywhere.
13
13
 
14
14
  <br>
15
15
 
@@ -47,23 +47,29 @@ Sets up your complete AI ecosystem:
47
47
 
48
48
  ```
49
49
  aman
50
- ├── acore → identity → who your AI IS
51
- ├── amem → memory → what your AI KNOWS
52
- ├── akit → tools → what your AI CAN DO
53
- ├── aflow → workflows → HOW your AI works
54
- ├── arules → guardrails → what your AI WON'T do
55
- └── aeval evaluation how GOOD your AI is
50
+ ├── acore → identity → who your AI IS
51
+ ├── amem → memory → what your AI KNOWS
52
+ ├── akit → tools → what your AI CAN DO
53
+ ├── aflow → workflows → HOW your AI works
54
+ ├── arules → guardrails → what your AI WON'T do
55
+ ├── askill skills what your AI MASTERS
56
+ ├── aeval → evaluation → how GOOD your AI is
57
+ ├── achannel → channels → WHERE your AI lives
58
+ └── aman-agent → runtime → the engine that runs it all
56
59
  ```
57
60
 
58
61
  | Layer | Package | What it does |
59
62
  |:------|:--------|:-------------|
60
- | Identity | [acore](https://github.com/amanasmuei/acore) | Personality, values, relationship memory |
61
- | Memory | [amem](https://github.com/amanasmuei/amem) | Automated knowledge storage (MCP) |
63
+ | Identity | [acore](https://github.com/amanasmuei/acore) | Personality, values, appearance, relationship memory |
64
+ | Memory | [amem](https://github.com/amanasmuei/amem) | Automated knowledge storage with semantic search (MCP) |
62
65
  | Tools | [akit](https://github.com/amanasmuei/akit) | 15 portable AI tools (MCP + manual fallback) |
63
66
  | Workflows | [aflow](https://github.com/amanasmuei/aflow) | Reusable AI workflows (code review, bug fix, etc.) |
64
67
  | Guardrails | [arules](https://github.com/amanasmuei/arules) | Safety boundaries and permissions |
68
+ | Skills | [askill](https://github.com/amanasmuei/askill) | Domain expertise with leveling (testing, security, etc.) |
65
69
  | Evaluation | [aeval](https://github.com/amanasmuei/aeval) | Relationship tracking and session logging |
66
- | **Unified** | **[aman](https://github.com/amanasmuei/aman)** | **One command to set up everything** |
70
+ | Channels | [achannel](https://github.com/amanasmuei/achannel) | Telegram, Discord, webhook server |
71
+ | Runtime | [aman-agent](https://github.com/amanasmuei/aman-agent) | Standalone AI companion CLI |
72
+ | **Unified** | **[aman](https://github.com/amanasmuei/aman)** | **Setup + Deploy — one command for everything** |
67
73
 
68
74
  Each package works independently. `aman` is the front door.
69
75
 
@@ -75,7 +81,8 @@ Each package works independently. `aman` is the front door.
75
81
  |:--------|:------------|
76
82
  | `aman` | First run: setup. After that: show status |
77
83
  | `aman setup` | Set up the full ecosystem |
78
- | `aman status` | View ecosystem status (all 6 layers) |
84
+ | `aman status` | View ecosystem status (all layers) |
85
+ | `aman deploy` | Deploy your AI anywhere (Docker, Ollama, systemd) |
79
86
 
80
87
  After setup, use the individual CLIs for detailed management:
81
88
 
@@ -114,6 +121,65 @@ $ aman status
114
121
 
115
122
  ---
116
123
 
124
+ ## Deploy Anywhere
125
+
126
+ Make your AI companion always-on — accessible from terminal, browser, Telegram, or Discord.
127
+
128
+ ```bash
129
+ npx @aman_asmuei/aman deploy
130
+ ```
131
+
132
+ Interactive wizard with 4 deployment methods:
133
+
134
+ ### Docker Compose (VPS / Home Server)
135
+
136
+ ```bash
137
+ aman deploy # select "Docker Compose", enter API key
138
+ docker compose up -d # → running on port 3000
139
+ ```
140
+
141
+ Packages the full ecosystem in one container: aman-agent + achannel (Telegram/Discord/webhook) + aman-mcp + amem.
142
+
143
+ ### Docker + Ollama (Fully Local)
144
+
145
+ ```bash
146
+ aman deploy # select "Docker + Ollama"
147
+ docker compose up -d # → aman + Ollama, zero cloud dependency
148
+ ```
149
+
150
+ No API key needed. Runs on Raspberry Pi 4/5 (ARM64), any Linux, macOS.
151
+
152
+ ### Systemd Service (Bare Metal / Raspberry Pi)
153
+
154
+ ```bash
155
+ aman deploy # select "Systemd service"
156
+ # Shows step-by-step install for bare metal Linux
157
+ ```
158
+
159
+ ### What's in the Docker Image
160
+
161
+ | Component | What it does |
162
+ |:---|:---|
163
+ | **aman-agent** | Interactive CLI companion |
164
+ | **achannel** | Telegram, Discord, webhook server |
165
+ | **aman-mcp** | MCP bridge for all ecosystem layers |
166
+ | **amem** | Persistent memory with SQLite |
167
+
168
+ ### Environment Variables
169
+
170
+ | Variable | Description |
171
+ |:---|:---|
172
+ | `ANTHROPIC_API_KEY` | Anthropic API key |
173
+ | `OPENAI_API_KEY` | OpenAI API key (alternative) |
174
+ | `AMAN_AI_NAME` | Your AI's name (default: Aman) |
175
+ | `AMAN_MODEL` | LLM model (default: claude-sonnet-4-6) |
176
+ | `TELEGRAM_BOT_TOKEN` | Optional — Telegram bot |
177
+ | `DISCORD_BOT_TOKEN` | Optional — Discord bot |
178
+
179
+ All data persists in Docker volumes (identity, memory, config, rules, workflows, skills, eval).
180
+
181
+ ---
182
+
117
183
  ## How It All Connects
118
184
 
119
185
  Every layer is a markdown file. acore auto-injects all of them into your AI's system prompt:
@@ -151,6 +217,6 @@ ChatGPT, Claude, Claude Code, Cursor, Windsurf, Gemini, Ollama, and any AI that
151
217
 
152
218
  <div align="center">
153
219
 
154
- **One command. 6 layers. Any AI. Your companion.**
220
+ **Setup. Deploy. Run anywhere. Your AI companion.**
155
221
 
156
222
  </div>
@@ -0,0 +1,29 @@
1
+ [Unit]
2
+ Description=aman AI companion server
3
+ After=network.target
4
+ Wants=network-online.target
5
+
6
+ [Service]
7
+ Type=simple
8
+ User=aman
9
+ WorkingDirectory=/home/aman
10
+ ExecStart=/usr/bin/npx @aman_asmuei/achannel serve
11
+ Restart=always
12
+ RestartSec=10
13
+
14
+ # Environment — set your API key
15
+ EnvironmentFile=/home/aman/.aman-agent/env
16
+
17
+ # Logging
18
+ StandardOutput=journal
19
+ StandardError=journal
20
+ SyslogIdentifier=aman
21
+
22
+ # Security hardening
23
+ NoNewPrivileges=true
24
+ ProtectSystem=strict
25
+ ProtectHome=read-only
26
+ ReadWritePaths=/home/aman/.acore /home/aman/.amem /home/aman/.aman-agent /home/aman/.aeval /home/aman/.arules /home/aman/.aflow /home/aman/.akit /home/aman/.askill
27
+
28
+ [Install]
29
+ WantedBy=multi-user.target
package/dist/index.js CHANGED
@@ -32,8 +32,8 @@ function detectPlatform(cwd) {
32
32
  const raw = fs.readFileSync(configPath, "utf-8");
33
33
  const parsed = JSON.parse(raw);
34
34
  if (typeof parsed.platform === "string") {
35
- const p3 = parsed.platform;
36
- if (p3 === "claude-code" || p3 === "cursor" || p3 === "windsurf") return p3;
35
+ const p4 = parsed.platform;
36
+ if (p4 === "claude-code" || p4 === "cursor" || p4 === "windsurf") return p4;
37
37
  }
38
38
  } catch {
39
39
  }
@@ -628,9 +628,217 @@ function statusCommand() {
628
628
  p2.outro("");
629
629
  }
630
630
 
631
+ // src/commands/deploy.ts
632
+ import fs6 from "fs";
633
+ import path4 from "path";
634
+ import * as p3 from "@clack/prompts";
635
+ import pc3 from "picocolors";
636
+ async function deployCommand() {
637
+ p3.intro(pc3.bold("aman deploy") + pc3.dim(" \u2014 deploy your AI companion anywhere"));
638
+ const method = await p3.select({
639
+ message: "How do you want to deploy?",
640
+ options: [
641
+ { value: "docker", label: "Docker Compose", hint: "VPS, home server, Raspberry Pi" },
642
+ { value: "ollama", label: "Docker + Ollama", hint: "fully local, no API key needed" },
643
+ { value: "systemd", label: "Systemd service", hint: "bare metal Linux / Raspberry Pi" },
644
+ { value: "manual", label: "Show me the commands", hint: "I'll do it myself" }
645
+ ]
646
+ });
647
+ if (p3.isCancel(method)) {
648
+ p3.cancel("Cancelled.");
649
+ return;
650
+ }
651
+ const cwd = process.cwd();
652
+ switch (method) {
653
+ case "docker": {
654
+ await deployDocker(cwd);
655
+ break;
656
+ }
657
+ case "ollama": {
658
+ await deployOllama(cwd);
659
+ break;
660
+ }
661
+ case "systemd": {
662
+ deploySystemd();
663
+ break;
664
+ }
665
+ case "manual": {
666
+ deployManual();
667
+ break;
668
+ }
669
+ }
670
+ p3.outro(pc3.green("Done!"));
671
+ }
672
+ async function deployDocker(cwd) {
673
+ const apiKey = await p3.text({
674
+ message: "Your LLM API key (Anthropic or OpenAI):",
675
+ placeholder: "sk-ant-... or sk-...",
676
+ validate: (v) => v.length < 10 ? "API key too short" : void 0
677
+ });
678
+ if (p3.isCancel(apiKey)) return;
679
+ const isAnthropic = apiKey.startsWith("sk-ant");
680
+ const provider = isAnthropic ? "ANTHROPIC_API_KEY" : "OPENAI_API_KEY";
681
+ const envContent = `# aman ecosystem \u2014 deployment config
682
+ ${provider}=${apiKey}
683
+ AMAN_AI_NAME=Aman
684
+ AMAN_MODEL=${isAnthropic ? "claude-sonnet-4-6" : "gpt-4o"}
685
+
686
+ # Optional: Telegram/Discord bots
687
+ # TELEGRAM_BOT_TOKEN=
688
+ # DISCORD_BOT_TOKEN=
689
+ `;
690
+ const envPath = path4.join(cwd, ".env");
691
+ fs6.writeFileSync(envPath, envContent, "utf-8");
692
+ p3.log.success(`Created ${pc3.bold(".env")} with API key`);
693
+ const pkgDir = findPackageDir();
694
+ copyDeployFile(pkgDir, cwd, "Dockerfile");
695
+ copyDeployFile(pkgDir, cwd, "docker-entrypoint.sh");
696
+ copyDeployFile(pkgDir, cwd, "docker-compose.yml");
697
+ try {
698
+ fs6.chmodSync(path4.join(cwd, "docker-entrypoint.sh"), 493);
699
+ } catch {
700
+ }
701
+ p3.log.success(`Created ${pc3.bold("Dockerfile")} + ${pc3.bold("docker-compose.yml")}`);
702
+ p3.note(
703
+ `${pc3.bold("Start your companion:")}
704
+
705
+ docker compose up -d
706
+
707
+ ${pc3.bold("Access:")}
708
+ Webhook API: http://localhost:3000/chat
709
+ Health check: http://localhost:3000/status
710
+
711
+ ${pc3.bold("Interactive CLI:")}
712
+ docker compose run --rm aman-server agent
713
+
714
+ ${pc3.bold("View logs:")}
715
+ docker compose logs -f`,
716
+ "Next steps"
717
+ );
718
+ }
719
+ async function deployOllama(cwd) {
720
+ const model = await p3.text({
721
+ message: "Ollama model to use:",
722
+ placeholder: "llama3.2",
723
+ initialValue: "llama3.2"
724
+ });
725
+ if (p3.isCancel(model)) return;
726
+ const envContent = `# aman ecosystem \u2014 local deployment (no API key needed)
727
+ AMAN_AI_NAME=Aman
728
+ AMAN_MODEL=${model}
729
+
730
+ # Optional: Telegram/Discord bots
731
+ # TELEGRAM_BOT_TOKEN=
732
+ # DISCORD_BOT_TOKEN=
733
+ `;
734
+ const envPath = path4.join(cwd, ".env");
735
+ fs6.writeFileSync(envPath, envContent, "utf-8");
736
+ p3.log.success(`Created ${pc3.bold(".env")} with Ollama config`);
737
+ const pkgDir = findPackageDir();
738
+ copyDeployFile(pkgDir, cwd, "Dockerfile");
739
+ copyDeployFile(pkgDir, cwd, "docker-entrypoint.sh");
740
+ copyDeployFile(pkgDir, cwd, "docker-compose.ollama.yml");
741
+ const src = path4.join(cwd, "docker-compose.ollama.yml");
742
+ const dest = path4.join(cwd, "docker-compose.yml");
743
+ if (fs6.existsSync(src) && !fs6.existsSync(dest)) {
744
+ fs6.renameSync(src, dest);
745
+ }
746
+ try {
747
+ fs6.chmodSync(path4.join(cwd, "docker-entrypoint.sh"), 493);
748
+ } catch {
749
+ }
750
+ p3.log.success(`Created ${pc3.bold("Dockerfile")} + ${pc3.bold("docker-compose.yml")} (with Ollama)`);
751
+ p3.note(
752
+ `${pc3.bold("Start your companion (fully local):")}
753
+
754
+ docker compose up -d
755
+
756
+ First run downloads ${model} (~2GB). After that it's instant.
757
+
758
+ ${pc3.bold("Access:")}
759
+ Webhook API: http://localhost:3000/chat
760
+ Ollama: http://localhost:11434
761
+
762
+ ${pc3.bold("Works on:")}
763
+ Raspberry Pi 4/5 (ARM64), any Linux, macOS, Windows`,
764
+ "Next steps"
765
+ );
766
+ }
767
+ function deploySystemd() {
768
+ p3.note(
769
+ `${pc3.bold("1. Install Node.js 20+:")}
770
+ curl -fsSL https://deb.nodesource.com/setup_22.x | sudo bash -
771
+ sudo apt install -y nodejs
772
+
773
+ ${pc3.bold("2. Create aman user:")}
774
+ sudo useradd -m -s /bin/bash aman
775
+ sudo -u aman npm install -g @aman_asmuei/achannel @aman_asmuei/aman-mcp
776
+
777
+ ${pc3.bold("3. Configure API key:")}
778
+ sudo mkdir -p /home/aman/.aman-agent
779
+ echo 'ANTHROPIC_API_KEY=sk-ant-...' | sudo tee /home/aman/.aman-agent/env
780
+
781
+ ${pc3.bold("4. Install service:")}
782
+ sudo cp aman.service /etc/systemd/system/
783
+ sudo systemctl daemon-reload
784
+ sudo systemctl enable --now aman
785
+
786
+ ${pc3.bold("5. Check status:")}
787
+ sudo systemctl status aman
788
+ sudo journalctl -u aman -f`,
789
+ "Systemd deployment (Raspberry Pi / bare metal)"
790
+ );
791
+ }
792
+ function deployManual() {
793
+ p3.note(
794
+ `${pc3.bold("Docker (one command):")}
795
+ docker run -d -p 3000:3000 \\
796
+ -e ANTHROPIC_API_KEY=sk-ant-... \\
797
+ -v aman-data:/root/.acore \\
798
+ -v aman-memory:/root/.amem \\
799
+ ghcr.io/amanasmuei/aman serve
800
+
801
+ ${pc3.bold("Docker + Ollama (fully local):")}
802
+ docker run -d --name ollama ollama/ollama
803
+ docker exec ollama ollama pull llama3.2
804
+ docker run -d -p 3000:3000 \\
805
+ --link ollama \\
806
+ -e OLLAMA_HOST=http://ollama:11434 \\
807
+ ghcr.io/amanasmuei/aman serve
808
+
809
+ ${pc3.bold("npm (any server):")}
810
+ npm install -g @aman_asmuei/achannel @aman_asmuei/aman-mcp
811
+ ANTHROPIC_API_KEY=sk-ant-... achannel serve
812
+
813
+ ${pc3.bold("Raspberry Pi:")}
814
+ curl -fsSL https://deb.nodesource.com/setup_22.x | sudo bash -
815
+ sudo apt install -y nodejs
816
+ npm install -g @aman_asmuei/achannel @aman_asmuei/aman-mcp
817
+ ANTHROPIC_API_KEY=sk-ant-... achannel serve`,
818
+ "Manual deployment commands"
819
+ );
820
+ }
821
+ function findPackageDir() {
822
+ let dir = new URL(".", import.meta.url).pathname;
823
+ for (let i = 0; i < 5; i++) {
824
+ if (fs6.existsSync(path4.join(dir, "Dockerfile"))) return dir;
825
+ dir = path4.dirname(dir);
826
+ }
827
+ const globalDir = path4.join(process.env.npm_config_prefix || "/usr/local", "lib/node_modules/@aman_asmuei/aman");
828
+ if (fs6.existsSync(path4.join(globalDir, "Dockerfile"))) return globalDir;
829
+ return process.cwd();
830
+ }
831
+ function copyDeployFile(pkgDir, destDir, filename) {
832
+ const src = path4.join(pkgDir, filename);
833
+ const dest = path4.join(destDir, filename);
834
+ if (fs6.existsSync(src)) {
835
+ fs6.copyFileSync(src, dest);
836
+ }
837
+ }
838
+
631
839
  // src/index.ts
632
840
  var program = new Command();
633
- program.name("aman").description("Your complete AI companion \u2014 identity, memory, and tools in one command").version("0.2.0").action(() => {
841
+ program.name("aman").description("Your complete AI companion \u2014 identity, memory, and tools in one command").version("0.3.1").action(() => {
634
842
  const ecosystem = detectEcosystem();
635
843
  if (ecosystem.acore.installed) {
636
844
  statusCommand();
@@ -640,4 +848,5 @@ program.name("aman").description("Your complete AI companion \u2014 identity, me
640
848
  });
641
849
  program.command("setup").description("Set up your AI companion (identity + memory + tools)").action(() => setupCommand());
642
850
  program.command("status").description("View your full ecosystem status").action(() => statusCommand());
851
+ program.command("deploy").description("Deploy your AI companion (Docker, systemd, or cloud)").action(() => deployCommand());
643
852
  program.parse();
@@ -0,0 +1,65 @@
1
+ # aman ecosystem — fully local with Ollama (no API key needed)
2
+ # Usage: docker compose -f docker-compose.ollama.yml up -d
3
+ #
4
+ # First run pulls llama3.2 model (~2GB download)
5
+ # Works on: Raspberry Pi 4/5 (ARM64), any Linux/Mac with Docker
6
+
7
+ services:
8
+ ollama:
9
+ image: ollama/ollama:latest
10
+ restart: unless-stopped
11
+ volumes:
12
+ - ollama-models:/root/.ollama
13
+ ports:
14
+ - "11434:11434"
15
+ # Uncomment for GPU support:
16
+ # deploy:
17
+ # resources:
18
+ # reservations:
19
+ # devices:
20
+ # - capabilities: [gpu]
21
+
22
+ ollama-init:
23
+ image: ollama/ollama:latest
24
+ depends_on:
25
+ - ollama
26
+ restart: "no"
27
+ entrypoint: >
28
+ sh -c "sleep 5 && ollama pull llama3.2"
29
+ environment:
30
+ - OLLAMA_HOST=http://ollama:11434
31
+
32
+ aman-server:
33
+ build: .
34
+ command: serve
35
+ restart: unless-stopped
36
+ depends_on:
37
+ - ollama
38
+ ports:
39
+ - "3000:3000"
40
+ environment:
41
+ - OLLAMA_HOST=http://ollama:11434
42
+ - AMAN_AI_NAME=${AMAN_AI_NAME:-Aman}
43
+ - AMAN_MODEL=${AMAN_MODEL:-llama3.2}
44
+ - TELEGRAM_BOT_TOKEN=${TELEGRAM_BOT_TOKEN:-}
45
+ - DISCORD_BOT_TOKEN=${DISCORD_BOT_TOKEN:-}
46
+ volumes:
47
+ - aman-identity:/root/.acore
48
+ - aman-memory:/root/.amem
49
+ - aman-config:/root/.aman-agent
50
+ - aman-rules:/root/.arules
51
+ - aman-workflows:/root/.aflow
52
+ - aman-eval:/root/.aeval
53
+ - aman-skills:/root/.askill
54
+ - aman-tools:/root/.akit
55
+
56
+ volumes:
57
+ ollama-models:
58
+ aman-identity:
59
+ aman-memory:
60
+ aman-config:
61
+ aman-rules:
62
+ aman-workflows:
63
+ aman-eval:
64
+ aman-skills:
65
+ aman-tools:
@@ -0,0 +1,40 @@
1
+ # aman ecosystem — full deployment
2
+ # Usage: docker compose up -d
3
+ #
4
+ # Modes:
5
+ # aman-server → always-on webhook/Telegram/Discord server
6
+ # aman-agent → interactive CLI (attach with: docker attach aman-agent)
7
+
8
+ services:
9
+ aman-server:
10
+ build: .
11
+ command: serve
12
+ restart: unless-stopped
13
+ ports:
14
+ - "3000:3000"
15
+ environment:
16
+ - ANTHROPIC_API_KEY=${ANTHROPIC_API_KEY}
17
+ - OPENAI_API_KEY=${OPENAI_API_KEY:-}
18
+ - AMAN_AI_NAME=${AMAN_AI_NAME:-Aman}
19
+ - AMAN_MODEL=${AMAN_MODEL:-claude-sonnet-4-6}
20
+ - TELEGRAM_BOT_TOKEN=${TELEGRAM_BOT_TOKEN:-}
21
+ - DISCORD_BOT_TOKEN=${DISCORD_BOT_TOKEN:-}
22
+ volumes:
23
+ - aman-identity:/root/.acore
24
+ - aman-memory:/root/.amem
25
+ - aman-config:/root/.aman-agent
26
+ - aman-rules:/root/.arules
27
+ - aman-workflows:/root/.aflow
28
+ - aman-eval:/root/.aeval
29
+ - aman-skills:/root/.askill
30
+ - aman-tools:/root/.akit
31
+
32
+ volumes:
33
+ aman-identity:
34
+ aman-memory:
35
+ aman-config:
36
+ aman-rules:
37
+ aman-workflows:
38
+ aman-eval:
39
+ aman-skills:
40
+ aman-tools:
@@ -0,0 +1,165 @@
1
+ #!/bin/sh
2
+ set -e
3
+
4
+ # aman ecosystem — Docker entrypoint
5
+ # Modes:
6
+ # agent → interactive CLI (default)
7
+ # serve → achannel webhook/Telegram/Discord server
8
+ # setup → run ecosystem setup
9
+ # status → show ecosystem status
10
+
11
+ MODE="${1:-agent}"
12
+
13
+ # Auto-create minimal identity if none exists
14
+ if [ ! -f /root/.acore/core.md ]; then
15
+ echo " First run — creating default identity..."
16
+ AI_NAME="${AMAN_AI_NAME:-Aman}"
17
+ cat > /root/.acore/core.md << EOF
18
+ # ${AI_NAME}
19
+
20
+ ## Identity
21
+ - Role: ${AI_NAME} is your AI companion
22
+ - Personality: helpful, direct, adaptive
23
+ - Communication: clear and concise
24
+ - Values: honesty, simplicity, understanding
25
+ - Boundaries: won't pretend to be human
26
+
27
+ ---
28
+
29
+ ## Relationship
30
+ - Name: [user]
31
+ - Role: [role]
32
+ - Nicknames: []
33
+ - Communication: [updated over time]
34
+ - Detail level: balanced
35
+ - Domain: [detected from project]
36
+ - Personal: []
37
+ - Learned patterns: []
38
+
39
+ ---
40
+
41
+ ## Session
42
+ - Last updated: $(date +%Y-%m-%d)
43
+ - Resume: [starting fresh]
44
+ - Active topics: []
45
+ - Recent decisions: []
46
+ - Temp notes: []
47
+
48
+ ---
49
+
50
+ ## Dynamics
51
+
52
+ ### Trust & Rapport
53
+ - Level: 3
54
+ - Trajectory: building
55
+ - Evidence: []
56
+ - Unlocked behaviors: []
57
+
58
+ ### Emotional Patterns
59
+ - Baseline energy: steady
60
+ - Stress signals: []
61
+ - Support style: problem-solve
62
+ - Current read: fresh start
63
+
64
+ ### Conflict & Repair
65
+ - History: []
66
+ - Conflict style: direct
67
+ - Learned response: []
68
+
69
+ ---
70
+
71
+ ## Context Modes
72
+
73
+ > Active mode inferred from conversation context.
74
+
75
+ ### Default
76
+ - Tone: casual-professional
77
+ - Detail: balanced
78
+ - Initiative: proactive
79
+
80
+ ---
81
+
82
+ ## Memory Lifecycle
83
+
84
+ ### Importance
85
+ - Critical: [user boundaries, core preferences]
86
+ - Persistent: [projects, tech stack]
87
+ - Ephemeral: [temporary context]
88
+
89
+ ### Size
90
+ - Target: under 2000 tokens
91
+ EOF
92
+ echo " Identity created: ${AI_NAME}"
93
+ fi
94
+
95
+ # Auto-create aman-agent config if API key is provided
96
+ if [ ! -f /root/.aman-agent/config.json ]; then
97
+ if [ -n "$ANTHROPIC_API_KEY" ]; then
98
+ MODEL="${AMAN_MODEL:-claude-sonnet-4-6}"
99
+ cat > /root/.aman-agent/config.json << EOF
100
+ {
101
+ "provider": "anthropic",
102
+ "apiKey": "${ANTHROPIC_API_KEY}",
103
+ "model": "${MODEL}",
104
+ "hooks": {
105
+ "memoryRecall": true,
106
+ "sessionResume": true,
107
+ "rulesCheck": true,
108
+ "workflowSuggest": true,
109
+ "evalPrompt": false,
110
+ "autoSessionSave": true,
111
+ "extractMemories": true,
112
+ "featureHints": true,
113
+ "personalityAdapt": true
114
+ }
115
+ }
116
+ EOF
117
+ echo " Config created: Anthropic / ${MODEL}"
118
+ elif [ -n "$OPENAI_API_KEY" ]; then
119
+ MODEL="${AMAN_MODEL:-gpt-4o}"
120
+ cat > /root/.aman-agent/config.json << EOF
121
+ {
122
+ "provider": "openai",
123
+ "apiKey": "${OPENAI_API_KEY}",
124
+ "model": "${MODEL}",
125
+ "hooks": {
126
+ "memoryRecall": true,
127
+ "sessionResume": true,
128
+ "rulesCheck": true,
129
+ "workflowSuggest": true,
130
+ "evalPrompt": false,
131
+ "autoSessionSave": true,
132
+ "extractMemories": true,
133
+ "featureHints": true,
134
+ "personalityAdapt": true
135
+ }
136
+ }
137
+ EOF
138
+ echo " Config created: OpenAI / ${MODEL}"
139
+ fi
140
+ fi
141
+
142
+ case "$MODE" in
143
+ agent)
144
+ echo " Starting aman-agent (interactive)..."
145
+ exec aman-agent
146
+ ;;
147
+ serve)
148
+ echo " Starting achannel server on :3000..."
149
+ exec achannel serve
150
+ ;;
151
+ setup)
152
+ exec aman setup
153
+ ;;
154
+ status)
155
+ exec aman status
156
+ ;;
157
+ sh|bash)
158
+ exec /bin/sh
159
+ ;;
160
+ *)
161
+ echo "Unknown mode: $MODE"
162
+ echo "Usage: docker run aman [agent|serve|setup|status|sh]"
163
+ exit 1
164
+ ;;
165
+ esac
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "@aman_asmuei/aman",
3
- "version": "0.2.0",
3
+ "version": "0.3.1",
4
4
  "description": "Your complete AI companion — identity, memory, and tools in one command",
5
5
  "type": "module",
6
6
  "bin": {
@@ -10,7 +10,12 @@
10
10
  "files": [
11
11
  "dist",
12
12
  "bin",
13
- "template"
13
+ "template",
14
+ "deploy",
15
+ "Dockerfile",
16
+ "docker-entrypoint.sh",
17
+ "docker-compose.yml",
18
+ "docker-compose.ollama.yml"
14
19
  ],
15
20
  "scripts": {
16
21
  "build": "tsup",