altimate-code 0.5.0 → 0.5.2

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
Files changed (3) hide show
  1. package/CHANGELOG.md +45 -0
  2. package/README.md +23 -65
  3. package/package.json +54 -14
package/CHANGELOG.md CHANGED
@@ -5,6 +5,51 @@ All notable changes to this project will be documented in this file.
5
5
  The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.1.0/),
6
6
  and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).
7
7
 
8
+ ## [0.5.2] - 2026-03-19
9
+
10
+ ### Added
11
+
12
+ - Trace history dialog (`/trace` command) — browse, search, and open past session traces from the TUI (#297)
13
+ - Docs showcase examples with screenshots (#292)
14
+
15
+ ### Fixed
16
+
17
+ - TUI trace dialog now respects custom `tracing.dir` config — previously always used default directory (#307)
18
+ - WebFetch `clearTimeout` leak — DNS failures no longer leak timer handles (#307)
19
+ - WebFetch User-Agent strategy inverted to honest-bot-first — reduces 403 blocks from TLS fingerprint mismatch (#303)
20
+ - Snowflake SDK stdout log noise suppressed in TUI via `additionalLogToConsole: false` (#305, #301)
21
+ - `cleanTitle` fallback in trace dialog no longer returns empty string (#307)
22
+ - Error logging added to `openTraceInBrowser` for debuggability (#307)
23
+ - `altimate_change` markers added to `webfetch.ts` for upstream merge compatibility (#307)
24
+
25
+ ### Changed
26
+
27
+ - Snowflake SDK minimum version bumped to `^2.0.3` for log suppression support (#305)
28
+ - Removed brew from docs and README (#299)
29
+ - Fixed README typo (`altimate` → `altimate-code`) (#293)
30
+
31
+ ## [0.5.1] - 2026-03-19
32
+
33
+ ### Added
34
+
35
+ - Simplified agent modes: 3 primary modes (`builder`, `analyst`, `plan`) replacing 7 — cleaner UX with focused roles (#282)
36
+ - SQL write access control — `builder` prompts for approval on write queries, `analyst` blocks them entirely, destructive SQL (`DROP DATABASE`, `TRUNCATE`) hard-blocked (#282)
37
+ - `core_failure` telemetry with PII-safe input signatures — captures tool failures with masked SQL literals and redacted secrets (#245)
38
+ - `peerDependencies` for database drivers in published npm packages (#273)
39
+ - Comprehensive docs restructuring with new Changelog, Getting Started, and Tools reference pages (#284)
40
+
41
+ ### Fixed
42
+
43
+ - Replace `escapeSqlString` with parameterized query binds in `finops/schema` modules (#277)
44
+ - Driver error messages now suggest `npm install` instead of `bun add` (#273)
45
+ - System prompt traced only once per session to avoid duplication (#287)
46
+
47
+ ### Changed
48
+
49
+ - Bump `@altimateai/altimate-core` to 0.2.5 — adds Rust-side failure telemetry with PII masking
50
+ - Removed 5 agent prompts: `executive`, `migrator`, `researcher`, `trainer`, `validator` (#282)
51
+ - README cleanup and updated branding (#288)
52
+
8
53
  ## [0.5.0] - 2026-03-18
9
54
 
10
55
  ### Added
package/README.md CHANGED
@@ -7,19 +7,16 @@
7
7
 
8
8
  **The open-source data engineering harness.**
9
9
 
10
- The intelligence layer for data engineering AI — 99+ deterministic tools for SQL analysis,
10
+ The intelligence layer for data engineering AI — 100+ deterministic tools for SQL analysis,
11
11
  column-level lineage, dbt, FinOps, and warehouse connectivity across every major cloud platform.
12
12
 
13
13
  Run standalone in your terminal, embed underneath Claude Code or Codex, or integrate
14
14
  into CI pipelines and orchestration DAGs. Precision data tooling for any LLM.
15
15
 
16
- [![npm](https://img.shields.io/npm/v/@altimateai/altimate-code)](https://www.npmjs.com/package/@altimateai/altimate-code)
17
- [![npm](https://img.shields.io/npm/v/@altimateai/altimate-core)](https://www.npmjs.com/package/@altimateai/altimate-core)
18
- [![npm downloads](https://img.shields.io/npm/dm/@altimateai/altimate-code)](https://www.npmjs.com/package/@altimateai/altimate-code)
16
+ [![npm](https://img.shields.io/npm/v/altimate-code)](https://www.npmjs.com/package/altimate-code)
19
17
  [![License: MIT](https://img.shields.io/badge/License-MIT-blue.svg)](./LICENSE)
20
- [![CI](https://github.com/AltimateAI/altimate-code/actions/workflows/ci.yml/badge.svg)](https://github.com/AltimateAI/altimate-code/actions/workflows/ci.yml)
21
- [![Slack](https://img.shields.io/badge/Slack-Join%20Community-4A154B?logo=slack)](https://altimate.ai/slack)
22
- [![Docs](https://img.shields.io/badge/docs-altimateai.github.io-blue)](https://altimateai.github.io/altimate-code)
18
+ [![Slack](https://img.shields.io/badge/Slack-Join%20Community-4A154B?logo=slack)](https://altimate.studio/join-agentic-data-engineering-slack)
19
+ [![Docs](https://img.shields.io/badge/docs-docs.altimate.sh-blue)](https://docs.altimate.sh)
23
20
 
24
21
  </div>
25
22
 
@@ -28,11 +25,7 @@ into CI pipelines and orchestration DAGs. Precision data tooling for any LLM.
28
25
  ## Install
29
26
 
30
27
  ```bash
31
- # npm (recommended)
32
- npm install -g @altimateai/altimate-code
33
-
34
- # Homebrew
35
- brew install AltimateAI/tap/altimate-code
28
+ npm install -g altimate-code
36
29
  ```
37
30
 
38
31
  Then — in order:
@@ -43,8 +36,6 @@ altimate # Launch the TUI
43
36
  /connect # Interactive setup — choose your provider and enter your API key
44
37
  ```
45
38
 
46
- > **No API key?** Select **Codex** in the `/connect` menu — it's built-in and requires no setup.
47
-
48
39
  Or set an environment variable directly:
49
40
  ```bash
50
41
  export ANTHROPIC_API_KEY=your_key # Anthropic Claude
@@ -58,7 +49,9 @@ altimate /discover
58
49
 
59
50
  `/discover` auto-detects dbt projects, warehouse connections (from `~/.dbt/profiles.yml`, Docker, environment variables), and installed tools (dbt, sqlfluff, airflow, dagster, and more). Skip this and start building — you can always run it later.
60
51
 
61
- > **Zero Python setup required.** On first run, the CLI automatically downloads [`uv`](https://github.com/astral-sh/uv), creates an isolated Python environment, and installs the data engine with all warehouse drivers. No `pip install`, no virtualenv management.
52
+ > **Headless / scripted usage:** `altimate --yolo` auto-approves all permission prompts. Not recommended with live warehouse connections.
53
+
54
+ > **Zero additional setup.** One command install.
62
55
 
63
56
  ## Why a specialized harness?
64
57
 
@@ -90,9 +83,9 @@ no hallucinated SQL advice, no guessing at schema, no missed PII.
90
83
  - **FinOps** — credit consumption, expensive query detection, warehouse right-sizing, idle resource cleanup
91
84
  - **PII Detection** — 15 categories, 30+ regex patterns, enforced pre-execution
92
85
 
93
- **Works seamlessly with Claude Code and Codex.** altimate is the data engineering tool layer — use it standalone in your terminal, or mount it as the harness underneath whatever AI agent you already run. The two are complementary.
86
+ **Works seamlessly with Claude Code and Codex.** Use `/configure-claude` or `/configure-codex` to set up integration in one step. altimate is the data engineering tool layer — use it standalone in your terminal, or mount it as the harness underneath whatever AI agent you already run. The two are complementary.
94
87
 
95
- altimate is a fork of [OpenCode](https://github.com/anomalyco/opencode) rebuilt for data teams. Model-agnostic — bring your own LLM or run locally with Ollama.
88
+ altimate-code is a fork of [OpenCode](https://github.com/anomalyco/opencode) rebuilt for data teams. Model-agnostic — bring your own LLM or run locally with Ollama.
96
89
 
97
90
  ## Quick demo
98
91
 
@@ -146,23 +139,19 @@ Teach your AI teammate project-specific patterns, naming conventions, and best p
146
139
 
147
140
  ## Agent Modes
148
141
 
149
- Each agent has scoped permissions and purpose-built tools for its role.
142
+ Each mode has scoped permissions, tool access, and SQL write-access control.
150
143
 
151
- | Agent | Role | Access |
144
+ | Mode | Role | Access |
152
145
  |---|---|---|
153
- | **Builder** | Create dbt models, SQL pipelines, and data transformations | Full read/write |
154
- | **Analyst** | Explore data, run SELECT queries, and generate insights | Read-only enforced |
155
- | **Validator** | Data quality checks, schema validation, test coverage analysis | Read + validate |
156
- | **Migrator** | Cross-warehouse SQL translation, schema migration, dialect conversion | Read/write for migrations |
157
- | **Researcher** | Deep-dive analysis, documentation research, and knowledge extraction | Read-only |
158
- | **Trainer** | Teach project-specific patterns, naming conventions, and best practices | Read + write training data |
159
- | **Executive** | Business-audience summaries — translates findings into revenue, cost, and compliance impact | Read-only |
146
+ | **Builder** | Create dbt models, SQL pipelines, and data transformations | Full read/write (write SQL prompts for approval; `DROP DATABASE`/`DROP SCHEMA`/`TRUNCATE` hard-blocked) |
147
+ | **Analyst** | Explore data, run SELECT queries, FinOps analysis, and generate insights | Read-only enforced (SELECT only, no file writes) |
148
+ | **Plan** | Outline an approach before acting | Minimal (read files only, no SQL or bash) |
160
149
 
161
- > **New to altimate?** Start with **Analyst mode** — it's read-only and safe to run against production connections.
150
+ > **New to altimate?** Start with **Analyst mode** — it's read-only and safe to run against production connections. Need specialized workflows (validation, migration, research)? Create [custom agent modes](https://docs.altimate.sh).
162
151
 
163
152
  ## Supported Warehouses
164
153
 
165
- Snowflake · BigQuery · Databricks · PostgreSQL · Redshift · DuckDB · MySQL · SQL Server
154
+ Snowflake · BigQuery · Databricks · PostgreSQL · Redshift · DuckDB · MySQL · SQL Server · Oracle · SQLite
166
155
 
167
156
  First-class support with schema indexing, query execution, and metadata introspection. SSH tunneling available for secure connections.
168
157
 
@@ -172,46 +161,13 @@ Model-agnostic — bring your own provider or run locally.
172
161
 
173
162
  Anthropic · OpenAI · Google Gemini · Google Vertex AI · Amazon Bedrock · Azure OpenAI · Mistral · Groq · DeepInfra · Cerebras · Cohere · Together AI · Perplexity · xAI · OpenRouter · Ollama · GitHub Copilot
174
163
 
175
- > **No API key?** **Codex** is a built-in provider with no key required. Select it via `/connect` to start immediately.
176
-
177
164
  ## Skills
178
165
 
179
166
  altimate ships with built-in skills for every common data engineering task — type `/` in the TUI to browse available skills and get autocomplete. No memorization required.
180
167
 
181
- ## Architecture
182
-
183
- ```
184
- altimate (TypeScript CLI)
185
- |
186
- @altimateai/altimate-core (napi-rs → Rust)
187
- SQL analysis, lineage, PII, safety — 45 functions, ~2ms per call
188
- |
189
- Native Node.js drivers
190
- 10 warehouses: Snowflake, BigQuery, PostgreSQL, Databricks,
191
- Redshift, MySQL, SQL Server, Oracle, DuckDB, SQLite
192
- ```
193
-
194
- The CLI handles AI interactions, TUI, and tool orchestration. SQL analysis is powered by the Rust-based `@altimateai/altimate-core` engine via napi-rs bindings (no Python required). Database connectivity uses native Node.js drivers with lazy loading.
195
-
196
- **No Python dependency**: All 73 tool methods run natively in TypeScript. No pip, venv, or Python installation needed.
197
-
198
- **dbt-first**: When working in a dbt project, the CLI automatically uses dbt's connection from `profiles.yml` — no separate warehouse configuration needed.
199
-
200
- ### Monorepo structure
201
-
202
- ```
203
- packages/
204
- altimate-code/ TypeScript CLI (main entry point)
205
- drivers/ Shared database drivers (10 warehouses)
206
- dbt-tools/ dbt integration (TypeScript)
207
- plugin/ Plugin system
208
- sdk/ SDKs (includes VS Code extension)
209
- util/ Shared utilities
210
- ```
211
-
212
168
  ## Community & Contributing
213
169
 
214
- - **Slack**: [altimate.ai/slack](https://altimate.ai/slack) — Real-time chat for questions, showcases, and feature discussion
170
+ - **Slack**: [Join Slack](https://altimate.studio/join-agentic-data-engineering-slack) — Real-time chat for questions, showcases, and feature discussion
215
171
  - **Issues**: [GitHub Issues](https://github.com/AltimateAI/altimate-code/issues) — Bug reports and feature requests
216
172
  - **Discussions**: [GitHub Discussions](https://github.com/AltimateAI/altimate-code/discussions) — Long-form questions and proposals
217
173
  - **Security**: See [SECURITY.md](./SECURITY.md) for responsible disclosure
@@ -220,10 +176,12 @@ Contributions welcome — docs, SQL rules, warehouse connectors, and TUI improve
220
176
 
221
177
  **[Read CONTRIBUTING.md →](./CONTRIBUTING.md)**
222
178
 
223
- ## What's New
179
+ ## Changelog
224
180
 
225
- - **v0.4.1** (March 2026) — env-based skill selection, session caching, tracing improvements
226
- - **v0.4.0** (Feb 2026) — data visualization skill, 99+ tools, training system
181
+ - **v0.5.0** (March 2026) — smooth streaming mode, builtin skills via postinstall, `/configure-claude` and `/configure-codex` commands, warehouse auth hardening
182
+ - **v0.4.9** (March 2026) — Snowflake auth overhaul (all auth methods), dbt tool regression fixes, parallel CI builds
183
+ - **v0.4.2** (March 2026) — yolo mode, Python engine elimination (all-native TypeScript), tool consolidation, path sandboxing hardening, altimate-dbt CLI, unscoped npm package
184
+ - **v0.4.0** (March 2026) — data visualization skill, 100+ tools, training system
227
185
  - **v0.3.x** — [See full changelog →](CHANGELOG.md)
228
186
 
229
187
  ## License
package/package.json CHANGED
@@ -14,23 +14,63 @@
14
14
  "scripts": {
15
15
  "postinstall": "bun ./postinstall.mjs || node ./postinstall.mjs"
16
16
  },
17
- "version": "0.5.0",
17
+ "version": "0.5.2",
18
18
  "license": "MIT",
19
19
  "dependencies": {
20
- "@altimateai/altimate-core": "^0.2.3"
20
+ "@altimateai/altimate-core": "^0.2.5"
21
21
  },
22
22
  "optionalDependencies": {
23
- "@altimateai/altimate-code-linux-x64": "0.5.0",
24
- "@altimateai/altimate-code-windows-arm64": "0.5.0",
25
- "@altimateai/altimate-code-linux-arm64-musl": "0.5.0",
26
- "@altimateai/altimate-code-darwin-x64": "0.5.0",
27
- "@altimateai/altimate-code-windows-x64": "0.5.0",
28
- "@altimateai/altimate-code-linux-x64-musl": "0.5.0",
29
- "@altimateai/altimate-code-darwin-x64-baseline": "0.5.0",
30
- "@altimateai/altimate-code-linux-x64-baseline-musl": "0.5.0",
31
- "@altimateai/altimate-code-linux-x64-baseline": "0.5.0",
32
- "@altimateai/altimate-code-linux-arm64": "0.5.0",
33
- "@altimateai/altimate-code-darwin-arm64": "0.5.0",
34
- "@altimateai/altimate-code-windows-x64-baseline": "0.5.0"
23
+ "@altimateai/altimate-code-linux-x64": "0.5.2",
24
+ "@altimateai/altimate-code-windows-arm64": "0.5.2",
25
+ "@altimateai/altimate-code-linux-arm64-musl": "0.5.2",
26
+ "@altimateai/altimate-code-darwin-x64": "0.5.2",
27
+ "@altimateai/altimate-code-windows-x64": "0.5.2",
28
+ "@altimateai/altimate-code-linux-x64-musl": "0.5.2",
29
+ "@altimateai/altimate-code-darwin-x64-baseline": "0.5.2",
30
+ "@altimateai/altimate-code-linux-x64-baseline-musl": "0.5.2",
31
+ "@altimateai/altimate-code-linux-x64-baseline": "0.5.2",
32
+ "@altimateai/altimate-code-linux-arm64": "0.5.2",
33
+ "@altimateai/altimate-code-darwin-arm64": "0.5.2",
34
+ "@altimateai/altimate-code-windows-x64-baseline": "0.5.2"
35
+ },
36
+ "peerDependencies": {
37
+ "pg": ">=8",
38
+ "snowflake-sdk": ">=1",
39
+ "@google-cloud/bigquery": ">=8",
40
+ "@databricks/sql": ">=1",
41
+ "mysql2": ">=3",
42
+ "mssql": ">=11",
43
+ "oracledb": ">=6",
44
+ "duckdb": ">=1",
45
+ "better-sqlite3": ">=11"
46
+ },
47
+ "peerDependenciesMeta": {
48
+ "pg": {
49
+ "optional": true
50
+ },
51
+ "snowflake-sdk": {
52
+ "optional": true
53
+ },
54
+ "@google-cloud/bigquery": {
55
+ "optional": true
56
+ },
57
+ "@databricks/sql": {
58
+ "optional": true
59
+ },
60
+ "mysql2": {
61
+ "optional": true
62
+ },
63
+ "mssql": {
64
+ "optional": true
65
+ },
66
+ "oracledb": {
67
+ "optional": true
68
+ },
69
+ "duckdb": {
70
+ "optional": true
71
+ },
72
+ "better-sqlite3": {
73
+ "optional": true
74
+ }
35
75
  }
36
76
  }