@coldge.com/gitbase 1.0.2

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/LICENSE ADDED
@@ -0,0 +1,15 @@
1
+ ISC License
2
+
3
+ Copyright (c) 2026, Coldge
4
+
5
+ Permission to use, copy, modify, and/or distribute this software for any
6
+ purpose with or without fee is hereby granted, provided that the above
7
+ copyright notice and this permission notice appear in all copies.
8
+
9
+ THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES
10
+ WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF
11
+ MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR
12
+ ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES
13
+ WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN
14
+ ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF
15
+ OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE.
package/README.md ADDED
@@ -0,0 +1,178 @@
1
+ # GitBase
2
+
3
+ GitBase is a professional version control system for Supabase. It synchronizes your Live Database state with local SQL files, providing a robust workflow for tracking schema changes, managing multi-tenant environments, and performing safe database restores.
4
+
5
+ ---
6
+
7
+ ## 🚀 Key Features
8
+
9
+ - **Reverse Git Workflow**: Treats the Live Database as the Source of Truth.
10
+ - **Dependency-Aware Restoration**: Automatically orders SQL execution (e.g., Tables before Views) using a DAG engine.
11
+ - **Multi-Profile Branches**: Manage Production, Staging, and Development references using Git-like branches.
12
+ - **Smart Change Detection**: Canonicalizes SQL to eliminate "noise" from formatting or whitespace.
13
+ - **Remote History Backups**: Store and sync your history repository in Supabase Storage buckets.
14
+ - **Production Guard**: RBAC protection for high-risk operations on protected branches.
15
+
16
+ ---
17
+
18
+ ## 🛠️ Installation & Setup
19
+
20
+ ### 1. Installation
21
+ Install the CLI globally via npm:
22
+ ```bash
23
+ npm install -g @coldge.com/gitbase
24
+ ```
25
+
26
+ ### 2. Authentication
27
+ Log in to your Supabase account using a Personal Access Token (PAT):
28
+ ```bash
29
+ gitb login
30
+ ```
31
+ *You can verify your status at any time with `gitb whoami`.*
32
+
33
+ ### 3. Initialization
34
+ Link your current directory to a Supabase project:
35
+ ```bash
36
+ gitb init
37
+ ```
38
+ *Use `--force` if you need to re-initialize an existing directory.*
39
+
40
+ ---
41
+
42
+ ## 🔄 Core User Flow
43
+
44
+ GitBase follows a simple, powerful cycle to keep your local and remote states in sync.
45
+
46
+ ### Phase 1: Tracking Changes (Live -> Local)
47
+ When you modify a Table, Function, or Policy in the Supabase Dashboard, GitBase helps you bring those changes into your version control.
48
+
49
+ 1. **Check Status**: `gitb status` identifies differences between your Live DB and local files.
50
+ 2. **Pull Changes**: `gitb pull` (or `gitb add .`) downloads the latest SQL from Supabase.
51
+ 3. **Snapshot**: `gitb commit -m "Describe your changes"` saves the state to your local history.
52
+
53
+ ### Phase 2: Deployment (Local -> Live)
54
+ After merging a branch or making local adjustments, deploy them back to the database.
55
+
56
+ 1. **Review**: `gitb diff` allows you to see what will change compared to the Live DB.
57
+ 2. **Push**: `gitb push` applies your local SQL files to the database.
58
+ 3. **Backup**: GitBase will automatically ask if you want to rename existing tables as backups before overwriting.
59
+
60
+ ### Phase 3: Disaster Recovery
61
+ If someone breaks production, you can reset the database to a known good state instantly.
62
+
63
+ 1. **Check Logs**: `gitb log` find the stable commit hash.
64
+ 2. **Restore**: `gitb revert <hash>` restores the entire schema and logic to that exact point in time.
65
+
66
+ ---
67
+
68
+ ## 📖 Command Reference
69
+
70
+ ### Authentication
71
+ #### `login`
72
+ Authenticate with the Supabase Management API.
73
+ - **Usage**: `gitb login`
74
+
75
+ #### `whoami`
76
+ Displays the current logged-in user and organization access.
77
+ - **Usage**: `gitb whoami`
78
+
79
+ ### Project Configuration
80
+ #### `init`
81
+ Initializes a `.gitbase` repository and links a project.
82
+ - **Usage**: `gitb init [options]`
83
+ - **Arguments**:
84
+ | Flag | Alias | Description |
85
+ | :--- | :--- | :--- |
86
+ | `--force` | `-f` | Overwrite existing configuration. |
87
+
88
+ ### Synchronization
89
+ #### `status`
90
+ Summarizes differences between the live database and local `.sql` files.
91
+ - **Usage**: `gitb status`
92
+
93
+ #### `pull` | `add`
94
+ Fetches schema definitions from the database to your local `supabase/` directory.
95
+ - **Usage**: `gitb pull [files..]`
96
+ - **Arguments**:
97
+ | Argument | Description |
98
+ | :--- | :--- |
99
+ | `files` | Optional. Specific files or directories to pull (defaults to all). |
100
+
101
+ #### `push`
102
+ Applies local `.sql` files to the live database.
103
+ - **Usage**: `gitb push [files..]`
104
+ - **Arguments**:
105
+ | Argument | Description |
106
+ | :--- | :--- |
107
+ | `files` | Optional. Specific files to push. |
108
+
109
+ ### Versioning & Comparison
110
+ #### `commit`
111
+ Saves a snapshot of the current local schema to the history.
112
+ - **Usage**: `gitb commit -m <message>`
113
+ - **Arguments**:
114
+ | Flag | Alias | Description |
115
+ | :--- | :--- | :--- |
116
+ | `--message` | `-m` | **Required.** The description of the snapshot. |
117
+
118
+ #### `log`
119
+ Displays a chronological list of commits for the current branch.
120
+ - **Usage**: `gitb log`
121
+
122
+ #### `diff`
123
+ Shows a line-by-line comparison of SQL definitions.
124
+ - **Usage**: `gitb diff [commit]`
125
+ - **Arguments**:
126
+ | Argument | Description |
127
+ | :--- | :--- |
128
+ | `commit` | Optional. The commit hash to compare against the Live DB (defaults to local HEAD). |
129
+
130
+ ### Restoration
131
+ #### `revert` | `reset`
132
+ Hard-resets the live database and local files to a previous commit.
133
+ - **Usage**: `gitb revert [commit] [options]`
134
+ - **Arguments**:
135
+ | Argument | Description |
136
+ | :--- | :--- |
137
+ | `commit` | Optional. Hash of the target commit (defaults to HEAD). |
138
+ | `--files` | Optional. Restricts the revert to specific file paths. |
139
+
140
+ ### Multi-Environment Profiles
141
+ #### `branch`
142
+ Manages project environmental profiles.
143
+ - **Usage**: `gitb branch [name] [options]`
144
+ - **Arguments**:
145
+ | Flag | Alias | Description |
146
+ | :--- | :--- | :--- |
147
+ | `name` | | Positional. Name of the branch to create/list. |
148
+ | `--delete` | `-d` | Deletes the specified branch profile. |
149
+
150
+ #### `checkout`
151
+ Switches the active environment profile.
152
+ - **Usage**: `gitb checkout <name>`
153
+
154
+ #### `merge`
155
+ Synchronizes configuration and history from another branch profile.
156
+ - **Usage**: `gitb merge <name>`
157
+
158
+ ### Cloud Sync
159
+ #### `remote`
160
+ Manages history backups in Supabase Storage.
161
+ - **Usage**: `gitb remote <subcommand>`
162
+ - **Subcommands**: `add`, `list`, `push`, `pull`.
163
+
164
+ ---
165
+
166
+ ## 🛡️ Security & Safety
167
+
168
+ > [!CAUTION]
169
+ > **Token Security**: Your Management API token is stored in `~/.config/gitbase/token.json`. Ensure this file is never shared or accidentally committed to public repositories.
170
+
171
+ > [!IMPORTANT]
172
+ > **Protected Branches**: By default, any branch named `production` requires **Owner** or **Administrator** organizational permissions to execute `push` or `revert`. This prevents unauthorized destructive changes to live environments.
173
+
174
+ ---
175
+
176
+ ## ⚖️ License
177
+
178
+ ISC License • Copyright (c) 2026 Coldge
@@ -0,0 +1,166 @@
1
+ import axios from 'axios';
2
+ import { getToken } from '../utils/config.js';
3
+ import fs from 'fs/promises';
4
+ import path from 'path';
5
+ import pg from 'pg';
6
+ import chalk from 'chalk';
7
+ const { Client } = pg;
8
+ const API_URL = 'https://api.supabase.com/v1';
9
+ const GITBASE_DIR = '.gitbase';
10
+ const CONFIG_FILE = path.join(GITBASE_DIR, 'config');
11
+ export async function getProjects() {
12
+ const token = await getToken();
13
+ if (!token)
14
+ throw new Error('Not logged in. Run `gitb login` first.');
15
+ const response = await axios.get(`${API_URL}/projects`, {
16
+ headers: { Authorization: `Bearer ${token}` }
17
+ });
18
+ return response.data;
19
+ }
20
+ export async function getConfig() {
21
+ try {
22
+ const content = await fs.readFile(CONFIG_FILE, 'utf-8');
23
+ return JSON.parse(content);
24
+ }
25
+ catch {
26
+ return null;
27
+ }
28
+ }
29
+ export async function runQuery(projectRef, sql) {
30
+ // 1. Get Connection String from Config
31
+ let connectionString = '';
32
+ try {
33
+ const config = await getConfig();
34
+ if (config) {
35
+ // New format
36
+ if (config.branches) {
37
+ for (const b of Object.values(config.branches)) {
38
+ const branch = b;
39
+ if (branch.projectRef === projectRef) {
40
+ connectionString = branch.connectionString;
41
+ break;
42
+ }
43
+ }
44
+ }
45
+ // Old format compatibility
46
+ else if (config.projectRef === projectRef) {
47
+ connectionString = config.connectionString;
48
+ if (!connectionString && config.dbPassword) {
49
+ connectionString = `postgres://postgres:${config.dbPassword}@db.${projectRef}.supabase.co:5432/postgres`;
50
+ }
51
+ }
52
+ }
53
+ }
54
+ catch { }
55
+ if (!connectionString) {
56
+ throw new Error('Database connection string not found in config. Please re-run `gitb init --force`.');
57
+ }
58
+ // 2. Connect
59
+ // Use connection pooling URL provided by user
60
+ const client = new Client({
61
+ connectionString,
62
+ ssl: { rejectUnauthorized: false }
63
+ });
64
+ try {
65
+ await client.connect();
66
+ const res = await client.query(sql);
67
+ return res.rows; // Array of objects
68
+ }
69
+ catch (err) {
70
+ throw new Error(`Database Error: ${err.message}`);
71
+ }
72
+ finally {
73
+ await client.end();
74
+ }
75
+ }
76
+ // --- STORAGE API ---
77
+ export async function listObjects(projectRef, bucket, prefix = '') {
78
+ const token = await getToken();
79
+ // Using Supabase Storage API directly
80
+ const url = `https://${projectRef}.supabase.co/storage/v1/object/list/${bucket}`;
81
+ const response = await axios.post(url, {
82
+ prefix,
83
+ limit: 1000,
84
+ offset: 0,
85
+ sortBy: { column: 'name', order: 'asc' }
86
+ }, {
87
+ headers: { Authorization: `Bearer ${token}` }
88
+ });
89
+ return response.data;
90
+ }
91
+ export async function uploadObject(projectRef, bucket, path, content) {
92
+ const token = await getToken();
93
+ const url = `https://${projectRef}.supabase.co/storage/v1/object/${bucket}/${path}`;
94
+ // Upload history file content
95
+ await axios.post(url, content, {
96
+ headers: {
97
+ Authorization: `Bearer ${token}`,
98
+ 'Content-Type': 'application/octet-stream',
99
+ 'x-upsert': 'true'
100
+ }
101
+ });
102
+ }
103
+ export async function downloadObject(projectRef, bucket, path) {
104
+ const token = await getToken();
105
+ const url = `https://${projectRef}.supabase.co/storage/v1/object/${bucket}/${path}`;
106
+ const response = await axios.get(url, {
107
+ headers: { Authorization: `Bearer ${token}` },
108
+ responseType: 'text'
109
+ });
110
+ return response.data;
111
+ }
112
+ export async function ensureBucket(projectRef, bucket) {
113
+ const token = await getToken();
114
+ const url = `https://${projectRef}.supabase.co/storage/v1/bucket`;
115
+ try {
116
+ await axios.post(url, {
117
+ id: bucket,
118
+ name: bucket,
119
+ public: false
120
+ }, {
121
+ headers: { Authorization: `Bearer ${token}` }
122
+ });
123
+ }
124
+ catch (e) {
125
+ if (e.response?.status !== 409) {
126
+ throw e;
127
+ }
128
+ }
129
+ }
130
+ // --- RBAC / PERMISSIONS ---
131
+ export async function isProductionAdmin(projectRef) {
132
+ const token = await getToken();
133
+ if (!token)
134
+ throw new Error('Not logged in.');
135
+ try {
136
+ // 1. Get the organization for this project
137
+ const projects = await getProjects();
138
+ const project = projects.find((p) => p.id === projectRef);
139
+ if (!project)
140
+ throw new Error(`Project ${projectRef} not found in your account.`);
141
+ const orgId = project.organization_id;
142
+ // 2. Get the current user's profile to find their ID/Email
143
+ // Actually, the easiest way with Management API is to list organization members
144
+ // and find the one that matches our token's identity.
145
+ // However, the Management API token itself doesn't easily reveal "who am I"
146
+ // without an extra call.
147
+ const userResp = await axios.get(`${API_URL}/me`, {
148
+ headers: { Authorization: `Bearer ${token}` }
149
+ });
150
+ const myEmail = userResp.data.email;
151
+ // 3. Get org members and check role
152
+ const membersResp = await axios.get(`${API_URL}/organizations/${orgId}/members`, {
153
+ headers: { Authorization: `Bearer ${token}` }
154
+ });
155
+ const me = membersResp.data.find((m) => m.email === myEmail);
156
+ if (!me)
157
+ return false;
158
+ // Only Owners and Administrators can touch production
159
+ const allowedRoles = ['Owner', 'Administrator'];
160
+ return allowedRoles.includes(me.roleName);
161
+ }
162
+ catch (e) {
163
+ console.error(chalk.red(`Permission Check Failed: ${e.message}`));
164
+ return false;
165
+ }
166
+ }
@@ -0,0 +1,34 @@
1
+ import fs from 'fs/promises';
2
+ import path from 'path';
3
+ import chalk from 'chalk';
4
+ import { getStatus } from './status.js';
5
+ export async function add(argv) {
6
+ const changes = await getStatus();
7
+ if (!changes)
8
+ return;
9
+ // For "." or empty files, process all changes
10
+ const files = argv.files || [];
11
+ const isAll = files.length === 0 || files.includes('.');
12
+ const toProcess = isAll ? changes : changes.filter((c) => files.includes(c.path));
13
+ if (toProcess.length === 0) {
14
+ console.log(chalk.yellow('No changes to add.'));
15
+ return;
16
+ }
17
+ for (const change of toProcess) {
18
+ const fullPath = path.join('supabase', change.path);
19
+ if (change.type === 'deleted') {
20
+ try {
21
+ await fs.unlink(fullPath);
22
+ console.log(chalk.red(`Deleted: ${change.path}`));
23
+ }
24
+ catch (e) {
25
+ // Ignore if already deleted
26
+ }
27
+ }
28
+ else {
29
+ await fs.mkdir(path.dirname(fullPath), { recursive: true });
30
+ await fs.writeFile(fullPath, change.content, 'utf-8');
31
+ console.log(chalk.green(`Updated: ${change.path}`));
32
+ }
33
+ }
34
+ }
@@ -0,0 +1,96 @@
1
+ import fs from 'fs/promises';
2
+ import path from 'path';
3
+ import chalk from 'chalk';
4
+ import readline from 'readline';
5
+ import { getConfig } from '../api/supabase.js';
6
+ const GITBASE_DIR = '.gitbase';
7
+ const CONFIG_FILE = path.join(GITBASE_DIR, 'config');
8
+ export async function branch(argv) {
9
+ const config = await getConfig();
10
+ if (!config) {
11
+ console.error(chalk.red('Not initialized. Run `gitb init` first.'));
12
+ return;
13
+ }
14
+ const branchName = argv.name;
15
+ const isDelete = argv.d || argv.delete;
16
+ if (!branchName) {
17
+ // List branches
18
+ console.log(chalk.cyan('Branches:'));
19
+ for (const name of Object.keys(config.branches || {})) {
20
+ if (name === config.currentBranch) {
21
+ console.log(chalk.green(`* ${name}`));
22
+ }
23
+ else {
24
+ console.log(` ${name}`);
25
+ }
26
+ }
27
+ return;
28
+ }
29
+ if (isDelete) {
30
+ if (branchName === 'production') {
31
+ console.error(chalk.red('Cannot delete the protected "production" branch.'));
32
+ return;
33
+ }
34
+ if (branchName === config.currentBranch) {
35
+ console.error(chalk.red('Cannot delete the current branch. Switch to another branch first.'));
36
+ return;
37
+ }
38
+ if (!config.branches || !config.branches[branchName]) {
39
+ console.error(chalk.red(`Branch '${branchName}' does not exist.`));
40
+ return;
41
+ }
42
+ delete config.branches[branchName];
43
+ await fs.writeFile(CONFIG_FILE, JSON.stringify(config, null, 2));
44
+ console.log(chalk.red(`Deleted branch '${branchName}'.`));
45
+ return;
46
+ }
47
+ if (config.branches && config.branches[branchName]) {
48
+ console.error(chalk.red(`Branch '${branchName}' already exists.`));
49
+ return;
50
+ }
51
+ const rl = readline.createInterface({
52
+ input: process.stdin,
53
+ output: process.stdout
54
+ });
55
+ const ask = (q) => new Promise(r => rl.question(q, r));
56
+ console.log(chalk.blue(`Creating new branch: ${branchName}`));
57
+ const ref = await ask('Project Ref: ');
58
+ const connString = await ask('Connection URI: ');
59
+ if (!ref || !connString) {
60
+ console.error(chalk.red('Project Ref and Connection URI are required.'));
61
+ rl.close();
62
+ return;
63
+ }
64
+ const currentHead = await fs.readFile(path.join(GITBASE_DIR, 'HEAD'), 'utf-8').catch(() => null);
65
+ config.branches = config.branches || {};
66
+ config.branches[branchName] = {
67
+ projectRef: ref.trim(),
68
+ connectionString: connString.trim(),
69
+ head: currentHead
70
+ };
71
+ await fs.writeFile(CONFIG_FILE, JSON.stringify(config, null, 2));
72
+ console.log(chalk.green(`Branch '${branchName}' created.`));
73
+ rl.close();
74
+ }
75
+ export async function checkout(argv) {
76
+ const config = await getConfig();
77
+ if (!config) {
78
+ console.error(chalk.red('Not initialized. Run `gitb init` first.'));
79
+ return;
80
+ }
81
+ const branchName = argv.name;
82
+ if (!branchName) {
83
+ console.error(chalk.red('Branch name required.'));
84
+ return;
85
+ }
86
+ if (!config.branches || !config.branches[branchName]) {
87
+ console.error(chalk.red(`Branch '${branchName}' does not exist.`));
88
+ return;
89
+ }
90
+ // Switch to target branch profile
91
+ config.currentBranch = branchName;
92
+ await fs.writeFile(CONFIG_FILE, JSON.stringify(config, null, 2));
93
+ // Swapping local 'supabase/' folder to match branch HEAD is recommended but not implemented yet.
94
+ console.log(chalk.green(`Switched to branch '${branchName}'`));
95
+ console.log(chalk.yellow('Note: Your local files are unchanged. Run `gitb reset` to sync files with this branch\'s HEAD.'));
96
+ }
@@ -0,0 +1,65 @@
1
+ import fs from 'fs/promises';
2
+ import path from 'path';
3
+ import chalk from 'chalk';
4
+ import readline from 'readline';
5
+ import { downloadFromStorage, listStorageObjects } from '../api/supabase.js';
6
+ const GITBASE_DIR = '.gitbase';
7
+ const CONFIG_FILE = path.join(GITBASE_DIR, 'config');
8
+ const OBJECTS_DIR = path.join(GITBASE_DIR, 'objects');
9
+ const BUCKET_NAME = 'gitbase-remotes';
10
+ export async function clone(argv) {
11
+ const projectRef = argv.ref;
12
+ if (!projectRef) {
13
+ console.error(chalk.red('Project Reference ID required.'));
14
+ return;
15
+ }
16
+ console.log(chalk.blue(`Cloning GitBase history from project '${projectRef}'...`));
17
+ const rl = readline.createInterface({
18
+ input: process.stdin,
19
+ output: process.stdout
20
+ });
21
+ const ask = (q) => new Promise(r => rl.question(q, r));
22
+ // 1. Setup local dir
23
+ try {
24
+ await fs.mkdir(GITBASE_DIR, { recursive: true });
25
+ await fs.mkdir(OBJECTS_DIR, { recursive: true });
26
+ }
27
+ catch (e) { }
28
+ // 2. Download Config from Bucket
29
+ let config;
30
+ try {
31
+ const configStr = await downloadFromStorage(projectRef, BUCKET_NAME, 'config.json');
32
+ config = JSON.parse(configStr);
33
+ await fs.writeFile(CONFIG_FILE, JSON.stringify(config, null, 2));
34
+ }
35
+ catch (e) {
36
+ console.error(chalk.red(`Failed to download project configuration from bucket: ${e.message}`));
37
+ console.log(chalk.yellow('Make sure the project has been initialized with `gitb init` and the bucket exists.'));
38
+ rl.close();
39
+ return;
40
+ }
41
+ // 3. Download All Objects
42
+ console.log(chalk.blue('Downloading commit history...'));
43
+ try {
44
+ const objects = await listStorageObjects(projectRef, BUCKET_NAME, 'objects/');
45
+ for (const obj of objects) {
46
+ const hash = obj.name;
47
+ const content = await downloadFromStorage(projectRef, BUCKET_NAME, `objects/${hash}`);
48
+ await fs.writeFile(path.join(OBJECTS_DIR, hash), content, 'utf-8');
49
+ }
50
+ }
51
+ catch (e) {
52
+ console.warn(chalk.yellow('Failed to download some history objects.'));
53
+ }
54
+ // 4. Update HEAD to match branch head
55
+ if (config.branches && config.branches[config.currentBranch]) {
56
+ const head = config.branches[config.currentBranch].head;
57
+ if (head) {
58
+ await fs.writeFile(path.join(GITBASE_DIR, 'HEAD'), head);
59
+ }
60
+ }
61
+ console.log(chalk.green(`\nSuccessfully cloned project '${projectRef}'.`));
62
+ console.log(chalk.yellow(`Current branch: ${config.currentBranch}`));
63
+ console.log(chalk.blue('Run `gitb reset` to sync your local files with the database.'));
64
+ rl.close();
65
+ }
@@ -0,0 +1,99 @@
1
+ import fs from 'fs/promises';
2
+ import path from 'path';
3
+ import chalk from 'chalk';
4
+ import { hashString } from '../utils/hashing.js';
5
+ const GITBASE_DIR = '.gitbase';
6
+ const OBJECTS_DIR = path.join(GITBASE_DIR, 'objects');
7
+ const HEAD_FILE = path.join(GITBASE_DIR, 'HEAD');
8
+ export async function commit(argv) {
9
+ const message = argv.message;
10
+ if (!message) {
11
+ console.error(chalk.red('Commit message required.'));
12
+ return;
13
+ }
14
+ // 1. Build Tree
15
+ const tree = {};
16
+ const supabaseDir = 'supabase';
17
+ // Recursive file walker
18
+ async function walk(dir, base) {
19
+ const entries = await fs.readdir(dir, { withFileTypes: true }).catch(() => []);
20
+ for (const entry of entries) {
21
+ const fullPath = path.join(dir, entry.name);
22
+ const relPath = path.join(base, entry.name).replace(/\\/g, '/'); // normalize path
23
+ if (entry.isDirectory()) {
24
+ await walk(fullPath, relPath);
25
+ }
26
+ else if (entry.isFile() && entry.name.endsWith('.sql')) {
27
+ const content = await fs.readFile(fullPath, 'utf-8');
28
+ const hash = hashString(content);
29
+ await saveObject(hash, content);
30
+ tree[relPath] = hash;
31
+ }
32
+ }
33
+ }
34
+ // Check if supabase dir exists
35
+ try {
36
+ await fs.access(supabaseDir);
37
+ await walk(supabaseDir, '');
38
+ }
39
+ catch {
40
+ console.log(chalk.yellow('Nothing to commit (supabase directory missing).'));
41
+ return;
42
+ }
43
+ if (Object.keys(tree).length === 0) {
44
+ console.log(chalk.yellow('Nothing to commit (empty working tree).'));
45
+ return;
46
+ }
47
+ // Save Tree Object
48
+ const sortedTree = Object.keys(tree).sort().reduce((acc, key) => {
49
+ acc[key] = tree[key];
50
+ return acc;
51
+ }, {});
52
+ const treeJson = JSON.stringify(sortedTree);
53
+ const treeHash = hashString(treeJson);
54
+ await saveObject(treeHash, treeJson);
55
+ // 2. Get Parent
56
+ let parent = null;
57
+ let parentTreeHash = null;
58
+ try {
59
+ parent = await fs.readFile(HEAD_FILE, 'utf-8');
60
+ if (parent) {
61
+ const parentCommitStr = await fs.readFile(path.join(OBJECTS_DIR, parent), 'utf-8');
62
+ const parentCommit = JSON.parse(parentCommitStr);
63
+ parentTreeHash = parentCommit.tree;
64
+ }
65
+ }
66
+ catch { }
67
+ // Prevent empty commits
68
+ if (parentTreeHash === treeHash) {
69
+ console.log(chalk.yellow('Nothing to commit (working tree clean).'));
70
+ return;
71
+ }
72
+ // 3. Create Commit Object
73
+ const commitObj = {
74
+ tree: treeHash,
75
+ parent: parent,
76
+ message: message,
77
+ timestamp: new Date().toISOString(),
78
+ author: process.env.USERNAME || process.env.USER || 'unknown'
79
+ };
80
+ const commitJson = JSON.stringify(commitObj, null, 2);
81
+ const commitHash = hashString(commitJson);
82
+ await saveObject(commitHash, commitJson);
83
+ // 4. Update HEAD
84
+ await fs.writeFile(HEAD_FILE, commitHash, 'utf-8');
85
+ // 5. Update branch HEAD in config
86
+ const { getConfig } = await import('../api/supabase.js');
87
+ const config = await getConfig();
88
+ if (config && config.branches && config.currentBranch) {
89
+ config.branches[config.currentBranch].head = commitHash;
90
+ await fs.writeFile(path.join(GITBASE_DIR, 'config'), JSON.stringify(config, null, 2));
91
+ }
92
+ console.log(chalk.green(`[${commitHash.substring(0, 7)}] ${message}`));
93
+ }
94
+ async function saveObject(hash, content) {
95
+ await fs.mkdir(OBJECTS_DIR, { recursive: true });
96
+ // Maybe use subdirectories like git (ab/cdef...) to avoid too many files in one dir
97
+ // For MVP, flat is fine.
98
+ await fs.writeFile(path.join(OBJECTS_DIR, hash), content, 'utf-8');
99
+ }