@fuzzle/opencode-accountant 0.3.0 → 0.4.0-next.1

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.md CHANGED
@@ -122,12 +122,21 @@ providers:
122
122
  currencyField: Currency
123
123
  skipRows: 9
124
124
  delimiter: ';'
125
- renamePattern: 'transactions-ubs-{account-number}.csv'
125
+ renamePattern: 'ubs-{account-number}-transactions-{from-date}-to-{until-date}.csv'
126
126
  metadata:
127
127
  - field: account-number
128
128
  row: 0
129
129
  column: 1
130
130
  normalize: spaces-to-dashes
131
+ - field: from-date
132
+ row: 2
133
+ column: 1
134
+ - field: until-date
135
+ row: 3
136
+ column: 1
137
+ - field: closing-balance
138
+ row: 5
139
+ column: 1
131
140
  currencies:
132
141
  CHF: chf
133
142
  EUR: eur
@@ -201,9 +210,10 @@ The `import-pipeline` tool provides an atomic, safe import workflow using git wo
201
210
  - Creates an isolated git worktree
202
211
  - Syncs CSV files from main repo to worktree
203
212
  - Classifies CSV files by provider/currency
213
+ - Extracts accounts from rules and creates declarations in year journal
204
214
  - Validates all transactions have matching rules
205
215
  - Imports transactions to the appropriate year journal
206
- - Reconciles closing balance (if available in CSV metadata)
216
+ - Reconciles closing balance (auto-detected from CSV metadata or data analysis)
207
217
  - Merges changes back to main branch with `--no-ff`
208
218
  - Deletes processed CSV files from main repo's import/incoming
209
219
  - Cleans up the worktree
@@ -225,7 +235,14 @@ The `import-pipeline` tool is the single entry point for importing bank statemen
225
235
 
226
236
  #### Rules File Matching
227
237
 
228
- The tool matches CSV files to their rules files by parsing the `source` directive in each `.rules` file. For example, if `ubs-account.rules` contains:
238
+ The tool matches CSV files to their rules files using multiple methods:
239
+
240
+ 1. **Source directive matching** (primary): Parses the `source` directive in each `.rules` file and supports glob patterns
241
+ 2. **Filename matching** (fallback): If path matching fails, matches based on the rules filename prefix
242
+
243
+ **Example using source directive:**
244
+
245
+ If `ubs-account.rules` contains:
229
246
 
230
247
  ```
231
248
  source ../../import/pending/ubs/chf/transactions.csv
@@ -233,32 +250,58 @@ source ../../import/pending/ubs/chf/transactions.csv
233
250
 
234
251
  The tool will use that rules file when processing `transactions.csv`.
235
252
 
236
- **Note:** The `source` path should match your configured `{paths.pending}` directory structure.
253
+ **Example using filename matching:**
254
+
255
+ If you have a rules file named `ubs-1234-567890.rules` and a CSV file named `ubs-1234-567890-transactions-2026-01-05-to-2026-01-31.csv`, the tool will automatically match them based on the common prefix `ubs-1234-567890`.
256
+
257
+ This is particularly useful when CSV files move between directories (e.g., from `pending/` to `done/`) or when maintaining exact source paths is impractical.
258
+
259
+ **Note:** Name your rules files to match the prefix of your CSV files for automatic matching.
237
260
 
238
261
  See the hledger documentation for details on rules file format and syntax.
239
262
 
263
+ #### Account Declarations
264
+
265
+ The pipeline automatically manages account declarations:
266
+
267
+ - Scans rules files matched to the CSVs being imported
268
+ - Extracts all accounts (account1 and account2 directives)
269
+ - Creates or updates year journal files with sorted account declarations
270
+ - Ensures `hledger check --strict` validation passes
271
+
272
+ **No manual account setup required.** Account declarations are created proactively before import attempts.
273
+
240
274
  #### Unknown Postings
241
275
 
242
276
  When a transaction doesn't match any `if` pattern in the rules file, hledger assigns it to `income:unknown` or `expenses:unknown` depending on the transaction direction. The pipeline will fail at the validation step, reporting the unknown postings so you can add appropriate rules before retrying.
243
277
 
244
278
  #### Closing Balance Reconciliation
245
279
 
246
- For providers that include closing balance in CSV metadata (e.g., UBS), the tool automatically validates that the imported transactions result in the correct balance. Configure metadata extraction in `providers.yaml`:
280
+ The import pipeline automatically detects closing balance using the following fallback chain:
281
+
282
+ 1. **CSV Metadata Extraction**: For providers with closing balance in metadata (e.g., UBS)
283
+ 2. **CSV Data Analysis**: Extracts balance from the last transaction row if metadata unavailable
284
+ 3. **Manual Override**: Use the `closingBalance` parameter when automatic detection fails
285
+
286
+ Configure metadata extraction in `providers.yaml`:
247
287
 
248
288
  ```yaml
249
289
  metadata:
250
- - field: closing_balance
290
+ - field: closing-balance
251
291
  row: 5
252
292
  column: 1
253
- - field: from_date
293
+ - field: from-date
254
294
  row: 2
255
295
  column: 1
256
- - field: until_date
296
+ - field: until-date
257
297
  row: 3
258
298
  column: 1
259
299
  ```
260
300
 
261
- For providers without closing balance in metadata (e.g., Revolut), provide it manually via the `closingBalance` argument.
301
+ **Note:** For most CSV formats, the closing balance will be detected automatically. Manual override is only needed when:
302
+
303
+ - CSV has no balance information in metadata or data
304
+ - The auto-detected balance has low confidence
262
305
 
263
306
  ## Development
264
307
 
@@ -89,9 +89,10 @@ The `import-pipeline` tool provides an **atomic, safe workflow** using git workt
89
89
  3. **Automatic Processing**: The tool creates an isolated git worktree and:
90
90
  - Syncs CSV files from main repo to worktree
91
91
  - Classifies CSV files by provider/currency
92
+ - Extracts required accounts from rules files and updates year journal
92
93
  - Validates all transactions have matching rules
93
94
  - Imports transactions to the appropriate year journal
94
- - Reconciles closing balance (if available in CSV metadata)
95
+ - Reconciles closing balance (auto-detected from CSV metadata or data, or manual override)
95
96
  - Merges changes back to main branch with `--no-ff`
96
97
  - Deletes processed CSV files from main repo's import/incoming
97
98
  - Cleans up the worktree
@@ -104,10 +105,31 @@ The `import-pipeline` tool provides an **atomic, safe workflow** using git workt
104
105
  ### Rules Files
105
106
 
106
107
  - The location of the rules files is configured in `config/import/providers.yaml`
107
- - Match CSV to rules file via the `source` directive in each `.rules` file
108
+ - Match CSV to rules file via the `source` directive in each `.rules` file, with automatic fallback to filename-based matching
109
+ - **Filename matching example:** If the rules file is named `ubs-1234-567890.rules`, it will automatically match CSV files like `ubs-1234-567890-transactions-2026-01.csv` based on the common prefix. This works even when CSV files move between directories.
110
+ - When account detection fails, recommend users either fix their `source` directive or rename their rules file to match the CSV filename prefix
108
111
  - Use field names from the `fields` directive for matching
109
112
  - Unknown account pattern: `income:unknown` (positive amounts) / `expenses:unknown` (negative amounts)
110
113
 
114
+ ### Automatic Account Declarations
115
+
116
+ The import pipeline automatically:
117
+
118
+ - Scans matched rules files for all account references (account1, account2 directives)
119
+ - Creates/updates year journal files (e.g., ledger/2026.journal) with sorted account declarations
120
+ - Prevents `hledger check --strict` failures due to missing account declarations
121
+ - No manual account setup required
122
+
123
+ ### Automatic Balance Detection
124
+
125
+ The reconciliation step attempts to extract closing balance from:
126
+
127
+ 1. CSV header metadata (e.g., UBS "Closing balance:" row)
128
+ 2. CSV data analysis (balance field in last transaction row)
129
+ 3. Manual override via `closingBalance` parameter (fallback)
130
+
131
+ For most providers, manual balance input is no longer required.
132
+
111
133
  ## Tool Usage Reference
112
134
 
113
135
  The following are MCP tools available to you. Always call these tools directly - do not attempt to replicate their behavior with shell commands.
@@ -138,12 +160,13 @@ The following are MCP tools available to you. Always call these tools directly -
138
160
  1. Creates isolated git worktree
139
161
  2. Syncs CSV files from main repo to worktree
140
162
  3. Classifies CSV files (unless `skipClassify: true`)
141
- 4. Validates all transactions have matching rules (dry run)
142
- 5. Imports transactions to year journal
143
- 6. Reconciles closing balance against CSV metadata or manual value
144
- 7. Merges to main with `--no-ff` commit
145
- 8. Deletes processed CSV files from main repo's import/incoming
146
- 9. Cleans up worktree
163
+ 4. Extracts accounts from matched rules and updates year journal with declarations
164
+ 5. Validates all transactions have matching rules (dry run)
165
+ 6. Imports transactions to year journal
166
+ 7. Reconciles closing balance (auto-detected from CSV metadata/data or manual override)
167
+ 8. Merges to main with `--no-ff` commit
168
+ 9. Deletes processed CSV files from main repo's import/incoming
169
+ 10. Cleans up worktree
147
170
 
148
171
  **Output:** Returns step-by-step results with success/failure for each phase
149
172
 
package/dist/index.js CHANGED
@@ -23458,6 +23458,18 @@ function findRulesForCsv(csvPath, mapping) {
23458
23458
  return rulesFile;
23459
23459
  }
23460
23460
  }
23461
+ const csvBasename = path9.basename(csvPath);
23462
+ const matches = [];
23463
+ for (const rulesFile of Object.values(mapping)) {
23464
+ const rulesBasename = path9.basename(rulesFile, ".rules");
23465
+ if (csvBasename.startsWith(rulesBasename)) {
23466
+ matches.push({ rulesFile, prefixLength: rulesBasename.length });
23467
+ }
23468
+ }
23469
+ if (matches.length > 0) {
23470
+ matches.sort((a, b) => b.prefixLength - a.prefixLength);
23471
+ return matches[0].rulesFile;
23472
+ }
23461
23473
  return null;
23462
23474
  }
23463
23475
 
@@ -24191,12 +24203,6 @@ function buildErrorResult4(params) {
24191
24203
  ...params
24192
24204
  });
24193
24205
  }
24194
- function buildSuccessResult4(params) {
24195
- return JSON.stringify({
24196
- success: true,
24197
- ...params
24198
- });
24199
- }
24200
24206
  function validateWorktree(directory, worktreeChecker) {
24201
24207
  if (!worktreeChecker(directory)) {
24202
24208
  return buildErrorResult4({
@@ -24235,7 +24241,7 @@ function findCsvToReconcile(doneDir, options) {
24235
24241
  const relativePath = path11.relative(path11.dirname(path11.dirname(doneDir)), csvFile);
24236
24242
  return { csvFile, relativePath };
24237
24243
  }
24238
- function determineClosingBalance(csvFile, config2, options, relativeCsvPath) {
24244
+ function determineClosingBalance(csvFile, config2, options, relativeCsvPath, rulesDir) {
24239
24245
  let metadata;
24240
24246
  try {
24241
24247
  const content = fs11.readFileSync(csvFile, "utf-8");
@@ -24246,25 +24252,50 @@ function determineClosingBalance(csvFile, config2, options, relativeCsvPath) {
24246
24252
  metadata = undefined;
24247
24253
  }
24248
24254
  let closingBalance = options.closingBalance;
24249
- if (!closingBalance && metadata?.closing_balance) {
24250
- const { closing_balance, currency } = metadata;
24251
- closingBalance = closing_balance;
24252
- if (currency && !closingBalance.includes(currency)) {
24255
+ if (!closingBalance && metadata?.["closing-balance"]) {
24256
+ const closingBalanceValue = metadata["closing-balance"];
24257
+ const currency = metadata.currency;
24258
+ closingBalance = closingBalanceValue;
24259
+ if (currency && closingBalance && !closingBalance.includes(currency)) {
24253
24260
  closingBalance = `${currency} ${closingBalance}`;
24254
24261
  }
24255
24262
  }
24256
24263
  if (!closingBalance) {
24264
+ const csvAnalysis = tryExtractClosingBalanceFromCSV(csvFile, rulesDir);
24265
+ if (csvAnalysis && csvAnalysis.confidence === "high") {
24266
+ closingBalance = csvAnalysis.balance;
24267
+ return { closingBalance, metadata, fromCSVAnalysis: true };
24268
+ }
24269
+ }
24270
+ if (!closingBalance) {
24271
+ const retryCmd = buildRetryCommand(options, "CHF 2324.79", options.account);
24257
24272
  return {
24258
24273
  error: buildErrorResult4({
24259
24274
  csvFile: relativeCsvPath,
24260
- error: "No closing balance found in CSV metadata",
24261
- hint: "Provide closingBalance parameter manually",
24275
+ error: "No closing balance found in CSV metadata or data",
24276
+ hint: `Provide closingBalance parameter manually. Example retry: ${retryCmd}`,
24262
24277
  metadata
24263
24278
  })
24264
24279
  };
24265
24280
  }
24266
24281
  return { closingBalance, metadata };
24267
24282
  }
24283
+ function buildRetryCommand(options, closingBalance, account) {
24284
+ const parts = ["import-pipeline"];
24285
+ if (options.provider) {
24286
+ parts.push(`--provider ${options.provider}`);
24287
+ }
24288
+ if (options.currency) {
24289
+ parts.push(`--currency ${options.currency}`);
24290
+ }
24291
+ if (closingBalance) {
24292
+ parts.push(`--closingBalance "${closingBalance}"`);
24293
+ }
24294
+ if (account) {
24295
+ parts.push(`--account "${account}"`);
24296
+ }
24297
+ return parts.join(" ");
24298
+ }
24268
24299
  function determineAccount(csvFile, rulesDir, options, relativeCsvPath, metadata) {
24269
24300
  let account = options.account;
24270
24301
  if (!account) {
@@ -24277,7 +24308,7 @@ function determineAccount(csvFile, rulesDir, options, relativeCsvPath, metadata)
24277
24308
  if (!account) {
24278
24309
  const rulesMapping = loadRulesMapping(rulesDir);
24279
24310
  const rulesFile = findRulesForCsv(csvFile, rulesMapping);
24280
- const rulesHint = rulesFile ? `Add 'account1 assets:bank:...' to ${rulesFile} or use --account parameter` : `Create a rules file in ${rulesDir} with 'account1' directive or use --account parameter`;
24311
+ const rulesHint = rulesFile ? `Add 'account1 assets:bank:...' to ${rulesFile} or retry with: ${buildRetryCommand(options, undefined, "assets:bank:...")}` : `Create a rules file in ${rulesDir} with 'account1' directive or retry with: ${buildRetryCommand(options, undefined, "assets:bank:...")}`;
24281
24312
  return {
24282
24313
  error: buildErrorResult4({
24283
24314
  csvFile: relativeCsvPath,
@@ -24289,6 +24320,70 @@ function determineAccount(csvFile, rulesDir, options, relativeCsvPath, metadata)
24289
24320
  }
24290
24321
  return { account };
24291
24322
  }
24323
+ function tryExtractClosingBalanceFromCSV(csvFile, rulesDir) {
24324
+ try {
24325
+ const rulesMapping = loadRulesMapping(rulesDir);
24326
+ const rulesFile = findRulesForCsv(csvFile, rulesMapping);
24327
+ if (!rulesFile) {
24328
+ return null;
24329
+ }
24330
+ const rulesContent = fs11.readFileSync(rulesFile, "utf-8");
24331
+ const rulesConfig = parseRulesFile(rulesContent);
24332
+ const csvRows = parseCsvFile(csvFile, rulesConfig);
24333
+ if (csvRows.length === 0) {
24334
+ return null;
24335
+ }
24336
+ const balanceFieldNames = [
24337
+ "balance",
24338
+ "Balance",
24339
+ "BALANCE",
24340
+ "closing-balance",
24341
+ "Closing Balance",
24342
+ "account_balance",
24343
+ "Account Balance",
24344
+ "saldo",
24345
+ "Saldo",
24346
+ "SALDO"
24347
+ ];
24348
+ const lastRow = csvRows[csvRows.length - 1];
24349
+ let balanceField;
24350
+ let balanceValue;
24351
+ for (const fieldName of balanceFieldNames) {
24352
+ if (lastRow[fieldName] !== undefined && lastRow[fieldName].trim() !== "") {
24353
+ balanceField = fieldName;
24354
+ balanceValue = lastRow[fieldName];
24355
+ break;
24356
+ }
24357
+ }
24358
+ if (balanceValue && balanceField) {
24359
+ const numericValue = parseAmountValue(balanceValue);
24360
+ let currency = "";
24361
+ const balanceCurrencyMatch = balanceValue.match(/[A-Z]{3}/);
24362
+ if (balanceCurrencyMatch) {
24363
+ currency = balanceCurrencyMatch[0];
24364
+ }
24365
+ if (!currency) {
24366
+ const amountField = rulesConfig.amountFields.single || rulesConfig.amountFields.credit || rulesConfig.amountFields.debit;
24367
+ if (amountField) {
24368
+ const amountStr = lastRow[amountField] || "";
24369
+ const currencyMatch = amountStr.match(/[A-Z]{3}/);
24370
+ if (currencyMatch) {
24371
+ currency = currencyMatch[0];
24372
+ }
24373
+ }
24374
+ }
24375
+ const balanceStr = currency ? `${currency} ${numericValue.toFixed(2)}` : numericValue.toFixed(2);
24376
+ return {
24377
+ balance: balanceStr,
24378
+ confidence: "high",
24379
+ method: `Extracted from ${balanceField} field in last CSV row`
24380
+ };
24381
+ }
24382
+ return null;
24383
+ } catch {
24384
+ return null;
24385
+ }
24386
+ }
24292
24387
  async function reconcileStatement(directory, agent, options, configLoader = loadImportConfig, hledgerExecutor = defaultHledgerExecutor, worktreeChecker = isInWorktree) {
24293
24388
  const restrictionError = checkAccountantAgent(agent, "reconcile statement");
24294
24389
  if (restrictionError) {
@@ -24311,11 +24406,11 @@ async function reconcileStatement(directory, agent, options, configLoader = load
24311
24406
  return csvResult.error;
24312
24407
  }
24313
24408
  const { csvFile, relativePath: relativeCsvPath } = csvResult;
24314
- const balanceResult = determineClosingBalance(csvFile, config2, options, relativeCsvPath);
24409
+ const balanceResult = determineClosingBalance(csvFile, config2, options, relativeCsvPath, rulesDir);
24315
24410
  if ("error" in balanceResult) {
24316
24411
  return balanceResult.error;
24317
24412
  }
24318
- const { closingBalance, metadata } = balanceResult;
24413
+ const { closingBalance, metadata, fromCSVAnalysis } = balanceResult;
24319
24414
  const accountResult = determineAccount(csvFile, rulesDir, options, relativeCsvPath, metadata);
24320
24415
  if ("error" in accountResult) {
24321
24416
  return accountResult.error;
@@ -24357,14 +24452,19 @@ async function reconcileStatement(directory, agent, options, configLoader = load
24357
24452
  });
24358
24453
  }
24359
24454
  if (doBalancesMatch) {
24360
- return buildSuccessResult4({
24455
+ const result = {
24456
+ success: true,
24361
24457
  csvFile: relativeCsvPath,
24362
24458
  account,
24363
24459
  lastTransactionDate,
24364
24460
  expectedBalance: closingBalance,
24365
24461
  actualBalance,
24366
24462
  metadata
24367
- });
24463
+ };
24464
+ if (fromCSVAnalysis) {
24465
+ result.note = `Closing balance auto-detected from CSV data (no metadata available). Account: ${account}`;
24466
+ }
24467
+ return JSON.stringify(result);
24368
24468
  }
24369
24469
  let difference;
24370
24470
  try {
@@ -24429,8 +24529,125 @@ It must be run inside an import worktree (use import-pipeline for the full workf
24429
24529
  }
24430
24530
  });
24431
24531
  // src/tools/import-pipeline.ts
24432
- import * as fs12 from "fs";
24532
+ import * as fs13 from "fs";
24433
24533
  import * as path12 from "path";
24534
+
24535
+ // src/utils/accountDeclarations.ts
24536
+ import * as fs12 from "fs";
24537
+ function extractAccountsFromRulesFile(rulesPath) {
24538
+ const accounts = new Set;
24539
+ if (!fs12.existsSync(rulesPath)) {
24540
+ return accounts;
24541
+ }
24542
+ const content = fs12.readFileSync(rulesPath, "utf-8");
24543
+ const lines = content.split(`
24544
+ `);
24545
+ for (const line of lines) {
24546
+ const trimmed = line.trim();
24547
+ if (trimmed.startsWith("#") || trimmed.startsWith(";") || trimmed === "") {
24548
+ continue;
24549
+ }
24550
+ const account1Match = trimmed.match(/^account1\s+(.+?)(?:\s+|$)/);
24551
+ if (account1Match) {
24552
+ accounts.add(account1Match[1].trim());
24553
+ continue;
24554
+ }
24555
+ const account2Match = trimmed.match(/account2\s+(.+?)(?:\s+|$)/);
24556
+ if (account2Match) {
24557
+ accounts.add(account2Match[1].trim());
24558
+ continue;
24559
+ }
24560
+ }
24561
+ return accounts;
24562
+ }
24563
+ function getAllAccountsFromRules(rulesPaths) {
24564
+ const allAccounts = new Set;
24565
+ for (const rulesPath of rulesPaths) {
24566
+ const accounts = extractAccountsFromRulesFile(rulesPath);
24567
+ for (const account of accounts) {
24568
+ allAccounts.add(account);
24569
+ }
24570
+ }
24571
+ return allAccounts;
24572
+ }
24573
+ function sortAccountDeclarations(accounts) {
24574
+ return Array.from(accounts).sort((a, b) => a.localeCompare(b));
24575
+ }
24576
+ function ensureAccountDeclarations(yearJournalPath, accounts) {
24577
+ if (!fs12.existsSync(yearJournalPath)) {
24578
+ throw new Error(`Year journal not found: ${yearJournalPath}`);
24579
+ }
24580
+ const content = fs12.readFileSync(yearJournalPath, "utf-8");
24581
+ const lines = content.split(`
24582
+ `);
24583
+ const existingAccounts = new Set;
24584
+ const commentLines = [];
24585
+ const accountLines = [];
24586
+ const otherLines = [];
24587
+ let inAccountSection = false;
24588
+ let accountSectionEnded = false;
24589
+ for (const line of lines) {
24590
+ const trimmed = line.trim();
24591
+ if (trimmed.startsWith(";") || trimmed.startsWith("#")) {
24592
+ if (!accountSectionEnded) {
24593
+ commentLines.push(line);
24594
+ } else {
24595
+ otherLines.push(line);
24596
+ }
24597
+ continue;
24598
+ }
24599
+ if (trimmed.startsWith("account ")) {
24600
+ inAccountSection = true;
24601
+ const accountMatch = trimmed.match(/^account\s+(.+?)(?:\s+|$)/);
24602
+ if (accountMatch) {
24603
+ const accountName = accountMatch[1].trim();
24604
+ existingAccounts.add(accountName);
24605
+ accountLines.push(line);
24606
+ }
24607
+ continue;
24608
+ }
24609
+ if (trimmed === "") {
24610
+ if (inAccountSection && !accountSectionEnded) {
24611
+ accountLines.push(line);
24612
+ } else {
24613
+ otherLines.push(line);
24614
+ }
24615
+ continue;
24616
+ }
24617
+ if (inAccountSection) {
24618
+ accountSectionEnded = true;
24619
+ }
24620
+ otherLines.push(line);
24621
+ }
24622
+ const missingAccounts = new Set;
24623
+ for (const account of accounts) {
24624
+ if (!existingAccounts.has(account)) {
24625
+ missingAccounts.add(account);
24626
+ }
24627
+ }
24628
+ if (missingAccounts.size === 0) {
24629
+ return { added: [], updated: false };
24630
+ }
24631
+ const allAccounts = new Set([...existingAccounts, ...missingAccounts]);
24632
+ const sortedAccounts = sortAccountDeclarations(allAccounts);
24633
+ const newAccountLines = sortedAccounts.map((account) => `account ${account}`);
24634
+ const newContent = [];
24635
+ newContent.push(...commentLines);
24636
+ if (newAccountLines.length > 0) {
24637
+ newContent.push("");
24638
+ newContent.push(...newAccountLines);
24639
+ newContent.push("");
24640
+ }
24641
+ newContent.push(...otherLines);
24642
+ fs12.writeFileSync(yearJournalPath, newContent.join(`
24643
+ `));
24644
+ return {
24645
+ added: Array.from(missingAccounts).sort(),
24646
+ updated: true
24647
+ };
24648
+ }
24649
+
24650
+ // src/tools/import-pipeline.ts
24434
24651
  class NoTransactionsError extends Error {
24435
24652
  constructor() {
24436
24653
  super("No transactions to import");
@@ -24444,7 +24661,7 @@ function buildStepResult(success2, message, details) {
24444
24661
  }
24445
24662
  return result;
24446
24663
  }
24447
- function buildSuccessResult5(result, summary) {
24664
+ function buildSuccessResult4(result, summary) {
24448
24665
  result.success = true;
24449
24666
  result.summary = summary;
24450
24667
  return JSON.stringify(result);
@@ -24470,7 +24687,7 @@ function buildCommitMessage(provider, currency, fromDate, untilDate, transaction
24470
24687
  }
24471
24688
  function cleanupIncomingFiles(worktree, context) {
24472
24689
  const incomingDir = path12.join(worktree.mainRepoPath, "import/incoming");
24473
- if (!fs12.existsSync(incomingDir)) {
24690
+ if (!fs13.existsSync(incomingDir)) {
24474
24691
  return;
24475
24692
  }
24476
24693
  const importStep = context.result.steps.import;
@@ -24487,9 +24704,9 @@ function cleanupIncomingFiles(worktree, context) {
24487
24704
  continue;
24488
24705
  const filename = path12.basename(fileResult.csv);
24489
24706
  const filePath = path12.join(incomingDir, filename);
24490
- if (fs12.existsSync(filePath)) {
24707
+ if (fs13.existsSync(filePath)) {
24491
24708
  try {
24492
- fs12.unlinkSync(filePath);
24709
+ fs13.unlinkSync(filePath);
24493
24710
  deletedCount++;
24494
24711
  } catch (error45) {
24495
24712
  console.error(`[ERROR] Failed to delete ${filename}: ${error45 instanceof Error ? error45.message : String(error45)}`);
@@ -24520,6 +24737,86 @@ async function executeClassifyStep(context, worktree) {
24520
24737
  };
24521
24738
  context.result.steps.classify = buildStepResult(success2, message, details);
24522
24739
  }
24740
+ async function executeAccountDeclarationsStep(context, worktree) {
24741
+ const config2 = context.configLoader(worktree.path);
24742
+ const pendingDir = path12.join(worktree.path, config2.paths.pending);
24743
+ const rulesDir = path12.join(worktree.path, config2.paths.rules);
24744
+ const csvFiles = findCsvFiles(pendingDir, context.options.provider, context.options.currency);
24745
+ if (csvFiles.length === 0) {
24746
+ context.result.steps.accountDeclarations = buildStepResult(true, "No CSV files to process", {
24747
+ accountsAdded: [],
24748
+ journalUpdated: "",
24749
+ rulesScanned: []
24750
+ });
24751
+ return;
24752
+ }
24753
+ const rulesMapping = loadRulesMapping(rulesDir);
24754
+ const matchedRulesFiles = new Set;
24755
+ for (const csvFile of csvFiles) {
24756
+ const rulesFile = findRulesForCsv(csvFile, rulesMapping);
24757
+ if (rulesFile) {
24758
+ matchedRulesFiles.add(rulesFile);
24759
+ }
24760
+ }
24761
+ if (matchedRulesFiles.size === 0) {
24762
+ context.result.steps.accountDeclarations = buildStepResult(true, "No matching rules files found", {
24763
+ accountsAdded: [],
24764
+ journalUpdated: "",
24765
+ rulesScanned: []
24766
+ });
24767
+ return;
24768
+ }
24769
+ const allAccounts = getAllAccountsFromRules(Array.from(matchedRulesFiles));
24770
+ if (allAccounts.size === 0) {
24771
+ context.result.steps.accountDeclarations = buildStepResult(true, "No accounts found in rules files", {
24772
+ accountsAdded: [],
24773
+ journalUpdated: "",
24774
+ rulesScanned: Array.from(matchedRulesFiles).map((f) => path12.relative(worktree.path, f))
24775
+ });
24776
+ return;
24777
+ }
24778
+ let transactionYear;
24779
+ for (const rulesFile of matchedRulesFiles) {
24780
+ try {
24781
+ const result2 = await context.hledgerExecutor(["print", "-f", rulesFile]);
24782
+ if (result2.exitCode === 0) {
24783
+ const years = extractTransactionYears(result2.stdout);
24784
+ if (years.size > 0) {
24785
+ transactionYear = Array.from(years)[0];
24786
+ break;
24787
+ }
24788
+ }
24789
+ } catch {
24790
+ continue;
24791
+ }
24792
+ }
24793
+ if (!transactionYear) {
24794
+ context.result.steps.accountDeclarations = buildStepResult(false, "Could not determine transaction year from CSV files", {
24795
+ accountsAdded: [],
24796
+ journalUpdated: "",
24797
+ rulesScanned: Array.from(matchedRulesFiles).map((f) => path12.relative(worktree.path, f))
24798
+ });
24799
+ return;
24800
+ }
24801
+ let yearJournalPath;
24802
+ try {
24803
+ yearJournalPath = ensureYearJournalExists(worktree.path, transactionYear);
24804
+ } catch (error45) {
24805
+ context.result.steps.accountDeclarations = buildStepResult(false, `Failed to create year journal: ${error45 instanceof Error ? error45.message : String(error45)}`, {
24806
+ accountsAdded: [],
24807
+ journalUpdated: "",
24808
+ rulesScanned: Array.from(matchedRulesFiles).map((f) => path12.relative(worktree.path, f))
24809
+ });
24810
+ return;
24811
+ }
24812
+ const result = ensureAccountDeclarations(yearJournalPath, allAccounts);
24813
+ const message = result.added.length > 0 ? `Added ${result.added.length} account declaration(s) to ${path12.relative(worktree.path, yearJournalPath)}` : "All required accounts already declared";
24814
+ context.result.steps.accountDeclarations = buildStepResult(true, message, {
24815
+ accountsAdded: result.added,
24816
+ journalUpdated: path12.relative(worktree.path, yearJournalPath),
24817
+ rulesScanned: Array.from(matchedRulesFiles).map((f) => path12.relative(worktree.path, f))
24818
+ });
24819
+ }
24523
24820
  async function executeDryRunStep(context, worktree) {
24524
24821
  const inWorktree = () => true;
24525
24822
  const dryRunResult = await importStatements(worktree.path, context.agent, {
@@ -24591,8 +24888,8 @@ async function executeMergeStep(context, worktree) {
24591
24888
  throw new Error("Import or reconcile step not completed before merge");
24592
24889
  }
24593
24890
  const commitInfo = {
24594
- fromDate: reconcileDetails.metadata?.from_date,
24595
- untilDate: reconcileDetails.metadata?.until_date
24891
+ fromDate: reconcileDetails.metadata?.["from-date"],
24892
+ untilDate: reconcileDetails.metadata?.["until-date"]
24596
24893
  };
24597
24894
  const transactionCount = importDetails.summary?.totalTransactions || 0;
24598
24895
  const commitMessage = buildCommitMessage(context.options.provider, context.options.currency, commitInfo.fromDate, commitInfo.untilDate, transactionCount);
@@ -24612,7 +24909,7 @@ function handleNoTransactions(result) {
24612
24909
  result.steps.import = buildStepResult(true, "No transactions to import");
24613
24910
  result.steps.reconcile = buildStepResult(true, "Reconciliation skipped (no transactions)");
24614
24911
  result.steps.merge = buildStepResult(true, "Merge skipped (no changes)");
24615
- return buildSuccessResult5(result, "No transactions found to import");
24912
+ return buildSuccessResult4(result, "No transactions found to import");
24616
24913
  }
24617
24914
  async function importPipeline(directory, agent, options, configLoader = loadImportConfig, hledgerExecutor = defaultHledgerExecutor) {
24618
24915
  const restrictionError = checkAccountantAgent(agent, "import pipeline");
@@ -24656,6 +24953,7 @@ async function importPipeline(directory, agent, options, configLoader = loadImpo
24656
24953
  }
24657
24954
  try {
24658
24955
  await executeClassifyStep(context, worktree);
24956
+ await executeAccountDeclarationsStep(context, worktree);
24659
24957
  await executeDryRunStep(context, worktree);
24660
24958
  await executeImportStep(context, worktree);
24661
24959
  await executeReconcileStep(context, worktree);
@@ -24693,7 +24991,7 @@ async function importPipeline(directory, agent, options, configLoader = loadImpo
24693
24991
  };
24694
24992
  }
24695
24993
  const transactionCount = context.result.steps.import?.details?.summary?.totalTransactions || 0;
24696
- return buildSuccessResult5(result, `Successfully imported ${transactionCount} transaction(s)`);
24994
+ return buildSuccessResult4(result, `Successfully imported ${transactionCount} transaction(s)`);
24697
24995
  } catch (error45) {
24698
24996
  result.steps.cleanup = buildStepResult(true, "Worktree cleaned up after failure (CSV files preserved for retry)", {
24699
24997
  cleanedAfterFailure: true,
@@ -24758,7 +25056,7 @@ This tool orchestrates the full import workflow in an isolated git worktree:
24758
25056
  }
24759
25057
  });
24760
25058
  // src/tools/init-directories.ts
24761
- import * as fs13 from "fs";
25059
+ import * as fs14 from "fs";
24762
25060
  import * as path13 from "path";
24763
25061
  async function initDirectories(directory) {
24764
25062
  try {
@@ -24766,8 +25064,8 @@ async function initDirectories(directory) {
24766
25064
  const directoriesCreated = [];
24767
25065
  const gitkeepFiles = [];
24768
25066
  const importBase = path13.join(directory, "import");
24769
- if (!fs13.existsSync(importBase)) {
24770
- fs13.mkdirSync(importBase, { recursive: true });
25067
+ if (!fs14.existsSync(importBase)) {
25068
+ fs14.mkdirSync(importBase, { recursive: true });
24771
25069
  directoriesCreated.push("import");
24772
25070
  }
24773
25071
  const pathsToCreate = [
@@ -24778,19 +25076,19 @@ async function initDirectories(directory) {
24778
25076
  ];
24779
25077
  for (const { path: dirPath } of pathsToCreate) {
24780
25078
  const fullPath = path13.join(directory, dirPath);
24781
- if (!fs13.existsSync(fullPath)) {
24782
- fs13.mkdirSync(fullPath, { recursive: true });
25079
+ if (!fs14.existsSync(fullPath)) {
25080
+ fs14.mkdirSync(fullPath, { recursive: true });
24783
25081
  directoriesCreated.push(dirPath);
24784
25082
  }
24785
25083
  const gitkeepPath = path13.join(fullPath, ".gitkeep");
24786
- if (!fs13.existsSync(gitkeepPath)) {
24787
- fs13.writeFileSync(gitkeepPath, "");
25084
+ if (!fs14.existsSync(gitkeepPath)) {
25085
+ fs14.writeFileSync(gitkeepPath, "");
24788
25086
  gitkeepFiles.push(path13.join(dirPath, ".gitkeep"));
24789
25087
  }
24790
25088
  }
24791
25089
  const gitignorePath = path13.join(importBase, ".gitignore");
24792
25090
  let gitignoreCreated = false;
24793
- if (!fs13.existsSync(gitignorePath)) {
25091
+ if (!fs14.existsSync(gitignorePath)) {
24794
25092
  const gitignoreContent = `# Ignore CSV/PDF files in temporary directories
24795
25093
  /incoming/*.csv
24796
25094
  /incoming/*.pdf
@@ -24808,7 +25106,7 @@ async function initDirectories(directory) {
24808
25106
  .DS_Store
24809
25107
  Thumbs.db
24810
25108
  `;
24811
- fs13.writeFileSync(gitignorePath, gitignoreContent);
25109
+ fs14.writeFileSync(gitignorePath, gitignoreContent);
24812
25110
  gitignoreCreated = true;
24813
25111
  }
24814
25112
  const parts = [];
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "@fuzzle/opencode-accountant",
3
- "version": "0.3.0",
3
+ "version": "0.4.0-next.1",
4
4
  "description": "An OpenCode accounting agent, specialized in double-entry-bookkepping with hledger",
5
5
  "author": {
6
6
  "name": "ali bengali",