satif-ai 0.2.11__tar.gz → 0.2.12__tar.gz

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
@@ -0,0 +1,175 @@
1
+ Metadata-Version: 2.3
2
+ Name: satif-ai
3
+ Version: 0.2.12
4
+ Summary: AI Agents for Satif
5
+ License: MIT
6
+ Author: Syncpulse
7
+ Maintainer: Bryan Djafer
8
+ Maintainer-email: bryan.djafer@syncpulse.fr
9
+ Requires-Python: >=3.10,<3.14
10
+ Classifier: License :: OSI Approved :: MIT License
11
+ Classifier: Programming Language :: Python :: 3
12
+ Classifier: Programming Language :: Python :: 3.10
13
+ Classifier: Programming Language :: Python :: 3.11
14
+ Classifier: Programming Language :: Python :: 3.12
15
+ Classifier: Programming Language :: Python :: 3.13
16
+ Provides-Extra: xlsx
17
+ Requires-Dist: openai-agents (>=0.0.9,<0.0.10)
18
+ Requires-Dist: satif-sdk (>=0.1.0,<1.0.0)
19
+ Requires-Dist: sdif-mcp (>=0.1.0,<1.0.0)
20
+ Description-Content-Type: text/markdown
21
+
22
+ # SATIF AI
23
+
24
+ [![License: MIT](https://img.shields.io/badge/License-MIT-yellow.svg)](https://opensource.org/licenses/MIT)
25
+ [![Python Version](https://img.shields.io/badge/python-3.10%2B-blue.svg)](https://www.python.org/downloads/)
26
+ [![Status: Experimental](https://img.shields.io/badge/Status-Experimental-orange.svg)](https://github.com/syncpulse-solutions/satif)
27
+
28
+ AI toolkit for transforming any input files into any output files.
29
+
30
+ ## ⚠️ Disclaimer
31
+
32
+ **EXPERIMENTAL STATUS**: This package is in early development and not production-ready. The API may change significantly between versions.
33
+
34
+ **BLOCKING I/O**: Despite the async API, some operations may contain blocking I/O. This package should be used for testing and experimental purposes only.
35
+
36
+ ## Installation
37
+
38
+ ```bash
39
+ pip install satif-ai
40
+ ```
41
+
42
+ ## Overview
43
+
44
+ SATIF AI enables automated transformation of heterogeneous data sources (CSV, Excel, PDF, XML, etc.) into any desired output format in 2 steps:
45
+
46
+ 1. **Standardization**: Ingests heterogeneous source files (CSV, Excel, PDF, XML, etc.) and transforms them into SDIF, a structured intermediate format.
47
+ 2. **Transformation**: Applies business logic to the standardized data to generate the target output files, with transformation code generated by AI.
48
+
49
+ ## Key Features
50
+
51
+ - **Any Format Support**: Process virtually any input, even challenging unstructured content (PDFs, complex Excel sheets)
52
+ - **AI-Powered Code Generation**: Automatically generate transformation code from examples and natural language instructions
53
+ - **Robust Schema Enforcement**: Handle input data drift and schema inconsistencies through configurable validation
54
+ - **SQL-Based Data Processing**: Query and manipulate all data using SQL
55
+ - **Decoupled Processing Stages**: Standardize once, transform many times with different logic
56
+
57
+ ## Usage
58
+
59
+ ### Basic Workflow
60
+
61
+ ```python
62
+ import asyncio
63
+ from satif_ai import astandardize, atransform
64
+
65
+ async def main():
66
+ # Step 1: Standardize input files into SDIF
67
+ sdif_path = await astandardize(
68
+ datasource=["data.csv", "reference.xlsx"],
69
+ output_path="standardized.sdif",
70
+ overwrite=True
71
+ )
72
+
73
+ # Step 2: Transform SDIF into desired output using AI
74
+ await atransform(
75
+ sdif=sdif_path,
76
+ output_target_files="output.json",
77
+ instructions="Extract customer IDs and purchase totals, calculate the average purchase value per customer, and output as JSON with customer_id and avg_purchase_value fields.",
78
+ llm_model="o4-mini" # Choose AI model based on needs
79
+ )
80
+
81
+ if __name__ == "__main__":
82
+ asyncio.run(main())
83
+ ```
84
+
85
+ ## Architecture
86
+
87
+ ```
88
+ ┌─────────────────┐ ┌───────────────────────┐ ┌─────────────────┐
89
+ │ Source Files │────▶│ Standardization Layer │────▶│ SDIF File │
90
+ │ CSV/Excel/PDF/ │ │ │ │ (SQLite-based) │
91
+ │ XML/JSON/etc. │ └───────────────────────┘ └────────┬────────┘
92
+ └─────────────────┘ │
93
+
94
+ ┌─────────────────┐ ┌───────────────────────┐ │
95
+ │ Output Files │◀────│ Transformation Layer │◀─────────────┘
96
+ │ Any format │ │ (AI-generated code) │
97
+ └─────────────────┘ └───────────────────────┘
98
+ ```
99
+
100
+ SDIF (Standardized Data Interoperable Format) is the intermediate SQLite-based format that:
101
+
102
+ - Stores structured tables alongside JSON objects and binary media
103
+ - Maintains rich metadata about data origins and relationships
104
+ - Provides direct SQL queryability for complex transformations
105
+
106
+ ## Documentation
107
+
108
+ For detailed documentation, examples, and advanced features, visit [SATIF Documentation](https://satif.io/docs).
109
+
110
+ ## Contributing
111
+
112
+ Contributions are welcome! Whether it's bug reports, feature requests, or code contributions, please feel free to get involved.
113
+
114
+ ### Contribution Workflow
115
+
116
+ 1. **Fork the repository** on GitHub.
117
+ 2. **Clone your fork** locally:
118
+
119
+ ```bash
120
+ git clone https://github.com/syncpulse-solutions/satif.git
121
+ cd satif/libs/ai
122
+ ```
123
+ 3. **Create a new branch** for your feature or bug fix:
124
+
125
+ ```bash
126
+ git checkout -b feature/your-feature-name
127
+ ```
128
+
129
+ or
130
+
131
+ ```bash
132
+ git checkout -b fix/your-bug-fix-name
133
+ ```
134
+ 4. **Set up the development environment** as described in the [From Source (for Development)](#from-source-for-development) section:
135
+
136
+ ```bash
137
+ make install # or poetry install
138
+ ```
139
+ 5. **Make your changes.** Ensure your code follows the project's style guidelines.
140
+ 6. **Format and lint your code:**
141
+
142
+ ```bash
143
+ make format
144
+ make lint
145
+ ```
146
+ 7. **Run type checks:**
147
+
148
+ ```bash
149
+ make typecheck
150
+ ```
151
+ 8. **Run tests** to ensure your changes don't break existing functionality:
152
+
153
+ ```bash
154
+ make test
155
+ ```
156
+
157
+ To also generate a coverage report:
158
+
159
+ ```bash
160
+ make coverage
161
+ ```
162
+ 9. **Commit your changes** with a clear and descriptive commit message.
163
+ 10. **Push your changes** to your fork on GitHub:
164
+
165
+ ```bash
166
+ git push origin feature/your-feature-name
167
+ ```
168
+ 11. **Submit a Pull Request (PR)** to the `main` branch of the original `syncpulse-solutions/satif` repository.
169
+
170
+ ## License
171
+
172
+ This project is licensed under the MIT License.
173
+
174
+ Maintainer: Bryan Djafer (bryan.djafer@syncpulse.fr)
175
+
@@ -0,0 +1,153 @@
1
+ # SATIF AI
2
+
3
+ [![License: MIT](https://img.shields.io/badge/License-MIT-yellow.svg)](https://opensource.org/licenses/MIT)
4
+ [![Python Version](https://img.shields.io/badge/python-3.10%2B-blue.svg)](https://www.python.org/downloads/)
5
+ [![Status: Experimental](https://img.shields.io/badge/Status-Experimental-orange.svg)](https://github.com/syncpulse-solutions/satif)
6
+
7
+ AI toolkit for transforming any input files into any output files.
8
+
9
+ ## ⚠️ Disclaimer
10
+
11
+ **EXPERIMENTAL STATUS**: This package is in early development and not production-ready. The API may change significantly between versions.
12
+
13
+ **BLOCKING I/O**: Despite the async API, some operations may contain blocking I/O. This package should be used for testing and experimental purposes only.
14
+
15
+ ## Installation
16
+
17
+ ```bash
18
+ pip install satif-ai
19
+ ```
20
+
21
+ ## Overview
22
+
23
+ SATIF AI enables automated transformation of heterogeneous data sources (CSV, Excel, PDF, XML, etc.) into any desired output format in 2 steps:
24
+
25
+ 1. **Standardization**: Ingests heterogeneous source files (CSV, Excel, PDF, XML, etc.) and transforms them into SDIF, a structured intermediate format.
26
+ 2. **Transformation**: Applies business logic to the standardized data to generate the target output files, with transformation code generated by AI.
27
+
28
+ ## Key Features
29
+
30
+ - **Any Format Support**: Process virtually any input, even challenging unstructured content (PDFs, complex Excel sheets)
31
+ - **AI-Powered Code Generation**: Automatically generate transformation code from examples and natural language instructions
32
+ - **Robust Schema Enforcement**: Handle input data drift and schema inconsistencies through configurable validation
33
+ - **SQL-Based Data Processing**: Query and manipulate all data using SQL
34
+ - **Decoupled Processing Stages**: Standardize once, transform many times with different logic
35
+
36
+ ## Usage
37
+
38
+ ### Basic Workflow
39
+
40
+ ```python
41
+ import asyncio
42
+ from satif_ai import astandardize, atransform
43
+
44
+ async def main():
45
+ # Step 1: Standardize input files into SDIF
46
+ sdif_path = await astandardize(
47
+ datasource=["data.csv", "reference.xlsx"],
48
+ output_path="standardized.sdif",
49
+ overwrite=True
50
+ )
51
+
52
+ # Step 2: Transform SDIF into desired output using AI
53
+ await atransform(
54
+ sdif=sdif_path,
55
+ output_target_files="output.json",
56
+ instructions="Extract customer IDs and purchase totals, calculate the average purchase value per customer, and output as JSON with customer_id and avg_purchase_value fields.",
57
+ llm_model="o4-mini" # Choose AI model based on needs
58
+ )
59
+
60
+ if __name__ == "__main__":
61
+ asyncio.run(main())
62
+ ```
63
+
64
+ ## Architecture
65
+
66
+ ```
67
+ ┌─────────────────┐ ┌───────────────────────┐ ┌─────────────────┐
68
+ │ Source Files │────▶│ Standardization Layer │────▶│ SDIF File │
69
+ │ CSV/Excel/PDF/ │ │ │ │ (SQLite-based) │
70
+ │ XML/JSON/etc. │ └───────────────────────┘ └────────┬────────┘
71
+ └─────────────────┘ │
72
+
73
+ ┌─────────────────┐ ┌───────────────────────┐ │
74
+ │ Output Files │◀────│ Transformation Layer │◀─────────────┘
75
+ │ Any format │ │ (AI-generated code) │
76
+ └─────────────────┘ └───────────────────────┘
77
+ ```
78
+
79
+ SDIF (Standardized Data Interoperable Format) is the intermediate SQLite-based format that:
80
+
81
+ - Stores structured tables alongside JSON objects and binary media
82
+ - Maintains rich metadata about data origins and relationships
83
+ - Provides direct SQL queryability for complex transformations
84
+
85
+ ## Documentation
86
+
87
+ For detailed documentation, examples, and advanced features, visit [SATIF Documentation](https://satif.io/docs).
88
+
89
+ ## Contributing
90
+
91
+ Contributions are welcome! Whether it's bug reports, feature requests, or code contributions, please feel free to get involved.
92
+
93
+ ### Contribution Workflow
94
+
95
+ 1. **Fork the repository** on GitHub.
96
+ 2. **Clone your fork** locally:
97
+
98
+ ```bash
99
+ git clone https://github.com/syncpulse-solutions/satif.git
100
+ cd satif/libs/ai
101
+ ```
102
+ 3. **Create a new branch** for your feature or bug fix:
103
+
104
+ ```bash
105
+ git checkout -b feature/your-feature-name
106
+ ```
107
+
108
+ or
109
+
110
+ ```bash
111
+ git checkout -b fix/your-bug-fix-name
112
+ ```
113
+ 4. **Set up the development environment** as described in the [From Source (for Development)](#from-source-for-development) section:
114
+
115
+ ```bash
116
+ make install # or poetry install
117
+ ```
118
+ 5. **Make your changes.** Ensure your code follows the project's style guidelines.
119
+ 6. **Format and lint your code:**
120
+
121
+ ```bash
122
+ make format
123
+ make lint
124
+ ```
125
+ 7. **Run type checks:**
126
+
127
+ ```bash
128
+ make typecheck
129
+ ```
130
+ 8. **Run tests** to ensure your changes don't break existing functionality:
131
+
132
+ ```bash
133
+ make test
134
+ ```
135
+
136
+ To also generate a coverage report:
137
+
138
+ ```bash
139
+ make coverage
140
+ ```
141
+ 9. **Commit your changes** with a clear and descriptive commit message.
142
+ 10. **Push your changes** to your fork on GitHub:
143
+
144
+ ```bash
145
+ git push origin feature/your-feature-name
146
+ ```
147
+ 11. **Submit a Pull Request (PR)** to the `main` branch of the original `syncpulse-solutions/satif` repository.
148
+
149
+ ## License
150
+
151
+ This project is licensed under the MIT License.
152
+
153
+ Maintainer: Bryan Djafer (bryan.djafer@syncpulse.fr)
@@ -1,6 +1,6 @@
1
1
  [project]
2
2
  name = "satif-ai"
3
- version = "0.2.11"
3
+ version = "0.2.12"
4
4
  description = "AI Agents for Satif"
5
5
  authors = [
6
6
  {name = "Syncpulse"}
@@ -63,6 +63,38 @@ class AIStandardizer(AsyncStandardizer):
63
63
  ) -> Optional[Type[AsyncStandardizer]]:
64
64
  return self.ai_standardizer_map.get(extension.lower())
65
65
 
66
+ def _resolve_file_path(
67
+ self, raw_path_item: Union[str, Path], temp_processing_dir: Path
68
+ ) -> List[Path]:
69
+ """
70
+ Resolves a single input path to a list of file paths.
71
+ This method contains blocking file system operations.
72
+ """
73
+ raw_path = Path(raw_path_item).resolve()
74
+ input_file_paths: List[Path] = []
75
+
76
+ if not raw_path.exists():
77
+ raise FileNotFoundError(f"Input path not found: {raw_path}")
78
+
79
+ if raw_path.is_file():
80
+ if raw_path.suffix.lower() == ".zip":
81
+ # Zip extraction is handled asynchronously in the calling method
82
+ return [raw_path]
83
+ else:
84
+ input_file_paths.append(raw_path)
85
+ elif raw_path.is_dir():
86
+ logger.info(f"Processing directory datasource: {raw_path}")
87
+ for child_item in raw_path.iterdir():
88
+ if child_item.is_file():
89
+ input_file_paths.append(child_item)
90
+ # Deeper recursion to be implemented.
91
+ else:
92
+ logger.warning(
93
+ f"Input path '{raw_path}' is not a file or directory and will be ignored."
94
+ )
95
+
96
+ return input_file_paths
97
+
66
98
  async def _resolve_input_files(
67
99
  self, datasource: Datasource, temp_processing_dir: Path
68
100
  ) -> List[Path]:
@@ -70,8 +102,8 @@ class AIStandardizer(AsyncStandardizer):
70
102
  Resolves the input datasource to a list of individual file paths.
71
103
  Handles single files, lists of files, and extracts ZIP archives.
72
104
  """
73
- input_file_paths: List[Path] = []
74
105
  raw_paths_to_check: List[Union[str, Path]] = []
106
+ all_input_file_paths: List[Path] = []
75
107
 
76
108
  if isinstance(datasource, (str, Path)):
77
109
  raw_paths_to_check = [datasource]
@@ -88,12 +120,13 @@ class AIStandardizer(AsyncStandardizer):
88
120
  if not raw_paths_to_check: # Should be caught by above, but defensive
89
121
  raise ValueError("No input datasource paths provided.")
90
122
 
123
+ # Process each path item in a thread to avoid blocking the event loop
91
124
  for raw_path_item in raw_paths_to_check:
92
- raw_path = Path(raw_path_item).resolve()
93
- if not raw_path.exists():
94
- raise FileNotFoundError(f"Input path not found: {raw_path}")
125
+ resolved_paths = await asyncio.to_thread(
126
+ self._resolve_file_path, raw_path_item, temp_processing_dir
127
+ )
95
128
 
96
- if raw_path.is_file():
129
+ for raw_path in resolved_paths:
97
130
  if raw_path.suffix.lower() == ".zip":
98
131
  zip_extract_target = (
99
132
  temp_processing_dir
@@ -103,7 +136,7 @@ class AIStandardizer(AsyncStandardizer):
103
136
  extracted_from_zip = await extract_zip_archive_async(
104
137
  raw_path, zip_extract_target
105
138
  )
106
- input_file_paths.extend(extracted_from_zip)
139
+ all_input_file_paths.extend(extracted_from_zip)
107
140
  except Exception as e_zip:
108
141
  logger.error(
109
142
  f"Failed to extract ZIP archive '{raw_path}': {e_zip}",
@@ -113,23 +146,14 @@ class AIStandardizer(AsyncStandardizer):
113
146
  # For now, skipping problematic zips.
114
147
  continue
115
148
  else:
116
- input_file_paths.append(raw_path)
117
- elif raw_path.is_dir():
118
- logger.info(f"Processing directory datasource: {raw_path}")
119
- for child_item in raw_path.iterdir():
120
- if child_item.is_file():
121
- input_file_paths.append(child_item)
122
- # Deeper recursion to be implemeted.
123
- else:
124
- logger.warning(
125
- f"Input path '{raw_path}' is not a file or directory and will be ignored."
126
- )
149
+ all_input_file_paths.append(raw_path)
127
150
 
128
- if not input_file_paths:
151
+ if not all_input_file_paths:
129
152
  # This means all inputs were invalid, unresolvable, or zips failed etc.
130
153
  logger.error("No processable files found after resolving datasource.")
131
154
  raise ValueError("Datasource resolution resulted in no processable files.")
132
- return input_file_paths
155
+
156
+ return all_input_file_paths
133
157
 
134
158
  def _group_files_by_standardizer(
135
159
  self, file_paths: List[Path]
@@ -269,7 +293,7 @@ class AIStandardizer(AsyncStandardizer):
269
293
 
270
294
  return successful_intermediate_sdif_files, aggregated_file_configs
271
295
 
272
- async def _consolidate_results(
296
+ def _consolidate_results(
273
297
  self,
274
298
  intermediate_sdif_files: List[Path],
275
299
  aggregated_file_configs: Optional[List[Dict[str, Any]]],
@@ -362,6 +386,59 @@ class AIStandardizer(AsyncStandardizer):
362
386
  file_configs=aggregated_file_configs if aggregated_file_configs else None,
363
387
  )
364
388
 
389
+ async def _setup_workspace(
390
+ self, output_path: Path, overwrite: bool
391
+ ) -> Tuple[Path, Path, Path]:
392
+ """
393
+ Sets up the temporary workspace directories and validates the output path.
394
+ Contains blocking file system operations.
395
+ """
396
+ final_sdif_file_target = output_path.resolve()
397
+
398
+ if final_sdif_file_target.is_dir():
399
+ raise ValueError(
400
+ f"Target output_path '{final_sdif_file_target}' is a directory. "
401
+ "It must be a full file path for the target SDIF SQLite database (e.g., data.sqlite or data.sdif)."
402
+ )
403
+ if not final_sdif_file_target.suffix:
404
+ logger.warning(
405
+ f"Target output_path '{final_sdif_file_target}' has no file extension. "
406
+ "It should be a path to an SDIF SQLite database file (e.g., data.sqlite or data.sdif)."
407
+ )
408
+ elif final_sdif_file_target.suffix.lower() not in (".sdif", ".sqlite", ".db"):
409
+ logger.warning(
410
+ f"Target output_path '{final_sdif_file_target}' does not have a common SQLite extension. "
411
+ "Ensure this is the intended SQLite file path."
412
+ )
413
+
414
+ # Create a unique temporary directory for this standardization run
415
+ run_temp_dir = Path(tempfile.mkdtemp(prefix="satif_aistd_run_"))
416
+ intermediate_sdif_files_dir = run_temp_dir / "intermediate_sdif_files"
417
+ intermediate_sdif_files_dir.mkdir(parents=True, exist_ok=True)
418
+ file_processing_temp_dir = run_temp_dir / "file_processing_temp"
419
+ file_processing_temp_dir.mkdir(parents=True, exist_ok=True)
420
+
421
+ return (
422
+ final_sdif_file_target,
423
+ intermediate_sdif_files_dir,
424
+ file_processing_temp_dir,
425
+ )
426
+
427
+ async def _cleanup_workspace(self, run_temp_dir: Path) -> None:
428
+ """
429
+ Cleans up the temporary workspace directory.
430
+ Contains blocking file system operations.
431
+ """
432
+ if run_temp_dir.exists():
433
+ try:
434
+ await asyncio.to_thread(shutil.rmtree, run_temp_dir)
435
+ logger.info(f"Cleaned up temporary run directory: {run_temp_dir}")
436
+ except Exception as e_clean:
437
+ logger.error(
438
+ f"Error cleaning up temporary run directory {run_temp_dir}: {e_clean}",
439
+ exc_info=True,
440
+ )
441
+
365
442
  async def standardize(
366
443
  self,
367
444
  datasource: Datasource,
@@ -387,31 +464,15 @@ class AIStandardizer(AsyncStandardizer):
387
464
  logger.info(
388
465
  f"AIStandardizer starting process for output SDIF file: {output_path}"
389
466
  )
390
- final_sdif_file_target = Path(output_path).resolve()
391
467
 
392
- if final_sdif_file_target.is_dir():
393
- raise ValueError(
394
- f"Target output_path '{final_sdif_file_target}' is a directory. "
395
- "It must be a full file path for the target SDIF SQLite database (e.g., data.sqlite or data.sdif)."
396
- )
397
- if not final_sdif_file_target.suffix:
398
- logger.warning(
399
- f"Target output_path '{final_sdif_file_target}' has no file extension. "
400
- "It should be a path to an SDIF SQLite database file (e.g., data.sqlite or data.sdif)."
401
- )
402
- elif final_sdif_file_target.suffix.lower() not in (".sdif", ".sqlite", ".db"):
403
- logger.warning(
404
- f"Target output_path '{final_sdif_file_target}' does not have a common SQLite extension. "
405
- "Ensure this is the intended SQLite file path."
406
- )
468
+ # Setup workspace and validate output path - moved to a separate async function
469
+ (
470
+ final_sdif_file_target,
471
+ intermediate_sdif_files_dir,
472
+ file_processing_temp_dir,
473
+ ) = await asyncio.to_thread(self._setup_workspace, Path(output_path), overwrite)
407
474
 
408
- # Create a unique temporary directory for this standardization run
409
- # This directory will hold intermediate files and ZIP extractions.
410
- run_temp_dir = Path(tempfile.mkdtemp(prefix="satif_aistd_run_"))
411
- intermediate_sdif_files_dir = run_temp_dir / "intermediate_sdif_files"
412
- intermediate_sdif_files_dir.mkdir(parents=True, exist_ok=True)
413
- file_processing_temp_dir = run_temp_dir / "file_processing_temp"
414
- file_processing_temp_dir.mkdir(parents=True, exist_ok=True)
475
+ run_temp_dir = file_processing_temp_dir.parent
415
476
 
416
477
  try:
417
478
  resolved_files = await self._resolve_input_files(
@@ -419,9 +480,11 @@ class AIStandardizer(AsyncStandardizer):
419
480
  )
420
481
  logger.info(f"Resolved {len(resolved_files)} file(s) for standardization.")
421
482
 
422
- grouped_by_std, unsupported = self._group_files_by_standardizer(
423
- resolved_files
483
+ # File grouping - potentially move to a thread if the list is very large
484
+ grouped_by_std, unsupported = await asyncio.to_thread(
485
+ self._group_files_by_standardizer, resolved_files
424
486
  )
487
+
425
488
  if not grouped_by_std:
426
489
  user_message = (
427
490
  "No files found that can be handled by configured AI standardizers."
@@ -451,7 +514,8 @@ class AIStandardizer(AsyncStandardizer):
451
514
  f"Successfully generated {len(intermediate_sdif_files)} intermediate SDIF SQLite file(s)."
452
515
  )
453
516
 
454
- final_result = await self._consolidate_results(
517
+ final_result = await asyncio.to_thread(
518
+ self._consolidate_results,
455
519
  intermediate_sdif_files,
456
520
  aggregated_file_configs,
457
521
  final_sdif_file_target,
@@ -469,13 +533,5 @@ class AIStandardizer(AsyncStandardizer):
469
533
  raise
470
534
  raise RuntimeError(f"AIStandardizer processing error: {e}") from e
471
535
  finally:
472
- # Clean up the entire temporary directory for this run
473
- if run_temp_dir.exists():
474
- try:
475
- shutil.rmtree(run_temp_dir)
476
- logger.info(f"Cleaned up temporary run directory: {run_temp_dir}")
477
- except Exception as e_clean:
478
- logger.error(
479
- f"Error cleaning up temporary run directory {run_temp_dir}: {e_clean}",
480
- exc_info=True,
481
- )
536
+ # Clean up using a dedicated async method
537
+ await self._cleanup_workspace(run_temp_dir)
@@ -90,7 +90,6 @@ async def atransform(
90
90
  # If code isn't provided, we need a builder. If a builder isn't provided, we create one.
91
91
  if current_transformation_code is None:
92
92
  if active_builder is None:
93
- # Create SyncpulseTransformationBuilder
94
93
  _effective_mcp_server = mcp_server if mcp_server is not None else mcp
95
94
 
96
95
  _openai_mcp_instance = OpenAICompatibleMCP(mcp=_effective_mcp_server)
@@ -137,7 +136,6 @@ async def atransform(
137
136
  if current_transformation_code is None:
138
137
  raise ValueError("Transformation code could not be obtained or generated.")
139
138
 
140
- # Code Executor and Transformation
141
139
  _code_executor = code_executor if code_executor is not None else LocalCodeExecutor()
142
140
 
143
141
  transformer = CodeTransformer(
@@ -243,7 +243,6 @@ class SyncpulseTransformationBuilder(AsyncTransformationBuilder):
243
243
  if resolved_output_target_files:
244
244
  for file_key_abs_path in list(resolved_output_target_files.keys()):
245
245
  agent_facing_name = resolved_output_target_files[file_key_abs_path]
246
- print(f"Representing {agent_facing_name} from {file_key_abs_path}")
247
246
  try:
248
247
  # Representer uses the absolute path (file_key_abs_path) to read the example file.
249
248
  representer = get_representer(file_key_abs_path)
satif_ai-0.2.11/PKG-INFO DELETED
@@ -1,23 +0,0 @@
1
- Metadata-Version: 2.3
2
- Name: satif-ai
3
- Version: 0.2.11
4
- Summary: AI Agents for Satif
5
- License: MIT
6
- Author: Syncpulse
7
- Maintainer: Bryan Djafer
8
- Maintainer-email: bryan.djafer@syncpulse.fr
9
- Requires-Python: >=3.10,<3.14
10
- Classifier: License :: OSI Approved :: MIT License
11
- Classifier: Programming Language :: Python :: 3
12
- Classifier: Programming Language :: Python :: 3.10
13
- Classifier: Programming Language :: Python :: 3.11
14
- Classifier: Programming Language :: Python :: 3.12
15
- Classifier: Programming Language :: Python :: 3.13
16
- Provides-Extra: xlsx
17
- Requires-Dist: openai-agents (>=0.0.9,<0.0.10)
18
- Requires-Dist: satif-sdk (>=0.1.0,<1.0.0)
19
- Requires-Dist: sdif-mcp (>=0.1.0,<1.0.0)
20
- Description-Content-Type: text/markdown
21
-
22
- # SATIF AI
23
-
satif_ai-0.2.11/README.md DELETED
@@ -1 +0,0 @@
1
- # SATIF AI
File without changes