sftp-push-sync 1.0.0
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- package/.vscode/settings.json +3 -0
- package/LICENSE +674 -0
- package/README.md +104 -0
- package/bin/sync-sftp.mjs +651 -0
- package/package.json +23 -0
package/README.md
ADDED
|
@@ -0,0 +1,104 @@
|
|
|
1
|
+
# SFTP Synchronisation Tool
|
|
2
|
+
|
|
3
|
+
Implements a push syncronisation with Dry-Run. Performs the following tasks:
|
|
4
|
+
|
|
5
|
+
1. Upload new files
|
|
6
|
+
2. Delete remote files that no longer exist locally
|
|
7
|
+
3. Identify changes based on size or altered content and upload them
|
|
8
|
+
|
|
9
|
+
Features:
|
|
10
|
+
|
|
11
|
+
- multiple connections in sync.config.json
|
|
12
|
+
- dry-run mode
|
|
13
|
+
- mirrors local → remote
|
|
14
|
+
- adds, updates, deletes files
|
|
15
|
+
- text diff detection
|
|
16
|
+
- Binary files (images, video, audio, PDF, etc.): SHA-256 hash comparison
|
|
17
|
+
- Hashes are cached in .sync-cache.json to save space.
|
|
18
|
+
- Parallel uploads/deletions via worker pool
|
|
19
|
+
- include/exclude patterns
|
|
20
|
+
|
|
21
|
+
The file shell-scripts/sync-sftp.mjs is pure JavaScript (ESM), not TypeScript. Node.js can execute it directly as long as "type": "module" is specified in package.json or the file has the extension .mjs.
|
|
22
|
+
|
|
23
|
+
## Config file
|
|
24
|
+
|
|
25
|
+
Create a `sync.config.json` in the root folder of your project:
|
|
26
|
+
|
|
27
|
+
```json
|
|
28
|
+
{
|
|
29
|
+
"connections": {
|
|
30
|
+
"prod": {
|
|
31
|
+
"host": "your.host.net",
|
|
32
|
+
"port": 23,
|
|
33
|
+
"user": "ftpuser",
|
|
34
|
+
"password": "mypassword",
|
|
35
|
+
"remoteRoot": "/folder/",
|
|
36
|
+
"localRoot": "public",
|
|
37
|
+
"syncCache": ".sync-cache.prod.json",
|
|
38
|
+
"worker": 3
|
|
39
|
+
},
|
|
40
|
+
"staging": {
|
|
41
|
+
"host": "ftpserver02",
|
|
42
|
+
"port": 22,
|
|
43
|
+
"user": "ftp_user",
|
|
44
|
+
"password": "total_secret",
|
|
45
|
+
"remoteRoot": "/web/my-page/",
|
|
46
|
+
"localRoot": "public",
|
|
47
|
+
"syncCache": ".sync-cache.staging.json",
|
|
48
|
+
"worker": 1
|
|
49
|
+
}
|
|
50
|
+
},
|
|
51
|
+
"include": [],
|
|
52
|
+
"exclude": [
|
|
53
|
+
"**/.DS_Store",
|
|
54
|
+
"**/.git/**",
|
|
55
|
+
"**/node_modules/**"
|
|
56
|
+
],
|
|
57
|
+
"textExtensions": [
|
|
58
|
+
".html", ".xml", ".txt", ".json", ".js", ".css", ".md", ".svg"
|
|
59
|
+
]
|
|
60
|
+
}
|
|
61
|
+
```
|
|
62
|
+
|
|
63
|
+
- Can be conveniently started via the scripts in `package.json`:
|
|
64
|
+
|
|
65
|
+
```bash
|
|
66
|
+
# For example
|
|
67
|
+
npm run sync:staging
|
|
68
|
+
# or short
|
|
69
|
+
npm run ss
|
|
70
|
+
```
|
|
71
|
+
|
|
72
|
+
If you have stored the scripts in `package.json` as follows:
|
|
73
|
+
|
|
74
|
+
```json
|
|
75
|
+
|
|
76
|
+
"scripts": {
|
|
77
|
+
"sync:staging": "node ./shell-scripts/sync-sftp.mjs staging",
|
|
78
|
+
"sync:staging:dry": "node ./shell-scripts/sync-sftp.mjs staging --dry-run",
|
|
79
|
+
"ss": "npm run sync:staging",
|
|
80
|
+
"ssd": "npm run sync:staging:dry",
|
|
81
|
+
|
|
82
|
+
"sync:prod": "node ./shell-scripts/sync-sftp.mjs prod",
|
|
83
|
+
"sync:prod:dry": "node ./shell-scripts/sync-sftp.mjs prod --dry-run",
|
|
84
|
+
"sp": "npm run sync:prod",
|
|
85
|
+
"spd": "npm run sync:prod:dry",
|
|
86
|
+
},
|
|
87
|
+
```
|
|
88
|
+
|
|
89
|
+
The dry run is a great way to compare files and fill the cache.
|
|
90
|
+
|
|
91
|
+
## Which files are needed?
|
|
92
|
+
|
|
93
|
+
- `shell-scripts/sync-sftp.mjs` - The upload script (for details, see the script)
|
|
94
|
+
- `sync.config.json` - The configuration file (with passwords in plain text, so please leave it out of the git repository)
|
|
95
|
+
|
|
96
|
+
## Which files are created?
|
|
97
|
+
|
|
98
|
+
- The cache files: `.sync-cache.*.json`
|
|
99
|
+
|
|
100
|
+
You can safely delete the local cache at any time. The first analysis will then take longer again (because remote hashes will be streamed again). After that, everything will run extremely fast again.
|
|
101
|
+
|
|
102
|
+
## special features
|
|
103
|
+
|
|
104
|
+
The first run always takes a while, especially with lots of images – so be patient! Once the cache is full, it will be faster.
|