-
Notifications
You must be signed in to change notification settings - Fork 21
Description
I'm using this to sync files from my local server to s3 compatible storage, however I keep getting crashes without any errors.
I'm running it on node 21 in a docker container, just super small node script which runs the sync every hour from a local folder to the cloud storage, below is the entire script.
I added some limits to my bucket size that was below the current storage, and when trying to upload I successfully got that error logged saying there was not enough storage. But most of the time its crashing every 15 mins or so. It was crashing every 5 mins before I added the partial size 100mb line, so that has helped, but still crashing very regurally.
Is there any way I can enable more logging so when it crashes I at least get error logs to see whats happening? I have transferred just over 100Gb to the bucket, but only 70Gb of files have been saved, so the regular crashing is leaving a lot of multi-part files half-done
import { S3Client } from '@aws-sdk/client-s3';
import { S3SyncClient, TransferMonitor } from 's3-sync-client';
import cron from 'node-cron';
const s3Client = new S3Client({
endpoint: `xxx`,
region: "xxx",
credentials: {
accessKeyId: "xxx",
secretAccessKey: "xxx",
},
});
const { sync } = new S3SyncClient({ client: s3Client });
const monitor = new TransferMonitor();
let running = false;
backup();
cron.schedule('0 * * * *', () => {
backup();
});
async function backup(){
if(running) return;
running = true;
const timeout = setInterval(() => console.log(monitor.getStatus()), 2000);
await sync("/backup", "s3://xxx", {
monitor,
del: true,
partSize: 100 * 1024 * 1024,
});
console.log(monitor.getStatus())
clearInterval(timeout);
running = false;
}