Problem with get /big option

Jun 8, 2010 at 7:23 PM
Edited Jun 8, 2010 at 7:25 PM
I am deploying s3 to a server to use for some simple DB backups. The "Put" command with /big works just fine.. I see the .000/.001/etc files up in the s3 bucket. When I try to list the bucket contents it verifis the files are there:

Y:\S3\S3 Tasks>s3.exe list "buddyplatformbackup/db backup/project noah/ProjectNoah*"
6/7/2010 3:20:15 PM 500.0M db backup/project noah/ProjectNoah(buddy)Backup.000

However when I try to get the file (with /big to recombine) it says it cant find the file that the list cmd displays.... am I missing some magical syntax?

Y:\S3\S3 Tasks>s3.exe get "buddyplatformbackup/db backup/project noah/ProjectNoah(buddy)Backup" /big
Not found: db backup/project noah/ProjectNoah(buddy)Backup.000

-Jeff
Coordinator
Jun 9, 2010 at 1:54 AM
Hi there, I've been able to replicate this problem with a filename with brackets in it. I'll get a fix out but if you remove the brackets you should be alright in the meantime. Cheers, Max
Jun 9, 2010 at 9:37 PM
Edited Jun 9, 2010 at 9:37 PM

Ok I will rename the file and try again.. thanks for the quick response!!

 

BTW on a related topic what sort of schema is used when deciding how big the files should be (/big:x)? Currently I am doing 1 gig... but there is a good/bad reason to do a smaller or bigger number?

 

-Jeff

Coordinator
Jun 10, 2010 at 2:03 AM
It's just the maximum size you don't mind having to re-upload if your network connection goes down really. The /big option has really nice detection of when the chunk you're uploading is already on the server so with a chunk size of 1GB you'll never have to re-upload more than 1GB on a retry.
Jun 11, 2010 at 4:38 PM
Understood.. thanks. BTW I have confirmed that removing the brackets fixed the problem.
Coordinator
Jun 22, 2010 at 9:10 AM

(This bug is fixed in version 1.7.)