This project is read-only.

ID

Uploaded

Status

Description

Work Items

Action

11806
by rcompton78
Mar 21, 2012
6:06 PM

Being evaluated

I added the ability to set the MimeType (Content-Type) to the Put command.
s3.exe put <bucket> <filename> /mimetype:Application/Octet

There is not validation on mimetype. Not completely sure how much validation is on amazon.

Rich Compton
Software Developer
Curve Dental Ltd

3551

Download

10339
by trench_
Sep 4, 2011
10:48 AM

Being evaluated

I added a delete command with wildcard matching.
Example: "S3 DEL mybucket/myfolder/*.rar"

I also added support for last modified date, so that you can delete all files older than a certain date (ideal for backup automation):
Example to delete matching files older than 30 days: "S3 DEL mybucket/backups/*.rar /lmbefore:-30"
Example to delete matching files older than the specified date: "S3 DEL mybucket/backups/*.rar /lmbefore:20110719"

Anyway, I needed this to automate backups to S3 myself, and combined with WinRar and Mailsend (http://www.muquit.com/muquit/software/mailsend/mailsend.html) I've setup the following batch-file to backup my server to S3 every night:

@echo off
:: Setup Parameters
set AdminEmail=...
set S3Key=...
set S3Secret=...

:: Generate filename
for /f "tokens=1-3 delims=-/ " %%a in ("%DATE%") do (
for /f "tokens=1-3 delims=:., " %%m in ("%TIME%") do (
set ArchiveName=E:\Backup\Backup_%%a%%b%%c_%%m%%n%%o.rar
)
)

:: Delete Previous Rar Items
del /Q E:\Backup\*.rar
del /Q E:\Backup\Backup.log

:: Rar Everything Together
call E:\Scripts\rar.exe a %ArchiveName% -dh -ep3 -hpMySuperPassword -r -se -msrar;zip;jpg;png;bak;mp3 @E:\Scripts\backup.lst >E:\Backup\Backup.log
IF %ERRORLEVEL% NEQ 0 GOTO Err

:: Remove S3 items older than 30 days
s3 del mywebserver/backup/*.rar /lmbefore:-30 /key:%S3Key% /secret:%S3Secret% >>E:\Backup\Backup.log
IF %ERRORLEVEL% NEQ 0 GOTO Err

:: Copy to S3
s3 put mywebserver/backup/ %ArchiveName% /key:%S3Key% /secret:%S3Secret% >>E:\Backup\Backup.log
IF %ERRORLEVEL% NEQ 0 GOTO Err

:: Success
mailsend +cc +bc -M "MyWebserver. Backup Completed!" -sub "MyWebserver. Backup Completed!" -d mywebserver.com -smtp 127.0.0.1 -f %AdminEmail% -t %AdminEmail% -a E:\Backup\Backup.log

GOTO End

:Err
mailsend +cc +bc -M "MyWebserver. Backup Failed!" -sub "MyWebserver. Backup Failed!" -d mywebserver.com -smtp 127.0.0.1 -f %AdminEmail% -t %AdminEmail% -a E:\Backup\Backup.log

:End

2590
5369

Download

9128
by kma248
Apr 12, 2011
10:59 PM

Being evaluated

Adds a /maxage parameter to Put so that you can easily set Cache-Control max-age headers when uploading. Example usage: s3 put mybucket my.file /maxage:86400 /acl:public-read

Download

8765
by chuchuva
Mar 16, 2011
9:10 PM

Being evaluated

Even better improvement of /sync option if you have lots of files. See http://chuchuva.com/pavel/2011/03/improving-backup-to-amazon-s3/

Download

8041
by fragged
Jan 13, 2011
3:23 PM

Being evaluated

This is an option to PUT files as the lowercase version of what they are currently called.
I have used this where I needed to move static files across, but the names were inconsistant with casing and failing to resolve on S3.
This uploads all files to S3 as lowercase, and a rewrite rule on the webserver makes sure all calls to S3 are lowercase.
Example:

s3.exe PUT bucket.com/images/ c:\images\ /tolower

Download

7122
by danruehle
Oct 19, 2010
10:03 PM

Being evaluated

Add a delete command that can delete based on a wildcard key, specific key or a bucket.

2590

Download

5226
by chuchuva
Feb 10, 2010
12:17 AM

Being evaluated

This patch fixes high memory consumption of the previous patch.

Download

5214
by chuchuva
Feb 9, 2010
5:45 AM

Being evaluated

Hi Max,

First of all, thank you for your tool - this is exactly what I need.

There is one problem, however. I backup large number of files - about 450,000 - every night. I use /sync option. For each file s3.exe issues HEAD command to check last modified date on the server. This takes a lot of time. Also, it costs $0.45 total (450,000 HEAD requests / 10,000 * $0.01). More efficient approach is to LIST all files in the bucket, store information about them in lookup table and then check last modified dates there. I tried this and it works. Now the cost is $0.018 (450,000 / 250 / 1,000 * $0.01) - a big win! :)
--
Pavel Chuchuva
http://chuchuva.com

Download

6297
by wizzardmr42
Jul 11, 2010
9:47 PM

Applied

List command displays extra column for storage class - "S" for standard/"R" for reduced redundancy


Applied Aug 10, 2010: Added support for storage classes with LIST and PUT, with thanks to wizzardmr42. The list command displays the storage class only when /storageclass is specified, as I know some folks are parsing the output from s3.exe and so don't like it when the format changes. Thanks for the patch! Max

Download

6298
by wizzardmr42
Jul 11, 2010
10:05 PM

Applied

This displays the storage class in the list bucket contents results - "S" for standard/"R" for reduced redundancy.

Replaces 6297 - moved the storage class indicator to just before the filename so that the tabbing isn't messed up.

This only requires a minor change on line 69 of Commands\List.cs


Applied Aug 10, 2010: Added support for storage classes with LIST and PUT, with thanks to wizzardmr42. The list command displays the storage class only when /storageclass is specified, as I know some folks are parsing the output from s3.exe and so don't like it when the format changes. Thanks for the patch! Max

Download

View All