Update PUT fuction size limit


The new file size limit of S3 is 5TB, the put function needs to be updated to reflect that.


SectorNine50 wrote Jul 1, 2013 at 5:10 PM

Looks like the maxFileBytes local variable in Put.Execute() was attempted to be updated, but is short one multiplication of 1024L. Right now, it's only indicating 5GB instead of 5TB.

I was unable to check out the code, so I can't submit a patch, but changing these two lines fixed the problem for me:

Line 123 in Put.cs:
const long maxFileBytes = 5L * 1024L * 1024L * 1024L * 1024L
Line 173 in Put.cs:
Path.GetFileName(file), maxFileBytes / 1024 / 1024 / 1024 / 1024));

SectorNine50 wrote Jul 1, 2013 at 5:11 PM

Also Line 172 in Put.cs to match 173:
throw new ArgumentOutOfRangeException(string.Format("{0} is too big; maximum file size on S3 is {1}TB. Type s3 help and see the /big option.",

maxc wrote Jul 2, 2013 at 9:02 PM

I'm not sure this is right - my understanding is that the maximum transfer size is 5GB, but that files up to 5TB can be created using multipart upload (which s3.exe doesn't support).

SectorNine50 wrote Jul 2, 2013 at 9:17 PM

You're 100% right, my mistake... I misread the documentation.

wrote Jun 4, 2015 at 9:16 PM

wrote Oct 19, 2015 at 6:16 PM