403 Forbidden Error on all operations

Jun 14, 2010 at 9:48 PM
Edited Jun 14, 2010 at 9:58 PM
This app looks like it is exactly what I need. But unfortunately I have been unable to get it working. I have done the following
- Signed up for the S3 Account.
- Got an "Access Key ID" and "Secret Access Key" from the Access Credentials section of my Security Credentials page.
- Went into the AWS Management Console and created a new bucket called 'MyBucket1'.
- Granted access to 'Everyone' for all permissions on this bucket. (I assume that I can narrow this down to 'authenticated users' in the future.
- opened a command prompt
- Ran "s3 auth keykeykeykeykeykeykeykey secretsecretsecretsecretsecretsecret"
- That part returns without error. Anything else I try to do returns an error

s3 list MyBucket1
The remote server returned an error: (403) Forbidden.


s3 put MyBucket1 textfile.txt
The remote server returned an error: (403) Forbidden.

If anyone can help me get over this hump, it would be greatly appreciated.

Thank you,
Rick Arthur
Jun 14, 2010 at 10:21 PM
Update:
The command 'S3 list' works and returns the right list of buckets! But still can't get anything else to work.
Coordinator
Jun 15, 2010 at 12:52 AM
HI there, I think this has to be a problem with the permissions on your buckets, since the list is working. Maybe you could create a new bucket and try accessing it without changing the permissions?
Jun 15, 2010 at 2:22 AM

Unfortunately, same result when I create a new bucket.  I connected with the same key/secret using CloudBerry, a GUI tool for S3.  I was able to read and write files from the folder.   I must be entering something wrong with your tool.   I assume the following should work:

 s3 list HealthTech /key:AKIAJQEX6VNVGG37NWMA /secret:EnESYIeuWIwcj1jSucNBN9FPqfIUYfZ01xf3lfYc

(Yes, that's the actual key, but I'll just discard it in a day or two and create a new one.)



Thanks for your help.

Rick

Coordinator
Jun 15, 2010 at 3:20 AM
How odd. Do you mind if I have a go at creating a bucket on your account?
Jun 17, 2010 at 6:37 PM

Sorry for the delayed response, I got sidetracked.  But please do.  Create a bucket, add/remove files or whatever you want to try.  I'm not using the account in production yet.

Thanks,

Rick

Coordinator
Jun 19, 2010 at 5:38 AM

Hi Rick,

I created a bucket called maxtesting using S3Fox and didn't have any problems accessing it with s3.exe.  Then I tried to create a bucket with uppercase characters and I couldn't do it!  S3Fox doesn't allow it.  I then found the following advice online:

"European Bucket allows only lower case letters. Although Buckets created in the US may contain lower case and upper case both, Amazon recommends that you use all lower case letters when creating a bucket."

Anyway, look like you'll be OK if you stick with lowercase bucket names.

Max

Jun 20, 2010 at 1:40 AM

Thanks.  That seems to do the trick.   Lower case buckets it is.

Thanks for a great product that fills a needed gap, and for your help on this particular issue.

Rick

Jul 10, 2011 at 2:25 AM

I got the same problem when I used a "-" in the name of my buckets. The AWS console lets you create buckets with dashes OK, but S3.exe returns a 403 when uploading to them.

Nov 25, 2013 at 7:51 AM
This also happens when the clock on the machine is out of sync - in my case the server was 30 minutes ahead of time.
Dec 2, 2013 at 2:24 PM
I saw this error on NOT US buckets.
So, I created US bucket (select region US Standard when creating) and all works fine!
Apr 29, 2014 at 1:11 AM
I thought I'd contribute to this

When you create a brand new bucket, Amazon S3 will redirect your request to a temporary endpoint with a 307 response

s3.exe follows the 307 to the new end point, however it doesn't pass through the 'Authorization:' request header on the second request. So if your bucket is private you'll get a 403

The only solution for non-US buckets is to keep trying until AWS update's all it's s3 end points with your new bucket. Or if you want to be insecure give the bucket public read-write until the AWS S3 end-points are all up to date

The best way to check is to open up fiddler and run the command until you stop getting a 307 back from AWS.

This is a bug and will need to be fixed, I'll raise an issue