PHP (PHP5) Interface to Amazon’s S3 Service (Version 0.1)

Something I threw together in about 6 hours… it’s a good start (and its workable) but needs a lot of work (polish, etc)… but right now there is literally nothing out there even remotely close to this class. So I thought I would share it with the world… Perhaps I’ll setup a project page and put it under subversion if people care to actually contribute.

http://blog.apokalyptik.com/storage3.phps

Update: I’ve put together a “release” of this project. Please visit The Storage3 Interface to Amazons S3 “Simple Storage Service” home page.

14 thoughts on “PHP (PHP5) Interface to Amazon’s S3 Service (Version 0.1)

  1. Here is the whole message that I received when I tried to run your php test code. I tested on xampp, would that be the cause of the problem?

    Bucket Made Sucessfully!

    Could Not Put File

    Array

    (

    [Code] => SignatureDoesNotMatch

    [Message] => The request signature we calculated does not match the signature you provided. Check your key and signing method.

    [RequestId] => 36F7DF6A7244B970

    [SignatureProvided] => 4omeXSG6ho99YWsURS3MWKMT+Mc=

    [StringToSignBytes] => 50 55 54 0a 0a 0a 57 65 64 2c 20 32 38 20 4a 75 6e 20 32 30 30 36 20 30 33 3a 34 31 3a 31 32 20 47 4d 54 0a 2f 38 37 39 38 37 39 38 37 39 2f 69 6e 66 69 6c 65

    [AWSAccessKeyId] => 0NSJHQTH7MZECMDJ8E02

    [HostId] => AaZNiVjTYRYQbY+XZNJzgKJIm01KAgUvVWFZdDsf6XAVMEMdC7XSYV3Dw6otyrx3

    [StringToSign] => PUT

    Wed, 28 Jun 2006 03:41:12 GMT

    /879879879/infile

    [int] => 403

  2. Steve says:

    Hi

    I am facing a problem with storing large files to amazon S3

    $string=file_get_contents(file);

    if the file is large like 1Gb .it causes a problem with memory .How can i store this problem?The problem is the entire file needs to be loaded to memory.Is there a solutioon for the probelm?Can u help me please

    Thanks in Advance

    Steve

  3. Steve – you hit the problem on the head. It's necessary to load the entire file into memory to put it onto S3… The two options here are A) break the file into smaller pieces, or

    B) figure out a way to stream the file without loading it into memory. I'm sure that it could be done but it would probably require writing a custom lib. replacing HTTP_REQUEST with CURL might be a possibility. I'm unsure whether it needs to load the whole file or not.

Leave a Reply