How to correctly queue image manipulation in laravel with beanstalkd while uploading to Amazon S3?

2.6k views Asked by At

I'm doing some tests with laravel by loading images to Amazon S3 and queuing image manipulation with beanstalkd. Please note that this is just testing.

Here is my implementation:

// routes.php

Route::post('/', function()
{
    $validator = Validator::make(Input::all(), array(
        'title' => 'required',
        'file'  => 'required|mimes:jpeg,jpg,png',
    ));

    if( $validator->fails() )
    {
        return Redirect::to('/');
    }

    // Upload File
    $file = Input::file('file');

    $now = new DateTime;
    $hash = md5( $file->getClientOriginalName().$now->format('Y-m-d H:i:s') );
    $key = $hash.'.'.$file->getClientOriginalExtension();

    $s3 = AWS::createClient('s3');

    $s3->putObject(array(
        'Bucket'      => 'bellated',
        'Key'         => $key,
        'SourceFile'  => $file->getRealPath(),
        'ContentType' => $file->getClientMimeType(),
    ));

    // Create job
    Queue::push('\Proc\Worker\ImageProcessor', array(
        'bucket'   => 'bellated',
        'hash'     => $hash,
        'key'      => $key,
        'ext'      => $file->getClientOriginalExtension(),
        'mimetype' => $file->getClientMimeType(),
    ));

    Log::info('queue processed');

    return Redirect::to('/complete');
});

// image processor

<?php namespace Proc\Worker;

use Imagine\Gd\Imagine;
use Imagine\Image\Box;

class ImageProcessor {

    protected $width;
    protected $height;
    protected $image;

    public function fire($job, $data)
    {
           $s3 = \AWS::createClient('s3');



        try {
   $response = $s3->getObject(array(
            'Bucket'      => $data['bucket'],
            'Key'         => $data['key'],
        ));
} catch (Exception $e) {
   return; 
}

        $imagine = new Imagine();
        $image = $imagine->load( (string)$response->get('Body') );

        $size = new Box(100, 100);
        $thumb = $image->thumbnail($size);

        $s3->putObject(array(
            'Bucket'      => 'bellated',
            'Key'         => $data['hash'].'_100x100.'.$data['ext'],
            'Body'        => $thumb->get($data['ext']),
            'ContentType' => $data['mimetype'],
        ));



    }

}

When I have 'sync' as queue - everything works fine and I get both images (original and resized) in Amazon, but after I switched to 'beanstlakd' and run php artisan queue:listen I keep getting this error:

Next exception 'Aws\S3\Exception\S3Exception' 
  with message 'Error executing "GetObject" 
  on "https://s3.eu-central-1.amazonaws.com/bellated/cd05ec14f7a19047828d7ed79d192ee3.jpg";
 AWS HTTP error:  
 Client error: 404 NoSuchKey 
 (client): The specified key does not exist. - 
  <?xml version="1.0" encoding="UTF-8"?>
    <Error>
      <Code>NoSuchKey</Code>
      <Message>The specified key does not exist.</Message>
      <Key>cd05ec14f7a19047828d7ed79d192ee3.jpg</Key>
      <RequestId>9390AD2904820C3E</RequestId> 
      <HostId>
        nZK1ivZn3bs6xy0S/tGe+A7yoZgKKccLpUDObKuwS2Zmi8LXUgFI5JpkQWCkwchCw6tgW7jyvGE=
      </HostId>
    </Error>'
  in /home/vagrant/Code/laravel/vendor/aws/aws-sdk-php/src/WrappedHttpHandler.php:152

Any ideas on what might be causing this or how could I proceed on this?

1

There are 1 answers

0
whoacowboy On

It looks like you are setting your s3 key to you file name which might be causeing you grief.

    $s3->putObject(array(
        'Bucket'      => 'bellated',
        'Key'         => $data['hash'].'_100x100.'.$data['ext'],
        'Body'        => $thumb->get($data['ext']),
        'ContentType' => $data['mimetype'],
    ));

The error makes me think this.

Client error: 404 NoSuchKey 
(client): The specified key does not exist. - 
<Key>cd05ec14f7a19047828d7ed79d192ee3.jpg</Key>

In general, it looks like you are doing this the hard way. I am not sure how to make your code work, but Laravel does a lot of what you are trying to do right out of the box.

Here is how I have done what you are trying to do.

You need to set up your environment.

.env

    S3_KEY=MYKEYMYKEYMYKEYMYKEY
    S3_SECRET=MYSECRETMYSECRETMYSECRETMYSECRETMYSECRET
    S3_REGION=us-east-1
    S3_BUCKET=bucketname

config/filesystem.php

    <?php
    return [
        'default' => 'local',
        'cloud' => 's3',
        'disks' => [
            'local' => [
                'driver' => 'local',
                'root'   => storage_path().'/app',
            ],
        's3' => [
            'driver' => 's3',
            'key'    => env('S3_KEY'),
            'secret' => env('S3_SECRET'),
            'region' => env('S3_REGION'),
            'bucket' => env('S3_BUCKET'),
        ],
        ],
    ];

routes.php Quick test

    Route::get('s3',function(){
        $success = Storage::disk('s3')->put('hello.txt','hello');
        return ($success)?'Yeay!':'Boo Hoo';
    });

I know that this is with a text file but it is the same.

How I would handle the Queueing is by using Laravel's Job (it use to be command).

At the terminal type which will make a app/Jobs/NewJob.php file.

php artisan make:job NewJob --queued

Set up your job like this.

NewJob.php

    <?php

    namespace App\Jobs;

    use ...;

    class NewJob extends Job implements SelfHandling, ShouldQueue
    {
        public $content;
        public $path;

        use InteractsWithQueue, SerializesModels;
        public function __construct($content, $path)
        {
            $this->content = $content;
            $this->path = $path;
        }

        public function handle()
        {
            Storage::disk('s3')->put($this->path,$this->content)
        }
    }

And your controller something like this

    <?php

    namespace App\Http\Controllers;

    use ...;

    class ImageController extends Controller
    {
        public function sendImage($content, $path)
        {
            $this->dispatch(new NewJob($content, $path));
        }
    }