How to encrypt & upload large files to Amazon S3 in Laravel

How to encrypt & upload large files to Amazon S3 in Laravel

https://ift.tt/2QmSQqB

Creating Queueable Jobs

Next, let’s create the two queueable jobs that we use for encryption and uploading to S3:

php artisan make:job EncryptFilephp artisan make:job MoveFileToS3

This will create two files in app/Http/Jobs : EncryptFile.php and MoveFileToS3.php. These jobs will accept a param in the constructor, which represents the filename. We add the functionality of encrypting and uploading to S3 in the handle method. This is what the two jobs look like:

<?php

namespace App\Jobs;

use Illuminate\Bus\Queueable;
use Illuminate\Contracts\Queue\ShouldQueue;
use Illuminate\Foundation\Bus\Dispatchable;
use Illuminate\Queue\InteractsWithQueue;
use Illuminate\Queue\SerializesModels;
use SoareCostin\FileVault\Facades\FileVault;

class EncryptFile implements ShouldQueue
{
use Dispatchable, InteractsWithQueue, Queueable, SerializesModels;

protected $filename;

/**
* Create a new job instance.
*
* @return void
*/
public function __construct($filename)
{
$this->filename = $filename;
}

/**
* Execute the job.
*
* @return void
*/
public function handle()
{
FileVault::encrypt($this->filename);
}
}
<?php

namespace App\Jobs;

use Exception;
use Illuminate\Bus\Queueable;
use Illuminate\Contracts\Queue\ShouldQueue;
use Illuminate\Foundation\Bus\Dispatchable;
use Illuminate\Http\File;
use Illuminate\Queue\InteractsWithQueue;
use Illuminate\Queue\SerializesModels;
use Illuminate\Support\Facades\Storage;

class MoveFileToS3 implements ShouldQueue
{
use Dispatchable, InteractsWithQueue, Queueable, SerializesModels;

protected $filename;

/**
* Create a new job instance.
*
* @return void
*/
public function __construct($filename)
{
$this->filename = $filename . '.enc';
}

/**
* Execute the job.
*
* @return void
*/
public function handle()
{
// Upload file to S3
$result = Storage::disk('s3')->putFileAs(
'/',
new File(storage_path('app/' . $this->filename)),
$this->filename
);

// Forces collection of any existing garbage cycles
// If we don't add this, in some cases the file remains locked
gc_collect_cycles();

if ($result == false) {
throw new Exception("Couldn't upload file to S3");
}

// delete file from local filesystem
if (!Storage::disk('local')->delete($this->filename)) {
throw new Exception('File could not be deleted from the local filesystem ');
}
}
}

As you can see, the EncryptFile job is simple — we are just using the FileVault package to encrypt a file and save it into the same directory, with the same name and the .enc extension. It’s exactly what we were doing before, in the HomeController’s store method.

For the MoveFileToS3 job, we are first using the Laravel putFileAs method that will automatically stream our file to S3, following the same directory convention as we had on the local filesystem.

We are then calling the PHP gc_collect_cycles function, in order to force collection of any existing garbage cycles. In some cases, if we don’t run this function then the file will remain locked and we won’t be able to delete it in the next step.

Finally, we are deleting the file from the filesystem and throwing Exceptions if the upload or the delete processes fail.

programming

via Laravel News Links https://ift.tt/2dvygAJ

December 9, 2019 at 09:00AM