Laravel 8 job queue gives http:500 error after loading 250 mb size csv why?

536 views Asked by At

I am uploading 250 mb csv file with 8,00,000 records in mysql using laravel 8. it works but after uploading some times it gives error as "This page is not working http_error 500". After timeout data will inserted in student table but it shows that error so i am confused whether correct data is stored or not because after insertion successful i am return batch but i am not go to this message. This code is work for small size file but i am getting problem with 250 mb size file.

php.ini file setting:

max_execution_time=10000
max_input_time=9000
memory_limit=2048M

my.conf

max_allowed_packet=120G

config/queue.php:

'database' => [
            'driver' => 'database',
            'table' => 'jobs',
            'queue' => 'default',
            'retry_after' => 120,
            'after_commit' => false,
        ],

controller:

public function store()
{
        $chunk=(array_chunk($x,10000));
        $path=resource_path('pending');

        $batch = Bus::batch([])->dispatch();
        $header=[];
        foreach($chunk as $key=>$chunks)
        {
            $data=array_map('str_getcsv',$chunks);

            if($key==0)
            {
                $header=$data[0];
                unset($data[0]);
    
            }

            $batch->add(new StudentCsvProcess($data,$header));
           
        }
        return $batch;        

    }

StudentCsvProcess.php

<?php

namespace App\Jobs;

use Illuminate\Bus\Queueable;
use Illuminate\Bus\Batchable;
use Illuminate\Contracts\Queue\ShouldBeUnique;
use Illuminate\Contracts\Queue\ShouldQueue;
use Illuminate\Foundation\Bus\Dispatchable;
use Illuminate\Queue\InteractsWithQueue;
use Illuminate\Queue\SerializesModels;
use App\Models\Student;

class StudentCsvProcess implements ShouldQueue
{
    use Batchable,Dispatchable, InteractsWithQueue, Queueable, SerializesModels;
    public $header; 
    public $data;
    /**
     * Create a new job instance.
     *
     * @return void
     */
    public function __construct($data,$header)
    {
        //
        $this->data=$data;
        $this->header=$header;
    }

    /**
     * Execute the job.
     *
     * @return void
     */
    public function handle()
    {
        foreach($this->data as $insert_data)
        {
            $studentData=array_combine($this->header,$insert_data);
            Student::create($studentData);
        }
        
    }

    public function failed(Throwable $exception)
    {
        // Send user notification of failure, etc...
    }
1

There are 1 answers

1
Rateb Habbab On

Make sure that you updated the following in php.ini:

upload_max_filesize = 500M
post_max_size = 500M
max_input_time = 300
max_execution_time = 300

You can try also updating the .htacess file like:

php_value upload_max_filesize 500M
php_value post_max_size 500M
php_value max_input_time 300
php_value max_execution_time 300