In one of my recent work I need to insert about 20K rows in a table.
My first attemp was to loop the data source and then make a new insert for each row, that means
make 20K insert request (very bad :(( ).
UserToken::recipients('fcm')->select('token','lang')->chunk(1000, function ($items) {
foreach($items as $item){
$data=[
'notification_job_id' => 3,
'token' => $item->token,
'locale' => $item->lang
];
DB::table('notification_recipients')->insert($data);// 20K insert query
}
});
After some search I've discoverd that I can performe this task in a more efficient way using a multidimensional array in this way and reduce the number of insert query to 20 in this way:
UserToken::recipients('fcm')->select('token','lang')->chunk(1000, function ($items) {
$list=[];
foreach($items as $item){
$data=[
'notification_job_id' => 3,
'token' => $item->token,
'locale' => $item->lang
];
array_push($list,$data);// push data in to array
}
// perform an insert request with 1k rows
DB::table('notification_recipients')->insert($list);
});