I have to pull data from a table that has 2 million rows. The eloquent query looks like this:
$imagesData = Images::whereIn('file_id', $fileIds)
->with('image.user')
->with('file')
->orderBy('created_at', 'DESC')
->simplePaginate(12);
The above query works fine in small table. But in production site that has over 2 million rows in Images
table, it takes over 15 seconds to get a reply. I use Laravel for api only.
I have read through other discussions on this topic. I changed paginate()
to simplePaginate()
. Some suggests perhaps having a DB::
query with whereRaw
might work better than whereIn
. Some says it might be due to PDO in php while processing whereIn
and some recommends using Images::whereIn
which I already used.
I use MariaDB, with InnoDB for db engine and its loaded into RAM. The sql queries performs well for all other queries, but only the ones that has to gather data from huge table like this takes time.
How can I optimise the above laravel query so I can reduce down the query response to couple of seconds if possible when the table has millions of rows?
via Chebli Mohamed
Aucun commentaire:
Enregistrer un commentaire