Description
I'm using Guzzle in my Laravel project. I had a memory crash when I make a request to an API that return a huge payload.
I have this on the top of my CURL.php
class. I have get() that I use guzzle.
use GuzzleHttp\Exception\GuzzleException;
use GuzzleHttp\Client;
use GuzzleHttp\FORCE_IP_RESOLVE;
use GuzzleHttp\DECODE_CONTENT;
use GuzzleHttp\CONNECT_TIMEOUT;
use GuzzleHttp\READ_TIMEOUT;
use GuzzleHttp\TIMEOUT;
class CURL {
public static function get($url) {
$client = new Client();
$options = [
'http_errors' => true,
'force_ip_resolve' => 'v4',
'connect_timeout' => 2,
'read_timeout' => 2,
'timeout' => 2,
];
$result = $client->request('GET',$url,$options);
$result = (string) $result->getBody();
$result = json_decode($result, true);
return $result;
}
...
}
When I call it like this in my application, it request a large payload (30000)
$url = 'http://site/api/account/30000';
$response = CURL::get($url)['data'];
I kept getting this error
cURL error 28: Operation timed out after 2000 milliseconds with 7276200 out of 23000995 bytes received (see http://ift.tt/1mgwZgQ)
How do I avoid this?
Should I increase these settings?
'connect_timeout' => 2,
'read_timeout' => 2,
'timeout' => 2,
Questions
How would one go about and debug this further ?
I'm open to any suggestions at this moment.
Any hints/suggestions / helps on this be will be much appreciated!
via Chebli Mohamed
Aucun commentaire:
Enregistrer un commentaire