Large file uploading using rest API


I have create a small app, which allow to upload the .gz file,

It allow me to a file of size less than 1MB,

If the file size is between 1M to 2MB, I got time out error.

My server is NGINX and upload_max_filesize=2MB

I have tried Connection:close, but no luck.

$app->post(’/[{name}]’, function ($request, $response, $args) {

$response->withHeader(‘Connection:close’, ‘*’);


Can you please guide me what I am missing?


$app->post('/[{name}]', function ($request, $response, $args) {
    return $response->withHeader('Connection', 'close');

Not sure what you were trying to do with the * agrument?

Hi JoeBengalen,

Thanks for reply,

I am trying to upload and uncompress a .gz file.

I have use return $response->withHeader(‘Connection’, ‘close’);

Below is the response which I am getting,

The request has been aborted manually or because of the conection timeout. There were no response from the server but the connection wasn’t closed.

You can adjust timeout in settings.

Try to:

add “Connection: close” header which should be used by the server to close the connection after it finish generating response

this is complete code

// Routes
$app->post(’/[{name}]’, function ($request, $response, $args) {

return $response->withHeader('Connection', 'close');

// Render index view

$headers = request_headers(); // get headers

$pheader = pull_headers($headers); // parse header

$authorization = $pheader[0];

$content_type = $pheader['1'];

$file_name = $_FILES['gzip_file']['name']; // get file name


	$dstName = "../public/gzipfile/"; //destination file path
	$temp = explode(".", $file_name); 
	$filename = round(microtime(true)) . '.csv'; 
	$src = $_FILES["gzip_file"]["tmp_name"]; // source path
	$des = $dstName.$filename; 
	uncompress($src,$des); // un compress gzip file
	//echo parse_transcational_csv($des); // parse csv file
	//unlink_csv($des); // unlink csv file