PHP File Streaming with cat and passthru

by Jason on November 9, 2009

The problem

This weekend, I needed to figure out how to make an on-request, downloadable, backup of an entire MySQL database (for Magento, in case you were wondering).
At first, I attempted the simplest approach, which was to perform the backup via system() or exec(), and then serve the file as a download using either file_get_contents or fopen/fread, etc. This wouldn’t work because PHP would run out of memory when reading the file. I even tried stream_get_contents() with the same result.

Hmmm….

I’m sure there are other ways of getting this done, but my approach has been working flawlessly so far and I thought I’d share with you.

The solution

Here is the entire code segment which performs the backup, bzips the SQL file, then serves it as a download using passthru and cat.


		//Define where backup will go
		$backup_folder = getenv("DOCUMENT_ROOT") . $settings['base_url'] . '/backup/';
		$backup_file = 'aps_backup_' . date("Y-m-d-H-i-s")  . '.sql.bz2';
		$backupFile = $backup_folder . $backup_file;
		
		//Perform the database backup
		$command = "mysqldump --opt -h$dbhost -u$dbuser -p$dbpass $dbname | bzip2 > $backupFile";
		error_log("Executing: $command");
		exec($command);
		
		//Serve file as download
		header("Content-type: application/octet-stream");
		header("Content-Disposition: attachment; filename=\"$backup_file\"");
		passthru("cat $backupFile");

As you can see, the database backup command is executed, and the output is bzipped and stored in the “backup” folder.
Once the command is completed, the headers are setup to serve a file download, and using the passthru command, we simply cat the file.
This effectively streams the file as a download, thus alleviating the memory limit errors I was receiving when trying to open/read/serve the large file download.
Obviously, this wouldn’t work on Windows.

I’m curious if anyone else has been faced with this problem, and what your solution(s) were?

5 comments

I know PHP-heads are fond of reinventing the wheel, but you may want to look into the X-Sendfile header.

http://www.adaniels.nl/articles/how-i-php-x-sendfile/

by Anonymous on November 9, 2009 at 6:16 pm. #

I did something like this before but I didn’t use passthru. I just read the file in small chunks of 8 KB and writes out each chunk as I read them. Output buffering has to be disabled of course.

Why write to a file at all? Why not passthru the output of ‘mysqldump | bzip2’ directly? That way the user doesn’t have to wait for the dump to complete.

by Hongli Lai on November 10, 2009 at 1:02 am. #

Hi Anonymous, and Hongli – these are all great suggestions!

I’ve never dealt with large file downloads like this where I’m serving the file up for download, so I’m a bit new to this arena.

The suggestions are much appreciated and I’ll try them both out!

by Jason on November 11, 2009 at 11:50 am. #

Why don’t split this operation into two different? The first one creates file, the second – does the import?

by Lempert on November 26, 2010 at 7:46 am. #

How about fpassthru? Does it eat up all the memory?

by Wojtek on January 25, 2012 at 3:50 pm. #

Leave your comment

Required.

Required. Not published.

If you have one.