PHP File Streaming with cat and passthru

by Jason on November 9, 2009

The problem

This weekend, I needed to figure out how to make an on-request, downloadable, backup of an entire MySQL database (for Magento, in case you were wondering).
At first, I attempted the simplest approach, which was to perform the backup via system() or exec(), and then serve the file as a download using either file_get_contents or fopen/fread, etc. This wouldn’t work because PHP would run out of memory when reading the file. I even tried stream_get_contents() with the same result.


I’m sure there are other ways of getting this done, but my approach has been working flawlessly so far and I thought I’d share with you.

The solution

Here is the entire code segment which performs the backup, bzips the SQL file, then serves it as a download using passthru and cat.

		//Define where backup will go
		$backup_folder = getenv("DOCUMENT_ROOT") . $settings['base_url'] . '/backup/';
		$backup_file = 'aps_backup_' . date("Y-m-d-H-i-s")  . '.sql.bz2';
		$backupFile = $backup_folder . $backup_file;
		//Perform the database backup
		$command = "mysqldump --opt -h$dbhost -u$dbuser -p$dbpass $dbname | bzip2 > $backupFile";
		error_log("Executing: $command");
		//Serve file as download
		header("Content-type: application/octet-stream");
		header("Content-Disposition: attachment; filename=\"$backup_file\"");
		passthru("cat $backupFile");

As you can see, the database backup command is executed, and the output is bzipped and stored in the “backup” folder.
Once the command is completed, the headers are setup to serve a file download, and using the passthru command, we simply cat the file.
This effectively streams the file as a download, thus alleviating the memory limit errors I was receiving when trying to open/read/serve the large file download.
Obviously, this wouldn’t work on Windows.

I’m curious if anyone else has been faced with this problem, and what your solution(s) were?