Catching errors when downloading massive files via PHP

Twicketattile

New Member
I am attempting to download fairly large files (up to, possibly over 1GB) from a remote HTTP server through a PHP script. I am using fgets() to read the remote file line by line and write the file contents into a local file that is created through tempnam(). However, the downloads of very large files (several hundred MB) are failing. Is there any way I can rework the script to catch the errors that are occurring?Because the download is only part of a larger overall process, I would like to be able to handle the downloads and deal with errors in the PHP script rather than having to go to wget or some other process.This is the script I am using now:\[code\]$tempfile = fopen($inFilename, 'w');$handle = @fopen("https://" . $server . ".domain.com/file/path.pl?keyID=" . $keyID . "&format=" . $format . "&zipped=true", "r");$firstline = '';if ($handle) { while (!feof($handle)) { $buffer = fgets($handle, 4096); if ($firstline == '') $firstline = $buffer; fwrite($tempfile, $buffer); } fclose($handle); fclose($tempfile); return $firstline;} else { throw new Exception ('Unable to open remote file.');}\[/code\]
 
Back
Top