How to Import Large MySQL Data Files
One Step: Use
BigDump
I've been so frustrated at the file size limits for uploading large MySQL data files, and BigDump is the answer. Works like a charm.
More details below...
* BigDump Website
* Import a MySQL data dump with BigDump | drupal.org
Basic Idea
1. Enter database/login info into BigDump PHP script
2. Upload script file and MySQL datafile to writable, web-accessible directory on your website.
3. Run the script by navigating to your website URL containing the BigDump script.
Result: Script loads data by breaking the large file into smaller chunks and loading each chunk separately into the database. Very smooth.
USAGE
1. Open BigDump in a text editor and adjust the database configuration
2. Drop the old tables on the target database if your dump doesn't contain "DROP TABLE" (use phpMyAdmin)
3. Create the working directory (e.g. dump) on your web-server
4. If you want to upload the dump files directly from the web-browser give the scripts writing permissions on the working directory (e.g. make chmod 777 on a Linux based system). You can upload the dump files from the browser up to the size limit set by the current PHP configuration of the web-server. Alternatively you can upload any files via FTP.
5. Upload bigdump.php and the dump files (*.sql or *.gz) via FTP to the working directory (take care of TEXT mode upload for bigdump.php and dump.sql but BINARY mode for dump.gz if uploading from MS Windows).
6. Run the bigdump.php from your browser via URL like
http://www.yourdomain.com/dump/bigdump.php . Now you can select the file to be imported from the listing of your working directory.
7. BigDump will start every next import session automatically if you enable the JavaScript in your browser.
8. Relax and wait for the script to finish. Do not close the browser window!
9. IMPORTANT: Remove bigdump.php and your dump files from your server
Note 1: BigDump will fail processing large tables containing extended inserts. An extended insert contains all table entries within one SQL query. BigDump isn't able to split such SQL queries. In most cases BigDump will stop if some query includes to many lines. But if PHP complains that allowed memory size exhausted or MySQL server has gone away your dump probably also contains extended inserts. Please turn off extended inserts when exporting database from phpMyAdmin.
Note 2: Some web-servers disallow script execution in the directory with writing permissions for security reasons. If you changed the permissions on the working directory and you are getting a server error when running the script restore the permissions to their normal state for directories.
Note 3: If Timeout errors still occure you may need to adjust the $linespersession setting in bigdump.php.
Note 4: If mySQL server overrun occures you can use $delaypersession setting to let the script sleep some milliseconds or more before starting next session. This setting will only work if the JavaScript is activated.
Note 5: BigDump is currently not able to restore a single dump file with multiple databases inside (switched by the USE statement).
Note 6: If you experience problems with non-latin characters while using BigDump you have to adjust the $db_connection_char_set configuration variable in bigdump.php to match the encoding of your dump file.