In a new post to his blog Sameer Borate includes a handy bit of code you can use to split up a large MySQL dump file into smaller, easier to digest chunks.
One of the frustrating things with working with MySQL is of importing large sql dump files. Either you get a 'max execution time exceeded' error from PHP or a 'Max_allowed_packet_size' from MySQL. In a recent task I needed to import a table of around a million records on a remote host, which quickly became an exercise in frustration due to various limitations on the server. SSH was of no help as changing the configuration files was restricted to the root user. My last resort was to split the huge 'INSERT' statements into smaller size files.
His script needs a little extra time to run (he sets max execute to 600 seconds) and takes the SQL file in line by line, splitting them back out to over files based on a "count" value - "dump-split-*". Depending on the size of your files, using something like this might not be an option. You might need something more like the command line "split" feature to keep it outside of PHP's memory management all together.
没有评论:
发表评论