posted by Sakila The Librarian
on
Mon 07 Sep 2009 12:11 UTC
Tags:
Or, There is a Use for GUI Tools!Recently I was working on a problem for a customer. They are converting a server with two InnoDB data files to "innodb_file_per_table":$ ls -l ibdata*-rwxr-xr-x 1 mysql mysql 2.0G Oct 14 13:10 ibdata1-rwxr-xr-x 1 mysql mysql 350G Oct 14 13:10 ibdata2I honestly don't recall ever seeing (or hearing about) so large a data file.The method chosen for conversion boils down to this:\tstop and start the server to enable "innodb_file_per_table"\talter all tables to myisam\tstop server and delete ibdata file\trestart server\tconvert tables back to InnoDB\tadd foreign keysThis post isn't about how we did it, or about whether or not it was the best way, or anything like that. This post is really about the the last two steps.There were over 500 InnoDB tables and something like 375 foreign keys. So I needed to somehow create the "ALTER TABLE" statements for these 500 servers. To drop the Foreign Keys, I used the following "SELECT" statement:SELECT CONCAT( "mysql -u root -pPASS ", table_schema," -e 'ALTER TABLE ",table_name, ' DROP FOREIGN KEY ', constraint_name,';'' &') AS list_of_fks FROM information_schema.KEY_COLUMN_USAGE WHERE REFERENCED_TABLE_NAME is not null ORDER BY TABLE_NAME, COLUMN_NAME;This created a list of statements that I could then put in a batch file, edit a little, and have them run in parallel.I used the following "SELECT" statement to find all InnoDB tables and prepare a command to convert them to MyISAM:SELECT CONCAT("mysql -u root -pPASS ", TABLE_SCHEMA, " -e 'ALTER TABLE ", TABLE_NAME, " ENGINE=MYISAM;' & ") FROM INFORMATION_SCHEMA.TABLES WHERE ENGINE='INNODB';Again, the output can be put in a batch file, modified a little, and run in parallel.Once everything was converted to MyISAM, I shut down the database and removed the InnoDB log and data files and modified the "my.cnf" to so I had "file_per_table" and a much smaller InnoDB data file. That was the fairly straightforward part. Now the fun began.