Dumping large live database

Dumping a large database while it is live requires a slightly different command

mysqldump –single-transaction very_large_db > bakup_of_db.sql

This dumps the database (or table) at a point in time while not locking the database

Ref: here

Transfer database from one server to another

Some times you want to copy a database from one server to another. This is painful and found a easy way to do it from a few other blogs

 

$ mysqldump -u username -p'password' db-name | ssh user@remote.box.com mysql -u username -p'password db-name

 

 

Change MySQL inno dB to file per table

This will create a file per table instead of just one big inno db file

cd /usr/local/mysql/support-files/
sudo cp my-huge.cnf /etc/my.cnf
sudo vi /etc/my.cnf

add this under [mysqld]

innodb_file_per_table

Now create the tables as required after restarting MySQL

2010 EC2 Setup Part 2

Setting up MySQL

1) sudo apt-get install mysql-server

2) Enter Root password

3) sudo service mysql start