1

I have a local mysql database called intranet_production, and on a remote server (which I have root access to the server and mysql) there is a database called 'extranet`.

Within the remote extranet database are dozens of tables. I only need around 5 of them so importing the entire database seems excessive.

I am looking for the best way of importing the database, once the import is working I will set it up as a cron job for every 24 hours. At the moment, the remote database can overwrite it's only local data as we aren't writing anything to the imported tables.

So, given that the local database is called intranet_production and the remote database is called extranet, what's the best way to import tables called table1, table2 and table3 as an example?

I've come across the following method, but I believe that has to be run on the remote server to export the table. I need to run the cron job on the local server.

mysqldump db-name foo | ssh user@remote.box.com mysql bar

Any pointers would be appreciated!

dannymcc
  • 2,677
  • 10
  • 46
  • 72

2 Answers2

2

You can write a small script that does the following:

ssh user@remote.box.com "mysqldump db table1 table2 > /tmp/your_dump.sql"
rsync user@remote.box.com:/tmp/your_dump.sql /tmp/
mysql db < /tmp/your_dump.sql

Optionally, you can compress the file during transfer for slow links.

Khaled
  • 35,688
  • 8
  • 69
  • 98
1

You can use mysqldump with remote hosts:

mysqldump -h database.server.com -u username -pPassword dbName tblName > local.file.sql

instad of redirecting to a file, you can just pipe it to mysql:

mysqldump <args> | mysql -u uname -pPwd localDbName
mulaz
  • 10,472
  • 1
  • 30
  • 37