205

I'm trying to import a gzipped SQL file into mysql directly. Is this the right way?

mysql -uroot -ppassword mydb > myfile.sql.gz
7ochem
  • 280
  • 1
  • 3
  • 12
Alex
  • 8,111
  • 24
  • 71
  • 99

13 Answers13

350
zcat /path/to/file.sql.gz | mysql -u 'root' -p your_database

> will write the output of the mysql command on stdout into the file myfile.sql.gz which is most probably not what you want. Additionally, this command will prompt you for the password of the MySQL user "root".

Giacomo1968
  • 3,522
  • 25
  • 38
joschi
  • 20,747
  • 3
  • 46
  • 50
  • 20
    As a good security practice, I would put my password on the command line, I would let mysql ask for it. – Prof. Moriarty May 03 '10 at 08:19
  • 14
    Or even better: create `~/.my.cnf` with the credentials. ;) – joschi May 03 '10 at 10:46
  • 9
    As @Prof. Moriarty explans, you can modify the command to not use the password via `zcat /path/to/file.sql.gz | mysql -u 'root' -p your_database`. It will know the last parameter is the database you wish to use, not your password. – bafromca Jan 20 '15 at 00:20
  • 8
    To slightly correct @Prof.Moriarty's comment, a good security practice would be to *not* put my password on the command line (where it will get stored in history, or seen over your shoulder), and let MySQL ask for it. The -p flag alone will cause MySQL to ask at a prompt for the password. – George Aug 03 '16 at 08:00
  • Good point. I have updated the original answer. – joschi Aug 03 '16 at 15:12
  • How would you include some additional SQL e.g. to disable foreign key checks before the file? – malhal Oct 27 '16 at 12:49
  • 8
    +1 for something that leaves the damn database dump compressed – Dmitri DB Jan 12 '17 at 19:18
  • 4
    If you run into issues with `zcat` you may also try `gzcat`. On my MacBook I ran into the error: `zcat: can't stat: file.sql.gz (file.sql.gz.Z): No such file or directory` – wsams Sep 14 '17 at 22:13
  • 2
    This chokes on mac because for whatever reason, OSX always appends a `.Z` to the file name. You can use `gunzip -c` in place of `zcat` though and it works fine. – mopsyd Apr 30 '18 at 17:58
  • Hi. From a performance point of view, is it faster to import the dump from the one-liner way? Or is it faster to uncompress to file.sql, then import? Thank you in advance for your opinion. – DarkCowboy Jun 06 '18 at 09:16
  • @DarkCowboy That depends on whether your CPU, your disk I/O, or the database I/O (which might be a combination of the previous two) is the bottleneck. – joschi Jun 07 '18 at 09:24
  • It would be nice if they created `sourcegz` mysql command. – Marinos An Oct 03 '18 at 18:20
  • Any GUI tools for this purpose? – Banee Ishaque K May 25 '19 at 07:15
  • `pv file.sql.gz |zcat| mysql -u 'root' -p your_database` to see progress – Muhammad Omer Aslam Sep 23 '21 at 14:37
  • my database is about `10 GB` (data+index) and the .gz file is about `1.5 GB`. how long may this process of importing take? – Soheil Nov 20 '21 at 13:40
  • @Soheil there’s no way one can answer that question without knowing the format of your data and the tech specs of your server. – bfontaine Nov 23 '21 at 14:52
  • 1
    @bfontaine yes! more data was needed to make a judgement. I was using 24GB RAM, 6 core CPU and my tables had many indices, it took about 5 hours! – Soheil Nov 24 '21 at 05:42
117

To display a progress bar while importing a sql.gz file, download pv and use the following:

pv mydump.sql.gz | gunzip | mysql -u root -p <database name>

In CentOS/RHEL, you can install pv with yum install pv.

In Debian/Ubuntu, apt-get install pv.

In macOS, brew install pv

Banjer
  • 3,854
  • 11
  • 40
  • 47
  • 2
    `pv` seems to be in the Ubuntu repos too (at least in 12.04 LTS it is), but again you need to do `sudo apt-get install pv` to get it. Thanks Banjer, this is perfect for big database imports! – toon81 Feb 19 '15 at 15:25
  • I had to run pv mydump.sql.gz | gunzip | mysql -u root my_database_name. That was because I was importing tables and I don't have a password set for my root user – Cristiano Mendonça Feb 09 '16 at 22:19
  • 1
    In MAC, `brew install pv` – score Apr 28 '16 at 19:30
  • 1
    Great answer! I had to import huge database (3GB compressed, 50GB plaintext) and having progress bar really helped – borisano Aug 17 '22 at 14:34
69

The simplest way is to unzip the database file before importing. Also as mentioned by @Prof. Moriarty you shouldn't be specifying the password in the command (you'll be asked for the password). This command taken from webcheatsheet will unzip and import the database in one go:

gunzip < myfile.sql.gz | mysql -u root -p mydb
Giacomo1968
  • 3,522
  • 25
  • 38
icc97
  • 990
  • 7
  • 16
20

If you get an error from zcat, in which the error message contains the file name with an extra suffix .Z, then try using gzcat instead, as described at https://stackoverflow.com/questions/296717/zcat-wont-unzip-files-properly

6

On macOS, I used this:

zcat < [Database].sql.gz | mysql -u root -p [Database Name in MySQL]

Enter your password, and voila!

Giacomo1968
  • 3,522
  • 25
  • 38
Qasim
  • 161
  • 1
  • 3
4

Also check if there is any USE-statement in the SQL file. Specifying the database at the command line doesn't guarantee that the data ends up there if a different destination is specified within the SQL file.

ikso
  • 160
  • 1
  • 7
  • 2
    You just need to extend the command like so: `pv mydump.sql.gz | gunzip | mysql -u root -p` `your_database`. The accepted answer uses this approach. – bafromca Jan 20 '15 at 00:17
4

You can use -c, --stdout, --to-stdout option of gunzip command

for example:

gunzip -c file.sql.gz | mysql -u root -p database
Archemar
  • 1,341
  • 11
  • 19
3

On MacOS I've been using the following one-liner with no need of installing additional programs, except for the MySQL client itself.

$ cat /path/to/file.sql.gz | gzip -d | mysql -u root <db_name>

The first command, cat, prints the file. Its output, the file contents, is sent as the input to the next command, gzip. gzip with the the -d option decompresses the input passed to it and outputs the result, which is finally used as input for the MySQL client, the mysql program. The output -> input sending is brought to us by the | (pipe) operator on bash and other shell.

This script can also be used in some popular Linux distros, such as Ubuntu. I'm not sure whether gzip is always available. But it can be easily installed, if not, with:

$ sudo apt install gzip
3

If you are using small size database it's better to extract and import. Here is the extract command

tar -xf dbname.sql.tar.gz 

Here is importing command.

mysql -u username -p new_database < data-dump.sql
1

For bzip2 compressed files (.sql.bz2), use:

bzcat <file> | mysql -u <user> -p <database>

OR

pv <file> | bunzip2 | mysql -u <user> -p <database>

to see progress bar.

TimSparrow
  • 111
  • 2
0

Lets say you need to populate user_data with mysql, try this:

export DEBIAN_FRONTEND=noninteractive \
    apt-get update -yq
    dpkg -l | grep mysql-server || (echo "Installing MySQL..." \
    && apt-get install -yq mysql-server \
    && echo -e "\n[mysqld]\nbind-address=0.0.0.0\nskip-name-resolve=1" | tee -a /etc/mysql/my.cnf \
    && aws s3 cp --quiet s3://your-bucket/mysqldump_all_databases.sql.gz - | zcat | mysql -uroot\
    && systemctl restart mysql)
0

If you are using windows, you can install gzip from chocolatey:

choco install gzip -y

or download gzip from this link: http://gnuwin32.sourceforge.net/packages/gzip.htm

After gzip installed run this command:

gzip -cd c:\backups\myDbBackup.sql.gz | mysql -u[user] -p[password] [myDataBase]

0

To export in .sql.qz command is :-

mysqldump -u username -p database | gzip > database.sql.gz

To import the .sql.qz file command is:-

gunzip < database.sql.gz | mysql -u usrname -p newdatabase

Shailesh Thapa