4

This morning I got a warning on my FreeNAS 9.3 machine about "Not enough space"

warning: The capacity for the volume 'SeanVolume' is currently at 85%, while the recommended value is below 80%.

It's of no use running the scrub command after a period of time.

I used SSH to login to FreeNAS host, and got some information:

[root@freenas] ~# zpool list
NAME           SIZE  ALLOC   FREE  EXPANDSZ   FRAG    CAP  DEDUP  HEALTH  ALTROOT
SeanVolume    21.8T  18.6T  3.16T         -    38%    85%  1.00x  ONLINE  /mnt
freenas-boot   111G   680M   110G         -      -     0%  1.00x  ONLINE  -

It shows it's ALLOC space only at 18.6T capacity, but it still had 6.7T by du -sh command:

[root@freenas] ~# du -sh /mnt/SeanVolume/
6.7T    /mnt/SeanVolume/

Additional information:

[root@freenas] ~# zpool status
  pool: SeanVolume
 state: ONLINE
  scan: scrub repaired 0 in 20h32m with 0 errors on Thu Jul 16 07:48:34 2015
config:

        NAME                                            STATE     READ WRITE CKSUM
        SeanVolume                                      ONLINE       0     0     0
          raidz3-0                                      ONLINE       0     0     0
            gptid/f4986ea8-f822-11e4-a7d4-d05099265144  ONLINE       0     0     0
            gptid/f51a05f7-f822-11e4-a7d4-d05099265144  ONLINE       0     0     0
            gptid/f595d78e-f822-11e4-a7d4-d05099265144  ONLINE       0     0     0
            gptid/f62490d4-f822-11e4-a7d4-d05099265144  ONLINE       0     0     0
            gptid/f6a8e41e-f822-11e4-a7d4-d05099265144  ONLINE       0     0     0
            gptid/f7266471-f822-11e4-a7d4-d05099265144  ONLINE       0     0     0
            gptid/f7f3bf28-f822-11e4-a7d4-d05099265144  ONLINE       0     0     0
            gptid/f8b8fa7b-f822-11e4-a7d4-d05099265144  ONLINE       0     0     0
        cache
          gptid/f901ec00-f822-11e4-a7d4-d05099265144    ONLINE       0     0     0
          gptid/dfa53351-1baa-11e5-ba0e-d05099265144    ONLINE       0     0     0

errors: No known data errors

Result of df -TH command:

[root@freenas] ~# df -TH
Filesystem                                                  Type      Size    Used   Avail Capacity  Mounted on
freenas-boot/ROOT/default                                   zfs       115G    698M    114G     1%    /
devfs                                                       devfs     1.0k    1.0k      0B   100%    /dev
tmpfs                                                       tmpfs      33M    5.5M     28M    16%    /etc
tmpfs                                                       tmpfs     4.2M    8.2k    4.2M     0%    /mnt
tmpfs                                                       tmpfs      11G     67M     11G     1%    /var
freenas-boot/grub                                           zfs       114G    8.1M    114G     0%    /boot/grub
SeanVolume                                                  zfs       1.6T    822k    1.6T     0%    /mnt/SeanVolume
SeanVolume/CandyDataset                                     zfs       1.6T    299k    1.6T     0%    /mnt/SeanVolume/CandyDataset
SeanVolume/CandyDataset/Applications                        zfs       1.6T    224k    1.6T     0%    /mnt/SeanVolume/CandyDataset/Applications
SeanVolume/CandyDataset/Documents                           zfs       1.6T    224k    1.6T     0%    /mnt/SeanVolume/CandyDataset/Documents
SeanVolume/CandyDataset/Downloads                           zfs       2.9T    1.3T    1.6T    46%    /mnt/SeanVolume/CandyDataset/Downloads
SeanVolume/CandyDataset/Music                               zfs       1.6T    224k    1.6T     0%    /mnt/SeanVolume/CandyDataset/Music
SeanVolume/CandyDataset/Pictures                            zfs       1.6T    224k    1.6T     0%    /mnt/SeanVolume/CandyDataset/Pictures
SeanVolume/CandyDataset/Temporary                           zfs       1.6T    224k    1.6T     0%    /mnt/SeanVolume/CandyDataset/Temporary
SeanVolume/CandyDataset/Videos                              zfs       1.8T    221G    1.6T    12%    /mnt/SeanVolume/CandyDataset/Videos
SeanVolume/PublicDataset                                    zfs       1.6T    243k    1.6T     0%    /mnt/SeanVolume/PublicDataset
SeanVolume/PublicDataset/Applications                       zfs       1.6T    224k    1.6T     0%    /mnt/SeanVolume/PublicDataset/Applications
SeanVolume/PublicDataset/Documents                          zfs       1.6T    224k    1.6T     0%    /mnt/SeanVolume/PublicDataset/Documents
SeanVolume/PublicDataset/Downloads                          zfs       1.6T    224k    1.6T     0%    /mnt/SeanVolume/PublicDataset/Downloads
SeanVolume/PublicDataset/Music                              zfs       1.6T    224k    1.6T     0%    /mnt/SeanVolume/PublicDataset/Music
SeanVolume/PublicDataset/Pictures                           zfs       1.6T    224k    1.6T     0%    /mnt/SeanVolume/PublicDataset/Pictures
SeanVolume/PublicDataset/Temporary                          zfs       1.6T    224k    1.6T     0%    /mnt/SeanVolume/PublicDataset/Temporary
SeanVolume/PublicDataset/Videos                             zfs       1.6T    224k    1.6T     0%    /mnt/SeanVolume/PublicDataset/Videos
SeanVolume/SeanDataset                                      zfs       3.9T    2.3T    1.6T    60%    /mnt/SeanVolume/SeanDataset
SeanVolume/SeanDataset/AppData                              zfs       1.6T    224k    1.6T     0%    /mnt/SeanVolume/SeanDataset/AppData
SeanVolume/SeanDataset/Applications                         zfs       1.6T    261k    1.6T     0%    /mnt/SeanVolume/SeanDataset/Applications
SeanVolume/SeanDataset/Documents                            zfs       3.0T    1.5T    1.6T    48%    /mnt/SeanVolume/SeanDataset/Documents
SeanVolume/SeanDataset/Downloads                            zfs       2.2T    647G    1.6T    29%    /mnt/SeanVolume/SeanDataset/Downloads
SeanVolume/SeanDataset/Music                                zfs       1.6T    411k    1.6T     0%    /mnt/SeanVolume/SeanDataset/Music
SeanVolume/SeanDataset/Pictures                             zfs       1.6T    2.2M    1.6T     0%    /mnt/SeanVolume/SeanDataset/Pictures
SeanVolume/SeanDataset/Saved Games                          zfs       1.6T    243k    1.6T     0%    /mnt/SeanVolume/SeanDataset/Saved Games
SeanVolume/SeanDataset/Temporary                            zfs       1.6T    533k    1.6T     0%    /mnt/SeanVolume/SeanDataset/Temporary
SeanVolume/SeanDataset/Videos                               zfs       2.4T    850G    1.6T    35%    /mnt/SeanVolume/SeanDataset/Videos
SeanVolume/jails                                            zfs       1.6T    710M    1.6T     0%    /mnt/SeanVolume/jails
SeanVolume/jails/.warden-template-pluginjail                zfs       1.6T    619M    1.6T     0%    /mnt/SeanVolume/jails/.warden-template-pluginjail
SeanVolume/jails/.warden-template-pluginjail-9.3            zfs       1.6T    619M    1.6T     0%    /mnt/SeanVolume/jails/.warden-template-pluginjail-9.3
SeanVolume/jails/pluginjail                                 zfs       1.6T    660M    1.6T     0%    /mnt/SeanVolume/jails/pluginjail
SeanVolume/ownCloudDataset                                  zfs       1.8T    259G    1.6T    14%    /mnt/SeanVolume/ownCloudDataset
SeanVolume/.system                                          zfs       1.6T    3.4M    1.6T     0%    /var/db/system
SeanVolume/.system/cores                                    zfs       1.6T    1.4M    1.6T     0%    /var/db/system/cores
SeanVolume/.system/samba4                                   zfs       1.6T      6M    1.6T     0%    /var/db/system/samba4
SeanVolume/.system/syslog-cd1fc29ce94d4a81a24df77359252261  zfs       1.6T      6M    1.6T     0%    /var/db/system/syslog-cd1fc29ce94d4a81a24df77359252261
SeanVolume/.system/rrd-cd1fc29ce94d4a81a24df77359252261     zfs       1.6T    224k    1.6T     0%    /var/db/system/rrd-cd1fc29ce94d4a81a24df77359252261
devfs                                                       devfs     1.0k    1.0k      0B   100%    /mnt/SeanVolume/jails/pluginjail/dev
procfs                                                      procfs    4.1k    4.1k      0B   100%    /mnt/SeanVolume/jails/pluginjail/proc
SeanVolume/jails/customplugin_2                             zfs       1.6T    1.4G    1.6T     0%    /mnt/SeanVolume/jails/customplugin_2
SeanVolume/SeanDataset/ISOImage                             zfs       1.6T    318k    1.6T     0%    /mnt/SeanVolume/SeanDataset/ISOImage
devfs                                                       devfs     1.0k    1.0k      0B   100%    /mnt/SeanVolume/jails/customplugin_2/dev
procfs                                                      procfs    4.1k    4.1k      0B   100%    /mnt/SeanVolume/jails/customplugin_2/proc
/mnt/SeanVolume/ownCloudDataset                             nullfs    1.8T    259G    1.6T    14%    /mnt/SeanVolume/jails/customplugin_2/media

Running the scrub status (Web GUI):

Scrub status: Completed
Errors: 0     Repaired: 0     Date: Thu Jul 16 07:48:34 2015

What can I do to attempt to fix this (except reboot, because it is running ownCloud Plugin online)?


PS: 2015/07/17 Added:

[root@freenas] ~# zfs list
NAME                                                         USED  AVAIL  REFER  MOUNTPOINT
SeanVolume                                                  10.7T  1.34T   803K  /mnt/SeanVolume
SeanVolume/.system                                           304M  1.34T  3.21M  legacy
SeanVolume/.system/cores                                    5.53M  1.34T  1.32M  legacy
SeanVolume/.system/rrd-cd1fc29ce94d4a81a24df77359252261      219K  1.34T   219K  legacy
SeanVolume/.system/samba4                                    218M  1.34T  5.59M  legacy
SeanVolume/.system/syslog-cd1fc29ce94d4a81a24df77359252261  61.4M  1.34T  5.62M  legacy
SeanVolume/CandyDataset                                     1.42T  1.34T   292K  /mnt/SeanVolume/CandyDataset
SeanVolume/CandyDataset/Applications                         365K  1.34T   219K  /mnt/SeanVolume/CandyDataset/Applications
SeanVolume/CandyDataset/Documents                            365K  1.34T   219K  /mnt/SeanVolume/CandyDataset/Documents
SeanVolume/CandyDataset/Downloads                           1.22T  1.34T  1.22T  /mnt/SeanVolume/CandyDataset/Downloads
SeanVolume/CandyDataset/Music                                365K  1.34T   219K  /mnt/SeanVolume/CandyDataset/Music
SeanVolume/CandyDataset/Pictures                             365K  1.34T   219K  /mnt/SeanVolume/CandyDataset/Pictures
SeanVolume/CandyDataset/Temporary                            365K  1.34T   219K  /mnt/SeanVolume/CandyDataset/Temporary
SeanVolume/CandyDataset/Videos                               206G  1.34T   206G  /mnt/SeanVolume/CandyDataset/Videos
SeanVolume/PublicDataset                                    3.03M  1.34T   237K  /mnt/SeanVolume/PublicDataset
SeanVolume/PublicDataset/Applications                        365K  1.34T   219K  /mnt/SeanVolume/PublicDataset/Applications
SeanVolume/PublicDataset/Documents                           365K  1.34T   219K  /mnt/SeanVolume/PublicDataset/Documents
SeanVolume/PublicDataset/Downloads                           365K  1.34T   219K  /mnt/SeanVolume/PublicDataset/Downloads
SeanVolume/PublicDataset/Music                               365K  1.34T   219K  /mnt/SeanVolume/PublicDataset/Music
SeanVolume/PublicDataset/Pictures                            365K  1.34T   219K  /mnt/SeanVolume/PublicDataset/Pictures
SeanVolume/PublicDataset/Temporary                           365K  1.34T   219K  /mnt/SeanVolume/PublicDataset/Temporary
SeanVolume/PublicDataset/Videos                              365K  1.34T   219K  /mnt/SeanVolume/PublicDataset/Videos
SeanVolume/SeanDataset                                      8.04T  1.34T  2.14T  /mnt/SeanVolume/SeanDataset
SeanVolume/SeanDataset/AppData                              2.64M  1.34T   219K  /mnt/SeanVolume/SeanDataset/AppData
SeanVolume/SeanDataset/Applications                          178G  1.34T   256K  /mnt/SeanVolume/SeanDataset/Applications
SeanVolume/SeanDataset/Documents                            2.58T  1.34T  1.33T  /mnt/SeanVolume/SeanDataset/Documents
SeanVolume/SeanDataset/Downloads                            1.07T  1.34T   661G  /mnt/SeanVolume/SeanDataset/Downloads
SeanVolume/SeanDataset/ISOImage                              103G  1.34T   310K  /mnt/SeanVolume/SeanDataset/ISOImage
SeanVolume/SeanDataset/Music                                51.7G  1.34T   402K  /mnt/SeanVolume/SeanDataset/Music
SeanVolume/SeanDataset/Pictures                              131G  1.34T  2.08M  /mnt/SeanVolume/SeanDataset/Pictures
SeanVolume/SeanDataset/Saved Games                          75.7G  1.34T   237K  /mnt/SeanVolume/SeanDataset/Saved Games
SeanVolume/SeanDataset/Temporary                            9.63G  1.34T   520K  /mnt/SeanVolume/SeanDataset/Temporary
SeanVolume/SeanDataset/Videos                                846G  1.34T   792G  /mnt/SeanVolume/SeanDataset/Videos
SeanVolume/jails                                            3.80G  1.34T   678M  /mnt/SeanVolume/jails
SeanVolume/jails/.warden-template-pluginjail                6.33M  1.34T   591M  /mnt/SeanVolume/jails/.warden-template-pluginjail
SeanVolume/jails/.warden-template-pluginjail-9.3             597M  1.34T   591M  /mnt/SeanVolume/jails/.warden-template-pluginjail-9.3
SeanVolume/jails/customplugin_2                             1.17G  1.34T  1.27G  /mnt/SeanVolume/jails/customplugin_2
SeanVolume/jails/pluginjail                                 1.28G  1.34T   630M  /mnt/SeanVolume/jails/pluginjail
SeanVolume/ownCloudDataset                                  1.21T  1.34T   255G  /mnt/SeanVolume/ownCloudDataset
freenas-boot                                                 680M   107G   144K  none
freenas-boot/ROOT                                            670M   107G   144K  none
freenas-boot/ROOT/Initial-Install                              8K   107G   659M  legacy
freenas-boot/ROOT/default                                    670M   107G   665M  legacy
freenas-boot/grub                                           7.76M   107G  7.76M  legacy

2015/07/17 Added:

After I used zfs list -t snapshot command, it show off many tiny snapshots (totally 2,534 files...).

I am checking, and delete those snapshot files.

After delete, running df -TH again!


2015/07/20 Add:

I got some space after I delete many snapshot files!

but...

The FreeNAS 9.3 Web GUI crashed...

After I clear a large number of Snapshots (approximately 1500), FreeNAS 9.3 Web GUI displays "An error occurred." After two days

[root@freenas] /etc/defaults# df -TH
Filesystem                                                  Type      Size    Used   Avail Capacity  Mounted on
freenas-boot/ROOT/default                                   zfs       115G    698M    114G     1%    /
devfs                                                       devfs     1.0k    1.0k      0B   100%    /dev
tmpfs                                                       tmpfs      33M    5.5M     28M    16%    /etc
tmpfs                                                       tmpfs     4.2M    8.2k    4.2M     0%    /mnt
tmpfs                                                       tmpfs      11G     67M     11G     1%    /var
freenas-boot/grub                                           zfs       114G    8.1M    114G     0%    /boot/grub
SeanVolume                                                  zfs         4T    822k      4T     0%    /mnt/SeanVolume
SeanVolume/CandyDataset                                     zfs         4T    299k      4T     0%    /mnt/SeanVolume/CandyDataset
SeanVolume/CandyDataset/Downloads                           zfs       5.3T    1.3T      4T    25%    /mnt/SeanVolume/CandyDataset/Downloads
SeanVolume/CandyDataset/Videos                              zfs       4.2T    221G      4T     5%    /mnt/SeanVolume/CandyDataset/Videos
SeanVolume/PublicDataset                                    zfs         4T    224k      4T     0%    /mnt/SeanVolume/PublicDataset
SeanVolume/SeanDataset                                      zfs       9.6T    5.6T      4T    59%    /mnt/SeanVolume/SeanDataset
SeanVolume/jails                                            zfs         4T    710M      4T     0%    /mnt/SeanVolume/jails
SeanVolume/jails/.warden-template-pluginjail                zfs         4T    619M      4T     0%    /mnt/SeanVolume/jails/.warden-template-pluginjail
SeanVolume/jails/.warden-template-pluginjail-9.3            zfs         4T    619M      4T     0%    /mnt/SeanVolume/jails/.warden-template-pluginjail-9.3
SeanVolume/jails/pluginjail                                 zfs         4T    661M      4T     0%    /mnt/SeanVolume/jails/pluginjail
SeanVolume/ownCloudDataset                                  zfs       4.3T    336G      4T     8%    /mnt/SeanVolume/ownCloudDataset
SeanVolume/.system                                          zfs         4T    3.4M      4T     0%    /var/db/system
SeanVolume/.system/cores                                    zfs         4T    1.4M      4T     0%    /var/db/system/cores
SeanVolume/.system/samba4                                   zfs         4T    6.1M      4T     0%    /var/db/system/samba4
SeanVolume/.system/syslog-cd1fc29ce94d4a81a24df77359252261  zfs         4T     11M      4T     0%    /var/db/system/syslog-cd1fc29ce94d4a81a24df77359252261
SeanVolume/.system/rrd-cd1fc29ce94d4a81a24df77359252261     zfs         4T    224k      4T     0%    /var/db/system/rrd-cd1fc29ce94d4a81a24df77359252261
devfs                                                       devfs     1.0k    1.0k      0B   100%    /mnt/SeanVolume/jails/pluginjail/dev
procfs                                                      procfs    4.1k    4.1k      0B   100%    /mnt/SeanVolume/jails/pluginjail/proc
SeanVolume/jails/customplugin_2                             zfs         4T    1.4G      4T     0%    /mnt/SeanVolume/jails/customplugin_2
devfs                                                       devfs     1.0k    1.0k      0B   100%    /mnt/SeanVolume/jails/customplugin_2/dev
procfs                                                      procfs    4.1k    4.1k      0B   100%    /mnt/SeanVolume/jails/customplugin_2/proc
/mnt/SeanVolume/ownCloudDataset                             nullfs    4.3T    336G      4T     8%    /mnt/SeanVolume/jails/customplugin_2/media
陳敬翔
  • 41
  • 1
  • 4
  • 1
    zpool list includes in ALLOC the space needed for the parity Information. So ALLOC in zpool list is always more than a du shows. Try zfs list and post the output. – Sunzi Jul 16 '15 at 11:34
  • 2
    Could it be, that you use zfs snapshots? They will also allocate space, but will probably not be visible in a simple du command. You can find snapshots and the space they will use with a zfs list -t snapshot command. – Sunzi Jul 17 '15 at 07:43

1 Answers1

1

I try to sum up the given information and give some explanation (where I know, what it means):

About the zpool list output:
SIZE is the size of all discs, including the discs used for redundancy (so together with the output of zpool status that means, you have put together 8 3TB drives in a RAID-Z3 configuration)
ALLOC is the used space, also including the discs used for redundancy (so it is at least (5+3)/5=1,6 times higher than the real data size, because 3 of the 8 discs are used for redundancy)
Free is the free space, also including the discs used for redundancy (so the usable free space is at most 5/(5+3)=0,625 the shown free size)

The zfs list command says, there is 1,34T available, zpool list says 3,16T free, minus the redundancy space I would expect about 1,975T.
I know, that zfs needs some space for metadata, but on my linux system with 4 2TB drives in RAID-Z1 the difference is about 6%, your difference here is about 47%, I can't say where the difference come from.

About the output of df -Th
df doesn't find zfs snapshots, so it can't calculate them, so the space used is not the space used in the zfs filesystem. For allocated space in the zfs filesystem, use zfs list

Deleting snapshots can take time as you found out, seems to be by design, you can take a look at http://nex7.blogspot.de/2013/03/readme1st.html

After deleting the snapshots, if I count your df output, you have about 7,2T of data in your pool and about 4T free, making 11,2T all.
If I take your complete size (21,8T) minus the redundancy discs, I would expect about 5/(5+3)*21,8T=13,6T (a difference about 20%)

I think 20% is a little bit to much for the metadata needed by zfs, but I can't tell you, where it get lost. Maybe you have many very small files, which are using more metadata space than not so many bigger files.
I'm sorry, but I'm not so deep in the zfs internals, that I can tell you, where to get the metadata space usage.
You could try again a zfs list and compare the used space there with the used space you get from df. Maybe it gives a hint.

Edited 21.07.2015
About the size: 3T drives have actually 2,7 TiB (because the drive manufactures are using 1TB = 1.000.000.000.000 Bytes, where Computers using 1TiB = 1*1024*1024*1024*1024=1.099.511.627.776 Bytes.
2,7 * 8 drives = 21,8T
In normal RAID Systems, the smallest drive determines the usable size of the larger drives.

Sunzi
  • 176
  • 6
  • I really appreciate your detail-minded explanations, I learned a lot! In fact, I'm not sure why: I actually used the 6 3TB and 2 4T, which theoretically should have 24 ~ 26T, but last only 21.8 TB. Finally it's only 12.1TB after redundancy. And why a difference of 20%, it's possible to wait other ZFS experts to answer. Thank you very much! – 陳敬翔 Jul 21 '15 at 07:53
  • I have 24x1TB array in 4xRAIDZ2 (2 disk of redundancy per every 6 disks, theoretical volume of 16TB) and `zpool list` shows 21.8TB. `zfs list` shows 13.3T. It seems that `zpool list` shows space projected for compression. – aaaaa says reinstate Monica Jul 21 '15 at 08:31
  • No. zpool list shows space including the redundancy drives. 1TB=0,909 TiB. 0,909Tib*24=21,8TiB so zpool list shows simply the complet space all drives have. zfs list shows the usable space (minus redundancy, minus metadata) – Sunzi Jul 21 '15 at 11:53