4

For a project, I'm using a SQL Server 2008 R2. One table has a filestream column.

I've made some load tests, and now the database has ~20GB used.

I've empty tables, except several(configuration tables). But my database was still using a lot of space. So I used the Task -> Shrink -> Database / Files But my database is still using something like 16GB.

I found that it's the filestream file is still using a lot of space.

The problem is that I need to backup this database to export it on the final production server, and event if I indicate to compress the backup I got a file more than 3.5Go. Not convenient to store and upload.

And I'm planning much bigger test, so I want to know how to shrink that empty space.

When I'm trying: enter image description here

I get this exception: enter image description here

The properties SIZE, MAXSIZE, or FILEGROWTH cannot be specified for the FILESTREAM data
file 'FileStreamFile'. (Microsoft SQL Server, Error: 5509)

So what should I do?

I found several topics with this error but they was about removing the filestream column.

J4N
  • 167
  • 1
  • 2
  • 8

2 Answers2

3

Old versions of files are removed from filestream via a garabage collection process that runs during the checkpoint process.

see http://sqlskills.com/BLOGS/PAUL/post/FILESTREAM-garbage-collection.aspx for a full explanation.

so - you jump through all the hoops, run a log backup, checkpoint and then you... wait, because the stupid garbage collector only seems to delete files out at the rate of about 4 or 5 per second.

new in 2012, I think? is sp_filestream_force_garbage_collection ( http://msdn.microsoft.com/en-us/library/gg492195.aspx ) - but I haven't used it so I can't say how effective it is.

bkr
  • 835
  • 1
  • 8
  • 20
  • I'm sorry, this project has been paused for months. So I did a backup, a checkpoint, something like 2 hours ago, but the space is still not released, and the shrink file still doesn't work. What should I do? – J4N Dec 10 '12 at 13:51
  • 1
    what kind of backup did you do, you need to remove all references to the file before it will garbage collect it. depending on your recovery mode, a full backup is not enough, you also need to do a transaction backup as well. – bkr Mar 22 '13 at 18:39
  • 1
    sp_filestream_force_garbage_collection worked perfectly, and quick too! Cheers. Instantly freed up 130GB+ of garbage files. (You might possibly need to switch to Simple recovery model for this to work.) – Kris Oct 02 '18 at 18:18
0

The problem is that the space won't free up until garbage collection is completed. Unfortunately there's no way force this.

I used a powershell script to iterate through the filestream files and empty their content. (note that this will remove the filestream data from all records, as long as you don't need it). This will leave the file that associated to the record in place, but will look like the field was empty.

You will need to change the path to your filesteam folder, but this is essentially what I used.

$files = Get-ChildItem -Path "C:\MSSQL\MSSQL10_50.MSSQLSERVER\MSSQL\DATA\MyDbRestore_9.MyDbRestore_Filestream" -Recurse -File -Exclude "*.hdr"

ForEach ($file in $files) { Clear-Content $file.FullName }