Hey Checkyourlogs Fans,

I was working with a customer today, and we are finally ready to upgrade our primary Veeam Backup Repository that is running Microsoft Storage Spaces on Windows Server 2016. The new Operating System will be Windows Server 2019 LTSC, and the whole point of this upgrade is to allow us to use ReFS + Deduplication.

The Deduplication feature is now supported in the LTSC builds of Windows Server and the purpose of this post today is to show you how to upgrade the OS Drives, Install the Deduplication Feature, Enable it and test.

Step 1 – Download the media from your Volume License Site (VLK)

Step 2 – Mount the ISO in the Target system that you want to be upgraded.

Note: You should always have a backup of your files before proceeding with any upgrade as a best practice.

Step 3 – Before we pull the trigger on the upgrade have a look at your Storage Pool’s Virtual Disk for a before picture.

As you can see we have 6.66 TB of 29.9 TB free. We have this Storage Pool setup in a 3-Way Mirror for maximum performance for the Veeam Backup Repository.

Step 4 – Run Setup.exe locally to start the upgrade

Step 5 – Choose Download updates, Drivers, and optional features (Recommended) and select I want to help make the installation of Windows Better checkbox

Step 6 – Enter your product key

Step 7 – Choose your edition

Step 8 – Accept the License Agreement

Step 9 – Select keep personal files and apps (These means do you want to upgrade)

Step 10 – Because we are running a Storage Spaces Pool and have a Virtual Disk à Choose Confirm on Setup has detected that one or more virtual drives are directly attached to physical devices. You might need to reconnect the virtual drives after the upgrade is complete.

Step 11 – On the Ready to Install screen click Install

Step 12 – Grab a cup of coffee and wait for the upgrade to complete.

Step 13 – 30 minutes later and we are back in business.

The good news is our Storage Pool and Volumes all came back online without having to do anything.

Step 14 – Let’s Install the Deduplication Feature now

Step 15 – Configure Deduplication for your Veeam Repository

Step 16 – Start the Deduplication jobs via Task Scheduler

Step 17 – View the results via Get-DedupJob and Get-DedupStatus and Get-DedupVolume

You can continue to view the progress until all of the Deduplication jobs are finished.

So, I would suggest it’s coffee time again while you wait.

Step 18 – Monitor the progress. You can also do this by watching it from Task Manager and selecting the Microsoft File Server Data Management Host Process. This is the Deduplication engine running. Interestingly enough this was consuming 20 % of the CPU and 54 GB of RAM during the first pass.

Step 19 – View the final results

As a side note, I’m impatient when trying something out, so I wanted to push the system on the first pass to speed it up.

The initial 54 GB of RAM wasn’t enough, so I did this.

I grabbed a piece of code from my friend Mikael Nystrom


Function Wait-VIADedupJob
while ((Get-DedupJob).count -ne 0 )
Start-Sleep -Seconds 30
foreach($item in Get-DedupVolume){
$item | Start-DedupJob -Type Optimization -Priority High -Memory 80
$item | Start-DedupJob -Type GarbageCollection -Priority High -Memory 80 -Full
$item | Start-DedupJob -Type Scrubbing -Priority High -Memory 80 -Full

There this looks better now:

Here is the memory usage for the Microsoft File Server Data Management Process now.

If you are curious about what is happening and which file the Deduplication engine is working on your can open Resource Monitor and look for the process that is running fsdmhost.exe. You will see the file that is being read and then you will see it being broken up into the Chuck Store. In this case, we can see the files floating into the D:\System Volume Information\Dedup\ChunkStore à With an extension of ccc.new.

Until I checked this, I thought that the jobs were stalled. I was wrong the Deduplication Engine was working through some very large Veeam Backup.VBK files.

NOTE: My estimates on how long the initial pass will take are as follows. I have seen the Disk MB/Sec running consistently at around 300 MB/sec. Your calculation will look something like this 300MB/sec * 60 Seconds * 60 Minutes = MB Per Hours Processing = Roughly 1 TB per Hour. If there are 24 TB’s of data which in our case there is. It will take approximately 24 Hours for the initial pass to complete.

You can see the time the job started and stopped by checking in the Deduplication Event Log. Look for Event ID 6148 you will see the message: Optimization job has started.

We can check for the completion event ID when we come back to this later for the complete duration of the post-processing.

And after one day which was pretty much my estimate the initial pass completed. I’ve now gained back about 14 TB of space post-Windows 2019 Deduplication.

I hope you enjoyed reading this and happy upgrading to Windows Server 2019 with Deduplication on ReFS.