0

I have a DS918+ with several backup tasks configured in HyperBackup. One of them has a threshold set for excessive storage usage, which I need to adjust. However, whenever I try to adjust it, I get a popup error that says:

'The operation failed. Please log in to DSM again and retry.'

Naturally, logging in and out and retrying does not work. Likewise, even a full reboot of the NAS does not work.

I am using the most recent HyperBackup from the package center 2.2.2-1113.

Using the Chrome Developer tools, I can isolate the request that is failing to the DSM:

task_id: 
1
statistic_params: 
{"enable_target_max_size":false,"enable_target_growth":false,"target_max_size_value":1363652116.48,"target_growth_value":512000,"enable_new_count":false,"enable_modify_count":false,"enable_delete_count":false,"new_count_percent":50,"modify_count_percent":50,"delete_count_percent":50}
api: 
SYNO.SDS.Backup.Client.Common.Statistic
method: 
config_set
version: 
1

Which gets this result:

code: 120
errors: {name: "statistic_params", reason: "type"}
name: "statistic_params"
reason: "type"
success: false

This happens even if I leave the default '1.3TB' threshold in place and just try to disable the warning.

How can I fix this?

robsiemb
  • 101
  • 1
  • 4

1 Answers1

0

This version of HyperBackup doesn't support floating point in the setting of the thresholds, and gives this error as a result.

Instead, you can set a whole-number of GB, for example, 1300 GB instead of 1.3 TB, and HyperBackup will accept that value.

Of course, when HyperBackup presents the value it has read out of the server in the future, it will show something like 1.27TB, which again won't be able to be submitted back.

Until there is a fix from Synology, the best bet is to just put whole numbers into the settings.

robsiemb
  • 101
  • 1
  • 4